Convolutional neural network aproach for heart MurMur detection in auscultation signals using wavelet transform based features

Robertas Petrolis1, Renata Paukstaitiene1, Gabriele Rudokaite2, Andrius Macas3, Arturas Grigaliunas1, Algimantas Krisciukaitis1
1Department of Physics, Mathematics and Biophysics, Lithuanian University of Health Sciences., 2Lithuanian University of Health Sciences Department of Cardiology, 3Department of Anaesthesiology, Medical Academy, Lithuanian University of Health Sciences


Abstract

Experienced cardiologists classify heart auscultation signals with great agreement between each other, at the same time facing difficulties to precise verbally the key features used. Therefore, machine learning algorithms trained on experts annotated data could be a valuable tool for clinical decision support increasing reliability of cardiac diagnostics.

We propose here an algorithm for classification of cardiac auscultation signals where key-features are calculated by wavelet transform and the decision is made by convolutional neural network trained on the annotated George Moody PhysioNet Challenge 2022 dataset.

The heart auscultation signal contains strong beats representing cardiac valve closures and the MurMur sounds following, or even partially overlapping with beats. These key-components of the signal differ in time and frequency, therefore continuous wavelet transform (CWT) was proposed as the method for primary features formation. The result of CWT of randomly taken excerpt of the signal is two-dimensional array containing bold areas of high value estimates representing the strong beats and some areas of moderate values representing MurMur sounds, in case they are present. The differences in amplitude, time and frequency of the analyzed components are clearly visible in the representation of CWT result as a picture. Strong beat representations in these images give the time marks for the eventual representations of sought MurMur sounds. Therefore, we did not do the cardiocycle delineation, but use the whole signal calculating CWT results of sliding-overlapping windows of at least one cardiocycle length instead. For final analysis we use only few selected CWT-results per recording having lowest, non-zero entropy. In this way we get rid of noisy or corrupted signal parts. The convolutional neural network does the final classification.

The algorithm was tested on Challenge dataset showing sensitivity 0.83; specificity 0.71 and score 1087 during unofficial phase.