Transferability and Adversarial Training in Automatic Classification of the Electrocardiogram with Deep Learning

Arvid Eriksson1, Thomas B. Schön2, Antonio H. Ribeiro3
1KTH Royal Institute of Technology, 2Uppsala University, 3KTH


Abstract

Background: Automatic electrocardiogram (ECG) analysis using deep neural networks has seen promising results in recent years. These models are, however, susceptible to domain shifts when applying a model to new data distributions not previously trained on. This is especially important in automatic ECG analysis as many settings, such as on a hospital level, do not have the data or resources required for training suitable models from scratch.

Methods: We investigate how training on worst-case artificial samples through adversarial training can help promote robust models and models that can easily be molded through fine-tuning to new datasets. We use APGD for adversarially attacking and training deep convolutional neural networks. We compare the area under the precision-recall curve (AUPRC) for the classification of atrial fibrillation using two cohorts: we use PTB-XL for training and CODE-15% for fine-tuning and evaluating the models.

Results: Our results show that adversarially trained models on ECG data yield higher transferability when fine-tuned on new datasets compared to normally trained models (0.692 vs. 0.639 AUPRC) and need roughly a quarter of the data to fine-tune to a similar performance. We also note that they even have the ability to supersede models solely trained on the new dataset using more total time and data (0.719 vs. 0.679 AUPRC).

Discussion: This study shows the impact of adversarial training through the scope of transferability. We show that adversarial training improves robustness, often at the cost of performance on clean data, while also generating models that are more suitable for fine-tuning. Our work thus paves the way for the training of general models that can be applied to different types of new settings for automatic ECG analysis with high performance.