AI Article Synopsis

  • Early detection and intervention of arrhythmia is vital for effective treatment and reducing complications, prompting the development of an explainable deep learning model (XDM) for classification.
  • Utilizing a large dataset of 86,802 electrocardiograms (ECGs), the XDM was validated against external data from 36,961 ECGs to ensure its accuracy and explainability.
  • The XDM demonstrated high performance with area under the curve (AUC) scores of 0.976 and 0.966 during internal and external validation, respectively, indicating it could classify arrhythmias effectively while providing explanations for its classifications, enhancing its usefulness in clinical settings.

Article Abstract

Background: Early detection and intervention is the cornerstone for appropriate treatment of arrhythmia and prevention of complications and mortality. Although diverse deep learning models have been developed to detect arrhythmia, they have been criticized due to their unexplainable nature. In this study, we developed an explainable deep learning model (XDM) to classify arrhythmia, and validated its performance using diverse external validation data.

Methods: In this retrospective study, the Sejong dataset comprising 86,802 electrocardiograms (ECGs) was used to develop and internally variate the XDM. The XDM based on a neural network-backed ensemble tree was developed with six feature modules that are able to explain the reasons for its decisions. The model was externally validated using data from 36,961 ECGs from four non-restricted datasets.

Results: During internal and external validation of the XDM, the average area under the receiver operating characteristic curves (AUCs) using a 12‑lead ECG for arrhythmia classification were 0.976 and 0.966, respectively. The XDM outperformed a previous simple multi-classification deep learning model that used the same method. During internal and external validation, the AUCs of explainability were 0.925-0.991.

Conclusion: Our XDM successfully classified arrhythmia using diverse formats of ECGs and could effectively describe the reason for the decisions. Therefore, an explainable deep learning methodology could improve accuracy compared to conventional deep learning methods, and that the transparency of XDM can be enhanced for its application in clinical practice.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.jelectrocard.2021.06.006DOI Listing

Publication Analysis

Top Keywords

deep learning
24
explainable deep
12
learning model
12
external validation
12
internal external
8
xdm
7
arrhythmia
6
deep
6
learning
6
detection classification
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!