Abstract
An electrocardiogram (ECG) monitors the electrical activity generated by the heart and is used to detect fatal cardiovascular diseases (CVDs). Conventionally, to capture the precise electrical activity, clinical experts use multiple-lead ECGs (typically 12 leads). Recently, large-scale deep learning models have been used to detect these diseases, however, they require large memory and long inference time. We propose a low-parameter model, Low Resource Heart-Network (LRH-Net), that detects ECG anomalies in a resource-constrained environment. On top, multi-level knowledge distillation (MLKD) is employed to improve model generalization. MLKD distils the dark-knowledge from higher parameter models (teachers) trained on different lead configurations to LRH-Net. The LRH-Net has 106× fewer parameters and 76% faster inference than the teacher model for detecting CVDs. Using MLKD, the performance of LRH-Net on reduced lead data was scaled up to 3.25%, making it suitable for edge devices.