Background: Accurate determination of low-density lipoprotein cholesterol (LDL) is important for coronary heart disease risk assessment and atherosclerosis. Apart from direct determination of LDL values, models (or equations) are used. A more recent approach is the use of machine learning (ML) algorithms.

Methods: ML algorithms were used for LDL determination (regression) from cholesterol, HDL and triglycerides. The methods used were multivariate Linear Regression (LR), Support Vector Machines (SVM), Extreme Gradient Boosting (XGB) and Deep Neural Networks (DNN), in both larger and smaller data sets. Also, LDL values were classified according to both NCEP III and European Society of Cardiology guidelines.

Results: The performance of regression was assessed by the Standard Error of the Estimate. ML methods performed better than established equations (Friedewald and Martin). The performance all ML methods was comparable for large data sets and was affected by the divergence of the train and test data sets, as measured by the Jensen-Shannon divergence. Classification accuracy was not satisfactory for any model.

Conclusions: Direct determination of LDL is the most preferred route. When not available, ML methods can be a good substitute. Not only deep neural networks but other, less computationally expensive methods can work as well as deep learning.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.cca.2021.02.020DOI Listing

Publication Analysis

Top Keywords

data sets
12
low-density lipoprotein
8
lipoprotein cholesterol
8
machine learning
8
direct determination
8
determination ldl
8
ldl values
8
deep neural
8
neural networks
8
methods
6

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!