Severity: Warning
Message: file_get_contents(https://...@pubfacts.com&api_key=b8daa3ad693db53b1410957c26c9a51b4908&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests
Filename: helpers/my_audit_helper.php
Line Number: 176
Backtrace:
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3122
Function: getPubMedXML
File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global
File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword
File: /var/www/html/index.php
Line: 316
Function: require_once
The use of ensembles in machine learning (ML) has had a considerable impact in increasing the accuracy and stability of predictors. This increase in accuracy has come at the cost of comprehensibility as, by definition, an ensemble model is considerably more complex than its component models. This is of significance for decision support systems in medicine because of the reluctance to use models that are essentially black boxes. Work on making ensembles comprehensible has so far focused on global models that mirror the behaviour of the ensemble as closely as possible. With such global models there is a clear tradeoff between comprehensibility and fidelity. In this paper, we pursue another tack, looking at local comprehensibility where the output of the ensemble is explained on a case-by-case basis. We argue that this meets the requirements of medical decision support systems. The approach presented here identifies the ensemble members that best fit the case in question and presents the behaviour of these in explanation.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/s0933-3657(03)00056-3 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!