Severity: Warning
Message: file_get_contents(https://...@pubfacts.com&api_key=b8daa3ad693db53b1410957c26c9a51b4908&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests
Filename: helpers/my_audit_helper.php
Line Number: 176
Backtrace:
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 1034
Function: getPubMedXML
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3152
Function: GetPubMedArticleOutput_2016
File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global
File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword
File: /var/www/html/index.php
Line: 316
Function: require_once
In this article, we consider the privacy implications of posting data from small, randomized trials, observational studies, or case series in anesthesia from a few (e.g., 1-3) hospitals. Prior to publishing such data as supplemental digital content, the authors remove attributes that could be used to re-identify individuals, a process known as "anonymization." Posting health information that has been properly "de-identified" is assumed to pose no risks to patient privacy. Yet, computer scientists have demonstrated that this assumption is flawed. We consider various realistic scenarios of how the publication of such data could lead to breaches of patient privacy. Several examples of successful privacy attacks are reviewed, as well as the methods used. We survey the latest models and methods from computer science for protecting health information and their application to posting data from small anesthesia studies. To illustrate the vulnerability of such published data, we calculate the "population uniqueness" for patients undergoing one or more surgical procedures using data from the State of Texas. For a patient selected uniformly at random, the probability that an adversary could match this patient's record to a unique record in the state external database was 42.8% (SE < 0.1%). Despite the 42.8% being an unacceptably high level of risk, it underestimates the risk for patients from smaller states or provinces. We propose an editorial policy that greatly reduces the likelihood of a privacy breach, while supporting the goal of transparency of the research process.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1213/ANE.0000000000001331 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!