Recent developments at the U.S. National Institutes of Health (NIH) concerning autism research have raised significant data privacy concerns, particularly following its decision to abandon a proposed national registry for diagnosed individuals. Attorneys Ariana Aboulafia and Andrew Crawford from the Center for Democracy and Technology have highlighted these issues, emphasizing that the planned research still poses various privacy risks.
Last week, NIH announced plans to aggregate records from both government and private sources, including electronic health records, pharmacy data, and information from wearable technologies. This initiative aims to create a comprehensive database for studying autism and identifying those affected by the condition.
Despite the announcement, NIH faced substantial backlash from patient advocates and privacy organizations, prompting a rapid reassessment of its plans for a national registry. The agency’s retreat, however, does not eliminate the pressing concerns associated with the collection and use of sensitive health data, as noted by Aboulafia and Crawford in their discussion with Information Security Media Group.
The primary sticking point remains the accessibility of this amassed data and its potential secondary uses. Aboulafia raised critical questions regarding the number of third parties who might gain access to the information and what purposes the compiled data might ultimately serve. “There are still questions we don’t have answers for,” she stated, emphasizing the ongoing uncertainty despite NIH’s withdrawal from a registry plan.
Concerns extend beyond data collection practices; Crawford pointed out that patients may hesitate to share personal health information under new circumstances. “Exchanging information with your doctor serves a clear purpose—diagnosis and treatment—but when that data is repurposed, it can erode trust,” he said, noting that this distrust could hinder patient engagement with health providers and technology.
NIH has yet to respond to inquiries regarding the data privacy implications of its autism research strategy, which could have far-reaching effects on patient confidentiality and data security. In their interview, Aboulafia and Crawford explored additional considerations including consent issues, particularly surrounding the data usage of minors, as well as possible HIPAA and privacy regulations governing health research.
The conversation also addressed risks associated with using de-identified or anonymized data in research, highlighting the implications of a recent incident involving the bankruptcy of genetics firm 23andMe, which has sparked renewed concerns about the privacy of health data in an increasingly digital landscape.
Aboulafia and Crawford are well-positioned to discuss these matters; Aboulafia leads CDT’s Disability Rights in Technology Policy, advocating for equitable technology practices, while Crawford oversees CDT’s Data and Privacy Project, previously advising on judicial and technological matters in Congress. Their insights underscore the complex interplay of health data research and privacy, a dynamic that continues to evolve as technology advances.