rf-fullcolor.png

 

December 18, 2023
by Ferdous Al-Faruque

NIST drafts differential privacy guidance to enhance individual privacy in research

The US National Institute of Standards and Technology (NIST) has published a draft guidance that describes differential privacy, offers advice on how it can be achieved and discusses concerns about its use. According to the agency, differential privacy is a technology that enables researchers to quantify individual privacy risk when one’s data is included in a published dataset.
 
On 11 December, NIST published the draft guidance entitled "Guidelines for Evaluating Differential Privacy Guarantees” which describes differential privacy. Differential privacy is a technique used to enhance privacy of data by adding random noise that can help mask the identity of the person that the data was obtained from.
 
“More noise yields better privacy but also degrades the utility of the result,” the guidance states. “This dynamic is often called the privacy-utility trade-off, and it can be difficult to achieve high utility and strong privacy protection in some cases.
 
“In addition, some differentially private techniques can create or magnify systemic, human, or statistical bias in results, so care must be taken to understand and mitigate these impacts,” the document added.
 
The document was developed in response to an executive order published in October requiring federal agencies to develop plans to ensure safe and secure use of artificial intelligence (AI) technology. While the guidance is not specific to the healthcare industry, it could have implications for research involving health or medical data.
 
While differential privacy has been around for almost two decades as a theoretical framework, NIST noted that it is considered a privacy-enhancing technology (PET) that currently lacks standards, which NIST says raises a barrier for users.
 
“This publication is intended to help practitioners of all backgrounds — policymakers, business owners, product managers, IT technicians, software engineers, data scientists, researchers, and academics — understand, evaluate, and compare differential privacy guarantees,” the agency said. “In particular, this publication highlights privacy hazards that practitioners should consider carefully.”
 
The guidance is divided into three sections. The first section details NIST’s definition of differential privacy, what are differential privacy guarantees, how they compare to each other, and their use in the real world. The second section details differentially private algorithms, and provides different scenarios where they can be used. Finally, the third section discusses concerns about differentially private analysis techniques.
 
While NIST notes that there are a number of concerns with using differential privacy it is currently the best-known method to protect data from malicious hackers.
 
“This publication has summarized just a few of the many kinds of data analyses that can be accomplished with differential privacy, and current research is expanding these capabilities every year,” the agency added. “In addition, an increasing number of open-source libraries and systems are starting to bring these techniques into practice.”
 
Stakeholders can comment on the guidance until 25 January 2024, on NIST’s website.
 
NIST differential privacy draft guidance
×

Welcome to the new RAPS Digital Experience

We have completed our migration to a new platform and are pleased to introduce the updated site.

What to expect: If you have an existing login, please RESET YOUR PASSWORD before signing in. After you log in for the first time, you will be prompted to confirm your profile preferences, which will be used to personalize content.

We encourage you to explore the new website and visit your updated My RAPS page. If you need assistance, please review our FAQ page.

We welcome your feedback. Please let us know how we can continue to improve your experience.