The information commissioner’s position paper on the UK government’s proposal for a trustworthy digital identity system provides an insight into the interplay between data protection and digital identity.

Important points

  • In view of the type and volume of data that is affected, every controller involved in the verification of the digital identity would have to do so Carrying out a data protection impact assessment (DPIA) legally. An ICO review may also be required.
  • Concepts such as federated identity management, attribute-based credentials, and tokenization are important. In connection with the processing on the device This can reduce the likelihood and severity of potential risks and harmsuch as misuse of personal information; Loss of confidence or unjustified intrusion; and reducing costs (both in terms of implementation and compliance).

Purpose limitation

  • It means organizations shouldn’t use the data that a person specifically provides for the verification of digital identity for other purposes unless permitted by law or with the permission of a person.
  • After our experience Failure to limit purposes for which organizations collect personal data carries a risk to individuals.
  • People have a reasonable expectation that organizations will use their data for the purposes they are initially informed about.
  • It would seriously undermine public confidence in the Framework for organizations to use people’s data in ways they would not expect.
  • This could be the case both within private sector organizations and within government.
  • Additionally, processing data collected for one purpose for another incompatible purpose (unless an exception applies) is in violation of the UK GDPR. The governing body of the framework should therefore play an important role in ensuring that the data used in digital identity for this purpose are in practice limited.

Automated processing

  • Concerns and potential risks can arise for individuals when digital identity and attribute systems (or the service providers using digital identity and attributes) rely on automated processing. This could include the use of algorithms or artificial intelligence as part of the system.
  • Even with automated processing that does not fall under Article 22B. if the processing is not exclusively automated or has no legal or similarly significant effects, Companies still have to fully consider and comply with data protection rights and obligations – in particular transparency, accuracy and legal protection mechanisms.
  • The ICO welcomes the trust framework Recognizing the potential discriminatory biases in automated decision-making and that an appropriate governing body receives annual exclusion reports.

Information for children

The UK GDPR states that children generally deserve special protection because of the risks associated with the collection and processing of their data. Therefore, Every digital identity system must be given special consideration how it safely houses and protects children.

Fair and lawful processing

  • It is important that individuals are offered real choice whether a digital identity and attribute schema processes its data. However, organizations have to be Particularly careful consent or express consent as a prerequisite for access to a digital identity and attribute schema in the framework. Consent is likely to be invalid if it holds power over an individual, such as a government agency or a potential employer seeking confirmation of a medical screening.
  • All organizations within the trust framework must In addition to considering how they can use digital identities and attributes, consider whether they should do so in a given scenario. The assessment of whether the processing is fair may depend on how you receive the personal data. If organizations deceive or mislead someone when they receive personal information, it is likely not fair.

transparency

  • When communicating about digital identities and attributes to the public, It is important that this information is user-friendly and easily understood by those who are not technical specialists.
  • There should be one too reasonable level of consistency between different controllers and the privacy information they provide.
  • The design of this new system should also Create the ability to leverage transparency best practices built into the service experience, not just formal privacy notices. This should include testing and designing the user experience (UX).

Data minimization

  • If Businesses collect excessive data for identification, “malfunction” can occur, and the data is considered valuable for marketing. There is also a risk of multiple organizations holding duplicate data that can be hacked or misused.
  • Organizations must therefore ensure that personal data is adequate, relevant and limited to what is necessary for the purposes for which it is processed.
  • Only process data that is necessary to verify a person’s identity or their attributes.
  • Acquisition, Use and retention of the minimum required amount of data reduces data protection risks, so every system within the framework should support this.
  • Organizations should They only have access to the data they need to run their servicesB. verifying facts and not providing detailed information.

security

  • Any scheme should be based on strong technical and organizational security Agreement. This is due to the attractiveness of the data in digital identity systems to bad actors and the high risk they pose if they compromise the data.
  • These measures could include the use of privacy enhancing technology to minimize the risk of fraud, impersonation and other misuse or data loss.
  • Organizations should adhere to security measures under regular review to ensure their effectiveness, including monitoring of false positive rates.
  • The distributed, decentralized model can support the effectiveness of these measures in addition to the joint assessment of threats and the clarification of risks.

[View source.]