Privatus 6 1 8 – Automated Privacy Protection

broken image


At a glance

  1. Privatus 6 1 8 – Automated Privacy Protection
  2. Privatus 6 1 8 – Automated Privacy Protection Devices
  3. Privatus 6 1 8 – Automated Privacy Protections
  4. Privatus 6 1 8 – Automated Privacy Protection Act

Information Technology Laboratory (ITL) National Vulnerability Database (NVD) Announcement and Discussion Lists General Questions & Webmaster Contact. A leader in screen privacy and protection For more than 30 years, 3M has been advancing optical technology to deliver market-leading visual privacy and protection products. We remain a trusted global leader today — continuing to improve our technology, features, and breadth of product. Display and device technology continues to advance.

  • The UK GDPR has provisions on:
    • automated individual decision-making (making a decision solely by automated means without any human involvement); and
    • profiling (automated processing of personal data to evaluate certain things about an individual). Profiling can be part of an automated decision-making process.
  • The UK GDPR applies to all automated individual decision-making and profiling.
  • Article 22 of the UK GDPR has additional rules to protect individuals if you are carrying out solely automated decision-making that has legal or similarly significant effects on them.
  • You can only carry out this type of decision-making where the decision is:
    • necessary for the entry into or performance of a contract; or
    • authorised by domestic law applicable to the controller; or
    • based on the individual's explicit consent.
  • You must identify whether any of your processing falls under Article 22 and, if so, make sure that you:
    • give individuals information about the processing;
    • introduce simple ways for them to request human intervention or challenge a decision;
    • carry out regular checks to make sure that your systems are working as intended.

Checklists

All automated individual decision-making and profiling

To comply with the UK GDPR..

Automated

Privatus 6 1 8 – Automated Privacy Protection

We have a lawful basis to carry out profiling and/or automated decision-making and document this in our data protection policy.

We send individuals a link to our privacy statement when we have obtained their personal data indirectly.

We explain how people can access details of the information we used to create their profile.

We tell people who provide us with their personal data how they can object to profiling, including profiling for marketing purposes.

We have procedures for customers to access the personal data input into the profiles so they can review and edit for any accuracy issues.

Privatus 6 1 8 – automated privacy protection devices

Privatus 6 1 8 – Automated Privacy Protection

We have a lawful basis to carry out profiling and/or automated decision-making and document this in our data protection policy.

We send individuals a link to our privacy statement when we have obtained their personal data indirectly.

We explain how people can access details of the information we used to create their profile.

We tell people who provide us with their personal data how they can object to profiling, including profiling for marketing purposes.

We have procedures for customers to access the personal data input into the profiles so they can review and edit for any accuracy issues.

We have additional checks in place for our profiling/automated decision-making systems to protect any vulnerable groups (including children).

We only collect the minimum amount of data needed and have a clear retention policy for the profiles we create.

As a model of best practice..

We carry out a DPIA to consider and address the risks before we start any new automated decision-making or profiling.

We tell our customers about the profiling and automated decision-making we carry out, what information we use to create the profiles and where we get this information from.

We use anonymised data in our profiling activities.

Solely automated individual decision-making, including profiling with legal or similarly significant effects (Article 22)

To comply with the UK GDPR..

We carry out a DPIA to identify the risks to individuals, show how we are going to deal with them and what measures we have in place to meet UK GDPR requirements.

We carry out processing under Article 22(1) for contractual purposes and we can demonstrate why it's necessary.

OR

We carry out processing under Article 22(1) because we have the individual's explicit consent recorded. We can show when and how we obtained consent. We tell individuals how they can withdraw consent and have a simple way for them to do this.

OR

We carry out processing under Article 22(1) because we are authorised or required to do so. This is the most appropriate way to achieve our aims.

We don't use special category data in our automated decision-making systems unless we have a lawful basis to do so, and we can demonstrate what that basis is. We delete any special category data accidentally created.

We explain that we use automated decision-making processes, including profiling. We explain what information we use, why we use it and what the effects might be.

We have a simple way for people to ask us to reconsider an automated decision.

We have identified staff in our organisation who are authorised to carry out reviews and change decisions.

We regularly check our systems for accuracy and bias and feed any changes back into the design process.

Privatus 6 1 8 – Automated Privacy Protection Devices

As a model of best practice..

We use visuals to explain what information we collect/use and why this is relevant to the process.

We have signed up to [standard] a set of ethical principles to build trust with our customers. This is available on our website and on paper.

In brief

What is automated individual decision-making and profiling?

Automated individual decision-making is a decision made by automated means without any human involvement.

Examples of this include:

  • an online decision to award a loan; and
  • a recruitment aptitude test which uses pre-programmed algorithms and criteria.

Automated individual decision-making does not have to involve profiling, although it often will do.

The UK GDPR says that profiling is:

'Any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person's performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements.'

[Article 4(4)]

Organisations obtain personal information about individuals from a variety of different sources. Internet searches, buying habits, lifestyle and behaviour data gathered from mobile phones, social networks, video surveillance systems and the Internet of Things are examples of the types of data organisations might collect.

Information is analysed to classify people into different groups or sectors, using algorithms and machine-learning. This analysis identifies links between different behaviours and characteristics to create profiles for individuals. There is more information about algorithms and machine-learning in our paper on big data, artificial intelligence, machine learning and data protection.

Based on the traits of others who appear similar, organisations use profiling to:

  • find something out about individuals' preferences;
  • predict their behaviour; and/or
  • make decisions about them.

This can be very useful for organisations and individuals in many sectors, including healthcare, education, financial services and marketing.

Automated individual decision-making and profiling can lead to quicker and more consistent decisions. But if they are used irresponsibly there are significant risks for individuals. The UK GDPR provisions are designed to address these risks.

What does the UK GDPR say about automated individual decision-making and profiling?

The UK GDPR restricts you from making solely automated decisions, including those based on profiling, that have a legal or similarly significant effect on individuals.

'The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.'

[Article 22(1)]

For something to be solely automated there must be no human involvement in the decision-making process.

The restriction only covers solely automated individual decision-making that produces legal or similarly significant effects. These types of effect are not defined in the UK GDPR, but the decision must have a serious negative impact on an individual to be caught by this provision.

A legal effect is something that adversely affects someone's legal rights. Similarly significant effects are more difficult to define but would include, for example, automatic refusal of an online credit application, and e-recruiting practices without human intervention.

When can we carry out this type of processing?

Solely automated individual decision-making - including profiling - with legal or similarly significant effects is restricted, although this restriction can be lifted in certain circumstances.

You can only carry out solely automated decision-making with legal or similarly significant effects if the decision is:

  • necessary for entering into or performance of a contract between an organisation and the individual;
  • authorised by law (for example, for the purposes of fraud or tax evasion); or
  • based on the individual's explicit consent.

If you're using special category personal data you can only carry out processing described in Article 22(1) if:

  • you have the individual's explicit consent; or
  • the processing is necessary for reasons of substantial public interest.

What else do we need to consider?

Because this type of processing is considered to be high-risk the UK GDPR requires you to carry out a Data Protection Impact Assessment (DPIA) to show that you have identified and assessed what those risks are and how you will address them.

As well as restricting the circumstances in which you can carry out solely automated individual decision-making (as described in Article 22(1)) the UK GDPR also:

  • requires you to give individuals specific information about the processing;
  • obliges you to take steps to prevent errors, bias and discrimination; and
  • gives individuals rights to challenge and request a review of the decision.

These provisions are designed to increase individuals' understanding of how you might be using their personal data.

You must:

  • provide meaningful information about the logic involved in the decision-making process, as well as the significance and the envisaged consequences for the individual;
  • use appropriate mathematical or statistical procedures;
  • ensure that individuals can:
    • obtain human intervention;
    • express their point of view; and
    • obtain an explanation of the decision and challenge it;
  • put appropriate technical and organisational measures in place, so that you can correct inaccuracies and minimise the risk of errors;
  • secure personal data in a way that is proportionate to the risk to the interests and rights of the individual, and that prevents discriminatory effects.

What if Article 22 doesn't apply to our processing?

Article 22 applies to solely automated individual decision-making, including profiling, with legal or similarly significant effects.

If your processing does not match this definition then you can continue to carry out profiling and automated decision-making.

But you must still comply with the UK GDPR principles.

You must identify and record your lawful basis for the processing.

You need to have processes in place so people can exercise their rights.

Individuals have a right to object to profiling in certain circumstances. You must bring details of this right specifically to their attention.

Further Reading

In more detail – ICO guidance

We have published detailed guidance on automated decision-making and profiling.

In more detail – European Data Protection Board

The European Data Protection Board (EDPB), which has replaced the Article 29 Working Party (WP29), includes representatives from the data protection authorities of each EU member state. It adopts guidelines for complying with the requirements of the EU version of the GDPR.

WP29 has adopted guidelines on Automated individual decision-making and Profiling, which have been endorsed by the EDPB.

Other relevant guidelines published by WP29 and endorsed by the EDPB include:

EDPB guidelines are no longer be directly relevant to the UK regime and are not binding under the UK regime. However, they may still provide helpful guidance on certain issues.

In detail

What is profiling?

Profiling analyses aspects of an individual's personality, behaviour, interests and habits to make predictions or decisions about them.

The UK GDPR defines profiling as follows:

‘profiling' means any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person's performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements.

Privatus 6 1 8 – Automated Privacy Protections

Article 4 (4)

Organisations obtain personal information about individuals from a variety of different sources. Internet searches, buying habits, lifestyle and behaviour data gathered from mobile phones, social networks, video surveillance systems and the Internet of Things are examples of the types of data organisations might collect.

They analyse this information to classify people into different groups or sectors. This analysis identifies correlations between different behaviours and characteristics to create profiles for individuals. This profile will be new personal data about that individual.

Organisations use profiling to:

  • find something out about individuals' preferences;
  • predict their behaviour; and/or
  • make decisions about them.

Profiling can use algorithms. An algorithm is a sequence of instructions or set of rules designed to complete a task or solve a problem. Profiling uses algorithms to find correlations between separate datasets. These algorithms can then be used to make a wide range of decisions, for example to predict behaviour or to control access to a service. Swinsian 1 13 0 – music manager and player. Artificial intelligence (AI) systems and machine learning are increasingly used to create and apply algorithms. There is more information about algorithms, AI and machine-learning in our paper on big data, artificial intelligence, machine learning and data protection.

You are carrying out profiling if you:

  • collect and analyse personal data on a large scale, using algorithms, AI or machine-learning;
  • identify associations to build links between different behaviours and attributes;
  • create profiles that you apply to individuals; or
  • predict individuals' behaviour based on their assigned profiles.

Although many people think of marketing as being the most common reason for profiling, this is not the only application.

Privatus 6 1 8 – Automated Privacy Protection Act

Example

Profiling is used in some medical treatments, by applying machine learning to predict patients' health or the likelihood of a treatment being successful for a particular patient based on certain group characteristics.

Less obvious forms of profiling involve drawing inferences from apparently unrelated aspects of individuals' behaviour.

Example

Rollercoaster tycoon 3 platinum 3 3 1. Using social media posts to analyse the personalities of car drivers by using an algorithm to analyse words and phrases which suggest ‘safe' and ‘unsafe' driving in order to assign a risk level to an individual and set their insurance premium accordingly.

What is automated decision-making?

Automated decision-making is the process of making a decision by automated means without any human involvement. These decisions can be based on factual data, as well as on digitally created profiles or inferred data. Examples of this include:

  • an online decision to award a loan; and
  • an aptitude test used for recruitment which uses pre-programmed algorithms and criteria.

Automated decision-making often involves profiling, but it does not have to.

Example

An examination board uses an automated system to mark multiple choice exam answer sheets. The system is pre-programmed with the number of correct answers required to achieve pass and distinction marks. The scores are automatically attributed to the candidates based on the number of correct answers and the results are available online.

This is an automated decision-making process that doesn't involve profiling.

What are the benefits of profiling and automated decision-making?

Profiling and automated decision making can be very useful for organisations and also benefit individuals in many sectors, including healthcare, education, financial services and marketing. They can lead to quicker and more consistent decisions, particularly in cases where a very large volume of data needs to be analysed and decisions made very quickly.

What are the risks?

Although these techniques can be useful, there are potential risks:

  • Profiling is often invisible to individuals.
  • People might not expect their personal information to be used in this way.
  • People might not understand how the process works or how it can affect them.
  • The decisions taken may lead to significant adverse effects for some people.

Just because analysis of the data finds a correlation doesn't mean that this is significant. As the process can only make an assumption about someone's behaviour or characteristics, there will always be a margin of error and a balancing exercise is needed to weigh up the risks of using the results. The UK GDPR provisions are designed to address these risks.

Further Reading

Further reading

The European Data Protection Board (EDPB), which has replaced the Article 29 Working Party (WP29), includes representatives from the data protection authorities of each EU member state. It adopts guidelines for complying with the requirements of the GDPR. EDPB guidelines are no longer directly relevant to the UK regime and are not binding under the UK regime. However, they may still provide helpful guidance on certain issues

WP29 adopted guidelines on automated individual decision-making and profiling – Chapter II, which have been endorsed by the EDPB.





broken image