When you click on "Accept all cookies"click, you agree to the storage of cookies on your device to improve website navigation, analyze site usage, and support our marketing efforts. For more information, see our privacy policy.

Processing sensitive personal data

Die new, harmonised EU General Data Protection Regulation came into force on May 24, 2016 and is applicable from May 24, 2018 without any further transition period. All companies operating in the EU, regardless of their headquarters, are already subject to legislation; legal implementation in Switzerland is already in progress. This is a reason for us to take a closer look at the General Data Protection Regulation in a series of articles.

Particularly sensitive personal data

Data is considered THE resource of the future. As the crude oil of the digital age, so to speak. We are leaving more and more decisions in our lives to algorithms. And this is usually to our great advantage. We are happy to have our navigation device record and transmit the location and speed in the car if this allows the route guidance to take us even more reliably past traffic jams and construction sites to our destination. Article 9 of the General Data Protection Regulation prohibits the processing of information considered to be particularly critical, but grants some exceptions. This sensitive information is referred to as special categories of personal data guided. Fall into this category

  • racial and ethical origin
  • political opinions
  • religious and ideological convictions
  • Trade union membership
  • genetic data
  • biometric data to uniquely identify people
  • health data
  • Sex life or sexual orientation data

The last point is likely to result in civil status becoming a special category of data in some countries if a legal distinction is made, for example, between marriage and homosexual partnership. In most cases - but not necessarily always - this restriction of use can be lifted with the express consent of the data subject. If not possible in individual cases, there are some exceptions for very specific purposes which could be applied provided that certain guarantees are provided. Use in the context of health care could be mentioned.

Is the selective protection of individual features still up to date?

From a legal point of view, it may appear that the particularly sensitive information can be identified and prohibited relatively easily on the basis of the criteria mentioned above. But thanks to today's options, classifying individual data attributes as good or bad falls short in practice. We are constantly sharing a lot of personal data unconsciously and indirectly. In a Standford University Study For example, an algorithm was already able to determine five basic personality factors better than their colleagues based on just 10 Facebook likes of this person. If the Facebook profiles had 70 likes, the algorithm was already leaving good friends and roommates behind. 150 likes were enough on average to beat the family. And to catch up with the spouses, the algorithm needed 300 likes. With 88% probability, a Algorithm also the sexual orientation of men based only on their position in a Facebook friendship network and determined with 85% probability whether the Americans involved in the study vote Democrats or Republicans. The examples given show that the stored and processed data relating to data protection represents the objects to be checked in a system today, but primarily the algorithms used. In this respect, data protection is certainly more topical than ever, but as word formation, it conveys an increasingly antiquated image. Because it is becoming more and more apparent that the focus is less on the processed data, but on what you do with it. The General Data Protection Regulation now addresses this aspect. These algorithms - very useful in practice but feared by data protectionists - are referred to as profiling summarized. Profiling is defined as follows in Article 4: “For the purposes of this Regulation, the term means... Profiling any type of automated processing of personal data that consists of using this personal data to evaluate certain aspects relating to a natural person, in particular to analyse or predict aspects relating to that natural person's performance at work, economic situation, health, personal preferences, interests, reliability, behavior, whereabouts or changes of location.”

Difficult to meet profiling requirements

First, profiling has special provisions regarding the right to information and information and the right to object. In particular, there are certain information requirements at various points in the process which must not be forgotten. This appears as far as feasible. Secondly, a review and correction may be required at any time. This is much more complicated. Because it is in the nature of modern machine learning algorithms that, like our biological thought processes, they often cannot clearly define why they came to their conclusion exactly. A person would say it this way: “It felt right.” We must come to terms with the uncomfortable idea that even computers today make their conclusions with a good pinch of intuition. However, this is so good and accurate that it is worthwhile relying on the algorithms. However, the most restrictive article is Article 22 (1): “The data subject has the right not to be subject to a decision based exclusively on automated processing — including profiling — which has legal effect on him or similarly significantly affects him.”

The point is therefore that no fully automated decisions may be made explicitly. The idea behind this is that any profiling process can only include a limited context of information, which may not include all relevant facts. The authors therefore hope that a person may know and take further information into account. To what extent this hope is justified in which cases, we will leave it open at this stage. But it can probably be doubted that an average insurance clerk, for example, has significantly more information when he checks an insured person's hundredth medical statement on that day.

Contestability of decisions as a solution

Profiling and automated decisions in individual cases are permitted if this is necessary to fulfill a contract with the data subject or the data subject has given their express consent. First and foremost, it must be ensured that the person concerned can effectively challenge the automatic decision. From an IT perspective, one of the more dubious aspects of the basic regulation is that it is based on the increasingly outdated idea that people could apparently make better decisions in principle.

“It's never going to be perfect.... But if the car is unlikely to crash in a hundred lifetime, or a thousand lifetime then it is probably ok... There is some chance that anytime a human driver gets in the car that they will have an accident that is their fault. It's never zero.... The key threshold for autonomy is: How much better does autonomy need to be than a person before you can rely on it.”-- Elon Musk, Ted 2017

Creating universal artificial intelligence that performs better than a human in every situation is probably still a long way off. On the other hand, it is very possible today to develop artificial intelligence that surpasses humans in a few specific tasks. The question today is therefore more how much better a computer must be able to drive a car, for example, until it is perceived as equal to a person in terms of safety.

“We operate internally with the assumption that an autonomous car needs to be 100 times better than a human.” -- Axel Nix, senior engineer in Harman International's autonomous vehicle team, The Observer

The combination of profiling and particularly sensitive data categories is particularly demanding in the General Data Protection Regulation. There are additional, sometimes massive restrictions lurking here. However, an in-depth discussion can only take place in the context of very specific individual cases, which is why we must refrain from doing so here. In any case, a thorough clarification is absolutely necessary. However, a solution can almost always be found for legitimate interests. It simply requires a thorough design and possibly an iterative approach to finding solutions with regular coordination with data protection. As a small consolation, it may be possible to say that once you have overcome these hurdles, you have a certain competitive advantage over the competition that cannot be caught up without spending a considerable amount of time.

Our series of articles on the subject

About the author

Stefan Haller is an IT expert specializing in risk management, information security and data protection at linkyard. He supports companies and authorities in risk analysis in projects, the design and implementation of compliance requirements in software solutions and in the preparation of IT security and authorization concepts. He is certified in Risk Management and has carried out numerous security audits based on ISO standard 27001 over 10 years as an internal auditor. Do you have any questions about implementation in your company? stefan.haller@linkyard.ch | +41 78 746 51 16