WS27

WORKSHOP

SUMMARY

25 FEBRUARY 2019

The Trouble with Article 25 (and How to Fix It): The Future of Data Protection by Design and Default


Summary by Alessandra Calvi, Brussels Privacy Hub, VUB

SUMMARY


On 25 February 2019, the Brussels Privacy Hub hosted Ira Rubinstein (New York University School of Law) who presented a paper in progress, on “The Trouble with Article 25 (and How to Fix It): The Future of Data Protection by Design and Default”, providing his point of view as a US scholar on Article 25 of the General Data Protection Regulation (GDPR). He debated his ideas with Dr. Hielke Hijmans (Brussel Privacy Hub), who also chaired the discussion, and the audience.


Hielke Hijmans opened the event, introducing the speaker and providing an overview of Article 25. Hijmans recognised the ambiguity of Article 25, but he reiterated its importance in terms of accountability of data controllers. He argued that technological solutions should help to comply with legal rules and increase trust in data controllers. Nevertheless, he admitted that there is an ethical dimension of accountability, going beyond strict legal compliance, whose relationship with Article 25 is undefined.  He pinpointed the four key elements of Article 25: first, the multifactor test, pursuant to which the design should be based on four basic requirements: state of the art, costs, nature of processing and risks; second, the need to comply with privacy by design at two-stages: at the time of the determination of the means for processing and at the time of the processing itself; third, the (little) guidance for action provided by the article, since the only technical and organisational measure suggested is pseudonymisation; and fourth, the goals to achieve with privacy by design, namely to implement data protection principles, to integrate safeguards into the processing to meet the requirements of the GDPR, to protect data subject rights.


Addressing the speaker, Hijmans questioned if data protection by default should be considered a subset of data protection by design that gives preference to consent as the legitimate ground for processing, or rather an independent element. He outlined the discrepancies between the GDPR recitals and the text of Article 25 on the obligations posed on designers and their enforceability. He was curious about the effect of Article 25 on strengthen the trust towards data controllers and processors. Finally, he emphasised the importance of the principle of data minimisation in the GDPR, asking whether the scope of it should entail not only data collection, but also the use of data.


In his presentation, Ira Rubinstein argued that Article 25 missed the opportunity to reconcile regulators and engineers by failing to introduce a clear and definitive obligation to adopt Privacy Enhancing Technologies (PETs) and Privacy Engineering (PE). He clarified to have focused his analysis on data protection by design, and not by default, due to its broader scope and independence from consent. Rubinstein defined PETs as tools built upon cryptographic protocols that can be deployed by users, data controllers or both parties collaboratively. He described PE as the integration of privacy requirements into system engineering activities, based on four steps: first, an analysis of privacy and functional requirements, second, the development of a design that fills the requirements, third, the implementation of the design and fourth, testing to verify that requirements are effectively met in the implementation. 


Rubinstein warned against the tendency to overlook the relation between trust and ethics, providing that the notion of trust should be interpreted in a technical sense, as vulnerability to threats.  Furthermore, he provided a road map to PETs and PE. In the case of PETs, he distinguished between “hard PETs” and “soft PETs”. The former is presupposing a distrust in data controllers based on data minimisation and aimed at reducing any disclosure of personal data using crypto protocols. The latter is implying a trustworthy data controller, based on data management and aimed at helping users to make wise choices via cookie management, privacy dashboards, ad icons etc. In the case of PE, he introduced the distinction between “privacy by architecture” solutions, that strive to minimise the collection of personal data mainly by technical means (e.g. anonymisation, client-side data storage and processing, use of “hard PETs”) and “privacy by policy” solutions, focused on the implementation of data management notices and on “soft PETs”.


He clarified that, to develop a PETs-PE based privacy controls design scheme, it would be necessary to set the requirements by performing an analysis based on legal, ethical and social norms; to design, once the requirements are set; to identify the privacy controls addressing the requirements; to take a technical decision with the application of PETs, and to take an architectural decision with the application of Privacy Enhancing Architecture. Rubinstein highlighted that in the GDPR there is a tension, unresolved by Article 25, between those principles that entail a trusted controller (as transparency and consent), so that the application of “soft PETs” would be sufficient; and those ones that treat controllers with distrust, as data minimisation, for which “hard PETs” would be necessary. Using a case study on targeted advertising, where one ad tech firms adopted a “privacy by policy” based solution, without changing its pre-GDPR business model, and the other one a “privacy by architecture” approach, Rubinstein showed how Article 25 fails to provide a clear answer on which of the two firms is compliant with the article, showcasing the mentioned tension in the GDPR.   


Finally, Rubinstein outlined the main five problematic areas of Article 25: first, the overlap with other GDPR provisions that call for the adoption of technical and organisational measures, that leads to confusion and risks diluting the importance of technical measures in favour of organisational ones, that are less costly and time consuming for companies; second, the failure to define the scope of the Article in relation with the objectives pursued; third, the alleged technological neutrality, in the name of which Article 25 neglects to provide guidance on the “technical and organisation measures”, the reference to pseudonymisation being insufficient; fourth, the vagueness of the wording “appropriate” and “in an effective manner”, that does not enable to understand which of the two, “privacy by policy” or “privacy by architecture”, should be preferred. And, finally, the failure to establish a base line for the quality of privacy control, not specifying what “state of the art” means.


Rubinstein then suggested possible solutions to address these shortcomings. He called for a revision of the GDPR to favour “hard PETs” and PE. He proposed to use product liability criteria or ENISA’s work on PETs maturity to better define “state of the art”. He urged the EDPS and the DPAs to encourage the adoption of “hard PETs” and PE, especially in the public sector, since they are not under the pressure of market demands. This includes for example to use “hard PETs” and PE in ‘Smart’ projects (e.g. smart meters, smart tolls, smart cars, etc.); to develop catalogues of Best Available Technologies (BATs), to invest in PET maturity assessments/repositories; and to issue more ambitious opinions on available technologies.  Further, he recommended to look for opportunities to adjust penalties to reward “hard PETs” and PE solutions, and to penalise reliance on “soft PETs” and PE that disregard data minimisation, but he admitted that the possibility to issue sanctions against designers is not so straightforward in the GDPR.


A lively Q&A session with the audience followed. Hijmans queried the speaker about the relation between Privacy by Design, Data Protection Impact Assessment (DPIA) and accountability. In this respect, Rubinstein warned against the tendency to reduce Privacy by Design to a DPIA, arguing that it should cover all the design steps. The discussion with the attendees focused on the effectiveness of the solutions proposed by the speaker and offered by the GDPR itself to fill the gaps of Article 25. It was agreed that clear quality requirements for the “state of the art” should be specified. It was pointed out how certifications may be lengthy and costly for businesses, and how this system struggles to cope with the rapid development of technology. The idea to let industry self-regulate was raised, but its effectiveness questioned. The limited impact of codes of conduct, most of the times too policy oriented, was criticised.


Hijmans concluded the debate expressing his appreciation for this productive and educational session, confident that the discussion fostered further reflection. 


Connect with us


Brussels Privacy Hub

Law Science Technology & Society (LSTS)

Vrije Universiteit Brussel

Pleinlaan 2 • 1050 Brussels

Belgium

info@brusselsprivacyhub.eu

@privacyhub_bru

Stay informed


Keep up to date of our activities and developments. Sign up to our newsletter:

My Newsletter

Copyright © Brussels Privacy Hub