The GDPR has been in force for less than two years, but we are already observing its effect on data subjects, data controllers/processors, and data privacy regulators. A huge, complex set of societal regulations is bound to have problematic passages and unintended consequences. So it seems time to begin consideration of how to revise the law for efficiency, effectiveness, and policy.

In November 2019, German data privacy authorities released a report noting the problems they encountered implementing GDPR and proposing changes to the law. The Report could be the first formal proposal in an effort to amend the EU Privacy laws.

The Report on Experience Gained in the Implementation of the GDPR [link to English translation] is a product of both the Federal and State Data Protection Supervisory Authorities in Germany, known as the Datenschutzkonferenz or DSK. The report is a wish list for privacy law revisions by one set of that law’s primary enforcers.

Moving past the predictable whining about lack of resources to handle the increased traffic of disclosures and complaints, not to mention the resultant lack of staff to hold spot inspections to see whether businesses are truly GDPR compliant, German authorities note both 1) practical changes that would help enforce the law and 2) concerns about companies getting away with marketing tactics that the authorities find unsavory. They lament the perception that companies know data privacy regulators are overwhelmed and are somehow taking advantage of the situation by shirking compliance obligations.

Without the authority and influence of the German Privacy Regulatory Bureaucracy, I still have a wish list of changes that would make the GDPR more effective, simpler to comply with, and easier to enforce.  My wishes address some of the same problems as the German report, and with one exception my proposed solutions are different.

Let’s start with the exception. The report notes the avalanche of needless data incident reports made by companies afraid of fines for not reporting suspicions in less than three days.  As any US state attorney general could have told them years ago, forcing companies to report data incidents within 72 hours with no “risk” exception leads to excessive reporting, wasting regulators time. German authorities propose that reporting not be required for data incidents likely to be low risk to the “rights and freedoms” of data subjects. This is an excellent proposal, and I include it in my wish list as well.

I would add my wish that the 72-hour reporting period be measured NOT from the first knowledge of a prospective data incident – which often results in filing formal reports before anyone understands what actually happened – but from the time that the data controller knows which data was likely at risk. Those may be the same times, but usually not. Often it takes time after learning of a breach or a mistake to determine whether any data was exposed, and if so, which data.  If the EU grants me this change – and the change really would benefit everyone – I would even be comfortable with the regulatory assumption that controllers must presume as a matter of law that the most data that could have been exposed in the incident was exposed in the incident.

German authorities complain that about enforcement of Article 5(2) (purpose limitation), among other things, fretting that the scientific research exception is too broad. Followers of this blog know that I have discussed the scientific exception before. My wish would be for the scientific research exception be clarified and  expanded so that universities and hospitals will not fear violating privacy law simply by following appropriate scientific procedures. The EU’s demands for specificity in consent, if strictly followed, make blind trial impossible.

Further, the legal basis for using data is confusing and muddled where clinical research is involved. For these reason, and for the sake of the intellectual and economic interest of the European Union, I wish for a standard agreed consent that researchers can use for their subjects, plus a special, specific “legal basis” for using data in clinical trials that acknowledges the need for information to be ethically withheld from subjects and that notes the accepted afterlife of clinical trial information in publications for the advancement of science. (It also wouldn’t hurt to have clarification that clinical test subjects, once they have agreed to be in a trial and given adequate notice of what that trial entails, not be able to remove their names and data from the scientific proceedings.)

The report contains a provision requesting a fourth category of regulated entities in the data chain: the producer. A producer would be a hardware or software company that is not currently regulated by EU privacy rules in their development, manufacture and sales roles. The DSK wants these roles regulated and held subject to the GDPR. While “Privacy by Design” principles make sense in a world where data is becoming more deeply regulated every day, and while building data security into products is a laudable goal, this proposal, without a long waiting period prior to enforcement and a set of exceptions and exemptions will cause chaos in the marketplace.

Every new car, every IoT device, every airplane, every smart doorbell would be subject to regulation and litigious second guessing on how much security they should be building into their products.  This gives the EU privacy regulators veto power over manufacturing spending decisions and product development, crippling the emerging market for smart items. While I am sure the German authorities would enjoy exercising power over every manufacturing company selling goods in Europe, this is a lousy and entirely impractical idea.

The German authority is deeply hostile to the concept of profiling, although the practice can describe a wide variety of activities from guessing what kind of books a person likes to buy to suggest more relevant purchase options to preparing comprehensive files containing both public and private data used for multiple purposes. It would seem to me that the latter example could be troubling, while the former example is much less so. The Report proposed that:

"The only possible legal bases for profiling should be – apart from having a specific legal basis – consent or a contract. That would ensure that profiling is only possible if the data subject is aware of it and has consented to it.”

The DSK also makes clear that specific legal basis for profiling should be limited and “There can . . . be no substitute for a statutory regulation.” My wish, if the EU legislators consider this proposal, that they do not paint with too broad a brush. A personal profile may be built on a few items of information, or tens of thousands of items. The profile may identify a person or simply follow a device ID within one website. Not all of these functions demands consent, and if they all did, people’s internet experiences would be decidedly poorer without much, if any, benefit to the data subject.

This report begins a necessary process – public discussion of how the GDPR is meeting its goals and how it can be revised for the public good. While I may not agree with all the proposals, the exercise is worth undertaking, and I salute this Report for raising important issues, even if I rarely agree with its proposed solutions.