Today – on European Data Protection Day – the 2021 Dutch Privacy Awards were handed out during the Dutch National Privacy Conference, a joint initiative by Privacy First and the Dutch Platform for the Information Society (ECP). These Awards provide a platform for companies and governments that see privacy as an opportunity to distinguish themselves positively and to make privacy-friendly entrepreneurship and innovation the norm. The winners of the Dutch Privacy Awards 2021 are STER, NLdigital, Schluss, FCInet and the Dutch Ministry of Justice and Security.
Advertising without storage of personal data, contextual targeting: proven effectiveness
The Dutch Stichting Ether Reclame (Ether Advertising Foundation), better known as STER, was one of the first organizations in the Netherlands to abandon the common model of offering advertisements based on information collected via cookies. STER has developed a procedure that only uses relevant information on the webpages visited. No personal data are collected at all (data such as browser version, IP address and click-through behaviour). Advertisers submit their advertisements to STER, which are then put on the website in conformity with the protocol developed by STER, which is based on a number of simple categories. These categories are linked to the information that is shown, such as a TV program that someone has selected. The protocol has been built up and refined over the past period and now works properly.
In this way, STER kills several birds with one stone. Most importantly, initial applications show that this approach is at least as effective for advertisers as the old cookie-based way. Secondly, the approach removes parties from the chain. Data brokers who played a role in the old system are now superfluous. Apart from the financial gain for the chain, this also prevents data coming into the possession of parties the data should not end up with. And thirdly, STER stays in control of its own advertising campaigns.
This makes STER a deserved winner of the Dutch Privacy Awards. The concept developed is innovative and helps to protect the privacy of citizens without them having to make any effort. STER is also investigating the possibility of using the approach more broadly. This too is an innovation that the expert panel applauds.
In that sense STER’s approach is also a well-founded response to the data-driven superpowers on the market as it demonstrates that the endless collection of personal data is not at all necessary to get your message across, whether it is commercial or idealistic.
STER could perhaps also have been submitted as a Business-to-Business entry, but the direct interests of consumers meant that it was listed in the category of consumer solutions.
Organisational innovation and practical application: Data Pro Code
Entries for the Dutch Privacy Awards often relate to technical innovations. At NLdigital it is not the technology, but the approach that is innovative. It has given concrete meaning to GDPR obligations through agreements and focuses mainly on data processors, not on the responsible parties. This enables processors to make agreements more quickly, practically and with sufficient care – agreements which are also verifiable in this regard. Many companies provide services by making applications available which involve data processing. And that requires processing agreements, which are not easy to apply for every organization. Filling in the corresponding statement leads to an appropriate processing agreement for clients.
NLdigital’s code of conduct called Data Pro Code is a practical instrument tailor made for the target group: IT companies that process data on behalf of others. With the help of (600) participants/members, the Code is drawn up as an elaboration of Art. 28 of the GDPR. It has been approved by the Dutch Data Protection Authority and has led to a publicly accessible certification.
Winner: FCInet & Ministery of Justice and Security
Ma³tch, privacy on the government agenda: innovative data minimization
FCInet is innovative, privacy-enhancing technology that was developed by the Dutch Ministry of Justice and Security and the Dutch Ministry of Finance. It is meant to assist in the fight against (international) crime. Part of FCInet is Ma³tch, which stands for Autonous Anonymous Analysis. With this feature the Financial Criminal Investigation Services (FCIS) can share secure and pseudonymized datasets on a national level (for example with the Financial Intelligence Unit-Netherlands and the Fiscal Information and Investigation Service), but also internationally. Ma³tch is a technology that supports and enforces parties concerned to make careful considerations per data field. This is possible with regard to the question of which data these parties want to compare and on the basis of which conditions. This ensures that parties can set up the infrastructure in such a way that it can be technically enforced that data are exchanged only on a legitimate basis.
Through hashing, organization A encrypts (bundles of) personal data in such a way that receiving party B has the possibility to check whether a person known to organization B is also known to organization A. Only if it turns out that there is a match (because the list of known persons in hashed form of organization B is checked against the list of persons in the sent list) does the next step take place whereby organization B actually requests information about the person concerned from organization A. The check takes place in a secure decentralized environment, so organization A does not know whether there is a hit or not. The technology thus prevents the unnecessary perusal of personal data in the context of comparisons.
The open source code technology of FCInet offers broader possibilities for application, which is encouraged by the expert panel and was an important reason for the submission: it can be reused in many other organizations and systems. The panel therefore assessed this initiative as a good investment in privacy by the government, where, clearly, the issue of privacy really is on the agenda.
Schluss applied for the Dutch Privacy Awards in 2021 for the third time. That is not the reason for the Incentive Award, even though it may encourage others to persevere in a similar way.
The reason is that it is a very nice initiative, focused on the self-management of personal data. In the form of an app, private users are offered a vault for their personal data, whether they are of a medical, financial or other nature. Users decide which people or organizations gets access to their data. The idea is that others who are allowed to see the data no longer need to store these data themselves. Schluss has no insight into who uses the app, its role is only to facilitate the process. The technology, which is open source, guarantees transparency about the operation of the app.
Schluss won the prestigious Incentive Award because thus far the app has had only a beta release. However, promising projects have been started with the Volksbank and there is a pilot in collaboration with the Royal Dutch Association of Civil-law Notaries. With the mission statement (‘With Schluss, only you decide who gets to know which of your details’) in mind, Schluss chose to become a cooperation, an organizational form that appealed to the expert panel. With this national Incentive Award the panel hopes to encourage the initiators to continue along this path and to persuade parties to join forces with Schluss.
There are four categories in which applicants are awarded:
1. the category of Consumer solutions (business-to-consumer)
2. the category of Business solutions (within a company or business-to-business)
3. the category of Public services (public authority-to-citizen)
4. the incentive award for a ground breaking technology or person.
From the various entries, the independent expert panel chose the following nominees per category (listed in arbitrary order):
Roseman Labs (Secure Multiparty Computation)
Ministry of Health (CoronaMelder)
NLdigital (Data Pro Code)
FCInet & Ministry of Justice (Ma³tch)
STER (Contextual targeting)
During the National Privacy Conference all nominees presented their projects to the audience in Award pitches. Thereafter, the Awards were handed out. Click HERE for the entire expert panel report (pdf in Dutch), which includes participation criteria and explanatory notes on all the nominees and winners.
National Privacy Conference
The Dutch National Privacy Conference is a ECP|Platform for the Information Society and Privacy First initiative. Once a year, the conference brings together Dutch industry, public authorities, the academic community and civil society with the aim to build a privacy-friendly information society. The mission of both the National Privacy Conference and Privacy First is to turn the Netherlands into a guiding nation in the field of privacy. To this end, privacy by design is key.
These were the speakers during the 2021 National Privacy Conference in successive order:
- Monique Verdier (vice chairwoman of the Dutch Data Protection Authority)
- Judith van Schie (Considerati)
- Erik Gerritsen (Secretary General of the Dutch Ministery of Health, Welfare and Sport)
- Mieke van Heesewijk (SIDN Fund)
- Peter Verkoulen (Dutch Blockchain Coalition)
- Paul Tang (MEP for PvdA)
- Ancilla van de Leest (Privacy First chairwoman)
- Chris van Dam (Member of the Dutch House of Representatives for CDA)
- Evelyn Austin (director of Bits of Freedom)
- Wilmar Hendriks (chairman of the expert panel of the Dutch Privacy Awards).
The entire conference was livestreamed from Nieuwspoort in The Hague: see https://www.nieuwspoort.nl/agenda/overzicht/privacy-conferentie-2021/stream and https://youtu.be/asEX1jy4Tv0.
Dutch Privacy Awards expert panel
The independent expert Award panel consists of privacy experts from different fields:
- Wilmar Hendriks, founder of Control Privacy and member of the Privacy First advisory board (panel chairman)
- Ancilla van de Leest, Privacy First chairwoman
- Paul Korremans, partner at Comfort Information Architects and Privacy First board member
- Marc van Lieshout, managing director at iHub, Radboud University Nijmegen
- Alex Commandeur, senior advisor BMC Advies
- Melanie Rieback, CEO and co-founder of Radically Open Security
- Nico Mookhoek, privacy lawyer and founder of DePrivacyGuru
- Rion Rijker, privacy and data protection expert, IT lawyer and partner at Fresa Consulting.
In order to make sure that the Award process is run objectively, the panel members may not judge on any entry of his or her own organization.
In collaboration with the Dutch Platform for the Information Society (ECP), Privacy First organizes the Dutch Privacy Awards with the support of the Democracy & Media Foundation and The Privacy Factory.
Pre-registrations for the 2022 Dutch Privacy Awards are welcome!
Would you like to become a sponsor of the Dutch Privacy Awards? Please contact Privacy First!
It is with great concern that Privacy First has taken note of the Dutch draft bill on COVID-19 test certificates. Under this bill, a negative COVID-19 test certificate will become mandatory for access to sporting and youth activities, all sorts of events and public places including bars and restaurants and cultural and higher education institutions, Those who have no such certificates risk getting high fines. This will put pressure on everyone's right to privacy.
Serious violation of fundamental rights
The draft bill severely infringes numerous fundamental and human rights, including the right to privacy, physical integrity and freedom of movement in combination with other relevant human rights such as the right to participate in cultural life, the right to education and various children’s rights such as the right to recreation. Any curtailment of these rights must be strictly necessary, proportionate and effective. However, the current draft bill fails to demonstrate this, while the required necessity in the public interest is simply assumed. More privacy-friendly alternatives to reopen and normalize society do not seem to have been considered. For these reasons alone, the proposal cannot pass the human rights test and should therefore be withdrawn.
The proposal also violates the general prohibition of discrimination, as it introduces a broad social distinction based on medical status. This puts pressure on social life and may lead to large-scale inequality, stigmatization, social segregation and even possible tensions, as large groups in society will not (or not systematically) want to or will not be able to get tested (for various reasons). During the recent Dutch National Privacy Conference organized by Privacy First and the Platform for the Information Society (ECP), it already became clear that the introduction of a mandatory ‘corona passport’ could have a socially disruptive effect. On that occasion the Dutch Data Protection Authority, among others, took a strong stand against it. Such social risks apply all the more strongly to the indirect vaccination obligation that follows on from the corona test certificate. In this regard, Privacy First wants to recall that recently both the Dutch House of Representatives and the Parliamentary Assembly of the Council of Europe have expressed their opposition to a direct or indirect vaccination requirement. In addition, the draft bill under consideration will have the potential to set precedents for other medical conditions and other sectors of society, putting pressure on a much broader range of socio-economic rights. For all of these reasons, Privacy First strongly recommends that the Dutch government withdraw this draft bill.
Multiple privacy violations
Moreover, from the perspective of the right to privacy, a number of specific objections and questions apply. First of all, the draft bill introduces a mandatory ‘proof of healthiness’ for participation in a large part of social life, in flagrant violation of the right to privacy and the protection of personal data. Also, the draft bill introduces an identification requirement at the entrance of public places, in violation of the right to anonymity in public spaces. The bill also results in the inconsistent application of existing legislation to the same act, namely testing, with far-reaching consequences on the one hand for a precious achievement like medical confidentiality and the trust of citizens in that confidentiality, and on the other hand for the practical implementation of retention periods while the processing of the test result does not change. After all, it is not the result of the test that should determine whether the file falls under the Dutch Medical Treatment Contracts Act (WGBO, which has a medical secrecy requirement and a retention period of 20 years) or under the Public Health Act (with a retention period of five years), but the act of testing itself. Moreover, it is unclear why the current draft bill seeks to connect to the Public Health Act and/or WGBO if it only concerns obtaining a test certificate for the purpose of participating in society (and therefore no medical treatment or public health task for that purpose). Here, the only possibility for processing and for breaching medical confidentiality should be the basis of consent. In this case, however, there cannot be the legally required freely given consent, since testing will be a compelling condition for participation in society.
Privacy requires clarity
Many other issues are still unclear: which data will be stored, where, by whom, and which data may possibly be exchanged? To what extent will there be personal localization and identification as opposed to occasional verification and authentication? Why may test results be kept for an unnecessarily long time (five or even 20 years)? How great are the risks of hacking, data breaches, fraud and forgery? To what extent will there be decentralized, privacy-friendly technology, privacy by design, open source software, data minimization and anonymization? Will test certificates remain free of charge and to what extent will privacy-friendly diversity and choice in testing applications be possible? Is work already underway to introduce an ‘alternative digital carrier’ in place of the Dutch CoronaCheck app, namely a chip, with all the risks that entails? How will function creep and profiling be prevented and are there any arrangements when it comes to data protection supervision? Will non-digital, paper alternatives always remain available? What will happen to the test material taken, i.e. everyone’s DNA? And when will the corona test certificates be abolished?
As long as such concerns and questions remain unanswered, submission of this bill makes no sense at all and the corona test certificate will only lead to the destruction of social capital. Privacy First therefore reiterates its request that the current proposal be withdrawn and not submitted to Parliament. Failing this, Privacy First will reserve the right to have the matter reviewed by the courts and declared unlawful.
 See the Dutch National Privacy Conference, 28 January 2021, https://youtu.be/asEX1jy4Tv0?t=9378, starting at 2h 36 min 18 sec.
 See Council of Europe, Parliamentary Assembly, Resolution 2361 (2021): Covid-19 vaccines: ethical, legal and practical considerations, https://pace.coe.int/en/files/29004/html, par. 7.3.1-7.3.2: “Ensure that citizens are informed that the vaccination is NOT mandatory and that no one is politically, socially, or otherwise pressured to get themselves vaccinated, if they do not wish to do so themselves; ensure that no one is discriminated against for not having been vaccinated, due to possible health risks or not wanting to be vaccinated.” See also, for example, Dutch House of Representatives, Motion by Member Azarkan on No Corona Vaccination Obligation (28 October 2020), Parliamentary Document 25295-676, https://zoek.officielebekendmakingen.nl/kst-25295-676.html: "The House (...) pronounces that there should never be a direct or indirect coronavirus vaccination obligation in the future"; Motion by Member Azarkan on Access to Public Benefits for All Regardless of Vaccination or Testing Status (5 January 2021), Parliamentary Document 25295-864, https://zoek.officielebekendmakingen.nl/kst-25295-864.html: "The House (...) requests the government to enable access to public services for all regardless of vaccination or testing status.’
Under the Corona Pandemic Emergency Act, the Dutch government has the option to introduce all kinds of restrictive measures, including the wide-ranging and mandatory use of face masks. This is unless the Dutch House of Representatives rejects this measure later this week. In this context, Privacy First today has sent the following email to the House of Representatives:
Dear Members of Parliament,
On 19 November, the government submitted to you the Regulation concerning additional requirements for face masks under COVID-19. Under this regulation, wearing a face mask will become mandatory in numerous places (including shops, railway stations, airports and schools) as of 1 December 2020. This obligation can be periodically extended by the government without the consent of Parliament. Based on the Corona Pandemic Emergency Act, you currently have seven days to exercise your right of veto and prevent the entry into force of a wide-ranging face mask obligation. By 26 November at the latest, you will be able to vote on this issue and reject this measure.
The wearing of face masks has been the subject of much public debate for months. Both the government and the National Institute for Public Health and the Environment (RIVM) have repeatedly stated that wearing non-medical face masks is hardly effective in combating the coronavirus. Scientists seem to be divided on this. At the same time, wearing a face mask can also have the opposite effect, i.e. harm people's health. There is a consensus, however, that in a legal sense the compulsory use of face masks is an infringement of the right to privacy and self-determination.
This accordingly falls within the scope of Privacy First. The right to privacy is a universal human right that is protected in the Netherlands by international and European treaties and by our national Constitution. Any infringement of the right to privacy must therefore be strictly necessary, proportionate and effective. If that is not the case, it is an unjustified breach and therefore a violation of the right to privacy, both as a human right and as a constitutional right. As long as the wearing of non-medical face masks to deafeat the coronavirus has not proven effective and can even have adverse health effects, there cannot be any social necessity for the introduction of a general face mask obligation. Such an obligation would thus amount to a social experiment with unforeseen consequences. This is not in keeping with a free and democratic constitutional society under the rule of law. Privacy First therefore advises you to reject the proposed regulation for the introduction of compulsory face masks and instead propose to continue wearing them on a voluntary basis.
The Privacy First Foundation
In the fight against the coronavirus, the Dutch government this week made clear that the introduction of a curfew is imminent. Because of this, Privacy First today has sent the following appeal to the Dutch House of Representatives:
Dear Members of Parliament,
This week the Netherlands finds itself at a historical human rights crossroads: is a nation-wide curfew going to be introduced for the first time since World War II? For Privacy First such a far-reaching, generic measure would be disproportionate and far from necessary in virtually every situation. Moreover, in the fight against the coronavirus the effectiveness of such a measure remains unknown to this date. For that alone, there can be no legally required social necessity of a curfew. A curfew could in fact also be counterproductive, as it would harm the mental and (therefore also) physical health of large groups in society. Besides, a curfew in the Netherlands is yet another step towards a surveillance society. The use of lighter, targeted and more effective measures is always preferable. Should a curfew nonetheless be introduced, Privacy First would consider it a massive violation of the right to privacy and freedom of movement. Privacy First therefore calls on you to not let this happen and to thwart the introduction of a curfew.
The Privacy First Foundation
Update 17 February 2021: this week, in summary proceedings, the district court of The Hague handed down a ground-breaking ruling that says that the curfew was wrongly introduced under the Dutch Extraordinary Powers Act. The current Dutch curfew is therefore unlawful. Moreover, the court found that there are "major question marks regarding the factual substantiation by the State of the necessity of the curfew. (...) Before a far-reaching restriction such as a curfew is introduced, it must be clear that no other, less far-reaching measures are available and that the introduction of the curfew will actually have a substantial effect", stated the court, without the conviction that this was the case. In addition, the court raised the question of why an urgent (but voluntary) curfew advice had not been chosen. The court also noted that "the Dutch Outbreak Management Team, according to the team itself, has no evidence that the curfew will make a substantial contribution to reducing the spread of the virus." All this "makes the State's assertion that a curfew is inevitable at least debatable and without convincing justification", the court concluded. (See judgment (in Dutch), paragraphs 4.12-4.14.)
The judgment of the district court of The Hague is in line with Privacy First’s earlier position. Privacy First hopes that this will be confirmed on appeal by the Hague Court of Appeal and that it will also lead to the rejection of the curfew by both the Dutch House of Representatives and the Senate.
A Dutch court has today handed down a judgment in preliminary injunction proceedings brought by Privacy First concerning the UBO register. The district court of The Hague confirmed that there is every reason to doubt the legality of the European money laundering directives which are the foundation of the UBO register. On this point the judge follows the very critical opinion of the European Data Protection Supervisor. The interim proceedings court rules that it cannot be excluded that the Court of Justice of the European Union (CJEU) will come to the conclusion that the public character of the UBO register is at odds with the proportionality principle. Questions over its legality were recently referred to the CJEU by a Luxembourg national court. As such, the Dutch court felt there is no need to do the same.
Privacy First had also requested a temporary deactivation of the UBO register. This, however, is a step too far for the court, which states that deactivating the register is not possible as long as the underlying EU guideline is still in force. It would put the Netherlands in a position in which it operates in violation of the European guideline. With this claim, the judge says, Privacy First is getting ahead of itself. Privacy First will examine the ruling on this point, also in view of possibly going into appeal.
‘The introduction of the UBO register would mean that privacy-sensitive data of millions of people will be up for grabs’, comments Privacy First’s attorney Otto Volgenant of Boekx Attorneys.’On all sides there are strong doubts whether this is actually an effective means in the fight against money laundering and terrorism. It’s like using a sledgehammer to crack a nut. The Court of Justice of the European Union will eventually adjudicate the case, and I expect it will annul the UBO register.’
At the start of this year, the Privacy First Foundation initiated fundamental legal action against the Dutch government on account of the new UBO register, which is linked to the Trade Register of the Dutch Chamber of Commerce. Under the law the UBO register is based on, all 1.5 million Dutch legal entities that are included in the Trade Register will have to make public all sorts of privacy-sensitive data about their Ultimate Beneficial Owners. This concerns personal data of millions of directors, shareholders and high executives of companies (including family businesses), foundations, associations, churches, social organizations, charities, etc. Privacy First deems that this is a massive privacy violation, one which also creates personal safety risks. That is why Privacy First has asked the court to immediately declare the UBO register unlawful. A lot of information in the register will be publicly available and can be requested by anyone. In Privacy First’s opinion this is completely disproportionate and an infringement of European privacy law. The CJEU will examine whether the European legislation on which the UBO register is based violates the fundamental right to privacy.
The ruling (in Dutch) by the interim proceedings court can be found here: http://deeplink.rechtspraak.nl/uitspraak?id=ECLI:NL:RBDHA:2021:2457.
Update 15 April 2021: yesterday Privacy First filed an urgent appeal against the entire judgment with the Court of Appeal of The Hague. The appeal subpoena can be found HERE (pdf in Dutch). Privacy First requests the Court, inter alia, to ask preliminary questions about the UBO register to the European Court of Justice and to suspend the UBO register until these questions are answered. In view of the major interests at stake, Privacy First hopes that the Court of Appeal of The Hague will hear this case as soon as possible.
Privacy First initiates legal action against the Dutch government on account of the recently-introduced UBO register. The preliminary injunction proceedings point at the invalidity of the legislation on which this register is based. The consequences of this new piece of legislation are far-reaching as the register contains very privacy-sensitive information. Data relating to the financial situation of natural persons will be up for grabs. More than 1.5 million legal entities that are registered in the Dutch Trade Register will have to make public details about their Ultimate Beneficial Owners (UBOs). The UBO register is publicly accessible: a request for information costs €2.50.
The UBO register aims to prevent money laundering but will lead to defamation.
The privacy breach that is the result of the UBO register and the public accessibility of sensitive data are disproportionate. The goal of the register is to thwart money laundering and terrorist financing. In order to achieve this goal there is no need for a UBO register, at least not one that is publicly accessible.
That is why Privacy First wants the UBO register to be rendered inoperative by a court, which, in case necessary, should submit questions of interpretation to the highest court in Europe: the European Court of Justice. In cases like these, the judiciary will have the final say. It is not uncommon for a court to overrule privacy-violating legislation and in this respect, Privacy First’s litigation has been successful in the past.
The proceedings will take place before The Hague District Court on 25 February 2021 at 12pm. The entire summons can be found HERE (pdf in Dutch). The ruling will follow two or three weeks after the hearing.
Background of the UBO register case
On 24 June 2020, the Dutch ‘Implementation Act for the Registration of Ultimate Beneficial Owners of Companies and Other Legal Entities’ came into effect in the Netherlands. On the basis of this new Act, a new UBO register which is linked to the Commercial Register of the Dutch Chamber of Commerce will contain information about all ultimate beneficial owners of companies and other legal entities founded in the Netherlands. The register should indicate how many shares are owned by the UBO: 25-50%, 50-75% or more than 75%. Furthermore, the name, month and year of birth as well as the nationality of the UBO will be made public, with all the privacy risks this entails.
Since 27 September 2020, newly founded entities have to register the ultimate beneficial owners in the UBO register. Existing legal entities will have to do so before 27 March 2022.
The Act provides very few possibilities to safeguard information. This is possible only for persons that are protected by the police, minors and those placed under guardianship. This means that the shares of practically every UBO will become a matter of public record. Anyone has access to the UBO register, with extracts coming at a price of €2.50.
European money laundering directive
The new Act stems from the fifth European money laundering directive, which obliges EU Member States to register UBOs and disclose their details to the public. According to the European legislator, this contributes to the proclaimed objective of countering money laundering and terrorist financing. The transparency is supposed to be a deterrent for persons who set out to launder money or finance terrorism.
Massive privacy violation and fundamental criticism
The question is whether this produces a windfall effect. Registering the personal data of all UBOs and making these publicly available is a generic precautionary measure. 99.99% of UBOs have nothing to do with money laundering or terrorist financing. Even if it were proportionate to collect information on all UBOs, making that information available only to government agencies engaged in combating money laundering and terrorism should suffice. It is not appropriate to disclose that information to everyone. The European Data Protection Supervisor (EDPS) deemed this privacy violation to be disproportionate. This opinion, however, did not lead to an amendment of the European Directive.
When this Act was discussed in Dutch Parliament, fundamental criticism came from various corners of society. The business community made its voice heard because it perceived privacy risks and feared − and now indeed experiences − an increase in costs. UBOs of family-owned companies that have remained out of the public eye up until now are running major privacy and security risks. There was also a great deal of attention for the position of social organizations − such as church communities and NGOs − that attach great importance to the protection of those affiliated with them. Associations and foundations that do not have owners face a different burden: they have to put the data that are already in the Trade Register in yet another register. Unfortunately these complaints have not resulted in any changes to the legislation.
Legal proceedings look promising
Privacy First has initiated legal proceedings against the UBO register for violation of the fundamental right to privacy and the protection of personal data. Privacy First asks the Dutch court to render the UBO register inoperative in the short term and, if necessary, to submit questions of interpretation on this matter to the highest court in Europe, the Court of Justice of the European Union.
The Dutch Act as well as the underlying European directive are in conflict with both the European Charter of Fundamental Rights and the GDPR. It is the legislator who has created this legislation, but it will be up to the court to do a thorough review thereof. Ultimately, the court has the last word. If the (European) legislator fails to take adequate account of the protection of fundamental rights, then the (European) court can invalidate this legislation. This would not be unique. The Court of Justice of the European Union has previously declared legislation invalid due to privacy violations, for example the Data Retention Directive and, more recently, the Privacy Shield. Dutch courts too regularly annul privacy-invading regulations. Privacy First has previously successfully challenged the validity of legislation, for example in the proceedings concerning the Telecommunications Data Retention Act and the System Risk Indication (SyRI). Viewed against this background, a positive outcome in the case against the UBO register is all but unlikely.
This week the Dutch House of Representatives will debate the ‘temporary’ Corona emergency law under which the movements of everyone in the Netherlands can henceforth be monitored ‘anonymously’. Privacy First has previously criticized this plan in a television broadcast by current affairs program Nieuwsuur. Subsequently, today Privacy First has sent the following letter to the House of Representatives:
Dear Members of Parliament,
With great concern, Privacy First has taken note of the ‘temporary’ legislative proposal to provide COVID-19 related telecommunications data to the Dutch National Public Health Institute (RIVM). Privacy First advises to reject this proposal on account of the following fundamental concerns and risks:
Violation of fundamental administrative and privacy principles
- There is no societal necessity for this legislative proposal. Other forms of monitoring have already proven sufficiently effective. The necessity of this proposal has not been demonstrated and there is no other country where the application of similar technologies made any significant contribution.
- The proposal is entirely disproportionate as it encompasses all telecom location data in the entire country. Any form of differentiation is absent. The same applies to data minimization: a sample would be sufficient.
- The proposal goes into effect retroactively on 1 January 2020. This violates legal certainty and the principle of legality, particularly because this date is long before the Dutch ‘start’ of the pandemic (11 March 2020).
- The system of ‘further instructions from the minister’ that has been chosen for the proposal is completely undemocratic. This further erodes the democratic rule of law and the oversight of parliament.
- The proposal does not mention 'privacy by design' or the implementation thereof, while this should actually be one of its prominent features.
Alternatives are less invasive: subsidiarity
- The State Secretary failed to adequately investigate alternatives which are more privacy friendly. Does she even have any interest in this at all?
- Data in the possession of telecom providers are pseudonymized with unique ID numbers and as such are submitted to Statistics Netherlands (CBS). This means that huge amounts of sensitive personal data become very vulnerable. Anonymization by CBS happens only at a later stage.
- When used, the data are filtered based on geographical origin. This creates a risk of discrimination on the basis of nationality, which is prohibited.
- It is unclear whether the CBS and the RIVM intend to ‘enrich’ these data with other data, which could lead to function creep and potential data misuse.
Lack of transparency and independent oversight
- Up until now, the Privacy Impact Assessment (PIA) of the proposal has not been made public.
- There is no independent oversight on the measures and effects (by a judge or an independent commission).
- The GDPR may be applicable to the proposal only partially as anonymous data and statistics are exempt from the GDPR. This gives rise to new risks of data misuse, poor digital protection, data breaches, etc. General privacy principles should therefore be made applicable in any case.
Structural changes and chilling effect
- This proposal seems to be temporary, but the history of similar legislation shows that it will most likely become permanent.
- Regardless of the ‘anonymization’ of various data, this proposal will make many people feel like they are being monitored, which in turn will make them behave unnaturally. The risk of a societal chilling effect is huge.
Faulty method with a significant impact
- The effectiveness of the legislative proposal is unknown. In essence, it constitutes a large scale experiment. However, Dutch society is not meant to be a living laboratory.
- By means of data fusion, it appears that individuals could still be identified on the basis of anonymous data. Even at the chosen threshold of 15 units per data point, the risk of unique singling out and identification is likely still too large.
- The proposal will lead to false signals and blind spots due to people with several telephones as well as vulnerable groups without telephones, etc.
- There is a large risk of function creep, of surreptitious use and misuse of data (including the international exchange thereof) by other public services (including the intelligence services) and future public authorities.
- This proposal puts pressure not just on the right to privacy, but on other human rights as well, including the right to freedom of movement and the right to demonstrate. The proposal can easily lead to structural crowd control that does not belong in a democratic society.
Specific prior consent
Quite apart from the above concerns and risks, Privacy First doubts whether the use of telecom data by telecom providers, as envisaged by the legislative proposal, is lawful in the first place. In the view of Privacy First, this would require either explicit, specific and prior consent (opt-in) from customers, or the possibility for them to opt-out at a later stage and to have the right to have all their data removed.
It is up to you as Members of Parliament to protect our society from this legislative proposal. If you fail to do so, Privacy First reserves the right to take legal action against this law.
The Privacy First Foundation
The Privacy Collective press release
Millions of Dutch internet users victim of unlawful collection and use of personal data
The Privacy Collective takes Oracle and Salesforce to Court
The Privacy Collective - a foundation that acts against violation of privacy rights - is taking Oracle and Salesforce to Court. The foundation accuses the technology concerns of unlawfully collecting and processing data of millions of Dutch internet users. The foundation has launched a class action, a legal procedure in which compensation is claimed for a large group of individuals. It is the first time that this legal instrument is used in the Netherlands in a case of infringement of the General Data Protection Regulation (GDPR).
Christiaan Alberdingk Thijm, lead lawyer in the case: “This is one of the largest cases of unlawful processing of personal data in the history of the internet. Almost every Dutch individual who reads or views information online is structurally affected by the practices of Oracle and Salesforce. Practices that merely serve a commercial purpose.”
Online shadow profile
Oracle and Salesforce collect data from website visitors at any time and on a large scale. By combining this with additional information, they create a personal profile of each individual internet user. The millions of profiles are used, among other things, to offer personalized online advertisements and unlawfully shared with numerous commercial parties, including ad-tech companies. The tech giants collect their information using - among other things - specially developed cookies. Alberdingk Thijm: “Most people do not know that they have such an online 'shadow profile'. They don't know what it looks like and have certainly not given legitimate consent.” For the collection and sharing of personal data, Oracle and Salesforce are obliged to ask for permission under the GDPR. “These parties violate internet users' right to privacy. The right to protection of personal data and the right to protection of privacy are recognized as fundamental rights", says Alberdingk Thijm.
The possibility to claim damages in a class action was recently created under Dutch law.“Claiming damages in a class action is an important tool to ensure the enforcement of the GDPR,” says Joris van Hoboken, a board member of the foundation and professor in Information Law. “It gives the GDPR teeth.” The Privacy Collective calls upon individual consumers to register with the foundation in order to show their support. Based on the number of victims, the total extent of the damage could exceed 10 billion euros. Several organizations support The Privacy Collective's campaign, including Privacy First, Bits of Freedom, Qiy Foundation and Freedom Internet. The claims are being fully funded by Innsworth, a litigation funder. The organization’s funding enables the benefits of scaling common claims in a collective action, without any individual claimants being exposed to litigation costs. Inssworth finances a similar class action in England and Wales, which is currently being prepared.
Source: The Privacy Collective press release, 14 August 2020.
More information: https://theprivacycollective.eu/en/.
On 21 June 2020, the Dutch group Viruswaarheid organised a large-scale protest against the Corona emergency legislation at the Malieveld in The Hague (Netherlands). Many thousands of people were planning to come here to demonstrate peacefully for their freedom. Despite an unjustified ban on this demonstration, several speakers who had been invited for this occasion still gave an appearance, including Privacy First chairman Bas Filippini. You can watch his entire speech (dubbed into English) below or on Youtube. Click here for the original version in Dutch.
Privacy First will continue to oppose totalitarian emergency legislation, including the Dutch Corona Emergency Law, by all political and legal means. Do you support us in this? Then become a donor of Privacy First!
Yesterday, there was a hearing in the Dutch House of Representatives in which the by now notorious Corona app was critically discussed. The House had invited various experts and organizations (among which Privacy First) to submit position papers and take part in the hearing. Below is both the full text of our position paper, as well as the text which was read out at the hearing. A video of the entire hearing (in Dutch) can be found HERE. Click HERE for the program, all speakers and position papers.
Dear Members of Parliament,
Thank you kindly for your invitation to take part in this roundtable discussion about the so-called Corona app. In the view of Privacy First, apps like these are a threat to everyone’s privacy. We will briefly clarify this below.
Lack of necessity and effectiveness
With great concern, Privacy First has taken note of the intention of the Dutch government to employ a contact tracing app in the fight against the coronavirus. Thus far, the social necessity of such apps has not been proven, while the experience of other countries indicates there is ground to seriously doubt their benefit and effectiveness. In fact, these apps may even be counterproductive as their use leads to a false sense of safety. Moreover, it’s very hard to involve the most vulnerable group of people (the elderly) through this means. This should already be enough reason to refrain from using Corona apps.
In Privacy First’s view, the use of such apps is a dangerous development because it could lead to stigmatization and numerous unfounded suspicions, and may also cause unnecessary unrest and panic. Even when ‘anonymized’, the data from these apps can still be traced back to individuals through data fusion. In case this technology will be introduced on a large scale, it will result in a surveillance society in which everyone is being continuously monitored – something people will be acutely aware of and would lead to an imminent societal chilling effect.
Risks of misuse
There is a significant risk that the collected data will be used for multiple purposes (function creep) and be misused by both companies and public authorities. The risk of surreptitious access, hacking, data breaches and misuse is substantial, particularly in the case of central instead of decentral (personal) storage as well as a lack of open source software. However, not even the use of personal storage offers any warranty against misuse, malware and spyware, or, for that matter, makes users less dependent on technical vulnerabilities. Moreover, if the data fall into the hands of criminal organizations, they will be a gold mine for criminal activities.
For Privacy First, the risks of Corona apps do not outweigh their presumed benefits. Therefore, Privacy First advises the House to urge the cabinet not to proceed with the introduction of such apps.
Testing instead of apps
According to Privacy First, there is a better and more effective solution in the fight against the coronavirus. One that is based on the principles of proportionality and subsidiarity, i.e., large scale testing of people to learn about infection rates and immunization. To this end, the necessary test capacity should become available as soon as possible.
Haste is rarely a good thing
If, despite all the above-mentioned objections, it will be decided there is going to be a Corona app after all, then this should come about only after a careful social and democratic process with sufficiently critical, objective and independent scrutiny. This has not been the case so far, judging by the developments of the past few days. In this context, Privacy First recommends that the House calls on the cabinet to put its plans on ice and impose a moratorium on the use of Corona apps.
Privacy by design
The right to anonymity in public space is a fundamental right, one that is crucial for the functioning of our democratic constitutional state. Any democratic decision to nullify this right is simply unacceptable. If indeed the deployment of ‘Corona apps’ will be widespread, then at least their use should be strictly anonymous and voluntary. That is to say, they should be used only for a legitimate, specific purpose, following individual, prior consent without any form of outside pressure and on the premise that all the necessary information is provided. In this respect, privacy by design (embedding privacy protection in technology) must be a guiding principle. For Privacy First, these are stringent and non-negotiable prerequisites. In case these conditions are not met, Privacy First will not hesitate to bring proceedings before a court.
The Privacy First Foundation
Dear Members of Parliament,
You have received our position paper, this is our oral explanation.
First of all: Privacy First is firmly against any form of surveillance infrastructure, with or without apps.
With this in mind, we look at three legal principles:
- Legitimate purpose limitation.
- What is the problem?
- What is the scale of the problem?
- What are possible objectives, how can we achieve these objectives, and how can we measure progress towards them?
It’s already impossible to answer the first question as we now test partially and selectively. The total infected population is unknown, the people who have recovered are unknown also, and do not get reported. There is, however, fearmongering as a result of emotions and selective reporting; deaths with multiple causes (die with as opposed to die from Corona) and admittance to critical care units.
Let us be clear, we will first have to map out the causes of this problem before we can draw conclusions and talk about solutions. Not only IT professionals and virologists should be involved in this, to no lesser extent we need philosophers, legal scholars, sociologists, entrepreneurs and others who represent society also.
- Necessity and proportionality. In terms of test capacity, critical care units, medical materials and medical personnel, we essentially have a capacity problem. So, there is no doubt in our mind what we should be focusing on, also in view of future outbreaks; testing the entire population in order to tell who is infected and who is immune, and be able to determine the real problem. 97% of the population is unaffected. Make sure there will be a division and proper care for high-risk groups. Halt crisis communication and start crisis management. Take all treatment methods seriously, including those that are not profitable for Big Pharma and Big Tech.
- Subsidiarity. Once we know the problem, we may ask what the solutions are. Additional personnel at municipal health centers? Building a critical care unit hospital specifically for situations like these? Increasing the test capacity in order to be able to take decisions based on figures? All of this is possible within our current health system, with the general practitioner as the first point of contact.
On the basis of trust, we have given our government six weeks to get its act together. And what do we get in return? Distrust and monitoring tools. And still shortages of medical equipment. So, fix the fundamentals, deal with the treatment and test capacity and stop building new technological gadgets and draconian apps used in dictatorial regimes in Asia. And take The Netherlands out of this prolonged lockdown as soon as possible. Privacy First is opposed to a ‘1.5-meter society’ as the new normal, and is instead in favor of a common-sense society based on trust in mature citizens.