Today, Privacy First sent the following plea to the Dutch House of Representatives:
Dear Members of Parliament,
It is with great disapproval that the Privacy First Foundation has taken note of the planned introduction of coronavirus entry passes for bars and restaurants, events and cultural institutions. This will lead to a division in society, exclusion of vulnerable groups and a massive violation of everyone’s right to privacy. Below, Privacy First will briefly explain this.
Serious violation of fundamental rights
The coronavirus entry pass (‘corona pass’) constitutes a serious infringement of numerous fundamental human rights, including the right to privacy, physical self-determination, bodily integrity and freedom of movement in conjunction with other classic human rights such as the right to participate in cultural life and various children’s rights such as the right to recreation. Any curtailment of these rights must be strictly necessary, proportionate and effective. In the case of the corona pass, however, this has not been demonstrated to date and the required necessity is simply being assumed in the public interest. More privacy-friendly alternatives to reopen and normalize society seem never to have been seriously considered. For these reasons alone, the corona pass cannot pass the human rights test and should therefore be repealed. In this context, Privacy First would also like to remind you of countries such as England, Belgium and Denmark where a similar pass was deliberately not introduced, or has been done way with not long after its introduction. In the Netherlands, there has been a great lack of support in recent days for the corona pas and many thousands of entrepreneurs have already let it be known that they will not cooperate. Privacy First therefore expects that the introduction of the corona pass will lead to massive civil disobedience and successful lawsuits against the Dutch government.
The introduction of the corona pass violates the general prohibition of discrimination, as it introduces a broad social distinction based on medical status. This puts a strain on social life and may lead to widespread inequality, stigmatization, social segregation and even possible tensions, as large groups in society will not (or not systematically) want to, or will not be able to get tested or vaccinated (for a variety of reasons), or obtain a digital test or vaccination certificate. During our National Privacy Conference in early 2021, Privacy First already took the position that the introduction of a mandatory ‘corona passport’ would have a socially disruptive effect. On that occasion, the Dutch Data Protection Authority, among others, explicitly took a stand against the introduction of such a passport. The aforementioned social risks apply all the more strongly to the vaccination coercion that is caused by the introduction of the corona pass. In this regard, Privacy First would like to remind you of the fact that both your House of Representatives and the Parliamentary Assembly of the Council of Europe have expressed their opposition to a direct or indirect vaccination requirement. In addition, the corona pass will have the potential to set precedent for other medical conditions and other sectors of society, putting pressure on a much wider range of socio-economic human rights. For these reasons, Privacy First calls on you to block the introduction of the corona pass.
Multiple privacy violations
From the perspective of the right to privacy, there are a number of yet other specific concerns and questions. First of all, the corona pass introduces a mandatory ‘health proof’ for participation in a large part of social life, in flagrant violation of the right to privacy and the protection of personal data. Through the mandatory display of an ID card in addition to the corona pass, an entirely new identification requirement is created in public places. The existing anonymity in the public space is thus removed, with all the dangers and risks that this entails. Moreover, this new identification requirement raises questions about the capacities of entrepreneurs to determine the identity of a person and to assess the state of health by means of the corona pass.
Moreover, the underlying legislation results in the inconsistent application of existing legislation with regard to the same act, i.e. testing, with far-reaching consequences on the one hand for an important attainment such as medical confidentiality and the public’s trust in that confidentiality, and on the other for the practical implementation of retention periods of the test results while the processing of these results does not change. After all, it is not the result of the test that should determine whether the registration of the testing falls under the Dutch Medical Treatment Agreement Act (‘Wgbo’, which requires medical confidentiality and a 20-year retention period) or the Dutch Public Health Act (‘Wpg’, which requires a 5-year retention period), but the act of testing itself. Besides, it is questionable why a connection was sought with the Wpg and/or Wgbo now that it is about obtaining a certificate for participation in society and it does not concern medical treatment (Wgbo) or public health tasks for that purpose. The only ground for processing personal data for the purpose of ascertaining the presence of the coronavirus and for breaching medical confidentiality, should be consent. However, in this case there cannot be the legally required freely given consent, since testing and vaccination will be a mandatory condition for participation in society.
Privacy requires clarity
Many other things are and remain unclear: what data will be stored, where, by whom and in which systems? To what extent will there be an international and European exchange of such data? Which parties with which purposes will have access to or will copy the data, or put these in huge new national databases together with our health data? Will we have constant personal localization and identification, or only occasional verification and authentication? Why can test results be kept for an unnecessarily long time? How great are the risks of hacking, data breaches, fraud and forgery? To what extent have decentralized, privacy-friendly technologies and privacy by design, open source software, data minimization and anonymization seriously been considered? How long will test certificates remain free of charge? Is work already underway to introduce an ‘alternative digital medium’ to the Dutch CoronaCheck app, namely a chip (card), with all the objections and risks that entails? Why has there been no independent Privacy Impact Assessment (PIA)? How many more times must the country accept emergency laws to close privacy leaks, when our overburdened and understaffed Data Protection Authority is already noting that there is no legal basis for the processing of the data concerned? How will unforeseen uses and abuses, function creep and profiling be prevented, and how is privacy oversight arranged? Will non-digital, paper alternatives remain available at all times? Why is the ‘yellow booklet’ not accepted as a privacy-friendly alternative, as it is in other countries? What happens with the test material – i.e. everyone’s DNA – at the various testing sites? And when will the corona pass be abolished? In other words, to what extent is this actually a ‘temporary’ measure?
In the view of Privacy First, the introduction of the corona pass will lead merely to an impractical burden on entrepreneurs, innumerable deficiencies and destruction of capital for society. Privacy First therefore requests that the members of the House of Representatives block the introduction of the corona pass. Failing to do so, Privacy First reserves the right to have the legislation introducing the corona pass reviewed against international and European law and declared inoperative by the courts. Preparations for such legal proceedings by us and many others are already underway.
Privacy First Foundation
 See National Privacy Conference 28 January 2021, https://youtu.be/asEX1jy4Tv0?t=9378, starting at 2h 36 min 18 sec.
 See Council of Europe, Parliamentary Assembly, Resolution 2361 (2021): Covid-19 vaccines: ethical, legal and practical considerations, https://pace.coe.int/en/files/29004/html, par. 7.3.1-7.3.2: ‘‘Ensure that citizens are informed that the vaccination is NOT mandatory and that no one is politically, socially, or otherwise pressured to get themselves vaccinated, if they do not wish to do so themselves; ensure that no one is discriminated against for not having been vaccinated, due to possible health risks or not wanting to be vaccinated.’’ See also, inter alia, Dutch House of Representatives, Motion by Member Azarkan on no corona vaccination requirement (28 October 2020), House of Representatives, 25295-676, https://zoek.officielebekendmakingen.nl/kst-25295-676.html: ‘‘The House of Representatives (...) expresses that there should never be a direct or indirect corona vaccination obligation in the future’’; Motion by Member Azarkan on access to public benefits for all regardless of vaccination or testing status (5 January 2021), House of Representatives 25295-864, https://zoek.officielebekendmakingen.nl/kst-25295-864.html: "The House of Representatives (...) requests the government to allow access to public benefits for all regardless of vaccination or testing status."
An earlier, similar version of this commentary appeared as early as March 2021: https://www.privacyfirst.eu/focus-areas/law-and-politics/695-privacy-first-position-concerning-the-dutch-draft-bill-on-covid-19-test-certificates.html.
Summary proceedings against massive privacy violation by Automatic Number Plate Recognition (ANPR) camera surveillance
Challenging large-scale privacy violations in court has long been Privacy First’s established practice. In recent years, Privacy First has successfully done so against the central storage in the Netherlands of everyone’s fingerprints under the Dutch Passport Act, against the storage of everyone’s communications data under the Dutch Telecommunications Data Retention Act and – in coalition with other parties – against large-scale risk profiling of innocent citizens through the Dutch System Risk Indication (SyRI).
A current and urgent issue that equally merits going to court over, concerns the Dutch legislation on Automatic Number Plate Recognition (ANPR) which applies since 2019 under Art. 126jj of the Dutch Code of Penal Procedure. Under this piece of law, the number plate codes of millions of cars in the Netherlands (i.e. everyone’s travel movements) are stored continuously for four weeks in a central police database for criminal investigation purposes, regardless of whether one is suspected of anything. This is totally unnecessary, completely disproportionate and also ineffective, as was revealed in evaluation reports published today by the Dutch Research and Documentation Center (‘WODC’, part of the Dutch Ministry of Justice and Security). Supervision is lacking and the system can easily be abused, newspaper NRC Handelsblad recently confirmed in its reporting.
Privacy First has therefore prepared a lawsuit to have the ANPR legislation repealed on account of violation of European privacy law. Summary proceedings against the Dutch government will take place at the district court of The Hague on 10 November 2021. Through Pro Bono Connect, Privacy First has engaged CMS as the law firm that will take care of the litigation in this case. Our summons in summary proceedings can be found HERE (pdf in Dutch). If necessary, these preliminary proceedings will be followed by broader proceedings on the merits. After all, there is no doubt that the current ANPR law constitutes a massive privacy violation and simply does not belong in a free democratic society. Considering the relevant European case law, Privacy First deems the likelihood of successful legal action very high.
Case details: Privacy First vs. the State (Dutch Ministry of Justice and Security), Wednesday 10 November 2021 11.00 am, The Hague district court. You are welcome to attend the court hearing. A route description in Dutch can be found here.
Would you like to support these legal proceedings? Then please consider becoming a donor! Privacy First consists largely of volunteers and is entirely dependent on sponsorship and donations to pursue litigation.
It is with great concern that Privacy First has taken note of the Dutch draft bill on COVID-19 test certificates. Under this bill, a negative COVID-19 test certificate will become mandatory for access to sporting and youth activities, all sorts of events and public places including bars and restaurants and cultural and higher education institutions, Those who have no such certificates risk getting high fines. This will put pressure on everyone's right to privacy.
Serious violation of fundamental rights
The draft bill severely infringes numerous fundamental and human rights, including the right to privacy, physical integrity and freedom of movement in combination with other relevant human rights such as the right to participate in cultural life, the right to education and various children’s rights such as the right to recreation. Any curtailment of these rights must be strictly necessary, proportionate and effective. However, the current draft bill fails to demonstrate this, while the required necessity in the public interest is simply assumed. More privacy-friendly alternatives to reopen and normalize society do not seem to have been considered. For these reasons alone, the proposal cannot pass the human rights test and should therefore be withdrawn.
The proposal also violates the general prohibition of discrimination, as it introduces a broad social distinction based on medical status. This puts pressure on social life and may lead to large-scale inequality, stigmatization, social segregation and even possible tensions, as large groups in society will not (or not systematically) want to or will not be able to get tested (for various reasons). During the recent Dutch National Privacy Conference organized by Privacy First and the Platform for the Information Society (ECP), it already became clear that the introduction of a mandatory ‘corona passport’ could have a socially disruptive effect. On that occasion the Dutch Data Protection Authority, among others, took a strong stand against it. Such social risks apply all the more strongly to the indirect vaccination obligation that follows on from the corona test certificate. In this regard, Privacy First wants to recall that recently both the Dutch House of Representatives and the Parliamentary Assembly of the Council of Europe have expressed their opposition to a direct or indirect vaccination requirement. In addition, the draft bill under consideration will have the potential to set precedents for other medical conditions and other sectors of society, putting pressure on a much broader range of socio-economic rights. For all of these reasons, Privacy First strongly recommends that the Dutch government withdraw this draft bill.
Multiple privacy violations
Moreover, from the perspective of the right to privacy, a number of specific objections and questions apply. First of all, the draft bill introduces a mandatory ‘proof of healthiness’ for participation in a large part of social life, in flagrant violation of the right to privacy and the protection of personal data. Also, the draft bill introduces an identification requirement at the entrance of public places, in violation of the right to anonymity in public spaces. The bill also results in the inconsistent application of existing legislation to the same act, namely testing, with far-reaching consequences on the one hand for a precious achievement like medical confidentiality and the trust of citizens in that confidentiality, and on the other hand for the practical implementation of retention periods while the processing of the test result does not change. After all, it is not the result of the test that should determine whether the file falls under the Dutch Medical Treatment Contracts Act (WGBO, which has a medical secrecy requirement and a retention period of 20 years) or under the Public Health Act (with a retention period of five years), but the act of testing itself. Moreover, it is unclear why the current draft bill seeks to connect to the Public Health Act and/or WGBO if it only concerns obtaining a test certificate for the purpose of participating in society (and therefore no medical treatment or public health task for that purpose). Here, the only possibility for processing and for breaching medical confidentiality should be the basis of consent. In this case, however, there cannot be the legally required freely given consent, since testing will be a compelling condition for participation in society.
Privacy requires clarity
Many other issues are still unclear: which data will be stored, where, by whom, and which data may possibly be exchanged? To what extent will there be personal localization and identification as opposed to occasional verification and authentication? Why may test results be kept for an unnecessarily long time (five or even 20 years)? How great are the risks of hacking, data breaches, fraud and forgery? To what extent will there be decentralized, privacy-friendly technology, privacy by design, open source software, data minimization and anonymization? Will test certificates remain free of charge and to what extent will privacy-friendly diversity and choice in testing applications be possible? Is work already underway to introduce an ‘alternative digital carrier’ in place of the Dutch CoronaCheck app, namely a chip, with all the risks that entails? How will function creep and profiling be prevented and are there any arrangements when it comes to data protection supervision? Will non-digital, paper alternatives always remain available? What will happen to the test material taken, i.e. everyone’s DNA? And when will the corona test certificates be abolished?
As long as such concerns and questions remain unanswered, submission of this bill makes no sense at all and the corona test certificate will only lead to the destruction of social capital. Privacy First therefore reiterates its request that the current proposal be withdrawn and not submitted to Parliament. Failing this, Privacy First will reserve the right to have the matter reviewed by the courts and declared unlawful.
 See the Dutch National Privacy Conference, 28 January 2021, https://youtu.be/asEX1jy4Tv0?t=9378, starting at 2h 36 min 18 sec.
 See Council of Europe, Parliamentary Assembly, Resolution 2361 (2021): Covid-19 vaccines: ethical, legal and practical considerations, https://pace.coe.int/en/files/29004/html, par. 7.3.1-7.3.2: “Ensure that citizens are informed that the vaccination is NOT mandatory and that no one is politically, socially, or otherwise pressured to get themselves vaccinated, if they do not wish to do so themselves; ensure that no one is discriminated against for not having been vaccinated, due to possible health risks or not wanting to be vaccinated.” See also, for example, Dutch House of Representatives, Motion by Member Azarkan on No Corona Vaccination Obligation (28 October 2020), Parliamentary Document 25295-676, https://zoek.officielebekendmakingen.nl/kst-25295-676.html: "The House (...) pronounces that there should never be a direct or indirect coronavirus vaccination obligation in the future"; Motion by Member Azarkan on Access to Public Benefits for All Regardless of Vaccination or Testing Status (5 January 2021), Parliamentary Document 25295-864, https://zoek.officielebekendmakingen.nl/kst-25295-864.html: "The House (...) requests the government to enable access to public services for all regardless of vaccination or testing status.’
This week the Dutch House of Representatives will debate the ‘temporary’ Corona emergency law under which the movements of everyone in the Netherlands can henceforth be monitored ‘anonymously’. Privacy First has previously criticized this plan in a television broadcast by current affairs program Nieuwsuur. Subsequently, today Privacy First has sent the following letter to the House of Representatives:
Dear Members of Parliament,
With great concern, Privacy First has taken note of the ‘temporary’ legislative proposal to provide COVID-19 related telecommunications data to the Dutch National Public Health Institute (RIVM). Privacy First advises to reject this proposal on account of the following fundamental concerns and risks:
Violation of fundamental administrative and privacy principles
- There is no societal necessity for this legislative proposal. Other forms of monitoring have already proven sufficiently effective. The necessity of this proposal has not been demonstrated and there is no other country where the application of similar technologies made any significant contribution.
- The proposal is entirely disproportionate as it encompasses all telecom location data in the entire country. Any form of differentiation is absent. The same applies to data minimization: a sample would be sufficient.
- The proposal goes into effect retroactively on 1 January 2020. This violates legal certainty and the principle of legality, particularly because this date is long before the Dutch ‘start’ of the pandemic (11 March 2020).
- The system of ‘further instructions from the minister’ that has been chosen for the proposal is completely undemocratic. This further erodes the democratic rule of law and the oversight of parliament.
- The proposal does not mention 'privacy by design' or the implementation thereof, while this should actually be one of its prominent features.
Alternatives are less invasive: subsidiarity
- The State Secretary failed to adequately investigate alternatives which are more privacy friendly. Does she even have any interest in this at all?
- Data in the possession of telecom providers are pseudonymized with unique ID numbers and as such are submitted to Statistics Netherlands (CBS). This means that huge amounts of sensitive personal data become very vulnerable. Anonymization by CBS happens only at a later stage.
- When used, the data are filtered based on geographical origin. This creates a risk of discrimination on the basis of nationality, which is prohibited.
- It is unclear whether the CBS and the RIVM intend to ‘enrich’ these data with other data, which could lead to function creep and potential data misuse.
Lack of transparency and independent oversight
- Up until now, the Privacy Impact Assessment (PIA) of the proposal has not been made public.
- There is no independent oversight on the measures and effects (by a judge or an independent commission).
- The GDPR may be applicable to the proposal only partially as anonymous data and statistics are exempt from the GDPR. This gives rise to new risks of data misuse, poor digital protection, data breaches, etc. General privacy principles should therefore be made applicable in any case.
Structural changes and chilling effect
- This proposal seems to be temporary, but the history of similar legislation shows that it will most likely become permanent.
- Regardless of the ‘anonymization’ of various data, this proposal will make many people feel like they are being monitored, which in turn will make them behave unnaturally. The risk of a societal chilling effect is huge.
Faulty method with a significant impact
- The effectiveness of the legislative proposal is unknown. In essence, it constitutes a large scale experiment. However, Dutch society is not meant to be a living laboratory.
- By means of data fusion, it appears that individuals could still be identified on the basis of anonymous data. Even at the chosen threshold of 15 units per data point, the risk of unique singling out and identification is likely still too large.
- The proposal will lead to false signals and blind spots due to people with several telephones as well as vulnerable groups without telephones, etc.
- There is a large risk of function creep, of surreptitious use and misuse of data (including the international exchange thereof) by other public services (including the intelligence services) and future public authorities.
- This proposal puts pressure not just on the right to privacy, but on other human rights as well, including the right to freedom of movement and the right to demonstrate. The proposal can easily lead to structural crowd control that does not belong in a democratic society.
Specific prior consent
Quite apart from the above concerns and risks, Privacy First doubts whether the use of telecom data by telecom providers, as envisaged by the legislative proposal, is lawful in the first place. In the view of Privacy First, this would require either explicit, specific and prior consent (opt-in) from customers, or the possibility for them to opt-out at a later stage and to have the right to have all their data removed.
It is up to you as Members of Parliament to protect our society from this legislative proposal. If you fail to do so, Privacy First reserves the right to take legal action against this law.
The Privacy First Foundation
Yesterday, there was a hearing in the Dutch House of Representatives in which the by now notorious Corona app was critically discussed. The House had invited various experts and organizations (among which Privacy First) to submit position papers and take part in the hearing. Below is both the full text of our position paper, as well as the text which was read out at the hearing. A video of the entire hearing (in Dutch) can be found HERE. Click HERE for the program, all speakers and position papers.
Dear Members of Parliament,
Thank you kindly for your invitation to take part in this roundtable discussion about the so-called Corona app. In the view of Privacy First, apps like these are a threat to everyone’s privacy. We will briefly clarify this below.
Lack of necessity and effectiveness
With great concern, Privacy First has taken note of the intention of the Dutch government to employ a contact tracing app in the fight against the coronavirus. Thus far, the social necessity of such apps has not been proven, while the experience of other countries indicates there is ground to seriously doubt their benefit and effectiveness. In fact, these apps may even be counterproductive as their use leads to a false sense of safety. Moreover, it’s very hard to involve the most vulnerable group of people (the elderly) through this means. This should already be enough reason to refrain from using Corona apps.
In Privacy First’s view, the use of such apps is a dangerous development because it could lead to stigmatization and numerous unfounded suspicions, and may also cause unnecessary unrest and panic. Even when ‘anonymized’, the data from these apps can still be traced back to individuals through data fusion. In case this technology will be introduced on a large scale, it will result in a surveillance society in which everyone is being continuously monitored – something people will be acutely aware of and would lead to an imminent societal chilling effect.
Risks of misuse
There is a significant risk that the collected data will be used for multiple purposes (function creep) and be misused by both companies and public authorities. The risk of surreptitious access, hacking, data breaches and misuse is substantial, particularly in the case of central instead of decentral (personal) storage as well as a lack of open source software. However, not even the use of personal storage offers any warranty against misuse, malware and spyware, or, for that matter, makes users less dependent on technical vulnerabilities. Moreover, if the data fall into the hands of criminal organizations, they will be a gold mine for criminal activities.
For Privacy First, the risks of Corona apps do not outweigh their presumed benefits. Therefore, Privacy First advises the House to urge the cabinet not to proceed with the introduction of such apps.
Testing instead of apps
According to Privacy First, there is a better and more effective solution in the fight against the coronavirus. One that is based on the principles of proportionality and subsidiarity, i.e., large scale testing of people to learn about infection rates and immunization. To this end, the necessary test capacity should become available as soon as possible.
Haste is rarely a good thing
If, despite all the above-mentioned objections, it will be decided there is going to be a Corona app after all, then this should come about only after a careful social and democratic process with sufficiently critical, objective and independent scrutiny. This has not been the case so far, judging by the developments of the past few days. In this context, Privacy First recommends that the House calls on the cabinet to put its plans on ice and impose a moratorium on the use of Corona apps.
Privacy by design
The right to anonymity in public space is a fundamental right, one that is crucial for the functioning of our democratic constitutional state. Any democratic decision to nullify this right is simply unacceptable. If indeed the deployment of ‘Corona apps’ will be widespread, then at least their use should be strictly anonymous and voluntary. That is to say, they should be used only for a legitimate, specific purpose, following individual, prior consent without any form of outside pressure and on the premise that all the necessary information is provided. In this respect, privacy by design (embedding privacy protection in technology) must be a guiding principle. For Privacy First, these are stringent and non-negotiable prerequisites. In case these conditions are not met, Privacy First will not hesitate to bring proceedings before a court.
The Privacy First Foundation
Dear Members of Parliament,
You have received our position paper, this is our oral explanation.
First of all: Privacy First is firmly against any form of surveillance infrastructure, with or without apps.
With this in mind, we look at three legal principles:
- Legitimate purpose limitation.
- What is the problem?
- What is the scale of the problem?
- What are possible objectives, how can we achieve these objectives, and how can we measure progress towards them?
It’s already impossible to answer the first question as we now test partially and selectively. The total infected population is unknown, the people who have recovered are unknown also, and do not get reported. There is, however, fearmongering as a result of emotions and selective reporting; deaths with multiple causes (die with as opposed to die from Corona) and admittance to critical care units.
Let us be clear, we will first have to map out the causes of this problem before we can draw conclusions and talk about solutions. Not only IT professionals and virologists should be involved in this, to no lesser extent we need philosophers, legal scholars, sociologists, entrepreneurs and others who represent society also.
- Necessity and proportionality. In terms of test capacity, critical care units, medical materials and medical personnel, we essentially have a capacity problem. So, there is no doubt in our mind what we should be focusing on, also in view of future outbreaks; testing the entire population in order to tell who is infected and who is immune, and be able to determine the real problem. 97% of the population is unaffected. Make sure there will be a division and proper care for high-risk groups. Halt crisis communication and start crisis management. Take all treatment methods seriously, including those that are not profitable for Big Pharma and Big Tech.
- Subsidiarity. Once we know the problem, we may ask what the solutions are. Additional personnel at municipal health centers? Building a critical care unit hospital specifically for situations like these? Increasing the test capacity in order to be able to take decisions based on figures? All of this is possible within our current health system, with the general practitioner as the first point of contact.
On the basis of trust, we have given our government six weeks to get its act together. And what do we get in return? Distrust and monitoring tools. And still shortages of medical equipment. So, fix the fundamentals, deal with the treatment and test capacity and stop building new technological gadgets and draconian apps used in dictatorial regimes in Asia. And take The Netherlands out of this prolonged lockdown as soon as possible. Privacy First is opposed to a ‘1.5-meter society’ as the new normal, and is instead in favor of a common-sense society based on trust in mature citizens.
With great concern, Privacy First has taken note of the intention of the Dutch government to employ special apps in the fight against the coronavirus. In Privacy First’s view, the use of such apps is a dangerous development because it could lead to stigmatisation and numerous unfounded suspicions, and may also cause unnecessary unrest and panic. Even when ‘anonymized’, the data from these apps can still be traced back to individuals through data fusion. In case this technology will be introduced on a large scale, it will result in a surveillance society in which everyone is being continuously monitored – something people will be acutely aware of and would lead to an imminent societal chilling effect. Furthermore, there is a substantial risk that the collected data will be used and misued for multiple (illegitimate) purposes by companies and public authorities. Moreover, if these data fall into the hands of criminal organizations, they will be a gold mine for criminal activities. For Privacy First, these risks of Corona apps do not outweigh their presumed benefits.
The right to anonymity in public space is a fundamental right, one that is crucial for the functioning of our democratic constitutional State. Any democratic decision to nullify this right is simply unacceptable. If indeed the deployment of ‘Corona apps’ will be widespread, then at least their use should be strictly anonymous and voluntary. That is to say, they should be used only for a legitimate, specific purpose, following individual, prior consent without any form of outside pressure and on the premise that all the necessary information is provided. In this respect, privacy by design (embedding privacy protection in technology) must be a guiding principle. For Privacy First, these are stringent and non-negotiable prerequisites. In case these conditions are not met, Privacy First will not hesitate to bring proceedings before a court.
Today an important debate will take place in the Dutch House of Representatives about the introduction of Passenger Name Records (PNR): the large scale, years-long storage of all sorts of data of airline passengers, supposedly to fight crime and terrorism. Privacy First has major objections and at the end of last week has sent the following letter to the House. Today’s parliamentary debate was first scheduled to take place on 14 May 2018, but was cancelled (following a similar letter from Privacy First) until further notice. Following new parliamentary questions, the debate will now take place today after all. Here is the full text of our most recent letter:
Dear Members of the House of Representatives,
On Monday afternoon, this 11 March, you will discuss the Dutch implementation of the European directive on Passenger Name Records (PNR) with minister Grapperhaus (Justice and Security). In Privacy First’s view, both the European PNR directive as well as the Dutch implementation thereof are legally untenable. We shall here briefly elucidate our position.
Under the minister’s legislative proposal concerning PNR, numerous data of every single airline passenger travelling to or from the Netherlands will be stored for five years in a central government database of the new Passenger Information Unit and will be used to prevent, investigate and prosecute crimes and terrorism. Sensitive personal data (such as names, addresses, telephone numbers, email addresses, dates of birth, travel data, ID document numbers, destinations, fellow passengers and payment data) of many millions of passengers will, as a result, become available for many years for the purpose of data mining and profiling. In essence, this means that every airline passenger will be treated as a potential criminal or terrorist. In 99.9% of all cases, however, this concerns perfectly innocent citizens, mainly holidaymakers and business travellers. This is a flagrant breach of their right to privacy and freedom of movement. Last year, Privacy First had already made these arguments in the Volkskrant and on BNR Nieuwsradio. Because of privacy objections, in recent years there has been a lot of political resistance to such large scale PNR storage of data, which has been rejected by both the House of Representatives as well as the European Parliament on several occasions since 2010. In 2015, Dutch ruling parties VVD and PvdA were absolutely opposed to PNR as well. Back then, they called it a ‘holiday register’ and they themselves threatened to take to the European Court of Justice in case the PNR directive would be adopted. However, after the attacks in Paris and Brussels, it seemed that many political restraints had evaporated and in 2016, the PNR directive finally came about after all. Up to now however, the legally required necessity and proportionality of this directive have still to be demonstrated.
In the summer of 2017, the European Court of Justice issued an important ruling with regard to the similar PNR agreement between the EU and Canada. The Court declared this agreement invalid because it violates the right to privacy. Among other things, the Court held that the envisaged agreement must, “limit the retention of PNR data after the air passengers’ departure to that of passengers in respect of whom there is objective evidence from which it may be inferred that they may present a risk in terms of the fight against terrorism and serious transnational crime.” (See Opinion 1/15 (26 July 2017), par. 207.) Ever since this ruling, the European PNR directive is a legal uncertainty. Therefore, the Dutch government has valid ‘‘concerns about the future viability of the PNR directive” (see Note in response to report, p. 23, in Dutch). Privacy First expects that the current PNR directive will soon be submitted to the European Court of Justice for judicial review and will then be declared unlawful. Subsequently, a situation will arise that is similar to the one we have witnessed a few years ago with regard to the European Telecommunications Data Retention Act: as soon as this European directive will be annulled, the Dutch implementing provisions will equally be invalidated in interim injunction proceedings.
The current Dutch PNR legislative proposal seems unlawful a priori because of a lack of demonstrable necessity, proportionality and subsidiarity. The legislative proposal comes down to mass surveillance of mostly innocent citizens; in the 2016 Tele2 case the European Court already ruled that this type of legislation is unlawful. Thereupon the Netherlands pledged before the UN Human Rights Council “to ensure that the collection and maintenance of data for criminal [investigation] purposes does not entail massive surveillance of innocent persons.” The Netherlands now seems to renege on that promise. After all, a lot of completely unnecessary data of every airline passenger will be stored for years and can be used by various Dutch, European and even non-European government agencies. Moreover, the effectiveness of PNR has to date never been demonstrated, the minister himself affirmed: ‘‘There is no statistical support” (see Note in response to report, p. 8, in Dutch). The risk of unjust suspicion and discrimination (due to fallible algorithms used for profiling) under the proposed PNR system is serious, which also increases the likelihood of delays and missed flights for innocent passengers. All the while, wanted persons will often stay under the radar and choose alternative travel routes. Furthermore, the legislative proposal entirely fails to address the role and capabilities of secret services, which will be granted secret and shielded access to the central PNR database under the new Dutch Intelligence and Security Services Act. However, the most questionable aspect of the Dutch PNR legislative proposal is that it goes even two steps further than the European PNR directive itself: After all, it is the Dutch government's own decision to also store the data of passengers on all intra-EU flights. This is not obligatory under the PNR directive, and the Netherlands could have limited this to preselected flights (judged to be at risk) only. This would have been in line with the advice of most experts in this field who argue for targeted actions as opposed to mass surveillance. In other words, to focus on persons with a reasonable suspicion about them, in accordance with the principles of our democracy under the rule of law.
Privacy First Advice
Privacy First strongly advises you to reject the current legislative proposal and to replace it with a privacy-friendly version. In case this will lead to the European Commission referring the Netherlands to the European Court of Justice due to a lack of implementation of the present PNR directive, Privacy First would be confident this would end in a clear victory for the Netherlands. EU Member States simply cannot be expected to implement privacy-violating EU rules. This applies equally to the national implementation of relevant resolutions of the UN Security Council (in this case UNSC Res. 2396 (2017)) which is similarly at odds with international human rights law. In this respect, Privacy First has already warned of the abuse of the Dutch TRIP system (which is also used for PNR) by other UN Member States. In this regard, the Netherlands has its own responsibility under the Dutch Constitution as well as under international law.
Privacy First Foundation
Update 19 March 2019: Regrettably, today the House of Representatives has adopted the legislative proposal almost unchanged; only GroenLinks, SP, PvdD and Denk voted against. Unfortunately, a motion by GroenLinks and SP to provoke legal action by the European Commission against the Dutch government about the PNR directive was rejected. The only bright spot is the widely adopted motion for the judicial reassessment and possible revision of the PNR directive at a European political level. (Only PVV and FvD voted against this motion.) Next stop: the Senate.
Update 4 June 2019: despite sending the above letter for a second time and despite other critical input by Privacy First, the Senate today has unfortunately adopted the legislative proposal. Only GroenLinks, PvdD and SP voted against. Even in spite of the enormous error rates (false positives) of 99.7% that recently came to light in the comparable German PNR system, see https://www.sueddeutsche.de/digital/fluggastdaten-bka-falschtreffer-1.4419760. Meanwhile, large scale cases have been brought against the European PNR directive in Germany and Austria in order for the European Court of Justice to nullify it on account of violations of the right to privacy, see the German-English campaign website https://nopnr.eu and https://www.nrc.nl/nieuws/2019/05/15/burgers-in-verzet-tegen-opslaan-passagiersgegevens-a3960431. As soon as the European Court rules that the PNR directive is unlawful, Privacy First will start interim injunction proceedings in order for the Dutch PNR law to be rendered inoperative. Moreover, yesterday Privacy First has put the PNR law on the agenda of the UN Human Rights Committee in Geneva. On 1 and 2 July 2019, the overall human rights situation in the Netherlands (including violations of the right to privacy) will be critically reviewed by this Committee.
The Dutch Ministry of Finance is about to oblige companies to export personal data on a large scale. The measure is hidden in a subordinate clause of a letter from the Minister of Finance, although it has major consequences. The measure obliges companies that trade in 'virtual assets' (such as bitcoins, real estate, but also purchases in computer games) to include personal data of customers in the transaction records and messages. The information from all parties involved needs to remain visible and available to everyone in the value chain.
Consumers, companies and citizens cannot object to this mandatory addition of their personal data. The topic is not receiving the proper amount of political attention because it is presented as a technical measure. In his letter to Dutch Parliament of 21 March 2019, the Minister fails to point out the large scope and impact. It is, however, suggested that a consultation round will take the market responses to the envisaged rules onboard.
Privacy First and VBNL (United Bitcoin Companies Netherlands) have meanwhile understood that the worldwide objections to the proposed measure are being ignored. That is why they are today sending an urgent letter to the Dutch Minister of Finance. They ask him to study the issue better, with all relevant Ministries and in particular: to better inform Parliament. In doing so, they point to the conflicts of law that may arise as the measure may well violate international agreements and treaties that protect privacy.
Where it is known that consumers are very reluctant to make their own data available to private and commercial institutions, the government must be similarly reluctant on their behalf. Privacy First finds it extremely unfortunate that the Ministry of Finance seems to intend to give this all-in permission for unbridled export of personal data without giving it proper attention and without applying due process.
There is no merit to the claim that the measure is required for counter-terrorism purposes. Experts at Europol (!) indicate that the international proposal is "overkill" and not necessary for investigative purposes. The rule adds nothing to the existing European framework against money laundering and terrorist financing and only increases the risk of unwanted data breaches.
Privacy First and VBNL hope that their letter will make Dutch Parliament aware that this is a proposal that goes far beyond the much-debated access-regime of the recent second European Payment Services Directive (PSD2). With PSD2, consumers can decide to share data themselves. With this proposal, they will become deprived of that fundamental right for all kinds of economic acts. Privacy First and VBNL are calling on parliamentarians to protect consumers and businesses against this unnecessary planned measure.
The letter can be downloaded here (pdf).
Writing a New Year’s Column about the state of affairs concerning the protection of everyone’s privacy weighs me down this year. With the exception of a few bright spots, privacy in the Netherlands and the rest of the world has greatly deteriorated. For a while it seemed that the revelations of Edward Snowden in 2013 about secret services tracking everyone’s online behavior would be a rude wake-up call for the world. It was thought that an increasing number of data breaches and a rising number of governments and companies getting hacked, would make people realize that large amounts of data stored centrally is not the solution. The Arab Spring in 2015 would bring about major change through the unprecedented use of (social) media.
The European Union successfully voted against the exchange of data relating to travel movements, paved the way for the current General Data Protection Regulation and seemed to become the shining alternative example under the guidance of Germany, a country known for its vigilance when it comes to privacy. Unfortunately, things turned out differently. Under the Obama administration, Snowden was shunned as a traitor and other whistleblowers were clamped down on harder than ever before. Julian Assange was forced into exile while murdering people with the use of drones and without any form of trial was implemented on a large scale. Extrajudicial killings with collateral damage... While the discussion was about waterboarding... Discussions on such ‘secondary topics’ have by now become commonplace in politics, and so has the framing and blaming of opponents in the polarized public debate (the focus is usually on the person rather than on the argument itself).
Looking back on 2018, Privacy First identifies a great number of areas where the breakdown of privacy is evident:
Government & privacy
In March, an advisory referendum in the Netherlands was held on the introduction of the so-called Tapping law. Immediately after that, the referendum was abrogated. This happened in a time of unprecedented technological possibilities to organize referendums in various ways in a shared democracy. That’s outrageous. The outcome of the referendum was not taken into account and the Tapping law was introduced just like that. Moreover, it turned out that all along, the Dutch Minister of the Interior had withheld an important report on the functioning of the Dutch General Intelligence and Security Service.
Apparently this was nothing to worry about and occurred without any consequences. The recent report by the Dutch State Commission on the (re)introduction of referendums will likely end up in a drawer, not to be looked at again.
Fear of losing one’s role and the political mood of the day are all too important in a culture in which ‘professional politicians’ are afraid to make mistakes, but which is full of incidents nonetheless. One’s job or profession comes first, representing citizens comes second. Invariably, incidents are put under a magnifying glass in order to push through binding legislation with a broad scope. Without the review of compliance with guiding principles such as necessity, purpose limitation, subsidiarity and proportionality. There is an ever wider gap between government and citizens, who are not trusted but are expected to be fully transparent towards that self-same government. A government that time and again appears to be concealing matters from citizens. A government that is required by law to protect and promote privacy, but is itself still the most prominent privacy-violator.
The medical establishment & privacy
In this area things got really out of hand in 2018. Through various coordinated media offensives, the EU and the member states are trying to make us believe in the advantages of relinquishing our right to physical integrity and our humanity. Sharing biometric data with the United States continues unabatedly. We saw the police calling for compulsory DNA databases, compulsory vaccination programs, the use of smart medicines with microchips and the phasing out of alternative therapies. Furthermore, health insurance companies cautiously started to cover genetic testing and increasingly doing away with medical confidentiality, the Organ Donation Act was introduced and microchips implanted in humans (the cyborg as the highest ideal in Silicon Valley propaganda) became ever more popular.
How long before microchips become compulsory for all citizens? All (domestic) animals in the EU have already preceded us. And then there’s the Electronic Health Record, which was first rejected in the Dutch Senate but has reappeared on the minister’s agenda via a detour. Driven by commercial interests, it is being rammed down the throats of general practitioners while alternatives such as Whitebox are not taken seriously. The influence of Big Pharma through lobbying with government bodies and participating in government working groups is particularly acute. They closely cooperate with a few IT companies to realize their ideal of large and centralized networks and systems. It’s their year-end bonus and growth at the expense of our freedom and well-being.
Media & privacy
Naturally, we cannot overlook ‘fake news’. One of the premises for having privacy is being able to form your own opinion and respect and learn from the opinions of others. Furthermore, independent left and right-wing media are essential in a democratic constitutional State. It's their task to monitor the functioning of elected and unelected representatives in politics and in government. Journalists should be able to penetrate into the capillaries of society in order to produce local, national and global news.
Ever since free news gathering came about, it has been a challenge to obtain news based on facts. It’s not always easy to distinguish a press service, PR and propaganda from one another. In times of rapid technological changes and new opportunities, they should be continuously reviewed according to the principles of journalism. That’s nothing new. What is new, however, is that the European Union and our own Minister for the Interior, Kajsa Ollongren, feel they’re doing the right thing by outsourcing censorship to social media companies that are active on a global scale and have proven to be unreliable.
While Facebook and Google have to defend themselves in court for spreading fake news and censoring accounts, the governments hand over the monitoring task to them. The privacy violators and fake news distributors as the guardians of our privacy and journalism. That’s the world upside down. By so doing, this minister and this government undermine the constitutional State and show disdain for intelligent citizens. It’s time for a structural change in our media system, based on new technologies such as blockchain and the founding of a government media office whose task is to fund all media outlets through citizens’ contributions, taking into account the media’s scope and number of members. So that concerns all media, including the so-called alternative media, which should not be censored.
Finance & privacy
The erosion of one’s privacy increasingly manifests itself at a financial level too. The fact of the matter is, that the tax authorities already know in detail what the spending pattern of all companies and citizens looks like. Thanks to the Tapping Law, they can now pass on this information in real-time to the secret services (the General Intelligence and Security Service is watching along). Furthermore, a well-intended initiative such as PSD2 is being introduced in a wholly improvident and privacy-unfriendly way: basic conditions relating to the ownership of bank details (of citizens, account holders) are devoid of substance. Simple features such as selective sharing of banking details, for example according to the type of payment or time period, are not available. What’s more, payment details of third parties who have not given their consent, are sent along.
In the meantime, the ‘cash = criminal’ campaign goes on relentlessly. The right to cash and anonymous payment disappears, despite even the Dutch Central Bank now warning that the role of cash is crucial to our society. Privacy First has raised its opinion on this topic already in 2016 during a public debate. The latest development in this regard is the further linking of information through Big Data and profiling by debt-collecting agencies and public authorities. Excluding citizens from the electronic monetary system as a new form of punishment instead of letting them pay fines is a not so distant prospect. In this regard, a lot of experimentation is going on in China and there have been calls in Europe to move in the same direction, supposedly in order to fight terrorism. In other words, in the future it will become increasingly difficult to raise your voice and organize against abuse of power by governments and companies: from on high it takes only the press of a button and you may no longer be able to withdraw cash, travel or carry out online activities. In which case you have become an electronic outcast, banished from society.
Public domain & privacy
In 2018, privacy in public space has all but improved. Whereas 20 years ago, the Netherlands was deemed too small to require everyone out on the streets to be able to identify themselves, by now, all governments and municipalities in Europe are developing ‘smart city’ concepts. If you ask what the benefits and use of a smart city are (beyond the permanent supervision of citizens), proponents will say something vague about traffic problems and that the 'killer applications' will become visible only once the network of beacons is in place. In other words, there are absolutely no solid figures which would justify the necessity, subsidiarity and proportionality of smart cities. And that’s not even taking basic civil rights such as privacy into consideration.
Just to give a few examples:
- ANPR legislation applies from 1 January 2019 (all travel movements on public roads will be stored in a centralized police database for four weeks)
- A database consisting of all travel movements and stays of European citizens and toll rates as per 2023
- Emergency chips in every vehicle with a two-way communication feature (better known as spyware) as per 1 January 2019
- Cameras and two-way communication in public space, built into the lampposts among other objects as part of smart city projects
- A decision to introduce additional cameras in public transport as per 2019
- The introduction of Smart Cities and the introduction of unlimited beacons (doesn’t it sound so much better than electronic concentration camp posts?)
- Linking together all traffic centers and control rooms (including those of security companies operating on the private market)
- Citizens are permanently monitored by invisible and unknown eyes.
Private domain & privacy
It’s well known that governments and companies are keen to take a peek in our homes, but the extent to which this was being advanced last year, was outside of all proportion. Let’s start with energy companies, who foist compulsory smart meters on citizens. By way of ‘appointment to install a smart meter’, which you didn’t ask for, it’s almost impossible to stay clear of red tape. After several cancellations on my part and phone calls to energy provider Nuon, they simply continued to push forward. I still don’t have a smart meter and it will stay like that.
Once again Silicon Valley featured prominently in the news in 2018. Unelected dictatorial executives who are no less powerful than many a nation state, promote their utopias as trendy and modern among citizens. Self-driving cars take the autonomy and joy away from citizens (the number of accidents is very small considering the millions of cars on the road each day), while even children can tell that a hybrid approach is the only option. The implementation of smart speakers by these social media companies is downright spooky. By bringing smart toys onto the market, toy manufacturers equally respond to the needs that we all seem to have. We can all too readily guess what these developments will mean for our privacy. The manipulation of facts and images as well as distortion, will starkly increase.
Children & privacy
Children and youths represent the future and nothing of the above bodes well for them. Screen addiction is sharply on the rise and as children are being raised amidst propaganda and fake news, much more attention should go out to forming one’s own opinion and taking responsibility. Centralized pupil monitoring systems are introduced indifferently in the education system, information is exchanged with parents and not having interactive whiteboards and Ipads in the classroom has become unthinkable. The first thing children see every single day, is a screen with Google on it... Big Brother.
Dependence on the internet and social media results in impulsive behaviour among children, exposes them to the madness of the day and affects their historical awareness and ability to discern underlying links. The way of thinking at universities is becoming increasingly one-sided and undesirable views are marginalized. The causes of problems are not examined, books are not read though there is certainly no lack of opinions. It’s all about making your voice heard within the limits of self-censorship that’s in force in order to prevent becoming the odd one out in the group. The same pattern can be identified when it comes to forming opinions in politics, where discussing various issues based on facts seems no longer possible. Not to mention that the opinions of citizens are considered irrelevant by our politicians. Good quality education focused on forming opinions and on creating self-reflective minds instead of a robot-way of thinking, is essential for the development of a healthy democracy.
Are there any positive developments?
It's no easy task to identify any positive developments in the field of privacy. The fact is that the introduction of the GDPR and the corresponding option to impose fines has brought privacy more sharply into focus among companies and citizens than the revelations of Snowden have been able to do. The danger of the GDPR, however, is that it narrows down privacy to data protection and administrative red tape.
Another positive development is the growing number of (as of yet small) initiatives whereby companies and governments consider privacy protection as a business or PR opportunity. This is proved by the number of participants in the 2019 Dutch Privacy Awards. Recurring themes are means of anonymous communication (email, search engines, browsers), possible alternatives to social networks (messaging services like WhatsApp, Facebook, Instagram and Twitter) on the basis of subscriptions, blockchain technology and privacy by design projects by large organizations and companies.
Privacy First has teamed up with a few top quality pro bono attorneys who are prepared to represent us in court. However, judges are reluctant to go off the beaten track and come up with progressive rulings in cases such as those concerning number plate parking, average speed checks, Automatic Number Plate Recognition, the Tapping Law, etc. For years, Privacy First has been suffering from a lack of funding. Many of those who sympathize with us, find the topic of privacy a bit eerie. They support us morally but don’t dare to make a donation. After all, you draw attention to yourself when you’re concerned with issues such as privacy. That’s how bad things have become; fear and self-censorship... two bad counsellors! It’s high time for a government that seriously deals with privacy issues.
Constitutional reform should urgently be placed on the agenda
Privacy First is a great proponent of constitutional reform (see our 2017 New Year’s column about Shared Democracy), based on the principles of the democratic constitutional State and the European Convention on Human Rights (ECHR). Our democracy is only 150 years old and should be adapted to this current day and age. This means that the structure of the EU should be changed. Citizens should take on a central and active role. Government policies should focus on technological developments in order to reinforce democracy and formulate a response to the concentration of power of multinational companies.
Privacy First argues that the establishment of a Ministry of Technology has the highest priority in order to be able to stay up to date with the rapid developments in this field and produce adequate policies accordingly. It should live up to the standards of the ECHR and the Dutch Constitution and avoid becoming a victim of the increasing lobbying efforts in this sector. Moreover, it is time for a Minister of IT & Privacy who stays up to date on all developments and acts with sufficient powers and in accordance with the review of a Constitutional Court.
The protection of citizens’ privacy should be facilitated and there should be privacy-friendly alternatives for current services by technology companies. For 2019, Privacy First has a few tips for ordinary citizens:
- Watch out for and stay away from ‘smart’ initiatives on the basis of Big Data and profiling!
- Keep an eye on the ‘cash = criminal’ campaign. Make at least 50% of your payments anonymously in cash.
- Be cautious when communicating through Google, Apple, Facebook and Microsoft. Look for or develop new platforms based on Quantum AI encryption and use alternative browsers (TOR), networks (VPN) and search engines (Startpage).
- Be careful when it comes to medical data and physical integrity. Use your right for there to be no exchange of medical data as long as initiatives such as Whitebox are not used.
- Be aware of your right to stay anonymous, at home and in public space. Campaign against toll payment, microchips in number plates, ANPR and number plate parking.
- Be aware of your legal rights to bring lawsuits, for example against personalized waste disposal passes, camera surveillance, etc.
- Watch out for ‘smart’ meters, speakers, toys and other objects in the house connected to the internet. Purchase only privacy by design solutions with privacy enhanced technology!
The Netherlands and Europe as guiding nations in the field of privacy, with groundbreaking initiatives and solutions for apparent contradictions concerning privacy and security issues - that’s Privacy First's aim. There’s still a long way to go, however, and we’re being blown off course ever more. That’s due in part because a comprehensive vision on our society and a democracy 3.0 is lacking. So we continue to drift rudderless, ending up in the big manipulation machine of large companies one step at a time. We need many more yellow vests before things change. Privacy First would like to contribute to shaping and promoting a comprehensive, positive vision for the future. A future based on the principles that our society was built on and the need for greater freedom, with all the inevitable restrictions this entails. We will have to do it together. Please support Privacy First actively with a generous donation for your own freedom and that of your children in 2019!
To an open and free society! I wish everyone a lot of privacy in 2019 and beyond!
Bas Filippini, Privacy First chairman
A train passenger has submitted an enforcement request to the Dutch Data Protection Authority, because he argues that Dutch Railways (NS) violates the privacy of train passengers.
In response to three new attempts by Dutch Railways (NS) to violate the privacy of train passengers, NS customer Michiel Jonker has submitted a request for enforcement to the Dutch Data Protection Authority (DPA). It concerns:
- Rejecting the reimbursement of the remaining balance on anonymous public transport chip cards if the holder does not provide his or her name and address data to NS;
- Refusing international train tickets by NS employees at station desks if buyers do not provide their name and address data to NS;
- Charging, since 2 July 2018, additional "service costs" when holders of anonymous public transport chip cards pay in cash for topping up the balance on these cards.
Since July 2014, NS has already launched attacks on the privacy of Dutch train passengers in various ways. It then concerned:
- Discriminating holders of anonymous public transport chip cards in discount hours;
- Requiring de-anonymization of the anonymous public transport chip cards when NS is asked to provide services (for example, reimbursing money in the event of delays);
- Applying two unique card numbers on each anonymous OV chip card, as a result of which the anonymity of these cards is affected.
As a traveler who wants to maintain his privacy, Jonker repeatedly asked the DPA to investigate these violations and to take enforcement measures. Jonker already won several lawsuits against the DPA, which initially refused to even investigate the reports.
The recently adopted General Data Protection Regulation (GDPR) will play an important role in the assessment of the new violations by NS. Another central issue will be the right to pay by cash, which protects privacy.
Jonker: "In all these matters, the question is whether users of Dutch public transport are entitled to a real, effective protection of their privacy. This question is more relevant than ever, when you see how people are treated in situations where privacy is not adequately protected. We don't only think about China with its Social Credit score, or the United States with their "No Fly" lists, but also about European countries where laws have been adopted in recent years that allow the government to spy on travelers who are not even suspected of any punishable or risky behavior. For example France with its permanent state of emergency and the Netherlands with its new Intelligence and Security Act."
In this new case, Jonker is supported by Privacy First and Maatschappij voor Beter OV.
Source: https://www.liberties.eu/en/news/ns-privacy-fight-passenger-privacy/15444, 25 July 2018.