Privacy First initiates legal action against the Dutch government on account of the recently-introduced UBO register. The preliminary injunction proceedings point at the invalidity of the legislation on which this register is based. The consequences of this new piece of legislation are far-reaching as the register contains very privacy-sensitive information. Data relating to the financial situation of natural persons will be up for grabs. More than 1.5 million legal entities that are registered in the Dutch Trade Register will have to make public details about their Ultimate Beneficial Owners (UBOs). The UBO register is publicly accessible: a request for information costs €2.50.
The UBO register aims to prevent money laundering but will lead to defamation.
The privacy breach that is the result of the UBO register and the public accessibility of sensitive data are disproportionate. The goal of the register is to thwart money laundering and terrorist financing. In order to achieve this goal there is no need for a UBO register, at least not one that is publicly accessible.
That is why Privacy First wants the UBO register to be rendered inoperative by a court, which, in case necessary, should submit questions of interpretation to the highest court in Europe: the European Court of Justice. In cases like these, the judiciary will have the final say. It is not uncommon for a court to overrule privacy-violating legislation and in this respect, Privacy First’s litigation has been successful in the past.
The proceedings will take place before The Hague District Court on 25 February 2021 at 12pm. The entire summons can be found HERE (pdf in Dutch). The ruling will follow two or three weeks after the hearing.
Background of the UBO register case
On 24 June 2020, the Dutch ‘Implementation Act for the Registration of Ultimate Beneficial Owners of Companies and Other Legal Entities’ came into effect in the Netherlands. On the basis of this new Act, a new UBO register which is linked to the Commercial Register of the Dutch Chamber of Commerce will contain information about all ultimate beneficial owners of companies and other legal entities founded in the Netherlands. The register should indicate how many shares are owned by the UBO: 25-50%, 50-75% or more than 75%. Furthermore, the name, month and year of birth as well as the nationality of the UBO will be made public, with all the privacy risks this entails.
Since 27 September 2020, newly founded entities have to register the ultimate beneficial owners in the UBO register. Existing legal entities will have to do so before 27 March 2022.
The Act provides very few possibilities to safeguard information. This is possible only for persons that are protected by the police, minors and those placed under guardianship. This means that the shares of practically every UBO will become a matter of public record. Anyone has access to the UBO register, with extracts coming at a price of €2.50.
European money laundering directive
The new Act stems from the fifth European money laundering directive, which obliges EU Member States to register UBOs and disclose their details to the public. According to the European legislator, this contributes to the proclaimed objective of countering money laundering and terrorist financing. The transparency is supposed to be a deterrent for persons who set out to launder money or finance terrorism.
Massive privacy violation and fundamental criticism
The question is whether this produces a windfall effect. Registering the personal data of all UBOs and making these publicly available is a generic precautionary measure. 99.99% of UBOs have nothing to do with money laundering or terrorist financing. Even if it were proportionate to collect information on all UBOs, making that information available only to government agencies engaged in combating money laundering and terrorism should suffice. It is not appropriate to disclose that information to everyone. The European Data Protection Supervisor (EDPS) deemed this privacy violation to be disproportionate. This opinion, however, did not lead to an amendment of the European Directive.
When this Act was discussed in Dutch Parliament, fundamental criticism came from various corners of society. The business community made its voice heard because it perceived privacy risks and feared − and now indeed experiences − an increase in costs. UBOs of family-owned companies that have remained out of the public eye up until now are running major privacy and security risks. There was also a great deal of attention for the position of social organizations − such as church communities and NGOs − that attach great importance to the protection of those affiliated with them. Associations and foundations that do not have owners face a different burden: they have to put the data that are already in the Trade Register in yet another register. Unfortunately these complaints have not resulted in any changes to the legislation.
Legal proceedings look promising
Privacy First has initiated legal proceedings against the UBO register for violation of the fundamental right to privacy and the protection of personal data. Privacy First asks the Dutch court to render the UBO register inoperative in the short term and, if necessary, to submit questions of interpretation on this matter to the highest court in Europe, the Court of Justice of the European Union.
The Dutch Act as well as the underlying European directive are in conflict with both the European Charter of Fundamental Rights and the GDPR. It is the legislator who has created this legislation, but it will be up to the court to do a thorough review thereof. Ultimately, the court has the last word. If the (European) legislator fails to take adequate account of the protection of fundamental rights, then the (European) court can invalidate this legislation. This would not be unique. The Court of Justice of the European Union has previously declared legislation invalid due to privacy violations, for example the Data Retention Directive and, more recently, the Privacy Shield. Dutch courts too regularly annul privacy-invading regulations. Privacy First has previously successfully challenged the validity of legislation, for example in the proceedings concerning the Telecommunications Data Retention Act and the System Risk Indication (SyRI). Viewed against this background, a positive outcome in the case against the UBO register is all but unlikely.
This week the Dutch House of Representatives will debate the ‘temporary’ Corona emergency law under which the movements of everyone in the Netherlands can henceforth be monitored ‘anonymously’. Privacy First has previously criticized this plan in a television broadcast by current affairs program Nieuwsuur. Subsequently, today Privacy First has sent the following letter to the House of Representatives:
Dear Members of Parliament,
With great concern, Privacy First has taken note of the ‘temporary’ legislative proposal to provide COVID-19 related telecommunications data to the Dutch National Public Health Institute (RIVM). Privacy First advises to reject this proposal on account of the following fundamental concerns and risks:
Violation of fundamental administrative and privacy principles
- There is no societal necessity for this legislative proposal. Other forms of monitoring have already proven sufficiently effective. The necessity of this proposal has not been demonstrated and there is no other country where the application of similar technologies made any significant contribution.
- The proposal is entirely disproportionate as it encompasses all telecom location data in the entire country. Any form of differentiation is absent. The same applies to data minimization: a sample would be sufficient.
- The proposal goes into effect retroactively on 1 January 2020. This violates legal certainty and the principle of legality, particularly because this date is long before the Dutch ‘start’ of the pandemic (11 March 2020).
- The system of ‘further instructions from the minister’ that has been chosen for the proposal is completely undemocratic. This further erodes the democratic rule of law and the oversight of parliament.
- The proposal does not mention 'privacy by design' or the implementation thereof, while this should actually be one of its prominent features.
Alternatives are less invasive: subsidiarity
- The State Secretary failed to adequately investigate alternatives which are more privacy friendly. Does she even have any interest in this at all?
- Data in the possession of telecom providers are pseudonymized with unique ID numbers and as such are submitted to Statistics Netherlands (CBS). This means that huge amounts of sensitive personal data become very vulnerable. Anonymization by CBS happens only at a later stage.
- When used, the data are filtered based on geographical origin. This creates a risk of discrimination on the basis of nationality, which is prohibited.
- It is unclear whether the CBS and the RIVM intend to ‘enrich’ these data with other data, which could lead to function creep and potential data misuse.
Lack of transparency and independent oversight
- Up until now, the Privacy Impact Assessment (PIA) of the proposal has not been made public.
- There is no independent oversight on the measures and effects (by a judge or an independent commission).
- The GDPR may be applicable to the proposal only partially as anonymous data and statistics are exempt from the GDPR. This gives rise to new risks of data misuse, poor digital protection, data breaches, etc. General privacy principles should therefore be made applicable in any case.
Structural changes and chilling effect
- This proposal seems to be temporary, but the history of similar legislation shows that it will most likely become permanent.
- Regardless of the ‘anonymization’ of various data, this proposal will make many people feel like they are being monitored, which in turn will make them behave unnaturally. The risk of a societal chilling effect is huge.
Faulty method with a significant impact
- The effectiveness of the legislative proposal is unknown. In essence, it constitutes a large scale experiment. However, Dutch society is not meant to be a living laboratory.
- By means of data fusion, it appears that individuals could still be identified on the basis of anonymous data. Even at the chosen threshold of 15 units per data point, the risk of unique singling out and identification is likely still too large.
- The proposal will lead to false signals and blind spots due to people with several telephones as well as vulnerable groups without telephones, etc.
- There is a large risk of function creep, of surreptitious use and misuse of data (including the international exchange thereof) by other public services (including the intelligence services) and future public authorities.
- This proposal puts pressure not just on the right to privacy, but on other human rights as well, including the right to freedom of movement and the right to demonstrate. The proposal can easily lead to structural crowd control that does not belong in a democratic society.
Specific prior consent
Quite apart from the above concerns and risks, Privacy First doubts whether the use of telecom data by telecom providers, as envisaged by the legislative proposal, is lawful in the first place. In the view of Privacy First, this would require either explicit, specific and prior consent (opt-in) from customers, or the possibility for them to opt-out at a later stage and to have the right to have all their data removed.
It is up to you as Members of Parliament to protect our society from this legislative proposal. If you fail to do so, Privacy First reserves the right to take legal action against this law.
The Privacy First Foundation
Yesterday, there was a hearing in the Dutch House of Representatives in which the by now notorious Corona app was critically discussed. The House had invited various experts and organizations (among which Privacy First) to submit position papers and take part in the hearing. Below is both the full text of our position paper, as well as the text which was read out at the hearing. A video of the entire hearing (in Dutch) can be found HERE. Click HERE for the program, all speakers and position papers.
Dear Members of Parliament,
Thank you kindly for your invitation to take part in this roundtable discussion about the so-called Corona app. In the view of Privacy First, apps like these are a threat to everyone’s privacy. We will briefly clarify this below.
Lack of necessity and effectiveness
With great concern, Privacy First has taken note of the intention of the Dutch government to employ a contact tracing app in the fight against the coronavirus. Thus far, the social necessity of such apps has not been proven, while the experience of other countries indicates there is ground to seriously doubt their benefit and effectiveness. In fact, these apps may even be counterproductive as their use leads to a false sense of safety. Moreover, it’s very hard to involve the most vulnerable group of people (the elderly) through this means. This should already be enough reason to refrain from using Corona apps.
In Privacy First’s view, the use of such apps is a dangerous development because it could lead to stigmatization and numerous unfounded suspicions, and may also cause unnecessary unrest and panic. Even when ‘anonymized’, the data from these apps can still be traced back to individuals through data fusion. In case this technology will be introduced on a large scale, it will result in a surveillance society in which everyone is being continuously monitored – something people will be acutely aware of and would lead to an imminent societal chilling effect.
Risks of misuse
There is a significant risk that the collected data will be used for multiple purposes (function creep) and be misused by both companies and public authorities. The risk of surreptitious access, hacking, data breaches and misuse is substantial, particularly in the case of central instead of decentral (personal) storage as well as a lack of open source software. However, not even the use of personal storage offers any warranty against misuse, malware and spyware, or, for that matter, makes users less dependent on technical vulnerabilities. Moreover, if the data fall into the hands of criminal organizations, they will be a gold mine for criminal activities.
For Privacy First, the risks of Corona apps do not outweigh their presumed benefits. Therefore, Privacy First advises the House to urge the cabinet not to proceed with the introduction of such apps.
Testing instead of apps
According to Privacy First, there is a better and more effective solution in the fight against the coronavirus. One that is based on the principles of proportionality and subsidiarity, i.e., large scale testing of people to learn about infection rates and immunization. To this end, the necessary test capacity should become available as soon as possible.
Haste is rarely a good thing
If, despite all the above-mentioned objections, it will be decided there is going to be a Corona app after all, then this should come about only after a careful social and democratic process with sufficiently critical, objective and independent scrutiny. This has not been the case so far, judging by the developments of the past few days. In this context, Privacy First recommends that the House calls on the cabinet to put its plans on ice and impose a moratorium on the use of Corona apps.
Privacy by design
The right to anonymity in public space is a fundamental right, one that is crucial for the functioning of our democratic constitutional state. Any democratic decision to nullify this right is simply unacceptable. If indeed the deployment of ‘Corona apps’ will be widespread, then at least their use should be strictly anonymous and voluntary. That is to say, they should be used only for a legitimate, specific purpose, following individual, prior consent without any form of outside pressure and on the premise that all the necessary information is provided. In this respect, privacy by design (embedding privacy protection in technology) must be a guiding principle. For Privacy First, these are stringent and non-negotiable prerequisites. In case these conditions are not met, Privacy First will not hesitate to bring proceedings before a court.
The Privacy First Foundation
Dear Members of Parliament,
You have received our position paper, this is our oral explanation.
First of all: Privacy First is firmly against any form of surveillance infrastructure, with or without apps.
With this in mind, we look at three legal principles:
- Legitimate purpose limitation.
- What is the problem?
- What is the scale of the problem?
- What are possible objectives, how can we achieve these objectives, and how can we measure progress towards them?
It’s already impossible to answer the first question as we now test partially and selectively. The total infected population is unknown, the people who have recovered are unknown also, and do not get reported. There is, however, fearmongering as a result of emotions and selective reporting; deaths with multiple causes (die with as opposed to die from Corona) and admittance to critical care units.
Let us be clear, we will first have to map out the causes of this problem before we can draw conclusions and talk about solutions. Not only IT professionals and virologists should be involved in this, to no lesser extent we need philosophers, legal scholars, sociologists, entrepreneurs and others who represent society also.
- Necessity and proportionality. In terms of test capacity, critical care units, medical materials and medical personnel, we essentially have a capacity problem. So, there is no doubt in our mind what we should be focusing on, also in view of future outbreaks; testing the entire population in order to tell who is infected and who is immune, and be able to determine the real problem. 97% of the population is unaffected. Make sure there will be a division and proper care for high-risk groups. Halt crisis communication and start crisis management. Take all treatment methods seriously, including those that are not profitable for Big Pharma and Big Tech.
- Subsidiarity. Once we know the problem, we may ask what the solutions are. Additional personnel at municipal health centers? Building a critical care unit hospital specifically for situations like these? Increasing the test capacity in order to be able to take decisions based on figures? All of this is possible within our current health system, with the general practitioner as the first point of contact.
On the basis of trust, we have given our government six weeks to get its act together. And what do we get in return? Distrust and monitoring tools. And still shortages of medical equipment. So, fix the fundamentals, deal with the treatment and test capacity and stop building new technological gadgets and draconian apps used in dictatorial regimes in Asia. And take The Netherlands out of this prolonged lockdown as soon as possible. Privacy First is opposed to a ‘1.5-meter society’ as the new normal, and is instead in favor of a common-sense society based on trust in mature citizens.
With great concern, Privacy First has taken note of the intention of the Dutch government to employ special apps in the fight against the coronavirus. In Privacy First’s view, the use of such apps is a dangerous development because it could lead to stigmatisation and numerous unfounded suspicions, and may also cause unnecessary unrest and panic. Even when ‘anonymized’, the data from these apps can still be traced back to individuals through data fusion. In case this technology will be introduced on a large scale, it will result in a surveillance society in which everyone is being continuously monitored – something people will be acutely aware of and would lead to an imminent societal chilling effect. Furthermore, there is a substantial risk that the collected data will be used and misued for multiple (illegitimate) purposes by companies and public authorities. Moreover, if these data fall into the hands of criminal organizations, they will be a gold mine for criminal activities. For Privacy First, these risks of Corona apps do not outweigh their presumed benefits.
The right to anonymity in public space is a fundamental right, one that is crucial for the functioning of our democratic constitutional State. Any democratic decision to nullify this right is simply unacceptable. If indeed the deployment of ‘Corona apps’ will be widespread, then at least their use should be strictly anonymous and voluntary. That is to say, they should be used only for a legitimate, specific purpose, following individual, prior consent without any form of outside pressure and on the premise that all the necessary information is provided. In this respect, privacy by design (embedding privacy protection in technology) must be a guiding principle. For Privacy First, these are stringent and non-negotiable prerequisites. In case these conditions are not met, Privacy First will not hesitate to bring proceedings before a court.
The world is hit exceptionally hard by the coronavirus. This pandemic is not only a health hazard, but can also lead to a human rights crisis, endangering privacy among other rights.
The right to privacy includes the protection of everyone’s private life, personal data, confidential communication, home inviolability and physical integrity. Privacy First was founded to protect and promote these rights. Not only in times of peace and prosperity, but also in times of crisis.
Now more than ever, it is vital to stand up for our social freedom and privacy. Fear should not play a role in this. However, various countries have introduced draconian laws, measures and infrastructures. Much is at stake here, namely preserving everyone’s freedom, autonomy and human dignity.
Privacy First monitors these developments and reacts proactively as soon as governments are about to take measures that are not strictly necessary and proportionate. In this respect, Privacy First holds that the following measures are in essence illegitimate:
- Mass surveillance
- Forced inspections in the home
- Abolition of anonymous or cash payments
- Secret use of camera surveillance and biometrics
- Every form of infringement on medical confidentiality.
Privacy First will see to it that justified measures will only apply temporarily and will be lifted as soon as the Corona crisis is over. It should be ensured that no new, structural and permanent emergency legislation is introduced. While the measures are in place, effective legal means should remain available and privacy supervisory bodies should remain critical.
Moreover, in order to control the coronavirus effectively, we should rely on the individual responsibility of citizens. Much is possible on the basis of voluntariness and individual, fully informed, specific and prior consent.
As always, Privacy First is prepared to assist in the development of privacy-friendly policies and any solutions based on privacy by design, preferably in collaboration with relevant organizations and experts. Especially in these times, the Netherlands (and the European Union) can become an international point of reference when it comes to fighting a pandemic while preserving democratic values and the right to privacy. This is the only way that the Corona crisis will not be able to weaken our world lastingly, and instead, we will emerge stronger together.
The coronavirus has plunged the whole world into a deep crisis and governments do their utmost to control the dissemination. As I wrote in my previous column, it is important especially now to keep our heads cool and to protect our civil rights and privacy. A short and temporary infringement of our privacy in the general interest may be legitimate. The western model should imply a partial, temporary lockdown, lasting at most twice the incubation period so as to control the spread of the virus based on increased testing, and to facilitate the healthcare system, augmenting the number of critical care beds.
Moreover, this should be a participatory lockdown, based on voluntary participation and citizens’ individual responsibility. This is only logical, as trust is the cornerstone of our democratic society, even though at times there is a lack of it. This concerns trust in fellow citizens, the government and first of all, oneself. At this point in time I have a lot of confidence in the Dutch approach, which is a combination of common sense and relying on healthcare experts. Ultimately, we will have to learn to live with this virus and control potential outbreaks.
To measure is to know and therefore it is essential to scale up the number of tests with the right test equipment without delay. There are tests which can indicate quickly whether someone is infected. It is interesting to note that in Germany, where practically everyone with symptoms is being tested, the percentages of gravely ill and deceased people are considerably lower than in countries where testing is very limited. For policy makers and politicians it is thus very important to take the right decisions on the basis of facts.
If not, there will be a long-standing and emotionally-driven struggle, the encroachment on our freedom will not be short and temporary and power will shift disproportionately into the hands of the State. Such a scenario will see us move towards a forced surveillance society (see the current situation Israel is in, the newly introduced legislation in the UK as well as EU proposals with regard to telecom location data), characterised by the abolishment of anonymous (cash) payments (see the current guidelines in the Dutch retail sector), the dissolution of medical confidentiality and physical integrity in the context of potential virus infections (compulsory vaccinations and apps) and censorship of any alternative or undesired sources of information that counter the prevailing narrative. Besides, commercial interests of IT and pharmaceutical companies would come to dominate even more.
In the best case scenario, both society and the economy will soon be able to revive on the basis of individual and aggregate test results, with this lesson to bear in mind: let’s not lose the importance of our freedom, health and individual responsibility out of sight. All of a sudden, citizens have been left to their own devices and this experience will make them realize that life is not malleable and our society is not a mere paper exercise. This situation could lead to increased civic participation and less government, i.e. greater focus on critical functions. When we take a look around now, we see positive-minded, well-informed and responsible citizens and there is no need to keep focusing on a handful of exceptions. That is, as long as the measures in place are comprehensible, measurable and very temporary, and are not packaged into structural legislation, thereby misusing the crisis in order to grant certain organizations and sectors greater influence and power.
Finally, it’s worth realizing that all entrepreneurial Dutchmen without whom we would not be able to pay our fine public services, also deserve a round of applause. And perhaps the idea of a basic income for every citizen could be reviewed once more. In other words: let’s aim for more individual decisions in a freer society that is supported by technology and common sense!
Here’s to a free 2020!
Privacy First chairman
(in personal capacity)
Many questions have been raised about Privacy First’s point of view in relation to the protection of privacy in crisis situations, such as the one we’re currently experiencing as a result of the coronavirus. As indicated previously, I support the precautionary principle, i.e., we don’t know what we don’t know and what in fact is effective. A strict, western-style approach on the basis of a temporary (partial) lockdown for a (very) short period of time will drastically flatten the coronavirus curve and will make sure the healthcare system does not collapse. This also allows us to gain time to find a vaccine or medicine. We still don’t know exactly what kind of virus we’re dealing with, how it came into existence and how to control it.
Our society is built on trust. In a crisis situation like we’re in now, authorities will have to take temporary crisis measures which allow citizens to do the right thing voluntarily and on the basis of trust. This may temporarily restrict privacy, such as freedom of movement and/or physical integrity (think of being in quarantine). The government can choose to have a full or partial lockdown. Making this choice, it is essential that we rely on the norms and values of our free, democratic society, and that there is trust both in the citizenry and in the means and measures that may be employed. Ideally, this would result in a participatory lockdown based on everyone’s freedom and sense of responsibility.
Past experience shows that when there is open and honest communication, citizens act responsibly and in the general interest. This implies that draconian and structural legislative measures that restrict freedom can be kept at bay, much to the benefit of the people and the economy. In this respect, it is significant that practically all companies, institutions and organizations currently comply with the protocols, and even do more than what is required. After a period of inaction, the Dutch government has decided to act and take responsibility, which is most welcome. After all, this concerns a potentially great number of very sick patients and fatalities, including many elderly and vulnerable people.
Our government has opted for a democratic instead of a dictatorial approach, and that is to be applauded. So let’s use this moment to keep our head cool instead of infringing upon everyone’s freedom and right to privacy, freedom of movement, bodily integrity and cash payments. I see there is a bitter wind sweeping through Denmark, where a coronavirus emergency law has been rushed through, allowing the authorities to force people to be vaccinated (even though there is no vaccine yet), and in France too, where permanent crisis measures seem to have been implemented. All this is incompatible with a decent society and creates misplaced precedents. Let’s act in the general interest on the basis of trust and everyone’s own responsibility. For that, we need neither to be locked up, nor do we want to see the army in the streets, or any other draconian measures or laws to be put in place.
Let’s strive for a free and trustworthy Netherlands and Europe.
Privacy First chairman
(in personal capacity)
In the context of the National Privacy Conference organized by Privacy First and the Dutch Platform for the Information Society (ECP), today the Dutch Privacy Awards have been handed out. These Awards offer a podium to organizations that consider privacy as an opportunity to positively distinguish themselves and want privacy-friendly entrepreneurship and innovation to become a benchmark. The winners of the 2020 Dutch Privacy Awards are Publicroam, NUTS and Candle.
Safe and easy access to WiFi everywhere for guest users
Most people in libraries, hotels, coffee bars and other public places log onto the local WiFi network in order to save on mobile data and to not rely on mobile networks which indoors may not be available everywhere. Often, WiFi networks operate on the basis of a single, local password, indicated on tables and screens. This makes the digital activities of users vulnerable in more ways than one, with all the ensuing nasty consequences. On top of that, users may not be informed about what the internet provider does with their personal data. It is said that the trade in personal data is by now more profitable than the trade in oil.
These risks were first identified by educational institutions and later by public authorities. This led to the creation of international roaming services like Eduroam and Govroam. But why aren’t such services available everywhere and to everyone? Publicroam set out to change just that and is being welcomed in more and more places. And rightfully so, according to the Privacy Awards expert panel. Several large municipalities and organizations (all libraries in the Netherlands among them) are already connected to Publicroam, or will be soon. In and of itself this facility is not a completely new solution, but the expert panel is particularly impressed by the fact that it can offer great advantages to literally everyone in the country – and possibly beyond – and can therefore have a huge impact on what we’re used to: one account which allows all users to go online automatically and securely, with serious respect for privacy ensured.
It’s possible after all: sound business initiatives that respect privacy; Publicroam is proof of this.
Decentral infrastructure for privacy-friendly communication in healthcare
The NUTS Foundation is an initiative which aims to offer a privacy-friendly solution to identity management and sharing personal data in healthcare environments. It entails that individuals keep control over which healthcare data may be shared between healthcare providers. The NUTS Foundation has laid down its principles in a manifesto which all participants should ascribe to and which states that all software that’s being developed should meet the demands of open source. The result that the NUTS Foundation is striving for is a decentral system which keeps control over personal health information in the hands of the people involved.
The services offered by the decentral network are based on the principles of privacy by design. Identity management solutions contribute to irrefutably establishing the identity of individuals concerned. The decentral approach is in line with the digital healthcare architecture which is currently in the making and is also partly being introduced already. In this way, healthcare information components can use the decentral facilities that are being realized through NUTS.
In the eyes of the expert panel, the NUTS Foundation is a strong example of an initiative which not only looks at privacy issues in a comprehensive way but creates concrete solutions to these issues as well. The open source community that the NUTS Foundation is bringing to fruition, prevents vendor-lock-in in crucial areas of the digital healthcare infrastructure. Emerging digital Personal Healthcare Areas can equally make use of the decentral administrative provisions which NUTS is working towards. The rationale behind NUTS – creating a utility for a crucial part of the digital healthcare architecture – particularly appeals to the expert panel. Expanding the foundation, which currently by and large relies on a single company, will further increase the support for this initiative.
In order to give the NUTS Foundation the opportunity to further realize its ideals and to propagate these more widely, the expert panel has decided to confer this year’s Dutch Privacy Award for business solutions to the NUTS Foundation.
Privacy-friendly smart home solution
Candle is a reaction to a risk analysis (privacy by design) to Internet of Things products which unnecessarily connect to a cloud server. It’s a project which concentrates on developing alternative smart systems in and around the home, based on the principle that connection to the internet is unnecessary. Candle started off as a project organization run by students from universities and colleges of higher education as well as by artists’ collectives who aimed at developing practical hardware solutions combined with open source software. Various domestic appliances such as central heating, cameras, CO2 sensors and other applications can easily be connected with one another. A switch is used to make contact with an external network. Users make a deliberate choice when they import and export emails and other data.
Candle shows that it’s very well feasible to create a Smart solution without Big Tech companies and their data driven models. Meanwhile, there are various concept solutions which companies can actually put into practice. In its core, Candle is privacy by design and it opens people’s eyes to alternative smart systems.
"The market for ethical technology will grow in much the same way as the market for biological food has grown enormously. But how do we boost this market? That’s the challenge. The GDPR has ploughed the earth. Now it’s time to sow and entrust this concept to consumers", comments Candle.
There are four categories in which applicants are awarded:
1. the category of Consumer solutions (business-to-consumer)
2. the category of Business solutions (within a company or business-to-business)
3. the category of Public services (public authority-to-citizen)
4. The incentive prize for a ground breaking technology or person.
From the various entries, the independent expert panel chose the following nominees per category:
|Consumer solutions:||Business solutions:||Public services:|
During the National Privacy Conference the nominees presented their projects to the audience in Award pitches. Thereafter, the Awards were handed out. Click HERE for the entire expert panel report (pdf), which includes participation criteria and explanatory notes on all the nominees and winners.
National Privacy Conference
The National Privacy Conference is a ECP|Platform for the Information Society and Privacy First initiative. Once a year, the conference brings together Dutch industry, public authorities, the academic community and civil society with the aim to build a privacy-friendly information society. The mission of both the National Privacy Conference and Privacy First is to turn the Netherlands into a guiding nation in the field of privacy. To this end, privacy by design is key.
These were the speakers during the 2020 National Privacy Conference in successive order:
- Monique Verdier (vice chairman of Dutch Data Protection Authority)
- Richard van Hooijdonk (trendwatcher/futurist) and Bas Filippini (founder and chairman of Privacy First)
- Tom Vreeburg (IT-auditor)
- Coen Steenhuisen (privacy advisor at Privacy Company)
- Peter Fleischer (global privacy counsel at Google)
- Sander Klous (professor in Big Data Eco Systems, University of Amsterdam)
- Kees Verhoeven (Member of the Dutch House of Representatives for D66).
Expert panel of the Dutch Privacy Awards
The independent expert award panel consists of privacy experts from different fields:
• Bas Filippini, founder and chairman of Privacy First
• Paul Korremans, partner at Comfort Information Architects and Privacy First board member
• Marie-José Bonthuis, owner of IT’s Privacy
• Esther Janssen, attorney at Brandeis Attorneys specialized in information law and fundamental rights
• Marc van Lieshout, managing director at iHub, Radboud University Nijmegen
• Melanie Rieback, CEO and co-founder of Radically Open Security
• Nico Mookhoek, privacy lawyer and owner of NMLA
• Wilmar Hendriks, founder of Control Privacy and member of the Privacy First advisory board
• Alex Commandeur, senior advisor at BMC Advies.
In order to make sure that the award process is run objectively, the panel members may not judge on any entry of his or her own organization.
Privacy First organizes the Dutch Privacy Awards with the support of the Democracy & Media Foundation and in collaboration with ECP. Would you like to become a partner of the Dutch Privacy Awards? Then please contact Privacy First!
Today, the district court of The Hague ruled on the use of the algorithm-based system SyRI (System Risk Indication) by the Dutch government. The judges decided that the government, in trying to detect social services fraud, has to stop profiling citizens on the basis of large scale data analysis. As a result, people in the Netherlands are no longer 'suspected from the very start’ ("bij voorbaat verdacht").
The case against the Dutch government was brought by a coalition of NGOs, consisting of the Dutch Platform for the Protection of Civil Rights (Platform Bescherming Burgerrechten), the Netherlands Committee of Jurists for Human Rights (Nederlands Juristen Comité voor de Mensenrechten, NJCM), Privacy First, the KDVP Foundation (privacy in mental healthcare), Dutch trade union FNV, the National Clients Council (LCR) and authors Tommy Wieringa and Maxim Februari.
The court concludes that SyRI is in violation of the European Convention on Human Rights. SyRI impinges disproportionately on the private life of citizens. This concerns not only those that SyRI has flagged as an 'increased risk', but everyone whose data are analysed by the system. According to the court, SyRI is non-transparent and therefore cannot be scrutinized. Citizens can neither anticipate the intrusion into their private life, nor can they guard themselves against it.
Moreover, the court draws attention to the actual risk of discrimination and stigmatization on the grounds of socio-economic status and possibly migration background, of citizens in disadvantaged urban areas where SyRI is being deployed. There is a risk – which cannot be examined – that SyRI operates on the basis of prejudices. The attorneys of the claimant parties, Mr. Ekker and Mr. Linders, had this to say: "The court confirms that the large scale linking of personal data is in violation of EU law, Dutch law and fundamental human rights, including the protection of privacy. Therefore, this ruling is also important for other European countries and on a wider international level."
From now on, as long as there is no well-founded suspicion, personal data from different sources may no longer be combined.
Line in the sand
"This ruling is an important line in the sand against the unbridled collection of data and risk profiling. The court puts a clear stop to the massive surveillance that innocent citizens have been under. SyRI and similar systems should be abolished immediately", states Privacy First director Vincent Böhre.
"Today we have been proved right on all fundamental aspects. This is a well-timed victory for the legal protection of all citizens in the Netherlands", says Tijmen Wisman of the Platform for the Protection of Civil Rights.
Another plaintiff in the case, trade union FNV, equally rejects SyRI on principal grounds. "We are delighted that the court has now definitively cancelled SyRI", comments Kitty Jong, vice chair of FNV.
The parties hope that the ruling will herald a turning point in the way in which the government deals with the data of citizens. They believe this viewpoint is endorsed by the considerations of the court: these apply not only to SyRI, but also to similar practices. Many municipalities in the Netherlands have their own data linking systems which profile citizens for all sorts of policy purposes. When it comes to combining data, a legislative proposal that would be greater in scope than SyRI and would enable lumping together the databases of private parties and those of public authorities, was all but unthinkable. The decision by the Hague district court, however, clamps down on these Big Data practices. According to the claimant parties, it is therefore of crucial importance that the SyRI ruling will affect both current as well as future political policies.
The case against SyRI serves both a legal and a social goal. With this ruling, both goals are reached. Merel Hendrickx of PILP-NJCM: "Apart from stopping SyRI, we also aimed at initiating a public debate about the way the government deals with citizens in a society undergoing digitisation. This ruling shows how important it is to have that discussion."
Although SyRI was adopted in 2014 without any fuss, the discussion about its legality intensified after the lawsuit was announced. At the start of 2019, the use of SyRI in two Rotterdam neighbourhoods led to protests among inhabitants and a discussion in the municipal council. Soon after, the mayor of Rotterdam, Ahmed Aboutaleb, pulled the plug on the SyRI program because of doubts over its legal basis. In June 2019, Dutch newspaper Volkskrant revealed that SyRI had not detected a single fraudster since its inception. In October 2019, the UN Special Rapporteur on extreme poverty and human rights, Philip Alston, wrote a critical letter to the district court of The Hague expressing serious doubts over the legality of SyRI. Late November 2019, SyRI won a Big Brother Award.
The coalition of parties was represented in court by Anton Ekker (Ekker Advocatuur) and Douwe Linders (SOLV Attorneys). The proceedings were coordinated by the Public Interest Litigation Project (PILP) of the NJCM.
The full ruling of the court can be found HERE (official translation in English).
Fundamental lawsuit against mass risk profiling of unsuspected citizens
On Tuesday October 29 at 9:30 am in the district court of The Hague the court hearing will take place in the main proceedings of a broad coalition of Dutch civil society organizations against Systeem Risico Indicatie (System Risk Indication - SyRI). SyRI uses secret algorithms to screen entire residential areas to profile citizens on the risk of fraud with social services. According to the coalition of plaintiffs, this system poses a threat to the rule of law and SyRI must be declared unlawful.
The group of plaintiffs, consisting of the Dutch Platform for the Protection of Civil Rights, the Netherlands Committee of Jurists for Human Rights (NJCM), the Privacy First Foundation, the KDVP Foundation and the National Client Council (LCR), in March 2018 sued the Dutch Ministry of Social Affairs. Authors Tommy Wieringa and Maxim Februari, who previously spoke very critically about SyRI, joined the proceedings in their personal capacity. In July 2018, Dutch labour union FNV also joined the coalition.
The parties are represented by Anton Ekker (Ekker Advocatuur) and Douwe Linders (SOLV Attorneys). The case is coordinated by the Public Interest Litigation Project (PILP) of the NJCM.
Trawl method on unsuspected citizens
SyRI links the personal data of citizens from various government databases on a large scale. These centrally collected data are subsequently analyzed by secret algorithms. This should show whether citizens pose a risk of being guilty of one of the many forms of fraud and violations that the system covers. If the analysis of SyRI leads to a risk notification, then the citizen in question will be included in the so-called Risk Notices Register (Register Risicomeldingen), which can be accessed by government authorities.
SyRI uses this trawl method to screen all residents of a neighborhood or area. For this, the system uses almost all data that government authorities store about citizens. It comprises 17 data categories, which together provide a very intrusive picture of someone's private life. SyRI currently covers the databases of the Dutch Tax Authorities, Inspectorate of Social Affairs, Employment Office, Social Security Bank, municipalities and the Immigration Service. According to the Dutch Council of State (Raad van State), which gave a negative opinion on the SyRI bill, it was hard to imagine any data that did not fall within the scope of the system. Former chairman Kohnstamm of the Dutch Data Protection Authority, which also issued a negative opinion on the system, called the adoption of the SyRI legislation "dramatic" at the time.
Threat to the rule of law
According to the claimants, SyRI is a black box with major risks for the democratic rule of law. It is completely unclear to any citizen, who can be screened by SyRI without cause, what data are used for this, which analysis is carried out with it and what makes him or her a 'risk'. Moreover, due to the secret operation of SyRI, citizens are also unable to refute an incorrect risk indication. The use of SyRI makes the legal process and the associated procedures intransparent.
SyRI thereby undermines the relationship of trust between the government and its citizens; these citizens are in fact suspected in advance. Virtually all information that they share with the government, often to be eligible for basic services, can be used against them secretly without any suspicion.
The plaintiffs in this lawsuit are not opposed to the government combating fraud. They just think that this should be done on the basis of a concrete suspicion. There should be no trawl searches in the private life of unsuspected Dutch citizens to look for possible fraud risks. According to the claimants, this disproportionate method does more harm than good. There are better and less radical forms of fraud prevention than SyRI.
Not one fraudster detected yet
The total of five SyRI investigations that have been announced since the system's legal introduction have by now turned tens of thousands of citizens inside out, but have not yet detected one fraudster. This was revealed at the end of June 2019 by Dutch newspaper Volkskrant, which managed to get hold of evaluations of SyRI investigations. The investigations failed because the analyses were incorrect, due to lack of capacity and time at the implementing bodies, but also because there is disagreement within the government about SyRI.
For example, mayor Aboutaleb of Rotterdam pulled the plug from the SyRI investigation in two neighborhoods in Rotterdam South last summer, because the Ministry, unlike the municipality, also wanted to use police and healthcare data in the investigation. The deployment of SyRI also led to protest among the neighborhood's residents, who clearly showed that they felt insulted and unfairly treated.
UN expresses concern about SyRI
The UN Special Rapporteur on extreme poverty and human rights Philip Alston wrote to the court earlier this month about his concerns about SyRI and urged the judges to thoroughly assess the case. According to the rapporteur, several fundamental rights are at stake. SyRI is described in his letter as a digital equivalent of a social detective who visits every household in an area without permission and searches for fraudulent cases; in the analogue world such a massive manhunt would immediately lead to great resistance, but with a digital instrument such as SyRI, it is wrongly claimed that 'ignorance is bliss'.
The court hearing is open to the public and will take place on Tuesday October 29th from 9.30 am in the Palace of Justice, Prins Clauslaan 60 in The Hague. Case number: C/09/550982 HA ZA 18/388 (Nederlands Juristen Comité c.s./Staat).
Source: campaign website Bijvoorbaatverdacht.nl.