This week the Dutch House of Representatives will debate the ‘temporary’ Corona emergency law under which the movements of everyone in the Netherlands can henceforth be monitored ‘anonymously’. Privacy First has previously criticized this plan in a television broadcast by current affairs program Nieuwsuur. Subsequently, today Privacy First has sent the following letter to the House of Representatives:

Dear Members of Parliament,

With great concern, Privacy First has taken note of the ‘temporary’ legislative proposal to provide COVID-19 related telecommunications data to the Dutch National Public Health Institute (RIVM). Privacy First advises to reject this proposal on account of the following fundamental concerns and risks:

Violation of fundamental administrative and privacy principles

- There is no societal necessity for this legislative proposal. Other forms of monitoring have already proven sufficiently effective. The necessity of this proposal has not been demonstrated and there is no other country where the application of similar technologies made any significant contribution.
- The proposal is entirely disproportionate as it encompasses all telecom location data in the entire country. Any form of differentiation is absent. The same applies to data minimization: a sample would be sufficient.
- The proposal goes into effect retroactively on 1 January 2020. This violates legal certainty and the principle of legality, particularly because this date is long before the Dutch ‘start’ of the pandemic (11 March 2020).
- The system of ‘further instructions from the minister’ that has been chosen for the proposal is completely undemocratic. This further erodes the democratic rule of law and the oversight of parliament.
- The proposal does not mention 'privacy by design' or the implementation thereof, while this should actually be one of its prominent features.

Alternatives are less invasive: subsidiarity

- The State Secretary failed to adequately investigate alternatives which are more privacy friendly. Does she even have any interest in this at all?
- Data in the possession of telecom providers are pseudonymized with unique ID numbers and as such are submitted to Statistics Netherlands (CBS). This means that huge amounts of sensitive personal data become very vulnerable. Anonymization by CBS happens only at a later stage.
- When used, the data are filtered based on geographical origin. This creates a risk of discrimination on the basis of nationality, which is prohibited.
- It is unclear whether the CBS and the RIVM intend to ‘enrich’ these data with other data, which could lead to function creep and potential data misuse.

Lack of transparency and independent oversight

- Up until now, the Privacy Impact Assessment (PIA) of the proposal has not been made public.
- There is no independent oversight on the measures and effects (by a judge or an independent commission).
- The GDPR may be applicable to the proposal only partially as anonymous data and statistics are exempt from the GDPR. This gives rise to new risks of data misuse, poor digital protection, data breaches, etc. General privacy principles should therefore be made applicable in any case.

Structural changes and chilling effect

- This proposal seems to be temporary, but the history of similar legislation shows that it will most likely become permanent.
- Regardless of the ‘anonymization’ of various data, this proposal will make many people feel like they are being monitored, which in turn will make them behave unnaturally. The risk of a societal chilling effect is huge.

Faulty method with a significant impact

- The effectiveness of the legislative proposal is unknown. In essence, it constitutes a large scale experiment. However, Dutch society is not meant to be a living laboratory.
- By means of data fusion, it appears that individuals could still be identified on the basis of anonymous data. Even at the chosen threshold of 15 units per data point, the risk of unique singling out and identification is likely still too large.
- The proposal will lead to false signals and blind spots due to people with several telephones as well as vulnerable groups without telephones, etc.
- There is a large risk of function creep, of surreptitious use and misuse of data (including the international exchange thereof) by other public services (including the intelligence services) and future public authorities.
- This proposal puts pressure not just on the right to privacy, but on other human rights as well, including the right to freedom of movement and the right to demonstrate. The proposal can easily lead to structural crowd control that does not belong in a democratic society.

Specific prior consent

Quite apart from the above concerns and risks, Privacy First doubts whether the use of telecom data by telecom providers, as envisaged by the legislative proposal, is lawful in the first place. In the view of Privacy First, this would require either explicit, specific and prior consent (opt-in) from customers, or the possibility for them to opt-out at a later stage and to have the right to have all their data removed.

It is up to you as Members of Parliament to protect our society from this legislative proposal. If you fail to do so, Privacy First reserves the right to take legal action against this law.

For further information or questions with regard to everything discussed above, Privacy First can be contacted at all times by telephone (+31-20-8100279) and email (This email address is being protected from spambots. You need JavaScript enabled to view it.).

Yours sincerely,

The Privacy First Foundation

The Privacy Collective press release

Millions of Dutch internet users victim of unlawful collection and use of personal data

The Privacy Collective takes Oracle and Salesforce to Court

The Privacy Collective - a foundation that acts against violation of privacy rights - is taking Oracle and Salesforce to Court. The foundation accuses the technology concerns of unlawfully collecting and processing data of millions of Dutch internet users. The foundation has launched a class action, a legal procedure in which compensation is claimed for a large group of individuals. It is the first time that this legal instrument is used in the Netherlands in a case of infringement of the General Data Protection Regulation (GDPR).

Christiaan Alberdingk Thijm, lead lawyer in the case: “This is one of the largest cases of unlawful processing of personal data in the history of the internet. Almost every Dutch individual who reads or views information online is structurally affected by the practices of Oracle and Salesforce. Practices that merely serve a commercial purpose.”

Online shadow profile

Oracle and Salesforce collect data from website visitors at any time and on a large scale. By combining this with additional information, they create a personal profile of each individual internet user. The millions of profiles are used, among other things, to offer personalized online advertisements and unlawfully shared with numerous commercial parties, including ad-tech companies. The tech giants collect their information using - among other things - specially developed cookies. Alberdingk Thijm: “Most people do not know that they have such an online 'shadow profile'. They don't know what it looks like and have certainly not given legitimate consent.” For the collection and sharing of personal data, Oracle and Salesforce are obliged to ask for permission under the GDPR. “These parties violate internet users' right to privacy. The right to protection of personal data and the right to protection of privacy are recognized as fundamental rights", says Alberdingk Thijm.

Class action

The possibility to claim damages in a class action was recently created under Dutch law.“Claiming damages in a class action is an important tool to ensure the enforcement of the GDPR,” says Joris van Hoboken, a board member of the foundation and professor in Information Law. “It gives the GDPR teeth.” The Privacy Collective calls upon individual consumers to register with the foundation in order to show their support. Based on the number of victims, the total extent of the damage could exceed 10 billion euros. Several organizations support The Privacy Collective's campaign, including Privacy First, Bits of Freedom, Qiy Foundation and Freedom Internet. The claims are being fully funded by Innsworth, a litigation funder. The organization’s funding enables the benefits of scaling common claims in a collective action, without any individual claimants being exposed to litigation costs. Inssworth finances a similar class action in England and Wales, which is currently being prepared.

Source: The Privacy Collective press release, 14 August 2020.

More information: https://theprivacycollective.eu/en/.

On 21 June 2020, the Dutch group Viruswaarheid.nl organised a large-scale protest against the Corona emergency legislation at the Malieveld in The Hague (Netherlands). Many thousands of people were planning to come here to demonstrate peacefully for their freedom. Despite an unjustified ban on this demonstration, several speakers who had been invited for this occasion still gave an appearance, including Privacy First chairman Bas Filippini. You can watch his entire speech (dubbed into English) below or on Youtube. Click here for the original version in Dutch.

Privacy First will continue to oppose totalitarian emergency legislation, including the Dutch Corona Emergency Law, by all political and legal means. Do you support us in this? Then become a donor of Privacy First!

Yesterday, there was a hearing in the Dutch House of Representatives in which the by now notorious Corona app was critically discussed. The House had invited various experts and organizations (among which Privacy First) to submit position papers and take part in the hearing. Below is both the full text of our position paper, as well as the text which was read out at the hearing. A video of the entire hearing (in Dutch) can be found HERE. Click HERE for the program, all speakers and position papers.

Dear Members of Parliament,

Thank you kindly for your invitation to take part in this roundtable discussion about the so-called Corona app. In the view of Privacy First, apps like these are a threat to everyone’s privacy. We will briefly clarify this below.

Lack of necessity and effectiveness

With great concern, Privacy First has taken note of the intention of the Dutch government to employ a contact tracing app in the fight against the coronavirus. Thus far, the social necessity of such apps has not been proven, while the experience of other countries indicates there is ground to seriously doubt their benefit and effectiveness. In fact, these apps may even be counterproductive as their use leads to a false sense of safety. Moreover, it’s very hard to involve the most vulnerable group of people (the elderly) through this means. This should already be enough reason to refrain from using Corona apps.

Surveillance society

In Privacy First’s view, the use of such apps is a dangerous development because it could lead to stigmatization and numerous unfounded suspicions, and may also cause unnecessary unrest and panic. Even when ‘anonymized’, the data from these apps can still be traced back to individuals through data fusion. In case this technology will be introduced on a large scale, it will result in a surveillance society in which everyone is being continuously monitored – something people will be acutely aware of and would lead to an imminent societal chilling effect.

Risks of misuse

There is a significant risk that the collected data will be used for multiple purposes (function creep) and be misused by both companies and public authorities. The risk of surreptitious access, hacking, data breaches and misuse is substantial, particularly in the case of central instead of decentral (personal) storage as well as a lack of open source software. However, not even the use of personal storage offers any warranty against misuse, malware and spyware, or, for that matter, makes users less dependent on technical vulnerabilities. Moreover, if the data fall into the hands of criminal organizations, they will be a gold mine for criminal activities.

For Privacy First, the risks of Corona apps do not outweigh their presumed benefits. Therefore, Privacy First advises the House to urge the cabinet not to proceed with the introduction of such apps.

Testing instead of apps

According to Privacy First, there is a better and more effective solution in the fight against the coronavirus. One that is based on the principles of proportionality and subsidiarity, i.e., large scale testing of people to learn about infection rates and immunization. To this end, the necessary test capacity should become available as soon as possible.

Haste is rarely a good thing

If, despite all the above-mentioned objections, it will be decided there is going to be a Corona app after all, then this should come about only after a careful social and democratic process with sufficiently critical, objective and independent scrutiny. This has not been the case so far, judging by the developments of the past few days. In this context, Privacy First recommends that the House calls on the cabinet to put its plans on ice and impose a moratorium on the use of Corona apps.

Privacy by design

The right to anonymity in public space is a fundamental right, one that is crucial for the functioning of our democratic constitutional state. Any democratic decision to nullify this right is simply unacceptable. If indeed the deployment of ‘Corona apps’ will be widespread, then at least their use should be strictly anonymous and voluntary. That is to say, they should be used only for a legitimate, specific purpose, following individual, prior consent without any form of outside pressure and on the premise that all the necessary information is provided. In this respect, privacy by design (embedding privacy protection in technology) must be a guiding principle. For Privacy First, these are stringent and non-negotiable prerequisites. In case these conditions are not met, Privacy First will not hesitate to bring proceedings before a court.      

Yours faithfully,

The Privacy First Foundation
(...)


Dear Members of Parliament,

You have received our position paper, this is our oral explanation.

First of all: Privacy First is firmly against any form of surveillance infrastructure, with or without apps.

With this in mind, we look at three legal principles:

  •  Legitimate purpose limitation.
    - What is the problem?
    - What is the scale of the problem?
    - What are possible objectives, how can we achieve these objectives, and how can we measure progress towards them?

    It’s already impossible to answer the first question as we now test partially and selectively. The total infected population is unknown, the people who have recovered are unknown also, and do not get reported. There is, however, fearmongering as a result of emotions and selective reporting; deaths with multiple causes (die with as opposed to die from Corona) and admittance to critical care units.

    Let us be clear, we will first have to map out the causes of this problem before we can draw conclusions and talk about solutions. Not only IT professionals and virologists should be involved in this, to no lesser extent we need philosophers, legal scholars, sociologists, entrepreneurs and others who represent society also.

  • Necessity and proportionality. In terms of test capacity, critical care units, medical materials and medical personnel, we essentially have a capacity problem. So, there is no doubt in our mind what we should be focusing on, also in view of future outbreaks; testing the entire population in order to tell who is infected and who is immune, and be able to determine the real problem. 97% of the population is unaffected. Make sure there will be a division and proper care for high-risk groups. Halt crisis communication and start crisis management. Take all treatment methods seriously, including those that are not profitable for Big Pharma and Big Tech.

  • Subsidiarity. Once we know the problem, we may ask what the solutions are. Additional personnel at municipal health centers? Building a critical care unit hospital specifically for situations like these? Increasing the test capacity in order to be able to take decisions based on figures? All of this is possible within our current health system, with the general practitioner as the first point of contact.

On the basis of trust, we have given our government six weeks to get its act together. And what do we get in return? Distrust and monitoring tools. And still shortages of medical equipment. So, fix the fundamentals, deal with the treatment and test capacity and stop building new technological gadgets and draconian apps used in dictatorial regimes in Asia. And take The Netherlands out of this prolonged lockdown as soon as possible. Privacy First is opposed to a ‘1.5-meter society’ as the new normal, and is instead in favor of a common-sense society based on trust in mature citizens.

With great concern, Privacy First has taken note of the intention of the Dutch government to employ special apps in the fight against the coronavirus. In Privacy First’s view, the use of such apps is a dangerous development because it could lead to stigmatisation and numerous unfounded suspicions, and may also cause unnecessary unrest and panic. Even when ‘anonymized’, the data from these apps can still be traced back to individuals through data fusion. In case this technology will be introduced on a large scale, it will result in a surveillance society in which everyone is being continuously monitored – something people will be acutely aware of and would lead to an imminent societal chilling effect. Furthermore, there is a substantial risk that the collected data will be used and misued for multiple (illegitimate) purposes by companies and public authorities. Moreover, if these data fall into the hands of criminal organizations, they will be a gold mine for criminal activities. For Privacy First, these risks of Corona apps do not outweigh their presumed benefits.

The right to anonymity in public space is a fundamental right, one that is crucial for the functioning of our democratic constitutional State. Any democratic decision to nullify this right is simply unacceptable. If indeed the deployment of ‘Corona apps’ will be widespread, then at least their use should be strictly anonymous and voluntary. That is to say, they should be used only for a legitimate, specific purpose, following individual, prior consent without any form of outside pressure and on the premise that all the necessary information is provided. In this respect, privacy by design (embedding privacy protection in technology) must be a guiding principle. For Privacy First, these are stringent and non-negotiable prerequisites. In case these conditions are not met, Privacy First will not hesitate to bring proceedings before a court.

The world is hit exceptionally hard by the coronavirus. This pandemic is not only a health hazard, but can also lead to a human rights crisis, endangering privacy among other rights.

The right to privacy includes the protection of everyone’s private life, personal data, confidential communication, home inviolability and physical integrity. Privacy First was founded to protect and promote these rights. Not only in times of peace and prosperity, but also in times of crisis.

Now more than ever, it is vital to stand up for our social freedom and privacy. Fear should not play a role in this. However, various countries have introduced draconian laws, measures and infrastructures. Much is at stake here, namely preserving everyone’s freedom, autonomy and human dignity.

Privacy First monitors these developments and reacts proactively as soon as governments are about to take measures that are not strictly necessary and proportionate. In this respect, Privacy First holds that the following measures are in essence illegitimate:
- Mass surveillance
- Forced inspections in the home
- Abolition of anonymous or cash payments
- Secret use of camera surveillance and biometrics
- Every form of infringement on medical confidentiality.

Privacy First will see to it that justified measures will only apply temporarily and will be lifted as soon as the Corona crisis is over. It should be ensured that no new, structural and permanent emergency legislation is introduced. While the measures are in place, effective legal means should remain available and privacy supervisory bodies should remain critical.

Moreover, in order to control the coronavirus effectively, we should rely on the individual responsibility of citizens. Much is possible on the basis of voluntariness and individual, fully informed, specific and prior consent.

As always, Privacy First is prepared to assist in the development of privacy-friendly policies and any solutions based on privacy by design, preferably in collaboration with relevant organizations and experts. Especially in these times, the Netherlands (and the European Union) can become an international point of reference when it comes to fighting a pandemic while preserving democratic values and the right to privacy. This is the only way that the Corona crisis will not be able to weaken our world lastingly, and instead, we will emerge stronger together.

Health and common sense

Saturday, 28 March 2020 Columns

Column

The coronavirus has plunged the whole world into a deep crisis and governments do their utmost to control the dissemination. As I wrote in my previous column, it is important especially now to keep our heads cool and to protect our civil rights and privacy. A short and temporary infringement of our privacy in the general interest may be legitimate. The western model should imply a partial, temporary lockdown, lasting at most twice the incubation period so as to control the spread of the virus based on increased testing, and to facilitate the healthcare system, augmenting the number of critical care beds.

Moreover, this should be a participatory lockdown, based on voluntary participation and citizens’ individual responsibility. This is only logical, as trust is the cornerstone of our democratic society, even though at times there is a lack of it. This concerns trust in fellow citizens, the government and first of all, oneself. At this point in time I have a lot of confidence in the Dutch approach, which is a combination of common sense and relying on healthcare experts. Ultimately, we will have to learn to live with this virus and control potential outbreaks.

To measure is to know and therefore it is essential to scale up the number of tests with the right test equipment without delay. There are tests which can indicate quickly whether someone is infected. It is interesting to note that in Germany, where practically everyone with symptoms is being tested, the percentages of gravely ill and deceased people are considerably lower than in countries where testing is very limited. For policy makers and politicians it is thus very important to take the right decisions on the basis of facts.

If not, there will be a long-standing and emotionally-driven struggle, the encroachment on our freedom will not be short and temporary and power will shift disproportionately into the hands of the State. Such a scenario will see us move towards a forced surveillance society (see the current situation Israel is in, the newly introduced legislation in the UK as well as EU proposals with regard to telecom location data), characterised by the abolishment of anonymous (cash) payments (see the current guidelines in the Dutch retail sector), the dissolution of medical confidentiality and physical integrity in the context of potential virus infections (compulsory vaccinations and apps) and censorship of any alternative or undesired sources of information that counter the prevailing narrative. Besides, commercial interests of IT and pharmaceutical companies would come to dominate even more.

In the best case scenario, both society and the economy will soon be able to revive on the basis of individual and aggregate test results, with this lesson to bear in mind: let’s not lose the importance of our freedom, health and individual responsibility out of sight. All of a sudden, citizens have been left to their own devices and this experience will make them realize that life is not malleable and our society is not a mere paper exercise. This situation could lead to increased civic participation and less government, i.e. greater focus on critical functions. When we take a look around now, we see positive-minded, well-informed and responsible citizens and there is no need to keep focusing on a handful of exceptions. That is, as long as the measures in place are comprehensible, measurable and very temporary, and are not packaged into structural legislation, thereby misusing the crisis in order to grant certain organizations and sectors greater influence and power.

Finally, it’s worth realizing that all entrepreneurial Dutchmen without whom we would not be able to pay our fine public services, also deserve a round of applause. And perhaps the idea of a basic income for every citizen could be reviewed once more. In other words: let’s aim for more individual decisions in a freer society that is supported by technology and common sense!

Here’s to a free 2020!

Bas Filippini,
Privacy First chairman
(in personal capacity)

Column

Many questions have been raised about Privacy First’s point of view in relation to the protection of privacy in crisis situations, such as the one we’re currently experiencing as a result of the coronavirus. As indicated previously, I support the precautionary principle, i.e., we don’t know what we don’t know and what in fact is effective. A strict, western-style approach on the basis of a temporary (partial) lockdown for a (very) short period of time will drastically flatten the coronavirus curve and will make sure the healthcare system does not collapse. This also allows us to gain time to find a vaccine or medicine. We still don’t know exactly what kind of virus we’re dealing with, how it came into existence and how to control it.

Our society is built on trust. In a crisis situation like we’re in now, authorities will have to take temporary crisis measures which allow citizens to do the right thing voluntarily and on the basis of trust. This may temporarily restrict privacy, such as freedom of movement and/or physical integrity (think of being in quarantine). The government can choose to have a full or partial lockdown. Making this choice, it is essential that we rely on the norms and values of our free, democratic society, and that there is trust both in the citizenry and in the means and measures that may be employed. Ideally, this would result in a participatory lockdown based on everyone’s freedom and sense of responsibility.

Past experience shows that when there is open and honest communication, citizens act responsibly and in the general interest. This implies that draconian and structural legislative measures that restrict freedom can be kept at bay, much to the benefit of the people and the economy. In this respect, it is significant that practically all companies, institutions and organizations currently comply with the protocols, and even do more than what is required. After a period of inaction, the Dutch government has decided to act and take responsibility, which is most welcome. After all, this concerns a potentially great number of very sick patients and fatalities, including many elderly and vulnerable people.

Our government has opted for a democratic instead of a dictatorial approach, and that is to be applauded. So let’s use this moment to keep our head cool instead of infringing upon everyone’s freedom and right to privacy, freedom of movement, bodily integrity and cash payments. I see there is a bitter wind sweeping through Denmark, where a coronavirus emergency law has been rushed through, allowing the authorities to force people to be vaccinated (even though there is no vaccine yet), and in France too, where permanent crisis measures seem to have been implemented. All this is incompatible with a decent society and creates misplaced precedents. Let’s act in the general interest on the basis of trust and everyone’s own responsibility. For that, we need neither to be locked up, nor do we want to see the army in the streets, or any other draconian measures or laws to be put in place.

Let’s strive for a free and trustworthy Netherlands and Europe.

Bas Filippini,
Privacy First chairman
(in personal capacity)

In the context of the National Privacy Conference organized by Privacy First and the Dutch Platform for the Information Society (ECP), today the Dutch Privacy Awards have been handed out. These Awards offer a podium to organizations that consider privacy as an opportunity to positively distinguish themselves and want privacy-friendly entrepreneurship and innovation to become a benchmark. The winners of the 2020 Dutch Privacy Awards are Publicroam, NUTS and Candle.

Winner: Publicroam

Safe and easy access to WiFi everywhere for guest users

Most people in libraries, hotels, coffee bars and other public places log onto the local WiFi network in order to save on mobile data and to not rely on mobile networks which indoors may not be available everywhere. Often, WiFi networks operate on the basis of a single, local password, indicated on tables and screens. This makes the digital activities of users vulnerable in more ways than one, with all the ensuing nasty consequences. On top of that, users may not be informed about what the internet provider does with their personal data. It is said that the trade in personal data is by now more profitable than the trade in oil.

These risks were first identified by educational institutions and later by public authorities. This led to the creation of international roaming services like Eduroam and Govroam. But why aren’t such services available everywhere and to everyone? Publicroam set out to change just that and is being welcomed in more and more places. And rightfully so, according to the Privacy Awards expert panel. Several large municipalities and organizations (all libraries in the Netherlands among them) are already connected to Publicroam, or will be soon. In and of itself this facility is not a completely new solution, but the expert panel is particularly impressed by the fact that it can offer great advantages to literally everyone in the country – and possibly beyond – and can therefore have a huge impact on what we’re used to: one account which allows all users to go online automatically and securely, with serious respect for privacy ensured.

It’s possible after all: sound business initiatives that respect privacy; Publicroam is proof of this.

Winner: NUTS

Decentral infrastructure for privacy-friendly communication in healthcare

The NUTS Foundation is an initiative which aims to offer a privacy-friendly solution to identity management and sharing personal data in healthcare environments. It entails that individuals keep control over which healthcare data may be shared between healthcare providers. The NUTS Foundation has laid down its principles in a manifesto which all participants should ascribe to and which states that all software that’s being developed should meet the demands of open source. The result that the NUTS Foundation is striving for is a decentral system which keeps control over personal health information in the hands of the people involved.

The services offered by the decentral network are based on the principles of privacy by design. Identity management solutions contribute to irrefutably establishing the identity of individuals concerned. The decentral approach is in line with the digital healthcare architecture which is currently in the making and is also partly being introduced already. In this way, healthcare information components can use the decentral facilities that are being realized through NUTS.

In the eyes of the expert panel, the NUTS Foundation is a strong example of an initiative which not only looks at privacy issues in a comprehensive way but creates concrete solutions to these issues as well. The open source community that the NUTS Foundation is bringing to fruition, prevents vendor-lock-in in crucial areas of the digital healthcare infrastructure. Emerging digital Personal Healthcare Areas can equally make use of the decentral administrative provisions which NUTS is working towards. The rationale behind NUTS – creating a utility for a crucial part of the digital healthcare architecture – particularly appeals to the expert panel. Expanding the foundation, which currently by and large relies on a single company, will further increase the support for this initiative.

In order to give the NUTS Foundation the opportunity to further realize its ideals and to propagate these more widely, the expert panel has decided to confer this year’s Dutch Privacy Award for business solutions to the NUTS Foundation.

Winner: Candle

Privacy-friendly smart home solution

Candle is a reaction to a risk analysis (privacy by design) to Internet of Things products which unnecessarily connect to a cloud server. It’s a project which concentrates on developing alternative smart systems in and around the home, based on the principle that connection to the internet is unnecessary. Candle started off as a project organization run by students from universities and colleges of higher education as well as by artists’ collectives who aimed at developing practical hardware solutions combined with open source software. Various domestic appliances such as central heating, cameras, CO2 sensors and other applications can easily be connected with one another. A switch is used to make contact with an external network. Users make a deliberate choice when they import and export emails and other data.

Candle shows that it’s very well feasible to create a Smart solution without Big Tech companies and their data driven models. Meanwhile, there are various concept solutions which companies can actually put into practice. In its core, Candle is privacy by design and it opens people’s eyes to alternative smart systems.

"The market for ethical technology will grow in much the same way as the market for biological food has grown enormously. But how do we boost this market? That’s the challenge. The GDPR has ploughed the earth. Now it’s time to sow and entrust this concept to consumers", comments Candle.

Nominations

There are four categories in which applicants are awarded:

1. the category of Consumer solutions (business-to-consumer)

2. the category of Business solutions (within a company or business-to-business)

3. the category of Public services (public authority-to-citizen)

4. The incentive prize for a ground breaking technology or person.

From the various entries, the independent expert panel chose the following nominees per category:

Consumer solutions: Business solutions: Public services:
Publicroam NUTS (no entries)
Candle Rabobank/Deloitte  
Skotty    

During the National Privacy Conference the nominees presented their projects to the audience in Award pitches. Thereafter, the Awards were handed out. Click HERE for the entire expert panel report (pdf), which includes participation criteria and explanatory notes on all the nominees and winners.

National Privacy Conference

The National Privacy Conference is a ECP|Platform for the Information Society and Privacy First initiative. Once a year, the conference brings together Dutch industry, public authorities, the academic community and civil society with the aim to build a privacy-friendly information society. The mission of both the National Privacy Conference and Privacy First is to turn the Netherlands into a guiding nation in the field of privacy. To this end, privacy by design is key.

These were the speakers during the 2020 National Privacy Conference in successive order:

- Monique Verdier (vice chairman of Dutch Data Protection Authority)
- Richard van Hooijdonk (trendwatcher/futurist) and Bas Filippini (founder and chairman of Privacy First)
- Tom Vreeburg (IT-auditor)
- Coen Steenhuisen (privacy advisor at Privacy Company)
- Peter Fleischer (global privacy counsel at Google)
- Sander Klous (professor in Big Data Eco Systems, University of Amsterdam)
- Kees Verhoeven (Member of the Dutch House of Representatives for D66).

Expert panel of the Dutch Privacy Awards

The independent expert award panel consists of privacy experts from different fields:

• Bas Filippini, founder and chairman of Privacy First
• Paul Korremans, partner at Comfort Information Architects and Privacy First board member
• Marie-José Bonthuis, owner of IT’s Privacy
• Esther Janssen, attorney at Brandeis Attorneys specialized in information law and fundamental rights
• Marc van Lieshout, managing director at iHub, Radboud University Nijmegen
• Melanie Rieback, CEO and co-founder of Radically Open Security
• Nico Mookhoek, privacy lawyer and owner of NMLA
• Wilmar Hendriks, founder of Control Privacy and member of the Privacy First advisory board
• Alex Commandeur, senior advisor at BMC Advies.

In order to make sure that the award process is run objectively, the panel members may not judge on any entry of his or her own organization.

Privacy First organizes the Dutch Privacy Awards with the support of the Democracy & Media Foundation and in collaboration with ECP. Would you like to become a partner of the Dutch Privacy Awards? Then please contact Privacy First!

 

FG7A4979m

Today, the district court of The Hague ruled on the use of the algorithm-based system SyRI (System Risk Indication) by the Dutch government. The judges decided that the government, in trying to detect social services fraud, has to stop profiling citizens on the basis of large scale data analysis. As a result, people in the Netherlands are no longer 'suspected from the very start’ ("bij voorbaat verdacht").

The case against the Dutch government was brought by a coalition of NGOs, consisting of the Dutch Platform for the Protection of Civil Rights (Platform Bescherming Burgerrechten), the Netherlands Committee of Jurists for Human Rights (Nederlands Juristen Comité voor de Mensenrechten, NJCM), Privacy First, the KDVP Foundation (privacy in mental healthcare), Dutch trade union FNV, the National Clients Council (LCR) and authors Tommy Wieringa and Maxim Februari.

The court concludes that SyRI is in violation of the European Convention on Human Rights. SyRI impinges disproportionately on the private life of citizens. This concerns not only those that SyRI has flagged as an 'increased risk', but everyone whose data are analysed by the system. According to the court, SyRI is non-transparent and therefore cannot be scrutinized. Citizens can neither anticipate the intrusion into their private life, nor can they guard themselves against it.

Moreover, the court draws attention to the actual risk of discrimination and stigmatization on the grounds of socio-economic status and possibly migration background, of citizens in disadvantaged urban areas where SyRI is being deployed. There is a risk – which cannot be examined – that SyRI operates on the basis of prejudices. The attorneys of the claimant parties, Mr. Ekker and Mr. Linders, had this to say: "The court confirms that the large scale linking of personal data is in violation of EU law, Dutch law and fundamental human rights, including the protection of privacy. Therefore, this ruling is also important for other European countries and on a wider international level."

From now on, as long as there is no well-founded suspicion, personal data from different sources may no longer be combined.

Line in the sand

"This ruling is an important line in the sand against the unbridled collection of data and risk profiling. The court puts a clear stop to the massive surveillance that innocent citizens have been under. SyRI and similar systems should be abolished immediately", states Privacy First director Vincent Böhre.

"Today we have been proved right on all fundamental aspects. This is a well-timed victory for the legal protection of all citizens in the Netherlands", says Tijmen Wisman of the Platform for the Protection of Civil Rights.

Another plaintiff in the case, trade union FNV, equally rejects SyRI on principal grounds. "We are delighted that the court has now definitively cancelled SyRI", comments Kitty Jong, vice chair of FNV.

Turning point

The parties hope that the ruling will herald a turning point in the way in which the government deals with the data of citizens. They believe this viewpoint is endorsed by the considerations of the court: these apply not only to SyRI, but also to similar practices. Many municipalities in the Netherlands have their own data linking systems which profile citizens for all sorts of policy purposes. When it comes to combining data, a legislative proposal that would be greater in scope than SyRI and would enable lumping together the databases of private parties and those of public authorities, was all but unthinkable. The decision by the Hague district court, however, clamps down on these Big Data practices. According to the claimant parties, it is therefore of crucial importance that the SyRI ruling will affect both current as well as future political policies.

Public debate

The case against SyRI serves both a legal and a social goal. With this ruling, both goals are reached. Merel Hendrickx of PILP-NJCM: "Apart from stopping SyRI, we also aimed at initiating a public debate about the way the government deals with citizens in a society undergoing digitisation. This ruling shows how important it is to have that discussion."

Although SyRI was adopted in 2014 without any fuss, the discussion about its legality intensified after the lawsuit was announced. At the start of 2019, the use of SyRI in two Rotterdam neighbourhoods led to protests among inhabitants and a discussion in the municipal council. Soon after, the mayor of Rotterdam, Ahmed Aboutaleb, pulled the plug on the SyRI program because of doubts over its legal basis. In June 2019, Dutch newspaper Volkskrant revealed that SyRI had not detected a single fraudster since its inception. In October 2019, the UN Special Rapporteur on extreme poverty and human rights, Philip Alston, wrote a critical letter to the district court of The Hague expressing serious doubts over the legality of SyRI. Late November 2019, SyRI won a Big Brother Award.

The coalition of parties was represented in court by Anton Ekker (Ekker Advocatuur) and Douwe Linders (SOLV Attorneys). The proceedings were coordinated by the Public Interest Litigation Project (PILP) of the NJCM.

The full ruling of the court can be found HERE (official translation in English).

Our Partners

logo Voys Privacyfirst
logo greenhost
logo platfrm
logo AKBA
logo boekx
logo brandeis
 
 
 
banner ned 1024px1
logo demomedia
 
 
 
 
 
Pro Bono Connect logo
Procis

Follow us on Twitter

twitter icon

Follow our RSS-feed

rss icon

Follow us on LinkedIn

linked in icon

Follow us on Facebook

facebook icon