social.stefan-muenz.de

Search

Items tagged with: edri

Bild/Foto
I've joined ReclaimYourFace in calling to protect our public spaces from biometric mass surveillance - will you?
https://reclaimyourface.eu/
#ReclaimYourFace #EDRi
 
Bild/Foto
I've joined ReclaimYourFace in calling to protect our public spaces from biometric mass surveillance - will you?
https://reclaimyourface.eu/
#ReclaimYourFace #EDRi
 
As a EU citizen I've submitted my response to the European Commission’s public consultation on artificial intelligence (AI) after consulting EDRI's recommendations. (PDF alert).

#AI #ArtificialIntelligence #EU #EC #EDRi
 
As a EU citizen I've submitted my response to the European Commission’s public consultation on artificial intelligence (AI) after consulting EDRI's recommendations. (PDF alert).

#AI #ArtificialIntelligence #EU #EC #EDRi
 
EDRi calls for fundamental rights-based responses to COVID-19

In a recent statement released on 20 March 2020, European Digital Rights
(EDRi) calls on the Member States and institutions of the European Union
(EU) to ensure that, while developing public health measures to tackle
COVID-19, they:
  • Strictly uphold fundamental rights;
  • Protect data for now and the future;
  • Limit the purpose of data for COVID-19 crisis only;
  • Implement exceptional measures for the duration of the crisis only;
  • Condemn racism and discrimination;
  • Defend freedom of expression and information.
EDRi’s Head of Policy, Diego Naranjo, explains that: "EDRi supports
necessary, proportionate measures, fully in line with national and
international human rights and data protection and privacy legislation,
taken in order to tackle the COVID - 19 global pandemic. These measures
must not, however, set a precedent for rolling back the fundamental
rights obligations enshrined in European law."

EDRi recognises that Coronavirus (COVID-19) disease poses a global
public health challenge of unprecedented proportions. The use of
good-quality data can support the development of evidence-based
responses. However, we are witnessing a surge of emergency-related
policy initiatives, some of them risking the abuse of sensitive personal
data in an attempt to safeguard public health. When acting to address
such a crisis, measures must comply with international human rights law
and cannot lead to disproportionate and unnecessary actions. It is also
vital that measures are not extended once we are no longer in a state of
emergency.

EDRi's Executive Director, Claire Fernandez, emphasises that: "In times
of crisis, our authorities and communities must show responsibility,
resilience, solidarity, and offer support to healthcare systems in order
to protect our lives. States’ emergency responses to the COVID-19
pandemic must be proportionate, however, and be re-evaluated at
specified intervals. By doing this, states will prevent the
normalisation of rights-limiting measures, scope creep, data retention
or enhanced surveillance that will otherwise be harmful long after the
impacts of the pandemic have been managed."

In these times of pandemic and emergency measures, EDRi expresses
solidarity towards collective protection and support for our health
systems. We will continue monitoring and denouncing abuses of human
rights in times when people are particularly vulnerable.

Read full statement "EDRi calls for fundamental rights-based responses
to COVID-19": https://edri.org/covid19-edri-coronavirus-fundamentalrights/

EDRi Members and Observers' Responses to COVID-19:#EDRI #CivilRights #DigitalRights #Europe
EDRi calls for fundamental rights-based responses to COVID-19
 

Your family is none of their business



  • Today’s children have the most complex digital footprint in human history, with their data being collected by private companies and governments alike.
  • The consequences on a child’s future revolve around one’s freedom to learn from mistakes, the reputation damage caused by past mistakes, and the traumatic effects of discriminatory algorithms.
Summer is that time of the year when parents get to spend more time with their children. Often enough, this also means children get to spend more time with electronic devices, their own or their parents’. Taking a selfie with the little one, or keeping them busy with a Facebook game or a Youtube animations playlist – these are examples that make the digital footprint of today’s child the largest in human history.

Who wants your child’s data?

https://edri.org/your-family-is-none-of-their-business/

Mobile phones, tablets and other electronic devices can open the door for the exploitation of the data about the person using that device – how old they are, what race they are, where are they located, what websites they visit etc. Often enough, that person is a child. But who would want a child’s data?

Companies that develop “smart” toys are the first example. In the past year, they’ve been in the spotlight for excessively collecting, storing and mis-handling minors’ data. Perhaps you still remember the notorious case of “My Friend Cayla”, the “smart” doll that was proved to record the conversations between it and children, and share them with advertisers. In fact, the doll was banned in Germany as an illegal “hidden espionage device”. However, the list of “smart” technologies collecting children data is long. Another example of a private company mistreating children’s data was the case of Google offering its school products to young American students and tracking them across their different (home) devices to train other Google products. A German DPA (Data Protection Authority) decided to ban Microsoft Office 365 from schools over privacy concerns.

Besides private companies, state authorities have an interest to record, store and use children’s online activity. For example, a Big Brother Watch 2018 report points that in the United Kingdom “Department for Education (DfE) demands a huge volume of data about individual children from state funded schools and nurseries, three times every year in the School Census, and other annual surveys.” Data collected by schools (child’s name, birth date, ethnicity, school performance, special educational needs and so on) is combined with social media profile or other data (e.g household data) bought from data brokers. Why linking all these records? Local authorities wish to focus more on training algorithms that predict children’s behaviour in order to identify “certain” children prone to gang affiliations or political radicalisation.

Consequences for a child’s future

Today’s children have the biggest digital footprint out of all humans in human history. Sometimes, the collection of a child’s data starts even before they are born, and this data will increasingly determine their future. What does this mean for kids’ development and their life choices?

The extensive data collection of today’s children aims at neutralising behavioural “errors” and optimising their performance. But mistakes are valuable during a child’s self-development – committing errors and learning lessons is an important complementary to receiving knowledge from adults. In fact, a recent psychology study shows that failure to provide an answer to a test is benefiting the learning process. Constantly using algorithms to optimise performance based on a child’s digital footprint will damage the child’s right to make and learn from mistakes.

https://invidio.us/watch?v=afYVNHDLTMc

#edri #kids #family #children #gafam #protection #privacy #collection #tracking
Your family is none of their business
 

Your family is none of their business



  • Today’s children have the most complex digital footprint in human history, with their data being collected by private companies and governments alike.
  • The consequences on a child’s future revolve around one’s freedom to learn from mistakes, the reputation damage caused by past mistakes, and the traumatic effects of discriminatory algorithms.
Summer is that time of the year when parents get to spend more time with their children. Often enough, this also means children get to spend more time with electronic devices, their own or their parents’. Taking a selfie with the little one, or keeping them busy with a Facebook game or a Youtube animations playlist – these are examples that make the digital footprint of today’s child the largest in human history.

Who wants your child’s data?

https://edri.org/your-family-is-none-of-their-business/

Mobile phones, tablets and other electronic devices can open the door for the exploitation of the data about the person using that device – how old they are, what race they are, where are they located, what websites they visit etc. Often enough, that person is a child. But who would want a child’s data?

Companies that develop “smart” toys are the first example. In the past year, they’ve been in the spotlight for excessively collecting, storing and mis-handling minors’ data. Perhaps you still remember the notorious case of “My Friend Cayla”, the “smart” doll that was proved to record the conversations between it and children, and share them with advertisers. In fact, the doll was banned in Germany as an illegal “hidden espionage device”. However, the list of “smart” technologies collecting children data is long. Another example of a private company mistreating children’s data was the case of Google offering its school products to young American students and tracking them across their different (home) devices to train other Google products. A German DPA (Data Protection Authority) decided to ban Microsoft Office 365 from schools over privacy concerns.

Besides private companies, state authorities have an interest to record, store and use children’s online activity. For example, a Big Brother Watch 2018 report points that in the United Kingdom “Department for Education (DfE) demands a huge volume of data about individual children from state funded schools and nurseries, three times every year in the School Census, and other annual surveys.” Data collected by schools (child’s name, birth date, ethnicity, school performance, special educational needs and so on) is combined with social media profile or other data (e.g household data) bought from data brokers. Why linking all these records? Local authorities wish to focus more on training algorithms that predict children’s behaviour in order to identify “certain” children prone to gang affiliations or political radicalisation.

Consequences for a child’s future

Today’s children have the biggest digital footprint out of all humans in human history. Sometimes, the collection of a child’s data starts even before they are born, and this data will increasingly determine their future. What does this mean for kids’ development and their life choices?

The extensive data collection of today’s children aims at neutralising behavioural “errors” and optimising their performance. But mistakes are valuable during a child’s self-development – committing errors and learning lessons is an important complementary to receiving knowledge from adults. In fact, a recent psychology study shows that failure to provide an answer to a test is benefiting the learning process. Constantly using algorithms to optimise performance based on a child’s digital footprint will damage the child’s right to make and learn from mistakes.

https://invidio.us/watch?v=afYVNHDLTMc

#edri #kids #family #children #gafam #protection #privacy #collection #tracking
Your family is none of their business
 

Microsoft Office 365 banned from German schools over privacy concerns


In a bombshell decision, the Data Protection Authority (DPA) of the German Land of Hesse has ruled that schools are banned from using Microsoft’s cloud office product “Office 365”. According to the decision, the platform’s standard settings expose personal information about school pupils and teachers “to possible access by US officials” and are thus incompatible with European and local data protection laws.

The ruling is the result of several years of domestic debate about whether German schools and other state institutions should be using Microsoft software at all, reports ZDNet. In 2018, investigators in the Netherlands discovered that the data collected by Microsoft “could include anything from standard software diagnostics to user content from inside applications, such as sentences from documents and email subject lines.” All of which contravenes the General Data Protection Regulation (GDPR) and potentially local laws for the protection of personal data of underaged pupils.

While Microsoft’s “Office 365” is not a new product, the company has recently changed its offer in Germany: Until now, it provided customers with a special German cloud version hosted on servers run by German telecoms giant Deutsche Telekom. Deutsche Telekom served as a kind of infrastructure trustee, putting customer data outside the legal reach of US law enforcement and intelligence agencies. In 2018, however, Microsoft announced that in 2019 this special arrangement will be terminated and German customers are offered to move to Microsoft’s standard cloud offer in the EU.

Microsoft insists that nothing changes for customers because the new “Office 365” servers are also located in the EU or even in Germany. However, legal developments in the US have put the Hesse DPA on high alert: The newly enacted “US Cloud Act” empowers US government agencies to request access to customer data from all US-based companies no matter where their servers are located.

To make things even worse, Germany’s Federal Office for Information Security (BSI) recently expressed concerns about telemetry data that the Windows 10 operating system collects and transmits to Microsoft. So even if German (or European) schools stopped using the company’s cloud office, its ubiquitous Windows operating system also leaks data to the US with no control or stopping it for users.

School pupils are usually not able to give consent, Max Schrems from EDRi member noyb told ZDNet. “And if data is sent to Microsoft in the US, it is subject to US mass surveillance laws. This is illegal under EU law.” Even if that was legal, says the Hesse DPA, schools and other public institutions in Germany have a “particular responsibility for what they do with personal data, and how transparent they are about that.”

It seems that fulfilling those responsibilities hasn’t been possible when using Microsoft Office 365. In a next step, it is crucial that European DPAs discuss those findings within the European Data Protection Board to come to an EU-wide rule that protects children’s personal data from unregulated access by US agencies. Otherwise European schools would be well-advised to switch to privacy-friendly alternatives such as Linux, LibreOffice, and Nextcloud.

https://edri.org/microsoft-office-365-banned-from-german-schools-over-privacy-concerns/

#microsoft #office365 #office #spy #school #germany #german #edri #banned #privacy #education #libreoffice #nextcloud #linux
Microsoft Office 365 banned from German schools over privacy concerns
 
Later posts Earlier posts