Privacy Perspectives June 2024

Image of Carlo Cilento

Veröffentlicht am 12. Juni 2024 von Carlo Cilento

Dieser Inhalt ist noch nicht ins Deutsche übersetzt. Unten finden Sie die englische Version.

Welcome to the first installment of the Privacy Perspectives. This is a new space for for deeper dives on Privacy Monthly material, and for other material that doesn't quite fit the Privacy Monthly. Every story comes with a direct link to the source, some commentary for context, and sometimes a personal take.

  1. EDPB cuts AI no slack
  2. Youtube drops the ball on political ads
  3. How do apps protect female health data?
  4. The Markup on car tracking and mortgage brokers
  5. SDKs and the FTC
  6. How GPS changed location data
Logo of MichelinMichelin chose Simple AnalyticsJoin them

EDPB cuts AI no slack

The AI Act is stealing the media spotlight, and for good reason: it is the first act of its kind and is likely to set the tone for AI policy discourse worldwide, much like the GDPR did for privacy law. But the privacy people are also discussing other AI-related news that flew under the radar of the general media: the European Data Protection Board (EDPB) published its report on the work of the ChatGPT taskforce.

Here is some context: in 2023 the Italian privacy watchdog banned ChatGPT for about a month over privacy concerns. This prompted the EDPB (that is, the committee where all privacy regulators sit) to launch a broader investigation through the so-called “ChatGPT taskforce”. The result is a report that lays out the common ground found by European regulators.

The report is very important because ChatGPT’s issues are largely common to all foundational models: for instance, they hallucinate, they cannot be made to forget data, and they are mostly trained on nonconsensually scraped data. All of these are serious issues that regulators will need to tackle in the near future and their approach will heavily impact foundational models on the EU market.

The report doesn’t beat around the bush and states quite clearly that regulators expect full compliance from the providers of AI and that technical impossibility is no excuse for non-compliance. In other words, the EDPB is not willing to cut OpenAI (and other players) any slack on the grounds that complying with the GDPR is technically impossible.

The report stresses that implementing safeguards can help with compliance. But we should be realistic here: many safeguards that are commonplace in other industries simply do not work for AI- at least within the current state of the art. If your training data is the entire open Web, things like anonymization and sanitization of sensitive data are simply not possible, nor is any serious work to improve data quality.

Individual regulators may very well stray from the stance of the EDPB as the report is not binding in nature. And of course, there is no saying where the Court Justice will stand when it finally deals with AI and the GDPR.

Nonetheless, should the line in the report prevail, foundational models might be in trouble on the EU market.

Youtube drops the ball on political ads

An investigation by Access Now and Global Witness highlights that YouTube is doing little or nothing to address election disinformation in India.

The two groups uploaded 48 video ads containing grossly false electoral information in three languages, including English- which should be the easiest one for Google to work with. All of them passed YouTube’s review. The only reason they were not broadcasted is that Access Now pulled them beforehand.

Maybe those mass layoffs from Google’s trust and safety team weren’t such a great idea after all?

How do apps protect female health data?

Writing for The Pulse, Matt Fisher covers and summarizes a recent study on privacy in female mhealth apps in the US market. Spoiler alert: privacy practices are terrible across the industry. To no small extent, this is due to the fact that many mhealth apps are not covered by HIPAA- a US health care sector law that protects health information.

As Matt correctly points out, HIPAA can be confusing for non-lawyers. Whether data fall under the HIPAA depends not only on their nature but also on the context in which they were collected. To grossly simplify, health data collected outside the health care system do not fall under HIPAA no matter how sensitive they might be.

So, “Alice’s menstrual cycle stopped” is protected health information when Alice tells her doctor but not when she types it into her mhealh app. This is counterintuitive and, therefore, confusing for Alice. She may mistakenly think that the information is always covered by HIPAA and believe her data to be safer than they actually are.

It is worth noting that health data privacy has been incredibly important since Dobbs v. Jackson. After the ruling, residents of certain States risk prosecution and imprisonment for seeking health care and mhealth apps are a treasure trove of potentially incriminating evidence. The FTC is doing it best to control the damage but there will be no real fix until the US protects health data with a federal privacy law.

The Markup on car tracking and mortgage brokers

When the harms of surveillance are discussed, people usually think of future dystopias and spy story scenarios. The reality is often more mundane- think less “1984” and more “dangerous ex stalking you”.

An excellent article co-published by The Markup and CalMatters explains how car tracking enable domestic abuse by allowing the abuser to locate the driver. As the author correctly notes, cars are often a lifeline for victims of abuse- which makes car-enabled stalking all the more problematic. Sometimes even a restraining order is not enough to stop the tracking.

The Markup also investigated the use of Meta’s pixel from US mortage brokers and found that many of them- including some heavyweights- share users’ financial data with Facebook without their consent or even their knowledge.

Meta bans businesses from sending sensitive information via its pixel and claims that it uses automated tools to block sensitive information from being sent. That being said, the results of The Markup’s investigation suggest that Meta is probably not enforcing its policies too strictly.

SDKs and the FTC

Andrew Folks takes an in-depth look at some of the legal issues of software development kits (SDK) and offers an overview of recent FTC enforcement against illegal SDK tracking.

Software developer kids (SDKs) is a bundle of software-building tools. Typically, the owners of an SDK will incorporate tracking technology in the code and make it available to third party developers. As a result, developers get to use the SDK for free and the SDK owners get to collect data from the end user.

SDKs are the privacy catastrophe that hardly anyone is talking about. Just about everyone has this spyware on their phones. This happens even on the EU market and despite the strict opt-in consent required by the ePrivacy Directive for such tracking. In fact, many companies essentially side-step the law by requiring consent to tracking for the app to work at all.

How GPS changed location data

Writing about the recent FCC fines against mobile carriers, Cobun Zweifel-Keegan provides an interesting overview of how GPS changed the economic value of location data. In a nutshell, GPS created new and profitable revenue streams for communication carriers but also generated new expectations of privacy among customers.

GA4 ist komplex. Versuchen Sie Simple Analytics

GA4 ist wie im Cockpit eines Flugzeugs zu sitzen, ohne einen Pilotenschein zu haben

14-Tage-Testversion starten