Privacy by design vs. privacy by paperwork

Lets talk about the difference of trying to be GDPR-compliant and not processing any personal data at all.

Lisa Figas

Lisa is TelemetryDeck's co-founder and product/marketing person
Privacy by design vs. privacy by paperwork

When I looked at our competitors' websites the other day, my jaw dropped: Privacy was listed as a feature and the explanatory text said, “Your data is yours to own. [Company Name] does not sell our customers' user data.” Phew, quite honestly: Not selling data should be a matter of course, not a marketing feature. Also, this sounds like a business decision and not one that was made in product design.

I take this moment of discovery as an opportunity to explain what our understanding of privacy is. And I want to make an important distinction: privacy by design vs. privacy by paperwork.

Privacy by design

Stop using tools just because you've always used them. Actively search for alternatives

Privacy by design – as we understand it – means that, from the very start of a project, all technological decisions are geared towards privacy. To be specific: storing only the data that is absolutely necessary, or actively seeking strategies that do not require any data or require as little data as possible.

This approach also takes into account the principle of data minimization, which is required by the GDPR, for example. Only personal data that is absolutely necessary should be collected. The fact that someone might look at some statistics in the distant future is not sufficient justification.

To give a concrete example, let me describe the process that TelemetryDeck implemented from day 1 to separate user behavior information from actual users: Our SDKs accept a custom user identifier, such as an email address or an internal identifier, to help you identify your users. We salt and hash the identifier on the user’s device. When a signal arrives at our server, we add our own salt to the user identifier and hash it again. This ensures that neither TelemetryDeck nor our customers can reconstruct the original identifier, protecting the user’s privacy.

So privacy by design in my understanding is something that smart developers technically implement in their products they build themselves from scratch and fully understand. Whereas privacy by paperwork is done by external consultants, people who do not work on the products themselves and are usually not directly affected by the strategies they develop.

Privacy by paperwork

There are numerous tutorials on how to (supposedly) use an analytics tools in a legally secure way. Especially in the case of software that transfers personal data to the USA or to a company headquartered in the USA, there is a lot of work to be done here. I'll give you a brief overview.

  • SCC: Due to the discontinuation of the Privacy Shield, which set the legal framework for data transfer to the USA, it is necessary to create other bases for data transfer to the USA. For this purpose, it is necessary to agree with the provider on so-called Standard Contractual Clauses (SCC) with the provider. However, these alone are not sufficient as a safeguard.
  • DPA: You sign a data processing agreement (DPA) with the US provider of the analysis software, i.e. a contract in which it is regulated that the company is allowed to process the data of your users on your behalf.
  • Storage location: Pay attention to where the analytics tool stores the data. Data from users located within the EU should not leave the EU if possible. Many providers therefore have the option of booking data centers within the EU. While this is helpful in regard to GDPR, the problem remains that access by US authorities can still be enforced through the CLOUD Act. Even this measure alone is not sufficient to protect the personal data of EU citizens.
  • Consent: App users must provide consent prior to data collection. It should be possible to refuse or allow tracking in general and tracking for advertising purposes individually. By the way: An analysis done by Flurry in 2021 showed that 96% of users reject the analysis of their activities.
  • Settings: When setting up analytics software, you must make sure that a so-called IP anonymization is performed. This means that part of the IP address is truncated during tracking, making it more difficult to identify the individual user. However, I would like to note that this measure in particular is window dressing, because the IP address has long since ceased to be the only way to identify individuals.
  • Data deletion: Users of your app have the right to request the deletion of their personal data. Requests for data erasure can also be made in relation to the data stored by an analytics company. You have to react within certain deadlines and carry out the deletion very thoroughly. Make sure that the provider you choose supports these deletion requests.

Austrian data privacy activist Max Schrems likes to call these measures “ice floes” tactics. If you want to stand on a pack of drift ice, none of the individual floating ice sheets alone is enough to support the weight. You have to put them all on top of each other and hope that you don't collapse. However, that is not certain.

The everlasting drama of analytics software

Talk openly about supporting the principle of privacy by design.

The other day, an entrepreneur told me that he installs Google Analytics in all his customers' projects because they demand it. They already use the same service for the website, so it makes sense to use it to analyze the app as well. But he is sure that 90 percent never look at the numbers. The reasons for this are many and varied. Sometimes there is no budget for optimizing an app at all—so the usage data is not even relevant. In other cases, there is simply no one in the company who is qualified to evaluate usage data. A shift in priorities can also be a reason for unnecessarily collecting data that no one looks at.

In this way, almost endless databases full of personal data are created. They contain sensitive information about the behavior and preferences of individuals. The only entity that benefits from this data is the analytics provider—in this case, Google. The company benefits because the collected information can be used to optimize advertising for each individual Google user. Everyone else loses. The users lose their privacy and the app providers lose time and money in the completely unnecessary attempt to make Google Analytics reasonably legally compliant. This whole situation is infuriating and pointless!

How can we do better?

So what can the app industry do to improve the situation for everyone? I've put together a few suggestions.

  • Acknowledge the privacy issues that arise from monitoring usage behavior.
  • Stop using tools just because you've always used them. Actively search for alternatives.
  • Build analytics software according to the principle of privacy by design.
  • Budget for tools that support your business model and respect the privacy of your users. Stop using the free standard without calculating the consequences.
  • Talk openly about supporting the principle of privacy by design.

It is necessary that we work together to restrain surveillance capitalism and all its negative effects on society, democracy, and the climate. None of us can do it alone. But together we can achieve a significant improvement.