Can Germany Learn to Love Technology as Much as Science? — Privacy & Automation in a post-COVID World

Alex Rutherford
4 min readNov 19, 2021

I live in Berlin, Germany. I work for the Max Planck Institute, a highly trusted public institution backed by long term state funding. Germany is of course one of the most developed countries in the world and responsible for many marvels of manufacturing and science and even voted a scientist to lead the country for 14 years.

Given this, it surprises many that Germans are at best ambivalent towards new technology; the eventual application of science. In fact Google ranked Germany as last out of 27 European countries in terms of readiness for digital learning and fax machines are a growing channel of communication. Germans suffer some of the slowest and most expensive mobile data in Europe.

Part of this stems from extremely conservative values regarding data privacy. A Eurobarometer report from 2015 noted that Germans are most likely among Europeans to answer that they feel they have no control over their information on the internet (45%) and, unsurprisingly given their history, also lead in awareness of mass data collection by governments (76%).

(source)

As part of the process of renting an apartment in Germany, I had to present a number of documents; passport, employment contract and others; not unusual. What is unusual is that much documentation is exchanged in hard copy rather than electronically (German law generally considers electronic signatures as non-binding). Without access to a printer, I went to a printing shop and signed onto a shared computer. Since hard copies are the norm in Germany, many other previous users also had the need to print their documents there. I was horrified to find that many of those documents were available for any other user to see in the Downloads folder; boarding passes, TV license correspondence, payslips and so on.

These various pieces of empirical data and anecdote, prompted me to think about attitudes to science and technology here in the heart of Europe and how the long term changes in behaviour due to COVID might affect these.

At this point it is helpful to recapitulate why an institution stores personal data. Organisations that provide services such as filing taxes, medical care or renting an apartment, need to keep records of personal data in order to offer the same service continuously over time, or to have a kind of institutional memory. As Max Weber, an (ironically) German sociologist noted, this allows bureaucracies to be rational and consistent, and is thus a pre-requisite for fairness and inclusion.

As such, bureaucracies are in fact forms of automation, that lie on a spectrum that includes driverless vehicles and personal computers. Organisations need data to operate, just like machines for producing textiles (punch cards) or exchanging information (personal communications). The kind of data required can vary a lot between these applications. But as I have written before, the most thorny (although interesting) phenomena arise at the shifting interface between humans and machines.

Max Weber (source)

Could it be that the embrace of automation could be the key to resolving these Teutonic attitudes to data privacy and technology? This is particularly relevant considering the swift adoption of many new technologies and practices during the pandemic, something David Autor has coined as automation forcing.

It is true that digital technologies allow for fast retrieval and transfer of information on a large scale. Thus the negative effects of one adversary intent on releasing private data can be greater than if that person had to interrogate individuals and record their information with a pen and paper (something former East Germans are painfully aware of). So it might be reasonably assumed that automation, in general, uniformly creates unwanted privacy risks.

But my experience in the copy shop suggests the opposite. If the required information exchange were automated i.e. electronic, there would not exist copies on a shared computer, nor a physical copy to be destroyed or archived. Humans are not exactly the best secure conduits of sensitive data, even if their intentions are benign. A popular security vulnerability in the last decade was the “USB Drop” — a discarded USB drive would be found by an unsuspecting user and connected to their device. From there, an adversary can do many things; from logging keystrokes to storing charge and literally destroying the machine. The 2020 Verizon Data Breaches Investigation Report lists phishing and credential theft (read: insecure passwords) as the top actions leading to data breaches; decidedly human phenomena.

But conversely, what of the privacy risks that automation removes? Fraud detection and anomaly detection are sophisticated tools that flag suspicious transactions and events. In fact, penetration testing of computer networks belonging to companies and governments can be automated using quite accessible software. In fact Facebook, not a company often associated with satisfactory data protection, was using a considerable proportion of energy in its data centres to continually update access permission of a piece of content in light of changing privacy controls.

Will practices to ease these problems, adopted in response to COVID-19, persist in Germany? Recent developments are discouraging. One one hand, Angela Merkel recently unveiled plans to respond to the COVID-19 pandemic by investing in a new epidemiological research hub in coordination with the World Heath Organisation. Meanwhile the Anmeldung process, the mandatory registration of your residence and notorious among expats, which was rapidly retooled to be conducted electronically during the COVID pandemic, returned to a paper format on 1st January 2021.

--

--

Alex Rutherford

Data, science, data science and trace amounts of the Middle East and the UN