»Die Privatsphäre kommt nicht über Nacht zurück«

Mit Styling-Tipps und Anti-Drohnen-Mode geht er gegen Überwachung vor – Adam Harvey will Spaß in die unangenehme Debatte um Datenschutz und Privatsphäre bringen. Warum das wichtig ist und wie es weitergehen kann, erklärt er uns im Interview. – Dieser Text ist auf Deutsch und Englisch verfügbar.

21. September 2016  15 Minuten

Have you ever thought about choosing special make-up to hide from video surveillance? Probably not – but it works! Adam Harvey is an artist and engineer working on creative ways to adapt to global mass surveillance. Besides Website of the project »CV Dazzle« style tips against facial recognition he designed Website of the project »Stealth Wear« fashion that shields you from infrared cameras and he has developed a Description of the »Off Pocket« on the crowdfunding page Faraday cage phone case to block tracking and eavesdropping. He also developed Website of the poject »Think Privacy« privacy slogans, a Website of the project »Skylift« device to spoof a smartphone’s geolocation, and an Website of the project »Camoflash« anti-paparazzi flashlight.

5 months ago he moved from New York to Berlin. Now he speaks with me about arts, surveillance and democracy.

Adam, I see you’re not wearing your Stealth Wear. Don’t you believe in the garments you’re selling?

Adam Harvey: I think that Stealth Wear and another project I worked on called »CV Dazzle« A project to work around face recognition by wearing special haircuts and make-up. are not impractical, but very specific solutions. You might own a gown or tuxedo, but you don’t wear that every day; that would be ridiculous. Even if you wear a tuxedo once a year, you could justify buying it. With Stealth Wear it’s similar. There might be a situation depending on who you are and where you live, where blocking thermal imaging is how you want to or need to dress. Not that you need to buy the clothes that I make, but the idea is that if somebody could make a counter surveillance garment that’s practical once a year or once every other year then it has value. And if that one time that you wear it is very critical, then it can have a lot of value. I think that Stealth Wear and CV Dazzle could become more practical as we learn how vulnerable we are to emerging types of surveillance.
Artist and engineer at the same time – Adam Harvey in his studio. – Quelle: Adam Harvey copyright
Tell me more about that.

Adam Harvey: I think, although this is a bit speculative, that walking around in a highly surveilled environment and exposing your face is going to feel uncomfortable in a matter of a few years. For example, even retail stores now have facial recognition. And so now somebody can send you an e-mail based on you visiting that store. And it’ll say potentially: »Thanks for visiting, here’s 10% off if you come back.« And you’ll be like: How did you know that I was shopping for spaghetti? The company knows exactly where you were in the store, at what time and whether you purchased something or not. It gets very specific, very psychological, very behavioural.

There are many people who say they don’t have anything to hide. If the shop knows that they’ve looked at spaghetti and they get 10% off that is an advantage.

Adam Harvey: The coupon strategy is actually effective. It means that people are willing to exchange privacy or their data for a coupon. Making people understand what it is they’re giving out and what it means to have privacy or control of your data is at the heart of the question if you have nothing to hide or nothing to fear. People are willing to exchange privacy for a coupon. If you don’t worry about your data, »It’s just a chat« or »It’s just an e-mail«, »It’s just my location«, »It’s just my friend’s address«, when you internalize this, it’s very easy to say that there is no damage. This is the big problem with privacy from a legal perspective: that there is no physical harm that I’m doing to you by taking your data or incentivizing you to give it to me.

Then why worry about privacy?

Adam Harvey: The long term consequences of these short term decisions are that you have these massive, massive (!) databases which centralize intelligence about people, about culture, about businesses, about demographics, deep insights into what people are thinking about right now, where everybody has been, everything that you’ve thought about, purchased, written, said, visited, everyone. It’s enormous. You can’t unplug Google. When you participate in these services, you’re changing the course of democracy by creating powerful companies, that have an increasing amount of leverage over world decisions.

A company as digital state

What leverage would that be?

Adam Harvey: You can’t unplug Google. »Quite literally the world would begin to stop. Even at a personal level. Maybe not you, but a lot of people, you can’t unplug them from Google anymore. As Evgeny Morozov has said: ›Google is the backend of people’s lives.‹ You wake up in the morning, communication, travel, finding what you need, finding places to get food, you do it through Google.« Google is a part of your life, you can’t easily remove that. Even if I do, I e-mail people who use Google. There is no way to avoid this kind of »state«. It’s not a nation state, but a digital state that we participate in, but don’t get to vote in. A digital state that we don’t get to vote in. It’s not a democracy, we’re not making the decisions. We have some influence by not using it or using a competitor. But you have no right to say anything to Google. They have their own type of governance that controls many things in the same way that a real government would. This is the long term problem. When you say »If I have nothing to hide, I have nothing to fear »This idea comes from companies like Sun Microsystems, Oracle and Google. Facebook came later. They promote a Cybernetic Utopia: Connect everyone through Google, connect everyone through technology and we create an equilibrium, where machines take care of the work in the background, everyone is kind of free, because a lot of things are provided or taken care of by machines and networks to create a baseline stability that gets rid of corruption in politics, because now everyone is equally known in the network, everyone has equal weight.
Of course this doesn’t work, because some people want more power. The Cybernetic Utopia, which is echoed in Twitter, in Facebook and a lot of social media: ›Everyone can use this and they have the ability to communicate with everyone.‹ Well, not really, because I can say something and then its algorithm quickly weighed it down. You can say something and pay 5 Euro to promote it and then it goes to the top of my feed. You can control the visibility of what’s seen.
And then there’s the Facebook emotional contagion study, where Facebook changed the mood of news that was appearing on someone’s site and then it changed their behaviour. That is sinister, because when you’re in a good mood you make different decisions about your life than when you’re in a bad mood. So the avalanche effect to that is: I change something in the system, because I control it and that changes the way that you feel and that changes the things that you decide. And that changes what you study, what you decide to wear, who you decide to hang out with, and so: your life.«
– give it all away«, you’re a part of this.

You’re an artist, activist, a privacy advocate. Where do you find the courage to take on that powerful digital state?

Adam Harvey: Well, people do say that my work is activism. But I like what Jake Appelbaum says, I guess you could also call him an activist, previously one of the Tor Tor is a free software for anonymous communication. It directs traffic through several Tor servers, thus concealing the users’ location. developers. He said that people use the word activism pejoratively to describe someone who is participating in democracy. But actually activism is what you are supposed to be doing, if you live in a democracy. And so it’s somehow backwards that we think about activism as something outside, unnatural. So in that way I don’t think I’m an activist.

Okay, but even if you’re not an activist, still you do something. Why?

Adam Harvey: I’m vocal. My work is a bit of a struggle, because people don’t want to hear about the risks and consequences of what they do. The critical side of things is not always that fun. What I like to do is to make that fun. I like to make the critical side of things fun. It’s not about finger wagging, it’s just about being savvy. A lot of people aren’t interested in discussing privacy, if it gets in the way of making money. People working at ethically dubious tech companies often don’t want to discuss privacy issues, because their company is part of the problem.
With his project »Think Privacy« Adam Harvey wants to get people interested in privacy. – Quelle: Adam Harvey copyright
But there are also a lot of people who are not making money with data and they are not interested either. Or they don’t do anything, because they feel unable to. What about them?

Adam Harvey: Even people who work professionally in privacy feel like the challenge is overwhelming. But I always like to encourage people and note that it’s kind of foolish to think privacy is going to fix itself over night or even within a year. It took a while to get where we are now. »Let’s encrypt« »Let’s encrypt« is a certification authority for encryption protocols launched in April 2016 . The certification is carried out in an automated process and therefore requires only little time and effort, thus making encryption more attractive. was a huge step forward, but that movement was years in the making. So it is important to start now, but it’s 2 years for some of the easy fixes and 5 years for some of the harder fixes and maybe it’s 100 years before society changes enough that people actually have the data protections they deserve. That might sound fatalistically long-term, but I think I’m being the opposite, I’m being realistic and optimistic talking about it this way. You can’t approach privacy expecting too much to happen too soon. Progress will be incremental.

Privacy Step by Step

Incremental Privacy – what does that mean?

Adam Harvey: Taking a lot of small steps. You understand that you have to go to work tomorrow and you have to use email. So maybe you can improve the email. So then you decide to use instead of Google. It doesn’t fix everything, because you still email other people with Gmail addresses and your WiFi MAC address A MAC address is the unique hardware address of any device. and phone signals are being tracked everywhere you go. There are companies that sell this data. You can improve that by turning off WiFi when you intend to leave the house or using air plane mode more often. That’s kind of annoying, but it’s manageable. You then realize that your IP address Internet Protocol addresses make devices that are connected to the internet addressable. With an IP address data can be transported through the internet. , your browser fingerprint When you open a website your browser automatically transmits certain information, for example your language and display settings, your operating system, the location of the server your internet provider uses. The combination of all those pieces of information makes a browser If you combine all those pieces of information you can generally identify the browser. You can check your browser fingerprint here. and your cookies So-called cookies are small files sent to the user’s browser by a website and stored on his computer. The next time you open the site it can recognize you. are being used to track you everywhere you go online. You can improve that situation by using a VPN VPN is an acronym for Virtual Private Network, an encrypted, private network within the internet. By using the VPN the user appears to have a different location. to cloak your IP address and installing browser extensions to block tracking or you can clear your cookies every week, every day, or a couple of times a day, depending on how concerned you are about all this.

I see where you’re going. So the consequent conclusion would be to only use Signal Signal (previously known as TextSecure) is a free smartphone messenger for encrypted communication and Tor and give up on all your convenience for the sake of privacy?

Adam Harvey: No, it’s not just all or nothing. That’s how people end up losing: In becoming fatalistic about it.

So what then?

Adam Harvey: It’s just like politics: If you change one person’s mind, they change someone else’s perspective and that’s the potential to grow a lot. There’s this project »Think privacy«, that I’m working on and I think it can help to get people interested in privacy.

In which way?

Adam Harvey: One of the posters that I made, »data never dies«, is this larger idea: You’re here in Berlin and you know that. You don’t know where else you are though. You’re probably somewhere in Facebook’s or Google’s data centre in Ireland. You’re probably on a Facebook server in I think it’s Prineville, Oregon, and you’re on a lot of other servers, you have no idea. I don’t think anyone has any idea, how many places in the world they exist anymore. And it doesn’t really matter if it’s just a couple of words that you wrote. And it doesn’t really matter if it’s a photo of your cat. But if someone has a photo of you in your apartment, your heart rate, your eye colour, your hair colour, your height and clothes, over a while all your clothes, your wardrobe, everything that you’ve written in emails, your speed of typing, your voice and so on. I don’t mean to get creepy, but a lot of that information about you is already stored in databases and linked to your identity. At some point you can use this data to reconstruct someone’s life.
Adam Harvey’s anti-drone fashion tested by infrared cameras. – Quelle: Adam Harvey copyright
That does sound creepy. What would the companies do with that data?

Adam Harvey: Marketing agencies are doing a psychological behaviour analysis. They could be saying: »Oh, we saw you with earrings on from Chanel. We know you like earrings from Chanel, so you get 10% off for your next thing.« And you wonder: How do they know I like Chanel? That probably doesn’t sound too bad. But combine that with expressions, logos on your clothes, the people that are in your photos, face recognition of everyone else, location. And then, it’s not that you’re advertising anymore – that’s a bad word choice anymore – you algorithmically nudge them towards the person you want them to be. That is not advertising. »Another thing is that some point you can begin to reproduce parts of that person. Whether they’re media representations, like a neural network that writes in your style or an artificial speaking agent that talks in your voice, with your cadence and in your style. At some point what’s the difference between you and that higher resolution and more organized version of yourself? So we could have your DNA – and FBI has the DNA of a lot of people. They could have all your biometrics, your fingerprints, eye, height, behavioural biometrics like the speed that you type. The way you walk, the way you move your body, express yourself, is that a part of you? I would say it is. It’s part of what makes you you. But a couple of people don’t think that way, they think ›That’s a biometric and so that’s data we’re going to collect.‹«

On my way to your studio I tried to count the surveillance cameras recording my passing by. I counted 12. It’s not a very nice feeling to be aware of this surveillance, so I wonder how you feel to be working in this field. Are you happy?

Adam Harvey: I know I feel happier here being in Berlin than for example being in London. In London I feel very watched. Not as a target, but just in general. You can feel the surveillance in London, it’s everywhere you go. To me, personally, it’s depressing to be watched by people I don’t know, that I can’t even see, that I don’t know what they’re doing. To just sit there and have unknown agencies stare at you is a depressing culture. I think that should be factored in when you measure the quality of life in a city. In addition to the personal feelings, seeing a lot of cameras suggests that there’s a reason why everyone is being watched. Some kind of threat, some kind of danger. It shows a lack of trust of the environment and of the people that are around you. I think you want to live somewhere where people trust each other and where you’re not relying on some kind of automated analytics to enforce rules for you. It’s pretty great to see people participating in democracy, rather than outsourcing it to a company to regulate for them.

Every photo can be surveillance

Can you imagine going back to a point where you would walk the street and you wouldn’t even notice the cameras?

Adam Harvey: Actually no. At Pennsylvania State University where I went to undergrad, there were a lot of riots. Drunk people who couldn’t get enough pizza went out to the streets and wrecked a lot of things. It was not really a big threat, just college chaos. But as a result of these riots which made the university look really bad, they wanted to install cameras. I was working as a photographer at the school newspaper at that time and the police came to the newspaper and basically said: »We want all of the photos to identify people in the riots.« From what I heard and witnessed the police got whatever photos they wanted. I don’t know if my photos were included, it wasn’t my assignment to photograph the riots, but those of other photographers were given to the police. What I think I’m doing as an artist, journalist or photographer is actually surveillance. This was a big turning point, because then I realized that every image a photographer makes is a form of surveillance; it is a form of proof. Everything I thought I was doing as an artist or as a journalist, as a photographer was also surveillance.

Isn’t that a bit over the top?

Adam Harvey: It sounds like that, but technically it’s true and it happens so much now with Instagram and with Facebook. It’s a new form of distributed mass surveillance, of crowd surveillance. Every time you take a photo and upload it to Instagram – if you don’t think that law enforcement agencies are monitoring that, then you are not reading the news. Facebook, Instagram, Twitter are truly Social Media, but they’re much more – they’re law enforcement tools. You can’t see that when you’re posting stuff. It’s fun, it’s a good way to share and you learn a lot. If you could only see the police officer, if you could only see where your photos end up, then I think you would change your behaviour. That’s the problem. We need to be able to know what happens beyond the upload.

That would be the role of the media.

Adam Harvey: The problem is that there is a relationship between technology companies and media companies, which are a kind of technology companies, about the narratives. I think it’s hard to be really critical off Facebook and Google or the ideas they support. Even as a media company you are participating in a form of surveillance capitalism, because you’re tracking everyone that reads the article with 3rd party trackers. It’s like if you were someone who is going to the Alcoholics Anonymous and they told you they had an open bar at the next meeting. That would be insane, right? You can’t hurt the person that you’re trying to help. But that’s what happens with the articles. Do you see the irony of tracking someone while telling them about the dangers of being tracked?

I do. Maybe that is also a question of incremental privacy, not just all or nothing. As for the coverage, it’s difficult to keep up with the speed of developments. Smart technology has become such a complex system within a few years.

Adam Harvey: People have been arguing that government is too slow to keep up regulating technology, but now it’s moving even faster. I’m not sure who can keep up with the pace of technology anymore. As we speak, China is developing artificial intelligence missiles. The anxiety of the whole thing, of talking about privacy is: You start talking about trackers in your browser and you end up talking about artificial intelligence missiles in China – and somehow they’re linked. Because they both rely on the concentration of wealth and data and the speed of technology. It’s very difficult to have any conversation about privacy without [losing oneself in it], at least with me. Because I think that all these problems are related and they demand more conversation, more examination, more experimental thought about them.
With this make-up and haircut you can outwit facial recognition software. – Quelle: Adam Harvey copyright
What are you currently working on?

Adam Harvey: I’m working on making CV Dazzle an open source software program, so that anyone can use it to make their face less visible to face detection. So instead of wearing the make-up patterns in the project photos you could do it digitally. And I’m still working on the Faraday cage phone case, a new version of it. Like the previous version, but with more security features. I can’t say tell you more about it, because it’s still in progress. But what I can say is that there are a lot of new ways to spy on a phone. For an investigative journalist for example it can be quite dangerous to carry around a phone.

Then I’m doing a show at Aksioma next year, which is in Slovenia. It will be the »Think Privacy« slogans and some new work. For the end of this year – as long as it can get printed in time – there will be a new book called »The Most Unwanted« and it’s 50 to 100 emerging surveillance technologies that’s put together like an annual report.

So what do you intend to achieve with all these projects you do?

Adam Harvey: What motivates me is that surveillance is fascist and tells you what to do and tells you who to be and what you can’t do. For many reasons that is terrible. The ability to express yourself and to be free of a dictatorial ideology is what art is all about. So it’s kind of the exact opposite, art and fascism.

The idea is that with all the work I do is exploring how we are going to adapt our behaviour to surveillance. I think you can’t really stop surveillance – that would have been a discussion to have a hundred years ago, it’s definitely too late to have that discussion. But the discussion now is how do you change the next hundred years. May it mean limiting the use of data collection tools, finding new work-arounds, or new ways of adapting. And of course: How do you change the legal policy? I think that only comes when there’s enough public support.

Mit Illustrationen von Ronja Schweer für Perspective Daily

von Nikola Schmidt 

Nikola beschäftigt sich mit den Grundregeln des menschlichen Zusammenlebens. Mit Schwerpunkt im Internationalen und Europäischen Recht hat sie in Berlin und Istanbul Jura studiert. Sie fragt: Wie kann man Recht so gestalten, dass alle möglichst selbstbestimmt und frei leben können?

Nikola war bis Dezember 2016 Stammautorin bei Perspective Daily und ist seitdem Gastautorin.

Themen:  Internet   Aktivismus  

Die Diskussionen sind leider nur für Mitglieder verfügbar.

Weitere Artikel für dich