Why Privacy Matters – TED talk

Alessandro Acquisti on social media usage experiments at Carnegie Mellon:

Technical proof

Facial recognition (on a cloud based cluster with database of public facebook profile pictures) to find facebook profile from a passerby snapshot (from that name a social security number can be deferred with additional databases). More info on original experiment page.

Social media post judgement bias

People who upload (also embarassing) pictures to social media judge others harsher for such images than those who don’t (think about recruiting situations)

Nothing to hide myth

Women with baby (on picture or in text) is less likely to receive invitations to job offers. More info on original experiment page.

Informed decision in marketing myth

we react positively to photos “merged together” from close friends’ faces – but we don’t recognize them anymore (i.e. we don’t know what is actually happening and thus can’t make an informed decision)

Transparency myth

If “how we use your data” (with usage forms that usually indicate reluctance to share) is only 15 secs earlier than a sensitive question (e.g. in a questionaire or sign up-wizard), people will answer as if no warning was given (i.e. share more than if the warning and the question were close together)

“people don’t care about privacy.”

Often, a service doesn’t really leave a choice. It doesn’t mean they don’t care they just might judge their benefits higher than their damage.

Analogy with Garden Eden: In the garden, Adam & Eve had no material desires left open, yet they could not recognize & reflect on themselves. With recognizing their true nature, they had to leave the garden. In a similar sense, we need to take care of our privacy to recognize our freedom – while marketeers suggest that they fulfill all of our material needs (a.k.a. free online services). Trade automoy and freedom for comfort.

Big data can be a force for freedom and a force for (hidden) manipulation.

 

 

comment!

“Computer bedienen”

“Wir bedienen Computer” – “Computer bedienen uns”

There is a strange double meaning in the German word “bedienen”: It means
“to serve” (with a slightly respectful connotation, more like a waiter, less than a servant) but also
“to operate” (a machine).

From the initial sentence alone, you can’t say who is master and who is slave (to use some computer related terms). This struck me recently when I read a chapter in Frank Schirrmacher’s Payback, Chaos im Kurzzeitgedächtnis (p. 64).

One of his points is that services like Google Now are usually seen as digital butlers but at the same time they select information for us which inevitably controls what we do and think.

comment!

Privacy needs a culture of anonymity (more than technical solutions)

Our understanding of privacy is currently in permanent discussion and re-definition. Social networks encourage sharing of private details but this also means sharing these details with a large corporation (and most likely with its advertising clients). Intelligence agencies skim through our conversations in their quest to identify potential future maybe terrorists. Quite some people are scared because of this.

We see different reactions: the technology savy now roll out heavy encryption and other technology.  It quickly becomes an arms race to get something “really” secure and anonymous. For the majority, this technology looks – and in large parts sadly is – highly complicated and as if it will take all the fun from digital interaction. It looks as it even reassures them in their fatalistic perception that they are lost anyway. Then they stop caring altogether. And there are still enough that didn’t really notice and are not inclined to take part in the discussion.

Technology, in consequence, should not be the first thing to look at here. What we need is a cultural shift towards anonymity and privacy. That means insight into the value of privacy (e.g., as a precondition for liberty) when we actively think about it, e.g. in discussions. But it should eventually go deeper and become an almost subconscious value that we consider intuitively, like fairness. Anonymity should weave into our everyday decisions, not as an “always on” but an always available option.

La Bauta (in the back) was the everyday mask in Venice - picture by richspalding

La Bauta (in the back) was the everyday mask in Venice – picture by richspalding

In an article for the magazine <kes>, Johannes Wiele puts up three theses:

  • Social conflicts can’t be solved, just temporarily settled/negotiated (in pluralistic societies)
  • Almost all actors in politics pursue (what they think is) the good cause
  • (Despotism provokes resistance. This point is less relevant here but can explain one of the motivations for such a cultural shift)

The first two points combined mean: we need to arrive at a common understanding on society level of the benefits and risks of digital technology. We need to compare our “traditional” values and the preconditions they build on to the conditions of the digital world. Some values might be difficult to keep, some might need to be redefined, and we will need new, different social rules. These discussions must reach society level (involving “all actors in politics”) to achieve a broad understanding and constitute new social norms (this also resonates with Sascha Lobo’s call at re:publica this year). Technological implementations, such as email encryption, might be a consequence of this new culture but they are not at the heart of it.

Wiele references the mask culture in 18th century Venice to illustrate how vivid and detailed such a culture can be: various masks for various events, rituals around masking and un-masking, obligations like the prohibition to carry arms while being masked (see his blog for details). Wiele also mentions that masks became popular because of the excessive surveillance prevailing in Venice at that time. Wearing a mask was part of a strict social codex and its appearance was very regulated. This gave others, such as non-Venetian traders, the security that the bearer of the mask had certain priviledges and could be trusted, while still hiding his identity. This is like a 3rd party confirming, e.g. certain access rights when you want to use an online service but without giving away your full identity (and, ideally, without getting to know itself for which service and when you use this confirmation). Digital certificates represent parts of this concept but they don’t have a working “mask mode” yet.

Because it was a cultural or social standard, noone had to justify why s/he wanted to stay anonymous under normal circumstances. The mask culture might even look playful to us nowadays which I find a good aspect.

A culture of (choice of) anonymity could be an interesting development and consequence of the current situation. It is certainly the only way for a  profound and sustainable, or trustworthy and applicable, concept of privacy.

comment!
. .

Does my personal past influence my Favor for Windows Mobile?

olive and pink posters from 2006, winmo from 2012

Holding Windows Phone next to the poster announcing our final exhibition of my art school reveals a stunning similarity: strictly left aligned, ultra-light Swiss sans serifs, white text on bold colours. It struck me even more when I explored the other colour  schemes of WinMo, since they offer a light olive and sky blue that Bernhard used for the poster series.

Preview of the Flash portfolio

And then I remembered my first portfolio from 2004: tiles with project previews moving in the “wind”, and whirling away when you went for details (you can check it out yourself). Compare this to the main screen tiles animation in Windows Mobile. Stunning.

Moving tiles on my Windows Phone phone

comment!

USB surgery

ad hoc soldering place

So, after a long break I heated up my soldering iron again. My USB hub had a broken cable and I decided to fix, and at same time extend it. Also, I wanted to solder some USB cables for a long time, since it is rather simple on the technical level but an important and omnipresent item in the computer ecosystem.

Close look at ugliness

There was a little challenge built in, too, since my hub offered the cable colours blue, green, red, and transparent (USB standards would have been black, green, white, and red). Everything looks fine now.

Of course, the effort was in no relation to the monetary value of the device – but reparing is not about money but about curiosity and autonomy, as laid down in the Repair Manifesto (PDF).

New hub with just the cover plate missing

comment!

Why Google might not so really love open source

Contrasting my earlier estimation of Google’s Android plans, Symbian’s Executive Director Lee Williams recently explained his sharp take on the Android (business) model on GigaOm. Obviously, he’s a competitor, but he also manages to shed an interesting light on potential Google plans:

The Android System is basically open, but to use it in any reasonable means (if you are not a true hacker), you need a Google Account for Mail, Maps, Market, etc. And this account isn’t just something but a unique identifier for Google to collect all of your information, your habits, and device usage in one basket. This enables them to send you highly profiled and personalized ads (which can be sold expensively, I guess).
While you personally could say, “I don’t mind”, it’s a problem for a lot of other service providers who are not able “to get through” to the customer because s/he is already tied to Google.

Additionally, the applications that enforce this strong Google Account/device connection are all proprietary, i.e. not open. Google is really serious about protecting the apps that as their series of “Cease and Desist” letters showed. And because they are so central for the Android OS, Lee Williams has a good point in claiming that Android itself is not really open. Neither concerning these central apps, nor for other service providers. Hopefully, his Symbian Foundation will keep this case in mind.

And again, it looks like a “the winner takes it all” attempt that’s one of the biggest factors of uneasiness in my mixed feelings towards Google.

thanks Fee for pointing me to this.

comment!

Looking forward to Video Surveillance

a quick shot from the entrance of a Kaufhof department store

a quick shot from the entrance of a Kaufhof department store

comment!
.

digital ambiguity

screenshot of Martin's favicons with white and grey backgrounds
The website of the interaction designer Martin Frey is represented by a fascinating favicon. It’s a rather simple matrix of grey and transparent pixels, his initials “MF” set to pure white. With the browser’s location bar usually set to white as well, the “MF” should remain invisible (at least until it gets displayed on the (in my case) grey background of a tab).

If I look onto the screen of my notebook under a very small angle, however, I can see the initials nevertheless – stunning! Even more surprisingly, I was not able to reproduce this “hologram effect” on my large flat screen or on my girl-friends notebook.

To me, it seems like a little secret hidden in an actually exposed but usually discarded place. Despite the strict commandments of the binary world to be either 1 or 0, smart and gentle (or even “in between”) notions still might be possible.

scrennshot of the SAP favicon
Another interesting favicon is used by SAP: It keeps on scrolling until the site is fully loaded – very nice!

comment!
. .

All your data are belonging to us!

bundestag kameraueberwachung

A proposal for a new law faces a lot of controversy at the moment: The TKÜ (Law for the Surveillance of Telecommunication). Unfortunately, a lot of people are completely unaware and uninformed about the problems at hand — especially if they are not reading a lot of things online. I think, this is very problematic for two main reasons (a lot more can be found easily via the link in the corner of this site): the relation data stored is more sensitve than we might think and our believe that state authorities are good guys is not necessarily true.

Isn’t it all a minor problem as they are just storing the relational data (who with whom when and where) and don’t record e.g. the voice (they do but via another law)? Acutally, content is completely irrelevant: The whole field of Social Network Analysis strives to map entire social networks (you and your friends and their friends…) based on communication (one very good example is MIT’s Reality Mining Project). They can even estimate your general happiness: spending time with their friends usually makes people more content. As the analysis produces very concrete and specific patterns it is suited ideally for a pattern based search for criminals/terrorists. Especially “home grown terrorists” will have very sharp disruptions in their social life. All data sets should not only be stored but scanned carefully for suspicious behaviour if we want to take prevention seriously!

Still no problem because we don’t have to hide anything! We even stopped downloading files from dubious sources, so the copyright industry’s desires behind the law can’t harm us, either. But what if your friend becomes a suspect? Remember that you are linked with pretty much people with only six in between? I’m pretty sure you will find a true terrorist much closer in your “network”. And you can get a lock-in from prosecution authorities yourself, too! Visiting Afghanistan for whatever reason (relatives? NGO project?) is not a good idea, clearly, but probably not very likely for most of us, either. So Guantanamo is away far enough (you could get “extracted“, still) but serves as a first example why naively believing in the good state is a bad idea: While the U.S.A. can still be regareded a democracy and a constitutional state, all you know about that becomes irrelevant once you find yourself in “the camp”. No civil rights as you are outside the U.S. and of course Europe (if you consider yourself a civilian) and no rights from the Geneva Convention(if you consider yourself a soldier). No perspective to get heard by a lawyer, either.
For all Germans, there is a very recent example from at home: A sociologist working for Humboldt University, on cities in particular, got arrested for being part of a “terrorist community” (it’s all about communities…). It’s not that he really did something but that he was providing the “intellectual basis” for others — via his scientific research. Once you are suspected of terrorism you lose a lot of rights, e.g. talking to your attorney privately. It’s the attorney you need to get you out of prison, unfortunately.

While it is certainly necessary to provide security for the people, there are some limits that should be respected in order not to lose our freedom in tight situations.
On Nov, 6th, we can give our concerns a voice!

comment!
. . . .

Google takes care of you!

When I was on the way of looking for a film I saw at this years ars electronica, I got quite a good result via Google: http://www.thearkfilm.com/

But instead of the website I got a smart advise by Google as you can see below:
google anit malware
(you can right-click “view image” for the better readable version, untill I have better skripts for that)

[edit]
I had my doubts with the Google warning but the ars-link was exactly the same, so I took it and found a very nice page, describing the film, trailer, and several articles linking to all the awards the film has won so far (siggraph et al.!) – as expected.
Did that site become “badware” by Google-algorithms? and
Why did Google/stopbadware not provide any “no badware” button as we know it from spam and everything?
[/edit]

Has anyone ever experienced something similiar?

comment!