Why Privacy Matters – TED talk

Alessandro Acquisti on social media usage experiments at Carnegie Mellon:

Technical proof

Facial recognition (on a cloud based cluster with database of public facebook profile pictures) to find facebook profile from a passerby snapshot (from that name a social security number can be deferred with additional databases). More info on original experiment page.

Social media post judgement bias

People who upload (also embarassing) pictures to social media judge others harsher for such images than those who don’t (think about recruiting situations)

Nothing to hide myth

Women with baby (on picture or in text) is less likely to receive invitations to job offers. More info on original experiment page.

Informed decision in marketing myth

we react positively to photos “merged together” from close friends’ faces – but we don’t recognize them anymore (i.e. we don’t know what is actually happening and thus can’t make an informed decision)

Transparency myth

If “how we use your data” (with usage forms that usually indicate reluctance to share) is only 15 secs earlier than a sensitive question (e.g. in a questionaire or sign up-wizard), people will answer as if no warning was given (i.e. share more than if the warning and the question were close together)

“people don’t care about privacy.”

Often, a service doesn’t really leave a choice. It doesn’t mean they don’t care they just might judge their benefits higher than their damage.

Analogy with Garden Eden: In the garden, Adam & Eve had no material desires left open, yet they could not recognize & reflect on themselves. With recognizing their true nature, they had to leave the garden. In a similar sense, we need to take care of our privacy to recognize our freedom – while marketeers suggest that they fulfill all of our material needs (a.k.a. free online services). Trade automoy and freedom for comfort.

Big data can be a force for freedom and a force for (hidden) manipulation.




Firefox OS privacy controls

5 key privacy features of FirefoxOS in an overview

Mozilla teamed up with Telekom Innovation Labs and IXDS to develop an introduction tour and control interface for the remarkable privacy features of their novel Firefox OS. This operating system is meant for entry level phones in emerging markets, where Mozilla sees the chance and the necessity to empower users for a safe journey into the mobile web.


For any user, but especially for these potential “newbies”, privacy and security need good explanation and motivation: security risks can be distant and vague and technical backgrounds can feel intimidating. At the same time, users are usually up for something else than privacy, such as setting up their phone to make the first call. “Respect my task & time”, how Larissa Co brings it to the point in her excellent talk. (yes, we need to face it: taking care of privacy and security is on no ones todo list and is usually not “productive” per se.)


We approached this UX challenge by strictly limiting topcis and features, using short and fresh explanations, and carefully drawn illustrations. To arrive there, we initiated co-creative workshops with users, Mozilla, and Telekom Group Privacy. We also worked very closely with the respective tech teams to make sure our ideas would make it into reality. We could build on IXDS user research knowledge on privacy and identity from previous projects and from Mozilla’s research team.
Mozilla does not rely on the exploitation of private data (e.g., for targeted marketing) and therefore is a trustworthy broker for the user. They can offer features like granualar permissions (in Android only very briefly in 4.3 / 2013 in the Cyanogen Custom ROM) or blurred location tracking.

The results were also presented at the W3C workshop on usable privacy controls (Berlin, 2014).

  • Early sketches from a workshop to find the best approach for the privacy tour.


During the highly playful workshops, the participants produced some really entertaining and insightful explanations on privacy topcis, e.g. a role playing video in the style of a Kids TV series to explain email encryption. This shows how important it is to move privacy questions out of their dry and defensive atmosphere and give more personal, active, and playful answers.

It also helps to be very clear about your target groups. Only few activists will accept harsh security and privacy settings to really protect them, even against more targeted attacks. Regular users see their benefit in more peace of mind and a sense of control over their data but it must be balanced with the general comfort and user experience.

Note: The Privacy Dashboard and the included Guided Tour are also scientifically evaluated in Piekarska et al (2015): Because We Care. Privacy Dashboard on Firefox OS.


Privacy needs a culture of anonymity (more than technical solutions)

Our understanding of privacy is currently in permanent discussion and re-definition. Social networks encourage sharing of private details but this also means sharing these details with a large corporation (and most likely with its advertising clients). Intelligence agencies skim through our conversations in their quest to identify potential future maybe terrorists. Quite some people are scared because of this.

We see different reactions: the technology savy now roll out heavy encryption and other technology.  It quickly becomes an arms race to get something “really” secure and anonymous. For the majority, this technology looks – and in large parts sadly is – highly complicated and as if it will take all the fun from digital interaction. It looks as it even reassures them in their fatalistic perception that they are lost anyway. Then they stop caring altogether. And there are still enough that didn’t really notice and are not inclined to take part in the discussion.

Technology, in consequence, should not be the first thing to look at here. What we need is a cultural shift towards anonymity and privacy. That means insight into the value of privacy (e.g., as a precondition for liberty) when we actively think about it, e.g. in discussions. But it should eventually go deeper and become an almost subconscious value that we consider intuitively, like fairness. Anonymity should weave into our everyday decisions, not as an “always on” but an always available option.

La Bauta (in the back) was the everyday mask in Venice - picture by richspalding

La Bauta (in the back) was the everyday mask in Venice – picture by richspalding

In an article for the magazine <kes>, Johannes Wiele puts up three theses:

  • Social conflicts can’t be solved, just temporarily settled/negotiated (in pluralistic societies)
  • Almost all actors in politics pursue (what they think is) the good cause
  • (Despotism provokes resistance. This point is less relevant here but can explain one of the motivations for such a cultural shift)

The first two points combined mean: we need to arrive at a common understanding on society level of the benefits and risks of digital technology. We need to compare our “traditional” values and the preconditions they build on to the conditions of the digital world. Some values might be difficult to keep, some might need to be redefined, and we will need new, different social rules. These discussions must reach society level (involving “all actors in politics”) to achieve a broad understanding and constitute new social norms (this also resonates with Sascha Lobo’s call at re:publica this year). Technological implementations, such as email encryption, might be a consequence of this new culture but they are not at the heart of it.

Wiele references the mask culture in 18th century Venice to illustrate how vivid and detailed such a culture can be: various masks for various events, rituals around masking and un-masking, obligations like the prohibition to carry arms while being masked (see his blog for details). Wiele also mentions that masks became popular because of the excessive surveillance prevailing in Venice at that time. Wearing a mask was part of a strict social codex and its appearance was very regulated. This gave others, such as non-Venetian traders, the security that the bearer of the mask had certain priviledges and could be trusted, while still hiding his identity. This is like a 3rd party confirming, e.g. certain access rights when you want to use an online service but without giving away your full identity (and, ideally, without getting to know itself for which service and when you use this confirmation). Digital certificates represent parts of this concept but they don’t have a working “mask mode” yet.

Because it was a cultural or social standard, noone had to justify why s/he wanted to stay anonymous under normal circumstances. The mask culture might even look playful to us nowadays which I find a good aspect.

A culture of (choice of) anonymity could be an interesting development and consequence of the current situation. It is certainly the only way for a  profound and sustainable, or trustworthy and applicable, concept of privacy.

. .

emotisys on asteroids!


Recently, I moved my entire online household to a new infrastructure and a new provider. I felt I was reaching the limits of my old one, e.g. in terms of mailbox size, and–what was more important to me– functionality like server side mail filtering and security features like ssh/sftp access. During my most recent research project (part of SASER) I learnt quite a bit about server security and -attacks, which gave me decent motivation. Then again, I didn’t want to maintain nor pay for a full blown server. And I wanted to keep “my stuff”, such as emails, under German privacy law and with a trustworthy provider. Quite a shopping list. I’m very confident that I’ve found the perfect match for me: uberspace.de


Here are some reasons:


When I visited their website for the first time, I was immediately struck by the colloquial but informative language they used. You never feel like being in a tech shop although you get every technical detail explained if you want. They managed to find an informal tone, the right length of explanations, the right examples where you need it, and to sprinkle subtle humor on top. I read half of their instructions even before I signed up, just because it was an easy read! This is a very rare thing if you look into most of your manuals. And you can just write them and get help very quickly in the same supportive tone. Since it’s the “technical staff” writing the texts, you get a feeling for their attitude. Being accessible and personal contributes to trust.


If you want to manage private data online, it’s not enough for the service provider to have good intentions: their competencies must allow them to run state-of-the-art technology in a way that keeps your data safe. As the thorough documentation shows and also the team portraits tell you, the people behind uberspace know what they are doing. They also tell you (better: everyone)  about the limits of their service and that they are not without fail–s.th. that I find an important part of “expertise”. Of course, no service is without fail (due to humans and software) but most of them will give you a false sense of security and hide the details in fine print. Credible expertise is a basis for trust.


When you look at the most popular/best marketed webhosters, their offers are full of superlatives but also *-links to the fine print. What looked like a great offer will become actually quite pricy 6 months later. uberspace make it very clear what you get and what you can’t. They tell you about their technology, their services, and their calculation (!). Actually, you can decide yourself how much you want to pay but with their transparent calculation they give you a good understanding of what will be fair. And since they obviously treat you fair, you are inclined to do so in exchange.

They will explain how they made all of these choices. Along the way, you’ll learn quite a bit about webhosting, including how secure your data can be. They also have terms of service, although they are called house rules. Again, they are not in dry and “exact” judicial language but lay out situations that require, e.g. interventions by uberspace staff and explain what will happen.

You get the impression that they have a reason for everything they do–if they don’t, they won’t do it. Best example is the personal data required to start an uberspace: zero. There is no reason for them to record it, they don’t need to contact you, they don’t need your bank account (if you don’t pay by yourself, they’ll simply close your webspace. Fair deal). Not recording not essential data is obviously very simple, noone needs to tick any privacy statements, and it’s just good practice of privacy. Data they don’t store can’t be misused, get stolen, or required to be handed over by law. Clear business models, clear dos and don’ts, clear data policy: transparancy builds trust as little else.

Challenging and Encouraging

In short, uberspace is for autonomous users. People who already know or want to learn about web hosting technology, want to take a look behind the scenes, so that they are in control of their scripts and data. uberspace do a great deal to help you with that, including a more or less subtle push towards the command line interface. This again is a sign that uberspace take their users serious: web technology isn’t Lego but you can learn it, too. The more you understand, the more you are in control.



Design at Linux Tag 2014

Going to a Linux conference as a designer might sound like an exotic idea on first sight. While this should receive a second thought (see later), I have to admit that I had a special approach, too, when I went to this year’s Linuxtag at Station, Berlin: I went there as part of the SaSER research team at IDL which is (on the top most level) concerned with defining new more efficient, reliable, and secure communications on the internet. So we were in the sys admin domain already (although Linux is far more than for admins but that might be what you think of).IMG_9091

Visual Analytics for Security Admins

We gave a talk on our interim results, which are visual analytics tools to investigate huge log files (i.e. text files): Opening Treasure Boxes. Exploring log files with visual data analysis to detect security breaches (slides). As part of the “Tracing and Logging” track, we talked to system administrators and security analysts and everyone else interested. It was a great opportunity for us to extend our contact with users, get feedback, acquire new use cases. We assumed that we would also “evangelize” a little in favor of visual analytics and visualizations beyond bar and line charts. We were quite right with that: our ideas were partially seen as strange and  unusual but we also received quite some thankful feedback by analysts who said our ideas opened new opportunities for them.Screen-Shot-2013-10-31-at-3.45.06-PM___elasticsearch-org

A view of the Kibana interface (image from Elastic Search)

A view of the Kibana interface (image from Elastic Search)

But I also want to point out that a couple of tools in the same track provided a very decent user interface and good visualization support. This is especially remarkable as log files are abstract things and enormously large, which makes providing a grip on them a real design challenge. Lennart Koopmann of Torch presented greylog2, with an interface to query large logfiles, get an overview over values in the file, and also get visual support for results in the form of time lines. Even more dedicated to a graphical user interface was Kibana (which builds on logstash and ElasticSearch), presented by Bernd Erk of Netways. I was impressed by the visual support for building and modifying queries, the ease of building graphs, and the clean overall interface. In many regards it reminded me of Splunk, which is also a great but not an open source software. As we found during the preparation of our talk, also the event monitoring system Icinga2 starts including interesting visualizations – Markus Frosch (of Icinga) just didn’t put a huge focus on the new interface.

Design for Open Source Software (needed!)

Coming back to the (suspected) design – Linux repulsion or even design – open source repulsion: open source software gained a bad reputation for having ugly or “just enough” user interfaces, with little help for users to find a workflow or just please the eye (things like Firefox or Fritzing are an exception, of course, but they are also rather recent offsprings). It seems as if open source is much more appealing to developers than to designers. I have no instant explanation for that – if you do, please let me know! I have to admit that a lot of the software I saw during Linuxtag unfortunately confirmed this prejudice. The more delighted I was when I saw how well crafted things like Kibana were.

It might be worth noting that Edna Kropp and Nicole Charlier of akquinet gave a basic introduction into user centered design and how they work as “on-site UX consultants“. While it was pretty basic for a designer, it was probably new and remarkable for many of the developers (hopefully) listening. I think much more talks like this are necessary to get to a common understanding between developers and designers in the open source scene.

Further notes

The bare crypto stick (it has a modest but nice casing in the final version)

The bare crypto stick (it has a modest but nice casing in the final version)

For the real paranoid cautious people, there is a Crypto Stick: looks like a thumb drive but actually hosts a micro-processor, a smartcard, and an SD card. You can use it to establish secure connections from untrusted systems (like internet cafés), store your passwords, and other things. You can even transport documents “plausibly hidden”, e.g., in case you get searched at an airport – and you don’t have to think of Snowden to understand how relevant that can be. I liked the idea to have a “security thing” that is really strong but also makes it easier for people to stay safe online/digitally.
Btw: it’s open software and open hardware, so you can build it at home (although the small form factor makes it complicated)

UDOO: Standard PC interfaces for the "Linux part" seen at the front here, with pin headers in Arduino due format at the back

UDOO: Standard PC interfaces for the “Linux part” seen at the front here, with pin headers in Arduino due format at the back

Even physical computing was a topic and the only other presentation given by an interaction designer: Michelangelo Guarise presented UDOO, which combines an Arduino Due-derived board with a Linux system running on a powerful quad-core ARM chip. This “natural” combination pops out in various flavors at the moment, combining the sensor-friendly, real-time interaction capable Arduino architecture with high-performance computing. I hope they will soon add their platform as a part to the Fritzing library and I’m curious about the projects building on that single board computer!

And I got a trusted certificate from CAcert to (soon) sign my email and ssl server connections – yeeha! I was impressed by how serious they take the process, with several people checking my ID cards separately. Trust on the internet is a delicate thing and digital signatures can help a lot here.



PRISM, security, and the user

The extent, if not totality, of the US spy program PRISM has shocked the world. It still does, as new details occur and no official plans to improve transparency or legitimacy are announced.

The activities uncovered put yet another spotlight on the vulnerability of the “information society” we live in and we appreciate for its comfort. As the term already suggests, information plays the key role and it is also key to gain or exert power. Therefore, criminals work on malware to gain information about our credit cards and to steal business secrets. Companies are after your intimate behavior to personalize advertising. And now it turns out that also friendly constitutional democracies filter data on massive scale as part of their “intelligence” (how far this even involves “business intelligence” is one of the unanswered questions).

In this light, improving the security of messages and the transmission networks themselves becomes critical.

As an example for secure messages, SiMKo, the top security devices by Deutsche Telekom, aim to protect government communications – as it seems now, this is not only necessary against spy organizations but also to keep friendly secret services at bay. T-Systems works with IXDS to not “just” deliver top security but to keep up usability and joy of use up at the same time. [I work for IXDS]

I also joined the project SASER, an EU funded research activity for a more stable, secure, and efficient network technology. As part of the Interaction Design Lab, we will develop visualization tools for complex data that help security analysts to find and stop vulnerabilities or attacks.

More secure technology and “security habits” certainly help on an individual level. Attempts towards total surveillance, however, need to be blocked on society (or political) level. Only if we value transparency and accountability more than secrecy, even in the event of terror, we can keep a vivid freedom of speech and our democracy healthy.


Defining privacy

The spreading of personal information in the digital age and the loss of control over it is continually increasing. In it’s essence, it is nothing very new but we witness (or are part of) some major shifts right now: the rise of online social networks, high precision targeted advertising, and the level of surveillance as part of the anti-terrorism measures. The significance of privacy is currently being re-negotiated (details below).

At the same time, the technical possibilities to control and broker one’s personal data streams have increased just as much – unfortunately most of these possibilities are stuck in theory and decent tools are missing. We should expect (or build) a ground breaking solution here. I find this particularly striking as I had the priviledge to work on such a tool over a year ago and sadly enough it hasn’t really come to market as of today (I’ll go into details in a separate article).

Photo (slightly cropped) by ecoev on Flickr

Photo (slightly cropped) by ecoev on Flickr

A couple of days ago, I had the privilege to attend a conference on privacy from Germany’s internet industry association eco. By the mere count of participants (overwhelmingly in black suits) it was a small meeting, but as the participation of the German Minister of the Interior, Hans-Peter Friedrich, and the EU commissioner for “Justice and Fundamental Rights”, Viviane Reding, shows, it was of extremely high profile for our societies’ rule makers.
From a citizen’s point of view, the event was pretty interesting as you could witness the actors and debates that shape the laws of tomorrow. For designers, however, the lack of discussable solutions, or just adventurous experiments, was disappointing. I have the strong impression that some practical contributions will inspire the debate and could bring a more differentiated or “realistic” view to some legal considerations.

Defining terms – not just a question for law makers

While defining terms sounds like hairsplitting detail work, knowing about different aspects and concepts of privacy and data protection focuses the often superficial and emotional debates. I’ll look very briefly at two questions: protect data against whom or what? And what is the data to be protected?

During the eco meeting, Axel Spieß, an international expert in this (legal) domain, pointed out the very different meanings of “privacy” in the US and “Datenschutz” in Germany: in the US, privacy was mainly referring to the “right to be let alone”, as a citizen against the state (4th amendment). In contrast, acquiring and selling user data is a pure matter of private business and contracts. “Data protection” would usually refer to measures that prevent the theft or loss of data.
Under German jurisdiction, however, “Datenschutz”/data protection is affected by all transactions (or even just the collection) of “information that identifies a person” because it is considered to violate one’s “informational self-determination“. And this needs to be respected by governmental authorities as well as private companies.
(For the UK position, BBC News has a comprehensive article for you.)

There is also a fundamentally different perception of who owns the data (US (mostly): the company who collects (or buys) it. Germany: the person it refers to). Ownership of personal information is also an important point for a couple of service ideas around a transparent data trade (see the practicle article on that)

In his speech at the congress, Minister Friedrich implied that data collection by authorities was rather harmless since it couldn’t happen without laws and was under public control. But since the 9/11 attacks, we should be aware of how easily security (or anxiety) rules above freedom (and as part of it, privacy), and otherwise illegal activities and questionable surveillance pass through.
[end of sidetrack]

The other important definition is the term of “identifying personal information”: intuitively, one would think of more sensitive information, such as name, address, phone numbers (IP numbers? already a hot debate!). And indeed, some laws contain such lists. However, in the age of sophisticated data mining, “insensitive” data (such as items of a single purchase) is easily combined into “more sensitive” data (such as buying habits and all deviations, like job loss, illnesses, diets, or even pregnancies). As behaviour prediction is becoming reality, there is no insensitive data any more (as the German Constitutional Court stated already 1983).

Who defines the privacy of the future?

Inside the EU, the debate around privacy is active for quite a while now. Commissioner Reding claims that it is at the heart of the Digital Agenda (which has its own commissioner, Nellie Kroes). For the EU, a unified data protection and privacy legislation would not only facilitate trade inside the union, it would also be a strong signal towards other societies and markets. Companies with businesses in the EU would at least have to take the EU rules into account, if not completely follow them (what this could mean can be seen in discussions around facebook, Street View).

So far, the EU has been quite successful in setting the agenda and the terms of the discussion. They also convince/persuade more and more non-European countries to follow their model. Obviously, this upcoming normative power of the EU is at odds with US interests and US companies (who form, again, most of the internet as we know it). More or less recently (02/2012), the Obama administration came up with a regulation of its own, the much debated Consumer Privacy Bill of Rights. Given the US traditions as described above, this might appear as a strange thing (some of the differences are lined out here and here).

With the models currently debated on both sides of the Atlantic Ocean, we negotiate nothting less than the fundamental privacy rules of the future digital society.


Predicting behaviour from user data

Since people follow rather stable routines, it is possible to predict their behaviour (within a range of certainty) from analysing their activities in the past. One important research in this direction was carried out in the context project at the University of Helsinki from 2002-2005, with a focus on what places people go and where they meet.

Today, tremendous amounts of behavioural data is generated through web log statistics, tracking cookies and beacons, and mobile phone positions (cell towers and GPS). New mechanisms evolve that make this data also usable, even in real time (e.g. Google’s Map Reduce algorithm). This is the result of a Structure Big Data conference that promises an “inevitable, even irresitible surveillance society” (Jeff Jonas, an IBM engineer quoted in a Computerwold article)

While the ability to “look into people’s minds” scares privacy experts, it also promises to deliver perfect filters for users who feel lost in the tremendous stream of news and information. And it offers them a personalized experience of services.

Another point of concern:

The higher the amount and variety of data collected, the more unique the data sets are that a single person produces. One example is the website visitor identification through the browser footprint. It might look pretty generic on first view, but since it includes the fonts installed, version numbers of plugins, etc., very few people actually have the same browser footprint.
While the data itself is usually collected in a “non-identfying, anonymized form”, the combined data sets render anonymity an illusion.

[update 02/2012:]

The New York Times had an extensive report on how large supermarkets extensively collect data on their customers. Despite the data pieces being rather trivial (who buys what when), they can conclude from the large numbers and the pretty unchanging behaviour of each customer the personal needs of each customer very precisely.

They even feature a story about targeting a pregnant teenager with baby products where even the teenager’s father didn’t know (yet) that his daughter was pregnant. While this is probably a rare case, it shows that the large numbers and decent data mining can not only report but even predict personal needs and wishes.


Homo Reparans

In their preface, the curators of Ars Electronica 2010 sounded pretty alarmistic. Not to put too fine a point on the summary, you could phrase it with “the world is on fire, act now, there is no time for pessimism (or maybe even thinking)”. Before I went, I had my doubts whether a call for immediate action combined with an apparently clear goal to get the world “back on the right track” (i.e. repair it) would get us into a mode of “no alternative” (TINA). If there was no alternative, asking questions, reflecting, disagreeing would have to be regarded as a waste of time…

Tove Kjellmark's "Destruction of the Ego" Robot needs a repair


Technology and democracy

Of course, you can read the initial statement as a provocation, and the actual discussions were more balanced and aware of the two sides many measures to “save the world” have. The relations and tensions between technology and democracy formed a core topic on this festival around “Art, Technology, and Society”. Andreas Lehner of the CCC underlined the importance of hacker organisations for democracy with a quote of Albert Einstein: “Think also about the fact that it is the engineers who make true democracy possible. ” But while they (often) lay the foundations and enable communication and exchange, in a democracy, they should not be the ones to take decisions. As Amelia Andersdottir (MEP for the Piratpartiet) reported from her own experience, many decisions in today’s politics come prefabricated from expert organisations like the WTO. The European or national parliaments have little left to change – and often lack time and expertise to fully understand the concepts and even more to improve them. Additionally, more funding and experts form an advantage for large companies in the competition of opinions. This is not a conspiracy by “them” (a somewhat diffuse enemy Ralf Schmerberg wants to bash in his movie Problema), but a self-reinforcing process that needs to be changed.

216 prepared dc-motors/filler wire 1.0mm by Zimoun.

216 prepared dc-motors/filler wire 1.0mm by Zimoun.


Today’s technology even enables a level of hypercommunication that goes further than most of its users want it, into the most private aspects of life. Google Streetview and Facebook were not in the center of discussions but a permanent subtext. (Maybe this is also because today’s developments in this area spread in realtime and solid critique is delayed, as Geert Lovink put it).The Austrian philosopher Andreas Hirsch even claimed that there is nothing left but an illusion of privacy. This brought up a fruitful debate with two remarkable statements:

Even this illusion is still valuable to Derrick de Kerckhove, because as long as we think of a private space, we can also think of a public space, reserved for arguments that are not meant personal. Joitchi Ito, who actually lives a pretty transparent life himself (deliberately), still was not convinced: Privacy is needed so that civil actions gain enough momentum before they are under public/governmental control (and possibly restrictions). If their is no private space anymore, there won’t be any strong impulses for the public space, the res publica. Privacy is an essential prerequisite to make “repair” possible for re(s)publics and democracies.

(You could also think of China as an extreme form of the expert society refered to by Andersdotter above, which might prescribe unacceptable standards for your way of life.)

Ars Electronica Courtyard

Courtyard of former Tabakfabrik, the venue of Ars Electronica

Open tools

A strong grassrootsmovement (at least at ars electronica) is dedicated to opensource technology. Open software is much more common and accepted today (just think of Mozilla’s Firefox), but it is not software alone anymore.

Unhappy with today’s versions of social networks, Maxwell Salzberg presented the Diaspora project that aims at making the flow of (user) data more transparent and thus giving users a better control of privacy (Gert Lovink added similiar projects like GNUsocial, AppleSeed, and status.net). I personally find this extremely important, not only because of the significance social networks have today, but also because the architecture of the new systems will have to offer solutions for some tricky problems (like interoperability, widespread acceptance, ease of use).

Head mounted eyetracking set

Head mounted open source eyetracker (hardware side)

When you hit the borders of opensource software, you will soon want additional hardware. The Free Art and Technology Lab, together with the Graffiti Research Lab, Open Frameworks, and the Ebeling Groupcreated a do-it-yourself eyetracking system (with claimed 50$ costs in hardware) called the Eyewriter. They initially created it for a friend who suffers of ALS/Lou Gehrig’s Desease and can’t moeve anything but his eyes. While eyetracking as an expensive technology is not new, this package of hardware instructions and powerful software puts it into the hands of everybody, to explore, adapt, and improve.


MakerBot in action

Material/3d printers used to be another high-tech, high-cost device. Now there is the makerbot to print out the 3d models of the general public. And from being the attraction itself it already tends to become a supporting part of other artworks, like in Daan van den Berg’s Merrick.

Oribots by Matthew Gardiner

The parts of Matthew Gardiner’s Oribotic project are also created with a 3d printer.

Even “bio technology” is now tinkered with. There are more speculative designs for debate, like Catherin Kramer’s Community Meat Lab. It combines the (future) in-vitro growing of meat with a community based form of production. While open sourcing biotec, she also tries to avoid the gap between food producers and consumers of today’s industrialised supply systems. Ready to build (it yourself) are the low tech/recycled The Windowfarms Project by Britta Riley.


Britta Riley’s Windowfarms

Interestingly enough, considerable exhibition space was given to industrial companies in the Repair Fair – some of them surely a proof for former utiopias and brave enterpreneurship. Other big ones like Siemens or Linz AG apeared in a strange contrast to the rest of the exhibition, being as responsible for the often bemoaned state of the enviroment as well as potential contributors to “reparation”.

Repair for originals!

For many people who create opensource technolgy today, disassembling devices because they were broken has often been the first step into working with technology. This is contained in one of the statements in the platform 21 Repair Manifesto, (to me) one of the most important documents of the ars:

7. To repair is to discover.


10. Repairing is indiependence.
Don’t be a slave to technology – be its master. If it’s broken, fix it and make it better. And if you’re a master, empower others.


straight forward repair by filting in situ: Woolfiller by Heleen Klopper

Additionally, they gave me a very enlightening explanation for “authenticity”, the feeling that certain things are somehow weaved into our personal history. We often tend to cling to old stuff, even when new products were easily available (a marketing department’s nightmare: happy people don’t buy new stuff, and authenticity is hard to synthesize). When you repair something after an accident or because it is worn out, you focus especially on the parts of a thing which make you aware of your “common experiences”. And repairing causes an self-reinforcing exchange with a thing: you dedicate time and effort and this makes it even more important and unique to you.

9. Repaired things are unique.
Even fakes become originals when you repair them.

Shoe Goo Repair

At the Shoe Goo repair station, Arne Hendriks applies “street knowledge” from skaters to make you shoes live longer.

. .

Looking forward to Video Surveillance

a quick shot from the entrance of a Kaufhof department store

a quick shot from the entrance of a Kaufhof department store