The White Cyclosa Trashline spider has finished weaving its web, but the spider doesn’t rest. It collects the remains of its prey, some rubble and debris and uses its white web to stitch it all together. The White Cyclosa spider is sculpting a life-size self-portrait to be exhibited on its web. It is not a very anatomically correct portrait, but the audience of that piece is not picky. And that is exactly the point. This spider’s main predators are wasps and the self-portrait is set as a decoy. If the wasps attack, every sculpture would help reduce the spider’s vulnerability. A wrong identification leads to a failed targeted killing attempt. This might be all the spider needs to escape. Not a perfect protection, but an obfuscation.
Obfuscation is defined by Finn Brunton and Helen Nissenbaum as “the production, inclusion, addition, or communication of misleading, ambiguous, or false data in an effort to evade, distract, or confuse data gatherers or diminish the reliability (and value) of data aggregations.”[1] While a chameleon camouflages by changing itself to blend with its surrounding, the White Cyclosa spider obfuscates by changing its surrounding to blend with itself.
The unique creative resistance of the White Cyclosa spider sets the stage for discussing obfuscation in terms of conflict and information. While it is not common in nature, human culture and history provide many examples of obfuscation. In a famous scene in Stanley Kubrick’s Spartacus (1960), when the Romans demand to identify the slave leader, every slave in the crowd shouts “I’m Spartacus”. In World War II coalition forces planes threw tons of aluminum chaff to jam German radars and make it harder for them to identify ally attack planes. In 2008, Anthony Curcio used obfuscation to rob a Brink’s armored car in front of a Bank of America branch. He posted an ad on Craigslist inviting cleanup workers to meet outside the bank wearing identical suits, the same one he was wearing as he escaped with 2 bags worth of $400,000.
From my own perspective and training in graphic, interaction and information design, obfuscation is somewhat counter-intuitive. On my first year as a Visual Communication student 20 years ago, I learned about Gestalt theory and how our visual/cognitive system constantly processes vast visual data, separates foreground from background, groups similar signals, identifies contrasts and patterns and dictates order on the visual world. As designers we take advantage of this order-bias to build artificial graphic systems that leverage this visual sense making. Designers make choices about typographic hierarchy, balanced color schemes, interaction flows, and information visualization in order to direct and shape human attention.
And as a media activist working with the Israeli NGO, The Public Knowledge Workshop, we’ve been investing a lot of energy trying to make sense of messy government data, to find signals in the noise, to use clear evidence to hold the government accountable.
So what led me to work against all of this and take up obfuscation as my weapon of choice?
Data and its Discontents
The celebration of data and its “big” promises to inform our lives have been met with a mixed response. The popularity of social media and mobile technology has also increased the invasiveness of commercial and government surveillance. What was initially regarded as rumors and tech-conspiracy theories have later been substantiated by Edward Snowden’s leak of the NSA’s extensive surveillance program and the compliance of the tech giants in this widespread spying program. All of this has led to a widespread concern about the slow erosion of privacy and the widening use of cognitive manipulation by algorithms. This growing data anxiety is however mostly impotent, as people find it hard to exercise their political agency online, and often are not even sure whether such an agency can even exist. This power imbalance feeds the common techno-determinist approach that claims this is “what technology wants” and that therefore there’s nothing we can do about it. Unable to resist, feeling like the battle was already lost before it even began, many try to belittle the losses. They imagine their lives without privacy and determine they have “nothing to hide” and therefore this techno-cultural shift should not present grave personal consequences.
Crypto Culture evolved around a technological response to surveillance. It is centered around the use of encryption technologies in an attempt to hide and protect data and communication from being tracked by third parties. Complex mechanisms involving the exchange of trusted keys and elaborated processes of encryption and decryption provide individuals some refuge from the all-watching eyes of online surveillance. Indeed, when Google can only send your emails but not read them, or when your text messages are encrypted, the ability of these communication platforms to peek into your content is severely diminished. Some advocate to not only encrypt as much as possible but often also to “just say no”. Feeling exploited by abusive communication platforms, some advocate for simply opting-out committing so-called “social-media suicide”.
Yet I am afraid these solutions are not gaining enough traction to become viable alternatives to the big data surveillance status quo. At best, both encryption and opting out provide individual protections for the few tech-savvy elites, while at the same time they remove these critical agents from the scene of the action and make it harder for them to deliver their message to the unencrypted “ignorant” masses. At the end of the day, people don’t go online to hide, they go online to express themselves and to communicate with others. My concern is that by focusing on individual encryption we put the responsibility on those who “simply don’t get it”, rather than challenge the systematic vulnerabilities of the web and working together to hold those who exploit them accountable.
Fighting Data with Data
Data obfuscation emerged in recent years as a different countermeasure. While encryption and opting out are based on restrictive measures and individual protection, data obfuscation takes expression to a whole new level. In 2006, in an effort to fight against search engine profiling, my colleagues Daniel C. How and Helen Nissenbaum developed the Track Me Not browser extension. Track Me Not continuously performs random search queries obfuscating the genuine searches performed by the individual. Similarly, the AdNauseam browser extension[2] that we launched in 2014 not only blocks ads but also simultaneously clicks on all of them. Both of these browser extensions surround the genuine data with automated noise, fighting big data surveillance by making it bigger than it can successfully analyze.
In the TOR network each communication packet ping pongs between many relay nodes before hitting its target. With every communication session broken into different small packets, each of them apparently coming from a different IP, it is nearly impossible to track down who in this vast network is “on the other side line”. During the public uprising following the Iranian election in 2009 people all around the world set up TOR relays and proxy servers in solidarity to help obfuscate Iranian activists’ communications and protect them from the Iranian government’s surveillance. This aspect of collective action and solidarity is an important complement to the individual solutions of cryptography and a refreshing alternative to a culture of determinist defeatism.
Why Obfuscate?
In their book Obfuscation: A User’s Guide For Privacy and Protest[3] Finn Brunton and Helen Nissenbaum categorize the different functions of obfuscation. While it is far from being a silver bullet solution to privacy and data protection, in many cases it could provide the right means, depending on the end.
The White Cyclosa and Anthony Curcio were not using Obfuscation as a permanent protection but simply as a way to postpone identification and buy time enough to escape from their captors. The rebel slaves were shouting “I’m Spartacus” to provide cover and express protest. The TOR network is used to prevent individual exposure. Track Me Not[4] and AdNauseam are used to interfere with profiling and provide plausible deniability. Many other examples show different uses and different goals for the use of obfuscation. Therefore we should always analyze obfuscation means in the context of the obfuscation’s goals.
Is It Wrong?
Data is mostly used as a proxy towards a more scientific and possibly reliable knowledge gathering process. If obfuscation is set to tamper with that, is it ethical? Some ethicists would answer with a flat “No!”. Kant for example held truth to be the highest value claiming you shouldn’t even lie to a murderer asking you for the location of his victim[5]. Many other ethicists criticize Kant’s uncompromising position arguing for a more nuanced approach to the ethics of knowledge exchange.
There are definitely questionable uses of obfuscation, be it Anthony Curcio’s robbery or governments deliberately obfuscating open government data to make it adequately transparent yet practically unintelligible. In fact, obfuscation is always questionable and that question is the key to understanding the informational power structures it addresses.
Distributed Denial of Service (DDoS) attacks use obfuscation to generate many automated hits on a website, flooding the target servers that slows traffic down to a halt often crashing the servers or making them practically unusable. The automated requests are impossible to differentiate from genuine users and hence not serving some requests would mean not serving any request, basically shutting down all communication. Many online hacktivist groups justify this tactic as a legitimate tool for fighting against tyrannical governments and corrupt corporations. But the use of DDoS does not stop there, the practice and the technology that became iconic by groups like Anonymous are more often used for criminal activity, blackmail and even by repressive governments against opposition groups and civil society organizations.
So am I trying to argue that the ends justify the means? They might, but the means cannot be justified out of context. And nor should any use of data be outright justified or vilified, whether it claims to be truthful or not. We should generate, collect, analyze or obfuscate data with the larger social and political implications in mind. Obfuscation manufactures uncertainty as a tactic, it begs the question how “big” can data really get before it collapses under its own weight. When most of today’s algorithmic mechanisms rely heavily on automatically identifying trends and patterns in large datasets and have the statistical model emerge from correlations, they are left vulnerable to obfuscation and other forms of noise. The more authoritative and influential the algorithm, the higher the stakes for influencing it. Additionally, the more data sources the algorithm relies on, the more it exposes itself as a potential target for obfuscation.
Such “gaming” of algorithmic authority have become more and more controversial in the past few years with the rise of fake news, obfuscated through clickbait and disseminated through social media filter bubbles. While obfuscation could potentially promote healthy data skepticism, it may also contribute to further the mistrust in the scientific process and in fact-based public discourse.
To conclude, I would argue data obfuscation should not be read outside the wider cultural and political context of today’s big data controversies. In this context, tech giants develop clandestine corporate surveillance apparatus to spy and manipulate our perception of the world to further serve their business interests. Governments either cooperate with these tech giants or use their own tech-might and outreach to do the same. And some bad actors use this crisis of trust to throw the baby out with the bathwater and dismiss any notion of truth as the product of lies and manipulation serving narrow interests. I argue that obfuscation allows us the much needed benefit of the doubt, for it exposes the seams within these opaque algorithmic control mechanisms.
Rather than go down the nihilistic trajectory of post-truth, a constructive use of these technologies and countermeasures could challenge the balance of powers between tech giants and the wider public and lead us back to define the ethics of big data, to reevaluate the terms of its collection, analysis, storage, accessibility and use. Finally, when considering obfuscation’s impact on our technological, cultural and ethical landscape we may apply Melvin Kranzberg’s first law of technology[6] and realize that…
Obfuscation is neither good nor bad;
nor is it neutral.
Visual examples
The White Cyclosa Spider sculpts its own self portraits to obfuscate its true identity on the web.
In 2013 Ben Grosser created the ScareMail browser extension that adds “scary” text to an email in Gmail. ScareMail adds algorithmically generated narrative containing a collection of probable NSA search terms. Every story is unique to decrease automated filtering, every story is nonsensical to increase tracker frustration.[7]
While the AdNauseam extension blocks and then quietly clicks all ads on websites it visits, its ad vault shows all of these ads at once. This colorful visual overdose exposes the targeting failure of surveillance advertising when the ads are doomed to fight for attention all at once – an illustration of the data glut generated by the extension’s automatic clicking function.
Unfitbits is a series of obfuscation solutions to “release your fitness data from yourself”. Tega Brain and Surya Mattu advocate attaching fitness trackers such as the Fitbit to different motor devices to trick the trackers and potentially gain the insurance discounts. In the image: A Fitbit bio-tracking device mounted on a metronome.
More information
- Obfuscation: A User Guide for Privacy and Protest / Finn Brunton and Helen Nissenbaum
- Surveillance Countermeasures: Expressive Privacy via Obfuscation / Daniel C. Howe
[1] Brunton, Finn, and Helen Fay Nissenbaum. Obfuscation: a users guide for privacy and protest. Cambridge, MA: MIT Press, 2016.
[2] Howe, Daniel C., Mushon Zer-Aviv and Helen Nissenbaum.”AdNauseam.” Clicking Ads So You Don’t Have To. Accessed February 18, 2018. http://adnauseam.io/.
[3] Brunton, Finn, and Helen Fay Nissenbaum. Obfuscation: a users guide for privacy and protest. Cambridge, MA: MIT Press, 2016.
[4] Howe, Daniel C., and Helen Nissenbaum. TrackMeNot. Accessed February 18, 2018. https://cs.nyu.edu/trackmenot/.
[5] Kant, I. Groundwork of the Metaphysics of Morals, The Metaphysics of Morals and “On a supposed right to lie from philanthropy,” in Immanuel Kant, Practical Philosophy, eds. Mary Gregor and Allen W. Wood (Cambridge: CUP, 1986).
[6] Kranzberg, Melvin. “Technology and History: “Kranzberg’s Laws“.” Technology and Culture 27, no. 3 (1986): 544-60. doi:10.2307/3105385.
[7] Daniel C, Howe. “Surveillance Countermeasures: Expressive Privacy via Obfuscation.” Post-Digital Research VOLUME 3, ISSUE 1, 2014 Edited by Christian Ulrik Andersen, Geoff Cox and Georgios Papadopoulos