At first glance, an algorithm is a step-by-step procedure for calculations, and as such a wonderfully boring predictable thing. As a result, watching algorithms is only bearable for geeks or when animated; as a dance for example. As can be seen in this video, their computation consists in sequences of commands that instruct a machine or result (Cormen 2013).
These instructions are expressed in programming languages such as Java, C++, Python, or Fortran, and inform an “application” or “software” in how to behave, whereby they rely on protocols to exchange communication with other software, devices, or internet nodes. At a second glance, one discovers that algorithms bear a dysfunctional relation: while their steps are predictable, their impact isn’t. Their technical reality is fundamentally disparate from their social effects (Bunz 2014, 55). From this perspective, algorithms are unpredictable but of glittering power reshaping our societies: They trade at the stock market, calculate organ donations in hospitals, become a weapon in the war against terrorism by control and manage “kill lists”, but also run ever more devices in this world from phones to traffic lights to our heating; and in the humanities, they explore knowledge by driving large data sets.
Thus, much like tools that turned us from an animal rationale into a homo faber (Arendt 1958), algorithms shift our human condition. Today, we find ourselves in the new role of a homo communicatus; algorithmic processing of ##knowledge is said to affect the human brain (Carr 2010), or to open its own mode of thought as the English-Italian philosopher Luciana Parisi writes: instead of simply performing rules, algorithms “have become performing entities and are now actualities that select, evaluate, transform, and produce data” (Parisi 2013, IX): they have agency. When they are no longer seen as “a tool to accomplish a task” (XIII), the age-old definition of algorithms gets challenged, which was dating back to the Persian mathematician Muhammad ai’Khwarizmi (c. 780-880), whose name is pronounced in Latin slang “algorism”. But if algorithms are no step-by-step procedures for calculations anymore that consist of instructions and follow a finite set of rules to carry out a computation, one is confronted with a range of new questions. What are algorithms now? What is the reason that their status is changing? And how is this of use to the humanities?
What can algorithms know?
The present power of algorithms is fueled by another entity: that of data. Generally referred to as big data, large data sets, whose technical history has been well described by Kevin Driscoll (2012), contest the ruling of algorithms. Experts agree that without data to process, the algorithm remains inert (Berry 2011, 33; Cheney-Lippold 2011; Manovich 2013). The effectiveness of algorithms is strongly related to the data sets they compute, and computer scientists (Domingos 2012) as well as businessmen (Croll and Yoskovitz 2013) ponder if more data beat better algorithms, or if it is the other way round. Thus, it is of no surprise that within the humanities, algorithms are not only reflected upon but also used to analyze cultural data.
To analyze culture in a new way, large data sets are compiled and digitalized material is visualized. The cultural analytic Lev Manovich (2013) worked on visualization tools to dissect videos on the level of specific frames. This approach which has been carried further by the artist and hacker Robert M Ochshorn, who uses algorithms to persuasively read out films as if they were text. The literary scholar Franco Moretti (2005) uses quantitative methods to generate a graph of the fast rise and dramatic fall of British novelistic genres (44 genres over 160 years), or maps the radically changing geography in village narratives. Those new ways of generating cultural knowledge by exploring quantitative methods using data sets and algorithms are referred to as “Digital Humanities” (cf. Berry 2012; Lunenfeld et al. 2012). Here, algorithms become a way of “knowing knowledge” (Berry 2012, 6) which has triggered off new debates. Reducing algorithms to a sheer instrumental tool has been strongly criticized, for example in Alan Liu’s (2012) call to rethink the idea of instrumentality.
What can algorithms do?
From a philosophical perspective, the question after the being of algorithms is causing interesting problems. The withdrawal of their being is starting with their denotation: “algorithm” is not the only way to address a computational process. “Code” is the language in which an algorithm is written in. the word “algorithm” – a set of rules to be followed by calculations – marks a mathematical perspective, while the word “code” – a system of words to represent others – takes a linguistic perspective. Two words are describing the same “thing.”
Likewise it is hard to say what code or an algorithm does. Its ontological status – what is its being? – has been highly disputed ever since the British computer scientist Alan Turing (1936) wrote about the idea of a ‘Universal Machine’. This theoretical device manipulates symbols, and thereby simulates the logic of a computer algorithm. When Turing, by that time still a young student at King’s College Cambridge, worked on the proof “On Computable Numbers, With an Application to the Entscheidungsproblem”, he found that if it is possible to give a mathematical description of the structure of a machine, every machine can be simulated by manipulating symbols. Running software that consists of algorithms, his “universal” principle comes to life. This life is generally described as “virtual”, which is why the status of the algorithm is such a disputed issue. Philosophers and theorists have approached what algorithms do from various sides. Accordingly, they describe algorithms:
- As inexistent: The German media theorist Friedrich Kittler famously claims “There is no software” (1995). In his description, algorithms are a sheer effect of the hardware they rely on, designed to disguise our technical hardware determination. While this is a radical approach of some beauty, it also offers various problems: from a philosophical perspective, it repeats the gesture of idealistic philosophy to seek truth behind a curtain; from a humanistic perspective, it operates along the lines of technological determinism; from a pragmatic perspective, it lacks an approach to study the evolving field of software further. Against Kittler, Andrew Goffey (2008) has demonstrated the many ways in which an algorithm executes “control”.
- As a transforming activity: Kittler scholar Wolfgang Ernst focuses on the fact that an algorithm stores information in a different way than writing: breaking it down in 0 and 1, it doesn’t narrate but counts. Therefore, archives in the age of digital collections become a “mathematically defined space” (cf. Ernst 2012; Parikka 2012). Shintaro Miyazaki (2012) also emphasizes the activity but insists that in a strict sense it is not even mathematical. In his view, an algorithm formulated in a programming language is not the same as an algebraic formula: it is not “recursive”. Alexander Galloway stresses the specific ontological quality of algorithms and code as executable: “code is the summation of language plus an executable metalayer that encapsulates that language” (Galloway 2004, 165). While the ontological status of an algorithm as process has been acknowledged, the approach has also been questioned. Against the tendency to treat source code as an origin from which algorithmic actions emerge, Chun (2008) makes the point that interfaces are more than the effect of their source.
- As an element of interaction: Software studies look further into the algorithmic activity in a broader sense and seek to overcome the “immateriality” of software (Fuller 2008, 4). Aspects of design, glitches, interactions, or preferences come into focus (Fuller 2003), as well as the social condition of the algorithm’s production (Berry 2011, 43-51). Software can only be understood in the middle of things: “we can only begin with things” (Chun 2008, 324). While this approach widens the view, it opens up again to general criticism from the Kittler school suspecting this position of falling back into a technological oblivion.
- As a thing: Wendy Chun (2013, 8), and Yuk Hui both look closer at algorithms as things. Looking at source code, Chun stresses that it is an abstraction that is haunted as it can do and be things “it can be interpreted or compiled; it can be rendered into machine-readable commands that are then executed” (2013,51). Addressing data as a “digital object” and relating it to the philosophical discourse of things, Hui notes that “the evolution of technical standards from GML to XML to Web ontologies blurs the distinction between a simple text file and a structured computer program” (Hui 2012, 394). Next to “natural objects” discussed in continental philosophy, next to the “technical object” discussed in Simondon’s philosophy of technology, the new ontological status of a “digital object” emerges. This digital object is defined by its relations: “relationality is the point where algorithms act” (394).
- As the power that drives cognitive capitalism: Discussing technological empires like Google, Pasquinelli (2009) has argued that besides being an apparatus of surveillance or control, algorithms as they are employed by Google or Facebook capture living time and living labour. Driven by algorithms those technologic empires support free content produced by free labour with the aim to exploit it applying a new form of parasitic rent based on ##metadata and algorithms. Google does not need to possess the information on the internet, but owns the algorithms which manage to find the right data. As Pasquinelli writes, Google becomes thanks to its algorithms “a global rentier that is exploiting the new lands of the internet“ (2009, 153). Discussing financial algorithms, Franco Berardi and Geert Lovink paint the picture even darker and warn that the process of predatory power has become automated (Berardi, Lovink 2011) with humans no longer in charge.
- As an ethical problem: Apart from everyday objects that due to their algorithmic arming acquired tremendous power to regulate behaviour, algorithms are also explicitly programmed to make decisions thus becoming loaded with value-judgments, as it is explicitly the case in high-frequency trading, in profiling suspects, or in organ donation. Kraemer, van Overveld, and Peterson (2011) have argued that in those cases the designers of algorithms should when possible leave ethical issues to users, or make the ethical assumptions within the algorithm transparent. Susan Schuppli (2-14) has taken this argument further. Looking beyond the use of armed drones to algorithms which calculate who can be seen as a terrorist and should be killed, she warns that computational regimes produce new relations of power for which we have inadequate legal framework, or modes of political resistance.
What can algorithms hope?
As the immaterial materiality on which the 21st century is to be built, algorithms are strange things in which base and superstructure coincide. In principal, this transformation has a strong potential. Also it is highly divergent, and next to creative disruption one finds cheerful destruction. Technology has recently been mainly explored from economical perspectives. However, it is poetically ignorant, and this does not need to be the case forever.
Arendt, H. ( 1998): The Human Condition. Chicago: University of Chicago Press.
Berardi, F. and G. Lovnik (2011): “A Call to the Army of and to the Army of Software.” <http://networkcultures. org/wpmu/geert/2011/10/12/franco-berardi-geert-lovink-a-call-to-the-army-of-love-and-to-the-armyof-software>, [accessed 1 October 2014].
Berry, D. (2011): The Philosophy of Software: Code and Mediation in the Digital Age. London: Palgrave.
Berry, D. (2012): Understanding Digital Humanities. London: Palgrave Macmillan.
Bunz, M. (2014): The Silent Revolution. How Digitalization Transforms Knowledge, Work, Journalism and Politics Without Making Too Much Noise. Basingstoke: Palgrave Macmillan.
Carr, N. (2010): The Shallows: How the Internet is Changing the Way We Think, Read and Remember. London: Atlantic Books.
Cheney-Lippold, J. (2011): “A New Algorithmic Identity. Soft Biopolitics and the Modulation of Control.”Theory, Culture & Society, 28, 6, 164-181.
Chun, W. (2008): “On ‘Sourcery,’ or Code as Fetish.” Configurations, 16/3, Fall, 299-324.
Chun, W. (2013): Programmed Visions: Software and Memory. Cambridge MA: MIT Press.
Cormen, T.H. (2013): Algorithms Unlocked. Cambridge MA: MIT Press.
Croll, A. and Yoskovitz, B. (2013): Lean Analytics: Use Data to Build a Better Startup Faster. Sebastopol: O’Reilly Media 2013.
Domingos, P. (2012): “A few useful things to know about machine learning.” Communications of the ACM, 55,10, 78–87.
Driscoll, K. (2012): “From Punchcards to ‘Big Data’: A History of Database Populism.” communication +1, Volume 1, Article 4, available at <http://scholarworks.umass.edu/cpo/vol1/iss1/4>, [accessed 1 October 2014].
Ernst, W. (2012): Digital Memory and the Archive. Minneapolis: Minnesota University Press.
Fuller, M. (2003): Behind the Blip, Essays On the Culture of Software. New York: Autonomedia.
Fuller, M. (ed.) (2008): Software Studies, A Lexicon. Cambridge MA: MIT Press.
Galloway, Alexander (2004): Protocol – How Control exists After Decentralisation. Cambridge MA: MIT Press.
Goffey, A. (2008): “Algorithm,” in: Software Studies A Lexicon, edited by M. Fuller. Cambridge MA: MIT Press, 15-20.
Hui, Y. (2012): “What is a Digital Object?” Metaphilosophy, 43, 380–395, available at:
Kittler, F. (1995): “There is no software.” ctheory 10.18, available at: http://www.ctheory.net/articles.aspx?id=74>, [accessed 1. October 2014].
Kraemer F., Overveld van K., Peterson M. (2011): “Is there an ethics of algorithms?“, Ethics and Information Technology, 13 (3), 251–260.
Lunenfeld, P.; Burdick, A.; Drucker, J.; Presner, T.; Schnapp, J. (2012): Digital_Humanities. Cambridge MA: MIT Press.
Manovich, L. (2013): Software Takes Command. London: Bloomsbury.
Miyazaki, Shintaro 2012: ‘Understanding Micro-Temporality in Computational Cultures’, in: Computational Culture – A Journal of Software Studies, 28. September 2012, available at: <http://computationalculture.net/article/algorhythmics-understanding-micro-temporality-in-computational-cultures>, [accessed 1. October 2014].
Moretti, F. (2005): Graphs, Maps, Trees. London: Verso.
Liu, A. (2012). “Where is Cultural Criticism in the Digital Humanities.” Debates in the Digital Humanities. Minneapolis: University of Minnesota Press. <http://dhdebates. gc. cuny. edu/debates/text/20>,[accessed 1. October 2014].
Parisi, L. (2013): Contagious Architecture, Computation, Aesthetics, and Space. Cambridge MA: MIT Press.
Parikka, J. (2012): What is media archaeology? Cambridge: Polity Press.
Pasquinelli, M. (2009): “Google’s PageRank Algorithm: A Diagram of Cognitive Capitalism and the Rentier of the Common Intellect.” Deep Search, edited by Konrad Becker and Felix Stalder. London: Transaction Publishers.
Schuppli, S. (2014): “Deadly algorithms: Can legal codes hold software accountable for code that kills?” Radical Philosophy, 187, Sept/Oct: <https://www.radicalphilosophy.com/commentary/deadly-algorithms> [accessed 1. October 2014].
Turing, A. (1936): “On computable numbers, with an application to the Entscheidungsproblem.” Proceedings of the London Mathematical Society 42/2, 230-265.