You are on page 1of 16

577395

research-article2015

ECS0010.1177/1367549415577395European Journal of Cultural StudiesAndrejevic et al.

european journal of

Introduction

Cultural studies of data


mining: Introduction

European Journal of Cultural Studies


2015, Vol. 18(4-5) 379394
The Author(s) 2015
Reprints and permissions:
sagepub.co.uk/journalsPermissions.nav
DOI: 10.1177/1367549415577395
ecs.sagepub.com

Mark Andrejevic

The University of Queensland, Australia

Alison Hearn

University of Western Ontario, Canada

Helen Kennedy

University of Sheffield, UK

Over the past 2years, the total amount of data about everything from the humidity of
shipping crates, toilet flushes in shopping malls or tweets about Justin Beiber exceeded
the total amount yet recorded in human history equivalent to a zettabyte of data or
sextillion bytes and growing (Shaw, 2014). Given this, it is now axiomatic to claim
that we are in the age of big data and are witnessing a quantitative (and perhaps
qualitative) revolution (Lohr, 2012) in human knowledge, driven by accompanying
forms of data mining and analytics. New analytical methods and businesses seeking to
monetize this explosion of data emerge daily. Often offered in black-boxed proprietary
form, these companies and their analytic methods promise to help us gain insight into
public opinion, mood, networks, behaviour patterns and relationships. Data analytics
and machine learning are also ostensibly paving the way for a more intelligent Web 3.0,
promising a more productive and intuitive user/consumer experience. Data analytics
involve far more than targeted advertising, however; they envision new strategies
for forecasting, targeting and decision-making in a growing range of social realms,
such as marketing, employment, education, health care, policing, urban planning and
epidemiology. They also have the potential to usher in new, unaccountable and opaque
forms of discrimination and social sorting based not on human-scale narratives but on
incomprehensibly large, and continually growing, networks of interconnections.

Corresponding author:
Helen Kennedy, University of Sheffield, Sheffield, S10 2TU, UK.
Email: h.kennedy@sheffield.ac.uk

Downloaded from ecs.sagepub.com at CMU Libraries - library.cmich.edu on December 8, 2015

380

European Journal of Cultural Studies 18(4-5)

A well-developed cultural studies approach has an important role to play in considering the social and political consequences of data mining and analytics. When every move
we make online is tracked by privately owned corporations and the state, advertisements
follow us around in material retail spaces, and even our sleep patterns become fodder for
self-tracking (to gain self-knowledge), we cannot afford to limit our thinking about
data analysis technologies by approaching them solely as communication media. Instead,
we must see them as techno-economic constructs whose operations have important
implications for the management of populations and the formation of subjects. As Web
3.0 and the big data it generates move inexorably towards predictive analytics and the
overt technocratic management of human sociality, questions need to be asked about
what kinds of data are gathered, constructed and sold; how these processes are designed
and implemented; to what ends data are deployed; who gets access to them; how their
analysis is regulated (boyd and Crawford, 2012) and what, if any, possibilities for agency
and better accountability data mining and analytics open up.
Although cultural studies has always been good at picking up on significant cultural
trends and, in some cases, at anticipating them, databases and data processing practices
do not fit neatly within the traditional ambit of the field and pose some challenges to it.
Data mining challenges conventional forms of narrative and representation, promising
to discern patterns that are so complex that they are beyond the reach of human perception, and in some cases of any meaningful explanation or interpretation. Moreover, as a
signifying practice, data mining is less accessible than other forms of cultural representation, such as the cultural texts and everyday life practices that have been the traditional focus of cultural studies research. Data mining is not only a highly technical
practice, it also tends to be non-transparent in its applications, which are generally privately owned and controlled. As a result, it can be extremely difficult for independent
researchers to gain access to data sets and analytic methods in order to critically assess
them. Most often, with so little insight available into their production, we are left only
to theorize their effects.
In his 1963 inaugural address at the University of Birmingham, Richard Hoggart captured cultural studies original intentions when he observed that his motives (and by
association, those of the emerging discipline) for attending to the contemporary and
especially commercial culture were to find ways of understanding the languages of
developing popular media forms: television, magazines, and, especially, advertising (as
paraphrased by Gray et al., 2007: 5). Tellingly, Hoggarts speech also critically diagnosed the forms of categorization and targeting that have since become a staple of commercial media research and marketing: So much language is used not as exploration but
as persuasion and manipulation he observed manipulation that treats individuals as
bits, bits who belong to large lopsided blocks as citizens, as consumers, as middle-aged
middle-class parents as voters or viewers, as religiously inclined car owners, as typical teenagers or redbrick professors (Gray et al., 2007: 5). It is a formulation that anticipates a tendency that has become increasingly pronounced throughout the last decades of
the 20th century, as we have moved into the era of mass customization, targeting and
personalization. Indeed, this trend has subsequently developed to the point that the manufacture of the atomized bits that are the targets of customized messaging has become
as significant as the messages themselves. So, as we tweet, post, like, share and Google

Downloaded from ecs.sagepub.com at CMU Libraries - library.cmich.edu on December 8, 2015

Andrejevic et al.

381

to generate meaning, related platforms and their analytics generate us as bits in turn and
deploy our communicative efforts to their own advantage.
Micro-targeting via data analytics and metricization is the message in the current
conjuncture. This development is reflected in the heightened popular attention to media
technologies that function in a different register than the content-driven mass media: the
rise of a fascination not just with the Internet and the mobile phone, but also with databases, their algorithms and analysts. Drawing on John Durham Peters (2013), we might
describe this development as the rise of logistical media media whose content is not
so much narratival or representational as organizational (p. 40). Peters (2013) refers to
media such as data processors broadly construed: media that arrange people and property into time and space (p. 40). It is a formulation that captures the allocative character
of data mining and the processes of segmentation, sorting and categorizing that emerge
from the search for useful patterns.
To observe that the database as a research object fits more comfortably into the category of logistical media than traditional mass media is not to suggest that databases are
contentless, but rather to note the shift away from interpretive approaches and meaning-making practices towards the project of arranging and sorting people (and things) in
time and space. Consider, for example, Googles oft-repeated response to concerns about
its data mining of personal messages on its free email service: no humans read your
email. The process of reading proper is not what matters in this approach to the medium.
That is, content is not approached as content but as metadata information about the
content that can be used to analyse and categorize individuals and groups of people,
places and things. Messages are mined not to understand or interpret their intended or
received content but to arrange and sort people and their interactions according to the
priorities set by Googles business model.
We might describe this process of metadatification whereby a message is reconfigured into data about itself as the post-ideological or post-textual moment taken to its
logical conclusion. That is, once the notion of a containable, interpretable and transmissible dominant meaning is deconstructed beyond recognition, what we are left with is
the circulation of affects, and eventually, from an instrumental point of view, their effects.
This trajectory was anticipated, perhaps inadvertently, by John Fiskes move towards a
version of post-textualism in which, he states, there are no texts, no audiences. There is
only an instance of the process of making and circulating meanings and pleasures
(Turner, 1992: 123). Intimations of these developments can also be found within semiotic criticism itself, for example, in early discussions about promotionalism by Andrew
Wernick. In his work, echoing Barthes, the logics of capitalist instrumentality prey parasitically on modes of human representation, turning them towards politically interested
ends. Anticipating the reductive instrumentality of metadatification and its predictive
uses, Wernick (1990) argued that a promotional message is an anticipatory advocate of
its original referent, marked not by what it says but by what it does (p. 184).
Perhaps not coincidentally, recent forms of social and cultural theory mirror developments in big data analytics; new materialism, object-oriented ontology, post-humanism
and new medium theory all of which are coming to play an important role in digital
media studies de-centre the human and her attendant political and cultural concerns in
favour of a flat ontology wherein humans are but one node, and perhaps not the most

Downloaded from ecs.sagepub.com at CMU Libraries - library.cmich.edu on December 8, 2015

382

European Journal of Cultural Studies 18(4-5)

important, in complex networks of interactions and assemblages. Thus, analysis of the


circulation of affects and effects rather than of meanings, content or representations, connected as they are to human-centred forms of meaning-making, has become a dominant
trope in some influential current approaches to media. Such analyses tend to fashion
themselves as anti-discursive in their rejection of a focus on representation and cognition
and their turn towards bodies and things in their materiality (rather than their signification). For example, Karen Barad (2003), whose theory of agential realism has been
influential in the development of new materialist approaches recently taken up in media
studies (and digital media studies in particular), argues that
Language has been granted too much power. The linguistic turn, the semiotic turn, the
interpretive turn, the cultural turn: it seems that at every turn lately every thing even
materiality is turned into a matter of language or some other form of cultural representation.
(p. 804)

Google, with its attempt to distance itself from reading, might well agree. By contrast,
Graeme Turner (1992) argues that cultural studies is a formation in which language
looms as the most essential of concepts, either in its own right or through being appropriated as a model for understanding other cultural systems (p. 13). To line up the cultural
turn alongside the linguistic or semiotic one, as Barad does, is not to suggest that cultural
approaches limit themselves solely to questions of textual representation, but that when
they turn their attention to audiences, institutions, artefacts, economies and activities,
they remain within the horizon of interpretation, explanation and narrative. We suggest
that this adherence to the horizon of meaning is a strategic critical resource in the face of
theoretical tendencies that reproduce the correlational logic of the database by focusing
on patterns and effects rather than on interpretations or explanations.
A related tendency is manifest in the growing interest in German-inflected medium
theory, as seen in the growing influence of Friedrich Kittler. As Jeremy Packer (2013)
puts it, Kittlers work represents a cold-turkey cure for hermeneutic and ideological
fixation (p. 295). In John Durham Peters words, the German theorist has no use for the
category of the human or experience (Kittler, 2010: 5). For him, the conditions of
the latter are subordinated to core media processes in a particular historical period:
namely, the collection, storage and processing of data. The virtue of such an approach,
according to Packer (2013), is that it addresses a shift in the way digital media operate, a
shift that he describes in distinctly logistical terms: digital media power is first and foremost epistemological, not ideological. It is computational (p. 297). Packers words provide a theoretical echo of former Wired magazine editor Chris Andersons (2008)
post-hermeneutic paean to the end of theory ushered in by the advent of big data:
Out with every theory of human behavior, from linguistics to sociology Who knows why
people do what they do? The point is they do it, and we can track and measure it with
unprecedented fidelity. With enough data, the numbers speak for themselves. (n.p.)

The various strands of theory referenced above are diverse movements on their own,
and there are significant differences among them, but they partake to various degrees in
countering an emphasis on the discursive. It is not difficult to discern certain affinities
between tendencies in these approaches and a datalogical turn more generally: that is,

Downloaded from ecs.sagepub.com at CMU Libraries - library.cmich.edu on December 8, 2015

Andrejevic et al.

383

a growing interest in what automated forms of information processing can reveal about
the object formerly known as content. This affinity, in turn, raises questions regarding
the critical purchase of such approaches, including whether they are adequate to the task
of critiquing digital media practices and the social relations they rely upon and/or reproduce. Or, on the contrary, are these theoretical tools prone to buy in to the logics of
metadatification? Google may not measure ideology, but does this mean that ideology
and the need for a critical political analysis of it cease to exist? More pointedly, as Alex
Galloway (2013) asks,
Why, within the current renaissance of research in continental philosophy, is there a coincidence
between the structure of ontological systems and the structure of the most highly evolved
technologies of post-Fordist capitalism? [] Why, in short, is there a coincidence between
todays ontologies and the software of big business? (p. 347)

Taken in the spirit of a needed corrective to a perceived over-emphasis on discourse


and language, the re-assertion of the extra-discursive serves as a potentially useful provocation. Elizabeth Grosz (2004) (whose work has also been influential in the formation
of new materialism), for example, issues a call for
a remembrance of what we have forgotten not just the body, but that which makes it possible
and which limits its actions: the precarious, accidental, contingent, expedient, striving, dynamic
status of life in a messy, complicated, resistant, brute world of materiality. (p. 2)

But, while it is true that the tendency towards textualism or constructivism is not foreign to cultural studies, neither is its critique. Almost a quarter century ago, Graeme
Turner (1992) noted in response to the celebration of the possibilities of textual polysemy (as a form of liberation from the alleged tyranny of a texts dominant meaning):
The obvious limitation to the progressive effect of this multitude of textual possibilities is that
making over the meaning of a television program may be much easier than climbing out of
a ghetto, changing the color of ones skin, changing ones gender (p. 122)

Indeed, in part to address the limitations of textualism, cultural studies has always
incorporated a much broader scope of research objects, including cultural institutions,
audiences, subcultures and the practices of everyday life.
We argue that cultural studies is well positioned to challenge the dangers that can
come from an over-correction of a focus on discourse (whether in the realm of audiences,
artefacts or texts). In promising to push beyond narrowly human-centred concerns, for
example, these recent theoretical positions threaten to adopt a view from nowhere in
which the goal of comprehensiveness (the inclusion of all components of an endless
network of inter-relations) tends towards a politically inert process of specification in
which structures of power and influence dissipate into networks and assemblages. As in
the case of data mining, infinite specification results in a deadlock of interpretation. The
various parties to an interaction become subsumed to its overall effects so that all we can
say, in the end, is, its complicated. Jane Bennetts (2009) version of vibrant materialism, for example, displaces simple narratives of causality with the recognition of the
constantly unfolding network of contributors to any event or outcome. Likewise, in data

Downloaded from ecs.sagepub.com at CMU Libraries - library.cmich.edu on December 8, 2015

384

European Journal of Cultural Studies 18(4-5)

mining, humans take their place within the growing menagerie of data about inanimate
objects, forces, interactions, flora and fauna. Data become not so much symbols to be
understood and differentiated as inputs to be sorted. The multiplication of variables and
the automaticity of the algorithm displace the roles of interpretation, comprehension and,
most significantly, politics. As Galloway (2012) has also argued, these theories remove
the point of decision from the people (demos) to the object world at large. And, because
there is no way of generating a dynamics of contestation and argument in a flat ontology
of ever proliferating relations or objects, these theories leave little room for judgment
obviously a crucial element in any political project.
Our commitment to cultural studies focus on politics and power (and relatedly, discourse and representation) is not meant as a retro or rear-guard action counter-poised to
approaches that have forsaken or surpassed the cultural, but rather a (re-)assertion of its
unsurpassability. This is not to say that there is nothing beyond culture, but that the significance of the assertion of such a beyond is inescapably caught up in cultural logics. A
critical analysis of data mining, then, must involve an interrogation of the embrace of a
post-cultural imaginary within contemporary media theory that is, such an analysis
must step back and explicitly situate data mining culturally, politically and economically.
Such an approach presses against the notion that it would be either possible or desirable
to permanently bracket the distinctive character of the cultural (and associated concerns
with the political implications of representation, narration, explanation and interpretation) in the complementary fantasies of automated tracking and sense-making that dispense with the need for comprehension and the theoretical analysis of assemblages in
their infinite complexity. In the end, the cultural, representational and political cannot be
surpassed by theory or algorithm, no matter how complex, unknowable or compelling
they might be. Indeed, complex and compelling theories and algorithms are products of
history and subject to cultural logics, no matter how vociferously they might claim it to
be otherwise. If cultural studies is intellectual political work or a practice which always
thinks about its intervention in a world in which it would make some difference, in which
it would have some effect (Hall, 1996: 275), then the cultural studies of data mining and
data analytics must attend to questions of power, subjectivity, governance, autonomy,
representation, control and resistance which have always been central to enquiry in this
field. The articles in this special issue do just that.
One of the pressing tasks of such an endeavour must necessarily be an engagement
with the social relations that render data mining itself opaque. In this regard, our concerns align with those strands of cultural studies that locate their roots in a critique of
political economy. As boyd and Crawford (2012) have argued, the creation of huge data
sets ushers in a new digital divide between those with access to data (and the resources
for making sense of it) and those without, as elite commercial companies like Google,
Facebook and Amazon have the best access to data, as well as the best tools and methods
to make sense of them (Williamson, 2014). These companies (and others like them, such
as OKCupid) can experiment constantly for commercial purposes on their (sometimes
very large) user populations without notifying them, whereas those supposed to be operating in the public interest are effectively locked out or provided only limited access.
This state of affairs seems unlikely to change any time soon, given the economic and
competitive incentive to maintain the proprietary character of data mining. Data mining

Downloaded from ecs.sagepub.com at CMU Libraries - library.cmich.edu on December 8, 2015

Andrejevic et al.

385

is capital-intensive insofar as it relies on building the networks and platforms to collect


data, the server farms to store it and the computing power to analyse it. If one of the casualties of a fascination with the virtual character of software, applications, code and data
has been a shift away from political-economic approaches to the media, the reliance of
data mining on capital-intensive infrastructure reminds us of the ongoing salience of
political-economic concerns. These debates inform the articles in this collection.
While developments in data collection and processing appear complex, opaque, ineluctable and unassailable, it is crucial to remember that they comprise just another phase
in human technological development, constituted by and constitutive of dominant economic and cultural systems. This, of course, is the insight that cultural studies approaches
bring to bear on the study of the data mining practices and discourses. These practices
may promote and exploit the dream of total knowledge and future prediction, but such
initiatives (at least in their proprietary form) call out to be examined in their mundane
materiality, in their contexts of production and use, for the ways they are helping to inaugurate new monopolies of knowledge, systems of governance, forms of subjectivity, and
social and cultural divisions and stratifications. No matter the kinds or numbers of
machines we have wrought, in the end, we remain humans attempting to make meaning
with whatever forms of communication we have available. The collection of essays presented here takes up these concerns by developing critical approaches to emerging technologies and practices of data mining and analysis; they historicize, probe, complexify,
humanize, identify contradictions and modes of resistance, and trace the ways our cultural worlds are changing in the wake of data analytics. As such, they address themes
central to both academic discussions of digital media and public concerns about emerging forms of data-driven sorting, prediction and decision-making.
The first group of essays provides historical and macro-assessments of the cultural
worlds we are making as a result of big data and data analytics. Each essay challenges
the celebratory presentism with which many of these developments are often met and
highlights the specific kinds of inversions and contradictions the new regime of algorithmic culture can engender. They remind us that technological forms, and the rhetorics
and analytic practices that accompany them, do not come from nowhere they have
histories, which shape and condition them, and inevitably bear the marks of the cultural,
social and political conditions surrounding their production and implementation. As
such, and most crucially, these essays remind us that the development of big data analytics conditions new forms of political domination and resistance.
In the first of these essays, Algorithmic Culture, Ted Striphas examines the ways in
which computational processes of sorting, classifying and hierarchizing of people,
places, objects and ideas (e.g. on sites like Amazon and Netflix or the results generated
by Google, Bing and others) have profoundly altered the way culture, as a category of
experience, is now practised, experienced and understood. Following Frederic Jamesons
famous dictum always historicize, and in the spirit of Raymond Williams Culture and
Society (1958), Striphas examines the conceptual conditions out of which algorithmic
culture has developed by addressing a small group of terms whose bearing on the meaning of the word culture has become unusually strong in recent years. Williams identified
the first one information in his later work, The Sociology of Culture (1981); the other
two crowd and algorithm are Striphas own. Arguing that the semantic dimensions

Downloaded from ecs.sagepub.com at CMU Libraries - library.cmich.edu on December 8, 2015

386

European Journal of Cultural Studies 18(4-5)

of algorithmic culture are at least as important as the technological ones, Striphas


takes us through the histories of these contemporary keywords, highlighting their definitional contradictions, expansions and contractions, and identifying the ways the circulation of these concepts supports the privatization of cultural processes: the consolidation
of the work of culture into the hands of a few powerful companies and their ability to use
a variety of legal instruments to hide the how and why of this work from the public eye.
At stake, he argues, is the gradual abandonment of cultures publicness and, ultimately,
the emergence of a strange new breed of elite culture purporting to be its opposite. In an
era of information surfeit, the information organizers are the new gatekeepers and their
algorithms the contemporary analogues of objectivity as if the results they yield are
not already shaped by the imperatives baked into the algorithms. In this regard, cultural
concerns return us to questions regarding control over informational resources in the
digital era.
Robert Gehl picks up the theme of instrumentalized sharing in the digital age but
historicizes and individualizes it in his essay Sharing, Knowledge Management, and Big
Data: A Partial Genealogy of the Data Scientist. Following Foucaults genealogical
method, Gehl unpacks the discourse of sharing as it emerged specifically in the corporate
practices of knowledge management and the emergence of the knowledge worker in the
1990s. At this time, a firms strategic advantage was seen to lie in their employees tacit
knowledge, and the goal of management strategy was to compel it to be shared. This was
the beginning of the highly flexible, precarious contemporary knowledge worker of
today, whose primary asset is communicative skill and whose primary role is to turn
knowledge into information, all under the aegis of innovation, creativity and play.
Tracing the connections between the rise of the knowledge economy in the 1960s,
knowledge management strategies and the knowledge worker in the 1990s, and the
Facebook like button in the 2000s, Gehl then focuses on the worker whose job it is to
make sense of all the information being produced: the data scientist. While touted as
being the sexiest job in the 21st century, Gehl argues that the systems designed by data
scientists will inevitably fold back on them, subjecting these workers to instrumental
logics of their own design. Ultimately, data scientists will feel the problematic material
effects of informationalization along with the rest of us: precarity, low pay and exploitative working conditions. As Gehl writes, what big data firms crave from their data scientists is precisely what they enjoy with their data: cheapness and ubiquity.
Moving from the data scientist to the actual techniques of data mining and predictive
analytics, Adrian Mackenzie similarly looks to problematize the evolution of our current
digital moment in his essay The production of prediction: what does machine learning
want? Mackenzie draws on the Foucauldian concept of problematization, insisting that
in order to assess the ways in which material life is being transformed, we need to examine the conceptual components and techniques of machine learning and prediction that
lie behind data analytics. Mackenzie draws our attention to the fact that these techniques
are generic and indifferent to the specific data in question, insisting that we must recognize conceptual tools, such as decision trees, perceptrons, logistic and linear regression models, association rules and Nave Bayes classifiers, as methods that have histories,
cultural contexts and biases. Mackenzie traces the styles of reasoning and the narratives
embedded in these different techniques of machine learning and data analysis, and argues

Downloaded from ecs.sagepub.com at CMU Libraries - library.cmich.edu on December 8, 2015

Andrejevic et al.

387

that, in these forms of reasoning, we can see foundational assumptions that are now
working to condition the contours of our lives. Mackenzie argues that these abstract
techniques are performative, comprising forms of material action in the world. Like
Gehl, he contends that these predictive techniques are recursive and will inevitably fold
back on themselves, working to normalize the situations in which they themselves are
entangled. But, while Gehl insists that the data scientist will eventually be displaced by
the very automation he helped to design, Mackenzie concludes by highlighting the
trans-individual co-operative potential of the techniques themselves reminding us to
be attentive to the specific political and economic contexts of these technological developments and their liberatory possibilities.
The next group of papers develop and particularize the issues and contradictions that
accompany the implementation of data analytics already identified issues involving
power, capitalism, labour, representation, materiality, surveillance and subjectivity. They
do so by examining specific examples of the use and deployment of data mining and
analysis across a range of social and cultural phenomena. These papers undertake the
hard work of identifying and analysing new kinds of socio/cultural industry workers
the infomediaries, consumer data miners and reputation managers who have arisen to
curate, shape and predict our tastes and desires in the 21st century.
In the first of these essays, Curation by code: infomediaries and the data-mining of
taste, Jeremy Morris examines automated music recommendation systems, arguing that
they occupy an increasingly central position in the circulation of media and cultural
products. Far from being neutral purveyors of predictions, these recommendation systems exert a logistical power that shapes the ways audiences discover, use and experience informational and cultural content (Gillespie, 2014). The paper examines the
specific case of The Echo Nest, a music intelligence service that provides a massive
database, which services other recommendation systems, media measurement companies, labels and other arms of the music industries. Largely a business-to-business affair,
the company is a paradigmatic infomediary, providing not just content but also establishing the databases, relational connections and audience intelligence that underpin the
recommendations. Like ratings and measurement companies before them, Morris argues,
an emerging constellation of infomediaries simultaneously measure and manufacture
audiences. By looking at the shift from intermediation to infomediation, and through a
critical interpretive reading of the Echo Nest database and some of its algorithms, the
paper probes how cultural content itself is being readied, or pre-configured, for algorithmic analysis and examines what effects this has on its production. Picking up on the
themes of inversion and recursivity highlighted by Gehl and Mackenzie and echoing
Striphas claims, Morris underscores the ways in which the culture industries and our
own cultural tastes are being re-shaped as a result of what Kate Crawford (2013) has
called data fundamentalism (n.p.).
While data analytics re-shape musical consumption and production, they are also
busy transforming retail environments. In their paper, Making Data Mining a Natural
Part of Life: Physical Retailing, Customer Surveillance, and the 21st Century Social
Imaginary Joseph Turow, Lee McGuigan and Elena Rosa Maris examine current efforts
to reorganize retail environments around the data capturing and processing affordances
of digital media. Using Charles Taylors concept of the social imaginary, Turow,

Downloaded from ecs.sagepub.com at CMU Libraries - library.cmich.edu on December 8, 2015

388

European Journal of Cultural Studies 18(4-5)

McGuigan and Maris contend that new institutional actors in the retail industry are using
predictive analytics to re-shape lifestyles and identities and are instantiating new forms
of social discrimination as a result. The authors focus specifically on loyalty programmes
in retail stores, tracing their history from green stamps to branded charge cards and from
frequent flyer programmes to the rise of Internet shopping, as they work to attract consumers and collect highly personalized data about them at the same time. Given that all
manner of mobile apps and games are venues for performing loyalty and accumulating
rewards, individual consumers now leave ever-growing trails of data breadcrumbs for
retailers. These analytics are deployed in turn on shop floors via mechanisms like
dynamic pricing and digital price displays, or via loyalty rewards apps like Shopkick,
which explicitly rewards consumers for checking in with stores or scanning items using
their smart phones. In this way, the authors argue, storefronts literally become factories
for the generation of consumer data, and function, simultaneously, to produce new subjectivities where who you actually are is determined by where you spend time, and
which things you buy. Not all consumers and their data are created equal, however; as
these new tracking technologies work to naturalize and reward the practices of surveillance, they develop new forms of social discrimination predicated entirely on consumer
behaviour. In the wake of these developments, data-driven discrimination increases and
we see a lessening of democratic relations in all areas of life.
Kate Crawford, Tero Karppi and Jessa Lingel take up similar concerns about the
extraction of value from all aspects of human life, specifically sleep, in their essay Our
Metrics, Ourselves: One Hundred Years of Self-Tracking from the Weight Scale to the
Wrist Wearable Device. Attending to the commodification and knowledge-making that
occur between bodies and data, the authors focus on the emergence of contemporary
self-tracking apps and devices like the Fitbit Flex, Jawbone UP or the Nike FuelBand SE
and analyse them by way of a comparison with a historical precedent the weight scale.
Initially positioned as a public machine, which would reward users with candy along
with their weight when they emerged in the later 1900s, the weight scale gradually
became a private home-based device, tied to normalizing body types but sold under the
guise of increasing self-knowledge and social power. Indeed, the message of self-knowledge through measurement can be found reiterated in much contemporary advertising for
new self-tracking apps and devices and in the rise of the current Quantified Self movement. These devices reinforce the view that individuals are not the most authoritative
source of data about themselves (Bossewitch and Sinnreich, 2013) but, rather, need the
help of a quantifying machine to know themselves better. Taking issue with Jonathan
Crarys (2013) claim that, in an era of 24/7 activity, sleep is one of the only areas of life
yet to be colonized by the market, Crawford et al. point out that extracting monetary
value from sleep is exactly what devices like Fitbit and Jawbone UP now seek to do.
While users extract value from the measures provided, companies collect data from individual users that can then be used or sold in a myriad of ways. In addition to the monetization of user data, the authors argue that there is an implicit form of normalization of
bodies and standards at work in the operations of these devices, which remain opaque to
users. These new apps and devices help to create more efficient, normalized gendered
bodies; function to naturalize forms of implicit participation in regimes of data surveillance, monitoring and extraction; generate data sets for all manner and means of

Downloaded from ecs.sagepub.com at CMU Libraries - library.cmich.edu on December 8, 2015

Andrejevic et al.

389

capitalist industry and, most troubling, extract value from our desire for self-understanding, all while we sleep.
Music production and consumption, the retail shopping experience and even our sleep
patterns are increasingly conditioned and determined by new forms of data extraction and
analysis, and all of these tools, devices and systems incorporate some aspect of geolocation
to enable their functioning. In Platform specificity and the politics of location data extraction, Rowan Wilken and Carlos Barreneche examine the growing commercial significance
of location data and argue for a medium-specific analysis of the differences between
dominant techniques of geolocation data extraction and their political-economic arrangements. To this end, the authors offer a comparison between the ways in which Foursquare
and Google each extract and use geocoded user data. Foursquare has moved from a gamified form of user check-in into a geospecific kind of user retail recommendation and ranking platform. Indeed, Foursquare would fit neatly with the kinds of retail shopping apps
and devices reviewed by Turow et al., as it puts its geolocation data to work to produce a
retail consumer database. The ultimate effect, the authors argue, is the generation of geodemographic profiling, which can then be used for predictive purposes. Google, on the other
hand, deploys its own users as sensors to drive mobility flows. From Global Positioning
System (GPS) signals in mobile phones, to WiFi access points, to check-ins on sites like
Facebook and Twitter, Google collects user data, claiming to anonymize it. The truth is,
however, that many Android cell phones transmit geolocation data whether the phone is
running or not, and Google only needs a small number of anonymous spatio-temporal
mobility traces to uniquely identify most of us. As with so many other forms of data analytics, Googles ultimate goal is to be able to predict user behaviour in order to convert data
traffic into foot traffic for local retailers. Insofar as they do not simply collect information
about humans and their environment, but reflexively interact with, and adjust to them,
modulating flows of populations of users, the authors argue that we must come to see these
kinds of geolocation as environmental technologies. Recalling Crawford et al.s claims
about the production of the ideal worker subject via self-measuring devices and apps,
Wilken and Barreneche argue that geolocation techniques facilitate technocratic forms of
urban mobilities, aligning the rhythms of the city with the rhythms of the commodity. The authors caution us to be mindful that these forms of consumer tracking are also
underpinned by efforts to securitize mobility and, thereby, contribute to the construction of
distinct place ontologies that is, ways of categorizing the world that threaten to reform our very notion of the public.
Mark Davies expands on the themes of surveillance, consumer tracking and neoliberal digital capitalism in his essay Ebooks in the global information economy. Noting
that little scholarly attention has generally been paid to the ebook, Davies points out that
Amazon, the worlds largest purveyor of ebooks, is also one of a group of large digital
media corporations currently vying for dominance in the global information economy.
As such, Davies argues, we must understand the ebook as a node in the wider discourse
of neoliberal digital informationism. In other words, ebooks are exemplary objects in
the current alignment of neoliberal economic doctrine and celebratory forms of information capitalism. As Davies shows, reviewing the work of Babbage, Taylor, Mumford,
Turing, Peter Drucker and Daniel Bell, discourses of information and market freedom
have gone hand in hand for many decades and have culminated in the rise of

Downloaded from ecs.sagepub.com at CMU Libraries - library.cmich.edu on December 8, 2015

390

European Journal of Cultural Studies 18(4-5)

contemporary neoliberal digital capitalism and Silicon Valley ideology. Davies then
turns to the ebook, arguing that it works to commodify reading and readers, coopts users
labour and serves as a strategic site for the corporate struggle over the digital commons.
For example, Amazons proprietary Whispersync technology works to collect information about Kindle eBook users, such as what they read, how fast they read and what passages they underline, and then adds this to its consumer database. Indeed, Amazons
business strategy depends on selling books at cost while generating valuable user data,
and, in turn, Amazon exerts its distributing power by delisting publishers who refuse to
participate in these practices. Ebook publishers also use reader data to design their next
publishing projects, down to preferred word length or type of protagonist. In this way, the
ebook as data-generating device serves to re-shape what kinds of books and information
actually get published, producing a new information ecology driven by market information and surveillance, and signalling the retreat of public duty notions of book publishing. Focusing specifically on Amazons Kindle, Davies positions the ebook as a
strategic commodity, a crucial element in corporate competition that has little to do with
book publishing, and everything to do with beating competitors Apple and Google in the
race to dominate the global information economy. Under the aegis of creative disruption, global network powers, like Amazon, naturalize the practices of digital enclosure
and digital capitalism, and place citizens in a democratic double-bind, where the potential for enhanced participation provided by digital media must be traded off against new
forms of corporate surveillance. Davis concludes by making a plea to step up efforts to
popularize a progressive critique of digital network media which, paradoxically,
depends on currently out-of-fashion notions of a critical public culture.
In his essay Open source intelligence, social media and law enforcement: visions, constraints and critiques, Dan Trottier continues a focus on questions of surveillance and big data
analytics by examining the adoption of open-source intelligence (OSINT) and social media
monitoring in European law enforcement and intelligence agencies. He argues that OSINT
practices are techno-economic constructs, working to augment the visibility and exploitation
of social life. Positioning the current use of OSINT in relation to CCTV monitoring of the
1980s and 1990s, Trottier goes on to examine the implementation of open-source social media
intelligence tools, such as NiceTrack, in policing contexts in Europe. While police agencies
tend to describe social media information as a pure, constantly flowing source of information
and social media users as, alternatively, customers or criminals, the actual implementation of
OSINT in policing contexts brings to the foreground a far more complex set of issues. Drawing
from interviews with 19 police and intelligence officials and 14 privacy advocates from 13
European Union (EU) member states, Trottier highlights respondents motivations to adopt
social media monitoring technologies in their jurisdictions; these include the desire to identify
terror threats, child exploitation and copyright infringement. Trottier goes on to highlight the
ways in which the use of OSINT is severely limited and conditioned by already-existing legal
frameworks, budgetary constraints, an unresponsive culture of policing and the circulation of
deliberate misinformation online, and enumerates the liabilities and social costs attached to the
use of OSINT. These include an increase in forms of categorical discrimination, the gradual
expansion of monitoring and surveillance to more and more areas of life or function creep,
the criminalization of online spaces and the demarcation of every individual online as a possible suspect and an increased risk that the public will see OSINT as yet another opaque form

Downloaded from ecs.sagepub.com at CMU Libraries - library.cmich.edu on December 8, 2015

Andrejevic et al.

391

of police surveillance. From his analysis, Trottier notes that the use of OSINT collapses social
and institutional contexts, remediating personal content offered on privately owned platforms
into public evidence that can be used in a variety of troubling and unpredictable ways. These
new uses of social media data, in turn, can come to have a chilling effect on users online
speech, rendering us all more passive in the long run.
Following in the cultural studies tradition of identifying spaces for agency in the face of
structures of power, the final three papers of the volume pose distinct challenges to accounts of
big data that see it as ineluctably oriented towards power asymmetries and domination. These
papers locate possibilities for resistance, engagement and enrichment via the use of new means
of computation and big data analytics. In Plenty as a response to austerity: big data expertise,
cultures and communities, Caroline Bassett focuses on the links between data analytics and
governance. Echoing Mark Davis, she situates the explosion of big data within the trends of
advanced neoliberal digital capitalism, specifically its recent obsession with austerity. But,
drawing on research into the use of big data analytics in small communities in the United
Kingdom, she argues that these analytical tools are not intrinsically virtuous or terminally
evil. Indeed, Bassett contends that there can be good uses and forms of big data analytics. In
an attempt to address the lag that occurs between the introduction of new technological tools
and the development of expertise on the ground to make use of these tools, Bassett begins from
first principles, working to redefine both the terms big data and expertise. Bassett picks up
themes explored by Adrian Mckenzie as she draws on Bernard Steiglers claims about the coconstruction of humans and technological forms, and the pharmacological nature of technologies insofar as they can have both curative and toxic dimensions. Bassett redefines big data as
a process, involving humans and machines, hermeneutic and algorithmic operations engaging specific locations and conditions, and producing forms of knowledge that tend to
present themselves as entirely computational. She goes on to reassess definitions of expertise,
arguing that, as big data challenges the very ways in which we define and think about expertise,
it simultaneously is shaped by, and has embedded within it, various kinds of already-existing
distributed human expertise. If we see big data as constitutive of and constituted by forms of
human expertise, Bassett argues, then we can see it as open, malleable and amenable to alternative uses. Given this corrective to our understanding of big data analytics as inherently sociotechnical forms of operation, the question becomes, Who has access to these tools and how are
they being used? If, as Bassett hopes, we can realign our understanding of Big Data analytics
away from celebratory views of it as a kind of post-interpretive computational knowing, and
towards a view of it as co-constitutive of humans and their social conditions, then we might
find ways to use Big Data circuits to enable particular forms of life to flourish.
Dan McQuillan takes Edward Snowdens revelations as a starting point to interrogate
the continued salience of our established modes of critique in contemporary algorithmic
culture. In Algorithmic States of Exception, McQuillan takes up Bassetts claim that our
critical focus should not be on big data per se, but on the nature of the material-political
apparatus that connects data to decision-making and governance. Tracing changes in the
conceptual apparatus that buttress and support certain kinds of data analytics, McQuillan
specifically examines the shift from the use of relational database management systems to
the use of NoSQL a form of data storage that does not look for relations between data
but stores all manner and means of data in a schema-less system. McQuillan argues that
this kind of data storage facilitates forms of data mining that stress prediction and

Downloaded from ecs.sagepub.com at CMU Libraries - library.cmich.edu on December 8, 2015

392

European Journal of Cultural Studies 18(4-5)

correlation over relationships and causation. These predictions are then deployed as forms
of algorithmic regulation, prescribing corrective measures and new modes of subjectivity.
McQuillan, drawing on Agamben, asserts that these trends towards prediction in all fields
of life are colliding with our assumptions about political and judicial fairness and are
producing a perpetual state of exception; they increasingly operate with coercive force
on populations, but without any substantive controls or legal consequences; these new
operations have the potential to create social consequences that are unaddressed in law.
McQuillan asks how we might create counter-movements to these new forms of exclusionary power and suggests that a potentially fruitful way forward involves making correlations between historical and contemporary forms of resistance and assertions of the
commons. For example, McQuillan traces the ways 13th century Christian antinomianism
resonates in the work of hacker group Anonymous; both, he argues, sought to disrupt
apparatuses of control without themselves engaging in the cycle of lawmaking and lawpreserving. He also notes that 18th century food riots in Britain share much in common
with contemporary DIY cryptoparties and festivals; both work to intervene in situations of
excess and unfairness, using whatever tools they have at hand, in a collectivist mode and
without legal cover. These historical examples, McQuillan argues, can give us clues as to
how to re-shape the dominant privatized big data apparatus; the way forward does not
include discarding our technological affordances, but in learning against their current
forms and logics, and reconfiguring them to enable and advance social struggles.
Finally, Fenwick McKelvey, Matthew Tiessen and Luke Simcoe argue, like McQuillan,
that concerns about publicness and privacy no longer obtain in contemporary algorithmic
culture. Rather, they insist that our digital lives are now working to produce a vast simulated
reality. Inspired by the 1964 science fiction classic Simulacron-3, which envisages an entire
city run by computers and mined by scientists as a giant exercise in market research, A
Consensual Hallucination No More? The Internet as Simulation Machine explores the operations and implications of todays Internet, run as it is on the fuel of the individual impulse to
share and communicate. In a kind of matrix-like inversion, the authors claim that humans
attempts to share, connect and making meaning have been reduced to standing reserves of
information, owned and used overwhelmingly by those who wield global power and control:
banks, corporations, governments. The Internet as simulation machine intensifies existing
power asymmetries and social and financial exclusion in addition to producing new ones; it
creates new digital divides between those who do not choose to take part in life online and
those who do, those who have the means to curate their online identity and those who do not,
and those who do and do not have access to the firehose of data flowing in through sites
such as Twitter and Facebook. McKelvey et al. explore possibilities for resistance under circumstances where traditional logical forms of antagonistic communication are also inevitably reduced to feeding bots making stock market decisions. Focusing on the work of 4Chan
and the Deterritorial Support Group, they ask whether it might not make more sense to make
no sense at all in this day and age. Is non-communication or idiocy a new form of resistance,
or will communicative tricksters be used to train the next generation of simulation machines?
Our goal with this volume is to ask cultural studies scholars to take seriously the proliferating processes of data mining and analytics. As we have argued, we recognize that these
developments can prove challenging as a focus of study. This is so not just because what
they are and how they operate are so opaque and relatively new, or because they are so

Downloaded from ecs.sagepub.com at CMU Libraries - library.cmich.edu on December 8, 2015

Andrejevic et al.

393

different from other kinds of cultural objects, or even because they are predicated on denying that they are cultural objects in the first place, but because of the real material barriers
that exist for us to access their operations, logics and forms of implementation. These barriers alone are reason enough to bring questions of power, ownership, signification and subjectivity back into the forefront of their critique. To be sure, the essays we have assembled
only begin to scratch the surface of these developments in big data and data analytics. Much
more remains to be done. We need to develop new methodologies and new intellectual and
critical competencies to tackle the embedded assumptions buried in the code and their political and cultural implications. Our ability to accomplish these things will require more than
isolated scholarly effort; collaborative, politically engaged activist sensibilities will no doubt
be required in order to push past the privatized digital enclosures and open up access to the
algorithms, analytics, distributive regimes and infrastructural monopolies that are increasingly coming to condition the contours and substance of our daily lives.

References
Anderson C (2008) The end of theory: The data deluge makes the scientific method obsolete.
Wired Magazine, 23 June. Available at: http://www.wired.com/science/discoveries/magazine/16-07/pb_theory (accessed 30 August 2013).
Barad K (2003) Posthumanist performativity: Toward an understanding of how matter comes to
matter. Signs 28(3): 801831.
Bennett J (2009) Vibrant Matter: A Political Ecology of Things. Durham, NC: Duke University
Press.
Bossewitch J and Sinnreich A (2013) The end of forgetting: Strategic agency beyond the panopticon. New Media & Society 15(2): 224242.
boyd d and Crawford K (2012) Critical questions for big data: Provocations for a cultural, technological, and scholarly phenomenon. Information, Communication & Society 15(5): 662679.
Crary J (2013) 24/7: Late Capitalism and The Ends of Sleep. New York: Verso.
Crawford K (2013) The hidden biases in big data. Harvard Business Review, 1 April. Available at:
http://blogs.hbr.org/2013/04/the-hidden-biases-in-big-data/
Galloway AR (2012) A response to Graham Harmans Marginalia on Radical Thinking. An und
fr sich, 3 June. Available at: https://itself.wordpress.com/2012/06/03/a-response-to-grahamharmans-marginalia-on-radical-thinking/
Galloway AR (2013) The poverty of philosophy: Realism and post-fordism. Critical Inquiry
39(2): 347366.
Gillespie T (2014) The Relevance of algorithms. In: Gillespie T, Boczkowski P and Foot K (eds)
Media Technologies: Essays on Communication, Materiality, and Society. Cambridge, MA:
MIT Press, pp. 167193.
Gray A, Campbell J, Erickson M, et al. (2007) CCCS Selected Working Papers, vol. 1. New York:
Routledge.
Griffin (2014) Incredible machines: trying to negotiate in a zettabyte world. The Vancouver Sun, 3
March. Available at: http://blogs.vancouversun.com/2014/03/03/incredible-machines-tryingto-negotiate-in-a-zettabyte-world/ (accessed 25 March 2015).
Grosz E (2004) The Nick of Time: Politics, Evolution, and the Untimely. Durham, NC: Duke
University Press.
Hall S (1992) Cultural studies and its theoretical legacies. In: Grossberg L, Nelson C and Treichler
P (eds) Cultural Studies. New York and London: Routledge.

Downloaded from ecs.sagepub.com at CMU Libraries - library.cmich.edu on December 8, 2015

394

European Journal of Cultural Studies 18(4-5)

Kittler F (2010) Optical Media. Cambridge: Polity Press.


Lohr S (2012) The age of big data. The New York Times, 11 February. Available at: http://www.
nytimes.com/2012/02/12/sunday-review/big-datas-impact-in-the-world.html?_r=0 (accessed
10 August 2012).
Packer J (2013) Epistemology not ideology OR why we need new Germans. Communication and
Critical/Cultural Studies 10(23): 295300.
Peters JD (2013) Calendar, clock, tower. In: Stolow J (ed.) Deus in Machina: Religion and
Technology in Historical Perspective. New York: Fordham University Press, pp.2542.
Turner G (1992) British Cultural Studies. London and New York: Routledge.
Wernick A (1990) Promotional Culture: Advertising, Ideology and Symbolic Expression. London,
Thousand Oaks, CA, New Dehli: Sage.
Williamson B (2014) The death of the theorist and the emergence of data and algorithms in digital
social research. Impact of Social Sciences, LSE Blog, 10 February. Available at: http://blogs.
lse.ac.uk/impactofsocialsciences/2014/02/10/the-death-of-the-theorist-in-digital-socialresearch/
Williams R (1958) Culture and Society, 1780-1950. New York: Columbia University Press.
Williams R (1981) The Sociology of Culture. Chicago: University of Chicago Press.

Biographical notes
Mark Andrejevic is Associate Professor of Media Studies at Pomona College in the US. His latest
book, Infoglut: How Too Much Information Is Changing the Way We Think and Know (2013),
explores the social, cultural, and theoretical implications of data mining and predictive analytics.
His work has appeared in a edited collections and in academic journals including Television and
New Media; New Media and Society; Critical Studies in Media Communication; Theory, Culture
& Society; Surveillance & Society; The International Journal of Communication; Cultural Studies;
The Communication Review, and the Canadian Journal of Communication. His current work
explores the logic of automated surveillance, sensing, and response associated with drones.
Alison Hearn is an Associate Professor at the University of Western Ontario in Canada. Her
research focuses on the intersections of promotional culture, new media, self-presentation, and
new forms of labour and economic value. She also writes on the university as a cultural and political site. She has published widely in such journals as Continuum, Journal of Consumer Culture,
Journal of Communication Inquiry, and Topia: Canadian Journal of Cultural Studies, and in
edited volumes including The Media and Social Theory, Blowing Up the Brand, and The Routledge
Companion to Advertising and Promotional Culture. She is co-author, with Liora Salter, of
Outside the Lines: Issues in Interdisciplinary Research (McGill-Queens University Press, 1997).
Helen Kennedy is Professor of Digital Society at the University of Sheffield. She has been
researching new and digital media since they came into existence and has published widely in this
field. She is author of Net Work: ethics and values in web design (Palgrave Macmillan, 2011). Her
work has been published in various journals including New Media and Society; Information,
Communication and Society; Media, Culture and Society; convergence; Ephemera; The
Information Society. She is currently researching ordinary forms of social media data mining, and
how ordinary people interact with data visualisations.

Downloaded from ecs.sagepub.com at CMU Libraries - library.cmich.edu on December 8, 2015

You might also like