Data Models and Complexes of Subjectivation

Brian Holmes just posted an interesting text on „Guattari’s Schizoanalytic Cartographies“. Since I haven’t read a lot of Guattari, I was surprised to learn that he had quite an elaborated concept of modelling in digital environments. Holmes sums up Guattari’s lecture “Le Capitalisme Mondial Intégré et la révolution moléculaire“ [PDF] from 1981 as follows:

The key concepts of this far-seeing text are modeling and machinic-semiotic integration. The word “modeling” designates the simulation of dynamic systems, typically carried out by the application of strategically formulated computer algorithms to data-inputs gathered by scientists (which can be social scientists, psychologists, market researchers and so on). This kind of modeling has become essential to the planning of what Guattari called “collective facilities,” which increasingly take the form of privately owned consumption environments […] Since WWII, the primary vector of this uniquely neoliberal form of control has been cybernetic modeling and the construction of interactive environments, or sites of “machinic-semiotic integration,” where the very freedom of the users continually generates the data allowing our progressively fine-tuned entrapment, within custom-built settings that morph and mutate to match the evolution of our already programmed dreams.

Drawing on further texts by Guattari, Holmes also discusses means of moving beyond this kind of territorialised subjectivity – „a cartography of escape routes leading beyond the black holes of neoliberal control, toward the possibility of collective speech.“ The concept of „Fourfold“ he suggests seems rather intricate, but when filled with practical experience it might actually help to pin down the somewhat vague notion of „lines of flight“.

I was glad to see that Holmes’ text connects two issues I have struggled to bring together for a while: The theme of an „ethics of transgression“ I touched upon in an article from 2005 and my current interest in the intersection between modelling and governmentality. This all fits in very well with a discussion I suggested for the upcoming workshop „Modes of Governance in Digitally Networked Environments“, organised by Malte Ziewitz and Christian Pentzold at the Oxford Internet Institute. Here’s my abstract:

“Predictive Consumer Modelling as a Mode of Governance

In July 2008, Google filed a patent application for “Network node ad targeting“, a system that ranks members of an online social network according to their influence within a community. It allows advertisers to place their commercial messages on the profile pages of the most influential members, thereby increasing the effectiveness of the ad campaign. In exchange, the influential members receive a financial incentive and are thus encouraged to strengthen their position within the community.

The patent sheds some light on how the data collected by popular online services – especially search engines – is used in order to devise increasingly sophisticated advertising strategies. A key aspect of these techniques is the construction of models. In order to process the wealth of data that enters the systems, there has to be a formalised means of interpreting the data. This involves specifying which indicators should be parsed and what kind of meaning is ascribed to specific data combinations and frequency distributions. Consumer models can be based on social network data, click streams, search queries, location, etc. Their purpose is to enable the prediction of behaviour by sorting individuals into segments, thereby reducing uncertainty and increasing returns on ad investment.

Considering the current pervasiveness of these advertising techniques, two lines of enquiry should be explored further:

1. How is data turned into information and on whose terms? While some services have begun to provide users with technical means to access and edit their own data, administrators mostly retain control over how larger quantities of data are processed and grouped together. Such asymmetries can be traced on different levels (visibility, accessibility, editability) on different technical layers of the system (database field definition, database entry, model, algorithm). Overall, it can be argued that the strong commercial bias in model construction results in an asymmetric distribution of access and transparency, which in turn inhibits the emergence of alternative epistemological trajectories.

2. What effect do the meaning ascriptions implemented in consumer models have on individual and collective subjectivation processes? If influential community members are primarily addressed as efficient disseminators of commercial messages, how is community, trust and friendship affected? If an information need expressed in a search query is primarily interpreted as a consumption need, how is information gathering and learning affected? Foucault’s concept of governmentality can help to understand the link between (external) control and (internal) conduct involved in the increasingly commercial framing of subjectivation processes.”

Technorati Tags: , , ,

About Theo Röhle

Theo Röhle ist Promotionsstudent im Fach Medienkultur an der Universität Hamburg. Sein Dissertationsvorhaben mit dem Arbeitstitel „Dispositiv Google“ geht der Frage nach, inwiefern bisherige medienwissenschaftliche Machtkonzepte im Fall der Suchmaschinen ihre Gültigkeit beibehalten.

2 thoughts on “Data Models and Complexes of Subjectivation

  1. Hello Theo –

    I am pleased that you find my Guattari text useful: the idea was to resituate his and Deleuze’s work within the rather vast social-science context that is only indicated rather elusively in their writing. So your response is exactly the kind I was looking for! In turn I find quite fascinating and pertinent your abstract on “Predictive Consumer Modeling as a Mode of Governance.” The question of asymmetry is fundamental. We are constantly told in the neoliberal societies that we are all free, and so are massive corporations, to which commercial law gives the status of legal persons (despite the fact that they are immortal). But these corporate-persons now increasingly have (or can purchase) extraordinary capacities of data-gathering, analysis and modeling that allow them to map out the ways that flesh-and-blood individuals act, and even trace the contours of our desire. Their advantage over the rest of us is immense: with their information about our intimate existence they are able to shape the architectures of interaction that we inhabit on a daily basis. Not only cultural critique of the kind I do, but serious and disciplined social-scientific research is needed in these areas now, and it should be brought to the center of academic and governmental institutions. That’s why I am curious to read your forthcoming paper.

    You might also be interested in my text “Future Map,” which attempts both to establish the archaeology of predatory data-gathering systems (again, with extensive reference to postwar cybernetics), and to work out a Foucaultian interpretation of the way the use of this data structures contemporary capitalist society, drawing on the later Foucault who was no longer so concerned with discipline but rather with internalized governmentality. That text is here:

    all the best, Brian

  2. Hi Brian,

    thanks for your comments, I’ll definitely have a closer look at the Future Maps text. Seems like it fits this discussion perfectly. The OII workshop probably won’t result in a paper, since it’s mainly discussion based. But check out the book from the Deep Search conference coming up in June at Transaction Publishers. My chapter contains similar points on governmentality and it’s going to be a very interesting collection altogether.


Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>