Data Models and Complexes of Subjectivation
Brian Holmes just posted an interesting text on „Guattari’s Schizoanalytic Cartographies“. Since I haven’t read a lot of Guattari, I was surprised to learn that he had quite an elaborated concept of modelling in digital environments. Holmes sums up Guattari’s lecture “Le Capitalisme Mondial Intégré et la révolution moléculaire“ [PDF] from 1981 as follows:
The key concepts of this far-seeing text are modeling and machinic-semiotic integration. The word “modeling” designates the simulation of dynamic systems, typically carried out by the application of strategically formulated computer algorithms to data-inputs gathered by scientists (which can be social scientists, psychologists, market researchers and so on). This kind of modeling has become essential to the planning of what Guattari called “collective facilities,” which increasingly take the form of privately owned consumption environments [...] Since WWII, the primary vector of this uniquely neoliberal form of control has been cybernetic modeling and the construction of interactive environments, or sites of “machinic-semiotic integration,” where the very freedom of the users continually generates the data allowing our progressively fine-tuned entrapment, within custom-built settings that morph and mutate to match the evolution of our already programmed dreams.
Drawing on further texts by Guattari, Holmes also discusses means of moving beyond this kind of territorialised subjectivity – „a cartography of escape routes leading beyond the black holes of neoliberal control, toward the possibility of collective speech.“ The concept of „Fourfold“ he suggests seems rather intricate, but when filled with practical experience it might actually help to pin down the somewhat vague notion of „lines of flight“.
I was glad to see that Holmes’ text connects two issues I have struggled to bring together for a while: The theme of an „ethics of transgression“ I touched upon in an article from 2005 and my current interest in the intersection between modelling and governmentality. This all fits in very well with a discussion I suggested for the upcoming workshop „Modes of Governance in Digitally Networked Environments“, organised by Malte Ziewitz and Christian Pentzold at the Oxford Internet Institute. Here’s my abstract:
“Predictive Consumer Modelling as a Mode of Governance
In July 2008, Google filed a patent application for “Network node ad targeting“, a system that ranks members of an online social network according to their influence within a community. It allows advertisers to place their commercial messages on the profile pages of the most influential members, thereby increasing the effectiveness of the ad campaign. In exchange, the influential members receive a financial incentive and are thus encouraged to strengthen their position within the community.
The patent sheds some light on how the data collected by popular online services – especially search engines – is used in order to devise increasingly sophisticated advertising strategies. A key aspect of these techniques is the construction of models. In order to process the wealth of data that enters the systems, there has to be a formalised means of interpreting the data. This involves specifying which indicators should be parsed and what kind of meaning is ascribed to specific data combinations and frequency distributions. Consumer models can be based on social network data, click streams, search queries, location, etc. Their purpose is to enable the prediction of behaviour by sorting individuals into segments, thereby reducing uncertainty and increasing returns on ad investment.
Considering the current pervasiveness of these advertising techniques, two lines of enquiry should be explored further:
1. How is data turned into information and on whose terms? While some services have begun to provide users with technical means to access and edit their own data, administrators mostly retain control over how larger quantities of data are processed and grouped together. Such asymmetries can be traced on different levels (visibility, accessibility, editability) on different technical layers of the system (database field definition, database entry, model, algorithm). Overall, it can be argued that the strong commercial bias in model construction results in an asymmetric distribution of access and transparency, which in turn inhibits the emergence of alternative epistemological trajectories.
2. What effect do the meaning ascriptions implemented in consumer models have on individual and collective subjectivation processes? If influential community members are primarily addressed as efficient disseminators of commercial messages, how is community, trust and friendship affected? If an information need expressed in a search query is primarily interpreted as a consumption need, how is information gathering and learning affected? Foucault’s concept of governmentality can help to understand the link between (external) control and (internal) conduct involved in the increasingly commercial framing of subjectivation processes.”