Figuring control in the algorithmic era
Control. What does this term mean in the 21st Century of networked sociality and digital information tangles?
I don’t mean control in the sense that systems theory or cybernetics considers it, at least not at first. Rather, I want to explore this question as one that is a real, embodied concern, primarily in everyday use of social media, although it can be asked at many levels outside this everyday use.
At the individual level, we might ask: How much control do we have over our information and its flow through global social networks? What should we do to control our presentation of self in various social network sites and search engine results?
At the organizational,corporate, or developer level, we might ask similar questions using different words: How can we optimize our visibility in search engine results? How can we make sure we’re placing advertisements in the right location or using the best algorithms?
At the level of legislation, we might ask: How can we control information flow and use so that an individual’s data privacy is preserved? How can we build systems to protect copyright holders, wherein music sharing is known and therefore can be controlled or halted?
These questions of control all have something in common:They require us to define the term. To consider the extent to which we possess it, wield it, or understand its role in our everyday lives, we make certain assumptions about how it [control] works: specifically and in particular contexts. For example,I might believe that actions I take (press ‘send’ on a text message or update my Facebook status) will yield particular results (my friend receives my text message or can read my Facebook status update). I might also believe that the outcomes of specific acts can or should be known (I search for restaurants nearby on my smart phone map and the map will somehow display all the restaurants, unless I input more specific parameters).
When we think about this process for even a short time, we can realize that actions are not necessarily connected to specific and known consequences. There are mediating factors, more than could be possibly counted. Control is in some ways a tautology, since it requires itself to work properly. Still, we make it work on an everyday level.We use this term, perhaps as a metonym, to stand in for a lot of features,events, and cause/effect relations. In short, control is a strong concept-in-use with high ambiguity.
One way to consider this concept is to realize that any attempt to conceptualize it goes well beyond basic definition into the realm of operationalization, or the way we define terms always in relation to how they function in action. How we understand and feel about ‘control’ might be an ongoing decision in specific contexts, a negotiation over time, or in situations that seem unavoidable or ‘just the way things are,’ just a foregone conclusion.
One way to explore the complexity of the everyday experience of control is to layer and juxtapose voices and stories. In a recent paper I’m working on, I’m developing cases that take the shape of dialogues between various agents in particular situations.These are intended as figurations that can help us think through various working patterns of control, including beliefs about control, affective elements of control, enactments of control through specific code operations such as algorithms, making sense of perceived or actual loss of control, and consequences of maintaining an ambiguous stance toward the notion and operation of control within techno-cultural contexts. I believe this is a useful analytical move toward thinking about what is and what could be otherwise.
The question “What constitutes control?” or “where is the locus of control?” is my concern, but, inspired by an actor network theory or situated knowledge perspective, this does not comprise the focus of my analytical gaze. Rather, I follow the figurations, refigurations, and configurations that may be interpreted as control or function in ways that centralize control as a feature of becoming, whether or not this is at the surface or within infrastructures of discourse, interaction spaces, or other cultural formations.
Stated more practically, I have several cases where I begin with a certain instance where the idea or actuality of control can be noticed. I pick up actual or potential threads in this instance, treating them as parts of an ongoing conversation among various human and nonhuman actants. This method is largely inspired by my (2013) argument that through a playful process of sampling and arranging, we may find new analytical pathways or go in analytical directions that might not otherwise be apparent. Among other activities,the analysis consists of juxtaposing certain enactments of what might be considered ‘control’. Through this, I hope to lay out some of the assumptions at work in this interplay of human and non human agents.
Although this analysis is still in process, I already notice that control seems to have a persistent characteristic of (or perhaps is in a constant state of) paradox. This strongly resonates with what I found in 1995-7, when I was studying self-described ‘heavy users’ of the internet.
Below, there’s a brief snippet from a case I’m working on. Here, I explore the paradox of control within a certain moment of Facebook: Experiencing the news feed. Facebook controls the algorithms that in turn control to a large extent what we see in our news feed in this interface. While some experience this as seamless or unproblematic, others, such as MR below, express ongoing frustration that the flow of information is too much to handle. MR’s interview comments are juxtaposed with Facebook’s public announcements about improvements to its algorithms. From this juxtaposition, we see opposing desires, assumptions about control, and a possible disconnect between what Facebook thinks MR wants and what MR really wants. More agencies can and should be woven into this piece, but maybe you can get the sense of the direction of the representational style from this snippet:
Self /vs/ Facebook
“I can’t control anything!”/vs/”Our goal is to give you control!”
The goal of News Feed is to deliver the right content to the right people at the right time so they don’t miss the stories that are important to them. (Facebook News, 12/2013)
When horrible news is everywhere in your town plus everywhere in your feed, it’s completely overwhelming. I had to shut it all down. (MR, research participant, 09/2013)
Starting soon, we’ll be doing a better job of distinguishing between a high quality article on a website versus a meme photo (Facebook News,12/2013)
That’s what they want you to do. Be overwhelmed. (MR.research participant, 09/2013)
As a result, people may start seeing a few more stories returning to their feed with new comments highlighted. Our testing has shown that doing this in moderation for just a small number of stories can lead to more conversations between people and their friends on all types of content.(Facebook News, 12/2013)
So I had to learn to manage it. Which means shutting it down at certain times. Otherwise, I can’t do anything or get anything done. It occupies my entire day. I’m constantly confronted with issues, over and over. It’s everywhere: rape culture, the fucked up tea party antics, toxic dumping, school shootings ….the list could go on. (MR,research participant, 09/2013)
So how does News Feed know which of those stories to show? By letting people decide who and what to connect with, and by listening to feedback. (Facebook News, 12/2013)
I can only react. I can’t plan. I don’t have any agency. (MR, research participant, 09/2013)
Algorithms function in powerful ways to mediate experience.This has been discussed and critiqued in many ways (e.g., Introna and Nissenbaum, 2000; Mackenzie, 2005; Cheney-Lipphold, 2011; Barry, 2012; Burcher,2012; Gillespie, 2014). In this project, I explore how we can begin to talk about these technological/human relationships in ways that complicate and yet illustrate more clearly a key element of these relationships—control. I consider how those intersections where control seems to be simplified as a causal chain of actions and outcomes. By focusing on certain desires, illusions, and invisibilities, I seek to
‘stay with the trouble’
as Haraway would suggest(1991), and consider possible alternatives for how we might conceptualize relations between humans and technologies for ethically sensible digital futures.
Barry, D. (2012). The social epistemologies of software. Social Epistemology: A Journal of Knowledge, Culture and Policy, 26:3-4, 379-398, DOI: 10.1080/02691728.2012.727191
Burcher, T. (2012). Want to be on top?Algorithmic power and the threat of invisibility on Facebook. New Media & Society, 0(0), 1-17.DOI: 10.1177/1461444812440159
Cheney-Lipphold, J. (2011). A NewAlgorithmicIdentity: Soft Biopolitics and the Modulation of Control. Theory, Culture & Society, 28(6), 164-181. DOI:10.1177/0263276411424420
Gillespie, T. (2014). The relevance of algorithms. In Gillespie, T., Bockowski, P. & Foot, K. (Eds.). Media Technologies. Cambridge, MA: MIT Press.
Goffman, E. (1959). Performance of self in everyday life. New York, NY: Anchor Books.
Haraway, D. (1991). Simians, cyborgs and women:The reinvention of nature (pp. 149–181). New York, NY: Routledge.
Introna, L., & Nissenbaum, H. (2000).Shaping the Web: Why the Politics of Search Engines Matters. TheInformation Society, 16, 169–185.
Kacholia, V., & Ji, M. (2013). News FeedFYI: Helping you find more news to talk about. Facebook Newsroom/Blog, December 2013. Available from:http://newsroom.fb.com/News/768/News-Feed-FYI-Helping-You-Find-More-News-to-Talk-Abou
Mackenzie, A. (2005). The performativity of code: Software and cultures of circulation.Theory,Culture & Society, 22, 71-92. DOI:10.1177/0263276405048436
Markham, A. (2013a). Remix culture, remix methods: Rethinking qualitative inquiry for social media contexts. In N. Denzin & M. Giardina (Eds.), Global dimensions of qualitative inquiry (pp. 63–81). Walnut Creek, CA: Left CoastPress.