Algorithmic Identity: Debrief
Let’s debrief that previous post.
Firstly, here’s why this matters. These companies make promises of convenience (no need to think about what movie to watch next, Netflix will figure that out for you) and control (you’re taking the actions, so you must be in control of who knows what about you). But at what cost and in what reality? That is, you are being defined by algorithms to which you have zero exposure. And even if you did, you’d come up against what Fuller and Goffey call grey media: lines and lines of code that are indecipherable, especially once compiled. In their words, “expert systems, workflow, databases, human-computer interaction and the sub-media world of leaks, networks and permissions structures that establish what eventually appears as conventional media.”
Cheney-Lippold equates the kind of control that the creators of algorithms have to Foucault’s biopower: defining our bodies and ourselves in order to exert power (Cheney-Lippold calls it “soft biopower” for obvious reasons). These algorithms, and the complex databases they populate, are the mechanisms which control not just what we see when and how, but, consequently, what actions we want to take, what products we want to buy, and what thoughts we want to think.
Don’t believe me? In 2014, Facebook revealed that it had been changing what users saw in their “news feeds” in order to then affect what “moods” those users would express in their subsequent posts. Now picture this happening during an election year. Too late.