Full-text audio version of this essay.
On August 7, Donald Trump issued an executive order that would ban TikTok in the U.S., effective September 15, if its Chinese parent did not sell it to an American company. The stated reason is to combat any Chinese spying that might be facilitated by the app’s data collection practices and, more ludicrously, to address the app’s censorship of pro-minority protests and prevent the spread of conspiracy theories and disinformation about Covid-19 — things Trump is manifestly unconcerned about domestically. China certainly could be gathering TikTok’s data, as this piece by tech analyst Ben Thompson details. “That, though, is not the primary risk,” he writes. “What should truly concern Americans is the algorithm.”
By “the algorithm,” Thompson means TikTok’s method of filling users’ feeds without any active intervention on the users’ part. You just watch content on the site, and based on your responses to it the app purportedly infers what you want and begins to show you more of it. It also draws on other user data it tracks, outlined vaguely in this blog post that offers the company’s explanation of how its algorithms work. It concludes on this Kondo-esque note:
Ultimately, your For You feed is powered by your feedback: the system is designed to continuously improve, correct, and learn from your own engagement with the platform to produce personalized recommendations that we hope inspire creativity and bring joy with every refresh of your For You feed.
TikTok declutters your media consumption so that every video sparks joy. But could it also spark mind control? In this post on the Financial Times site, Izabella Kaminska talks to Paul Dabrowa, “an artificial intelligence and social media expert specializing in the operational underpinnings of persuasion and related psychology,” who claims that TikTok is a “program that develops predictive behavioral models expunged from the data exposed by users’ digital footprints online. This includes their computers, their smartphones, wearables, and just about any other data tracking devices.” The point of this is not necessarily to entertain users and aggregate their attention for advertisers but, Dabrowa argues, to brainwash them.
Unlike Facebook which analyses your current friendship network, TikTok uses a behavioral profile powered by artificial intelligence to populate a user’s feed before friends are even added. It also predicts the type of friends you should have for your personality. Once outfitted with this information, the TikTok AI has the capacity to train users using similar methods that dog trainers use, i.e. deploying positive and negative feedback loops to encourage TikTok users to behave in certain ways … Initially the videos would appear funny and generate positive emotions, at which point they would be directed to a propaganda video generated by the CCP, with the hope they would then share it. With repeated exposure the positive emotions will become subconsciously linked to the propaganda message in the same way a dog can be made to sit with food training. Over time children could be trained to associate positive emotions with political positions positive to the CCP or react negatively to positions negative to the CCP.
Thompson doesn’t go so far as to liken TikTok users to dogs being trained behavioristically with video treats, but he does warn about TikTok as a propaganda machine: “TikTok’s algorithm, unmoored from the constraints of your social network or professional content creators, is free to promote whatever videos it likes, without anyone knowing the difference.”
Never mind that the same could be said about any platform whose content is sorted by black-boxed algorithms. (And never mind that television networks are free to show you whatever ads they want while you are trying to watch some other content.) To some patriotic minds, the threat of TikTok is that it will make kids love “Marxism” (as though that applies to China’s current political economy).
It seems xenophobically paranoid (if not fully disingenuous) to argue that TikTok is making Manchurian candidates of its users. But that’s not to say it isn’t conditioning us. Another tech industry analyst, Eugene Wei, in this essay, also focuses on TikTok’s “eerily perceptive” algorithm and its predictive powers:
The beauty is its algorithm is so efficient that its interest graph [TikTok’s database of which users are interested in what] can be assembled without imposing much of a burden on the user at all. It is passive personalization, learning through consumption. Because the videos are so short, the volume of training data a user provides per unit of time is high. Because the videos are entertaining, this training process feels effortless, even enjoyable, for the user.
By “training process” Wei seems to mean the algorithm, but it applies equally to the user. The “passive personalization” it entails is equally ambiguous: It applies to both the algorithm that is figuring a user out and the user who is being made to feel “personalized” — recognized as a unique person — simply by watching videos. They don’t even have to “like” them. The app’s environment transforms users’ passivity into self-assertiveness, treating behavioral data as though it were freighted with intimations of their true selves, their real desires. “When you gaze into TikTok,” Wei writes, “TikTok gazes into you.”
If one overlooks the double-sidedness of the relation between users and sorting algorithms— how they mutually shape each other — one not only risks celebrating TikTok (and other algorithmically driven systems) for magically catering to the pre-existing desires of individuals, as Wei tends to and as many enthusiastic commentators have done over the past few years. One also risks investing TikTok with irresistible mind-control powers that are the flip side of “knowing you better than you do.” But the effects of interacting with algorithms are not simply a matter of being programmed or being served. They operate on us through the tensions between activity and passivity that they generate. If they enact a form of control, it is often simultaneously experienced as a kind of liberation. If they implement censorship, they also make possible hypervisibility. If they attempt to dictate who we are, they also make us more materially aware of our own individuality.
Regardless of any potential ulterior motives, TikTok must first induce users to keep using it. One might see that as a matter of matching its supply with what the audience demands, but it is also a matter of producing the audience that can enjoy the app. If there is any sort of mind control involved, this would be it — the same processes of seduction that characterize the system of consumerism in general. The relation between what we want to see and what we end up spending so much time watching is pretty tenuous to begin with; TikTok is designed to preserve and administrate that gap while seeming to close it.
Apologists for TikTok often tout how fun it is. Whenever I see something described as “fun,” I reach for my Baudrillard to trot out his passages about the “fun morality” — the compulsory nature of enjoyment in a consumer society. Fun, as I interpret the word, is not just a synonym for having a good time; it’s an “aesthetic category,” in Sianne Ngai’s sense of the term, that captures a structure of feeling peculiar to the way capitalism subjectivizes us. It indexes the degree to which our capacity to experience pleasure has been subsumed by consumerism. That is, “fun” marks how effectively we’ve been produced as an audience.
“Fun” often evokes the kinds of pleasures typified by the “experience economy,” in which retail is conflated with tourism, saturated with an “authenticity” that deconstructs itself. It also includes media “experiences” as well — on-screen entertainment, often presented serially in feeds. (Channel flipping and “doomscrolling” are fun’s inverted cousins.) It conceives of pleasure as a commodity, as an end that can be abstracted from its means. It presumes — or rather prescribes — a sense of time as a uniform emptiness, a blankness that must be filled with various prefabricated chunks of spent attention. “Fun” is when we defeat boredom; being bored by default is fun’s prerequisite.
In a 1956 essay, “The World as Phantom and as Matrix,” critic Gunther Anders links “fun” to the individual’s disavowal of cultural indoctrination.
To transform a man into a nobody (and one who is proud of being a nobody) it is no longer necessary to drown him in the mass or to enlist him as an actual member of a mass organization. No method of depersonalizing man, of depriving him of his human powers, is more effective than one which seems to preserve the freedom of the person and the rights of individuality. And when the conditioning is carried out separately for each individual, in the solitude of his home, in millions of secluded homes, it is incomparably more successful. For this conditioning is disguised as “fun”; the victim is not told that he is asked to sacrifice anything; and since the procedure leaves him with the delusion of his privacy or at least of his private home, it remains perfectly discreet. The old saying “a man’s own home is as precious as gold” has again become true, though in an entirely new sense. For today, the home is valuable not only to its owner, but also to the owners of the homeowners — the caterers of radio and television who serve the homeowner his daily fare.
Admittedly, this talk of “conditioning” is not far from the Pavlovian dogs that Debrowa invoked. But though Anders mentions “caterers,” this conditioning is not some sort of “food training” carried out by specific types of TV shows. Rather it relies on the flow of programming, and the individualized sense of control that comes from private, personal viewing.
The relation between what we want to see and what we end up watching is pretty tenuous to begin with; TikTok preserves that gap while seeming to close it
Wei’s claim that users “enjoy” TikTok’s algorithmic training process could be understood in these terms. Its “conditioning” is carried out algorithmically for each individual in the solitude of their own phone. But users are encouraged to understand this a process they are directing in their own interests. Users can see themselves as producing the content they enjoy by consuming it. I’m programming the programmers! Meanwhile, the proprietors of algorithmically sorted feeds — the “caterers” of content providing our “daily fare” — are ultimately the “owners of the phone owners,” capturing our data and our time, which together produce the capture not of our minds but our will.
Anders’s essay (as I have described before) treats video consumption as a means of producing the self as passive. When social media structure us as both broadcaster and audience, that doesn’t change: Broadcasting the self is less a matter of self-expression than the possibility to engage with ourselves as a media object. “Broadcasting” can be seen not so much as a practice but a state of mind brought on when experience and mediation seem interchangeable, an attitude that treats attention not as a condition of communication but its outcome, its product. The broadcast mentality is not contingent on having a big audience; rather what is significant is that you can be an audience to yourself.
Algorithms make us a media object for ourselves in a more immediate way. You can interact with yourself as constituted by the recommendations. Serial interaction with a stream of videos chosen “For You” projects a self forward while also implicitly erasing any sense of self rooted in past experience. You can look into TikTok to see yourself reflected back not only in videos you make and in the response they get but in the entire shape of the interface. The videos you watch may be about this or that, but the app positions you so that the sequence of them is always about you.
Anders claims that “modern mass consumption is a sum of solo performances; each consumer, an unpaid homeworker employed in the production of the mass man.” In other words, though you consume alone, you are working in tandem with everyone else to produce the idea of the audience, which in turn organizes how you feel about what you see, where you see yourself in relation to everyone else. But as Deleuze argues in the “Postscript on the Societies of Control,” “we no longer find ourselves dealing with the mass/individual pair. Individuals have become ‘dividuals,’ and masses, samples, data, markets, or ‘banks.’” We can update Anders’s “production of mass man” to the production of the “dividual”: Everyone on their phones is constructing and reinforcing the idea of the modern consumer as a mesh of configurable data points, consuming databases. Our “solo performances” on apps aggregate into a new kind of selfhood that is disseminated back to us.
In “Why the Next Song Matters: Streaming, Recommendation, Scarcity,” Eric Drott details this process with respect to recommendation algorithms in music streaming services like Spotify. He traces their shift from marketing themselves on their seemingly infinite databases (listen to whatever you want!) to their purported ability to serve up the “perfect song” (here’s something you didn’t know you could want this much!). This parallels TikTok’s promise to provide the content that users “want.” Streaming platforms, Drott argues, sell customers on fulfilling not a lack but the lack of a lack. They claim to solve the “scarcity of scarcity” that the digital surfeit has supposedly brought on. This is a familiar trope across tech companies: They have brought us access to so much goodness, to such a “consumer surplus,” that they now need to spoon-feed what’s good to us.
After all, it is tricky to know what to choose, especially when we are unmoored from any framework that could fully stabilize a set of tastes or needs as “natural” or “proper to one’s station” or “authentic” or so on. It’s not as though taste in music is in-born or even personal; it obviously varies according to the social context in which you are listening as well as what sort of listening habits you have cultivated. And it is not as though pieces of music get exhausted through consumption. You can play the same album over and over again and enjoy it more and more. Requiring novelty is a learned behavior, anchored to economic rewards, among them the fruits of status or fashionability.
Given the effort that wanting new things requires, it makes sense that we would become susceptible to systems that can do our desiring for us. Algorithms don’t reflect existing needs or wants; they are a system for instilling new ones. So opening TikTok or Spotify or Netflix is typically not a way to watch something specific but a place to turn when you want to want something. Their algorithms produce a sense of having been “satisfied” without having to have first been deprived or lacking, and without having to make the effort of bringing one’s desire into a sharper focus. They make desire an inert consumer good, available on demand as already enjoyed. When you consume an algorithmic feed, you are constituted as a subject who is continually winning at wanting.
Drott notes how the problem of “too much choice” can manifest as nostalgia for someone who knows better than us what we should listen to: “These figures include the clerks who once staffed now-shuttered independent record stores; the enthusiasts responsible for the vibrancy of local music scenes; or — more intimately still — older siblings whose role as unofficial ‘cultural intermediaries’ was inseparable from their status as musical ego-ideals.” Streaming services want us to replace those figures with algorithms, which are presumably better because they appear as our servants; they seem to merely organize the proclivities we already have and are trying to discover in ourselves.
Requiring novelty is a learned behavior. When you consume an algorithmic feed, you are constituted as a subject who is continually winning at wanting
But the shift in how streaming services market themselves reflects a bait and switch with respect to what they offer: You wanted more music; you got Big Brother. “Recasting abundance as scarcity is performative in that it fabricates a need,” Drott points out. That is, it invents what it pretends to discover. I think this applies to algorithmic “discovery” in general: It aims to produce a desire that you can recognize as your own, which in turn stabilizes your sense of self as someone with legible tastes. As Drott explains, “The gap between individual and the subject position that is symbolically constructed via music appears to collapse: the streaming service apparently interpellates us as ourselves and as nothing else.” I am music and I write the songs.
Rather than merely listening to what we like, as if we had tastes that were separate from our essential self, what we like is collapsed into who we are. “What services touting their curated selections are selling clients is, in addition to music, a range of subject positions they can adopt through music,” Drott writes, “alleviating them of the burden of having to fabricate such subjectivities themselves.” The fact that there is a “range” reflects the instability of the self that such recommendations bring into view. Even as algorithms promise an accessible self, they also take for granted that users are alienated from their own desire and need a prosthetic for experiencing it.
But why would anyone want an algorithm to perform the work of fashioning a self for them? One way of describing that difficulty is in terms of a capitalist crisis of overproduction: How can profits be realized if there are too many goods and not enough consumer demand for them? “Without the production of desire, there is no continuous self-transformation of capital,” Drott notes, glossing Marx. “And without the continuous self-transformation of capital — its transmutation from money into productive capital into commodities endowed with surplus value back into an increased sum of money — one is left with a collection of inert entities: money that buys nothing, labor that is idle, machines that gather dust, commodities that nobody buys.”
This tension, which saturates capitalist culture and its institutions, places intense pressure on individuals to want things. It’s not as though there is something “natural” to consumerism. It’s not anchored by subsistence; rather, it serves as the systemic expression of the fantasy of being able to banish subsistence. Consumerism rationalizes the egregious inequality capitalism produces by positing consumers as liberated and autonomous, not merely capable but compelled to never stop wanting more.
Capitalism’s requirement of ceaselessly expanding consumer demand — the crisis of overproduction — gets translated on the level of the individual into a compulsion to want things, a sense that insatiability is a positive character trait, that knowing what to buy or having a wish list of consumer goods and experiences is an index of how well one belongs, how successful one is at being a person. Consumerism is hard work in this sense; it takes a lot of effort to continually reproduce desire, to keep up with what is supposed to be desirable and why, to be able to see yourself being seen consuming what is supposed to be consumed in just the right way, as in an Instagram brunch pic.
Alienation from our capacity to desire is not an intrinsic psychological truth; it is a socioeconomic condition. It is reflected by the sorts of desire made culturally available, the sorts of “fun” on offer, which are premised on alienation and built of contradictions (contrived spontaneity, reified “authenticity,” achievable aspiration, etc.). Capitalist culture interpellates us as both insatiable and easily satisfied by such products as TikTok videos and pop songs. The result of this contradiction is that the more we consume on these terms, the more we feel the requirement to want more. In other words, if TikTok is making kids Marxists, it is because it is suffusing their lives with the intolerable logic of capital, not because it is showing them CCP-approved agitprop.
To manage and regularize levels of consumer demand, capitalism requires that our consumption become deskilled: We must be brought to unlearn how to satisfy ourselves on our own terms (assuming we ever know in the first place). Sometimes this process passes under the name of fashion, which valorizes change for its own sake, or rather for the sake of those who manage the cycles of change. But the deskilling is not entirely imposed from above but effectuated by our own practices, which are often felt to be “fun” or “convenient” or “trending.”
Anders’s claim that we are all unpaid workers “employed in the production of the mass man” anticipates the late Bernard Stiegler’s idea that consumption under contemporary consumerism is a form of “proletarianization,” that is, a sort of alienated and abstracted labor. Whereas 19th century capitalism proletarianized workers (deskilling them into “abstract labor” for production processes), Stiegler argues in For a New Critique of Political Economy that 20th century capitalism proletarianized consumers (deskilling their desire into fungible, abstract libidinal energy). As Jason Read explains in this excerpt on Stiegler from his book The Politics of Transindividuality, proletarianization is “the basis for understanding the transfer of knowledge of cooking to microwaveable meals and the knowledge of play from the child to the video game.” Deskilled consumerism is the reduction of the possibilities of knowledge and pleasure to “fun.” As Read notes, Stiegler links this process to media consumption specifically: Video, in his view, has the capability of synchronizing all viewers’ experience of time to the same rhythm of the interface. Through this process, what Stiegler calls the “relations of consumption” — how the desire for things and uses we make of them are fundamentally social — is transformed. The “synchronization of consciousness,” Read writes, “destroys the basis for individuation.” That echoes Anders’s argument about radio and TV making people “proud of being a nobody.”
Hiroki Azuma, in his 2001 book Otaku: Japan’s Database Animals, refers to this kind of deskilling as well, drawing on Alexander Kojève’s reading of Hegel to argue that postmodern consumers have been “animalized” — a not entirely useful metaphor but one which resonates with Dabrowa’s imagery. Azuma argues that for homo sapiens to become “human,” they must “struggle against nature,” whereas animals “live in harmony with nature.” Human “desire,” then, involves sublimation and a higher purpose than the mere satisfaction of animal wants — something more than “food training.”
For Azuma, what makes human desire “human” is that it takes other people’s desires into account — “the desire of the other is itself desired.” People want to be wanted, and want to want things that others want — sometimes this is described as “mimetic desire” or “social proof.” This is “human” because humans are purportedly alone among species in being able to conceive of intersubjective desire. When consumers find that they can satisfy themselves without drawing on desire for the desire of the other — when pleasure is abstracted from a social situation — they are “animalized”: They are plugged into routinized circuits of stimulus and response that allow them to experience pleasure without social communication. This isn’t a matter of being behavioristically conditioned by rewards so much as blue-pilling oneself, opting for the juicy steak over the conditions of possibility for collective meaning and joy. In practice it means accepting “convenient” modes of consumerism that eliminate the “friction” of other people or embracing algorithms that supplant social communication, providing “the desire of the other” without the trouble of reciprocity with that other. “The objects of desire that previously could not be had without social communication, such as everyday meals and sexual partners, can now be obtained very easily, without all that troublesome communication, through fast food or the sex industry,” Azuma argues. “So it can be said that in this way our society has truly been stepping down the path of animalization for several decades.”
Algorithms play their part by replacing social interaction with data processing. What others do is presented to us directly as what kind of content we should want. Stiegler understands this sort of content as a mediatized “exteriorization of memory and knowledge” that allows consumers to be “controlled by the cognitive and cultural industries of control societies.” Media products replace our lived experience and reprogram our relation to memory, gesture, and pleasure — standardizing and “grammatizing” them, working against our capacity to individuate ourselves as Stiegler believes is our human destiny. The result of this process is what Stiegler labels “systemic stupidity” — or what I am calling the invention of “fun.”
In Azuma’s terms, “systemic stupidity” is animalized consumption that is detached from any master narratives that could give it that higher purpose and instead plays out as the compulsive fulfillment of base needs. He sees otaku — that is, obsessive anime fans — as emblematic consumers of this type who “detach ‘form’ from ‘content’” not “for the purpose of finding meaning in various works or engaging in social activities but rather in order to confirm the self as a pure idle spectator (which is the self as ‘pure form’).” They content themselves, Azuma argues, with consuming endless configurations of tropes drawn from a database of emotionally triggering components. That foreshadows not only how porn sites work but also Pandora and Netflix; they disaggregate content into its component appealing parts and Frankenstein together new iterations that can serve specific niches or search terms. It is also in part how TikTok seems to work, according to this analysis, tagging content by common traits.
Algorithmic culture generally lets us experience the self as “pure form,” atomized within a society remade as network connections, achieving the status of “a pure idle spectator” who is preoccupied with consuming the self itself rather than some other social experience or representation of reality. The feed is just for you; you are the only reason for it appearing in just that way, and it accomplishes nothing beyond allowing you to enjoy your pivotal centrality to that closed loop.
If TikTok is making kids Marxists, it’s because it is suffusing their lives with the intolerable logic of capital, not because of CCP-approved agitprop
Proletarianization, then, solves the problem of consumer demand by stripping consumers of their “transindividuality” — the way they are particularized and grounded within a concrete community — rendering them as atomized sheep in a herd, a statistic in an overarching balance sheet. In this condition, their affect suitably waned, they are amenable to having their desire for things induced serially and perpetually. “The consumer resolves the problem of overconsumption by quickly and obsessively adopting new technologies, new needs, new objects,” Read notes (and you could add “new playlists, new songs, new apps, new videos”), “but in doing so it produces a crisis of subjectivity, a breakdown of individuation and responsibility that is incapable of constituting itself in relation to a future.” Azuma puts the same general idea this way: “Today, emotional activities are being ‘processed’ nonsocially, in solitude, and in an animalistic fashion. For in the postmodern, database-model society, there cannot be such a thing as a grand empathy. Today, many otaku works are clearly consumed as tools for such animalistic ‘processing.’” The same might be said of TikTok videos.
Endless consumption of this sort dissolves the self, but that is where algorithms become the cure as well as the disease. They reconstitute the self externally and seem to solve the “crisis of subjectivity” they help facilitate. In the process, algorithms train users in what is supposed to be “fun” — the structure of feeling that links the dispersed self back to them — reinforcing lessons we have already absorbed from the “exteriorizations” of other entertainment forms. But this isn’t a matter of affiliating oneself with certain types of content, or branding oneself with a subculture. The self is not anchored by specific content but by the orientation toward time that algorithmic feeds establish: Time needs to be “consumed” so that we can realize ourselves as having recognizable interests, so that we can produce ourselves as a “self” through consumption. It doesn’t matter what specific sort of thing Spotify or TikTok recommends to us, only that it continues to do so. Then we continue to be someone specific.
Algorithms, in other words, teach us to locate ourselves not in our transindividual social relations but in what Drott calls “next-ness”: a foreshortened sense of the horizon of the self. Within that narrow span of space-time, the self is readily dissolved into the range of subject positions that streaming services offer through the content they automatically provide, the “subcomponents into which individuals can be — and have been — discomposed.” Just as pieces of content are broken down into their component parts, so are the consumers. “You want to be the song that you hear in your head,” as Bono says.
The “normative listener” doesn’t have a particular consistent identity, the way one might have belonged to a subculture or modeled oneself after a particular type in the late 20th century. (That was The Breakfast Club model of subjectivation, where one is a nerd, a jock, a preppie, a rebel, or a weirdo.) Rather one’s tastes, Drott argues, “cohere … by virtue of a steadfast refusal of any positive principle of coherence, fluctuating according to context, affect, setting, and other contingent factors.” The conformity of what Anders called the “mass man” consists not in specific shared tastes but a common condition of being broken apart into these configurable subcomponents, signifiers of identity that are now spoken by feeds (much like they were once spoken by consumer goods laden with “characteristics” in Kelvin Lancaster’s theory of consumerism). “Azuma describes this as “database consumption.” But this shouldn’t be understood as consumers confronting an array of options; it is more that we live our lives as part of the spreadsheet, algorithmically processed as data and for data in a material world that is rendered as interchangeable cells, one piece of content after another.
Algorithms serve the capitalist process of producing consumers in its image and instilling in them the desire for the kinds of standardized culture it can reproduce profitably. As a result we are isolated and controlled through the permission to consume when we are no longer compelled by our inability to afford it. Through forms of media, selected for us by ever more responsive algorithms (the human editors that once programmed us giving way to automated systems), we internalize certain patterns of pleasure and behavior that valorize convenience and efficiency and condemn the complications of interpersonal relationships. Media provide emotional experience on a commodity basis, decontextualized and abstracted from the fabric of social relations. They induce a compulsive passivity that simulates autonomy with none of the responsibility.
As much as I agree with all that, and the critiques of “animalization” and “proletarianization” and “depersonalization,” the terms remain off-putting. Who wants to think of themselves or anyone else as “systemically stupid” or somehow less than human? Would you trust what such critics had to say about the world? In recounting these theories, I’m wary of their implications that there is a correct way to consume — a non-stupid way to do it like a “human.” Critiques of consumerism often play as disavowals, as pleas for individual immunity from the snares that caught all the ordinary “stupid” people. But everyone is essentially complicit in a libidinal economy that is fully subsumed by capitalism. Empathy is on its terms.
Sianne Ngai’s Theory of the Gimmick offers a way of describing the deskilling of consumer demand that grapples with that complicity and largely avoids reducing consumer subjects to passive victims or trained animals. That is, there is less moralizing, implicit or explicit, in her analysis of the aesthetic forms capitalism generates to reproduce itself. Instead she focuses on omnipresent ambivalence. “In a world in which everything is made to be sold for profit and engineered to appeal to what a consumer is preshaped to desire,” she asks, “how can there not be a philosophically as well as historically meaningful uncertainty at the heart of the aesthetic evaluations through which we process the pleasures we take in it?” That uncertainty is what she calls the gimmick, which is not simply about clever novelty but also about how commodities seem to lie:
All subjects in capitalism find something gimmicky. When asked to explain why they are characterizing their specific object this way, their responses become tellingly similar: because it is trying too hard, because it is not working hard enough, because its promises of value are unconvincing, because it is instructing me exactly how to consume it (and so on) …
Gimmicks tend to seem like cheats or short cuts — anything that feels exploitive — but they also can appear as clever hacks (another form of exploitation). They seem to foreground their obviousness in a way that demands we judge them, but it’s “a judgment in which skepticism and enjoyment coincide,” Ngai writes. In judging something a gimmick, we complete it rather than negate it. A gimmick works by inciting a reaction. As with a P.T. Barnum exhibit, the point is not to fool us but to engage us with debased ingenuity. Calling something a gimmick is a “way of communicating the falseness of a thing’s promises of reducing labor, saving time, and expanding value, without disavowing their appeal or social effectivity,” Ngai argues.
Tech platforms, especially, rely on gimmicks — something that Wei notes in passing in his TikTok analysis. You can trace the gimmickry at the level of content, with how prominent all the memes are, or at the level of the interface, where users are offered some new way to process or produce content, whether that is zany image filters and AR lenses or algorithmic feeds. Algorithms try too hard, they make wrong guesses, they tell us how content should be consumed, they flaunt their labor-saving capacity — all of which instills an ambivalence toward them that indexes our dependency on them. Algorithmic feeds offer to save us the labor of making a self through media consumption — an imperative under neoliberalism to make our personalities productive — by handing us a gimmicky self: an overtly false identity that conveys how thoroughly we have been surveilled and the hidden truth about who we really are under capitalist conditions.
As with most marketing and advertising, a gimmick works by letting us see through it and feel superior to it, even as it lodges its framing of the world in our consciousness. We want to believe and disbelieve simultaneously. The gimmick offers a way to hold those contradictory impulses together, allowing us to examine and disavow the desires through which capitalism subjectivates us. It cloaks our complicity in those desires and registers our ambivalence about the erstwhile necessary labor it saves — what it would otherwise require to desire things, to produce ourselves as selves. We displace our induced desire for capitalist culture (characterized by novelty, disposability, status positionality, reification, fetishization) onto gimmicks, which we enjoy through debunking as much as through enchantment — they become bound up with each other.
“The conjoining of enigma and transparency in the gimmick points to a key shift in the way illusions become socially effective,” Ngai notes, with respect to conceptual art, but it applies equally to notoriously black-boxed algorithms. “It ultimately reflects our simultaneous recognition of what we can but also cannot grasp about a productive process from an artifact’s appearance … as well as a double-sided gestalt: ‘work’ conjoined to an equivocal ‘zero’ or disappearance of work.” We know how algorithms are supposed to work in general and why they can’t possibly “really” work to define us and predict us, yet we find ourselves trapped in their world, being predicted, contained, controlled. In supplying a proxy for that “desire of the desire of the other,” they keep the idea of sociality alive in a distorted form — a social distancing without the social. Within algorithmically organized media, we perceive the activity of other people and are even “connected” to them but always from behind a veil of perpetual processing. All we know of what other people want is brought to us as a faint reflection of ourselves, and it can never exceed that horizon.
In his discussion of music-recommendation algorithms, Drott argues that streaming companies are heavily invested in predictive analytics “not because they have some virtuous interest in matching musics and users, despite marketing rhetoric to that effect, but rather because it opens a temporal horizon just wide enough to enable the ongoing reproduction of their business — which amounts to the ongoing reproduction of capital.” But he wonders whether listeners will be taken in: “Whether it matters as much to listeners remains an open question, though it is one that streaming platforms and other digital media companies are striving to resolve in their favor, deploying the considerable resources they command in an effort to fabricate a desire for the curation services they provide.” TikTok seems to be succeeding on that front, selling the waning of affect as a service. It aims to be the last gimmick, the master gimmick, the one that resolves all the ambivalence and ambiguity of capitalist desire into a single, ceaseless stream of subjectivizing content. The next thing you see will reveal how you die.