Image by Agnes Jonas

On the Grid | How Surveillance Became a Love Language

Zoë Hitzig

A full-page ad in a November 1990 issue of Fortune magazine features two dozen men in dark suits turned away from the viewer. Standing in neat rows, most blend together in a uniform mass. But three are singled out, with red targets pinned to their backs. The text below the shadowy tableau reads: “Wouldn’t it be great if new customers were this easy to spot? Now they can be.” Bullseye.

The spread was for Lotus MarketPlace, a collaboration between Lotus Development Corporation, a spreadsheet-software tech giant then valued at over one billion dollars, and Equifax, one of the country’s largest consumer credit agencies. Lotus MarketPlace contained detailed profiles of 120 million Americans, including their names, addresses, phone numbers, marital statuses, estimated household incomes, and purchase histories, all filed into lifestyle categories such as “cautious young couples” and “inner-city singles.” For $695 (about $1,600 today), a company could purchase eleven CD-ROM discs of consumer data covering half the U.S. population.

In a Wall Street Journal piece that ran just after the Fortune ad appeared, a Georgetown professor was quoted describing the product as “a big step toward people completely losing control of how, and by whom, personal information is used.” A staff attorney with the American Civil Liberties Union said that Lotus was “stretching for the broadest interpretation of the law and looking for ways to get around its intent.” The article ricocheted through nascent cyberspace, finding its way onto message boards and email lists where angry netizens encouraged one another to call Lotus to insist that their names be removed. This online backlash quickly grew into a coordinated attack, with phone calls and letters overwhelming Lotus’s headquarters.

Nine months after the product was announced, Lotus’s president publicly acknowledged the “volume and tenor of the concerns raised” as well as the insurmountable expense of removing the over thirty thousand people who demanded to be taken out of the database. Lotus MarketPlace was canceled before it even launched. Today, this triumph of privacy advocates reads like a false dawn. In the decades since, we have indeed, as the Georgetown professor warned, completely lost “control of how, and by whom, personal information is used.” 

And we know how it happened. After the dot-com bubble burst in 2002, Google realized that it was sitting on a monetizable surplus: the data produced by people’s engagement with its search engine could be wielded to customize ads. And then it discovered that the more precisely those ads were targeted, the more lucrative they became. Google’s data was richer and vaster than Lotus’s — a combination of search histories, IP addresses, and metadata that could paint a picture of what a specific person in a specific place at a specific time wanted to do, know, or buy. Google’s approach was stealthier, too. The company updated its terms of service to note, without explicitly mentioning advertising, that it was stockpiling this information “to improve the quality of our service and to better understand how people interact with us.” Facebook and data brokers like Acxiom quickly followed suit, refining ad-targeting algorithms and accumulating massive data sets of consumer profiles. By the late 2000s, smartphones provided new sources of data, harvesting information all day long, not just when people were at their computers. There were no more full-page ads in popular magazines. Just terms of service in miniscule fonts, manipulative interfaces, and other tricks of the magician.

The result is that we are now locked in innumerable contracts through which we surrender our personal information for convenience or pleasure — for better search results, faster delivery, more helpful recommendations, thimblefuls of dopamine. We feel conflicted about these agreements, but also powerless to amend or terminate them. Even the “techlash” of the late 2010s — when scandals like Cambridge Analytica and high-octane critiques like Shoshana Zuboff’s The Age of Surveillance Capitalism ended the era of unquestioned techno-optimism — did little to free us from these arrangements. According to a 2023 Pew Research Center survey, 73 percent of Americans feel that “they have very little to no control over the data collected about them by companies.” And yet, we still turn over our information voluntarily for trifles — FaceApp, which demands full access to cameras and camera rolls in exchange for filters that add or subtract twenty years or twenty pounds, has been downloaded by over 350 million people across the globe. We are burnt out, fatigued, addicted. We talk casually about our Big Tech overlords, more or less accepting our debased roles in their fiefdoms.

But there is something more insidious happening, too. Technology companies have so thoroughly conditioned us to believe we are powerless when it comes to digital privacy that our attitudes toward privacy more broadly have also been warped. Just as in the era of the PATRIOT Act the national security state insisted that it was virtuous, even patriotic, to give in to the intelligence machine, tech culture now ascribes its own virtues to the forfeiture of privacy: realness and connection. Where we once guarded our control over personal information, we now give up control not just freely but even tenderly, monitoring and being monitored by loved ones through social media platforms like BeReal and location-sharing apps. It’s a strange form of Stockholm syndrome for the surveillance age — we love, and love with, the tools of our captors. Resigned to the Big Tech companies recording our every move, we’ve invited friends, family, and partners to join them in watching us. We’ve begun to celebrate surveillance as a form of intimacy.

 

Find My Friends, an app that allows people to track their consenting contacts’ whereabouts in real-time, was introduced in 2011 with the launch of the iPhone 4S. In the app, you can set up alerts for when someone enters or leaves a specified location. Or, you can simply treat Find My as a live map, and watch your targets strut around the neighborhood, the city, or the globe. In 2019, Apple merged Find My Friends with Find My iPhone, which geolocates Apple ID-linked devices. Now, the streamlined Find My app (represented by a green target with a blue bullseye) takes care of both, as if friends are also expensive possessions to track in case they get lost or stolen. And location-tracking is not limited to Find My: Google Maps offers the same, and Snapchat, for example, has a similar feature called Snap Map, which the company claims is accessed by over 350 million users per month.

What might have seemed, not too long ago, like a dangerous act of exposure has rapidly become a security blanket and a source of recreation. Location-sharing apps allow parents to track their adolescent children, and adult children to keep tabs on their senescent parents. Marketed as “family safety” solutions, location-tracking apps like Life360 offer more than just real-time location data. They also maintain a database of your family’s movements, storing up to thirty days of precise location history for every member of your “circle.” There are even smartwatches and other GPS devices designed for kids who don’t yet have phones; the Wizard Watch, for example, says it “gives guardians the confidence to allow their loved one to explore the world outside, without the stress and fear of wondering where they are or if they are safe.” In these duty-bound dynamics, there may be a clear sense in which the person tracking is responsible for the well-being of the person who’s being tracked — one party gives up privacy in exchange for care.

In friendships and intimate partnerships there may be good safety rationales to turn on location sharing, but there’s nothing in the implicit relationship contract to suggest that one person can monitor the other’s whereabouts. Still, it can be entertaining to track the people in our lives. As one 22-year-old Find My user who habitually retrieves ten friends’ locations told Vox, the app “is so, so common among basically everyone I know, just for safety reasons but also for fun.” A 2023 TikTok featuring a screen recording of the Find My interface overlaid with a GIF of Pedro Pascal eating a sandwich and the words “Me checking find my friends to make sure all my sims are where they’re supposed to be” garnered nearly ten million views, one million likes, and countless videos riffing on the format. This kind of location-sharing turns friendship into a video game. And if it’s all a game, there’s no reason to object.

Those who celebrate the fun side of location-sharing apps don’t talk about them in terms of control; they talk about convenience. Apps like Find My save people who are always down to hang from the effort of texting, and enable spontaneous coordination in a globalized, expeditious present in which friends are literally hard to find, stochastically whizzing across and between cities. But what’s really disturbing — and representative of how this technology is changing relationships — is how people talk about mutual location-sharing like it’s a badge of intimacy, the implication being that truly close friends deserve to know everything about each other, including minute-by-minute coordinates. The same Vox article describes the phenomenon as “the next step in digital intimacy after following someone on Instagram.” And some say that it makes their friendships deeper. In The Paris Review, Sophie Haigney (a Drift contributor) cheekily declares Find My to be her “favorite app” and recounts using it “constantly and impractically” to check on her loved ones. “I guess it makes me feel close to them in a stupid technology way,” she explains. In a poetic apologia for the app published in the New York Times, the novelist Kathleen Alcott muses, “Find My Friends rewards a groundwork of trust that’s already laid, magnifying what we know to be true about the people we love through the changes in place that express it.”

In his landmark work from the middle of the last century, sociologist Erving Goffman theorized that in any social interaction, individuals are like actors who tailor their performances to their audiences and their contexts. Building on Goffman’s ideas, media scholars like danah boyd have used the term “context collapse” to describe how social media demands a unified presentation of the self to distinct audiences simultaneously. Unlike face-to-face interactions — in which we present ourselves differently to family, friends, or colleagues — social media forces us to speak to all of them at once. It also introduces another audience member: the algorithm. Whenever we communicate online, we communicate to a collapsed version of our social worlds via a medium that is structured to maximize engagement — by prioritizing the extreme, or the enviable, or the seemingly successful. And so we find ourselves further and further from anything that resembles our complete “self,” presenting ourselves as — per a popular meme — professional on LinkedIn, wholesome on Facebook, slutty on Tinder, and stylish on Instagram. Location-sharing apps, on the other hand, can offer the illusion of remaining whole. They entice in part because they seem to counter the distortionary, performative aspects of social media. They allow us to exercise our desire to rein in our audience and banish the always-lurking algorithm, sharing a truly unfiltered stream of information with the small group we’ve pulled in close. “In a world where we use social media to broadcast highly curated versions of ourselves,” Alcott writes, Find My furnishes an “antithesis.”

Location-sharing suggests that to be authentic is to cede agency over what you share — to give your friends unmediated access to your life. Putting the imperative in its very name, the French social media app BeReal capitalizes on a hunger for alternatives to the performance of social media. At some uncertain time every day, BeReal pings its users: “⚠️ Time To BeReal ⚠️” Users then have two minutes to take and post two photos using their front and back cameras simultaneously. Photos can be posted along with optional geotags, though not all users are aware of this feature and may unknowingly share their locations. It’s a digital approximation of the panopticon’s punishing 360-degree view, with the randomly timed notifications creating that disciplining feeling of always possibly being watched.

The sense that we are showing less filtered and more continuous footage of our lives to our friends distracts us from the reality that we are also showing all of it to the extraordinarily powerful tech companies that gave us the idea to do so in the first place. It’s not nice to think about how Apple can see where you are at all times of the day, but it’s lovely to think about a good friend catching you on a long hike in the middle of a big green swatch on the little map in the palm of their hand. So even when we supposedly react against the ways that tech companies try to control us, we reinforce the surveillance logic on which they thrive. We continue to subscribe to and even spread the idea that it is virtuous to give up agency over the personal information we share with others, because we’re conditioned by the tech companies to trade that agency for convenience. By the time we start sharing our location, we’ve likely already given that same real-time data to companies like Apple and Google.

Perhaps we share our information willy-nilly with our friends simply because it’s, at this point, an unremarkable thing to do. As Eva Galperin, the cybersecurity director at digital rights group Electronic Frontier Foundation, told the New York Times, “People do this sort of indefinite data sharing because it is normalized within their immediate family or friend group.” It is advantageous for the omniscient, omnipotent tech companies, which have been tracking and psychologically manipulating us for almost two decades, if we adopt a posture of powerlessness about our personal information in all realms of life, online and off. Our dependence on them grows, giving them license to continue harvesting more and better data even as regulatory scrutiny intensifies. As a non-location-sharer in a friend group filled with them, I am often teased for pushing against Find My and for other practices (auto-deleting messages, always using Signal, affixing privacy screens to my devices) that I like to think of as basic digital hygiene. It’s worth mentioning that I am the only one in the group who thinks about information technologies for a living — I recently wrote a doctoral dissertation on the economics of privacy and am currently a researcher at OpenAI (all views here are, of course, my own). Still, my friends insist I’m paranoid. Behind their taunts lies an internalized and unquestioned consensus that being truthful means full disclosure by default.

 

Our surveillance Stockholm syndrome is not only making us more submissive to Big Tech; it’s also changing how we relate to each other. It creates snags in relationships, to be sure — location-sharing apps, for example, expose white lies, stoke FOMO, and enable unwanted or unwarranted deductions about who’s sleeping with whom. But there may be deeper relational losses, too, that come from the moral attitude that says it’s wrong to have secrets, and that it’s wrong to have regions of our lives that are not translated into data. By replacing opportunities for genuine reflection and connection with runnels of information, our appropriation of digital surveillance may diminish our autonomy, erode trust, and undermine the meaning of our relationships with others and with ourselves.

In her 2015 book In Defense of Secrets, the late French psychoanalyst Anne Dufourmantelle corrects one of the implicit claims of the surveillance-as-intimacy perspective: “Transparency is not truth,” she writes. Believing it is leaves our psychological landscapes exposed and open to manipulation by external forces. A “free life,” she argues, is precisely one that is “capable of generating” secrets. Clearly, our sharing-is-caring regime makes it harder to have secrets. In her playful discussion of Find My, Haigney reports, “someone above the age of forty asked me recently how anyone in my generation has affairs, if we all know where others are at any given time.” Affairs are not the only kind of secret to be had in intimate relationships, of course: some other common secrets have to do with gambling, drugs, alcohol, frivolous shopping, illicit friendships, delicate health, or financial issues. Whatever it is, our adoption of surveillance as intimacy makes it harder to keep the activity or fact secret, and indeed tells us we are wrong to do so.

It seems plausible that our warm acceptance of surveillance tools has been at least a partial factor in the recent popularity of non-monogamy. Given the constant flow of information from social media and tracking apps, it can be simpler to pursue an open relationship than to hide an affair. But even if our embrace of digital surveillance has potentially helped to push us toward open relationships, it has also made carrying them out more difficult. The writer and artist Shelby Lorman reports falling into a “digitally induced paranoia” when her boyfriend posted an inscrutable candlelit photo on his Instagram story during their year-long attempt at an open relationship, for which they adopted a “don’t ask, don’t tell” policy when it comes to talking about others they’re dating. “We’re so inundated in the amount of access we have to everyone, all the time, that it’s easy to dismiss how this impacts us, especially romantically,” she writes. As an anonymous writer in The New Statesman put it about their own non-monogamous relationship, “I want to know everything. But sometimes the details make me feel jealous and insecure.” Of course they do! Imagine how much more of a mess Proust’s narrator — already constantly seeking to uncover his lover Albertine’s secrets — would be if he were equipped with contemporary tracking tools and cast into a society that normalizes them. In Proust’s world, surveillance breeds obsession, not intimacy, and entrenches insecurity rather than securing love, distancing the narrator from the object of his affection. Eventually, he observes, “we only love what we do not wholly possess.”

So secret affairs are unworkable, because we know where everyone is at all times. And open relationships can devolve into paranoia, or ratchet up to spectacles of endless disclosure. But even for those relationships in which neither party has secrets, something is lost when we decide to share everything: the freedom to reflect on and narrate our needs and desires, and tailor them for specific listeners. When we surrender control over what and how we disclose, we undercut our capacities for self-determination.

In a thrilling new book, The Right to Oblivion, political theorist Lowry Pressly centers a defense of privacy not on secrets, as Dufourmantelle does, but on the value of oblivion. Oblivion, for Pressly, “describes a state of affairs about which there is no information or knowledge one way or the other, only ambiguity and potential.” Oblivion and secrecy are not the same — for some experience to become a secret, it must first travel out of the domain of oblivion and into that of information. Pressly argues that true privacy requires safeguarding oblivion, not secrets. Surveillance-as-intimacy renders the self as a “repository of information to be got at rather than a human being whose depths are unknown and respected as such.” Open relationships can inscribe a partner as a “repository” by assuming that they are a sum of facts about what they bought, where they went, whom they flirted with or kissed or brought home from the bar. To be constantly worried about disclosure is to be always in the process of codifying experience as information. Some of the most tantalizing and powerful encounters in our lives resist the kind of classification that often weighs us down and anchors us in the shallow end of what’s possible. In all relationships in which we treat each other as repositories of information, we tend toward surveillance — to our mutual detriment. Pressly points out that “the child who is tracked by her parents from her earliest opportunities for independence, whether in the physical world or online” will ultimately miss out on “opportunities to be trusted,” which are crucial to “personal development and moral self worth.”

Our surveillance Stockholm syndrome blinkers us in another kind of relationship — our relationships with ourselves. The most dramatic example of this behavior is “digital hoarding,” the practice of relentlessly collecting digital files to the point that virtual clutter causes stress, confusion, and an overwhelming sense of disorder. Many of us have some digital hoarding tendencies — deleting photos can feel like an impossible task, as though the memories and relationships they represent might dissolve if they were to be wiped from our machines. These habits represent a conflation of “memory” of the human kind with the “memory” of the machine kind. Apple, for example, shows us “memories” from our camera rolls, employing facial recognition and metadata to put together collections like “Last Weekend in Kansas City” or “All Together,” a photo album of you and your family. It’s a bit creepy — after all, Apple is showing its hand, proving that it can infer which faces belong to which of our friends — but it’s also endearing. Every time you smile at a “memory” in spite of yourself, you are unknowingly saluting the principle that we ought not have power over the information we spew onto countless servers, as well as the more foundational principle that memories — the kaleidoscopic whorl of experience that we draw on to make life meaningful — ought to be tabulated into neat packets of information. The suggestion that we have no realm of oblivion and that we are the sum of our data, in Pressly’s words, creates an “excess of historicity” about one’s own life that can lead to a “sense of life becoming more fixed, more factual, with less ambiguity and life-giving potentiality.” It diminishes our belief in “that central capacity of human agency to change and become different” from who we were in the past.

Today’s ascendant technology — large machine learning models, often mythologized as “artificial intelligence,” that promise and threaten to bring about profound changes in the social order — evolve the capitalists’ surveillance practices, and our modes of participation in them. Security expert Bruce Schneier warns that the new generation of artificial intelligence tools enables mass spying, which goes beyond the mass surveillance that we have already normalized. Surveillance is about tracking actions — what you do, where you go, what you buy. Spying, on the other hand, is about gleaning intent through a careful study of what you say, what you think, and what you feel. While surveillance is easy today, with our devices logging our physical coordinates, our transactions, and our website visits, spying has remained relatively labor-intensive, requiring analysis of large amounts of unstructured data like text, audio, and video. The new wave of machine learning models can take enormous amounts of messy data and instantly produce summaries that anyone can understand.

The normalization of mass spying could go further than surveillance did in skewing our relationships. The devices cozied up in our homes — Ring cameras capturing every neighborhood drama, Alexa politely ignoring our off-key singing — are already quietly recording and transmitting data every moment of the day. There have been flashes of resistance to the creep of these gadgets. Amazon’s ill-fated Ring Nation — a television show that featured Ring-captured clips of doorstep marriage proposals (“Ring, you heard it first!”), kids being chased across their yard by cranes and cats, and deer and iguanas chilling on patios — was canceled after one season, having caught the attention of high-profile critics like Senator Ed Markey. “The Ring platform has too often made over-policing and over-surveillance a real and pressing problem for America’s neighborhoods, and attempts to normalize these problems are no laughing matter,” he cautioned. A writer in Vice pronounced the show an audacious new step in “Amazon’s propaganda campaign to normalize surveillance.” Still, these technologies continue to proliferate, even incorporating new language models. In October, Ring launched an A.I.-powered search tool that can pinpoint specific objects and activities from recorded footage. The search is not yet very sophisticated, but it’s not hard to imagine it soon enabling queries like, “What did my partner get up to while I was gone?” In this world, you wouldn’t even need to be suspicious about something specific — a generalized hunch would be enough to format a query, and receive an easily digestible response. This is a significant shift beyond our current capabilities; intimate spying typically entails continually monitoring live feeds, manually reviewing recorded data, and watching dots on location-sharing apps. If we’ve already adopted digital surveillance as a modern love language, are we going to normalize and then moralize digital spying too?

Lotus MarketPlace tried to put targets on our backs, but we threw them off. Three decades later, we have bullseyes on all sides and don’t seem to care. In fact, we now fasten targets on our friends like charms on a friendship bracelet. We say — with pride — that we have nothing to hide. In our unthinking acceptance and enforcement of the relational terms of service that cast surveillance as a form of intimacy, we not only make ourselves ever more powerless in the grips of our captors, but also overlook what these contracts may devalue or destroy entirely: the deep autonomy, trust, and moral self-worth born out of secrets and regions of our lives that should be protected from a translation into mere information. In a 1991 postmortem of the Lotus MarketPlace debacle in the Technology Review, scholar Langdon Winner augured, “The troubles unearthed during the MarketPlace furor will not vanish with the product’s ignominious death.” Indeed, the troubles live on, even as our response to them has been subdued. To distract us from their power over us, at first the tech companies hid their intentions. Now that we’ve caught on, they’ve taken a new approach. They’ve served us a tiny sip of their own intoxicating power — they’ve given us power over each other.

Zoë Hitzig is an economist and the author of two poetry collections, most recently Not Us Now. She is the poetry editor of The Drift.