Personal trust in an age of state surveillance
An excerpt from Snowden’s Box: Trust in the Age of Surveillance by Jessica Bruder and Dale Maharidge, on sale for 40% off until Friday, May 1 at 11:59PM EST as part of the US politics in a moment of crisis reading list.
The story of Snowden’s box is deeply human, somewhat messy, and more than a little weird. It’s about a brief moment when strangers worked together to build an underground railroad for secrets—a high-stakes endeavor that relied, more than anything, on bonds of trust.
That’s no small thing. We live in an era of suspicion, marked by an eroding faith in government, the media, and even each other. Social scientists have studied the decline of public confidence, quizzing Americans on the same set of topics over and over for nearly half a century as part of a long-term project called the General Social Survey. It includes this question:
Generally speaking, would you say that most people can be trusted or that you can’t be too careful in dealing with people?
Researchers ran the first round of the survey in 1972. At the time, nearly half of the people who responded said they trusted others. By the latest round in 2018 — two years after Donald Trump was elected president — the figure had dropped to less than a third.
That’s scary news. Trust is the basis of all cooperative action in a free society. It’s the feeling of fellowship that allows people to take risks and grow. It’s also the underpin- ning of democracy. And it’s fragile, easy to undermine. Massive domestic spying systems — like the one Snowden revealed — are corrosive to the kind of deep human connec- tions that nourish trust and collaboration.
Consider East Germany, the most notorious surveillance state in modern history. Its secret police force—the Ministerium für Staatssicherheit, better known as the Stasi—was created in 1950. By the time it disbanded four decades later, the Stasi had grown to include some 86,000 full-time employees. If you add part-time and unofficial agents, the total number of people spying for the secret police had risen to more than half a million.
The population was saturated with snoops. Some estimates set the ratio at one informer to every six and a half citizens. Most major institutions—from universities to churches and political parties—were infiltrated. Neighbors snitched on neighbors, and mistrust was rampant. Even a member of the nation’s Olympic bobsledding team, Harald Czudaj, confessed to spying on his fellow athletes. Years later, he recounted through tears that police had caught him driving drunk and blackmailed him into becoming an informer.
East Germany was an extreme case, but even in systems with fewer informants state surveillance frays the relationships between citizens. People grow wary, exhausted from the constant pressure. Eventually, they turn against each other.
“If we are gathering data on people all the time on the basis that they may do something wrong, this is promoting a view that, as citizens, we cannot be trusted,” explained University of Sheffield sociology professor Clive Norris, testifying more than a decade ago before members of British Parliament. Today, the United Kingdom maintains one of the most extensive surveillance systems in the world.
The constant monitoring of a population, Norris and other scholars note, “fosters suspicion,” undermines “cohe- sion and solidarity,” and amounts to “a slow social suicide.” In other words: paranoia will destroy you.
The Snowden affair created a groundswell of concern about how ordinary people are monitored by powerful entities, from governments to tech firms and other corporate interests. It sparked a public conversation on privacy, security, and freedom in the digital age, pushing our culture—at least for a moment—past the point of what the writer Cory Doctorow calls “peak indifference.”
Before Snowden came on the scene, state-run surveillance rarely made it into mainstream American discourse. One of the rare exceptions came in 1993, when officials under the Clinton administration unveiled a device called the Clipper chip and proposed installing it in telephones nationwide. This cutting-edge microchip would encrypt users’ communications, but it would also provide direct access for eavesdropping by intelligence and law enforcement agencies.
Privacy advocates, politicians, technologists, and civil libertarians were alarmed by the Orwellian plan. Together, they formed a motley opposition with members ranging from the ACLU to Rush Limbaugh.
“The precise object of their rage is the Clipper chip, officially known as the MYK-78 and not much bigger than a tooth,” wrote journalist Steven Levy. “Just another tiny square of plastic covering a silicon thicket. A computer chip, from the outside indistinguishable from thousands of others. It seems improbable that this black Chiclet is the focal point of a battle that may determine the degree to which our civil liberties survive in the next century. But that is the shared belief . . . The Clipper chip has prompted what might be considered the first holy war of the informa- tion highway.”
Polling showed that 80 percent of Americans didn’t want their phones to be Clipper-chipped. So the White House mounted a bizarre public relations offensive, including a WIRED article by NSA chief counsel Stewart A. Baker titled “Don’t Worry Be Happy: Why Clipper Is Good for You.”
But that breezy, Bobby McFerrin–flavored headline wasn’t enough to save the government’s plan. Around the same time Baker’s article came out, an AT&T Bell Laboratories researcher named Matthew Blaze announced a security flaw in the Clipper chip: hackers could use it to encrypt communications the government wouldn’t be able to crack. The technology was sunk for good.
After that brief flare in public consciousness, dialogue about surveillance in America largely returned to the underground: a lively counterculture of cypherpunks, hackers, and digitally literate civil libertarians. It was far enough outside the mainstream that even Greenwald, who’d written for Salon and the Guardian about abuses of surveillance and went on to play a pivotal role in the NSA leaks, had given little thought to securing his own private communications.
When a mysterious person—using the handle “Cincinnatus”—pleaded with him to set up encrypted email, Greenwald blew off the request. “Despite my intentions, I never created the time to work on encryption,” he later wrote. “It was simply that on my always too-long list of things to take care of, installing encryption technology at the behest of this unknown person never became pressing enough for me to stop other things and focus on it.”
Cincinnatus was, of course, Snowden.
In early 2013, most journalists were like Greenwald. Protecting their email from prying eyes wasn’t a priority. The general public was even less interested in such matters. In the immediate aftermath of the Snowden leaks, that changed. For many, privacy went from a curious abstraction to an immediate, tangible concern. Encryption got hip. At packed CryptoParties, grassroots privacy activists around the globe taught layfolk how to safeguard their online communications. By the time the television series Mr. Robot aired in 2015, the public was primed to immerse itself in a drama about a hacktivist collective. The show’s first season averaged 1.39 million viewers an episode. Meanwhile, a handful of heavy-hitting tech giants followed consumers’ interests—or at least their disposable income—and began touting privacy and security as integral features.
At the same time, all the talk of using technology to protect civil liberties may have worked to obscure some equally valuable truths. Chief among them: encryption is an important tool, but it’s not everything. All meaningful communication and collaboration rely on a bedrock of trust: people of good faith working together. The most advanced algorithms can’t outpace that basic principle.
During a debate over secure messaging, Jon Evans, a TechCrunch columnist and software engineer, explained as much. “You always have to trust somebody. It’s inevitable,” he wrote. “Real security design is about navigating the compromises between usability and security, determining the sophistication and threat model of your users, deciding who you have to trust and who you can’t afford to.”
In essence, when it comes to trust, there is always a pra tical tradeoff to be made — unless you live on a desert island and you’re a do-it-yourselfer who relies exclusively on home-brewed code, building proprietary smartphones out of coconuts and corresponding only with yourself. Faith in no one, after all, is a recipe for isolation.
Unfortunately, as Evans pointed out, we can’t afford to trust others indiscriminately. Sometimes the figures who are most beholden to the public prove least worthy of its confidence.
That was true of James Clapper, who probably never imagined his tenure as US director of national intelligence would collide with one of the most significant leaks in national history. Three months before the first secrets spilled out of Snowden’s box, Clapper was questioned by Senator Ron Wyden (D-OR) at an open congressional hearing.
Wyden’s job, of course, meant he had a national security clearance. So it seems likely the senator already knew the answer when he asked Clapper if the NSA collected “any type of data at all on millions or hundreds of millions of Americans.”
“No, sir,” Clapper replied. He added: “Not wittingly.” That was a lie.
When his bad-faith testimony was exposed by Snowden’s leaks, Clapper’s first move was to deflect attention away from the lie. He assured Americans: a hunt was on for the dastardly individual who had stolen the government’s secrets. “This is someone who for whatever reason, has chosen to violate a sacred trust for this country,” he thundered.
That remark revealed either extreme hypocrisy or — even worse — the kind of cynicism that makes hypocrisy irrelevant. But the absurdity of making such a charge against the leaker, even with his own credibility in tatters, seemed lost on Clapper.
The intelligence chief would finally apologize for giving “clearly erroneous” information to Congress and the American people. Months later he would resign — his long career as a public servant derailed by deceit. But in the days following the first NSA leaks, contrition was not part of the script. Clapper focused on a single talking point: a traitor had betrayed America. In doing so, he underscored — over and over, as if immune to irony — the importance of upholding public trust.
During one especially revealing interview, NBC anchor Andrea Mitchell asked him how hard it was for intelligence officials to safeguard classified information. Clapper rattled off a litany of tools and protocols. He mentioned security clearances for federal workers and contractors, along with the spy-proof rooms known as SCIFs, or sensitive compartmented information facilities.
But even with the most sophisticated strategies the government could develop, keeping secrets was a “tough problem,” Clapper admitted. After all, systems are only as reliable as the people who operate them.
“When it all boils down to it,” he concluded, “it is all about personal trust.”
It’s easy to believe that small things — individual actions and human relationships — don’t make much of a difference in the face of an authoritarian regime. We disagree. We want this book to serve as a quiet testament to the power of trust, and why it’s worth fighting for a culture where it can thrive.
Trust is the glue of the world, the difference between civilization and chaos. It’s what lets people come together in any kind of cooperative action, from social movements to marriages and markets. When shared between members of a civic-minded community, trust is the one thing that can keep state power in check — unless, of course, we allow ourselves to be manipulated by fear and, in the silence that follows, grow apart from one another.