August 19, 2018

Symptom – Election Interference….Disease – Evaporation of Trust



Almost two years after the pivotal 2016 presidential election, America is finally piecing together what happened.


 

By Dr. Jen Easterly and Joshua Geltzer, J.D. / 08.09.2018


Almost two years after the pivotal 2016 presidential election, America is finally piecing together what happened. Thanks to detailed indictments by Special Counsel Robert Mueller, bipartisan analysis by the Senate Intelligence Committee (validating earlier Intelligence Community reporting), and extensive testimony by tech leaders like Mark Zuckerberg in the face of mounting congressional pressure, Americans are beginning to understand the full assault on the democratic processes directed from Moscow. With the midterm elections almost upon us and a political influence campaign again underway, this time at least there’s a sense of the threat we face.

But we will remain dangerously behind the curve if it takes two years for us to comprehend each new type of threat that emerges online. From China’s theft of sensitive information about U.S. government employees to Iran’s denial-of-service attacks on our financial institutions; from North Korea’s destructive hack of Sony Pictures to Russia’s interference with our elections; from international cyber criminals emptying our bank accounts to terrorists like ISIS leveraging the Internet to recruit and radicalize vulnerable youth, these cyber-enabled threats—and whatever’s coming next—are just symptoms. The underlying disease is an assault on trust in the digital age, indeed an assault on what’s arguably our most critical asset: our cognitive infrastructure.

Think of it as the collective brains of the American polity—our collective consciousness, the fundamentals that we think and believe, the shared understandings of the world that drive our ability to make decisions. For centuries, we took this for granted as a national security asset. That’s because it could be shielded, at least in part, from those who would try to weaken it. Think of Britain’s cutting of Germany’s transatlantic cables on the first day of World War I, thus ensuring that German propaganda could never directly threaten the minds of still-neutral Americans.

The ability to shield America’s cognitive infrastructure from assault and protect the “human operating system” that relies on it has crumbled. America’s adversaries—as Jonathon Morgan and Renee DiResta have compellingly explained here on Just Security—have weaponized information, or really disinformation, to infect Americans’ perceptions of what’s true and even of truth itself; what’s more, their efforts have been deliberately embraced and amplified by politicians and media outlets here at home. No longer are facts such “stubborn things,” as John Adams famously asserted. Particular manifestations of this have been damaging enough, from minority voters manipulated into staying home on Election Day 2016 to vulnerable individuals seduced by ISIS into slaying fellow Americans. But it’s the broader campaign that requires our attention—and our response to it that demands elevation as a national security priority.

Information operations like Moscow’s assault on the 2016 presidential election, when combined with domestic forces assailing key elements of our democracy and society, are having a corrosive effect on this country’s cognitive infrastructure. Already, many Americans express fundamental doubt about institutions once deemed sacred to our democratic health mere years ago. The free press has become “fake news”; our free and fair elections have become “rigged”; our independent judiciary has become “so-called judges.” What’s next? One could imagine a concerted attack on the confidence that Americans (and others) place in the financial markets, perhaps through the deliberate dissemination of fake information manufactured to drive malign market activity or even just confuse traders.

What’s even scarier is that the weapons already eviscerating Americans’ trust—trust in facts, in institutions, even in each other—are still in their infancy. Moscow’s 2016 election interference involved fairly standard cyber tradecraft, not high-tech cyber-weapons. Yet we’re on the brink of seeing the weaponization of far more sophisticated tools like “deep fakes”: images, audio, and even video that can make it look like a particular leader said or did something she didn’t. The dangers of swift, even violent mobilization in response to deep fakes disseminated online are particularly grave, given the human impulse to regard seeing as believing. Similarly, ISIS has used simple lists of personally identifiable information to put digital cross-hairs on U.S. service members and their families. Imagine the far greater and more sophisticated threats that await when terrorists and others combine all of the sensitive information on us, given away through social media and smartphone apps, sold by vendors, and stolen by hackers.

We need to recognize our cognitive infrastructure, on which trust is built, as a critical national asset that’s crumbling, and we need to shore it up—fast. It’s a challenge that government, the private sector, and civil society must tackle together.

First, all three need to recognize America’s cognitive infrastructure as foundational to our country’s economic and political health so that resources are rapidly allocated to studying its decline and funding diverse plans for revitalizing it. Second, the government and private sector need to build relationships for promoting the strength of America’s cognitive infrastructure and addressing threats to it, especially because those relationships hold the capacity for information sharing and complementary affirmative messaging that restores Americans’ capacity for critical thought and healthy skepticism of what they see tweeted and posted. The administration’s recent announcement of a new National Risk Management Center is a start, but its mandate must be expansive and flexible enough to encompass the new types of cyber-enabled threats that continue to emerge. While the private sector of course has its own incentives for whether, when, and how to act, an increasing set of private actors has shown a willingness at least to try addressing cyber-related threats, and arming those actors with better information on those threats is a step that government can and should take.

Third, the private sector—informed by the government—must experiment with technology to build “digital guideposts” bolstering our cognitive infrastructure. What we have in mind are information campaigns in which browsers and websites periodically encourage users to read and consider items before re-posting them and to remember that online accounts may not be genuine, all bolstered by civic education in school that fosters healthy online consumption and behavior from an early age. One rough model here might be Facebook’s existing presentation to users who are algorithmically determined to be at risk of suicide with a menu of options for getting help. Fourth, we need a restoration of old-fashioned facts and a belief that they exist in the first place—and partnerships between social media companies and independent fact-checkers like Snopes.com are a good way to start. These efforts need to be broadened and deepened: broadened across a wider array of communications platforms and deepened to address more uploaded content, more quickly—before falsehoods go viral.

It’s no secret that good national security policy rests on good decision-making. But good decision-making rests on good information and basic trust that, when we debate tough issues, there’s at least a set of facts in the world defining those issues. Before we can improve America’s national security decision-making, we need to improve a piece of our critical infrastructure long taken for granted: our cognitive infrastructure.


Originally published by the Just Security, New York University School of Law, under a Creative Commons Attribution-No Derivs-NonCommercial license.

Comments

comments