News & Stories

Camille François MIA ’13: “An Optimist Mind”

By Brett Essler
Posted Sep 27 2021
Camille Francois MIA '13

In her Twitter bio, Camille François MIA ’13 describes herself as an “optimist mind working on digital harms.” As a leading expert on cyberconflict who is focused on detecting and mitigating the ever-growing threats of disinformation, media manipulation, and harassment, François has seen some of the most dispiriting behavior imaginable. That she remains an optimist is no small feat. 

As the chief innovation officer at Graphika, François oversaw the cybersecurity firm’s analysis, investigation, and R&D teams. She is also an affiliate at Harvard University’s Berkman Klein Center for Internet and Society, was recently named among Time magazine's “100 Next” global leaders for her work on information operations, and appeared in Alex Gibney’s HBO documentary Agents of Chaos

Now, François—who has lent her expertise to the government of France and the U.S. Senate Select Intelligence Committee—returns to SIPA this fall to teach Information Operations on Social Media, a short course that provides students the foundations to understand an emerging space of cyberconflict: information operations on social media. 

“Camille François has become one of the most globally influential researchers tracking how states are using social media for disinformation and misinformation,” says Jason Healey, a senior research scholar at SIPA who leads the School’s Tech and Policy initiative. “This course will be a great introduction to the topic and help open doors to jobs in government or the Trust and Safety teams of tech platforms so that the students can understand and help defeat online hate, conspiracy theories, and extremism.”

The course, François says, will give students a better understanding of organized disinformation campaigns designed to manipulate social discourse. If students are lucky, they may also benefit from some of Francois’ optimism. 

François recently shared her thoughts on the current state of mis- and disinformation and what SIPA students can expect from her new course. 

How has his approach to cyber policy or policy that combats mis- and disinformation changed since you began in the field? 

Propaganda and disinformation have always been a facet of geopolitics. But until a few years ago, influence operations on social media were a strategic blind spot for many in cybersecurity and national security. Following the U.S. presidential election in 2016, new approaches to conceptualizing, detecting, and addressing information operations emerged across the public and private sectors. Detection methods across social media platforms matured, along with attribution practices, for instance. 

What did the U.S. cybersecurity experts get right in the 2020 election? 

Information operations—and disinformation campaigns in general—were a key part of election security efforts in 2020. Several foreign-based disinformation campaigns were detected and dismantled thanks to the collective—and creative!—efforts of investigative journalists, technology platforms, and governmental entities. Experts expected Russian, Chinese, and Iranian efforts to use information operations to target the election: they were right. They feared that information operations would leverage real individuals wittingly or unwittingly being recruited in these campaigns, and leverage artificial intelligence to better hide inauthentic accounts on social media. This too happened: tactics evolve quickly in this field, as adversaries adapt to detection and mitigation methods.

The thorniest questions in 2020 had to do with domestic disinformation. And of course, the more chaotic the information environment is, the easier it is for foreign actors to take advantage of existing vulnerabilities in our online conversations. 

How is mis- and disinformation regarding COVID primarily spread? Is it a domestic campaign or one initiated by foreign actors? 

To date, the vast majority of harmful and false information spread on social media regarding COVID is propagated by real people, some operating out of fear and distrust, others operating out of political or commercial motives. This isn’t to say that we haven’t seen foreign actors pushing COVID disinformation and harnessing these fears, but the volume of these operations is minimal compared to what is being shared by real people online and propagated through domestic campaigns. 

People tend to put very different concepts and threats in the big grab bag of disinformation and misinformation—a situation that mirrors the early days of cybersecurity. Foreign actors pursuing their own geopolitical goals, the global market of fake accounts and the disinformation-for-hire industry, conspiracy theories and the communities who spread them, systematic amplification of harmful information by platforms, the dire lack of trust in institutions responsible for scientific communications—all of these shape the challenges we face with COVID information online. It’s fundamental to disentangle them, as they require different types of intervention.

What is the biggest emerging threat we face in terms of cybersecurity or mis- and disinformation? 

Information operations are designed to erode trust in democratic systems, and to intimidate and silence critical voices. But by the same token, moral panic and overreaction to these threats is also harmful and counterproductive. I think, for example, of politicians who blamed organic popular online demonstrations on Russian trolls—this happened with George Floyd in the U.S., or with the gilets jaunes [yellow vests] in France. The biggest challenge we face is to find the calm and rigor to properly separate the different issues at play in mis- and disinformation, and to identify the right instruments to tackle each of them. 

Why did you want to come back to SIPA to teach?

Many believe that tackling information operations and disinformation online will require armies of more engineers and data scientists. I don’t. Students of international and public affairs are bound to be confronted with these issues in their professional lives, whether they choose to be leaders in government, the tech sector, the media, or civil society. Technology issues are increasingly geopolitical, and their expertise and creativity will be direly needed to navigate these topics. 

How is the class Information Operations on Social Media structured? What will students take away? 

It’s a deep dive on the part of the disinformation field that’s the closest to cybersecurity—the detection and mitigation of information operations, strategically deployed by organized actors to manipulate people and conversations on social media. This emerging field, dedicated to studying how powerful actors use digital tools to manipulate and how one may detect and mitigate these campaigns, is a little sister to the cybersecurity field. It borrows a lot from cybersecurity, notably forensic principles and attribution theories, but also innovates to bring new perspectives and tools to tackle these threats. 

What are the new kinds of roles or careers students will find now in this space whether in the public, private, or nonprofit sectors? How can institutions like SIPA provide students with the right curriculum and experiential learning for these opportunities?

Many fields are impacted by these issues. Having an in-depth understanding of information operations online will be essential for careers focusing on cybersecurity, human rights, national security or the growing field of Trust and Safety within the technology industry.