A trio of news articles in February and March on the reach and impact of the global fake news machine completely blew my mind, and I’ve been ruminating on them ever since. It struck me that these developments have huge implications for higher education, both for college and university leaders and for the communications people who support them.
On February 23, the Chronicle of Higher Education posted this piece: How Russian Trolls Used Higher Ed to Sow Discord Online. According to the Chronicle’s research, “At least 129 Twitter accounts associated with a Kremlin-aligned propaganda outfit…tweeted and retweeted about issues pertinent to higher education from 2015 to 2017. …Some accounts…generated original and hostile commentary directed at aspects of American higher ed. Other accounts mostly amplified the opinions of legitimate American critics of academe in the United States.”
On February 15, Inside Higher Ed reported on research documenting the role that Russian social media bots played in the racial unrest at the University of Missouri in 2015. “A new journal article in Strategic Studies Quarterly reveals that the Russian bots had another target in the fall of 2015: students at the University of Missouri at Columbia. The bots created false impressions about some threats against black students and faculty members at the university, which resulted in some campus leaders calling for people to stay home and many students to say that they were terrified. The false reports also contributed to a negative image of the university—particularly with regard to its support for minority students—that the university continues to fight.”
MIT Professor Sinan Aral wrote in the New York Times in March about research he and his colleagues conducted on the spread of fake news on Twitter. They found that false stories spread farther and faster than legitimate ones, and that human behavior played a greater role in this spread than bots. This reinforces research at Carnegie Mellon, which found that people have a variety of strategies for avoiding factual information and seeking out news that reinforces their biases.
While misinformation is nothing new, the tools for spreading them are ever-more effective and expansive. “Social networks have lacked the checks and balances or strategic best practices of traditional businesses, media outlets and government since their inception,” said Nikki Sunstrum, Director of Social Media at the University of Michigan. “The platforms 67% of Americans rely on for news in this country were originally only intended to rank cute co-eds, share what you were eating for breakfast and snap a racy photo. The unfiltered dissemination of this type of information naturally progressed as the platforms grew, eliminating barriers for global conversation and providing a limitless playground for any user to become an expert, offer opinion, or share content with little to no consideration for credentials, expertise or fact-checking.”
And, of significant concern, students may be particularly vulnerable to absorbing malicious campaigns aimed at sowing discord. A 2016 Stanford study found that a very large percentage of students have difficulty judging the credibility of the news they read.
Universities are addressing this troubling trend through coursework, extra-curricular skill development programs, and a growing body of scholarly research that contributes to our understanding. Some campuses are launching major initiatives and centers, such as the new Center for Social Media Responsibility at the University of Michigan.
More immediately, how should colleges prepare for and respond to disinformation that targets their students, faculty, staff, alumni, and institutions?
Steve Kloehn, vice president for communications at Carnegie Mellon, said the core tenets of good communication continue to serve us well. “At the heart of it, I think, are the very basics of our work: get ahead of the narrative, know your audiences (including the ones who think you are full of fake news), make a positive case instead of a defensive one, show instead of tell whenever you can, and be unrelenting and disciplined,” he said.
Many tried-and-true efforts to communicate factual information and combat rumors continue to be effective, but institutions have to respond much more rapidly than before. Colleges and universities, which prize thoughtfulness over brevity and quickness, often struggle to be nimble enough. One way to help your organization respond more rapidly is to create processes for better forecasting issues on the horizon, developing policies, creating messaging and vetting that messaging with leadership. While the details of a particular issue may be unique, the broad outlines can often be predicted and prepared for.
Other specific approaches I have found to be helpful include:
- Strong listening systems for identifying rumors and misinformation early; this may include social media listening tools, Google alerts, listservs, paying attention to the student newspaper, and good old-fashioned networks of colleagues who are ready to report what they hear.
- A thorough archive of topical materials including existing policies, relevant data and previous public statements; a great example of this is Michigan’s Key Issues website, which serves as a resource not only for journalists but also for campus communicators and others who may have to respond quickly.
- FAQs that tackle tough questions and misinformation head-on, written in concise and accessible language with a neutral tone; these should be posted prominently on your website in a flare-up and tagged effectively so they show up in search.
- Direct outreach to key audiences and group leaders who can disseminate your information; for example, during disputes over a Palestinian Solidarity Conference the University of Michigan worked with Hillel, the Anti-Defamation League and Arab American groups in Southeast Michigan to reach their members.
- Deployment of credible, third-party experts who can offer facts and speak up on your behalf.
What other ideas do you have about tackling the growing challenge of deliberate misinformation? I welcome your ideas and case studies.