Table of Contents
- Executive Summary
- Introduction: What Drives the Need for New Skills?
- What Do We Mean By Cyber Citizenship And What Skills Contribute To It?
- What Does Research Say About Building These Skills?
- What are the Challenges to Implementation in the U.S. Education System?
- New Instructional Materials Developed for Educators, But Also A New Problem
- A First Step: The Cyber Citizenship Portal
- Recommendations
- Conclusion: What Would Success Look Like?
- Appendix: Diagram of Emerging Network
What Does Research Say About Building These Skills?
Policy leaders, educators, and parents have all voiced a sense of urgency about the problems of an increasingly polluted information environment and the need to figure out what teaching methods or materials will build the critical thinking skills that students need. Yet helping students to become aware of manipulation by social media algorithms, to avoid being misled by false claims (spread by automated bots or otherwise), and to learn to check sources before getting carried away by emotions—these are skills neither easily gained nor easy to assess. Indicators of success are not going to come from one-and-done modules or multiple-choice tests.
Fortunately, there are some foundations to build upon in studying what works. As discussed earlier, the field of media literacy education is not new. Indeed, the National Association for Media Literacy Education (NAMLE) emerged before the dawn of either Facebook or YouTube. Efforts to infuse classroom lessons with new communications tools and help build students’ skills in choosing and using media and technology for civic dialogue have been underway for decades. Leading organizations that support educators, like the National Council of Teachers of English, the National Council of Social Studies, and ISTE, have been grappling with and developing new frameworks for teaching that recognize the power of communications and network tools that enable anyone to publish anything at any time. Educators have also benefited over the past decade from research on the benefits of harnessing digital media for connected learning. The Connected Learning Alliance has found that students learn most when their personal interests are connected to “meaningful relationships and real-world opportunity.”1 And in cybersecurity, researchers can now collect data on the reduction of risky behavior, such as click rates on phishing tests, after students are taught how to avoid scams online.
However, in the areas that make up the intersection of cyber citizenship, a robust base of replicated studies on how to build skills and an awareness specific to fighting disinformation, misinformation, and mal-information does not yet exist. This is a result of both the relative newness of the problem and the fact that the focus until now has been on remedies such as technological fixes or regulation. It also points to a policy need, support for more research into which skill-building approaches work best, for which types of students, in which contexts—and for which desired outcomes. Each element is a critical part of strengthening the education system, including informal learning settings, such as extracurricular programs for children, teens, or adults.
Fortunately, research initiatives are taking shape. One is Mapping Impactful Media Literacy Practices, a two-year, U.S.-based research project started in 2020, that begins with a comprehensive review of current studies and includes international perspectives.2 It aims to identify what the research counts as an impactful media literacy practice, map that practice in different educational settings, and create a tool for educators to determine the impact of their own efforts.
As that work unfolds, there are some key lessons from new studies that have emerged in the past few years to examine how to help build resilience to dis- and misinformation. For example, a 2020 study of Learn to Discern, a program designed and run by the international education and research firm IREX, showed that targeted skill-building messages were effective in reducing engagement with misinformation, by helping people to recognize how various sources and authors were triggering their emotions. The program is designed to help people become more aware of their whole information environment, including a recognition of how some texts or videos are designed to generate strong emotional responses that could lead to the sharing and spreading of misinformation.
Notably, this “emphasis on emotional awareness and thought-driven decision-making,” as IREX describes it,3 is different from approaches that rely on a deep reading of just one text—an approach criticized of late.4 Asking students to think critically only about the words inside a text misses the mark, especially when so many texts are little more than memes with photos and a few words. A key part of media literacy education today involves helping students to do “lateral reading,” in which they shift horizontally in their web browser, opening a new tab to search for evidence that a source is considered credible by others.5 Students also benefit from seeing the way an issue is framed. This can happen when teachers help them see the larger environment of information at their fingertips, including various authors and motivations, and how their own information environment is shaping their understanding of the world.
In the study using Learn to Discern materials, Facebook users were shown a short video about media literacy that informed them of Russian sources of news messages. Then they were asked about whether they would “like” various pieces of politically right-leaning Russian content on social media. The study, led by RAND as part of its research on Russian propaganda, showed that materials such as “the video on media literacy…appeared to reduce the number of self-reported ‘likes’ for politically right-leaning Russian content.”6 This was just one study and many more are needed. Yet the results point to early evidence that interventions can be helpful.
Another research-validated approach is designed to build resilient mindsets by putting people into synthetic environments in which they can “experience” and learn, without the associated risks of the real world.7 One example of this is making students into antagonists (or “chief disinformation officers”) in simulation games that show how chaos ensues when bad information spreads. Researchers at the University of Cambridge have developed two such games—Bad News and Harmony Square—to study their impact. Harmony Square is a free, 10-minute game available on the internet that, as the researchers explain, “incorporates active experiential learning through a perspective-taking exercise: players are tasked with spreading misinformation and fomenting internal divisions in the quiet, peaceful neighborhood of Harmony Square.”8 Their study examined whether players perform better on tests of their ability to spot trolls and identify emotionally exploitative, conspiratorial, or polarizing content than people who did not play the game. In a randomized, controlled trial involving 681 people (half from the U.S. and half from other parts of the world), they found that those who played Harmony Square were more likely to consider manipulative social media as unreliable, had more confidence in spotting that content, and were less likely to share it.9
And more research is emerging from programs such as KQED Learn, an educational site associated with the public radio station in California’s Bay Area. in 2020, it conducted pre- and post-tests with nearly 200 students whose teachers participated in KQED’s media literacy and civic engagement programs. The students showed significant growth in their ability to distinguish between legitimate and dubious photographs and to compare articles for reliability. They also improved on their ability to check the reliability of sources by using lateral reading techniques instead of relying on searches inside the text.10
These three examples show how education and learning strategies are evolving to incorporate new findings in the science of human behavior and cognition. And more studies continue to emerge, from scholars at research centers such as the newly formed Center for an Informed Public and in publications such as the Journal of Media Literacy Education. Gathering robust evidence on how educators should go about building these skills and developing these mindsets is important. As the KQED study showed, this is about more than teaching students to look for a “.edu” or “.org” on a website, since this is now understood as nowhere near enough, as bad actors can easily stand up fake “.org” websites. Nor is it telling them to avoid Wikipedia. (We find that many teachers are not aware that Wikipedia’s commitment to transparency, and its self-policing corps of human editors, now makes it one of the most reliable sources for checking contested information.) This kind of teaching also goes beyond traditional news literacy education, which provides a good foundation but may not go far enough in helping people spot trolls and content generated by bots, intentionally designed to sow division. And it recognizes that online threats are about more than clicking the wrong link and downloading malware.
Interdisciplinary efforts are thus key. As part of these, drawing in new fields will be even more valuable to future research on effectiveness. For example, cognitive science is bringing new insights into how people learn and are shaped by information they ingest and information environments they participate in. This includes the impact of psychological phenomena such as the illusory truth effect (the impact of seeing false information repeated so much it seems true) and first-impression bias (in which the information people read first has lasting effects on behavior). Public interest technologists and those who study polarization on the internet are paying attention to the impact of algorithms on what information people see. One example is the way that YouTube serves up videos that lead people down rabbit holes of extremism. And national security experts are constantly identifying what military analysts call TTPs (Tactics, Techniques, and Procedures) of social media warfare by foreign groups and governments that are intent on disrupting democracies.
Here again we see the value of building bridges between different fields and approaches. Educators are not often privy to these new developments and the instructional materials they use can quickly become out of date. For example, Russian actors targeted the 2016 US presidential election by utilizing thousands of false front accounts (“sock-puppets”) and tens of thousands of algorithmic bots that injected disinformation themes into the U.S. political ecosystem. In the 2020 election, they pivoted to elevating domestic sources of disinformation and conspiracy theories.
Educational materials thus will need to incorporate more recognition of changing TTPs threat actors, to ensure that teachers do not become unwitting conduits for more polarization. This may be especially important if they are designing lessons that, say, ask students to argue two sides of an issue that might actually be part of a disinformation campaign.
Citations
- See the Connected Learning Alliance for more about the elements of and principles behind “connected learning,” at source
- Paul Mihaldis, associate professor and graduate program director for the Media Design program at Emerson College, is the principal investigator, and NAMLE is the lead partner. See source
- For further reading, see Randomized Control Trial Finds IREX’s Media Literacy Messages to Be Effective in Reducing Engagement with Disinformation published by IREX,source
- Charlie Warzel, “Don’t Go Down the Rabbit Hole: Critical Thinking, as We’re Taught to Do It, Isn’t Helping in the Fight Against Misinformation,” New York Times, February 18, 2021, source
- The concept of lateral reading is described in several media literacy resources and emerged from the research of the Stanford History Education Group, led by Sam Wineburg. For a useful overview, see “Expand Your View with Lateral Reading,” an article published by the News Literacy Project at source
- Todd C. Helmus, et al., Russian Propaganda Hits Its Mark (Santa Monica, CA: RAND Corporation, 2020), source
- Harmony Square is the example noted here because of the Cambridge research study, but educators may also be interested in another game, which may be appropriate for slightly younger students, called Interland, designed as part of Google’s Be Internet Awesome initiative. As Google describes it, “Interland is a free, web-based game designed to help kids learn five foundational lessons across four different mini-games, or ‘lands.’ Kids are invited to play their way to Internet Awesome in a quest to deny hackers, sink phishers, one-up cyberbullies, outsmart oversharers and become safe, confident explorers of the online world.” For more, see source
- Jon Roozenbeek and Sander Van Der Linden, “Breaking Harmony Square: A Game that “Inoculates” against Political Information,” Misinformation Review, November 6, 2020, 1,source
- Roozenbeek and Van Der Linden, 5–7,source
- Institute for the Study of Knowledge Management in Education, Equity and Access, Civic Engagement and Media Literacy- Final Report: KQED Learn Impact Study (Half Moon Bay, CA: Institute for the Study of Knowledge Management in Education, 2020),source