O

n the morning of July 25, 2019, former special counsel Robert Mueller sat before the House Intelligence Committee to field questions about the 448-page report his months-long investigation had generated. Appearing for the first time since his appointment as special counsel, reporters from every major media outlet packed into the committee room to hear Mr. Mueller’s verdict on the integrity of the 2016 election.

Expanding upon the intelligence community’s consensus that Russia and other foreign actors exerted a dedicated effort to influence the presidential race, Mueller gravely warned that “they’re doing it as we sit here, and they expect to do it during the next campaign.” With this message coming out of every television, radio, and newspaper in the United States, vigilance surrounding social media misinformation was at an all-time high in the months preceding and weeks following the 2020 election.

Among those who heard Mr. Mueller’s salient warning was junior Chase Small. Small became concerned by 2016 election interference after reading a report put out by Professor Michael A. McFaul, the director of the Stanford Freeman Spogli Institute for International Studies. “I had been super interested in what had been put in place to avoid that sort of interference again in this election and saw that most of that had fallen to the platforms themselves,” Small recalls. 

Small went looking for opportunities that allowed students to assist in avoiding a rehashing of the 2016 election. Luckily for the sophomore, the newly formed Election Integrity Project was looking for interested undergraduates. The Stanford Internet Observatory, in conjunction with Graphika, the Atlantic Council’s Digital Forensics Research Laboratory, the University of Washington’s Center for Informed Policy, and the Stanford Program on Democracy and the Internet had acknowledged the same lack of external checks on social media misinformation during election periods. In response, these organizations jointly founded the Election Integrity Project (EIP). As described on Stanford’s Cyber Policy Center, the EIP is “focused on supporting real-time information exchange between the research community, election officials, government agencies, civil society organizations, and social media platforms.” Aiming “to detect and mitigate the impact of attempts to prevent or deter people from voting or to delegitimize election results,” the EIP provided an opportunity for Stanford students to impact the 2020 election in real-time. 

Most Stanford undergraduates at the EIP served as Tier 1 Analysts. They do “ the first pass of analysis when [the EIP]receives new pieces of potential misinformation,” according to sophomore Jennifer John. Senior Dylan Junkin, another Tier 1 Analyst, said undergraduates at EIP also gathered “some basic data about the misinformation, like who it’s targeting, who is the source, and assigning a score that correlates to its threat level.” 

Upon receiving a report ticket containing a brief description of the suspected misinformation, having been reported to the EIP by either a government official or a member of the general populace, the Tier 1 Analysts are first on the scene. The first step when analyzing a ticket was to decide if it fell in the EIP’s scope. “We had a pretty narrow scope in what we were looking at,” John explained, “so we wouldn’t go after claims like, ‘Joe Biden is a fraud.’ It would have to be something that directly misleads people about how to vote, why you shouldn’t vote, or overall delegitimizing the election results.” The tickets would also be segregated based upon the target audience—each analyst was assigned to a certain swing state or demographic, such as the LGBT community, black voters, or the disabled and elderley, which was John’s specialization. Analysts would then advise Stanford graduate students in management positions on the argument for removal of the post. If the ticket fell within the EIP’s scope and violated the policies of a given platform, analysts would, according to John, “submit the ticket to someone involved in site integrity and safety at the platform where it was spreading. Sometimes this would mean pinging someone at Facebook and Reddit and Instagram and Tik Tok, depending on how far it had spread.” If the platform agreed that the post was not acceptable, the post would be promptly removed or labelled as erroneous to viewers.

While Russian interference in the 2016 election may have inspired the creation of EIP, international interference wound up as only a fragment of the 2020 project. After slogging through hundreds of pieces of misinformation, Junkin noted that “the vast majority of misleading posts were domestic.” To anyone who has glanced at President Trump’s Twitter account since November 3 or tuned into Sean Hannity in the past week, many of the most common narratives will sound familiar.

In the weeks leading up to Election Day, Junkin recalled that the majority of reported posts involved narratives around mail-in ballots: claims about postal workers throwing out ballots, dead people receiving ballots, or general misconceptions about the legality of mail-in voting. For instance, Junkin analyzed and reported multiple claims about Minnesota Congresswoman Ilhan Omar allegedly buying votes and committing ballot harvesting, an illegal practice in which third-parties—not voters—submit absentee ballots. After Election Day, Junkin found that “the misinformation was no longer about the process. Most posts began to discuss the people interacting with completed ballots, like ballot counters and poll watchers ” In an instance both Junkin and John encountered, a poll watcher in Philadelphia who was temporarily denied access to the vote-counting area was paraded as proof that all Republican poll workers were being barred entry. 

One of the EIP’s most prevalent challenges occurred in Antrim County, Michigan, where human error resulted in Trump’s and Biden’s vote totals being initially inverted before the issue was quickly rectified. Despite the non-issue of Dominion Voting System, on which the votes were cast, Small explained that “Antrim County has become the poster child for the claim that Dominion is at the center of a federal voter fraud scheme.” For example, right-wing news site OANN has repeatedly run articles claiming “vulnerabilities” and “glitches” on the Dominion Voting System, and President Trump himself tweeted, “REPORT: DOMINION DELETED 2.7 MILLION TRUMP VOTES NATIONWIDE. DATA ANALYSIS FINDS 221,000 PENNSYLVANIA VOTES SWITCHED FROM PRESIDENT TRUMP TO BIDEN. 941,000 TRUMP VOTES DELETED. STATES USING DOMINION VOTING SYSTEMS SWITCHED 435,000 VOTES FROM TRUMP TO BIDEN.” While Junkin noted that “the President’s statements may run against the work that we’re trying to do,” he also contended that he and the rest of the analysts at the EIP believe it is imperative that the information is corrected outside that media atmosphere. Unfortunately, Junkin also suggested that even basic information warnings on the President’s tweets can “feed into his narrative that big tech is trying to censor him and his message.”

While most posts are simply flagged with a footnote reading “This claim about the election is disputed” and remain visible on the site, Twitter’s and Facebook’s recent applications of fact-checking has led to an alarming rise of misinformation-spreading alternative platforms. John said that he has seen “a number of people moving to Parler, which frames itself as the free speech version of Twitter. It’s number one on the App Store—over one million downloads in the last week. Ted Cruz has an account. President Trump doesn’t yet, but most suspect that it’s only a matter of time.” Parler raises a firestorm of questions about how much mainstream platforms can fact-check the president without losing business and how to deal with rampant misinformation on a platform that markets itself as “fact check free.” Even so, Parler will likely continue to rise in significance in the coming months as a major player on the right. 

Despite having possibly kickstarted a dangerous boom on Parler, increased fact-checking on social media platforms over the course of the 2020 election has been crucial in maintaining the integrity of American democracy. With Stanford undergraduates serving as the first set of eyes on every piece of misinformation, the EIP was and continues to be a paradigm of how Stanford students are working to mold the world of misinformation in tangible ways. As the EIP continues to confront posts about vote counting, legal challenges to the results, and a whole host of remaining issues in the coming weeks, EIP analysts can be proud that they played a central role in protecting the integrity of the most important election of our lifetimes.