By Caty Cherepakhov

2016 was the first time a large-scale, online disinformation campaign was used to interfere with our elections, and it was successful because social media can scale any message by giving people and organizations unprecedented access to one another. Given the steady increase in time people spend online, disinformation is a real, growing threat and — perhaps most important — is likely to get worse in 2020. 

While Russia still actively promotes messaging running counter to the U.S.’s global standing and democracy overall, they can now do so more effectively through Russia Today (better known as RT) and their social media channels instead of through countless smaller scale disinformation campaigns. Disinformation campaigns used to originate in Russia, and could still come from countries like Russia, China, and Iran in 2020, but researchers predict stories that gain traction will be home-grown. To this point, we’ve already seen politicians and news outlets spread disinformation to advance their political goals — a near-unique problem the U.S. faces due to a partisan divide in trusting stories based on outlet and topic coupled with our two-party system.

While there will certainly be wild cards, here are some trends and tactics that will likely be in play for 2020.

  • More rallies and counter-rallies. Demonstrations tend to get mainstream media coverage, a primary goal of disinformation campaigns, so we’re likely to see more attempts to organize them.

  • Disinformation from ordinary Americans. People or local officials will be paid by some individual or organization with a political goal to create social media accounts, host events, or even run a full campaign — a playbook that recently saw success in Madagascar. Not to mention the confused Americans that share false stories outright!

  • Voter suppression through disinformation. Using robocalls to misinform voters has proven successful at reducing turnout in other countries, and while there is no hard research about the impact of ads with this goal in 2016, it will almost certainly be tested in 2020.

  • Deep fakes making news. As seen during the wide circulation of a crudely edited video of Speaker Pelosi, one deep fake can drive a whole news cycle – and this will only get worse as they become more sophisticated.

  • Confusion through Instagram memes. Instagram is expected to be the largest vehicle for disinformation, particularly through memes targeting both parties, due to the diversity and age range of its users.

  • Whatsapp mass forwarding. Whatsapp is viewed as the platform to watch because people can send messages to hundreds of people at once, a feature that led to an increase in domestic disinformation around recent Indian elections

Overall, 2020 is likely to see a mix of new and old tactics, and getting ahead of those attacks will be a lot easier if you’re alert and on the lookout, and certainly more manageable on the frontend rather than in the midst of a would-be attack. Whether you lead a communications team or write social for a brand, here’s what you can do to prepare:


  • Vet your own messaging. Red team your message frame, statements, and responses to attacks to ensure there isn’t room for misinterpretation, and that your proposed response won’t create unintended negative consequences. Simple statements and inadequate responses can be distorted to create a slew of negative headlines.
  • Monitor your mentions across platforms. Keep track of what people are saying about your organization or campaign on social media. Use social monitoring tools to track an uptick in mentions — if you see one during odd hours, it could be coordinated bots from other countries. Monitoring software like Talkwalker or Netbase can flag these spikes automatically so you know an attack is coming.
  • Train your team. Learn how to recognize suspicious websites, topics, and warning signs for inauthentic social media accounts (like the misuse of popular phrases and grammatical errors specific to non-native English speakers), and utilize sites where stories can be reliably fact-checked.
  • Educate your followers. Whether it’s journalists covering your brand (who often inadvertently propagate misinformation just by covering it in the wrong way) or volunteers on a campaign, make sure people know the facts and are ready to dispute disinformation in their communities or online. If you’re seeing a lot of disinformation, an FAQ one-pager dispelling common misconceptions is a must!
  • Make a plan of action. Don’t wait to be attacked before putting a plan in place. Identify potential attack points and vulnerabilities, and prepare a comprehensive response plan. If you’re running a campaign, for example, prepare a fact-checking site with clear, direct statements and resources for common lines of attack — time is of the essence, and it’s better to be prepared from the get-go rather than scramble after an onslaught of fake posts.
  • Respond carefully. When experiencing a disinformation campaign, you may be inclined to respond through official channels, but that often isn’t the best move. Only respond formally if the false information is gaining mainstream media attention — if it isn’t, your response will likely give it some. If you do respond, do so forcefully and clearly, and be sure to include proof to back it up.
  • Manage your reputation. People are more likely to believe bad things about an organization they don’t like or know anything about. Building brand loyalty, developing a positive reputation, and doing good can help protect organizations from boycotts or lasting damage to the brand. And, it gives you substantive evidence to counteract an attack.

The goal of those spreading disinformation in 2020 isn’t just to misinform. These efforts are designed to confuse, spread doubt, and to blur the lines of fact/truth. Nefarious actors will use a variety of channels and messages to reach people — both leading up to Election Day and year-round. Understanding these threats and taking proactive measures enables brands and campaigns to weather the storm and avoid long-term reputational damage.

Before joining Precision, Caty received her Masters in Comparative Politics from New York University and wrote her thesis on the Impact of Disinformation on Democratic Decline in the U.S., U.K., and Canada.