News & Insights
Our team at Precision has been helping clients — from political campaigns to leading corporations — navigate these complex issues so they can quickly identify and proactively combat false narratives. Here are our top tips for approaching disinformation.
False news stories are 70 percent more likely to be retweeted than true stories are, according to research from MIT. It’s no wonder we hear so much about disinformation and misinformation these days, from the “Big Lie” of voter fraud in the 2020 presidential election to the ongoing backlash that Spotify is facing as notable artists pull content from its platform for hosting a podcast that has spread vaccine misinformation.
While the battle to fight back against inaccuracies may have largely started in the political world, organizations of all kinds are now susceptible to falsehoods spreading among consumers, employees, and other stakeholders. For many, this misinformation can carry real consequences, negatively impacting business. And as the topic gets more and more attention, bad actors are continuing to come up with new strategies to spread false information, making it nearly impossible to have a set playbook for what to watch for and how to fight it.
Our team at Precision has been helping clients — from political campaigns to leading corporations — navigate these complex issues so they can quickly identify and proactively combat false narratives. Here are our top tips for approaching disinformation:
➔ To catch issues early, forethought and customization are critical. If you wait until a false narrative “rises to the surface” and shows up on your timeline, it will already have a life of its own. Monitoring for false narratives requires forethought, a deep understanding of your industry (and its key influencers, in your industry and beyond), and a skilled team of data scientists and analysts. There is no single playbook for identifying misinformation, but with the right team and tools in place you can develop an infrastructure to catch toxic narratives early, making it easier to nip them in the bud. Social listening tools are a great place to start as they allow you to track conversations from keyword-based queries and author lists. It is also helpful to incorporate image recognition and different variations of keywords (i.e. v*ccine instead of vaccine), as disinformers often use images or text variations to attempt to get around detection. As you become more familiar with the toxic ecosystem you are facing and the key players, analysis of profile attributions and behaviors can help you segment influencers into distinct factions, helping to better understand and predict trends in behavior.
➔ Focus on the present and future, not the past. You will have a better grasp on the misinformation landscape affecting your organization by tracking in real time, rather than looking back at past weeks or months. By conducting deep influencer and keyword monitoring daily, you will be able to track the “who” of the narrative, in addition to the “what”. This should also give you a good grasp of which platforms misinformation related to your organization thrives on and can often contain your monitoring to just those platforms. While Facebook, Twitter and Telegram are generally relied upon the most to spread misinformation, it’s important to determine if this is true for your own unique influencer base.
➔ Responding to misinformation is not always the best way to combat it. Sometimes publicly responding to a false narrative can amplify conversations and engagement on misinformation, having the opposite effect as intended. Sharing misinformation simply to add a comment of “this is wrong” still contributes to the negative spread of misinformation. Assessing each threat on an individual basis is necessary and, if you do engage, try to come from a place of understanding and develop an empathetic response — research shows this is much more effective than using negative rhetoric. For example, in responding to misinformation about COVID-19, one might start with a statement like “I know the past two years have been frustrating, but…” or “I also find the changing regulations complicated, however…” to show a desire to help rather than fight.
➔ Get familiar with content moderation policies. Platforms such as Twitter, Facebook, and YouTube are often inconsistent in the way they respond to misinformation. By having a thorough grasp of each platform’s content moderation policies, you can proactively identify and alert platforms of content that should be removed or accounts that should be suspended.
Understanding who is spearheading, engaging, and re-engaging a misinformation narrative will allow you to see trends in behavior and predict future moves. But tracking misinformation can be difficult. A skilled data analysis team can provide extra support in helping you to corral misinformation as soon as it starts, giving you a head start on quashing harmful narratives before they circulate widely.