Social

Two Basic Patterns for Misinformation – Brad Chen


Photo by Marie Bellando-Mitjans on Unsplash

Revelations of the deceptive tactics applied in recent elections have induced a heightened awareness among Europeans and Americans of the danger posed by misinformation, particularly when combined with the reach of online social media. As citizens many of us feel compelled to rise to the challenge, to defend our form of government and our way of life, but our efforts may be wasted unless we understand the challenge first. With that in mind here are two complementary patterns for misinformation.

The first general pattern is the information void. Information voids occur when there is public interest in information and that information is not forthcoming. It could be that the information does not exist, as with certain questions of science and religion, or that the information is being withheld, for example a secret. Human nature dictates that information voids will tend to be filled, with rumors, speculation and sometimes lies fulfilling public need when the truth is not forthcoming.

The second general pattern is the destruction of information. This occurs when an adversary acts to disrupt public acceptance of a particular fact or understanding. Lies are one important technique but there are many others.

An astute reader might now be thinking, “Okay, so there is gossip and there are lies. What’s the big deal?” The simplicity of this model is a feature but what is more important is how this structure motivates strategies to mitigate misinformation. Let’s consider some examples.

Information Voids

Popular concerns about misinformation tends to focus on weighty topics like election interference and foreign intelligence services, but these associations are misleading. Consider an insecure lover, separated from their partner. That absence creates an information void that can easily provoke suspicion or fear. Over time our lonely soul may find themselves thoroughly disoriented, susceptible to all manner of paranoid fantasy. A phone call or a friendly word might put our poor soul’s mind to rest, but often the only option is to recognize the void, assume benign intent, and calmly await for the separation to end and full confidence to be restored.

Secrecy commonly provokes information voids. Imagine a new or secret project in a high-tech software company. As the existence of the project becomes known, rumors begin to circulate. A range of theories emerge, some more plain, some more creative. Existing conspiracy theories easily attach themselves given the void created by the dearth of facts. Which rumors are more likely to capture the public’s attention? Although in viral marketing clever executives use public enthusiasm to their advantage, in common cases the confusion does more harm than good.

It’s easy to imagine misinformation as pure evil when in fact the underlying behavior is as basic as human imagination. Consider a general planning for an imminent attack by a sleuthful adversary. The general must prepare a plan mindful of incomplete information and the possibility of deception. Under such conditions imagination or lack thereof may be the difference between survival and an untimely death.

As a 21st century example of information voids, Web search engines can manifest “data voids” (as described in Wired and the New York Times) where searches run ahead of content for a subject of public interest. This can happen in media markets where demand gets ahead of content sources. It may also occur after a major news event, in the time gap before authoritative news sources respond. A variant of this phenomenon is the social media “rabbit hole”, where personalization technologies guided by popularity lead a consumer to unsavory content. Mass personalization combined with the global reach of social media make this a hard problem. It is as if the perennial rumor mill has become global and searchable, with the most salacious gossip available to all, and the most provocative gossipers achieving minor stardom.

The point of these examples is to demonstrate that information voids are all around us and are as old as human story-telling. As a consumer of information you can be wary of the enthusiasm that commonly accompanies the most salacious attempts to fill the void. By recognizing voids for what they are, and giving due consideration to benign, dull explanations of the unknowable, an information consumer can avoid the messiness of false incrimination. Moreover, unless you live in a society of charlatans the odds of being correct are in your favor.

If you are responsible for creating an information void by withholding information, you can constructively manage the situation by being aware of public interest, and by being as transparent as possible without compromising the project’s strategic intent. By developing a sensitivity to how secrecy induces rumor and speculation, you can better monitor the negative impact of secrecy in misunderstandings and distrust, and strike a better balance between transparency and the protection of confidential information.

Destruction of Information

Information voids occur when information of public interest is unknown. When the information is known but inconvenient, an adversary may apply themselves to destroy it. As a first example let’s go back to Pisa in 1633 to a pivotal event in the birth of modern democracy. For refusing to acknowledge that the Sun rotated around the Earth, Galileo Galilei was found “vehemently suspect of heresy”. The scientific community throughout Europe took note, and not only for the sake of the movement of celestial bodies. In refusing to bow to orthodoxy, Galileo questioned the authority of the ruling theocracy to dictate science, and more generally the authority of the integrated church-state to dictate to their subjects what they would be allowed to believe. Although Galileo lived the rest of his natural life under house arrest, his struggle informed the thinking of Sir Francis Bacon, commonly credited with establishing the scientific method which uses scepticism and peer scrutiny to discourage competent scientists from maintaining beliefs that are false. The premise that the scientific community could govern itself without relying on divine intervention was itself a narrow case of the general notion that a person can discover new truth through disciplined rational thought, without relying on royal or ecclesiastic exceptionalism, the basis of the intellectual movement known as The Enlightenment. The Enlightenment in turn gave credibility to a rebellious group of American colonies who believed they could rule themselves without the help of the British monarch. And the rest is history.

The Enlightenment marked the key departure from the medieval dependence on theocracy and royal exceptionalism. Facts and science displaced faith and authority in how European civilization constructed its system of beliefs, often at odds with the purveyors of faith and authority. Given how the resistance against misinformation supported the birth of modern democracy, it is unsurprising that misinformation today is weakening democracy and restoring a medieval way of thinking. Geocentrism is echoed in popular themes like climate change denial, creationism, and antivax. Instead of hunting witches, we demonize Jews, Muslims, immigrants, and whichever political party we disagree with. When opinion leaders argue against competent science based on bad science, religion and “common-sense” arguments, they do more than advance their causes, they compromise the public’s faith in competent science, and ultimately the ability of the public to believe any facts at all. A public deprived of facts is left making decisions based on opinions, creating opportunities for charismatic leaders whose priorities are at odds with truth and reason.

Birtherism is a modern example of information destruction. Birtherism is a discredited body of conspiracy theories that question Barack Obama’s U.S. citizenship. Despite his having satisfied the normal standard of proof of U.S. citizenship, a variety of creative theories emerged before and during Obama’s presidency questioning his citizenship, among them:
– that he was born in Kenya, not in Hawaii
– that he has held Indonesian citizenship, forfeiting his US citizenship
– that he was born with dual British / US citizenship
The willingness of public figures to participate in this disinformation strategy, and sometimes actively support it, illustrates the insidious nature of such campaigns. Ben Nimmo, Senior Fellow at the Atlantic Council’s Digital Forensics Research Lab describes
four basic tactics for disinformation, distract, divide, discourage and deceive, and Birtherism demonstrates all of them. Birtherism effectively distracted voters from issues like healthcare and the subprime mortgage crisis which were less easily handled by Obama’s opposition. It divided the electorate, with as many as 25% of those polled in 2010 doubting Obama’s natural US citizenship. It created sufficient doubt to discourage some from voting at all, and deceived many who chose to invest themselves in false claims or to question the accepted truth.

As rumors develop around 2020 US presidential candidates, the election has already produced notable examples of information destruction. The New York Times reported on JoeBiden.info, a deceptive website easily confused with authoritative content from the candidate Joe Biden. The site claims to be based purely on factual information, but is managed by a Biden opponent, and built to present isolated events from the candidate’s history out of context while omitting Biden’s own message. It also relies on identity deception, with the key detail that the site is maintained by an adversary unavailable to a casual consumer of the content. The clandestine nature of the site’s authorship exemplifies a general problem of weak identity in Internet services, with the author’s right to privacy at odds with public expectations of protection from deception. To protect the public from related deceptive practices on radio and television, some countries enforce ownership and disclosure requirements.

Another variant on information destruction is controlling the public narrative. Phineas T. Barnum is credited with the quote ‘There’s no such thing as bad publicity’, a maxim modern politicians adapt by generating newsworthy material at strategic moments to seize control of headlines, often drawing attention away from opponents or from negative news about themselves. For a modern example, consider the timing of the Skripal poisoning shortly before Russian presidential elections. By triggering calls for retaliation from the UK and US, the events supported and reinforced a narrative in Russia of Russophobia and nationalism. Without taking credit for the attack, Putin’s team acknowledged the positive impact on voter turnout, critical for the support of the Russian president. For a U.S. example, witnesses subpoenaed by special counsel Robert Muller stated that the release of stolen emails from the Hillary Clinton campaign was timed to draw attention away from the Access Hollywood recording in which Donald Trump spoke graphically of his crude behavior towards women. During the Trump presidency it is difficult to prove that the administration’s initiatives around immigration and trade are timed to control the press, but objectively they have done well at holding the headlines, while the actual impact of policies on immigration and trade are unclear.

As a final example of controlling the public narrative, Never Trust A Pelican is a children’s story from Thailand in which a hungry pelican fabricates a rumor of an imminent natural disaster to convince the residents of a pond to entrust it with their lives. The pelican is both trusted and well fed until a clever crab recognizes the falsehood.

What to do?

Misinformation is messy business. Some examples don’t fit cleanly into this model, but more important than the model is having a plan for handling the problems. While awareness is an important first step towards meeting these challenges, if we stop by concluding “I have been deceived” we abandon ourselves to a world depleted of fact, ceeding the advantage to the purveyors of charisma and popularity. To maintain the factual basis for public beliefs we need to learn to value reason and fact. These two patterns for misinformation provide a simple structure that can help.

For the first pattern, the information void, learn to recognize information voids for what they are. If you are a member of the curious public, be weary of those who attempt to fill the void with rumors and speculation. If you are party to creating a void, recognize that you can prevent misunderstanding through a transparent disclosure of facts instead of relying on secrecy and ignorance.

For the second pattern, information destruction, make a personal commitment to valuing reason and fact. Support competent science and best-practices in professional journalism. Learn to recognize medieval thinking, and when you see it share it with somebody who doesn’t. Be wary of the omission and manipulation of context. Understand the weakness of online identity, and how it facilitates deception. Be aware of the public narrative and those who seek to manipulate it, taking ownership of your personal path through these stories.



Source link

Show More

Leave a Reply

Pin It on Pinterest

Share This

Share this post with your friends!