In this article:
- People have long used the media to spread lies and deceit, and though it is a relatively modern term, the words “fake news” pertain to something that’s been around for centuries now.
- Throughout the years, misinformation and disinformation has had a hand in shaping battles, empires, and stock markets. It also has the power to kill.
- The problem of false and harmful information is multi-layered and pervasive, and looking back might help us pick up lessons we can use moving forward in the fight against fake news.
The term “fake news” has been around since the 2000s, but it was only in the past few years — and largely thanks to a loud liar of a former president — that it truly became a household term.
Hailed as 2017’s word of the year by Collins Dictionary, the term refers to “fake, often sensational, information disseminated under the guise of news reporting.” And while some scholars have raised concerns with the linguistics of it, “fake news” has come to be an umbrella term for more specific words like misinformation and disinformation, which describe problematic and harmful information in today’s media landscape.
But the phenomenon defined by fake news has been around for much longer — before troll farms sprouted across the globe, before social media bots came to be, and before social media platforms themselves grew to be such a major part of our lives with their secret algorithms.
The truth is, people have used the media to disseminate lies and deceive others for centuries now. It’s just that today’s unique mix of technology, advertising systems, and culture has brought about a bigger, ever-evolving monster that researchers and governments have still yet to fully grasp.
In that effort, it might be worth taking a look back to recount some of the biggest examples of fake news that have changed history, and what we might learn from them today.
The Making of Rome’s First Emperor (32 BC)
Before he rebranded himself as Caesar Augustus, the first emperor of Rome was called Octavian. He was deeply respected by his people as a great leader that brought about peace and prosperity through constitutional reforms, imperial expansion, and the creation of efficient public services like postal delivery, road networks, and even a fire brigade.
He was also a formidable military leader and defeated former ally Mark Antony on his way to becoming emperor in the Battle of Actium. Before the actual battle, however, was a ferocious propaganda war.
In contrast to Mark Antony, who had moved to Egypt to stay with Queen Cleopatra, Octavian had remained in Rome and knew the people much better. Tensions had been rising between them for a decade, but the final straw that led to the people turning against Mark Antony was a questionable document that historians today still debate the authenticity of — Mark Antony’s will.
Octavian read the questionable (and inflammatory) document aloud in the Senate and spread copies of it across Rome, playing on anti-eastern prejudice and public suspicion of powerful women to cast Mark Antony as having betrayed Rome for the love of Cleopatra. Before the will, he had also circulated small coins and poetry that painted Mark Antony as a drunk, and therefore, unfit to rule.
He eventually did defeat Mark Antony in battle, after defeating him in the court of public opinion. Octavian went on to rule Rome for over four decades afterward.
The fact that people remember Cleopatra today as the power-hungry temptress that corrupted Mark Antony — instead of a shrewd political leader who wanted to protect her crown and maintain her country’s independence — is a testament to the staying power of Octavian’s propaganda.
We’re most vulnerable when disinformation plays to our existing biases and prejudice, and this more often affects women.
For example, part of the lies that sent a sitting Philippine senator to jail was an alleged sex tape that was definitely not of her, but was used to fuel public derision of her as “an immoral woman.” She remains unfairly incarcerated to this day.
Stocks and War (1814)
The stock market has been described as just vibes, which I’m tempted to believe, but people with a deeper understanding of the system explain that it’s a complex adaptive system that may or may not truly matter to those of us who aren’t corporate hotshots.
Interestingly, the stock exchange has attracted fraudsters for as long as it’s existed, and one of the biggest stock market hoaxes involved another war — this time the Napoleonic one — and is known as the Great Stock Exchange Fraud of 1814.
In February of 1814, a man arrived at the Ship Inn in Dover, England, bearing news of Napoleon’s defeat and death. Though Napoleon did, eventually, die, he was still very much alive at the time — and would live on until 1821.
But the English didn’t know that, and the man who bore the news looked legit: He wore a military uniform and acted the part of Colonel du Bourg convincingly.
He also gave the locals the juicy details they wanted to hear: “The Allies are in Paris, Bonaparte is dead, destroyed by the Cossacks, and literally torn in a thousand pieces; the Cossacks fought for a share of him as if they were fighting for gold,” he said. “The country can expect a speedy peace.”
And soon, stocks — which, at the time, looked more like today’s bonds — rose. The story was eventually found out by authorities to be a hoax, but not before stockholders were able to make a lot of money selling off their government securities at suddenly very good prices.
Behind the fraud was a group of eight people, including naval hero and Parliament member Lord Cochrane. Six of them were tried and sentenced to a year of prison time and public humiliation.
There’s a lot of money to be made in lies, both in and out of the stock exchange. And this raises plenty of questions about how today’s digital journalism and social media platforms are currently set up.
Interesting, too, is how fast the fake news peddlers were caught. Maybe it was the sheer audacity of the lie, but maybe it also has something to do with how this particular kind of fake news pissed off powerful people and threatened their wealth.
Creatures on the Moon (1835)
This one’s a classic on the power of a compelling story.
On August 21 of 1835, the New York Sun began publishing a series of articles that claimed that there was life on the moon. And it wasn’t just any life: It was an entire system of fascinating aliens like bipedal beavers and bat-humans with wings.
Technically, the article series was sci-fi passed off as news — something that the radio drama War of the Worlds also did a century later, though it didn’t cause quite the stir the media made it out to be. Nevertheless, the paper did enjoy a substantial rise in readership.
There’s money to be made in sensationalism, and it’s a practice we now know as yellow journalism. (And it, too, has been linked to war.)
But as funny as it seems today to fall for a hoax like this, it’s hard to blame people of the time. Newspapers, after all, are supposed to be credible sources of information, and the problem of fake news is one that’s about public trust in media institutions.
Mark Twain’s Too-Early Death (1897)
Born Samuel Langhorne Clemens in 1835, Mark Twain was a lot of things: He was a humorist, an entrepreneur, and a lecturer. Plus, he was also anti-imperialist, anti-slavery, and pro-union. He passed away in the year 1910, but some reports had come way too early.
In 1897, reports surfaced that Mark Twain had been “dying in poverty in London.”
Twain himself, who was very alive and well (and not quite poor at the time), didn’t know if he was going to be amused or annoyed.
Nevertheless, he did come out with a clarification and a possible explanation for the confusion in an article dated June 2 of that year.
“The report of my death was an exaggeration,” he was quoted to say. The reports may have a nugget of truth, he guessed, in that a cousin of his, named James Ross Clemens was ill, but had since recovered.
This is an early example of what has become a long tradition of saying celebrities have died. And though we can’t know for sure if this particular news was maliciously and deliberately spread — that is, if it’s mis- or disinformation — we do know that this type of news today tends to be commercially motivated, as there’s money to be made in clicks and ads.
And, like Twain’s story, it works because it taps into our emotions. Just like people loved Twain, we care about celebrities today. People and things we have an emotional connection to tend to be low-hanging fruit for those who’d like to spread fake news.
Fake News to Catch Fake News (1903)
This is an interesting one, and it involves competing publications: the Clarksburg Daily Telegram and the Clarksburg Daily News. In the early 1900s, the former was suspicious that the latter was stealing their stories, and they turned to fake news to prove it.
The Daily Telegram ran a story about the death of a person called Mejk Swenekafew, a Slav who lived near the Columbia coal mine. He was shot and in critical condition after a fight that involved — of all things — a pet dog.
Just as the Daily Telegram predicted, the Daily News ran the story, too, and was caught red-handed. In fact, the name of the non-existent shooting victim, spelled backwards, is “We Fake News.”
Stealing articles is never a good thing, but what this case does point to is how we can learn more about misinformation and disinformation today if we look at how news is produced, alongside the more explored issue of how readers might fall for fake news.
Today’s economy has brought about a precarious class of digital freelance media laborers, as newspapers lay off staff en masse — and continue to do so in a pandemic. This raises the critical questions of who is giving us our news, and how.
In her typology of mis- and disinformation, Dr. Claire Wardle points to poor journalism as one cause of false connection, misleading content, and false context.
All War Is Deception (1917, 1943)
In the time of World Wars I and II, propaganda was a potent and oft-used weapon for gathering public support, as Octavian did, but also for fooling the enemy.
In 1917, two London-based newspapers — namely, the Times and the Daily Mail — ran a story about a ‘Kadaver’ factory called Kadaververwertungsanstalt in Germany. Based on “anonymous sources,” the reports found that the factory used human corpses to make soap and margarine.
The story was later on traced back to MI7, a branch of the British War Office that had hired 13 officers and 25 writers. This horrific (and thankfully, fake) piece of news is just one of many churned out by all sides in World War I.
In the days leading up to D-Day during World War II, the German forces that were occupying the Netherlands used the local publication Haarlemsche Courant to spread Nazi propaganda. Using it, they tried to downplay the Allies’ progress and impending arrival.
In response, locals put together a “fake” version of the paper that revealed the truth — and two versions of the Courant were published on D-Day.
All warfare, according to Sun Tzu, is based on deception.
And it has its unintended consequences: Historians explain that false stories like the corpse factory “encouraged later disbelief when early reports circulated about the Holocaust under Hitler.”
But as with the Dutch resistance, we can also see how we can combat disinformation. and reclaim narratives by understanding what’s going on around us and in the media. This is the hope of games like Go Viral, developed by Cambridge University researchers to help people understand and spot fake news about COVID-19.
The Fraudulent Research Hailed by Anti-vaxxers (1998)
Speaking of COVID-19 disinformation, one of the most worrisome — and deadly — ways false information has worsened today’s pandemic is the rise of the anti-vax movement, which had been causing (metaphorical) headaches and (literal) outbreaks of measles and other preventable diseases before the pandemic came around.
In 1998, a major medical journal published a small study that tied the measles, mumps, and rubella vaccine (a vaccine that is given to millions of children every year) to the development of autism. The problem was, the research was terrible.
For starters, the paper had no business being in a major journal or any bold claims about vaccination and health, as it studied only 12 children and had multiple discrepancies.
It was also debunked time and again (with some studies involving over 600,000 children) — leading to The Lancet retracting the paper and lead researcher Andrew Wakefield being stripped of his medical license.
(Plus, it’s pretty ableist to be terrified of your child having autism and not, say, polio, which is actually deadly but preventable through vaccines.)
Despite this, the fake science stuck, and two decades later, anti-vax propaganda translates to a staggering $1.1 billion in revenues for social media platforms.
The interesting thing about this case is that the media was complicit in spreading Wakefield’s worrisome ideas. Media at the time provided so much coverage to the dubious science without paying attention to the glaring methodological mistakes in the research, which The Lancet itself, for some reason, also missed.
Of course, this isn’t to subtract from the blame on Wakefield, because what he did was incredibly irresponsible. Since then, he’s had two decades to either replicate his results (to disprove his critics) or to acknowledge his mistakes — and he’s done neither. Not in the face of preventable illness, nor the deaths of children. And that means that he’s crossed from irresponsible to just plain evil.
But we also can’t deny how hard it has become for the media to undo the damage caused by giving Wakefield a platform to begin with.
And this is a mistake that we see too often. Many new and exciting medical studies reported in the news turn out to be poorly designed in the end. But they make for far more clickable material than detailed discussions about methodological validity, data saturation, or even epistemological assumptions of any given study.
A Lot to Learn
As we develop techniques and technologies to combat fake news, it’s important to recognize that we are fighting against a monster that’s as old as media itself, and it’s important to keep learning.
Too often, we blame others for falling into the many traps of fake news, when statistically, we’re likely to have fallen into one, too. Not only does this blind us from recognizing our own susceptibility to fake news, but it also keeps us from finding solutions together.
Back in 1944, Robert Knapp, a researcher from the US government’s Office of War Information, published a set of recommendations for what he had termed “effective rumor control.” Chief among them is to build faith in media institutions and to ensure that quality information is as accessible as possible — recommendations that still make a lot of sense today.
Fact From Fiction is a biweekly column on misinformation and disinformation around the world.