[ad_1]
Here in the US, we are deep into election season, and it is impossible to debate politics without also debating how technology is distorting it. There are the AI-generated deepfake images Donald Trump circulated of Taylor Swift appearing to endorse his campaign, as well as disproven conspiracy theories about rigged voting machines. And then there are the malicious disinformation campaigns on social media, which are coming from everywhere – with seemingly no solutions in sight.
The Microsoft Threat Analysis Center released a report charting a recent rise in fake activist and news websites, as well as fake accounts on social media, created by operatives in Russia, Iran and China. Generally, their goals are to create chaos during the election and exacerbate tensions over race, gender and other hot-button cultural issues.
On X, Elon Musk released a chatbot called Grok, which was spouting misinformation for several weeks about when to vote. Meanwhile, Meta, owner of Facebook, announced it has a solution to the political misinformation conundrum: its social platform Threads won’t “recommend” political content. This is akin to claiming one can remove butter from shortbread – it sounds like a healthy goal until you try to separate out what is “political” from everything else.
I recently talked about all this at the National Book Festival in Washington DC on a panel with propaganda expert Peter Pomerantsev. Our moderator asked us if the 2024 election is better or worse than the 2020 one in terms of misinformation and disinformation, with the latter being false information intended to deceive or mislead. It is a complicated question, because social media has changed so much since 2020.
I think people are more aware of online misinformation, but they are more confused by it than ever. Partly that is because the previous generation of social platforms is crumbling away, and the new ones have fragmented us into dozens of spaces. But in the US, this confusion is also engineered. Politicians have sued and hamstrung academic groups like the Stanford Internet Observatory in California, which tracked US election misinformation online in 2020. We know the propaganda is out there, but nobody is able to analyse it adequately.
Plus, as Pomerantsev said, it isn’t as if online disinformation will evaporate after the elections. Indeed, many people crave it. Propaganda makes us feel like we are part of a community, united against a common enemy. This insight is particularly profound when it comes to social media, which is also designed to make people feel like they are part of a community even when they are alone with their glowing screen.
We are at a weird historical juncture. Experts may understand why propaganda works, but no longer know how it reaches us on a technical level. When the military sent propaganda to adversaries in the past century, they loaded pamphlets into planes and dropped them behind enemy lines. It was pretty obvious where the information was coming from, and why. Today, companies hide the way false information rockets across their platforms. Their algorithms for surfacing content are secret, and so are the identities of many people posting. We don’t need better technology to solve our misinformation problems; we need transparency from the companies disseminating it. They should be honest about where the content in our feeds is coming from, because most of what we see is determined by algorithm, and comes from strangers we never opted to follow. If researchers knew how information got into our feeds, and how people respond to it, they might come up with tools that prevent dangerous lies from spreading.
Still, there are non-technical solutions too. Bestselling author Rebecca Yarros, whose fantasy novel Fourth Wing is about a group of students at a war college for dragon riders, also spoke at the National Book Festival . As Yarros’s main character Violet gains more experience, she realises that the leaders of the college have been rewriting history books to justify a centuries-long war. In reality, her people started the war by colonising the group Violet once thought were the baddies.
Yarros explained that she wrote the book in part to protest US politicians who are removing references to slavery from history books. Many audience members thanked her for telling a story that debunked propaganda and was on the side of colonised people.
I walked around with a smile for a while afterwards. Partly it was the fizz of being in a real-life community, not one fabricated by propaganda. But it was more than that. Popular stories like Fourth Wing give me hope that the escape from propaganda can be just as compelling as the escape into it.
Annalee’s week
What I’m listening to
The podcast Tested, by Rose Eveleth, about the history and science of sex testing at the Olympics (see review, page 30)
What I’m reading
Peter Pomerantsev’s How to Win an Information War, a tale of British psychological operations in the second world war
What I’m working on
Learning about the history of biang biang noodles
Annalee Newitz is a science journalist and author. Their latest book is Stories Are Weapons: Psychological warfare and the American mind. They are the co-host of the Hugo-winning podcast Our Opinions Are Correct. You can follow them @annaleen and their website is techsploitation.com
Topics:
[ad_2]