By TALI ARBELJanuary 19, 2021
Major social platforms have been cracking down on the spread of misinformation and conspiracy theories in the leadup to the presidential election, and expanded their efforts in the wake of the Jan. 6 Capitol riot. But Apple and Google, among others, have left open a major loophole for this material: Podcasts.
Podcasts made available by the two Big Tech companies let you tune into the world of the QAnon conspiracy theory, wallow in President Donald Trump’s false claims of a stolen election and bask in other extremism. Accounts that have been banned on social media for election misinformation, threatening or bullying, and breaking other rules also still live on as podcasts available on the tech giants’ platforms.
Conspiracy theorists have peddled stolen-election fantasies, coronavirus conspiracies and violent rhetoric. One podcaster, RedPill78, called the Capitol siege a “staged event” in a Jan. 11 episode of Red Pill News...
Podcasting “plays a particularly outsized role” in propagating white supremacy, said a 2018 report from the Anti-Defamation League. Many white supremacists, like QAnon adherents, support Trump. Podcasting’s an intimate, humanizing mode of communication that lets extremists expound on their ideas for hours at a time, said Oren Segal of ADL’s Center on Extremism...
“Podcasts filled with hatred and incitement to violence should not be treated any differently than any other content,” Segal said. “If you’re going to take a strong stance against hate and extremism in the platform in any way, it should be all-inclusive.”..
Podcasts suffer from the same misinformation problem as other platforms, said Shane Creevy, head of editorial for Kinzen, a startup created by former Facebook and Twitter executives that offers a disinformation tracker to companies, including some that host or curate podcasts.
Creevy points out that it’s harder to analyze misinformation from video and audio than from text. Podcasts can also run for hours, making them difficult to monitor. And podcasting has additional challenges in that there are no reliable statistics on their audience, unlike a YouTube stream, which shows views, or a tweet or Facebook post, which shows likes and shares, Creevy said.
But some argue that tech-company moderation is opaque and inconsistent, creating a new set of problems. Censorship “goes with the tide against what’s popular in any given moment,” said Jillian York, an expert at the Electronic Frontier Foundation, a digital-rights group. Right now, she said, “that tide is against the speech of right-wing extremists ... but tomorrow the tide might be against opposition activists.”
AP Technology Editor David Hamilton contributed to this article.