Fake Claims Over US Shooting Dog Facebook, YouTube
SAN FRANCISCO • A week after the school shooting in Parkland, Florida, Facebook and YouTube had vowed to crack down on the trolls.
Thousands of posts and videos had popped up on the sites, falsely claiming that survivors of the shooting were paid actors or part of various conspiracy theories.
Facebook called the posts “abhorrent”. YouTube, which is owned by Google, said it needed to do better.
Both promised to remove the content. The companies have since aggressively pulled down many posts and videos and reduced the visibility of others.
Yet spot searches of the sites revealed that the noxious content was far from eradicated.
On Facebook and Instagram, which is owned by Facebook, searches on Friday for the hashtag #crisisactor, used to accuse the Parkland survivors of being actors, turned up hundreds of posts perpetuating the falsehood, although some also criticised the conspiracy theory.
Many of the posts had been tweaked ever so slightly – for example, videos had been renamed #propaganda rather than #hoax – to evade automated detection.
And on YouTube, while many of the conspiracy videos claiming that the students were actors had been taken down, other videos that claimed the shooting had been a hoax remained rife.
The teenagers of Marjory Stoneman Douglas High School, who lost 17 of their classmates and school staff members in the mass shooting on Feb 14, have emerged as passionate advocates for reform, speaking openly of their anger in the hope of forcing a reckoning on guns.
But in certain right-wing corners of the Internet – and increasingly from more mainstream voices like radio talk-show host Rush Limbaugh and a commentator on CNN – the students were portrayed not as grief-ridden survivors but as pawns and conspiracists intent on exploiting a tragedy to undermine the nation’s laws.
In these baseless accounts, the students were described as “crisis actors” who travel to the sites of shootings to instigate fury against guns. Or they were called Federal Bureau of Investigation plants, defending the bureau for its failure to catch the shooter.
They have been portrayed as puppets being coached and manipulated by the Democratic Party, gun control activists, the so-called antifa or anti-fascist movement, and the left-wing billionaire George Soros. The theories were far-fetched. But they found a broad and prominent audience online.
Last Tuesday, the President’s son Donald Trump Jr liked a pair of tweets that accused David Hogg – a 17-year-old who is among the most outspoken of the Parkland students – of criticising the Trump administration in an effort to protect his father, whom the teenager has described as a retired FBI agent.
Others offered more sweeping condemnations. Mr Alex Jones, the conspiracy theorist behind the site Infowars, suggested that the mass shooting was a “false flag” orchestrated by anti-gun groups.
On his radio programme, Mr Limbaugh said of the student activists last Monday: “Everything they’re doing is right out of the Democrat Party’s various playbooks.
“It has the same enemies – the NRA (National Rifle Association) and guns.”
By last Tuesday, that argument had migrated to CNN.
In an on-air appearance, Mr Jack Kingston, a former United States representative from Georgia and a regular CNN commentator, asked: “Do we really think – and I say this sincerely – do we really think 17-year-olds on their own are going to plan a nationwide rally?”
The resilience of misinformation, despite efforts by the technology behemoths to eliminate it, has become a real-time case study of how the companies are constantly a step behind in stamping out the content.
At every turn, trolls, conspiracy theorists and others have proved to be more adept at taking advantage of exactly what the sites were created to do – encourage people to post almost anything they want – than the companies are at catching them.
“They’re not able to police their platforms when the type of content that they’re promising to prohibit changes on a too-frequent basis,” said Mr Jonathon Morgan, founder of a company that tracks disinformation online, of Facebook and YouTube.
The difficulty of dealing with inappropriate online content stands out with the Parkland shooting because the tech companies have effectively committed to removing any accusations that the Parkland survivors were actors, a step they did not take after other recent mass shootings, such as last October’s massacre in Las Vegas.
Mr Morgan said in the past, the companies typically addressed specific types of content only when it was illegal – posts from terrorist organisations, for example.
Facebook and YouTube’s promises follow criticism in recent months over how their sites can be gamed to spread Russian propaganda, among other abuses.
The companies have said they are betting big on artificial intelligence systems to help identify and take down inappropriate content, though that technology is still being developed.
In the meantime, they have hired or said they plan to hire more people to comb through what is posted to their sites. Facebook said it was hiring 1,000 new moderators to review content. YouTube has said it plans to have 10,000 moderators by the year end.