Google And Facebook Under Fire For Spreading Hoaxes During Las Vegas Attack Source: Jessica Lachenal
People scrambling for shelter at the Route 91 Harvest country music festival after gunfire was heard on October 1, 2017 in Las Vegas, Nevada.
Immediately following the first announcements that a terrorist had opened fire on Las Vegas concert-goers, people took to the internet for answers. Instead, what they found was Google and Facebook turning up some dubious news search results from some even more dubious web sources.
The two internet giants are under fire for displaying highly questionable news items from far right-wing pundit websites and — get this — 4chan as news search results in the hours following the shooting. The sites and the notoriously anarchic and conspiracy-theory-laden image board community were sharing "breaking news" where they had wrongly identified the terrorist and shared other misinformation. It was only after mainstream media websites began covering the story late Sunday and early Monday morning that these search results were phased out and replaced by more reputable sources.
Perhaps more worryingly, it wasn't revealed until hours later that the person identified by these sites actually had nothing to do with the Las Vegas attack.
According to ABC 7, a Google spokesperson commented on the inclusion of 4chan's "/pol/" or "Politically Incorrect" board in the news section, saying, "Within hours, the 4chan story was algorithmically replaced [itals ours] by relevant results. This should not have appeared for any queries, and we'll continue to make algorithmic improvements to prevent this from happening in the future."
For one thing, there's a huge problem in how the Google spokesperson referred to the 4chan post as a "story," as if 4chan should in any way be counted as a news site. And it's in this lending of legitimacy that we see the problem that runs underneath nearly all discussions of fake news and the like: In relying on algorithms and other computer-only methods to tell people what's news or not, both Google and Facebook have opened themselves up to highlighting "news sites" that would never pass muster if a human being were involved in the vetting process. Arguably, these problems are being caused by their over-reliance on algorithms.
Consider Facebook's own past struggles with fake news. The site first made headlines last year when they replaced their human news curators and editors — responsible for vetting what appears in that "Trending" section on your Facebook page — with an algorithm, in an attempt to thwart accusations from the right that these editors had a liberal bias. A follow-up investigation by Facebook argued that there was no bias involved in their work, but people remained unsatisfied by that explanation.
Critics later voiced their displeasure with their decision to use an algorithm, warning that such a system could easily be gamed by anybody willing enough to try. More recently, The Verge reported on a Facebook redesign of the Trending News page, sharing an explanation from Facebook about how the stories are chosen. According to them, a story was chosen based on "the engagement around the article on Facebook, the engagement around the publisher overall, and whether other articles are linking to it." Again, no mention of human oversight.
There's another worrying facet regarding the quick spread of this fake news regarding the Las Vegas attack. Another site included in the early search results was the Gateway Pundit, which took a deeper dive into the incorrectly named suspect. It described him as "a far-left loon" and "a Democrat who liked Rachel Maddow, MoveOn.org, and associated with the anti-Trump army," reports CNN (warning: video containing gunshot sounds autoplays on the site). Other right-wing sites had latched onto this narrative with little to no confirmation, and ironically, the fake news went viral thanks to those right-wing sites.
This and a few other dubious sites turned up on Facebook's Crisis Response page, which is where family members can check in with the site to tell others that they're safe. Along with check-ins, people are offered news regarding the relevant crisis; this is where those "stories" turned up. Facebook commented on their inclusion to Fast Company, saying, "We are working to fix the issue that allowed this to happen in the first place and deeply regret the confusion this caused."
Both Google and Facebook's comments about "working to fix the issue" are echoes of previous promises from both sites to "do better" when it comes to their role in the news media. Facebook in particular has been in the hot seat as of late as they continue to assist the federal government in their investigation into Russia's work to influence the 2016 election. Numerous promises about working on solutions have been pouring out of Facebook from both CEO Mark Zuckerberg and Facebook's vice president of global policy, Joel Kaplan, and if this mistake regarding Las Vegas attack news is anything to go by, then those promises have yet to be filled.
The LA Times spoke with Gabriel Kahn, a professor at the USC Annenberg School for Communication and Journalism, regarding the use of algorithms to highlight news stories. He summed up the argument against algorithm-based news curation saying, "These algorithms were designed with intent and the intent is to reap financial reward. They’re very effective, but there’s also collateral damage as a result of designing platforms that way. It’s not good enough to say, ‘Hey, we’re neutral. We’re simply an algorithm and a platform.’ They have a major responsibility that they still have not fully come to terms with."
| }
|