Facebook shows you what you want to see post-election Source: wtnh
Yours might be one of angst and despair, or celebrations and “I told you so’s.” It depends on the people you’re friends with and the online community you’ve created with your clicks, likes and shares.
Facebook’s algorithm knows what you like based on the videos you watch, people you talk to, and content you interact with. It then shows you more of the same. This creates something called “filter bubbles.” You begin to see only the content you like and agree with, while Facebook hides dissenting points of view.
This means news on Facebook comes with confirmation bias — it reinforces what you already think is true — and people are increasingly frustrated.
Facebook denies it’s a media company, yet almost half of U.S. adults get news from Facebook.
When Facebook fired its human curators and began to rely on algorithms to surface popular stories earlier this year, fake news proliferated.
Viral memes and propaganda spread among people with similar beliefs and interests. It’s cheaper and easier to create and spread ideological disinformation than deeply-researched and reported news. And it comes from all over — teens in Macedonia are responsible for a large portion of fake pro-Trump news, according to a BuzzFeed analysis.
Filter bubbles became especially problematic during the presidential election.
Hyperpartisan news sites and fake websites distributed false stories about voter fraud, election conspiracies, and the candidates’ pasts that spread like wildfire on Facebook. It was more prevalent on right-leaning Facebook pages. As CNNMoney’s Brian Stelter said in response to the growing number of false viral stories, people should have a “triple check before you share” rule.
Today, many people are shocked by Trump’s victory. Words of fear and sorrow fill their Facebook feeds, and even those with thousands of friends are probably only seeing posts that echo their feelings.
But if you voted for Trump, chances are your feed reflects the opposite. You might see a cascade of #MakeAmericaGreatAgain hashtags and friends celebrating.
Facebook will likely face questions about its responsibility to weed out fake sites and false stories. The company did not respond to questions about whether it plans to modify the news feed algorithms. On an earnings call in July, Facebook CEO Mark Zuckerberg argued that social media sites actually provide more diversity of opinion than traditional news outlets.
But Facebook is just the vehicle with which friends and family are sharing news. It’s not necessarily about trusting the news people see on Facebook, but believing and trusting the people who are sharing it.
The U.S. presidential election starkly illustrated the problems with filter bubbles and personalized social networks. Almost two billion people use the social media site where friends and family share and nurture similar beliefs and false news snowballs. And it raises questions as to the civic responsibility of the world’s biggest platform — and whether or not automated fact-checking could have changed an election.
| }
|