TechNews Pictorial PriceGrabber Video Sun Nov 24 20:41:45 2024

0


Facebook fake news did not influence US election - Zuckerberg
Source: Newman




Facebook founder and CEO Mark Zuckerberg has hit back at "crazy" claims that fake news posted on the platform contributed to the election of Donald Trump to the White House.

Several articles published following the shock result have pointed the finger at social networks for failing to tackle biased and factually incorrect stories, suggesting that the increase in people receiving their news from social platforms had influenced the result.

But speaking at a technology conference broadcast live on his Facebook page, Mr Zuckerberg said: "Personally, I think the idea that fake news on Facebook - of which it's a very small amount of the content - influenced the election in any way is a pretty crazy idea."

He added: "Part of what I think is going on here is that people are trying to understand the result of the election. But I do think there is a certain profound lack of empathy in asserting that the only reason that someone could have voted the way they did is because they saw some fake news.

"If you believe that, then I don't think you have internalised the message that Trump supporters are trying to send in this election."

According to data from the Reuters Institute, almost half (46%) of people in the US use social media for news every week - up from 26% three years ago.

Facebook has come in for particular criticism in the past year for its trending module, which is reported to have promoted several bogus news stories since removing human curators and instead relying on automation.

While some of these articles have been conservative in nature, not all are so: one popular meme that has circulated throughout the US election campaign contains false quotes attributed to the president-elect.

In it, he is wrongly said to have called Republican voters "the dumbest group of voters in the country".

Nic Newman, a digital strategist and research associate at the Reuters Institute, said that "for many people, Facebook is now the front page of the internet and here news is selected by an algorithm not an editor - with Mark Zuckerberg as editor in chief".

"In that context, the impact of traditional - fact and reason-based - news brands is diluted, flattened because it looks pretty similar to a huge range of other content that include disreputable news, partial news and fake news. In the process trust in the media as a whole is affected along with its impact.

"The editorial endorsement of a newspaper for one candidate or another has become irrelevant because consumers make up their own minds now on the basis of a huge stream of information coming at them every day."

Others have expressed concern that Facebook's algorithms - promoting content that past behaviour suggests a user will like - can lead to an "echo chamber" effect, meaning that people are often exposed only to opinions they tend to agree with.

Richard Sambrook, professor of journalism at Cardiff University, is among those concerned.

He said: "There is no doubt that the online filter bubble polarises views and hardens divisions. And most people don't recognise it.

"Try to find views different from your own on Facebook or Twitter - it's not impossible, but it is really hard.

"The algorithms just serve up what they think you want to see. But diversity of views is very important - it builds understanding and tolerance. Understanding and penetrating these echo chambers is in my view an urgent task."

Either way, Professor Sambrook claims Facebook is "trying to duck the responsibility for what they have enabled" by insisting it is just a technology firm, rather than a media company.

But Mr Newman says research is divided on the "echo chamber" effect, adding: "Yes, the choices that Facebook make mean that you are more likely to see things you like, but there is also evidence that digital and social media expose you to more sources than ever before.

"We should remember that a few decades ago most people only took one newspaper with a defined political perspective. If we want to be, we can be far better informed today."

On the "echo chamber" issue, Mr Zuckerberg told the Techonomy conference: "All the research we have suggests that this isn't really a problem... but for whatever reason we've had a really hard time getting that out."

He said the company's research had found that "almost everyone" had some friends - typically at least 10% - that leaned the other way to them politically.

He added that information received from Facebook was "going to be inherently more diverse" than that previously received through a small number of TV networks or newspapers, many of which had their own political standpoints.

In a statement, Adam Mosseri, Facebook's VP of product management said: "We take misinformation on Facebook very seriously. We value authentic communication, and hear consistently from those who use Facebook that they prefer not to see misinformation.

"In Newsfeed we use various signals based on community feedback to determine which posts are likely to contain inaccurate information, and reduce their distribution.

"In Trending we look at a variety of signals to help make sure the topics being shown are reflective of real-world events, and take additional steps to prevent false or misleading content from appearing.

"Despite these efforts we understand there's so much more we need to do, and that is why it's important that we keep improving our ability to detect misinformation. We're committed to continuing to work on this issue and improve the experiences on our platform."


}

© 2021 PopYard - Technology for Today!| about us | privacy policy |