TechNews Pictorial PriceGrabber Video Thu Nov 28 17:55:10 2024

0


Facebook is reportedly going after the healthcare market. Bu
Source: Jacob Axelrad




In an attempt to win back the good graces of people angered by its controversial emotional experiment this summer, Facebook said Thursday that in future, its research on users will be subjected to more rigorous internal guidelines and standards.

Any research relating to material that "relates to content that may be considered deeply personal" will be more deeply assessed before research can begin. Among the steps it is taking, Facebook says its research projects will now be reviewed by a panel of Facebook researchers, its research practices will become part of the company's "six-week training program," and its research will now be available on a new website.

Other than those vague new guidelines, however, Facebook is still playing it close to the vest when it comes to company research practices �C nor did the company make any promises about seeking user consent prior to conducting experiments on them, experts note.
Recommended: Are you savvy about social networks? Take our quiz to find out.

"It's clear that Facebook is chastened by the backlash they received to the emotions study. That said, this response feels quite weak to me. Basically, they've promised more in-depth internal review, which they should be doing if only to prevent another spell of bad PR. But they've not given users the option to opt out of such research, they've not announced involving outside experts in reviews, and they've not addressed questions of whether certain types of experiments are ethical," says Ethan Zuckerman, director of the Center for Civic Media at the Massachusetts Institute of Technology. "I'm glad they've recognized they have a problem, but the steps announced seem modest to me."

The study that sparked Facebook's policy change, published in an academic journal in June, deliberately manipulated users' news feeds for one week in 2012 without their knowledge. The purpose was to test whether people responded differently to an excess of positive or negative content shown in their feeds. It turns out, to a degree, they do. As the experiment showed, more positive content resulted in more positive reactions; more negative content resulted in more negative reactions.

But when people learned Facebook was messing around with their news feeds �C for purposes other than the standard targeted advertising, of course �C they were not happy. A worldwide backlash erupted. Users took to Twitter to vent their frustration. A movement urging users to quit Facebook quickly gained popularity. Facebook was roundly criticized for experimenting on users without their consent. And, finally, Facebook apologized.

Granted, Internet companies experimenting with users' data is not new and is not unique to Facebook. Shortly after news surfaced about the Facebook experiment, the dating site OKCupid came out to publicly defend its own manipulation of user information. Google analyzes Gmail messages to target users with ads and search results. And Amazon uses customer information to recommend products it thinks people will like.

Now, Facebook appears to be promising a more open, thoughtful approach to its future research. In a statement, the company's chief technology officer Mike Schroepfer said Facebook has learned from its past missteps.

"We were unprepared for the reaction the paper received when it was published and have taken to heart the comments and criticism," he said in the statement. "It is clear now that there are things we should have done differently. For example, we should have considered other non-experimental ways to do this research. The research would have also benefitted from more extensive review by a wider and more senior group of people. Last, in releasing the study, we failed to communicate clearly why and how we did it."

Nonetheless, Facebook's business model relies on constant experimentation. And with roughly a seventh of the world's population using its service, it is unlikely to alter its successful practices due to the consternation of users.

"Facebook has every reason to manipulate the News Feed to optimize for whatever user engagement metrics correspond to the best returns for advertisers, which in turn correspond to the best returns for Facebook," writes Marcus Wohlsen in Wired. "And it has every reason to use other experiments in an effort to improve other parts of its operation. This is the way many online companies work."

Demonstrating its ever-expanding scope, Facebook on Friday earned approval from the European Union to cement its $19 billion purchase of messaging service WhatsApp, the company's largest acquisition so far.

This comes at a time when many disgruntled Facebook users are seeking alternatives. This week, Facebook apologized to the LGBT community after drawing criticism for its real-name policy that says people can only use the site with their legal names. That upset many users, especially members of the LGBT community who may go by a name different from their legal name.

Notably, social media newcomer Ello has attracted attention for positioning itself as the anti-Facebook. The invitation-only site features no ads and does not turn over user information to third parties. And Ello does not require people use their real names, earning it praise from the LGBT community.

According to Paul Budnitz, Ello's founder, the site has been signing up new users at a rate of around 4,000 people per hour, though that number has not been independently verified.


}

© 2021 PopYard - Technology for Today!| about us | privacy policy |