Content kingmaker??quality or webpage position? Source: University of Southern California
Position bias: variation in votes based on position. Credit: PLOS ONE
In today's Information Age, it's easy get overwhelmed by online content. On YouTube alone, over 100 hours worth of video is uploaded every minute. Showcasing the most interesting content allows providers to convey a certain level of quality to its audiences and encourages users to stay on the website, consuming content and winning advertising dollars for its provider. However, this influx of information makes it difficult for both content providers and users to determine what is interesting and worth consuming.
Due to the sheer volume of submitted content, some providers such as Reddit depend on user ratings or peer recommendations to navigate and sort their most interesting content for other users. Providers depend on this collective intelligence to identify new quality content and rank it for other users, but these collective judgments via peer recommendation may be biased and inconsistent. In practice, peer recommendation often leads to "winner-take-all" and "irrational herding" behaviors that result in similar content receiving widely different numbers and types of recommendations.
In a study published in the peer-reviewed online journal PLOS ONE this week, researchers evaluated some popular peer recommendation strategies and their ability to identify interesting content. Dr. Kristina Lerman, a computer science professor at the USC Viterbi School of Engineering and a Project Leader at USC Viterbi's Information Sciences Institute, and co-author Tad Hogg, a Research Fellow at the Institute for Molecular Manufacturing in Palo Alto, CA, first determined what kind of content users prefer and then evaluated how position on a webpage affects collective judgments about content.
"Psychologists have known for decades that position bias affects perception: people pay more attention to items at the top of a list than those below them," said USC Viterbi computer science professor, Dr. Kristina Lerman. "We were surprised, however, how strongly this affected user behavior and the outcomes of recommendation."
Lerman and Hogg found that position bias accounts for consumers spending five times more attention on material that is posted near the top of a webpage. Position bias can be a potential problem for sites that rely on peer recommendation alone. For example, Reddit posts appear in order of popularity, derived from up-votes and down-votes by users, with more popular posts nearer the top of the webpage. Due to position bias, users are more likely to see, consume, and recommend already-popular content positioned near the top of the webpage, creating a run-away loop that further amplifies its popularity at the expense of potentially more interesting content farther down the webpage.
In their study, Lerman and Hogg demonstrated that ordering content by recency of recommendation rather than by aggregate popularity (total 'likes' or recommendations), generates better estimates of what users actually find interesting and would prefer to consume.
In contrast, Twitter's system of sharing and recommending content avoids the "winner-take-all" and "irrational herding" effects by presenting content in chronological order, based on the time of recommendation. Retweets, or recommendations, bring older posts back up to the top of a user's newsfeed, helping to reduce the herding effect.
"Twitter does the right thing when it pushes newly retweeted posts to the top of the followers' screens, giving them another chance to discover interesting content," said Lerman.
By influencing the peer recommendations that determine the ranking of content, position bias can create a cycle that can exclude quality content. By understanding and being aware of the factors that influence peer recommendation, providers can more effectively leverage collective judgments of consumers about what content is worthy of their time and attention.
| }
|