Facebook is in the business of adding to its base however possible in order to sell more advertising.
The two sides overlap when it comes to their devotion to inspiring attention and cash. As the following article explains, it is doubtful that Facebook knowingly discriminated against any particular point of view or ideology. Its processes are designed to emphasize what works. But the caution may be that there is no such thing as an unbiased algorithm - however unhappy the implications for those espousing a specific cause - especially when its rooted in something as desperately intended to generate inclusiveness as a social network. JL
Deepa Seetharaman reports in the Wall Street Journal:
Facebook says three factors have an outsize impact on a post’s relevancy score: the author of the post, the type of post and whether a user’s friends have liked, commented or otherwise engaged with that post. A post is deemed to be read most if it has been on the user’s screen for a specified, but unknown, amount of time.Other signals include how much time a user spends reading posts from that author and how often they are shared. The network matters, too.
Facebook Inc. is largely ruled by algorithms. But ultimately people must make judgments about what to show its 1.6 billion users, and how.
That mutual dependence on software and people came into view this week following allegations that Facebook workers manipulated the “trending topics” feature by suppressing conservative viewpoints.
To decide what to show on its main news feed, Facebook relies on an algorithm, or computer program, that combs through roughly 100,000 signals and creates a “relevancy score” for each post for a specific user, Facebook says.
An algorithm also governs what users see as trending topics. No two users see the same news feed or trending items.
Facebook has become an important source of news for many users, who spend an average of 50 minutes a day on the social network, according to the company.
Current and former employees say the algorithm is based on years of research observing how users interact with the network, and continual testing. But science goes only so far, and intuition plays a role as well, those current and former employees say.
“Facebook’s news feed team needs a human touch because ranking based purely on algorithms would feel unnatural, the same way that robots today do not appear quite human,” said SC Moatti, a former Facebook product leader and author of “Mobilized” a book about the business of mobile.
Facebook’s ultimate goal is to get more users to spend more time on the network, so the company spends a lot of time and energy deciding what to show users. No user can see everything that their friends post. In 2013, Facebook said there were, on average, 1,500 potential stories in a user’s feed from friends and pages they follow.
Facebook says three factors have an outsize impact on a post’s relevancy score: the author of the post, the type of post and whether a user’s friends have liked, commented or otherwise engaged with that post. A post is deemed to be read once most of it has been on the user’s screen for a specified, but unknown, amount of time.
Other signals include how much time a user spends reading posts from that author and how often they are shared. Posts from new Facebook friends get higher priority, as Facebook wants users to build connections with these friends, Ms. Moatti said.The network matters too: Users on a slower network may be shown more text posts; faster speed might yield more video. A user might also be shown more video if they tend to watch videos from start to finish, Facebook says.
Facebook launched the news feed almost 10 years ago. It was originally pitched as a personalized newspaper, to see the most relevant and interesting posts from your network. In early 2009, Facebook began displaying posts purely in reverse chronological order, but reversed course later that year. In 2010, the social network split news feed into “top news” and “most recent” and moved to a new ranking system based on machine learning.
Facebook tweaks the algorithm often—it has announced updates nearly once a month in the past 2 ½ years; many of those changes were designed to make the feed less promotional and repetitive.
Facebook launched the trending topics feature in 2014 as part of an effort to compete with sites like Twitter as a platform for conversation about real-time news. It assembled a team of contract workers in New York and tasked them with sifting through the most commonly discussed topics on Facebook.
Facebook denied a report from tech blog Gizmodo that its “news curators” altered the trending topics list for political reasons but acknowledged that it deploys people to manage the computer-generated lists of popular topics on the social network.
Reviewers look at hundreds of potential topics, many of which are slight variations of one another, according to one person who worked on the team. The reviewers were urged to combine topics and consolidate them under a single hashtag. For example, a popular video of a giant rat dragging a pizza through a New York subway stop appeared under topics like “pizza,” “rat” and “pizza rat.” The reviewer chose the hashtag “pizza rat.”
Reviewers also had to check to make sure trending news events weren’t hoaxes; at one point, the team had to corroborate them with stories published from one of a 1,000 publications deemed credible by Facebook. The person didn’t know who assembled the list.
Each curator had to develop 20 trends over an eight-hour shift. The former worker said this was sometimes difficult because of the amount of duplication and hoaxes.
Facebook declined to comment on how the review process works.
Other topics—such as #BlackLivesMatter and #MakeAmericaGreatAgain—are always popular, regardless of recent news. Those hashtags are “blacklisted” unless a news event drives them back into the public sphere. At that point, a reviewer can allow the hashtag to trend again, said the person who worked on the team.
0 comments:
Post a Comment