Pay no attention to the man behind the algorithm
by Heather Zeiger | September 23, 2016
If Facebook is all about crafting an image, the company may need to take few lessons from its users. This past summer Facebook’s Trending feature came under fire from Gizmodo for allegedly stifling conservative news media outlets. Additionally, the Trending team inserted some news items that did not hit the threshold for a Trending topic.
Trending is a news aggregator that is visible on the right side of the screen when you view Facebook on a non-mobile device. As Gizmodo reports, the news is sifted by a group of news curators, comprised of mostly young, Ivy-league educated journalists.
The fact that Facebook was curating news was not the biggest problem. Traditional media outlets have been doing this for years. Where Facebook fell afoul of its audience is in claiming to be objective and appearing to be data-driven rather than human-driven.
When Facebook announced Trending in 2014, Zukerberg said that it was a “new product that’s designed to surface interesting and relevant conversations in order to help you discover the best content from all across Facebook.”
Mark Zukerberg and Facebook did their due diligence to address the accusation of bias by meeting with several well-known conservatives and firing the Trending team. The journalists were replaced with engineers who would oversee the algorithm that the journalists were apparently teaching.
The new system did not work as well as was hoped. The first day of its launch Trending posted an inappropriate YouTube video. While launch-day glitches can be forgiven, even if they cannot be un-seen, the algorithm continued to prove it was not up-to-task by posting a fake news story about a Fox News reporter who was fired for supporting Hilary Clinton.
Given that this whole thing started with allegations of partisan bias, this was a particularly unfortunate mistake on the algorithm’s part. Then, on September 11th, Trending posted a story on a 9/11 conspiracy, or a so-called “truther” story. Finally, Trending’s news story about the Apple iPhone 7 was from a satirical website that says on its home page not to confuse its content with real news.
The backlash over this episode was particularly harsh, and some people felt betrayed by Facebook for its lack of transparency.
The Pew Research Center has conducted several surveys on news consumption and social media. According to a 2013 poll, 43% of Facebook users get their news from Facebook, but only 4% of Facebook news consumers consider it the most important way to get their news.
Even though Facebook is not people’s primary news outlet, it has the attention of millions. About 30% of the population overall has received some portion of their news from Facebook, making Facebook the largest influence on news coverage and public opinion.
However, a recent poll that Pew Research conducted after the Trending story came out showed only about 7% of adults who get their news from social media trust the information they get from the site.
Customer expectations and Company obligations
There is a saying in the tech industry that if you are not paying for it, then you are the product, not the customer. Facebook, whose motto is “free and always will be,” has a trove of data that you have agreed to provide it. Every time you upload a picture of your child’s first day of school, or your meal with your spouse at a swanky restaurant, or the sunset from your hotel window, you have just given Facebook, and its advertisers, information for targeted marketing.
What do you get out of sharing this personal information with Facebook, aside from targeted advertising? You get the Facebook experience. And, with that experience, comes certain expectations whether the company explicitly states those services or implies them by way of mission statements.
Consider Disney World as an example. Like Facebook, Disney World does not have an equivalent competitor. There may be other theme parks, but you cannot get “the Disney World experience” going anywhere else. You pay the ticket price with certain expectations beyond the “terms of service” in mind. Is Disney obligated to meet our expectations? No, they technically are not, but it’s good for business that they do.
Facebook’s stated mission is to “give people the power to share and make the world more open and connected.” In an interview about whether Trending news was suppressing conservative views, Mark Zukerberg says “Facebook stands for giving everyone a voice.” When Facebook announced Trending, it painted a picture of the user experience:
To the right of your News Feed, you’ll see a list of topics that have recently spiked in popularity. The list is personalized, including topics based on things you’re interested in and what is trending across Facebook overall. Each topic is accompanied by a headline that briefly explains why it is trending. You can click on any headline to see the most interesting posts from your friends or Pages that are talking about that particular topic.
It is implied that Trending, as part of the Facebook experience, will be personalized and based on aggregate data. Does that mean Facebook is obligated to reveal who news is curated by and provide bi-partisan news? No, but it’s good for business if they do.
Zukerberg says that Facebook is a technology company, not a media company, “We build tools; we do not produce any content.” In this statement he abdicates responsibility for content by pointing out that they are just the messenger, so to speak.
But when you have the eyes of over a billion people (as of June 2016, Facebook reports 1.03 billion mobile daily active users, on average), there is some responsibility for what content becomes Trending news.
Media ethicist Kelly McBride says that Facebook has an obligation to be transparent about how it curates its stories: “As the No. 1 driver of audience to news sites, Facebook has become the biggest force in the marketplace of ideas. With that influence comes a significant responsibility.”
Beyond the Algorithm
I update a news aggregator website on technology, medicine, and ethics. My job is to sift through my alerts, which serve as an unsophisticated algorithm, and run them through a metric to determine which stories make the cut and which ones do not. One thing that I have noticed over the years of updating the news is that human oversight is absolutely necessary.
For example, if “cyborg” is one of my tags, I will get pings for everything from DC Comics to MMA (mixed martial arts) fighting. I could teach the algorithm to ignore these, but that does not stop the occasional article calling some politician a cyborg (Mitt Romney and Hilary Clinton have both been dubbed cyborgs).
Additionally, part of my job is figure out what constitutes a news item as opposed to satire, speculation, or sponsored content something that is difficult to teach even the most sophisticated algorithms.
As hard as I may try to be objective, I am still biased because I am human. However, even if I had the most sophisticated algorithm available to me, it too would have biases because it was made by humans. This is the point of a Wired article entitled “Repeat after me: Humans run the Internet, not algorithms”.
The Wired article had to spell this out because the prevailing worldview surrounding technology, and the Internet in particular, is built on some false assumptions and it is these false assumptions that lead to the backlash over Facebook’s Trending feature.
In Evgeny Morozov’s book To Save Everything, Click Here, he critiques Internet-centrism and Silicon Valley’s attitude toward so-called gatekeepers. Facebook and various others in Silicon Valley perpetuate the myth that the Internet is a truly democratic entity devoid of human manipulation and the more information that is available for people to access, the better.
To their way of thinking, gatekeepers are the enemy, which is ironic given that they often serve as gatekeepers for news and information themselves. This may be why people were so upset about Facebook’s Trending team. It seemed hypocritical to the Silicon Valley ideal. As Wired pointed out, human intervention is the norm for the Internet and
Facebook is responsible for the misconceptions that persist. But so is Google—especially Google. And so is the tech press. They’ve spent years feeding the notion that the Internet is entirely automated. Though it doesn’t operate that way, people want it to.
On a deeper level, the backlash seems to stem from people’s disappointment that behind the curtain of the great and powerful Facebook is a bunch of under-appreciated journalists writing headlines to news stories that they have deemed newsworthy based on a loosely guided metric.
People desire some way to know Truth, apart from human bias or manipulation. But, in a culture devoid of any notions of objective truth, all we have left are subjects, each looking at reality through their own biased lens. It is telling that, yet again, people are turning to technology, this time in the form of algorithms, for answers to existential questions.
Heather Zeiger is a freelance science writer with advanced degrees in chemistry and bioethics. She writes on the intersection of science, culture, and technology.