Archives

Categories

Tags

Is it time for an Algorithm Ombudsman?

(Photo by Ulysse Bellier and licensed via Creative Commons - CC BY 2.0)

In my recent post The New Gatekeepers I noted how algorithms are playing an increasingly important role in determining what information people are presented online. This is particularly true in an age where, according to a recent Pew Research Center report, 60 percent of Facebook and Twitter users say they use the sites as sources of news. With this new reality in mind, in a an op-ed last week the president of the Newseum, Jeffey Herbst, argued that in our digital world The Algorithm is an Editor because it often determines, on a personalized level, what news stories individual users see. For Herbst, this revolution raises questions about how social media sites negotiate their for-profit imperative with their new-found roles as providers of information and the social responsibility this entails in democratic societies. While legacy media companies addressed similar issues by attempting to wall off news reporting from advertising, this task may prove more difficult when algorithms are in charge.

A central concern about the rise of algorithms is that these complex equations are essentially “black boxes” in that very few people understand how they work or what they are programmed to do. Indeed, algorithms are trade secrets that social media companies invest heavily in because they are key to improving user experience and thus generating profit. Often times the only way that we can find out about how they are working is through excellent data journalism that reverse engineers what algorithms are doing. In a recent article Investigating the Algorithms that Govern our Lives, Chava Gouerie looks at some recent reporting and draws attention to the fact that algorithms can often encode and even amplify forms of discrimination, such as racism and gender bias. While little research has focused on how these imperfect algorithms are impacting the type of news stories that are pushed to peoples feeds, it is safe to assume that bias is persisting in often hard to detect ways.

Given the power of algorithms in shaping our information ecosystem perhaps as well as their clear imperfections, perhaps it is time for social media companies to think about creating the position of “Algorithm Ombudsman” – an outsider who would be given privileged access to a company’s algorithm in order to ensure that the public good was being taken into account. Having a public advocate on staff could help build trust among users that companies are doing their best to make the user experience fair and equitable.

The Role of an Ombudsman

An algorithm ombudsman would be the digital equivalent of a public editor at a newspaper. The public editor’s role is to supervise the implementation of proper journalism ethics and to examine critical errors and omissions. They also serve as an important liaison with readers when disputes about an article or editorial practice emerge. Ideally they are given enough autonomy to be able to challenge the editor and owner without fear of losing their job. Their role is not to micro-manage the entire organization, but rather to make sure that the broader practices of the newspaper are ethical and consistent.

Ideally, an algorithm ombudsman would be an already respected individual who was given privileged access to the inner-workings of the algorithms in order to evaluate whether it was operating in an ethical way. So, for example, when cases of algorithmic discrimination are made, a well-respected individual is given the autonomy to investigate. This does not mean that the ombudsman would necessarily get to change company policy, but rather that they would at least be able to truly understand the problem being discusses from all angles and be able to voice their opinion.

What Might an Algorithm Ombudsman Do?

Are different demographic groups being exposed to relevant content? Are important news stories being obscured because they aren’t feel-good and clickable? Is an increased emphasis on video sharing privileging certain types of content over others? These are some of the questions that an algorithm ombudsman could look into if asked. They are questions that many of us have asked ourselves, but which we have little ability to actually investigate.

In 2014, Facebook famously made slight changes to its algorithm so that articles by the liberal-leaning Upworthy website would no longer appear as often on user’s feeds. Many users liked this change as they were tired of the so-called “click-bait.” However, this change also meant that the progressive political content that was once prominent on many feeds was no longer there. This demonstrated the power of algorithms to shape our information ecosystem. Yet, Facebook did not publicly address these changes, even though they undoubtedly impacted political discourse and the exchange of ideas. In this case, the algorithm ombudsman that I envision could have potentially looked into this change and issued an opinion on how the change was implemented and whether or not it was, on the whole, good for the users of Facebook.

As the lines between media companies and social media are increasingly blurred, social media companies are going to have to come to terms with the greater social responsibility they acquire as some of the most important providers of information in our societies. Therefore, we need new systems that can help generate trust in these platforms and ensure that they are contributing to the common good. Opening up the “black box” of algorithms just a little would go a long way in holding social media platforms accountable in our information ecosystem.


Daniel O’Maley is the Associate Editor at the Center for International Media Assistance.

Blog Post

Comments (0)

Comments are closed for this post.