Shining a Light on the Algorithms Behind Search Results and News Feeds
Blog Post
Shutterstock
Oct. 21, 2019
Over the past two decades, the amount of information available on the internet has grown exponentially. However, as sources of this information have similarly grown in number and capacity, users have struggled to identify which ones are reliable and relevant.
A number of internet platforms, including search engines and social media networks, have asserted that they are able to provide relevant content to users with artificial intelligence. Using their proprietary algorithmic tools, these platforms have established complex processes for curating and ranking the content they show users, with the aim of providing personalized search results and news feeds. They assert that this promotes content that users will find relevant and meaningful. However, these practices also help drive these company’s bottom lines and can increase the risk that individuals are restricted to “filter bubbles” or are unable to engage with new content that matches their potentially expanding interests.
Although internet platforms herald the introduction of these curation and ranking algorithms as a positive shift, their proliferation has raised a number of concerns regarding fairness, accountability, and transparency around algorithmic decision-making. These algorithms have become integral to the operations of many internet platforms, and they have enabled companies to become gatekeepers of online speech who exercise significant editorial judgment over information flows. Despite the ubiquitous presence of these curation and ranking algorithms online, many users are often unaware that algorithms are being used by platforms to shape their online experiences. As a result, many individuals willingly—and unwittingly—accept the role that internet platforms and black box algorithms play in deciding what their online experience will be.
Further, internet platforms have failed to provide adequate transparency around how these tools are developed and deployed, and how they shape the user experience. Platforms also generally offer users and content creators only a limited set of controls over how they engage with these algorithmic tools.
In addition, the widespread deployment of algorithmic curation and ranking tools has raised a number of concerns regarding algorithmic bias and accountability. By default, algorithmic tools are designed to preference certain factors and characteristics over others when curating and ranking content. Developers who select these factors do so based on their priorities and assumptions about what users value in their online experience.
In our new report, New America’s Open Technology Institute (OTI) explores how three search engines—Google, Bing, and DuckDuckGo—and three internet platforms with news feeds—Facebook, Twitter, and Reddit—utilize algorithmic curation and ranking practices, and what the associated challenges are. The report also explores how internet platforms, policymakers, and researchers can promote greater fairness, accountability, and transparency around these algorithmic decision-making practices. The recommendations presented in the report include:
- Search engines and internet platforms that feature news feeds should make a more concerted effort to raise awareness around algorithmic curation and ranking practices and provide meaningful transparency around how they impact users’ experiences and free expression online.
- Search engines and internet platforms that feature news feeds should provide greater transparency around the different implicit and explicit preferences their curation and ranking systems are designed around.
- Search engines and internet platforms that feature news feeds should provide users with a robust set of controls which enable them to tailor their own search and news feed experiences.
- Search engines and internet platforms that feature news feeds should enable publishers to understand and exercise some control over how their content is collected, curated, and ranked.
- Internet platforms, policymakers, and researchers should collaborate on, promote and fund further research on the impacts of algorithmic curation and content ranking.
- Internet platforms, researchers, and civil society organizations should collaborate to develop a set of industry-wide best practices for transparency and accountability around algorithmic curation and ranking practices.
This is the second in a series of four reports that will explore how internet platforms are using automated tools to shape the content we see and influence how this content is delivered to us.