“Recommended” news – What happens when algorithms decide what news we read?
A research team explores how media platforms use news recommendation systems and how users perceive them.
News recommendation systems are employed on social media and online news portals to select the content shown to users. The research team examined how such recommendation systems influence journalistic content production, public perception of journalism and trust in journalistic work.
The project involved three studies. On the supply side, interviews with news industry professionals looked at how news recommendation systems are used and their influence on news production and dissemination. On the user’s side, a survey of 5,000 users across five countries examined public perceptions of these systems. And finally, the research team led by Frank Esser (University of Zurich) developed a model for the optimal design and use of news recommendation systems.
The most important findings
Most of the 13 interviewed media organisations use news recommendation systems, though they remain in experimental stages. Many of the involved news experts expressed concerns that these systems could undermine editorial control, potentially resulting in filter bubbles and the loss of a common basis for dialogue.
Users, by contrast, are generally supportive of the technology, especially when it comes to sports, entertainment and celebrity news, where personalised content is welcomed. However, for political or local news, they prefer curated journalistic content.
That said, trust in media companies declined when users perceived an over-reliance on algorithm-driven recommendations. Overall, users remained cautious about AI and algorithm-based technologies.
Many users also assume that all media companies in Switzerland already employ news recommendation systems. In reality, only some do, and these are still at an experimental stage. According to the researchers, this misconception may suggest that users struggle to distinguish between personalised advertising and personalised news recommendations, often associating the latter with commercial media outlets.
Implications for policy and practice
To maintain trust in journalism, striking a balance between personalisation and editorial integrity is essential. Important factors in preserving public confidence in media professionals include adherence to journalistic standards and transparency. To avoid misunderstandings, the research team recommends that both media professionals and the public should be better informed about the use of algorithms and AI.
The study indicates that algorithms can help distribute content more precisely, but they also pose risks. Such systems must be used responsibly, and only through this approach can digitalisation benefit society and protect general values.
Three main messages
- Although some media outlets have implemented recommendation systems on their websites, their use remains experimental and is characterised by various challenges and compromises. These include balancing technological advancements with editorial integrity, meeting user expectations while upholding journalistic responsibility, and addressing the growing influence of tech players in newsrooms.
- To foster greater acceptance of news recommendation systems, it is essential to align them with journalistic criteria, clearly communicate their benefits to users and address concerns regarding filter bubbles and data misuse.
- Looking ahead, users remain cautious about algorithmic and AI-driven technologies for news-related purposes. Their attitudes towards generative AI mirror those towards news recommendation systems, underscoring the relevance of these findings for the future adoption of related technologies in journalism.
Find out more about the methodology used by the researchers and further background information on the NRP 77 project website:
You can find more detailed insights into the results of the surveys here: