In January, Facebook announced major and ongoing changes to its news feed to improve the user experience – and people’s lives. But will it work? And what does it mean for brands?
Are you seeing the right stuff on Facebook? Updates to the algorithm mean that users will see more of what they actually want to see – content from family and friends; posts that spark the most interactions. Facebook chief executive Mark Zuckerberg wants to encourage more ‘meaningful’ interactions (with each other and with the platform, presumably) to ‘improve people’s lives.’ The move comes after reports that people are using the platform less and is one of many changes over the last few months and years that makes it harder for brands to reach potential customers organically.
Facebook claims that users will see fewer posts from news sources that people don’t trust, something that makes sense following our Trust In News study late last year. Our study across five countries found that trust in traditional news sources had remained strong, and in fact in general increased, despite a buzz around the term ‘fake news’ from the US president. Meanwhile, social media was not trusted (and in fact trusted less) as a source of true information. (In the UK, only 28.9% of people say they trust social media as a news source.)
Facebook has said it will show fewer posts from publishers, and those it does show will be ranked according to how trustworthy they are perceived to be by users. But will real news suppliers be negatively impacted? Genuine publishers are relying on the platform to bring traffic to their sites, and unless a friend or family member shares a story, users are unlikely to see it from now on.
Kantar Consulting research shows that it is becoming harder for people to find the information they perceive as trustworthy – 61% agreed in 2008; the number was 70% in 2017. And 78% of those surveyed last year said they were ‘skeptical about the accuracy of news stories and information presented in the media.’
Justine Hess, Associate Head of Global MONITOR, Kantar Consulting, comments: ‘There are organizations out there who are acting as third-party mediators of content on Facebook, checking what’s true and what isn’t – such as Faktisk, in Norway, now one of the most frequently accessed websites in the country. Facebook itself is also doing quite a lot to improve trust in the platform, experimenting with new ways to moderate content, for example, and privileging content from friends and family. For better and worse, the latter move also has follow-on consequences for brands.’