Facebook is all set to start testing a new pop-up feature that will ask users if they are certain about wanting to share an article if they haven’t opened it. Facebook shared this news via a post on Twitter, which also has a similar prompt when users try and retweet any article.
As explained by Facebook on Twitter:
“Starting today, we’re testing a way to promote more informed sharing of news articles. If you go to share a news article link you haven’t opened, we’ll show a prompt encouraging you to open it and read it, before sharing it with others.”
As you can see, the prompts will alert people that they’re about to share a link they haven’t opened, with a brief explainer on the potential issues with this approach. That, ideally, will get more people to reconsider sharing links without understanding the full context, which could help to reduce misinformation, and angst, among Facebook users.
Facebook is planning to add more censorship to its platform with a new pop-up that asks users if they have read an article before sharing it. The prompt, currently being tested among select users, will only appear if the person clicks ‘share’ without opening the article
The new feature is meant to help users be better informed about the articles they wish to share. According to Facebook. This will help in compacting the spread of misinformation which has been highly prevalent on various social media apps, including Facebook.
Here is how it works. When a user decides to share a news article link they haven’t opened, the social media platform will show a pop-up prompt encouraging the user to open it and read it before sharing it with others. The pop-up message warns the user that not opening the article can lead to “missing key facts”, with headlines often not telling the whole story.
The new pop-up feature is similar to the feature available on Twitter which the company had start testing back in June 2020. The main goal of the company is to help users be better informed and compact the speed of misinformation.
On last June, Twitter added warning signs regarding not to share unopened links, which has lead to users opening articles by 40% more often and have slowed the rate in which mis-information’s are spread on social media platforms. Twitter’s results show the effectiveness of small measures of friction in the sharing process, which can lead to more informed, measured debate and discussion, by simply prompting users to take a moment to consider their action.
Facebook will be hoping for the same, and with a growing number of people now getting at least some of their news content from The Social Network, maximizing informed engagement, and subsequent debate, is an important consideration for the platform as it looks to play a more positive role in topical discussion and sharing.
Facebook also has warnings in place for when users go to share an article that’s more than 90 days old, which aims to slow misdirected discussion around older content, while Facebook’s algorithm also takes into account the amount of time you spend reading an article after opening a link when measuring your relative interest in the content.
If this small measure gets more people to actually take an extra moment to read the full context, that could have a big impact – and as Twitter’s results show, it can be an effective measure in ensuring that fake news is reduced on various social media platforms. Other platforms like WhatsApp, Telegram, etc must also adopt this kind of policy so that the circulation of misinformed articles can be reduced to a significant extent.