Alex Kantrowitz reforming the Share Button :
A simple product tweak, the research indicated, would likely help Facebook constrain its misinformation problem more than an army of content moderators — all without removing a single post. In this scenario, adding some friction after the first share, or blocking sharing altogether after one share, could help mitigate the spread of misinformation on Facebook.
Seems like another day, another revelation from the ‘Facebook Files’ and I am sure there are lots more to come. You can of course push back and claim that the leaks were of information gleamed from internal documents and not form actually decision makers but it’s clear that they reflect the overall nature of the beats.
When it comes to misinformation, internal Facebook research noted that people are at least 4x more likely to see misinformation from a shares post. That is that if the original post users see is from outside of their friends list is four times more likely to be false, and up to 20 times in some situations. Lets that just sink in for a moment.
Two years ago, Facebook discovered that users were seeing false information shared into their newsfeed and apparently did nothing about it. The research goes on to make a recommendation that adding friction to sharing or blocking sharing all together would help mitigate the issues they were facing. Whilst it’s not known for sure if any action was taken, employees did openly discuss the findings on internal systems, it is clear they did not make any meaningful changes.