In a crackdown against fake news, fake buns and false everything else, Facebook is introducing compulsory ‘Reaction proof’. Before a user can select Like, Heart or one of the other Reaction emoticons, they will have to prove their intent is genuine.
A Facebook spokesperson explained: “Facebook has been accused of promoting false messages and fake stories and we don’t want to be accused of allowing our users to post fake Reactions.
“Too many people are clicking Like or Heart or selecting one of the Reaction emoticons but don’t really mean it. They want to ingratiate themselves with the poster so they stay ‘friends’.
“From next month, all users will have to prove their Reactions are genuine by uploading a photo of themselves displaying the Reaction.
“There will be a Help page available but, for example, a Like might be portrayed by a big smile and thumbs up, the Heart by a genuine display of positive emotion, and so on.
“Special AI software we’ve developed will analyse the photo and decide if it’s genuine or not. If it’s not genuine, the user will not be allowed to post a Reaction for 24 hours. A second offence will result in a Reaction ban for a week, a third for a month, and any additional infringements will ban the user for life from posting fake Reactions. That should ensure a more genuine user experience.
“It’s Facebook’s contribution to cleaning up this wave of false and disingenuous information and emotional reactions which is flooding the internet.”
A second conversation with a Facebook insider, who did not want to be named, revealed that the Facebook software team had to build in a special AI bypass for the company’s chairman and CEO, Mark Zuckerberg, as he had been totally unable to provide proof of any genuine emotion.