As a result, people might know beforehand if a post they are about to share on Facebook was found to be false.
However, Facebook's newsfeed algorithm does intervene to demote false content, ensuring that it reaches fewer people than it would otherwise.
"Fact-checking can take hours, days or weeks, so nobody has time to properly check everything they see online. But it's important somebody's doing it because online misinformation, at its worst, can seriously damage people's safety or health," Full Fact Director Will Moy was quoted as saying.
Facebook started partnering with third-party fact-checkers in December 2016, in the wake of the US elections which saw disinformation spreading across the platform, the Wired reported.
Since its launch in the US, the programme has received mixed reviews.
While it has been praised for trying to curb the spread of misinformation on the platform, the programme has also been criticised for its unwillingness to pay for fact-checking, which relies on users to flag content to third parties, The Guardian reported.
(This story has not been edited by Business Standard staff and is auto-generated from a syndicated feed.)