When you see someone post a ton of links, videos, photos, from questionable sources, chances are the next time they post something new, you’ll probably just ignore it. This is versus someone who vets their sources and whose posts usually come from well-known and reputable sources, in which you’d be more inclined to check out their links.

Turns out that Facebook has a similar trustworthiness rating system internally in which they use to rate its users. In a report from the Washington Post, apparently Facebook has an internal rating system that they use to rate the trustworthiness of its users whenever they report fake news, with a score that is either zero or one.

Basically the problem Facebook discovered was that sometimes users who disagreed with a post would flag it as fake news. Such users would presumably over time be marked as being untrusthworthy with their flags, versus someone who actually reports actual fake news even if it was an article they don’t agree on, in which their flags would then be viewed as more trustworthy.

According to Tessa Lyons, the product manager at Facebook who deals with misinformation, she told the Post that it is “not uncommon for people to tell us something is false simply because they disagree with the premise of a story or they’re intentionally trying to target a particular publisher.” That being said it is unclear as to how Facebook uses these scores, or if Facebook will ever give users a way to check their score.

Filed in General. Read more about .

Discover more from Ubergizmo

Subscribe now to keep reading and get access to the full archive.

Continue reading