Summary of “‘Fiction is outperforming reality’: how YouTube’s algorithm distorts truth”

An ex-YouTube insider reveals how its recommendation algorithm promotes divisive clips and conspiracy videos.
One YouTube creator who was banned from making advertising revenues from his strange videos – which featured his children receiving flu shots, removing earwax, and crying over dead pets – told a reporter he had only been responding to the demands of Google’s algorithm.
The company said that in 2016 it started taking into account user “Satisfaction”, by using surveys, for example, or looking at how many “Likes” a video received, to “Ensure people were satisfied with what they were viewing”.
From the start, we were stunned by how many extreme and conspiratorial videos had been recommended, and the fact that almost all of them appeared to be directed against Clinton.
What was most compelling was how often Chaslot’s software detected anti-Clinton conspiracy videos appearing “Up next” beside other videos.
How does YouTube interpret “Viewer interest” – and aren’t “The videos people choose to watch” influenced by what the company shows them?
YouTube’s algorithm may have developed its biases organically, but could it also have been nudged into spreading those videos even further? “If a video starts skyrocketing, there’s no question YouTube’s algorithm is going to start pushing it,” Albright says.
William Ramsey, an occult investigator from southern California who made “Irrefutable Proof: Hillary Clinton Has a Seizure Disorder!”, shared screen grabs that showed the recommendation algorithm pushed his video even after YouTube had emailed him to say it violated its guidelines.

The orginal article.