YouTube: These testimonials show the dangers of the video recommendation algorithm

YouTube's recommendation algorithm is at the heart of a Mozilla campaign to improve its relevance - and reduce its perverse effects. The foundation cites 28 stories of users who came across bizarre, ultra-violent, pornographic or dangerous content without the subject of these videos being related to their interests.

Mozilla has launched a mini-site on which 28 stories of YouTube users can be read about the recommendation algorithm. So many incidents where YouTube's algorithm of recommendations showed them sometimes unsustainable videos showing eg violence, racism, or apologizing for conspiracy theories to users - adults and children - who would have fine.

YouTube: Mozilla wants to help Google improve video recommendation algorithm

"I started watching a boxing match, then boxing matches in the street, and then I finally came across some street fighting videos, then accidents and urban violence ... I am came out with a horrible view of the world and feeling bad, without it being something I wanted, "says a user.

Another user also tells how the algorithm is outbidding: "I started looking for 'fact videos' where people fall or get a little hurt. YouTube then offered me a channel that showed videos from Dash Cams in cars. At first there were minor accidents, but later it evolved into cars that explode and fall from bridges - videos in which people clearly did not survive the accident. I felt a little sick, and I never looked for this type of content.

There is also the story of this professor scandalized by the number of conspiracy videos: "I watched serious documentaries on Apollo 11. But YouTube's recommendation algorithm is now full of videos on conspiracy theories: September 11, Hitler's escape, alien researchers and anti-American propaganda ". There are 28 stories on the project site about how YouTube showed LGBTphobic, racist, or sexual content.

70% of videos viewed on YouTube are offered by the recommendation algorithm. But this algorithm is regularly the target of criticism that it is for its dimension addictive (one can spend hours to be shown videos recommended) that for its failures. The Mozilla mini-site is the showcase of a project called #YouTubeRegrets whose goal is to find solutions.

"These stories show that the algorithm gives more weight to the engagement - it shows content that keeps the user on the hook, whether or not this content is dangerous," says Ahsley Boyd, project manager at Mozilla. "We think these testimonials fairly represent the larger problem of YouTube's algorithm: recommendations that can aggressively push bizarre or dangerous content. The fact that we can not study these stories in greater depth - lack of access to adequate data - reinforces the idea that the algorithm is opaque and out of control. "

Mozilla's bet is that YouTube and Google can no longer perfect their algorithm by themselves: "We do not think this is a problem that can be solved internally. It is too serious and complex. YouTube must allow independent researchers to contribute to finding a solution, "said Ashley Boyd. The foundation would like to lead the platform to open up its data to the research community, to help them build simulation tools, and more broadly to give them more leeway than to leave them on the sidelines.

Source : The Next Web