YouTube will stop recommending videos with conspiracy theories

3797

YouTube revealed last Friday (25), another measure to improve the quality of videos that stand out in their platform. The company will promote fewer videos with conspiracy theories and disinformation in general, with the aim of also reducing the platform's potential to carry extremist content to its users.

The recommended videos section on YouTube works through algorithms that learn from the content that the user watches on the platform. If you watch several videos of cake or lasagna recipes, YouTube will look for other popular videos that can help you in your noble ambition to improve your culinary skills.

However, these same algorithms can end up taking you to a “black hole” of questionable videos, including content that propagates misinformation. YouTube knows this and reports that, from time to time, it reevaluates its algorithms. According to the company, in the past year, hundreds of changes have been made to improve the quality of user recommendations.

The change, according to the company, will affect less than 1% of the videos available on the platform, but the effect can still be significant. In a statement, YouTube said it would begin “to reduce content recommendations that can misinform users in harmful ways such as videos promoting a miraculous and false cure for a serious illness, or claiming that the Earth is flat, or make blatantly false claims about historical events such as the 11 of September".

For now, such a policy will be gradually strengthened, it will start with a small number of videos displayed for users in the US before expanding to more countries as the algorithm becomes more refined.

LEAVE AN ANSWER

Please enter your comment!
Please enter your name here