The researchers, the New York Times reports, find that the same tenets that reward extremism also happen with sexual content on YouTube: A user who watches erotic videos might be recommended videos of ...
"If you randomly follow the algorithm, you probably would consume less radical content using YouTube as you typically do!" So says Manoel Ribeiro, co-author of a new paper on YouTube's recommendation ...
YouTube's algorithm recommends right-wing, extremist videos to users — even if they haven't interacted with that content before — a recent study found. (Marijan Murat/picture alliance via Getty Images ...
Warning: This episode contains references to guns and gun violence. YouTube’s recommendation algorithm has always been key to keeping users on the site. Watch a cute cat video, and the platform spews ...
A new study conducted by the Computational Social Science Lab (CSSLab) at the University of Pennsylvania sheds light on a pressing question: Does YouTube's algorithm radicalize young Americans?