Google said in a blog post Friday that it would be “taking a closer look” at how the “Up Next” videos that endlessly autoplay after each video ends can lead viewers into conspiracy-theory videos and other “borderline content.”
The announcement comes a day after a Buzzfeed feature detailed how YouTube’s recommendation algorithms often steer viewers toward controversial or even prohibited content if left long enough to their own devices. For example, a YouTube search for “nancy pelosi speech” that Buzzfeed made on an account with no historical personal data started with a BBC News clip and, ten videos later, served up QAnon consipiracy videos.
“Despite year-old promises to fix its ‘Up Next’ content recommendation system, YouTube is still suggesting conspiracy videos, hyperpartisan and misogynist videos, pirated videos, and content from hate groups following common news-related searches,” the Buzzfeed story said.
Google’s blog post appears to be written to address the concerns raised in the story.
“We’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways—such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11,” said the post, which was authored by “the YouTube Team.”
“While this shift will apply to less than one percent of the content on YouTube, we believe that limiting the recommendation of these types of videos will mean a better experience for the YouTube community,” the post said.