Technology

YouTube’s algorithm reportedly doesn’t care if you show thumbs-down videos

YouTube's algorithm reportedly doesn't care if you show thumbs-down videos
Written by adrina

A photo of a screen on YouTube with the mouse hovering over the dislike button.

YouTube has already prevented videos from displaying dislike counts received, but apparently giving a video a thumbs-down doesn’t change how many similar videos the platform recommends you.
photo: Wachiwit (Shutterstock)

My YouTube recommendations are full of old reruns of Gordon Ramsay’s Kitchen Nightmares. Getting drunk one night and watching an entire episode might be partly my fault. Let me tell you if there’s one thing I’d like to get rid of in my feed, it’s the famous bragging Brit slamming another chef while the world’s most disgusting sound effects (braaa-reeeee) shuffle through in the background. I didn’t like a lot of these videos, but now Hell’s Kitchen is popping up on my page, and I’m feeling more and more like a “raw” steak poking and swearing at Ramsay.

But apparently I’m not alone in my YouTube recommendations. A report from the Mozilla Foundation published Monday claims, based on a survey and crowdsourced data, that the “dislike‘ and ‘Do Not Recommend Channel’ feedback tools don’t actually change video recommendations.

Well, there are two points here. One is that users consistently feel that the Google-provided YouTube controls don’t actually make a difference. Second, based on data collected from users, that the controls have a “negligible” impact on recommendations, meaning “most unwanted videos still get through.”

The foundation relied on its own data Regrets reporters Browser plugin tool that allows users to block selected YouTube videos from appearing in their feed. According to the report, its analysis relied on nearly 2,757 survey respondents and 22,722 people who gave Mozilla access to more than 567 million video recommendations recorded from late 2021 through June 2022.

Although the researchers admit that the survey participants are not a representative sample of YouTube large and diverse audience, a third of respondents said that using the YouTube controls doesn’t seem to change their video recommendations at all. One user told Mozilla that they would report videos as misleading or spam and later reappear in their feed. Respondents often said that blocking a channel would only result in recommendations from similar channels.

YouTube’s algorithm recommends videos to users they don’t want to watch, and it’s often worse than just old Ramsay cable. A 2021 Mozilla report, again based on crowdsourced user data, claims that people browsing the video platform are regularly recommended violent content, hate speech, and political misinformation.

In this latest report, Mozilla researchers found that pairs of videos containing these users were rejected, such as a Tucker Carlson Screed, would only lead to another video being recommended by Fox News’ YouTube channel. Based on a review of 40,000 pairs of videos, often when a channel is blocked, the algorithm would simply recommend very similar videos from similar channels. Using the “Dislike” or “Not Interested” buttons prevented only 12% and 11% of unwanted recommendations, respectively, compared to a control group. Using the Do Not Recommend Channel and Remove from Watch History buttons was more effective in correcting users’ feeds, but by only 43% and 29%, respectively.

“In our analysis of the data, we found that YouTube’s user control mechanisms are inadequate as a means of preventing unwanted recommendations,” Mozilla researchers write in their study.

YouTube spokeswoman Elena Hernandez told Gizmodo in an email statement, “Our controls don’t filter out entire topics or viewpoints as it could have negative repercussions for viewers, such as: B. creating echo chambers.” The company has said that it doesn’t prevent all content from related topics from being recommended, but it also claims to circulate “authoritative” content while suppressing “borderline” videos that come close to it are violating the Content Moderation Policy.

in one Blog post 2021, Cristos Goodrow – YouTube’s VP of Engineering – wrote that their system is “constantly evolving,” but that providing transparency about their algorithm “isn’t as simple as listing a formula for recommendations,” since their systems take clicks into account, you see to Time, poll responses, shares, likes and dislikes.

Of course, like every other social media platform out there, YouTube struggles with developing systems that can fight the full breadth of the internet bad or even predatory content be uploaded to the website. A forthcoming book shared exclusively with Gizmodo said YouTube had drained nearly billions of dollars in advertising revenue to deal with the weird and disturbing videos recommended to children.

While Hernandez claimed the company had expanded its Data APIthe SPokesperson added, “Mozilla’s report doesn’t take into account how our systems actually work, so it’s difficult for us to get a lot of insights.”

But this is a critique Mozilla also throws at Google’s feet, saying the company doesn’t provide enough access to allow researchers to assess what influences YouTube’s secret sauce, aka its algorithms.

#YouTubes #algorithm #reportedly #doesnt #care #show #thumbsdown #videos

 







About the author

adrina

Leave a Comment