YoutubeThe omnipotent recommendations appeared in most of the videos that an army of collaborative volunteers said they regretted seeing, according to a study released Wednesday by Mozilla based on “regret” reports from Youtube users. Of the videos that people said they regretted, 71% were recommended by YouTube’s AI. YouTube also recommended that people watch videos that were later removed for breaking their own rules, and people in countries where English is not the primary language reported pitiful videos at a higher rate than people in English-speaking countries, according to the report.
YouTube said its own surveys find that users are satisfied with its recommendations, which generally direct people to authoritative or popular videos. YouTube cannot adequately review Mozilla’s definition of “regrettable” or the validity of its data, the company added, noting that it is constantly working to improve its recommendations, including 30 changes to reduce harmful video recommendations in the last year.
Google’s huge video site is the world’s largest online video source. It reaches over 2 billion viewers every month, and people watch over a billion hours there every day. For years, YouTube has bragged about its algorithmic recommendations to generate more than 70% of the time people spend watching YouTube. But Mozilla’s report provides a glimpse into some of the potential shortcomings of those recommendations.
Approximately 9% of the recommended “pitiful” videos (a total of 189 videos in this study) were subsequently removed from YouTube. YouTube videos can be removed for a variety of reasons, including breaking the rules against offensive or dangerous content or copyright infringement. Sometimes the person who posted the video removes it. But the study confirmed that YouTube removed some videos for violating its policies after previously recommending them.
“That’s just weird,” Brandi Geurkink, Mozilla’s senior defense manager and study co-author, said in an interview Tuesday. “The recommendation algorithm was actually working against its own similar abilities to … police the platform.”
YouTube – like Facebook, Twitter, Reddit and many other Internet companies that provide users with a platform to post their own content, has struggled with how to balance freedom of expression with effective surveillance of offensive or dangerous material posted there. Over the years, YouTube has grappled with misinformation, conspiracy theories, discrimination, hate and harassment, videos of mass murder, and child abuse and exploitation – all on an unprecedented global scale.
The Mozilla study, for example, found that YouTube videos with misinformation were the most frequently reported as regrettable. And the rate of YouTube videos lamented is 60% higher in countries that do not speak English as a primary language, particularly in Brazil, Germany and France.
The study is based on voluntary reports submitted via a special RegretsReporter extension that Mozilla developed for the Chrome and Firefox web browsers. Tens of thousands of people downloaded the extension and 1,662 submitted at least one report on a YouTube video they regretted seeing, for a total of 3,362 reports from 91 countries between July 2020 and May 2021.
The study has several limitations. The people reporting these sorry videos are not a random sample: they are volunteers whose willingness to participate may mean they are not representative of the YouTube audience as a whole. The report acknowledges that limitation, as well as the fact that many factors can affect whether a volunteer reports a particular video and that the concept of a pitiful video may differ among volunteers.
The study is also based solely on regret reports submitted from desktop web browser extensions, which excludes any viewing on mobile devices or connected televisions. Mobile devices, for example, account for more than 70% of the time spent watching YouTube.
Mozilla’s report makes several recommendations for YouTube, for legislators, and for people like you.
For individual YouTube viewers, Mozilla recommended check your data settings for YouTube and Google and consider reviewing your “search” and “watch” history to edit anything you don’t want to influence your recommendations.
YouTube and other platforms should establish independent audits of their recommendation systems, Mozilla said, as it also called for more transparency about limit content and greater user control over how their personal data contributes to their recommendations, including the ability to opt-out. personalize.
YouTube said it welcomes more research and is exploring options to attract outside researchers to study its systems.
Mozilla also recommended that lawmakers require YouTube and others to release information and create tools for researchers to analyze their recommendation algorithms. And regulations must ensure that platforms take into account the risks they are taking when designing and running automated systems that scale content to scale.
Mozilla is a software company best known for its unit that operates the Firefox web browser. Google, YouTube’s parent company, is also one of Mozilla’s biggest sources of revenue, through the royalties Google pays Mozilla for integrating the search engine into Firefox. Google is the default search engine in Firefox in many regions of the world.