Mozilla: YouTube’s Algorithm Pushes Harmful Content

 A new study conducted by the non-profit Mozilla Foundation suggests that YouTube continues to recommend harmful videos to users that violate its own content policies.

 

The crowd-funded study found that around 71% of the videos that viewers considered ‘regrettable’ were recommended by the video platform’s own algorithm.

What Exactly is YouTube Regrettables?

Let’s understand it this way, any content that a user finds disturbing or harmful is deemed as regrettable. For example, some volunteers were often recommended videos titled, ‘Man humiliates feminist’.

Such misogynistic videos would fall under YouTube regrets. Other examples would include Covid misinformations, extreme political inclinations, or hate speeches.

Further Investigation

The finding was made using crowdsourced data from RegretsReporter, a browser extension that lets users report information on harmful videos and what actually led them there. More than 30,000 YouTube users used the platform to report the harmful content that was recommended to them.

The investigation further suggests that non-English speakers were more prone to disturbing content and that ‘YouTube regrets’ was 60% higher in countries that don’t have English as their first language. Mozilla says that there were a total of 3362 regrettable videos from 91 countries, between July 2020 and May 2021. Most of these were either misinformation, violent or graphic content, hate/polarization speeches, and scams. 

The study also suggests that recommended videos were 40% more likely to be reported by users than videos for which they searched. 

Brandy Geurkink, Mozilla’s Senior Manager of Advocacy says that YouTube’s algorithm is designed to misinform and harm people:

“Our research confirms that YouTube not only hosts, but actively recommends videos that violate its very own policies. We also now know that people in non-English speaking countries are the most likely to bear the brunt of YouTube’s out-of-control recommendation algorithm. Mozilla hopes that these findings — which are just the tip of the iceberg — will convince the public and lawmakers of the urgent need for better transparency into YouTube’s AI.”

User Attention

The YouTube algorithm is essentially a black box. The video-streaming platform relies on its algorithm to show users what they like or might like to keep them hooked on for as long as possible.

The study by Mozilla backs up the notion that YouTube’s AI continues to feed users ‘low-grade, divisive, misinformations’. This makes it easier for the video streaming platform to easily grab eyeballs by triggering people’s sense of outrage. This only implies that the problem of regrettable or triggering recommendations is indeed systematic. The more you outrage people, the more you expand your appetite for ad revenues!

As a method to fix YouTube’s algorithm, Mozilla is calling for ‘common sense transparency laws, better oversight, and consumer pressure’. It also called for the protection of independent researchers to empower them to interrogate algorithmic impacts, as an exercise to repair the worst excesses of the YouTube algorithm.

Superficial Claims

YouTube has a history of being accused of fuelling societal ills by feeding hate speech, political extremism, and conspiracy junk. The parent company and tech giant Google has always responded to negative criticism flaring up around algorithms, by announcing few policy tweaks.

The dangerous behavior of YouTube algorithms, only suggests that Google has been making superficial claims of reform.

 

Mumbikar Nikhil | Trendsetter | Influncex’20

Ex: Tell us about a change you’ve seen in the digital blogging space since you started out your journey? How has this space evolved?

Nikhil: Changes which I have seen in the digital blogging space that there are so many opportunities have grown a lot and more and more people are coming onto this.

Ex: Can you give us an insight into social media platforms that only an Influencer can know?

Nikhil: The insight which I can give you is that the Influencer who is more personal about their life has more connection with their audience.

Ex: What would be your calling in a parallel universe? (Alternate career options)

Nikhil: I would have been either a Photographer or probably a DOP.

Ex: A technology you wished you’d invented?

Nikhil: Technology I wish I could have invented the Smartphone, It’s a device on which we all rely on and I think it’s the most popular invention in the history of Mankind.

Ex: What’s been your favorite encounter with a fan so far? (DMs included!)

Nikhil: There are many but one is that when I was traveling from Kolkata to Visakhapatnam two boys were standing just before the toll naka and when I spoke to them, they told me they came 200km from inside a village and they just guessed it I would be passing from there because I didn’t update it anywhere. I was really shocked and felt blessed at that time.

Ex: Social media trolling is seeing an upward rise. Do you think this only increases with the popularity of an Influencer? How does one tackle this situation?

Nikhil: Definitely grows with the popularity or I’d like to say Depends on how deep of a personal connection an influencer have with their audience through their content. When you grow, you gotta have bullies and haters. That’s proof you are growing.

Ex: Tell us something we don’t know about the influencer’s content strategy?

Nikhil: One thing people don’t know about the content strategy is that Influencing is not just about to look good, wear good clothes, It’s about talking what needs to be spoken and doing something which people find it challenging and of course you need to be calculative and futuristic what’s gonna happen in the future and accordingly make a move.

[button href=”https://www.instagram.com/nikkkhil/?hl=en” type=”btn-primary” size=”btn-lg”]FOLLOW[/button]   
Exit mobile version