INTRODUCTION
YouTube is the second largest search engine in the world. With 1 billion users worldwide, people of every generation turns to YouTube for entertainment as well as information. Every time a user opens YouTube, they are inundated with an extremely large number of videos to pick from. This begs the question: how sure can a user be that what they are watching is reliable?
While YouTube uses AI and human moderators to check for harmful or inappropriate content, this system is often overwhelmed by the amount of content being uploaded. On average, 300 hours of video is being uploaded on YouTube every minute. This is far too large an amount for humans to keep up with. While AI can help, at the moment, it lacks an understanding of nuance and context to accurately filter out “bad” content.
While a lot of us who spend time on the internet have developed a sixth sense for harmful and misleading content, children and the elderly are susceptible to bad content. Studies have shown that such content increases the risk of anxiety and depression. Misleading content such as fake news, deep fakes, manipulated images and other forms of misinformation can have long-term and widespread effects on society.
With these problems in mind, our feature aims to help reduce the amount of harmful and misleading content on YouTube by taking the help of the YouTube community.
While YouTube uses AI and human moderators to check for harmful or inappropriate content, this system is often overwhelmed by the amount of content being uploaded. On average, 300 hours of video is being uploaded on YouTube every minute. This is far too large an amount for humans to keep up with. While AI can help, at the moment, it lacks an understanding of nuance and context to accurately filter out “bad” content.
While a lot of us who spend time on the internet have developed a sixth sense for harmful and misleading content, children and the elderly are susceptible to bad content. Studies have shown that such content increases the risk of anxiety and depression. Misleading content such as fake news, deep fakes, manipulated images and other forms of misinformation can have long-term and widespread effects on society.
With these problems in mind, our feature aims to help reduce the amount of harmful and misleading content on YouTube by taking the help of the YouTube community.
PROBLEM
Inappropriate, misleading and harmful content such as violent imagery, fake scientific information and clickbait is being uploaded on YouTube on a daily basis. A lot of this information gets past YouTube’s current moderating system.
SOLUTION
To create a robust system in which selected YouTube community members moderate the content they watch which helps filter out inappropriate and misleading content. This system would act as an additional layer of moderation along with YouTube’s existing AI and human moderators.
SECONDARY RESEARCH
GENERATIONAL PREFERENCES ON YOUTUBE
• Millennials value authenticity.
• Gen Z prefers realistic content and direct brand communication on YouTube.
• Gen Z prefers realistic content and direct brand communication on YouTube.
YOUTUBE'S CHALLENGES
• Children’s mental health is at risk from disturbing content.
• Despite efforts, content moderation is a massive task due to the sheer volume of uploads.
• The platform is actively removing harmful AI-generated content.
• YouTube is found to be an unreliable source for health information, with many videos promoting unhealthy behaviours.
• Despite efforts, content moderation is a massive task due to the sheer volume of uploads.
• The platform is actively removing harmful AI-generated content.
• YouTube is found to be an unreliable source for health information, with many videos promoting unhealthy behaviours.
INFLUENCER AWARENESS
• Parents often lack awareness of YouTube influencers and the content their children consume.
YOUTUBE KIDS
• Downloads: The app saw a rise to 131 million downloads in 2023, up from 103 million the previous year.
• Audience: It’s designed for children under 13, but by age 8, many transition to regular YouTube.
• Content Moderation: An algorithm filters content, yet some inappropriate videos may bypass it.
• Advertisements: Ads are present in 95% of videos for young children, sometimes obstructing educational content.
• Parental Usage: Two-thirds of parents report their children use YouTube Kids, but regular YouTube is more popular, with 70% of parents acknowledging its use by their children, often without parental supervision.
• Downloads: The app saw a rise to 131 million downloads in 2023, up from 103 million the previous year.
• Audience: It’s designed for children under 13, but by age 8, many transition to regular YouTube.
• Content Moderation: An algorithm filters content, yet some inappropriate videos may bypass it.
• Advertisements: Ads are present in 95% of videos for young children, sometimes obstructing educational content.
• Parental Usage: Two-thirds of parents report their children use YouTube Kids, but regular YouTube is more popular, with 70% of parents acknowledging its use by their children, often without parental supervision.
STEPS TAKEN BY YOUTUBE
• Content Ratings: Videos have ratings for various maturity levels.
• Supervised Accounts: Parents can limit content for children under 13.
• Comment Restrictions: Comments on videos with minors are disabled to prevent exploitation.
• Content Recommendations: YouTube limits recommendations of potentially risky content featuring minors.
• Technology Improvements: Enhancements in machine learning and CSAI Match technology help protect minors.
• Human Review: AI, with occasional human review, assesses age-restriction of videos.
• Monetization: Stricter criteria for monetizing kids’ videos to discourage low-quality content.
• Regulatory Compliance: Videos must be labeled “Made for Kids” as per regulations.
• Revenue Limitations: Restrictions on targeted advertising and certain features for kids’ videos.
• Adpocalypse Impact: Past brand safety issues led to reduced ad spending and stricter content guidelines.
• Content Ratings: Videos have ratings for various maturity levels.
• Supervised Accounts: Parents can limit content for children under 13.
• Comment Restrictions: Comments on videos with minors are disabled to prevent exploitation.
• Content Recommendations: YouTube limits recommendations of potentially risky content featuring minors.
• Technology Improvements: Enhancements in machine learning and CSAI Match technology help protect minors.
• Human Review: AI, with occasional human review, assesses age-restriction of videos.
• Monetization: Stricter criteria for monetizing kids’ videos to discourage low-quality content.
• Regulatory Compliance: Videos must be labeled “Made for Kids” as per regulations.
• Revenue Limitations: Restrictions on targeted advertising and certain features for kids’ videos.
• Adpocalypse Impact: Past brand safety issues led to reduced ad spending and stricter content guidelines.
THE PROBLEM WITH YOUTUBE'S CURRENT MODERATING AND AGE-RESTRICTION SYSTEM
Age Restriction for Videos:
• Age-restricting YouTube videos limits their visibility and can lead to fewer views gained.
• YouTube promotes age-restricted videos less often, resulting in decreased visibility
Human Moderators vs. AI Moderation:
• YouTube is reintroducing human moderators after relying heavily on AI during the pandemic.
• AI filters sometimes failed to match human accuracy, resulting in video removals and incorrect takedowns.
Outsourcing Moderation:
• YouTube outsources moderation to contracting companies.
• Moderators work on short-term contracts with minimal job security and low pay.
Age Restriction for Videos:
• Age-restricting YouTube videos limits their visibility and can lead to fewer views gained.
• YouTube promotes age-restricted videos less often, resulting in decreased visibility
Human Moderators vs. AI Moderation:
• YouTube is reintroducing human moderators after relying heavily on AI during the pandemic.
• AI filters sometimes failed to match human accuracy, resulting in video removals and incorrect takedowns.
Outsourcing Moderation:
• YouTube outsources moderation to contracting companies.
• Moderators work on short-term contracts with minimal job security and low pay.
COMPETITIVE ANALYSIS
USER INTERVIEWS
We conducted the user interviews of 6 participants belonging to varied age groups to understand their usage, likes/ dislikes, and any other difficulty or challenge that they face while using Youtube. We have categorized their demographic data and the insights that we obtained from the interviews.
FINDINGS FROM INTERVIEWS
• 100% of participants turn to Youtube for diversified contents throughout the day
• 80% relies on Youtube as the primary app for video related contents
• 60% of the participants feel that they can find any content they want on Youtube
• Boomers like watching historical, gardening, and political videos.
• Millennials like watching cooking videos, interviews, news, food vlogs, podcasts.
• Gen Z likes watching music videos, gaming videos, K-pop, technical videos and their subject contents.
• 80% relies on Youtube as the primary app for video related contents
• 60% of the participants feel that they can find any content they want on Youtube
• Boomers like watching historical, gardening, and political videos.
• Millennials like watching cooking videos, interviews, news, food vlogs, podcasts.
• Gen Z likes watching music videos, gaming videos, K-pop, technical videos and their subject contents.
OBSERVATIONS
• Users think that the content creators who are in the business for long could be trusted, passing the right information.
• Users trusts the source that they have been following for quite some time.
kids are very tech-savvy and have access to everything, and harmful content can impact them unconsciously
• Surveys and feedback from users after they watch a videos will be helpful
• Thinks if a blogger or content holds a certain level of education, degree to talk about a certain topic or spread information will be a great idea.
• Watches podcasts, news channels, interviews only from legit creators who have been in the market for quite sometime.
• Users trusts the source that they have been following for quite some time.
kids are very tech-savvy and have access to everything, and harmful content can impact them unconsciously
• Surveys and feedback from users after they watch a videos will be helpful
• Thinks if a blogger or content holds a certain level of education, degree to talk about a certain topic or spread information will be a great idea.
• Watches podcasts, news channels, interviews only from legit creators who have been in the market for quite sometime.
PAIN POINTS & CHALLENGES
• Misinformation: it induces panic amongst users
• Misleading: Turns out be a clickbait which ends up wasting a lot of time
• Disturbing: Doesn’t feel good about abusive content as it disturbs their peace of mind and occupies larger portion of their time thinking about the same and also scares one.
• Violence: Kids try to copy harmful behaviour they seen online which negatively impacts their mental health.
• Misleading: Turns out be a clickbait which ends up wasting a lot of time
• Disturbing: Doesn’t feel good about abusive content as it disturbs their peace of mind and occupies larger portion of their time thinking about the same and also scares one.
• Violence: Kids try to copy harmful behaviour they seen online which negatively impacts their mental health.
AFFINITY MAPPING
After conducting interviews, we identified and grouped together related keywords and phrases mentioned by the participants. This process led to the creation of an affinity map, which assisted us in pinpointing the key pain points and desired features expressed by our users.
USER PERSONAS
DEFINING THE PROBLEM
PROBLEM STATEMENT
YouTube users who are looking for reliable, authentic and safe content on the platform often come across videos that are inappropriate, misleading or harmful due to YouTube’s insufficient moderation mechanism which leads to a negative user experience.
YouTube users who are looking for reliable, authentic and safe content on the platform often come across videos that are inappropriate, misleading or harmful due to YouTube’s insufficient moderation mechanism which leads to a negative user experience.
THE 5 W'S
HOW MIGHT WE..
• ...develop a robust content moderation system to filter out harmful or misleading videos on YouTube?
• ...implement an age-appropriate rating system for YouTube videos to protect minors from unsuitable content?
• ...empower YouTube users with reliable information and tools to identify and report inappropriate or misleading content?
• ...create a transparent and user-friendly system for content warnings?
• ...collaborate with creators, experts, and the community to improve the quality and authenticity of content on YouTube?
• ...implement an age-appropriate rating system for YouTube videos to protect minors from unsuitable content?
• ...empower YouTube users with reliable information and tools to identify and report inappropriate or misleading content?
• ...create a transparent and user-friendly system for content warnings?
• ...collaborate with creators, experts, and the community to improve the quality and authenticity of content on YouTube?
PROJECT GOALS
USER FLOWS
SKETCHES
WIREFRAMES
DESIGN MOCK-UPS
ICONS
PROTOTYPE
VIEWER FLOW
MODERATOR FLOW