YouTube Kids

YouTube Kids is a video app and website for children developed by YouTube, a subsidiary of Google. The app provides a version of the service oriented solely towards children, with curated selections of content, parental control features, and filtering of videos deemed inappropriate for viewing by children under the age of 13, in accordance with the Children's Online Privacy Protection Act, which prohibits the regular YouTube app from profiling children under the age of 13 for advertising purposes.

First released on February 15, 2015, as an Android and iOS mobile app, the app has since been released for LG, Samsung, and Sony smart TVs, as well as for Android TV. On May 27, 2020, it became available on Apple TV. As of September 2019, the app is available in 69 countries. YouTube launched a web-based version of YouTube Kids on August 30, 2019.

YouTube Kids has faced criticism from advocacy groups, particularly the Fairplay Organization, for concerns surrounding the app's use of commercial advertising, as well as algorithmic suggestions of videos that may be inappropriate for the app's target audience, as the app has been associated with a controversy surrounding disturbing and/or violent videos depicting characters from children's media franchises. Criticism over the videos led YouTube to announce that it would take more stringent actions to review and filter such videos when reported by the community, and prevent them from being accessible from within the YouTube Kids app.

Content
The app is divided into four content categories; "Recommended", "Shows", "Music", and "Learning". The categories feature curated selections of content from channels deemed appropriate for children.

In August 2016, the app was updated to support the YouTube Red (now YouTube Premium) subscription service, allowing ad-free playback, background playback, and offline playback for subscribers. In February 2017, YouTube began to introduce premium original series oriented specifically towards YouTube Kids, including DanTDM Creates a Big Scene, Fruit Ninja: Frenzy Force, Hyperlinked, and Kings of Atlantis. YouTube has also presented advocacy campaigns through special playlists featured on YouTube Kids, including "#ReadAlong" (a series of videos, primarily featuring kinetic typography) to promote literacy, "#TodayILearned" (which featured a playlist of STEM-oriented programs and videos), and "Make it Healthy, Make it Fun" (a collaboration with Marc and Pau Gasol to promote healthy living and an active lifestyle to children).

In November 2017, the app was updated to add additional user interface modes designed for different age groups, ranging from the existing simplified interface (intended for younger children) to a more dense interface designed for older children.

In September 2018, YouTube added new age group options relating to the content offered in the app, "Younger" and "Older". "Younger" maintains the existing mix of content offered before, and "Older" adds more content from other genres, such as nature, gaming, and music. In August 2019, the "Younger" setting was split to add a new "Preschool" group, with a focus on "creativity, playfulness, learning, and exploration".

Parental controls
The YouTube Kids app features parental control settings that allow parents to limit screen time, and restrict users from accessing the search tool. Parents can use a passcode or their Google Account to protect these settings, and configure profiles for multiple users to tailor their experiences.

Advertising
The Campaign for a Commercial-Free Childhood (CCFC) and the Center for Digital Democracy (CDD) both expressed concern over the use of advertising within the YouTube Kids app, arguing that children would not be able to distinguish the ads from content. Short bumpers were later added to the app to establish a separation between advertising and content.

Filtering issues
The YouTube Kids app has faced criticism over the accessibility of videos that are inappropriate for its target audience. The CCFC filed an FTC complaint over YouTube Kids shortly after its release, citing examples of inappropriate videos that were accessible via the app's search tool (such as those related to wine in their testing), and the Recommended page eventually using search history to surface such videos. YouTube defended the criticism, stating that it was developed in consultation with other advocacy groups, and that the company was open to feedback over the app's operation. A larger YouTube controversy referred to as "Elsagate" discovered by Matan Uziel and Charlie Warzel has also been associated with the app, referring to channels which post videos featuring characters from popular franchises (especially, among others, Frozen, PAW Patrol, Thomas and Friends, Peppa Pig, and Spider-Man), but with disturbing, sexually suggestive, violent, or otherwise inappropriate themes and content.

YouTube global head of family and children's content Malik Ducard admitted that "making the app family friendly is of the utmost importance to us", but admitted that the service was not curated all the time, and that parents had the responsibility to use the app's parental controls to control how it is used by their children (including disabling access to the search tool). Josh Golin, director of the Campaign for a Commercial-Free Childhood, argued that automated algorithms were not enough to determine whether a video is age-appropriate, and that the process required manual curation. He added that "the YouTube model has created something, which is so vast, but there are 400 hours of content are uploaded every minute. It's simply too big. People have been raising these issues for years, just visit any parenting forum and they’ve been talking about the fake Peppa Pig videos."

In November 2017, YouTube announced that it would take further steps to review and filter videos reported by users as containing inappropriate content, including more stringent use of its filtering and age-restriction system to prevent such videos from appearing on the app and YouTube proper. In an update to the YouTube Kids app that month, a more prominent disclaimer was added to its first-time setup process, stating that the service cannot fully guarantee the appropriateness of videos that were not manually curated, and informing parents of means to report and block videos that they do not find suitable.

These options expanded further in 2018, with the addition of an option to restrict users to human-reviewed channels and recommendations, as well as a manual whitelisting system.