YouTube Officially Rolls Out COPPA Privacy Changes for All Children-Aimed Video Content

YouTube offically rolls out COPPA privacy changes for all children-aimed video content

This week, YouTube started the official rollout of their new advertising rules for children-aimed video content. The reason is a lawsuit that YouTube settled with the US government for violating the Children’s Online Privacy Protection Act (COPPA). The new rules will bring YouTube’s policies and practices in line with COPPA. Some creators fear massive advertising-revenue losses.

Marketing to Children

There are a couple of reasons why marketing to children is a billion-dollar business. Children represent an important demographic to marketers. Firstly, their own buying power is considerable, and growing. Secondly, they influence their parents purchasing decisions.

On top of that, the current generation of children are very brand conscious and tend to develop a fierce brand loyalty. This starts at toddler-age, when children are unable to distinguish commercials from a show. Also, word of mouth marketing is huge, especially amongst tweens and teens.

Above all, users don’t always know how to properly navigate YouTube’s privacy settings. Throw data collection into the mix and it is easy to understand why the topic is so controversial. One the one hand there is a strong call to market to kids, albeit in an ethical manner that doesn’t take advantage of their naivety. On the other hand, regulations are getting tighter and fines exponentially higher.

COPPA Around for a While

America’s federal Children’s Online Privacy Protection Act has been around some time. Congress enacted COPPA in 1998, to limit the collection of personally identifiable information from children without their parents’ consent.

The Commission’s Rule implementing COPPA, effective since April 2000, is directed to sites, services and apps primarily designed for children 12 and under, or websites that are knowingly collecting or maintaining personal information of children in that age group. The law applies to content creators in the same way it would if the channel owner had its own website or app.

COPPA prohibits, among other things, the collection of information such as names, addresses, IP addresses and cookies without prior parental consent. The law also requires sites to “post a complete privacy policy, notify parents directly about their information collection practices, and get verifiable parental consent before collecting personal information from their children or sharing it with others”.

YouTube Privacy Changes

The reason for YouTube’s rollout of this change is a lawsuit that the video-sharing platform settled with the government for violating COPPA. In September, YouTube was fined $170 million by the US regulator, the Federal Trade Commission (FTC), and as part of the settlement promised to make adjustments.

It is now forbidden to run targeted advertisements on children’s videos. Furthermore, it will be impossible to comment on videos aimed at a young audience. Many of YouTube’s other features, such as sending push notifications, are also turned off.

Channel owners have to designate if the videos they upload are “directed to children”. The purpose of this requirement is to make sure that both YouTube and channel owners are complying with the law.

Also, as it is not always possible to know who the viewer is, anyone who watches a “children’s video” is seen as a child. It does not matter how old the viewer actually is.

The new rules apply immediately to all videos worldwide. YouTube will be using AI-algorithms to verify if creators have correctly labelled their content. Moreover, YouTube says it will override the label in cases of error or abuse, and may take action against violators, including account termination.

Strong Impact on Video Makers’ Business

Channels focused on children’s videos will no doubt feel a strong business impact and possibly a loss of advertising revenues. YouTube said it is “committed to helping creators navigate this new landscape and to supporting our ecosystem of family content”.

Video makers have known since September that the changes would come. As mentioned above, video makers must indicate themselves whether a video is intended for children or not. YouTube places the responsibility, and any FTC fines, with the makers.

The FTC does provide guidelines that help distributors and creators determine whether content is aimed at kids. Content isn’t considered “directed to children” just because some children may see it. But, according to distributors and creators, what a children’s video really is, remains vague. Or at least broad and vague enough to frighten a lot of content creators.

IT communication specialist
Sandra has many years of experience in the IT and tech sector as a communication specialist. She's also been co-director of a company specializing in IT, editorial services and communications project management. For she follows relevant cybercrime and online privacy developments.