The Children's Online Privacy Protection Act ("COPPA") is a United States federal law passed by Congress in 1998 to protect children’s online privacy. COPPA requires the Federal Trade Commission (“FTC”) to issue and enforce a rule implementing the law (“the COPPA Rule” or “the Rule”). The FTC’s COPPA Rule became effective in 2000 and was amended in 2013. The FTC is presently in the process of reviewing the Rule again. COPPA is enforced by the FTC and by state Attorneys General, who have the authority to seek civil penalties from companies that violate the law.
Although COPPA is a U.S. law, foreign online service operators must comply with the COPPA Rule if their services, or a portion of the services, are directed to children in the U.S., or if they have actual knowledge that they collect information from children in the U.S. Relatedly, American online service operators that collect information from foreign children are also subject to COPPA.
The COPPA Rule applies to online service operators whose service, or a portion of the service, is directed to children under the age of 13 or who have actual knowledge that they collect information from children under the age of 13. Actual knowledge can be gained in a variety of ways, including by a user attempting to pass through an age gate, a user or parent telling a customer service team the user’s age, or by user generated content (e.g., a user stating she is in elementary school in her profile).
Third parties in the digital advertising ecosystem (e.g., not the online service operator) must comply with the COPPA Rule when they have actual knowledge that they collect information from users of an online service directed to children. The FTC sets forth two scenarios where ad networks and other third parties will likely be deemed to have actual knowledge:
See the FTC’s 2013 Statement of Basis and Purpose.
COPPA’s requirements apply only when the online service operators or third parties described above collect personal information. Because personal information includes persistent identifiers, COPPA applies to the collection of mobile device identifiers, browser-based cookies, or other persistent device identifiers, regardless of whether the device is a personal device (e.g., a mobile phone) or shared device (e.g., a smart TV). Please also refer to the section below: HOW DOES COPPA APPLY TO ADTECH.
Operators covered by the Rule must:
See COPPA Rule at 16 C.F.R. § 312.
In addition to publishers of content directed to children or that have actual knowledge of a child using their service, any company that handles persistent identifiers of an online service it knows to be directed to children is subject to COPPA. This includes demand-side platforms (DSPs), supply-side platforms (SSPs), ad networks, data management platforms (DMPs), customer data platforms (CDPs), analytics and fraud detection companies, measurement providers and all of the other advertising technologies.
Although COPPA does not prohibit advertising to children, the Rule prohibits the collection of personal information (including cookies and other persistent identifiers) from children under 13 without verifiable parental consent. The intention behind this prohibition is to stop behavioral advertising, retargeting and profiling of children under 13. Contextual advertising is permissible under COPPA. In practice, this means contextually-based advertising that does not track the user over time and across online services.
Real-time Bidding (“RTB”) is a way of transacting media that allows an individual ad impression to be put up for bid in real-time. This is done through a programmatic on-the-spot auction. RTB allows for the ability to serve targeted ads. The Interactive Advertising Bureau’s OpenRTB is an API specification for an open protocol for the automated trading of digital media across a range of platforms and devices. Part of IAB’s OpenRTB specification, the COPPA flag is an attribute of a bid request that signals whether that request is for the opportunity to serve an ad to a child protected by COPPA. The publisher issuing the bid request has made the determination that the user is a child. The flag will have a value of 1 if the user is a child under 13, and a value of 0 otherwise. This flag allows media buyers to programmatically decide whether to make a bid and whether they can use tracking and targeting technologies with that impression. See IAB Guide to Navigating COPPA. For more information about how an advertiser or adtech partner can act upon the flag, refer to the Knowledge Base.
There are two categories of online services under COPPA: directed to children and general audience. Mixed audience online services are a subset of the child-directed category of online services.
Directed to children is defined in the COPPA Rule as a commercial website or online service, or portion thereof, that is targeted to children. See COPPA Rule at 16 C.F.R. § 312.2 (definition of “Web site or online service directed to children”). Classifying a service as child-directed is subjective. However, it requires consideration of a number of factors set forth in the COPPA Rule. These factors include:
See COPPA Rule at 16 C.F.R. § 312.2 (definition of “Web site or online service directed to children,” paragraph (1)). In other words, if an online service is targeted to appeal to children, then it is considered child-directed, and the operator is required to treat every user of the service as if they are a child and comply with the COPPA Rule. This is true even if the user is assessing the service from a shared device or a device that belongs to an adult.
The COPPA Rule provides for a mixed audience subcategory of the directed to children category. See COPPA Rule at 16 C.F.R. § 312.2 (definition of “Web site or online service directed to children,” paragraph (3)); and Complying with COPPA: Frequently Asked Questions (See FAQs D.4-8). A mixed audience online service falls under the definition of directed to children despite not targeting children under 13 as its primary audience. Operators of mixed audience online services are permitted to implement a neutral age screen for their users. Using the age screen enables the operator of a mixed audience online services to collect personal information from users who indicate they are 13 or under but only after obtaining verifiable parental consent. Notably, an online service may be deemed directed to children even if its Terms of Service or Privacy Policy prohibit children under 13 from using the service. In determining whether an online service is child-directed, the FTC will consider the factors set forth in the Rule listed above.
General audience is the term used by the FTC to describe sites and services that do not target children under 13 as a portion of the audience. See the FTC’s 1999 Statement of Basis and Purpose. The COPPA Rule applies to operators of general audience online services with actual knowledge that they are collecting information from children under 13.
Importantly, a general audience online service does not become a mixed audience service simply because some children use the site or service. However, sites that are widely-known to have a large proportion of children are using the service will likely be considered a mixed audience site.
The COPPA Rule defines personal information to include:
See COPPA Rule at 16 C.F.R. § 312.2 (definition of “Personal information”).
Data collection is defined under COPPA as the gathering of any personal information from a child by any means, including but not limited to:
See COPPA Rule at 16 C.F.R. § 312.2 (definition of “Collects or collection”).
COPPA includes a provision which enables industry groups, commercial entities, or others to develop their own COPPA oversight programs, known as Safe Harbor programs. There are currently six COPPA Safe Harbor programs approved by the FTC: Children’s Advertising Review Unit (CARU), the Entertainment Software Rating Board (ESRB), PRIVO, kidSAFE, iKeepSafe, and TRUSTe. A Safe Harbor organization may audit, monitor and provide guidance to its participating companies. A benefit of certification with a Safe Harbor program is that, generally, a disciplinary review for a COPPA violation will allow for a period to cure the violation instead of a formal investigation by the FTC. To be clear, Pixalate is not an FTC approved COPPA Safe Harbor.
Pixalate is providing tools designed to help ad tech companies avoid serving targeted ads on child-directed websites. In the first instance, it is the publisher’s responsibility to designate its app as child-directed wherever appropriate. If a publisher identifies its app as child-directed, ad networks must NOT serve targeted ads on that property. Many ad tech companies have expressed concern, however, that publishers do not always get such designations correct. These ad tech companies are trying to go above and beyond the requirements of COPPA to ensure that targeted ads are not served on child-directed properties. This is where Pixalate’s compliance tools add significant value. Given that manual review of all of the transactions that take place in the ad ecosystem is virtually impossible and that third parties cannot often ascertain the intended audience of an app, Pixalate provides a solution that flags likely child-directed properties for further review or blocking of targeted ads by ad tech companies.
Please note that Pixalate’s assessment of COPPA risk, or potential child directedness, does not provide a legal conclusion regarding an app’s intended audience or the sufficiency of an app’s COPPA Rule compliance.
Pixalate uses a combination of signals to determine if an app is likely directed to children under 13. These signals include: app store category information (e.g., Games, Education, Entertainment); content rating (e.g., Everyone and Everyone 10+ in the Google Play store or 4+ and 9+ in the Apple App store); whether the app is in Google’s Teacher Approved program; and the presence of child keywords in app’s title or description. Additionally, Pixalate cross references apps between the two stores for consistency. If an app is likely child-directed in one store based on Pixalate’s automated methodology, the algorithm will match the equivalent version of the app in the other store and designate it as likely child-directed as well.
The table below provides an overview of Pixalate’s automated methodology for determining whether the audience for an app is likely child-directed. As explained above, child-directed includes mixed audience apps that may be targeting children as a portion of their audience. If an app is not assessed to be likely child-directed under Pixalate’s methodology, the algorithm will deem the audience to be likely general audience.
Pixalate updates the COPPA Audience assessment on a weekly cadence, but some components used in the assessment, like Content Rating, Category or App description, are only refreshed on a monthly cadence.
Pixalate uses app store content category and subcategory information, along with other app information, in assessing whether apps are likely child-directed.
Based on a manual review, Pixalate determined that most apps that target children fall into the Games, Education or Entertainment categories or subcategories in both of the app stores. As early as 2012, the FTC found the highest percentage of apps for children were located in the Games and Education categories in the Apple and Android stores. See the FTC’s Staff Report Mobile Apps for Kids: Current Privacy Disclosures are Disappointing, Table 1 at page 5. Based on Pixalate’s analysis, this remains true ten years later. Pixalate also found that a large number of apps that target children are in the Entertainment category. Pixalate uses the Games, Education and Entertainment categories/subcategories along with specific age ratings (discussed below) and child keywords (discussed below) to designate apps as likely child-directed.
Google states that app developers who have designed an app for children must participate in the Designed for Families program. Google also states that developers who have designed apps for everyone, including children and families, may apply to participate in the Designed for Families program. As part of the application, in addition to selecting a content rating, app developers declare that their app is designed for specific target age groups. Apps that are approved for the Designed for Families program appear in the Family category in the Play store. Pixalate uses this category information in combination with specific content ratings (discussed below) to designate apps as likely child-directed.
Apple provides app developers the ability to declare that their apps should be included in the Kids Category on the App Store. App developers that participate in this program are supposed to follow certain guidelines including adherence to children’s privacy laws. Apps in the Kids category are supposed to be designed for children 11 and under. In addition to selecting a content rating, app developers who have designed apps for Apple’s Kids category choose a target age range for the app: 5 and under, 6-8, or 9-11). See Choosing a Category, Special Cases, Apps for Kids. Apple also allows app developers to assign their app to the Family subcategory. Pixalate uses this category information in combination with specific content ratings (discussed below) to designate apps as likely child-directed.
Periodically, both app stores rename their categories. In general, any categories that contain the words “Kids” or “Family” are utilized by Pixalate (along with other app information) to designate apps that are likely child-directed. For more detailed information about the app stores content category information that Pixalate uses in its automated methodology, refer to the Knowledge Base.
Both Google and Apple use ratings to describe the app’s content. These ratings do not describe whether an app is targeting children under 13. Pixalate does not rely on these ratings alone in its child-directed automated assessment. Rather, Pixalate uses these ratings only in combination with other factors in its methodology to assess whether apps are child-directed.
Google explains that content ratings are the responsibility of the app developers and the International Age Rating Coalition (IARC). See Apps & Games content ratings on Google Play. Google states, “Content ratings are used to describe the minimum maturity level of content in apps. However, content ratings don’t tell you whether an app is designed for users of a specific age.” See Age-based ratings & description. Rating standards vary by country or region. For North and South America, Google uses ratings that are maintained by the Entertainment Software Rating Board (ESRB). More information about these ratings can be found on the Google Play Help website.
Pixalate uses two of the ESRB content ratings (Everyone and Everyone 10+) in combination with other factors in its child-directed automated assessment. Similarly for apps from other regions, Pixalate uses content ratings that correspond with these ratings. Please refer to the Knowledge Base for a full listing of the content ratings that Pixalate uses in its child-directed automated assessment.
According to the Apple developer website, “An age rating is a required app information property used by the parental controls on the App Store.” Apple provides a list of content descriptions and the app developer identifies how frequently each content type appears in the app. These selections are converted into one of four App Store age ratings: 4+, 9+, 12+ or 17+. See App Store Preview, Get Started, Age Ratings and App Store Connect Help, My Apps, Age ratings.
Pixalate uses two of the Apple age ratings (4+ and 9+) in combination with other factors in its child-directed automated assessment.
Google Play has a program called Teacher Approved in which Apps are evaluated by teachers and other specialists and for its age and content appropriateness. Google explains that teachers and specialists rate apps in the Designed for Families program (discussed above) based on design, appeal, and enrichment; age appropriateness; and the appropriateness of ads, in-app purchases, and cross-promotion. Teacher approved apps are eligible to appear on the Kids tab on Google Play and display a Teacher Approved badge. Unlike most apps on Google Play, these apps also display the app’s target age group(s) for children under 13.
Any app that is in Google’s Teacher Approved program is designated as child-directed under Pixalate’s automated child-directed assessment.
Pixalate uses a curated list of child-related keywords in an app’s title or app store description to assess whether an app likely targets children under 13. Pixalate uses both a qualitative and quantitative approach for generating the curated child-related keyword list. Pixalate uses a statistical technique based on conditional entropy to determine the most important words/ phrases used to describe apps for children. Pixalate is also supplementing the curated list of child-related keywords based on input from the educators in the Trust and Safety Advisory Board as they manually review apps.
Pixalate has formed a Trust and Safety Advisory Board, helmed by a former FTC enforcer and composed of qualified educators, to review and assess whether apps are child-directed. The educators on Pixalate’s Trust and Safety Advisory Board make assessments of apps based on the child-directed factors outlined in the COPPA Rule discussed above. App review is an ongoing process. Pixalate prioritizes apps for review based on the popularity of the app measured by the number of downloads in Google Play and the number of reviews in Apple’s App store.
Pixalate approaches COPPA as a risk factor. Pixalate’s assessment of COPPA risk does not provide a legal conclusion regarding whether an app is “directed to children” under COPPA. Pixalate analyzes multiple signals and produces a risk score (low, medium, high or critical) that captures the likelihood that a given app is a potential COPPA risk. In order to do so, the following signals are used:
Because Pixalate continuously monitors these signals, a mobile app’s COPPA Overall Risk Assessment rating can change between low, medium, high and critical over time.
See above: PIXALATE’S AUTOMATED CHILD DIRECTED RISK ASSESSMENT METHODOLOGY and PIXALATE’S MANUAL CHILD DIRECTED RISK ASSESSMENT METHODOLOGY.
Operators that are covered by the COPPA Rule are required to post a clear and comprehensive online privacy policy describing their information practices for personal information collected online from children. Pixalate deems the lack of an identifiable privacy policy for a child-directed app to be a critical COPPA risk factor because it is a violation of COPPA for operators of websites and onlines services that collect, use, or disclose personal information from children to fail to post a privacy policy online. Additionally, the FTC recommends that all websites and online services - particularly those directed to children - post privacy policies online so visitors can easily learn about the operator’s information practices. Pixalate determines whether an app has a privacy policy based on information provided in the app stores. Additionally, Pixalate uses crawlers to scan developer websites for privacy policies. An app will be flagged as not having a privacy policy if it is not detected under either of these two methods.
Photographs, videos, or audio files where such files contain a child’s image or voice are personal information under the COPPA Rule. Geolocation sufficient to identify street name and name of a city or town is also personal information under the Rule. Mobile apps request access to certain device permissions in order to operate, such as access to the device’s camera, microphone or geolocation. In some cases, not all of the permissions that are requested are used. However, the fact that access to certain permissions has been requested creates additional risks since the permissions can be used at any time in the future. Pixalate has classified the most common mobile app permissions in terms of their COPPA risk, i.e. the risk to expose personal information. The permissions that Pixalate deems to be sensitive permissions are shown in the Knowledge Base.
Residential IP is an address that is assigned from an ISP to a homeowner, and is associated with a single owner and location. IP addresses are persistent identifiers that fall under the definition of personal information in the COPPA Rule. Accordingly, Pixalate deems passing residential IP information to be a COPPA risk factor. Pixalate examines the traffic associated with an app and determines if the end-user IP is transmitted through the advertising pipeline which exposes granular information about the user’s location. If IP is passed, the IP is categorized based on network type, for example: Cable/DSL, Cellular Tower, etc. Cable/DSL IPs are residential IPs which can be reverse geocoded to expose the location of the user. If the majority of the traffic from an app exposes such Cable/DSL IPs then Pixalate flags it as passing residential IP traffic in the bidstream. Pixalate interprets the passing of a Residential IP in the bidstream as a COPPA risk because advertising can be targeted using this information. However, if the last octet of the Residential IP address has been truncated in the programmatic bidstream, it is no longer considered personal information under COPPA. Accordingly, Pixalate does interpret the passing of a truncated Residential IP in the bidstream as a COPPA risk.
Geolocation information sufficient to identify street name and name of a city or town is personal information under the COPPA Rule. Pixalate deems passing geolocation information to be a COPPA risk factor. Pixalate examines the traffic associated with an app and determines if the end-users’ GPS coordinates are being transmitted through the advertising pipeline that exposes granular information about the users’ locations. If traffic from an app exposes such GPS location data then it is flagged as passing location information.
The table below details the impact of various signal combinations on the COPPA risk score.
Pixalate’s COPPA Compliance Tools render opinions that Pixalate believes may be useful to our clients and others in the digital media industry. It is important to note, however, that the mere fact that an app appears to be directed to children (e.g., data subjects under 13 years of age, as defined by the COPPA Rule) does not mean that any such app, or its operator, is failing to comply with the COPPA Rule. Further, with respect to apps that appear to be child-directed and have characteristics that, in Pixalate’s opinion, may trigger related privacy obligations and/or risk, such assertions reflect Pixalate’s opinions (i.e., they are neither facts nor guarantees); and, although Pixalate’s methodologies used to render such opinions are derived from automated processing and at times coupled with human intervention, no assurances can be – or are – given by Pixalate with respect to the accuracy of any such opinions.
“As ad spend on channels like CTV grows by leaps and bounds, advertisers need greater transparency into their programmatic buys.”
Patrick McCormack
Head of Business Development and Global Partnerships, yahoo
“MRT offers Criteo access to critical insights helping us evaluate brand safety signals and maintain our quality standards across our in-app supply globally.”
François Zolezzi
Head of Supply Quality, Criteo
Eric Bozinny
Senior Director, Marketplace Quality, PubMatic
“To ensure the quality and safety of all our LAN inventory, LinkedIn uses the MRT to evaluate publishers.”
Peter Turner
Business Development, LinkedIn Marketing Solutions
Disclaimer: The content of this page reflects Pixalate’s opinions with respect to the factors that Pixalate believes can be useful to the digital media industry. Any proprietary data shared is grounded in Pixalate’s proprietary technology and analytics, which Pixalate is continuously evaluating and updating. Any references to outside sources should not be construed as endorsements. Pixalate’s opinions are just that - opinion, not facts or guarantees.
Per the MRC, “'Fraud' is not intended to represent fraud as defined in various laws, statutes and ordinances or as conventionally used in U.S. Court or other legal proceedings, but rather a custom definition strictly for advertising measurement purposes. Also per the MRC, “‘Invalid Traffic’ is defined generally as traffic that does not meet certain ad serving quality or completeness criteria, or otherwise does not represent legitimate ad traffic that should be included in measurement counts. Among the reasons why ad traffic may be deemed invalid is it is a result of non-human traffic (spiders, bots, etc.), or activity designed to produce fraudulent traffic.”