Does FTC Report on Data Collection for Kid-Apps Presage Rules for All Developers?

A report was just released by the US Federal Trade Commission about apps directed at kids. It's about privacy and data collection and it comes right in the wake of the iPhone contacts upload scandal involving multiple app developers. 

Undertaken under the enforcement umbrella of the Children's Online Privacy Protection Rule, it contains a number of intesting findings and observations. One could argue the findings are limited to the special needs and circumstances of children. However in my view the FTC is seeking to establish principles that would apply more broadly to apps and developers in general. Accordingly there may be implications for in-app advertising if formal rules around data sharing disclosures and user controls are established. (Location-sharing permissions are a kind of model here.)

The FTC report found more than "8,000 results in the Apple App Store and over 3,600 in the Android Market" that resonded to the search query "kids." The FTC then analyzed 480 of the top kids' apps from both iTunes and Android Markets. The basic finding of the analysis is that kids apps do little to explain what functions of the phone they access or what data they capture (and share with third parties): 

[A]cross the wide range of “kids” apps examined in the survey, staff found very little information about the data collection or sharing practices of these apps. Apple’s and Google’s mobile operating systems and app stores provide limited notice to users regarding app capabilities, and leave the bulk of disclosure to individual app developers. In most instances, staff was unable to determine from the information on the app store page or the developer’s landing page whether an app collected any data, let alone the type of data collected, the purpose for such collection, and who collected or obtained access to such data . . .

The FTC expressed disappointment with the paucity of information about "permissions" and data collection: 

Of the 182 Android apps indicating they were intended for use by kids, only 24% specified that the app required “no special permissions to run” – i.e., that a child could use the app without the app accessing any information or capabilities from the mobile device. Conversely, 76% indicated that the app required at least one “permission” to run . . .

[The] Apple app promotion pages that staff examined provided almost no information on individual developers’ data collection and sharing practices. Similarly, the Android app promotion pages that staff examined provided little information other than the mandatory “permissions.” Only three (1.5%) of the 200 Android apps even attempted to convey information about the purpose for the “permissions.”

As a result the FTC wants more disclosures and more parental controls. It wants app stores (Apple, Google) to do a great deal more in the way of providing information to end-users:

  • All members of the "kids app ecosystem" – the stores, developers and third parties providing services – should play an active role in providing key information to parents.
  • App developers should provide data practices information in simple and short disclosures. They also should disclose whether the app connects with social media, and whether it contains ads. Third parties that collect data also should disclose their privacy practices.
  • App stores also should take responsibility for ensuring that parents have basic information. "As gatekeepers of the app marketplace, the app stores should do more." The report notes that the stores provide architecture for sharing pricing and category data, and should be able to provide a way for developers to provide information about their data collection and sharing practices.

Even though this all comes under the specific banner of protecting children I see it as a template for something broader in the future that would potentially apply to all apps and developers.

If I'm correct a troubling aspect of all this, hypothetically, for app developers and publishers would be the required disclosure of ad-related data sharing and any requirement that consumers be given the option to block or opt-out. My guess is that most consumers, if they knew how, would probably block any such data sharing accordingly. 

That option would compromise the efficacy of ad networks and their targeting capabilities. And while this is speculation on my part I don't think it's all that wild.