Where current legislative proposals fall short
One example of concerning legislation is Utah’s App Store Accountability Act. The bill requires app stores to share if a user is a kid or teenager with all app developers (effectively millions of individual companies) without parental consent or rules on how the information is used. That raises real privacy and safety risks, like the potential for bad actors to sell the data or use it for other nefarious purposes.
This level of data sharing isn’t necessary — a weather app doesn’t need to know if a user is a kid. By contrast, a social media app does need to make significant decisions about age-appropriate content and features. As written, however, the bill helps social media companies avoid that responsibility despite the fact that apps are just one of many ways that kids can access these platforms. And by requiring app stores to obtain parental consent for every single app download, it dictates how parents supervise their kids and potentially cuts teens off from digital services like educational or navigation apps.
A legislative framework that better protects kids
By contrast, we are focused on solutions that require appropriate user consent and minimize data exposure. Our legislative framework, which we’ll share with lawmakers as we continue to engage on this issue, has app stores securely provide industry standard age assurances only to developers who actually need them — and ensures that information is used responsibly. Here are more details:
- Privacy-preserving age signal shared only with consent: Some legislation, including the Utah bill, require app stores to send age information to all developers without permission from the user or their parents. In our proposal, only developers who create apps that may be risky for minors would request industry standard age signals from app stores, and the information is only shared with permission from a user (or their parent). By just sharing with developers who need the information to deliver age-appropriate experiences, and only sharing the minimum amount of data needed to provide an age signal, it reduces the risk of sensitive information being shared broadly.
- Appropriate safety measures within apps: Under our proposal, an age signal helps a developer understand whether a user is an adult or a minor — the developer is then responsible for applying the appropriate safety and privacy protections. For example, an app developer might filter out certain types of content, introduce take a break reminders, or offer different privacy settings when they know a user might be a minor. Because developers know their apps best, they are best positioned to determine when and where an age-gate might be beneficial to their users, and that may evolve over time, which is another reason why a one-size-fits-all approach won’t adequately protect kids.
- Responsible use of age signals: Some legislative proposals create new child safety risks because they establish no guardrails against developers misusing an age signal. Our proposal helps to ensure that any age signals are used responsibly, with clear consequences for developers who violate users’ trust. For example, it protects against a developer improperly accessing or sharing the age signal.
- No ads personalization to minors: Alongside any age assurance proposal, we support banning personalized advertisements targeting users under 18 as an industry standard. At Google, this is a practice we’ve long disallowed. It’s time for other companies to follow suit.
- Centralized parental controls: Recognizing that parents sometimes feel overwhelmed by parental controls across different apps, our proposal would provide for a centralized dashboard for parents to manage their children’s online activities across different apps in one place and for developers to easily integrate with.
This post was first published on Google’s blog page “The Keyword” by Kareem Ghanem, the company’s Director for Public Policy
0 Comments