The ICO’s Age Appropriate Design Code Comes of Age
As well as being relevant to digital services which are targeted at or particularly popular with children, it will also be relevant to all providers of Information Society Services (ISS) in the UK – which include games, apps, websites as well as connected toys and IoT devices - including some not-for-profit providers, where there is a likelihood that children may actually use their services.
The Code is not a new law, but a Statutory Code of Practice under the Data Protection Act 2018, which sets 15 standards and explains how GDPR will apply in the context of children using digital services. The ICO and the courts will take the Code into account in deciding whether the requirements of GDPR and PECR have been met for the purposes of enforcement action.
The Code is the first of its kind anywhere in the world and reflects not only increasing policy demands to protect children’s privacy from the likes of the 5Rights Foundation but also the demands of younger users themselves as vocalised recently in relation to family tracking apps such as Life360.
Who is considered a child and which services are in scope?
A child is anybody under the age of 18, which is a much higher bar than the current UK data protection law which sets the bar at 13.
The Code applies to all ISS which are likely to be accessed by under 18s in the UK, not only those which are targeted at them.
In practice this covers a very broad range of services including apps, websites, social media platforms, search engines, messaging and internet telephony services, online marketplaces, content streaming services, online games, news and educational websites, connected toys and IoT devices.
It also includes services provided on an indirect charging basis, e.g. funded by advertising.
Not-for-profit services may also be included if they are the types of services which are typically provided on a commercial basis.
Services which only provide information or bookings for “real world” services are likely to be outside of scope, as are online counselling or preventative services.
Who does it apply to?
The Code applies to providers based in the UK and providers based outside of the UK if their services are provided to or monitor users based in the UK.
When is a service “likely” to be accessed by a child?
The ICO’s view is that the possibility of a service being accessed by a child has to be more probable than not, taking into account the nature and content of the service and whether it has a particular appeal for children, as well as the way in which the service is accessed and any measures which have been implemented to prevent children gaining access.
A “common sense” approach should be taken, so rather than focusing on making a service which you would not want children to access in any event compliant, the ICO recommends that you should focus on preventing access in the first place.
What are the standards of age appropriate design?
- The best interests of the child: This should be a primary consideration when designing and developing online services likely to be accessed by a child. This means the design of the service should take account of the child’s privacy needs and how they can be best supported by the design of the service.
- Data protection impact assessment: You should undertake a DPIA before launch to assess and mitigate the rights and freedoms of children likely to use your product or service. You should also undertake a DPIA of your existing services to understand what changes may be required.
- Age appropriate application: Take a risk-based approach to recognising the age of individual users and ensure you effectively apply the standards in this Code to child users. Either establish age with a level of certainty that is appropriate to the risks to the rights and freedoms of children that arise from your data processing, or if that is not possible, or you do not wish to do this, you should apply the standards in the Code to all your users instead.
- Transparency: The privacy information you provide must be concise, prominent and in clear language suited to the age of the child. Provide additional specific “bite-sized” child-accessible explanations about how you use personal data at the point that use is activated.
- Detrimental use of data: Do not use children’s personal data in ways that have been shown to be detrimental to their wellbeing, or that go against industry codes of practice, other regulatory provisions or Government advice (such as CAP code for marketing and advertising).
- Policies and community standards: Uphold your own published terms, policies and community standards including privacy policies, age restriction, behaviour rules and content policies.
- Default settings: Settings must be “high privacy” by default unless you can demonstrate a compelling reason for a different default setting, taking account of the best interests of the child.
- Data minimisation: Collect and retain only the minimum amount of personal data you need to provide the elements of your service in which a child is actively and knowingly engaged. Give children separate choices over which elements they wish to activate.
- Data sharing: Do not disclose children’s data unless you can demonstrate a compelling reason to do so, taking account of the best interests of the child.
- Geolocation: Switch geolocation options off by default unless you can demonstrate a compelling reason for geolocation to be switched on by default, taking account of the best interests of the child. If geolocation services are additional to the core service, then these should be subject to separate privacy settings. Provide an obvious sign for children when location tracking is active. Options which make a child’s location visible to others must default back to “off” at the end of each session.
- Parental controls: If you provide parental controls, give the child age appropriate information about this. If your online service allows a parent or carer to monitor their child’s online activity or track their location, provide an obvious sign to the child when they are being monitored.
- Profiling: Switch options which use profiling “off” by default (unless you can demonstrate a compelling reason for profiling to be on by default, taking account of the best interests of the child). Only allow profiling if you have appropriate measures in place to protect the child from any harmful effects (in particular, being fed content that is detrimental to their health or wellbeing) and separate privacy settings should be used for each different type of profiling.
- Nudge techniques: Do not use nudge techniques or other tools to lead or encourage children to provide unnecessary personal data or weaken or turn off their privacy protections.
- Connected toys and devices (IoT): If you provide a connected toy or device ensure you include effective tools to enable conformance to this Code.
- Online tools: Provide prominent and accessible tools to help children exercise their data protection rights and report concerns.
What should you be doing now?
Consider whether your service is or is likely to be accessed by children and, if so, what you already know about the age of the children accessing your service. The Code does not mandate any specific age-verification methods but the ICO has provided some guidance on the ways in which you might be able to determine the ages of the users of your services. Reliance on self-declaration alone is unlikely to be enough, unless the risks to children are very low.
Review your existing age-verification mechanisms and consider whether any new mechanisms need to be adopted.
Update your standard Data Privacy Impact Assessment template for new services to include elements which demonstrate how the requirements of the Code have been met, as well as conducting updated DPIAs on your existing services (if you are a larger organisation the ICO also expects you to consult with children and parents). The ICO has also prepared a template DPIA which you may wish to consider.
You should already be complying with the fundamental requirements of GDPR but do review your existing privacy notices, polices, tools and other information provided to users and consider whether they are adequate for the age-groups of your child users. Consider how you might make information and tools more accessible and easy to understand for the different user age-groups.
Consider whether you also need to make any changes to the design of your services, including your data collection, default privacy settings, or features such as nudge settings.
If you consider that your services are not likely to be accessed by children then you should nevertheless document why you think this is the case, in case you need to justify this to the ICO at a later date.
It will be interesting to see how the Code is taken up. Clearly the experience of Life360 shows that there are compelling commercial, as well as legal, reasons for many providers of digital services to make changes to account for the needs and risks associated with younger users sooner rather than later.