AI algorithms are starting to play a variety of roles in the digital ecosystems of children - being embedded in the connected toys, apps and services they interact with daily. Such AI systems provide children with many advantages, such as personalized teaching and learning from intelligent tutoring systems or online content monitoring and filtering algorithms that proactively identify potentially harmful content or contexts before they are experienced. 

AI systems in games and entertainment services provide personalized content recommendations, while social robots power the interactive characters in ways that make them engage and human-like. Going forward, AI systems will, in all likelihood, become even more pervasive in children's applications simply due to their sheer usefulness in creating compelling, adaptive, and personal user experiences. 

Such systems also could play a role, for instance, in making systems more inclusive and accessible, as researchers explore ways systems might be made sensitive and adaptive to children's specific needs and abilities. 

Major AI tools for children

Certain AI apps now available in the market leverage AI to help kids play more creatively. 

  • Brickit: Having a massive pile of LEGO and no clue what to build next is a parent's nightmare. Brickit uses AI to tell you which bricks you have and make suggestions as to what you can build. Users simply take pictures, and the app's AI identifies and suggests. 
  • Petalica: This smart app uses AI to take kids' analogue drawings to another level. It is an online tool that takes the images the children have drawn and harnesses machine learning to colorize their work. Users can draw or upload their sketches. 
  • Artbreeder: This software is for kids who want to experiment with their images. It bills itself as a mash-up of human creativity and AI-driven software which delivers the ultimate in human/machine creativity. The app hosts thousands of photos, inviting the user to mix to create stunning and sometimes bizarre new ones. It enables users to generate and modify images of faces, landscapes and paintings, among other categories. 
  • Luca & Friends: Designed for kids ages 4-8, the app harnesses AI to provide an interactive learning experience in which players play games by moving to select the correct answers. Using basic movements and following simple directions, players might stretch or jump to "touch" or "catch" the correct answers, helping them build strength, endurance, coordination and flexibility while practicing English and Science, Technology, Engineering and Maths (STEM) skills. 

AI for children 

There has been increasing effort made in attempts to regulate more responsible AI. Regulatory frameworks have emerged that attempt to systematically characterize risks related to AI technologies and establish methods by which risks might be identified and mitigated. A UNICEF review of 20 national AI strategies in 2018 has shown that very little attention has been given to safeguarding children's rights in an algorithmic-oriented economy and society.

Meanwhile, a separate branch of work has focused on, more specifically, guiding AI (or digital) technologies for children. The two branches of work were separate but related, and sometimes they touched on similar topics differently. Therefore, this confusing array of different frameworks and guidelines relates to somewhat overlapping concerns. This can make it difficult for designers and practitioners to establish concrete design suggestions and standards effectively. 

Guidelines to remember 

As UNICEF and other organizations emphasize, we must pay specific attention to children and the evolution of AI technology in a way that children-specific rights and needs are recognized. The potential impact of artificial intelligence on children deserves special attention, given children’s heightened vulnerabilities and the numerous roles that artificial intelligence will play throughout the lifespan of individuals born in the 21st century. 

As part of their AI for children project, UNICEF has developed this policy guidance to promote children's rights in government and private sector AI policies and practices. The policy guidance explores AI systems and considers the ways in which they impact children. The guidance offers nine requirements for child-centered AI: 

  • Support children’s development and well-being 
  • Ensure inclusion of and for children 
  • Prioritize fairness and non-discrimination for children 
  • Protect children’s data and privacy 
  • Ensure safety for children 
  • Provide transparency, explainability, and accountability for children 
  • Empower governments and businesses with knowledge of AI and children’s rights  
  • Prepare children for present and future developments in AI 
  • Create an enabling environment 

Understanding impact 

One of the major challenges with designing age-appropriate AI for children is to make the AI policies and design guidelines translated to AI developers. Understanding the ways that AI systems are being used in systems for children and their harm and impact is still a new and emerging area of investigation.  

On the one hand, there exists a growing body of literature characterizing such AI harms across many kinds of applications and on the other, there is a complementary body of literature focusing on risks to children, which similarly includes primary research, proposed policies, and codes of practice, such as the UK ICO's Age Appropriate Design Codes and the UNICEF's AI policy for children. These diverse efforts have created a confusing outlook for designers and practitioners to create concrete and safe designs for children. There is only one way to analyze the impact these AI tools have on children- to observe and learn. 

Want to publish your content?

Publish an article and share your insights to the world.

Get Published Icon
ALSO EXPLORE