A Guide to Navigating AI as a Family
Foundry10 and UC Irvine collaborate on a new report that provides caregivers of K-12 students with strategies for helping youth responsibly use AI.
Generative AI tools can make life easier for kids and teenagers by helping them explore fun ideas for a science project or create novel graphics, for example. However, GenAI can also make it easier for students to cheat on an assignment or to become victims of deepfake photos. The broad impact of new applications such as ChatGPT — and the speed with which this technology has become integrated into the lives of young people — has left caregivers in uncharted territory.
“Things are moving quickly, and parents, teachers, and students alike are all working through what they think and how they reason about AI,” says Gillian R. Hayes, Kleist Professor of Informatics in the Donald Bren School of Information and Computer Sciences (ICS) and Vice Provost of Academic Personnel at UC Irvine.

To help make sense of it all, Hayes and colleagues from UC Irvine worked with collaborators from Foundry10, an education research organization focused on expanding ideas about learning and creating direct value for young people. Together, the team of researchers published a technical report, “Navigating AI as a Family: Caregivers’ Perspectives and Strategies.” The report offers strategies for helping youth build AI literacy, presenting research-backed recommendations and resources for caregivers of children in kindergarten through 12th grade.
Identifying Caregiver Profiles
A key takeaway from the report is the identification of four distinct caregiver profiles for viewing and approaching AI:
- curious newcomer — still in the early stages of exploring AI;
- discerning optimist — has some experience with AI and a deeper understanding of its capabilities;
- concerned critic — deeply skeptical and cautious toward AI; and
- tech-savvy enthusiast — knowledgeable about AI and eager to integrate it into their children’s lives.
“Each profile reflects different attitudes toward integrating AI into both their own lives and their children’s lives, highlighting the diverse perspectives caregivers have on generative AI,” says Aehong Min, an informatics postdoctoral scholar working in the STAR Lab at UCI. “These perspectives balance excitement about AI’s educational potential with concerns about its impact on children’s critical thinking, social development and ethical reasoning.”
Offering Recommendations and Resources
The report offers caregivers evidence-based recommendations in the following three areas:
- Exploring and co-creating with AI, sharing resources to help caregivers expand their understanding of AI and suggesting how to introduce AI at developmentally appropriate ages to ensure children gain critical-thinking skills.
- Building strong ethics, providing strategies for ensuring originality and incorporating ethical AI use into daily life.
- Moderating AI use, explaining how to set clear boundaries for AI use in schoolwork and how to support independent skill development.
“Families want more support and guidance on how to use AI ethically and responsibly,” says Kelli Dickerson, Director of Research for UCI CERES, focused on Connecting the EdTech Research EcoSystem. “Many parents are figuring things out as they go, trying to understand how AI can fit into their family life and how they can strike the right balance between making the most of its benefits while also ensuring their children use it safely and responsibly.”
The recommendations in this report guide families in exploring and using AI in ways that align with their values, setting children up for future success in a technologically changing world. The report also helps educators who are hoping to build AI literacy by effectively introducing and leveraging AI in their classrooms.
“UCI and Foundry 10 are continuing to collaborate,” says Hayes, “working to figure out how everyone can best work together in the interest of our youth.”
— Shani Murray