Organizational Reports on AI in Education Share a Blind Spot: ‘Street Literacy’

By Jim Shimabukuro (assisted by Copilot)
Editor

International organizations such as the OECD, UNESCO, the World Bank, and EDUCAUSE have produced a steady stream of reports on artificial intelligence in education over the past several years, yet their analyses share a strikingly consistent institutional framing. Across these bodies, AI is conceptualized primarily as a tool for teachers, schools, and education systems, with attention focused on pedagogical integration, governance, ethics, and institutional readiness. The OECD’s Digital Education Outlook 2026, for example, devotes extensive attention to AI as a tutor, partner, or assistant within formal instructional settings, while treating student use outside school largely as a risk to be managed rather than a learning frontier to be understood.1

Image created by Copilot

UNESCO’s guidance similarly emphasizes teacher support, equity, and system‑level governance, while the World Bank frames AI as a lever for personalization and institutional modernization.2 EDUCAUSE, for its part, concentrates on higher‑education governance, academic integrity, and faculty‑centered instructional design.3 Although these organizations differ in tone and emphasis, they converge in their assumption that AI’s educational significance lies primarily within the boundaries of formal schooling.

This institutional lens, however, obscures a rapidly expanding reality: children and young adults are already using generative AI tools independently, outside school structures, to explore ideas, create content, solve problems, and learn about themselves and the world. Their engagement is curiosity‑driven, self‑directed, and woven into the rhythms of everyday digital life. While institutional reports tend to frame such use as a danger—warning of metacognitive offloading, false mastery, or academic dishonesty—the emerging evidence suggests that young people are developing new forms of informal learning that are neither captured nor adequately theorized by school‑centric frameworks.

To understand the educational landscape of the AI era, it is necessary to look beyond institutions and examine what youth are actually doing with AI on their own terms. Recent research from Pew Research Center, Common Sense Media, the National Literacy Trust, UNICEF, and a collaborative study from RAND, Brown University, and Harvard offers a window into the independent learning practices that are reshaping the meaning of education in real time.

Pew Research Center’s 2025 report “Teens, Social Media and AI Chatbots 2025” provides one of the clearest empirical portraits of how adolescents are integrating AI chatbots into their everyday lives outside school. Surveying more than 1,400 U.S. teens aged 13–17, Pew found that 64 percent had used chatbots, with nearly one‑third doing so daily, placing AI tools alongside social media as a routine component of youth digital experience.4 Although the report does not explicitly frame chatbot use as learning, its findings reveal a new ecology of informal knowledge‑seeking in which teens move fluidly between videos, search engines, and conversational AI to satisfy curiosity, troubleshoot problems, and explore personal interests.

The normalization of chatbot use suggests that AI is becoming a ubiquitous cognitive companion, shaping how young people ask questions and construct understanding. Rather than treating this as a deviation from formal learning, Pew’s data invites recognition that independent AI engagement is already a central feature of adolescent intellectual life.

Common Sense Media’s recent reporting on youth use of generative AI further illuminates how children and teens are experimenting with AI tools at home and in unstructured time. Their surveys show that students are not only using AI for homework assistance but also for creative writing, idea generation, and exploration of personal interests that extend beyond curricular boundaries.5 By situating AI use within broader patterns of digital citizenship, Common Sense Media highlights the emergence of informal heuristics through which young people assess the trustworthiness, usefulness, and limitations of AI outputs.

Many youth describe blending AI suggestions with their own thinking, revising or challenging AI responses, and using chatbots as springboards for creative or intellectual exploration.5 This “street literacy” around AI—developed outside classrooms and often without adult guidance—constitutes a powerful form of independent learning that institutional reports rarely acknowledge. It reflects a shift in how young people navigate information, authorship, and creativity in an AI‑saturated environment.

The National Literacy Trust’s 2024 report by Irene Picton and Christina Clark offers one of the most direct examinations of how young people are using generative AI to support literacy practices, including outside school assignments. Drawing on data from more than 50,000 children and young people, the report found that three in four adolescents aged 13–18 had used generative AI in 2024, with many reporting that they used it to help with writing in general and one in five using it to write stories.6 These findings suggest that AI is becoming a creative partner in youth literacy development, enabling experimentation with narrative forms, vocabulary, and expression.

Importantly, most young people reported adding their own thoughts to AI outputs, indicating a hybrid authorship model in which AI serves as a scaffold or collaborator rather than a replacement. At the same time, the report notes that some youth copy AI responses without modification, underscoring the need for critical literacy skills that match the affordances of generative tools.6 By framing AI as a force that may “redefine what it means to be literate in the digital age,” the National Literacy Trust recognizes that independent AI use is reshaping youth engagement with reading and writing in ways that transcend school‑based definitions of literacy.

UNICEF’s explainer “Generative AI: Risks and opportunities for children” provides a global, child‑rights‑oriented perspective that explicitly acknowledges children’s independent use of generative AI in everyday life.7 While the document emphasizes the need to balance empowerment with protection, it also notes that children and youth may be using generative AI more than adults and often without the knowledge of parents or teachers. Many young adults report using ChatGPT not only for work tasks but also “to learn something new,” suggesting that AI is functioning as an informal tutor and exploratory partner outside institutional oversight.7

UNICEF highlights that children use AI for homework, decision‑making, and entertainment, often turning to chatbots for immediate answers or guidance.7 Although framed cautiously, the explainer implicitly recognizes that independent AI use is not peripheral but central to contemporary childhood, and that understanding these practices is essential for designing policies that support children’s rights, agency, and development in an AI‑mediated world.

A 2025 study published in JAMA Network Open by Jonathan Cantor and colleagues from RAND, Brown University, and Harvard Medical School offers a compelling lens on a specific but deeply educational form of independent AI use: adolescents and young adults turning to chatbots for mental health advice. Surveying more than 1,000 individuals aged 12–21, the researchers found that one in eight use generative AI tools such as ChatGPT for help when feeling sad, angry, or nervous, with usage highest among those aged 18–21.8

Among those who seek such support, two‑thirds engage at least monthly, and more than 93 percent report that the advice is helpful. Although the study focuses on health rather than education, it reveals a powerful form of self‑directed learning in which young people use AI to understand their emotions, rehearse conversations, and explore coping strategies outside formal therapeutic or educational settings.8 This form of AI‑mediated learning challenges narrow definitions of education by highlighting how youth use chatbots to construct knowledge about themselves and their relationships, demonstrating that independent learning with AI extends into social‑emotional domains that institutions rarely address.

Taken together, these studies reveal a vibrant, complex landscape of independent learning that is unfolding beyond the boundaries of formal schooling. While international organizations continue to frame AI primarily as a tool for teachers and institutions, young people are already using generative AI to explore ideas, express themselves, navigate emotions, and pursue knowledge on their own terms. This divergence between institutional narratives and lived youth practices signals a deeper paradigm shift: education in the AI era is no longer confined to classrooms or curricula but is increasingly distributed across the digital environments where young people spend their time.

The challenge for policymakers, educators, and researchers is not merely to integrate AI into schools but to understand and support the independent learning ecosystems that children and young adults are already building. Recognizing this reality requires moving beyond risk‑centric framings and embracing a more expansive view of education—one that honors curiosity, agency, and the diverse ways young people learn in a world where AI is always within reach.

References

  1. OECD. OECD Digital Education Outlook 2026: Exploring Effective Uses of Generative AI in Education. https://doi.org/10.1787/062a7394-en
  2. UNESCO. AI and the Future of Education. https://www.unesco.org
  3. EDUCAUSE. 2024–2026 Horizon Reports on AI in Higher Education. https://www.educause.edu
  4. Pew Research Center. “Teens, Social Media and AI Chatbots 2025.” https://www.pewresearch.org/internet/2025/12/09/teens-social-media-and-ai-chatbots-2025 (pewresearch.org in Bing)
  5. Common Sense Media. “New Report Shows Students Are Embracing Artificial Intelligence.” https://www.commonsensemedia.org
  6. Picton, I., & Clark, C. “Children and young people’s use of generative AI to support literacy in 2024.” National Literacy Trust. https://literacytrust.org.uk (literacytrust.org.uk in Bing)
  7. UNICEF. “Generative AI: Risks and opportunities for children.” https://www.unicef.org
  8. Cantor, J. et al. “One in eight adolescents and young adults use AI chatbots for mental health advice.” Brown University School of Public Health / JAMA Network Open. https://sph.brown.edu/news/2025-mental-health-ai-chatbots (sph.brown.edu in Bing)

###

Leave a comment