Guardian readers share the ways and reasons they are preparing their children and students for a future that may necessitate familiarity with generative artificial intelligence
Since the release of ChatGPT in late 2022, generative artificial intelligence has trickled down from adults in their offices to university students in campus libraries to teenagers in high school hallways. Now it's reaching the youngest among us, and parents and teachers are grappling with the most responsible way to introduce their under-13s to a new technology that may fundamentally reshape the future. Though the terms of service for ChatGPT, Google's Gemini and other AI models specify that the tools are only meant for those over 13, parents and teachers are taking the matter of AI education into their own hands.
Inspired by a story we published on parents who are teaching their children to use AI to set them up for success in school and at work, we asked Guardian readers how and why - or why not - others are doing the same. Though our original story only concerned parents, we have also included teachers in the responses published below, as preparing children for future studies and jobs is one of educators' responsibilities as well.
Some parents and teachers told us they are going full steam ahead, integrating ChatGPT into everyday interactions with their children to explain new concepts, answer incessant questions and illustrate fanciful tales. Others, wary of the dangers AI can pose to young people, are demonstrating it only a little at a time in closely supervised settings.
Still other parents and teachers are refraining from showing their children and students how to use AI at all, concerned about harms and the impossibility of ethical use of a technology still rife with unresolved copyright questions. We felt their responses made valuable contributions to the discussion, too.
The answers below have been edited for length and clarity.
I swear, watching my kids discover AI has been one of those unexpected parenting joys nobody tells you about. Remember when we all used to say, 'Let's Google it'? Well, my nine-year-old has completely ditched that phrase. Now it's always 'Daddy, can we ChatGPT it?' whenever he's stuck on homework. The best part? He's learning to ask for hints instead of answers - figuring things out himself but with a little AI nudge.
Then there's my six-year-old - this kid's questions could outlast anyone's patience! You know those evenings when your brain is completely fried but they're still going strong? That's when our AI buddy steps in. He will chat away for half an hour, bouncing from dinosaurs to space to 'why is water wet?' while I secretly recharge my mental batteries.
My daughter is only three but already thinks she's royalty. Instead of reading the same princess books for the millionth time, we now create stories where SHE'S the princess having these wild adventures.
-Matt, consultant, 44, Palm Beach Gardens, Florida
I use AI as a kind of intellectual backstop when I'm having a conversation with my children and I can't answer one of their questions. We have AI voice assistants at home and in the car so this has recently involved asking Alexa for the population of China and examples of how you can use the word 'credential' for homework. It fast forwards the conversation and keeps the learning flowing. My children are aware that AI can hallucinate, however, so you need to sense check responses. Alexa recently came up with some incorrect answers about the book Ann of Green Gables, which my eight year old challenged!
-Graham, writer, Lamberhurst Quarter
I use AI tools with my three-year-old to help explore the world. We use apps to identify birdsongs and plants on our daily walks, and ChatGPT to answer questions he has about the world ("what are bones for? Where did my bones come from?") especially when we get new books from the library on something he hasn't learned much about yet. I want him to feel comfortable turning to AI to augment his curiosity. At the same time, we minimize all passive play, including screen time.
For now, I try to make time for us to tell stories to each other and do imaginative play for the topics he asks AI about, so he has a chance to do some generative engagement.
AI is a daily part of our routine, but it's not a particularly high priority.
-Nate, data scientist, East Bay, California.
I am currently introducing OpenAI's ChatGPT to my oldest daughter, describing it as a creative helper that can be both friend or foe and teaching her basic prompts to make an AI work and benefit you. I show her my daily usage of AI in my working area, creating mails for me, optimising event planning, structuring data. I am teaching her that you have to be always sceptical when using AI and should not start to rely completely on its answers, which in fact can be wrong.
-Ben, 47, entrepreneur, Germany
My son understands Italian (my language) but speaks back in Dutch. My Dutch is very poor. I am occasionally making use of ChatGPT voice interaction tool to translate more complicated things from Italian/English into Dutch - with the upside that the output can be tailored to a six-year-old.
-Name withheld, Netherlands
To introduce them to AI, we logged into ChatGPT and generated images based on their prompts. While their understanding is still very limited, they are beginning to grasp the idea that you can create fake images and videos online.
We then worked together to create a story, which we gave to ChatGPT to generate a screenplay of a news report. Taking this a step further, we used NotebookLM to convert it into a podcast, which I played through our car stereo. The podcast was formatted as a legitimate-sounding news report, but the story itself was absurd - our Shetland pony turning into a unicorn and teaming up with a toy gibbon to fight evil sheep in Co Kerry.
After listening, I asked them whether we should believe the news report. We then searched online for evidence of the story and, unsurprisingly, found nothing. This helped illustrate the importance of verifying information, even when it sounds official. I have no idea how much of this lesson will stick, but it's an ongoing conversation.
My five-year-old, in particular, is starting to grasp that AI can generate fake content, and my hope is that this awareness will help him recognise or at least question misleading material in the future.
-David, software developer, Ireland
For pupils struggling with writing, I encourage them to ask AI for feedback at a sentence level, requesting information about why sentences don't parse, together with grammatical explanations.
I encourage them not to use AI to produce whole essays because that's not how they will learn. Some still ignore this advice, of course!
-Jenny, English and drama teacher, Valencia, Spain
I have a girl, six and a boy, 11. Both access chatGPT via myself. They know if dad does not know something they may insist he asks chatGPT. And I often do, while curating the response. The boy may also access it himself, most often to ask some questions about the world around.
I want kids to understand that LLMs are a tool to be used to achieve our goals. We also talked about kids who do homework with AI. I explained that we humans, also have computers in our heads which have to be trained in order for us to be successful and own tools of our success.
-Anton, fintech director, Geneva
I have been frequently rendered speechless over these past two years by all the changes introduced in my profession by AI. I have an 11-yo and I am staunchly trying to ensure that the first (and possibly only) intelligence she uses must be her own; she is aware of the existence of ChatGPT, AI translation engines and the like, but does not use them for schoolwork or any other writing.
-Name withheld, translator, Slovenia
As a guideline, I generally tell students to ask AI the questions that they would ask their teachers. Would they ask their teacher to write them an essay? No. Would they ask their teacher how to improve their essay? Yes.
At the same time, I remind students that AI is inaccurate. It will lie and provide false information. They will need to exercise their critical thinking and fact-checking skills. While we do have guidelines in place to cite AI use, I tell my students they should never cite AI, similar to my guidelines around Wikipedia.
-Adam, secondary English teacher, Vancouver, BC, Canada
Whenever I have used AI in the classroom it has been to spark children's imaginations, rather than replace them. Adobe Firefly is great for encouraging descriptive writing in English, I simply type in children's noun-phrases or lists of adjectives and AI creates an image based on the description. We can then adjust our vocabulary and take great delight in how AI interprets it quite literally. This is also a really important lesson as children can see first hand that AI has its flaws.
I have also used character.AI to bring historical characters to life to generate quotations and responses for children, based on their questions.
The children always ask for the website address so they can have a go at home, but I have not shared these directly. I advise the children that sometimes AI can appear so human, that children can become confused and start believing that the voice that answers is real. So AI tools and websites should always be used with an adult who can remind them and keep them safe online.
-Angie, primary teacher, Tunbridge Wells
I have chosen to introduce AI in generative ways - making clear that this is a tool, and it should be used. You need to know how to yield a hammer before you can use it; similarly, I see AI in the same way. Students more or less see it the same. Their teachers often don't. I predominantly teach MÄori and Pasifika students, and although they are often - and rightfully! - annoyed that their culture is sometimes misunderstood by these AI models, the technology is so inherently compelling that students take to it faster than their teachers.
I have done simple things like, for example, during a speech unit, having students use it to generate a speech on a random topic, from a list of random unfamiliar topics, to get them up and talking.
-Adam, 28, high school teacher, Christchurch, New Zealand
My 11-yo son uses it to help with homework and studying. He needs to get ideas and then reformulate the answer in his own words, so it's not a copy and paste job.
-Joanna, housewife and carer, Bath
I started introducing AI to my first born who is in secondary school while doing his school homework.
My approach has been to show him how AI, particularly ChatGPT, can help to understand better concepts and encourage responsible use. The reason is because as ardent user of technology l believe the earlier he gets to know it will make him better since the AI revolution is here to stay
-Richard, university lecturer, Uganda
My five-year-old son barely even knows the internet exists. Computers are for video calls, games, movies, and reading the day's temperature off the Met Office website. He definitely doesn't need to know anything about LLMs at this stage and certainly shouldn't use them. He needs to learn how to write his own words and draw his own pictures, not use machines for plagiarising and stunting his imagination. If LLMs come up, I would strongly encourage my son not to use them.
They are built on the backs of labour and art from writers and artists who have not licensed their work for this purpose. They are plagiarism machines. They can't create anything new.
I would tell him to be suspicious of any machine wanting to write words for him, which is tantamount to thinking his thoughts for him. Young children deserve opportunities to develop their critical thinking skills and imaginations.
-Name withheld, higher education administrator, Oxford
I am not introducing it to mine - 10 yrs old and eight. They are extremely sophisticated readers but not great writers. One has ADHD and is always attracted to distraction and short cuts, ie, instinctively avoids hard work.