Begin typing your search...

    Cognitive offloading: Parents, your job has changed in the AI era

    Humans are pros when it comes to cognitive offloading, meaning using tools to free up mental processing space and avoid thinking

    Cognitive offloading: Parents, your job has changed in the AI era
    X

    As the new school year gets underway, artificial intelligence is appearing in many aspects of teaching and learning. Spurred by the hopes that these tools will improve and personalise children’s learning, the large commercial AI labs are hard at work: Google rolled out 30 new education tools and features on Gemini in June; in late July, OpenAI introduced ChatGPT’s student tutor study mode.

    Parents should be very wary about children having unfettered access to a new digital technology. We saw social media wreak havoc on young people’s emotional states soon after they debuted more than 20 years ago. With AI, it isn’t just children’s emotional well-being that’s at risk — it’s also their cognitive development. Parents can’t afford to wait for someone else to protect their children. They are, like it or not, the first line of defense and oversight.

    Humans are pros when it comes to cognitive offloading, meaning using tools to free up mental processing space and avoid thinking. Why use a map when GPS can navigate? Students behave the same way when there’s a tool like ChatGPT or Gemini readily available to think for them. Some AI tools are carefully designed for education and can help children follow their curiosity, fill in learning holes, and help with learning differences. But these tools stop being helpful when they start doing the thinking for children. When students put their essay prompts or problem sets into regular ChatGPT and it spits out perfect work, they are shortcutting their learning. If the training wheels on a child’s bike kept the rider upright and pedalled and steered automatically, the child would not likely learn to ride. When students use Gemini or DeepSeek to do their history homework for them, that’s what’s happening.

    This is what researchers at MIT recently found when they tested how AI affected writing skills. They split university students ages 18 to 39 into three groups: One wrote with ChatGPT from the start, the second wrote on their own but could use Google search, and the third group was not allowed to use any tools. Later, all of the students revised their writing using ChatGPT to help.

    Those who wrote with ChatGPT from the beginning exhibited the worst writing quality and motivation; and as shown from brain activity measurements, parts of their brain associated with learning were less active. They struggled to revise their writing because it was never theirs to begin with. Participants who drafted their work unaided performed best. Given that even well-educated university students are at risk, we should be even more, worried about children who have yet to fully develop their thinking skills.

    Parents are already in the dark about their children’s engagement in school, despite their best efforts to stay apprised. (Grades, for instance, give only a partial picture of one’s learning.) AI is making this worse. Rectifying this parental awareness gap is paramount, since our research also shows that families are as influential as teachers and peers in helping young children and teens engage deeply in learning. If we don’t help our children to use AI wisely, we risk a whole new level of learning loss: a nation of compliant and unmotivated young people who have not developed the muscles to struggle productively, think, work, and contribute to our communities. This is the opposite of the human creativity and problem-solving required to navigate the opportunities and pitfalls of our new AI age.

    The New York Times

    Jenny Anderson & Rebecca Winthrop
    Next Story