Common Arguments in Favour of AI

There are some common reasons that are given when talking about putting AI into classrooms.

“Kids are using AI anyway, by putting it in the classroom means teachers have oversight”

Giving in to cheating

This frames cheating using AI is inevitable. Essay-writing services have existed for decades but inviting their use into classrooms was never considered.

“Students must learn to critically evaluate AI output”

Students cannot critique what they don’t know

AI’s problem of “hallucination”, or creating false information, is a well-known problem with generative AI.

Proponents of AI try to re-frame this problem by saying students should “critically evaluate” anything created by AI tools.

If a student is learning about something for the first time, it is not possible for them to evaluate whether something is true or false.

“AI is the future, students must learn how to use it”

Using AI has negative impacts on learning

Putting AI into classrooms is often framed as “AI Literacy”. That AI is simply another tool that children should learn how to use.

However peer-reviewed studies show that using AI has negative impacts on learning (Kosmyna 2025, Lehmann 2025) and critical thinking (Lee 2025).

It’s possible to make an argument for a single class where students learn how AI works, its strengths and weaknesses. That is different from recommending its use in every class.

“The classroom AI has safety measures”

Generative AI is inherently uncontrollable

SchoolAI, Microsoft and other companies that market to schools are aware that chat-bots have encouraged children to commit suicide and have talked to young children about sexual topics.

They reassure parents that their tools are safer than others.

However because generative AI is based on natural language, any guidelines like “don’t talk about sexual topics” can be broken by prompt injection.

“AI is used for cutting-edge research, so it is important for children to use it in classrooms”

Chat-bots and scientific tools are not the same thing

Proponents of AI often mention how “AI” is used to find cancer cells or find patterns in large amounts of data.

They are trying to equate tools created to solve a specific scientific problem, with general-purpose text-generation tools like ChatGPT and Copilot.

These are not the same thing at all.

The first are tools evaluated by peer-reviewed study, while the other is a commercial product designed to create believable text.

“Text- and image-generation tools enhance creativity”

Shortcuts deny students opportunities to learn

Drawing and writing are difficult skills that take thousands of hours of practice.

Encouraging students to use these tools denies them the learning opportunities that comes with trying, struggling and improving.

“AI keeps kids engaged”

Chat-bots are addictive

AI companies are quick to point out how engaging AI tools are. They do this because studies have shown that tools like ChatGPT are terrible for learning and retention.

The interactive snappy way of interacting with chat-bots decreases attention spans and comprehension ability.

The final point

There are doubtless more arguments that are made in favour of deploying AI into classrooms.

However there is one fundamental question that should be at the heart of any discussion:

Does putting AI in classrooms help students?

  • Do they retain what they learn?
  • Do they feel proud of what they make?
  • Does it help them with emotional growth and resilience?o