CBC Interview with Christina Walker

CBC released an interview with Christina Walker, the Vancouver School Board’s Digital Literacy Mentor.

The interview is relatively short but has many key omissions and mis-truths that are important to correct and clarify.

Avoids mentioning harms of AI

At the end of the piece the interviewer asks: “Have you noticed a decline in students when they’re using AI, whether it’s those critical thinking skills, whether their grades are going down? Is there any metric for that?”

Christina Walker:
“I wouldn’t say that there’s a metric. There isn’t really a metric just yet, other than, of course, having conversations and of course, as teachers, we are working in the classrooms, we are seeing what’s happening there.”

This is incorrect, there are many different metrics, and there is mounting scientific evidence that using generative AI harms learning.

AI-use is correlated with worse learning outcomes (Kosmyna 2025, Lehmann 2025), poorer critical thinking (Lee 2025), and makes users unable to accurately judge their own performance (Fernandes 2025)

Students that use AI perform worse than those that never used it (Bastani 2024) once the AI crutch is taken away.

A report from Microsoft themselves found that students that used LLMs as a study aid performed worse than students that just took notes (Kreijkes 2025).

A recent report from the usually AI-positive Brookings Institution was summarized by NPR as “The risks of AI in schools outweigh the benefits“.

The majority of students use it to solve problems for them, rather than learn from it (Anthropic 2025).

As part of the BC Ministry of Education and Child Care’s Considerations for Using AI Tools in K-12 Schools document, it recommends that school boards “Seek AI tools with a proven track record of effectiveness backed by research or case studies demonstrating that they deepen learning for students.” (page 12)

If the VSB is not aware of any metrics to evaluate the AI tools, what “track record of effectiveness backed by research or case studies” are they using to recommend AI use in schools?

No mention of why AI is being deployed

In the entire interview there is no mention of why VSB is choosing to deploy AI into classrooms.

Does it help with test scores?

Does it help students to retain information?

If it has benefits, why are those benefits not mentioned?

CBC does not ask this question.

No mention of mental health issues

Despite spending the first part of the interview talking about how many guardrails there are in Microsoft Copilot 13+, Christina Walker does not mention the host of problems associated with continued interaction with chatbots in general.

Children are regularly using chat-bots for personal advice (Common Sense Media 2025). Putting a chatbot in the classroom can only increase this reliance on tools for emotional support.

In adults “chat-bot psychosis“, caused by prolonged interaction with chat-bots, has affected many and has prompted a class-action lawsuit.

Calculators, but not smart phones

No discussion of AI in schools would be complete without mentioning the calculator. Everyone thought that calculators becoming popular would mean students would not learn how to add and subtract. But students can still do math so there was really nothing to worry about.

After bringing up calculators, Christina Walker mentions other technologies that have come out since the calculator: the internet and smartphones.

So if calculators turned out to be not a problem, and we are applying the same logic AI, surely we would apply the same logic to the internet and smart phones? After all, everyone was worried about calculators, but students adapted.

Curiously BC currently restricts cellphones in classrooms, with the reasoning that:

By removing the distractions from digital devices, students can focus on their education. This leads to better learning outcomes and helps support their mental health and social connections.

A distraction-causing digital tool that harms mental health and hurts social connections sounds a lot like AI.