After the Bell is a podcast published by the Vancouver School Board. Episodes are typically hosted by Jiana Chow, the Communications Manager at VSB, and cover a whole host of topics.
Episode 23 was originally published on October 29th 2024, and it features a conversation between:
- Jiana Chow, Communications Manager, Vancouver School Board
- Jeff Spence, District Principal, Learning and Information Technology
- Zhi Su, District Principal for Vancouver Learning Network and Summer Learning
The episode is available in many places including Youtube.
The things that Jeff Spence and Zhi Su talk about in this episode are useful for understanding the aims of the Vancouver School Board when it comes to AI. What their goals are, what they are prioritizing, and how they see the technology.
We are posting this transcript so parents can more easily refer to what was said, with some of the more interesting lines highlighted.
Episode Transcript
Jiana Chow:
Welcome to VSB After the Bell. I’m your host, Jiana Chow. With deep gratitude and respect, we are honoured to be learning and unlearning on the ancestral and unceded lands of the Hamathkwiyam, Skohomish O’homeyum, and Tsleil-Waututh Nations.
The Vancouver School Board is among the most diverse school systems in Canada with an annual enrollment of approximately 50,000 students from kindergarten to grade 12 and adult education students. For some, advancements in technology can be filled with fear of the unknown. For others, the rise of technology, such as artificial intelligence or AI, is an exciting opportunity to enhance student learning and provide solutions to pre-existing barriers.
However you may feel about AI learning, the reality is that it is here, and it is something we must all learn to adapt in order to stay up to date with technology. Here to discuss all things AI is Jeff Spence, District Principal, Learning and Information Technology, as well as Zhi Su, District Principal for Vancouver Learning Network and Summer Learning. Thank you both for joining me today.
First off, Jeff, can we start with you? A brief introduction for our listeners.
Jeff Spence:
For sure. My name is Jeff Spence.
I’ve been teaching in the district since 1992. I taught at a variety of high schools, then moved into administration, and I kind of see myself as bridging the gap between what is happening in technology at the schools and the IT department, and I provide this translation service both directions.
Zhi Su:
Hi, I’m Zhi Su, District Principal of the Vancouver Learning Network and Summer Learning, and I joined the Vancouver School Board in 2001 and taught at various schools, started the digital immersion program at John Oliver, and then moved out of district to become administrator, and then now I’m back in district, and my primary role is to support online learning for the district as well as summer learning over the summer months.
Jiana Chow:
When ChatGPT was first launched, there was a lot of questions around academic integrity and students using it to cheat for exams, either plagiarism through writing essays and making it their work. It’s been two years. What’s the feedback been like?
Zhi Su:
What we’re trying to do is we’re trying to revamp the curriculum to make it so that we’re either including AI or we’re telling students not to use AI or we’re asking students to use AI but then think more about the process and what is it teaching you about your writing, for instance, and how can you improve it? And so I think there is a changing landscape in terms of education and academic integrity.
I think teachers need to recognize that AI use is out there and try to figure out ways to engage students on a more personal level so that the content that’s created by AI is not so generic that it still needs to be personalized by the students.
Jiana Chow:
So how is AI currently being used in schools now?
Jeff Spence:
From a district perspective, we have contracts with Microsoft. So we have Microsoft Copilot AI, which is currently used with all of the offices and with teachers and employees, not with students yet.
But we do know in schools that kids are using AI. They were on right away, on ChatGPT within minutes of it launching. And so we do know that kids use it.
But at this point, I wouldn’t say it’s widespread requested use by teachers. Does that make sense? Teachers are not saying, “hey, go on to, you don’t have Copilot, go on to ChatGPT”. They’re not requesting students to go on because we’ve warned them about kind of some of the dangers.
Jiana Chow:
Are teachers using it more often then?
Jeff Spence:
We have the whole gamut of people who engage or don’t.
Jiana Chow:
Those who are using it, how are they using it in the classroom?
Zhi Su:
Talking to a couple of teachers this morning, and they’re talking about how they use it for research and as a tool to generate content or curriculum or adapt curriculum they currently use, as well as to generate assessment rubrics and help inform some of the practice that they’re currently doing, but also being able to adapt it. And with the AI, the large language models, teachers can scale up or down the grade level of the content quite easily.
And so, for instance, if a student is struggling to understand and there may be a language barrier or the language may not be student-friendly language, the AI can translate that information for the students so it’s more easily palatable for the students or at their level. So in that sense, it’s a tool that a teacher has in their toolkit.
Jiana Chow:
That’s really interesting. Okay, so they can just use something that they would use for the whole classroom and if they find a student that needs more advanced levels, then they plop it in and ask the AI tool to make it harder or the next grade up?
Zhi Su:
Yes, absolutely. So, for instance, case in point, we have a large number of students that have a variety of range in their math skills. And so, for some teachers, some of the math may be beyond their scope.
for some teachers, some of the math may be beyond their scope… the teacher can take a higher level math and share that with the students for them to use.
Zhi Su, District Principal for Vancouver Learning Network and Summer Learning
But with that said, AI does generate misinformation so the teacher still needs to vet the content before sharing it with the students.
And so, for instance, a grade seven teacher who has a number of students that are excelling in math who need more personalized information or personalized learning, the teacher can take a higher level math and share that with the students for them to use. And in that sense, it saves the teacher time and it’s more personalized for a student. But with that said, AI does generate misinformation so the teacher still needs to vet the content before sharing it with the students.
Jiana Chow:
You’re the district principal for the Vancouver Learning Network, which is essentially our online platform for students who want to learn at their own pace.
Zhi Su:
Correct, yes.
Jiana Chow:
So do you see teachers or students use that more because you’re in VLN?
Zhi Su:
Yeah, I think so, partly because it’s on their platform.
And so, they’re using computers, the content is consumed, interacted online. And so, because of the portability of that technology, you can easily copy and paste, transfer it to AI, help translate, for instance. But teachers can also use the students’ work and it can accelerate their grading process.
[AI] can accelerate [teachers’] grading process
Zhi Su, District Principal for Vancouver Learning Network and Summer Learning
Jiana Chow:
So is there a concern with privacy if you’re uploading students’ work?
Jeff Spence:
Oh, yes, absolutely. And the reason is, student work, like consider a student in a Grade 9 class perhaps writing an essay and it may contain personal anecdotes. As soon as it’s uploaded, this information is now given to the AI and the AI can use it for training. And it’s compared to other students who write essays and it’s like, “oh, that’s the same”. It could be used for comparison for cheating where it could be looking for similar stories. But the main thing is the story shouldn’t even be there.
The teacher shouldn’t be uploading it to an AI to be used for future training.
Jiana Chow:
And how are you managing that?
Jeff Spence:
Well, first step is moving to Copilot. And so, most people’s experience with AI, I would guess, is probably with ChatGPT because it was the first one.
And Copilot, in fact, is built on top of ChatGPT. Let me explain it this way. If, Jiana, if you asked ChatGPT to do something for you, it knows who you are because you had to create an account to use the tool and then it takes whatever you asked it and it takes the response and it remembers all of that information.
You can review what you asked it months ago even. And so, it’s remembering, it’s taking that and it’s training future models with that data. The difference with Copilot is, Copilot, think of it as an interruption line in between.
So, you now ask Copilot something and then Copilot asks ChatGPT. So, what does ChatGPT see? It sees, “oh, Copilot is asking me something”. But millions of people around the world are using Copilot and so information is coming in.
Now, not only that, Copilot won’t allow a bunch of privacy. It has a number of privacy rules in place.
Jiana Chow:
Compared to ChatGPT.
Jeff Spence:
Yes, because ChatGPT is the source, but Copilot is like a filter layer.
Zhi Su:
Copilot, because we signed in, it’s a walled garden and the information is not stored, it gets deleted. And so, when we create a new chat, it doesn’t understand the context that you’re coming from.
So, oftentimes, in order to create good information, you need to create a good prompt. Part of that is creating the context, the situation. Whereas if you go into ChatGPT, you can look in your account and you can look at your history and it actually knows all this information about you because it’s learning and it’s learning about you.
In order for AI to sustain its attention, it has to generate all this power to focus on you and it’s not optimized that way. And so, when you create a new chat, it’s paying attention to you and you’re communicating back and forth, just like we’re communicating back and forth. But if we stopped communicating, our attention shifts elsewhere.
Jiana Chow:
But then when you go back online, it knows your history from before. No?
Jeff Spence:
It can pick up the thread and continue, but this is where we get into AI hallucination, where it keeps going along, along, along a thread in a conversation and it starts to get distracted and it starts to make things up.
Jiana Chow:
Oh, really? What’s AI hallucination?
Jeff Spence:
Let’s try this example.
So, let’s say a student is using ChatGPT to create a resume and they say, hey, I’m a student, I’m a grade 10 student, I’ve never had a job before, but I did volunteer and I babysat a little bit and I don’t know, something in the neighbourhood, washing cars, and it makes a resume. And it says, okay, I need to also include a cover letter. It’s like, okay, makes a cover letter.
Oh, I don’t really like that cover letter because I also did some of these other things. And it starts to not forget that it was making a resume, but it starts to now go down the path of some of the items that are in there. It’s like, hmm, this is not quite enough.
I’m just going to add some things in here to make you look better because that’s what other people do with their resumes and cover letters because it has the worldwide pool of cover letters. So, all of a sudden, it puts something in and you realize, wait a minute, I’ve never done that. Where did that come from? And so you say to it, “Hey, that’s not true. I was never an alligator wrestler.” And it will say, “Oh, I’m sorry about that.”
Zhi Su:
“Yes, you’re right.” Very agreeable. “Yes, you’re right. I’m sorry.”
Jeff Spence:
“Sorry, let me take that out.” And then it puts something else in. And so students have encountered this hallucination when they’re getting AI to write essays for them. And it may put in, you say, also include references down at the bottom. And there is a reference that is complete fabrication. It puts in a link. You click on the link. There’s like no link. There’s nothing there. And you say, hey, that’s not really a link. It’s like, “Oh, I’m sorry. Let me put in another. Oh, that’s better. Here’s another one.” Yeah, completely fake again.
There is no Professor Su at whatever university. You have to check.
… you say, “also include references down at the bottom.” And there is a reference that is complete fabrication. It puts in a link. You click on the link. There’s like no link. There’s nothing there. And you say, “Hey, that’s not really a link.”
It’s like, “Oh, I’m sorry. Let me put in another. Oh, that’s better. Here’s another one.”
Yeah, completely fake again.
Jeff Spence, District Principal, Learning and Information Technology
Jiana Chow:
So you really have to pay attention.
Jeff Spence:
You have to check.
Jiana Chow:
Yeah. How are these companies, these big tech companies, working with education? Are they even keeping educators in mind as it progresses?
Jeff Spence:
We attend and present at Microsoft related things. There’s one coming up next month, IT for K-12. And so I’m presenting at that.
Zhi Su:
We do participate in a Coast Metro, I guess it’s a meeting of AI planning for strategic planning along with the ministry to talk about policy and guidelines for our districts. And that’s exciting. We’re on the forefront and we recognize that we do need to have policies in place and guidelines and guardrails so that we can intentionally implement this and keeping privacy and student information safe, but also thinking ahead and mapping forward what we would like to do in the future.
And we do have a variety of initiatives on the go.
Jiana Chow:
Can you speak more to those initiatives?
Zhi Su:
Yeah. So one of the things that we thought was valuable was something that came out of Seneca College and they created an AI tutor chat-bot for their students. And so that’s something that we’re working with Microsoft on building so that just imagine that we can provide a tutoring service essentially for all our students in the district.
we’re working with Microsoft on building [an AI tutor chat-bot] so that … we can provide a tutoring service essentially for all our students in the district
Zhi Su, District Principal for Vancouver Learning Network and Summer Learning
Jiana Chow:
Is this going to replace teachers?
Jeff Spence:
No, absolutely not. No, because teaching is about relationships and we’re just talking about content right now.
Zhi Su:
Yeah. I do see a separation in like the skills and then the content. And so in the past, there was heavy reliance on the teacher to know everything and see everything, and now I don’t think that’s the case.
I think that we always need teachers. We need people that are able to build relationships with students and teach them the skills, but the content, the content exists online. I think the big thing is that students still need to maintain their curiosity and they need to be able to critically think.
We need people that are able to build relationships with students and teach them the skills, but the content, the content exists online.
Zhi Su, District Principal for Vancouver Learning Network and Summer Learning
And when they access content online, they need to know whether or not this content is credible, which is it reliable or is it misinformed information? And that’s part of the critical thinking piece. And so the teachers are essentially the guide. We talk about the facilitator and the teacher is facilitating a group of students and the students are all over the place.
the technology [in contrast to teachers] can probably more readily see where the students are progressing
Zhi Su, District Principal for Vancouver Learning Network and Summer Learning
And the technology, where the technology comes in, is the technology can probably more readily see where the students are progressing, where they are along this continuum. And then the teacher being the facilitator can see where the students are and then better engage them. And so there’s a lot of heavy lifting that the technology can do that enhances what the teacher is doing.
So we talk about this as being the age of augmentation acceleration and so make things better, more efficient, faster.
Jiana Chow:
So speaking of policy, what do we have in place at VSB to ensure that the safety of students and teachers are protected?
Zhi Su:
We do have administrative procedures and private information guidelines and acceptable use agreements, which informs our practice of staff and students. While we do have those in place, we recognize that a refresh is an order to add an addendum to recognize that AI is something new, but really AI is not new and rather a new tool that has surfaced within our digital literacy umbrella.
And I think Jeff can speak to this a little bit more.
Jeff Spence:
One of the roles that is part of my job in the VSB is reviewing some of the admin procedures. An example of that would be historically I recall reviewing about the use of Google in the classroom and then more recently the use of social media in the classroom and our teachers allowed to create accounts for their classroom or our students using their accounts.
Another example is like TikTok in the classroom. Are we using it or are we not using it? So right now we’re reviewing the admin procedures for AI and how does that embed in?
And also keeping it flexible enough that there’s going to be future things that we aren’t even talking about yet. So it really is about appropriate use of technology, software, and hardware in the classroom for learning.
Jiana Chow:
So we’re monitoring how the technology is changing and how users are changing their behaviors as well and then updating our procedures, our policies.
Jeff Spence:
And we look to see what’s appropriate. Like what is appropriate classroom use of these technologies?
Jiana Chow:
So what roles can families play in supporting their child’s interaction with AI?
Zhi Su:
We view parents as partners in education and when it comes to the question of what parents can do or need to know, I think parents need to stay informed and in the loop of what’s going on with artificial intelligence in general.
And some parents are, in their own line of work or business. Parents also need to be aware that their children might be using artificial intelligence to satisfy their own curiosities. And what types of information, especially private information, is going where. Parents need to know that potentially their child is sharing information that shouldn’t be shared.
Will this impact their future or come back to harm them later? That’s something that we may or may not have considered with social media. What types of data, information, or images are they sharing?
Looking into the minimum age of account creation, AI tools are saying that children between the ages of 13 and 18 require parental consent. Do you recall giving your child permission to use AI?
Another important thing that parents need to do is talk to their child about academic integrity and respecting the rules or instructions of the teacher and helping their child with critical thinking skills and identifying what is valid or credible information as opposed to fake, misinformed, or biased content, as well as navigating ethical considerations around use, academic integrity, copyrights, or intellectual property.
I noticed that the post-secondary institutions are also cautiously embracing AI for research. One of their criteria for AI use for academics is, are you willing to accept the risks of use or false information? I think that question applies to everyone who is using AI regardless of their role or occupation. The future of work is beginning to leverage and integrate AI, and jobs will shift.
Parents will need to help guide their children in choosing future occupations.
Jeff Spence:
I like a lot of what you’re saying, Zhi. Zhi and I are fortunate. Both of us have kids that are in the system. And so we talk to our kids, and unfortunately, the gateway is always around cheating. But really, what is cheating? Years ago, we would say, “You can’t use Google, that’s cheating.”
But now, everybody just uses it. In terms of AI, maybe we need to shift the conversation away from it being cheating and into the realm of, hey, how can we use this for good? What’s the amount that you’re allowed to use it? Is it okay to use an AI to help you get some ideas?” Then later on, you do all of your own work. Is that considered cheating?
In terms of AI, maybe we need to shift the conversation away from it being cheating and into the realm of, hey, how can we use this for good? What’s the amount that you’re allowed to use it?
Jeff Spence, District Principal, Learning and Information Technology
One of the things I would like to stress is, anytime you’re putting information into the AI, the AI generally is keeping that information. If you write a song and upload it, if you write an essay and upload it, if you create art and upload it, now the AI is using it to train its models. Effectively, you’ve given away your creative work. I think that’s a discussion I really would like parents to have with kids. What are you using it for? Is it helping you or is it potentially you’re giving away things that in the future you’ll wish you hadn’t have given away?
Jiana Chow:
Thank you for sharing all the insights and how it’s impacting teachers and how they teach and students and how they learn.
To our listeners, thank you so much for being part of this conversation. We hope you found this information helpful as we continue to navigate the changing world of AI learning.
[End of transcript]