The classroom is no longer what it was even five years ago. Students aren’t just bringing smartphones to school – they’re bringing powerful AI tools capable of writing essays, solving complex equations, and creating digital art in seconds.
At Lakehead University in Thunder Bay, education leaders aren’t hiding from this technological sea change. They’re embracing it head-on.
“We can’t pretend this isn’t happening,” says Dr. Wayne Melville, Dean of Education at Lakehead University. “Our responsibility is to prepare teachers who understand not just how to manage these tools, but how to leverage them to create deeper learning experiences.”
Last week, over 200 educators from across Northwestern Ontario gathered for a first-of-its-kind regional workshop on artificial intelligence in education. The day-long session brought together teachers, principals, board administrators and faculty from both Lakehead University and Confederation College.
The training comes at a critical moment. According to recent polling by the Ontario College of Teachers, nearly 78% of educators report students using AI tools like ChatGPT in their classrooms, but only 23% feel adequately prepared to address it pedagogically.
“The technology is moving faster than our policies,” explains Michael Morrow, Director of Education for Lakehead Public Schools. “We need a balanced approach that acknowledges both the potential benefits and the very real concerns about academic integrity.”
During hands-on sessions, teachers experimented with prompt engineering – the art of effectively communicating with AI systems – and explored approaches for detecting AI-generated content. But the more compelling conversations centered around reimagining assessment entirely.
“If a student can use ChatGPT to write an essay in seconds, maybe we shouldn’t be assigning that type of essay anymore,” notes Emma Vieira, an English department head at Sir Winston Churchill Collegiate and Vocational Institute. “Instead, we’re looking at assessments that require students to critically evaluate AI outputs or use the technology as a collaborative tool rather than a replacement for their own thinking.”
The economic stakes are significant. A recent report from the Information and Communications Technology Council projects that Canada will need to fill approximately 250,000 new jobs in AI-related fields by 2025. This creates urgency around helping students develop AI literacy early.
“We’re preparing students for a workforce where AI interaction will be as fundamental as email,” says Dr. Greg Rickford, Ontario’s Minister of Northern Development, who attended the morning session. “Northwestern Ontario can’t afford to fall behind on this front.”
Educators in the region face unique challenges implementing AI training. Internet connectivity remains spotty in many rural and remote communities, creating potential equity gaps. Indigenous educators at the conference also raised important questions about ensuring AI tools respect and incorporate traditional knowledge systems.
“These technologies are built on datasets that often exclude Indigenous perspectives,” explains Rachel Chakasim, Indigenous Education Coordinator at Northern Nishnawbe Education Council. “We’re working to ensure our students can engage critically with AI while maintaining connection to cultural knowledge that algorithms simply don’t capture.”
The conference wasn’t without controversy. Some teachers expressed concerns about the environmental impact of large AI models, which require significant energy resources. Others worried about classroom surveillance implications as some districts explore AI tools for monitoring student engagement and behavior.
“There’s no putting this genie back in the bottle,” says Ron Poling, computer science teacher at Thunder Bay Catholic District School Board. “But we need to model ethical, thoughtful use of these tools. Our students are watching how we navigate this.”
Perhaps most revealing was a live demonstration where teachers watched as ChatGPT generated a fairly sophisticated essay on Northern Ontario ecology in under 30 seconds. The output wasn’t perfect – it contained several factual errors about local wildlife – but it was certainly passable by high school standards.
“That’s exactly why we need to refocus on skills AI can’t replicate,” says Melville. “Critical thinking, emotional intelligence, ethical judgment, and creative problem-solving – these remain uniquely human domains where our students need to excel.”
The training marks the beginning of a more coordinated regional approach to AI in education. Lakehead University announced plans to develop a certificate program in AI for practicing teachers, while district school boards committed to establishing shared guidelines for appropriate AI use in assignments.
For many educators, the day offered a shift in perspective from seeing AI as a threat to viewing it as a teaching opportunity.
“I came in worried about catching cheaters,” admits Sarah Kosonic, a Grade 10 science teacher. “I’m leaving thinking about how to teach my students to be thoughtful AI users who understand both the capabilities and limitations of these tools.”
As the training wrapped up, Melville emphasized that navigating AI in education isn’t merely a technical challenge but a deeply human one.
“Technology always changes faster than we can adapt our institutions,” he told the assembled educators. “But one thing remains constant – good teaching has always been about relationships, about seeing potential in students and helping them develop as whole people. AI doesn’t change that fundamental truth.”