When a patient walked into an Edmonton emergency room last month, they weren’t just greeted by triage nurses and intake forms. They were quietly scanned by Canada’s first hospital-based artificial intelligence weapons detection system—a technology that might soon become as commonplace in healthcare settings as hand sanitizer dispensers.
The University of Alberta Hospital quietly implemented this AI-powered scanning system in February, marking a troubling evolution in how healthcare facilities are responding to rising violence against medical staff. According to Alberta Health Services (AHS), the technology has already intercepted over 100 prohibited items, including knives and other potential weapons.
“We’re seeing more verbal and physical threats against our healthcare workers,” says Darryl Ness, executive director for protective services at AHS. “This technology creates another layer of security without slowing down critical care.”
The system, developed by Canadian security firm Evolv Technology, uses a combination of advanced sensors and machine learning algorithms to detect concealed weapons without requiring people to empty pockets or walk through traditional metal detectors. Patients simply walk past unassuming white pillars stationed at entrances, while security staff monitor alerts on tablets.
This installation reflects a disturbing reality: emergency departments have become increasingly dangerous workplaces. A 2022 survey by the Canadian Association of Emergency Physicians found that 68% of emergency doctors reported experiencing verbal harassment weekly, while 16% faced physical threats at least monthly.
The Edmonton deployment follows similar implementations at Toronto’s Hospital for Sick Children and Vancouver General Hospital last year. Hospital administrators cite a troubling post-pandemic surge in healthcare violence as justification for the increased surveillance.
“We’re walking a tightrope between creating safety and maintaining the welcoming environment essential for healthcare,” explains Dr. Samantha Rivera, an emergency medicine specialist who has advocated for better protection measures. “Nobody wants hospitals feeling like courthouses, but staff deserve to work without fear.”
Privacy experts, however, raise concerns about this technological solution. Sharon Polsky of the Privacy and Access Council of Canada questions whether patients were adequately informed about being scanned. “Healthcare facilities are places of vulnerability where people expect confidentiality,” Polsky notes. “These systems collect data, and patients deserve transparency about what happens with that information.”
AHS maintains that the system doesn’t store personal identifiers or images of patients, though the accuracy of such claims remains difficult for outside observers to verify. This represents the classic technology implementation challenge: deployment often outpaces regulatory frameworks and public discussion.
The weapons detection system brings other complications. Traditional security measures like metal detectors and bag searches are visible and understood. This new approach—designed to be minimally intrusive—means many patients may not realize they’re being scanned at all.
The technology isn’t perfect either. Early deployments in American hospitals reported false positives triggered by medical devices, prosthetics, and even certain types of jewelry. AHS declined to share the system’s current false positive rate at the Edmonton site.
For Indigenous communities and other marginalized groups who have historically experienced discrimination in healthcare settings, increased security measures raise concerns about potential profiling. When alerted to a possible weapon, how do security personnel decide which alerts warrant intervention?
“These systems don’t eliminate human bias,” cautions Rebecca Thompson, a health equity researcher at MacEwan University. “They can actually amplify existing prejudices depending on how alerts are handled.”
The financial investment is substantial. While AHS hasn’t disclosed costs, similar deployments in U.S. hospitals range from $75,000-$125,000 per entrance point, plus annual licensing fees. This comes as many hospitals struggle with staffing shortages and resource constraints.
What makes this situation particularly complex is that healthcare violence is indisputably increasing. Emergency department workers describe being punched, spat on, and threatened regularly. A nurse at Royal Alexandra Hospital, speaking anonymously, described removing her identification badge when leaving work to prevent being recognized by aggressive patients.
“Something needs to change,” she says. “But is more technology the answer, or do we need more human-centered approaches?”
Alternative approaches exist. Some hospitals have implemented de-escalation training, created specialized care units for patients in crisis, and increased mental health resources. Critics of the technological approach argue these human-centered solutions address root causes rather than symptoms.
The Edmonton deployment represents a broader shift in how institutions respond to societal challenges—increasingly turning to algorithmic solutions for human problems. As these systems become normalized in healthcare settings, the question becomes less about whether we can implement such technology and more about whether we should.
For now, as patients enter the University of Alberta Hospital emergency department, they’re participating—knowingly or not—in a significant experiment balancing safety, privacy, and access in modern healthcare. The outcomes will likely influence hospital security nationwide.
What remains certain is that the pressures making hospitals less safe won’t be solved by scanning systems alone. The technology may intercept weapons, but addressing why people feel compelled to bring them into places of healing requires solutions no algorithm can provide.