AI “Black Box” placed in more hospital operating rooms to improve safety

Key Takeaways:

– AI-powered surveillance technology is being used in hospital operating rooms to collect audio, video, patient vital signs, and other surgical data.
– The technology, called the OR Black Box, is sold by Surgical Safety Technologies Inc. and is implemented in over two dozen hospitals in the US and Canada.
– The OR Black Box uses wide-angled cameras and AI models to analyze data and provide insights on protocol compliance, efficiency, safety audits, quality controls, and education.
– The technology anonymizes people in the operating rooms by blurring faces and “cartoonifying” bodies.
– The data collected by the OR Black Box is not intended to assign blame to staff but to prevent mishaps from happening.
– Some hospitals have used the technology to identify areas where protocol is not being followed and have implemented retraining sessions to address the issues.
– There are concerns among hospital staff about the use of AI surveillance and whether errors caught on camera will be used for disciplinary purposes.
– The protection of staff identities and the potential use of data in malpractice cases are also unanswered questions.
– Early adopters of the technology have focused on the positive changes it has brought, such as improving efficiency and teamwork in operating rooms.

Ars Technica:

Enlarge / A camera in a hospital.

AI-powered surveillance technology is quickly making its way into hospital operating rooms around the country, where it works to constantly collect audio, video, patient vital signs, and a wealth of other surgical data, all in the name of improving safety and efficiency.

The surveillance technology has been implanted in operating rooms in over two dozen hospitals in the US and Canada so far. Most recently, the Boston area’s Brigham and Women’s Faulkner Hospital became one of the latest adopters of the technology, which is sold by Surgical Safety Technologies Inc. in Toronto.

The AI-powered platform is called the OR Black Box, named after the recording devices used in aircraft to help understand events that led to a disaster or other incident. But the name is a bit of a misnomer. The technology is not a literal black box; it’s a set of wide-angled cameras and proprietary, customized AI models. It’s also not necessarily intended to sort out the events that led to a surgical disaster or incident after the fact. Rather, it’s intended to help prevent any mishaps from happening, and its makers and adopters have had to repeatedly assure medical staff that the all-seeing AI won’t be used to pick out individual errors and assign blame to staff.

Instead, the OR Black Box is said to anonymize people in the operating rooms; the models blur faces and “cartoonify” bodies. The software aggregates and analyzes the data and reports back to the hospitals with insights on protocol compliance, efficiency, safety audits, quality controls, and key video and audio clips for review, annotation, and education. After 30 days, all recordings are erased.

For example, the Boston Globe noted that Duke Health in North Carolina started using an OR Black Box four years ago and realized after looking at the data that its surgical teams weren’t carefully following the protocol for preparing patients’ skin for incisions. “It’s the simple things that we thought we were doing well,” Christopher Mantyh, professor of surgery and vice chair of clinical operations at the Duke University School of Medicine, told the Globe. “We looked at it [and said], ‘Yeah, we’re not doing this right.” According to Mantyh, the hospital bought the OR staff lunch and held a retraining session.

Pros and cons

But unease of the AI surveillance seems to follow the Black Box to any new hospital. In Boston, Janet Donovan, an operating room nurse at Faulkner and a union officer of the Massachusetts Nurses Association, told the Globe that hospital staff have not warmed to the idea. They are particularly concerned about whether errors caught on the AI’s cameras will be used to discipline nurses. They are not convinced that staff identities will be protected. She notes that so far, the Black Box is used in only two operating rooms, with the same staff working in both, which would likely make it easy to figure out who was who and who did what from the data.

There is also the lingering question of how the data could come into play in the event of a malpractice case. “Living in the age of technology, we all know that nothing that is recorded ever truly goes away,” Donovan wrote in a letter to the hospital’s chief of surgery.

Indeed, the question of how the data could be used in legal cases is unclear, and malpractice lawyers will almost certainly try to get it if it’s involved in a case. Richard Epstein, a professor of law at New York University, noted to the Wall Street Journal last year that it’s possible lawyers could be successful in getting the data. “In the medical world, there is a great deal of strong judicial and medical oversight, and ultimately it won’t be up to any institution to determine or limit the purposes for which the information is used,” he said. But, for now, “nobody knows what will happen. … Legal protections are not clear-cut and are uncertain until tested by litigation and/or legislation.”

For now, early adopters are focusing on the positives of the technology. At Mayo Clinic, which installed Black Boxes in late 2021, hospital administrators used the technology to figure out that key surgical equipment was not optimally arranged in some of its operating rooms. So, they moved it and improved efficiency.

“There have been a lot of positive changes around teamwork and team function and how we respond to things,” Sean Cleary, MD, a surgical oncologist at Mayo Clinic, told Becker’s Hospital Review. “When we have something happen in the operating room and the team’s able to respond to a change in plan and done so smoothly and efficiently, we can now characterize that.”

Source link

AI Eclipse TLDR:

AI-powered surveillance technology is being implemented in hospital operating rooms to collect audio, video, patient vital signs, and other surgical data with the goal of improving safety and efficiency. The technology, called the OR Black Box, has been installed in over two dozen hospitals in the US and Canada. It uses wide-angled cameras and customized AI models to analyze the data and provide insights on protocol compliance, efficiency, safety audits, and quality controls. The technology is not intended to assign blame or identify individual errors but rather to prevent mishaps from happening. The cameras anonymize individuals by blurring faces and “cartoonifying” bodies, and all recordings are erased after 30 days. Early adopters have found positive changes in teamwork and efficiency, but there are concerns among hospital staff about the potential use of the data for disciplinary action or in legal cases. The use of the technology and its implications in legal cases are still uncertain and will likely be tested through litigation or legislation.