Jamie Powell has been a nurse long enough to remember when patient evaluations were entirely analog.
“We would have to handwrite our assessment into a paragraph in the chart, in the progress notes,” says Powell.
Keep up with local business news and trends
Subscribe to the St. Louis Business newsletter to get the latest insights sent to your inbox every morning.
She’s seen the technology nurses use to keep track of their patients evolve drastically over the course of her 31-year nursing career, from those written paragraphs to a tri-folded piece of paper for each patient refreshed daily to, eventually, the electronic records systems ubiquitous today.
Now as a nurse in Mercy Hospital St. Louis’ orthopedic unit, which she says also serves as the hospital’s innovation unit, Powell is on the frontier of what’s likely to come next. Since February, nurses in Powell’s unit have been piloting and providing feedback on a new artificial intelligence-powered system that can capture a nurse’s patient assessment and automatically populate the electronic record.
It’s part of a collaboration Mercy and a handful of other health care systems across the country have forged with Microsoft, as the tech giant continues to add more capabilities to its Dragon Copilot. The company had already found success with the ambient AI’s ability to accurately draft clinical notes for physicians based on a recorded conversation between them and a patient. It then sought to do the same for nursing workflows, says Umesh Rustogi, general manager of Dragon for nursing at Microsoft Health and Life Sciences.
“About two thirds of the nurses report a very high level of stress and burnout, and what contributes to that is the high level of documentation and the administrative load,” he says. “Just imagine you’re doing tens, if not hundreds, of rows of documentation to manually capture this data. This is what we have basically now tried to change.”
Microsoft’s Dragon Copilot functions like a second set of ears while a nurse performs a patient assessment. On their mobile device, a nurse will select the patient they’re assessing and, with consent from the patient, turn on the ambient AI and narrate their evaluation. The AI captures a transcript of the interaction and then inserts key assessment information into the proper parts of the patient’s medical chart, also referred to as a flow chart or flow sheet.
“[It] creates efficiencies for the nurses, transforms the nursing workflow, reduces that cognitive burden, and essentially gives time back to nurses so that they could take care of the patient,” Rustogi says.
It’s an opportunity to respond to the current shortage of just over 400,000 licensed practical nurses and registered nurses, which is projected to grow to nearly 500,000 by 2030 per the Health Resources and Services Administration. The innovation seeks to cut down on the vast amount of time—between 25 and 40 percent of a shift, Rustogi estimates—that nurses spend on administrative tasks, what Rustogi says is “time taken away from actually caring for the patient.”
It’s also a chance to more accurately capture the totality of an interaction between nurse and patient, instead of relying on memory like many nurses do, says Stephanie Clements, senior vice president and chief nursing executive at Mercy.
Working off memory, she says, nurses “run the risk of not capturing all of the details of the interaction. The more accurately we’re able to document those interactions, the more specific and detailed and personalized the care becomes.”
Powell admits the transition to narrating her observations so the ambient AI could pick up what she was doing was a bit awkward at first. She had gotten used to carrying on a natural conversation with patients while checking what she needed to and remembering those details after the fact.
“I’d be looking at their dressing. I’d be looking at their IV. I’d be looking at how they’re breathing. I would assess their orientation, their mental status, just having a conversation,” Powell says.
This conversational element is one Rustogi acknowledges was a challenge in developing Dragon Copilot for nursing applications, and a key reason why Microsoft sought Mercy and other health systems as “co-development” partners. He says the partnership flowed both ways, enabling Microsoft to produce a better AI product while the tech company could help health systems manage big changes to their nurses’ workflows.
To that end, Rustogi says the development of Dragon Copilot for nursing was much more than “repurposing the existing solution” from physicians for nurses, but akin to developing something from scratch.
“It’s a very fast-paced environment. Nurses are always on the move, on the go,” he says. “Also, nurses tend to have a very heavy, structured documentation workflow as compared to physicians.”
The process wasn’t always smooth, especially when they first started using the ambient AI, says Powell.
“It would pick up weird things the patient would say in the conversation, or something that was said on a TV [in the background] or [by] a family member, and it would [put that] on the flow sheet,” she says.
But Powell credits Microsoft with taking comments from her and other nurses seriously and incorporating their feedback. Now, she says Dragon Copilot is much more accurate when it populates a flow sheet from one of her patient interactions.
Her fellow nurse Paige Walton says it’s helped save time. “If you’re in the room and charting while you’re in the room, with it running, it makes it quicker, easier,” she says. “You make sure your assessment’s in and charted and not having to wait until later on the day to chart it.”
Walton admits she sometimes forgets to start the recording from her phone, and would love a voice activated option akin to saying “Hey Siri” or “OK, Google.” Rustogi says this is common feedback and something Microsoft is exploring a few ways to implement, such as through smart badges or hospital rooms equipped with microphones that are activated with a key word.
This obstacle speaks to the challenge of managing sensitive medical information, which Rustogi and Clements both assure stays protected. Rustogi says Microsoft will use some of the captured data to improve accuracy, safety, and future features, but only as anonymised data versus something that could identify an individual patient.
Ideas for improvement aside, Powell says she appreciates how the ambient AI captures more of the care that nurses perform on a daily basis. It’s small, but important details that can be overlooked when relying on memory, like icing or elevating a limb, working with pain management, ensuring alarms are on for safety or that a call light is within a patient’s reach, she explains.
“I’ll be saying those things out loud,” Powell says. “We’re getting credit for the good care that we’re actually providing the patient and picking up on things that we don’t even think about putting in the chart.”
Clements, Mercy’s chief nurse executive, adds she’s heard this sentiment from other caregivers too, who are excited that conversations about things like the social determinants of health, health-related social needs, and other parts of a patient interaction are captured.
“I really see it as our ability to care for the whole person—mind, body, spirit—in a way that we might not have been able to capture without the utilization of technology and AI tools,” she says.
Using Dragon Copilot has also driven key efficiency gains for Mercy. The healthcare system reports the tool led to a 29 percent reduction in incremental overtime and a nearly five percent bump in patient satisfaction. Clements expects full implementation of Dragon Copilot for nursing across Mercy’s acute care facilities given how well it’s doing in surgical nursing units right now.
“We see it as an augmentation of the care we’re providing, not replacing the need for that human interaction,” she says.
And Microsoft is confident enough in the technology that it started rolling out the product more widely earlier this month.
Patients have largely embraced Mercy’s use of Dragon Copilot too, Clements adds. She says she can’t think of any who refused to let a nurse use the tool during an assessment. Instead, they appear happy to be part of the development of a cutting-edge medical tool.
Margie Stratman, whose husband Leon Joseph Stratman was recently at Mercy Hospital St. Louis recovering from a broken hip, shares she appreciates how the AI tool means she gets to be more informed because the care that a nurse is providing is being narrated to her and her husband.
“A lot of times I had no idea what was going on,” she says. “[Now] I understand what they’re saying. Otherwise they keep it all to themselves, and now we know too what’s going on.”
Details that help a patient understand what and why a nurse is assessing a particular part of a patient’s body can go a long way to fostering a sense of safety in a setting that’s inherently unnerving for many people, Powell says. It allows them to lean into the side of nursing that technology can never replicate, while shedding some of the mental burden from parts of the job that can be automated.
“You’re never going to take the human aspect out of nursing. People are scared and they need comfort,” Powell says. “A computer can’t take that away.”