Mixing Realities for More Realistic Trauma Training

0
Mixing Realities for More Realistic Trauma Training

One of the first steps in trauma, wound and combat casualty care is to stop the bleeding. “There’s a right amount of pressure. Too much is not a good thing. Not enough is not a good thing,” says Gregory Welch, AdventHealth Endowed Chair in Healthcare Simulation at UCF’s College of Nursing.

Currently, there is no way to tell if the pressure a trainee applies is correct. A new patented innovation seeks to solve this and the other limitations of current training tools.

AdventHealth Endowed Chair in Healthcare Simulation Gregory Welch

The multi-sensory wound simulation was developed by Welch and a team of experts from UCF’s College of Nursing, Department of Computer Science, and Institute for Simulation and Training. The innovation merges the physical and virtual worlds to enhance the quality and realism of healthcare training.

Healthcare professionals, combat medics, and first responders currently may receive training in the physical and virtual worlds separately. The physical simulations involve a fake wound and fake blood on a mannequin or standardized patient (actor). Created with moulage or special effects makeup, these simulated wounds are static, lack responsiveness to treatment, and take time to set up.

There are also computer-based and virtual reality simulations, which lack physical touch for hands-on training. “VR is amazing, but the default is everything is virtual,” Welch says. “In healthcare, hands-on training — being able to touch a patient or grab a tool — really matters.”

UCF’s innovation combines a hands-on, tactile experience in the real world with a dynamic wound in the digital world. The result, according to the patent, is a “powerful combination that has the potential to revolutionize casualty care training for both the military and civilians.”

Realistic Training

It works like this. A user wears a head-mounted augmented reality (AR) system that presents a scenario. The “patient” is physically present in front of the user, either as a simulation mannequin or participant, and the virtual wound appears seamlessly overlaid on that patient, such that the user perceives only a wounded patient.

On the physical “patient” is UCF’s patented smart moulage that tells the AR system to fill in the details of the scenario. With it, a user may see blood on their hands until the appropriate amount of pressure is applied. That pressure is gauged by a sensor.

“If you bend it, for example, the smart moulage know it’s flexing and the digital imagery will match what you’re feeling,” Welch says. “You’re feeling it, and it’s feeling you.”

Users may also feel a pulse and liquid “blood” thanks to an actuator inside the wound that emits water from a reservoir. The latter idea came from co-inventor and global healthcare simulation expert Mindi Anderson.

“Liquid is used already in simulation. For example, when using an IV task trainer, a student may ‘see’ simulated blood when starting an IV,” Anderson says. “This is different because it is done virtually. Water may look like simulated blood, and when the participant puts pressure on the wound to stop the bleeding, they can see the results of their intervention.”

Those results for users would be seeing the apparent rate of blood loss slow and vital signs possibly stabilize.

Clever Coding

Another unique aspect of UCF’s smart moulage innovation is how it connects to the AR system.

Normally, head-mounted displays track your location within a room with static things around you and overlay digital content onto the user’s view of the real world.

“The issue is that the wound location could change, depending on the scenario, and the virtual wound needs to stay stuck, so to speak, to the physical piece to be as realistic as possible. This could be challenging and complex to code,” Welch says.

Instead, Welch and the team developed a way for the wound to identify itself to the augmented reality (AR) system — almost like a QR code embedded in the smart moulage. The AR system would scan this code and would lock into it, continually tracking the wound and rendering effects of the user’s interventions in the scenario in real time.

Future Potential

Welch envisions this smart moulage may appear like a simple rubbery skin with electronics and a battery embedded inside, which can be peeled off, put in a box to charge overnight, and be ready to use each day. Its portability is also a benefit.

“A simulated wound, such as this, can be put on a mannequin, person, or anywhere to help with training,” Anderson says. “Participants can see and respond to realistic scenarios and hopefully, transfer the skills to work with real life patients.”

“When we cannot find solutions for our needs, we develop our own solutions using interdisciplinary collaborations such as this,” Anderson adds.

The smart moulage has applications beyond severe bleeding scenarios, and could be used for burn training, surgical training and more.

Co-inventors Associate Professor Frank Guido-Sanz and College of Nursing Associate Dean for Simulation and Immersive Learning Mindi Anderson demonstrating the technology. (Photo by Dana Saccoccio)

“The potential of this invention to improve trauma and critical care training scenarios involving wounds and bleeding injuries is endless. It could benefit frontline providers, from first responders to nurses and physicians, across military and civilian settings to ultimately impact patient lives,” says co-inventor Frank Guido-Sanz who has first-hand experience in trauma care as an acute care nurse practitioner in addition to his faculty appointment at the College of Nursing.

For now, this patented innovation is in the prototype stage. UCF’s Technology Transfer team is working to make potential licensees aware of the technology (view more here), and helping them explore how it might fit with their systems or envisioned use cases.

While this patent waits to be licensed for broad use, Welch, who has more than 25 patents at UCF, will continue to innovate.

“I think there’s a void between the two spaces of relatively realistic real-world patient scenarios and VR,” Welch says. “That’s the hard space to develop, but there’s a lot of space for something to be done.”

link

Leave a Reply

Your email address will not be published. Required fields are marked *