University awards six XR projects under new initiative

Topics:

A virtual reality chemotherapy simulation that helps train future medical professionals about what to do when the drug leaks out of blood vessels and threatens a patient’s skin.

A nuclear reactor simulation using virtual reality that offers training without the safety concerns of a live reactor.

more information

These are two of the six extended reality projects to receive a first wave of funding under the University of Michigan’s new XR Initiative, announced in the fall.

The other projects include using augmented reality or virtual reality in immersive audio design classes, construction architecture courses, a virtual physics lab, and the teaching of head and neck anatomy to dental students.

“The university is always asking ‘What’s next?’ and it’s thrilling to see the creative and multidisciplinary projects in health care, engineering and architecture as part of the XR Initiative,” said James Hilton, vice provost for academic innovation.

“These projects are looking at long-standing challenges in new ways and will allow our faculty to use XR to redefine what a hands-on, immersive education looks like for our students and learners beyond campus.”

XR encompasses augmented reality, virtual reality, mixed reality and other variations of computer-generated real and virtual combined environments and human-machine interactions.

The three-year-funded initiative calls for the Center for Academic Innovation to seed new projects and experiments that integrate XR into residential and online curricula, and to create innovative public-private partnerships to develop new XR related educational technology.

“We are excited to launch this first round of XR Initiative-funded projects here at the university and our team will shortly begin our design and development processes to deliver these projects at scale,” said Jeremy Nelson, director of the XR Initiative.

Financial awards for the first-round projects ranged from $12,000 to $25,000, and each award will be supported through a number of in-kind investments from the XR Initiative and the Center for Academic Innovation.

“We were blown away by the creativity and ingenuity of the proposals, and we chose projects that we believe will fundamentally change the way we teach and learn,” said James DeVaney, associate vice provost for academic innovation and founding executive director of the Center for Academic Innovation.

“The first wave of XR projects are looking at unique challenges in new ways and target a wide range of learners from high-schoolers through graduate students. That’s important to the center because to understand and make best use of innovative pedagogies and breakthrough technologies we need to design with diverse learners from the start.”

The Centers for Disease Control and Prevention says 650,000 people a year in the United States get chemotherapy as an outpatient. It is a high-volume, high-risk clinical intervention that requires interprofessional clinical teams to manage, Michelle Aebersold, clinical professor at the School of Nursing, wrote in her proposal.

Her team’s “Getting Under The Skin” project seeks to develop a 3-D environment to help students interested in becoming nurses, pharmacists and physicians manage a serious side effect of the therapy. The goal is prevention, detection and management of a condition in which the drug leaks out of blood vessels as it is being administered, causing patients to have damage to the skin that sometimes goes undetected due to vague symptoms.

“In the area of high-risk medications, we have only been able to show our students the devastating effects of when hazardous yet important intravenous medications leak outside the vessels and cause skin damage,” Aebersold said.

“The ability to help them see how it actually happens inside the body and how quickly it can occur will improve their assessment of the intravenous sites, their knowledge of how important fast treatment is, and their motivation to ensure there are nurse-driven and pharmacy-supported protocols in every infusion center.

“XR has great potential to provide faculty another educational methodology to use in helping students understand their role in caring for patients, being part of a health care team, learning how to care for patients, and one added advantage over other simulation methods is that immersive VR can help students understand what it is like to be a patient.”

Another virtual reality project would create 3-D models to give students experience operating a nuclear reactor.

The College of Engineering is home to the No. 1 nuclear engineering program in the country. For several decades up to the early 2000s, the program included training at a physical nuclear reactor. The Ford Nuclear Reactor, originally established as a WWII memorial under the Michigan Memorial Phoenix Project, permanently shut down in 2003.

It was decommissioned over the next four years, leaving U-M as one of the only programs without a research reactor, both in the Top 5-ranked university programs and the Big 10, said Brendan Kochunas, project manager and assistant professor in the Department of Nuclear Engineering and Radiological Sciences.

The Extended Reality Nuclear Reactor Laboratory simulation would allow some retired courses that used the Ford Nuclear Reactor to be taught again to upper level undergraduates and graduate students.

“I hope to gain experience and insight into how to apply XR technologies in a practical way to enhance education and research in the field of nuclear engineering,” Kochunas said. “I think XR has such potential for this area of science, or really any area of science where the reality is you have a physical system that is expensive or potentially hazardous.”

Students would use virtual-reality headsets to virtually walk around the reactor control room and floor, look down at the core, view instrument panels and interact with the control panel.

“In reality, one does not simply walk up next to an operating nuclear reactor core, but in virtual reality one can,” Kochunas said. “We can also overlay simulation results on the virtualized physical systems allowing students to experience neutron fields or temperature fields visually, where in reality this is not possible. Now we get the opportunity to have the Phoenix rise again—only virtually. I think that’s pretty cool.”

Other projects funded are:

Cross-platform XR Tools for Supporting Student Creativity in Immersive Audio Design

Faculty lead: Anıl Çamcı, assistant professor of performing arts technology, School of Music, Theatre & Dance
Description: Develop AR and VR versions of a browser-based tool for designing virtual immersive sonic environments. This tool, Inviso, enables novice and expert users to construct complex immersive sonic environments with 3D dynamic sound components.
Students: Undergraduate and graduate students taking courses on XR, audio and film production, interaction design, accessibility and areas of practice that deal with data representation and urban planning.

Crashing Trains and Launching Rockets: A Virtual Physics Laboratory for the Classroom

Faculty Lead: Thomas Schwarz, associate professor, Department of Physics
Description: This classroom experiment would be used to test the viability of the XR platform for pedagogy in a physics laboratory course and to assess the requirements for developing more general tools for creating future experiments.
Students: Undergraduate physics/engineering students and, potentially, STEM high school students.

Augmented Techtronics

Faculty lead: Jonathan Rule, co-founder of MPR Arquitectos and assistant professor of practice, Taubman College of Architecture and Urban Planning
Description: The project will leverage both VR and AR to demystify construction concepts currently presented through abstract drawings and images. The VR environment will provide a platform construction simulation experience. The AR component will be content-embedded in an app accessible on a smart device.
Students: This is a core residential course serving 100 undergraduate and graduate students each year.

Comparison of Student Learning of Head and Neck Anatomy and Diagnosis of Pathology Using XR

Faculty lead: Hera Kim-Berman, clinical assistant professor, Department of Orthodontics and Pediatric Dentistry 
Description: This proposal seeks to compare two learning techniques by using two randomly assigned groups of students with minimal experience in interpretation of cone beam computed tomography images, who either will be trained to use a computer screen and Dolphin 3-D Imaging software to view data, or a VR group with a headset and ImmersiveView application to view a rendered 3-D model of the skull and overlaid CBCT data. 
Students: The study will seek second-year dental students.

Tags:

Leave a comment

Commenting is closed for this article. Please read our comment guidelines for more information.