The University of Canterbury (UC) is pioneering a study to develop virtual reality (VR) training for early childhood education student teachers. This initiative addresses the challenge of limited access to infants aged birth to six months during teacher training.
Many aspiring educators are keen to work with infants, but real-life interaction with this age group is often unavailable. To overcome this, the UC team has created VR-based training environments.
The study was initiated by Professor Jayne White, who sought a partnership with HIT Lab NZ to explore this innovative solution. Associate Professor Heide Lukosch, leading HIT Lab NZ’s Applied Immersive Game Initiative (AIGI), was quickly interested in this collaboration. The AIGI focuses on using immersive gaming applications to enhance educational, social, and health outcomes.
Supported by the University’s Child Well-being Research Institute, the first trial of a VR prototype was conducted under Professor White’s leadership. “VR has proven to be an effective tool for learning practical skills in various fields, and its potential in education is significant,” Professor White said. “We’re excited about the opportunities this tool presents.”
Associate Professor Lukosch added, “The VR application in early childhood education is crucial as infants communicate in ways that can be hard for non-familial adults to understand. Our research aims to help interpret these cues effectively.”
The study, co-led by Lukosch and White, incorporates the Mātauranga Māori concept of whanaungatanga, guided by UC Senior Lecturer Dr. Ngaroma Williams. This approach focuses on enhancing relational skills for adults working with infants through novel training environments.
“We’re intrigued by how virtual environments can help people in otherwise inaccessible or dangerous situations,” Lukosch explained. “Our goal is to create an immersive environment where users can feel present and accountable for their actions.”
A key element of the VR training involves haptic gloves that simulate the resistance felt when handling an infant. This technology allows users to experience realistic interactions, such as changing, soothing, feeding, or entertaining a virtual infant.
One of the early VR prototypes challenges users to establish positive interactions by interpreting and responding to a virtual infant’s non-verbal cues. These virtual infants can be programmed with preferences that users must learn to recognize and respond to appropriately.
“For instance, if a virtual baby prefers red toys, picking up a red toy would make the baby smile, while choosing a yellow toy might make it cry,” Lukosch explained. Although not utilizing artificial intelligence yet, the team sees potential for future AI integration to create adaptive learning environments.
The UC team is also collaborating with Professor Tony Walls and Dr. Niki Newman from the University of Otago Simulation Centre, who provide expertise on the healthcare aspects of the study.
As the project moves towards commercialization over the next three years, the team aims to apply the interaction design principles developed from this study to other training environments.