Coding emotion

3 mins read
Daniel Stringer / 1 November 2023

“Tell me and I forget, teach me and I may remember, involve me and I learn.”

– Benjamin Franklin

How do you develop Virtual Reality that makes learners feel? You may have used VR to play games, or even to learn practical skills. Here at Inizio Engage XD, we’ve created training applications that help construction workers safely navigate a building site before going there in real life, or prepare radio presenters to deal with an equipment failure live on air. But, what if the skills we want to hone are our emotional resilience, our interpersonal skills, or our ability to deal with emotionally distressing scenarios? In our most recent VR project with Telford College, we did exactly that, giving trainee Health Care Assistants the opportunity to experience the very emotions they’ll confront in the world of work. That might include feeling overwhelmed, experiencing bereavement, or even abuse. My role, as Lead Immersive Developer, was to change the way we develop these experiences; to code emotion.

 

Finding inspiration

While our colleagues in the UX and Digital teams interviewed prospective users to gain valuable insights, the design and development team looked for inspiration in existing VR experiences. We found the immersive game ‘That Dragon, Cancer,’ which tells the story of the developer’s son Joel, who was diagnosed with terminal cancer at just 12 months old. This game demonstrated that it was possible to take an intensely emotional topic and apply an artistic style that complemented an emotional journey.
The development process starts when our skilled scriptwriters begin crafting detailed narratives that guide participants through deep emotional experiences. Following this, we create storyboards that align with the story’s rhythm. This cohesive blend of elements guides users steadily toward their intended emotional peak.

 

Bringing our characters to life

It was crucial to the success of this project to create an emotional connection between our characters and the learner. Trainee HCAs care for Alice, Dennis and Martha through the three scenarios, which take place in Alice’s home, in an A&E department, and in a care home.

Our characters didn’t have hugely detailed facial features. Instead, we relied on posture, eyebrow movements, and voice to convey their emotions. This meant that we didn’t fall victim to the ‘uncanny valley’ phenomenon. This is when a character’s resemblance to a human is remarkably close, yet not precise enough, inducing a sense of unease in viewers, rather than the expected familiarity and empathy.

This allowed us to really prioritise the emotional experience. We then worked with professional actors to give our characters a voice, so that we could achieve those emotional nuances.

 

Developing empathy

We wanted to give users decisions to make, which would help them to be more emotionally invested in the outcome. Should they introduce themselves to Dennis in the hospital, or just carry on with their clinical tasks? Should they go along with our dementia patient Martha’s confusion, or correct her?

Ultimately, the outcomes are the same: Dennis’ health deteriorates and Martha is devastated and angry as she realises her husband died long ago. By giving our learners choices, they are more connected to the outcome, but also experience the feelings of overwhelm and fear of violence that we want to elicit, to prepare them for the demands of their role. There are no perfect outcomes here, no tests to pass, or boxes to tick. Instead, this is about allowing trainees to experience these feelings before they happen in real life. For me, no learning tool does that more effectively than VR.