Hatfield, a teacher and cellist, began working with the VR lab in 2017. So far, the students have tested only the virtual practising tool's audio elements. In November 2019 the first test round with both sound and images was carried out, and the goal is that the VR lab will be able to be used starting in 2020 – although Hatfield is reluctant to name a specific date. It’s not unusual to encounter some bumps in the road when you set out to develop a product that the world has never seen before.
Demanding audience
So far we are watching the virtual practising world from a computer screen on the desk in the VR lab: the VR glasses from China, which just arrived, are not working properly. Pål Hieronimus Aamodt is the COO and project manager of Pointmedia, the company behind the 3D animation in the project. Now he is sitting in front of the computer screen, opening and closing windows, clicking on things, trying to get the large, black glasses to communicate with the software they have developed. Scratching his head.
“We’ll take it up with the producer,” he says, partially to himself and partially to Hatfield, who nods.
The VR lab has presented several challenges along the way, such as how to create the sound of a large concert hall in a room that only measures a few small steps from one end to the other. The biggest problem, though, has been the audience.
“In the midst of the development process, we discovered that the person wasn’t good enough. We have spent 1500 hours creating a new one,” Aamodt explains, bringing up the smallest concert hall on the screen and zooming in on the audience.
They are sitting like a typically well-mannered audience at a classical concert, with their hands in their laps, directing their gaze towards the stage. But the faces are similar; several of the women have bought the same bright green dress, and the men have gone to the same barber. We are seeing an older version of the program, Hatfield assures us. The new people Pointmedia has developed have wrinkles, more texture, different facial expressions.
“But when you animate everything in 3D there are limits to how close their movements can be to reality. If you have face capture and motion capture, you can produce audience movements and facial expressions more easily. But a motion capture suit costs around NOK 100 000, so you need the resources to be able to access it,” says Hatfield.
Coughing, clapping and booing
It is Hatfield himself and Pointmedia that decide how the audience members should look. Their behaviour, on the other hand, can be pre-programmed or triggered by the teacher while the student performs in the virtual concert reality.