We can’t quite explain what you’re going to see if you go to Telepresence, an adventurous virtual-reality undertaking at the Western Front this weekend, but we can promise you this: it won’t be like anything you’ve seen before. The brainchild of media artists Kiran Bhumber and Nancy Lee, with assistance from trumpeter JP Carter and assorted coders and programmers, it aims to upend the conventions of the traditional concert in at least a couple of innovative ways—including breaking down the barrier between stage and seating.
“In general, the way contemporary music is performed, or jazz or electronic music, is very staid and performer-separated,” Bhumber explains, in a conference call with Lee and the Georgia Straight. “You’re in your role as a performer or as an audience member. And what Telepresence does is it allows a new space [in which] to experience a musical performance. And the way that happens is by using VR, but also by having the live performer and the audience members in the same stage area.”
“Virtual reality is kind of a new medium, but conventionally it’s used in a very visually centred way,” Lee adds. “In gaming or in cinema, it’s really focused on the visual element. So we really want to reverse that hierarchy between the audio and the visual, centre the focus and the attention around sound and music, and create a visual element in the virtual environment that will evoke a deeper listening.”
At the Front, Carter, his electronically augmented trumpet, and his listeners will find themselves ringed by a multichannel sound system; given the musician’s highly creative use of looping, reverb, and extended techniques, that should be enough to ensure an immersive environment. Beyond that, each listener will be given a virtual-reality headset that places them, visually, in an abstract, interactive landscape that in some ways mirrors the music. There’ll be an abstract three-dimensional representation of the musician and generative images that presumably will have some sort of synaesthetic component, plus an unspecified “game object” element—and audience members will have a measure of control over at least some of these.
“Some elements of the virtual environment will be sound-reactive to JP’s performance,” Lee explains. “In terms of the reactivity of that, that depends on the position of the audience member’s head and how far it is away from JP’s trumpet—so there are small interactive elements that are visual, and that will be live.”
Meanwhile Carter will be “conducting” the listening and viewing experience, in a way. “We kind of have a few different choreographic elements with him,” Bhumber says, “but it is more on the improvisational side. He really does play to how people are moving in the space, and what we’ve seen in some instances of his performance is that he will play a certain timbre, and that will change the audience’s position—how their body is positioned, and also how their gaze is positioned.”
And there’s another way in which the usual concert hierarchy is upended: although other applications of virtual-reality technology can feel alienating, the idea here is that everyone’s cocreating the event.
“What we find really interesting is how people are able to connect with each other before the performance, and how people can debrief with each other after the performance, too,” Lee says. “Being able to see how this system that we’ve built for the collective VR experience can facilitate this kind of social interaction, that’s an area that we’re definitely observing.”
Telepresence takes place at the Western Front on Friday and Saturday (December 14 and 15).