Skip to main content

You’re sitting in the audience waiting for the Broadway hit Hamilton to start when the producer appears on stage. “I’m sorry, but the actress playing Angelica Schuyler has food poisoning and the understudy isn’t here tonight. We’re going to have to cancel the show, unless…”

This is the scenario Informatics Professor Tess Tanenbaum presents when talking about her new project ShadowCast. She continues:

“…unless there is somebody in the audience who knows the part and is willing to go on.” You raise your hand nervously as the producer scans the audience. The producer sees your hand go up and yells, “You! Get up here on stage.” The show will go on!

“That is the fantasy that we’re trying to fulfill,” says Tanenbaum, referring to the virtual reality karaoke platform she is building specifically for musical theater. ShadowCast is a fully functioning karaoke jukebox that comes with its own virtual Broadway stage and audience, creating an immersive experience for the performer. When you select a song from the 50 different options — ranging from “Let it Go” in Frozen to “You Will Be Found” in Dear Even Hansen — you also select an avatar. Then, as you step out onto the stage and into the spotlight, facing a virtual audience, you step whole heartedly into that role.

A performer’s avatar in ShadowCast while singing karaoke to a virtual audience.

The Art of Acting in VR
“Virtual performance, as mediated through an avatar, comes with a host of complex interactional challenges in order to give the performer the ability to easily express and communicate emotion, without distracting them from the moment of the performance,” explains Tanenbaum. Consequently, in creating this immersive experience, Tanenbaum and her team, including informatics Ph.D. student Nazely Hartoonian, have had to explore new forms of nonverbal communication.

“What I love about ShadowCast is its innovative approach to interface design,” says Hartoonian, who earned her bachelor’s degree in economics from UCI in 2019 and met Tanenbaum through her work as the productions director for UCI’s Video Game Development Club (VGDC). “At the touch of a single button, ShadowCast allows its user to seamlessly transition from multiple expressions while giving a virtual performance of a lifetime.”

As lead producer of the ShadowCast project, Hartoonian handles various testing and quality assurance tasks and helps manage communication with the two dozen or so developers involved with the project. She has also worked closely with Tanenbaum and fellow informatics Ph.D. student Jeffrey Bryan to research the state of the art, as outlined in their CHI 2020 paper, “How Do I Make This Thing Smile? An Inventory of Expressive Nonverbal Communication in Commercial Social Virtual Reality Platforms.” Their research highlights the scarcity of interaction paradigms for facial expression as well as the huge gap in other nonverbal forms of communication such as posture, pose, and social status.

“What our system offers is a unique approach to nonverbal communication,” says Hartoonian. In particular, ShadowCast includes a built-in “emotion wheel” that users can leverage to control their facial expressions through head movements.

“When you push down a button, an emotion wheel appears around the periphery of your vision,” explains Tanenbaum. “And it’s this infinitely flexible parametric, analog facial puppeteering space that can be as simple or as complex as you want it to be.” The space is mapped to mimic smiling and frowning, so as you lift your chin up, the avatar starts to smile; as you lower your chin, the avatar grows increasingly angry. There is also a horizontal axis mapped to feelings of surprise or sadness. “This allows your avatar to express laughing or angry crying, or it can be angrily or happily surprised,” notes Tanenbaum. “And the wheel recedes into the background really nicely, so you can do this without having to stop your performance.”

The ShadowCast “emotion wheel” for selecting visual expressions using head movement.

Furthermore, the interface is not only easy to learn but also infinitely scalable. “We’re imagining an expert interface for people who are producing theater, where they take core faces and assign them zones in this space so that they can then create distinct facial expressive spaces as a performer,” says Tanenbaum. Also, although this emotion wheel is built into ShadowCast, it could be easily be applied separately to other systems.

“So ShadowCast has become this great vessel for exploring these questions of how we express emotion in a virtual space, how we create intimacy in a virtual space, in addition to the questions about performance and the pleasures of theater and acting,” she says. “So it’s been a really generative project.”

Potential to Increase Engagement and Access
According to Tanenbaum, ShadowCast also lays the groundwork for a more ambitious project. “The twist to this is that we want it to be public performances,” she says, “so it’s meant to be done in the lobby of a Broadway theater with a touring show.” Tanenbaum has been in contact with ICS alumnus Tim Kashani, who co-founded Apples and Oranges Studios and has produced Tony Award-winning productions. “The idea is that you have an actual audience that is watching you, and your virtual performance is up on a big screen,” says Tanenbaum.

The ShadowCast spectator view.

A separate version of the ShadowCast system resides on a network and creates a spectator screen, so as the real audience watches the virtual performance, they can “puppeteer” the virtual audience using a smartphone app. The real audience thus becomes the virtual audience, supporting the performer. “VR is really good at making you feel like you’re somewhere else, which is good if you have stage fright or you’re self-conscious,” notes Tanenbaum. “It makes you feel safe and supported, and you aren’t looking the real audience in the eyes, but you still get that feedback and sense of connection.”

More importantly, ShadowCast has the potential to reach people far beyond theater lobbies. “The software infrastructures we have created for it are the first steps towards building a fully featured platform for virtual reality distributed theater performance,” says Tanenbaum. “We are now in the preliminary stages of expanding our design to accommodate VR performance education, which will allow us to develop curriculums for performing arts education in virtual reality.” Such a digital infrastructure for theater performance could extend the arts to rural and socioeconomically disadvantaged communities where access to theater education and infrastructures is lacking.

“We know that students who are highly engaged with the arts, especially rural and underprivileged students and underrepresented minorities, are much more likely to graduate high school and college and find career success than students who are unengaged with the arts,” says Tanenbaum. “You also make it accessible to people who have typically been excluded from theater,” she continues. “For better or worse, theater tends to revolve a lot around physical appearance, which means that people of color, people of size, people with different gender presentation or mobility impairments often are less successful in theater careers.” ShadowCast aims to creates a space for anyone and everyone to participate in theater. “That is the big dream for this project.”

Another more recent goal is to consider how this project might help during the global pandemic. “We’ve been thinking about how this technology could help the theater world while we’re all stuck sheltering in place,” says Tanenbaum. “We have the first steps toward distributed virtual theater performance, at a time when Broadway theaters are all closed.” Perhaps Tanenbaum and her team will help find another way to ensure that the show will go on.

Shani Murray

Skip to content