Extending Reach with Extended Reality: Live Theatre Performance and XR

By: Rebecca Hodge

Traditionally, theatre is defined by a live performance happening in one physical place, at one time, where audiences gather for a collective experience. But this structure presents a fundamental issue: what about when an audience can’t be there? Potential audience members may not live where a performance is taking place, might not be able to afford the cost of admission, or may have physical limitations that prevent them from attending.

Extended reality (XR) technologies present a possible solution for addressing these issues, with a multitude of methods for live performances to reach and engage with more audience members.

The Pros and Cons of ProShots

Some methods for capturing live theatrical performance already exist and have been successful, but they have areas for improvement. The main existing method is professional video recordings, commonly known as proshots. Most often, these are video recordings made with multiple cameras, providing a variety of shots, then edited together, similar to a film. Sometimes these events are live-streamed, other times they are pre-recorded and edited videos broadcast at a set time, and other times they are fully pre-recorded and placed on streaming services to watch on demand.

The primary benefit of proshots is that they are easily distributed, making them less expensive for audiences to access. The National Theatre in London’s NT Live has successfully used pro shots to reach a larger audience. Through distribution to movie theaters across the world, the play Prima Facie reached 1.5 million additional viewers. If all these audience members had attended in person, they would’ve filled up the theater for nearly five years of nightly performances.

Audience members seated in a dark theater facing a large screen displaying a single chair illuminated by stage lights before an NT Live screening of Inter Alia begins.

Figure 1: Audience members waiting for an NT Live screening of Inter Alia to begin. Source: The New York Times

While the audience reach can be impressive, there are artistic and experiential costs to proshots. One important consideration is that film and theatre are inherently different mediums. Rather than an audience member having one perspective from a certain distance for the extent of a show where they can choose where to look, the audience for a proshot sees curated angles and shots, arranged like a film.

“A proshot that uses the language of films, such as editing and close-ups, is using a fundamentally different language than a theater performance and will often come into conflict.”
— Author Function, in “Bad Editing in Modern Musical Proshots” (7:58-8:07)

In essence, proshots are live theatre performances captured and consumed as film, and this means elements of staging and theatrical artistry get lost in translation.

Another key problem is that the end result of a proshot isn’t live by default. Instead, it’s a recording of something live and cannot truly recreate the live performance experience.

Defining Live Performance

If XR is to expand the reach of live performances and do it better than proshots, then there must be some understanding of what “live” really means. When thinking of attending live theatre, a general process comes to mind: buying tickets for a specific performance at a certain date, time, and place; physically going to that location at that time along with other audience members; and seeing physical actors on a stage. But do any of those elements define a performance as truly “live”?

In Liveness: Performance in a Mediatized Culture, Philip Auslander argues that there is no core, eternal definition of liveness. Rather, the concept develops and shifts over time in response to technological advances and historical changes. The Internet and digital media are key advances in recent history, introducing the concept of mediatized performances: performances circulated through audio and video with technologies of reproduction, such as proshots. Auslander challenges how live and mediatized performances are often placed in mutually exclusive opposition. Rather, he views live and mediatized events as linked, with one often drawing on the other.

From this framework, it becomes possible to recognize that a live performance isn’t restricted to physical, in-person, synchronous events. How, then, do we create a sense of liveness in digital, mediatized events? While investigating attempts to create virtual live performances, Pat Healy and Hannah Standiford identified four key components of liveness:

  • Temporality – an event is ephemeral and bound to a “now”

  • Exclusivity – only a limited social group attends the event

  • Spatiality – an event occurs within a set space, whether physical or digital

  • Interactionality – the event has a social aspect that acknowledges a participant’s presence

Depending on how a proshot gets distributed, it can hit some of these factors. For example, NT Live livestreamed a performance of Inter Alia to 680 movie theaters. This fulfills the temporality requirement by being a time-bound performance, happening at the same time as audiences in theaters are watching. It’s exclusive by requiring tickets to watch. It somewhat addresses spatiality, as audiences had to gather at one of the 680 movie theaters to watch the livestreamed performance. Interactionality is mostly absent, however. If the movie theater was empty, nothing would change about the performance. Audience members can interact with each other within the theaters, but this isn’t really acknowledging their presence within the performance itself.

Other forms of proshot distribution present a mixed bag when addressing liveness. Take the proshot of Hamilton released on Disney+ in 2020 as an example. Since the proshot was on a streaming service, it could be viewed at any time by anyone who had access to Disney+. That crosses out temporality and exclusivity. There is no spatiality beyond gathering around a television screen. Finally, there is no interactionality for audiences.

Proshots often don’t address these factors for liveness. But extended reality technologies could offer an alternative that consistently fulfills these elements of liveness while also providing new avenues of creative exploration for theatre artists.

What Possibilities Are Out There?

In-Person Immersives – “Shared Reality”

In-person immersive technology could essentially replace a stage with a screen, but in a way that feels more real to audiences than a movie theater. One company currently broadcasting live sports and experimenting with other experiences using immersive technology is Cosm. Extremely high definition screens are arranged in a dome that sweeps around the audience, paired with spatial audio to create a “shared reality” where everyone experiences the events from their own vantage point.

A large audience fills an arena watching a UFC fight inside an octagonal cage at Cosm, with immersive lighting and multiple overhead screens enhancing the “shared reality” experience.

Figure 2: An audience watches a “shared reality” UFC fight at Cosm. Source: Tomorrow’s World Today

If Cosm can accomplish this experience with live broadcast sports, why not a theatrical performance? The simplest application would be a next step up from movie theaters livestreaming proshots: film a performance and stream the feed to immersive screenings in Cosm locations miles away.

Improvements in projection and interactive technology could bring immersive performances to more spaces as well. Ultra-Wideband (UWB) technology presents opportunities for cheaper and better hidden projectors and sensors, creating visuals that react to changes in the space. This use of technology wouldn’t necessarily capture and reproduce the actual performance, but instead the space around it. The technology could act as a replacement for complex physical sets that have to be built and taken down, making performances more portable and less resource-intensive.

Spatiality, exclusivity, and temporality are all present in these “shared reality” experiences, similar to how they would be for a typical in-person performance. The presence of other audience members can also give a sense of interactionality for shared screenings, whereas interactive elements of a performance integrating UWB provide more possibilities to engage the audience.

However, these in-person experiences do not address the concern that there are some audience members who physically cannot attend in-person gatherings. Access to the performance is limited to those who can travel to the location where it is taking place. If there isn’t a Cosm location nearby, or the theatre troupe using a mixed reality set isn’t touring to a close town, then these performances are no more inherently accessible than traditional ones.

Augmented Reality, Anywhere You Have a Phone

A number of theatrical experiments have taken advantage of the fact many people have smartphones with cameras. All Kinds of Limbo, created by the National Theatre’s Immersive Storytelling Studio, is one example that embraced the augmented reality (AR) space to address the limitations on in-person gatherings posed by the pandemic. The short performance features a musical performance by a 3D avatar of the singer, visualized wherever viewers point their phones.

Performer Nubiya Brandon stands on a stylized stage in a white gown, arms extended, surrounded by illuminated golden arches and musicians in a digital environment from All Kinds of Limbo.

Figure 3: A still from All Kinds of Limbo, featuring performer Nubiya Brandon. Source: The New York Times

To summon some of theater’s shared intimacy, it’s being ticketed and broadcast as live, although the show is recorded. Other people attending virtually are represented by blades of moving white light and, by playing with the settings, you can move around the space and see the action from different angles.
— Andrew Dickson, describing "All Kinds of Limbo" in the New York Times

Another example is The Tempest, produced by Nexus Studios. As viewers watch, actors perform in a remote studio, their motion projected onto 3D avatars on the audience’s screens. The experience offers opportunities for audiences to participate. For instance, as Gonzalo describes the abundance of nature, viewers can tap the screen to plant seeds and create vibrant greenery that overlays the space through their phone.

A person holds a smartphone displaying an augmented reality scene from The Tempest, showing a 3D green character standing on a virtual stage with the prompt “Tap to Create Lightning.”

Figure 4: A still from a video featuring the live AR technology used in The Tempest. Source: Nexus Studios

Each performance described above has a distinct relationship to the key components of liveness. Both provide a shared digital space for audiences, fulfilling the spatiality element. As ticketed events, there is some amount of exclusivity — though anyone who has a phone ultimately could join in.

Where the two diverge is in their relationships to temporality and interactionality. All Kinds of Limbo was prerecorded, but broadcast at specific times to audiences. The Tempest, on the other hand, had the actors performing at the same time the event was broadcast. Both were advertised as “live” performances as they were only available at specific showtimes.

For interactionality, All Kinds of Limbo gives a nod to audience presence by representing the audience members as “blades of moving white light” on each other’s screens and allowing them to choose their own perspectives. The Tempest engaged on a more individual level, where audiences could tap on their own screens and see the results reflected in the visuals. However, this was purely visual and independent. The presence of other audience members did not affect the individual audience member’s experience in any notable way.

In both examples, the performances were created by motion capture of a live performer. Ultimately, the quality of that performance can be limited by the quality of the motion capture and resulting output. Seeing human forms lag, glitch, or otherwise move unlike humans may lead to an unintended uncanny experience that breaks immersion.

Issues with motion capture are one facet of how creating AR and other XR experiences can necessitate great technical expertise and money. Exploring the ways that extended reality has been used to create Shakespeare productions such as The Tempest, Aneta Mancewicz notes several patterns:

  • most of the projects investigated relied on public funding from metropolitan centers

  • most performances were short (10-60 minutes) both due to the costs of creation and the audience’s limited level of attention they could sustain with XR

  • numbers of participants were limited due to technical costs and issues such as data transfer and troubleshooting

While nearly everyone has a smartphone, not every theatre can afford to realize an AR performance on them, and even then there are limitations with attention and technological issues on the audience side.

New Worlds Within VR Headsets

Virtual reality headsets provide full visual immersion for the wearer and potential methods for interactivity with a virtual space. This means that settings for performances could be anything from under the sea to the depths of space or a historical monument, so long as it can be 3D modeled. No matter where an individual looks, the headset ensures they are still within the event space.

“A VR setup, which gives the wearer the freedom to focus anywhere within its 360-degree view, has perhaps more in common with theater, where a spectator chooses the focal point, than with film, where the director does.”
— Alexis Soloski, in the New York Times
A stylized virtual reality scene from Tempest shows two animated figures, one holding a flashlight and another standing near a campfire in a dark forest environment.

Figure 5: A still from Tempest, a VR production created by Tender Claws. Source: The New York Times

In practice, many theatre productions involving VR headsets have both audience and performers appear via virtual avatars in real time. Rather than video recordings or motion capture, performers must translate their work into their virtual avatars that don’t have to be humanoid or realistic. They in turn are responding to manifestations of audience members who have limited forms of expression but are nonetheless within the same virtual space and can move.

Companies such as Tender Claws and Adventure Lab have experimented with VR theatre performances that take place in real time with shared virtual spaces, fulfilling the components of spatiality and temporality. By using live performers and allowing them to “riff” off of audience members through their physicality, the experiences can achieve interactionality and a sense of liveness. So long as people have headsets and Wi-Fi, these performances can be experienced anywhere.

“It feels live. It feels present. Even though we’re virtual, I feel you in there.”
— Katelyn Schiller, actress, in the New York Times

For better or for worse, exclusivity is inherent to the system as it stands now. Tender Claws and Adventure Lab report difficulty getting investors on board due to the limitations of the technology. As described by Alexis Soloski in an article looking into VR theatre, “an immersive VR experience can hire and train only so many actors, and those actors can guide only so many people, a barrier to attracting and earning back investment.”

Envisioning New Forms of Live Performance

Extended reality doesn’t seem poised to replace traditional live theatre but does offer a distinct new area of exploration — a partnership between theatre and technology, between the virtual and the real.

“The potential of extended reality for theatre depends on the theatre’s ability to reinvent itself within a novel framework of reality as an overarching and hybrid entity that combines physical and virtual realities.”
— Aneta Mancewicz, in “Extended Reality Shakespeare”

Each technology has its own strengths and weaknesses. However, in many cases, one format can be easily ported to another, allowing for flexibility that decreases the impact of these problems. All Kinds of Limbo was available via AR, VR, and typical streaming on computer. Cosm’s technology for creating the immersive in-person screenings can easily be ported over to VR headsets or recreate 3D environments using AR on a phone. This adaptability between forms of technology presents greater accessibility for audiences, as audiences could choose what experience suits them best.

Throughout the different forms of extended reality and digital performances, many creators have emphasized the importance of interactionality. In many XR theatre experiences, the audience is present within the show and able to interact with it in some way. Rather than quietly sitting in a seat and watching a framed performance, XR theatre has let audiences choose their own perspectives or change the virtual setting in some way. The interactionality provides a differentiating factor for XR theatre, moving it beyond simply recreating existing live, traditional experiences.

In order to expand audience reach, XR technologies will need to improve in cost and accessibility for creators and audiences alike. At the present, many XR theatre experiences are experiments, requiring ample outside funding and technological expertise in order for theatremakers to pull off a show. XR can require a high level of digital literacy from audience members, who may not be equipped to set up a VR headset or troubleshoot when Wi-Fi gets spotty.

Nonetheless, as artists continue to experiment with XR and live performance, they find new ways to address these issues. XR is not the ultimate solution to expanding audience reach. Rather, it’s a multitude of potential solutions, a toolset for greater artistic experimentation and crafting new avenues for audiences to engage with art.