Skip to content

Opinion: Why the metaverse isn’t ready to be the future of work just yet

While metaverse supporters claim VR can virtually recreate in-person collaborative experiences, the ways users interact with VR systems can introduce usability issues.
metaverse-moment-kilitochan-getty
Metaverse supporters say VR can recreate in-person collaborative experiences

The COVID-19 pandemic radically changed how many Canadians work. According to Statistics Canada, remote work rose to 40 per cent in April 2020 and fell to 18.7 per cent as of May 2024.

While telework allows employees to save time and money on commuting costs and can offer a better work/life balance, academic studies have illustrated how collaborative work can potentially suffer in remote contexts.

Proponents of the metaverse suggest it could be the future of remote work since virtual worlds and virtual reality (VR) afford us the ability to collaborate in virtual spaces. But some companies argue that remote work lacks “water cooler moments” — impromptu encounters between employees that can spark innovation.

The compounding effects of Zoom fatigue and awkward attempts to recreate informal office interactions through virtual events like Zoom parties are partly behind recent interest in the metaverse as the future of remote work.

Metaverse is a term first coined by novelist Neal Stephenson more than three decades ago in his 1992 science-fiction novel Snow Crash.

Today, it’s used to describe “the convergence of physical and virtual space accessed through computers and enabled by immersive technologies such as virtual reality, augmented reality and mixed reality.” These technologies are collectively referred to as extended reality.

User interaction issues

This isn’t the first time companies have explored the use of virtual environments to support remote work. In the early 2000s, companies like IBM and Microsoft had virtual islands in the social virtual world Second Life, where they hosted corporate training and recruitment events. But where Second Life was a platform for desktop computers, the metaverse is primarily one to be used with VR head-mounted displays.

While metaverse supporters claim VR environments can virtually recreate in-person collaborative experiences or “water cooler moments,” the ways users interact with VR systems can introduce some usability issues.

In order to interact with 3D content in virtual environments, modern VR systems almost universally use 3D input devices such as controllers or wands, tracked in 3D typically using cameras positioned facing outward on the device. Hand tracking is becoming increasingly prevalent as well.

However, these 3D controllers have been shown to lack the finesse of conventional computer input devices like a mouse or touchscreen, resulting in worse performance in common tasks such as acquiring targets (for example, clicking on icons).

Eye tracking is becoming increasingly prevalent as well, with recent entries such as the Apple Vision Pro relying almost entirely on eye-based interaction. For repeated target acquisition tasks, eye tracking can cause eye fatigue, and its performance potential is limited by the quality of the device’s eye tracker.

Ergonomics and physical strain

Ergonomic issues also persist. While VR head-mounted displays are becoming increasingly affordable and portable compared to devices from the late 1990s, they still introduce head and neck issues if worn for extended periods of time.

Another common issue is that VR controllers are known to yield arm and shoulder fatigue in users. This phenomenon is commonly referred to as “gorilla arm syndrome.”

There are methods to alleviate this fatigue, such as requiring smaller physical motions mapped to larger virtual motions. However, most commercial device manufacturers have not employed such techniques yet, instead favouring 1:1 scale interaction. In other words, this means any movement you do in real life is executed the same in VR.

Notably, such approaches go back to the earliest examples of 3D interaction (not necessarily in VR), dating back to the mid-1980s, despite extensive academic research to improve the situation since then.

To their credit, Apple has shown an awareness of this issue with Vision Pro, using eye tracking as a primary input method with subtle gestures in a comfortable arm pose to act as a “click” action. This technique, likely adapted from the “Gaze + pinch” method, uses a combination of eye gaze to select targets and gestures with the hand to manipulate them.

Cybersickness

A final usability hurdle is cybersickness — visually induced motion sickness commonly observed with VR use. Presently, anywhere from 20 to 95 per cent of users can experience cybersickness.

For years, cybersickness was incorrectly attributed to technical issues, such as display refresh latency. We now know that while latency exacerbates cybersickness, the phenomenon is more complex and caused by human factors not easily addressed through better or faster technology.

The primary cause of cybersickness is believed to be visual-vestibular mismatches. This occurs when your eyes tell you you’re moving while your inner ear sense of motion tells you that you are not. VR systems using a controller joystick to move through the environment will immediately increase likelihood of cybersickness than those using natural walking.

Postural stability — our ability to keep upright with conflicting visual information — also plays a role, and sudden motions in VR can cause users to lose their stability. Both issues can be mitigated through better system design, such as minimizing or eliminating movement, or using movement techniques that provide correct cues, like physically walking around an environment.

Since we often want to move around in a larger virtual environment than the physical space available, other techniques intended for use with joystick or steering-based virtual movement have also been shown to be effective in reducing cybersickness. Examples of this include reducing the field of view during motion or snapping the viewpoint to fixed rotation angles and movement positions.

The future of work

As for the future of work in the metaverse, while some may be eager to see the shift in the very near future, there are still too many issues that make working in VR complicated. These issues, unfortunately, will not be easily solved by simply releasing new hardware or software.

Until these issues of user interaction, ergonomics and cybersickness are resolved, the metaverse will not be ready to fully replace traditional office environments or provide a completely effective alternative for remote work.

For now, remote work in the metaverse may appeal to early adopters or companies looking to experiment with virtual spaces, but it is unlikely to become a mainstream solution in the immediate future.

The ConversationVictoria (Vicky) McArthur receives funding from the Natural Sciences and Engineering Research Council of Canada and the Social Sciences and Humanities Research Council of Canada.

Robert Teather receives funding from the Natural Sciences and Engineering Research Council of Canada holding a Discovery Grant entitled "Interaction in Cross-Platform Multi-User Virtual Reality." The research program supported by this fund studies the effectiveness of VR as a multi-user collaboration tool.