Horizon Workrooms

Building the next generation of productivity in AR on the new Quest Pro

Horizon Workrooms is an application part of the Work Metaverse experiences. It is designed to allow users to work in immersive spaces and to mirror their computers in VR/VR to enhance their productivity.

Role

Senior Product Designer (IC)

Challenge

To bring the vision for work in the Metaverse to the present by developing new interface paradigms that enable knowledge workers to leverage AR/VR for new productivity use cases

Contributions

Researched and developed new input methods for VR to improve comfort, aid adoption, and increase satisfaction for users across all VR/AR use cases

Worked to abstract complexity inherent in spatial computing to create new features and ways of working in VR that are easy to use and desirable for knowledge work users

Contributed to large redesign application projects to include new, more flexible ways to work in augmented reality

Worked with researchers and PMs to identify gaps and opportunities and communicated them to other teams at Meta

Impact

Contributed to Workrooms redesign to create an AR model that is more fluid and flexible to many use cases

Worked on designing new AR interaction patterns shipped with the new Quest Pro

Developed concepts from early-stage ideas to concrete hardware/software specs, directly influencing Meta's hardware roadmap

Context

The Metaverse Vision

The vision for work in the "Metaverse" shows that working in VR and AR is better than using today's computing devices. You wear a headset because it allows you to overlay your screens and digital artifacts on top of your world, and you can have meetings with your co-workers in your own home or office in much more immersive ways. It's hard to disagree with this long-term vision, but its resolution needs to be higher to inspire short-term execution.

The gap between the vision and the present

While there's much speculation about the potential of VR and AR technologies to revolutionize how we work, think, and collaborate, the current design implementation and end-to-end user experience leave much to desire. When observed closely, today's offerings and design implementations are light years away from this vision. Key narrative points such as "VR/AR replacing your computer monitor" or "I can be much more productive in VR" it misses the granularity required to design good experiences for XR. For example, pinching with your fingers and adjusting the side of a window from a distance looks good in a video, but it's not practical in reality. It doesn't provide a good experience because you need to be extremely precise, and the gesture becomes overly exhausting after an 8hr work day. The journey of new Quest users begins with excitement, then moves to low motivation to finally abandon within two/three weeks of buying a device.

The hardware gap

It's easier to talk about the experience of using VR/AR when considering the role of hardware. Unlike laptops, VR devices impact how you feel inside an experience. Aspects such as framerate and pixel resolution can change your perception of the entire experience. Weight, tracking, and battery affect how users perceive their interaction with XR applications. On the computing side, standalone devices are limited and will continue to be for a long time. There aren't simple solutions to this problem in the near term. It's easy to see this limitation as an impairment, but it's also possible to frame it as a creative constraint for Design. For example, If hardware is limited, then the choice regarding What should be even more intentional. In other words, what experiences should designers focus on that create real user value? What features and use cases can be significant to justify owning VR hardware? Assuming that some friction points will remain, how do designers create value to aid the adoption of VR in the short term?

The Design gap

The biggest challenge for VR isn't the hardware per se but the ability to rethink how we do interaction design in this new paradigm. Looking at how the Quest OS, it's easy to see old patterns finding their way into VR without much concern regarding potential alternatives and how the end experience will feel for users. There are a few reasons for this. First, VR development originated from games and immersive environments. You see these choices reflected in the selection of hardware. VR controllers, for example, are joysticks, not touchpads. Because of this, the entire OS reflects simple design choices made years ago, such as point-and-click input methods (either with the hand or controller).

The design gap is not Meta specific and torments every VR manufacturer. Instead, this design gap is a universal challenge that marks the end of an era entirely built on the WIMP (Windows, Icons, Menus, Pointer) paradigm. WIMP has been the predominant interface paradigm from the 80s to today. The same model was adapted and improved for touch devices such as the iPad/iPhone, but while the UX seems new, the model remains essentially the same. It's expectable that the first generation of OS would extend this model. Still, almost ten years later, its limitations can be felt by every user, be they gamers or knowledge workers.

If the experience of using VR is marginally better than existing computing paradigms, VR's friction won't justify the switch. If the way we interact in VR is cumbersome and full of micro-frictions, VR offerings will need to be substantially better than existing alternatives.

The dilemma above summarises the intricacies and interdependencies between Design and hardware when designing for XR. Different device manufacturers might focus on reducing friction or improving users' value, but VR adoption will only happen when both are in place.

Due to the heavy hardware limitation and slow adoption from serious games, Meta is course-correcting and adjusting its original strategy. Instead of trying to place VR as a gaming device, it now focuses on VR for productivity use cases. The Quest Pro was presented as an extension of your workspace and was designed for solo work and social interactions in immersive spaces. The entire hardware form factor and specs focused on giving you the best possible device for this new immersive paradigm called "Metaverse."

Contributions and Projects

As a design leader and individual contributor at Meta, I focused on tangible outcomes I could influence in the short term. For example, I focused on evaluating existing implementations and observing user behavior across different countries to inspire me to focus on the right problems to solve. Early adopters and users who escape the average productivity user are always more valuable than observing users at the center of the bell curve. For example, when dealing with the limitations of housing areas, many users in Japan are already using Horizon Workrooms as their office, spending more than 5hrs a day. These users have developed many hacks to extend battery life and overcome today's limitations to create an ideal experience in the present.

From observations such as these, I developed a working hypothesis that saw general VR adoption as long-term. Still, it was possible to triple adoption in the short term by removing major friction points. My perspective on solving these problems focused mainly on Design and off-the-shelf engineering solutions. They asked hard questions about alternative ways to structure the behavior or applications in VR, but they were also hyper-realistic and pragmatic to implement.

Solving interaction and usability problems

There are two main ways to solve usability/design problems. The first approach focuses on iterating on a solution to improve its comprehension or ease of use. The second focuses on using existing user problems as a starting point to rethink how fundamental aspects of how things work. For example, it's possible to make buttons bigger to make far-field interactions with the controller easier, but this usability problem can also help question existing input methods altogether. During my time at Meta, I focused on thinking about problems systematically because of the impact such components could have on the end-to-end user experience.

Going Beyond Computer Screen Mirroring

A second pillar of my work focused on bringing your interactions with your existing devices, such as OS and hardware, to life. The way Mate approached this space was very literal. The app focuses on allowing you to extend your computer in VR to create and move windows. But my interest lies in taking this initial implementation forward to allow a much tighter integration between VR and Desktop. I was interested in VR as a mediator rather than just a virtual monitor. The mediator metaphor also enables you to leverage VR apps and your computer.

Similarly, the mediator metaphor allows you to use your favorite input methods regardless of where and what they are. For example, if VR only works as a virtual monitor, your mouse pointer cannot go to VR apps and spaces. It might look small, but users won't see the difference between a desktop window and a VR space. Because the metaphor is limited, the implementation is also limited, creating an end experience full of frustrations.

The short-term vision I worked to bring to life focused on better connecting what users know today with new affordances. If they are just getting set up, controlling their Mac/Windows applications is easy and personal. I worked with engineers to develop new ways to detect and configure planes in space so that the devices such as the new Quest Pro can always know where surfaces (planes) are in your worldview. This was important to remove the constant friction in the application today, where users are constantly required users to configure a new workspace every time they move to a new area.

My explorations and design choices allowed users to quickly put on their VR headset and get to work in less than 5 seconds rather than 30/40 seconds (today's set-up time). I focused on flexibility of use to account for the myriad ways users can interact with their computers or VR apps. They can be on the go on a train, at a dinner table, in the office, or running late to a meeting room. In contrast, the current app implementation focused on the "virtual office "as the primary space; this assumption needed to match the reality of many VR users who saw the act of getting work done fluidly and in a wide range of possible configurations.

Product Impact

The result of this work has influenced a new generation of applications and features yet to be released by Workrooms. The result is still far from the long-term vision, but it helped the company to take tangible steps toward a design implementation that is valuable and desirable for today's and tomorrow's VR users.