
What if Mixed Reality (MR) applications could seamlessly adapt to any environment — from a cozy living room to a crowded train seat? What if games could weave fantastical stories into your own surroundings, blurring the boundaries between reality and imagination? What if visiting a dream destination were as effortless as stepping outside your front door?
From Ivan Sutherland’s pioneering Sword of Damocles to today’s cutting-edge headsets, MR has continuously expanded how we perceive and interact with digital content. As technology evolves beyond bulky, stationary setups, new hardware and sensors are unlocking fresh possibilities — and new challenges. Researchers are now reimagining what MR can do, where it can exist, and how humans can best experience it.
Recent advances have led to immersive environments that transcend the limits of a single room, enhance accessibility, and enrich human interaction.
At the University of Birmingham’s VR Lab, researchers are pushing these boundaries even further. Led by Dr. Massimiliano di Luca, Prof. Eyal Ofek, Dr. Diar Abdulkarim, Dr. Daniele Giunchi, Yilong Lin, and colleagues, the team is developing technologies that take Mixed Reality beyond dedicated gaming spaces and into everyday life — boosting productivity, enabling new forms of collaboration, and making digital experiences more inclusive and deeply human.
Other publications in the topic:
Talk: Behind the Scenes with Microsoft: VR IN THE WILD 2021
Virtual Reality (VR) & Augmented Reality (AR) pose challenges and opportunities from both a technical and social perspective. We could now have digital objects, not physical ones, that change our understanding of the world around us. It is a unique opportunity to change the reality we sense.
The Microsoft Researchers are looking for new ways to extend our abilities beyond our physical limitations, enabling superhuman abilities on the one hand and leveling the playing field for people with physical limitations.
Dr. Ofek will describe efforts to design VR & AR applications that adapt to the user’s uncontrolled environment, enabling continuous use at work and during leisure across a wide variety of environments. He will also review efforts to the extent of rendering new capabilities, such as haptic rendering.

AvatarPilot: Decoupling one-to-one motions from their semantics with weighted interpolations, C.Y. Wang, Eyal Ofek, H. Kim, P. Panda, A. Won, M. Gonzalez-Franco
ISMAR 2024 Video
Physical constraints in the real spaces where users are situated pose challenges for remote XR and spatial computing interactions using avatars. Users may not have available space in their physical environment to duplicate the physical setup of their collaborators, but if avatars are relocated, one-to-one motions may no longer preserve meaning.
We propose a solution: using weighted interpolations, we can guarantee that everybody is looking at or pointing at the same objects, both locally and remotely. At the same time, this preserves the meaning of gestures and postures that are not object-directed (i.e., that are close to the body). We extend this work to locomotion and direct interactions in near space, such as grabbing objects, exploring the limits of our social and scene understanding, and finding a new use for Inverse Kinematics (IK).

Eyal Ofek, J. Grubert, M. Pahud, M, Phillips, and P.O. Kristensson, Towards a Practical Virtual Office for Mobile Knowledge Workers, Symposium on The New Future of Work 2020 (NFW’20) Video
As more people work from home or while traveling, new opportunities and challenges arise around mobile office work, including using makeshift spaces, working in less-than-optimal conditions, and being decoupled from co-workers. Virtual Reality (VR) has the potential to change the way information workers work: it enables personal, bespoke working environments even on the go. It allows new collaboration approaches that can help mitigate the effects of physical distance. In this paper, we investigate opportunities and challenges for realizing a mobile VR office environment and discuss the implications of recent findings on mixing standard off-the-shelf equipment, such as tablets, laptops, or desktops, with VR to enable effective, efficient, ergonomic, and rewarding mobile knowledge work. Further, we investigate the role of conceptual and physical spaces in a mobile VR office.

DreamWalker: Substituting Real-World Walking Experiences with a Virtual Reality.
J. Yang, C. Holz, Eyal Ofek, and A. Wilson, UIST 2019 Video
We explore a future in which people spend considerably more time in virtual reality, even when walking between real-world locations. In this paper, we present DreamWalker, a VR system that enables real-world walking while users explore and remain fully immersed in large virtual environments via a headset. Provided with a real-world destination, DreamWalker finds a similar path in a pre-authored VR environment and guides the user as they walk through the virtual world. To keep the user from colliding with objects and people in the real-world, DreamWalker’s tracking system fuses GPS locations, inside-out tracking, and RGBD frames to 1) continuously and accurately position the user in the real world, 2) sense walkable paths and obstacles in real time, and 3) represent paths through a dynamically changing scene in VR to redirect the user towards the chosen destination. We demonstrate DreamWalker’s versatility by enabling users to walk along three paths across the large Microsoft campus while enjoying pre-authored VR worlds, supported by a variety of obstacle-avoidance and redirection techniques. In our evaluation, 8 participants walked across campus along a 15-minute route, experiencing a virtual Manhattan filled with animated cars, people, and other objects.

More Adaptive MR works of people from the VR lab:
C.Y. Wang, H. Kim, P. Panda, Eyal Ofek, M. Gonzalez-Franco & A Won, ISMAR 2024 Video

Reality Check: Blending Virtual Environments with Situated Physical Reality.
J. Hartmann (Waterloo & MSR), C. Holz, Eyal Ofek, & A. D. Wilson, CHI 2019. Video

L-P. Cheng, Eyal Ofek, C. Holz & A.D. Wilson, IEEE VR 2019, Video, MSR Blog

Room2Room: Enabling Life-Size Telepresence in a Projected Augmented Reality Environment T. Pajsa, H. Benko, A. D. Wilson, and Eyal Ofek CSCW 2016 Best Paper
Video

SnapToReality: Aligning Augmented Reality to the Real World.
B. Nuernberger, Eyal Ofek, H. Benko & A. D. Wilson. CHI 2016 Video

IllumiRoom: Immersive Experiences Beyond the TV Screen.
B. R. Jones, H. Benko, Eyal Ofek, A. D. Wilson Communications of the ACM, Vol. 58 No. 6, 2015 Video, Popular Science

Spatial Constancy of Surface-Embedded Layouts across Multiple Environments, B. Ens, E. Ofek, N. Bruce & P. Irani, SUI 2015 Video

SurroundWeb: Mitigating Privacy Concerns in a 3D Web Browser. J. Wilk, D. Molnar, Eyal Ofek, C. Rossbach, B. Livshits, A. Moshchuk, H.J.Wang & R. Gal, IEEE Security & Privacy 2015 Video, BBC (TV coverage)

RoomAlive: Magical Experiences Enabled by Scalable, Adaptive Projector-Camera Units.
B.Jones, R. Shdhi, M. Murdock, R. Mehra, H. Benko, A. Wilson, Eyal Ofek, B. MacIntyre, N. Raghuvanshi & L. Shapira , UIST 2014 Video, RoomAlive: The Other Resident Video The Washington Post, BBC,

FLARE: Fast Layout for Augmented Reality Applications. R. Gal, L. Shapira, Eyal Ofek, & P. Kohli, IEEE ISMAR 2014 Video – Technology incorporated into Unity MARS and HoloLens.

IllumiRoom: Peripheral Projected Illusions for Interactive Experiences B. Jones, H. Benko, Eyal Ofek & A. Wilson, CHI 2013 Best Paper, Golden Mouse award (Best Video show) video, Xbox_Video NBC
