Haptics

Virtual reality (VR) systems can immerse users in wonderous virtual worlds, rich in sights and sounds. However, how many times have you reached for an object in VR or augmented reality (AR) and, just as you were about to grab it, you experienced the bitter sensorial shock that the object did not exist in the real world.

Prof. Ofek is part of a team of researchers at the University of Birmingham’s VRlab, who are tackling one of virtual reality’s most demanding frontiers: delivering truly immersive and convincing tactile experiences. Their mission is to make virtual objects not only visible and audible—but touchable.

Compared to the maturity of visual and auditory rendering, today’s haptic feedback remains rudimentary. Consumer devices mostly rely on simple vibrations—small motors or actuators that can only hint at the complexity of real touch. Researchers are working to move beyond these “buzz” sensations toward realistic rendering of textures, resistance, and contact forces. Consumers, meanwhile, wait eagerly for these sensations to become part of their everyday virtual experiences.

The difficulty lies in the nature of touch itself. As Prof. Ofek notes, “It’s easy to trick the eye or the ear—film, for example, creates motion at just 24 frames per second—but touch operates on a different level of complexity.” The sense of touch involves high-resolution spatial and temporal signals across skin, muscles, and joints—making haptic rendering a challenge several orders of magnitude harder than visual or auditory simulation.

The VRlab team—including Dr. Massimiliano di Luca, Prof. Eyal Ofek, Dr. Diar Abdulkarim, Dr. Daniele Giunchi, Yilong Lin, and others—are exploring how existing technologies can be combined and refined to create versatile haptic systems. Their work focuses on enabling users to touch and grasp virtual objects, feel surfaces slide beneath their fingertips, and experience a spectrum of tactile sensations through handheld or wearable devices.

My vision: a future where people can interact with digital worlds as naturally as with the physical one—where the virtual becomes truly tangible.

Some of My current publications involve:

Big or Small, It’s All in Your Head: Visuo-Haptic Illusion of Size-Change Using Finger-Repositioning ,

M. J. Kim, Eyal Ofek, M. Pahud, M. Sinclair, and A. Bianchi.

Best paper honourable mention at CHI 2024, the major international human-Computer Interaction conference. In this work investigated modulating the perception of size beyond this range. We developed a fixed-sized VR controller leveraging finger-repositioning to create a visuo-haptic illusion of dynamic size-change of handheld virtual objects.

ArmDeformation: Inducing the Sensation of Arm Deformation in Virtual Reality Using Skin-Stretching,

Yilong Lin, P. Zhang, Eyal Ofek, and S. Je, CHI 2024

We introduce ArmDeformation, a wearable device employing skin-stretching to enhance virtual forearm ownership during arm deformation illusion. We explored the maximum visual threshold for forearm bending and the minimum detectable bending direction angle when using skin-stretching in VR. Our study demonstrates that using ArmDeformation in VR applications enhances user realism and enjoyment compared to relying on visual feedback alone.

Enhanced efficiency in visually guided online motor control for actions redirected towards the body midline.

A. Maselli, Eyal Ofek, B. Cohn, K. Henkley, and M. Gonzalez-Franco,
Philosophical Transactions of the Royal Society B Vol. 378 Issue 1869. 2024

    Reaching objects in a dynamic environment requires fast online corrections that compensate for sudden object shifts or postural changes. We investigate how sensorimotor asymmetries associated with space perception, brain lateralization and biomechanical constraints, affect the efficiency of online corrections.  

    More of my haptic research:


    Haptics in the Metaverse: Haptic feedback for Virtual, Augmented, Mixed, and eXtended Realities.

    Pacchierotti, C., Chinello, F., Koumaditis, K., Di Luca, Massimiliano., Ofek, Eyal., and Georgiou, O.
    Eds. of Special Issue of the IEEE Transaction on Haptics. 2024


    A mechatronic shape display based on auxetic materials.  

    Steed, A., Ofek, Eyal., Sinclair, M. Gonzalez-Franco, Nature Comm. 12, 4758 (2021).
    Nature Device & Materials Blog., Video

    HapticBots: Distributed Encountered-type Haptics for VR with Multiple Shape-changing Mobile Robots, Suzuki R., Ofek, Eyal., M. Sinclair, M., Leithinger, D. and Gonzalez-Franco, M.

    UIST 2021 Video

    Haptic PIVOT: On-Demand Handhelds in VR, R. Kovacs, Eyal. Ofek. M. Gonzalez-Franco, S. Marwecki, C. Holz, and M. Sinclair, UIST 2020, Video

    X-Rings: A Hand-mounted 360 Degree Shape Display for Grasping in Virtual Reality,

    Gonzalez E.J., Ofek, Eyal. Gonzalez-Franco M. and M. Sinclair, UIST 2021 Best Paper: Honorable mention. Video

    Asymmetry of Grasp in Haptic Perception. M. Gonzalez Franco, M. Sinclair, and Eyal Ofek,

    ACM Symposium on Applied Perception 2020 (SAP ’20) | ProjectTalk

    Virtual Reality Without Vision: A Haptic and Auditory White Cane to Navigate Complex Virtual Worlds.  A. F. Siu, M. Sinclair, R. Kovacs, C. Holz, Eyal Ofek & E. Cutrell.,

    CHI 2020 Honorable Mention paper VideoProjectMSR Blog

    CapstanCrunch: A Haptic VR Controller with User-supplied Force Feedback. 

    M. Sinclair, Eyal Ofek, M. Gonzalez-Franco, and C. Holz,  UIST 2019

    Best Technical Demo: Honorable Mention. VideoMSR Blog,

    TORC: A Virtual Reality Controller for In-Hand High-Dexterity Finger Interaction. 

    J. Lee (Kaist & MSR), M. Sinclair, M. Gonzalez-Franko, Eyal Ofek and C. Holz, 

    CHI 2019. VideoMSR Blog

    The Uncanny Valley of Haptics., CC.. Berger, M. Goznalez-Franco, Ofek. Eyal and K. Hinckley.

    Science Robotics 18 Apr 2018: Vol. 3, Issue 17 Video,  Microsoft Research Blog

    CLAW: A Multifunctional Handheld Haptic Controller for Grasping, Touching, and Triggering in Virtual Reality 

    I. Choi, E. Ofek, H. Benko, M. Sinclair, and C.Holz, CHI 2018, 

     VideoMSR Blog,

    Haptic Links: Bimanual Haptics for Virtual Reality Using Variable Stiffness Actuation

    E. Strasnick, C. Holz, Eyal Ofek, M. Sinclair and H. Benko, 

    CHI 2018  Video,  MSR Blog,

    Haptic Revolver: Touch, Shear, Texture, and Shape Rendering on a Reconfigurable Virtual Reality Controller. E. Whitmire (UW), H. Benko, C. Holz, E. Ofek, and Mike Sinclair, 

    CHI 2018 Best Paper: Honourable Mention Video. MSR Blog

    Sparse Haptic Proxy: Touch Feedback in Virtual Environment Using a General Passive Prop.

    L. P. Cheng (HPI) E. Ofek, C. Holz, H. Benko & A. D. Wilson, CHI 2017

     NormalTouch and TextureTouch: High-fidelity 3D Haptic Shape Rendering on Handheld Virtual Reality Controllers, H. Benko, C. Holz, M. Sinclair, and Eyal Ofek  UIST 2016
    Best paper Honourable Mentioned Video

    Haptic Retargeting: Dynamic Repurposing of Passive Haptics for Enhanced Virtual Reality Experiences.  M. Azmandian, M. Hancock, H. Benko, Eyal Ofek, A. D Wilson,
    CHI 2016 Golden Mouse Award – Best video