Hasso Plattner Institute, Potsdam, Germany
 Microsoft Research, Redmond
"Sparse Haptic Proxy: Touch Feedback in Virtual Environments Using a General Passive Prop" [pdf]
CHI 2017, Denver, Colorado
We propose a class of passive haptics that we call Sparse Haptic Proxy (SHP): a set of geometric primitives that simulate touch feedback in elaborate virtual reality scenes. Unlike previous passive haptics that replicate the virtual environment in physical space, a Sparse Haptic Proxy simulates a scene’s detailed geometry by redirecting the user’s hand to a matching primitive of the proxy. To bridge the divergence of the scene from the proxy, we augment an existing Haptic Retargeting technique with an on-the-fly target remapping: We predict users’ intentions during interaction in the virtual space by analyzing their gaze and hand motions, and consequently redirect their hand to a matching part of the proxy. We conducted three user studies on our haptic retargeting technique and implemented a system from the three main results: 1) The maximum angle participants found acceptable for retargeting their hand is 40°, rated 4.6 out of 5 on average. 2) Tracking participants’ eye gaze reliably predicts their touch intentions (97.5%), even while simultaneously manipulating the user’s hand-eye coordination for retargeting. 3) Participants preferred minimized retargeting distances over better-matching surfaces of our Sparse Haptic Proxy when receiving haptic feedback for single-finger touch input. We demonstrate our system with two virtual scenes: a flight cockpit and a room quest game. While their scene geometries differ substantially, both use the same sparse haptic proxy to provide haptic feedback to the user during task completion.
The user of Virtual Reality, may find himself in different virtual worlds, such as a spy game (middle) or a space simulator(right) , yet they all give him tangible feedback using the same physical geometry in the real world (left).
Since the beginning of Virtual Reality (VR) development, the goal was to immerse the users in a virtual world that will be as real for them as reality. While the capabilities of computer graphics and virtual reality headsets display headsets, as well as 3D audio rendering, has progress significantly over the last decades, our ability to simulate the physicality of the virtual world is still in infancy.
To help crack this problem a team of Microsoft Research researchers – Christian Holz, Hrvoje Benko, Andy Wilson and myself, along with Lung-Pan Chen, a visiting intern from HPI, Germany, has proposed a new solution that can help us feel our environment while in VR, in a paper published at CHI 2017.
Whenever ya user reach to touch a virtual object, she expects to feel the physical object. However, to be able to do so in a naïve way, the physical room would have to be modeled to match each different virtual world that the user is experiencing. To solve this problem, the proposed system tricks your senses while in virtual world to redirect your hands, such that whenever the you see yourself touching a virtual object, your hands are actually touching a physical geometry, but it may lie in a different location in space.
The user operations in the virtual world: Touching a log, picking a key, rolling a dial, vs. the user touch in the real world.
By using a simple fixed geometry, in a form of a hemisphere, simulating the space around the user, the researchers have been able to simulate the physical feedback of scenes of different geometries. Such a technology has the potential to make virtual worlds more tangible using a cheap and safe solution. The technology is examined and put through its paces by the team, looking at different parameters such as the maximum allowed difference between the visualized world and its approximated physical geometry without generating user uneasiness.