Home


Selected past Industrial projects

 

Description: Description: Description: Description: \\research\root\web\external\en-us\UM\People\eyalofek\PhotonPaint.jpg

Photon Paint I (1987) & II (1988)

Photon Paint, was an image manipulation & drawing application, first released in 1987 for the Commodore Amiga, followed by Photon Paint II 1988 and later by Photon paint Macintosh.

Developed by Bazbosoft (Oren Peli, Eyal Ofek & Amir Zbeda) and distributed by MicroIllusion, it had bought by about 1/3 of Amiga owners.

 

Description: Description: Description: Description: \\research\root\web\external\en-us\UM\People\eyalofek\MiniZ.jpgDescription: Description: Description: Description: \\research\root\web\external\en-us\UM\People\eyalofek\ZCam.jpg

ZCam – 1999

ZCam was the first video of time-of-flight camera products for video applications by Israeli developer 3DV Systems. The ZCam supplements full-color video camera imaging with real-time range imaging information, allowing for the capture of video in 3D. The company was bought by Microsoft and the technology was incoporated in Hololens. 

I was in charge of all software and algorithms for the camera and its application from its start at 1996 till 2004.

Video  Video2 Video3

Awards:

NAB 1999 - Best of show.

Videography 1999 - Editors's Choise

Television Broadcast 1999 - Editor's Pick

Advance Imaging - Solution of the year 1999.

3DV Systems

JVC

Wikipedia

WideScreen Review

Description: Description: Description: Description: \\research\root\web\external\en-us\UM\People\eyalofek\StreetSide.jpg  

Local.live Street Side Technical Preview – Feb. 2006

4 months after moving from Microsoft research to Virtual Earth we released a technical preview that showed Street Side images in Seattle and San Francisco centers. The preview allows the user to ‘drive’ a car in the streets, and view images of the streets. This was first on-line service for immersive 360 experiences of street. All the project was developed by me, with the help of B. Snow (web page programming) and R. Welsh (Graphic design).

Description: Description: Description: Description: \\research\root\web\external\en-us\UM\People\eyalofek\image001.jpg

 

Automatic geopositioning of Flickr™ images in Bing maps – Feb. 2010

M. Kroepfl and myself worked on a matching user images (both with initial guess for location, as well as general ones) to Bing maps street images. The result was shipped as a Bing Maps application on Feb 2010, and was shown at TED 2010. 

Video

TED talk by Blaise Aguera y Arcas.

Geek in Disguise

Search Engine Land

Tech Flash

Bing Blog

Flickr

Description: Description: Description: Description: \\research\root\web\external\en-us\UM\People\eyalofek\Fun.jpg

Semantically tagging the maps with crowd data – Apr. 2010

I worked on attaching semantic tags to maps based on labels such as found in Flickr™ images. A service that is based on this work was demonstrated at Where 2.0 2011 and is accessible on the web.

Description: Description: Description: Description: \\research\root\web\external\en-us\UM\People\eyalofek\Mouse.jpg

Microsoft Touch Mouse – Aug. 2011

Although I was NOT involved in the development of the product, it is based on a work, started by Hrvoje Benko and myself,  and was later incorporate in our UIST paper, and in our Patent. ,

IllumiRoom: Peripheral Projected Illusions for Interactive Experiences

Illumiroom – Jan 2013

IllumiRoom is a proof-of-concept system from Microsoft Research. It augments the area surrounding a television screen with projected visualizations to enhance the traditional living room entertainment experience

Video     Video Ad

 

Flare – Sep 2014

Flare is a rule-based system for generating object layouts for AR applications.

Video

 

 

 Room Alive - 2014

RoomAlive is a proof-of-concept prototype that envisions a future of interactive gaming with projection mapping. RoomAlive transforms any room into an immersive, augmented entertainment experience through the use of video projectors. Users can touch, shoot, stomp, dodge and steer projected content that seamlessly co-exists with their existing physical environment. RoomAlive builds heavily on our last research project, IllumiRoom, which explored interactive projection mapping surrounding a television screen. IllumiRoom was largely focused on display, extending traditional gaming experiences out of the TV. RoomAlive instead focuses on interaction, and the new kinds of games that we can create with interactive projection mapping. RoomAlive looks farther into the future of projection mapping, and asks what new experiences will we have in the next few years?

 Description at Projection Mapping.

 Video

RoomAlive Scene

 RoomAlive toolkit (2014)

An open source SDK that enables developers to calibrate a network of multiple Kinect sensors and video projectors. The toolkit also provides a simple projection mapping sample that can be used as a basis to develop new immersive augmented reality experiences similar to those of the IllumiRoom and RoomAlive research projects.

The RoomAlive Toolkit is provided as open source under the MIT License.

The code is available for download at GithHub: https://github.com/Kinect/RoomAliveToolkit.

B. Lower and A. Wilson gave a talk on the RoomAlive Toolkit for BUILD 2015. You can view it at http://channel9.msdn.com/Events/Build/2015/3-87.

 

Haptic Controllers (2017 - current)

Virtual Reality (VR) and Augmented Reality (AR) have progressed dramatically in the past 30 years. Today, we are able to wear a consumer head-mounted display and experience fantastic worlds, populated with rich geometry and beautifully realistically rendered virtual objects. 3D audio plays sounds in our ears, as if they are generated by virtual sources in space, and may adapt as we move around the space. However, whenever we try to reach our hand and touch any virtual object, the illusion will break: it is only a mirage, and our hand will end up touching or grasping air.


Compared to visual and audio rendering capabilities of consumer devices, their tactile offering is mostly limited to a simple buzz – a vibration feeling generated by an internal motor or an actuator, buried inside the controllers. Although there are many research works that aimed at rendering different tactile sensations, they have not managed to reach consumers. Reasons for that are many, laboratory prototypes such as exoskeletons, and other hand mounted devices may require a cumbersome procedure to fit to users, put them on or take them off. Many prototype devices can simulate only a specific sensation, such as texture, heat, weight that may not be general enough to attract users. Complex mechanics that involves a multitude of motors may render the device too expensive, too big or too fragile to be a consumer product.


We have been exploring a number of ways in which technology can generate a wide range of haptic sensations that may fit within held Virtual Reality controllers, not unlike the ones currently being used by consumers. Enabling users to touch and grasp virtual objects, feel the sliding of their fingertips on the surface of the objects and more. The ultimate goal: Allowing users to interact with the virtual digital world, in more natural ways than ever before.

 

SeeingVR toolkit (2019)

Current virtual reality applications do not support people who have low vision, i.e., vision loss that falls short of complete blindness but is not correctable by glasses. We present SeeingVR, a set of 14 tools that enhance a VR application for people with low vision by providing visual and audio augmentations

The code is available for download at GithHub .

Related Publication.

Video

Virtual Reality & Augmented Reality in the wild (2014-Current)

New inside-out tracking HMDs allows users to wander through large environments using continuous ‘inside-out‘ optical tracking, opening the opportunity for applications to spread over large spaces and time intervals. For example, a user may play a multi-player game over multiple rooms or outdoors, or a group of workers may wonder through a large space and share the same content. We explore the technologies required to enable this future, as well as some vertical example applications.

Related publications

SurroundWeb: Mitigating Privacy Concerns in a 3D Web BrowserIEEE Security & Privacy 2015

Spatial Constancy of Surface-Embedded Layouts across Multiple Environments 2015

VRoamer: Generating On-The-Fly VR Experiences While Walking inside Large, Unknown Real-World Building Environments. IEEE VR 2019

DreamWalker: Substituting Real-World Walking Experiences with a Virtual Reality.  UIST 2019