ARBA: Augmented Reality Bay Area meetup

On Jan 24, 2017, we joint the first Augmented Reality meetup hosted at the Runway Incubator in San Francisco. This is also the first time I went to the meetup. 

So the above company basically creates a Kinect which can work with VR headset to produce body tracking.

This is an interesting device, a company named ultrahaptics creates a ultrasound matrix (where the green dot is under the palm), which give air pressure based on the computer display. In VR, this makes feeling the virtual object possible. However, a plane only gives force from one direction. To resolve this issue, a cube-shape array can be created with the blocks. This could be a good plugin for automobile control panel where ultrahaptics can provide feeling of real buttons so people don’t need to look at the panel while driving.

Wearing HoloLens in daily life really need a strong mind! All right, the presentation starts. First, Occipital demonstrates a smart mixture of AR/VR. They created a headset for Iphone. The headset also includes a depth camera. In the demo, they showed that how the depth camera can used to scan a room to create a 3D mesh model for that space. Then the texture is created from the iphone camera. After generating the 3D mesh of the space, the model is loaded into the VR set so it becomes a AR environment. In my opinion, this is an offline real world mapping trick that take advantage of the 3D reconstruction functions. As a result, the system may not ready to process real time point cloud data.

 

Next company Yowza shows a new idea to convert our real living space into digital world. In their idea, the raw mesh of the space is captured and uploaded to their cloud, then the point cloud is segmented and classified to different, completed furnature model in the dataset. Then the 3D models replace the raw mesh and ideally create a completed 3D scene for VR environment. 3D object recognition and 3D segmentation are hot topics in SIGGRAPH and CVPR. This company’s idea will be a very good feature for VR.

The last demo is a Tango based one from Clever Robot Labs. It analyzes the 3D point cloud from Tango phone to recognize the ceiling, floor, table, bed, etc in real time. Then it can replace them with VR contents dynamically. Interestingly, the algorithm can replace the real table based on the point cloud to scale the virtual table. Please see the video for the result.

After that, some new member also introduce themselves and also pop some job information. It is a very nice experience. Focusing more on technical side. And I am also glad that our CEO An Li has a good conversation with Ori Inbar. Hope that we can also join AR meetup in this year!