Canon Mixed Reality Project MReal 体验

今天在斯坦福体验了摄影大厂佳能自己捣鼓出的一款增强现实设备MReal。这是一款配有三个摄像头的有线连接头盔和对应的解码设备。配合佳能为其量身打造的Markers, 该产品可以实现空间内三维物体的实体渲染。不同于目前各大厂商的AR设备,佳能的这一款采用了Video see-through的设计。也就是说,用户眼前看到的影像实际是经过两个在眼睛处的摄像头采集的。头顶部还有一个简单的摄像头,头部平视时该摄像头朝下,用来跟踪地板上的maker map。

MReal HMD set.

设备的帧率在30 fps, 视角在110°左右。作为一个面向工业设计界的AR头盔(感觉Mixed Reality就是对Augmented Reality这个晦涩的词汇的大众叫法),感觉HMD本身只是三个摄像头的支架,视频信息被汇总到一个解码器一样的设备然后再传输给一台主机process,看起来略显笨重。但是不同于其他VR Hack成的MR设备,MReal在眼前的图像会真是还原实际的景深,所以裸眼看见物体的大小和通过此设备看到的大小别无二致,这一点还是不错的。

相关处理设备

此台设备由于配备了Marker 的地板,所以不需要额外的摄像机来跟踪头盔在空间的位置,用户也是通过一个贴满Marker的操纵杆来和MR环境交互,基本上全部是利用图像跟踪的技术。值得注意的是,为了提供更自然地交互,系统提供了颜色选取功能,用来让用户自定义一个颜色范围,比方说手,然后就可以将手的部分从二维图像中挖出来。据透露,通过两个摄像头捕捉图像的视差,三维空间可以重建,于是手和虚拟渲染出的物体就可以交互。

大家其实都忍不住想用手呢~
大家其实都忍不住想用手呢~

关于这个产品的应用,主要是针对工业设计。因为传统工业设计中直接制造毛培模型话费平均百万美元,到最后定稿阶段必须要手工制造出实体才能商讨。通过这样一个设备,许多设计可以按实际大小显示在增强现实空间里。而且由于头盔佩戴者可以被跟踪,相隔千里的人也可以针对同一个产品建模进行交流,用手中的tracker指向模型,另一边的用户会看到一个虚拟渲染出的手指向模型的位置,然后可以走过去观察。这样一个在虚拟环境中的展示可以很大程度上缩小开模的次数,从而减少设计成本。而且由于是在三维空间等大小显示,设计师可以更直观的获得产品外观方便设计。而作为Video See-through设备,可以很好地避免光学成像的半透明效果,渲染出的物体拥有实体店颜色,更适合工业设计用途。此外,在进行专业人员训练时,不需要实际的机器,人员就可以在这种虚拟环境下迅速尝试多台设备,对于培养诸如同一厂商整条流水线的汽车维修人员也是很有帮助的。

在教育方面,这也可以应用于医疗教学,可以更方便的展示相应信息。但是基于本设备的成本大约在10万左右,我相信还是工业界使用的可能性更大。

佳能的这款MReal设备推出大约有三年,并不面向普通消费者,提供SDK但是并不开源。但是确实作为一个Video See-through的设备,解决了光学成像的半透明问题,还是值得学习借鉴的。

IMG_20160413_170532 IMG_20160413_164129

产品在工业界的流水线所起到的作用。
产品在工业界的流水线所起到的作用。

SIGGRAPH 2015 Day 3

Sorry for the late post but I definitely will finish the 5-day season of this journey:)

In the morning I attended the studio course, which covers the GPU general computing discussion on Mobile phone. Personally I feel the presentation is not that clear for the technical detail, but definitely covers huge knowledge about what the GPGPU technology is like on the mobile device. Picture below illustrates the comparison of current different computing interfaces. I suppose if there is some heavy computation that do not need to be sent to the server to compute (like real-time tracking for the mobile system), utilizing the GPU for the task should be a good option.IMG_20150811_101317

Exhibition

PANO_20150811_105119
The main entrance of the exhibition hall. Besides showing GTC talks, Nvidia have a booth to show off the GPU technology for game, movie production. Intel, on the right side, illustrates the graphics technology for the CPU. Ironically, these picture shows the current leader of the computer graphics.

Besides the academic mind mixing, another important platform is that the industrial companies show off of their “coming soon” technologies. Below are some high lights.

IMG_20150811_125951
There are also multiple companies focusing on body skeleton tracking technology by using infrared camera array and reflective markers. The pipeline also handle motion retargeting. These companies provide the 3rd-party solution for movie or game studios.
IMG_20150811_130157
EPSON demonstrates the argument reality eyeware. The AR scene is displayed on a small section of the eye glass. You need to connect the glass with a specific hardware (size like a phone) for touch-based navigation. I feel the system is still kind prototype, may targeting certain professional user but not for everyone. And yes, we have the Pixar booth at the back!!!!!
IMG_20150811_130316
This France company shows a very smooth eye tracking demo here. In a limited distance, the system can track the eye gaze movement and use it to control the reading. It is very smooth and effective. Can definitely be used in hospital for the patients who cannot move.
IMG_20150811_104843
A mobile plug-in style 3D scanning system. very efficiently create still 3D model of any object.
IMG_20150811_111337
IMG_20150811_111353 Qualcomm shows a lot of advanced computer vision technologies this year. Here is a cloud-based object recognition services for mobile devices. It will come with their new CPU so it may need to take a while.
IMG_20150811_125723
IMG_20150811_124822 IMG_20150811_125109 USense is a startup targeting VR/AR experience. Applause for the Chinese start up in US! They purpose two systems. One for mobile, which have two camera inside the headset to help recognize the hand gesture. The wire version have two camera to sense the 3D world and a leapmotion sensor for hand gesture, I think.
IMG_20150811_122012
I meet Evan in SIGGRAPH! So surprise that I meet the guy who we emailed each other before about the DI4D system. Their data quality is good and the new head mounted capture system is remote and light weighted. So great to meet him.

 

Advanced VR Develop Experience Sharing

In the afternoon I attend the talk about VR application and experience. SONY discuss their learning from the new head set. To guide the user in a fully 3D environment, using 3D sound to drag the attention is very important since it is super easy to lose the focus of the story in 3D video. BTW, to enhance the VR experience, hand gesture, especially the grasp motion need to be efficiently detected.

Another one is to use the VR for realistic journal report. To create a new media to bring the audience to the scene of the crime or war zone to experience negative emotions. For the effects please refer to the Buzzfeed video below. Last is a start-up about a droid which can take 360 degree videos.

IMG_20150811_142248 IMG_20150811_145159 IMG_20150811_150408

A secret everyone knows

I feel that I am lucky because I catch this chance. So every SIGGRAPH, pixar will send 1,500 renderman teapot (the shape of the famous Utah teapot) for anyone. Three days, each day 500. The wait list is crazy and normally the teapots will be gone in less than 15 minutes. Each teapot comes with a metal box and a unique ID for it. OMG, it really will be one of the thing that encourage me to go back here next year!

So if you want one, remember to check out with Pixar guys once the exhibition opens about the time. Oh, pixar also like to host renderman party during siggraph. This need invitation and also will be super easy to fill. So also ask the employee about how to register.

PhotoGrid_1439353921392