Asia Pacific Workshop on Mixed and Augmented Reality Asia Pacific Workshop on Mixed and Augmented Reality Asia Pacific Workshop on Mixed and Augmented Reality

Keynote Speeches

Day 2: 4/14, Saturday

Prof. Ming Ouhyoung

Keynote: Prof. Ming Ouhyoung
Host:Chu-Song Chen

Title: Four Major Technical Problems Encountered in VR/AR/MR, and Two Case Studies to Deal With Them


In this talk, I will explore the four major technical problems to solve in VR/AR/MR, namely (i) Display resolution high enough not to see visible pixels, (ii) Latency (from motion to photons) to avoid dizziness, (iii) Tracking (with enough precision) so that manipulation and registration is satisfactory, (iv) Vergence-accommodation conflict inherent to all stereoscopic displays. I will also explain the current state of the art technologies, and use two case studies to show potential solutions in order to deal with above problems. The two case studies are:

1. CatAR: A Novel Stereoscopic Augmented Reality Cataract Surgery Training System
We propose CatAR, a novel stereoscopic augmented reality (AR) cataract surgery training system. It provides dexterous instrument tracking ability using a specially designed infrared optical system with 2 cameras and 1 reflective marker. The tracking accuracy on the instrument tip is 20 μm, much higher than previous simulators. Moreover, our system allows trainees to use and to see real surgical instruments while practicing. Five training modules with 31 parameters were designed and 28 participants were enrolled to conduct efficacy and validity tests. The results revealed significant differences between novice and experienced surgeons. Improvements in surgical skills after practicing with CatAR were also significant.

2. Stereoscopic 360。 Imaging
This report proposes a low-cost and portable polycamera system and accompanying methods for capturing and synthesizing stereoscopic 360。 panoramas. The polycamera consists of only four cameras with fisheye lenses. Synthesizing panoramas from only four views is challenging because the cameras view very differently and the captured images have significant distortions and color degradation including vignetting, contrast loss, and blurriness. For coping with these challenges, this paper proposes methods for rectifying the polyview images, estimating depth of the scene and synthesizing stereoscopic panoramas. The proposed camera is compact in size, light in weight, and inexpensive. The proposed methods allow the synthesis of visually pleasing stereoscopic 360。 panoramas using the images captured with the proposed polycamera.


Ming Ouhyoung received the BS and MS degree in electrical engineering from the National Taiwan University, Taipei, in 1981 and 1985, respectively. He received the Ph.D degree in computer science from the University of North Carolina at Chapel Hill in Jan., 1990. He was a member of the technical staff at AT&T Bell Laboratories, Middle-town, during 1990 and 1991. Since August 1991, he has been an associate professor in the department of Computer Science and Information Engineering, National Taiwan University. Then since August 1995, he became a professor. He was the Director of the Center of Excellence for Research in Computer Systems, College of Engineering, from August 1998 to July 2000, and was the Chairman of the Dept. of CSIE from August 2000 to July 2002. He was the associate dean of College of EECS (2012-2015). He has published over 100 technical papers on computer graphics, virtual reality, and multimedia systems. He is a senior member of ACM and member of IEEE.

Day 3: 4/15, Sunday

Prof. Masahiko INAMI

Keynote: Prof. Masahiko Inami
Host:Bing-Yu Chen

Title: JIZAI Body


The social revolutions have accompanied innovation of the view of the body. If we regard the information revolution as establishment of a virtual society against the real society, it is necessary to design a new view of body "JIZAI body (freedomated body)", which can adapt freely to the change of social structure, and establish a new view of the body. In this talk, we discuss how we understand of basic knowledge about the body editing for construction of JIZAI body (freedomated body) based on VR, AR and Robotics.


Masahiko (Masa) Inami is a Professor in the Research Center for Advanced Science and Technology at the University of Tokyo, Japan. He is also directing the Inami JIZAI Body Project, JST ERATO. His research interest is in Augmented Human, human I/O enhancement technologies including perception, HCI and robotics. He received BE and MS degrees in bioengineering from the Tokyo Institute of Technology and PhD in the Department of Advanced Interdisciplinary Studies (AIS) from the University of Tokyo in 1999. He joined the Faculty of Engineering of the University of Tokyo, and in 1999, he moved to the University of Electro-Communications. In April 2008, he joined Keio University, where he served as a Professor of the Graduate School of Media Design and the Vice-director of the International Virtual Reality Center till October 2015. In November 2015, he rejoined the University of Tokyo. His installations have appeared at Ars Electronica Center. He proposed and organized the Superhuman Sports Society.

Important Dates:

Paper Submission Due: Jan 7, 2018
Final Decision to Authors: Jan 25, 2018
Demo Submission Due: Feb 12, 2018
Camera-Ready Due: Feb. 28, 2018
Early Registration Due: Mar. 7, 2018
Workshop Dates: Apr 13-17, 2018