Program
Conference Program (preliminary)
Sunday Oct 22 2006 |
|
||||||||||||||||||||||||||||||||||||||||||||
Monday Oct 23 2006 |
|
||||||||||||||||||||||||||||||||||||||||||||
Tuesday Oct 24 2006 |
|
||||||||||||||||||||||||||||||||||||||||||||
Wednesday Oct 25 2006 |
|
Keynote I
The Poor Man's Palace: Special Effects in the Real World
Ramesh Raskar
Mitsubishi Electric Research Labs (MERL), Cambridge, MA, USA
Bio
Ramesh Raskar joined MERL as a Research Scientist in 2000 after his doctoral research at U. of North Carolina at Chapel Hill, where he developed a framework for projector based displays. His work spans a range of topics in computer vision, graphics and HCI including projective geometry, non-photorealistic rendering and intelligent user interfaces. Current projects include composite RFID (RFIG), multi-flash non-photorealistic camera for depth edge detection, locale-aware mobile projectors, high dynamic range video, image fusion for context enhancement and quadric transfer methods for multi-projector curved screen displays.
Dr. Raskar received the TR100 Award, Technology Review's 100 Top Young
Innovators Under 35 worldwide, 2004, Global Indus Technovator Award
2003, instituted at MIT to recognize the top 20 Indian technology
innovators on the globe, Mitsubishi Electric Valuable Invention Award
2004, 2006 and Mitsubishi Electric Information Technology R&D Award
2003. He is a member of the ACM and IEEE.
http://www.merl.com/people/raskar/raskar.html
ISMAR Workshop (full day)
Industrial Augmented Reality
Augmented Reality has matured from pure research to actual industrial applications. There are still many research questions to solve, and solutions are discussed every year at the ISMAR conference series. However, making systems really work requires usually some more effort, knowledge, and ideas than can be discussed by academia alone. The goal of this workshop is to bring together people from industry, research labs, and academia who share the goal of employing augmented reality technologies in real industrial settings aimed at producing a commercial benefit.
The workshop is a continuation of the very successful one-day event at ISMAR'05. It is intended to provide a platform to jointly discuss the "devils in the detail" and to identify ways to make the leap from research-based demonstrators to fully integrated systems.
Organizers:
Gudrun Klinker (TU Munich), coordinator
Stefan Noelle (Volkswagen)
Toshikazu Ohshima (Ritsumeikan University, formerly Canon)
Marcus Toennis (TU Munich)
Affiliated Workshop (full day)
Mixed Reality Technology for Film Making
Information on this workshop can be found on http://www.rm.is.ritsumei.ac.jp/MR-PreVizProject/workshop.htm.
ISMAR Tutorial 1 (half day)
Computer Vision for Augmented Reality
Camera based tracking for augmented reality is nowadays frequently done using AR-Toolkit. The tracking of AR-Toolkit is based on markers placed in the scene. There are situations where it is not possible to place markers in the scene to track the camera motion one example is TV-broadcasting applications. In recent years the techniques in computer vision have been extended towards markerless high quality camera tracking with interactive frame rates. These techniques can be used for camera pose estimation for augmented reality.
The tutorial will review the algorithms and techniques implemented in AR-Toolkit as the state of the art tool for camera tracking in augmented reality. Afterwards these techniques will be extended towards markerless camera pose estimation in real-time. For this the state of the art techniques in computer vision will be explained in detail. The introduced methods include algorithms for detection and tracking of salient image features in the video, the robust estimation of the camera pose from the motion of the salient features. In the final part of the tutorial we will discuss the techniques used to improve the performance of the algorithms to achieve real-time. Throughout the tutorial we will show applications of each of the algorithms to enhance the understanding. These examples are used to build a markerless augmented reality system.
Organizers:
Jan-Michael Frahm (UNC-Chapel Hill), coordinator and instructor
Other(s) TBD
ISMAR Tutorial 2 (half day)
Head-worn Displays (HWD) Fundamentals and Applications
Trying to develop new applications with HWDs without an associated plan for the development of technology optimized for that application and set of tasks is like trying to swim in a pool with no water.
To enable breakthroughs in the use of HWDs targeted at multiple applications, this tutorial will briefly review the principles of optical imaging for HWDs. We will then explain various perception issues in HWDs which, if understood, can help guide the optimal design and use of the technology across various applications and tasks. We will then detail eyepiece based versus projection optics, including retinal scanning displays which can be thought of as a scanning projection system. Video see-through technology will then be introduced together with some key system aspects related to calibration and applications. Wide field of view optics will then be addressed together with state of the art displays and applications will be reviewed. Finally, we will discuss how free-form optics has already played a key role in HWD design and will help launch the next generation eyeglass displays for "grand public" use as well.
Organizers:
Jannick Rolland (University of Central Florida), coordinator and
instructor
Andrei State (University of North Carolina at Chapei Hill), instructor
Ozan Cakmakci (University of Central Florida, and Intern at Canon
Japan), instructor
Robert Patterson (Washington State University), instructor
Posters
Chris Warrington, Gerhard Roth, Eric Dubois | Markerless Augmented Reality for Cubic Panorama Sequences |
Taehee Lee, Tobias Höllerer | Viewpoint Stabilization for Live Collaborative Video Augmentations |
Wayne Piekarski | Robust Gloves For 3D Interaction In Mobile Outdoor AR Environments |
Youngjin Hong, Sanggoog Lee, Yongbeom Lee, Sangryong Kim | Mobile Pointing & Input System Using Active Marker |
Wafaa ABOU MOUSSA, Jean-Pierre JESSEL, Emmanuel DUBOIS | Notational-based Prototyping of Mixed Interactions |
Susanna Nilsson, Björn Johansson | User Experience and Acceptance of a Mixed Reality System in a Naturalistic Setting � A Case Study |
Ann-Sofie Gunnarsson, Malinda Rauhala, Anders Henrysson, Anders Ynnerman | Visualization of Sensor Data Using Mobile Phone Augmented Reality |
Sanni Siltanen | Texture Generation over the Marker Area |
Raphael Grasset, Julian Looser, Mark Billinghurst | Transitional Interface: Concept , Issues and Framework |
Jeremiah Neubert, Tom Drummond | Using Backlight Intensity for Device Identification |
Yoichi Motokawa, Hideo Saito | Support System for Guitar Playing using Augmented Reality Display |
Ryosuke Ichikari, Keisuke Kawano, Asako Kimura, Fumihisa Shibata, Hideyuki Tamura | Mixed Reality Pre-visualization and Camera-Work Authoring in Filmmaking |
Sinem Güven, Steven Feiner, Ohan Oda | Mobile Augmented Reality Interaction Techniques for Authoring Situated Media On-Site |
Stefan Nölle, Gudrun Klinker | Augmented Reality as a Comparison Tool in Automotive Industry |
Seok-Won Lee, Dong-Chul Kim, Do-Yoon Kim, Tack-Don Han | Tag Detection Algorithm for Improving a Instability Problem of an Augmented Reality |