ISMAR 06 --- Fifth IEEE and ACM International Symposium on Mixed and Augmented Reality --- Oct. 22 - 25, 2006 in Santa Barbara, CA

Call for Papers


Online Paper Submissions due: Monday, April 24, 9pm US Pacific Time Zone.
Online Paper Submissions due: Sunday, April 30, 9pm US Pacific Time Zone.
See Accepted Submissions.

Demo Submission due: Monday, July 10.
See Accepted Demos

Student Volunteers due: Saturday, Aug 19

Student Research Colloquium due: Sunday, October 8

Aims and Scope

Mixed Reality (MR) and Augmented Reality (AR) allow the creation of novel types of user interfaces integrating computer-generated information into the real world. Applications of this technology are beginning to have a significant impact on industry and society. The research field is highly interdisciplinary, bringing together computer vision, computer graphics, user interfaces, human factors, wearable computing, mobile computing, computer networks, displays, sensors, to name just some of the most important influences.

Since 1999, ISMAR is the premier forum in this vibrant field. This year we are proud to present the 5th IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR 2006). The symposium will be held on Oct. 22-25, 2006, in sunny Santa Barbara, California, USA. We invite you all to participate in this event for the exchange of new ideas in this exciting field!

Conference Topics

ISMAR06 invites submissions in the general field of Mixed and Augmented Reality.
Topics include, but are not limited to:

MR/AR applications

  • personal MR/AR information systems
  • industrial and military MR/AR applications
  • medical MR/AR applications
  • MR/AR for entertainment
  • MR/AR for architecture
  • MR/AR for art, cultural heritage, or education and training


  • position and orientation tracking technology
  • calibration methods
  • sensor fusion
  • vision-based registration and tracking
  • acquisition of 3D scene descriptions

System architecture

  • wearable and mobile computing
  • distributed and collaborative MR/AR
  • display hardware
  • performance issues (real-time approaches)

User interaction

  • interaction techniques for MR/AR
  • collaborative MR/AR
  • multimodal input and output

Information presentation

  • real-time rendering
  • photorealistic rendering
  • object overlay and spatial layout techniques
  • aural, haptic and olfactory augmentation
  • mediated and diminished reality
  • display and view management

Human factors

  • usability studies and experiments
  • acceptance of MR/AR technology
  • social implications

Paper Submissions


All ISMAR program chairs and reviewers must accept a confidentiality agreement, which obligates them to treat the submitted papers as confidential information. Patent submissions or other forms of IP protection sought by commercial parties is therefore not adversely affected by submitting material to ISMAR. In case a paper is accepted, publication will be at the first day of the conference, Oct. 22, 2006.

Extended Deadline: Sunday, April 30, 9pm US Pacific Time Zone (see news)

Submission notification date: We are aiming for June 24.

ISMAR is a high-quality conference with a competitive submission process. In 2005, ISMAR accepted 22% of the papers submitted based on a rigorous reviewing procedure with at least 4 independent experts providing reviews, and a final decision made in a two-day committee meeting.

For this year, we will accept the following categories of submissions:

  • Full-length papers are up to 10 pages long and are held to rigorous standards of contribution and evaluation.
  • Short papers are up to 4 pages in length and are reviewed through the same process, but the expected amount of contribution is less due to the shorter format. Typical examples are a small but neat technical contribution or an application case study.
  • Posters: Contributions are submitted as an extended abstract of up to 2 pages in length.

Accepted full-length and short papers will be presented at the conference. The allotted presentation time will depend upon the length and content of the paper. Accepted papers and poster abstracts will be published in the proceedings. Previously published work may not be submitted, nor may the work be concurrently submitted to any other conference or journal. Such papers will be rejected without review. Reviews will be single blind (i.e., include authorship information with the submitted paper.) Papers must be in English.

Authors may include supplementary materials (such as a video) with the submission, and such materials are highly encouraged if they provide evidence of the claimed contribution. The submission should include all information necessary to evaluate the paper and must not ask reviewers to go to web sites or other external information sources, since that may jeopardize the anonymity of the reviewers.

If you have any further questions, contact schmalstieg[at]

Paper and Poster Submission Guidelines

A submission can only be submitted under one category (full paper, short paper, or poster). You must not submit the same material simultaneously to different categories (e.g., as both a full paper and a poster).

Paper and poster submissions must be formatted for 8.5"x11" paper. Please follow the IEEE Computer Society TCVG authoring guidelines (LaTeX style, MS Word template, sample paper).

All materials will be submitted electronically to

If you already have an account with that system (because you have been asked to be a reviewer), please use that account to submit materials. Otherwise, create a new account.

Papers and posters must be submitted as PDF documents.

Supplementary videos are encouraged. Videos should not be longer than 5 minutes long, and the total size of all submitted materials (including the PDF doucment) must be under 50MB. If you include a video, also include a text file describing what codec you used to create the video. Videos should be playable by either the current Windows Media Player or Apple Quicktime player. It is preferable that your video be playable by these standard players without requiring additional codec installations. If you require a special codec (e.g. DivX), make certain that you include instructions on how to find and install that codec. If the reviewers cannot play your video file, it will reduce the chances of your submission being accepted.

Demo Submissions

Deadline: Monday, July 10

Following the established tradition of showcasing the State of the Art in Augmented and Mixed Reality applications, ISMAR06 is calling for submissions of demonstrations.

The demo session is tentatively scheduled for Tuesday Oct. 24th. Demo presenters will also have one minute and two slides to draw the crowds to their demo during a Demo Fast Forward early in the conference.

In addition to presenting at the demo session, accepted demos will be published on the Conference CDROM in the form of a one page description.

Contact: Dirk Reiners, Demo Chair mail[at]

Please make sure to include "ISMAR 06 Demo" in the subject line.

Demo Submission Guidelines

To submit a proposal for a demo, please send email to mail[at] containing

  • The title of the proposed demonstration
  • The names of the authors
  • A brief abstract with keywords
  • The name, mailing address, telephone, fax number and email address of the contact author
  • A one-page storyboard describing your demo. What makes it unique and special? Why will it draw a big crowd? Would your grandma want to see it? Why? Or would an AR expert want to see it and why?
  • [Optional] Video of the demo

In addition, we need the following information for planning the demonstration space:

  • The amount of floor or desktop space needed (length, width, height)
  • The list of equipment you will bring (be as detailed as you can be)
  • Any power, socket and outlet needs (110V AC US sockets are available)
  • Networking requirements (beyond the common mail/web access)
  • Any environmental requirements (does this demonstration require a dark environment? Does it produce or is sensitive to a large amount of noise or light? Etc.)

After sending email, you should get a notification that your email has been received. If you experience problems sending large emails (esp. movies), contact mail[at] to get the possibility to upload the files to an ftp server.

Call for Student Volunteers

ISMAR 06 organizing committee is recruiting student volunteers to serve in various support roles during the conference.

Student volunteer activities:

Typical student volunteer activities include helping with on-site registration, packing conference bags with proceedings, setting up and removing tables, and assisting with audio/visual problems. Student volunteers may need to arrive one or two days earlier to get oriented with the environment, organizing committee, and the resources within the conference area as well as to help with pre-conference preparations.

Why volunteering?

In return of their service, students volunteers receive free registration to this event. However, travel and accommodation expenses are at your own charge.

Due to the highly interdisciplinary nature of this community, student volunteers will have the opportunity to exchange ideas with leading researchers from various branches of science (computer science, psychology, optics, etc.) and engineering that relate to augmented reality.

How to apply?

Interested students should fill out the questionnaire below and contact:

Zhenyao Mo
Student Volunteer Chair
Computer Science Department
University of Southern California

Application deadline: Saturday, Aug 19, 2006

Acceptance notification: Thursday, Aug 31, 2006 (See Accepted Demos)

Questionnaire for volunteers:

  • Name
  • Email
  • Web page
  • Academic department / Major
  • University
  • Advisor
  • Research interest
  • Degree program (Doctoral/Masters/Undergrad)
  • Have you ever attended ISMAR before?
  • Have you ever been a student volunteer at a conference? Which one?
  • Do you have any special skills? (audio/visual equipment, networking equipment, data entry, etc.)

Accepted Submissions


Masayuki Takemura, Itaru Kitahara, Yuichi Ohta Photometric Inconsistency on a Mixed-Reality Face
Mark Livingston Quantification of Visual Capabilities using Augmented Reality Displays
Istvan Barakonyi, Dieter Schmalstieg Ubiquitous Animated Agents for Augmented Reality
Julien Pilet, Andreas Geiger, Pascal Lagger, Vincent Lepetit, Pascal Fua An All-In-One Solution to Geometric and Radiometric Calibration
Gérald Bianchi, Christoph Jung, Benjamin Knoerlein, Gabor Szekely, Matthias Harders High-fidelity visuo-haptic interaction with virtual objects in multi-modal AR systems
Marcus Toennis, Gudrun Klinker Effective Control of a Car Driver�s Attention for Visual and Acoustic Guidance towards the Direction of Imminent Dangers
Kiyohide Satoh, Kazuki Takemoto, Shinji Uchiyama, Hiroyuki Yamamoto A Registration Evaluation System Using an Industrial Robot
Gerhard Reitmayr, Tom Drummond Going out: Robust, Model-based Tracking for Outdoor Augmented Reality
Harald Wuest, Gabriele Bleser, Didier Stricker Online camera pose estimation in partially known and dynamic scenes
Erick Mendez, Denis Kalkofen, Dieter Schmalstieg Interactive Context-Driven Visualization Tools for Augmented Reality
Daniel Pustka, Martin Bauer, Manuel Huber, Gudrun Klinker Spatial Relationship Patterns: Elements of Reusable Tracking and Calibration Systems
Gilles Simon Automatic Online Walls Detection for Immediate Use in AR Tasks
Ronald Azuma, Howard Neely III, Mike Daily, Jon Leonard Performance Analysis of an Outdoor Augmented Reality Tracking System that Relies Upon a Few Mobile Beacons
Juri Platonov, Hauke Heibel, Peter Meier, Bert Grollmann A mobile markerless AR system for maintenance and repair
Kensaku Saitoh, Takashi Machida, Kiyoshi Kiyokawa, Haruo Takemura A 2D-3D Integrated Interface for Mobile Robot Control Using Omnidirectional Images and 3D Geometric Models
Shilpi Gupta, Christopher Jaynes The Universal Media Book: Tracking and Augmenting Moving Surfaces with Projected Information
Martin Bauer, Michael Schlegel, Daniel Pustka, Nassir Navab, Gudrun Klinker Predicting and Estimating the Accuracy of Optical Tracking Systems
Aaron Stafford, Wayne Piekarski, Bruce Thomas Implementation of God-like Interaction Techniques For Supporting Collaboration Between Indoor and Outdoor Users
Ji-Young Oh, Hong Hua User Evaluations on Form Factors of Tangible Magic Lenses
Timothy Edmunds, Dinesh K. Pai An Event Architecture for Distributed Interactive Multisensory Rendering
Bunyo Okumura, Masayuki Kanbara, Naokazu Yokoya Augmented Reality Based on Estimation of Defocusing and Motion Blurring from Captured Images
Bruce Thomas Evaluation of Three Input Techniques for Selection and Annotation of Physical Objects Through an Augmented Reality View
Jason Wither, Stephen DiVerdi, Tobias Hollerer Using Aerial Photographs for Improved Mobile AR Annotation
Michael Zäh, Wolfgang Vogl Interactive Laser Projection for Programming Industrial Robots
Sylvia Irawati, Mark Billinghurst , Andreas Duenser , Scott Green , Heedong Ko "Move the couch where?": Developing an Augmented Reality Multimodal Interface
Jan Fischer, Dirk Bartz Enhanced Visual Realism by Incorporating Camera Image Effects
Muriel Pressigout, Éric Marchand Hybrid tracking algorithms for planar and non-planar structures subject to illumination changes
Jun Ehara, Hideo Saito Texture Overlay for Virtual Clothing Based on PCA of Silhouettes
Sinem Güven, Steven Feiner Visualizing and Navigating Complex Situated Hypermedia in Augmented and Virtual Reality
Alex Olwal LightSense: Enabling Spatially Aware Handheld Interaction Devices


Chris Warrington, Gerhard Roth, Eric Dubois Markerless Augmented Reality for Cubic Panorama Sequences
Taehee Lee, Tobias Höllerer Viewpoint Stabilization for Live Collaborative Video Augmentations
Wayne Piekarski Robust Gloves For 3D Interaction In Mobile Outdoor AR Environments
Youngjin Hong, Sanggoog Lee, Yongbeom Lee, Sangryong Kim Mobile Pointing & Input System Using Active Marker
Wafaa ABOU MOUSSA, Jean-Pierre JESSEL, Emmanuel DUBOIS Notational-based Prototyping of Mixed Interactions
Susanna Nilsson, Björn Johansson User Experience and Acceptance of a Mixed Reality System in a Naturalistic Setting � A Case Study
Ann-Sofie Gunnarsson, Malinda Rauhala, Anders Henrysson, Anders Ynnerman Visualization of Sensor Data Using Mobile Phone Augmented Reality
Sanni Siltanen Texture Generation over the Marker Area
Raphael Grasset, Julian Looser, Mark Billinghurst Transitional Interface: Concept , Issues and Framework
Jeremiah Neubert, Tom Drummond Using Backlight Intensity for Device Identification
Yoichi Motokawa, Hideo Saito Online Tracking of Guitar for Playing Supporting System by Augmented Reality
Ryosuke Ichikari, Keisuke Kawano, Asako Kimura, Fumihisa Shibata, Hideyuki Tamura Mixed Reality Pre-visualization and Camera-Work Authoring in Filmmaking
Sinem Güven, Steven Feiner, Ohan Oda Mobile Augmented Reality Interaction Techniques for Authoring Situated Media On-Site
Stefan Nölle, Gudrun Klinker Augmented Reality as a Comparison Tool in Automotive Industry
Mark Fiala Webtag: A World Wide Internet Based AR System
Seok-Won Lee, Dong-Chul Kim, Do-Yoon Kim, Tack-Don Han Tag Detection Algorithm for Improving a Instability Problem of an Augmented Reality