RegisterGet new password
ISMAR 2014 - Sep 10-12 - Munich, Germany

ieee.org
computer.org
vgtc.org

ISMAR Sessions

Call for Participation

Committees

Supporters and Media

Contact




Social Media

LinkedInLinkedIn



Previous Years

Doctoral Consortium

We are proud to present the 3rd ISMAR Doctoral Consortium (DC). We are continuing to build on the success of the previous DC, where DC students received invaluable inputs from DC mentors to help improve their research. The goal of the DC is to create an opportunity for PhD students to present their research, discuss their current progress and plans, and receive constructive criticism and guidance regarding their work from mentor(s) and DC participants. This year, there are four DC students who will present their excellent and interesting work, as details are listed below. 

Please join us at the DC session on Tuesday September 9, 2014, 9am-1pm. 

 

 

Time

Student

Research Title

Mentors

09:00-
09:05

   Opening Remarks

09:05-
10:00

Alexander Plopski

Corneal Imaging in Localization and HMD interaction

Prof. Dieter Schmalstieg and Dr. Takeshi Kurata

 

10:00-
11:00

Dariusz Rumiński

Semantic Contextual Augmente Augmented Reality Environments

Prof. Steven Feiner, Prof. Dieter Schmalstieg and Dr. Takeshi Kurata

 

11:00-
12:00

Jason Weigel

Designing Support for Collaboration around Physical Artefacts: Using Augmented Reality in Learning Environments

Prof. Dieter Schmalstieg and Prof. Steven Feiner

12:00-
13:00

Fabrizio Cutolo

Video See Through AR Head-Mounted Display for Medical Procedures

Prof. Steven Feiner and Dr. Takeshi Kurata

 

 

 

Submission Abstracts:

 

Video See Through AR Head-Mounted Display for Medical Procedures
Fabrizio Cutolo, EndoCAS Center, Department of Translational Research and New Technologies in Medicine and Surgery University of Pisa, Italy
Thesis supervisors: Paolo Domenico Parchi and Vincenzo Ferrari
In the context of image-guided surgery (IGS), AR technology appears as a significant development in the field since it complements and integrates the concepts of surgical navigation based on virtual reality.The aim of the project is to optimize and validate an ergonomic, accurate and cheap video see-through AR system as an aid in various typologies of surgical procedures. The system will ideally have to be inexpensive and user-friendly to be successfully introduced in the clinical practice.

Corneal Imaging in Localization and HMD interaction
Alexander Plopski, Osaka University, Japan
Thesis supervisors: Kiyoshi Kiyokawa, Haruo Takemura, and Christian Nitschke
The human eyes perceive our surroundings and are one of, if not our most important sensory organs. Contrary to our other senses the eyes not only perceive but also provide information to a keen observer. However, thus far this has been mainly used to detect re- flection of infrared light sources to estimate the user’s gaze. The reflection of the visible spectrum on the other hand has rarely been utilized. In this dissertation we want to explore how the analysis of the corneal image can improve currently available eye-related solutions, such as calibration of optical see-through head-mounted devices or eye-gaze tracking and point of regard estimation in arbi- trary environments. We also aim to study how corneal imaging can become an alternative for established augmented reality tasks such as tracking and localisation.

Semantic Contextual Augmented Reality Environments 
Dariusz Rumiński, Poznań University of Economics, Poland
Thesis supervisor: Krzysztof Walczak
The paper presents the concept of dynamic Contextual Augmented Reality Environments (CARE), in which augmentation presented to users is dynamically constructed based on four semantically described elements. The first element is the user’s context (prefer- ences, privileges, location, time, device’s capabilities). The sec- ond element is a set of trackables - visual markers representing real world objects that can be augmented for a given user in a given context. The third element are content objects, representing inter- active 2D and 3D multimedia content including video sequences and sounds to be presented on the trackables. The last one is a description of a user interface, which may be specific to a concrete device or application and which indicates the forms of information presentation and interaction available to a user.

Designing Support for Collaboration around Physical Artefacts: Using Augmented Reality in Learning Environments 
Jason Weigel, University of Queensland, Australia
Thesis supervisors: Stephen Viller and Mark Schulz
The aim of this thesis is to identify mechanisms for supporting collaboration around physical artefacts in co-located and remote settings. To explore the research question in the project, a Research through Design approach has been adopted. A technology probe — an evolutionary prototype of a remote collaboration system — will be used to fuel the research. The prototype will facilitate collaboration between small groups around physical artefacts in an augmented learning environment. The prototype will inform future collaborative augmented reality technology design.

Sponsors (Become one)

Diamond



Platinum



Gold




Bronze






SME







Academic