VR - IEEE Xplore

3 downloads 107505 Views 76KB Size Report
on Visualization and Computer Graphics (TVCG) reviewing cycle was .... Gudrun Klinker studied computer science. (informatics) at the ... Benjamin Lok received the PhD degree (2002, advisor: Dr. ... He is also an adjunct associate professor in.
IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS,

VOL. 17,

NO. 9,

SEPTEMBER 2011

1193

Guest Editors’ Introduction: Special Section on the IEEE Virtual Reality Conference (VR) Kiyoshi Kiyokawa, Gudrun Klinker, and Benjamin Lok, Member, IEEE

Ç

T

HE IEEE Virtual Reality Conference (VR) continues to be the leading venue for disseminating the latest in VR research, applications, technologies, and companies, and this special section presents significantly extended versions of the six best papers from the IEEE VR 2010 Proceedings. These papers highlight the wide span of active research areas in the VR field, including perceptual understanding, interfaces, rendering and visualization technologies, augmented reality, and audio and display technologies. IEEE VR 2010 had a record number of submissions (161) and each paper was reviewed by at least four experts in the field. An international program committee of 59 VR experts carefully invited reviewers, led discussions, and summarized a concensus review. An international program committee meeting was held where each paper was discussed and the overall acceptance rate of 25 percent is an indication of the significant effort expanded in selecting the best papers. The top 10 papers of the IEEE VR 2010 conference were identified from the conference review process and the conference Awards Committee. Next, each of these authors were asked to submit an extended version of their conference paper, with a clear focus of additional content that expanded and enhanced the scientific contribution of the original conference paper. A standard IEEE Transactions on Visualization and Computer Graphics (TVCG) reviewing cycle was initiated and all accepted papers required multiple revisions and reviews by each reviewer. Significant effort by many people was invested to ensure the quality of this special section, and we are very grateful. In their paper “Natural Interaction Metaphors for Functional Validations of Virtual Car Models,” Mathias Moehring and Bernd Froehlich provide an elegant solution to a major VR problem—how do you provide natural physical interfaces (e.g., gestures and finger tracking) to purely virtual objects? Further, if the goal is to validate the virtual models, for example, in car manufacturing, the requirements are significant. To address this issue, the authors examined grasping heuristics of users and developed “Normal Proxies,” a simplification of complex virtual objects to

. K. Kiyokawa is with the Cybermedia Center, Osaka University, Toyonaka Educational Research Center 517, 1-32 Machikaneyama, Toyonaka, Osaka 560-0043, Japan. E-mail: [email protected]. . G. Klinker is with Fachbereich Informatik, Technische Universita¨t Mu¨nchen, Boltzmannstr. 3, Raum 03.013.053, D-85748 Garching b. Mu¨nchen, Germany. E-mail: [email protected]. . B. Lok is with the University of Florida, CSE Building Room 544, PO Box 116120, Gainesville, FL 32611-6120. E-mail: [email protected]. For information on obtaining reprints of this article, please send e-mail to: [email protected]. 1077-2626/11/$26.00 ß 2011 IEEE

provide improved grasp detection and grasp stability. Expert review and user studies demonstrated that the approach did allow for an intuitive and reliable assessment required in model verification. Further, the efficacy of such an approach showed that overall task performance and usability are similar in CAVE and HMD environments. Although it has been common to use multiple projectors to build an immersive display environment, its calibration is still a tough problem. In their paper “Autocalibrating Tiled Projectors on Piecewise Smooth Vertically Extruded Surfaces,” Behzad Sajadi and Aditi Majumder have proposed a novel calibration approach for casually aligned projectors using a single camera. They assume the surface is a piecewise smooth vertically extruded surface (e.g., cylindrical displays and CAVEs) for which the aspect ratio of the rectangle formed by the four corners of the surface is known and the boundary is visible and segmentable. Using these priors, they can estimate the display’s 3D geometry and camera extrinsic parameters using a nonlinear optimization technique from a single image without any explicit display to camera correspondences, and recover the intrinsic and extrinsic parameters of each projector quickly and robustly. In visual perception, change blindness describes the phenomenon that persons viewing a visual scene may apparently fail to detect significant changes in that scene. These phenomena have been observed in both computer generated imagery and real-world scenes. However, until now the influence of stereoscopic vision on change blindness has not been studied thoroughly. In their paper “Change Blindness Phenomena for Virtual Reality Display Systems,” Frank Steinicke, Gerd Bruder, Klaus Hinrichs, and Pete Willemsen have investigated change blindness techniques for stereoscopic virtual reality (VR) systems. In their study, they compared a few passive and active stereo display systems and found that change blindness phenomena occur with the same magnitude as in monoscopic viewing conditions. They have also presented and discussed the potential of the techniques for allowing abrupt, and yet significant changes in a VR environment. While a significant amount of VR research has focused on the visual and the haptic components of immersive environments, audio has had proportionally less study. However, recently there has been increasing interest in new methods to generate audio for virtual environments, and evaluating the impact of audio in VEs. In “Sound Synthesis and Evaluation of Interactive Footsteps and Environmental Sounds Rendering for Virtual Reality Applications,” Rolf Nordahl, Luca Turchet, and Stefania Serafin proposed new methods to generate audio Published by the IEEE Computer Society

1194

IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS,

for immersive VE navigation (specifically footsteps on different materials). They also conducted studies to evaluate if the surfaces being walked upon could be recognized based by the sounds and the impact of environmental sounds on recognizing sounds. It has been an unsolved question how much independency should be given to each user to control their viewpoint in a colocated collaboration to maximize the task efficiency for a given task. In their paper “Effects of Viewing Conditions and Rotation Methods in a Collaborative Tabletop AR Environment,” Sangyoon Lee and Hong Hua have investigated the effects of viewing conditions (synchronous/asynchronous) and rotation methods (direct/indirect) on different types of collaborative tasks (block building/text selection) in their original two-user co-located tabletop augmented reality (AR) environment, SCAPE. They revealed that the viewing conditions had significant and different effects on several objective and subjective measurements depending on the task type. Data visualization has been an application well suited for immersive virtual reality. However, researchers have realized that there is a need for important tools that leverage both the immersive component of the VR system and the large volumes of data. Christoph W. Borst, JanPhillip Tiesel, Emad Habib, and Kaushik Das have developed one such tool in their paper “Single-Pass Composable 3D Lens Rendering and Spatiotemporal 3D Lenses.” They propose a new interactive 3D lens rendering technique that enables a single-pass rendering approach to 3D lenses that leverages programmable GPUs to provide these visualization tools at interactive rates. Thanks to Ryohei Nakatsu, a fellow program chair of IEEE VR 2010, for helping with the international program committee meeting that helped identify the best papers to be considered for this special section. Thanks also to the IEEE VR 2010 Awards Committee, including Greg Welch, Sabine Coquiallart, and Dirk Reiners, who listened to the talks and recommended papers for inclusion in this special section. Finally, thanks to the reviewers who provided very detailed and careful reviews for the included papers.

Kiyoshi Kiyokawa Gudrun Klinker Benjamin Lok Guest Editors

VOL. 17,

NO. 9,

SEPTEMBER 2011

Kiyoshi Kiyokawa received the ME and PhD degrees in information systems from the Nara Institute of Science and Technology in 1996 and 1998, respectively. He is currently an associate professor at the Cybermedia Center, Osaka University. He was a research fellow of the Japan Society for the Promotion of Science in 1998. He worked for the Communications Research Laboratory from 1999 to 2002. He was a visiting researcher at the Human Interface Technology Laboratory at the University of Washington from 2001 to 2002. His research interests include virtual reality, augmented reality, 3D user interface, and CSCW. He is a member of the IEICE, the IPSJ, the VRSJ, the HIS, and the ACM. Gudrun Klinker studied computer science (informatics) at the Friedrich-Alexander Universita¨t Erlangen, Universita¨t Hamburg (Diplom), and Carnegie-Mellon University (PhD) in Pittsburgh, Pennsylvania, focusing on research topics in computer vision. In 1989, she joined the Cambridge Research Laboratory of the Digital Equipment Corporation in Boston, Massachusetts, working in the visualization group on the development of a reusable tele-collaborative data exploration environment to analyze and visualize 3D and higherdimensional data in medical and industrial applications. Since 1995, she has been researching various aspects of the newly emerging concept of augmented reality, first at the European Computer-industry Research Center, then at the Fraunhofer Institute for Computer Graphics, and since 2000 at the Technical University of Munich. Here, her research focus lies on developing approaches to ubiquitous augmented reality that lend themselves to realistic industrial applications. Benjamin Lok received the PhD degree (2002, advisor: Dr. Frederick P. Brooks, Jr.) and the MS degree (1999) from the University of North Carolina at Chapel Hill, and the BS degree in computer science (1997) from the University of Tulsa. He did a postdoctorate fellowship (2003) under Dr. Larry F. Hodges at the University of North Carolina at Charlotte. He is an associate professor in the Computer and Information Sciences and Engineering Department at the University of Florida (UF). He is also an adjunct associate professor in the Surgery Department at the Medical College of Georgia. His research focuses on virtual humans and mixed reality in the areas of computer graphics, virtual environments, and human-computer interaction. He received a US National Science Foundation (NSF) CAREER Award (2007-2012) and the UF ACM CISE Teacher of the Year Award in 20052006. He and his students in the Virtual Experiences Research Group have received Best Paper Awards at ACM I3D (Top 3, 2003) and IEEE VR (2008). His work is primarily supported by the NSF and US National Institutes of Health. He currently serves on the Steering Committee of the IEEE Virtual Reality Conference and he was a program cochair of the ACM VRST 2009 and IEEE Virtual Reality 2010 conferences, and area chair for IEEE ISMAR 2009. He is on the editorial board of the International Journal of Human-Computer Studies and Simulation: Transactions of the Society for Modeling and Simulation. He is a member of the IEEE.