An Integrated Design Flow in User Interface and ... - IEEE Xplore

5 downloads 1297 Views 1MB Size Report
this offer an ideal opportunity for game developers to design more playful and interesting mobile AR games that embed various interaction forms. This paper ...
An Integrated Design Flow in User Interface and Interaction for Enhancing Mobile AR Gaming Experiences Raymond Koon Chuan Koh

Henry Been-Lirn Duh

Jian Gu

Interactive and Digital Media Institute / ECE

Interactive and Digital Media Institute / ECE

Interactive and Digital Media Institute

National University of Singapore

National University of Singapore

National University of Singapore

Singapore

Singapore

Singapore

ABSTRACT Interaction is a prominent feature that distinguishes computer (video) games from other kinds of entertainment [1, 2]. Given that Augmented Reality (AR) presentations enable the creation of user interactions that expand traditional HCI from 2D to 3D spaces, this offer an ideal opportunity for game developers to design more playful and interesting mobile AR games that embed various interaction forms. This paper introduces a mobile AR cooking game that is based on the integration of adopted AR interaction and user interface principles in its design flow. These embedded interactions are designed with considerations and spatial mappings to real cooking mechanisms in order to provide a unique gaming experience and increased engagement for a better overall AR interaction on the multimodal mobile device. KEYWORDS: Mobile Augmented Reality, Game Design, 3D Interaction. INDEX TERMS: H.5.2 [Information Interfaces and Presentation]: User Interfaces - Input devices and strategies, Interaction styles. 1

INTRODUCTION

Good interaction is one of the most important factors that may much influence a gaming experience. In a game, these would ideally mean that the player should be able to feel a sense of control over their actions and they must receive corresponding feedback at appropriate moments. Good interactions will also mean that the game’s interaction metaphors can be spatially mapped to intuitive control mechanisms that are relative to the real world so that it is easy for casual gamers to access the game play. Based on Csikszentmihalyi’s Flow theory, Sweetser and Wyeth [3] developed a new model, GameFlow, which consists of eight elements for achieving game enjoyment including control and feedback. It is hence clear that a better interaction design in a game would make it more enjoyable. 1.1 AR Design Principles Adoption for Mobile Augmented Reality (AR) is a form of mixed reality that combines virtual information that is set in physical spaces [4]. Mobile devices (e.g. Apple iPhone) are increasingly capable in their hardware specifications and performances, making it an ideal platform for AR technologies to be designed and deployed on. Previous studies have provided several design principles and suggestions for AR applications. Namely, • The media fusion component of AR takes the tokens of 21 Heng Mui Keng Terrace, I3 Blg,#02-02-09,NUS, Singapore 119613 / {raymondkoh, eledbl, idmgj} @nus.edu.sg IEEE International Symposium on Mixed and Augmented Reality 2010 Arts, Media, & Humanities Proceedings 13 -16 October, Seoul, Korea 978-1-4244-9342-5/10/$26.00 ©2010 IEEE

several input channels and infers user intention from them. It is suggested to combine different input channels under their corresponding predefined boundary conditions [5]. All involved user interfaces have to deliver a consistent user experience. • It is important to recognize that a mobile phone can be considered as an interaction device with 6 Degree of Freedom (DOF) input with the combination of in-built camera, computer vision software and a reference coordinate system that collectively allows it to be tracked in three dimensions [6]. • When implementing user interfaces for mobile AR, the dimensional restrictions of desktop metaphors should be designed with consideration to the additional DOF that are available on the mobile phone, as a suitably expressive 3D environment can make information access much more natural to people because it better models the tangible environment that they work in [7]. • When fiducial markers are used for tracking in mobile AR applications, it is important to consider tangible AR techniques as well because it enables interaction with virtual content to be more natural and intuitive [8]. • One of the crucial aspects for consideration is the creation of appropriate interaction techniques for AR applications that allow users to interact with virtual content in an intuitive way. A small pilot study revealed that multisensory feedback enhanced game playing experiences. Mobile devices are ideal for AR applications because they are minimally intrusive, socially acceptable, readily available and highly mobile [9]. Dickey [10] reviewed game design strategies and discussed that interactive choices within games can be used as a design strategy that leads to increased players’ engagement. We attempt to collectively apply the AR design principles and suggestions as mentioned earlier into a crafted interaction design flow and experience to avail them as interactive options for a mobile-based AR game. By having game interactions that are natural, intuitive and mimicking real-world behavior, game players do not need to spend more time on learning how to play the game. In addition, we applied the ‘seamful design’ [11] methodology in acknowledgement that uncertainty caused by the unfamiliar and unpredictable nature of a relatively new mobile AR game technology that fuses empowering interactions exist in the user experience that is delivered. The conceptual application of seamful design includes measures that are built into the game level in our prototype to mitigate disruptions and breaks in interaction flows that may be due to a collective result of operating conditions (i.e. poor lighting in the physical environment where the mobile game is played) and user factors (i.e. players who are unaware that fiducial markers need to be constantly or regularly kept within visual view by the mobile device’s camera, because they are new and/or not familiar with how AR applications work properly).

47

A mobile AR cooking game that is developed as a stationary marker-driven and vision-based AR game application [12] is introduced. In it, how embedded interactions are designed to adopt interface and seamful design guidelines for an ideal multimodal mobile AR application will be described. 2

MOBILE AR COOKING GAME

2.1 Design Procedures Henrysson and Billinghurst [6] suggested that although the desktop metaphor in HCI has been widely adopted in many computing platforms, it is applicable mostly in only supporting 2D interactions and its associated input devices are hence designed to support a limiting two DOFs (corresponding to the two dimensions of a 2D plane). This renders existing desktop interfaces unsuitable for 3D data interaction implementations in vision-based Mobile AR, because the latter is able to provide such affordances in multimodal interaction context using the additional hardware features of smart mobile devices that can now present information in three dimensions [6]. One of the key features in AR technology is that, the system can through vision-based recognition techniques, detect the distance between the objects (e.g. fiducial markers) and the camera, and then provide the real sense of physical distance to the users. Beyond this point however, there is no known design process when this is used in multimodal context and as an interaction feature of engagement for a mobile AR game which may employ an interface that encompasses the various forms of input methods such as physical manipulation, GUI interaction, standard I/O interaction, TouchScreen interaction, etc ([13]). We have hence taken a holistic approach with these adopted principles in consideration of the design inception for a multimodal structured mobile AR cooking game that we have developed on the Apple iPhone platform. 2.1.1 Interaction design and management HCI is one of the most important issues when designing any real time system and an ideal AR system that mixes virtual information in a user friendly way and allows users to interact naturally [14] can complement the context and user experience. Further, multimodality features mix interactions in a common space that require them to be dynamic enough so that the most appropriate mechanism of interaction can be interchangeably applied during the appropriate corresponding moment [13]. During conceptualization, the dish to be cooked as the challenge in the first level of our game prototype is identified as “Pan Seared Scallops with Peppers and Onions in Anchovy Oil” with its actual recipe obtained [15] as in Table1. Table1. Actual recipe and game-adapted version of the recipe Original recipe Ingredients Olive oil, minced anchovy fillets, scallops, red/orange pepper, red onion, garlic cloves minced lime and lemon zest, salt and pepper, parsley sprigs. Preparation 1. Heat the olive oil and minced anchovies in a large frying Methods pan over medium-high heat, stirring as the oil heats to dissolve the anchovies. Once the anchovies are sizzling, add the sea scallops, and cook without moving the scallops for 2 minutes. 2. Meanwhile, toss the peppers, onion, garlic, lime zest and lemon zest in a bowl; season to taste with salt and pepper. Sprinkle the pepper mixture onto the scallops, and continue cooking a few more minutes until the scallops have browned on one side. Flip the scallops over, stir the pepper mixture, and continue cooking until the scallops have browned on the other side, 4 to 5 minutes. Garnish with parsley sprigs to serve.

48

Adapted/modified preparatory steps for the game level Interaction # Actions (guided by Effect on Group respective visual text Temperature/ labels in the user Heat interface accordingly) 1 i. Position the mobile Nil 1. Preparation device’s camera to Steps make sure ‘marker’ is within viewing range. ii. “Pan Mode” is disabled unless view is within established proximity range.

2. Main Cooking Tasks

2

Add “Oil” (Soft Button/2D Icon) into “Frying Pan”.

Increase

3

Add “Minced Anchovies” (Soft Button/2D Icon) into “Frying Pan”. Add “Scallops” (Soft Button/2D Icon) into “Frying Pan”.

Increase

4

5

3. Garnishing

6

7

8

i. Toggle “Pan Mode” (Soft Button/2D Icon), ii. Shake “Frying Pan” and stir contents (finger input). iii. Toggle “Pan Mode” (Soft Button/2D Icon). i. Add “Pepper & Onion Mix” (Soft Button/2D Icon) into “Frying Pan”. ii. Add “Lime and Lemon Mix” (Soft Button/2D Icon) into “Frying Pan”. i. Toggle “Pan Mode” (Soft Button/Icon), ii. Shake “Frying Pan” and stir contents (finger input). iii. Toggle “Pan Mode” (Soft Button/2D Icon). Add “Parsley Sprigs” (Soft Button/2D Icon) in “Frying Pan”.

Increase

Decrease

Domain Type

Real world context, players are guided to establish spatialgame space familiarity. Pan and ingredients are 3D / passive Pan and ingredients are 3D / passive Pan and ingredients are 3D / passive Pan and ingredients become 2D / active during shaking

Increase

Pan and ingredients are 3D / passive

Decrease

Pan and ingredients become 2D / active during shaking

Nil

Pan and ingredients are 3D / passive The “Pan Mode” is a ‘Soft Button/2D Icon’ feature in the Graphic User Interface that allows the toggle of switching between 2D and 3D modes of the pan.

The dish’s preparatory steps and actions are then considered and matched to interaction options that are presented in linearprogressive groups as in Table 1. This comparison and assignment is mainly based on the identification of gesture similarities in performing such physical actions in the real world (i.e. stirring is associated to a finger swirling in a circular motion on the mobile device’s touch-screen, shaking of the mobile device as though it is a cooking pan etc) that can be supported by design principles across the multiple input channels that are commonly available on recent smart mobile devices (tangible interface, haptics, 6 DOFs,

metaphorically designed Graphic User Interfaces, possibility of having hybrid 2D/3D interfaces and interaction spaces ([16])). These are done with the intent of preserving two key aspects, the cooking process flow (for fair realism and challenge without becoming a computer simulation in itself) and a consistent interaction flow (for enjoyment and increased engagement). Two conditional parameters are established as in-game rules to facilitate this preservation process of this balance between a realistic gaming experience versus fun and enjoyment. Firstly, in contextual consideration, we determined that the mobile device can serve as both a provider of an augmented point of view (a chef’s perspective in this case) and also entirely act as a frying pan at times. This concept of allowing a shift of the player from an external perspective to an embedded agent in game space and thereby becoming part of the game environment creates more engaging experiences for the player [10]. A “Continuous Natural User Interface” (CNUI) is therefore implemented to allow this switching of roles that the mobile device would play to enable perspective shifts through the course of the game level, and content instances are designed to be cross-domain transferable between 2D to 3D spaces using this content-centered, domain and interaction-continuous interface [17] approach to support this process. Cooking is maintained as the task epicentre with the domain-transferable interface game elements (frying pan and added ingredients) and actions being directly complimentary to the overall interaction experience. When the mobile device is acting as a cooking pan, its physical distance to the fiducial marker is akin attributing to the heat received by a cooking pan when placed on a heated stove (Figure 1). This link would motivate the player to interactively maintain the physical proximity (mobile device to marker) as a game challenge while being an effective measure to prevent disruption in the interaction flow that is caused by the sudden loss of visual recognition of the fiducial marker by the mobile device’s camera due to poor user handling of the mobile device (a seamful design approach [11]).

Figure 1. Concept / inspiration of mobile AR cooking game

Secondly, we determined when and whether a user-performed action in the cooking process of the game level would contribute to its current cumulative virtual heat level, which is made into a game mechanic and affects scoring in the game (i.e. when adding an ingredient during the cooking process, a pan is typically left on the stove while this is happening, thereby the overall heat in the pan’s surface should gradually increase and affect the cooking process when this action is being performed over time). In the context of the real world, various factors may directly influence the cooking outcome, for example, proper heat control, optimal sequence of adding ingredients, ideal timing to stir the ingredients or to shake the pan, etc. These are balanced and made into game mechanics and are grouped into interaction groups based on the different phases of the real cooking process so that the players’

reaction times to perform such measured actions would impact the cooking/game progress and outcome, while the recurring domain switches ([17]) may remain much less disruptive to the interaction flow (a seamful design approach [11]). 2.1.2 Designing a cross-domain seamful user interface At the beginning of the level, the cooking pan starts off on the virtual stove (as represented by the fiducial marker) while ingredients are to be added sequentially using a First-Person view. The stove is a 2D texture that is set on the marker, bearing as a 2D Graphic User Interface (GUI) element that is registered in 3D [15]. During this interval (Interaction Group 1 in Table 1), the frying pan and the added ingredients are passive 3D objects that offer no direct player interaction as the mobile device only augments the chef's point of view with actionable 2D user interface elements (icons) on the mobile device screen to add the ingredients (Figure 2). The layout for this phase is designed with a combination of 2D and 3D-registered GUI elements [16] (Figure 3).

Figure 2. Virtual 3D pan on a physical AR marker

In exploiting the advantages brought forth by the availability of additional DOFs on the mobile device to increase the intuitiveness of the game play, both the mobile device’s in-built physics engine and accelerometer are also featured as interaction mechanisms in the game. After a predefined progression of performing the correct steps of adding the ingredients in the game, the player is prompted to ‘hold up the pan’ by toggling the in-game “Pan Mode” switch (Figure 3). This is implemented in the view that after the initial steps of preparing the frying pan on the heated stove, the heat intensity in the pan can only be lowered if it is taken away from the heat source (it is intentionally designed that the stove’s heat cannot be directly adjusted or controlled). The frying pan and its contents (added ingredients) now become active 2D objects that have been transferred to the 2D GUI on the screen of the mobile device while the virtual stove remains as set on the physical marker. The mobile device is now treated as the pan itself and can be shaken using the inbuilt accelerometer for motion detection (tangible input source) and stirred using finger input on the touch screen, both as natural gesture input to evenly distribute the heat across the added ingredients in it. This direct interaction is thereby oriented on real world behaviour ([17]). The frying pan is finally placed back onto the stove as another cross-domain instance (toggling of the “Pan Mode” again on the on-screen 2D GUI) for the last cooking steps in to be completed. In comparison with traditional cooking video games (i.e. “Cooking Mama” on the Nintendo DS™ portable console platform) where the player is unable to perform such perspective shifts in temporal spaces, the incorporation of this feature thereby provides a natural and unique gaming experience.

49

Figure 3. Main user interface of mobile AR cooking game with hybrid 2D/3D elements

It is recognised that while switching/crossing between domains for the content (frying pan and added ingredients) in this mobile AR game, the interaction flow can be broken if it is not properly managed. In order to better preserve it using a seamful design ([11]) approach, transitions in the form of animated sequences of the game content (ingredients) are introduced to present the switching of pan modes to establish continuity (such that the domain is continuous with the content transferable to and from real-world instances [17]), which has no effect on the in-game countdown timer. To support these sequences, the necessary domain versions of the game assets and interface elements that need to be designed and created (‘2D only’ or ‘both 2D and 3D’) are identified so that designers can produce the corresponding variants for use in the game level (Figure 4).

Figure 5. 2 versions of the main GUI image overlay for each of the designed domain instances, (Left) When ‘Pan’ is 3D/on marker, (Right) When ‘Pan’ is 2D/onscreen in mobile device

2.2 Game Scenario The basic game mechanism of the game is such that the player needs to follow the sequential indications to finish several cooking steps within the stipulated time limit in order to induce game progress to advance to the next stage (Figure 3). The heating control of the pan, and the cooking time are two essential factors which will affect the cooking process. Players can control the heating through manipulating the physical distance of the mobile device (camera) to the marker. It means that, the distance and time can be integrated to provide a more real cooking experience to player. The player enjoyment and engagement will be enhanced through integrating these two factors - physical distance and the timing. 2.3

Figure 4. Development screenshots of game asset variants (Left) 2D texture version / flat 2D images, (Right) 3D geometry version / 3D objects

Given the screen-estate limited nature of mobile devices in task-focused domains, any user interface elements being presented have to bear direct information relevance to the current task of the user. In the switching of domains between 2D to 3D modes, a set of onscreen 2D GUI overlay accompanies each of the two modes such that icons that are not needed or will become redundant after a domain switch are not shown (i.e. when holding up the pan, the player is not expected to add in ingredients as the interaction design only requires him to perform shake/stir actions, hence soft buttons/2D icons for adding ingredients are not displayed during this interval). Thereby as proposed by Vitzthum [18], only the information that is required to solve the current task (the interaction(s) that have been designed) is shown.

50

User’s Operating Flow

2.3.1 Mastering the “Heat” One of the most important factors that determine cooking success is the mastery of proper heat control during the process. In the real world, there are normally two ways to do this. The first way is to directly control the heat of the stove and the second way is to adjust the distance between the pan and the stove. In this mobile AR cooking game, a natural 3D interaction feature is designed to augment the experience using the latter approach. The virtual pan is presented as being held in hand (i.e. the mobile phone is regarded as the pan itself, Figure 1). As the system is able to calculate the proximity between the physical marker and the mobile phone through AR technology, players can manually manipulate this distance to simulate heat control (i.e. the closer the mobile phone is to the physical marker, the stronger the intensity of the heat received) which is naturally mapped to a typical cooking process in the real world. A heat control status bar is provided in the game’s user interface to indicate the virtual heat’s intensity, providing instant feedback while the player interactively manipulates the physical distance (Figure 5). In the game scenario, the player is at times required to keep the heat intensity level within the specified range. As a result, they will need to manipulate and maintain the physical distance during the said interim, bearing similarity to the real world counterpart of a pan over fire during cooking.

Figure 6. Distance-driven interaction to master the heat (nearer = more heat)

2.3.2 Shaking the “Pan” In order to mix the added ingredients together and to make sure they are uniformly cooked, it is sometimes necessary to shake the pan in the context of the real cooking process. Shaking the virtual pan in 3D space is designed to naturally map to the real world operation. There are two approaches that this 3D interaction is implemented, the first is by the use of AR technology. AR technology can detect the distance between the physical marker and the mobile phone hence the rate of movement of the mobile phone could be calculated. A threshold value is predefined and only when the movement speed reaches this value then are corresponding actions considered as shaking in the game. Visual feedback is instantly provided to depict the movements of the ingredients in the virtual pan that are caused by shaking actions. The second approach of implementing the shaking action is to use the inbuilt physics engine in the mobile phone. The inbuilt accelerometer is able to detect the shaking action more accurately than the AR technology described earlier but with one disadvantage. As the force of gravity is a vital component in a physics engine, ingredients added in the virtual pan will automatically be drawn downwards (as induced by gravity) when the mobile phone is held in a vertically upright position without any shaking input (action) by the user. This may cause confusion and likely to render the game to be playable only when the mobile phone is horizontally held. For this reason, AR technology is primarily used to implement the shaking of the “pan” in this mobile AR game (a seamful design [11] approach). 3

innovative interaction experiences. Such manipulations in 3D spaces when used in a game context are collectively one of the unique features available on smart mobile platforms. Mobile AR as one of the more popular technologies to embed 3D interactions into, will only gradually mature while mobile phones concurrently become increasingly powerful. This provides an ideal opportunity to design and fuse more intricate 3D interactions into mobile games to enhance gaming experiences. The design concept to map game interactions to both real control mechanisms and AR-based 3D interactions, and to work around ‘seams’ in system design for a mobile AR game can be a useful model to look at. Future work will take an immediate look at establishing usability studies and to continue to investigate other possible combinations of interactive choices to be used as design strategies in mobile AR game development. We will also look at how this can be applied in a collaborative (multiplayer) environment so as to further enhance mobile AR gaming, engagement and interaction experiences. ACKNOWLEDGEMENTS This research is carried out under NRF Project No. WBS R-705000-025-271 partially funded by a grant from the National Research Foundation administered by the Ministry of Education of Singapore. Special thanks to the student intern from the School of Design, Singapore Polytechnic, Mr Lee Yi Fan, for contributing to the design of game assets for this project and Mr. Chengyi Liu’s help in the game design. REFERENCES [1]

[2]

[3]

[4]

CONCLUSION

Inspired by design principles, guidelines and suggestions for AR interactions and user interfaces, interactive features for a mobile AR game are designed using intuitive mappings that exploit the various input channels of a multimodal mobile device. Various 3D interactions are considered and implemented through the integrative use of mobile AR technology in order to richly simulate cooking procedures, while revolving around closely to the applied concept of ‘designing around seams’ in the seamful design [11] approach. These 3D interactions are naturally mapped to real cooking mechanisms in order to provide unique gaming experiences for increased engagement and to allow for player positioning (points of view). Game interactivity is enriched not only through the application of various interactions in 3D spaces, but also by the provisions of proper informational feedback and bidirectional cross-domain transfers of game content, and thereby enhancing mobile gaming experiences based on the GameFlow theory [3] in a domain-continuous context [17]. A mobile game is usually played in a mobile context and players now may in addition have the flexibility and opportunity to manipulate the collective multimodal input channels of smart mobile devices (AR marker tracking, touch-screen display, GUI interaction, accelerometer-linked in-game physics etc) for

[5]

[6]

T. Grodal. Video games and the pleasures of control. In D. Zillmann & P. Vorderer (Eds.), Media entertainment: The psychology of its appeal, pages 197–212. Mahwah, NJ: Lawrence Erlbaum Associates, 2000. P. Vorderer. Interactive entertainment and beyond. In D. Zillmann & P. Vorderer (Eds.), Media entertainment: The psychology of its appeal, pages 21–36. Mahwah, NJ: Lawrence Erlbaum Associates, 2000. P. Sweetser and P. Wyeth. GameFlow: a model for evaluating player enjoyment in games. ACM Computers in Entertainment 3(3), July 2005. R. T. Azuma. A survey of augmented reality. Presence: Teleoperators and Virtual Environments, pages 355–385, August 1997. C. Sandor and G. Klinker. A rapid prototyping software infrastructure for user interfaces in ubiquitous augmented reality. Personal Ubiquitous Comput. 9, 3, pages 169–185, May 2005. A. Henrysson and M. Billinghurst. Using a mobile phone for 6 DOF mesh editing. In Proceedings of the 7th ACM SIGCHI New Zealand Chapter's international Conference on Computer-Human interaction: Design Centered HCI (Hamilton, New Zealand, July 02 - 04, 2007). CHINZ '07, vol. 254. ACM, New York, NY, pages 9 16, 2007. S. Diverdi, D. Nurmi and T. Hollerer. A framework for generic interapplication interaction for 3D AR environments. In Augmented Reality Toolkit Workshop 2003, IEEE Computer Society, pages 86– 93, 2003. H. Kato, M. Billinghurst, I. Poupyrev, N. Tetsutani and K. Tachibana. Tangible Augmented Reality for Human Computer Interaction. In Proceedings of Nicograph 2001, Nagoya, Japan, 2001. F. Zhou, H.B.-L. Duh and M. Billinghurst, Trends in augmented reality tracking, interaction and display: A review of ten years of ISMAR, Mixed and Augmented Reality, 2008. ISMAR 2008. 7th IEEE/ACM International Symposium on , vol., no., pp.193-202, 1518 Sept. 2008.



[7]

[8]

[9]

51

[10] M.D. Dickey. Engaging By Design: How Engagement Strategies in Popular Computer and Video Games Can Inform Instructional Design. In Education Technology Research & Development, vol. 53, no. 2, pp. 67–83, 2005. [11] M. Chalmers and I. MacColl. Seamful and seamless design in ubiquitous computing. In Proceedings of workshop at the crossroads: the interaction of HCI and systems issues in UbiComp, 2003. [12] A. Phillips. Games in AR: Types and technologies. In Proceedings of the IEEE International Symposium on Mixed and Augmented Reality 2009 (ISMAR 2009), 2009 [13] F. Liarokapis and R.M. Newman. Design experiences of multimodal mixed reality interfaces. In Proceedings of the 25th Annual ACM international Conference on Design of Communication (El Paso, Texas, USA, October 22 - 24, 2007). SIGDOC '07. ACM, New York, NY, 34-41, 2007. [14] F. Liarokapis, I. Greatbatch, D. Mountain, A. Gunesh, V. BrujicOkretic, and J. Raper,. Mobile Augmented Reality Techniques for GeoVisualisation. In Proceedings of the Ninth international Conference on information Visualisation (July 06 - 08, 2005). IV. IEEE Computer Society, Washington, DC, 745-751, 2005. [15] R. Rinehart. Pan Seared Scallops with Pepper and Onions in Anchovy Oil. Retrieved January 17th, 2010, from Allrecipes.com: http://allrecipes.com/Recipe/Pan-Seared-Scallops-with-Pepper-andOnions-in-Anchovy-Oil/Detail.aspx [16] C. Geiger, L. Oppermann and C. Reimann. 3D-registered interaction-surfaces in augmented reality space. Augmented Reality Toolkit Workshop. IEEE International, vol., no., pp. 5- 13, 2003. [17] N. Petersen and D. Stricker. Continuous natural user interface: Reducing the gap between real and digital world. Mixed and Augmented Reality, 2009. ISMAR 2009. 8th IEEE International Symposium on, vol., no., pp.23-26, 2009. [18] A. Vitzthum. SSIML/AR: A Visual Language for the Abstract Specification of Augmented Reality User Interfaces. 3D User Interfaces, 2006. 3DUI 2006. IEEE Symposium on, vol., no., pp. 135- 142, 25-29, 2006.

52