Live Solar System (LSS) - IEEE Xplore

3 downloads 0 Views 24MB Size Report
Live Solar System (LSS): Evaluation of an. Augmented Reality Book-Based Educational Tool. Aw Kien Sin. Faculty of Information Science and Technology.
Live Solar System (LSS): Evaluation of an Augmented Reality Book-Based Educational Tool Aw Kien Sin

Halimah Badioze Zaman

Faculty of Information Science and Technology Universiti Kebangsaan Malaysia, 43600 Bangi, Selangor, Malaysia [email protected]

Faculty of Information Science and Technology Universiti Kebangsaan Malaysia, 43600 Bangi, Selangor, Malaysia [email protected]

Abstract— Live Solar System (LSS) is an Augmented Reality book-based educational tool to learn Astronomy. Augmented Reality (AR) has its own potential in the education field. In this paper, we review on Tangible Augmented Reality (TAR) which is a combination of Tangible User Interface and AR interface. By applying TAR in LSS, it was able to provide intuitive interaction which became a new learning experience for users. A user study was conducted to test on the usability of LSS based on three constructs namely: ease of use, learnability, and effectiveness. Findings of the study showed that LSS is easy to use and learn, and it also helped users in learning Astronomy. Keywords-component; Augmented Raelity, Tangible User Interface, Tangible Augmeneted Relity, User study

I.

INTRODUCTION

will limit user activities within a room area because users cannot see the obstructs around them. Whereas, outdoor activities can be carried out by using AR, because the real environment still exists in AR [6]. VR and AR both have a remarkable requirement differences in depth of immersion [7]. The main objective of VR is to design and develop a totally immersive artificial environment, which requires a high level of realism of virtual environment, although that is not a necessary goal in AR [5]. For the rest of the paper, we will first review the background of AR and Tangible Augmented Reality (TAR). In section III, the AR application Live Solar System (LSS) is presented; findings of the usability testing of LSS is presented in section IV; and finally, the paper is concluded in section V.

In recent years, Augmented reality (AR) has increasingly become an important field of research. This trend can be seen easily when one conducts searching on AR related publications in the ACM digital library. There was only one publication record found on Augmented Reality based on keyword in the year 1992. However the record had increased to 513 records in the year 2009. Besides, few international conferences based on AR technology has been held since the end of 1990s, such as the International Workshop on Augmented Reality (IWAR), International Symposium on Augmented Reality (ISAR) and International Symposium on Mixed and Augmented Reality (ISMAR). AR superimposes virtual objects in the real environment by registering in 3D and giving real time interactivity to users [1][2]. In 1965, Ivan Sutherland created the first AR by building the first Head-Mounted-Displays (HMD) which can display a simple wire frame cube overlaid on the real world [3]. In 1990, the term AR was coined by Boeing in a project, with the intention of helping their workers assemble cables into the aircraft [4]. Figure 1 shows an example of AR, which blends the virtual 3-Dimensional object in the real environment. AR is a variation of Virtual Reality (VR) [1]. Therefore, both have the same fundamental elements, namely virtual objects, realtime response and visual equipment [5]. However, they are different in a few aspects. AR only superimposes virtual objects on the real objects, whereas the real environment can still be seen. In VR, the real environment is totally replaced with the virtual environment, as shown in Figure 1. Hence, VR

978-1-4244-6716-7/10/$26.00 © 2010 IEEE

Figure 1. AR example of showing the Sun structure in 3-Dimension

II.

BACKGROUND

AR is able to enrich educational benefits by supporting seamless interaction with both the real and virtual environment, using tangible interface metaphor for object manipulation, and offers smooth transition between reality and Virtuality [8]. Shelton and Hedley [9] explored the use of AR in teaching undergraduates on earth-sun relationship in terms of axial tilt and solstices, and found that AR is useful in teaching subjects where students cannot possibly experience it first hand in the real world. Besides, AR can clearly show spatial concepts, temporal concepts and contextual relationships between real

and virtual objects, and all these factors enable AR to be a powerful educational tool [10]. Tangible User Interface (TUI) has been applied in many AR applications. TUI was introduced by Ishii & Ulmer in 1997. They defined TUI as augmenting the real physical world by coupling digital information to everyday physical object [11]. TUI provides a physical interaction by turning physical objects into input and output device for computer interface [12]. Physical interaction is able to provide direct, manipulability and intuitive understanding to users [13]. For instance, it is very natural for humans to pick up, rotate and place a physical object. Hence, when physical objects are used as input or output device, users are not required to learn much about the way of manipulating these physical objects. TUI gives an intuitive and seamless interaction with digital and physical objects, because TUI do not require any special purpose input devices [14][15]. Fitzmaurice & Buxton [16] and Patten & Ishii [17] in their respective research, showed that TUI is an outperformed standard mouse and keyboard-basedinterface [18]. TUI offers a seamless interaction, but consists of a spatial gap. TUI has limitation in dynamically change an object’s physical properties [19]. Spatial gap will exist when the chosen object does not match with the desired functionality in a computer interface. On the other hand, AR interface supports spatially seamless workspace, but it has the interaction gap. Due to the complimentary nature of AR and TUI, Billinghurt et al. [20] proposed a new interface known as Tangible Augmented Reality (TAR). TAR is an interface which combines Tangible User Interface (TUI) with Augmented Reality Interface [20][21]. With TAR, each object is registered as a physical object, and users interact with the virtual objects by manipulating the corresponding tangible objects [20]. Augmented Chemistry [22] is an example of the implementation of AR and TAR in science learning. It is a workbench consisting of a table and a rear-projection screen. Augmented Chemistry has helped users to understand the molecules or atoms structure by showing the structure in 3Dimension. Users still uses booklets, gripper and cube to interact with these molecules and atoms models. Construct3D [23] is a three dimensional geometric construction tool. It is used to help in learning of high school mathematics and geometry. Users can build any virtual 3D geometry by using Head-Mounted-Displays (HMD) and the Personal Interaction Panel. In Mechanics and Physics education, Physics Playground [24] allows students to create, destroy, modify and interact with different kinds of virtual model. Students learn the effect of force, counterforce, speed and velocity through the virtual physics experiment. In storytelling, MagicBook [25] allows users to read the book and see the virtual objects through handheld displays (HHD). Without using any AR displays, users can still read the text on the book, look at the pictures and flip the pages. In children’s education, ‘The Book of Colours’ [26] uses AR to explain the basic theory and concepts of colours through HMD. The children can interact and observe the visual feedback from the 3-Dimension virtual character. AR is not only applied in formal education, but it is also applied in informal education.

Orlando Science Center in US use AR to show sea creatures. Visitors can navigate a Rover through the ocean environment, to explore reptiles and fishes in DinoDigs exhibition hall [27]. There are many applications to show that AR and TAR is already applied in various fields especially education. Hence, AR with TAR is able to offer intuitive and tangible interaction to users for learning purposes. III.

LIVE SOLAR SYSTEM

Live Solar System (LSS) is an AR based educational tool to help students learn about Astronomy. Therefore, various multimedia elements like video, graphic, text and 3D objects were integrated into LSS to achieve the intended purpose. LSS modules were designed and developed based on Form 3 science syllabus, specifically on the topic: ‘Stars and Galaxies’. LSS has 3 modules which cover topics on: ‘Solar System’, ‘The Sun’ and ‘The Stars’. LSS allows students to explore freely in the AR solar system, so that they can understand in a more concrete manner, the eight planets in our solar system. Besides, they can also learn about the characteristics of the planets. For example, students can learn about the sun as well as its phenomena, through the combination of physical and digital learning materials. By applying TAR concept in LSS, mouse and keyboard devices were not involved during the interactions. This means that LSS offers an intuitive-tangible interaction to users. Cubes were chosen as the physical object which were used during these interactions. Cube was chosen as it is one of the familiar physical objects in our daily lives, and all of us know how to manipulate a cube. Actions like ‘place’, ‘pick up’, ‘throw’, ‘squeeze’, ‘press’, ‘rotate’ and ‘hold’ are usually performed by humans [28]. Thus, some of the selected natural behaviors of humans will be used in the LSS interaction as shown in Figure 2. By manipulating the cube using natural actions, users are able to have effective interaction during exploration of the LSS. Figure 3 shows how a cube is used to change the virtual solar system from normal mode to gravity mode. In manipulating the movie in LSS, a card with two markers on it was used. The movie will load on one of the markers. Button to play and stop the associated movie will be displayed on the other marker. This movie interaction is similar with the metaphor of ‘on and off television and monitor’ in daily life. Figure 4 shows the user can start the movie by pressing the ‘play’ button, and at the same time, the ‘play’ button will change to ‘stop’ button which allow user to stop the movie by pressing it.

Figure 2. The selected natural actions in LSS interaction.

Figure 3. Changing mode by rotating a cube in hand. Figure 5. Live Solar System work flow

Figure 4. The way user play and stop a movie in LSS.

Figure 5 shows that LSS consists of both the physical as well as the digital world. Its implementation involves the use of a webcam to capture the real world view and then send for the marker detection process. Associated digital contents such as the 3D objects will be loaded after ID of the marker was determined. Then, the captured real world view would be combined with the digital contents and sent out to the users either through the HMD or monitor. In LSS, the cube was used as the input device. Figure 6 shows the hardware used in LSS: they are HMD, web-cam, cubes and cards with markers on it.

Figure 6. LSS hardware components.

IV.

RESULTS OF USABILITY TESTING

We conducted a usability test of LSS based on a case study conducted in a secondary school in Malaysia. The tests focused on constructs such as ‘ease of use’, ‘learnability’ and ‘effectiveness’ of LSS. There were total 30 students from Form four involved in this test. All of them had no experience in using any AR applications before. This evaluation consisted of two phases. In the first phase, 40 students were given a Pre test on their previous knowledge on Astronomy. The Pre test comprised of 15 questions and were conducted one month before the second phase. In the second phase, 40 students were divided into two groups with 20 students each. They comprised of the experimental group and the controlled group. Students from the experimental group used LSS with HMD in learning the topic on Astronomy. But the controlled group used the conventional way of learning in the classroom, such as reading the textbook and teaching aid. A Post test and a usability questionnaire were administered to

students from the experimental group after using LSS. The Questionnaire comprised a total of 18 questions which focused on ease of learning and ease of use by experts in LSS.

TABLE II.

A.

Construct : Effectiveness In this case study, effectiveness of LSS was defined as, the capability of LSS to help in student learning. We measured the effectiveness by comparing results between the controlled and the experimental group. Table I shows that result of control and experiment group. The result shows that experimental group’s performance increased 46% after using LSS compared to controlled group with only 17%. This result shows that LSS is able to help user in learning astronomy by using AR technology.

Question Q1 Q2 Q3 Q4 Q5 Q6

TABLE I.

COMPARISON OF EXPERIMENT AND CONTROL GROUP Result

Group Experiment

Average Pre Test Result (%) 24.25

Average Post Test Result (%) 41.25

Difference between pre and post test (%) 17

Control

30.25

76.25

46

B. Construct:Ease of Use The construct on ‘ease of use’ was defined as, easiness of using LSS by users as a learning experience. The instrument used to measure this construct was the questionnaire. The instrument was administered on the sample which comprised of 20 students from the experimental group after they had used LSS. There were 10 questions assessed based on this construct in the questionnaire. All of the questions were in form of five point Likert Scale (from strongly agree to strongly disagree). Figure 7 shows that most of the students gave a positive response to the construct on ‘ease of use’ of the LSS prototype. There were 46% and 42% of students who answered ‘agree’ and ‘strongly agree’ respectively, to the construct on ‘ease of use’ of LSS. Table II shows the distribution of students’ response to each question in questionnaire. Most of their responses were indicated at the level of ‘strongly agree’ and ‘agree’.The findings indicate that Tangible Augmented Reality (TAR) interaction approach applied in LSS has made LSS easy to use for the users.

Q7 Q8 Q9 Q10 Total

C.

DISTRIBUTION OF STUDENTS’ RESPONSE (EASE OF USE) Strongly disagree 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0

Disagree 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0

Slightly disagree 3 (15%) 4 (20%) 5 (25%) 1 (5%) 2 (10%) 1 (5%) 1 (5%) 2 (10%) 3 (15%) 2 (10%) 24

Agree 13 (65%) 13 (65%) 13 (65%) 8 (40%) 5 (25%) 4 (20%) 6 (30%) 11 (55%) 7 (35%) 12 (60%) 92

Construct: Learnability The construct on learnability was defined as easiness of learning Astronomy using LSS. The students were required to complete a task for this purpose. The task completion time test and questionnaire were used to measure this construct. There were 8 questions in the questionnaire builtin for assessing the learnability construct purpose. All the items were in the form of five point Likert Scale (from strongly agree to strongly disagree). The sample involved to measure this construct were again 20 students from the experimental group who were required to fill the questionnaire after using the LSS prototype. Figure 8 shows that most of the students gave a positive response to the construct on learnability of LSS. Table III shows that most of the students’ responses fell in the category of ‘strongly agree’ and ‘agree’. The findings indicate that users found that it was easy to learn about Astronomy through the LSS.

Figure 8. LSS: Construct Learnability Figure 7. LSS: Construct Ease of use

Strongly agree 4 (20%) 3 (15%) 2 (10%) 11 (55%) 13 (65%) 15 (75%) 13 (65%) 7 (35%) 9 (45%) 6 (30%) 83

DISTRIBUTION OF STUDENTS’ RESPONSE ( LEARNABILITY)

Question S1 S2 S3 S4 S5 S6 S7 S8 Total

Strongly disagree 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0

Disagree 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0 (0%) 0

Slightly disagree 1 (5%) 0 (0%) 1 (5%) 2 (10%) 1 (5%) 1 (5%) 2 (10%) 2 (10%) 10

Agree 5 (25%) 7 (35%) 7 (35%) 6 (30%) 7 (35%) 9 (45%) 8 (40%) 6 (30%) 55

Strongly agree 14 (70%) 13 (65%) 12 (60%) 12 (60%) 12 (60%) 10 (50%) 10 (50%) 12 (60%) 95

The task completion time test was also used to measure the construct on learnability. There were 10 students involved as samples of the study for this test. All of them had not had any experience in using any augmented reality (AR) applications. They were divided into two groups with 5 persons each. The controlled group was given detailed demonstration on how to use LSS, whereas the experimental group was given some explanation on how to use LSS by the conventional method of ‘talking and using simple teaching aid). Each student in both groups was given 5 minutes to use LSS before the actual testing was conducted. A task was given to both groups and the task completion time was taken for comparison between the groups. Table IV shows the required tasks given to the samples.

completed the task faster than the students from the experimental group, the difference of time between the two groups was only 9 seconds. Thus, this finding indicate that the experimental group was able to complete the task almost as fast as the controlled group. This finding also indicate that the learnability of LSS is high because the interaction approach used in LSS is similar to the ways students interact with objects in their daily life.

Figure 9. Average task completion time between controlled and experimental group.

TABLE IV.

RESULT OF TASK COMPLETION TIME FOR CONTROLLED AND EXPERIMENTAL GROUP.

Group TABLE III.

REQUIRED TASKS

1) view the virtual solar system by using 3D Solar System Card

Controlled

2) Pick the Jupiter planet from the virtual solar system with the Pointer Cube.

5) Pick Saturn planet from the virtual solar system with the Pointer Cube. 6) Get the selected Saturn planet’s information by using the Information Centre Card. 7) Read out the Saturn planet’s diameter figure .

Table V shows the task completion time result for both the controlled and experimental groups. The average task completion time for the controlled group was 119.4 seconds and 128.4 seconds for the experimental group as shown in Figure 9. Although students from the controlled group

Time (seconds)

A1

140

A2

130

A3

105

A4

115

A5

107 Average time

3) Rotate the selected Jupiter planet with the Pointer Cube. 4) Delete the selected Jupiter planet from the Pointer Cube by using the Trash Bin Card .

Student

Experimental

119.4

B1

110

B2

140

B3

127

B4

135

B5

130 Average time

V.

128.4

CONCLUSION

In this paper we have presented an Augmented Reality based educational tool, on Live Solar System (LSS). LSS was designed and developed to help students learn the subject on Astronomy. The combination of augmented reality (AR) interface and the Tangible User Interface (TUI) provides a seamless interaction to users. By applying Tangible Augmented Reality (TAR), LSS was easy to learn from and

easy to use as indicated in the findings of the usability testing conducted. Hence, the traditional device like keyboards and mice were no longer used in the LSS prototype. Instead, physical objects like cubes and cards were used as input devices in LSS. Users used natural interaction like ‘rotating’, ‘picking up’, ‘placing and holding’, in manipulating the physical objects. This natural manipulating approach used were similar to the natural and intuitive interaction approaches used by users in their everyday life. This made interaction between users and LSS easy. This new approach to learning gave a more engaging learning experience for the students. REFERENCES [1] [2]

[3] [4] [5] [6] [7]

[8]

[9]

[10]

[11]

[12]

[13]

R. Azuma, “A Survey of Augmented Reality. Teleoperators and Virtual Environments”, 6(4), 1997, pp.355–385. R. Azuma, Y, Baillot, R. Behringer, S. Feiner, S. Julier, B. Maclntyre, “Recent Advance in Augmenteed Reality”, IEEE Computer Graphics and Application, 2001. I, Sutherland, “The Ultimate Display” In Interational Federation of Information Processing, 1965, vol. 2, pp. 506–508. W. Barfield and T.Caudell, “Fundamentals of Wearable Computers and Augmented Reality”, pp. 447–468. S.P Jong, “Augmented Reality Introduction”, In Slaid. Deparment of Computer Science and Engineering, University Incheon, 2005. Y.M Jung and S.C Jong, “The Virtuality and Reality of Augmented Reality” Journal of Multimedia 2(1), pp.32–37, 2007. P. Milgram, H. Takemura, A. Utsumi, F. Kishino, “Augmented Reality: A class of displays on the reality-virtuality continuum”, Proceedings of Telemanipulator and Telepresence Technologies,1994. M. Billinghurst, 2003. “Augmented Reality in Education” http://www.newhorizons.org/strategies/technology/billinghurst.htm [21 December 2009] B. Shelton and N. Hedly, “Using augmented reality for teaching earthsun relationships to undergraduate geography students”, The 1st IEEE international augmented reality toolkit workshop, Darmstadt, German, 2002. E. Woods, M. Billinghurst, G. Aldridge, B. Garrie, “Augmenting the Science Centre and Museum Experience” Proceedings of the 2nd international conference on Computer graphics and interactive techniques in Australasia and South East Asia , 2004, pp. 230-236 H. Isii, and B. Ulemr, “Tangible Bits: Towards Seamless Interfaces between People, Bit, and Atoms”, Conference on Human Factors in Computing Systems , CHI97, 1997, pp. 234-241. M.J. Kim, and M.L. Maher, “Comparison Of Designers Using A Tangible User Interface And A Graphical User Interface and the Impact On Spatial Cognition”, Proceedings of International Workshop on Human Behaviour in Designing, 2005, pp: 81-94. Q. Wang, C. Li, X. Huang and M. Tang, “Tangible Interface: Integration of real and virtual”,7th International Conference on Computer Supported Cooperative Work in Design, 2002, pp.408-412

[14] I. Poupyrev, D.S. Tan, M. Billinghurst, H. Kato, H. Regenbrecht, N.Tetsutani, “Developing a Generic Augmented-Reality Interface”, Computer 35(3),2002, pp. 44-50. [15] K.J. Kruszyński, and R. van Liere, “Tangible props for scientific visualization: concept, requirements, application” Journal Virtual Reality,2009, Vol.13, No.4, pp.235-244. [16] G. Fitzmaurice, and W. Buxton , “An Empirical Evaluation of Graspable User Interfaces: towards specialized, space-multiplexed input” Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI’97), 1997, pp: 43-50. [17] J. Patten and H. Ishii, “A Comparision of Spatial Organization Strategies in Graphical and Tangible User Interface”, Proceedings of Designing Augmented Reality Environments,2002, pp:41-50. [18] E.V.D Hoven, J. Frens, D. Aliakseyeu, J.B Martens, K. Overbeeke, P. Peters, “Design research & Tangible Interaction, 1st International conference on tangible Embedded interaction, 2007, TEI’07. pp. 109115. [19] M.Billinghurst, “Crossing the chasm”, Proceedings of International Conference on Augmented Tele-Existence (ICAT 2001), http://www.hitl.washington.edu/publications/r-2002-62/r-2002-62.pdf. [28 December 2009] [20] M. Billinghurst, H. Kato, I. Poupyrev, “Collaboration with tangible augmented reality interfaces”, HCI International, 2001, pp. 234-241. [21] M. Billinghurst, R. Grasset, H. Seichter, and A. Dünser, “Towards Ambient Augmented Reality with Tangible Interfaces” In Proceedings of the 13th international Conference on Human-Computer interaction. Part Iii: Ubiquitous and intelligent interaction. Lecture Notes In Computer Science, vol. 5612, pp.387-396. [22] M, Fjeld, and B.M Voegtli, “Augmented Chemistry: An Interactive Educational Workbench”, In Proceedings of the 1st international Symposium on Mixed and Augmented Reality, IEEE Computer Society, Washington, DC, pp. 259 ,2002. [23] Construct3D: A Virtual Reality Application for Mathematics and Geometry Education. Education and Information Technologies 5, 4 (Dec. 2000), pp.263-276 [24] H. Kaufmann and B. Meyer,. Simulating educational physical experiments in augmented reality. In ACM SIGGRAPH ASIA 2008 Educators Programme, SIGGRAPH Asia '08. ACM, 2008, pp.1-8. [25] M. Billinghurst, H. Kato, and I.Poupyrev, “The MagicBook: Moving Seamlessly between Reality and Virtuality”, IEEE Computer Graphics and Applications, vol. 21, no. 3, pp. 6-8, 2001. [26] G. Ucelli, G. Conti, R.D. Amicis, R.Servidio, “Learning Using Augmented Reality Tech-nology: Multiple Means of Interaction for Teaching Children the Theory of Colours”, Pro-ceeding of Intelligent Technologies for Interactive Entertainment, pp. 193-202, 2005. [27] C.E. Hughes, E. Smith, C. Stapleton and D.E Hughes, “Proceedings of KSCE 2004, http://www.mcl.ucf.edu/research/seacreatures/KSCE04HughesEtAl.pdf [ 5 January 2010] [28] J.G. Sheridan, B.W. Short, V.K Laerhoven, N. Villar, G. Kortuem, “Exploring cube affordances: towards a classification of non-verbal dynamics of physical interfaces for wearable computing”, Eurowearable. pp. 113-118, 2003.