Augmented reality to enhance visitors ... - ACM Digital Library

4 downloads 0 Views 492KB Size Report
Dec 3, 2010 - Augmented Reality to Enhance Visitors Experience in a. Digital Zoo. Johannes Karlsson, Shafiq ur Réhman, Haibo Li,. Department of Applied ...
Augmented Reality to Enhance Visitors Experience in a Digital Zoo Johannes Karlsson, Shafiq ur Réhman, Haibo Li, Department of Applied Physics and Electronics, Umeå University, 901 87 Umeå, Sweden.

[johannes.karlsson,shafiq.urrehman,haibo.li]@tfe.umu.se

ABSTRACT In this paper we propose three tire based protocol for augmented reality application on mobile phones for animal situation awareness and tracking system in a wireless sensor network. We have developed and deployed an animal tracking system in a local zoo that can provide visitors information about individual animals. Here we present our ongoing work on an augmented information system that can provide online real-time animal information such as the current location, and feature of specific breed.

1.

INTRODUCTION

Advances in Wireless Sensor Network (WSN) technology has made it possible to embed digitally controlled sensors ( such as cameras and RFID etc) to compute, analyze and visualize communication for the effective digital environment. In the environment visualization, the major efforts are focused to provide the users with information of interest from the data gathered by sensors in WSN [9]. Due to high density for full coverage of the monitored environment, the amount of sensed data and the size of data displays for presentation to the users, it is required to provide visualization tools based on Mixed Reality (MR). Considering the importance of Data visualization in our WSNs (i.e. digital zoo project), we have proposed Augmented Reality (AR) based approach [2] in order to enhance the experience of physicalvisitors. In this paper, we propose an AR information map based visualization tool on the hand held devices (see Fig. 1) as a suitable data/information visualization method for our WSNs project. An AR visual representation of an animal of interest is overlaid on display screen to enable the users to view more information for a specific animal and/or site of interest. This paper is organized as follows. Section 2 discusses the related work and provides an overview of digital zoo project. Section 3 presents the system architecture; the hardware details are presented in section 4 and the software details used

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. MUM ’10, December 1-3, 2010, Limassol, Cyprus. Copyright 2010 ACM 978-1-4503-0424-5/10/12 ...$10.00.

Figure 1: Systematic overview: The augmented reality information of an animal of interest is rendered on the handheld device based on the sink node dataset in WSN. to implement an AR information map based visualization tool on hand held devices are given in section 5. In section 6 this work is concluded along with future directions.

2. BACKGROUND The Swedish Zoo “Lycksele Djurpark” is the northernmost zoo in the country. They are specialized in nordic animals and have animals like the bear, elk, musk-ox, wild boar to mention some. Due to weather conditions and the sparsely populated location of the zoo, it has difficulty to attract physical visitors to the zoo during winter. Although the zoo is closed during the winter the animals life continues as usual and they are outside. It is challenging for the zoo to make profit and to develop under these conditions. The goal of the project “Digital Djurpark” (Digital Zoo) funded by EU regional development fund is to make the zoo digital and more attractive. The approach is to build a wireless sensor network covering the whole zoo. The goal is to build a network that consists of around 100 wireless sensors, which are able to configure themselves into a working network. The sensors could be collecting data such as video, pictures, sound and temperature. The sensed data is processed and semantic information is used to support interaction for both physical and virtual (Internet) visitors [3]. Visitors from home can access live video streams from the wireless media sensor network deployed in the zoo.

(a) Ear tag RFID transponder (HDX)

(b) Implant RFID transponder (FDX-B)

Figure 3: Overview of the RFID tags used in our setup.

The deployed network uses a multihop ad-hoc mesh networking protocol to build up a totally autonomous network. The nodes send messages to the sink, where this data is transferred the information to an application running in the central computer (i.e. sink node) where it is collected into a database. This database can be accessed by the ARcomponents. Our WSNs hardware for AR application is mainly composed of two components: Radio-frequency Identification transponder and Ad-Hoc Wireless Camera.

batteries, instead they receive the energy used to transmit back its message from the reader. Most animals in the zoo are carrying a RFID chip, either as an implant or as an ear tag. The implant are usually small glass encapsulated transponders that are implanted below the skin at the back of the neck of the animal. The size of this transponder typically has a length ranging from 10 mm to 20 mm and has a diameter of about 2 mm. For ear tag RFID, larger transponders can be used; i.e. diameter of 20-30 mm and a weight of 3-10 grams. The RFID transponders used for animals are following the ISO 11785 standard. In the zoo, we are basically using two different types of protocols within the ISO 11785 standard. One is the the ISO full duplex standard (FDX-B) where data is sent back from the transponder during the activation period. When this standard is used the transponder is sending back information during the time it receives energy from the reader. The other protocol is the ISO half duplex (HDX), in which the tag first stores energy received from the reader and then send back its data. The size of these transponders are often relative large and due to a larger coils size, a better reading distance can be reached. For many animals the smaller glass encapsulated RFID transponder has to be used. Since this transponder is put under the skin it will give the animal a more “natural” appearance in the zoo compared to an ear tag. In our experimental setup, we have tested two different RFID readers for animal identification. One is an ISO halfduplex RFID reader, based on the Texas Instrument(TI) high performance LF radio frequency module RI-RFM-007, with Texas Instrument RI-ANT-G04E 1018x518 mm antenna.It is used for an half-duplex RFID ear tag transponder with a diameter of 30 mm ( see Fig. 3).For optimal orientation of the RFID transponder, the reading distance can be up to 80 cm for the TI reader in controlled environment; i.e. a large enough detection area for the animals wearing ear tags in the zoo. Second is an ISO full-duplex RFID reader based on Atmel U2270B reader IC which can be tuned to 134.4 kHz and used for ISO 11785 tags. It can read 20 mm full-duplex ear tags at about 10 cm range. However, for implant tags the Atmel reader can only achieve a reading distance of a few centimeters, which is not enough to get a robust identification of animals. Considering this limitation, the readers will be placed at strategic locations like eating or drinking areas. It is also possible to put the antennas around tunnels or gates for many animals. In this cases a very short reading distance is enough to read the RFID tag of the animal.

4.1 Radio-frequency identification

4.2 Ad-Hoc Wireless Camera

To identify the animals in the zoo, we are using passive radio-frequency identification (RFID). They don’t need any

To capture information from the animals we have developed a camera platform based on the Gumstix OveroTM Fire

Figure 2: Block diagram of system architecture. The data is collected on sink node using RFIDs and camera nodes, where as information rendering is performed on users handheld device based on the data, whenever the user is connected to our WSN. Previously, Ann-sofie et al [8] have presented an idea based on augmented reality visualization on a mobile phone for onsite inspection. To our best of knowledge, data visualization on hand held devices from such a large WSN as ours is never been implemented. In this work, an augmented reality visualization tools aims at providing real-time augmented information about animal of interest. Tracking of laboratory mice has been investigated in previous literature using the combination of RFID and cameras [5]. This was however only for a small indoor setting. In our work we are considering identification and tracking of animals in a large scale outdoor environment.

3.

SYSTEM ARCHITECTURE

In order to enhance the experience of end-users, the proposed application is based on two main components (see Fig. 2). First, a deployed WSNs consisting of a set of nodes collecting visual information (video data), radio-frequency identification (RFID) transponders for tracking animals in zoo. In our WSNs just one node acts as a sink and is connected to a central computer, where collected data is saved in a database. Second, a AR based visualization tool, capable of recognizing each handheld display (i.e. mobile phone) and presenting mixed reality information on display.

4.

HARDWARE: WIRELESS SENSOR NETWORK

Figure 4: Picture of a wireless ad-hoc camera. computer-on-module and a custom image sensor board, see Fig. 4. The Gumstix OveroTM Fire consists of an Texas Instrument Omap 3530 application processor with ARM Cortex-A8 CPU and a C64x+ digital signal processor (DSP) core both running at 600MHz. The board has 256MB of RAM and 256MB of flash memory. For communication IEEE 802.11g wireless LAN is used in a wireless mesh network. The board measures only 17mm x 58mm x 4.2mm which enable us to build a really small and powerful camera platform. The camera board contains an OmniVision OV9710 image sensor supporting 1280×800 at 30 fps. The image sensor also has a good low light sensitivity of 3.3V per lux-second. For this platform we use a multi-hop wireless ad-hoc network on 802.11 wireless links. Since this platform has high computational capacity and also high communication capacity it will enable the possibility to send high-quality real-time video. This platform has however rather high power consumption, typically a few watts, and will in general be connected to the main power supply.

4.3 Hardware deployment Currently we have 25 video transmitting camera nodes deployed in the zoo. For the augmented reality application we are first targeting the system installation we have at the wolves. Here the area is divided into two sub-areas. The wolves can pass between these two areas through a tunnel measuring 60x60 centimetres. A RFID antenna is installed around this tunnel, which enables us to get the ID of the animal each time it is moving between the two sub-areas. To track the animal from this point two cameras are installed in this area (see Fig. 5). This setup enable us to identify and track the wolves over a large area in the zoo.

5.

SOFTWARE: AUGMENTED REALITY MODULE

In this section we describe the system overview for a software that can be used for the visualization of the information provided by our WSN. The complete system must be able to perform two main tasks: • Animal Recognition and Tracking. This module performs detection and tracking of the animals based on RFID and motion tracking algorithm. • Mixed Reality Rendering. It provides the means for rendering auxiliary information on the display of a user’s handheld device.

Figure 5: Wolf Cage Area: The wolves are crossing the tunnel equipped with RFID reader. From which the wolves are identified and tracked by the camera using animal tracking algorithm. The first task is to identify and track each individual animal in the zoo. The wireless sensor network will ultimately know the geographical position of all animals in the zoo and can send this information to the sink node. The second task is to visualize the collected information in an AR application running on a handheld device. To enable this we must know the position and orientation of the handheld device. The device must also be connected to the wireless network in order to receive real-time information about the position and ID of all animals in the zoo.

5.1 Animal Recognition and Tracking For computer vision tracking of animals we are using the Blob Tracking algorithm. It is considered as a robust technique for object tracking in outdoor environment [1], [7]. In this work, we have employed the Blob Tracking approach. The foreground and background detection perform segmentation for each pixel in the frame of video sequence. Once the blob pixels have been determined, the blobs can be tracked. With the blob tracking, the system perform a saving function to collect all blobs position and save the blob trajectory. A trajectory post-processing is optional to smoothing the trajectory. The output video provides an ellipse where the animals appear in the frame that shows where the tracked animals are. With the help of combining occasional RFID readings with tracking of animals from the wireless sensor network, the system reports the name above the individual animal to the sink node. For more details about our work on identification and tracking of animals in the zoo, see [4] The RFID readers are placed at strategic locations like drinking or feeding locations. The location of RFID reader is detected in order to get the static position according the cameras view. Once the animal stands nearby the RFID readers, the name is captured, by tracking individual animals, the individual’s location can be calculated and sent to the control center. This application also can be used in multiple cameras working by using scale-invariant feature transform (or SIFT) [6]. Animal recognition and tracking module provides animal identification code (i.e. “animal-id”). This information is

6. CONCLUSION AND FUTURE WORK

Figure 6: Single individual tracking.

In this paper, we proposed an animal tracking and monitoring based mixed reality application rendering system on the mobile phones for our digital zoo project. The main motivation for the proposed system is to provide visitors with additive information regarding the animal of interest. The preliminary results has so far shown a successful system for identification and tracking of the animals. It is believed that the proposed system is a useful interaction tool for visitor, which provide a cheap and an easy way to access the information from our WSN. The complete system will be implemented and evaluated in future work. Several challenges exists to provide detailed AR information in real-time. In the system several position errors will be introduced and added up. The position and orientation of the handheld device provided by the on-board sensors will contain errors. The position of the animal will also contain errors due to inaccuracy of the position of the visual sensors nodes. Finally the delay in the system will also introduce some error when AR information is rendered on the handheld device. Due to this limitations it will probably be difficult to render AR information for a group of animals standing close together.

7. REFERENCES Figure 7: Augmented reality application on a smart phone. sent to sink node in WSN for the AR application.

5.2 Mixed Reality Rendering The second software module can be split into two components. One application is the AR client running on the handheld device. The second software component is the server application running on the sink node in the wireless network. The server application running on the sink node will maintain a database of two kind of data. One is the more static information that is added by the zoo staff or automatically transferred from the zoo keeping database. This is general information about the animal like name, breed, gender, date of birth, offsprings, etc. The other type of data is the dynamically updated data from the wireless sensor network. This information is for example animals current position and collected media clips from of the animal. The client application, on the user mobile phone, will be connected through the wireless network to the server application. Through this connection the client will have updated information about the animals current position and information related to each animal. To render this information the client must also know the position and orientation of the handheld device. This information is provided by the GPS, accelerometer and digital compass on the device. Once the geographical position and orientation of the handheld device is calculated the additional animal information can be rendered on the display. Since quite much information is available for each animal, the AR application should only overlay some information, like animals name and possible also the breed. The user could then click on a specific animal to get more detailed information and also previously recorded video clips.

[1] Arulampalam, M. Maskell, S. Gordon, and T. N. Clapp. A tutorial on particle filters for online nonlinear/non-gaussian bayesian tracking. IEEE Transactions on Signal Processing, 50(2):174–188, 2002. [2] O. Bimber. The end of Hardware: Augmented Reality and Beyond. BookSurge Publishing, 2006. [3] K. Fahlquist, J. Karlsson, K. Ren, L. Liu, H. Li, S. U. Rehman, and T. Wark. Human animal machine interaction: Animal behavior awareness and digital experience. In Proceedings of ACM Multimedia 2010 Brave New Ideas (ACM Multimedia 2010 - Brave NewIdeas), 2010. [4] J. Karlsson, K. Ren, and H. Li. Tracking and identification of animals for a digital zoo. In Proceedings of Internet of Things Symposium 2010, 2010. [5] M. Kritzler, S. Jabs, P. Kegel, and A. Kr¨ uger. Indoor tracking of laboratory mice via an rfid-tracking framework. In MELT ’08: Proceedings of the first ACM international workshop on Mobile entity localization and tracking in GPS-less environments, pages 25–30, New York, NY, USA, 2008. ACM. [6] D. G. Lowe. Object recognition from local scale-invariant features. In Proceedings of the International Conference on Computer Vision, pages 1150–1157, 1999. [7] A. Senior, A. Hampapur, Y.-L. Tian, L. Brown, S. Pankanti, and R. Bolle. Appearance models for occlusion handling. Journal of Image and Vision Computing, 24(1):1233–1243, November 2006. [8] A. sofie Gunnarsson, M. Rauhala, A. Henrysson, and A. Ynnerman. Visualization of sensor sata using mobile phone augmented reality. In Proceedings of the 5th IEEE and ACM International Symposium on Mixed and Augmented Reality, pages 233–234, 2006. [9] J. Yicka, B. Mukherjeea, and D. Ghosal. Wireless sensor network survey. Computer Networks, 52(12):2292–2330, 2008.