Enhancing situational awareness by means of

0 downloads 0 Views 857KB Size Report
The system was made as a subproject, in which the Cisco Aironet 1260 Series Access point 39 was ... It there are hostile elements in a room, the color will be red. ... extensive symbol palette but to make it possible to present sensor information.
Enhancing situational awareness by means of visualization and information integration of sensor networks J. Timonen*, J. Vankka, Finnish National Defence University, Department of Military Technology, P.O. BOX 7 FI-00861 Helsinki, Finland ABSTRACT This paper presents a solution for information integration and sharing architecture, which is able to receive data simultaneously from multiple different sensor networks. Creating a Common Operational Picture (COP) object along with the base map of the building plays a key role in the research. The object is combined with desired map sources and then shared to the mobile devices worn by soldiers in the field. The sensor networks we used focus on location techniques indoors, and a simple set of symbols is created to present the information, as an addition to NATO APP6B symbols. A core element in this research is the MUSAS (Mobile Urban Situational Awareness System), a demonstration environment that implements central functionalities. Information integration of the system is handled by the Internet Connection Engine (Ice) middleware, as well as the server, which hosts COP information and maps. The entire system is closed, such that it does not need any external service, and the information transfer with the mobile devices is organized by a tactical 5 GHz WLAN solution. The demonstration environment is implemented using only commercial off-theshelf (COTS) products. We have presented a field experiment event in which the system was able to integrate and share real time information of a blue force tracking system, received signal strength indicator (RSSI) based intrusion detection system, and a robot using simultaneous location and mapping technology (SLAM), where all the inputs were based on real activities. The event was held in a training area on urban area warfare. Keywords: Common Operational Picture, information integration, urban warfare, wireless sensor platform, Mobile Urban Area Situational Awareness System

1. INTRODUCTION The key aim of the project was to investigate location techniques indoors. The project consists of multiple research groups, all of which have their specific research topic. This paper describes the integration of all the sensor systems into a system that is able to fulfill the definition of COP, sharing a single identical display of relevant information to multiple echelons in the theater1. The Wireless Sensor Systems in Indoor Situation Modeling II (WISM II) project2 focuses on exploring new concepts, not on delivering a product. The novel MUSAS system contains several features incorporated into to the software during this research. An essential part of the development was interviews with the research and development department of the Guard Jaeger Regiment, based in Helsinki, Finland. The requirements of the system derive from multiple interviews and testing sessions. In terms of the software, the information integration framework was designed and implemented in the project. The Graphical User Interfaces (GUI) for the Android and Command and Control (C2) application were also implemented and designed. The Environmental Systems Research Institute’s (Esri) frameworks for Android and C# were a great help in this regard. An ad hoc capable network was built as a means of sharing the information. The planning and mechanical work and installation were done during the project, but the access point itself is a COTS product. *[email protected]; telephone +358299530472; http://www.puolustusvoimat.fi/

This paper is organized as follows. In section 2 we present existing research in the field. In chapter 3 we present the overall system and the means of delivering information in the battlefield. Information integration is discussed in chapter 4. Chapter 5 presents the results of the experiment carried out to evaluate the system. Finally, chapter 6 concludes the study by discussing the results of this work.

2. RELATED WORK Future Command and Control (C2) systems are widely studied by many armed forces3. The operational solutions of C2 systems usually combine multiple different networks and information levels into an operating entity. The US Army’s Joint Vision document of 2001 highlights the importance of information superiority throughout the battlefield 4. One project is the multinational Future Soldier project, controlled by the United States. This program comprises a Personal Intelligent Agent (PIA) subsystem, which is for controlling the information sent and presented to the soldier. This artificial intelligence agent is sometimes called the fighting person’s “digital buddy”. 5 The British Army is developing the Future Integrated Soldier Technology (FIST) system, which contains a helmet-mounted display for supporting Situational Awareness (SA). This system was evaluated for entry into operations during 2015 6. Many of the technology programs include projects for various sensors, information sources, networks, and solutions for delivering the information from commander to field personnel. In academic studies, the projects are usually limited to investigating a specified research problem, which in turn narrows the scope from entire system to a partial implementation of the functionality set. In the area of decision support and emergency management, studies similar to MUSAS have been carried out. Research by Balfour7 discusses a Common Operating Picture Software/Systems (COPSS) system that is capable of presenting four-dimensional COP in a browser interface. The COPSS highlights scalability in information sources, as well as augmented reality. Adams8 reports on the Emergency-Management Situational-Awareness Prototype (EMSAP) architecture for improved SA and command and control prototype for emergency situations, such as earthquakes. In the field of dismounted soldier communication and SA, Williams9 presents the Small Unit Operations Situation Awareness System (SUO SAS). The system emphasizes operations in an infrastructure-less environment, as well as combining GPS with inertial capabilities. It also incorporates publish-subscribe architecture for blue force tracking. The perspective of process in a tactical system with a dismounted soldier is examined by Saarelainen10, also considering the information transfer model. A Wireless Polling Sensor Network (WSPN) for improved SA and blue force tracking is studied by Saarelainen11. In the field of location services, the need to improve GPS navigation is recognized in the Force XXI Land Warrior program. For this purpose a Dead Reckoning Module (DRM) 12 is presented. The COTS perspective has been under examination by armies for several years. The use of Android devices or wearable computers and adding augmented reality to the soldier’s view are presented in Ref. 13-16. Information integration of a multisensory environment is investigated by Akita17. Frameworks for supporting integration and SA are examined by Schiefelbein18 and Manternach 19. Networking is an essential part of communication in the last mile. The problematic of networking was not the key problem in this MUSAS, but some research was done in that area as well. The COTS Wireless Local Area Network (WLAN) solutions used in proximity to hostile forces are studied in Ref. 20. One perspective is the use of frequency hopping radios to deliver IP-based traffic for use on SA, as in Ref. 21. A test bed solution for multi-level data fusion is presented by Paradis 22. In this paper, centralized and hybrid integration methods are examined and evaluated. The paper presents a novel solution for integrating sensor information from various sensor systems, using only COTS technology. It also presents the means of delivery for the end user in the battlefield, and focuses on the user interface.

3. SYSTEM OVERVIEW 3.1 Entities The system consists of three main sensor systems: wearable sensor nodes23, a robot capable of performing SLAM (Simultaneous Location and Mapping) 24, and a wireless sensor platform (WSP) 25. These three sensor systems are able to provide the necessary sensor information to the COP server. The server system performs information integration and updates the COP server, which in turn is able to share the COP throughout the network. The COP sharing framework is made in such a way that it is capable of hosting multiple information sources, but it can also operate without any external system. In this case, the system would operate in map sharing mode, in which the users would form the COP by updating information to the system manually. In this setup, all the sensor systems are operated in a 2.4 GHz band. The COP sharing framework is implemented using 5 GHz technology to prevent possible interference from a more powerful system. Figure 1 illustrates the relationships between entities.

Figure 1 Overall System

The connection from different sensor systems to the COP server in the server system is organized using Ethernet switch and static IP adresses in the connecting devices. Each sensor system has its own control, which performs data fusion and then shares the result through Internet Connection Engine (Ice) middleware. The COP sharing framework in turn uses dynamic IP addresses, and is configured into a separate subnetwork.

The fundamental design principle is that the system must be independent. This means that every device must be able to function without an Internet connection. Common Operational Picture, maps, and network connections are managed locally, all the information coming from inside the system. Connecting to the upper echelon is possible (for example, VHF radio or microlink), but is not currently used. 3.2 Sensor systems and location techniques In the overall system, there are three main sensor systems (see Figure 1), which will provide the automatic COP information for data integration. The different purposes for the information received are locating own troops inside and outside, locating unknown targets inside a building, and the creation of a base map of the building. Table 1 present the systems, methods, and their purposes. Table 1 Sensor systems

Sensor system

Purpose

Tracking method

Platform

Device free localization (DFL)

Detection of unknown targets

Received signal strength indicator (RSSI)

Wireless sensor platform

Wearable sensor node

Blue force tracking

Inertia and Time-of-flight (TOF) based navigation

Wireless sensor node

Robot system

Blueprint map, location, sensor location

Simultaneous Localization and Mapping (SLAM) utilizing laser sensor

Robot, wireless sensor platform, and multichannel router

Android device

Location outdoors

Global Positioning System (GPS)

End user network and Android device

A crucial platform for information delivery is the wireless sensor platform24. This node is a stackable hardware platform in an integrated circuit, which is designed to host different sensors (e.g., RSSI, microphone, camera, temperature). It has a modular architecture that can support multiple abstraction levels, and it is also capable of performing data fusion and analysis at network level. The node uses mesh topology for enabling wireless networking in pre-unknown environments 24-26 . This node is important for the WISM II project because it is the fundamental platform in some of the sensor systems. The device-free localization system uses variations in received signal strength indicator (RSSI) to locate targets inside the electromagnetic field. In addition to locating targets, it also allows tracking of the movement of the targets in the field27. The accuracy of intrusion detection is 0.22 meters on average28, which is more than enough from a military perspective. It is also possible to locate objects through walls using this technology, depending on distance and the material from which the wall is made. The technique is not visible to the target being tracked. These aspects are studied in more detail by in Ref. 29 and 30. The wearable sensor node 31 has been developed by the Technical Research Centre of Finland. The tracking system is based on wireless wearable sensor nodes and sensors installed on the clothing. The tracking data are provided by the GPS when moving outdoors, and inertial or radio based methods when operating indoors. The purpose of the system is to provide blue force tracking data. It provides extra location methods in the form of hybrid localization system (HLS), able to combine localization data from multiple sensors. Moreover, this framework provides improved independence concerning location systems. In the WISM II project, the entities used were inertial sensor and time of flight (TOF) 31, 32. Simulation of RF-propagation of the system is covered in Ref. 33. The robot has multiple important tasks in the system. When it is started, it creates a physical origin for the system, and it is capable of locating itself in its own coordinate system 34. The location methods of the robot are widely covered in thesis 35. The robot has visual wavelength cameras that can be used to assess a situation. The most important sensor from the MUSAS point of view is the laser sensor used for creating a blueprint for the integration server 24, 34. Its use for location is evaluated also from the perspective of rescue personnel in Ref. 36. In the WISM II project, a multi-channel

router was installed in the robot router 37, which enabled connection using various methods (for example, WSP, mobile broadband, and WLAN). In the version we used, we were able to deliver three smaller robots to the target area inside the bigger robot. These robots could be guided as a fleet, using novel algorithms, such as presented in Ref 38. 3.3 Tactical WLAN solution A network solution was needed in order to get the COP information delivered to the end user. The selected frequency band was 5 GHz WLAN (802.11a). The reason for using this particular band was that in the system there are multiple networks operating in 2.4 GHz band (see Figure 1). The system has a fundamental need of an access point that is capable of performing an ad hoc connection in wireless mode (Figure 2). In terms of the entire system, Figure 2 is located at the upper left corner in Figure 1 (end user network and ad hoc). With this technique, it is possible to broaden the operation area dramatically, still using only COTS products. The focus of the WISM II project was not the network solution for delivering the information, so this system was not part of the scientific research as such. It was built in order to permit testing of the system. The system was made as a subproject, in which the Cisco Aironet 1260 Series Access point 39 was selected to operate as the base station. The access point was originally designed for indoor use in demanding environments. In this project, it was attached to a portable metal box, which held the accessory equipment needed. The box contained a 10 amp hour sealed lead acid battery, voltage convertor, fuse box, voltage display, and main switch. The ready setup can be seen in Figure 3.

Figure 2 End User Network

Figure 3 Access Point

The map and COP services were physically located in the command post computer, serving the devices in the network. The Dynamic Host Control Protocol (DHCP) server was placed in this access point so that we were able to use dynamic IP addressing in the mobile devices. Table 2 presents the main features of the access point.

Table 2 Access Point Features

Feature Weight

7.2 kg

Power consumption

Approx. 12W

Battery capacity

10 ampere hours

Operating time

Approximately 10 hours

Range

Approximately 200 x 600 meters

User interface

On-Off button and voltage display

4. INFORMATION INTEGRATION 4.1 COP server The COP server has several important tasks. It is located in the center of Figure 1 (COP server, inside server system). Its most obvious task is to operate as a command post application, providing high scope information as well as improved functionality. In the context of the demonstration system, the server application operates as a command and control tool for the platoon leader or company commander. Not only does the application contain COP information, but it also encloses functionality for data integration, which is hidden from the user. At this stage, the operator of the application has manual tasks such as updating symbols and drawing the rooms, based on the robot image. The operator is responsible for the initial configuration of the system. The next stage will use artificial intelligence for these operations. Figure 4 is a general view of the application. The application is used with mouse to zoom and drag. There are multiple quick buttons also available. At the top left corner, the coordinates of the mouse cursor are displayed in local coordinates, as well as real world coordinates (MGRS, WGS84). At top center, the user finds a selection of NATO APP6B symbols, events, and polygon shapes for drawing rooms. The color of the rooms changes according to the situation. It there are hostile elements in a room, the color will be red. After the space has been cleared of all danger, the operator can change the color to green. One very usable feature is the possibility of partial transparency of a raster map on top of the satellite map. This mode reveals shapes of the terrain and different targets, such as monuments hidden in the forest.

Figure 4 Server GUI

There are two dialog boxes at the top of the main screen in Figure 4. The one on the left is a log screen, which displays all the incoming information from the application and sensor systems. The right dialog box plays an essential role in correctly focusing the different coordinate systems. The application uses one origin in respect to which all the targets are drawn. The origin is a point in a real world coordinate system, which can be defined manually, but usually the point is defined with the location of the robot when it is started. Once the operator defines the orientation and scale of the system, the selected values will appear to the dialog in the “Values from Robot Picture” column. After this, the operator can adjust the Master init angle and Scale sliders. This operation will place the robot at the right location in terms of the base map and incoming SLAM information from the robot. These settings will affect every sensor system equally. The fields

starting from the VTT field (Figure 4) will provide individual tuning possibilities for sensor systems in terms of location of the origin, scale, and angle. When these settings are saved once, the application will automatically use the latest configuration. For indoor location purposes, a simple symbol layout was created, as seen in Figure 5. The purpose was not to create an extensive symbol palette but to make it possible to present sensor information. This palette is the subject of further research, since it is not based on any actual standard.

An extremely important feature of the COP server is that it increases the abstraction level of targets. After a sensor network is connected to the system, the information is no longer just information that represents x and y coordinates bound to the local origin; now the track has a location based on real world coordinates, type, symbol, and additional information. This kind of information is easy to share and is understood similarly in various different systems. Figure 5 Symbols

4.2 Information integration The goal of information integration is to combine information from multiple sources and deliver it to the end users. The integration system is located inside the COP server seen in Figure 1. The system was planned in such a way that it is quite straightforward to add new sources to the system. During the development, evaluation showed that adding a totally new system would take less than a day if the Ice middleware could be installed in the attached sensor system. The features discussed in this chapter concern only the command and control computer, which also hosts the methods for information fusion. The Android devices connect only to the services described in this section. The IceStorm server 40 creates a means of transferring information inside the system. The physical server is one laptop running the software and connected to the system using Ethernet switch (Figure 6 – Ice). Each system connecting to it implements a slice definition file, based on which implementation of the methods is chosen. It is possible to subscribe to these topics from any device (including Android devices) connected to the Ice middleware. After a client has subscribed to a certain topic, the information will be delivered to all the subscribers when an event arises. The system uses static IP definitions in sensor systems connecting to the server.

Figure 6 Delivery Architecture

hy

The specific sensor networks and technologies used for creating COP are described in section 3.2. Figure 6 shows the general architecture for information fusion and exchange, seen in Figure 1 as the COP server in the upper left corner. In this project, we had multiple topics used for internal communication, such as real time node location between robot and sensor nodes. The IceStorm server was used for data fusion by the sensor networks, as well as information fusion, to provide a unified COP. This paper presents only the topics needed to subscribe from a COP point of view. The elements creating COP are divided into two different parts: constantly moving targets and static entities. The reason for this is efficiency in update methods to the mobile devices. It takes CPU and network power to update multiple targets to all of the devices. We decided that we wanted to have strict control over targets that were updated constantly. The Esri tracking server (Figure 6) 41 was selected to host and update these targets. The COP object, which is formed based on the sensor information, is delivered to the tracking server inside the COP sharing framework, from where it is published to the end users’ devices. The tracking server is a separate service to which all the applications must subscribe and which they must add as a layer to the application. The Esri ArcGIS server (Figure 6) 42 hosts the complex and static targets such as rooms. It publishes the map layers in use and also provides tools for analysis. The laser image of the robot will be delivered through the Ice and presented in the ArcGIS server layer. On top of this layer, the operator can draw the room over the appearing image, and commence sharing. Table 3 presents the different types of information that are delivered inside the system. As in many cases, the update frequency is a key issue from an efficiency point of view. Table 3 Information Types

Information

Delivery solution

Nature

Update interval (ms)

Source

Blue force tracking

Tracking server

Dynamic

10-1000 (1000 used)

Wearable sensor node

Detection of unknown targets

Tracking server

Dynamic

10-1000 (1000 used)

Device-free localization

Building blueprint image

ArcGIS server

Dynamic

5000

Robot laser sensor

Blueprint COP information

ArcGIS server

Static

Operator action based

Operator

Satellite and raster maps

ArcGIS server

Static

Based on GUI operation

Map server

Events Troops

ArcGIS server

Static

User action based

Android application or Command application

Position of device

ArcGIS server

Dynamic

2000

GPS chip in Android

Laser outlines of the buildings

ArcGIS server

Static

No updates during operation

Map server

Figure 7 shows the architecture for information delivery from Ice. The operation starts when the application starts and Cop Ice Subscriber is run. The precondition for this is that the connection to the IceStorm server can be created. The instance of the class will first subscribe to the topics it uses. If the topic does not exist, the subscriber will create the topic and subscribe to it. When a sensor system subscribes to a topic and starts to publish (for example, robot image), the MUSAS will begin receiving and updating information to the operator’s display. The IceStorm server enables subscribepush architecture, where the connecting devices don’t need to put queries to the server. When a sensor system publishes something to a topic, all the subscribers of the specific topic will be automatically invoked (push service). This model provides a lightweight solution for connecting to multiple information sources. The IceStorm server is also used between the sensor systems to share information. One example of this is the data fusion used to combine sensor information from device-free localization and robot location, in order to improve the estimation of location, as well as the results of RSSI based location.

Figure 7 Information Delivery Using Ice (within Figure 6)

In the COP Ice Subscriber implementation, each topic has a specific method, which implements the functionality that will be performed when an event comes from the Ice middleware. This construction is implemented using a separate thread, in which the COP Ice Subscriber is started from the main program. The class COP Ice Subscriber contains subclasses for each topic to which it subscribes. When the method is invoked, it usually parses the information so that it will fit the definition in the target system. Targets that are to be shared to the Esri tracking server are passed to a singleton class StateValues, which will gather the information into a dictionary object. When any information reaches the AddObjectToCOPobject method, the observer notifies this and waits a defined amount of time (2000 ms) before it updates the COP object to the tracking server using SendAllToTrackingServer(). The reason for this is that there is a continuing search for a more efficient solution in a multisensory environment, where the power of the server can be saved for display capabilities. 4.3 Mobile device application The application for the end user was created for the Android platform. This allows easy deployment for new devices using the same operating system, and offers a wide range of COTS products. In this system, it is possible to join a new device just by installing the application and configuring the Wi-Fi settings. Mobile devices can be found in Figure 1 (upper left corner) and Figure 2 (lower center). The application is made to be as simple as possible because of the extremely stressful and fast-paced environment in which the system operates. It contains only a selected set of features, as illustrated in Table 4.

Table 4 Mobile Device Features

Feature

Selection

Map

Satellite or Raster

User view

Targets, troops, events, rooms (and colors) Compass

Marking targets

Event, Hostile, Unknown, Neutral

Gestures

Pinch - Zoom map Pinch + Rotate - Rotate map Short press - North up Long press - Select satellite or raster map

Compass

Displays North

During the research, it was recognized that it is not possible for the soldier in the field to focus on the device while in action. For this reason, the baseline for the planning was the fact that the device is only for support, not for active use. This ideology was taken into account by making the user interface as simple and intuitive as possible. Common examples of use are moving the map, zooming into the map, and adding new targets. It is possible to use every feature with only one hand, including opening the carrying pouch attached to the torso. The user interface is seen in Figure 8.

Figure 8 Mobile Device User Interface

The Android application subscribes to the services in the ArcGIS server and the tracking server (see Figure 6), which are presented in section 4.2 in more detail. The application is capable of adding targets or events to the ArcGIS server, but not to the tracking server, since these are objects that are derived from the sensor networks with information integration. The device has an integrated GPS sensor, which can be enabled from the menu. After the location is detected, it is shared to the ArcGIS server, where it will be visible throughout the network.

5. RESULTS OF THE EXPERIMENT The system created (Figure 1) was tested in an experiment event at Santahamina, Finland, in 2012. In this event, the system formed a common operational picture using wearable sensor nodes, device-free localization, and mobile robots. The system was able to produce real time results and deliver information across the area of the operating platoon. The network was built as the troops advanced inside the building. The experiment was conducted in a testing yard for urban area warfare. As test personnel, we had a platoon of soldiers, specialized in operations in an urban area. During the tests, we also filmed a short movie 43, which explains the operational concept of the MUSAS system. The tests allowed us to make many improvements in terms of integration and performance. For example, the frequency of the updates (see Table 3) of the different systems was decided based on the performance of the system, as well as on feedback from the experiences of the testing platoon. The solution was to use an interval from 1 second to 5 seconds, depending on the information type. This was shown to be enough for the users in this case. We also tested the system in an outdoor-like environment, which suggested future improvements to the hardware. In the demonstration event, the soldiers used end user network devices for building communication at the time of action. The access point (Figure 3) is extremely simple to use, having only one button, with which it is possible to turn it on or shut it down. With this setup, it is possible to provide covering range outside the building as well as inside, by using one access point in the doorway. In the demonstration setup, we used a total of four access points, one operating in the command post and three deployed to the terrain with the troops. The demonstration setup contains a means for attaching the mobile devices to the soldier’s equipment. Two options studied were attachment to the left hand (for a right-handed user) and to the upper left torso, using a specific pouch. The first impression was that the hand attachment was better, but the torso attachment proved more reliable. The device is vulnerable when used in the hand, consuming more of the user’s energy and also possibly preventing other activities during battle. The torso attachment is slightly more difficult to reach, but, on the other hand, the device here is well protected and unobtrusive. After some training, the soldiers got used to carrying and using the device attached to the torso. Later the GUI device will probably be developed to fit to this attachment more effectively. The soldiers also used glows, specially designed for tactical use with touch screen capability. The users gave good feedback about the usability of the mobile devices and also on the speed of the system. More tests are needed to evaluate how well the GUI meets the needs of users in a stressful situation. In this experiment, there was of course no extensive tiring phase (1-2 days of battle) beforehand. A test that incorporated sudden unexpected events and still forced the soldier to use the device would provide crucial information about use under conditions of great stress.

6. CONCLUSIONS AND FUTURE WORK 6.1 Conclusions In this research we presented an information integration framework that is capable of hosting multiple different sensor networks operating as information sources. The novel architecture also demonstrates methods of delivering the common operational picture created to the end user using COTS technology, as well as an information integration system using Ice middleware. At the same time, we presented an Android user interface, capable of receiving information from the server inside the closed network. The user also has the possibility of adding simple targets during the action. The application supports multiple different devices (for example, tablets), for improved display capabilities. Development and testing found the system to be a success in its area. Hence, it was decided that research on this system will continue with the addition of more features and improvements to usability. 6.2 Future research Future plans for development include the implementation of the 3D view to the operator’s application, as well as to the Android device. At the same time, the Nato Vector Graphics (NVC) will be introduced to the system. The connection methods to the upper echelon (Figure 1) are established and will be used in the next field demonstration event.

The integration of a visual wavelength camera into the system is currently the subject of research and testing. The video stream will be coming from a hand-held UAV. The stream can be ordered from any device inside the network, and will be delivered after the UAV has arrived to the desired area, straight to the user’s device. Voice over IP (VOIP) technology will also be tested and implemented for an integrated helmet system. The distributed server and map architecture is in the planning phase. The general features of the forthcoming implementation are illustrated in 44. The analysis tools offered by the framework (such as terrain analysis or probability analysis) will be examined and evaluated. There are multiple entities in the system that could take advantage of artificial intelligence. This perspective will be evaluated and implemented in the next operational version.

7. ACKNOWLEDGEMENTS This work was funded by the Finnish Funding Agency for Technology and Innovation (TEKES). The context for this work was the WISM II (Wireless Sensor Systems in Indoor Situation Modeling II) project 2.

REFERENCES [1] [2] [3] [4] [5] [6] [7] [8]

[9] [10]

[11] [12] [13] [14]

[15]

DOD Dictionary of Military and Associated Terms, "COP Definition," DOD, 10 March 2013, < http://www.dtic.mil/doctrine/dod_dictionary/data/c/10914.html> (10 March 2013). Björkbom, M., “WISM II (2011-2012),” Aalto University, 3 March 2012, (10 March 2013). SoldierMod.com., “Programmes at a glance: December 2012, report,” Soldiermod.com, 2012, (30 March 2013). Director for Strategic Plans and Policy, Strategy Division, “Joint Vision 2020,” Government Printing Office, Washington, DC (2001). Taylor, A., “FUTURE SOLDIER 2030 Initiative,” US Army RDECOMM2009. Army-Technology.com, “FIST - Future Infantry Soldier,” Army-technology, 5 April 2013,< http://www.armytechnology.com/projects/fist/> (30 March 2013). Balfour, R. E., “Next generation emergency management common operating picture software/systems (COPSS),” Systems, Applications and Technology Conference (LISAT), 1-6 (2012). Adams, K., Wassell, A., Ceruti, M. G., Castro, E., Lehan, S. F. and Mitchell, J. W., “Emergency-management situational-awareness prototype (EMSAP),” 2011 IEEE First International Multi-Disciplinary Conference on Cognitive Methods in Situation Awareness and Decision Support (CogSIMA), 110-114 (2011). Williams, L. J., “Small unit operations situation awareness system (SUO SAS): An overview, Military Communications Conference, 2003. MILCOM 2003. 2003 IEEE, Vol. 1, 174-178 (2003). Saarelainen, T. and Timonen, J., “Tactical management in near real-time systems,” 2011 IEEE First International Multi-Disciplinary Conference on Cognitive Methods in Situation Awareness and Decision Support (CogSIMA), 240-247 (2011). Saarelainen, T. and Jormakka, J., “C4I2-Tools for the Future Battlefield Warriors,” 2010 Fifth International Conference on Digital Telecommunications (ICDT), 38-43 (2010). Marth, R. B., Sr., Levi, R., Durboraw, I. N., III, and Beam, K., “The integrated navigation capability for the Force XXI Land Warrior,” Position Location and Navigation Symposium, IEEE 1998, 193-200 (1998). Kaul, V., Makaya, C., Das, S., Shur, D. and Samtani, S., “On the adaptation of commercial smartphones to tactical environments,” Military Communications Conference, 2011. MILCOM 2011, 2205-2210 (2011). Suri, N., Pochet, L., Sterling, J., Kohler, R., Casini, E., Kovach, J. et al., “Infrastructure, middleware, and applications for portable cellular devices in tactical edge networks,” Military Communications Conference, 2011. MILCOM 2011, 1541-1546 (2011). Hicks, J. D., Flanagan, R. A., Petrov, P. V. and Stoyen, A. D., “Eyekon: augmented reality for battlefield soldiers,” Software Engineering Workshop, 2002. Proceedings. 27th Annual NASA Goddard/IEEE, 156-163 (2002).

[16] [17] [18] [19] [20]

[21]

[22]

[23] [24]

[25] [26] [27]

[28]

[29] [30] [31]

[32] [33]

[34]

[35]

[36] [37]

Murray, J., “Wearable computers in battle: recent advances in the Land Warrior system,” The Fourth International Symposium on Wearable Computers, 169-170 (2000). Akita, R. M., “User based data fusion approaches,” Proceedings of the Fifth International Conference on Information Fusion, 2002, Vol. 2, 1457-1462 (2002). Schiefelbein, M. C., “Information architecture for threat detection systems,” 2008 IEE Conference on Technologies for Homeland Security, 589-592 (2008). Manternach, J. and Broadstock, T., “Information integration for situation awareness,” International Symposium on Collaborative Technologies and Systems. CTS 2006, 142-149 (2006). Epstein, M., Gilmour, P. and Yoon, C. J., “Application of commercial wireless LAN technology to forward area mobile communications,” Military Communications Conference, 1993. MILCOM ’93. Conference record. Communications on the Move, IEEE, Vol. 2, 490-496 (1993). Ostling, K., “Evaluating an IP-based tactical communication network for highly mobile units,” Military Communications Conference, 2001. MILCOM 2001. Communications for Network-Centric Operations: Creating the Information Force. IEEE, Vol. 2, 751-754 (2001). Paradis, S. and Roy, J., “An architecture and a facility for the integration of all levels of data fusion,” Proceedings of the Third International Conference on Information Fusion, 2000. FUSION 2000. MOD5/11MOD5/17, Vol. 1 (2000). Saarinen, J. and Heikkila, S., “Laser based personal navigation system,” 2005 IEEE International Symposium on Computational Intelligence in Robotics and Automation. CIRA 2005. Proceedings, 315-320 (2005). Yigitler, H., Virrankoski, R. and Elmusrati, M. S., “Stackable wireless sensor and actuator network platform for wireless automation: The UWASA node,” Aalto University Workshop on Wireless Sensor Systems, Espoo, Finland (2010). Virrankoski, R., Ed., Generic Sensor Network Architecture for Wireless Automation (GENSEN), Proceedings of the University of Vaasa, Reports. Vaasa, Finland (2012). Yigitler, H., “The UWASA Node Reference Manual,” University of Vaasa, Department of Computer Science, Communications and Systems Engineering Group, Vaasa, Finland (2010). Kaltiokallio, O., Bocca, M. and Eriksson, L. M., “Distributed RSSI processing for intrusion detection in indoor environments,” Proceedings of the 9th ACM/IEEE International Conference on Information Processing in Sensor Networks, 404-405 (2010). Kaltiokallio, O. and Bocca, M., “Real-time intrusion detection and tracking in indoor environment through distributed RSSI processing,” 2011 IEEE 17th International Conference on Embedded and Real-Time Computing Systems and Applications (RTCSA), 61-70 (2011). Wilson, J. and Patwari, N., “See-through walls: Motion tracking using variance-based radio tomography networks,” IEEE Transactions on Mobile Computing, Vol. 10, 612-621 (2011). Zheng, Y. and Men, A., “Through-wall tracking with radio tomography networks using foreground detection,” Wireless Communications and Networking Conference (WCNC), 2012 IEEE, 3278-3283 (2012). Korkalainen, M., Tukeva, P., Lindholm, M. and Kaartinen, J., “Hybrid localization system for situation awareness applications,” 3rd Workshop on Wireless Communication and Applications (WoWCA2012), Vaasa, Finland (2012). Korkalainen, M., Mäyrä, A. P. and Känsälä, K., “An open communication and sensor platform for urban search and rescue operations,” Proc. of SPIE, Vol. 85400O-1 (2012). Korkalainen, M. and Sallinen, M., “A survey of RF-propagation simulation tools for wireless sensor networks,” 2010 Fourth International Conference on Sensor Technologies and Applications (SENSORCOMM), 342-347 (2010). Saarinen, J., Suomela, J., Heikkila, S., Elomaa, M. and Halme, A., “Personal navigation system,” 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2004. IROS 2004. Proceedings, Vol. 1, 212-217 (2004). Saarinen, J., “Doctoral dissertation: A sensor-based personal navigation system and its application for incorporating humans into a human-robot team,” Helsinki University of Technology, Automation Technology Series A: Research Reports No. 33 (2009). Saarinen, J., Heikkila, S., Elomaa, M., Suomela, J. and Halme, A., “Rescue personnel localization system,” Safety, Security and Rescue Robotics, Workshop, 2005 IEEE International Conference, 218-223 (2005). Goodmill, “Goodmill w24 mobile multi-channel router solution for broadband connectivity,” (2012).

[38] [39] [40] [42] [43] [44]

Myrsky, M., Saarinen, J. and Maula, A., “Interface for controlling a fleet of generic machines,” Multivehicle Systems, 60-65 (2012). Florwick, J. W., Amrod, A. C. and Woodhams, J., “Wireless LAN design guide for high density client environments in higher education,” User guide (2011). ZeroC, “IceStorm,” ZeroC, 12 March 2013, (22 March 2013). [41] Esri, “Tracking server 10”, Environmental Systems Research Institute, New York, USA (2010). Esri, “ArcGIS® 10.1 for server functionality matrix,” Environmental Systems Research Institute, Redlands, California, USA (2013). Finnish Combat Camera Team, “WISM II–project,” 7 March 2013, (5 April 2013). Timonen, J., “Distributed information system for tactical network,” 3rd Workshop on Wireless Communication and Applications (WoWCA2012), Vaasa, Finland (2012).