chapter 1: introduction

3 downloads 37 Views 2MB Size Report
THROUGH THE USE OF ANDROID MOBILE HANDSETS by ... system which runs on GPS enabled Android mobile phones. The mobile .... 1.5.3 Data collection .

ENHANCING MOBILITY INDEPENDENCE FOR VISUALLY IMPAIRED PEOPLE THROUGH THE USE OF ANDROID MOBILE HANDSETS by

TSHEPHISHO JOSEPH SEFARA (201007780)

RESEARCH REPORT

Submitted in fulfilment of the requirements for HONOURS DEGREE in DEPARTMENT OF COMPUTER SCIENCES in the FACULTY OF SCIENCE AND AGRICULTURE (School of Mathematical and Computer Sciences) at the UNIVERSITY OF LIMPOPO SUPERVISOR: Mr. PS Ramalepe

2014

DECLARATION

I declare that ENHANCING MOBILITY INDEPENDENCE FOR VISUALLY IMPAIRED PEOPLE THROUGH THE USE OF ANDROID MOBILE HANDSETS is my own work and that all the sources that I have used have been indicated and acknowledged by means of complete references and that this work has not been submitted before for any other degree at any other institution.

Name: TSHEPHISHO JOSEPH SEFARA

Page | ii

Date: 2014/10/29

ACKNOWLEDGEMENTS

I would like to thank the following persons for their contribution to this research project: 

A special thank you to the Lord who gave me power and hope during the period of this project.



My supervisor, Mr PS Ramalepe for the guidance and support.



My colleagues for their encouragement in this study.

Page | iii

ABSTRACT

There have been attempts to help visually impaired people on their navigation from one destination to another. Most of the systems developed are expensive and heavy to be used by visually impaired people. This study focuses on visually impaired people in the University of Limpopo, Turfloop campus. The aim of this research is to enhance mobility independence for visually impaired people through the use of a mobile navigation system which runs on GPS enabled Android mobile phones. The mobile navigation system contains GPS co-ordinates points of known roads in the University of Limpopo, Turfloop campus. The mobile navigation system notifies them either by vibrating or making a beep sound, when they get lost along the way.

Page | iv

CONTENTS DECLARATION ................................................................................................................ii ACKNOWLEDGEMENTS ............................................................................................... iii ABSTRACT .....................................................................................................................iv LIST OF FIGURES ..........................................................................................................ix LIST OF TABLES ............................................................................................................ x LIST OF ABBREVIATIONS AND ACRONYMS ...............................................................xi CHAPTER 1: INTRODUCTION ....................................................................................... 1 1.1

Background of the problem ................................................................................ 1

1.2

Research problem .............................................................................................. 2

1.3

Related work ...................................................................................................... 2

1.4

Purpose of the study .......................................................................................... 4

1.4.1

Aim .............................................................................................................. 4

1.4.2

Objective...................................................................................................... 4

1.5

Research methodology ...................................................................................... 4

1.5.1

Research design .......................................................................................... 4

1.5.2

Sampling...................................................................................................... 5

1.5.3

Data collection ............................................................................................. 5

1.5.4

Data analysis ............................................................................................... 5

1.6

Importance of the proposed research ................................................................ 5

1.7

Design and implementation ................................................................................ 6

CHAPTER 2: LITERATURE REVIEW ............................................................................. 7 2.1

Introduction ........................................................................................................ 7

2.2

Review of navigation systems used by visually impaired people ....................... 7

2.2.1

The white cane and RFID navigation system .............................................. 7 Page | v

Contents 2.2.2

The guide dog navigation system ................................................................ 8

2.2.3

The tactile navigation system ...................................................................... 9

2.2.4

Other electronic navigation systems .......................................................... 10

2.3

Conclusion ....................................................................................................... 15

CHAPTER 3: METHODOLOGY .................................................................................... 17 3.1

Introduction ...................................................................................................... 17

3.2

Research design .............................................................................................. 17

3.3

Research setting .............................................................................................. 18

3.4

Population and sample ..................................................................................... 19

3.4.1

The population specification ...................................................................... 20

3.4.2

Probability sampling .................................................................................. 20

3.5

Data collection.................................................................................................. 21

3.5.1

Data collection objectives .......................................................................... 21

3.5.2

Aim of data collection ................................................................................ 21

3.5.3

Data collection procedure .......................................................................... 22

3.5.4

Data collection instrument ......................................................................... 22

3.6

Reliability .......................................................................................................... 23

3.7

Validity ............................................................................................................. 23

3.8

Pretesting the questionnaire............................................................................. 24

3.9

Ethical consideration ........................................................................................ 24

3.10 Data analysis.................................................................................................... 25 3.10.1 Importance of the study ............................................................................. 25 3.10.2 Research objectives .................................................................................. 25 3.10.3 Data presentation and discussion .............................................................. 25 3.11 Conclusion ....................................................................................................... 32 Page | vi

Contents CHAPTER 4: DESIGN .................................................................................................. 33 4.1

Introduction ...................................................................................................... 33

4.2

Analysis ............................................................................................................ 33

4.2.1

Use case.................................................................................................... 33

4.2.2

UML activity diagram ................................................................................. 35

4.3

Design of mobile application ............................................................................ 37

4.3.1

Android architecture .................................................................................. 37

4.3.2

Architectural design ................................................................................... 38

4.3.3

Component level design ............................................................................ 40

4.3.4

UI design ................................................................................................... 45

4.3.5

Data modelling ........................................................................................... 47

4.4

Conclusion ....................................................................................................... 48

CHAPTER 5: IMPLEMENTATION ................................................................................ 49 5.1

Introduction ...................................................................................................... 49

5.2

Data providers .................................................................................................. 49

5.3

Location ........................................................................................................... 50

5.3.1 5.4

Global Positioning System ......................................................................... 50

Conclusion ....................................................................................................... 54

CHAPTER 6: CONCLUSION ........................................................................................ 55 6.1

Introduction ...................................................................................................... 55

6.2

Summary .......................................................................................................... 55

6.3

Challenges ....................................................................................................... 55

6.4

Recommendations ........................................................................................... 56

6.5

Limitations to the study .................................................................................... 56

BIBLIOGRAPHY ........................................................................................................... 57 Page | vii

Contents APPENDIX A: QUESTIONNAIRE ................................................................................. 63 APPENDIX B: OBTAINING GOOGLE MAPS API KEY................................................. 66 APPENDIX C: UML DIAGRAMS ................................................................................... 68 APPENDIX D: SOURCE CODE OF THE APPLICATION ............................................. 69 APPENDIX E: DATA FROM UL DSU............................................................................ 76 APPENDIX F: PROJECT PLAN .................................................................................... 77

Page | viii

LIST OF FIGURES Figure 2.1 A conception of the navigation system [10]. ................................................... 8 Figure 2.2 Prototype of the oral tactile interface [15]. .................................................... 10 Figure 2.3 The functional components of any navigation system for the blind [16]. ...... 11 Figure 2.4 Bionic Eyeglass: The first prototype [22]. ..................................................... 13 Figure 2.5 Mobile navigation for visually impaired based on Microsoft Kinect [23]. ....... 14 Figure 3.1 Age of participants ....................................................................................... 26 Figure 3.2 Gender of participants .................................................................................. 27 Figure 3.3 Living areas .................................................................................................. 28 Figure 3.4 Travelling...................................................................................................... 29 Figure 3.5 Guidance modes .......................................................................................... 31 Figure 4.1 UML use case diagram for the system ......................................................... 34 Figure 4.2 UML activity diagram for the overall system ................................................. 36 Figure 4.3 Android software stack [41]. ......................................................................... 38 Figure 4.4 Overall mobile application architecture ........................................................ 40 Figure 4.5 Components of the application ..................................................................... 41 Figure 4.6 Activity lifecycle [42] ..................................................................................... 42 Figure 4.7 Component-level design for the MapsActivity .............................................. 43 Figure 4.8 Component-level design for the DataBaseHelper ........................................ 45 Figure 4.9 Illustration of a view hierarchy, which defines a UI layout [46] ..................... 46 Figure 4.10 UI design .................................................................................................... 47 Figure 4.11 UML diagram of the entity GPS_points ...................................................... 47 Figure 5.1 Data is disabled............................................................................................ 50 Figure 5.2 GPS is disabled............................................................................................ 52 Figure 5.3 Application screenshot during testing ........................................................... 53 Figure 5.4 Known roads ................................................................................................ 54

Page | ix

LIST OF TABLES Table 3.1 Kinds of smart phones ................................................................................... 30 Table 3.2 Participants likes to use the system .............................................................. 30 Table 3.3 Navigation type.............................................................................................. 31 Table 3.4 Use of white can when walking with sighted guide ........................................ 32 Table 4.1 StartApplication use case scenario ............................................................... 34

Page | x

LIST OF ABBREVIATIONS AND ACRONYMS

ADT

Android Development Tools

API

Application Programming Interface

DGPS

Differential Global Positioning System

DSU

Disabled Student Unit

DVM

Dalvik Virtual Machine

GIS

Geographical Information System

GPS

Global Positioning System

GUI

Graphical User Interface

IBM

International Business Machines

IMU

Inertial Measurement Unit

JDK

Java Development Kit

OHA

Open Handset Alliance

PC

Personal Computer

PDR

Personal Dead-reckoning

RFID

Radio-Frequency Identification

RGB

Red Green and Blue

SACNB

South Africa National Council for the Blind

SDK

Software Development Kit

SLAM

Simultaneous Localization and Mapping

SPSS

Statistical Package for Social Sciences

UI

User Interface Page | xi

List of abbreviations and acronyms UL

University of Limpopo

UML

Unified Modelling Language

WI-FI

Wireless Fidelity

WSN

Wireless Sensor Network

Page | xii

CHAPTER 1: INTRODUCTION

1.1

Background of the problem

The visually impaired people do not interact in basic routine activities like the nonvisually impaired people. However, the visually impaired people are an important part of the humankind. All the innovations and technologies are developed to make life easier and comfortable, but to the visually impaired people the usage of such technologies is limited.

According to the statistics [1] of the SANCB over 45 million people around the world are completely blind, the top three causes of blindness worldwide are cataract which accounts to 39.1% of the global blindness, uncorrected refractive errors accounts to 18.2% of global blindness and glaucoma accounts to 10.1% of global blindness. The prevalence of sight disability in South Africa is the highest of all disabilities (32%) followed by physical disability (30%), hearing disability (20%), emotional disability (16%), intellectual disability (12%) and communication disability (7%).

The focus of this study is to develop a mobile application which will alert the visually impaired people when walking on unknown roads. At the UL Turfloop campus visually impaired people can be able to walk in the areas designed for their lives using blind navigation equipment. The visually impaired people commonly rely on using white canes or a guide dog to help them in reaching their desired destination safely. However, this approach is difficult if the guide dog does not know the entire path to the destination. The visually impaired people rely on repetitive and regular roads with least obstructions for their daily movement in a predefined area of the UL Turfloop campus.

Page | 1

Chapter 1: Introduction At the UL, there are student with visual impairment. They can go to their classrooms and cafeterias by using white canes on learned roads to those places. It is very likely for visually impaired people to deviate from their way. In this study we will develop an application that will enhance mobility independence for the visually impaired people through the use of Android mobile handsets. The literature review, methodology, design, and implementation of the mobile application are explained in the next chapters.

1.2

Research problem

The problem is the shortage of advanced technologies to aid the visually impaired people on their mobility. Currently they use the symbol canes as guidance, possibility of losing direction is high because the symbol canes does not determine whether the person is on the right road or not.

1.3

Related work

Researchers have developed navigation equipment for visually impaired people. In 2001, Helal et al. [2] developed a navigation system called Drishti (meaning Vision in the ancient Indian language Sanskrit). Drishti is an integrated navigation system for visually impaired community based on wireless pedestrian navigation system [2]. It integrates many technologies including wearable computers, voice recognition, voice synthesis, wireless networks, GIS and GPS [2]. Drishti also uses speech synthesis but requires that users wear headphones on both ears. But many visually impaired people express strong reservations about wearing headphones when they walk [3].

RFID [4] involves a tag affixed to a product which identifies and tracks the product via radio waves. These tags can carry up to 2,000 bytes of data. This technology has three

Page | 2

Chapter 1: Introduction parts: a scanning antenna, a transceiver with a decoder to interpret the data and a transponder (RFID tag) pre-set with information [4]. The scanning antenna sends out a radio-frequency signal providing a means of communication with the RFID tag. When the RFID tag passes through the frequency field of the scanning antenna; it detects the activation signal and can transfer the information data to be picked up by the scanning antenna [4].

RFID is a non-contact system that uses radio frequency electromagnetic fields to transfer information to another for the purpose of automatic identification [4]. RFID provide navigation system both indoors and outdoors. RFID tags inside a building can indicate the location of the stairs and objects inside the house, and RFID tags placed on the street improve outdoor navigation [4]. Mines present some of the toughest conditions imaginable for an RFID system. At the same time, Willard Batteries [5], the company which provides batteries and various management services to the mining industry in South Africa, finds that RFID technology can provide potential principal source of benefits.

Among the equipment, the Willard battery company [6] manages for mining companies, are battery-powered cap lamps, worn by miners on their helmets. By attaching RFID tags to the lamps, Willard can track their use and maintenance. The technology can also identify users and their movements into and out of the mines.

RFID system [4] involves assembling and inserting computerized chip which works out to be more expensive. RFID readers struggle picking up information when passing near medals or liquids. Reader collision can occur where two signals from different readers overlap and the tag is unable to respond to both. Tag collision can occur when numerous tags in the same area respond at the same time. WSN [7] layers can be deployed outside buildings. WSN layers will cover surrounding obstacles which are far away from the visually impaired person path.

Page | 3

Chapter 1: Introduction 1.4

Purpose of the study

1.4.1 Aim

This project is focused on visually impaired people at the UL Turfloop campus. The aim of this project is to enhance mobility independence by developing a mobile application which will run on GPS enabled Android mobile devices. The application will support the use of symbol canes for guidance. Hence, using symbol cane together with the application will provide more advanced navigation.

1.4.2 Objective

The objective is to give the visually impaired people support on their navigation through the UL Turfloop campus and to enhance their mobility. The application has to be installed on their mobile phones.

1.5

Research methodology

1.5.1 Research design

Quantitative research paradigm will be used because it is independent of the researcher and descriptive survey will be used to collect information from the visually impaired people.

Page | 4

Chapter 1: Introduction 1.5.2 Sampling

The sample will be composed of visually impaired students in the UL, Turfloop campus. Probability sampling will be used because it provides mathematical statistics and probabilities that can be applied to analyse and interpret data. . 1.5.3 Data collection

The aim of data collection is to understand respondent’s requirements, how they interact with the mobile application, and to create a basis for creation of the mobile application design. The questionnaires will be used to collect the data, and questionnaires will be completed for each participants by asking them questions from the questionnaire.

1.5.4 Data analysis

The data will be collected and analysed using the statistical computer program called IBM SPSS, and the research results will be presented in the form of percentages, bar graphs, pie diagrams and frequency counts.

1.6

Importance of the proposed research

This study will be helpful to the visually impaired people at the UL Turfloop campus because it will produce an Android mobile phone application that can be used as a guidance in their mobility. Therefore, this study intends to make difference in the lives of the visually impaired.

Page | 5

Chapter 1: Introduction 1.7

Design and implementation

The design phase describes how the system will operate, in terms of network, software, hardware infrastructure and the database that will be needed. The design is explained in terms of the android architecture, architectural design, component level design, UI design and data modelling. The implementation phase describes the significance part of the code including GPS connectivity, internet connectivity, and connection to the database.

Page | 6

CHAPTER 2: LITERATURE REVIEW

2.1

Introduction

People with visual impairment face large restrictions with regard to their mobility and in this era there is limitation of infrastructures to make it easier. The visually impaired people often might knock up against dangerous obstacles while walking or fall down at uneven walking paths. Hence, electronic white canes have been studied [8], [9]. Researchers have developed navigation systems to enhance independence, safety and mobility for the visually impaired people. This literature review explores current navigation systems used by the visually impaired people including: (a) the white cane and RFID navigation system, (b) the guide dog navigation system, (c) the tactile navigation system, and (d) other electronic navigation systems.

2.2

Review of navigation systems used by visually impaired people

2.2.1 The white cane and RFID navigation system

A white cane is a useful supporting device for the visually impaired people. They use a white cane while walking both indoors and outdoors. The most important function of the white cane is for detection of obstacles. The visually impaired people can walk independently using a white cane in the learnt area. However, they cannot walk independently in the unknown area without help of others. Because, a white cane cannot give them route to their destination since it is not a navigation device. Many navigation systems for visually impaired people uses GPS. These systems form a support for outdoor navigation of unknown environment. However, these supporting systems are useless in the indoor space [10]. Fukasawa and Magatani [10] created an indoor navigation system for the visually impaired people. The system is composed of RFID tags, intelligent white cane and coloured navigation lines. An intelligent white cane Page | 7

Chapter 2: Literature review composed of a voice processor, a vibrator, transceiver for RFID tags and a RGB colour sensor [11].

Figure 2.1 A conception of the navigation system [10].

Navigation lines are on the floor to different destinations, each with unique colour and unique RFID code. The visually impaired person may swings the white cane along the navigation lines, if the colour sensor catches the target colour, the white cane will notify the user by vibrating. If the white cane catches RFID tag at the mark point. The voice processor will notify the user about the area code received. This system is useful for visually impaired people mostly at hospitals and airports. The concept of the navigation systems is shown in figure 2.1.

2.2.2 The guide dog navigation system

In recent years, researchers attempted to develop a guide dog system and a walk guide system. Iwatsuka et al. [12] developed a guide dog system which can recognize characters using character recognition. The system focus on reading room numbers and leads the user to a target place [12], but the system cannot read words. Hence, the Page | 8

Chapter 2: Literature review study of Iwatsuka et al. [12] can be improved further by using a mobile phone with character recognition function and speech synthesis so that the mobile phone can speak aloud the recognized room numbers all well as words.

2.2.3 The tactile navigation system

The study of Dowson [13] has determined that visually impaired people can reliably detect, distinguish and remember a limited number of different tactile paving surfaces and the distinct meanings assigned to them. Recognising that the needs of people with physical and sensory disabilities could create potential conflicts, the research which led to the development of the tactile paving surfaces involved not only the target group (blind and partially sighted people), but also others with a wide range of other disabilities including wheelchair users and people with walking difficulties, particularly people with painful conditions such as arthritis. Tactile paving surfaces [13] is used to convey important information to visually impaired pedestrians about their environment, for example, hazard warning, directional guidance, or the presence of an amenity.

By contrast, Kassim et al. [14] revealed that tactile paving is a kind of texture ground surface which is installed on the floor to help the visual impaired people to distinguish direction, location, and potential hazardous environment. However, the visually impaired people sometimes will face difficulties to detect the surrounding area or identify the correct path from one place to another by using their conventional white cane even though they walk on the tactile paving. In this situation, they will identify the surrounding area by asking a stranger around them, otherwise they may be involved in hazardous situation.

Tang and Beebe [15] created an oral tactile interface to help visually impaired people for navigation on the outdoor environment. The oral tactile interface provides directional cues along tactile channel, which can be used by the visually impaired person to find

Page | 9

Chapter 2: Literature review guidance in outdoor navigation. The device (figure 2.2) is a mouthpiece with a microfabricated electrotactile display on top for tactile demonstration on the palate and the touch keypad for the tongue at the bottom [15]. The tactile display contains an array of stimulators that shows dynamically refreshable tactile patterns on the oral structures [15]. The user perceives and recognizes the patterns and retrieves guidance information conveyed in these patterns (figure 2.2b). Figure 2.2a shows a flexible tactor array on top, and figure 2.2b shows a tongue touch keypad at the bottom.

Figure 2.2 Prototype of the oral tactile interface [15].

2.2.4 Other electronic navigation systems

The idea of using GPS [16] to assist the visually impaired with navigation goes back over a decade, when Collins and Loomis independently proposed it. The first evaluation of GPS for this purpose was carried out by Strauss and his colleagues [17] because their research was conducted at an early stage of GPS development, the poor positioning accuracy they obtained precluded practical studies with blind subjects. Page | 10

Chapter 2: Literature review Nowadays, most GPS applications requiring real-time positioning accuracy better than 25 metres use differential correction (DGPS)—correction signals from a GPS receiver at a known fixed location are transmitted by radio link to the mobile receiver, allowing the mobile receiver to determine its position with an absolute positional accuracy on the order of 1 metre or better.

Figure 2.3 The functional components of any navigation system for the blind [16].

All GPS-based navigation systems [16] for the blind consist of these functional components (figure 2.3), a module for determining the traveller’s position and orientation, a GIS comprising the system software and the spatial database for relating the traveller’s orientation and GPS coordinates to the surrounding environment, and the UI. There are now a number of research and commercial endeavours around the world utilizing GPS or DGPS for determining the position of a blind traveller [16]. They differ in terms of the three functional modules mentioned above, as well as in terms of physical configuration.

Faibish and Abramovitz [18] presented a new method of navigation based on perception of the unknown environment for mobile robots. The perception relies on information from various kinds of sensors including low level vision, infrared and sonars. The navigation algorithm uses latest perception to build the map of the workspace and computes the most favourable path based on the previous map of the workspace [18]. This navigation algorithm is best for indoor and outdoor navigation and it can be improved further to be used by the visually impaired people. Most sensors perform Page | 11

Chapter 2: Literature review poorly in unstructured outdoor terrains, the biggest challenge is especially in outdoors, off-road environments, where tall grass, deadfall and other obstacles predominate [19]. But stereo visions may perform better with mentioned challenges since they have capability to use distant objects as landmarks for outdoor navigation [19].

Most computer navigation system for the visually impaired people have lack of a prior knowledge to determine the orientation of the user with respect to the scene. Coughlan and Yuille [20] demonstrated an algorithm to solve this problem. Their system can detect the orientation of the user based on the statistics learnt in the specific domain of interest. However, the system should be tested extensively because Coughlan and Yuille [20] did not quantify the amount of outliers in the outdoor scenes and did not apply error or residual analysis. Residual analysis is a techniques for the evaluation of the goodness of a fitted model. Residuals need to be analysed before making any conclusions. The following may results if the fitted model is invalid [21] : 

The model may not include explanatory variable that should be in the model.



The data may contain influential or outlying observations which may have an impact on the conclusion to be drawn from the analysis.



The following assumptions may not be valid o Linearity: The regression function is not linear. o Homoscedasticity: The error terms do not have a constant variance. o Independence: The error terms are not independent. o Normality: The error terms are not normally distributed.

The mentioned techniques are used to examine adequacy of the fitted model. These techniques may be based on formal statistical tests, tables of values of certain statistics or a graphical representation of these values.

Karacs et al. [22] conducted a study to create (Bionic Eyeglass: The First Prototype) a personal navigation device for visually impaired people. The system helps visually impaired and blind people in their everyday basic tasks by translating visual information into spoken speech. The system uses a cell phone as a frontend and embedded cellular visual computer as a computing service [22]. Karacs et al. [22] created a wireless Page | 12

Chapter 2: Literature review communication framework between the cell phone and the Bi-i visual computer. The recorded image from the cell phone is streamed to the Bi-i through wireless communication. The Bi-i runs several algorithms including image recognition analysis then returns the results via wireless connection to the cell phone. The cell phone plays the acoustic waveforms received from the Bi-i. The system configuration consists of the cell phone (bottom), the Bi-I visual computer (left), and the wireless adapter (right) is shown in figure 2.4.

Figure 2.4 Bionic Eyeglass: The first prototype [22].

Zöllner et al. [23] conducted a study to create a mobile navigational system for visually impaired people based on the Microsoft Kinect. From figure 2.5, (a) is a LilyPad vibe board in a plastic cap, (b) is a Mobile Kinect camera with battery pack, (c) is a Kinect helmet, (d) is a Vibrotactile Waist Belt, and (e) is a Complete setup with backpack . Zöllner utilized the Microsoft Kinect sensor, a vibrotactile waist belt built with Arduino LilyPad vibe boards and a simple backpack construction that carried the laptop to enable quick debugging as in (figure 2.5e). For detecting the immediate surroundings, Zöllner et al. [23] reversed the standard operating principle of the Kinect. Instead of a static Kinect that tracks moving objects, the system track the static environment with a moving head-mounted Kinect (figure 2.5c). A 12V battery pack used to power the mobile Kinect lasted for 5 hours during the tests (figure 2.5b). A waist belt contains Page | 13

Chapter 2: Literature review three pairs of Arduino LilyPad vibe boards (figure 2.5a) that produce vibrotactile output. The three pairs of Arduino LilyPad are fixed into plastic bottle caps to amplify the perceived vibration (figure 2.5d). The speech output is provided by an ordinary Bluetooth headset for mobile phones (figure 2.5e).

Figure 2.5 Mobile navigation for visually impaired based on Microsoft Kinect [23].

On the other hand the study of Kaiser and Lawo [24] presented a wearable navigation system for the visually impaired and blind people in unknown indoor and outdoor environments. The system navigation, map and track the position of the pedestrian during the inspection of the unknown environment. SLAM [24] was implemented from mobile robotics and when a map is created the user is guided efficiently by a route selecting method. The user is equipped with a short range laser, an IMU, a wearable computer for data processing and an audio bone headphone. PDR [24] in combination with GPS was used to cover all scenarios of outdoor localization. Blind pedestrians would explore the unknown environment while the system builds a map and simultaneously tracks the position of the person. The system was able to construct a map of unknown environment, and able to take decisions on choosing safe and

Page | 14

Chapter 2: Literature review effective routes to guide the blind person from a starting point to destinations previously explored [24].

GPS devices have gained popularity in the market implying a reduction of cost, size and new features, but fail to provide an interface for visually impaired people. Sanchez et al. [25] developed a computer tool to help visually impaired people in their displacement through a specific area. Ambient GPS [25] is a hardware and software solution for assisting the visually impaired people in their outdoor navigation. Ambient GPS uses a GPS device connected to a Pocket PC and a speech synthesis system. The pocket PC examine and translate the data received from the GPS and contrasts it with previously stored data in order to provide users with data about their destination location [25].

The most important difficulty for the visually impaired people during their outdoor navigation is the need for extra help while reaching a particular place. Mecocci et al. [26] proposed a system to solve this difficulty by supplying the visually impaired person with a mobile system which has the functionality of providing the user with scene descriptions and other important landmarks. The system is capable of providing an interpretation of real world scenes by analysing a single image [26].

2.3

Conclusion

Visually impaired people are not comfortable wearing many navigation equipment on their body as in the study of Zöllner et al. [23] , Kaiser and Lawo [24]. Hence, mobile navigation using smart phones need to be developed. The study of Dowson [13] may be further investigated by including mobile GPS system together with tactile paving for visually impaired people to avoid difficulties detecting the surrounding area. White cane and guide dog are the most common support tools used by the visually impaired people [27]. However, a long white cane cannot be used for indoors and in urban areas [28]. Hence, a mobile navigation system on smart phone may assist the visually impaired Page | 15

Chapter 2: Literature review people on their outdoor navigation to enhance independency. Guide dogs are assistance dogs trained to assist the visually impaired people. Trained guide dogs are able to avoid obstacles and bark to warn some special emergency. However, guide dog’s training process lasts up to 18 months [27], which will costs around R58000 for each, and one dog can only work for 8 to 10 years. Thus, it is necessary for visually impaired to use the navigation system on smart phones on their mobility. This study will produce a mobile application to be used by the visually impaired people on their mobile phones. The mobile application will give the visually impaired people support when walking from one place to another place. The reviewed navigation systems are expensive for the visually impaired people to use them. Thus, this study is helpful for visually impaired people because they will use the mobile application with no costs.

Page | 16

CHAPTER 3: METHODOLOGY

3.1

Introduction

In the previous chapter, a literature review was done to explore various navigation systems for the visually impaired people, but mobile navigation using GPS has been identified for investigation. Now this chapter describes the research methodology used for this study. The study design, the geographical area where the study was conducted, analysis methods and the population and sample are explained. The instrument used for data collection, including methods applied to maintain reliability and validity of the instrument, are also explained.

3.2

Research design

The research design is the most significant step because it determine the failure or success of the research. The research design controls logical actions of analysis and collection of data so that conclusions may be drawn. Definitions of research design are varied and ambiguous. For instance, Rowley [29] defines the research design as the logic that links the data to be collected and the conclusions to be drawn to the initial questions of a study.

According to De Vos and Fouche [30] research design is a blueprint or detailed plan of how a research study is to be conducted – operationalizing variables so they can be measured, selecting a sample of interest to study, collecting data to be used as a basis for testing hypothesis and analysing the results. Huysamen [31] offers a closely related definition of research design as the plan or blueprint according to which data is collected to investigate the research hypothesis or question in the most economical manner.

Page | 17

Chapter 3: Methodology In this study, a quantitative research paradigm was used because of the following reasons. 

It involves the use and analyses of numbers.



It produces statistical data that tell us how respondents reacts.



It is independent of the researcher.

According to Muijs [32] quantitative research paradigm is explained as the phenomena by collecting numerical data that are analysed using mathematically based methods (particularly in statistics). A descriptive survey was used because it provides an accurate portrayal or account of the characteristics, for example abilities, beliefs, behaviour, opinions, and knowledge of a particular individual, situation or group. This research design was chosen to meet the objectives of the study. Descriptive survey is defined as a method of sociological investigation that uses question based or statistical surveys to collect information about how people think and act [33]. A survey obtains information from sample of population of people by means of an interview or self-report. Self-report is any method which relies on the participant’s own report of their feelings, attitudes or believes without the help of the researcher. Mostly self-reports are often used as a way of gaining participants responses in observational studies and experiments.

3.3

Research setting

The study was conducted at the UL Turfloop campus which fall under Capricorn District in the Limpopo Province. The university has a building for people with disabilities called DSU. The institution provides appropriate support services to empower people with disabilities on campus and in the community. The institution is committed to providing support services to students with disabilities in order to provide a study environment which promotes fairness and equity to all students irrespective of race, gender, disability, and religion. Students with disabilities are accommodated in University residences.

Page | 18

Chapter 3: Methodology The DSU offers the following services [34]: 

Orientation and Mobility The DSU offers orientation and mobility training to new students to familiarise them with the new surroundings.



Computer Training Facilities Students with disabilities are offered basic computer lessons to develop essential skills for using adaptive technology. Computers with speech synthesis for visually impaired students are available for academic activities.



Low Vision Reading Room Reading equipment that enlarges print to any size are available for partially sighted students.



Braille Printing Braille printers are available to visually impaired students to transcribe print into Braille and photocopying is done for students.



Braille and Audio Library Services Audio books, Braille and journals are available to be loaned out to students.



Assistive Devices Assistive devices such as tape-recorders, type-writers and Perkins Braillers are also loaned out to students.

3.4

Population and sample

According to Kitchenham & Pfleeger [35] the target population is the group or the individuals to whom the survey applies, and the target population should be represented as a final list of all its parameters. Kitchenham & Pfleeger [35] also define a sample as a representative subset of the target population. The study target population is composed of all the visually impaired students in the UL Turfloop campus and the sample is described as a randomly selected part of the population to be studied.

Page | 19

Chapter 3: Methodology 3.4.1 The population specification

The criteria that were present in order to select participants, were the following: 

The individual should be a registered student of UL.



The individual is visually impaired.



The individual is willing to participate.

3.4.2 Probability sampling

In this study, probability sampling method was used because of the following reasons: 

It provides data of known quality.



It provides data in timely fashion.



It provides a quantitative measure of the extent of variation due to random effects.



It provides acceptable data at minimum cost.



It provides better control over non sampling sources of errors.



Mathematical statistics and probability can be applied to analyse and interpret data.

A sample size of 24 visually impaired students was randomly selected from the population size of 25 visually impaired students. The sample consists of subjects who were willing to participate in the research and who met the sampling specification, the sample included 10 blind students and 14 partially sighted students.

Page | 20

Chapter 3: Methodology 3.5

Data collection

Data collection is the process of gathering and measuring information on variables of interest, in an established systematic fashion that enables one to answer stated, test hypotheses, and evaluate outcomes.

3.5.1 Data collection objectives

The objectives of data collection is 

To collect accurate data from the visually impaired people about the proposed mobile android application.



To understand if they want to use the proposed mobile android application.



To extract as much information as possible that is pertinent to the visually impaired people.

3.5.2 Aim of data collection

The aim of data collection is to understand respondent’s requirements, how they interact with the mobile application, and to create a basis for creation of an application design.

Primary data is a data observed or collected directly from first-hand experience. The advantages of primary data are the following: 

Data is unbiased, up-to-date, accurate and reliable, applicable and usable.



Direct interaction with the respondents.



And primary data collection is a powerful method for acquiring information.

Primary data was used in this study because of the following [36]: Page | 21

Chapter 3: Methodology 

It addresses specific research issues.



It enables greater control over how the information is collected.



Focuses on both qualitative and quantitative issues.

3.5.3 Data collection procedure

Meetings were arranged with the participants. Each participant was informed about the study, and trained to answer the questions. Questionnaires were completed for each participants by asking them questions from the questionnaire.

3.5.4 Data collection instrument

The questionnaire was chosen as a data collection instrument because of the following: 

Doubts regarding the meaning of the questions may be clarified to ensure that the participant is answering the questions in the sense that the researcher intended.



The importance of the research can be personally presented to the participants and its significance explained to them to motivate honest answers by emphasising their contribution to the research.



It requires fewer skills than interviewing, and hence relatively low skilled assistants can be recruited to perform this task to speed up the research.

A questionnaire is a printed self-report designed to draw information that can be extracted through the written responses of the subjects. The contents of the questionnaire were designed upon the following: 

Most questions were closed ended, which made it easy to compare responses to each other.



They were more specific.



They required less energy and time to administer.



They were presented in a consistent manner. Page | 22

Chapter 3: Methodology

Questionnaires have their weakness because the subjects may not reflect their true opinions but might answer what they think will please the researcher. To ensure anonymity, we guaranteed that the answers would not link to them at a stage of data analysis, by using anonymous names and by destroying data at the completion of this study. Questions consisted of gaining demographical information such as age and gender. Questions assessing knowledge about navigation tools were included.

3.6

Reliability

Golafshani [37] defines reliability as an extent to which result are consistent over time and an accurate representation of the total population under study is referred to as reliability and if the results of a study can be reproduced under a similar methodology, then the research instrument is considered to be reliable. The questionnaire answered by the visually impaired revealed consistency in responses. Reliability can be ensured by reducing sources of measurement error like bias in data collection. Data collection bias was reduced by assigning one person to conduct the survey and standardising conditions such as personal attributes, e.g. support and friendliness.

3.7

Validity

According to Golafshani [37] validity determines whether the research truly measures that which it was intended to measure or how truthful the research results are. In other words, does the research instrument allow you to hit "the bull’s eye" of your research object? Researchers generally determine validity by asking a series of questions, and will often look for the answers in the research of others. Content validity refers to the extent to which an instrument represents the factors under study. To achieve content validity, questionnaires included a variety of questions on the knowledge of visually impaired people about independence and mobility. Questions were based on data from Page | 23

Chapter 3: Methodology the literature review to ensure that visually impaired people should have knowledge of various navigation equipment.

3.8

Pretesting the questionnaire

Pre-testing is the administration of the data collection instrument with a small set of respondents from the population for the full scale survey. If problems occur in the pretest, it is likely that similar problems will arise in full-scale administration. The purpose of pre-testing is to identify problems with the data collection instrument and find possible solutions. We pretested the questionnaire with four visually impaired people. All of them were able to answer the questions and no question was changed following the pre-test.

Ethical consideration

3.9

There are several ethical issues that had to be taken into account, when doing a research. The conducting of the research needs integrity, honesty and morality, but not only hard worker and expertise. This helps to protect the identity of the rights of human subject. The following ethical consideration were maintained [38]: 

Self-determination: Subjects were allowed to voluntarily choose to participate.



Anonymity: Anonymity is described as when the subject name is associated with his or her respond. In this study, subject’s names were not included on the questionnaire.



Harm: Sensitive questions were avoided to prevent subjects from harm.



Deception: Subjects may be deceived in order to conceal actions or functions of the subject, to disguise the aim of the study and to conceal experiences which the

subjects

will

undergo.

Deception

of

research

subjects

involves

misinterpretation of facts to convince another person to accept the fallacy. This deception may occur while the researcher is not aware. Page | 24

We avoided any

Chapter 3: Methodology deception in this study and make sure that everyone understand how the study was conducted.

3.10 Data analysis The data was coded, classified, organized and analysed after it was collected. For closed questions, the researcher used a computer programme called IBM SPSS to analyse the data. The collected data was analysed using descriptive statistics. The importance of data analysis is to get meaning and explanation in raw data. The research results are presented in the form of percentages, bar graphs, pie diagrams and frequency counts.

3.10.1 Importance of the study

As stated in the previous chapters the aim of this study is to enhance mobility independence for the visually impaired people by developing an application which will run on GPS enabled Android mobile phones.

3.10.2 Research objectives

To be able to achieve the above aim the following objectives have been identified: 

To give the visually impaired people support on their navigation.



To enhance their mobility through the UL Turfloop campus.

3.10.3 Data presentation and discussion

The data is presented in descriptive statistics through frequency counts and percentages, illustrated in pie diagrams and bar graphs. Page | 25

Chapter 3: Methodology

 Demographical characteristics of participants

This section present the demography of the participants in terms of age, gender and traveling. The participants’ ages ranges from 16 to older than 36 years. Figure 3.1 shows which group of people participated. From analysis 25% of participants were on the age interval of 16-19 years, 42% were on the age interval of 20-25 years, 21% were on the age interval of 26-35 years, and 13% were on the age interval of 36 years and above (see figure 3.1).

Figure 3.1 Age of participants

Page | 26

Chapter 3: Methodology The data shows that visually impaired people on age interval of 16-25 years participated more than students on the age of 26 and above. The data on figure 3.2 shows that males participated more than females from all the age groups and there were no females from the age group of 36 years and older.

Figure 3.2 Gender of participants

 Living areas of participants

The majority of participants are residing inside the campus (see figure 3.3), this shows that about 54% of the visually impaired students are residing in the campus, 46% of Page | 27

Chapter 3: Methodology them are living outside the campus. Hence, on-campus students are more advantageous than off-campus students because the mobile application functions inside the campus.

Figure 3.3 Living areas

The data on figure 3.4 illustrate how visually impaired people travel to their school. Since most of them are residing inside the campus, 54% of them go to their school using the white canes, 33% travel to school by cars, and 13% of them are helped by someone.

Page | 28

Chapter 3: Methodology

Figure 3.4 Travelling

 Mobile navigation system

From table 3.1 shows that 44% of the visually impaired people have smartphones, a smartphone is a mobile phone built on a mobile operating system, with more advanced computing capability and connectivity than a feature phone. The data on table 3.2 shows that all participants are interested to use the system and they promised to buy the android mobile phone.

Page | 29

Chapter 3: Methodology

Table 3.1 Kinds of smart phones Participants

Percentage

Android smart phone

8

32

iPhone smart phone

3

12

Other

13

52

Total

24

96

Table 3.2 Participants likes to use the system

Responds

Yes

Frequency

Percent

24

100,0

Figure 3.5 shows the navigation mode, about 85% of the participants chose the vibration navigation mode over beep and voice navigation. The disadvantage of voice navigation is that it does not work properly in noisy areas. Thus, it makes the vibration navigation to be more advantageous.

Page | 30

Chapter 3: Methodology

Figure 3.5 Guidance modes

All the visually impaired chose to use the application that alert them on unknown roads (table 3.3). Most of them did not like to move around the campus without a sighted guidance but few of them (partially sighted) can go to cafeterias without a sighted guidance.

Table 3.3 Navigation type

votes

1. System that guide you to move from point A to point B

0

2. System that guide you to use roads in the UL campus

0

3. System that alert you when you use the road you did not learn.

24

Page | 31

Chapter 3: Methodology

The data on table 3.4 reveals that 83% of the visually impaired people use a white cane when they go to cafeterias with a sighted guide but 17% of them don’t use a white cane.

Table 3.4 Use of white can when walking with sighted guide

Votes

Yes

20

No

4

3.11 Conclusion

In this chapter IBM SPSS was used for data analysis, and descriptive statistics was used to draw pie and bar diagrams, and tables were added to summarise the results. Now that the research methodology in the form of research design, data collection methods, measuring instruments, sampling, validity, reliability and data analysis has been discussed, the stage has been set for the design of the mobile application. The next chapter will therefore deal with the design of the application.

Page | 32

CHAPTER 4: DESIGN

4.1

Introduction

The design phase describes how the system will operate, in terms of network, software, hardware infrastructure and the database that will be needed. This chapter explains the analysis, design of the mobile application, android architecture, architectural design, component level design, UI design and data modelling.

4.2

Analysis

4.2.1 Use case

In software and system engineering, a use case is a formal way of representing how the system interacts with its environment. A use case describe the set of activities that are performed by the user of the system, to achieve a goal. An actor refers to an external system, organization or person that triggers the event to which the system responds [39].

 UML use case diagram

In this study, actors are the visually impaired people. Figure 4.1 presents the use case diagram of the system. As the diagram illustrates, there is only one actor. The actor initiates the system and responds to warning messages encountered during the execution of the system, including internet connectivity and GPS status.

Page | 33

Chapter 4: Design

System

Start mobile application

Visually impaired person

Responds to warning messages

Figure 4.1 UML use case diagram for the system

 Use case scenario

The use case scenario in table 4.1 describes the way a user may start the mobile application. Table 4.1 StartApplication use case scenario Use case name Start application Context Start the application Primary Actors Visually impaired person Pre-condition

Android device

Success post condition

A start up activity that loads the map view.

Trigger

User wants to use the system

Scenario Exceptions

1. User start the application. 2. User responds to warning messages such as internet/GPS connectivity errors. 1. Device support internet connectivity. 2. Device support GPS connectivity.

Page | 34

Chapter 4: Design 4.2.2 UML activity diagram

An activity diagram shows a software process as a flow of work through a series of actions. The activity diagram is used to describe different types of processes, such as 

A work flow between the systems.



The steps performed in a use case.



The flow of data between components of the system.



A software algorithm.

Figure 4.2 shows the activity diagram for the overall system. In figure 4.2 the user start the application, the application should establish internet connection to communicate with Google Play services for authentication, and then the application loads the map view from Google Map servers, also from local database. Also the application should establish GPS connectivity to retrieve the current location of the user. The mobile navigation system contains GPS co-ordinates points of known areas. The system will notify them either by vibrating, and/or playing a beep sound when reaching unknown areas.

Page | 35

Chapter 4: Design

START

LOADS PREDEFINED GPS POINTS FROM LOCAL DATABASE

STARTS INTERNET CONNECTION

DISPLAY WARNING MESSAGE

NO INTERNET ENABLED?

YES

STARTS GPS CONNECTION

CONNECTS TO GOOGLE PLAY SERVICES

DISPLAY WARNING MESSAGE

CONNECTS TO GOOGLE MAPS SERVERS

LOADS CURRENT POSITION OF THE USER

LOADS THE MAP VIEW

GPS ENABLED? YES NO

CONNECTS TO GPS SATELLITE

END

Figure 4.2 UML activity diagram for the overall system

Page | 36

Chapter 4: Design 4.3

Design of mobile application

This section is based on the design of the mobile application. Design shows the software in different ways. The mobile application architecture is illustrated and the components of the mobile application are explained in high-level design including the Maps and location, UI, Data storage, and An Activity. The goal of design is to produce a model or representation that exhibits firmness, commodity, and delight.

4.3.1 Android architecture

Android is an open source operating system for mobile devices. Android was founded by Android Inc., and sold to Google in 2005 [40]. In November 2007, the OHA was announced amongst a consortium of several top companies. The main goal was to develop open standards for mobile devices such that developers can contribute to improve features of the product [40]. Android’s kernel is made of Linux kernel.

Above the Linux kernel (figure 4.3), there are middleware, libraries and APIs written in C, and application software running on an application framework which includes Javacompatible libraries. Software stack of Android runs Java applications using Java core libraries. Every Java application runs on its own DVM.

Page | 37

Chapter 4: Design

Figure 4.3 Android software stack [41].

Developers can develop applications using SDK provided by Google. It consists of API used to develop any Java applications. These API are middleware between the user and contents of the phone, and they provide access to contents on the phone such as GPS, Bluetooth and contacts.

4.3.2 Architectural design

The software architecture of a program is the structure(s) of the system which comprise. [39] •

The software components



The externally visible properties of those components Page | 38

Chapter 4: Design •

The relationships among the components

Software architectural design represents the structure of the data and program components that are required to build a computer-based system [39]. Figure 4.4 depicts the system architecture of the mobile application modules. The application loads map view from Google maps servers to the main display and loads GPS coordinates from the satellite. The display will show the complete map of the area and the current location will be shown by the pointer. Figure 4.4 shows the overview of the application, the application contains the Google maps interface. The map view contains zoom options and the location of the user. The mobile application have permissions to access location services, internet, vibrator, media content control, and to write external memory for data or map caching and database.

The mobile application is built using Android framework for mobile platform. It requires google play services library, Google Maps API v2, and the following permissions: 

access internet service used by the API to download map tiles from Google Maps servers;



access network state allows the API to check the connection status in order to determine whether data can be downloaded;



write external storage allows the API to cache map tile data in the device's external storage area;



access coarse location to allows the API to use Wi-Fi and/or mobile cell data to determine the device's location;



access fine location allows the API to use the GPS to determine the device's location to within a very small area.

The mobile application uses a pre-existing database to retrieve the geographical points of known and unknown areas. It calculates the current location of the user to determine whether the current location is known or unknown.

Page | 39

Chapter 4: Design

EXTERNAL MEMORY

MAP OR DATA CACHING

DATABASE

LOCATION SERVICES

GPS SATELLITE

PERMISSIONS

VIBRATION

ANDROID HANDSET

INTERNET ACCESS

DISPLAY MAP VIEW

MEDIA CONTENT CONTROL

ZOOM OPTIONS

GOOGLE MAPS SERVERS DISPLAY USER LOCATION

Figure 4.4 Overall mobile application architecture

4.3.3 Component level design

High-level components describe a clear understanding of the aim and various modules involved. This section shows the components level design of this application and explains the design and importance of each component as illustrated in figure 4.5.

Page | 40

Chapter 4: Design

USER INTERFACE

MAPSACTIVITY

DATABASEHELPER

Figure 4.5 Components of the application

An Activity is an application component that provides a screen with which users can interact in order to do something, such as send an email, dial the phone, take a photo, or view a map [42]. The lifecycle of an activity is affected by its association with other activities. An activity is created when calling the method onCreate() and is destroyed or finished when calling the method onDestroy().The visible lifetime of an activity happens between the call to onStart() method and the call to onStop() method. During this time, the user can see the activity on-screen and interact with it. The foreground lifetime of an activity happens between the call to onResume() method and the call to onPause() method. During this time, the activity is in front of other activities, for example, the method onPause() is called when a device goes to sleep or when another activity comes in front of it, a paused activity can later be resumed without loss of data. An activity can be stopped by a call to onStop() method mostly when the device has run out of memory. The activity lifecycle is shown in figure 4.6 and the elaboration of the MapsActivity design component is shown in figure 4.7.

Page | 41

Chapter 4: Design

Figure 4.6 Activity lifecycle [42]

Figure 4.7 represents the component-level design using UML notation. The MapsActivity module access data by invoking the module DataBaseHelper, which allows relevant data to be passed to the component in this case geographical coordinates are passed to the component. Page | 42

Chapter 4: Design

MapsActivity

DataBaseHelper DataBaseHelper() onCreate() onUpgrade() createDATABASE() copyDATABASE() checkDATABASE() openDataBase() getLocationPoints() open() close()

MapsActivity

LocationManager Context db Cursor GoogleMap NetworkInfo

onCreate() myLocationListener() getSystemService() showGPSAlert() findMinDistance() findDIstance() drawMarker() myController() showMobileData() DataBaseHelper()

Figure 4.7 Component-level design for the MapsActivity

Android application can access location services through Google Play services by adding Google Play library as a reference on the application project. The main component of the location framework is the LocationManager [43] system service, which provides APIs to determine location and bearing of the underlying device. With the Google Maps Android API, maps are added to the application based on Google Maps data. The API automatically handles access to Google Maps servers, touch gestures, map display, and data downloading on the map. The main class in the Google Maps Android API is a SupportMapFragment [44]. A SupportMapFragment displays a map on the screen with data retrieved from the Google Maps servers. When the SupportMapFragment has focus, it will capture pressed keys and touch gestures to pan and zoom the map automatically, including handling network requests for

Page | 43

Chapter 4: Design additional maps tiles [43]. It also provides all of the UI elements necessary for users to control the map. A LocationListener [45] is used for receiving notifications from the LocationManager when the location has changed. To use Google Maps in the mobile application, a Maps API key had to be obtained to register with the Google Play services. The methods showGPSAlert() and showMobileData() are used to display warning message if the GPS and Mobile data connectivity are disabled. The method getSystemService() is used to control the vibrator of the mobile device. The methods findDistance(), findMinDistance(), and drawMarker() are used to calculate the distance, calculate minimum distance, and draw a maker on the map respectively. The method myController() is called by myLocationListener() when new coordinates are received.

Android support the following data storages: SQLite Databases, Network Connection, Shared Preferences, External Storage and Internal Storage. This mobile application uses SQLite Database storage. By default, Android installs applications in the internal storage. This is private because other applications cannot access it, and it is removed when the user uninstall it. Databases that are created are accessible to any class in the mobile application but not outside.

As in figure 4.8, this is applied by creating a sub-class to SQLiteOpenHelper and overriding onCreate() method to run SQLite command to open a database. The method onUpgrade()

is

called

to

drop,

add

or

alter

the

table.

The

method

getReadableDatabase() is called to read from database which return SQLiteDatabase object. The method createDATABASE() calls the method checkDATABASE() to verify if the

database

already exists

in

the device

internal memory.

The method

copyDATABASE() is used to copy the database from the application’s assets directory to the device internal memory. The method openDataBase() is used to open the database in the internal memory and the method getLocationPoints() is used to retrieve data from the database. The methods close() and open() are used to close and open

Page | 44

Chapter 4: Design connection to the database. The Android SDK includes SQLite database tool which is required to manipulate the database. This application contains relational database consisting of geographical points of the roads.

MapsActivity onCreate() myLocationListener() LocationManager() onStatusChanged() ConnectivityManager() getSystemService() showGPSAlert() findMinDistance() findDIstance() drawMarker() myController() showMobileData() DataBaseHelper() MapsActivity

DataBaseHelper

DataBaseHelper Context SQLiteDatabase Cursor DataBaseHelperr() onCreate() onUpgrade() createDATABASE() getReadableDatabase() copyDATABASE() checkDATABASE() openDataBase() getLocationPoints() open() close()

Figure 4.8 Component-level design for the DataBaseHelper

4.3.4 UI design

Activities are accountable for creating the UI with setContentView(View) method. When a new activity is created, it is placed in front of the current activity and becomes in running state. The user interact with the UI. Android provides pre-build UI components such as UI controls and layout objects that are used to create GUI. All UI elements are built using View and ViewGroup objects. An object View draws something on the screen that the user can interact with. A ViewGroup object holds other View (and ViewGroup) objects in order to define the layout of the interface [46]. Android provides collection of both View and ViewGroup subclasses that offer common input controls Page | 45

Chapter 4: Design (such as buttons and text fields) and various layout models (such as a linear or relative layout). Android contains various layout managers to be used to create the UI. This application employs fragment layout. Figure 4.9 shows how Views are represented.

Figure 4.9 Illustration of a view hierarchy, which defines a UI layout [46]

The UI of the mobile application is given in figure 4.10, the UI consists of the MapsActivity which displays the map on the screen, the maker, and zoom control are also displayed on the map.

Page | 46

Chapter 4: Design

Figure 4.10 UI design

4.3.5 Data modelling

The local database of the mobile device is used to store geographical co-ordinates. It consists of one table with three attributes as shown in figure 4.11. The _id is the unique primary key to identify each row. The _latitudes and _longitudes are attributes used to store the GPS co-ordinates. GPS_points PK

_id _latitudes _longitudes

Figure 4.11 UML diagram of the entity GPS_points

Page | 47

Chapter 4: Design 4.4

Conclusion

The application will be developed for Android platform using Java. The following development tools are available from the provider website. •

Install Java JDK [47]



Install Android studio bundle [48]



Or ADT bundle [49]

The above software are available for different operating systems. This chapter illustrated the design of the systems in the form of analysis and the design of the mobile application. The next chapter is the implementation of the mobile application.

Page | 48

CHAPTER 5: IMPLEMENTATION

5.1

Introduction

In this chapter, we will illustrate the important parts of the code. These parts include GPS connectivity, data connectivity, and connecting to the local database. The received GPS coordinates are compared with the coordinates in the database for determining if the user location is known or unknown.

5.2

Data providers

The data provider used for Android platform is Google Maps API. The developed mobile application retrieves data from Google Maps servers. The mobile application uses Google Maps Android API V2 to retrieve the map. Google Maps Android API V2 key was obtained from Google API console [50] to register the mobile application. The following code is declared in the manifest file for the application to have permissions to use internet and to check internet connectivity status.

When the internet is disabled the output on figure 5.1 is displayed by the code shown below. The user can cancel or go to mobile device settings to enable internet. When internet is enabled, the application authenticate to google maps servers and load the map. if ( !( net != null && net.isConnected())) { showMobileData(); }

Page | 49

Chapter 5: Implementation

Figure 5.1 Data is disabled

5.3

Location

The important sensor that serves as input information for the mobile application is the location sensor of the mobile device. The mobile application uses GPS to determine the user location.

5.3.1 Global Positioning System

The location of the mobile device can be determined by the following techniques 

GPS sensor



Cell-Network towers



Wi-Fi connection

Page | 50

Chapter 5: Implementation The signal generated by the Cell-Network towers is not accurate compared to the GPS and Wi-Fi signals. The developed mobile application uses GPS sensor. In order for the application to use location information the following code is declared in the manifest file.

The class MapsActivity uses LocationListener method to register itself by calling requestLocationUpdates() on the LocationManager.

The Location Listener provides

methods that are invoked when there is an event around GPS servers. The important method is onLocationChanged(Location location) that is invoked when there is a change in GPS coordinates. The mobile application uses coordinates received from LocationManager.GPS_PROVIDER and coordinates with accuracy greater than 16 metres are dismissed by the application as shown in the code below. Coordinates with accuracy of less than 16 metres are passed to the methods myController() and drawMaker(). if ( location.getProvider().equals( "gps" ) ) { if( location.getAccuracy() > 16) { location = null; } else { newMap.clear(); myController(location); drawMaker(location); } }

The mobile application depends on the mobile device’s GPS sensors. GPS connectivity must be available before the application may execute its functions. The following code is used to check the GPS connectivity, and figure 5.2 is outputted when the GPS is disabled. Two buttons are available for the user to cancel or go to mobile device settings to enable GPS. if (!locationManager.isProviderEnabled(LocationManager.GPS_PROVIDER)) { showGPSAlert(); }

Page | 51

Chapter 5: Implementation

Figure 5.2 GPS is disabled

When both internet and GPS are enabled, the application loads the data in the database to determine the known locations in the UL campus. The code below is used to loads the data from the database. Locations which are not saved in the database are marked as unknown locations. public Cursor getLocationPoints() { return mDATABASE.query(DATABASE_TABLE, new String[] {FIELD_ID, FIELD_LATITUDE, FIELD_LONGITUDE}, null, null, null, null, null); }

The application will determine the lowest distance between the user location and locations retrieved from the database. If the lowest distance is less than 16 metres, then the user location is on the known area. The mobile device can vibrate or make a beep sound for locations that fall under unknown areas. The distance of 16 metres was chosen based on the accuracy of the locations in the database. The code below is used to calculate the minimum distance. The method vibrate() is used to vibrate the device for 400 milliseconds and the method ToneGenerator() is used to make a sound for 500 milliseconds at a volume of 95%.

Page | 52

Chapter 5: Implementation public void findMinDistance() { double min = arrayList[0]; for( int j = 1; j < arraySize; j++ ) if ( min > arrayList[j] ) min = arrayList[j]; if ( min > 16.0 ) { vib.vibrate( 400 ); ToneGenerator tone = new ToneGenerator( AudioManager.STREAM_NOTIFICATION, 95 ); tone.startTone( ToneGenerator.TONE_CDMA_ALERT_CALL_GUARD, 500 ); } }

The following code is declared in the manifest file to allow the application to play a sound and to use the mobile device vibrator.

The marker on the map in figure 5.3 is shown by the code below, the maker is drawn using the location of the user by the method addMaker(). The method animateMaker() is used to animate the map as the user is in motion. public void drawMaker( Location loc ) { LatLng user = new LatLng( loc.getLatitude(),loc.getLongitude()); newMap.addMarker( new MarkerOptions().position(user)).setTitle("Me"); newMap.animateCamera( CameraUpdateFactory.newLatLngZoom(user,17.0f)); }

Figure 5.3 Application screenshot during testing

Page | 53

Chapter 5: Implementation Figure 5.4 shows roads which are marked as known areas at the UL Turfloop campus. These roads are chosen because the visually impaired people were trained to use them.

Figure 5.4 Known roads

5.4

Conclusion

The mobile application uses the local database in the mobile device internal memory. Verification of the database is checked when the application starts to execute. Android 4.0.4 on Samsung Galaxy Pocket was used to test the application and there were no bugs found during the tests.

Page | 54

CHAPTER 6: CONCLUSION

6.1

Introduction

This chapter presents summary, challenges, recommendations, and limitations to the study.

6.2

Summary

In the study it is seen possible to enhance mobility independence for the visually impaired people. The study was conducted in the UL Turfloop campus and was targeted at the visually impaired students registered with the UL. The literature review was conducted to review current navigation equipment used by the visually impaired. The study identified navigation using GPS as the solution to enhance mobility for visually impaired. The data was collected from the visually impaired students and no sensitive information was collected. The collected data was analysed using IBM SPSS computer program, and data analyses revealed that the mobile application can be implemented. The mobile phone application was then developed to be used by the visually impaired. The mobile application was tested and gave satisfied results.

6.3

Challenges

The challenges were to construct the roads in the campus using GPS coordinates with sufficient accuracy.

Page | 55

Chapter 6: Conclusion 6.4

Recommendations 

This study can be improved to cover all the roads in the UL Turfloop campus.



Ethical clearance can be obtained from the UL to involve the visually impaired students during all the phases of the study.



Further studies can be made using qualitative research design to reveal the expressions of the visually impaired students.

6.5

Limitations to the study

This study is limited in terms of resources because the mobile application was tested only on Android 4.0.4 instead of all Android versions, but the researcher tested the mobile application on an Android emulator with 100% success for compatibility with different versions but performance was not measured since the mobile application requires GPS coordinates.

Page | 56

BIBLIOGRAPHY

[1] “Council's

Vision,”

2010.

[Online].

Available:

http://www.sancb.org.za/article/councils-vision#sthash.mDL9oS5J.dpuf. [Accessed 3 march 2014]. [2] A. Helal, S. E. Moore and B. Ramachandran, “Drishti: An Integrated Navigation System for Visually Impaired and Disabled,” in In Wearable Computers, 2001. Proceedings. Fifth International Symposium on, 2001. [3] V. Kulyukin, C. Gharpure, P. Sute and S. Pavithran, “Perception of Audio Cues in Robot-Assisted Navigation,” in RESNA 27th International Annual Conference, Florida, 2004. [4] “RFID VS BARCODES: Advantges and disadvantages comparison,” 1 may 2012. [Online].

Available:

http://www.aalhysterforklifts.com.au/index.php/about/blog-

post/rfid_vs_barcodes_advantages_and_disadvantages_comparison.

[Accessed

16 march 2014]. [5] [Online]. Available: http://www.willard.co.za/. [Accessed 16 march 2014]. [6] B. Violino, “Mining for RFID's Benefits,” 15 August 2005. [Online]. Available: http://www.rfidjournal.com/articles/view?1759#sthash.RzDWJrCG.dpuf. [Accessed 16 march 2014]. [7] F. Alsaade, N. Zaman, Q. M. llyas, Y. Fauda and M. Ahmad, “Proposing a Smart and Intelligent System to Assist the Blind Community Based on Advance Technologies,” Journal of Applied Sciences, vol. 13, no. 7, pp. 1112-1117, 2013. [8] K. Yelamarthi, D. Haas, D. Nielsen and S. Mothersell, “RFID and GPS integrated navigation system for the visually impaired,” in Circuits and Systems (MWSCAS), 2010 53rd IEEE International Midwest Symposium on, Seattle, WA, 2010.

Page | 57

Bibliography [9] Y. Niitsu, T. Taniguchi and K. Kawashima, “Detection and notification of dangerous obstacles and places for visually impaired persons using a smart cane,” in Mobile Computing and Ubiquitous Networking (ICMU), 2014 Seventh International Conference on, Singapore, 2014. [10] A. J. Fukasawa and K. Magatani, “A navigation system for the visually impaired an intelligent white cane,” in Engineering in Medicine and Biology Society (EMBC), 2012 Annual International Conference of the IEEE, San Diego, CA, 2012. [11] Y. Shiizu, Y. Hirahana, K. Yanashima and K. Magatani, “The development of a white cane which navigates the visually impaired,” in Engineering in Medicine and Biology Society, 2007. EMBS 2007. 29th Annual International Conference of the IEEE, Lyon, 2007. [12] K. Iwatsuka, K. Yamamoto and K. Kato, “Development of a guide dog system for the blind people with character recognition ability,” in Pattern Recognition, 2004. ICPR 2004. Proceedings of the 17th International Conference on, 2004. [13] A. J. Dowson, “The Development of Surface Tactile Indicators,” in Proceedings of the 7 th International Conference on Concrete Block Paving (PAVE AFRICA 2003), Sun City, 2003. [14] A. M. Kassim, M. A. Azam, N. Abas and T. Yasuno, “Design and development of navigation system by using RFID technology,” in System Engineering and Technology (ICSET), 2013 IEEE 3rd International Conference on, Shah Alam, 2013. [15] H. Tang and D. J. Beebe, “An Oral Tactile Interface for Blind Navigation,” Neural Systems and Rehabilitation Engineering, IEEE Transactions on, vol. 14, no. 1, pp. 116-123, March 2006. [16] J. M. Loomis, R. G. Golledge and R. L. Klatzky, “Navigation System for the blind: Auditory display modes and guidance,” Loomis et al., vol. 7, no. 2, pp. 193-203, April 1998. Page | 58

Bibliography [17] D. A. Brusnighan, M. G. Strauss, M. G. Floyd and B. C. Wheeler, “Orientation aid implementing the global positioning system,” in In Bioengineering Conference, 1989., Proceedings of the 1989 Fifteenth Annual Northeast, 1989. [18] S. Faibish and M. Abramovitz, “Perception and navigation of mobile robots,” in Intelligent Control, 1992., Proceedings of the 1992 IEEE International Symposium on, Glasgow, 1992. [19] M. Agrawal, K. Konolige and R. C. Bolles, “Localization and Mapping for Autonomous Navigation in Outdoor Terrains : A Stereo Vision Approach,” in Applications of Computer Vision, 2007. WACV '07. IEEE Workshop on, Austin, TX, 2007. [20] J. M. Coughlan and A. L. Yuille, “Manhattan World: Compass Direction from a Single Image by Bayesian inference,” in Computer Vision, 1999. The Proceedings of the Seventh IEEE International Conference on (Volume:2), Kerkyra, 1999. [21] X. Yan, “Model Diagnostics: Heteroscedasticity and Linearity,” in Linear Regression Analysis:

Theory

and

Computing,

World

Scientific

Publishing

Company,

Incorporated, 2009, pp. 195-217. [22] K. Karacs, A. Lazar, R. Wagner, B. Balint, T. Roska and M. Szuhaj, “Bionic eyeglass: The first prototype A personal navigation device for visually impaired - A review,” in Applied Sciences on Biomedical and Communication Technologies, 2008. ISABEL '08. First International Symposium on, Aalborg, 2008. [23] M. Zöllner, S. Huber, H. Reiterer and H. C. Jetter, “NAVI: A Proof-of-concept of a Mobile Navigational Aid for Visually Impaired Based on the Microsoft Kinect,” in Proceedings of the 13th IFIP TC 13 International Conference on Human-computer Interaction - Volume Part IV, Lisbon, Springer-Verlag, 2011, pp. 584-587. [24] E. B. Kaiser and M. Lawo, “Wearable navigation system for the visually impaired and blind people,” in Computer and Information Science (ICIS), 2012 IEEE/ACIS 11th International Conference on, 2012. Page | 59

Bibliography [25] J. H. Sanchez, F. A. Aguayo and T. M. Hassler, “Independent Outdoor Mobility for the Blind,” in Virtual Rehabilitation, 2007, Venice, Italy, 2007. [26] A. Mecocci, R. Lodola and U. Salvatore, “OUTDOOR SCENES INTERPRETATION SUITABLE FOR BLIND PEOPLE NAVIGATION,” in Image Processing and its Applications, 1995., Fifth International Conference on, Edinburgh, 1995. [27] Y. Wei, X. Kou and M. Lee, “Development of a guide-dog robot system for the visually impaired by using fuzzy logic based human-robot interaction approach,” in Control, Automation and Systems (ICCAS), 2013 13th International Conference on, Gwangju, 2013. [28] Y. Wang and K. J. Kuchenbecker, “HALO: Haptic Alerts for Low-hanging Obstacles in white cane navigation,” in Haptics Symposium (HAPTICS), 2012 IEEE, Vancouver, BC, 2012. [29] J. Rowley, “Using case studies in research,” Management research news, vol. 25, no. 1, pp. 16-27, 2002. [30] A. S. De Vos and C. B. Fouche, “General introduction to research design, data collection methods and data analysis,” in Research at grass roots: A primer for the caring professions, A. S. De Vos, Ed., Pretoria, Van Schaik, 1998, p. 77. [31] G. H. Huysamen, “Methodology for the social and behavioural sciences. Pretoria,” Sigma, 1994. [32] D. Muijs, “Introduction to quantitative research,” in Doing Quantitative Research in Education with SPSS, SAGE Publications, 2010, p. 1. [33] WebFinance,

Inc.,

[Online].

Available:

http://www.businessdictionary.com/definition/survey-research.html. [Accessed 05 June 2014]. [34] “Disabled

Student

Unit,”

2014.

[Online].

Available:

http://www.ul.ac.za/index.php?Entity=Disabled%20Student%20Unit. [Accessed 05 Page | 60

Bibliography June 2014]. [35] B. Kitchenham and S. L. Pfleeger, “Principles of survey research: part 5: populations and samples,” ACM SIGSOFT Software Engineering Notes, vol. 27, no. 5, pp. 17-20, 2002. [36] “Primary research,” Wikimedia Foundation, Inc, 12 June 2014. [Online]. Available: http://en.wikipedia.org/wiki/Primary_research. [Accessed 15 June 2014]. [37] N. Golafshani, “Understanding reliability and validity in qualitative research,” The qualitative report, vol. 8, no. 4, pp. 597-607, 2003. [38] H. Strydom, “Ethical aspects of research in the social sciences and human service professions,” in Research at grassroots: for the Social Sciences and Human Service Professions, 2nd ed., A. S. De Vos, H. Strydom, C. B. Fouche and C. S. L. Delport, Eds., Pretoria, Van Schaik, 2002, pp. 64-73. [39] R. S. Pressman, Software Engineering: A practitioner's Approach, 7th ed., McGraw-Hill Higher Education, 2010. [40] “Android (operating system),” Wikimedia Foundation, Inc, 28 July 2014. [Online]. Available: http://en.m.wikipedia.org/wiki/Android_(operating_system). [Accessed 29 July 2014]. [41] “Android

Security

Overview,”

[Online].

Available:

https://source.android.com/devices/tech/security/. [Accessed 20 July 2014]. [42] “Activities,”

Google

Inc.,

[Online].

http://developer.android.com/guide/components/activities.html.

Available: [Accessed

04

August 2014]. [43] “Location

and

Maps,”

[Online].

Available:

http://developer.android.com/guide/topics/location/index.html. [Accessed 05 August 2014].

Page | 61

Bibliography [44] “SupportMapFragment,”

[Online].

Available:

http://developer.android.com/reference/com/google/android/gms/maps/SupportMap Fragment.html. [Accessed 05 August 2014]. [45] “LocationListener,”

[Online].

Available:

http://developer.android.com/reference/android/location/LocationListener.html. [Accessed 05 August 2014]. [46] “UI

Overview,”

[Online].

Available:

http://developer.android.com/guide/topics/ui/overview.html#Layout. [Accessed 04 August 2014]. [47] “Java

SE

Development

Kit

8

Downloads,”

[Online].

www.oracle.com/technetwork/java/javase/downloads/index.html.

Available:

[Accessed

06

August 2014]. [48] “Android

Studio,”

[Online].

Available:

developers.android.com/sdk/installing/studio.html. [Accessed 07 August 2014]. [49] “Get the Android SDK,” [Online]. Available: developers.android.com/sdk/index.html. [Accessed 07 August 2014]. [50] “Google

API,”

Google

Inc,

[Online].

Available:

https://code.google.com/apis/console/?noredirect. [Accessed 30 Septermber 2014].

Page | 62

APPENDIX A: QUESTIONNAIRE The questionnaire was used to gather information from the visually impaired. Participant number: ______ 1. What is your gender? Male

1

Female

2

2. What is your age range? 10-19 years

1

20-25 years

2

26-35 years

3

36+ years

4

3. How do you travel to school? By foot

1

By car

2

Guide from someone

3

4. Where do you stay? On campus

1

Off campus – short distance

2

Off campus – long distance

3

5. Do you walk to cafeterias unaccompanied? Yes

1

No

2

Page | 63

Appendix A: Questionnaire 6. Do you often get lost? Yes

1

No

2

7. Do you generally use a white cane when you walk to cafeteria alone? Most of the times

1

Do not go to cafeteria

2

Help from someone

3

8. How many times do you go to cafeteria per day? Once

1

Twice

2

Thrice

3

More than three

4

9. Do you use a white cane when you walk with a sighted guide? Yes

1

No

2

10. What kind of a smart phone do you have? Android smart phone

1

Blackberry smart phone

2

iPhone smart phone

3

Windows smart phone

4

Other

5

Page | 64

Appendix A: Questionnaire 11. What kind of a mobile navigation system have you used? White cane navigation system

1

RFID navigation system

2

Guide dog navigation system

3

Tactile navigation system

4

Mobile navigation system

5

Other

6

12. Would you like to use the proposed mobile navigation system together with your white cane? Yes

1

No

2

13. What kind of a navigation system do you choose between the following? System that guides you to move from point A to point B

1

System that guides you to use roads in the UL campus

2

System that alert you when you use the road you did not learn.

3

14. What method of navigation do you prefer between the following? Voice navigation

1

Vibration navigation

2

Beep navigation

3

Page | 65

APPENDIX B: OBTAINING GOOGLE MAPS API KEY

The next steps should be followed to obtain the Google Maps API Key [50] : 1. Get the SHA-1 fingerprint of the SDK Debug Certificate. Locate the debug keystore file. The file name is debug.keystore and is created when the project is built. By default, it is stored in the following directory: 

Windows: c:\Users\your_user_name\.android\

Run the following code to display the SHA-1 fingerprint keytool -list -v -keystore \debug.keystore -alias androiddebugkey -storepass android -keypass android

2. Create API project in the Google APIs Console. In the web browser, navigate to the following web page:

https://code.google.com/apis/console/?noredirect

Follow these steps to register the Google Maps API Key a) If you don’t have a Google account, use the link on the page to register new account. b) Click create project and the console will create a project named API Project c) List of APIs and services will be displayed in the main window d) Scroll to view Google Maps Android API v2, to the right of the entry, click the switch indicator to put it on. e) In the resulting page, accept Google Maps Android API Terms of Service. f) Navigate to the project in the Google APIs Console. Page | 66

Appendix B: Obtaining Google Maps API Key g) Click API Access, in the left navigation bar. h) Click Create New Android Key, in the resulting page. i) In the resulting dialog, enter the SHA-1 fingerprint, then a semicolon, then the application's package name. For example: 51:6C:33:09:12:E0:47:B6:AF:AF:AF:52:6B:AF:0B:9D:38:8E:40:50 ;se.research.project.naviblind

j) The Google APIs Console will display the API key, for example: AIzaSyAuUA0YvIFPyR3tJwRBN7NXWs-Z8SGTZ3s

3. Add the API Key to the application project. In the AndroidManifest.xml, add the following code as a child of the element, and replace API_KEY with an obtained API Key.

Page | 67

APPENDIX C: UML DIAGRAMS

Page | 68

APPENDIX D: SOURCE CODE OF THE APPLICATION Activity_maps.xml is the layout for the UI.



AndroidManifest.xml is used to define permissions to access mobile device’s hardware.

Page | 69

Appendix D: Source code of the application

MapsActivity.java is the main java class package se.research.project.naviblind; import android.app.AlertDialog; import android.content.Context; import android.content.DialogInterface; import android.content.Intent; import android.database.Cursor; import android.location.Location; import android.location.LocationListener; import android.location.LocationManager; import android.media.AudioManager; import android.media.ToneGenerator; import android.net.ConnectivityManager; import android.net.NetworkInfo; import android.os.Vibrator; import android.provider.Settings; import android.support.v4.app.FragmentActivity; import android.os.Bundle; import android.view.KeyEvent; import android.widget.EditText; import android.widget.TextView; import android.widget.Toast; import com.google.android.gms.maps.CameraUpdateFactory; import com.google.android.gms.maps.GoogleMap; import com.google.android.gms.maps.SupportMapFragment; import com.google.android.gms.maps.model.LatLng; import com.google.android.gms.maps.model.MarkerOptions; import java.io.IOException; import java.security.Key; public class MapsActivity extends FragmentActivity { private GoogleMap newMap; // instance of map DataBaseHelper db; //instance of DataBaseHelper int arraySize = 298; //maximum number of items in the database double [] arrayList ; //stores distance int [] idArray; //stores id field from database String [] latArray; //stores latitudes from database String [] lonArray; //stores longitudes from database double distance; //stores distance between two points Location loc2; //stores locations from the database

Page | 70

Appendix D: Source code of the application LocationManager locationManager; LocationListener locationListener; Vibrator vib; Cursor cursor;

// manages locations updates //listen to incoming points // instance of vibrator //retrieves points from the database

@Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_maps); setUpMapIfNeeded(); tx = (TextView) findViewById(R.id.textView); idArray = new int[arraySize]; //initialize idArray latArray = new String[arraySize]; //initialize latArray lonArray = new String[arraySize]; //initialize lonArray arrayList = new double[arraySize]; //initialize arrayList vib = (Vibrator) getSystemService(Context.VIBRATOR_SERVICE); locationManager = (LocationManager) getSystemService(Context.LOCATION_SERVICE); locationListener = new myLocationListener(); locationManager.requestLocationUpdates( LocationManager.GPS_PROVIDER, 3000, 0, locationListener); //if the GPS is disabled call Method showGPSAlert if(!locationManager.isProviderEnabled(LocationManager.GPS_PROVIDER)){ showGPSAlert(); } ConnectivityManager cm = (ConnectivityManager)getSystemService( Context.CONNECTIVITY_SERVICE); NetworkInfo net = cm.getActiveNetworkInfo(); //if the network is disabled call Method showMobileData if ( !( net != null && net.isConnected())) showMobileData(); db = new DataBaseHelper(this);//create an instance of DataBaseHandler //try to create the database try { db.createDATABASE(); } catch (IOException e ) { throw new Error("ErrorCopyingDataBase"); } db.openDataBase(); //opens the database cursor = db.getLocationPoints(); //get locations from the database int counter = 0; //declare and initialize counter to zero //copy all the fields from the cursor to the three arrays if ( cursor.moveToFirst()) do{ idArray[counter] = Integer.parseInt(cursor.getString(0)); latArray[counter] = cursor.getString(1); lonArray[counter] = cursor.getString(2); counter++; } while (cursor.moveToNext()); cursor.close(); //closes the cursor db.close(); //closes the database } class myLocationListener implements LocationListener { public void onLocationChanged(Location location) { //pass points with 16m accuracy to methods myController() and //drawMaker() if(location != null) { if ( location.getProvider().equals( "gps" ) ) { if(location.getAccuracy() > 16) { location = null; }

Page | 71

Appendix D: Source code of the application else { newMap.clear(); myController(location); drawMaker(location); } } } public void onStatusChanged(String provider,int status,Bundle extras){} public void onProviderEnabled( String provider ) { } public void onProviderDisabled( String provider ) { } } //Method to kill the process public boolean onKeyDown( int key, KeyEvent event ) { if ( key == KeyEvent.KEYCODE_BACK ) { locationManager.removeUpdates(locationListener); finish(); } return super.onKeyDown(key,event); } //Method to create locations using data in the three arrays public void myController ( Location loc1) { //initialize loc2 that stores locations from database loc2 = new Location(""); for (int k = 0; k < arraySize; k++) { loc2.setLatitude(Double.parseDouble(latArray[k])); loc2.setLongitude(Double.parseDouble(lonArray[k])); findDistance(loc1, loc2,idArray[k]-1); } findMinDistance(); } //Method to compute distance public void findDistance(Location userLoc, Location dbLoc, int id ) { //calculate distance between user location and the location in the //database then store the distance in the array arrayList distance = dbLoc.distanceTo( userLoc ); arrayList[id] = distance; } //Method to find minimum distance public void findMinDistance() { double min = arrayList[0]; for(int j = 1; j < arraySize; j++ ) if ( min > arrayList[j]) { min = arrayList[j]; } //Find minimum distance if ( min > 16.0) { vib.vibrate(100); //vibrate for 100 milliseconds //Play tone for 500 milliseconds with 95% volume ToneGenerator tone = new ToneGenerator(AudioManager.STREAM_NOTIFICATION, 95); tone.startTone(ToneGenerator.TONE_CDMA_ALERT_CALL_GUARD, 500); } } //Method to display a dialog message if the internet is disabled public void showMobileData() { AlertDialog.Builder newDialog = new AlertDialog.Builder(this); newDialog.setMessage("Data is disabled.").setCancelable(false). setPositiveButton("Settings",new DialogInterface.OnClickListener(){ public void onClick(DialogInterface dialogInt, int paramId){ Intent intent = new Intent(Settings.ACTION_SETTINGS); startActivity(intent); } } );

Page | 72

Appendix D: Source code of the application newDialog.setNegativeButton("Cancel",new DialogInterface.OnClickListener(){ public void onClick(DialogInterface dialogInt, int paramId){ dialogInt.cancel(); } } ); AlertDialog alertDialog = newDialog.create(); alertDialog.show(); } //Method to display a dialog message if the GPS is disabled public void showGPSAlert() { AlertDialog.Builder newDialog = new AlertDialog.Builder(this); newDialog.setMessage("GPS is disabled.").setCancelable(false). setPositiveButton("Settings",new DialogInterface.OnClickListener(){ public void onClick(DialogInterface dialogInt, int paramId){ Intent SettingIntent = new Intent( Settings.ACTION_LOCATION_SOURCE_SETTINGS); startActivity(SettingIntent); } } ); newDialog.setNegativeButton("Cancel",new DialogInterface.OnClickListener(){ public void onClick(DialogInterface dialogInt, int paramId){ dialogInt.cancel(); } } ); AlertDialog alertDialog = newDialog.create(); alertDialog.show(); } protected void onResume() { super.onResume(); setUpMapIfNeeded(); } private void setUpMapIfNeeded() { //Check if a map is already instantiated. if (newMap == null) { // Try to obtain the map from the SupportMapFragment. newMap = ((SupportMapFragment) getSupportFragmentManager().findFragmentById(R.id.map)).getMap(); // Check if the map exists. if (newMap != null) { setUpMap(); } } } //Method to set up the map private void setUpMap() { newMap.addMarker(new MarkerOptions() .position(new LatLng(-23.8851494, 29.7391727)).title("Me")); newMap.setMapType(GoogleMap.MAP_TYPE_SATELLITE); } //Method to draw a maker on the screen public void drawMaker(Location loc) { LatLng user = new LatLng(loc.getLatitude(),loc.getLongitude()); newMap.addMarker(new MarkerOptions().position(user)).setTitle("Me"); newMap.animateCamera(CameraUpdateFactory.newLatLngZoom(user,17.0f)); } }

Page | 73

Appendix D: Source code of the application DataBaseHelper.java is used to access the database. package se.research.project.naviblind; import android.content.Context; import android.database.Cursor; import android.database.SQLException; import android.database.sqlite.SQLiteDatabase; import android.database.sqlite.SQLiteOpenHelper; import android.os.Build; import android.util.Log; import import import import import

java.io.File; java.io.FileOutputStream; java.io.IOException; java.io.InputStream; java.io.OutputStream;

public class DataBaseHelper extends SQLiteOpenHelper { private private private private private private private private private

static String DATABASE_PATH = ""; //empty path static String DATABASE_NAME = "LOCATIONS.db"; //database name static int VERSION = 1; //database version static final String DATABASE_TABLE ="GPS_points";//database table static final String FIELD_ID = "_id"; //primary key field static final String FIELD_LATITUDE = "_latitude";//latitude static final String FIELD_LONGITUDE="_longitude";//longitude SQLiteDatabase mDATABASE; final Context mCONTEXT;

// Constructor to create the database and to set the path to store // database in the phone memory public DataBaseHelper(Context context) { super(context, DATABASE_NAME, null, VERSION); if (Build.VERSION.SDK_INT >= 17 ) DATABASE_PATH=context.getApplicationInfo().dataDir+"/databases/"; else DATABASE_PATH="/data/data/"+context.getPackageName()+"/databases/"; this.mCONTEXT = context; } // Method to check if the database exists in the phone memory. // If it does not exists, it try to call copyDATABASE Method public void createDATABASE() throws IOException { boolean is_DB_EXISTS = checkDATABASE(); if(!is_DB_EXISTS) { this.getReadableDatabase(); try { copyDATABASE(); } catch (IOException mIOException){ throw new Error("ErrorCopyingDataBase"); } } } //Method to copy the database from assets directory to phone memory private void copyDATABASE() throws IOException { InputStream inputStream = mCONTEXT.getAssets().open(DATABASE_NAME); String outFileName = DATABASE_PATH + DATABASE_NAME; OutputStream outputStream = new FileOutputStream(outFileName);

Page | 74

Appendix D: Source code of the application byte[] buffer = new byte[1024]; int length; while ( (length = inputStream.read( buffer )) > 0 ) { outputStream.write( buffer, 0, length ); } outputStream.flush(); outputStream.close(); inputStream.close(); } //end Method copyDATABASE // Method to check if the database exists in the phone memory private boolean checkDATABASE() { File DbFILE = new File( DATABASE_PATH + DATABASE_NAME); return DbFILE.exists(); } //end Method checkDATABASE // Method to open the database from phone memory public void openDataBase() throws SQLException { String path = DATABASE_PATH + DATABASE_NAME; mDATABASE = SQLiteDatabase.openDatabase(path, null, SQLiteDatabase.OPEN_READONLY); } //end Method openDataBase //Method to close the newly created database public synchronized void close() { if( mDATABASE != null) mDATABASE.close(); super.close(); } //return all the fields from the table public Cursor getLocationPoints() { return mDATABASE.query(DATABASE_TABLE, new String[] {FIELD_ID, FIELD_LATITUDE, FIELD_LONGITUDE}, null, null, null, null, null); } public void onCreate(SQLiteDatabase db) { } public void onUpgrade(SQLiteDatabase db,int oldVersion,int newVersion){ } }

Page | 75

APPENDIX E: DATA FROM UL DSU

UNIVERSITY OF LIMPOPO – TURFLOOP CAMPUS DISABLED STUDENT UNIT STATISTICS FOR 2014 REGISTERED STUDENTS

DISABILITY BLIND ALBINISM PARTIALLY SIGHTED MOBILITY IMPAIRED PHYSICALLY IMPAIRED HEARING IMPAIRED MULTIPLE DISABILITY CEREBRAL PALSY AMPUTEE DWARFISM SCHIZOPHRENIA MEDICAL CONDITION TOTAL

No. OF STUDENTS 11 22 14 06 10 01 01 01 02 01 01 05 75

Page | 76

FEMALE 03 11 05 03 06 00 00 01 01 01 01 03 35

MALE 08 11 09 03 04 01 01 00 01 00 00 02 40

APPENDIX F: PROJECT PLAN

SCHEDULED MEETINGS FOR THE YEAR 2014 TIME: 09:00 DAY: FRIDAY

Planned activities Research Proposal Literature review Methodology Design Implementation Introduction Conclusion

Duration 1 hour 1 hour 1 hour 1 hour 1 hour 1 hour 1 hour

Date 11 April 2014 20 June 2014 27 June 2014 25 July 2014 22 August 2014 05 September 2014 19 September 2014

Page | 77

View publication stats