Skip to main content

Design and implementation of a secure mobile phone-based route navigator (mGuide), adapted for the visually challenged people

Abstract

This paper is focused on the development of a turn-by-turn voice navigation mobile application “mGuide adapted for visually challenged people (VCP). The application was developed in response to the challenges of outdoor navigation by VCP. The application optimizes five map servers to give not only the best real-time routing at any instance but also faster loading and retrieval of route data into the phone’s storage. The destination points of the user are obtained in real time using open street maps and stored in the database. Once the destination has been set and mode of travel chosen, the software gives turn-by-turn voice navigation till the destination is reached. In addition, the application can be able to read inbox messages as they are received. Furthermore, the application gives an audio alert to the user in case of lost route and redirects the user using alternative route. Testing of the application was done using 9 totally blind students and 1 partially blind student from Kenyatta University in Kenya, who were successfully guided. In testing the performance of the whole system, no sound alerts between 0 and 10 m from the middle of the road were heard. However, as the user deviated more than the 10 m from the path, sound alerts are heard. In conclusion, sound alert due to wrong turn is averagely 5.23 s.

Introduction

The statistics carried out in 2010 by world health organization showed that 285 million people were visually impaired globally [11]. Of the 285 million people, 39 and 246 million were totally blind and had low vision, respectively. 90% of visually challenged persons (VCP) live in developing countries. The major causes of visual impairments are: incorrect refractive errors (myopia, hyperopia or astigmatism), cataracts and glaucoma at 43, 33 and 2%, respectively [13]. Sight loss is often accompanied by a loss of independence. Therefore, VCP encounter challenges in outdoor movement and orientation [4]. In addition, reduced visual capacity challenges a VCP’s spatial problem-solving ability leading to stress and anxiety. Consequently, he may avoid leaving home or in the case of a learning institution, missing classes and other routine chores.

The assistive devices relied currently, for the VCP, include the white cane, a guide dog, a guide person and electronic travel aids (ETAs). A cane has a one-time cost, and it provides the user with a wealth of information regarding his or her immediate environment [11]. Not only does a white cane help the user in navigation, but also it provides a visual cue to others, informing them of the user’s visual impairment. However, there are drawbacks to using the white cane. A long white cane can be cumbersome when using public transportation, or even indoors. However, technological advancement has led to the development of the smart cane. The smart cane can detect obstacles using ultrasound or radar as it comes in contact with the ground [7]. The white cane has been transitioning into more robust and multifunctional smartcane being used nowadays. For example, a solar-powered smartcane has been proposed which uses Arduino Uno, ultrasonic and water sensors. The mobile phone of the VSP is connected via Bluetooth link, and the app is installed on the user’s phone [1]. A guide dog works on commands only to guide the user around obstacles [11]. Moreover, it has no sense of direction and the VCP must be trained to be in control at all times and give the dog the appropriate commands on directions. Further, guide dogs are scarce and only certain breeds such as German Shepherd and Labrador with good qualities including smartness, persistence and friendliness can be trained. It takes 1.5–2 years to train a qualified guide dog which then can be used for about five years. An interesting development in the line of guide dogs is the use of artificial intelligence and automated technology in the development of guide dog robot [6]

The ETAs are based on producing an ultrasound or laser light via transmitter which radiates outwards toward any obstacle. The reflected waves upon hitting the receiver produce an audio or vibrational response. To illustrate the use of ETAs, a helmet-based system that aids the visually impaired has been proposed. The system consists of three working systems namely face recognition, fall detection and obstacle detection. Face recognition is implemented in Raspberry Pi. fall detection which is proposed in emergency situations and it is implemented using the IC Atmega 16. The fall detection module transfers location coordinates of the individual to the emergency contacts upon a sudden fall by usage of the GSM module. These proposed ETAs are great technologies which can work in tandem with our proposed turn-by-turn navigation system to achieve better mobility [12].

Therefore, a mobile application to give turn-by-turn voice navigation to work in tandem with a white cane, guide dog or ETAs is necessary. A research commissioned by National Council for the disabled in 2003 found that more than a third of VCP in Kenya never venture outdoors without assistance due to their lack of sight [9].

The “mGuide” mobile application developed in this study relied on the Geographical Positioning System (GPS) technology. The GPS is a conglomeration of more than twenty-four satellites orbiting around the earth and which communicates with receivers on the earth’s surface using radio waves [11]. The “mGuide” application was developed using android programming language which is an object-oriented programming (OOP) language. The advantages of OOP include: (i) improved software productivity due to its modulability; (ii) improved software maintenance and troubleshooting; and (iii) production of higher quality software [14]. These advantages outweigh the shortfalls of extended lines of code and slowing down of programs.

Currently, the best turn-by-turn outdoor navigation application is Google maps by Google. The Google map is scripted using C++ in its back-end, and the user interface is written using JavaScript, extensible markup language and Ajax [11].To start with, the maps which are pre-installed on the Smartphones guide users to set destinations whether walking or driving. Moreover, the app has great features such as: (i) provides voice-guided turn-by-turn driving directions, with multiple voice prompts before turns and merges; (ii) automatically recalculates route when you make wrong turns or change course; (iii) offers multiple routes to avoid traffic, highways, or tolls, and can automatically re-route on command or based on traffic conditions; and (iv) uses Android's built-in voice search, to enable the user speak destinations to the app (Pascolini D & SPM., 2010). However, Google maps are designed for general public use and lacks features for the VCP. Other mobile applications include “WayfinderAccess” and “MobileGeo” which runs on Windows mobile based Smart phones [4]. These applications have their databases hosted on a remote server which can be accessed by a third party. In addition, “Seeing Eye GPS app” is an accessible turn-by-turn GPS iPhone app with all the normal navigation features plus features unique to people who are blind [11]. In addition, the “Seeing Eye GPS app” requires iPhone, iPad, and iPod touch mobile devices and it’s available in English, French, German, Hebrew, Serbian and Spanish languages. Relying on GPS alone for route navigation leads to erroneous identification of critical points which could be fatal for VCPs [8].

The “mGuide” application in this study is used to give turn-by-turn voice guidance in English or Kiswahili to a destination. Moreover, the application is also able to work in tandem with other ETAs to inform the VCP about above knee obstacles as real-time route guidance proceeds. In addition, the application uses a set of maps to give the best navigation at any particular time and can switch seamlessly among the maps. Upon completion of app development, the “mGuide” application package (APK) file is installed onto the user phone. Thereafter, the guide person sets the destination for the VCP or the destination is directly spoken into the app by the user. Subsequently, the app periodically obtains and updates its location data in real time until the destination is reached. If the user deviates from the set destination, then the algorithm recalculates distance to the destination. The algorithm is faster compared to other algorithms like map-matching and route prediction algorithm.

Statement of the research problem

There is a very low usage of GPS-enabled mobile phone navigation among the VCPs. This is despite the availability of millions of mobile phones in use by the VCP to make calls to other phones and listen to received text messages. For the majority of the VCP, the white cane and using a guide dog are the primary mobility devices through which they have gained the ability to travel somewhat independently. However, the guide person is noted to be at hand to offer assistance even on short clear paths. Therefore, the VCP finds traveling difficult and hazardous because they cannot easily determine "where" things are, a process known as "spatial sensing," [5]. The development of the “mGuide” application involves mapping a route commonly used by the user onto the application’s database. Then, using difference amplification algorithm (DAA), the user is updated periodically of the progress along the route. Accurate localization through the use of GPS enabled the calculation of speed from distance covered in specified time intervals.

System description

Positioning Technology using GPS

The navigation has three main parts which enable it to communicate successfully with receivers on the earth. The parts include the space, user and the control segments. The space segment is made up of 24 or more network of radio emitting and receiving satellites placed into orbits as shown in Fig. 1 [11]. Meanwhile, the user segment may be an airplane, a ship, a car or a mobile terminal [4]. On the other hand, the control segment consists of a network of monitoring and control facilities (master stations, monitoring stations and uploading stations) that are used to manage the satellite constellation and update the satellite navigation data messages [5]. By getting the difference in time between when the GPS signal from the satellite was sent and when it is received, the actual observation to the satellite(s) can be expressed as in Eq. 1 [5].

$$P^{s} = \left( {T - T^{s} } \right)c$$
(1)

where PS the position of the satellite from receiver, T is the reading of the receiver clock, TS is the reading of the satellite clock and c is the speed of light given as \(3.0 \times 10^{8}\) m/s. Depending on a single constellation of Satellites such as GPS may be detrimental especially due to high accuracy demands of location-based applications. Hence, the need to have receivers for improved positioning accuracy at any time. The other constellations which offer location services include Galileo by European Union and BeiDou designed by Russia among others [2].

Fig. 1
figure 1

The three GPS segments, namely the space, user and control segments (Pascolini D and SPM. 2010)

Software Design

The “mGuide” application gets the shortest distance between two points projected to the mapped routes, calculates and displays the angle to the next turn along the route, and guides the user through a turn-by-turn voice guidance to a destination. The navigation is real time; hence, the application keeps on updating its current location and if the user deviates from the route, a warning sound is played in a preferred language such as English or Kiswahili. Then, DAA re-routes the user to the destination using an alternative route. The application flow is illustrated in Fig. 2.

Fig. 2
figure 2

A flowchart diagram for the “mGuide” application, used in this work, showing the flow of activities

The “mGuide” application, is divided into packages which are further divided into smaller modules called classes to execute a specific task [5]. The “LocationManager” method in the MainActivity class offers background service for querying the current location of the mobile terminal and storing it in the mobile database. In addition, the “calculateDistance” method is executed by the following java code:

$$\begin{gathered} {\text{d = calculateDistance (location[0], location[1], location[2], location[3]);}} \hfill \\ {\text{ mesagebody}}{\text{.setText(}}''{\text{Dear User you are '' + d + }}''{\text{meters to reach }}''{\text{ + getname + }}''{\text{accuracy}}''{\text{ + mac + }}''{\text{meters}}''{\text{);}} \hfill \\ \end{gathered}$$

where the locations “location[0], location[1],” represents the coordinates of the VCP’s current location and “location[2], location[3])” the destination coordinates. Thereafter, the “Tts” method calls the “text-to-speech” engine to read out to the reader the route progress. Hereafter, the code snippet for the implementation of the text-to-speech function is presented.

$$\begin{gathered} {\text{Tts t = new Tts(getApplicationContext());}} \hfill \\ {\text{t}}{\text{.guideme (''Dear user you are '' + d + }}''{\text{meters to reach}}''{\text{ + getname + }}''{\text{. Accuracy }}''{\text{ + macc + }}''{\text{ meters}}''{\text{);}} \hfill \\ \end{gathered}$$

Upon arriving at the destination, the distance (d) equals to zero. Therefore, the VCP is notified via the text-to-speech interphase, as illustrated by the code shown below;

$$x{\text{ }} = {\text{ }}0;$$
$${\text{if (}}d{\text{ }} = {\text{ }}x{\text{)}}$$
$$\begin{gathered} ''{\text{mesagebody}}{\text{.setText(Congratulations, you have arrived at your destination}}''{\text{);}} \hfill \\ {\text{t}}{\text{.guideme(}}''{\text{Congratulations, you have arrived at your destination}}''{\text{);}} \hfill \\ \end{gathered}$$

The recorded voice clips for guiding the user along the root include: (i) Turn right (Geuka kulia); (ii) Turn left (Geuka kushoto); (iii) Lost route (Umepotea njia); and (iv) Re-routing (Njia mbadala). In addition, the voice clips were recorded and uploaded to the phone database to be called at the appropriate instance.

Testing system’s performance

The user was asked to deviate from the middle of the path a distance of 10 m and if any alerts were sent, they were recorded. This was followed by other 20, 30, and 40 m deviations from the middle of the path. A test is considered a success if the user stands at any distance less than 50 m from the middle of the path and the application doesn’t sound any verbal alert. If a message is sent within that distance, then the test is considered a failure because this would send a false alarm. For each deviation category, 20 tests were done. The success rate is calculated using Eq. 2 [3].

$${\text{ Success rate}} = \frac{{\text{Number of successful tests}}}{{\text{Total number of tests}}}$$
(2)

Results and discussion

Code Implementation

Figure 3 shows the implementation of the “mGuide” application. Figure 3(a) shows the application successfully obtaining the GPS coordinates. The results from the app were compared with a handheld GPS device and results analyzed. Figure 3(b) shows the real-time navigation as the user walks along the set route.

Fig. 3
figure 3

Screenshots of the running “mGuide” application. a Obtaining the GPS coordinates and b Real-time navigation to the set destination

Figure 4 shows the screen shot of the implemented program. The guide sets the origin and destination for the VCPs and the application sets to guide the user turn-by-turn. Moreover, in case of lost route, the application re-routes the user to the destination without the need of a guide person. However, it should be noted that, in addition to the “mGuide” application, there is need for a white cane or ETAs for detecting obstacles along the route.

Fig. 4
figure 4

Real-time route navigation. a The white arrow showing distance to destination and b Display of route lost and hence application re-routing the PVC to destination

Comparison of “mGuide” application with a Handheld GPS device

The “mGuide” application was tested in real time within XX along selected paths. To start with, the testing was done under a driving mode, and the results are shown in Fig. 5. It’s noted that while distance to destination was seen to decrease gradually, between 0–2 min the distance was increasing rather than decreasing as expected. This can be attributed to poor GPS signal reception [10]. In addition, the distance between the 2nd and 14th min is decreasing gradually to zero as the user is moving to destination, implying the code is running successfully. At the 14th min when distance was zero, application notified the user that he had arrived at the destination.

Fig. 5
figure 5

A graph of distance against time for user walking along a specified path from main gate to Kiwanja market in XX university using the developed mGuide application. The speed decreases gradually to zero

Figure 6 shows results for XX main gate to a Nyayo hostel route. The results show a great agreement for longitudes (Fig. 6a) and latitudes (Fig. 6b) between the data obtained by both handheld GPS terminal and the “mGuide” application. Since the sky was clear, there was a clear visibility of both devices to the GPS satellites, hence the agreement seen. However, the coordinates of point 5 were taken under trees where visibility was low, hence the varied data points seen (Fig. 6a). Similarly, the results in Fig. 7(a) for the longitudes and Fig. 7 (b) for the latitudes are in good agreement with each other. This is attributable to the good visibility of the satellites along the path. In conclusion, the “mGuide” is shown as a reliable navigation application. A graph of distance after deviation against route number for the different routes was plotted as shown in Fig. 8. It was found out that “mGuide” application notified the user within a shorter distance on deviation from set destination hence more effective in outdoor navigation. This is due to the app’s ability to use a set of 5 map servers and can switch between them to give the best navigation. Figure 9 shows a visually challenged student walking at the middle of the road to test the application. The user is holding an “mGuide”-enabled phone and has earphones to aid in turn-by-turn voice directions. The response to user’s deviation from the middle of a road or a path as tabulated in Table 1 shows that no alarm was sounded for deviations within 10 m. However, for deviation greater that 10 m, the user receives voice warning of deviation from the set route.

Fig. 6
figure 6

Graphs of latitude and longitude against point number for Kenyatta University main gate–Nyayo hostel route. a A graph of longitude against point number and b latitude against point number

Fig. 7
figure 7

Graph of latitude against point number for Art zone 39 route. a Latitudes and b longitudes variations

Fig. 8
figure 8

A graph of distance before notification for different users on the mapped routes; for the mGuide and App B

Fig. 9
figure 9

A photograph of visually challenged student walking along a path in Kenyatta University (courtesy of directorate of disability)

Table 1 Percentage success rates for the application’s response to deviation from the middle of a route

Conclusions

In this paper, an optimal navigation system, “mGuide” application has been developed and implemented. The application was tested using 9 totally blind students and 1 partially blind student all from xx, who were successfully guided. Sound alerts are received upon deviation of more than 10 m from the path. Also, a maximum delay time of 5.23 s was recorded. In addition, the application optimizes on its ability to guide a user in a specified route through a turn-by-turn voice guidance. An alert is always communicated to the user when he makes wrong turn and offers alternative route. A success rate of 100% was achieved for deviations of less than 10 m showing the application’s reliability for use with VSPs. This is achieved by use of the text-to-speech (TTS) engine found in most smartphones. As the VCP navigates through the various apps in the phone, the TTS reads them out. Once the application is opened, the speech-to-text (STT) engine converts spoken out by the user to location coordinates and activates the navigation process. Destination so helps to read text messages to the user. Further, since the voice navigation is passive, once the application has been set by a guide person, the user then be guided through the set path. Research has shown that using an STT over soft key can save 30%-40% of the battery life of the mobile phone of users.

Availability of data and materials

NA.

Abbreviations

API:

Application Programming Interface

APK:

Android Package file

DAA:

Difference Amplification Algorithm

GPS:

Global Positioning System

LBS:

Location-Based Services

RFID:

Radio Frequency Identification

SDK:

Software Development Kit

TTS:

Text To Speech

VCP:

Visually Challenged Person

ETA:

Electronic Travel Aid

References

  1. Apprey MW, Agbevanu KT, Gasper GK, Akoi PO (2022) Design and implementation of a solar powered navigation technology for the visually impaired. Sensors International 3:100181. https://doi.org/10.1016/j.sintl.2022.100181

    Article  Google Scholar 

  2. Ashour I, El Tokhey M, Mogahed Y, Ragheb A (2022) Performance of global navigation satellite systems (GNSS) in absence of GPS observations. Ain Shams Engineering Journal 13(2):101589. https://doi.org/10.1016/j.asej.2021.09.016

    Article  Google Scholar 

  3. Baldauf M, Hong S-B (2016) Improving and Assessing the Impact of e-Navigation applications. International Journal of e-Navigation and Maritime Economy 4:1–12. https://doi.org/10.1016/j.enavi.2016.06.001

    Article  Google Scholar 

  4. Bousbia-Salah, M., Fezari, M., & Hamdi, R. (2005). A navigation system for the blind 16th Triennial World Congress Prague, Czech Republic.

  5. Guochang Xu, Xu Y (2016) GPS: Theory, algorithms and applications. International Journal of Innovative Research in Engineering & Science. https://doi.org/10.1016/B978-0-12-417049-0.00004-3

    Article  Google Scholar 

  6. Hong B, Lin Z, Chen X, Hou J, Lv S, Gao Z (2022) Development and application of key technologies for Guide Dog Robot: A systematic literature review. Robot Auton Syst 154:104104. https://doi.org/10.1016/j.robot.2022.104104

    Article  Google Scholar 

  7. Hussain M, Ullah M, Fareed A, Sohail B (2016) The Smartcane for Blind People An Electronically Smart Stick to Aid Mobility. International Journal of Computer Science and Information Security 14:276–285

    Google Scholar 

  8. Ivanov R (2012) Real-time GPS track simplification algorithm for outdoor navigation of visually impaired. J Netw Comput Appl 35(5):1559–1567. https://doi.org/10.1016/j.jnca.2012.02.002

    Article  Google Scholar 

  9. Kabare, K. (2018). Social protecton and disability in Kenya [Unpublished]. Development Pathways.

  10. Moreno EG, Romana MG, Martínez Ó (2016) A first step to diagnostic of urban transport operations by means of GPS receiver. Procedia Computer Science 83:305–312. https://doi.org/10.1016/j.procs.2016.04.130

    Article  Google Scholar 

  11. Pascolini D, & SPM., M. (2010). Global estimates of visual impairment British Journal Ophthalmology online, 1. https://www.who.int/blindness/publications/globaldata/en/

  12. Susan John A, Shelly S (2022) A navigational system for visually challenged persons. Materials Today: Proceedings 62:6873–6878. https://doi.org/10.1016/j.matpr.2022.05.137

    Article  Google Scholar 

  13. WHO. (2012). Global Data on visual impairments 2010 (World Health Organization Issue.

  14. Yang J, Lee Y, Chang KH (2018) Evaluations of JaguarCode: A web-based object-oriented programming environment with static and dynamic visualization. J Syst Softw 145:147–163. https://doi.org/10.1016/j.jss.2018.07.037

    Article  Google Scholar 

Download references

Funding

This work was supported by the National Commission for Science, Technology and Innovation (NACOSTI) of Kenya.

Author information

Authors and Affiliations

Authors

Contributions

NO came up with the idea, wrote the codes, carried out the experiments, implemented the application and drafted the manuscript. PK helped in the design of the application, interpreting and analyzing data, and writing and revising manuscript. PO helped in interpretation and presentation of graphical data and writing and revising the manuscript. All the authors read and approved the final manuscript.

Corresponding author

Correspondence to Nicholas Ososi Onkoba.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Onkoba, N.O., Karimi, P. & Nyangaresi, P.O. Design and implementation of a secure mobile phone-based route navigator (mGuide), adapted for the visually challenged people. Journal of Electrical Systems and Inf Technol 10, 18 (2023). https://doi.org/10.1186/s43067-023-00087-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s43067-023-00087-0

Keywords