Design, development and technology acceptance modeling of accessibility systems for people with sensory impairments with applications on autonomous smartphone-based navigation systems for blind people
KeywordsUTAUT ; Sensory disabilities ; Blind and visually impaired ; Deaf and hard of hearing ; User-centered design ; User-centered training ; Cognitive driven design ; Assistive technologies ; Sentiment analysis ; Usability ; User experience (UX) ; Special education ; UEQ+ ; Outdoor navigation ; Indoor navigation ; Smartphone ; Sensors
This dissertation is part of the Human-Computer Interaction (HCI) field of studies concerning the challenges, the technological solutions developed to enable sensory disabled and marginalized groups of individuals and the effective training methodological approaches. The ultimate goal of this effort is to contribute towards the direction of reducing social exclusion and the stigma associated with disabled individuals. In more detail, this effort focuses on enabling independent navigation in both outdoor and indoor spaces via assistive technology (AT) solutions that takes into consideration the special needs of the blind and visually impaired and considers the design of training sessions as an important factor for the success of any AT solution. Blind people face serious restrictions in their life due to their vision impairment resulting in both social and professional exclusion and a deteriorated quality of life. There are a lot of individuals worldwide suffering from eyesight deficiencies and a large amount is found in low to middle-income countries creating additional challenges for any solution. Taking all this into consideration the initiative for the MANTO project was created involving the design, implementation and validation of AT solutions for providing cost and functionally effective indoor and outdoor blind navigation applications. The design phase involved the conduct of multiple interviews with the blind and visually impaired, where various categories of beliefs, attitudes and preferences emerged after a thorough analysis of the given input. Among the many categories, the most significant ones were selected to form the requirements concerning the functionality and interface provided by the two applications targeting outdoor and indoor navigation. The involvement of the blind and visually impaired individuals was critical to the applications’ development cycle as dictated by the followed user-centered design approach. Taking into consideration the input of the requirements elicitation phase and the overall goals for cost and functional effectiveness, both applications were built on top of the Android platform utilizing low-end smartphone devices. Regarding the case of outdoor space navigation, the application provides safe and highly precise blind pedestrian navigation without requiring the mandatory use of tactile ground surface indicators. The system employs voice instructions to continuously inform the user about the status and progress of the navigation and the various obstacles found along the navigational path. The Android application (BlindRouteVision) aggregates data from three different sources, an external high-precision GPS receiver tracking real-time pedestrian mobility, a custom-made external device consisting of an ultra-sonic sensor and a servo mechanism that resembles a sonar device in its functionality and, finally, a second external device installed on traffic lights for tracking their status in order to enable the passing of crossings near them. The user interacts with the system via an appropriately designed voice interface to enable fast and accurate interaction. Likewise, for the case of indoor space navigation, the application provides accurate and safe navigation in indoor spaces. Its basis lies in the combination of a state-of-the-art pedestrian dead reckoning (PDR) algorithm with surface tactile ground indicator guides, the gyroscope sensor found on smartphone devices, and last but not least, BLE technology radio beacons that are used to correct the accumulated error of the PDR method. The application provides its capabilities to users via a voice-command-based interface that is configurable to their preferences. Both of these applications were validated in terms of Usability and User Experience (UX) by blind and visually impaired individuals. Usability employed a number of tasks to be performed by the blind participants relevant to the functionality of the outdoor (completion of a pedestrian navigation route, combining pedestrian navigation with public means of transport and passing marked crossings near traffic lights) and indoor (completion of thematic routes and location of Points of Interest (POIs)) application while UX was evaluated by means of standardized questionnaires (UEQ+) followed by a step of statistical analysis. Furthermore, as the literature demonstrates, AT solutions are not widely accepted by blind and visually impaired individuals, a result which was also confirmed from the interactions with the blind and visually impaired. To address the issue of low acceptance rate and the subsequent abandonment of those solutions in a short period of time, we searched the literature to uncover the underlying causes. The effort revealed many factors of technological, financial and human nature contributing to this trend. Some of those factors can be addressed by the current technology and an improvement in assistive devices’ interfaces while others are open research problems. Despite the current situation, during the interviews with the blind and visually impaired individuals, it became evident that training could play a significant role in improving the low acceptance rate and stopping the abandonment of AT solutions while being technologically and financially feasible as well as humanly approved. To explore the role of training, special training sessions were designed to demonstrate the features of the outdoor blind navigation application (BlindRouteVision). These sessions were incorporated in the context of the special Orientation and Mobility (O&M) courses where the blind learn fundamental skills for their independent mobility. A companion training application, functionally equivalent to the main application, was developed in order to facilitate and expedite the learning process. The training application itself was, also, evaluated from the perspective of Usability, UX and even further, with sentiment analysis conducted on the blind participants’ responses via the use of Recursive Neural Network (RNN) deep learning models that are part of the CoreNLP framework. Overall, the training application was positively evaluated towards succeeding in its goal to facilitate the training sessions. To further validate the importance of training in improving Technology acceptance, we extended the Unified Theory of Acceptance and Use of Technology (UTAUT) to include training as one of the external factors that predict behavioral intention which according to UTAUT predicts, in turn, the actual usage of technology. The extended model was validated while evaluating the outdoor blind navigation application. Special questionnaires were used to evaluate the factors of the model followed by a thorough statistical analysis employing Explanatory Factor Analysis (EFA), Confirmatory Factor Analysis (CFA) and Structural Equation Model (SEM). The analysis results showed a partial satisfaction of the model with the newly inserted training factor positively influencing the factor of behavioral intention.