Based on user feedback obtained during the 2023 evaluation stages, the VIP SpaceNav system has been improved and expanded with a number of functionalities. Thus, the space configuration application has been updated to achieve a simpler workflow of generating the 3D model of a building. We implemented a settings menu for the user navigation application, to select the navigation mode (guided or free roaming), the rendering of the navigation cues/orientation information and the frequency of TTS directions. A new functionality brought to the final prototype is the route summary, which is communicated at the user's request. Obstacle detection functionality using the YOLOv7-tiny neural network was added to the final prototype. In addition, an audio processing module has been implemented that sonifies nodes in the navigation path as well as the user's proximity to walls and doors.
To evaluate the latest changes to the configuration application, the time required to define the walls, windows and doors of some areas in the building was calculated, compared to the configuration time in previous prototypes. To test the obstacle detection mode, a series of tests were conducted in which the accuracy of detection and labeling of objects in the scene was evaluated. Another test was to verify the correct detection of the user's path relative to the navigation graph. We also evaluated the navigation cues/orientation information in the three modes: 8 cardinal directions, clock hours, or angle relative to the user's forward direction. For the evaluation of the sonification component, an independent application was developed that allows the playback of fixed spatial sounds (from the 8 cardinal directions) or by sweeping between two main directions. This application allowed testing users' perception of the spatiality of sounds. It was concluded that sound sweeping provides increased accuracy.
Experiments were carried out both with blindfolded and blind people. We repeated the experiments from 2023, but used the new settings menu, the new TTS guidance cues, as well as spatial sounds. In addition, complex maps were created for piloting in real conditions: scenarios from the everyday life of students and teachers at the Faculty of Automation and Computers, POLITEHNICA Bucharest were conceived. The scenarios of free roaming in a laboratory room, in a large hall, guided navigation in a large hall, guided navigation in a simple scenario, guided navigation in scenario with an elevatorwere repeated (with the new TTS directions and spatial sounds). In addition, a guided navigation experiment was conducted from the ground floor of one building to a laboratory room in another building (passing through the multiple hallways, using doors and stairs) by a blindfolded person using TTS cues and spatial sounds. The highly complex scenario demonstrates the potential of the application to be used in real spaces.