NS3 SIMULATOR PROJECT TITLE

Being Aware of the World: Toward Using Social Media to Support the Blind With Navigation

This paper lays the ground work for assistive navigation using wearable sensors and social sensors to foster situational awareness for the blind. Our system acquires social media messages to gauge the relevant aspects of an event and to create alerts. We propose social semantics that captures the parameters required for querying and reasoning an event-of-interest, such as what, where, who, when, severity, and action from the Internet of things, using an event summarization algorithm. Our approach integrates wearable sensors in the physical world to estimate user location based on metric and landmark localization.

Streaming data from the cyber world are employed to provide awareness by summarizing the events around the user based on the situation awareness factor. It is illustrated using disaster and socialization event scenarios. Discovered local events are fed back using sound localization so that the user can actively participate in a social event or get early warning of any hazardous events. A feasibility evaluation of our proposed algorithm included comparing the output of the algorithm to ground truth, a survey with sighted participants about the algorithm output, and a sound localization user interface study with blind-folded sighted participants. Thus, our framework supports the navigation problem for the blind by combining the advantages of our real-time localization technologies so that the user is being made aware of the world, a necessity for independent travel.