Researchers From IBM and Carnegie Mellon Unveil Visual Assistance App

image

Researchers from IBM and Carnegie Mellon University’s Cognitive Assistance Laboratory have partnered to create a new smartphone app to help people living with blindness or other severe visual impairments. The team recently unveiled the first iteration of their development efforts – the NavCog app uses an array of sensors and technologies to help blind users navigate their surroundings.

The new app was created within an open source development platform called HULOP, which stands for Human-scale Localization Platform. In its simplest terms, NavCog is a GPS device that provides detailed audio directions from point A to point B through a person’s headset, rather than with an onscreen map. Rather than rely on traditional GPS technology, the app uses iBeacon locators to provide end users with better location accuracy and more precise directions. Because of its reliance on iBeacon, NavCog is not suitable for use in the general public, but was designed to help with navigation in smaller settings, such as on a college or corporate campus, or in a mall or airport. The team behind the app rolled out an iBeacon-based navigation system on Carnegie Mellon’s campus, placing beacons every eight to 12 meters along all major walkways. This network is currently the only map that synchs with NavCog, but the team has published instructions for organizations interested in rolling out their own iBeacon networks on its Github page.

image

While NavCog is limited in its current iteration, the development team has also published a three-year development roadmap that will introduce a number of enhancements and support for new technologies that should drastically improve its accuracy and usefulness. Currently, iBeacon networks are able to geolocate a user to an accuracy of 1.5 meters. The team hopes that accuracy will be improved to under 0.5 meters by adding support for ultrasonic ranging. Beyond improved location support, researchers have several other ongoing parallel projects that will eventually bring both facial recognition and mood evaluation into its toolset. Researchers envision a final product that will allow a user to walk across a campus unassisted, recognize a friend as they pass by, and recognize if the friend is noticeably upset, happy, or excited as they approach.

The app, in its current version, is already available on iOS.


Enjoy HIStalk Connect? Sign up for update alerts, or follow us at @HIStalkConnect.

↑ Back to top

Founding Sponsors

Platinum Sponsors