Technology has enhanced the world for those with vision limitations. As many as 50 million Americans have some degree of vision loss — a problem that was poorly addressed by technology prior to the arrival of smartphone navigation, smart glasses, sensors, robotics and AI in nearly all devices and technologies. Today, it’s a whole new and innovative world — supportive of individuals with a range of vision limitations from low vision to completely blind. And in fact, today those who are blind can use haptics to follow a game in real time, even in a stadium. They can create presentations from data sets, read road signs down the road, navigate airports and safely make their way through obstacle-filled streets. Here are five technology enablers that will likely make a difference:
OneCourt. “Our laptop-sized device uses haptics, or trackable vibrations, to relay key events and spatial positioning of the ball so you can feel the game-play with your fingertips. As the ball moves, you will feel the vibration on the device moving. Specific aspects of the game are communicated with different vibration patterns. Our haptic language is conveyed in real-time, empowering users to experience the game live! The device can be paired with your favorite broadcast at both live games and during the at-home experience. Our device can also be used in conjunction with closed-circuit radios that are available in stadiums and arenas if desired, or it can be used as a stand alone device on game day.” Learn more at OneCourt.
Umwelt. “Umwelt, an authoring environment designed for screen-reader users, incorporates an editor that allows someone to upload a dataset and create a customized representation, such as a scatterplot, that can include three modalities: visualization, textual description, and sonification. Sonification involves converting data into nonspeech audio. The system, which can represent a variety of data types, includes a viewer that enables a blind or low-vision user to interactively explore a data representation, seamlessly switching between each modality to interact with data in a different way.” Learn more at Umwelt.
Seleste Smart Glasses. “Using our built-in speaker and 8 Mega Pixel autofocus camera our glasses allow you to read everything from a book to the large sign down the road. We also have openAI enabled scene description that can describe your surroundings in detail. Most importantly we have a smart assistant called ELA that you can ask questions about virtually anything in front of you like “what do the flowers in this bouquet look like?” or “Can you read me the vegetarian options on this menu?”. ELA can even describe the world as you move around. For example you can ask it to describe the art as you walk around a gallery or ask it to tell you what stores you walk by as you walk around a mall or even ask it to look for your wallet as you walk around your house. The glasses work by connecting to your phone’s bluetooth and wifi or hotspot so you can use the glasses at home and outside. You can control the glasses using our app, the buttons on the glasses or through voice commands.” Currently iPhone. Learn more at Seleste.
Glidance Glide. “The Glide is a small friendly looking device with two 7 inch wheels, a stem and a handle that is held at waist level. To use it, you simply hold the handle and start walking. Glide’s sensors and camera system detect the surroundings, to plot safe pathways around objects, and guide you to a specified destination by turning its wheels and applying brakes as needed. Glide can inform you of your surroundings using a speaker or Bluetooth connection to a wireless headset for more discreet navigation. The first of its kind, Glide is designed as an affordable consumer device, using the latest technologies in advanced robotics and AI.” Learn more at Glidance.
WeWalk Smart Cane with WeAssist. “By adding voice control, Bluetooth, a touchpad, ultrasonic sensors, vibration, AI, and more, we’ve built a smart cane that can improve the safety of users and help them overcome key barriers to mobility, ultimately contributing to a more equitable and accessible world.” The WeWALK team, some of whom have visual impairments themselves, eat, sleep, and breathe assistive and adaptive technology. The team deeply understands the challenges facing people with visual impairments such as detecting low-hanging objects and holding a cane in one hand and bouncing between apps on a phone with the other.” They also offer WeAssist, a video call service to help those who need assistance, such as inside airports, to get specific directions. Learn more about the Microsoft Case Study at WeWALK.