Sensors and data are proliferating. Whether it’s a phone, a car, or smart home devices, astonishing amounts of data fuel the work of researchers and product developers devising new products and experiences. Among those who will benefit, even more than they already do, are blind or visually impaired people.
Here because Sight Tech Global is pleased to announce two sessions, one with Apple and a separate one with Amazon, focusing on how these two technology leaders in machine learning and artificial intelligence see the future unfold, particularly around new experiences that will help people with loss of view. Sight Tech Global is a free, global and virtual event taking place on December 1st and 2nd. Register today.
Designing for Everyone: Accessibility and Machine Learning at Apple
For the Apple session, TechCrunch Editor-in-Chief Matthew Panzarino will speak with Jeff Bigham and Sarah Herrlinger.
Bigham is the head of research for AI / ML accessibility at Apple, as well as an associate professor of computer science at Carnegie-Mellon. He leads a team of researchers and engineers focused on improving accessibility through artificial intelligence and machine learning.
Herrlinger is Apple’s Senior Director of Global Accessibility Policy & Initiatives. He conducts accessibility programs for Apple, including support for disabled communities around the world, accessibility technologies built into all Apple hardware, software and services, as well as other initiatives that promote Apple’s culture of inclusion.
Apple’s iPhone and VoiceOver are among the most valuable tools available to blind people because they provide so many services, from browsing to reading e-mail aloud. With the addition of lidar and computer vision capabilities, among others, the phone in combination with cloud computing has become even more capable as a source of data about the world and a means to interpret that information in meaningful ways. Herrlinger and Bigham will provide an overview of Apple’s approach to accessible design, past year’s advancements, inclusiveness in machine learning research, and the latest approaches and future features.
Why does Amazon’s vision include talking less with Alexa?
For the Amazon session, Be My Eyes Vice President Will Butler will speak with Prem Natarajan, Vice President of Alexa AI and Beatrice Geoffrin, Director of Alexa Trust.
Geoffrin is a Director of Product Management on Amazon’s Alexa team. He heads the Alexa Trust organization that focuses on earning and maintaining the trust customers have in Alexa and making Alexa accessible by overseeing the Alexa for Everyone team.
Natarajan leads a multidisciplinary science, engineering and product organization that enhances the customer experience through advances in dialogue modeling, natural language understanding, entity linking and resolution, and related machine learning technologies.
Amazon’s Alexa is already a standard feature in most homes and is another great addition to a blind person’s tech toolset. As homes become more and more technology-driven, inputs from multiple sources – teachable artificial intelligence, multimodal understanding, sensors, machine vision, and more – will create a truly ambient surround experience. Yep, 1 in 5 Alexa smart home interactions are initiated by Alexa without any voice commands. While Alexa develops an understanding of us and our home well enough to anticipate our needs and act on our behalf in meaningful ways, what are the implications for accessibility?
Do not forget to subscribe now. Sight Tech Global is free, virtual and global.
Sight Tech Global is a production of Centro Vista for the blind and visually impaired. We are grateful to the current sponsors Ford, Google, Humanware, Microsoft, Mojo Vision, Facebook, Fable, APH and Vispero. If you want to sponsor the event, Please contact us. All sponsorship revenues go to the non-profit organization Vista Center for the blind and visually impaired, which has served the Silicon Valley area for 75 years.