![]() ![]() “These groundbreaking features were designed with feedback from members of disability communities every step of the way, to support a diverse set of users and help people connect in new ways.”Īssistive Access Supports Users with Cognitive DisabilitiesĪssistive Access uses innovations in design to distill apps and experiences to their essential features in order to lighten cognitive load. ![]() “Accessibility is part of everything we do at Apple,” said Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives. “Today, we’re excited to share incredible new features that build on our long history of making technology accessible, so that everyone has the opportunity to create, communicate, and do what they love.” ![]() “At Apple, we’ve always believed that the best technology is technology built for everyone,” said Tim Cook, Apple’s CEO. For users who are blind or have low vision, Detection Mode in Magnifier offers Point and Speak, which identifies text users point toward and reads it out loud to help them interact with physical objects such as household appliances. Coming later this year, users with cognitive disabilities can use iPhone and iPad with greater ease and independence with Assistive Access nonspeaking individuals can type to speak during calls and conversations with Live Speech and those at risk of losing their ability to speak can use Personal Voice to create a synthesized voice that sounds like them for connecting with family and friends. These updates draw on advances in hardware and software, include on-device machine learning to ensure user privacy, and expand on Apple’s long-standing commitment to making products for everyone.Īpple works in deep collaboration with community groups representing a broad spectrum of users with disabilities to develop accessibility features that make a real impact on people’s lives. New software features for cognitive, speech, and vision accessibility are coming later this yearĬUPERTINO, CALIFORNIA Apple today previewed software features for cognitive, vision, hearing, and mobility accessibility, along with innovative tools for individuals who are nonspeaking or at risk of losing their ability to speak. The LRC is or has been supported by grants from several government agencies, including the National Institutes of Health (particularly the National Institute of Child Health and Human Development), the National Science Foundation, the National Institute of Mental Health, the National Aeronautics and Space Administration, and the Department of Defense.Apple introduces new features for cognitive accessibility, along with Live Speech, Personal Voice, and Point and Speak in Magnifier (3) In addition, current projects at LRC also study vocal communication in nonhuman primates. ![]() Chimpanzee useįour chimpanzees-Lana, Sherman, Panzee, and Mercury-are used in “a wide array of comparative cognition studies: spatial memory, delay of gratification, numerical cognition, analogical reasoning, and cooperation” at LRC. (2) The language of communication is referred to by researchers as Yerkish, after the Yerkes Primate Center. The studies taught chimpanzees and bonobos using a computer-monitored keyboard to communicate using lexigrams, or abstract symbols. Duane Rumbaugh, an experimental psychologist, began language studies with chimpanzees in association with Georgia State University. The Language Research Center (LRC) is operated by the Department of Psychology in Georgia State University’s College of Arts and Sciences on a wooded 55-acre facility south of Atlanta in Decatur, Georgia. ![]()
0 Comments
Leave a Reply. |