Today is the twelfth annual Global Accessibility Awareness Day, a day aimed at raising awareness of people with disabilities and impairments. Apple is once again using the occasion to preview some of its upcoming accessibility features. And this year, Apple has some really fascinating announcements. Here are some of the highlights:
Live Speech and Personal Voice
For people who are unable to speak, Live Speech is a way that you can type words on an iPhone (or iPad or Mac) and have the words spoken out loud during phone calls, FaceTime calls, and in-person conversations. It's a simple idea that makes a lot of sense.
But wait ... there is more. For people who are at risk of losing their ability to speak, such as people recently diagnosed with ALS, Apple has created Personal Voice, a way to create a voice that sounds like your voice. To configure it, you read along with a randomized set of text prompts for 15 minutes. Then, your iPhone or iPad creates a voice for you to use.
You don't want someone else to take your voice, so Apple has some interesting built-in security and privacy. First, those phases are random, so you cannot just record someone else's voice while they are talking and then use that recording to create a Personal Voice because they would not be speaking the randomized text prompts. Moreover, the Personal Voice is created right on your device, so your voice is never uploaded to the Internet.
I'm very curious to try this out and see how it works. It sounds like incredibly cool technology.
Assistive Access
You may know how to use the iPhone, but the interface can be very confusing for some folks. When Assistive Access mode is turned on, apps like the Camera, Photos, Music, Phone, and Messages take on a simplistic interface with large, high-contrast buttons and few menus.
I know quite a few folks who didn't grow up with technology and who would appreciate this mode.
Point and Speak
For people with reduced or no vision, the upcoming Point and Speak mode in the Magnifier app allows a person to interact with physical objects that have text labels. Apple explains: "For example, while using a household appliance — such as a microwave — Point and Speak combines input from the camera, the LiDAR Scanner, and on-device machine learning to announce the text on each button as users move their finger across the keypad."
...and more
And that is just the tip of the iceberg. For example, for people with low vision, Apple is improving the ability to change text size. For people who have trouble hearing, more hearing devices work with the iPhone. For people sensitive to rapid animations, Apple is adding an ability to automatically pause images with moving elements in Messages and Safari.
These new features will be incredibly valuable for the target audience. But like most accessibility features, I suspect that they will also be appreciated by other folks for various reasons. For example, I'm sure folks will come up with some very creative things to do with Personal Voice. I look forward to seeing all of these new features rolled out.
0 comments:
Post a Comment