Google has revealed a string of accessibility updates it’s rolling out for Maps, Search and Assistant, as well as greater availability of some camera-based Pixel features. One of the main focus areas this time around is wheelchair accessibility. A new option that’s gradually becoming available on iOS and Android will allow Maps users to request stair-free walking routes. This feature — which Google says will benefit those traveling with luggage and strollers as well — will be available globally, as long as the company has sufficient data for the region.
Google notes that if you have the wheelchair-accessible option enabled in your transit preferences, this will automatically be applied to walking routes too. Otherwise, when you request a walking route, you can access stair-free directions by tapping the three dots at the top of the screen and enabling the “wheelchair-accessible” option.
On a related note, wheelchair-accessible information will be available across more Google products, namely on Maps for Android Auto and cars with Google built in. When you search for a place and tap on it, a wheelchair icon will appear if the location has a step-free entrance, accessible restrooms, parking or seating.
It should be easier to find and support businesses owned by people with disabilities in Maps and Search too. If a business chooses to identify itself as “disabled-owned,” this will be mentioned in Maps and Search listings. Google previously rolled out similar Asian-owned, Black-owned, Latino-owned, LGBTQ+ owned, veteran-owned and women-owned business labels.
Elsewhere, Google is enabling screen reader capabilities in Lens in Maps (which was previously called Search with Live View), an augmented reality tool that’s designed to help you find things like ATMs, restrooms and restaurants with the help of your handset’s camera. When you’re in a perhaps unfamiliar place, you can tap the camera icon in the search bar and point your phone at the world around you.
“If your screen reader is enabled, you’ll receive auditory feedback of the places around you with helpful information like the name and category of a place and how far away it is,” Eve Andersson, senior director on Google’s Products for All team, wrote in a blog post. This Lens in Maps feature, which is geared toward blind and low-vision folks, will be available on iOS starting today and Android later this year.
On Pixel devices, the Magnifier app uses your camera to help you zoom in on real-world details from afar or to make text on menus and documents easier to read with the help of color filters, brightness and contrast settings. The app is available for Pixel 5 and later devices, but not the Pixel Fold.
Google also notes that the latest version of Guided Frame that arrived on Pixel 8 and Pixel 8 Pro earlier this month recognizes pets, dishes and documents in addition to faces to help people who are blind or have low-vision take good-quality photos. The Guided Frame update is coming to Pixel 6 and Pixel 7 devices later this year.
Meanwhile, Google is offering more customizable Assistant Routines. The company says you’ll be able to add a Routine to your home screen as a shortcut, determine the size of it and customize it with your own images. “Research has shown that this personalization can be particularly helpful for people with cognitive differences and disabilities and we hope it will bring the helpfulness of Assistant Routines to even more people,” Andersson wrote. Google developers took inspiration from Action Blocks for this feature.
Last but not least, Google earlier this year added a feature to the desktop Chrome address bar to detect typos and suggest websites based on what the app reckons you meant. The feature will be available on Chrome on iOS and Android starting today. The idea is to help folks with dyslexia, language learners and those who make typos more easily find what they’re seeking.