Google is rolling out new accessibility enhancements tailored for individuals with disabilities to streamline everyday tasks like taking photos, navigating, and web searching.
Businesses can now indicate if they are owned by persons with disabilities on Google Maps and Search, aiding customers in discovering and supporting them.
Google Maps now provides stair-free walking routes and shows wheelchair-friendly locations.
The Lens feature in Maps, using AR and AI, enables users to explore new places through their phone’s camera with screen reader options soon to be available on Android.
For those who struggle with typos, Google Chrome now offers a typo-detection tool, beneficial for dyslexic users and language learners.
Additionally, inspired by Action Blocks, users can further personalize their Routines shortcuts, a feature that research suggests benefits individuals with cognitive differences.
On Pixel devices, the Magnifier app allows users to zoom into physical details. The Guided Frame feature, available on certain Pixel models, can now detect pets, food, and documents besides faces, with upcoming releases planned for other Pixel versions.
Google has unfurled a fresh, AI-infused feature within its revered search engine, a maneuver that resembles a similar feature launched by Microsoft in March.
This innovative capability empowers users to fashion images directly via their search inquiries, elevating the user experience to a fascinating new plane.
Google’s newly-introduced feature, according to Indian site TOI, employs an AI-supported Search Generated Experience (SGE), which is meticulously engineered to provide up to four automatically generated images in response to a user’s search query.
A scenario might involve a user searching for a visual depiction such as “a friendly Doberman sitting with its plump owner by a pond”, and the SGE strives to deliver generated images to fulfill this visual curiosity.