Advertisement
Canada markets closed
  • S&P/TSX

    24,102.71
    -60.12 (-0.25%)
     
  • S&P 500

    5,695.94
    -55.13 (-0.96%)
     
  • DOW

    41,954.24
    -398.51 (-0.94%)
     
  • CAD/USD

    0.7341
    -0.0001 (-0.01%)
     
  • CRUDE OIL

    77.40
    +0.26 (+0.34%)
     
  • Bitcoin CAD

    85,347.05
    -131.23 (-0.15%)
     
  • XRP CAD

    0.73
    -0.00 (-0.01%)
     
  • GOLD FUTURES

    2,662.20
    -3.80 (-0.14%)
     
  • RUSSELL 2000

    2,193.09
    -19.71 (-0.89%)
     
  • 10-Yr Bond

    4.0260
    +0.0450 (+1.13%)
     
  • NASDAQ futures

    19,988.75
    -4.00 (-0.02%)
     
  • VOLATILITY

    22.64
    +3.43 (+17.86%)
     
  • FTSE

    8,303.62
    +22.99 (+0.28%)
     
  • NIKKEI 225

    39,332.74
    +697.12 (+1.80%)
     
  • CAD/EUR

    0.6684
    -0.0004 (-0.06%)
     
Engadget
Why you can trust us

Engadget has been testing and reviewing consumer tech since 2004. Our stories may include affiliate links; if you buy something through a link, we may earn a commission. Read more about how we evaluate products.

Google’s accessibility app Lookout can use your phone’s camera to find and recognize objects

It's also making wheelchair information on Google Maps available on desktop.

Google

Google has updated some of its accessibility apps to add capabilities that will make them easier to use for people who need them. It has rolled out a new version of the Lookout app, which can read text and even lengthy documents out loud for people with low vision or blindness. The app can also read food labels, recognize currency and can tell users what it sees through the camera and in an image. Its latest version comes with a new "Find" mode that allows users to choose from seven item categories, including seating, tables, vehicles, utensils and bathrooms.

When users choose a category, the app will be able to recognize objects associated with them as the user moves their camera around a room. It will then tell them the direction or distance to the object, making it easier for users to interact with their surroundings. Google has also launched an in-app capture button, so they can take photos and quickly get AI-generated descriptions.

A screenshot showing object categories in Google Lookout, such as Seating & Tables, Doors & Windows, Cups, etc.
A screenshot showing object categories in Google Lookout, such as Seating & Tables, Doors & Windows, Cups, etc. (Google)

The company has updated its Look to Speak app, as well. Look to Speak enables users to communicate with other people by selecting from a list of phrases, which they want the app to speak out loud, using eye gestures. Now, Google has added a text-free mode that gives them the option to trigger speech by choosing from a photo book containing various emojis, symbols and photos. Even better, they can personalize what each symbol or image means for them.

Google has also expanded its screen reader capabilities for Lens in Maps, so that it can tell the user the names and categories of the places it sees, such as ATMs and restaurants. It can also tell them how far away a particular location is. In addition, it's rolling out improvements for detailed voice guidance, which provides audio prompts that tell the user where they're supposed to go.

Finally, Google has made Maps' wheelchair information accessible on desktop, four years after it launched on Android and iOS. The Accessible Places feature allows users to see if the place they're visiting can accommodate their needs — businesses and public venues with an accessible entrance, for example, will show a wheelchair icon. They can also use the feature to see if a location has accessible washrooms, seating and parking. The company says Maps has accessibility information for over 50 million places at the moment. Those who prefer looking up wheelchair information on Android and iOS will now also be able to easily filter reviews focusing on wheelchair access.

Google made all these announcements at this year's I/O developer conference, where it also revealed that it open-sourced more code for the Project Gameface hands-free "mouse," allowing Android developers to use it for their apps. The tool allows users to control the cursor with their head movements and facial gestures, so that they can more easily use their computers and phones.

Catch up on all the news from Google I/O 2024 right here!