Connect with us

Tech

Ray-Ban are being enhanced with AI-driven visual search capabilities

Starting today, Meta’s AI can also answer queries with real-time information.

The Ray-Ban Meta smart glasses are set for significant upgrades, enhancing the capabilities of the social network’s AI assistant. This update introduces real-time information support to the assistant, beginning with tests on new “multimodal” features. These features enable the AI to respond to questions based on the wearer’s surroundings.

Previously, Meta AI had a knowledge limit up to December 2022, restricting its ability to provide current event updates or live data like sports scores, traffic updates, or other on-the-go information. However, Meta CTO Andrew Bosworth has announced a change. All Meta smart glasses in the U.S. will now have access to real-time information, partially enabled by Bing.

Meta AI's new

The upcoming “multimodal AI” feature, initially showcased at Connect, is particularly notable. It allows the AI to answer contextual questions based on what the wearer sees through the glasses. This enhancement aims to make Meta AI more practical and less of a novelty, addressing some initial criticisms of the smart glasses. The early access beta version of this multimodal functionality will be available to a select group in the U.S. who opt-in, with broader availability expected in 2024.

Advertisement

Mark Zuckerberg and Andrew Bosworth have demonstrated the new capabilities with videos and screenshots. For instance, Zuckerberg used the command “Hey Meta, look and tell me,” to get outfit suggestions and identify objects like fruits or translate meme texts. Bosworth also mentioned that users could ask about their immediate environment and generate creative captions for newly taken photos.

Advertisement
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Advertisement

Copyright © 2023 GagsHub