Get featured on IndiaAI

Contribute your expertise or opinions and become part of the ecosystem!

Project Astra is a new multimodal AI agent capable of answering real-time questions fed to it through text, video, images, and speech by pulling up the relevant information. At Google I/O, the company’s annual developer conference, Demis Hassabis, the Head of Google DeepMind and the leader of Google’s AI effort presented an early version of what he hopes will become that universal assistant.

According to GoogleDeepMind, Project Astra is a real-time, multimodal AI assistant that can see the world. The new AI knows what things are and where the users left them. It can provide answers or help the users to do almost anything. 

Astra is not limited to smartphones. In a demo video, Google also showed it being used with a pair of smart glasses. Virtual assistants, until recently, depended on the information they extracted from the web and the information you fed them to help accomplish tasks. Project Astra can learn about the world, making it as close as possible to a human-assistant-like experience. 

Project Astra is in the early stages of testing, and there are no specific launch dates. A test version of the model is available on the DeepMind website. 

Want to publish your content?

Publish an article and share your insights to the world.

Get Published Icon
ALSO EXPLORE