Facebook is developing artificial intelligence that can see, hear, and ‘understand’ everything humans do.
The social media giant apparently intends to transmit video recordings from augmented reality glasses to its artificial intelligence in order to train it to perceive the world like people do.
Facebook wants to create artificial intelligence that learns to understand the world in the same way that humans do by tracking our every action.
In the first person, the tech giant has unveiled ambitions to teach AI to ‘understand and interact with the world like we do.’ It intends to accomplish this by utilizing video and audio from augmented reality (AR) glasses, such as its new high-tech Ray-Bans.
“While AI normally learns from third-person images and videos, next-generation AI will need to learn from videos that depict the world from the center of action,” according to the business.
“AI that understands the environment from this perspective could enable a new age of immersive experiences,” it continued.
In order to begin training its AI assistants, Facebook gathered 2,200 hours of first-person video from 700 people going about their regular lives for the Ego4D project. These activities can’t yet be accomplished by any AI system, but they could play a key role in Facebook’s plans to create the’metaverse,’ a digital 3D overlay of reality utilizing VR and AR.
With the launching of the new Facebook x Ray-Ban smart glasses and the Oculus VR headsets, it has already started working on this.
AI can learn to see and hear using enormous amounts of real-world data, and it appears that Facebook wants to collect this data using its products in the future to construct more intelligent systems.
However, the introduction of AR glasses, such as Facebook’s Ray-Bans, has raised privacy issues, as the spectacles would allow users to discreetly video others without their consent.
Sign up for one of our newsletters here to stay up to speed on all of the newest news.