fbpx

Meta’s Habitat 3.0 creates simulated real-life settings to train intelligent AI robots

Meta's Habitat 3.0

A team of researchers at Meta Platforms Inc.’s Fundamental Artificial Intelligence Research division announced the launch of an upgraded version of their AI simulation platform, Habitat, designed to educate robots on how to operate in real-world settings.

As part of the unveiling of Habitat 3.0, the company also introduced the Habitat Synthetic Scenes Dataset, a 3D dataset crafted by artists, useful for training AI navigation agents. Additionally, they revealed HomeRobot, an affordable hardware and software platform, suitable for both simulated and actual environments, created to serve as a robot assistant.

In a blog post, the FAIR researchers elucidated that these new releases mark their ongoing advancements in what they refer to as “embodied AI.” This entails AI agents capable of perceiving and interacting with their surroundings, ensuring safe cohabitation with human partners, and providing assistance and communication in both digital and physical realms.

Habitat, essentially a collection of virtual environments including offices, homes, and warehouses, serves as a platform to train and refine AI-driven robots for real-world navigation. The intricate virtual settings are meticulously constructed using an infrared capture system, accurately measuring the shapes and dimensions of objects such as tables, chairs, and books. Within these environments, researchers can train robots to complete complex tasks, requiring a comprehensive understanding of their surroundings.

Habitat 3.0 extends these existing capabilities by accommodating both robot and humanoid avatars, facilitating collaborative tasks between humans and robots. This allows for joint activities like cleaning a living room or preparing meals in a kitchen. The inclusion of realistic human avatars enables the simulation of both low- and high-level interactions.

According to the FAIR team, the integration of Habitat 3.0 will significantly reduce the learning time for robot AI agents from months or years to just a few days, enabling rapid testing of new models in safe simulated environments without any risk.

The Habitat Synthetic Scenes Dataset, HSSD-200, is expected to accelerate embodied AI research by providing 3D simulations of real-world scenes crucial for training. It includes 211 high-quality 3D scenes replicating real-world environments, incorporating a diverse set of 18,656 models from 466 semantic categories.

The new HomeRobot library, introduced by FAIR, offers a hardware and software blueprint for researchers keen on employing their Habitat-trained models in the physical world. This user-friendly stack, along with affordable hardware components, ensures easy setup for real-world applications, specifically focusing on Open-Vocabulary Mobile Manipulation research.

Holger Mueller of Constellation Research Inc. praised Meta’s development, emphasizing the significance of Habitat 3.0’s focus on human-machine interaction for the integration of robots into daily life. The researchers at FAIR emphasized their ongoing research in embodied AI, aiming to explore how robots can collaborate with humans in dynamic and ever-changing real-world environments. They expressed their intent to utilize Habitat 3.0 and HSSD-200 to gather data on human-robot interaction and collaboration, with plans to deploy the simulation-trained models into the physical world for performance evaluation.