NextFin

Alibaba Unveils Embodied Intelligence Foundation Model RynnBrain, Open-Sources Seven Other Models

Summarized by NextFin AI
  • Alibaba's DAMO Academy has launched the RynnBrain model, which includes seven open-sourced models, enhancing robotic intelligence significantly.
  • The RynnBrain model enables robots to retain spatial-temporal memory, allowing them to resume interrupted tasks effectively.
  • With scalability for various applications, RynnBrain is poised to become a foundational model in the embodied AI sector.
  • Trained on the Qwen3-VL model, RynnBrain has set new records on 16 evaluation benchmarks, outperforming competitors like Google's Gemini Robotics ER 1.5.

Alibaba's DAMO Academy has released the RynnBrain embodied intelligence brain foundation model and has open-sourced seven full-series models, including the 30B MoE, all at once.

RynnBrain allows robots to possess spatial-temporal memory and spatial reasoning capabilities for the first time, significantly enhancing their intelligence. For example, robots running the RynnBrain model, when interrupted during Task A and asked to perform Task B first, can remember the time and spatial state of Task A and continue working on it once Task B is completed.

RynnBrain also offers good scalability, enabling rapid post-training of various embodied models such as navigation, planning, and actions. It is expected to become a foundational model in the embodied AI industry.

RynnBrain was reportedly trained on the Qwen3-VL model and optimized using the self-developed RynnScale architecture, which accelerates training by two times with the same resources. The training data exceeds 20 million pairs. The results show that the model has set new records on 16 embodied open-source evaluation benchmarks, surpassing top industry models like Google’s Gemini Robotics ER 1.5.

Explore more exclusive insights at nextfin.ai.

Search
NextFinNextFin
NextFin.Al
No Noise, only Signal.
Open App