Your future roommate: NEO Gamma

PLUS: EngineAI’s front-flipping humanoid

Good morning, robotics enthusiasts. Norwegian robotics company 1X Technologies just unveiled NEO Gamma — a softer, gentler humanoid designed specifically for the home.

1X says that NEO’s advanced situational awareness allows it to blend seamlessly into your home, quietly doing the dishes and other chores while you binge your favorite series.

Could this be the high-tech roommate of the future?

In today’s robotics rundown:

  • 1X NEO Gamma home humanoid

  • EngineAI’s front-flipping robot

  • Tiny robot swimming like flatworms

  • New AI that learns like a child

  • Quick hits on other major news

LATEST DEVELOPMENTS

1X

Image source: 1X Technologies

The Rundown: Norwegian robotics company 1X launched NEO Gamma, its next-generation AI-powered humanoid for home environments—with a demo video showing the robot handling household tasks like making coffee, doing laundry, and vacuuming.

The details:

  • NEO Gamma wears a soft knit nylon suit (for enhanced safety) and walks, squats, and sits like humans while performing household activities.

  • The humanoid’s underlying visual manipulation model ensures real-time adaptability when handling unfamiliar objects in different household scenarios.

  • It uses "Emotive Ear Rings" with an in-house language model, an advanced three-speaker system, and four mics for seamless back-and-forth interactions.

  • Its hardware is also now 10x more reliable and 10 dB quieter than the last version— or just about the same as a standard refrigerator.

Why it matters: NEO Gamma joins a sea of humanoids from companies like Figure, Boston Dynamics, and Agility. While most startups are testing their humanoids in factory settings, 1X’s focus on a softer robotic roommate marks a different approach. The only question remains: will it reach the scale to become a mass-market product?

ENGINEAI ROBOTICS

Image source: EngineAI Robotics

The Rundown: China’s EngineAI Robotics just released a video on X showing its lightweight, compact PM01 humanoid performing a full front flip with impressive accuracy – in what appears to be a world’s first.

The details:

  • Standing 4.5 ft. tall with a weight of about 88 pounds, PM01 uses 24 degrees of full-body freedom to move like humans at a speed of about 2 m/sec.

  • However, the interesting bit is flexibility: it supports 320-degree waist rotation with 5 and 6 DoF for arms and legs, enabling complex movements.

  • The humanoid utilizes an X86 architecture-based computing system with NVIDIA Jetson Orin modules, supporting cross-platform algorithm deployment.

  • Its smart control interface is inspired by Iron Man, which, the company says, allows users to easily issue commands and access functions.

Why it matters: Since frontflips are trickier than backflips, the stunt shows that PM01 is getting more agile than many robots out there. Plus, it stands out with a price (approximately $14K) lower than alternatives from Tesla and Unitree as well as an open development approach.

RESEARCH

Image source: École polytechnique fédérale de Lausanne (EPFL)

The Rundown: A team of Swiss researchers has developed a promising little swimming robot, inspired by marine flatworms, that could be used in pollution monitoring and environmental studies like surveying coral reefs.

The details:

  • Detailed in a new study in Science Robotics, the aquatic robot is smaller than the size of a credit card and weighs only 6 grams.

  • Rather than traditional propeller-based systems, this tiny robot derives part of its agility from its featherweight oscillating fins—similar to a flatworm.

  • The bot can autonomously navigate surfaces dense with plants or animals, without harming the environment, the researchers said.

  • Plus, it can move in multiple directions, achieving impressive speeds of up to 4.7 inches per second, equivalent to 2.6 body lengths per second.

Why it matters: Combining cutting-edge robotics with insights from nature, the researchers are leading the way for a new generation of adaptable, environmentally harmonious robotic systems for ecological studies. Of course, there's still more work to be done—this bot can float on the surface, but it can't take a deep dive just yet.

RESEARCH

Image source: Ideogram/The Rundown

The Rundown: Researchers at the Okinawa Institute of Science and Technology in Japan have developed a brain-inspired AI that integrates vision, movement, and language to teach robots more human-like interaction.

The details:

  • The researchers took inspiration from how toddlers connect words & vision to actions—like identifying a red ball after playing with a bunch of red flowers.

  • Using the idea, they created an embodied AI with an architecture that focuses on compositionality or the ability to break down and recombine concepts.

  • In tests, a robot powered by the AI learned to move or stack unfamiliar colored blocks, based on verbal instructions to perform specific tasks.

  • Compared to traditional LLMs, this approach achieves real-time adaptability with much less data and computational power.

Why it matters: Researchers say this new AI model could help develop a new breed of robots that better interpret and respond to humans in real-world settings. Plus, it could help create systems with a deeper understanding of concepts like “suffering,” potentially resulting in more responsible and ethical AI behavior.

QUICK HITS

📰 Everything else in robotics today

Apple released the first developer beta of iOS 18.4, adding support for robot vacuums in the Apple Home app through Matter.

Open-source code repository Hugging Face debuted a foundational AI model for robots that translates natural language commands into physical actions.

Serve’s autonomous robots will soon be delivering Uber Eats orders for Shake Shack and Mister O1 in Miami, following similar deals in Los Angeles.

The Robotics and AI Institute is teaching robotic bikes to jump and robot dogs to run on a track at 11.6 miles per hour, tripling its original speed.

Uber CEO Dara Khosrowshahi said that Elon Musk wasn’t open to making Tesla’s planned robotaxis available on the ride-sharing platform.

China’s Northeastern University researchers developed a new H-shaped bionic robot that could replicate the movements that cheetahs make while running. 

University of California, Santa Barbara researchers engineered robots that work together to behave as smart materials, with tunable shapes and strength levels.

U.S. researchers are developing surgical robots that administer sight-restoring subretinal injections better than surgeons due to the limitations of human motor control.

University of Science and Technology of China (USTC) researchers developed a lightweight prosthetic hand with 19 degrees of freedom and human-level functions.

China’s autonomous driving tech company Pony.ai launched a robotaxi service in Guangzhou between the airport and railway station to the city center.

COMMUNITY

Join our next workshop on Wednesday, February 26th at 3 PM EST and learn how to create personalized AI video messages at scale with Synthesia's Kevin Alster, Strategic Advisor and former Head of Synthesia Academy.

RSVP here. Not a member yet? Join The Rundown University on a 14-day free trial.

That's it for today!

Before you go we'd love to know what you thought of today's newsletter to help us improve The Rundown experience for you.

Login or Subscribe to participate in polls.

See you soon,

Rowan, Jennifer, and Joey—The Rundown’s editorial team