- The Rundown Robotics
- Posts
- Nvidia's palm-sized 'robot brain'
Nvidia's palm-sized 'robot brain'
PLUS: Tesla swaps out Optimus training plan
Good morning, robotics enthusiasts. Nvidia just unveiled the Jetson AGX Thor, a $3,499 mini “robot brain,” packing desktop-level AI power into a palm-sized chip.
Robots can now run any generative AI model entirely on-device, no cloud required. With early adopters including Meta, Boston Dynamics, and Figure, could this tiny powerhouse spark a new frontier in robotics?
In today’s robotics rundown:
Nvidia’s tiny, powerful new ‘robot brain’
Tesla shifts gears for Optimus training
Boston Dynamics’ Spot flips like a gymnast
Hyundai to launch massive U.S. robotics hub
Quick hits on other robotics news
LATEST DEVELOPMENTS
NVIDIA

Image source: Nvidia
The Rundown: Nvidia just dropped Jetson AGX Thor, a $3,499 “robot brain” that packs desktop-level AI horsepower into a palm-sized module, letting robots run massive language, vision, and multimodal models without ever touching the cloud.
The details:
The module features a 2,560-core Blackwell GPU, 96 fifth-generation Tensor cores, and delivers up to 2,070 FP4 teraflops of AI compute.
With 128 GB RAM and 14 Arm CPU cores, Thor supports large LVM AI models locally, with 7x more AI computing power than its predecessor, Jetson Orin.
Early adopters Amazon, Meta, Boston Dynamics, Agibot, and Agility Robotics are integrating Thor into robots for warehouses and research.
Nvidia is also offering a Drive AGX Thor variant for self-driving and autonomous vehicle development.
Why it matters: Nvidia’s Jetson AGX Thor will give physical AI a major boost, letting machines run massive AI models locally, cutting out cloud delays, and giving robots real-time decision-making. With performance and efficiency far beyond previous generations, it looks to make complex, multimodal AI feasible on the edge.
TESLA

Image source: Tesla
The Rundown: Tesla has shaken up its Optimus robot training strategy, ditching motion-capture suits and VR headsets in favor of a vision-only approach using video recordings of human workers performing tasks, Business Insider reports.
The details:
This methodology aligns with Tesla’s self-driving car development, using massive video data to train neural networks for adaptable behaviors.
Workers wear custom helmet-mounted rigs with five in-house cameras, capturing detailed hand and finger movements from multiple angles.
Leadership of the program transitioned to Ashok Elluswamy, Tesla’s AI director, after former Optimus chief Milan Kovac stepped down.
Experts note video-based learning could let Optimus generalize skills, but warn it may lack the physical feedback that comes from direct teleoperation.
Why it matters: This shift, insiders say, could let Tesla scale data collection faster, reflecting Elon Musk’s belief that AI learns best through cameras — a principle already powering Tesla’s self-driving tech. The real question: can it capture enough richly annotated video for a robot to master a wide range of household and industrial tasks?
BOSTON DYNAMICS

Image source: Boston Dynamics
The Rundown: Boston Dynamics just dropped a new clip showing Spot, its four-legged robot dog, landing gymnast-style backflips. But it’s not all for show — as lead engineer Arun Kumar explains, these stunts are a real stress-test for Spot’s agility.
The details:
Kumar explains in the video that backflips are not designed for customers, but to push the robot's hardware and motors to their absolute limits.
Several clips reveal Spot tumbling or landing awkwardly, highlighting the trial-and-error nature of training robots for extreme maneuvers.
Reinforcement learning drives the progress, with Spot training through countless trial-and-error cycles until the flips stick.
Spot’s backflip lessons help engineers develop better recovery algorithms for real-world challenges, ensuring the robot can right itself if it slips or trips.
Why it matters: Spot certainly isn’t the only robot dog that can do tricks, but watching it recover from failures gives us a glimpse into the messy, iterative process of developing real-world robotics. Plus, Kumar explains the real purpose behind the stunts, creating versatile robots that can recover from falls, even while carrying heavy payloads.
HYUNDAI

Image source: Hyundai
The Rundown: Hyundai Motor Group just announced it will invest $26B in the U.S through 2028, with $5B of that earmarked for a state-of-the-art robotics manufacturing plant to produce 30K robots a year.
The details:
This facility is envisioned as a "Robotics Innovation Hub," focused on design, manufacturing, testing, and the deployment of advanced robots.
As Hyundai owns an 80% stake in Boston Dynamics, the U.S. robotics plant will accelerate the commercialization and scaling of Spot and Atlas robots.
Besides robotics, the plan includes building a new steel mill in Louisiana and scaling up Hyundai and Kia’s existing U.S. car manufacturing operations.
Why it matters: Hyundai is staking big on robotics, planning one of the largest, most advanced robot manufacturing hubs in the U.S. The facility will churn out robots at a scale rarely seen outside China or research labs, while creating thousands of jobs and supercharging Hyundai’s own smart factories.
QUICK HITS
San Francisco partially lifted its five-year ban on private vehicles along Market Street, now allowing Waymo driverless taxis to operate during limited times.
Robomart, a Los Angeles-based startup, unveiled its level-four autonomous RM5 delivery robot with a $3 flat fee for customer orders.
China's Haiqin remotely operated vehicle (ROV), designed for deep-sea exploration up to 20K feet, successfully completed its maiden voyage in the South China Sea.
Global robotics investments soared to at least $4.35B in July 2025, with 93 funding rounds dominated by companies in the U.S., China, and Israel, a new report cites.
1X’s Bernt Bornich told CNBC that demand is high for the NEO home humanoid, which he says will offer full autonomy “closer to 2027.”
A fleet of Unitree robot dogs acted as volunteers at China’s Zhejiang University, helping students move into dorms by hauling their luggage.
North Carolina State University researchers created a self-driving lab where multiple robots, guided by AI, autonomously discover and optimize quantum dots.
COMMUNITY
Read our last AI newsletter: The AI app power rankings
Read our last Tech newsletter: Klarna gets $14B reality check
Read our last Robotics newsletter: Drones that fly like birds of prey
Today’s AI tool guide: Create stylish presentations with Canva AI
RSVP to our next workshop @ 4 PM EST Friday: Essential ChatGPT Tips
That's it for today!Before you go we'd love to know what you thought of today's newsletter to help us improve The Rundown experience for you. |
See you soon,
Rowan, Jennifer, and Joey—The Rundown’s editorial team