- The Rundown Robotics
- Posts
- Amazon's new robot can touch and feel
Amazon's new robot can touch and feel
PLUS: A drone that controls lightning strikes
Good morning, robotics enthusiasts. Amazon just unveiled Vulcan — its most advanced warehouse robot, equipped with a sense of touch that allows it to handle a vast array of items with human-like finesse.
Amazon is revolutionizing warehouse automation—but what does that mean for the company’s 1.5M human warehouse workers? The future of work is changing fast.
In today’s robotics rundown:
Amazon’s Vulcan with a ‘sense of touch’
This drone attracts and controls lightning
Meet Goby: A tiny, hackable $100 bot
Stanford teaches robots to move like us
Quick hits on other robotics news
LATEST DEVELOPMENTS
AMAZON

Image source: Amazon
The Rundown: Amazon just unveiled Vulcan, a cutting-edge robot that it says has a true “sense of touch,” allowing it to pick and sort three-quarters of the items in the company’s vast warehouse stock, a job handled predominantly by humans.
The details:
Vulcan’s grippers come with force feedback sensors that can determine how much pressure to apply when handling packages, including fragile ones.
The robot’s arm has a “spatula-like” end effector that can extract items from densely packed compartments that were previously only accessible to humans.
Its tactile sensors and AI-powered interpretation allow for precise, adaptive motion strategies and real-time learning from physical interactions.
Vulcan can handle about 75% of the 1M different items in a typical Amazon warehouse, besting older robots that relied on vision and suction alone.
Why it matters: Vulcan can run up to 20 hours a day and is already deployed in U.S. and German warehouses, where it has processed hundreds of thousands of orders. Amazon maintains the robot won’t replace workers, but its ability to learn and improve over time suggests a future where roles could shift significantly.
DRONE INNOVATIONS

Image source: Nippon Telegraph and Telephone Corp.
The Rundown: A Tokyo-based tech giant says that it has developed and successfully tested the world’s first drone-based system capable of actively triggering and guiding lightning strikes — channeling their immense power safely to the ground.
The details:
Nippon Telegraph and Telephone Corp. tested a drone encased in a metal Faraday cage, trailing conductive wire connected to a ground-based switch.
During a storm, the team launched the drone at an altitude of 300 meters, then flipped a high-voltage switch to induce a lightning strike.
During tests, the drone survived direct lightning strikes with only partial melting of the cage and was able to remain airborne and operational.
The technology could potentially help mitigate the up to $1.4B in annual lightning-related damages in Japan.
Why it matters: This technology offers a significant advancement over fixed lightning rods and can be rapidly deployed to protect vulnerable sites such as wind turbines or outdoor venues. Beyond protection, the company is also exploring the possibility of harnessing and storing lightning’s power as a renewable energy source.
CHARMED LABS

Image source: Charmed Labs
The Rundown: If you’ve ever dreamed of seeing the world through the eyes of a mouse, you can now do that, with the Goby — a telepresence robot created by Austin’s Charmed Labs. It’s tiny, hackable, and priced at only $100.
The details:
Currently crowdfunding for its initial release, Goby is entirely reprogrammable with nothing more than a USB cable and the Arduino IDE.
The robot comes equipped with two independently motorized wheels and a unique articulated tail supported by a small ball.
It features an ESP32-S3 and an OmniVision OV2640 camera sensor for live video feed, a 3-axis accelerometer, and wheel odometry sensors.
Its BitBang software enables encrypted, low-latency WebRTC peer-to-peer connections, allowing anyone with the URL to control Goby from anywhere.
Why it matters: Home robot rivals include Loona and Enabot, but Goby stands out for being open source, hackable, and very affordable — offering a unique platform for both fun and practical applications like inspecting tight or hazardous spaces. Plus, setup via a QR code takes minutes, and you can share remote control access too.
STANFORD UNIVERSITY

Image source: Stanford University
The Rundown: This week, Stanford University roboticists released a new paper and demo videos showing how their TWIST real-time teleoperation framework enables humanoids to precisely mimic how humans move.
The details:
TWIST enables real-time teleoperation of humanoid robots by directly imitating human whole-body motions using motion capture (MoCap) data.
Under the hood, it uses a unified neural network controller, trained via reinforcement learning, allowing the robot to perform coordinated skills.
A two-stage framework trains a teacher policy with future motion data for smooth actions, then distills it into a student policy using only current data.
TWIST shows versatility on robots like the Unitree G1, enabling tasks such as picking up objects, kicking, and crouching, all controlled by a human operator.
Why it matters: Compared to systems like H20, which uses an RGB camera, TWIST leverages high-fidelity MoCap for sophisticated teacher-student training. Limitations include the lack of visual or tactile feedback for operators and robots overheating during prolonged use. Still, it paves the way for more functional humanoids.
QUICK HITS
📰 Everything else in robotics today
Igus, a motion plastics manufacturer in Germany, unveiled its Iggy Rob industrial humanoid, standing 1.7 meters tall and priced at around $54K.
Hugging Face released Open Computer Agent, an open-source, Operator-like AI agent tool that can perform tasks on the web.
Chinese firm Unitree and San Francisco-based Reborn announced a partnership to co-develop advanced AI specifically for Unitree’s humanoids.
A bipartisan group of U.S. lawmakers is urging an investigation into Unitree's operations within U.S. prisons and police forces, citing potential security risks.
Zoox, Amazon’s autonomous vehicle unit, paused operations of its driverless testing program and recalled its software following a crash in Las Vegas on April 8.
Chinese company Kepler began testing its fifth-generation humanoid K2, nicknamed 'Bumblebee,' on the SAIC-GM automotive assembly line.
University of Rochester researchers developed a new text-to-video model that learns real-world physics knowledge from time-lapse videos.
Korean researchers developed an autonomous robot specifically designed for wiping and UV-C disinfection in hospitals to reduce human exposure to pathogens.
MIT engineers designed a ping-pong robotic arm that can return shots with high-speed precision.
MIT also developed a system that enables robots to use only internal sensors to learn about an object’s weight or contents by picking it up and gently shaking it.
COMMUNITY
Join our next workshop this Friday, May 9th, at 4 PM EST with Dr. Alvaro Cintas, The Rundown’s AI professor. By the end of the workshop, you’ll you’ll be equipped to prompt with precision and creativity across any AI tool and improve AI responses.
RSVP here. Not a member? Join The Rundown University on a 14-day free trial.
That's it for today!Before you go we'd love to know what you thought of today's newsletter to help us improve The Rundown experience for you. |
See you soon,
Rowan, Jennifer, and Joey—The Rundown’s editorial team