Video Friday this is your weekly selection of amazing robotics videos collected by your friends on IEEE spectrum robotics. We're also posting a weekly calendar of upcoming robotics events for the next few months. Please send us your events to turn on.
IKRA 2026: 1–5 June 2026, VIENNA
Enjoy today's videos!
Suzumori's Endolalab, Science Tokyo, has developed a musculoskeletal robot for dogs using McKibben's gracilis muscles. This robot simulates a flexible, hammock-like shoulder structure to study the biomechanical functions of a dog's musculoskeletal system.
[ Suzimori Endo Robotics Laboratory ]
HOLE SNAIL!!!
We present a system that converts speech into physical objects using 3D. generative AI and discrete robotic assembly. Using natural language, the system makes design and manufacturing more accessible to people without experience in the field. 3D modeling or robotic programming.
[ MIT ]
Meet the new generation extreme AI. Fully autonomous technical vision system created for robotics, automationand real intelligence. See how OAK 4 combines computing, recognition and 3D perception into one device.
[ Luxonis ]
Thank you, Max!
Inspired by the sinuous resilience of vines, the company's engineers WITH And Stanford The university developed a robot capture which can move around and pick up various objects, including a glass vase and a watermelon, providing a softer approach compared to conventional grippers. A larger version of the robotic antennae can also safely lift a person out of bed.
[ MIT ]
[ Paper ]
Thanks Bram!
Autonomous driving is the main challenge for AI in the physical world. IN Weimowe address this challenge by prioritizing explicitly safe AI, where safety plays a central role in developing our models and AI ecosystem from the ground up.
[ Waymo ]
Created by engineering students at Texas A&M University, this AI robotic dog is reimagining how robots operate in disaster zones. Designed to navigate debris, avoid hazards, and make autonomous decisions in real time, the robot uses a custom Multimodal Large Language Model (MLLM) combined with visual memory and voice commands to see, remember, and plan its next move, just like a robot. first responder.
[ Texas A&M ]
[ MIT ]
In this audio clip, created from data from the SuperCam microphone aboard NASA's Perseverance spacecraft, the sound of electrical discharge can be heard as a Martian dust devil flies past Earth's surface. Mars all-terrain vehicle The recording was made on October 12, 2024, the 1296th Martian day (sol) of the Perseverance mission to the Red Planet.
[ NASA Jet Propulsion Laboratory ]
In this episode, we open the archive of presenter Hannah Fry's visit to our California robotics laboratory. Filmed earlier this year, Hannah interacts with a new set of robots – ones that don't just see, but think, plan and do. See how the team works behind the scenes to test the limits of generalization by getting robots to handle invisible objects on their own.
[ Google DeepMind ]
GRASP Robotics Workshop conducted by Parastoo Abtahi from Princeton Universityon the topic “When robots disappear – from tactile illusions in VR object-oriented interactions in AR.”
Advances in audiovisual rendering have led to commercialization virtual reality (VR); however, haptic technologies have not kept pace with these advances. Although various robotic systems aim to bridge this gap by simulating the sensation of touch, many hardware limitations make realistic touch interactions in virtual reality difficult. In my research, I explore how by understanding human perception through the lens of sensorimotor control theory, we can create interactions that not only overcome the current limitations of robotic virtual reality hardware, but also expand our abilities beyond what is possible in the physical world.
In the first part of this talk, I will present my work on redirection illusions, which exploit limitations of human perception to improve the perceived performance of counter types. tactile devices in VR, for example, positioning accuracy drones and figure display resolution. In Part 2, I will discuss how we apply and exploit these illusory interactions to physical spaces. augmented reality (AR) to facilitate localized and bidirectional human-robot communication by connecting users' mental models and robotic representations.
[ University of Pennsylvania GRASP Laboratory ]
Articles from your site
Related articles on the Internet






