Media Summary: Part of my research at Cal Poly, San Luis Obispo. The prompt used was: Prompt 3 (Alternating color stack w correction "1. GPT-5, Claude Sonnet 4, Grok 4, and Gemini 2.5 Flash: RoboCrew lib (our code): In this video, we are taking on a next challenge: teaching ...

Llm Guided Robotic Manipulation - Detailed Analysis & Overview

Part of my research at Cal Poly, San Luis Obispo. The prompt used was: Prompt 3 (Alternating color stack w correction "1. GPT-5, Claude Sonnet 4, Grok 4, and Gemini 2.5 Flash: RoboCrew lib (our code): In this video, we are taking on a next challenge: teaching ... Our Chief Technology Officer, Pras Velagapudi, explains what happens when we use natural language voice commands and ... This project demonstrates a natural-language–driven This video presents a layered embodied-intelligence architecture for

IROS2025 Human-in-the-loop Learning for Adaptive Our research presents an innovative method that synergizes GPT-4-driven LLMs and human- This project explores the integration of Large Language Models (LLMs) and Vision-Language Models (VLMs) into This video demonstrates my final project for RAS 545 ( Analytical and Voice-Guided Robotic System for Real-Time Object Grasping and Manipulation Using LLM Title: GazeVLA: Learning Human Intention for

A new AI agent developed by NVIDIA Research that can teach

Photo Gallery

LLM-Guided Robotic Manipulation
Which LLM is Best for Robotic Manipulation? (Tested!)
LLM Driven Robotic Manipulation
Controlling Robots using a Large Language Model
I Taught My Robot to Grab Things (Using VLA & LLMs)
Explained: Using LLMs to Control Humanoid Robots
LLM-Driven Robotic Manipulation Using Vision, Homography Calibration, and an Agentic AI Framework
An LLM-Guided Robotic Arm System with Hierarchical Decision Making
Human-in-the-loop Learning for Adaptive Robot Manipulation using LLMs and BTs
Enhancing the LLM-Based Robot Manipulation Through Human-Robot Collaboration
LLM-Driven Adaptive Robotic Manipulation for Complex Task Planning and Execution
Jailbreaking LLM-Controlled Robots – Alex Robey | IASEAI 2025
Sponsored
Sponsored
View Detailed Profile
LLM-Guided Robotic Manipulation

LLM-Guided Robotic Manipulation

Part of my research at Cal Poly, San Luis Obispo. The prompt used was: Prompt 3 (Alternating color stack w correction "1.

Which LLM is Best for Robotic Manipulation? (Tested!)

Which LLM is Best for Robotic Manipulation? (Tested!)

GPT-5, Claude Sonnet 4, Grok 4, and Gemini 2.5 Flash:

Sponsored
LLM Driven Robotic Manipulation

LLM Driven Robotic Manipulation

robot

Controlling Robots using a Large Language Model

Controlling Robots using a Large Language Model

Blog: https://mikelikesrobots.github.io/blog/

I Taught My Robot to Grab Things (Using VLA & LLMs)

I Taught My Robot to Grab Things (Using VLA & LLMs)

RoboCrew lib (our code): https://github.com/Grigorij-Dudnik/RoboCrew In this video, we are taking on a next challenge: teaching ...

Sponsored
Explained: Using LLMs to Control Humanoid Robots

Explained: Using LLMs to Control Humanoid Robots

Our Chief Technology Officer, Pras Velagapudi, explains what happens when we use natural language voice commands and ...

LLM-Driven Robotic Manipulation Using Vision, Homography Calibration, and an Agentic AI Framework

LLM-Driven Robotic Manipulation Using Vision, Homography Calibration, and an Agentic AI Framework

This project demonstrates a natural-language–driven

An LLM-Guided Robotic Arm System with Hierarchical Decision Making

An LLM-Guided Robotic Arm System with Hierarchical Decision Making

This video presents a layered embodied-intelligence architecture for

Human-in-the-loop Learning for Adaptive Robot Manipulation using LLMs and BTs

Human-in-the-loop Learning for Adaptive Robot Manipulation using LLMs and BTs

IROS2025 Human-in-the-loop Learning for Adaptive

Enhancing the LLM-Based Robot Manipulation Through Human-Robot Collaboration

Enhancing the LLM-Based Robot Manipulation Through Human-Robot Collaboration

Our research presents an innovative method that synergizes GPT-4-driven LLMs and human-

LLM-Driven Adaptive Robotic Manipulation for Complex Task Planning and Execution

LLM-Driven Adaptive Robotic Manipulation for Complex Task Planning and Execution

This project explores the integration of Large Language Models (LLMs) and Vision-Language Models (VLMs) into

Jailbreaking LLM-Controlled Robots – Alex Robey | IASEAI 2025

Jailbreaking LLM-Controlled Robots – Alex Robey | IASEAI 2025

What happens when

Agentic AI for Natural-Language Robotic Manipulation Using Vision and LLMs

Agentic AI for Natural-Language Robotic Manipulation Using Vision and LLMs

This video demonstrates my final project for RAS 545 (

Analytical and Voice-Guided Robotic System for Real-Time Object Grasping and Manipulation Using LLM

Analytical and Voice-Guided Robotic System for Real-Time Object Grasping and Manipulation Using LLM

Analytical and Voice-Guided Robotic System for Real-Time Object Grasping and Manipulation Using LLM

GazeVLA: Learning Human Intention for Robotic Manipulation (Apr 2026)

GazeVLA: Learning Human Intention for Robotic Manipulation (Apr 2026)

Title: GazeVLA: Learning Human Intention for

Eureka! Extreme Robot Dexterity with LLMs | NVIDIA Research Paper

Eureka! Extreme Robot Dexterity with LLMs | NVIDIA Research Paper

A new AI agent developed by NVIDIA Research that can teach

Enhancing Stability and Reliability in LLM Driven Robotic Manipulation Through Human Skill Demonstra

Enhancing Stability and Reliability in LLM Driven Robotic Manipulation Through Human Skill Demonstra

LLM

LLM-Guided Safe Motion Planning with Smoothed Trajectories for Autonomous Vehicles

LLM-Guided Safe Motion Planning with Smoothed Trajectories for Autonomous Vehicles

Gazebo #ROS2 #

Natural Language Robotic Control: Cube Stacking Task (MuJoCo Simulation)

Natural Language Robotic Control: Cube Stacking Task (MuJoCo Simulation)

This video demonstrates a