Robotics Engineer — Humanoid Robotics (New Grad)

Avant Robotics USA
Boston, MA

What you'll be doing

• Develop and optimize core robotics algorithms across motion planning, navigation, and perception fusion — including SLAM, sensor fusion for LiDAR, vision, and IMU

• Implement kinematic and dynamic models for trajectory generation, force control, and robust operation in dynamic, unstructured environments

• Integrate algorithms with real robot hardware, validate in simulation (ROS, Gazebo, MuJoCo, IsaacSim), and debug on physical humanoid platforms

• Research and apply cutting-edge techniques in imitation learning, reinforcement learning, and vision-language-action models to advance robot autonomy and decision-making

• Collaborate closely across mechanical, electrical, and software teams to drive research from prototype to deployment


What we need to see

• BS, MS, or PhD in Robotics, Computer Science, Mechanical Engineering, Electrical Engineering, or a related field from a top-tier university — graduating in 2025 or 2026

• Proficiency in C++ and Python; hands-on experience with ROS/ROS2 and at least one simulation environment (Gazebo, MuJoCo, IsaacSim, or equivalent)

• Solid theoretical foundations in robotics: kinematics, dynamics, control systems, optimization, and sensor processing

• Practical experience in at least one of: motion planning, SLAM, perception, learning-based control, or hardware-software integration on real robots

• Strong problem-solving skills and the drive to take projects from research to real-world deployment


Ways to stand out

• Publication(s) at top robotics or ML venues (ICRA, IROS, RSS, CoRL, NeurIPS, ICLR, CVPR, etc.)

• Research experience in a top university robotics lab with real robot hardware deployment

• Experience with learning-based approaches: imitation learning, reinforcement learning, diffusion policy, or VLA models

• Hands-on experience with humanoid, legged, or mobile robot platforms

• Background in large-scale AI model training and deployment in robotics contexts

// // //