Robotics Intern

Embodied, Inc. is a technology company founded by veteran roboticist Paolo Pirjanian (iRobot, Evolution Robotics) with the conviction that the next big wave of technology will be driven by human-machine interfaces that are socially aware and intelligent.

Embodied’s veteran team of technologists, neuroscientists, child development specialists, and creative storytellers have been entirely reinventing human-machine interaction to enable realistic and intuitive interactions similar to humans. Through extensive research, they developed a breakthrough technology platform, SocialX™, that incorporates advanced AI and machine-learning to support fluid conversation, body language, eye contact, and emotions.

The first iteration of this technology is Moxie™, an animated companion for children developed to help promote social, emotional, and cognitive learning. Recognized by TIME magazine as one of the Best Inventions of 2020, Moxie™ has been called “the robot pal you dreamed of as a kid” (Wired Magazine), “the robot that could be your child’s or parent’s new best friend” (Fast Company), and “a technically impressive childhood robot” (TechCrunch). You can learn all about Moxie™ and see how Embodied (one of Fast Company’s Most Innovative Companies of 2021) works at:


Position Summary

We're looking for an exceptional QA Analyst to join our team. The QA Analysts at Embodied, Inc. are responsible for designing test cases and executing manual tests for our suite of products. This is a hands-on, cross-functional, and detail-oriented career opportunity.

We are looking for an open-minded creative team player to contribute to a revolutionary robot experience for children. Embodied is looking for candidates interested in learning about and contributing to the development of Moxie, our first AI Character designed to support children’s development and mental wellbeing. The candidate will perform duties in the R&D team of Embodied and will participate in the optimization, improvement, enhancement, and benchmarking of Embodied interaction technologies. In particular, the focus will be on human-robot interaction systems. Additionally, the candidate will participate in prototyping and evaluating novel sensors and robot assemblies.

The candidate's project would be selected according to skill level and interest of the applicant. The following list presents some of the current projects spanning areas of computer vision, speech recognition and machine learning: 

  • Evaluation and benchmarking of the state-of-the-art techniques for face recognition and emotion detection
  • Evaluation and implementation of the state-of-the-art of gesture recognition algorithms.
  • Development of algorithms for user’s pose estimation
  • Evaluation and enhancement of voice activity detection algorithms
  • Development of multimodal (audio, video, text) techniques for estimation of the state of the user (happiness, boredom, frustration, etc.)


The typical work cycle involves research and algorithm development (typically in Python), algorithm testing, refinement and benchmarking, followed by C/C++ code implementation and integration into the robot.



  • Advanced BS student (80%+ of the classes approved) or graduate student
  • Proficient in C++ and python with good documentation and code organization habits
  • Excellent mathematical background, with solid foundation in probability theory, systems theory, multivariable calculus, non-linear optimization
  • Excellent GPA
  • Ability to tackle complex technical problems with creative solutions
  • Experience with computer vision, machine learning, or speech recognition is a definite plus
  • Excellent communication, organization and interpersonal skills: In this role, you will need to communicate clearly, and coordinate complex tasks among teams with diverse backgrounds


At Embodied, we support diversity and we are an equal opportunity workplace. We are a dynamic and diverse team that likes to push the status quo. 


Location: Pasadena, CA — Exceptional remote applicants may be considered.

Contact us: