Research Engineer, SLAM & Multi-View Geometry
As a SLAM / Multi-View Geometry Engineer on the Robotics team, you will develop systems that enable robots to perceive, track, and reconstruct the world in 3D from multi-camera and multimodal sensor data. You will work on real-time and offline SLAM pipelines used during teleoperation and robot data collection, as well as scalable systems for reconstructing and tracking 3D structure from large datasets. Specific responsibilities include developing and deploying online SLAM systems used during robotic data collection with multi-camera sensor stacks and teleoperation platforms, building systems for large-scale 3D reconstruction and point tracking across massive datasets, working with research and engineering teams to scale multi-view geometry pipelines to large datasets, improving the accuracy, robustness, and scalability of perception systems used in robotics data collection and training pipelines, and collaborating across robotics, perception, and ML teams to integrate geometry-based methods with learned models.
Intern, Software Engineer - Perception
As a Perception Engineering Intern at Hayden AI, the responsibilities include taking ownership of a real project and seeing it through to completion, building and shipping features with support from senior engineers, writing clean and scalable code, testing work and iterating quickly, being involved in all phases from design discussions to deployment, collaborating with engineers in code reviews and team discussions, participating in standups, sprint planning, and retrospectives, supporting the team on ad hoc engineering tasks, helping improve performance, reliability, or usability where needed, and asking questions, seeking feedback, and applying it quickly. Deliverables or project examples may include GPS data analysis, training deep learning models, creating AI datasets, lidar/camera data tooling, test cases for end-to-end system performance, developing a cloud service in the event processing pipeline, and adding a page or new user flow to the Portal web application.
Senior Software Engineering Lead, Resilience and Chaos Engineering
Lead the research and development of novel deep learning algorithms that enable robots to perform complex, contact-rich manipulation tasks. Explore the intersection of computer vision and robotic control, designing systems that allow robots to perceive and interact with objects in dynamic environments. Create models that integrate visual data to guide physical manipulation for sophisticated handling of diverse items. Collaborate with a multidisciplinary team of engineers and researchers to translate cutting-edge concepts into robust capabilities deployable on physical hardware for industrial applications. Research and develop deep learning architectures for visual perception and sensorimotor control in contact-rich scenarios. Design algorithms that enable robots to manipulate complex or deformable objects with high precision. Work with software engineers to optimize and deploy research prototypes onto physical robotic hardware. Evaluate model performance in simulation and real-world environments to ensure robustness and reliability. Identify opportunities to apply state-of-the-art advancements in computer vision and robot learning to practical industrial problems. Mentor junior researchers and contribute to the technical direction of the manipulation research roadmap.
Sensing Systems Engineer
The Systems Engineer is responsible for owning the holistic performance of Sesame’s wearable devices across the full stack, including hardware selection and integration, firmware, signal processing, and application behavior. They research, evaluate, and recommend optimal sensor technologies and devices for various wearable applications considering physical, electrical, software capabilities, cost, schedule, and user impact. They own the end-to-end performance of sensor systems from prototyping through mass production, managing latency, power consumption, thermal constraints, and reliability. The role includes defining system-level test plans, acceptance criteria, and specifications to ensure high-quality user experience and validating each layer of the stack. They design and supervise data collection strategies for ground-truth data sets needed for algorithm and model development. Additionally, they develop, test, and implement signal processing, sensor fusion, and calibration systems to translate raw sensor data into usable outputs, and collaborate with the ML team to enhance Sesame agents’ responses by integrating sensor data.
Service Technician Associate I - Pittsburgh, PA (Contract)
Develop tools for validation and regression testing of image sensors, image processing pipelines, and hardware and software integration. Perform lab and real-world camera data collection and data analysis. Participate in tuning of sensor parameters and image processing pipelines to optimize image quality. Troubleshoot camera and image quality issues observed on autonomous vehicles. Design new hardware and the necessary software for sensor range. Work with perception software team to assess end to end camera performance.
Senior Manager, Perception
Lead high-impact Perception teams, managing technical roadmap and milestone goals. Collaborate with AI and software leaders, simulation, systems design, and mission assurance teams to deliver a dynamic objects perception system across various sensor modalities and perception pipelines. Build and lead a group of managers and engineers responsible for roadmap, productivity, execution, and impact. Set vision for and grow a team of software engineers involved in planning, execution, and success of complex technical projects, providing technical leadership. Collaborate across teams to brainstorm and accelerate perception capability development. Provide summaries, progress updates, and recommendations to executive leadership. Establish best practices and statistical rigor around data-driven decision-making. Stay updated on industry and academic trends in AI and perception.
Senior Software Engineer, Pilots
As a Senior Software Engineer on the Pilots team, the responsibilities include delivering robust, thoroughly tested, and maintainable C++ code for edge and robotics platforms, designing, implementing, and owning prototype perception systems that may transition into production-grade solutions, constructing and refining real-time perception pipelines including detection, tracking, and sensor fusion, adapting and integrating ML and CV models for Hayden-specific applications, driving technical decision-making balancing prototyping speed with production readiness, collaborating with the Product team and cross-functional Engineering departments, and contributing to shared infrastructure, tooling, and architectural patterns as pilots mature into foundational products.
Machine Learning Engineer - Perception Mapping (copy)
As a software engineer on the perception mapping team at Zoox, you will curate, validate, and label datasets for model training and validation. You will research, implement, and train machine learning models to perform semantic map element detection and closely collaborate with validation teams to formulate and execute model validation pipelines. You will integrate models into the greater onboard autonomy system within compute budgets. Additionally, you will serve as a technical leader on the team, maintaining coding and ML development best practices and contributing to architectural decisions.
Robotics Engineer
As a Software Engineer in the Robotics and Automation group, you will design and deploy systems to automate material science research and discovery laboratories, specializing in robotics, automation, and perception software development. You will architect and develop software systems that control and orchestrate robotic workcells for autonomous materials experimentation, design scalable control frameworks for flexible automation involving robots, motion systems, sensors, and lab instruments. Your role involves collaborating with hardware, mechatronics, and science teams to translate experimental workflows into reliable automated processes; building and maintaining APIs and services for scheduling, execution, monitoring, and data capture; developing simulation, testing, and validation tools to accelerate development and ensure system reliability; integrating 2D and 3D vision systems with robotic manipulation, motion planning, and execution; optimizing system performance, robustness, and throughput under rapid iteration cycles; contributing to technical direction, architecture decisions, and best practices; mentoring junior engineers and helping establish engineering standards; and fostering collaboration and open-mindedness to empower the team to deliver world-class technology at an unprecedented speed.
Perception Engineer
Design, implement, and deploy 2D and 3D vision systems for robotic manipulation, inspection, state verification, and sensor fusion; develop vision-guided automation solutions integrating cameras, lighting, optics, and robots in laboratory and industrial environments; implement perception pipelines for object detection, segmentation, pose estimation, and feature extraction; own camera calibration and system-level accuracy validation; develop novel algorithms for state estimation of fluids and particle flows; integrate vision outputs with robot motion planning, grasping, and task execution; tune and harden vision systems for robustness against variability in materials, reflections, and environmental conditions; collaborate with software, mechatronics, and mechanical teams to translate experimental and operational needs into automated solutions; contribute to technical direction, architecture decisions, and best practices across the robotics, perception, and automation software stack; and bring an attitude of collaboration and open-mindedness to facilitate fearless and creative problem solving that empowers the team to ship world-class technology at an unprecedented speed.
Access all 4,256 remote & onsite AI jobs.
Frequently Asked Questions
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.
