Senior Robotics & Software Engineer - Grippers R&D (202648)
Building the best-in-the-world objects manipulation system with a cross-functional R&D team including Hardware, Software, and AI; inventing and implementing strategies for using new robotic grippers to handle previously unpickable items; improving adaptability, scalability, and reliability of the robotic platform; using data to build heuristics for handling different categories of items; and detecting anomalies using a combination of signals to determine if a robot picked more than one item at once or if one item is disassembling.
Senior Engineer, XBAT Simulation Modeling (R4546) (TX/SD/BOS)
As a Senior Modeling & Simulation Engineer, responsibilities include developing models and infrastructure for the integrated simulation pipeline in C++, designing deterministic, high-performance simulation tools capable of faster-than-real-time execution for development, testing, and release, implementing test scenarios and writing unit, system, and regression tests. Collaborate across autonomy, embedded, GNC, and test engineering teams to ensure the simulation mirrors real aircraft behavior and mission scenarios. Contribute to platform-agnostic simulation tooling to accelerate future development efforts. Perform verification and validation (V&V) analysis on model tools. Conduct system performance analysis and generate reports and visualizations. Utilize best practices in C++, simulation architecture, and performance engineering.
Staff Engineer, G&C (R4763)
As a Guidance and Controls engineer, you will be responsible for creating and maintaining all control and autonomy algorithms within the XBAT code base. This includes algorithm development, unit tests, component tests, flight software qualification, and flight test support. You will also be responsible for helping update and validate the truth models as required.
Prognostics Reliability Engineer
Lead Zoox’s technical strategy for prognostics across vehicle systems, focusing on reducing in-service failures and improving fleet availability. Identify and prioritize failure modes where prognostics can create meaningful operational value based on failure behavior, detectability, warning horizon, and serviceability. Develop and manage prognostics concepts, methodologies, and technical requirements for monitoring degradation, predicting remaining useful life, and detecting pre-failure behavior in fielded systems. Partner with reliability, design engineering, service, firmware/software, and data teams to define signals, features, infrastructure, and product changes needed for effective prognostics. Translate field performance, repair history, usage patterns, and failure analysis into monitor strategies and deployable health indicators with Design Reliability and Field Reliability. Guide development, validation, and tuning of prognostic models and health monitoring algorithms using field and test data. Establish technical frameworks for evaluating prognostic performance including sensitivity, false positive burden, lead time, robustness, and operational usefulness. Drive tradeoff decisions between prognostics, diagnostics, inspection intervals, and design improvement based on risk, cost, and implementation practicality. Build data and analysis architecture to support prognostics at scale including data quality, feature generation, monitor traceability, and performance feedback loops. Partner with service operations to ensure prognostics outputs translate into actionable maintenance decisions, clear workflows, and measurable business value. Provide technical leadership and mentorship across the prognostics workstream to raise methods, rigor, and cross-functional execution. Communicate recommendations, risks, and roadmap priorities to engineering leadership and stakeholders.
Robotics Software Engineer - Manufacturing Automation
Lead the research and development of novel deep learning algorithms that enable robots to perform complex, contact-rich manipulation tasks. Explore the intersection of computer vision and robotic control by designing systems that allow robots to perceive and interact with objects in dynamic environments. Create models that integrate visual data to guide physical manipulation, moving beyond simple grasping to sophisticated handling of diverse items. Collaborate with a multidisciplinary team of engineers and researchers to translate cutting-edge concepts into robust capabilities that can be deployed on physical hardware for industrial applications. Research and develop deep learning architectures for visual perception and sensorimotor control in contact-rich scenarios. Design algorithms that enable robots to manipulate complex or deformable objects with high precision. Collaborate with software engineers to optimize and deploy research prototypes onto physical robotic hardware. Evaluate model performance in simulation and real-world environments to ensure robustness and reliability. Identify opportunities to apply state-of-the-art advancements in computer vision and robot learning to practical industrial problems. Mentor junior researchers and contribute to the technical direction of the manipulation research roadmap.
Lead Software Engineer, Advanced Pilot Assistant Software (Autonomy/Robotics)
Design, build, and deploy robotic and embedded software that powers advanced pilot assistance systems in production environments. Own autonomy-related features or subsystems from concept through deployment, emphasizing reliability and performance. Write, review, and maintain high-quality Python and C++ code across autonomy, systems, and embedded components. Integrate software with hardware, sensors, and perception or data ingestion pipelines to support autonomous and operator-in-the-loop decision-making. Optimize software for edge compute environments, managing CPU/GPU usage, latency, and implementing appropriate safety mechanisms and fail-safes. Lead testing, validation, and deployment efforts to ensure systems meet safety-critical and mission-critical requirements. Mentor engineers and contribute to technical direction through design reviews, code reviews, and hands-on collaboration.
Manager, Software - Perception (R3770)
Lead multidisciplinary teams in autonomy, integration, and testing by aligning technical efforts, resolving cross-functional challenges, and driving mission-focused execution while balancing hands-on technical oversight with performance optimization, innovation, and stakeholder communication. Design and implement advanced perception algorithms for object detection, classification, and multi-target tracking across diverse sensors. Integrate data from vision systems, radars, and other sensors using probabilistic and deterministic fusion techniques to generate accurate situational awareness. Develop and refine state estimation algorithms for localization and pose estimation using IMU, GPS, vision, and other sensing inputs. Interpret sensor ICDs and technical specifications to ensure proper data handling and synchronization. Optimize perception pipelines for performance, robustness, and real-time efficiency in simulation and real-world environments. Collaborate closely with autonomy, systems, and integration teams to interface perception outputs with planning, behaviors, and decision-making modules. Validate algorithms using synthetic data, simulations, and field testing. Coordinate with hardware and sensor teams to integrate perception algorithms with onboard compute platforms and sensor payloads. Drive innovation in airborne sensing techniques for unmanned aircraft operating in complex or contested environments. Travel approximately 10-15% of the year to office locations, customer sites, and flight integration events.
Tech Lead Manager SWE, SDK
Lead the research and development of novel deep learning algorithms that enable robots to perform complex, contact-rich manipulation tasks. Explore the intersection of computer vision and robotic control, designing systems that allow robots to perceive and interact with objects in dynamic environments. Create models that integrate visual data to guide physical manipulation, moving beyond simple grasping to sophisticated handling of diverse items. Collaborate with a multidisciplinary team of engineers and researchers to translate cutting-edge concepts into robust capabilities that can be deployed on physical hardware for industrial applications. Research and develop deep learning architectures for visual perception and sensorimotor control in contact-rich scenarios. Design algorithms that enable robots to manipulate complex or deformable objects with high precision. Collaborate with software engineers to optimize and deploy research prototypes onto physical robotic hardware. Evaluate model performance in both simulation and real-world environments to ensure robustness and reliability. Identify opportunities to apply state-of-the-art advancements in computer vision and robot learning to practical industrial problems. Mentor junior researchers and contribute to the technical direction of the manipulation research roadmap.
Service Tooling Engineer
Design and build intuitive web interfaces for robot data annotation, dataset visualization, and experiment tracking. Utilize data-driven techniques to optimize interfaces for efficiency and fast iteration cycles. Integrate AI models to automate manual tasks. Collaborate with AI researchers, robot operators, and annotators to support new user experiences.
Senior Robotics Software Engineer | Manipulation
Architect and evolve Gecko Robotics' ROS2-based control framework and planning systems for articulated manipulators. Develop perception-driven motion planning using visual and other sensor inputs. Design closed-loop inspect → analyze → rework workflows. Optimize robotic inspection throughput within active manufacturing lines. Own system-level integration between robot control stack, industrial hardware, and Gecko's inspection software. Support system deployment and validation in production environments.
Access all 4,256 remote & onsite AI jobs.
Frequently Asked Questions
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.
