About the job
Location
On-site — Austin, TX
Employment Type
Full-Time
Job Title
Computer Vision Engineer (Junior, Senior, Lead/Principal)
Company Overview
Join 9-mothers, a dynamic and well-funded startup pioneering the development of autonomous machines for defense applications. Our flagship product is engineered to counteract small, agile FPV suicide drones—often seen in contemporary conflicts like the Russia-Ukraine situation. We pride ourselves on creating systems that demand unparalleled perception and decision-making capabilities. If you are passionate about transforming innovative machine learning technologies into operational solutions, this is the ideal environment for you.
Though we currently concentrate on defensive systems, there is exciting potential for future expansions into advanced capabilities.
Position Summary
We are on the lookout for a skilled Computer Vision Engineer to become the “vision” of our autonomous c-sUAS platforms. In this role, you will be responsible for architecting, deploying, and refining the perception pipeline, with a focus on achieving low-latency and high-frame-rate processing to accurately track fast-moving targets.
Your expertise should encompass building models and systems from the ground up, transcending the use of existing frameworks. This entails developing bespoke models rather than merely implementing tools like YOLO, showcasing a deep understanding of the complexities within the problem space.
Essential Duties
Design and implement a comprehensive embedded computer vision pipeline using high-performance programming languages such as Python and C++.
Employ real-time object detection models (e.g., YOLO) and optimized CNNs to deliver top-tier object detection and tracking performance.
Enhance code for low-latency, high-frame-rate efficiency on constrained embedded systems, demonstrating practical experience with NVIDIA Jetson platforms and camera pipelines.
Utilize geometric vision principles (calibration, rectification, and 3D geometry) to convert 2D camera images into precise 3D coordinates for fire control and guidance systems.
Collaborate closely with robotics and AI teams to ensure that perception data is reliable for autonomous decision-making and effective countermeasure guidance.

