About the job
About Mach Industries
Established in 2022, Mach Industries is an innovative defense technology firm dedicated to the advancement of next-generation autonomous defense platforms. Our mission is grounded in the delivery of scalable and decentralized defense systems aimed at bolstering the strategic capabilities of the United States and its allies. With a dynamic team of approximately 220 employees, we embody the essence of startup agility and ambition.
Our vision centers on transforming the future of warfare through state-of-the-art manufacturing, rapid innovation, and an unwavering commitment to national security. We strive to address the challenges of modern warfare with advanced systems designed to deter kinetic conflict and safeguard global stability.
The Role
As a Perception Engineer at Mach Industries, you will be instrumental in developing an AI-driven autonomy stack for environments where GPS and other sensing capabilities are either unavailable or unreliable. Your responsibilities will include the design, training, and deployment of cutting-edge vision and multi-sensor perception systems that facilitate navigation, targeting, and automatic target recognition across our product lines. You will collaborate with teams specializing in deep learning, computer vision, and embedded systems to translate research-grade algorithms into real-world applications.
Key Responsibilities
Develop and optimize detection, segmentation, and tracking architectures (CNN/Transformer) for EO/IR and multi-spectral imagery, ensuring robust generalization across varied and degraded conditions.
Establish training and evaluation pipelines (PR/ROC, mAP, latency, robustness suites); implement continuous regression testing and model-update loops utilizing field data.
Enhance models for real-time embedded inference through quantization and pruning techniques, utilizing TensorRT/ONNX Runtime, while profiling CPU/GPU performance to meet stringent throughput and latency targets on Jetson-class hardware.
Integrate vision outputs with auxiliary sensing modalities (e.g., radar, LiDAR, RF cues) to support confirm/deny processes, association, and track management via decision-level fusion.
Design visualization, triage, and root-cause analysis tools for expedited insights from simulations, hardware-in-the-loop tests, and flight logs; create comprehensive end-to-end test plans in cooperation with hardware and flight teams.
Implement instrumentation for health metrics, drift detection, and graceful degradation; produce clear tests and documentation aligned with performance requirements.

