A
AI

AI ML Perception Engineer Defense Tech

Approach Venture · Austin, TX · $120k - $210k

Actively hiring Posted 6 days ago

Role overview

AI/ML Perception Engineer – Build Autonomous Intelligence for Next-Gen Defense Robotics! Austin, TX | Onsite Opportunity Summary A rapidly scaling, venture-backed startup is forging the next generation of autonomous defense robotics designed to react in milliseconds to dynamic, high-threat scenarios. As a Principal AI/ML Perception Engineer, you will take the helm of the perception and autonomy stack, shaping the intelligence that powers a fully autonomous robotic system built for real-world deployment. In this early engineering role, you’ll architect and implement the full perception pipeline, from sensor fusion and model development to field validation and optimization on embedded platforms. This is a rare zero-to-one opportunity where your work becomes the foundation of the product’s decision-making capabilities. You'll see your models running in rugged outdoor environments, influence the platform’s long-term technical direction, and help define what modern autonomous defense systems are capable of.

About Us We’re an emerging technology company focused on building compact robotic systems that can perceive, think, and act autonomously under extreme conditions. Our team blends deep experience in autonomy, machine learning, sensing, and hardware development. We thrive at the intersection of software and real-world constraints, solving complex perception challenges across diverse and noisy environments. Our mission moves fast, demands creativity, and requires engineering excellence at every layer of the stack.

Job Duties

  • Build and refine perception pipelines leveraging multi-sensor inputs including camera, IMU, and audio sources
  • Deploy computer vision and ML models tailored for low-latency, compute-limited embedded environments
  • Develop and implement audio-enhanced detection and localization methods using microphone arrays or acoustic sensors
  • Lead the data lifecycle, including collection, annotation, training, validation, and field testing
  • Architect robust real-time sensor fusion frameworks for state estimation and situational awareness
  • Integrate and bring up sensors (cameras, IMUs, microphones) while writing and optimizing low-level drivers
  • Prototype, evaluate, and iterate rapidly through live field trials in outdoor operational settings
  • Partner closely with controls, embedded, and hardware engineers to ensure system-wide coherence
  • Build tooling for visualization, logging, replay, and real-time debugging
  • Contribute to perception strategy and influence the ongoing roadmap for autonomy and intelligence

What we're looking for

  • 3+ years of professional experience developing ML/CV-based perception systems for robotics, autonomy, or embedded products
  • Demonstrated success deploying vision or ML models into constrained, real-time environments
  • Strong proficiency in Python and C++ for research and embedded development
  • Applied experience with object detection, classification, tracking, and related CV workflows
  • Hands-on background with sensor fusion techniques (EKF/UKF or similar) using camera + IMU data
  • Exposure to acoustic or audio-driven sensing methods
  • Familiarity with embedded inference platforms such as NVIDIA Jetson, Coral, or comparable systems
  • Ability to bring up hardware, debug drivers, and tune systems directly on embedded devices
  • Experience working on unmanned systems, robotics autonomy, counter-UAS, or advanced defense hardware
  • Understanding of model compression techniques (quantization, pruning, TensorRT optimization)
  • Knowledge of acoustic localization, filtering, and triangulation methods
  • Prior contributions within early-stage or fast-iteration engineering teams
  • Background with real-time, safety-critical, or mission-driven embedded systems

Why Join Us

  • Opportunity to serve as a key technical leader and founding contributor
  • Build real autonomous systems that move from concepts to field demonstrations quickly
  • High-trust engineering environment with fast decision cycles
  • Meaningful influence on technical direction, product strategy, and culture
  • Work on technology that directly contributes to protecting people and infrastructure
  • Competitive equity offering
  • Strong PTO and flexible working environment
  • Relocation support for candidates moving to Austin
  • Long-term growth and leadership opportunities as the company expands

Compensation Details $120,000 - $210,000

Tags & focus areas

Used for matching and alerts on DevFound
Fulltime Machine Learning Robotics