Breaker
Sydney
Own and evolve Breaker's 3D localisation pipeline — turning 2D detections into accurate real-world coordinates using passive sensors only
Deep-stack work in camera-based localisation, sensor fusion, and real-time data processing on edge hardware
Looking for a mid-to-senior engineer who has deployed production localisation systems on moving robotic platforms
You'll build the metrics, tooling, and processes to continuously improve localisation accuracy across diverse hardware configurations
Must have working rights in Australia and be willing to participate in regular field testing
Join an exciting startup backed by globally recognised investors at the bleeding edge of physical AI
The way humans use robots is broken.
Modern problems demand more robots than we have operators. Every drone, ground vehicle, and maritime system requires dedicated training, manual control, and constant oversight. One operator per robot. One pilot per mission.
Breaker's AI agent breaks this constraint.
Our technology turns any robot into a truly autonomous, self-organizing teammate. Operators command and query teams of robots through natural language conversations over the push-to-talk radios they already carry, no laptops, no controllers, no additional gear.
Instead of manually flying search patterns across three different screens, you say, "survey this area and flag anything unusual." The robot team figures out how to divide the task, coordinate their movements, and report back what matters.
We're fundamentally changing the operator-to-robot ratio. Small teams become force multipliers.
Our software deploys directly onboard each robot, enabling real-time, intent-driven control even in contested environments with limited bandwidth. We're solving problems most AI companies never touch: sub-second inference on edge hardware with strict latency, power, and connectivity constraints.
We're backed by some of the best investors globally and growing our team across Austin, Texas, and Sydney, Australia. We're a small team of experienced engineers moving fast on technology that will define how humans and machines work together for decades to come.
Join us if you want to help build the robots we were promised 🤖
One of the most critical outputs of Breaker's product is putting accurate points on a map — taking 2D detections from camera systems and resolving them into precise 3D world coordinates. This is a hard, fascinating problem: we do it without LIDAR or active ranging, using passive sensors on moving platforms in unpredictable environments. You'll own this pipeline end-to-end.
This is a greenfield ownership opportunity. You'll take an existing localisation system and make it yours by building the measurement infrastructure to quantify performance, researching and implementing new techniques, and expanding existing capability into areas like tracking moving objects from moving platforms and generalising across diverse camera hardware. You'll define the contracts between your system and the teams that feed into it (computer vision, hardware, field operations) and build the data workflows that turn real-world testing into continuous improvement.
You'll work at the deepest level of our software stack, where real-time sensor data meets geometric reasoning. If you're the kind of engineer who gets energy from digging into a dataset to find where localisation breaks, running experiments to prove a new approach works, and building the automation to measure it all.
Own the accuracy and performance of Breaker's 3D localisation pipeline, from 2D bounding boxes in, lat/long/alt out
Build metrics, ground truth collection processes, and automated evaluation tooling to measure and track localisation performance
Research, prototype, and deploy new localisation and sensor fusion techniques to improve accuracy and robustness
Extend the system to track moving objects from moving platforms
Generalise the localisation pipeline to deploy reliably across different camera systems and hardware configurations
Define interface contracts with adjacent systems like what the CV pipeline must deliver, what camera hardware must provide
Process and analyse field data to identify failure modes, validate improvements, and feed findings back into the development loop
Design and specify field test scenarios (CONOPs-style) to stress-test specific localisation behaviours
Participate in regular field testing, including training field teams on data collection requirements
Conduct software-in-the-loop and hardware-in-the-loop testing and experimentation
Demonstrated experience with camera-based localisation on moving robotic platforms. Monocular or stereo, not LIDAR-dependent
Strong sensor fusion fundamentals. You understand timing, calibration, and the realities of combining data from multiple real-time sensor streams
Experience processing real-time data on constrained or edge-deployed hardware
Proven ability to dig into robotic datasets, identify where systems fail, and systematically troubleshoot localisation problems
Familiarity with optimisation backends (Ceres, GTSAM, or similar) for geometric inference
Proficiency in Python and/or C++
Comfortable working in Linux environments
Have deployed a production system, not just research prototypes or thesis projects
Must have working rights in Australia
Willing to participate in regular field testing
GPS-denied navigation or localisation work
Photogrammetry or large-scale imagery processing pipelines
Robotic manipulation with visual servoing or camera-based feedback
Experience with gimbaling camera systems and their associated complexities
Understanding of machine learning strengths and limitations as an input to geometric systems
Multi-platform data fusion (combining data from multiple robotic platforms)
ROS/ROS2 development experience
Experience with drones or other UAV platforms
Startup or scale-up environment experience
Australian citizenship (preferred but not required)
You'll be an owner, not a renter. We're at the stage where foundational decisions are still being made and entire systems need to be built from scratch. Your work won't be maintaining someone else's legacy — you'll be creating what comes next. The problems you solve and the systems you build will define how Breaker scales.
You'll work with people who've done this before. Our team has shipped production robotics systems, scaled infrastructure, and solved the kind of hard integration problems that only come up when software meets the physical world. You won't be the only person in the room who's debugged a sensor fusion pipeline or optimized inference on a Jetson.
You'll solve problems that don't exist anywhere else. Most companies are building incremental improvements on established technology. We're defining new categories — which means the work is harder, more ambiguous, and infinitely more interesting.
You'll work hard, together. We're in the office every day, grinding on hard problems alongside great people. We've built a workspace where the best work happens — access to hardware, quick decisions, real collaboration. We're flexible when life requires it, but we're looking for people who want to show up, get stuck in, and build something significant with a team they respect.
We're going global. Backed by globally recognised investors, we're growing teams across Sydney, Australia and Austin, Texas. If you want exposure to international expansion and the opportunity to help build across regions, that path exists here.
You'll own what you build. Generous equity packages mean when Breaker wins, you win.
Location. Cicada Innovations, Eveleigh, Sydney, Australia (National Innovation Centre)
If you're excited about the opportunity to work at the bleeding edge of physical AI, we'd love to hear from you.
No closely related roles found yet.
No additional active roles from this company right now.
Join a team building the future of autonomous aviation and unmanned systems.