Fire-and-Forget? Dissecting the Thin Line Between ‘Self-Homing’ Missiles and Autonomous Drones
Advent Atum
Rapidly advance defence and commercial industry capabilities focusing on autonomy and sustainability.
By Andy Wilson
Autonomous systems are shaking up the battlefield, promising faster, smarter, and more adaptive solutions across air, land, and sea. But the defence world still hasn’t nailed down what “autonomy” actually means, especially when it comes to guided munitions. A Fire-and-Forget missile and a terminally guided drone perform nearly identical tasks in nearly identical ways—so why does one get a pass as "self-homing" while the other is branded autonomous and surrounded by scrutiny?
Our definitions are murky, full of legacy terms and outdated perceptions. By digging into the technology behind “self-homing” missiles and “autonomous” drones, we can see just how arbitrary these distinctions are—and why they need a major reality check.
Fire-and-Forget Missiles: So Close to Autonomous, But Just Not Called It
Fire-and-Forget missiles (also known as Lock on After Launch or Lock on Before Launch), the battle-tested workhorses of precision warfare, are the OG autonomous systems in every way except in name. These systems lock onto a target at launch and pursue it without any further input from the operator, using onboard sensors and pre-programmed algorithms to dynamically adjust to target movement. Once released, they act without any human input, making independent adjustments to reach their destination.
The AGM-114L Hellfire, famous for its precision in all kinds of conditions, exemplifies this model. After being locked onto a target using active radar, it continues to track and hit that target without a shred of further guidance. There’s no human pilot making mid-course corrections—it’s on autopilot, self-homing its way to the target.
Functionally, this behaviour looks an awful lot like autonomy. It doesn’t rely on human input, it makes decisions on its own, and it’s every bit as capable of completing its mission as any so-called “autonomous” system. Yet, it’s not labelled autonomous. Why? Because it’s old tech, and the defence world is still hooked on outdated terminology that lets missiles off the hook simply because they don’t look like robots.
Autonomous Drones with Computer Vision: New Face, Same Functions
Now, compare that to the modern drone with computer vision for terminal guidance. These systems use AI-driven algorithms, often fed by real-time data from cameras or sensors, to recognise targets, adapt to last-second movements, and make smart decisions in complex environments. Loitering munitions, in particular, combine high-level autonomy with mission flexibility, following commands to search, track, and engage in highly dynamic scenarios.
Think of loitering munitions equipped with cutting-edge computer vision algorithms. They can visually recognise specific targets, differentiate based on shape or movement, and engage autonomously. They’re a self-sufficient weapon, capable of aborting a strike if conditions change or if the target no longer fits specific parameters.
In essence, these drones function just like a Fire-and-Forget missile—if anything, they offer even more autonomy. They analyse and react to more variables, and they have more degrees of freedom. But because they’re new, AI-driven, and fall into the category of “drone” rather than “missile,” we call them autonomous and subject them to a whole different layer of scrutiny.
The Traditional Computer Vision and AI Divide: Just Advanced Algebra?
Here’s where things get interesting. Traditional computer vision methods—like those used in early guided systems—are based on basic counting and classification, where we’re essentially supercharging algebra to interpret images. But AI in computer vision is like jet-powered algebra: it goes way beyond the basics. Neural networks and machine learning apply advanced, multi-layered transformations to understand complex patterns and context, giving drones more adaptability and intelligence.
In the simplest terms, AI-driven computer vision is just traditional computer vision on steroids. Both types of systems break down pixels into patterns, but AI models are designed to interpret far more and adjust dynamically based on the data. It’s still built on algebra at the core, just taken to a turbocharged level.
领英推荐
AGM-65F Maverick (Infrared Variant): The Maverick air-to-ground missile has variants with infrared guidance, designed to target armoured vehicles, bunkers, and ships by locking onto their heat signatures.
The Labelling Bias: If It Quacks Like a Duck…
Here’s where it gets ridiculous. Take a drone with terminal guidance, slap a sticker on it calling it a “slow missile,” and the conversation about autonomy changes instantly. The exact same system is now labelled “guided” instead of “autonomous.” This bias toward certain terminology comes from years of institutionalised thinking that equates “missile” with predictable simplicity and “drone” with the complexities of autonomy.
Labelling bias: A “missile” is generally seen as a predictable tool with a straightforward task. But the same technology in a drone makes us view it as an autonomous, high-tech platform. This discrepancy shows how much language, rather than technology, can define what we perceive as “autonomous.”
Consider a heat-seeking missile and a terminally guided drone. A heat-seeking missile detects and follows a thermal signature—independent of operator input, making its own adjustments as it homes in on a moving target. A terminally guided drone using computer vision does exactly the same thing, only with even more capability to adapt to what it “sees” in real-time. In both cases, the target’s evasion capabilities are effectively nullified by the weapon’s own decision-making processes. Functionally, they’re the same, yet we treat them as categorically different systems.
Same Function, Different Labels: A heat-seeking missile and a vision-guided drone both operate autonomously in their final phases. They make independent last-minute adjustments based on their sensors, with zero human input after launch. But one is a missile, the other is a drone, and suddenly the policy and ethics get a lot more complicated.
Implications for Policy and Ethics: Defining Autonomy for Real
These labelling inconsistencies go beyond semantics. They affect how we regulate and deploy technology, shaping policy and ethical debates that are increasingly out of sync with the systems we’re fielding. Because “autonomous systems” are scrutinised more intensely than their so-called “self-homing” counterparts, we’re creating regulatory gaps that sidestep the reality of these systems’ capabilities. If a Fire-and-Forget missile can make its own decisions to hit a target, it deserves the same level of ethical and regulatory consideration as any autonomous drone.
Policy Gaps: Defence policies should reflect capability, not outdated terms. Treating systems differently based on what we call them instead of what they do means we’re missing crucial oversight on “self-homing” weapons that function autonomously. If autonomy means the ability to independently reach a target, Fire-and-Forget missiles are autonomous by every measure that matters.
If we’re going to regulate autonomy, we need to start by defining it correctly. The traditional mindset—that missiles are simple, predictable systems while drones are complex and autonomous—is obsolete. We’re using old labels to guide new policy, and it’s creating a legal framework that’s neither consistent nor comprehensive.
Conclusion
The defence industry is overdue for a serious reevaluation of what “autonomy” actually means in weapon systems. A Fire-and-Forget missile and a terminally guided drone are, at their core, performing the same function: pursuing and hitting a target independently. Yet, we apply the “autonomous” label inconsistently, skewing perception and policy in the process.
It’s time we let go of legacy terminology and recognise these systems for what they are. When the only difference between a “self-homing” missile and an “autonomous” drone is a name, we’re overdue for a reset. As technology evolves, our understanding of autonomy needs to evolve too—before our outdated definitions lead to policy gaps we can’t afford.
System Analyst | C++ Developer | R&D | Ph.D
3 周It's simple. Drones permit the payload change, while missiles are usually just missiles. If I remember correctly, only Tomahawk cruise missiles had cargo configuration.