Military Robotics: The State of the Art of War

Although these stories have been covered individually, I think it's important to provide an overview of current military robotics applications; not to endorse, but rather to disseminate, this information. Here is the current state of the art of robotic war:


Recently, defense contractor Foster Miller has developed a versatile robotics platform called "Special Weapons Observation Reconnaissance Detection System." This phrase, but not its acronym, obscures the real intent of this project. This platform (pictured at the start of the article) is compatible with the DREAD weapons system, described below:

"Imagine a gun with no recoil, no sound, no heat, no gunpowder, no visible firing signature (muzzle flash), and no stoppages or jams of any kind. Now imagine that this gun could fire .308 caliber and .50 caliber metal projectiles accurately at up to 8,000 fps (feet-per-second), featured an infinitely variable/programmable cyclic rate-of-fire (as high as 120,000 rounds-per-minute), and were capable of laying down a 360-degree field of fire"). [...] unlike conventional weapons that deliver a bullet to the target in intervals of about 180 feet, the DREAD's rounds will arrive only 30 thousandths of an inch apart (1/32nd of an inch apart), thereby presenting substantially more mass to the target in much less time than previously possible. This mass can be delivered to the target in 10-round bursts, or the DREAD can be programmed to deliver as many rounds as you want, per trigger-pull. Of course, the operator can just as easily set the DREAD to fire on full-auto, with no burst limiter. On that setting, the number of projectiles sent down range per trigger-pull will rely on the operator’s trigger control. Even then, every round is still going right into the target. You see, the DREAD's not just accurate, it's also recoilless. No recoil. None. So, every "fired" round is going right where you aim it.

U.A.V.: via Edge of Vision
"A small company called Neural Robotics has produced a robotic mini-helicopter armed with a rapid-fire shotgun. Based on their low-cost AutoCopter, the UAV (unmanned aerial vehicle) uses neural network-based flight control algorithms to fly in either a self-stabilizing semi-autonomous mode controlled by a remote operator, or a fully-autonomous mode which can follow GPS waypoints. A video of the AutoCopter Gunship is available."

"The AutoCopter Gunship UCAV/UCAR can also be outfitted with a thermal/IR sight, giving the system day/night/all-weather capability, so the operator can engage the enemy in any conditions the AutoCopter can handle from a flight perspective, even in total darkness."
"The AutoCopter can reportedly fly forward at 60 mph, and sideways at 35 mph, and can handle sling loads in gusting winds without any problem(s)."

Big Dog:

One obvious problem for military applications is that most robots are capable of traversing only simple terrain. Enter the "packbot" by Foster Miller and Boston Dynamics, whose slogan is "The Leader in Lifelike Human Simulation":

"sensors for locomotion include joint position, joint force, ground contact, ground load, a laser gyroscope, and a stereo vision system. Other sensors focus on the internal state of BigDog, monitoring the hydraulic pressure, oil temperature, engine temperature, rpm, battery charge and others."


Obviously, one of the great advantages for having an autonomous robotic infantry is that they are capable of coordinating information and executing tactical strategies with much greater precision and speed than traditional units. Enter Raytheon & General Dynamics' AFTADS, the Advanced Field Artillery Tactical Data System, which is a command AND control system ("C2") currently installed in "all U.S. Army echelons from weapons platoon to corps and in the Marine Corps from firing battery to Marine Expeditionary Forces. AFATDS is installed aboard the U.S. Navy LHA/LHD Class big deck amphibious ships to support Expeditionary Strike Groups (ESGs) for amphibious operations."

"It processes fire mission and other related information to coordinate and optimize the use of all fire support assets, including mortars, field artillery, cannon, missile, attack helicopters, air support, and naval gunfire. [...] During battle, AFATDS will provide up-to-date battlefield information, target analysis, and unit status, while coordinating target damage assessment and sensor operations."

AFTADS is capable of managing all aspects of warfighting, including the validation of targets, the management of weapon systems/munitions status, analysis of all fire support assets, automatically applying tactical guidance for targeting and attack, continuously prioritizing of high-value and high-payoff targets, fully automating weapon-to-target pairing, the set up and coordination of all communications, and automated mission coordination in conjunction with established or evolving doctrine, tactics, techniques, and procedures. [This information is paraphrased from the data sheet linked to above].

Just to give you some idea of the destructive power available here, consider "the Crusader," which is a weapons system that AFATDS is capable of managing. The Crusader "consists of the next generation self-propelled howitzer (SPH) and [..] provides unprecedented lethality based on system responsiveness and rate of fire. The entire fire control and ammunition handling system for both the howitzer and RSV is automated. [...] A multiple round simultaneous impact (MRSI) capability allows one Crusader to deliver 4 to 8 rounds on a target within four seconds in a surprise fire-for-effect mode. Increased survivability is achieved by an integrated defense system that includes stealth design, lighter-weight armor protection [...] and the ability to make rapid survivability moves using shoot and scoot tactics to avoid enemy counterfire. The Crusader has a 30-40 km range." 30 to 40 km !

Although I have not been able to find this particular fact on the web (probably for good reason), I know from a first-hand source who trains soldiers on AFTADS that it is fully capable of operating within a "closed-loop," in which humans are not part of the target detection, fire planning, and fire execution cycle.

But without humans, who does the moment-by-moment tactical and strategic planning? Read on.


DARPA (Defense Advanced Research Projects Agency) has recently announced this new initiative (along with other biomimicry-oriented initiatives, such as SWARM, aka Smart Warfighting Array of Reconfigurable Modules). Given the lavish funding for grant winners, this announcement has many cognitive neuroscientists falling over one another to get their applications in order.

"The goal of the Biologically-Inspired Cognitive Architectures Program via this BAA is to develop, implement and evaluate psychologically-based and neurobiologically-based theories, design principles, and architectures of human cognition. In a subsequent phase, the program has the ultimate goal of implementing computational models of human cognition that could eventually be used to simulate human behavior and approach human cognitive performance in a wide range of situations" (presumably, including war).

"Our goal is to create challenges problems relevant to military situations that will serve this program and others as cognitive simulations evolve. The set of problems must be progressively more challenging, must involve both embedded and non-embedded problems. “Embedded” refers to problems in which an embodied cognitive agent is situated within, and interacts with, a physical environment which, may or may not contain other cognitive agents." (Clearly, "other cognitive agents" is a term that may eventually include humans.)

Related Posts:
Emotional Robotics
Constraints and Optimality
Binding through Synchrony: Proof from Developmental Robotics
Giving the Ghost a Machine
A Mind of Its Own: Wakamaru
Imitation Vs Self Awarenss: The Mirror Test

Relevant Companies:
Neural Robotics
I-Robot (the Roomba manufacturer recently won a defense contract)
Foster Miller
General Dynamics Robotics Systems
Boston Dynamics Robotics

If you liked this, don't forget to digg it.


Blogger Chris Chatham said...

Doesn't this bother anyone else?

I know that new technology is always a double-edged sword ... but maybe here the edge pointed towards us is a little sharper than the other one ...

3/17/2006 11:22:00 AM  
Anonymous Anonymous said...

OK, honestly, don't call BigDog "Packbot"!!!. Packbot is a freakin' little treaded robotic tank by iRobot, with absolutely no intelligence, whereas BigDog is one of the most advanced, intelligent locomoting quadrupeds on the planet!

3/17/2006 02:03:00 PM  
Blogger Chris Chatham said...

haha, i didn't realize there was something else by that name.y

3/17/2006 02:06:00 PM  
Blogger MathCogIdiocy said...

On what level does this bother you? That we are becoming more and more adept at removing the humanity from the act of killing? Or that cognitive scientists are scrambling for the big dollar projects?

As an aside, I've been engrossed in reading your blog since I stumbled over it a few weeks ago. The information you present is endlessly fascinating to me.

3/17/2006 09:44:00 PM  
Blogger Chris Chatham said...

thanks mathcog...:)

if i had to choose, it definitely bothers me on the level of killing, rather than big dollar projects. I guess I just don't believe that this is actually "defense" technology anymore.

3/18/2006 10:44:00 AM  
Blogger MathCogIdiocy said...

Chris -
I could argue that the last time defense technology was truly about defense, we were building castles with moats. It's unfortunate that since the Manhattan Project many of the most interesting "real-world" applications in physics, engineering, mathematics, and computer science have been funded by DOD projects. The only real chance we've had of changing that existed in the form of NASA projects - not an area the general public seems to think is worthy of the level of funding that defense gets. And that is the truly disturbing part of this - the public both actively and passively supports the development of new and better killing machines.

Okay, I've just deleted two paragraphs that were drifting into diatribe. Time to climb off the soap box.

3/18/2006 12:03:00 PM  
Blogger Chris Chatham said...

the original version of this post included just the kind of diatribe that I imagine you deleted ... ;)

in the end, I'm not sure what we can actually do about these trends. this is the best argument in support of working on these projects (to "be there when it happens"); when intelligent, autonomous war machines are a reality, it's better to have people involved that have moral problems with the endeavor.

of course, the manhattan project was actively driven by a threat - Einstein's involvement was predicated on the idea that the Nazis could be capable of developing atomic weapons - and could thus be plausibly construed as "defense."

In this case, I just can't imagine that Osama is working on a terminator too...

3/18/2006 12:41:00 PM  
Blogger Dan Dright said...

Yes, it bothers me. It also bothers me that I chose to work in the best-funded lab in my school because it had all the neat equipment, only to find out our biggest sugar daddy was DARPA. My wife calls them DIAPER, but that doesn't change the reality that my good-intentioned research ideas could turn into a better way for a guy in a remote location to hit a target with less training.

Thanks for the post. Mea Culpa, too.

3/18/2006 07:44:00 PM  
Blogger MathCogIdiocy said...

Sometimes diatribe is a good thing, but I think you made your point in a much stronger fashion when you presented the facts and then asked the one question.

For better or worse, the Manhatten project was a necessary evil at that time. If I remember correctly many of the scientists who worked on the bomb actively fought against nuclear proliferation after the war.

One of the things that concerns me is whether or not the people working on today's projects have any moral qualms. For a variety of personal reasons that don't need to be aired here, I don't see the new right, so called Christian, political environment of today as moral.

I agree with you that it is highly unlikely that Osama is working on a terminator. There are so many easier and cheaper ways to wreak havoc. There is also the question of where and how these new weapons are designed to work. It does not appear that they would be effective in the type of small scale warfare necessary to fight terrorists. Which also raises the question of whether or not you can actually fight terrorism. Is it a concrete object or a concept? Terrorism as the new communism?

You do realize that there is a better chance of my proving the Riemann Hypothesis during a coffee break than understanding why we need better and more efficient weapons of war. :)

3/18/2006 08:20:00 PM  
Blogger RFX said...

Hello Chris,
I stumbled across your blog and thought I might take the opportunity to introduce my company (Robotic FX, Inc. you posted our link at the bottom of your blog). If you have any questions about us please let me know. Also, if you would like a copy of our press release (or any future press releases) I would be more than happy to send them to you. Thanks!

6/28/2006 09:18:00 AM  

Post a Comment

Links to this post:

Create a Link

<< Home