It is doubtful if any of the people who are today writing on the effect of technology on warfare would consciously disagree with this statement. Yet, many of them tend to ignore the impact of firepower on dispersion, and as a consequence they have come to believe that the more lethal the firepower, the more deaths, disruption, and suppression it will cause. In fact, as weapons have become more lethal intrinsically, their casualty-causing capability has either declined or remained about the same because of greater dispersion of targets. Personnel and tank loss rates of the 1973 Arab-Israeli War, for example, were quite similar to those of intensive battles of World War II and the casualty rates in both of these wars were less than in World War I. (p. 7)
Research and analysis of real-world historical combat data by Dupuy and TDI has identified at least four distinct combat effects of firepower: infliction of casualties (lethality), disruption, suppression, and dispersion. All of them were found to be heavily influenced—if not determined—by moral (human) factors.
Again, I have written extensively on this blog about Dupuy’s theory about the historical relationship between weapon lethality, dispersion on the battlefield, and historical decline in average daily combat casualty rates. TDI President Chris Lawrence has done further work on the subject as well.
U.S. Army Major Amos Fox has recently published the first two of a set of three articles examining nature of proxy warfare in the early 21st century and suggests some ideas for how the U.S. might better conduct it.
Proxy environments dominate modern war… It is not just a Russian, Iranian or American approach to war, but one in which many nations and polities engage. However, the U.S. Army lacks a paradigm for proxy warfare, which disrupts its ability to understand the environment or develop useful tactics, operations and strategies for those environments.
His examination of the basic elements of proxy warfare leads him to conclude that “it is dominated by a principal actor dynamic, power relationships and the tyranny of time.” From this premise, Fox outlines two basic models of proxy warfare: exploitative and transactional.
The exploitative model…is characterized by a proxy force being completely dependent on its principal for survival… [It] is usually the result of a stronger actor looking for a tool—a proxy force—to pursue an objective. As a result, the proxy is only as useful to the principal as its ability to make progress toward the principal’s ends. Once the principal’s ends have been achieved or the proxy is unable to maintain momentum toward the principal’s ends, then the principal discontinues the relationship or distances itself from the proxy.
The transactional model is…more often like a business deal. An exchange of services and goods that benefits all parties—defeat of a mutual threat, training of the agent’s force, foreign military sales and finance—is at the heart of the transactional model. However, this model is a paradox because the proxy is the powerbroker in the relationship. In many cases, the proxy government is independent but looking for assistance in defeating an adversary; it is not interested in political or military subjugation by the principal. Moreover, the proxy possesses the power in the relationship because its association with the principal is wholly transactional…the clock starts ticking on the duration of the bond as soon as the first combined shot is fired. As a result, as the common goal is gradually achieved, the agent’s interest in the principal recedes at a comparable rate.
With this concept in hand, Fox makes that case that
[T]he U.S. Army is ill-suited for warfare in the proxy environment because it mismanages the fixed time and the finite power it possesses over a proxy force in pursuit of waning mutual interests. Fundamentally, the salient features of proxy environments—available time, power over a proxy force, and mutual interests—are fleeting due to the fact that proxy relationships are transactional in nature; they are marriages of convenience in which a given force works through another in pursuit of provisionally aligned political or military ends… In order to better position itself to succeed in the proxy environment, the U.S. Army must clearly understand the background and components of proxy warfare.
These two articles provide an excellent basis for a wider discussion for thinking about and shaping not just a more coherent U.S. Army doctrine, but a common policy/strategic/operational framework for understanding and successfully operating in the proxy warfare environments that will only loom larger in 21st century international affairs. It will be interesting to see how Fox’s third article rounds out his discussion.
With the December 2018 update of the U.S. Army’s Multi-Domain Operations (MDO) concept, this seems like a good time to review the evolution of doctrinal thinking about it. We will start with the event that sparked the Army’s thinking about the subject: the 2014 rocket artillery barrage fired from Russian territory that devastated Ukrainian Army forces near the village of Zelenopillya. From there we will look at the evolution of Army thinking beginning with the initial draft of an operating concept for Multi-Domain Battle (MDB) in 2017. To conclude, we will re-up two articles expressing misgivings over the manner with which these doctrinal concepts are being developed, and the direction they are taking.
Contrary to what it says, the Army has always been a concepts-based, rather than a doctrine-based, institution. Concepts about future war generate the requirements for capabilities to realize them… Unfortunately, the Army’s doctrinal solutions evolve in war only after the failure of its concepts in its first battles, which the Army has historically lost since the Revolutionary War.
The reason the Army fails in its first battles is because its concepts are initially — until tested in combat — a statement of how the Army “wants to fight” and rarely an analytical assessment of how it “will have to fight.”
Starting with the Army’s failure to develop its own version of “blitzkrieg” after World War I, Johnson identified conservative organizational politics, misreading technological advances, and a stubborn refusal to account for the capabilities of potential adversaries as common causes for the inferior battlefield weapons and warfighting methods that contributed to its impressive string of lost “first battles.”
Conversely, Johnson credited the Army’s novel 1980s AirLand Battle doctrine as the product of an honest assessment of potential enemy capabilities and the development of effective weapon systems that were “based on known, proven technologies that minimized the risk of major program failures.”
“The principal lesson in all of this” he concluded, “is that the U.S. military should have a clear problem that it is trying to solve to enable it to innovate, and is should realize that innovation is generally not invention.” There are “also important lessons from the U.S. Army’s renaissance in the 1970s, which also resulted in close cooperation between the Army and the Air Force to solve the shared problem of the defense of Western Europe against Soviet aggression that neither could solve independently.”
“The US Army is Wrong on Future War”
The other article, provocatively titled “The US Army is Wrong on Future War,” was published by West Point’s Modern War Institute. It was co-authored by Nathan Jennings, Amos Fox, and Adam Taliaferro, all graduates of the School of Advanced Military Studies, veterans of Iraq and Afghanistan, and currently serving U.S. Army officers.
They argue that
the US Army is mistakenly structuring for offensive clashes of mass and scale reminiscent of 1944 while competitors like Russia and China have adapted to twenty-first-century reality. This new paradigm—which favors fait accompli acquisitions, projection from sovereign sanctuary, and indirect proxy wars—combines incremental military actions with weaponized political, informational, and economic agendas under the protection of nuclear-fires complexes to advance territorial influence. The Army’s failure to conceptualize these features of the future battlefield is a dangerous mistake…
Instead, they assert that the current strategic and operational realities dictate a far different approach:
Failure to recognize the ascendancy of nuclear-based defense—with the consequent potential for only limited maneuver, as in the seventeenth century—incurs risk for expeditionary forces. Even as it idealizes Patton’s Third Army with ambiguous “multi-domain” cyber and space enhancements, the US Army’s fixation with massive counter-offensives to defeat unrealistic Russian and Chinese conquests of Europe and Asia misaligns priorities. Instead of preparing for past wars, the Army should embrace forward positional and proxy engagement within integrated political, economic, and informational strategies to seize and exploit initiative.
The factors they cite that necessitate the adoption of positional warfare include nuclear primacy; sanctuary of sovereignty; integrated fires complexes; limited fait accompli; indirect proxy wars; and political/economic warfare.
“Given these realities,” Jennings, Fox, and Taliaferro assert, “the US Army must adapt and evolve to dominate great-power confrontation in the nuclear age. As such, they recommend that the U.S. (1) adopt “an approach more reminiscent of the US Army’s Active Defense doctrine of the 1970s than the vaunted AirLand Battle concept of the 1980s,” (2) “dramatically recalibrate its approach to proxy warfare; and (3) compel “joint, interagency and multinational coordination in order to deliberately align economic, informational, and political agendas in support of military objectives.”
Future U.S. Army Doctrine: How It Wants to Fight or How It Has to Fight?
Readers will find much with which to agree or disagree in each article, but they both provide viewpoints that should supply plenty of food for thought. Taken together they take on a different context. The analysis put forth by Jenninigs, Fox, and Taliaferro can be read as fulfilling Johnson’s injunction to base doctrine on a sober assessment of the strategic and operational challenges presented by existing enemy capabilities, instead of as an aspirational concept for how the Army would prefer to fight a future war. Whether or not Jennings, et al, have accurately forecasted the future can be debated, but their critique should raise questions as to whether the Army is repeating past doctrinal development errors identified by Johnson.
The U.S. Army Training and Doctrine Command (TRADOC) released draft version 1.5 of its evolving Multi-Domain Operations (MDO) future operating concept last week. Entitled TRADOC Pamphlet 525-3-1, “The U.S. Army in Multi-Domain Operations 2028,” this iteration updates the initial Multi-Domain Battle (MDB) concept issued in October 2017.
According to U.S. Army Chief of Staff (and Chairman of the Joint Chiefs of Staff nominee) General Mark Milley, MDO Concept 1.5 is the first step in the doctrinal evolution. “It describes how U.S. Army forces, as part of the Joint Force, will militarily compete, penetrate, dis-integrate, and exploit our adversaries in the future.”
TRADOC Commander General Stuart Townsend summarized the draft concept thusly:
The U.S. Army in Multi-Domain Operations 2028 concept proposes a series of solutions to solve the problem of layered standoff. The central idea in solving this problem is the rapid and continuous integration of all domains of warfare to deter and prevail as we compete short of armed conflict. If deterrence fails, Army formations, operating as part of the Joint Force, penetrate and dis-integrate enemy anti-access and area denial systems;exploit the resulting freedom of maneuver to defeat enemy systems, formations and objectives and to achieve our own strategic objectives; and consolidate gains to force a return to competition on terms more favorable to the U.S., our allies and partners.
To achieve this, the Army must evolve our force, and our operations, around three core tenets. Calibrated force posture combines position and the ability to maneuver across strategic distances. Multi-domain formations possess the capacity, endurance and capability to access and employ capabilities across all domains to pose multiple and compounding dilemmas on the adversary. Convergence achieves the rapid and continuous integration of all domains across time, space and capabilities to overmatch the enemy. Underpinning these tenets are mission command and disciplined initiative at all warfighting echelons. (original emphasis)
For a look at the evolution of the Army and U.S. Marine Corps doctrinal thinking about multi-domain warfare since early 2017:
Trevor Dupuy was skeptical about the role of technology in determining outcomes in warfare. While he did believe technological innovation was crucial, he did not think that technology itself has decided success or failure on the battlefield. As he wrote posthumously in 1997,
I am a humanist, who is also convinced that technology is as important today in war as it ever was (and it has always been important), and that any national or military leader who neglects military technology does so to his peril and that of his country. But, paradoxically, perhaps to an extent even greater than ever before, the quality of military men is what wins wars and preserves nations. (emphasis added)
His conclusion was largely based upon his quantitative approach to studying military history, particularly the way humans have historically responded to the relentless trend of increasingly lethal military technology.
The Historical Relationship Between Weapon Lethality and Battle Casualty Rates
Based on a 1964 study for the U.S. Army, Dupuy identified a long-term historical relationship between increasing weapon lethality and decreasing average daily casualty rates in battle. (He summarized these findings in his book, The Evolution of Weapons and Warfare (1980). The quotes below are taken from it.)
Since antiquity, military technological development has produced weapons of ever increasing lethality. The rate of increase in lethality has grown particularly dramatically since the mid-19th century.
However, in contrast, the average daily casualty rate in combat has been in decline since 1600. With notable exceptions during the 19th century, casualty rates have continued to fall through the late 20th century. If technological innovation has produced vastly more lethal weapons, why have there been fewer average daily casualties in battle?
the granting of greater freedom to maneuver through decentralized decision-making and enhanced mobility; and
improved use of combined arms and interservice coordination.
Technological Innovation and Organizational Assimilation
Dupuy noted that the historical correlation between weapons development and their use in combat has not been linear because the pace of integration has been largely determined by military leaders, not the rate of technological innovation. “The process of doctrinal assimilation of new weapons into compatible tactical and organizational systems has proved to be much more significant than invention of a weapon or adoption of a prototype, regardless of the dimensions of the advance in lethality.” [p. 337]
As a result, the history of warfare has been exemplified more often by a discontinuity between weapons and tactical systems than effective continuity.
During most of military history there have been marked and observable imbalances between military efforts and military results, an imbalance particularly manifested by inconclusive battles and high combat casualties. More often than not this imbalance seems to be the result of incompatibility, or incongruence, between the weapons of warfare available and the means and/or tactics employing the weapons. [p. 341]
In short, military organizations typically have not been fully effective at exploiting new weapons technology to advantage on the battlefield. Truly decisive alignment between weapons and systems for their employment has been exceptionally rare. Dupuy asserted that
There have been six important tactical systems in military history in which weapons and tactics were in obvious congruence, and which were able to achieve decisive results at small casualty costs while inflicting disproportionate numbers of casualties. These systems were:
the Macedonian system of Alexander the Great, ca. 340 B.C.
the Roman system of Scipio and Flaminius, ca. 200 B.C.
the Mongol system of Ghengis Khan, ca. A.D. 1200
the English system of Edward I, Edward III, and Henry V, ca. A.D. 1350
the French system of Napoleon, ca. A.D. 1800
the German blitzkrieg system, ca. A.D. 1940 [p. 341]
With one caveat, Dupuy could not identify any single weapon that had decisively changed warfare in of itself without a corresponding human adaptation in its use on the battlefield.
Save for the recent significant exception of strategic nuclear weapons, there have been no historical instances in which new and lethal weapons have, of themselves, altered the conduct of war or the balance of power until they have been incorporated into a new tactical system exploiting their lethality and permitting their coordination with other weapons; the full significance of this one exception is not yet clear, since the changes it has caused in warfare and the influence it has exerted on international relations have yet to be tested in war.
Until the present time, the application of sound, imaginative thinking to the problem of warfare (on either an individual or an institutional basis) has been more significant than any new weapon; such thinking is necessary to real assimilation of weaponry; it can also alter the course of human affairs without new weapons. [p. 340]
Technological Superiority and Offset Strategies
Will new technologies like robotics and artificial intelligence provide the basis for a seventh tactical system where weapons and their use align with decisive battlefield results? Maybe. If Dupuy’s analysis is accurate, however, it is more likely that future increases in weapon lethality will continue to be counterbalanced by human ingenuity in how those weapons are used, yielding indeterminate—perhaps costly and indecisive—battlefield outcomes.
Genuinely effective congruence between weapons and force employment continues to be difficult to achieve. Dupuy believed the preconditions necessary for successful technological assimilation since the mid-19th century have been a combination of conducive military leadership; effective coordination of national economic, technological-scientific, and military resources; and the opportunity to evaluate and analyze battlefield experience.
Can the U.S. meet these preconditions? That certainly seemed to be the goal of the so-called Third Offset Strategy, articulated in 2014 by the Obama administration. It called for maintaining “U.S. military superiority over capable adversaries through the development of novel capabilities and concepts.” Although the Trump administration has stopped using the term, it has made “maximizing lethality” the cornerstone of the 2018 National Defense Strategy, with increased funding for the Defense Department’s modernization priorities in FY2019 (though perhaps not in FY2020).
Dupuy’s original work on weapon lethality in the 1960s coincided with development in the U.S. of what advocates of a “revolution in military affairs” (RMA) have termed the “First Offset Strategy,” which involved the potential use of nuclear weapons to balance Soviet superiority in manpower and material. RMA proponents pointed to the lopsided victory of the U.S. and its allies over Iraq in the 1991 Gulf War as proof of the success of a “Second Offset Strategy,” which exploited U.S. precision-guided munitions, stealth, and intelligence, surveillance, and reconnaissance systems developed to counter the Soviet Army in Germany in the 1980s. Dupuy was one of the few to attribute the decisiveness of the Gulf War both to airpower and to the superior effectiveness of U.S. combat forces.
Trevor Dupuy certainly was not an anti-technology Luddite. He recognized the importance of military technological advances and the need to invest in them. But he believed that the human element has always been more important on the battlefield. Most wars in history have been fought without a clear-cut technological advantage for one side; some have been bloody and pointless, while others have been decisive for reasons other than technology. While the future is certainly unknown and past performance is not a guarantor of future results, it would be a gamble to rely on technological superiority alone to provide the margin of success in future warfare.
Last week, the Russian Ministry of Defense claimed that its military air defense assets had shot down 45 drones in attempted attacks on Khmeimim Air Base, the main Russian military installation in Syria. The frequency of these attacks were increasing since the first one in January, according to Major General Igor Konashenkov. Five drones had been downed in the three days preceding the news conference.
Konashenkov asserted that although the drones appeared technologically primitive, they were actually quite sophisticated, with a range of up to 100 kilometers (60 miles). While the drones were purportedly to be piloted by Syrian rebels from Idlib Provence, the Russians have implied that they required outside assistance to assemble them.
The use of commercial off-the shelf (COTS) or modified off-the-shelf (MOTS) aerial drones by non-state actors for actions ranging from precision bombing attacks on combat troops, to terrorism, to surveillance of law enforcement, appears to be gaining in popularity.
Earlier this month, a pair of commercial drones armed with explosives were used in an alleged assassination attempt on Venezuelan President Nicolás Maduro. Daesh fighters in Syria and Iraq have been using drones for reconnaissance and to drop explosives and bombs on opposition forces.
In 2015, Reuters reported that a protester flew “a drone carrying radioactive sand from the Fukushima nuclear disaster onto the prime minister’s office, though the amount of radiation was minimal.” Mexican cartels have used drones to smuggle drugs and, in one instance, to land disabled grenades on a local police chief’s property. Last summer, a drone delivered an active grenade to an ammunition dump in Ukraine, which Kyle Mizokami of Popular Mechanics reported caused a billion dollars’ worth of damage.
Patrick Turner reported for Defense One that a criminal gang employed drones to harass an FBI hostage rescue team observing an unfolding situation outside a large U.S. city in 2017.
As Joseph Trevithick reported in The Drive, the Russians have been successful thus far in thwarting drone attacks in Syria using air defense radars, Pantsir-S1 short-range air defense systems, and electronic warfare systems. These attacks have not involved more than a handful of drones at a time, however. The initial Syrian rebel drone attack on Khmeimim Air Base in January 2018 involved 10 drones carrying 10 bomblets each.
The ubiquity of commercial drones also raises the possibility of attacks on non-military targets unprotected by air defense networks. Is it possible to defend every potential target? Perhaps not, but Jospeh Hanacek points out in War on the Rocks that there are ways to counter or mitigate the risk of drone attacks that do not involve sophisticated and expensive defenses. Among his simple suggestions are using shotguns for point defense against small and fragile drones, improving communications among security forces, and complicating the targeting problem for would-be attackers. Perhaps the best defense against drones is merely to avoid overthinking the problem.
A couple of years ago, a media report that the Chinese had claimed a technological breakthrough in stealth-busting quantum radar capabilities led me to muse about the possible repercussions on U.S. military capabilities. This was during the height of the technology-rooted Third Offset Strategy mania. It seemed to me at the time that concentrating on technological solutions to the U.S.’s strategic challenges might not be the wisest course of action.
The notion that stealth might be a wasting asset seemed somewhat far-fetched when I wrote that, but it appears to have become a much more serious concern. As the DARPA solicitation states, “Our acquisition system is finding it difficult to respond on relevant timescales to adversary progress, which has made the search for next generation capabilities at once more urgent and more futile.” (p. 5)
Armies have historically responded to the increasing lethality of weapons by dispersing mass in frontage and depth on the battlefield. Will combat see a new period of adjustment over the next 50 years like the previous half-century, where dispersion continues to shift in direct proportion to increased weapon range and precision, or will there be a significant change in the character of warfare?
One point of departure for such an inquiry could be the work of TDI President Chris Lawrence, who looked into the nature of historical rates of dispersion in combat from 1600 to 1991.
I am focusing on this because l really want to come up with some means of measuring the effects of a “revolution in warfare.” The last 400 years of human history have given us more revolutionary inventions impacting war than we can reasonably expect to see in the next 100 years. In particular, I would like to measure the impact of increased weapon accuracy, improved intelligence, and improved C2 on combat.
His tentative conclusions were:
Dispersion has been relatively constant and driven by factors other than firepower from 1600-1815.
Since the Napoleonic Wars, units have increasingly dispersed (found ways to reduce their chance to be hit) in response to increased lethality of weapons.
As a result of this increased dispersion, casualties in a given space have declined.
The ratio of this decline in casualties over area have been roughly proportional to the strength over an area from 1600 through WWI. Starting with WWII, it appears that people have dispersed faster than weapons lethality, and this trend has continued.
In effect, people dispersed in direct relation to increased firepower from 1815 through 1920, and then after that time dispersed faster than the increase in lethality.
It appears that since WWII, people have gone back to dispersing (reducing their chance to be hit) at the same rate that firepower is increasing.
Effectively, there are four patterns of casualties in modem war:
Period 1 (1600 – 1815): Period of Stability
Short battles
Short frontages
High attrition per day
Constant dispersion
Dispersion decreasing slightly after late 1700s
Attrition decreasing slightly after mid-1700s.
Period 2 (1816 – 1905): Period of Adjustment
Longer battles
Longer frontages
Lower attrition per day
Increasing dispersion
Dispersion increasing slightly faster than lethality
Period 3 (1912 – 1920): Period of Transition
Long battles
Continuous frontages
Lower attrition per day
Increasing dispersion
Relative lethality per kilometer similar to past, but lower
Dispersion increasing slightly faster than lethality
Period 4 (1937 – present): Modern Warfare
Long battles
Continuous frontages
Low attrition per day
High dispersion (perhaps constant?)
Relatively lethality per kilometer much lower than the past
Dispersion increased much faster than lethality going into the period.
Dispersion increased at the same rate as lethality within the period.
Chris based his study on previous work done by Trevor Dupuy and his associates, which established a pattern in historical combat between lethality, dispersion, and battlefield casualty rates.
There is no way to accurately predict the future relationship between weapon lethality and dispersion on the battlefield, but we should question whether or not current conception of combat reflect consideration of the historical trends.
The U.S. Navy has uploaded video of a recent sinking exercise (SINKEX) conducted during the 2018 Rim Of The Pacific (RIMPAC) exercises, hosted bi-annually by the U.S. Pacific Fleet based in Honolulu, Hawaii. As detailed by Tyler Rogoway in The Drive, the target of the SINKEX on 12 July 2018 was the U.S.S. Racine, a Newport class Landing Ship-Tank amphibious ship decommissioned 25 years ago.
As dramatic as the images are, the interesting thing about this demonstration was that it included a variety of land-based weapons firing across domains to strike a naval target. The U.S. Army successfully fired a version of the Naval Strike Missile that it is interested in acquiring, as well as a half-dozen High-Mobility Artillery Rocket System [HIMARS] rounds.Japanese troops fired four Type 12 land-based anti-ship missiles at the Racine as well. For good measure, an Australian P-8 Poseidon also hit the target with an air-launched AGM-84 Harpoon.
The coup de gras was provided by a Mk-48 torpedo launched from the Los Angeles class nuclear fast attack submarine USS Olympia, which broke the Racine‘s back and finally sank it an hour later.