Tag theory

Dupuy’s Verities: Fortification

The Maginot Line was a 900-mile long network of underground bunkers, tunnels and concrete retractable gun batteries. Its heaviest defenses were located along the 280-mile long border with Germany. [WikiCommons]

The sixth of Trevor Dupuy’s Timeless Verities of Combat is:

Defenders’ chances of success are directly proportional to fortification strength.

From Understanding War (1987):

To some modern military thinkers this is a truism needing no explanation or justification. Others have asserted that prepared defenses are attractive traps to be avoided at all costs. Such assertions, however, either ignore or misread historical examples. History is so fickle that it is dangerous for historians to use such words as “always” or “never.” Nevertheless I offer a bold counter-assertion: never in history has a defense been weakened by the availability of fortifications; defensive works always enhance combat strength. At the very least, fortifications will delay an attacker and add to his casualties; at best, fortifications will enable the defender to defeat the attacker.

Anyone who suggests that breakthroughs of defensive positions in recent history demonstrate the bankruptcy of defensive posture and/or fortifications is seriously deceiving himself and is misinterpreting modern history. One can cite as historical examples the overcoming of the Maginot Line, the Mannerheim Line, the Siegfried Line, and the Bar Lev Line, and from these examples conclude that these fortifications failed. Such a conclusion is absolutely wrong. It is true that all of these fortifications were overcome, but only because a powerful enemy was willing to make a massive and costly effort. (Of course, the Maginot Line was not attacked frontally in 1940; the Germans were so impressed by its defensive strength that they bypassed it, and were threatening its rear when France surrendered.) All of these fortifications afforded time for the defenders to make new dispositions, to bring up reserves, or to mobilize. All were intended to obstruct, to permit the defenders to punish the attackers and, above all to delay; all were successful in these respects. The Bar Lev Line, furthermore, saved Israel from disastrous defeat, and became the base for a successful offensive.[p. 4]

Will field fortifications continue to enhance the combat power of land forces on future battlefields? This is an interesting question. While the character of existing types of fortifications—trenches, strongpoint, and bunkers—might change, seeking cover and concealment from the earth might become even more important.

Dr. Alexander Kott, Chief Scientist at the U.S. Army Research Laboratory, provided one perspective in a recently published paper titled “Ground Warfare in 2050: How It Might Look.” In it, Kott speculated about “tactical ground warfighting circa 2050, in a major conflict between technologically advanced peer competitors.”

Kott noted that on future battlefields dominated by sensor saturation and long-range precision fires, “Conventional entrenchments and other fortifications will become less effective when teams of intelligent munitions can maneuver into and within a trench or a bunker.” Light dismounted forces “will have limited, if any, protection either from antimissiles or armor (although they may be provided a degree of protection by armor deployed by their robotic helpers… Instead, they will use cluttered ground terrain to obtain cover and concealment. In addition, they will attempt to distract and deceive…by use of decoys.”

Heavy forces “capable of producing strong lethal effects—substantial in size and mounted on vehicles—will be unlikely to avoid detection, observation, and fires.” To mitigate continuous incoming precision fires, Kott envisions that heavy ground forces will employ a combination of cover and concealment, maneuver, dispersion, decoys, vigorous counter-ISR (intelligence, surveillance, and reconnaissance) attacks, and armor, but will rely primarily “on extensive use of intelligent antimissiles (evolutions of today’s Active Protection Systems [APSs], Counter Rocket, Artillery, and Mortar [C-RAM], Iron Dome, etc.)”

Conversely, Kott does not foresee underground cover and concealment disappearing from future battlefields. “To gain protection from intelligent munitions, extended subterranean tunnels and facilities will become important. This in turn will necessitate the tunnel-digging robotic machines, suitably equipped for battlefield mobility.” Not only will “large static assets such as supply dumps or munitions repair and manufacturing shops” be moved underground, but maneuver forces and field headquarters might conceivably rapidly dig themselves into below-ground fighting positions between operational bounds.

Comparing Force Ratios to Casualty Exchange Ratios

“American Marines in Belleau Wood (1918)” by Georges Scott [Wikipedia]

Comparing Force Ratios to Casualty Exchange Ratios
Christopher A. Lawrence

[The article below is reprinted from the Summer 2009 edition of The International TNDM Newsletter.]

There are three versions of force ratio versus casualty exchange ratio rules, such as the three-to-one rule (3-to-1 rule), as it applies to casualties. The earliest version of the rule as it relates to casualties that we have been able to find comes from the 1958 version of the U.S. Army Maneuver Control manual, which states: “When opposing forces are in contact, casualties are assessed in inverse ratio to combat power. For friendly forces advancing with a combat power superiority of 5 to 1, losses to friendly forces will be about 1/5 of those suffered by the opposing force.”[1]

The RAND version of the rule (1992) states that: “the famous ‘3:1 rule ’, according to which the attacker and defender suffer equal fractional loss rates at a 3:1 force ratio the battle is in mixed terrain and the defender enjoys ‘prepared ’defenses…” [2]

Finally, there is a version of the rule that dates from the 1967 Maneuver Control manual that only applies to armor that shows:

As the RAND construct also applies to equipment losses, then this formulation is directly comparable to the RAND construct.

Therefore, we have three basic versions of the 3-to-1 rule as it applies to casualties and/or equipment losses. First, there is a rule that states that there is an even fractional loss ratio at 3-to-1 (the RAND version), Second, there is a rule that states that at 3-to-1, the attacker will suffer one-third the losses of the defender. And third, there is a rule that states that at 3-to-1, the attacker and defender will suffer the same losses as the defender. Furthermore, these examples are highly contradictory, with either the attacker suffering three times the losses of the defender, the attacker suffering the same losses as the defender, or the attacker suffering 1/3 the losses of the defender.

Therefore, what we will examine here is the relationship between force ratios and exchange ratios. In this case, we will first look at The Dupuy Institute’s Battles Database (BaDB), which covers 243 battles from 1600 to 1900. We will chart on the y-axis the force ratio as measured by a count of the number of people on each side of the forces deployed for battle. The force ratio is the number of attackers divided by the number of defenders. On the x-axis is the exchange ratio, which is a measured by a count of the number of people on each side who were killed, wounded, missing or captured during that battle. It does not include disease and non-battle injuries. Again, it is calculated by dividing the total attacker casualties by the total defender casualties. The results are provided below:

As can be seen, there are a few extreme outliers among these 243 data points. The most extreme, the Battle of Tippennuir (l Sep 1644), in which an English Royalist force under Montrose routed an attack by Scottish Covenanter militia, causing about 3,000 casualties to the Scots in exchange for a single (allegedly self-inflicted) casualty to the Royalists, was removed from the chart. This 3,000-to-1 loss ratio was deemed too great an outlier to be of value in the analysis.

As it is, the vast majority of cases are clumped down into the corner of the graph with only a few scattered data points outside of that clumping. If one did try to establish some form of curvilinear relationship, one would end up drawing a hyperbola. It is worthwhile to look inside that clump of data to see what it shows. Therefore, we will look at the graph truncated so as to show only force ratios at or below 20-to-1 and exchange rations at or below 20-to-1.

Again, the data remains clustered in one corner with the outlying data points again pointing to a hyperbola as the only real fitting curvilinear relationship. Let’s look at little deeper into the data by truncating the data on 6-to-1 for both force ratios and exchange ratios. As can be seen, if the RAND version of the 3-to-1 rule is correct, then the data should show at 3-to-1 force ratio a 3-to-1 casualty exchange ratio. There is only one data point that comes close to this out of the 243 points we examined.

If the FM 105-5 version of the rule as it applies to armor is correct, then the data should show that at 3-to-1 force ratio there is a 1-to-1 casualty exchange ratio, at a 4-to-1 force ratio a 1-to-2 casualty exchange ratio, and at a 5-to-1 force ratio a 1-to-3 casualty exchange ratio. Of course, there is no armor in these pre-WW I engagements, but again no such exchange pattern does appear.

If the 1958 version of the FM 105-5 rule as it applies to casualties is correct, then the data should show that at a 3-to-1 force ratio there is 0.33-to-1 casualty exchange ratio, at a 4-to-1 force ratio a .25-to-1 casualty exchange ratio, and at a 5-to-1 force ratio a 0.20-to-5 casualty exchange ratio. As can be seen, there is not much indication of this pattern, or for that matter any of the three patterns.

Still, such a construct may not be relevant to data before 1900. For example, Lanchester claimed in 1914 in Chapter V, “The Principal of Concentration,” of his book Aircraft in Warfare, that there is greater advantage to be gained in modern warfare from concentration of fire.[3] Therefore, we will tap our more modern Division-Level Engagement Database (DLEDB) of 675 engagements, of which 628 have force ratios and exchange ratios calculated for them. These 628 cases are then placed on a scattergram to see if we can detect any similar patterns.

Even though this data covers from 1904 to 1991, with the vast majority of the data coming from engagements after 1940, one again sees the same pattern as with the data from 1600-1900. If there is a curvilinear relationship, it is again a hyperbola. As before, it is useful to look into the mass of data clustered into the corner by truncating the force and exchange ratios at 20-to-1. This produces the following:

Again, one sees the data clustered in the corner, with any curvilinear relationship again being a hyperbola. A look at the data further truncated to a 10-to-1 force or exchange ratio does not yield anything more revealing.

And, if this data is truncated to show only 5-to-1 force ratio and exchange ratios, one again sees:

Again, this data appears to be mostly just noise, with no clear patterns here that support any of the three constructs. In the case of the RAND version of the 3-to-1 rule, there is again only one data point (out of 628) that is anywhere close to the crossover point (even fractional exchange rate) that RAND postulates. In fact, it almost looks like the data conspires to make sure it leaves a noticeable “hole” at that point. The other postulated versions of the 3-to-1 rules are also given no support in these charts.

Also of note, that the relationship between force ratios and exchange ratios does not appear to significantly change for combat during 1600-1900 when compared to the data from combat from 1904-1991. This does not provide much support for the intellectual construct developed by Lanchester to argue for his N-square law.

While we can attempt to torture the data to find a better fit, or can try to argue that the patterns are obscured by various factors that have not been considered, we do not believe that such a clear pattern and relationship exists. More advanced mathematical methods may show such a pattern, but to date such attempts have not ferreted out these alleged patterns. For example, we refer the reader to Janice Fain’s article on Lanchester equations, The Dupuy Institute’s Capture Rate Study, Phase I & II, or any number of other studies that have looked at Lanchester.[4]

The fundamental problem is that there does not appear to be a direct cause and effect between force ratios and exchange ratios. It appears to be an indirect relationship in the sense that force ratios are one of several independent variables that determine the outcome of an engagement, and the nature of that outcome helps determines the casualties. As such, there is a more complex set of interrelationships that have not yet been fully explored in any study that we know of, although it is briefly addressed in our Capture Rate Study, Phase I & II.

NOTES

[1] FM 105-5, Maneuver Control (1958), 80.

[2] Patrick Allen, “Situational Force Scoring: Accounting for Combined Arms Effects in Aggregate Combat Models,” (N-3423-NA, The RAND Corporation, Santa Monica, CA, 1992), 20.

[3] F. W. Lanchester, Aircraft in Warfare: The Dawn of the Fourth Arm (Lanchester Press Incorporated, Sunnyvale, Calif., 1995), 46-60. One notes that Lanchester provided no data to support these claims, but relied upon an intellectual argument based upon a gross misunderstanding of ancient warfare.

[4] In particular, see page 73 of Janice B. Fain, “The Lanchester Equations and Historical Warfare: An Analysis of Sixty World War II Land Engagements,” Combat Data Subscription Service (HERO, Arlington, Va., Spring 1975).

U.S. Army Multi-Domain Operations Concept Continues Evolving

[Sgt. Meghan Berry, US Army/adapted by U.S. Army Modern War Institute]

The U.S. Army Training and Doctrine Command (TRADOC) released draft version 1.5 of its evolving Multi-Domain Operations (MDO) future operating concept last week. Entitled TRADOC Pamphlet 525-3-1, “The U.S. Army in Multi-Domain Operations 2028,” this iteration updates the initial Multi-Domain Battle (MDB) concept issued in October 2017.

According to U.S. Army Chief of Staff (and Chairman of the Joint Chiefs of Staff nominee) General Mark Milley, MDO Concept 1.5 is the first step in the doctrinal evolution. “It describes how U.S. Army forces, as part of the Joint Force, will militarily compete, penetrate, dis-integrate, and exploit our adversaries in the future.”

TRADOC Commander General Stuart Townsend summarized the draft concept thusly:

The U.S. Army in Multi-Domain Operations 2028 concept proposes a series of solutions to solve the problem of layered standoff. The central idea in solving this problem is the rapid and continuous integration of all domains of warfare to deter and prevail as we compete short of armed conflict. If deterrence fails, Army formations, operating as part of the Joint Force, penetrate and dis-integrate enemy anti-access and area denial systems;exploit the resulting freedom of maneuver to defeat enemy systems, formations and objectives and to achieve our own strategic objectives; and consolidate gains to force a return to competition on terms more favorable to the U.S., our allies and partners.

To achieve this, the Army must evolve our force, and our operations, around three core tenets. Calibrated force posture combines position and the ability to maneuver across strategic distances. Multi-domain formations possess the capacity, endurance and capability to access and employ capabilities across all domains to pose multiple and compounding dilemmas on the adversary. Convergence achieves the rapid and continuous integration of all domains across time, space and capabilities to overmatch the enemy. Underpinning these tenets are mission command and disciplined initiative at all warfighting echelons. (original emphasis)

For a look at the evolution of the Army and U.S. Marine Corps doctrinal thinking about multi-domain warfare since early 2017:

Army And Marine Corps Join Forces To Define Multi-Domain Battle Concept

U.S. Army Updates Draft Multi-Domain Battle Operating Concept

 

Trevor Dupuy and Technological Determinism in Digital Age Warfare

Is this the only innovation in weapons technology in history with the ability in itself to change warfare and alter the balance of power? Trevor Dupuy thought it might be. Shot IVY-MIKE, Eniwetok Atoll, 1 November 1952. [Wikimedia]

Trevor Dupuy was skeptical about the role of technology in determining outcomes in warfare. While he did believe technological innovation was crucial, he did not think that technology itself has decided success or failure on the battlefield. As he wrote posthumously in 1997,

I am a humanist, who is also convinced that technology is as important today in war as it ever was (and it has always been important), and that any national or military leader who neglects military technology does so to his peril and that of his country. But, paradoxically, perhaps to an extent even greater than ever before, the quality of military men is what wins wars and preserves nations. (emphasis added)

His conclusion was largely based upon his quantitative approach to studying military history, particularly the way humans have historically responded to the relentless trend of increasingly lethal military technology.

The Historical Relationship Between Weapon Lethality and Battle Casualty Rates

Based on a 1964 study for the U.S. Army, Dupuy identified a long-term historical relationship between increasing weapon lethality and decreasing average daily casualty rates in battle. (He summarized these findings in his book, The Evolution of Weapons and Warfare (1980). The quotes below are taken from it.)

Since antiquity, military technological development has produced weapons of ever increasing lethality. The rate of increase in lethality has grown particularly dramatically since the mid-19th century.

However, in contrast, the average daily casualty rate in combat has been in decline since 1600. With notable exceptions during the 19th century, casualty rates have continued to fall through the late 20th century. If technological innovation has produced vastly more lethal weapons, why have there been fewer average daily casualties in battle?

The primary cause, Dupuy concluded, was that humans have adapted to increasing weapon lethality by changing the way they fight. He identified three key tactical trends in the modern era that have influenced the relationship between lethality and casualties:

Technological Innovation and Organizational Assimilation

Dupuy noted that the historical correlation between weapons development and their use in combat has not been linear because the pace of integration has been largely determined by military leaders, not the rate of technological innovation. “The process of doctrinal assimilation of new weapons into compatible tactical and organizational systems has proved to be much more significant than invention of a weapon or adoption of a prototype, regardless of the dimensions of the advance in lethality.” [p. 337]

As a result, the history of warfare has been exemplified more often by a discontinuity between weapons and tactical systems than effective continuity.

During most of military history there have been marked and observable imbalances between military efforts and military results, an imbalance particularly manifested by inconclusive battles and high combat casualties. More often than not this imbalance seems to be the result of incompatibility, or incongruence, between the weapons of warfare available and the means and/or tactics employing the weapons. [p. 341]

In short, military organizations typically have not been fully effective at exploiting new weapons technology to advantage on the battlefield. Truly decisive alignment between weapons and systems for their employment has been exceptionally rare. Dupuy asserted that

There have been six important tactical systems in military history in which weapons and tactics were in obvious congruence, and which were able to achieve decisive results at small casualty costs while inflicting disproportionate numbers of casualties. These systems were:

  • the Macedonian system of Alexander the Great, ca. 340 B.C.
  • the Roman system of Scipio and Flaminius, ca. 200 B.C.
  • the Mongol system of Ghengis Khan, ca. A.D. 1200
  • the English system of Edward I, Edward III, and Henry V, ca. A.D. 1350
  • the French system of Napoleon, ca. A.D. 1800
  • the German blitzkrieg system, ca. A.D. 1940 [p. 341]

With one caveat, Dupuy could not identify any single weapon that had decisively changed warfare in of itself without a corresponding human adaptation in its use on the battlefield.

Save for the recent significant exception of strategic nuclear weapons, there have been no historical instances in which new and lethal weapons have, of themselves, altered the conduct of war or the balance of power until they have been incorporated into a new tactical system exploiting their lethality and permitting their coordination with other weapons; the full significance of this one exception is not yet clear, since the changes it has caused in warfare and the influence it has exerted on international relations have yet to be tested in war.

Until the present time, the application of sound, imaginative thinking to the problem of warfare (on either an individual or an institutional basis) has been more significant than any new weapon; such thinking is necessary to real assimilation of weaponry; it can also alter the course of human affairs without new weapons. [p. 340]

Technological Superiority and Offset Strategies

Will new technologies like robotics and artificial intelligence provide the basis for a seventh tactical system where weapons and their use align with decisive battlefield results? Maybe. If Dupuy’s analysis is accurate, however, it is more likely that future increases in weapon lethality will continue to be counterbalanced by human ingenuity in how those weapons are used, yielding indeterminate—perhaps costly and indecisive—battlefield outcomes.

Genuinely effective congruence between weapons and force employment continues to be difficult to achieve. Dupuy believed the preconditions necessary for successful technological assimilation since the mid-19th century have been a combination of conducive military leadership; effective coordination of national economic, technological-scientific, and military resources; and the opportunity to evaluate and analyze battlefield experience.

Can the U.S. meet these preconditions? That certainly seemed to be the goal of the so-called Third Offset Strategy, articulated in 2014 by the Obama administration. It called for maintaining “U.S. military superiority over capable adversaries through the development of novel capabilities and concepts.” Although the Trump administration has stopped using the term, it has made “maximizing lethality” the cornerstone of the 2018 National Defense Strategy, with increased funding for the Defense Department’s modernization priorities in FY2019 (though perhaps not in FY2020).

Dupuy’s original work on weapon lethality in the 1960s coincided with development in the U.S. of what advocates of a “revolution in military affairs” (RMA) have termed the “First Offset Strategy,” which involved the potential use of nuclear weapons to balance Soviet superiority in manpower and material. RMA proponents pointed to the lopsided victory of the U.S. and its allies over Iraq in the 1991 Gulf War as proof of the success of a “Second Offset Strategy,” which exploited U.S. precision-guided munitions, stealth, and intelligence, surveillance, and reconnaissance systems developed to counter the Soviet Army in Germany in the 1980s. Dupuy was one of the few to attribute the decisiveness of the Gulf War both to airpower and to the superior effectiveness of U.S. combat forces.

Trevor Dupuy certainly was not an anti-technology Luddite. He recognized the importance of military technological advances and the need to invest in them. But he believed that the human element has always been more important on the battlefield. Most wars in history have been fought without a clear-cut technological advantage for one side; some have been bloody and pointless, while others have been decisive for reasons other than technology. While the future is certainly unknown and past performance is not a guarantor of future results, it would be a gamble to rely on technological superiority alone to provide the margin of success in future warfare.

The Great 3-1 Rule Debate

coldwarmap3[This piece was originally posted on 13 July 2016.]

Trevor Dupuy’s article cited in my previous post, “Combat Data and the 3:1 Rule,” was the final salvo in a roaring, multi-year debate between two highly regarded members of the U.S. strategic and security studies academic communities, political scientist John Mearsheimer and military analyst/polymath Joshua Epstein. Carried out primarily in the pages of the academic journal International Security, Epstein and Mearsheimer argued the validity of the 3-1 rule and other analytical models with respect the NATO/Warsaw Pact military balance in Europe in the 1980s. Epstein cited Dupuy’s empirical research in support of his criticism of Mearsheimer’s reliance on the 3-1 rule. In turn, Mearsheimer questioned Dupuy’s data and conclusions to refute Epstein. Dupuy’s article defended his research and pointed out the errors in Mearsheimer’s assertions. With the publication of Dupuy’s rebuttal, the International Security editors called a time out on the debate thread.

The Epstein/Mearsheimer debate was itself part of a larger political debate over U.S. policy toward the Soviet Union during the administration of Ronald Reagan. This interdisciplinary argument, which has since become legendary in security and strategic studies circles, drew in some of the biggest names in these fields, including Eliot Cohen, Barry Posen, the late Samuel Huntington, and Stephen Biddle. As Jeffery Friedman observed,

These debates played a prominent role in the “renaissance of security studies” because they brought together scholars with different theoretical, methodological, and professional backgrounds to push forward a cohesive line of research that had clear implications for the conduct of contemporary defense policy. Just as importantly, the debate forced scholars to engage broader, fundamental issues. Is “military power” something that can be studied using static measures like force ratios, or does it require a more dynamic analysis? How should analysts evaluate the role of doctrine, or politics, or military strategy in determining the appropriate “balance”? What role should formal modeling play in formulating defense policy? What is the place for empirical analysis, and what are the strengths and limitations of existing data?[1]

It is well worth the time to revisit the contributions to the 1980s debate. I have included a bibliography below that is not exhaustive, but is a place to start. The collapse of the Soviet Union and the end of the Cold War diminished the intensity of the debates, which simmered through the 1990s and then were obscured during the counterterrorism/ counterinsurgency conflicts of the post-9/11 era. It is possible that the challenges posed by China and Russia amidst the ongoing “hybrid” conflict in Syria and Iraq may revive interest in interrogating the bases of military analyses in the U.S and the West. It is a discussion that is long overdue and potentially quite illuminating.

NOTES

[1] Jeffery A. Friedman, “Manpower and Counterinsurgency: Empirical Foundations for Theory and Doctrine,” Security Studies 20 (2011)

BIBLIOGRAPHY

(Note: Some of these are behind paywalls, but some are available in PDF format. Mearsheimer has made many of his publications freely available here.)

John J. Mearsheimer, “Why the Soviets Can’t Win Quickly in Central Europe,” International Security, Vol. 7, No. 1 (Summer 1982)

Samuel P. Huntington, “Conventional Deterrence and Conventional Retaliation in Europe,” International Security 8, no. 3 (Winter 1983/84)

Joshua Epstein, Strategy and Force Planning (Washington, DC: Brookings, 1987)

Joshua M. Epstein, “Dynamic Analysis and the Conventional Balance in Europe,” International Security 12, no. 4 (Spring 1988)

John J. Mearsheimer, “Numbers, Strategy, and the European Balance,” International Security 12, no. 4 (Spring 1988)

Stephen Biddle, “The European Conventional Balance,” Survival 30, no. 2 (March/April 1988)

Eliot A. Cohen, “Toward Better Net Assessment: Rethinking the European Conventional Balance,International Security Vol. 13, No. 1 (Summer 1988)

Joshua M. Epstein, “The 3:1 Rule, the Adaptive Dynamic Model, and the Future of Security Studies,” International Security 13, no. 4 (Spring 1989)

John J. Mearsheimer, “Assessing the Conventional Balance,” International Security 13, no. 4 (Spring 1989)

John J. Mearsheimer, Barry R. Posen, Eliot A. Cohen, “Correspondence: Reassessing Net Assessment,” International Security 13, No. 4 (Spring 1989)

Trevor N. Dupuy, “Combat Data and the 3:1 Rule,” International Security 14, no. 1 (Summer 1989)

Stephen Biddle et al., Defense at Low Force Levels (Alexandria, VA: Institute for Defense Analyses, 1991)

Dupuy’s Verities: Initiative

German Army soldiers advance during the Third Battle of Kharkov in early 1943. This was the culmination of a counteroffensive by German Field Marshal Erich von Manstein that blunted the Soviet offensive drive following the recapture of Stalingrad in late 1942. [Photo: KonchitsyaLeto/Reddit]

The fifth of Trevor Dupuy’s Timeless Verities of Combat is:

Initiative permits application of preponderant combat power.

From Understanding War (1987):

The importance of seizing and maintaining the initiative has not declined in our times, nor will it in the future. This has been the secret of success of all of the great captains of history. It was as true of MacArthur as it was of Alexander the Great, Grant or Napoleon. Some modern Soviet theorists have suggested that this is even more important now in an era of high technology than formerly. They may be right. This has certainly been a major factor in the Israeli victories over the Arabs in all of their wars.

Given the prominent role initiative has played in warfare historically, it is curious that it is not a principle of war in its own right. However, it could be argued that it is sufficiently embedded in the principles of the offensive and maneuver that it does not need to be articulated separately. After all, the traditional means of sizing the initiative on the battlefield is through a combination of the offensive and maneuver.

Initiative is a fundamental aspect of current U.S. Army doctrine, as stated in ADP 3-0 Operations (2017):

The central idea of operations is that, as part of a joint force, Army forces seize, retain, and exploit the initiative to gain and maintain a position of relative advantage in sustained land operations to prevent conflict, shape the operational environment, and win our Nation’s wars as part of unified action.

For Dupuy, the specific connection between initiative and combat power is likely why he chose to include it as a verity in its own right. Combat power was the central concept in his theory of combat and initiative was not just the basic means of achieving a preponderance of combat power through superior force strength (i.e. numbers), but also in harnessing the effects of the circumstantial variables of combat that multiply combat power (i.e. surprise, mobility, vulnerability, combat effectiveness). It was precisely through the exploitation of this relationship between initiative and combat power that allowed inferior numbers of German and Israeli combat forces to succeed time and again in combat against superior numbers of Soviet and Arab opponents.

Using initiative to apply preponderant combat power in battle is the primary way the effects of maneuver (to “gain and maintain a position of relative advantage“) are abstracted in Dupuy’s Quantified Judgement Model (QJM)/Tactical Numerical Deterministic Model (TNDM). The QJM/TNDM itself is primarily a combat attrition adjudicator that determines combat outcomes through calculations of relative combat power. The numerical force strengths of the opposing forces engaged as determined by maneuver can be easily inputted into the QJM/TNDM and then modified by the applicable circumstantial variables of combat related to maneuver to obtain a calculation of relative combat power. As another of Dupuy’s verities states, “superior combat power always wins.”

What Does Lethality Mean In Warfare?

In an insightful essay over at The Strategy Bridge, “Lethality: An Inquiry,” Marine Corps officer Olivia Gerard accomplishes one of the most important, yet most often overlooked, aspects of successfully thinking about and planning for war: questioning a basic assumption. She achieves this by posing a simple question: “What is lethality?”

Gerard notes that the current U.S. National Defense Strategy is predicated on lethality; as it states: “A more lethal, resilient, and rapidly innovating Joint Force, combined with a robust constellation of allies and partners, will sustain American influence and ensure favorable balances of power that safeguard the free and open international order.” She also identifies the linkage in the strategy between lethality and deterrence via a supporting statement from Deputy Secretary of Defense Patrick Shanahan: “Everything we do is geared toward one goal: maximizing lethality. A lethal force is the strongest deterrent to war.”

After pointing out that the strategy does not define the concept of lethality, Gerard responds to Shanahan’s statement by asking “why?”

She uses this as a jumping off point to examine the meaning of lethality in warfare. Starting from the traditional understanding of lethality as a tactical concept, Gerard walks through the way it has been understood historically. From this, she formulates a construct for understanding the relationship between lethality and strategy:

Organizational lethality emerges from tactical lethality that is institutionally codified. Tactical lethality is nested within organizational lethality, which is nested within strategic lethality. Plugging these terms into an implicit calculus, we can rewrite strategic lethality as the efficacy with which we can form intentional deadly relationships towards targets that can be actualized towards political ends.

To this, Gerard appends two interesting caveats: “Notice first that the organizational component becomes implicit. What remains outside, however, is the intention–a meta-intention–to form these potential deadly relationships in the first place.”

It is the second of these caveats—the intent to connect lethality to a strategic end—that informs Gerard’s conclusion. While the National Defense Strategy does not define the term, she observes that by explicitly leveraging the threat to use lethality to bolster deterrence, it supplies the necessary credibility needed to make deterrence viable. “Proclaiming lethality a core tenet, especially in a public strategic document, is the communication of the threat.”

Gerard’s exploration of lethality and her proposed framework for understanding it provide a very useful way of thinking about the way it relates to warfare. It is definitely worth your time to read.

What might be just as interesting, however, are the caveats to her construct because they encompass a lot of what is problematic about the way the U.S. military thinks—explicitly and implicitly—about tactical lethality and how it is codified into concepts of organizational lethality. (While I have touched on some of those already, Gerard gives more to reflect on. More on that later.)

Gerard also references the definition of lethality Trevor Dupuy developed for his 1964 study of historical trends in weapon lethality. While noting that his definition was too narrow for the purposes of her inquiry, the historical relationship between lethality, casualties, and dispersion on the battlefield Dupuy found in that study formed the basis for his subsequent theories of warfare and models of combat. (I will write more about those in the future as well.)

Simpkin on the Long-Term Effects of Firepower Dominance

To follow on my earlier post introducing British military theorist Richard Simpkin’s foresight in detecting trends in 21st Century warfare, I offer this paragraph, which immediately followed the ones I quoted:

Briefly and in the most general terms possible, I suggest that the long-term effect of dominant firepower will be threefold. It will disperse mass in the form of a “net” of small detachments with the dual role of calling down fire and of local quasi-guerrilla action. Because of its low density, the elements of this net will be everywhere and will thus need only the mobility of the boot. It will transfer mass, structurally from the combat arms to the artillery, and in deployment from the direct fire zone (as we now understand it) to the formation and protection of mobile fire bases capable of movement at heavy-track tempo (Chapter 9). Thus the third effect will be to polarise mobility, for the manoeuvre force still required is likely to be based on the rotor. This line of thought is borne out by recent trends in Soviet thinking on the offensive. The concept of an operational manoeuvre group (OMG) which hives off raid forces against C3 and indirect fire resources is giving way to more fluid and discontinuous manoeuvre by task forces (“air-ground assault groups” found by “shock divisions”) directed onto fire bases—again of course with an operational helicopter force superimposed. [Simpkin, Race To The Swift, p. 169]

It seems to me that in the mid-1980s, Simpkin accurately predicted the emergence of modern anti-access/area denial (A2/AD) defensive systems with reasonable accuracy, as well the evolving thinking on the part of the U.S. military as to how to operate against them.

Simpkin’s vision of task forces (more closely resembling Russian/Soviet OMGs than rotary wing “air-ground assault groups” operational forces, however) employing “fluid and discontinuous manoeuvre” at operational depths to attack long-range precision firebases appears similar to emerging Army thinking about future multidomain operations. (It’s likely that Douglas MacGregor’s Reconnaissance Strike Group concept more closely fits that bill.)

One thing he missed on was his belief that rotary wing helicopter combat forces would supplant armored forces as the primary deep operations combat arm. However, there is the potential possibility that drone swarms might conceivably take the place in Simpkin’s operational construct that he allotted to heliborne forces. Drones have two primary advantages over manned helicopters: they are far cheaper and they are far less vulnerable to enemy fires. With their unique capacity to blend mass and fires, drones could conceivably form the deep strike operational hammer that Simpkin saw rotary wing forces providing.

Just as interesting was Simpkin’s anticipation of the growing importance of information and electronic warfare in these environments. More on that later.

Richard Simpkin on 21st Century Trends in Mass and Firepower

Anvil of “troops” vs. anvil of fire. (Richard Simpkin, Race To The Swift: Thoughts on Twenty-First Century Warfare, Brassey’s: London, 1985, p. 51)

For my money, one of the most underrated analysts and theorists of modern warfare was the late Brigadier Richard Simpkin. A retired British Army World War II veteran, Simpkin helped design the Chieftan tank in the 60s and 70s. He is best known for his series of books analyzing Soviet and Western military theory and doctrine. His magnum opus was Race To The Swift: Thoughts on Twenty-First Century Warfare, published in 1985. A brilliant blend of military history, insightful analysis of tactics and technology as well as operations and strategy, and Simpkin’s idiosyncratic wit, the observations in Race To The Swift are becoming more prescient by the year.

Some of Simpkin’s analysis has not aged well, such as the focus on the NATO/Soviet confrontation in Central Europe, and a bold prediction that rotary wing combat forces would eventually supplant tanks as the primary combat arm. However, it would be difficult to find a better historical review of the role of armored forces in modern warfare and how trends in technology, tactics, and doctrine are interacting with strategy, policy, and politics to change the character of warfare in the 21st Century.

To follow on my previous post on the interchangeability of fire (which I gleaned from Simpkin, of course), I offer this nugget on how increasing weapons lethality would affect 21st Century warfare, written from the perspective of the mid 1980s:

While accidents of ground will always provide some kind of cover, the effect of modern firepower on land force tactics is equally revolutionary. Just as we saw in Part 2 how the rotary wing may well turn force structures inside out, firepower is already turning tactical concepts inside out, by replacing the anvil of troops with an anvil of fire (Fig. 5, page 51)*. The use of combat troops at high density to hold ground or to seize it is already likely to prove highly costly, and may soon become wholly unprofitable. The interesting question is what effect the dominance of firepower will have at operational level.

One school of thought, to which many defence academics on both sides of the Atlantic subscribe, is that it will reduce mobility and bring about a return to positional warfare. The opposite view is that it will put a premium on elusiveness, increasing mobility and reducing mass. On analysis, both these opinions appear rather simplistic, mainly because they ignore the interchangeability of troops and fire…—in other words the equivalence or complementarity of the movement of troops and the massing of fire. They also underrate the part played by manned and unmanned surveillance, and by communication. Another factor, little understood by soldiers and widely ignored, is the weight of fire a modern fast jet in its strike configuration, flying a lo-lo-lo profile, can put down very rapidly wherever required. With modern artillery and air support, a pair of eyes backed up by an unjammable radio and perhaps a thermal imager becomes the equivalent of at least a (company) combat team, perhaps a battle group. [Simpkin, Race To The Swift, pp. 168-169]

Sound familiar? I will return to Simpkin’s insights in future posts, but I suggest you all snatch up a copy of Race To The Swift for yourselves.

* See above.

“Quantity Has A Quality All Its Own”: How Robot Swarms Might Change Future Combat

Humans vs. machines in the film Matrix Revolutions (2003) [Screencap by The Matrix Wiki]

Yesterday, Paul Scharre, director of the Technology and National Security Program at the Center for a New American Security, and prolific writer on the future of robotics and artificial intelligence, posted a fascinating argument on Twitter regarding swarms and mass in future combat.

His thread was in response to an article by Shmuel Shmuel posted on War on the Rocks, which made the case that the same computer processing technology enabling robotic vehicles combined with old fashioned kinetic weapons (i.e. anti-aircraft guns) offered a cost-effective solution to swarms.

Scharre agreed that robotic drones are indeed vulnerable to such countermeasures, but made this point in response:

He then went to contend that robotic swarms offer the potential to reestablish the role of mass in future combat. Mass, either in terms of numbers of combatants or volume of firepower, has played a decisive role in most wars. As the aphorism goes, usually credited to Josef Stalin, “mass has a quality all of its own.”

Scharre observed that the United States went in a different direction in its post-World War II approach to warfare, adopting instead “offset” strategies that sought to leverage superior technology to balance against the mass militaries of the Communist bloc.

While effective during the Cold War, Scharre concurs with the arguments that offset strategies are becoming far too expensive and may ultimately become self-defeating.

In order to avoid this fate, Scharre contends that

The entire thread is well worth reading.

Trevor Dupuy would have agreed with much of what Scharre’s asserts. He identified the relationship between increasing weapon lethality and battlefield dispersion that goes back to the 17th century. Dupuy believed that the primary factor driving this relationship was the human response to fear in a lethal environment, with soldiers dispersing in depth and frontage on battlefields in order to survive weapons of ever increasing destructiveness.

TDI Friday Read: Lethality, Dispersion, And Mass On Future Battlefields

Robots might very well change that equation. Whether autonomous or “human in the loop,” robotic swarms do not feel fear and are inherently expendable. Cheaply produced robots might very well provide sufficient augmentation to human combat units to restore the primacy of mass in future warfare.