Category Lessons of History

TDI Friday Read: Engaging The Phalanx

The December 2018 issue of Phalanx, a periodical journal published by The Military Operations Research Society (MORS), contains an article by Jonathan K. Alt, Christopher Morey, and Larry Larimer, entitled “Perspectives on Combat Modeling.” (the article is paywalled, but limited public access is available via JSTOR).

Their article was written partly as a critical rebuttal to a TDI blog post originally published in April 2017, which discussed an issue of which the combat modeling and simulation community has long been aware but slow to address, known as the “Base of Sand” problem.

Wargaming Multi-Domain Battle: The Base Of Sand Problem

In short, because so little is empirically known about the real-world structures of combat processes and the interactions of these processes, modelers have been forced to rely on the judgement of subject matter experts (SMEs) to fill in the blanks. No one really knows if the blend of empirical data and SME judgement accurately represents combat because the modeling community has been reluctant to test its models against data on real world experience, a process known as validation.

TDI President Chris Lawrence subsequently published a series of blog posts responding to the specific comments and criticisms leveled by Alt, Morey, and Larimer.

How are combat models and simulations tested to see if they portray real-world combat accurately? Are they actually tested?

Engaging the Phalanx

How can we know if combat simulations adhere to strict standards established by the DoD regarding validation? Perhaps the validation reports can be released for peer review.

Validation

Some claim that models of complex combat behavior cannot really be tested against real-world operational experience, but this has already been done. Several times.

Validating Attrition

If only the “physics-based aspects” of combat models are empirically tested, do those models reliably represent real-world combat with humans or only the interactions of weapons systems?

Physics-based Aspects of Combat

Is real-world historical operational combat experience useful only for demonstrating the capabilities of combat models, or is it something the models should be able to reliably replicate?

Historical Demonstrations?

If a Subject Matter Expert (SME) can be substituted for a proper combat model validation effort, then could not a SME simply be substituted for the model? Should not all models be considered expert judgement quantified?

SMEs

What should be done about the “Base of Sand” problem? Here are some suggestions.

Engaging the Phalanx (part 7 of 7)

Persuading the military operations research community of the importance of research on real-world combat experience in modeling has been an uphill battle with a long history.

Diddlysquat

And the debate continues…

Dupuy’s Verities: The Advantage Of The Offensive

Union assault on the “Mule Shoe” salient, 12 May 1864, by Thure de Thulstrup (1887) [Wikimedia]

The seventh of Trevor Dupuy’s Timeless Verities of Combat is:

An attacker willing to pay the price can always penetrate the strongest defenses.

From Understanding War (1987):

No matter how alert the defender, no matter how skillful his dispositions to avoid or mitigate the effects of surprise or the effects of flank or rear attack, a skillful attacker can always achieve at least a temporary advantage for some time at a place he has selected. This is one reason why Napoleon always endeavored to seize and retain the initiative. In the great battles of 1864 and 1865 in Virginia, Lee was always able to exploit his defensive advantage to the utmost. But Grant equally was always able to achieve a temporary superiority when and where he wished. This did not always result in a Union victory—given Lee’s defensive skill—but invariably it forced Lee to retreat until he could again impose a temporary stalemate with the assistance of powerful field fortifications. A modern example can be found in the Soviet offensive relieving Leningrad in 1943. Another was the Allied break-out from the Normandy beachhead in July and August of 1944.

The exact meaning of this verity is tricky to determine, as the phrase “willing to pay the price” does a lot of work here. History is certainly replete with examples of Phyrric victories, where the cost paid for battlefield success deprived the attacker of any clear benefit. (The U.S. Civil War Battle of Chickamauga in 1863 would be an example in line with Dupuy’s description above.) Perhaps “willing and able to pay the price” would have been a better of way stating this. And, of course, no attack is guaranteed to succeed.

What Dupuy had in mind here is probably best understood in the context of two other of his verities “Offensive action is essential to positive combat results” and “Initiative permits application of preponderant combat power.” Even if the defensive may be the stronger form of combat, the offensive affords certain inherent potential advantages that can enable attackers to defeat the strongest of defenses if conducted effectively, sufficiently resourced, and determinedly pressed.

U.S. Army Releases New Iraq War History

On Thursday, the U.S. Army released a long-awaited history of its operational combat experience in Iraq from 2003 to 2011. The study, titled The U.S. Army in the Iraq War – Volume 1: Invasion – Insurgency – Civil War, 2003-2006 and The U.S. Army in the Iraq War – Volume 2: Surge and Withdrawal, 2007-2011, was published under the auspices of the U.S. Army War College’s Strategic Studies Institute.

This reflects its unconventional origins. Under normal circumstances, such work would be undertaken by either the U.S. Army Combat Studies Institute (CSI), which is charged with writing quick-turnaround “instant histories,” or the U.S. Army Center of Military History (CMH), which writes more deeply researched “official history,” years or decades after the fact.[1] Instead, these volumes were directly commissioned by then-Chief of the Staff of the Army, General Raymond Odierno, who created an Iraq Study Group in 2013 to research and write them. According to Odierno, his intent was “to capture key lessons, insights, and innovations from our more than 8 years of conflict in that country.[I]t was time to conduct an initial examination of the Army’s experiences in the post-9/11 wars, to determine their implications for our future operations, strategy, doctrine, force structure, and institutions.”

CSI had already started writing contemporary histories of the conflict, publishing On Point: The United States Army in Operation IRAQI FREEDOM (2004) and On Point II: Transition to the New Campaign (2008), which covered the period from 2003 to January 2005. A projected third volume was advertised, but never published.

Although the Iraq Study Group completed its work in June 2016 and the first volume of the history was scheduled for publication that October, its release was delayed due to concerns within the Army historical community regarding the its perspective and controversial conclusions. After external reviewers deemed the study fair and recommended its publication, claims were lodged after its existence was made public last autumn that the Army was suppressing it to avoid embarrassment. Making clear that the study was not an official history publication, current Army Chief of Staff General Mark Milley added his own forward to Odierno’s, and publicly released the two volumes yesterday.

NOTES

[1] For a discussion of the roles and mission of CSI and CMH with regard to history, see W. Shane Story, “Transformation or Troop Strength? Early Accounts of the Invasion of IraqArmy History, Winter 2006; Richard W. Stewart, “‘Instant’ History and History: A Hierarchy of NeedsArmy History, Winter 2006; Jeffrey J. Clarke, “The Care and Feeding of Contemporary History,” Army History, Winter 2006; and Gregory Fontenot, “The U.S. Army and Contemporary Military History,” Army History, Spring 2008.

 

U.S. Army Doctrine and Future Warfare

Pre-war U.S. Army warfighting doctrine led to fielding the M10, M18 and M36 tank destroyers to counter enemy tanks. Their relatively ineffective performance against German panzers in Europe during World War II has been seen as the result of flawed thinking about tank warfare. [Wikimedia]

Two recently published articles on current U.S. Army doctrine development and the future of warfare deserve to be widely read:

“An Army Caught in the Middle Between Luddites, Luminaries, and the Occasional Looney,”

The first, by RAND’s David Johnson, is titled “An Army Caught in the Middle Between Luddites, Luminaries, and the Occasional Looney,” published by War on the Rocks.

Johnson begins with an interesting argument:

Contrary to what it says, the Army has always been a concepts-based, rather than a doctrine-based, institution. Concepts about future war generate the requirements for capabilities to realize them… Unfortunately, the Army’s doctrinal solutions evolve in war only after the failure of its concepts in its first battles, which the Army has historically lost since the Revolutionary War.

The reason the Army fails in its first battles is because its concepts are initially — until tested in combat — a statement of how the Army “wants to fight” and rarely an analytical assessment of how it “will have to fight.”

Starting with the Army’s failure to develop its own version of “blitzkrieg” after World War I, Johnson identified conservative organizational politics, misreading technological advances, and a stubborn refusal to account for the capabilities of potential adversaries as common causes for the inferior battlefield weapons and warfighting methods that contributed to its impressive string of lost “first battles.”

Conversely, Johnson credited the Army’s novel 1980s AirLand Battle doctrine as the product of an honest assessment of potential enemy capabilities and the development of effective weapon systems that were “based on known, proven technologies that minimized the risk of major program failures.”

“The principal lesson in all of this” he concluded, “is that the U.S. military should have a clear problem that it is trying to solve to enable it to innovate, and is should realize that innovation is generally not invention.” There are “also important lessons from the U.S. Army’s renaissance in the 1970s, which also resulted in close cooperation between the Army and the Air Force to solve the shared problem of the defense of Western Europe against Soviet aggression that neither could solve independently.”

“The US Army is Wrong on Future War”

The other article, provocatively titled “The US Army is Wrong on Future War,” was published by West Point’s Modern War Institute. It was co-authored by Nathan Jennings, Amos Fox, and Adam Taliaferro, all graduates of the School of Advanced Military Studies, veterans of Iraq and Afghanistan, and currently serving U.S. Army officers.

They argue that

the US Army is mistakenly structuring for offensive clashes of mass and scale reminiscent of 1944 while competitors like Russia and China have adapted to twenty-first-century reality. This new paradigm—which favors fait accompli acquisitions, projection from sovereign sanctuary, and indirect proxy wars—combines incremental military actions with weaponized political, informational, and economic agendas under the protection of nuclear-fires complexes to advance territorial influence. The Army’s failure to conceptualize these features of the future battlefield is a dangerous mistake…

Instead, they assert that the current strategic and operational realities dictate a far different approach:

Failure to recognize the ascendancy of nuclear-based defense—with the consequent potential for only limited maneuver, as in the seventeenth century—incurs risk for expeditionary forces. Even as it idealizes Patton’s Third Army with ambiguous “multi-domain” cyber and space enhancements, the US Army’s fixation with massive counter-offensives to defeat unrealistic Russian and Chinese conquests of Europe and Asia misaligns priorities. Instead of preparing for past wars, the Army should embrace forward positional and proxy engagement within integrated political, economic, and informational strategies to seize and exploit initiative.

The factors they cite that necessitate the adoption of positional warfare include nuclear primacy; sanctuary of sovereignty; integrated fires complexes; limited fait accompli; indirect proxy wars; and political/economic warfare.

“Given these realities,” Jennings, Fox, and Taliaferro assert, “the US Army must adapt and evolve to dominate great-power confrontation in the nuclear age. As such, they recommend that the U.S. (1) adopt “an approach more reminiscent of the US Army’s Active Defense doctrine of the 1970s than the vaunted AirLand Battle concept of the 1980s,” (2) “dramatically recalibrate its approach to proxy warfare; and (3) compel “joint, interagency and multinational coordination in order to deliberately align economic, informational, and political agendas in support of military objectives.”

Future U.S. Army Doctrine: How It Wants to Fight or How It Has to Fight?

Readers will find much with which to agree or disagree in each article, but they both provide viewpoints that should supply plenty of food for thought. Taken together they take on a different context. The analysis put forth by Jenninigs, Fox, and Taliaferro can be read as fulfilling Johnson’s injunction to base doctrine on a sober assessment of the strategic and operational challenges presented by existing enemy capabilities, instead of as an aspirational concept for how the Army would prefer to fight a future war. Whether or not Jennings, et al, have accurately forecasted the future can be debated, but their critique should raise questions as to whether the Army is repeating past doctrinal development errors identified by Johnson.

Dupuy’s Verities: Fortification

The Maginot Line was a 900-mile long network of underground bunkers, tunnels and concrete retractable gun batteries. Its heaviest defenses were located along the 280-mile long border with Germany. [WikiCommons]

The sixth of Trevor Dupuy’s Timeless Verities of Combat is:

Defenders’ chances of success are directly proportional to fortification strength.

From Understanding War (1987):

To some modern military thinkers this is a truism needing no explanation or justification. Others have asserted that prepared defenses are attractive traps to be avoided at all costs. Such assertions, however, either ignore or misread historical examples. History is so fickle that it is dangerous for historians to use such words as “always” or “never.” Nevertheless I offer a bold counter-assertion: never in history has a defense been weakened by the availability of fortifications; defensive works always enhance combat strength. At the very least, fortifications will delay an attacker and add to his casualties; at best, fortifications will enable the defender to defeat the attacker.

Anyone who suggests that breakthroughs of defensive positions in recent history demonstrate the bankruptcy of defensive posture and/or fortifications is seriously deceiving himself and is misinterpreting modern history. One can cite as historical examples the overcoming of the Maginot Line, the Mannerheim Line, the Siegfried Line, and the Bar Lev Line, and from these examples conclude that these fortifications failed. Such a conclusion is absolutely wrong. It is true that all of these fortifications were overcome, but only because a powerful enemy was willing to make a massive and costly effort. (Of course, the Maginot Line was not attacked frontally in 1940; the Germans were so impressed by its defensive strength that they bypassed it, and were threatening its rear when France surrendered.) All of these fortifications afforded time for the defenders to make new dispositions, to bring up reserves, or to mobilize. All were intended to obstruct, to permit the defenders to punish the attackers and, above all to delay; all were successful in these respects. The Bar Lev Line, furthermore, saved Israel from disastrous defeat, and became the base for a successful offensive.[p. 4]

Will field fortifications continue to enhance the combat power of land forces on future battlefields? This is an interesting question. While the character of existing types of fortifications—trenches, strongpoint, and bunkers—might change, seeking cover and concealment from the earth might become even more important.

Dr. Alexander Kott, Chief Scientist at the U.S. Army Research Laboratory, provided one perspective in a recently published paper titled “Ground Warfare in 2050: How It Might Look.” In it, Kott speculated about “tactical ground warfighting circa 2050, in a major conflict between technologically advanced peer competitors.”

Kott noted that on future battlefields dominated by sensor saturation and long-range precision fires, “Conventional entrenchments and other fortifications will become less effective when teams of intelligent munitions can maneuver into and within a trench or a bunker.” Light dismounted forces “will have limited, if any, protection either from antimissiles or armor (although they may be provided a degree of protection by armor deployed by their robotic helpers… Instead, they will use cluttered ground terrain to obtain cover and concealment. In addition, they will attempt to distract and deceive…by use of decoys.”

Heavy forces “capable of producing strong lethal effects—substantial in size and mounted on vehicles—will be unlikely to avoid detection, observation, and fires.” To mitigate continuous incoming precision fires, Kott envisions that heavy ground forces will employ a combination of cover and concealment, maneuver, dispersion, decoys, vigorous counter-ISR (intelligence, surveillance, and reconnaissance) attacks, and armor, but will rely primarily “on extensive use of intelligent antimissiles (evolutions of today’s Active Protection Systems [APSs], Counter Rocket, Artillery, and Mortar [C-RAM], Iron Dome, etc.)”

Conversely, Kott does not foresee underground cover and concealment disappearing from future battlefields. “To gain protection from intelligent munitions, extended subterranean tunnels and facilities will become important. This in turn will necessitate the tunnel-digging robotic machines, suitably equipped for battlefield mobility.” Not only will “large static assets such as supply dumps or munitions repair and manufacturing shops” be moved underground, but maneuver forces and field headquarters might conceivably rapidly dig themselves into below-ground fighting positions between operational bounds.

Comparing Force Ratios to Casualty Exchange Ratios

“American Marines in Belleau Wood (1918)” by Georges Scott [Wikipedia]

Comparing Force Ratios to Casualty Exchange Ratios
Christopher A. Lawrence

[The article below is reprinted from the Summer 2009 edition of The International TNDM Newsletter.]

There are three versions of force ratio versus casualty exchange ratio rules, such as the three-to-one rule (3-to-1 rule), as it applies to casualties. The earliest version of the rule as it relates to casualties that we have been able to find comes from the 1958 version of the U.S. Army Maneuver Control manual, which states: “When opposing forces are in contact, casualties are assessed in inverse ratio to combat power. For friendly forces advancing with a combat power superiority of 5 to 1, losses to friendly forces will be about 1/5 of those suffered by the opposing force.”[1]

The RAND version of the rule (1992) states that: “the famous ‘3:1 rule ’, according to which the attacker and defender suffer equal fractional loss rates at a 3:1 force ratio the battle is in mixed terrain and the defender enjoys ‘prepared ’defenses…” [2]

Finally, there is a version of the rule that dates from the 1967 Maneuver Control manual that only applies to armor that shows:

As the RAND construct also applies to equipment losses, then this formulation is directly comparable to the RAND construct.

Therefore, we have three basic versions of the 3-to-1 rule as it applies to casualties and/or equipment losses. First, there is a rule that states that there is an even fractional loss ratio at 3-to-1 (the RAND version), Second, there is a rule that states that at 3-to-1, the attacker will suffer one-third the losses of the defender. And third, there is a rule that states that at 3-to-1, the attacker and defender will suffer the same losses as the defender. Furthermore, these examples are highly contradictory, with either the attacker suffering three times the losses of the defender, the attacker suffering the same losses as the defender, or the attacker suffering 1/3 the losses of the defender.

Therefore, what we will examine here is the relationship between force ratios and exchange ratios. In this case, we will first look at The Dupuy Institute’s Battles Database (BaDB), which covers 243 battles from 1600 to 1900. We will chart on the y-axis the force ratio as measured by a count of the number of people on each side of the forces deployed for battle. The force ratio is the number of attackers divided by the number of defenders. On the x-axis is the exchange ratio, which is a measured by a count of the number of people on each side who were killed, wounded, missing or captured during that battle. It does not include disease and non-battle injuries. Again, it is calculated by dividing the total attacker casualties by the total defender casualties. The results are provided below:

As can be seen, there are a few extreme outliers among these 243 data points. The most extreme, the Battle of Tippennuir (l Sep 1644), in which an English Royalist force under Montrose routed an attack by Scottish Covenanter militia, causing about 3,000 casualties to the Scots in exchange for a single (allegedly self-inflicted) casualty to the Royalists, was removed from the chart. This 3,000-to-1 loss ratio was deemed too great an outlier to be of value in the analysis.

As it is, the vast majority of cases are clumped down into the corner of the graph with only a few scattered data points outside of that clumping. If one did try to establish some form of curvilinear relationship, one would end up drawing a hyperbola. It is worthwhile to look inside that clump of data to see what it shows. Therefore, we will look at the graph truncated so as to show only force ratios at or below 20-to-1 and exchange rations at or below 20-to-1.

Again, the data remains clustered in one corner with the outlying data points again pointing to a hyperbola as the only real fitting curvilinear relationship. Let’s look at little deeper into the data by truncating the data on 6-to-1 for both force ratios and exchange ratios. As can be seen, if the RAND version of the 3-to-1 rule is correct, then the data should show at 3-to-1 force ratio a 3-to-1 casualty exchange ratio. There is only one data point that comes close to this out of the 243 points we examined.

If the FM 105-5 version of the rule as it applies to armor is correct, then the data should show that at 3-to-1 force ratio there is a 1-to-1 casualty exchange ratio, at a 4-to-1 force ratio a 1-to-2 casualty exchange ratio, and at a 5-to-1 force ratio a 1-to-3 casualty exchange ratio. Of course, there is no armor in these pre-WW I engagements, but again no such exchange pattern does appear.

If the 1958 version of the FM 105-5 rule as it applies to casualties is correct, then the data should show that at a 3-to-1 force ratio there is 0.33-to-1 casualty exchange ratio, at a 4-to-1 force ratio a .25-to-1 casualty exchange ratio, and at a 5-to-1 force ratio a 0.20-to-5 casualty exchange ratio. As can be seen, there is not much indication of this pattern, or for that matter any of the three patterns.

Still, such a construct may not be relevant to data before 1900. For example, Lanchester claimed in 1914 in Chapter V, “The Principal of Concentration,” of his book Aircraft in Warfare, that there is greater advantage to be gained in modern warfare from concentration of fire.[3] Therefore, we will tap our more modern Division-Level Engagement Database (DLEDB) of 675 engagements, of which 628 have force ratios and exchange ratios calculated for them. These 628 cases are then placed on a scattergram to see if we can detect any similar patterns.

Even though this data covers from 1904 to 1991, with the vast majority of the data coming from engagements after 1940, one again sees the same pattern as with the data from 1600-1900. If there is a curvilinear relationship, it is again a hyperbola. As before, it is useful to look into the mass of data clustered into the corner by truncating the force and exchange ratios at 20-to-1. This produces the following:

Again, one sees the data clustered in the corner, with any curvilinear relationship again being a hyperbola. A look at the data further truncated to a 10-to-1 force or exchange ratio does not yield anything more revealing.

And, if this data is truncated to show only 5-to-1 force ratio and exchange ratios, one again sees:

Again, this data appears to be mostly just noise, with no clear patterns here that support any of the three constructs. In the case of the RAND version of the 3-to-1 rule, there is again only one data point (out of 628) that is anywhere close to the crossover point (even fractional exchange rate) that RAND postulates. In fact, it almost looks like the data conspires to make sure it leaves a noticeable “hole” at that point. The other postulated versions of the 3-to-1 rules are also given no support in these charts.

Also of note, that the relationship between force ratios and exchange ratios does not appear to significantly change for combat during 1600-1900 when compared to the data from combat from 1904-1991. This does not provide much support for the intellectual construct developed by Lanchester to argue for his N-square law.

While we can attempt to torture the data to find a better fit, or can try to argue that the patterns are obscured by various factors that have not been considered, we do not believe that such a clear pattern and relationship exists. More advanced mathematical methods may show such a pattern, but to date such attempts have not ferreted out these alleged patterns. For example, we refer the reader to Janice Fain’s article on Lanchester equations, The Dupuy Institute’s Capture Rate Study, Phase I & II, or any number of other studies that have looked at Lanchester.[4]

The fundamental problem is that there does not appear to be a direct cause and effect between force ratios and exchange ratios. It appears to be an indirect relationship in the sense that force ratios are one of several independent variables that determine the outcome of an engagement, and the nature of that outcome helps determines the casualties. As such, there is a more complex set of interrelationships that have not yet been fully explored in any study that we know of, although it is briefly addressed in our Capture Rate Study, Phase I & II.

NOTES

[1] FM 105-5, Maneuver Control (1958), 80.

[2] Patrick Allen, “Situational Force Scoring: Accounting for Combined Arms Effects in Aggregate Combat Models,” (N-3423-NA, The RAND Corporation, Santa Monica, CA, 1992), 20.

[3] F. W. Lanchester, Aircraft in Warfare: The Dawn of the Fourth Arm (Lanchester Press Incorporated, Sunnyvale, Calif., 1995), 46-60. One notes that Lanchester provided no data to support these claims, but relied upon an intellectual argument based upon a gross misunderstanding of ancient warfare.

[4] In particular, see page 73 of Janice B. Fain, “The Lanchester Equations and Historical Warfare: An Analysis of Sixty World War II Land Engagements,” Combat Data Subscription Service (HERO, Arlington, Va., Spring 1975).

Trevor Dupuy and Technological Determinism in Digital Age Warfare

Is this the only innovation in weapons technology in history with the ability in itself to change warfare and alter the balance of power? Trevor Dupuy thought it might be. Shot IVY-MIKE, Eniwetok Atoll, 1 November 1952. [Wikimedia]

Trevor Dupuy was skeptical about the role of technology in determining outcomes in warfare. While he did believe technological innovation was crucial, he did not think that technology itself has decided success or failure on the battlefield. As he wrote posthumously in 1997,

I am a humanist, who is also convinced that technology is as important today in war as it ever was (and it has always been important), and that any national or military leader who neglects military technology does so to his peril and that of his country. But, paradoxically, perhaps to an extent even greater than ever before, the quality of military men is what wins wars and preserves nations. (emphasis added)

His conclusion was largely based upon his quantitative approach to studying military history, particularly the way humans have historically responded to the relentless trend of increasingly lethal military technology.

The Historical Relationship Between Weapon Lethality and Battle Casualty Rates

Based on a 1964 study for the U.S. Army, Dupuy identified a long-term historical relationship between increasing weapon lethality and decreasing average daily casualty rates in battle. (He summarized these findings in his book, The Evolution of Weapons and Warfare (1980). The quotes below are taken from it.)

Since antiquity, military technological development has produced weapons of ever increasing lethality. The rate of increase in lethality has grown particularly dramatically since the mid-19th century.

However, in contrast, the average daily casualty rate in combat has been in decline since 1600. With notable exceptions during the 19th century, casualty rates have continued to fall through the late 20th century. If technological innovation has produced vastly more lethal weapons, why have there been fewer average daily casualties in battle?

The primary cause, Dupuy concluded, was that humans have adapted to increasing weapon lethality by changing the way they fight. He identified three key tactical trends in the modern era that have influenced the relationship between lethality and casualties:

Technological Innovation and Organizational Assimilation

Dupuy noted that the historical correlation between weapons development and their use in combat has not been linear because the pace of integration has been largely determined by military leaders, not the rate of technological innovation. “The process of doctrinal assimilation of new weapons into compatible tactical and organizational systems has proved to be much more significant than invention of a weapon or adoption of a prototype, regardless of the dimensions of the advance in lethality.” [p. 337]

As a result, the history of warfare has been exemplified more often by a discontinuity between weapons and tactical systems than effective continuity.

During most of military history there have been marked and observable imbalances between military efforts and military results, an imbalance particularly manifested by inconclusive battles and high combat casualties. More often than not this imbalance seems to be the result of incompatibility, or incongruence, between the weapons of warfare available and the means and/or tactics employing the weapons. [p. 341]

In short, military organizations typically have not been fully effective at exploiting new weapons technology to advantage on the battlefield. Truly decisive alignment between weapons and systems for their employment has been exceptionally rare. Dupuy asserted that

There have been six important tactical systems in military history in which weapons and tactics were in obvious congruence, and which were able to achieve decisive results at small casualty costs while inflicting disproportionate numbers of casualties. These systems were:

  • the Macedonian system of Alexander the Great, ca. 340 B.C.
  • the Roman system of Scipio and Flaminius, ca. 200 B.C.
  • the Mongol system of Ghengis Khan, ca. A.D. 1200
  • the English system of Edward I, Edward III, and Henry V, ca. A.D. 1350
  • the French system of Napoleon, ca. A.D. 1800
  • the German blitzkrieg system, ca. A.D. 1940 [p. 341]

With one caveat, Dupuy could not identify any single weapon that had decisively changed warfare in of itself without a corresponding human adaptation in its use on the battlefield.

Save for the recent significant exception of strategic nuclear weapons, there have been no historical instances in which new and lethal weapons have, of themselves, altered the conduct of war or the balance of power until they have been incorporated into a new tactical system exploiting their lethality and permitting their coordination with other weapons; the full significance of this one exception is not yet clear, since the changes it has caused in warfare and the influence it has exerted on international relations have yet to be tested in war.

Until the present time, the application of sound, imaginative thinking to the problem of warfare (on either an individual or an institutional basis) has been more significant than any new weapon; such thinking is necessary to real assimilation of weaponry; it can also alter the course of human affairs without new weapons. [p. 340]

Technological Superiority and Offset Strategies

Will new technologies like robotics and artificial intelligence provide the basis for a seventh tactical system where weapons and their use align with decisive battlefield results? Maybe. If Dupuy’s analysis is accurate, however, it is more likely that future increases in weapon lethality will continue to be counterbalanced by human ingenuity in how those weapons are used, yielding indeterminate—perhaps costly and indecisive—battlefield outcomes.

Genuinely effective congruence between weapons and force employment continues to be difficult to achieve. Dupuy believed the preconditions necessary for successful technological assimilation since the mid-19th century have been a combination of conducive military leadership; effective coordination of national economic, technological-scientific, and military resources; and the opportunity to evaluate and analyze battlefield experience.

Can the U.S. meet these preconditions? That certainly seemed to be the goal of the so-called Third Offset Strategy, articulated in 2014 by the Obama administration. It called for maintaining “U.S. military superiority over capable adversaries through the development of novel capabilities and concepts.” Although the Trump administration has stopped using the term, it has made “maximizing lethality” the cornerstone of the 2018 National Defense Strategy, with increased funding for the Defense Department’s modernization priorities in FY2019 (though perhaps not in FY2020).

Dupuy’s original work on weapon lethality in the 1960s coincided with development in the U.S. of what advocates of a “revolution in military affairs” (RMA) have termed the “First Offset Strategy,” which involved the potential use of nuclear weapons to balance Soviet superiority in manpower and material. RMA proponents pointed to the lopsided victory of the U.S. and its allies over Iraq in the 1991 Gulf War as proof of the success of a “Second Offset Strategy,” which exploited U.S. precision-guided munitions, stealth, and intelligence, surveillance, and reconnaissance systems developed to counter the Soviet Army in Germany in the 1980s. Dupuy was one of the few to attribute the decisiveness of the Gulf War both to airpower and to the superior effectiveness of U.S. combat forces.

Trevor Dupuy certainly was not an anti-technology Luddite. He recognized the importance of military technological advances and the need to invest in them. But he believed that the human element has always been more important on the battlefield. Most wars in history have been fought without a clear-cut technological advantage for one side; some have been bloody and pointless, while others have been decisive for reasons other than technology. While the future is certainly unknown and past performance is not a guarantor of future results, it would be a gamble to rely on technological superiority alone to provide the margin of success in future warfare.

The Great 3-1 Rule Debate

coldwarmap3[This piece was originally posted on 13 July 2016.]

Trevor Dupuy’s article cited in my previous post, “Combat Data and the 3:1 Rule,” was the final salvo in a roaring, multi-year debate between two highly regarded members of the U.S. strategic and security studies academic communities, political scientist John Mearsheimer and military analyst/polymath Joshua Epstein. Carried out primarily in the pages of the academic journal International Security, Epstein and Mearsheimer argued the validity of the 3-1 rule and other analytical models with respect the NATO/Warsaw Pact military balance in Europe in the 1980s. Epstein cited Dupuy’s empirical research in support of his criticism of Mearsheimer’s reliance on the 3-1 rule. In turn, Mearsheimer questioned Dupuy’s data and conclusions to refute Epstein. Dupuy’s article defended his research and pointed out the errors in Mearsheimer’s assertions. With the publication of Dupuy’s rebuttal, the International Security editors called a time out on the debate thread.

The Epstein/Mearsheimer debate was itself part of a larger political debate over U.S. policy toward the Soviet Union during the administration of Ronald Reagan. This interdisciplinary argument, which has since become legendary in security and strategic studies circles, drew in some of the biggest names in these fields, including Eliot Cohen, Barry Posen, the late Samuel Huntington, and Stephen Biddle. As Jeffery Friedman observed,

These debates played a prominent role in the “renaissance of security studies” because they brought together scholars with different theoretical, methodological, and professional backgrounds to push forward a cohesive line of research that had clear implications for the conduct of contemporary defense policy. Just as importantly, the debate forced scholars to engage broader, fundamental issues. Is “military power” something that can be studied using static measures like force ratios, or does it require a more dynamic analysis? How should analysts evaluate the role of doctrine, or politics, or military strategy in determining the appropriate “balance”? What role should formal modeling play in formulating defense policy? What is the place for empirical analysis, and what are the strengths and limitations of existing data?[1]

It is well worth the time to revisit the contributions to the 1980s debate. I have included a bibliography below that is not exhaustive, but is a place to start. The collapse of the Soviet Union and the end of the Cold War diminished the intensity of the debates, which simmered through the 1990s and then were obscured during the counterterrorism/ counterinsurgency conflicts of the post-9/11 era. It is possible that the challenges posed by China and Russia amidst the ongoing “hybrid” conflict in Syria and Iraq may revive interest in interrogating the bases of military analyses in the U.S and the West. It is a discussion that is long overdue and potentially quite illuminating.

NOTES

[1] Jeffery A. Friedman, “Manpower and Counterinsurgency: Empirical Foundations for Theory and Doctrine,” Security Studies 20 (2011)

BIBLIOGRAPHY

(Note: Some of these are behind paywalls, but some are available in PDF format. Mearsheimer has made many of his publications freely available here.)

John J. Mearsheimer, “Why the Soviets Can’t Win Quickly in Central Europe,” International Security, Vol. 7, No. 1 (Summer 1982)

Samuel P. Huntington, “Conventional Deterrence and Conventional Retaliation in Europe,” International Security 8, no. 3 (Winter 1983/84)

Joshua Epstein, Strategy and Force Planning (Washington, DC: Brookings, 1987)

Joshua M. Epstein, “Dynamic Analysis and the Conventional Balance in Europe,” International Security 12, no. 4 (Spring 1988)

John J. Mearsheimer, “Numbers, Strategy, and the European Balance,” International Security 12, no. 4 (Spring 1988)

Stephen Biddle, “The European Conventional Balance,” Survival 30, no. 2 (March/April 1988)

Eliot A. Cohen, “Toward Better Net Assessment: Rethinking the European Conventional Balance,International Security Vol. 13, No. 1 (Summer 1988)

Joshua M. Epstein, “The 3:1 Rule, the Adaptive Dynamic Model, and the Future of Security Studies,” International Security 13, no. 4 (Spring 1989)

John J. Mearsheimer, “Assessing the Conventional Balance,” International Security 13, no. 4 (Spring 1989)

John J. Mearsheimer, Barry R. Posen, Eliot A. Cohen, “Correspondence: Reassessing Net Assessment,” International Security 13, No. 4 (Spring 1989)

Trevor N. Dupuy, “Combat Data and the 3:1 Rule,” International Security 14, no. 1 (Summer 1989)

Stephen Biddle et al., Defense at Low Force Levels (Alexandria, VA: Institute for Defense Analyses, 1991)

Dupuy’s Verities: Initiative

German Army soldiers advance during the Third Battle of Kharkov in early 1943. This was the culmination of a counteroffensive by German Field Marshal Erich von Manstein that blunted the Soviet offensive drive following the recapture of Stalingrad in late 1942. [Photo: KonchitsyaLeto/Reddit]

The fifth of Trevor Dupuy’s Timeless Verities of Combat is:

Initiative permits application of preponderant combat power.

From Understanding War (1987):

The importance of seizing and maintaining the initiative has not declined in our times, nor will it in the future. This has been the secret of success of all of the great captains of history. It was as true of MacArthur as it was of Alexander the Great, Grant or Napoleon. Some modern Soviet theorists have suggested that this is even more important now in an era of high technology than formerly. They may be right. This has certainly been a major factor in the Israeli victories over the Arabs in all of their wars.

Given the prominent role initiative has played in warfare historically, it is curious that it is not a principle of war in its own right. However, it could be argued that it is sufficiently embedded in the principles of the offensive and maneuver that it does not need to be articulated separately. After all, the traditional means of sizing the initiative on the battlefield is through a combination of the offensive and maneuver.

Initiative is a fundamental aspect of current U.S. Army doctrine, as stated in ADP 3-0 Operations (2017):

The central idea of operations is that, as part of a joint force, Army forces seize, retain, and exploit the initiative to gain and maintain a position of relative advantage in sustained land operations to prevent conflict, shape the operational environment, and win our Nation’s wars as part of unified action.

For Dupuy, the specific connection between initiative and combat power is likely why he chose to include it as a verity in its own right. Combat power was the central concept in his theory of combat and initiative was not just the basic means of achieving a preponderance of combat power through superior force strength (i.e. numbers), but also in harnessing the effects of the circumstantial variables of combat that multiply combat power (i.e. surprise, mobility, vulnerability, combat effectiveness). It was precisely through the exploitation of this relationship between initiative and combat power that allowed inferior numbers of German and Israeli combat forces to succeed time and again in combat against superior numbers of Soviet and Arab opponents.

Using initiative to apply preponderant combat power in battle is the primary way the effects of maneuver (to “gain and maintain a position of relative advantage“) are abstracted in Dupuy’s Quantified Judgement Model (QJM)/Tactical Numerical Deterministic Model (TNDM). The QJM/TNDM itself is primarily a combat attrition adjudicator that determines combat outcomes through calculations of relative combat power. The numerical force strengths of the opposing forces engaged as determined by maneuver can be easily inputted into the QJM/TNDM and then modified by the applicable circumstantial variables of combat related to maneuver to obtain a calculation of relative combat power. As another of Dupuy’s verities states, “superior combat power always wins.”