Tag Trevor N. Dupuy

Dupuy’s Verities: The Complexities of Combat

“The Battle of Leipzig, 16-19 October 1813” by A.I. Zauerweid (1783-1844) [Wikimedia]
The thirteenth and last of Trevor Dupuy’s Timeless Verities of Combat is:

Combat is too complex to be described in a single, simple aphorism.

From Understanding War (1987):

This is amply demonstrated by the preceding [verities]. All writers on military affairs (including this one) need periodically to remind themselves of this. In military analysis it is often necessary to focus on some particular aspect of combat. However, the results of such closely focused analyses must the be evaluated in the context of the brutal, multifarious, overlapping realities of war.

Trevor Dupuy was sometimes accused of attempting to reduce war to a mathematical equation. A casual reading of his writings might give that impression, but anyone who honestly engages with his ideas quickly finds this to be an erroneous conclusion. Yet, Dupuy believed the temptation to simplify and abstract combat and warfare to be common enough that he he embedded a warning against doing so into his basic theory on the subject. He firmly believed that human behavior comprises the most important aspect of combat, yet it is all too easy to miss the human experience of war figuring who lost or won and why, and counts of weapons, people, and casualties. As a military historian, he was keenly aware that the human stories behind the numbers—however imperfectly recorded and told—tell us more about the reality of war than mere numbers on their own ever will.

Dupuy’s Verities: Combat Power =/= Firepower

A U.S. 11th Marines 75mm pack howitzer and crew on Guadalcanal, September or October, 1942. The lean condition of the crewmembers indicate that they haven’t been getting enough nutrition during this period. [Wikipedia]

The ninth of Trevor Dupuy’s Timeless Verities of Combat is:

Superior Combat Power Always Wins.

From Understanding War (1987):

Military history demonstrates that whenever an outnumbered force was successful, its combat power was greater than that of the loser. All other things being equal, God has always been on the side of the heaviest battalions and always will be.

In recent years two or three surveys of modern historical experience have led to the finding that relative strength is not a conclusive factor in battle outcome. As we have seen, a superficial analysis of historical combat could support this conclusion. There are a number of examples of battles won by the side with inferior numbers. In many battles, outnumbered attackers were successful.

These examples are not meaningful, however, until the comparison includes the circumstances of the battles and opposing forces. If one take into consideration surprise (when present), relative combat effectiveness of the opponents, terrain features, and the advantage of defensive posture, the result may be different. When all of the circumstances are quantified and applied to the numbers of troops and weapons, the side with the greater combat power on the battlefield is always seen to prevail.

The concept of combat power is foundational to Dupuy’s theory of combat. He did not originate it; the notion that battle encompasses something more than just “physics-based” aspects likely originated with British theorist J.F.C. Fuller during World War I and migrated into U.S. Army thinking via post-war doctrinal revision. Dupuy refined and sharpened the Army’s vague conceptualization of it in the first iterations of his Quantified Judgement Model (QJM) developed in the 1970s.

Dupuy initially defined his idea of combat power in formal terms, as an equation in the QJM:

P = (S x V x CEV)

When:

P = Combat Power
S = Force Strength
V = Environmental and Operational Variable Factors
CEV = Combat Effectiveness Value

Essentially, combat power is the product of:

  • force strength as measured in his models through the Theoretical/Operational Lethality Index (TLI/OLI), a firepower scoring method for comparing the lethality of weapons relative to each other;
  • the intangible environmental and operational variables that affect each circumstance of combat; and
  • the intangible human behavioral (or moral) factors that determine the fighting quality of a combat force.

Dupuy’s theory of combat power and its functional realization in his models have two virtues. First, unlike most existing combat models, it incorporates the effects of those intangible factors unique to each engagement or battle that influence combat outcomes, but are not readily measured in physical terms. As Dupuy argued, combat consists of more than duels between weapons systems. A list of those factors can be found below.

Second, the analytical research in real-world combat data done by him and his colleagues allowed him to begin establishing the specific nature combat processes and their interaction that are only abstracted in other combat theories and models. Those factors and processes for which he had developed a quantification hypothesis are denoted by an asterisk below.

Dupuy’s Verities: The Inefficiency of Combat

The “Mud March” of the Union Army of the Potomac, January 1863.

The twelfth of Trevor Dupuy’s Timeless Verities of Combat is:

Combat activities are always slower, less productive, and less efficient than anticipated.

From Understanding War (1987):

This is the phenomenon that Clausewitz called “friction in war.” Friction is largely due to the disruptive, suppressive, and dispersal effects of firepower upon an aggregation of people. This pace of actual combat operations will be much slower than the progress of field tests and training exercises, even highly realistic ones. Tests and exercises are not truly realistic portrayals of combat, because they lack the element of fear in a lethal environment, present only in real combat. Allowances must be made in planning and execution for the effects of friction, including mistakes, breakdowns, and confusion.

While Clausewitz asserted that the effects of friction on the battlefield could not be measured because they were largely due to chance, Dupuy believed that its influence could, in fact, be gauged and quantified. He identified at least two distinct combat phenomena he thought reflected measurable effects of friction: the differences in casualty rates between large and small sized forces, and diminishing returns from adding extra combat power beyond a certain point in battle. He also believed much more research would be necessary to fully understand and account for this.

Dupuy was skeptical of the accuracy of combat models that failed to account for this interaction between operational and human factors on the battlefield. He was particularly doubtful about approaches that started by calculating the outcomes of combat between individual small-sized units or weapons platforms based on the Lanchester equations or “physics-based” estimates, then used these as inputs for brigade and division-level-battles, the results of which in turn were used as the basis for determining the consequences of theater-level campaigns. He thought that such models, known as “bottom up,” hierarchical, or aggregated concepts (and the prevailing approach to campaign combat modeling in the U.S.), would be incapable of accurately capturing and simulating the effects of friction.

Dupuy’s Verities: The Effects of Firepower in Combat

A German artillery barrage falling on Allied trenches, probably during the Second Battle of Ypres in 1915, during the First World War. [Wikimedia]

The eleventh of Trevor Dupuy’s Timeless Verities of Combat is:

Firepower kills, disrupts, suppresses, and causes dispersion.

From Understanding War (1987):

It is doubtful if any of the people who are today writing on the effect of technology on warfare would consciously disagree with this statement. Yet, many of them tend to ignore the impact of firepower on dispersion, and as a consequence they have come to believe that the more lethal the firepower, the more deaths, disruption, and suppression it will cause. In fact, as weapons have become more lethal intrinsically, their casualty-causing capability has either declined or remained about the same because of greater dispersion of targets. Personnel and tank loss rates of the 1973 Arab-Israeli War, for example, were quite similar to those of intensive battles of World War II and the casualty rates in both of these wars were less than in World War I. (p. 7)

Research and analysis of real-world historical combat data by Dupuy and TDI has identified at least four distinct combat effects of firepower: infliction of casualties (lethality), disruption, suppression, and dispersion. All of them were found to be heavily influenced—if not determined—by moral (human) factors.

Again, I have written extensively on this blog about Dupuy’s theory about the historical relationship between weapon lethality, dispersion on the battlefield, and historical decline in average daily combat casualty rates. TDI President Chris Lawrence has done further work on the subject as well.

TDI Friday Read: Lethality, Dispersion, And Mass On Future Battlefields

Human Factors In Warfare: Dispersion

Human Factors In Warfare: Suppression

There appears to be a fundamental difference in interpretation of the combat effects of firepower between Dupuy’s emphasis on the primacy of human factors and Defense Department models that account only for the “physics-based” casualty-inflicting capabilities of weapons systems. While U.S. Army combat doctrine accounts for the interaction of firepower and human behavior on the battlefield, it has no clear method for assessing or even fully identifying the effects of such factors on combat outcomes.

A Comment On The Importance Of Reserves In Combat

An German Army A7V near the Somme on March 26, 1918 [forces.net] Operation Michael was the first of a series of German Army offensives on the Western Front in the spring of 1918. In late March, 74 German divisions employing infiltration tactics created a breach in a sector of the line held by the British Army. The Germans advanced up to 40 miles and captured over 75,000 British soldiers, but the ability of the British and French to redeploy reserves via rail halted the offensive in early April short of strategic success.

In response to my previous post on Trevor Dupuy’s verity regarding the importance of depth and reserves for successful defense, a commenter posed the following question: “Is the importance of reserves mainly in its own right, or to mitigate the advantages of attacker surprise?”

The importance of reserves to both attacker and defender is as a hedge against the circumstantial uncertainties of combat. Reserves allow attacking and defending commanders the chance to maintain or regain initiative in response to the outcomes of battle. The side that commits its last reserves before its opponent does concedes the initiative to the enemy, probably irrevocably.

In Trevor Dupuy’s theory of combat, the intrinsic superiority of the defensive posture (as per Clausewitz) is the corollary to the attacker’s inherent advantage in initiative. When combined with the combat multipliers of favorable terrain and prepared positions or fortifications, the combat power of a defending force is greatly enhanced. This permits a defending commander to reap the benefit of economy of force to create reserves. When arrayed in sufficient depth to prevent an attacker from engaging them, reserves grant flexibility of response to the defender. A linear defense or improperly placed reserves concede this benefit to the attacker at the outset, permitting the attacking commander to exploit initiative to mass superior combat power at a decisive point without reserves to interfere.

A defender’s reserves are certainly useful in mitigating attacker surprise, but in Dupuy’s theories and models, surprise is a combat multiplier available to both attacker and defender. As perhaps the most powerful combat multiplier available on the battlefield, surprise in the form of a well-timed counterattack by a defender can devastate an attacking force. Even an unexpected tactical wrinkle by a defender can yield effective surprise.

Dupuy’s Verities: The Requirements For Successful Defense

A Sherman tank of the U.S. Army 9th Armored Division heads into action against the advancing Germans during the Battle of the Bulge. {Warfare History Network]

The eighth of Trevor Dupuy’s Timeless Verities of Combat is:

Successful defense requires depth and reserves.

From Understanding War (1987):

Successful defense requires depth and reserves. It has been asserted that outnumbered military forces cannot afford to withhold valuable firepower from ongoing defensive operations and keep it idle in reserve posture. History demonstrates that this is specious logic, and that linear defense is disastrously vulnerable. Napoleon’s crossing of the Po in his first campaign in 1796 is perhaps the classic demonstration of the fallacy of linear (or cordon) defense.

The defender may have all of his firepower committed to the anticipated operational area, but the attacker’s advantage in having the initiative can always render much of that defensive firepower useless. Anyone who suggests that modern technology will facilitate the shifting of engaged firepower in battle overlooks three considerations: (a) the attacker can inhibit or prevent such movement by both direct and indirect means, (b) a defender engaged in a fruitless firefight against limited attacks by numerically inferior attackers is neither physically nor psychologically attuned to making lateral movements even if the enemy does not prevent or inhibit it, and (c) withdrawal of forces from the line (even if possible) provides an alert attacker with an opportunity for shifting the thrust of his offensive to the newly created gap in the defenses.

Napoleon recognized that hard-fought combat is usually won by the side committing the last reserves. Marengo, Borodino, and Ligny are typical examples of Napoleonic victories that demonstrated the importance of having resources available to tip the scales. His two greatest defeats, Leipzig and Waterloo, were suffered because his enemies still had reserves after his were all committed. The importance of committing the last reserves was demonstrated with particular poignancy at Antietam in the American Civil War. In World War II there is no better example than that of Kursk. [pp. 5-6]

Dupuy’s observations about the need for depth and reserves for a successful defense take on even greater current salience in light of the probably character of the near-future battlefield. Terrain lost by an unsuccessful defense may be extremely difficult to regain under prevailing circumstances.

The interaction of increasing weapon lethality and the operational and human circumstantial variables of combat continue to drive the long-term trend in dispersion of combat forces in frontage and depth.

Long-range precision firepower, ubiquitous battlefield reconnaissance and surveillance, and the effectiveness of cyber and information operations will make massing of forces and operational maneuver risky affairs.

As during the Cold War, the stability of alliances may depend on a willingness to defend forward in the teeth of effective anti-access/area denial (A2/AD) regimes that will make the strategic and operational deployment of reserves risky as well. The successful suppression of A2/AD networks might court a nuclear response, however.

Finding an effective solution for enabling a successful defense-in-depth in the future will be a task of great difficulty.

Dupuy’s Verities: The Advantage Of The Offensive

Union assault on the “Mule Shoe” salient, 12 May 1864, by Thure de Thulstrup (1887) [Wikimedia]

The seventh of Trevor Dupuy’s Timeless Verities of Combat is:

An attacker willing to pay the price can always penetrate the strongest defenses.

From Understanding War (1987):

No matter how alert the defender, no matter how skillful his dispositions to avoid or mitigate the effects of surprise or the effects of flank or rear attack, a skillful attacker can always achieve at least a temporary advantage for some time at a place he has selected. This is one reason why Napoleon always endeavored to seize and retain the initiative. In the great battles of 1864 and 1865 in Virginia, Lee was always able to exploit his defensive advantage to the utmost. But Grant equally was always able to achieve a temporary superiority when and where he wished. This did not always result in a Union victory—given Lee’s defensive skill—but invariably it forced Lee to retreat until he could again impose a temporary stalemate with the assistance of powerful field fortifications. A modern example can be found in the Soviet offensive relieving Leningrad in 1943. Another was the Allied break-out from the Normandy beachhead in July and August of 1944.

The exact meaning of this verity is tricky to determine, as the phrase “willing to pay the price” does a lot of work here. History is certainly replete with examples of Phyrric victories, where the cost paid for battlefield success deprived the attacker of any clear benefit. (The U.S. Civil War Battle of Chickamauga in 1863 would be an example in line with Dupuy’s description above.) Perhaps “willing and able to pay the price” would have been a better of way stating this. And, of course, no attack is guaranteed to succeed.

What Dupuy had in mind here is probably best understood in the context of two other of his verities “Offensive action is essential to positive combat results” and “Initiative permits application of preponderant combat power.” Even if the defensive may be the stronger form of combat, the offensive affords certain inherent potential advantages that can enable attackers to defeat the strongest of defenses if conducted effectively, sufficiently resourced, and determinedly pressed.

Dupuy’s Verities: Fortification

The Maginot Line was a 900-mile long network of underground bunkers, tunnels and concrete retractable gun batteries. Its heaviest defenses were located along the 280-mile long border with Germany. [WikiCommons]

The sixth of Trevor Dupuy’s Timeless Verities of Combat is:

Defenders’ chances of success are directly proportional to fortification strength.

From Understanding War (1987):

To some modern military thinkers this is a truism needing no explanation or justification. Others have asserted that prepared defenses are attractive traps to be avoided at all costs. Such assertions, however, either ignore or misread historical examples. History is so fickle that it is dangerous for historians to use such words as “always” or “never.” Nevertheless I offer a bold counter-assertion: never in history has a defense been weakened by the availability of fortifications; defensive works always enhance combat strength. At the very least, fortifications will delay an attacker and add to his casualties; at best, fortifications will enable the defender to defeat the attacker.

Anyone who suggests that breakthroughs of defensive positions in recent history demonstrate the bankruptcy of defensive posture and/or fortifications is seriously deceiving himself and is misinterpreting modern history. One can cite as historical examples the overcoming of the Maginot Line, the Mannerheim Line, the Siegfried Line, and the Bar Lev Line, and from these examples conclude that these fortifications failed. Such a conclusion is absolutely wrong. It is true that all of these fortifications were overcome, but only because a powerful enemy was willing to make a massive and costly effort. (Of course, the Maginot Line was not attacked frontally in 1940; the Germans were so impressed by its defensive strength that they bypassed it, and were threatening its rear when France surrendered.) All of these fortifications afforded time for the defenders to make new dispositions, to bring up reserves, or to mobilize. All were intended to obstruct, to permit the defenders to punish the attackers and, above all to delay; all were successful in these respects. The Bar Lev Line, furthermore, saved Israel from disastrous defeat, and became the base for a successful offensive.[p. 4]

Will field fortifications continue to enhance the combat power of land forces on future battlefields? This is an interesting question. While the character of existing types of fortifications—trenches, strongpoint, and bunkers—might change, seeking cover and concealment from the earth might become even more important.

Dr. Alexander Kott, Chief Scientist at the U.S. Army Research Laboratory, provided one perspective in a recently published paper titled “Ground Warfare in 2050: How It Might Look.” In it, Kott speculated about “tactical ground warfighting circa 2050, in a major conflict between technologically advanced peer competitors.”

Kott noted that on future battlefields dominated by sensor saturation and long-range precision fires, “Conventional entrenchments and other fortifications will become less effective when teams of intelligent munitions can maneuver into and within a trench or a bunker.” Light dismounted forces “will have limited, if any, protection either from antimissiles or armor (although they may be provided a degree of protection by armor deployed by their robotic helpers… Instead, they will use cluttered ground terrain to obtain cover and concealment. In addition, they will attempt to distract and deceive…by use of decoys.”

Heavy forces “capable of producing strong lethal effects—substantial in size and mounted on vehicles—will be unlikely to avoid detection, observation, and fires.” To mitigate continuous incoming precision fires, Kott envisions that heavy ground forces will employ a combination of cover and concealment, maneuver, dispersion, decoys, vigorous counter-ISR (intelligence, surveillance, and reconnaissance) attacks, and armor, but will rely primarily “on extensive use of intelligent antimissiles (evolutions of today’s Active Protection Systems [APSs], Counter Rocket, Artillery, and Mortar [C-RAM], Iron Dome, etc.)”

Conversely, Kott does not foresee underground cover and concealment disappearing from future battlefields. “To gain protection from intelligent munitions, extended subterranean tunnels and facilities will become important. This in turn will necessitate the tunnel-digging robotic machines, suitably equipped for battlefield mobility.” Not only will “large static assets such as supply dumps or munitions repair and manufacturing shops” be moved underground, but maneuver forces and field headquarters might conceivably rapidly dig themselves into below-ground fighting positions between operational bounds.

Comparing Force Ratios to Casualty Exchange Ratios

“American Marines in Belleau Wood (1918)” by Georges Scott [Wikipedia]

Comparing Force Ratios to Casualty Exchange Ratios
Christopher A. Lawrence

[The article below is reprinted from the Summer 2009 edition of The International TNDM Newsletter.]

There are three versions of force ratio versus casualty exchange ratio rules, such as the three-to-one rule (3-to-1 rule), as it applies to casualties. The earliest version of the rule as it relates to casualties that we have been able to find comes from the 1958 version of the U.S. Army Maneuver Control manual, which states: “When opposing forces are in contact, casualties are assessed in inverse ratio to combat power. For friendly forces advancing with a combat power superiority of 5 to 1, losses to friendly forces will be about 1/5 of those suffered by the opposing force.”[1]

The RAND version of the rule (1992) states that: “the famous ‘3:1 rule ’, according to which the attacker and defender suffer equal fractional loss rates at a 3:1 force ratio the battle is in mixed terrain and the defender enjoys ‘prepared ’defenses…” [2]

Finally, there is a version of the rule that dates from the 1967 Maneuver Control manual that only applies to armor that shows:

As the RAND construct also applies to equipment losses, then this formulation is directly comparable to the RAND construct.

Therefore, we have three basic versions of the 3-to-1 rule as it applies to casualties and/or equipment losses. First, there is a rule that states that there is an even fractional loss ratio at 3-to-1 (the RAND version), Second, there is a rule that states that at 3-to-1, the attacker will suffer one-third the losses of the defender. And third, there is a rule that states that at 3-to-1, the attacker and defender will suffer the same losses as the defender. Furthermore, these examples are highly contradictory, with either the attacker suffering three times the losses of the defender, the attacker suffering the same losses as the defender, or the attacker suffering 1/3 the losses of the defender.

Therefore, what we will examine here is the relationship between force ratios and exchange ratios. In this case, we will first look at The Dupuy Institute’s Battles Database (BaDB), which covers 243 battles from 1600 to 1900. We will chart on the y-axis the force ratio as measured by a count of the number of people on each side of the forces deployed for battle. The force ratio is the number of attackers divided by the number of defenders. On the x-axis is the exchange ratio, which is a measured by a count of the number of people on each side who were killed, wounded, missing or captured during that battle. It does not include disease and non-battle injuries. Again, it is calculated by dividing the total attacker casualties by the total defender casualties. The results are provided below:

As can be seen, there are a few extreme outliers among these 243 data points. The most extreme, the Battle of Tippennuir (l Sep 1644), in which an English Royalist force under Montrose routed an attack by Scottish Covenanter militia, causing about 3,000 casualties to the Scots in exchange for a single (allegedly self-inflicted) casualty to the Royalists, was removed from the chart. This 3,000-to-1 loss ratio was deemed too great an outlier to be of value in the analysis.

As it is, the vast majority of cases are clumped down into the corner of the graph with only a few scattered data points outside of that clumping. If one did try to establish some form of curvilinear relationship, one would end up drawing a hyperbola. It is worthwhile to look inside that clump of data to see what it shows. Therefore, we will look at the graph truncated so as to show only force ratios at or below 20-to-1 and exchange rations at or below 20-to-1.

Again, the data remains clustered in one corner with the outlying data points again pointing to a hyperbola as the only real fitting curvilinear relationship. Let’s look at little deeper into the data by truncating the data on 6-to-1 for both force ratios and exchange ratios. As can be seen, if the RAND version of the 3-to-1 rule is correct, then the data should show at 3-to-1 force ratio a 3-to-1 casualty exchange ratio. There is only one data point that comes close to this out of the 243 points we examined.

If the FM 105-5 version of the rule as it applies to armor is correct, then the data should show that at 3-to-1 force ratio there is a 1-to-1 casualty exchange ratio, at a 4-to-1 force ratio a 1-to-2 casualty exchange ratio, and at a 5-to-1 force ratio a 1-to-3 casualty exchange ratio. Of course, there is no armor in these pre-WW I engagements, but again no such exchange pattern does appear.

If the 1958 version of the FM 105-5 rule as it applies to casualties is correct, then the data should show that at a 3-to-1 force ratio there is 0.33-to-1 casualty exchange ratio, at a 4-to-1 force ratio a .25-to-1 casualty exchange ratio, and at a 5-to-1 force ratio a 0.20-to-5 casualty exchange ratio. As can be seen, there is not much indication of this pattern, or for that matter any of the three patterns.

Still, such a construct may not be relevant to data before 1900. For example, Lanchester claimed in 1914 in Chapter V, “The Principal of Concentration,” of his book Aircraft in Warfare, that there is greater advantage to be gained in modern warfare from concentration of fire.[3] Therefore, we will tap our more modern Division-Level Engagement Database (DLEDB) of 675 engagements, of which 628 have force ratios and exchange ratios calculated for them. These 628 cases are then placed on a scattergram to see if we can detect any similar patterns.

Even though this data covers from 1904 to 1991, with the vast majority of the data coming from engagements after 1940, one again sees the same pattern as with the data from 1600-1900. If there is a curvilinear relationship, it is again a hyperbola. As before, it is useful to look into the mass of data clustered into the corner by truncating the force and exchange ratios at 20-to-1. This produces the following:

Again, one sees the data clustered in the corner, with any curvilinear relationship again being a hyperbola. A look at the data further truncated to a 10-to-1 force or exchange ratio does not yield anything more revealing.

And, if this data is truncated to show only 5-to-1 force ratio and exchange ratios, one again sees:

Again, this data appears to be mostly just noise, with no clear patterns here that support any of the three constructs. In the case of the RAND version of the 3-to-1 rule, there is again only one data point (out of 628) that is anywhere close to the crossover point (even fractional exchange rate) that RAND postulates. In fact, it almost looks like the data conspires to make sure it leaves a noticeable “hole” at that point. The other postulated versions of the 3-to-1 rules are also given no support in these charts.

Also of note, that the relationship between force ratios and exchange ratios does not appear to significantly change for combat during 1600-1900 when compared to the data from combat from 1904-1991. This does not provide much support for the intellectual construct developed by Lanchester to argue for his N-square law.

While we can attempt to torture the data to find a better fit, or can try to argue that the patterns are obscured by various factors that have not been considered, we do not believe that such a clear pattern and relationship exists. More advanced mathematical methods may show such a pattern, but to date such attempts have not ferreted out these alleged patterns. For example, we refer the reader to Janice Fain’s article on Lanchester equations, The Dupuy Institute’s Capture Rate Study, Phase I & II, or any number of other studies that have looked at Lanchester.[4]

The fundamental problem is that there does not appear to be a direct cause and effect between force ratios and exchange ratios. It appears to be an indirect relationship in the sense that force ratios are one of several independent variables that determine the outcome of an engagement, and the nature of that outcome helps determines the casualties. As such, there is a more complex set of interrelationships that have not yet been fully explored in any study that we know of, although it is briefly addressed in our Capture Rate Study, Phase I & II.

NOTES

[1] FM 105-5, Maneuver Control (1958), 80.

[2] Patrick Allen, “Situational Force Scoring: Accounting for Combined Arms Effects in Aggregate Combat Models,” (N-3423-NA, The RAND Corporation, Santa Monica, CA, 1992), 20.

[3] F. W. Lanchester, Aircraft in Warfare: The Dawn of the Fourth Arm (Lanchester Press Incorporated, Sunnyvale, Calif., 1995), 46-60. One notes that Lanchester provided no data to support these claims, but relied upon an intellectual argument based upon a gross misunderstanding of ancient warfare.

[4] In particular, see page 73 of Janice B. Fain, “The Lanchester Equations and Historical Warfare: An Analysis of Sixty World War II Land Engagements,” Combat Data Subscription Service (HERO, Arlington, Va., Spring 1975).

Trevor Dupuy and Technological Determinism in Digital Age Warfare

Is this the only innovation in weapons technology in history with the ability in itself to change warfare and alter the balance of power? Trevor Dupuy thought it might be. Shot IVY-MIKE, Eniwetok Atoll, 1 November 1952. [Wikimedia]

Trevor Dupuy was skeptical about the role of technology in determining outcomes in warfare. While he did believe technological innovation was crucial, he did not think that technology itself has decided success or failure on the battlefield. As he wrote posthumously in 1997,

I am a humanist, who is also convinced that technology is as important today in war as it ever was (and it has always been important), and that any national or military leader who neglects military technology does so to his peril and that of his country. But, paradoxically, perhaps to an extent even greater than ever before, the quality of military men is what wins wars and preserves nations. (emphasis added)

His conclusion was largely based upon his quantitative approach to studying military history, particularly the way humans have historically responded to the relentless trend of increasingly lethal military technology.

The Historical Relationship Between Weapon Lethality and Battle Casualty Rates

Based on a 1964 study for the U.S. Army, Dupuy identified a long-term historical relationship between increasing weapon lethality and decreasing average daily casualty rates in battle. (He summarized these findings in his book, The Evolution of Weapons and Warfare (1980). The quotes below are taken from it.)

Since antiquity, military technological development has produced weapons of ever increasing lethality. The rate of increase in lethality has grown particularly dramatically since the mid-19th century.

However, in contrast, the average daily casualty rate in combat has been in decline since 1600. With notable exceptions during the 19th century, casualty rates have continued to fall through the late 20th century. If technological innovation has produced vastly more lethal weapons, why have there been fewer average daily casualties in battle?

The primary cause, Dupuy concluded, was that humans have adapted to increasing weapon lethality by changing the way they fight. He identified three key tactical trends in the modern era that have influenced the relationship between lethality and casualties:

Technological Innovation and Organizational Assimilation

Dupuy noted that the historical correlation between weapons development and their use in combat has not been linear because the pace of integration has been largely determined by military leaders, not the rate of technological innovation. “The process of doctrinal assimilation of new weapons into compatible tactical and organizational systems has proved to be much more significant than invention of a weapon or adoption of a prototype, regardless of the dimensions of the advance in lethality.” [p. 337]

As a result, the history of warfare has been exemplified more often by a discontinuity between weapons and tactical systems than effective continuity.

During most of military history there have been marked and observable imbalances between military efforts and military results, an imbalance particularly manifested by inconclusive battles and high combat casualties. More often than not this imbalance seems to be the result of incompatibility, or incongruence, between the weapons of warfare available and the means and/or tactics employing the weapons. [p. 341]

In short, military organizations typically have not been fully effective at exploiting new weapons technology to advantage on the battlefield. Truly decisive alignment between weapons and systems for their employment has been exceptionally rare. Dupuy asserted that

There have been six important tactical systems in military history in which weapons and tactics were in obvious congruence, and which were able to achieve decisive results at small casualty costs while inflicting disproportionate numbers of casualties. These systems were:

  • the Macedonian system of Alexander the Great, ca. 340 B.C.
  • the Roman system of Scipio and Flaminius, ca. 200 B.C.
  • the Mongol system of Ghengis Khan, ca. A.D. 1200
  • the English system of Edward I, Edward III, and Henry V, ca. A.D. 1350
  • the French system of Napoleon, ca. A.D. 1800
  • the German blitzkrieg system, ca. A.D. 1940 [p. 341]

With one caveat, Dupuy could not identify any single weapon that had decisively changed warfare in of itself without a corresponding human adaptation in its use on the battlefield.

Save for the recent significant exception of strategic nuclear weapons, there have been no historical instances in which new and lethal weapons have, of themselves, altered the conduct of war or the balance of power until they have been incorporated into a new tactical system exploiting their lethality and permitting their coordination with other weapons; the full significance of this one exception is not yet clear, since the changes it has caused in warfare and the influence it has exerted on international relations have yet to be tested in war.

Until the present time, the application of sound, imaginative thinking to the problem of warfare (on either an individual or an institutional basis) has been more significant than any new weapon; such thinking is necessary to real assimilation of weaponry; it can also alter the course of human affairs without new weapons. [p. 340]

Technological Superiority and Offset Strategies

Will new technologies like robotics and artificial intelligence provide the basis for a seventh tactical system where weapons and their use align with decisive battlefield results? Maybe. If Dupuy’s analysis is accurate, however, it is more likely that future increases in weapon lethality will continue to be counterbalanced by human ingenuity in how those weapons are used, yielding indeterminate—perhaps costly and indecisive—battlefield outcomes.

Genuinely effective congruence between weapons and force employment continues to be difficult to achieve. Dupuy believed the preconditions necessary for successful technological assimilation since the mid-19th century have been a combination of conducive military leadership; effective coordination of national economic, technological-scientific, and military resources; and the opportunity to evaluate and analyze battlefield experience.

Can the U.S. meet these preconditions? That certainly seemed to be the goal of the so-called Third Offset Strategy, articulated in 2014 by the Obama administration. It called for maintaining “U.S. military superiority over capable adversaries through the development of novel capabilities and concepts.” Although the Trump administration has stopped using the term, it has made “maximizing lethality” the cornerstone of the 2018 National Defense Strategy, with increased funding for the Defense Department’s modernization priorities in FY2019 (though perhaps not in FY2020).

Dupuy’s original work on weapon lethality in the 1960s coincided with development in the U.S. of what advocates of a “revolution in military affairs” (RMA) have termed the “First Offset Strategy,” which involved the potential use of nuclear weapons to balance Soviet superiority in manpower and material. RMA proponents pointed to the lopsided victory of the U.S. and its allies over Iraq in the 1991 Gulf War as proof of the success of a “Second Offset Strategy,” which exploited U.S. precision-guided munitions, stealth, and intelligence, surveillance, and reconnaissance systems developed to counter the Soviet Army in Germany in the 1980s. Dupuy was one of the few to attribute the decisiveness of the Gulf War both to airpower and to the superior effectiveness of U.S. combat forces.

Trevor Dupuy certainly was not an anti-technology Luddite. He recognized the importance of military technological advances and the need to invest in them. But he believed that the human element has always been more important on the battlefield. Most wars in history have been fought without a clear-cut technological advantage for one side; some have been bloody and pointless, while others have been decisive for reasons other than technology. While the future is certainly unknown and past performance is not a guarantor of future results, it would be a gamble to rely on technological superiority alone to provide the margin of success in future warfare.