Tag The Evolution of Weapons and Warfare

U.S. Army Mobile Protected Firepower (MPF) Program Update

BAE Systems has submitted its proposal to the U.S. Army to build and test the Mobile Protected Firepower (MPF) vehicle [BAE Systems/Fox News]

When we last checked in with the U.S. Army’s Mobile Protected Firepower (MPF) program—an effort to quickly field a new light tank lightweight armored vehicle with a long-range direct fire capability—Request for Proposals (RFPs) were expected by November 2017 and the first samples by April 2018. It now appears the first MPF prototypes will not be delivered before mid-2020 at the earliest.

According to a recent report by Kris Osborn on Warrior Maven, “The service expects to award two Engineering Manufacturing and Development (EMD) deals by 2019 as part of an initial step to building prototypes from multiple vendors, service officials said. Army statement said initial prototypes are expected within 14 months of a contract award.”

Part of the delay appears to stem from uncertainty about requirements. As Osborn reported, “For the Army, the [MPF} effort involves what could be described as a dual-pronged acquisition strategy in that it seeks to leverage currently available or fast emerging technology while engineering the vehicle with an architecture such that it can integrate new weapons and systems as they emerge over time.”

Among the technologies the Army will seek to integrate into the MPF are a lightweight, heavy caliber main gun, lightweight armor composites, active protection systems, a new generation of higher-resolution targeting sensors, greater computer automation, and artificial intelligence.

Osborn noted that

the Army’s Communications Electronics Research, Development and Engineering Center (CERDEC) is already building prototype sensors – with this in mind. In particular, this early work is part of a longer-range effort to inform the Army’s emerging Next-Generation Combat Vehicle (NGCV). The NGCV, expected to become an entire fleet of armored vehicles, is now being explored as something to emerge in the late 2020s or early 2030s.

These evolving requirements are already impacting the Army’s approach to fielding MPF. It originally intended to “do acquisition differently to deliver capability quickly.” MPF program director Major General David Bassett declared in October 2017, “We expect to be delivering prototypes off of that program effort within 15 months of contract award…and getting it in the hands of an evaluation unit six months after that — rapid!

It is now clear the Army won’t be meeting that schedule after all. Stay tuned.

“Quantity Has A Quality All Its Own”: How Robot Swarms Might Change Future Combat

Humans vs. machines in the film Matrix Revolutions (2003) [Screencap by The Matrix Wiki]

Yesterday, Paul Scharre, director of the Technology and National Security Program at the Center for a New American Security, and prolific writer on the future of robotics and artificial intelligence, posted a fascinating argument on Twitter regarding swarms and mass in future combat.

His thread was in response to an article by Shmuel Shmuel posted on War on the Rocks, which made the case that the same computer processing technology enabling robotic vehicles combined with old fashioned kinetic weapons (i.e. anti-aircraft guns) offered a cost-effective solution to swarms.

Scharre agreed that robotic drones are indeed vulnerable to such countermeasures, but made this point in response:

He then went to contend that robotic swarms offer the potential to reestablish the role of mass in future combat. Mass, either in terms of numbers of combatants or volume of firepower, has played a decisive role in most wars. As the aphorism goes, usually credited to Josef Stalin, “mass has a quality all of its own.”

Scharre observed that the United States went in a different direction in its post-World War II approach to warfare, adopting instead “offset” strategies that sought to leverage superior technology to balance against the mass militaries of the Communist bloc.

While effective during the Cold War, Scharre concurs with the arguments that offset strategies are becoming far too expensive and may ultimately become self-defeating.

In order to avoid this fate, Scharre contends that

The entire thread is well worth reading.

Trevor Dupuy would have agreed with much of what Scharre’s asserts. He identified the relationship between increasing weapon lethality and battlefield dispersion that goes back to the 17th century. Dupuy believed that the primary factor driving this relationship was the human response to fear in a lethal environment, with soldiers dispersing in depth and frontage on battlefields in order to survive weapons of ever increasing destructiveness.

TDI Friday Read: Lethality, Dispersion, And Mass On Future Battlefields

Robots might very well change that equation. Whether autonomous or “human in the loop,” robotic swarms do not feel fear and are inherently expendable. Cheaply produced robots might very well provide sufficient augmentation to human combat units to restore the primacy of mass in future warfare.

Is The End Of Stealth Neigh?

Lockheed Martin F-22 Raptor [Creative Commons]

Michael Peck made an interesting catch over at The National Interest. The Defense Advanced Research Projects Agency (DARPA) is soliciting input on potentially disruptive technologies for future warfare. With regard to air warfare, the solicitation baldy states, “Platform stealth may be approaching physical limits.” This led Peck to ask “Did the Pentagon just admit that stealth technology may not work anymore?

A couple of years ago, a media report that the Chinese had claimed a technological breakthrough in stealth-busting quantum radar capabilities led me to muse about the possible repercussions on U.S. military capabilities. This was during the height of the technology-rooted Third Offset Strategy mania. It seemed to me at the time that concentrating on technological solutions to the U.S.’s strategic challenges might not be the wisest course of action.

The notion that stealth might be a wasting asset seemed somewhat far-fetched when I wrote that, but it appears to have become a much more serious concern. As the DARPA solicitation states, “Our acquisition system is finding it difficult to respond on relevant timescales to adversary progress, which has made the search for next generation capabilities at once more urgent and more futile.” (p. 5)

Er, yikes.

TDI Friday Read: Lethality, Dispersion, And Mass On Future Battlefields

Armies have historically responded to the increasing lethality of weapons by dispersing mass in frontage and depth on the battlefield. Will combat see a new period of adjustment over the next 50 years like the previous half-century, where dispersion continues to shift in direct proportion to increased weapon range and precision, or will there be a significant change in the character of warfare?

One point of departure for such an inquiry could be the work of TDI President Chris Lawrence, who looked into the nature of historical rates of dispersion in combat from 1600 to 1991.

The Effects Of Dispersion On Combat

As he explained,

I am focusing on this because l really want to come up with some means of measuring the effects of a “revolution in warfare.” The last 400 years of human history have given us more revolutionary inventions impacting war than we can reasonably expect to see in the next 100 years. In particular, I would like to measure the impact of increased weapon accuracy, improved intelligence, and improved C2 on combat.

His tentative conclusions were:

  1. Dispersion has been relatively constant and driven by factors other than firepower from 1600-1815.
  2. Since the Napoleonic Wars, units have increasingly dispersed (found ways to reduce their chance to be hit) in response to increased lethality of weapons.
  3. As a result of this increased dispersion, casualties in a given space have declined.
  4. The ratio of this decline in casualties over area have been roughly proportional to the strength over an area from 1600 through WWI. Starting with WWII, it appears that people have dispersed faster than weapons lethality, and this trend has continued.
  5. In effect, people dispersed in direct relation to increased firepower from 1815 through 1920, and then after that time dispersed faster than the increase in lethality.
  6. It appears that since WWII, people have gone back to dispersing (reducing their chance to be hit) at the same rate that firepower is increasing.
  7. Effectively, there are four patterns of casualties in modem war:

Period 1 (1600 – 1815): Period of Stability

  • Short battles
  • Short frontages
  • High attrition per day
  • Constant dispersion
  • Dispersion decreasing slightly after late 1700s
  • Attrition decreasing slightly after mid-1700s.

Period 2 (1816 – 1905): Period of Adjustment

  • Longer battles
  • Longer frontages
  • Lower attrition per day
  • Increasing dispersion
  • Dispersion increasing slightly faster than lethality

Period 3 (1912 – 1920): Period of Transition

  • Long battles
  • Continuous frontages
  • Lower attrition per day
  • Increasing dispersion
  • Relative lethality per kilometer similar to past, but lower
  • Dispersion increasing slightly faster than lethality

Period 4 (1937 – present): Modern Warfare

  • Long battles
  • Continuous frontages
  • Low attrition per day
  • High dispersion (perhaps constant?)
  • Relatively lethality per kilometer much lower than the past
  • Dispersion increased much faster than lethality going into the period.
  • Dispersion increased at the same rate as lethality within the period.

Chris based his study on previous work done by Trevor Dupuy and his associates, which established a pattern in historical combat between lethality, dispersion, and battlefield casualty rates.

Trevor Dupuy and Historical Trends Related to Weapon Lethality

What Is The Relationship Between Rate of Fire and Military Effectiveness?

Human Factors In Warfare: Dispersion

There is no way to accurately predict the future relationship between weapon lethality and dispersion on the battlefield, but we should question whether or not current conception of combat reflect consideration of the historical trends.

Attrition In Future Land Combat

The Principle Of Mass On The Future Battlefield

Recent Developments In “Game Changing” Precision Fires Technology

Nammo’s new 155mm Solid Fuel Ramjet projectile [The Drive]

From the “Build A Better Mousetrap” files come a couple of new developments in precision fires technology. The U.S. Army’s current top modernization priority is improving its long-range precision fires capabilities.

Joseph Trevithick reports in The Drive that Nammo, a Norwegian/Finnish aerospace and defense company, recently revealed that it is developing a solid-fueled, ramjet-powered, precision projectile capable of being fired from the ubiquitous 155mm howitzer. The projectile, which is scheduled for live-fire testing in 2019 or 2020, will have a range of more than 60 miles.

The Army’s current self-propelled and towed 155mm howitzers have a range of 12 miles using standard ammunition, and up to 20 miles with rocket-powered munitions. Nammo’s ramjet projectile could effectively double that, but the Army is also looking into developing a new 155mm howitzer with a longer barrel that could fully exploit the capabilities of Nammo’s ramjet shell and other new long-range precision munitions under development.

Anna Ahronheim has a story in The Jerusalem Post about a new weapon developed by the Israeli Rafael Advanced Defense Systems Ltd. called the FireFly. FireFly is a small, three-kilogram, loitering munition designed for use by light ground maneuver forces to deliver precision fires against enemy forces in cover. Similar to a drone, FireFly can hover for up to 15 minutes before delivery.

In a statement, Rafael claimed that “Firefly will essentially eliminate the value of cover and with it, the necessity of long-drawn-out firefights. It will also make obsolete the old infantry tactic of firing and maneuvering to eliminate an enemy hiding behind cover.”

Nammo and Rafael have very high hopes for their wares:

“This [155mm Solid Fuel Ramjet] could be a game-changer for artillery,” according to Thomas Danbolt, Vice President of Nammo’s Large Caliber Ammunitions division.

“The impact of FireFly on the infantry is revolutionary, fundamentally changing small infantry tactics,” Rafael has asserted.

Expansive claims for the impact of new technology are not new, of course. Oribtal ATK touted its XM25 Counter Defilade Target Engagement (CDTE) precision-guided grenade launcher along familiar lines, claiming that “The introduction of the XM25 is akin to other revolutionary systems such as the machine gun, the airplane and the tank, all of which changed battlefield tactics.”

Similar in battlefield effect to the FireFly, the Army cancelled its contract for the XM25 in 2017 after disappointing results in field tests.

UPDATE: For clarity’s sake, let me re-up my contrarian take:

Will This Weapon Change Infantry Warfare Forever? Maybe, But Probably Not

U.S. Army Invests In Revitalizing Long Range Precision Fires Capabilities

U.S. Marines from the The 11th MEU fire their M777 Lightweight 155mm Howitzer during Exercise Alligator Dagger, Dec. 18, 2016. (U.S. Marine Corps/Lance Cpl. Zachery C. Laning/Military.com)

In 2016, Michael Jacobson and Robert H. Scales amplified a warning that after years of neglect during the counterinsurgency war in Iraq and Afghanistan, the U.S. was falling behind potential adversaries in artillery and long range precision fires capabilities. The U.S. Army had already taken note of the performance of Russian artillery in Ukraine, particularly the strike at Zelenopillya in 2014.

Since then, the U.S. Army and Marine Corps have started working on a new Multi-Domain Battle concept aimed at countering the anti-access/area denial (A2/AD) capabilities of potential foes. In 2017, U.S. Army Chief of Staff General Mark Milley made rapid improvement in long range precision fires capabilities the top priority for the service’s modernization effort. It currently aims to field new field artillery, rocket, and missile weapons capable of striking at distances from 70 to 500 kilometers – double the existing ranges – within five years.

The value of ground-based long-range precision fires has been demonstrated recently by the effectiveness of U.S. artillery support, particularly U.S. Army and Marine Corps 155mm howitzers, for Iraqi security forces in retaking Mosul, Syrian Democratic Forces assaulting Raqaa, and in protection of Syrian Kurds being attacked by Russian mercenaries and Syrian regime forces.

According to Army historian Luke O’Brian, the Fiscal Year 2019 Defense budget includes funds to buy 28,737 XM1156 Precision Guided Kit (PGK) 155mm howitzer munitions, which includes replacements for the 6,269 rounds expended during Operation INHERENT RESOLVE. O’Brian also notes that the Army will also buy 2,162 M982 Excalibur 155mm rounds in 2019 and several hundred each in following years.

In addition, in an effort to reduce the dependence on potentially vulnerable Global Positioning System (GPS) satellite networks for precision fires capabilities, the Army has awarded a contract to BAE Systems to develop Precision Guided Kit-Modernization (PGK-M) rounds with internal navigational capacity.

While the numbers appear large at first glance, data on U.S. artillery expenditures in Operation DESERT STORM and IRAQI FREEDOM (also via Luke O’Brian) shows just how much the volume of long-range fires has changed just since 1991. For the U.S. at least, precision fires have indeed replaced mass fires on the battlefield.

U.S. Army Swarm Offensives In Future Combat

For a while now, military pundits have speculated about the role robotic drones and swarm tactics will play in future warfare. U.S. Army Captain Jules Hurst recently took a first crack at adapting drones and swarms into existing doctrine in an article in Joint Forces Quarterly. In order to move beyond the abstract, Hurst looked at how drone swarms “should be inserted into the tactical concepts of today—chiefly, the five forms of offensive maneuver recognized under Army doctrine.”

Hurst pointed out that while drone design currently remains in flux, “for assessment purposes, future swarm combatants will likely be severable into two broad categories: fire support swarms and maneuver swarms.”

In Hurst’s reckoning, the chief advantage of fire support swarms would be their capacity for overwhelming current air defense systems to deliver either human-targeted or semi-autonomous precision fires. Their long-range endurance of airborne drones also confers an ability to take and hold terrain that current manned systems do not possess.

The primary benefits of ground maneuver swarms, according to Hurst, would be their immunity from the human element of fear, giving them a resilient, persistent level of combat effectiveness. Their ability to collect real-time battlefield intelligence makes them ideal for enabling modern maneuver warfare concepts.

Hurst examines how these capabilities could be exploited through each of the Army’s current schemes of maneuver: infiltration, penetration, frontal attack, envelopment, and the turning maneuver. While concluding that “ultimately, the technological limitations and advantages of maneuver swarms and fire support swarms will determine their uses,” Hurst acknowledged the critical role Army institutional leadership must play in order to successfully utilize the new technology on the battlefield.

U.S. officers and noncommissioned officers can accelerate that comfort [with new weapons] by beginning to postulate about the use of swarms well before they hit the battlefield. In the vein of aviation visionaries Billy Mitchell and Giulio Douhet, members of the Department of Defense must look forward 10, 20, or even 30 years to when artificial intelligence allows the deployment of swarm combatants on a regular basis. It will take years of field maneuvers to perfect the employment of swarms in combat, and the concepts formed during these exercises may be shattered during the first few hours of war. Even so, the U.S. warfighting community must adopt a venture capital mindset and accept many failures for the few novel ideas that may produce game-changing results.

Trevor Dupuy would have agreed. He argued that the crucial factor in military innovation was not technology, but the organization approach to using it. Based on his assessment of historical patterns, Dupuy derived a set of preconditions necessary for the successful assimilation of new technology into warfare.

  1. An imaginative, knowledgeable leadership focused on military affairs, supported by extensive knowledge of, and competence in, the nature and background of the existing military system.
  2. Effective coordination of the nation’s economic, technological-scientific, and military resources.
    1. There must exist industrial or developmental research institutions, basic research institutions, military staffs and their supporting institutions, together with administrative arrangements for linking these with one another and with top decision-making echelons of government.
    2. These bodies must conduct their research, developmental, and testing activities according to mutually familiar methods so that their personnel can communicate, can be mutually supporting, and can evaluate each other’s results.
    3. The efforts of these institutions—in related matters—must be directed toward a common goal.
  3. Opportunity for battlefield experimentation as a basis for evaluation and analysis.

Is the U.S. Army up to the task?

The Effects Of Dispersion On Combat

[The article below is reprinted from the December 1996 edition of The International TNDM Newsletter. A revised version appears in Christopher A. Lawrence, War by Numbers: Understanding Conventional Combat (Potomac Books, 2017), Chapter 13.]

The Effects of Dispersion on Combat
by Christopher A. Lawrence

The TNDM[1] does not play dispersion. But it is clear that dispersion has continued to increase over time, and this must have some effect on combat. This effect was identified by Trevor N. Dupuy in his various writings, starting with the Evolution of Weapons and Warfare. His graph in Understanding War of the battle casualties trends over time is presented here as Figure 1. As dispersion changes over time (dramatically), one would expect the casualties would change over time. I therefore went back to the Land Warfare Database (the 605 engagement version[2]) and proceeded to look at casualties over time and dispersion from every angle that l could.

l eventually realized that l was going to need some better definition of the time periods l was measuring to, as measuring by years scattered the data, measuring by century assembled the data in too gross a manner, and measuring by war left a confusing picture due to the number of small wars with only two or three battles in them in the Land Warfare Database. I eventually defined the wars into 14 categories, so I could fit them onto one readable graph:

To give some idea of how representative the battles listed in the LWDB were for covering the period, I have included a count of the number of battles listed in Michael Clodfelter’s two-volume book Warfare and Armed Conflict, 1618-1991. In the case of WWI, WWII and later, battles tend to be defined as a divisional-level engagement, and there were literally tens of thousands of those.

I then tested my data again looking at the 14 wars that I defined:

  • Average Strength by War (Figure 2)
  • Average Losses by War (Figure 3)
  • Percent Losses Per Day By War (Figure 4)a
  • Average People Per Kilometer By War (Figure 5)
  • Losses per Kilometer of Front by War (Figure 6)
  • Strength and Losses Per Kilometer of Front By War (Figure 7)
  • Ratio of Strength and Losses per Kilometer of Front by War (Figure 8)
  • Ratio of Strength and Loses per Kilometer of Front by Century (Figure 9)

A review of average strengths over time by century and by war showed no surprises (see Figure 2). Up through around 1900, battles were easy to define: they were one- to three-day affairs between clearly defined forces at a locale. The forces had a clear left flank and right flank that was not bounded by other friendly forces. After 1900 (and in a few cases before), warfare was fought on continuous fronts

with a ‘battle’ often being a large multi-corps operation. It is no longer clearly understood what is meant by a battle, as the forces, area covered, and duration can vary widely. For the LWDB, each battle was defined as the analyst wished. ln the case of WWI, there are a lot of very large battles which drive the average battle size up. ln the cases of the WWII, there are a lot of division-level battles, which bring the average down. In the case of the Arab-Israeli Wars, there are nothing but division and brigade-level battles, which bring the average down.

The interesting point to notice is that the average attacker strength in the 16th and 17th century is lower than the average defender strength. Later it is higher. This may be due to anomalies in our data selection.

Average loses by war (see Figure 3) suffers from the same battle definition problem.

Percent losses per day (see Figure 4) is a useful comparison through the end of the 19th Century. After that, the battles get longer and the definition of a duration of the battle is up to the analyst. Note the very dear and definite downward pattern of percent loses per day from the Napoleonic Wars through the Arab-Israeli Wars. Here is a very clear indication of the effects of dispersion. It would appear that from the 1600s to the 1800s the pattern was effectively constant and level, then declines in a very systematic pattern. This partially contradicts Trevor Dupuy’s writing and graphs (see Figure 1). It does appear that after this period of decline that the percent losses per day are being set at a new, much lower plateau. Percent losses per day by war is attached.

Looking at the actual subject of the dispersion of people (measured in people per kilometer of front) remained relatively constant from 1600 through the American Civil War (see Figure 5). Trevor Dupuy defined dispersion as the number of people in a box-like area. Unfortunately, l do not know how to measure that. lean clearly identify the left and right of a unit, but it is more difficult to tell how deep it is Furthermore, density of occupation of this box is far from uniform, with a very forward bias By the same token, fire delivered into this box is also not uniform, with a very forward bias. Therefore, l am quite comfortable measuring dispersion based upon unit frontage, more so than front multiplied by depth.

Note, when comparing the Napoleonic Wars to the American Civil War that the dispersion remains about the same. Yet, if you look at the average casualties (Figure 3) and the average percent casualties per day (Figure 4), it is clear that the rate of casualty accumulation is lower in the American Civil War (this again partially contradicts Dupuy‘s writings). There is no question that with the advent of the Minié ball, allowing for rapid-fire rifled muskets, the ability to deliver accurate firepower increased.

As you will also note, the average people per linear kilometer between WWI and WWII differs by a factor of a little over 1.5 to 1. Yet the actual difference in casualties (see Figure 4) is much greater. While one can just postulate that the difference is the change in dispersion squared (basically Dupuy‘s approach), this does not seem to explain the complete difference, especially the difference between the Napoleonic Wars and the Civil War.

lnstead of discussing dispersion, we should be discussing “casualty reduction efforts.” This basically consists of three elements:

  • Dispersion (D)
  • Increased engagement ranges (R)
  • More individual use of cover and concealment (C&C).

These three factors together result in the reduced chance to hit. They are also partially interrelated, as one cannot make more individual use of cover and concealment unless one is allowed to disperse. So, therefore. The need for cover and concealment increases the desire to disperse and the process of dispersing allows one to use more cover and concealment.

Command and control are integrated into this construct as being something that allows dispersion, and dispersion creates the need for better command control. Therefore, improved command and control in this construct does not operate as a force modifier, but enables a force to disperse.

Intelligence becomes more necessary as the opposing forces use cover and concealment and the ranges of engagement increase. By the same token, improved intelligence allows you to increase the range of engagement and forces the enemy to use better concealment.

This whole construct could be represented by the diagram at the top of the next page.

Now, I may have said the obvious here, but this construct is probably provable in each individual element, and the overall outcome is measurable. Each individual connection between these boxes may also be measurable.

Therefore, to measure the effects of reduced chance to hit, one would need to measure the following formula (assuming these formulae are close to being correct):

(K * ΔD) + (K * ΔC&C) + (K * ΔR) = H

(K * ΔC2) = ΔD

(K * ΔD) = ΔC&C

(K * ΔW) + (K * ΔI) = ΔR

K = a constant
Δ = the change in….. (alias “Delta”)
D = Dispersion
C&C = Cover & Concealment
R = Engagement Range
W = Weapon’s Characteristics
H = the chance to hit
C2 = Command and control
I = Intelligence or ability to observe

Also, certain actions lead to a desire for certain technological and system improvements. This includes the effect of increased dispersion leading to a need for better C2 and increased range leading to a need for better intelligence. I am not sure these are measurable.

I have also shown in the diagram how the enemy impacts upon this. There is also an interrelated mirror image of this construct for the other side.

I am focusing on this because l really want to come up with some means of measuring the effects of a “revolution in warfare.” The last 400 years of human history have given us more revolutionary inventions impacting war than we can reasonably expect to see in the next 100 years. In particular, I would like to measure the impact of increased weapon accuracy, improved intelligence, and improved C2 on combat.

For the purposes of the TNDM, I would very specifically like to work out an attrition multiplier for battles before WWII (and theoretically after WWII) based upon reduced chance to be hit (“dispersion”). For example, Dave Bongard is currently using an attrition multiplier of 4 for his WWI engagements that he is running for the battalion-level validation data base.[3] No one can point to a piece of paper saying this is the value that should be used. Dave picked this value based upon experience and familiarity with the period.

I have also attached Average Loses per Kilometer of Front by War (see Figure 6 above), and a summary chart showing the two on the same chart (see figure 7 above).

The values from these charts are:

The TNDM sets WWII dispersion factor at 3,000 (which l gather translates into 30,000 men per square kilometer). The above data shows a linear dispersion per kilometer of 2,992 men, so this number parallels Dupuy‘s figures.

The final chart I have included is the Ratio of Strength and Losses per Kilometer of Front by War (Figure 8). Each line on the bar graph measures the average ratio of strength over casualties for either the attacker or defender. Being a ratio, unusual outcomes resulted in some really unusually high ratios. I took the liberty of taking out six

data points because they appeared unusually lop-sided. Three of these points are from the English Civil War and were way out of line with everything else. These were the three Scottish battles where you had a small group of mostly sword-armed troops defeating a “modem” army. Also, Walcourt (1689), Front Royal (1862), and Calbritto (1943) were removed. L also have included the same chart, except by century (Figure 9).
Again, one sees a consistency in results in over 300+ years of war, in this case going all the way through WWI, then sees an entirely different pattern with WWII and the Arab-Israeli Wars

A very tentative set of conclusions from all this is:

  1. Dispersion has been relatively constant and driven by factors other than firepower from 1600-1815.
  2. Since the Napoleonic Wars, units have increasingly dispersed (found ways to reduce their chance to be hit) in response to increased lethality of weapons.
  3. As a result of this increased dispersion, casualties in a given space have declined.
  4. The ratio of this decline in casualties over area have been roughly proportional to the strength over an area from 1600 through WWI. Starting with WWII, it appears that people have dispersed faster than weapons lethality, and this trend has continued.
  5. In effect, people dispersed in direct relation to increased firepower from 1815 through 1920, and then after that time dispersed faster than the increase in lethality.
  6. It appears that since WWII, people have gone back to dispersing (reducing their chance to be hit) at the same rate that firepower is increasing.
  7. Effectively, there are four patterns of casualties in modem war:

Period 1 (1600 – 1815): Period of Stability

  • Short battles
  • Short frontages
  • High attrition per day
  • Constant dispersion
  • Dispersion decreasing slightly after late 1700s
  • Attrition decreasing slightly after mid-1700s.

Period 2 (1816 – 1905): Period of Adjustment

  • Longer battles
  • Longer frontages
  • Lower attrition per day
  • Increasing dispersion
  • Dispersion increasing slightly faster than lethality

Period 3 (1912 – 1920): Period of Transition

  • Long Battles
  • Continuous Frontages
  • Lower attrition per day
  • Increasing dispersion
  • Relative lethality per kilometer similar to past, but lower
  • Dispersion increasing slightly faster than lethality

Period 4 (1937 – present): Modern Warfare

  • Long Battles
  • Continuous Frontages
  • Low Attrition per day
  • High dispersion (perhaps constant?)
  • Relatively lethality per kilometer much lower than the past
  • Dispersion increased much faster than lethality going into the period.
  • Dispersion increased at the same rate as lethality within the period.

So the question is whether warfare of the next 50 years will see a new “period of adjustment,” where the rate of dispersion (and other factors) adjusts in direct proportion to increased lethality, or will there be a significant change in the nature of war?

Note that when l use the word “dispersion” above, l often mean “reduced chance to be hit,” which consists of dispersion, increased engagement ranges, and use of cover & concealment.

One of the reasons l wandered into this subject was to see if the TNDM can be used for predicting combat before WWII. l then spent the next few days attempting to find some correlation between dispersion and casualties. Using the data on historical dispersion provided above, l created a mathematical formulation and tested that against the actual historical data points, and could not get any type of fit.

I then locked at the length of battles over time, at one-day battles, and attempted to find a pattern. I could find none. I also looked at other permutations, but did not keep a record of my attempts. I then looked through the work done by Dean Hartley (Oakridge) with the LWDB and called Paul Davis (RAND) to see if there was anyone who had found any correlation between dispersion and casualties, and they had not noted any.

It became clear to me that if there is any such correlation, it is buried so deep in the data that it cannot be found by any casual search. I suspect that I can find a mathematical correlation between weapon lethality, reduced chance to hit (including dispersion), and casualties. This would require some improvement to the data, some systematic measure of weapons lethality, and some serious regression analysis. I unfortunately cannot pursue this at this time.

Finally, for reference, l have attached two charts showing the duration of the battles in the LWDB in days (Figure 10, Duration of Battles Over Time and Figure 11, A Count of the Duration of Battles by War).

NOTES

[1] The Tactical Numerical Deterministic Model, a combat model developed by Trevor Dupuy in 1990-1991 as the follow-up to his Quantified Judgement Model. Dr. James G. Taylor and Jose Perez also contributed to the TNDM’s development.

[2] TDI’s Land Warfare Database (LWDB) was a revised version of a database created by the Historical Evaluation Research Organization (HERO) for the then-U.S. Army Concepts and Analysis Agency (now known as the U.S. Army Center for Army Analysis (CAA)) in 1984. Since the original publication of this article, TDI expanded and revised the data into a suite of databases.

[3] This matter is discussed in Christopher A. Lawrence, “The Second Test of the TNDM Battalion-Level Validations: Predicting Casualties,” The International TNDM Newsletter, April 1997, pp. 40-50.

Trevor Dupuy on Military Innovation

In an article published by the Association of the U.S. Army last November that I missed on the first go around, U.S. Army Colonel Eric E. Aslakson and Lieutenant Colonel Richard T. Brown, (ret.) make the argument that “Staff colonels are the Army’s innovation center of gravity.”

The U.S. defense community has settled upon innovation as one of the key methods for overcoming the challenges posed by new technologies and strategies adapted by potential adversaries, as articulated in the Third Offset Strategy developed by the late Obama administration. It is becoming clear however, that a desire to innovate is not the same as actual innovation. Aslakson and Brown make the point that innovation is not simply technological development and identify what they believe is a crucial institutional component of military innovation in the U.S. Army.

Innovation is differentiated from other forms of change such as improvisation and adaptation by the scale, scope and impact of that value creation. Innovation is not about a new widget or process, but the decisive value created and the competitive advantage gained when that new widget or process is applied throughout the Army or joint force…

However, none of these inventions or activities can rise to the level of innovation unless there are skilled professionals within the Army who can convert these ideas into competitive advantage across the enterprise. That is the role of a colonel serving in a major command staff leadership assignment…

These leaders do not typically create the change. But they have the necessary institutional and operational expertise and experience, contacts, resources and risk tolerance to manage processes across the entire framework of doctrine, organization, training, materiel, leadership and education, personnel and facilities, converting invention into competitive advantage.

In his seminal book, The Evolution of Weapons and Warfare (Indianapolis, IN: The Bobbs-Merrill Company, Inc., 1980), Trevor Dupuy noted a pattern in the historical relationship between development of weapons of increasing lethality and their incorporation in warfare. He too noted that the crucial factor was not the technology itself, but the organizational approach to using it.

When a radically new weapon appears and is first adopted, it is inherently incongruous with existing weapons and doctrine. This is reflected in a number of ways; uncertainty and hesitation in coordination of the new weapon with earlier ones; inability to use it consistently, effectively, and flexibly in offensive action, which often leads to tactical stalemate; vulnerability of the weapon and of its users to hostile countermeasures; heavy losses incident to the employment of the new weapon, or in attempting to oppose it in combat. From this it is possible to establish the following criteria of assimilation:

  1. Confident employment of the weapon in accordance with a doctrine that assures its coordination with other weapons in a manner compatible with the characteristics of each.
  2. Consistently effective, flexible use of the weapon in offensive warfare, permitting full employment of the advantages of superior leadership and/or superior resources.
  3. Capability of dealing effectively with anticipated and unanticipated countermeasures.
  4. Sharp decline in casualties for those employing the weapon, often combined with a capability for inflicting disproportionately heavy losses on the enemy.

Based on his assessment of this historical pattern, Dupuy derived a set of preconditions necessary for a successful assimilation of new technology into warfare.

  1. An imaginative, knowledgeable leadership focused on military affairs, supported by extensive knowledge of, and competence in, the nature and background of the existing military system.
  2. Effective coordination of the nation’s economic, technological-scientific, and military resources.
    1. There must exist industrial or developmental research institutions, basic research institutions, military staffs and their supporting institutions, together with administrative arrangements for linking these with one another and with top decision-making echelons of government.
    2. These bodies must conduct their research, developmental, and testing activities according to mutually familiar methods so that their personnel can communicate, can be mutually supporting, and can evaluate each other’s results.
    3. The efforts of these institutions—in related matters—must be directed toward a common goal.
  3. Opportunity for battlefield experimentation as a basis for evaluation and analysis.

Does the U.S. defense establishment’s organizational and institutional approach to innovation meet these preconditions? Good question.

Trevor Dupuy and Historical Trends Related to Weapon Lethality

There appears to be renewed interest in U.S. Army circles in Trevor Dupuy’s theory of a historical relationship between increasing weapon lethality, declining casualty rates, and greater dispersion on the battlefield. A recent article by Army officer and strategist Aaron Bazin, “Seven Charts That Help Explain American War” at The Strategy Bridge, used a composite version of two of Dupuy’s charts to explain the American military’s attraction to technology. (The graphic in Bazin’s article originated in a 2009 Australian Army doctrinal white paper, “Army’s Future Land Operating Concept,” which evidently did not cite Dupuy as the original source for the charts or the associated concepts.)

John McRea, like Bazin a U.S. Army officer, and a founding member of The Military Writer’s Guild, reposted Dupuy’s graphic in a blog post entitled “Outrageous Fortune: Spears and Arrows,” examining tactical and economic considerations in the use of asymmetrical technologies in warfare.

Dr. Conrad Crane, Chief of Historical Services for the U.S. Army Heritage and Education Center at the Army War College, also referenced Dupuy’s concepts in his look at human performance requirements, “The Future Soldier: Alone in a Crowd,” at War on the Rocks.

Dupuy originally developed his theory based on research and analysis undertaken by the Historical Evaluation and Research Organization (HERO) in 1964, for a study he directed, “Historical Trends Related to Weapon Lethality.” (Annex I, Annex II, Annex III). HERO had been contracted by the Advanced Tactics Project (AVTAC) of the U.S. Army Combat Developments Command, to provide unclassified support for Project OREGON TRAIL, a series of 45 classified studies of tactical nuclear weapons, tactics, and organization, which took 18 months to complete.

AVTAC asked HERO “to identify and analyze critical relationships and the cause-effect aspects of major advances in the lethality of weapons and associated changes in tactics and organization” from the Roman Era to the present. HERO’s study itself was a group project, incorporating 58 case studies from 21 authors, including such scholars as Gunther E. Rothenberg, Samuel P. Huntington, S.L.A. Marshall, R. Ernest Dupuy, Grace P. Hayes, Louis Morton, Peter Paret, Stefan T. Possony, and Theodore Ropp.

Dupuy synthesized and analyzed these case studies for the HERO study’s final report. He described what he was seeking to establish in his 1979 book, Numbers, Predictions and War: Using History to Evaluate Combat Factors and Predict the Outcome of Battles.

If the numbers of military history mean anything, it appears self-evident that there must be some kind of relationship between the quantities of weapons employed by opposing forces in combat, and the number of casualties suffered by each side. It also seems fairly obvious that some weapons are likely to cause more casualties than others, and that the effectiveness of weapons will depend upon their ability to reach their targets. So it becomes clear that the relationship of weapons to casualties is not quite the simple matter of comparing numbers to numbers. To compare weapons to casualties it is necessary to know not only the numbers of weapons, but also how many there are of each different type, and how effective or lethal each of these is.

The historical relationship between lethality, casualties, and dispersion that Dupuy deduced in this study provided the basis for his subsequent quest to establish an empirically-based, overarching theory of combat, which he articulated through his Quantified Judgement Model. Dupuy refined and updated the analysis from the 1964 HERO study in his 1980 book, The Evolution of Weapons and Warfare.