Armies have historically responded to the increasing lethality of weapons by dispersing mass in frontage and depth on the battlefield. Will combat see a new period of adjustment over the next 50 years like the previous half-century, where dispersion continues to shift in direct proportion to increased weapon range and precision, or will there be a significant change in the character of warfare?
One point of departure for such an inquiry could be the work of TDI President Chris Lawrence, who looked into the nature of historical rates of dispersion in combat from 1600 to 1991.
As he explained,
I am focusing on this because l really want to come up with some means of measuring the effects of a “revolution in warfare.” The last 400 years of human history have given us more revolutionary inventions impacting war than we can reasonably expect to see in the next 100 years. In particular, I would like to measure the impact of increased weapon accuracy, improved intelligence, and improved C2 on combat.
His tentative conclusions were:
- Dispersion has been relatively constant and driven by factors other than firepower from 1600-1815.
- Since the Napoleonic Wars, units have increasingly dispersed (found ways to reduce their chance to be hit) in response to increased lethality of weapons.
- As a result of this increased dispersion, casualties in a given space have declined.
- The ratio of this decline in casualties over area have been roughly proportional to the strength over an area from 1600 through WWI. Starting with WWII, it appears that people have dispersed faster than weapons lethality, and this trend has continued.
- In effect, people dispersed in direct relation to increased firepower from 1815 through 1920, and then after that time dispersed faster than the increase in lethality.
- It appears that since WWII, people have gone back to dispersing (reducing their chance to be hit) at the same rate that firepower is increasing.
- Effectively, there are four patterns of casualties in modem war:
Period 1 (1600 – 1815): Period of Stability
- Short battles
- Short frontages
- High attrition per day
- Constant dispersion
- Dispersion decreasing slightly after late 1700s
- Attrition decreasing slightly after mid-1700s.
Period 2 (1816 – 1905): Period of Adjustment
- Longer battles
- Longer frontages
- Lower attrition per day
- Increasing dispersion
- Dispersion increasing slightly faster than lethality
Period 3 (1912 – 1920): Period of Transition
- Long battles
- Continuous frontages
- Lower attrition per day
- Increasing dispersion
- Relative lethality per kilometer similar to past, but lower
- Dispersion increasing slightly faster than lethality
Period 4 (1937 – present): Modern Warfare
- Long battles
- Continuous frontages
- Low attrition per day
- High dispersion (perhaps constant?)
- Relatively lethality per kilometer much lower than the past
- Dispersion increased much faster than lethality going into the period.
- Dispersion increased at the same rate as lethality within the period.
Chris based his study on previous work done by Trevor Dupuy and his associates, which established a pattern in historical combat between lethality, dispersion, and battlefield casualty rates.
Trevor Dupuy and Historical Trends Related to Weapon Lethality
What Is The Relationship Between Rate of Fire and Military Effectiveness?
There is no way to accurately predict the future relationship between weapon lethality and dispersion on the battlefield, but we should question whether or not current conception of combat reflect consideration of the historical trends.