Category Lessons of History

Too busy to read

A reposted email by retired General Mattis, who is being considered for Secretary of Defense. This is worth reading: general-james-mattis-email

Opening sentence:

“….The problem with being too busy to read is that you learn by experience (or by your men’s experience), i.e. the hard way. By reading, you learn through other’s experiences, generally a better way to do business, especially in our line of work where the consequences of incompetence are so final for young men.”

and

“Ultimately, a real understanding of history means that we face NOTHING new under the sun. For all the ‘4th Generation of War’ intellectuals running around today saying that the nature of war has fundamentally changed, the tactics are wholly new, etc, I must respectfully say….’Not really’…”

and

“‘Winging it’ and filling body bags as we sort out what works reminds us of the moral dictates and the cost of incompetence in our profession.”

War by Numbers is on Amazon

lawrencefinal

My new book, with a release date of 1 August, is now available on Amazon.com for pre-order: War by Numbers (Amazon)

It is still listed at 498 pages, and I am pretty sure I only wrote 342. I will receive the proofs next month for review, so will have a chance to see how they got there. My Kursk book was over 2,500 pages in Microsoft Word, and we got it down to a mere 1,662 pages in print form. Not sure how this one is heading the other way.

Unlike the Kursk book, there will be a kindle version.

It is already available for pre-order from University of Nebraska Press here: War by Numbers (US)

It is available for pre-order in the UK through Casemate: War by Numbers (UK)

The table of contents for the book is here: War by Numbers: Table of Contents

Like the cover? I did not have a lot to do with it.

What Is The Relationship Between Rate of Fire and Military Effectiveness?

marine-firing-m240Over at his Best Defense blog, Tom Ricks recently posed an interesting question: Is rate of fire no longer a key metric in assessing military effectiveness?

Rate of fire doesn’t seem to be important in today’s militaries. I mean, everyone can go “full auto.” Rather, the problem seems to me firing too much and running out of ammunition.

I wonder if this affects how contemporary military historians look at the tactical level of war. Throughout most of history, the problem, it seems to me, was how many rocks, spears, arrows or bullets you could get off. Hence the importance of drill, which was designed to increase the volume of infantry fire (and to reduce people walking off the battlefield when they moved back to reload).

There are several ways to address this question from a historical perspective, but one place to start is to look at how rate of fire relates historically to combat.

Rate of fire is one of several measures of a weapon’s ability to inflict damage, i.e. its lethality. In the early 1960s, Trevor Dupuy and his associates at the Historical Evaluation Research Organization (HERO) assessed whether historical trends in increasing weapon lethality were changing the nature of combat. To measure this, they developed a methodology for scoring the inherent lethality of a given weapon, the Theoretical Lethality Index (TLI). TLI is the product of five factors:

  • rate of fire
  • targets per strike
  • range factor
  • accuracy
  • reliability

In the TLI methodology, rate of fire is defined as the number of effective strikes a weapon can deliver under ideal conditions in increments of one hour, and assumes no logistical limitation.

As measured by TLI, increased rates of fire do indeed increase weapon lethality. The TLI of an early 20th century semi-automatic rifle is nearly five times higher than a mid-19th century muzzle-loaded rifle due to its higher rate of fire. Despite having lower accuracy and reliability, a World War II-era machine gun has 10 times the TLI of a semi-automatic rifle due to its rate of fire. The rate of fire of small arms has not increased since the early-to-mid 20th century, and the assault rifle, adopted by modern armies following World War II, remains that standard infantry weapon in the early 21st century.

attrition-fig-11

Rate of fire is just but one of many factors that can influence a weapon’s lethality, however. Artillery has much higher TLI values than small arms despite lower rates of fire. This is for the obvious reasons that artillery has far greater range than small arms and because each round of ammunition can hit multiple targets per strike.

There are other methods for scoring weapon lethality but the TLI provides a logical and consistent methodology for comparing weapons to each other. Through the TLI, Dupuy substantiated the observation that indeed, weapons have become more lethal over time, particularly in the last century.

But if weapons have become more lethal, has combat become bloodier? No. Dupuy and his colleagues also discovered that, counterintuitively, the average casualty rates in land combat have been declining since the 17th century. Combat casualty rates did climb in the early and mid-19th century, but fell again precipitously from the later 19th century through the end of the 20th.

attrition-fig-13

The reason, Dupuy determined, was because armies have historically adapted to increases in weapon lethality by dispersing in greater depth on the battlefield, decentralizing tactical decision-making and enhancing mobility, and placing a greater emphasis on combined arms tactics. The area occupied by 100,000 soldiers increased 4,000 times between antiquity and the late 20th century. Average ground force dispersion increased by a third between World War II and the 1973 Yom Kippur War, and he estimated it had increased by another quarter by 1990.

attrition-fig-14

Simply put, even as weapons become more deadly, there are fewer targets on the battlefield for them to hit. Through the mid-19th century, the combination of low rates of fire and relatively shorter range required the massing of infantry fires in order to achieve lethal effect. Before 1850, artillery caused more battlefield casualties than infantry small arms. This ratio changed due to the increased rates of fire and range of rifled and breach loading weapons introduced in the 1850s and 1860s. The majority of combat casualties in  conflicts of the mid-to-late 19th century were inflicted by infantry small arms.

attrition-fig-19The lethality of modern small arms combined with machine guns led to further dispersion and the decentralization of tactical decision-making in early 20th century warfare. The increased destructiveness of artillery, due to improved range and more powerful ammunition, coupled with the invention of the field telephone and indirect fire techniques during World War I, restored the long arm to its role as king of the battlefield.

attrition-fig-35

Dupuy represented this historical relationship between lethality and dispersion on the battlefield by applying a dispersion factor to TLI values to obtain what he termed the Operational Lethality Index (OLI). By accounting for these effects, OLI values are a good theoretical approximation of relative weapon effectiveness.

npw-fig-2-5Although little empirical research has been done on this question, it seems logical that the trend toward greater use of precision-guided weapons is at least a partial response to the so-called “empty battlefield.” The developers of the Third Offset Strategy postulated that the emphasis on developing precision weaponry by the U.S. in the 1970s was a calculated response to offset the Soviet emphasis on mass firepower (i.e. the “second offset”). The goal of modern precision weapons is “one shot, one kill,” where a reduced rate of fire is compensated for by greater range and accuracy. Such weapons have become sufficiently lethal that the best way to survive on a modern battlefield is to not be seen.

At least, that was the conventional wisdom until recently. The U.S. Army in particular is watching how the Ukrainian separatist forces and their Russian enablers are making use of new artillery weapons, drone and information technology, and tactics to engage targets with mass fires. Some critics have alleged that the U.S. artillery arm has atrophied during the Global War on Terror and may no longer be capable of overmatching potential adversaries. It is not yet clear whether there will be a real competition between mass and precision fires on the battlefields of the near future, but it is possible that it signals yet another shift in the historical relationship between lethality, mobility, and dispersion in combat.

SOURCES

Trevor N. Dupuy, Attrition: Forecasting Battle Casualties and Equipment Losses in Modern War (Falls Church, VA: NOVA Publications, 1995)

_____., Understanding War: History and Theory of Combat (New York: Paragon House, 1987)

_____. The Evolution of Weapons and Warfare (Indianapolis, IN: The Bobbs-Merrill Company, Inc., 1980)

_____. Numbers, Predictions and War: Using History to Evaluate Combat Factors and Predict the Outcome of Battles (Indianapolis; New York: The Bobbs-Merrill Co., 1979)

Tank Loss Rates in Combat: Then and Now

wwii-tank-battlefieldAs the U.S. Army and the national security community seek a sense of what potential conflicts in the near future might be like, they see the distinct potential for large tank battles. Will technological advances change the character of armored warfare? Perhaps, but it seems more likely that the next big tank battles – if they occur – will likely resemble those from the past.

One aspect of future battle of great interest to military planners is probably going to tank loss rates in combat. In a previous post, I looked at the analysis done by Trevor Dupuy on the relationship between tank and personnel losses in the U.S. experience during World War II. Today, I will take a look at his analysis of historical tank loss rates.

In general, Dupuy identified that a proportional relationship exists between personnel casualty rates in combat and losses in tanks, guns, trucks, and other equipment. (His combat attrition verities are discussed here.) Looking at World War II division and corps-level combat engagement data in 1943-1944 between U.S., British and German forces in the west, and German and Soviet forces in the east, Dupuy found similar patterns in tank loss rates.

attrition-fig-58

In combat between two division/corps-sized, armor-heavy forces, Dupuy found that the tank loss rates were likely to be between five to seven times the personnel casualty rate for the winning side, and seven to 10 for the losing side. Additionally, defending units suffered lower loss rates than attackers; if an attacking force suffered a tank losses seven times the personnel rate, the defending forces tank losses would be around five times.

Dupuy also discovered the ratio of tank to personnel losses appeared to be a function of the proportion of tanks to infantry in a combat force. Units with fewer than six tanks per 1,000 troops could be considered armor supporting, while those with a density of more than six tanks per 1,000 troops were armor-heavy. Armor supporting units suffered lower tank casualty rates than armor heavy units.

attrition-fig-59

Dupuy looked at tank loss rates in the 1973 Arab-Israeli War and found that they were consistent with World War II experience.

What does this tell us about possible tank losses in future combat? That is a very good question. One guess that is reasonably certain is that future tank battles will probably not involve forces of World War II division or corps size. The opposing forces will be brigade combat teams, or more likely, battalion-sized elements.

Dupuy did not have as much data on tank combat at this level, and what he did have indicated a great deal more variability in loss rates. Examples of this can be found in the tables below.

attrition-fig-53attrition-fig-54

These data points showed some consistency, with a mean of 6.96 and a standard deviation of 6.10, which is comparable to that for division/corps loss rates. Personnel casualty rates are higher and much more variable than those at the division level, however. Dupuy stated that more research was necessary to establish a higher degree of confidence and relevance of the apparent battalion tank loss ratio. So one potentially fruitful area of research with regard to near future combat could very well be a renewed focus on historical experience.

NOTES

Trevor N. Dupuy, Attrition: Forecasting Battle Casualties and Equipment Losses in Modern War (Falls Church, VA: NOVA Publications, 1995), pp. 41-43; 81-90; 102-103

A Losing Record

wld

Spotted an article today on the History New Network (HNN): Win, Lose, or Draw?

This got my attention because I have outlined a book I may start work on next year (2017) called Future American Wars: Understanding the Next Twenty Years. This book is intended to complete a trio of books, one on understanding insurgencies (American’s Modern Wars), one on understanding conventional combat (War by Numbers — release date still August 2017) and this one covering the situation going forward.

My opening chapter is called: A Losing Record.

What they are recording in this article is that:

  1. For conventional conflict we have 3 wins, 1 loss and 1 tie.
  2. For other conflicts (what they call the “gray zone”) there are 9 wins, 8 losses and 42 draws.

Anyhow, haven’t checked the individual cases, and in some cases it depends on how your interpret win, lose and draw; but it does bring out a fundamental problem that I was partly trying to address in America’s Modern Wars, which is our track record in these conflicts is not great. My book primarily focused on Iraq and Afghanistan, which I why I felt I needed to third book to cover all the other cases of interventions, peacekeeping operations, and so forth.

Anyhow, the SOCOM briefing chart can be blown up to large size and is definitely worth looking at.

 

Technology, Eggs, and Risk (Oh, My)

Tokyo, Japan --- Eggs in a basket --- Image by © JIRO/Corbis
Tokyo, Japan — Eggs in a basket — Image by © JIRO/Corbis

In my last post, on the potential for the possible development of quantum radar to undermine the U.S. technological advantage in stealth technology, I ended by asking this question:

The basic assumption behind the Third Offset Strategy is that the U.S. can innovate and adopt technological capabilities fast enough to maintain or even expand its current military superiority. Does the U.S. really have enough of a scientific and technological development advantage over its rivals to validate this assumption?

My colleague, Chris, has suggested that I expand on the thinking behind this. Here goes:

The lead times needed for developing advanced weapons and the costs involved in fielding them make betting on technological innovation as a strategy seem terribly risky. In his 1980 study of the patterns of weapon technology development, The Evolution of Weapons and Warfare, Trevor Dupuy noted that there is a clear historical pattern of a period of 20-30 years between the invention of a new weapon and its use in combat in a tactically effective way. For example, practical armored fighting vehicles were first developed in 1915 but they were not used fully effectively in battle until the late 1930s.

The examples I had in mind when I wrote my original post were the F-35 Joint Strike Fighter (JSF) and the Littoral Combat Ship (LCS), both of which derive much, if not most, of their combat power from being stealthy. If that capability were to be negated even partially by a technological breakthrough or counter by a potential adversary, then 20+ years of development time and hundreds of billions of dollars would have been essentially wasted. If either or both or weapons system were rendered ineffective in the middle of a national emergency, neither could be quickly retooled nor replaced. The potential repercussions could be devastating.

I reviewed the development history of the F-35 in a previous post. Development began in 2001 and the Air Force declared the first F-35 squadron combat operational (in a limited capacity) in August 2016 (which has since been stood down for repairs). The first fully combat-capable F-35s will not be ready until 2018 at the soonest, and the entire fleet will not be ready until at least 2023. Just getting the aircraft fully operational will have taken 15-22 years, depending on how one chooses to calculate it. It will take several more years after that to fully evaluate the F-35 in operation and develop tactics, techniques, and procedures to maximize its effectiveness in combat. The lifetime cost of the F-35 has been estimated at $1.5 trillion, which is likely to be another underestimate.

The U.S. Navy anticipated the need for ships capable of operating in shallow coastal waters in the late 1990s. Development of the LCS began in 2003 the first ships of two variants were launched in 2006 and 2008, respectively. Two of each design have been built so far. Since then, cost overruns, developmental problems, disappointing performances at sea, and reconsideration of the ship’s role led the Navy to scale back a planned purchase of 53 LCSs to 40 at the end of 2015 to allow money to be spent on other priorities. As of July 2016, only 26 LCSs have been programmed and the Navy has been instructed to select one of the two designs to complete the class. Initial program procurement costs were $22 billion, which have now risen to $39 billion. Operating costs for each ship is currently estimated at $79 million, which the Navy asserts will drop when simultaneous testing and operational use ends. The Navy plans to build LCSs until the 2040s, which includes replacements for the original ten after a service life of 25 years. Even at the annual operating cost of a current U.S. Navy frigate ($59 million), a back of the envelope calculation for a lifetime cost for the LCS is around $91 billion, all told; this is also likely an underestimate. This seems like a lot of money to spend on a weapon that the Navy intends to pull out of combat should it sustain any damage.

It would not take a technological breakthrough as singular as quantum radar to degrade the effectiveness of U.S. stealth technology, either. The Russians claim that they already possess radars that can track U.S. stealth aircraft. U.S. sources essentially concede this, but point out that tracking a stealth platform does not mean that it can be attacked successfully. Obtaining a track sufficient to target involves other technological capabilities that are susceptible to U.S. electronic warfare capabilities. U.S. stealth aircraft already need to operate in conjunction with existing EW platforms to maintain their cloaked status. Even if quantum radar proves infeasible, the game over stealth is already afoot.

Quantum Radar: Should We Be Putting All Our Eggs In The Technology Basket?

Corporal Walter "Radar" O'Reilly (Gary Burghoff) | M*A*S*H
Corporal Walter “Radar” O’Reilly (Gary Burghoff) | M*A*S*H

As reported in Popular Mechanics last week, Chinese state media recently announced that a Chinese defense contractor has developed the world’s first quantum radar system. Derived from the principles of quantum mechanics, quantum radar would be capable of detecting vehicles equipped with so-called “stealth” technology for defeating conventional radio-wave based radar systems.

The Chinese claim should be taken with a large grain of salt. It is not clear that a functional quantum radar can be made to work outside a laboratory, much less adapted into a functional surveillance system. Lockheed Martin patented a quantum radar design in 2008, but nothing more has been heard about it publicly.

However, the history of military innovation has demonstrated that every technological advance has eventually resulted in a counter, either through competing weapons development or by the adoption of strategies or tactics to minimize the impact of the new capabilities. The United States has invested hundreds of billions of dollars in air and naval stealth capabilities and built its current and future strategies and tactics around its effectiveness. Much of the value of this investment could be wiped out with a single technological breakthrough by its potential adversaries.

The basic assumption behind the Third Offset Strategy is that the U.S. can innovate and adopt technological capabilities fast enough to maintain or even expand its current military superiority. Does the U.S. really have enough of a scientific and technological development advantage over its rivals to validate this assumption?

The Uncongenial Lessons of Past Conflicts

Williamson Murray, professor emeritus of history at Ohio State University, on the notion that military failures can be traced to an overemphasis on the lessons of the last war:

It is a myth that military organizations tend to do badly in each new war because they have studied too closely the last one; nothing could be farther from the truth. The fact is that military organizations, for the most part, study what makes them feel comfortable about themselves, not the uncongenial lessons of past conflicts. The result is that more often than not, militaries have to relearn in combat—and usually at a heavy cost—lessons that were readily apparent at the end of the last conflict.

[Williamson Murray, “Thinking About Innovation,” Naval War College Review, Spring 2001, 122-123. This passage was cited in a recent essay by LTG H.R. McMaster, “Continuity and Change: The Army Operating Concept and Clear Thinking About Future War,” Military Review, March-April 2015. I recommend reading both.]

Saigon, 1965

The American RAND staff and Vietnamese interviewers on the front porch of the villa on Rue Pasteur. Courtesy of Hanh Easterbrook. [Revisionist History]

Although this blog focuses on quantitative historical analysis, it is probably a good idea to consider from time to time that the analysis is being done by human beings. As objective as analysts try to be about the subjects they study, they cannot avoid interpreting what they see through the lenses of their own personal biases, experiences, and perspectives. This is not a bad thing, as each analyst can bring something new to the process and find things that other perhaps cannot.

The U.S. experience in Vietnam offers a number of examples of this. Recently, journalist and writer Malcolm Gladwell presented a podcast exploring an effort by the RAND Corporation initiated in the early 1960s to interview and assess the morale of captured Viet Cong fighters and defectors. His story centers on two RAND analysts, Leon Gouré and Konrad Kellen, and one of their Vietnamese interpreters, Mai Elliott. The podcast traces the origins and history of the project, how Gouré, Kellen, and Elliott brought very different perspectives to their work, and how they developed differing interpretations of the evidence they collected. Despite the relevance of the subject and the influence the research had on decision-making at high levels, the study ended inconclusively and ambivalently for all involved. (Elliott would go on to write an account of RAND’s activities in Southeast Asia and several other books.)

Gladwell presents an interesting human story as well as some insight into the human element of social science analysis. It is a unique take on one aspect of the Vietnam War and definitely worth the time to listen to. The podcast is part of his Revisionist History series.

Trevor Dupuy’s Combat Advance Rate Verities

t-34_76_4One of the basic processes of combat is movement. According to Trevor Dupuy, one of the most important outcomes of ground combat is advance against opposition. He spent a good amount of time examining historical advance rates, seeking to determine if technological change had led to increases in advance rates over time. On the face of it, he determined that daily rates had increased by about one-half, from about 17 kilometers per day during the Napoleonic Era, to about 26 kilometers a day by the 1973 Arab-Israeli War. However, when calculated by the duration of a campaign, average daily advance rates did not appear to have changed much at all over 200 years, despite the advent of mechanization.

His research on the topic yielded another list of verities. He did not believe they accounted for every factor or influence on advance rates, but he did think they accounted for most of them. He was also reasonably confident that no weapons or means of conveyance then foreseen would alter the basic relationships in his list.[1]

  1. Advance against opposition requires local combat power preponderance.
  2. There is no direct relationship between advance rates and force strength ratios.
  3. Under comparable conditions, small forces advance faster than larger forces.
  4. Advance rates vary inversely with the strength of the defender’s fortifications.
  5. Advance rates are greater for a force that achieves surprise.
  6. Advance rates decline daily in sustained operations.
  7. Superior relative combat effectiveness increases an attacker’s advance rate.
  8. An “all-out” effort increases advance rates at a cost in higher casualties.
  9. Advance rates are reduced by difficult terrain.
  10. Advance rates are reduced by rivers and canals.
  11. Advance rates vary positively with the quality and density of roads.
  12. Advance rates are reduced by bad weather.
  13. Advance rates are lower at night than in daytime.
  14. Advance rates are reduced by inadequate supply.
  15. Advance rates reflect interactions with friendly and enemy missions.

NOTES

[1] Trevor N. Dupuy, Understanding War: History and Theory of Combat (New York: Paragon House, 1987), pp. 158–163.