Tag Military Science

Logistics in Trevor Dupuy’s Combat Models

Trevor N. Dupuy, Numbers, Predictions and War: Using History to Evaluate Combat Factors and Predict the Outcome of Battles (Indianapolis; New York: The Bobbs-Merrill Co., 1979), p. 79

Mystics & Statistics reader Stiltzkin posed two interesting questions in response to my recent post on the new blog, Logistics in War:

Is there actually a reliable way of calculating logistical demand in correlation to “standing” ration strength/combat/daily strength army size?

Did Dupuy ever focus on logistics in any of his work?

The answer to his first question is, yes, there is. In fact, this has been a standard military staff function since before there were military staffs (Martin van Creveld’s book, Supplying War: Logistics from Wallenstein to Patton (2nd ed.) is an excellent general introduction). Staff officer’s guides and field manuals from various armies from the 19th century to the present are full of useful information on field supply allotments and consumption estimates intended to guide battlefield sustainment. The records of modern armies also contain reams of bureaucratic records documenting logistical functions as they actually occurred. Logistics and supply is a woefully under-studied aspect of warfare, but not because there are no sources upon which to draw.

As to his second question, the answer is also yes. Dupuy addressed logistics in his work in a couple of ways. He included two logistics multipliers in his combat models, one in the calculation for the battlefield effects of weapons, the Operational Lethality Index (OLI), and also as one element of the value for combat effectiveness, which is a multiplier in his combat power formula.

Dupuy considered the impact of logistics on combat to be intangible, however. From his historical study of combat, Dupuy understood that logistics impacted both weapons and combat effectiveness, but in the absence of empirical data, he relied on subject matter expertise to assign it a specific value in his model.

Logistics or supply capability is basic in its importance to combat effectiveness. Yet, as in the case of the leadership, training, and morale factors, it is almost impossible to arrive at an objective numerical assessment of the absolute effectiveness of a military supply system. Consequently, this factor also can be applied only when solid historical data provides a basis for objective evaluation of the relative effectiveness of the opposing supply capabilities.[1]

His approach to this stands in contrast to other philosophies of combat model design, which hold that if a factor cannot be empirically measured, it should not be included in a model. (It is up to the reader to decide if this is a valid approach to modeling real-world phenomena or not.)

Yet, as with many aspects of the historical study of combat, Dupuy and his colleagues at the Historical Evaluation Research Organization (HERO) had taken an initial cut at empirical research on the subject. In the late 1960s and early 1970s, Dupuy and HERO conducted a series of studies for the U.S. Air Force on the historical use of air power in support of ground warfare. One line of inquiry looked at the effects of air interdiction on supply, specifically at Operation STRANGLE, an effort by the U.S. and British air forces to completely block the lines of communication and supply of German ground forces defending Rome in 1944.

Dupuy and HERO dug deeply into Allied and German primary source documentation to extract extensive data on combat strengths and losses, logistical capabilities and capacities, supply requirements, and aircraft sorties and bombing totals. Dupuy proceeded from a historically-based assumption that combat units, using expedients, experience, and training, could operate unimpaired while only receiving up to 65% of their normal supply requirements. If the level of supply dipped below 65%, the deficiency would begin impinging on combat power at a rate proportional to the percentage of loss (i.e., a 60% supply rate would impose a 5% decline, represented as a combat effectiveness multiplier of .95, and so on).

Using this as a baseline, Dupuy and HERO calculated the amount of aerial combat power the Allies needed to apply to impact German combat effectiveness. They determined that Operation STRANGLE was able to reduce German supply capacity to about 41.8% of normal, which yielded a reduction in the combat power of German ground combat forces by an average of 6.8%.

He cautioned that these calculations were “directly relatable only to the German situation as it existed in Italy in late March and early April 1944.” As detailed as the analysis was, Dupuy stated that it “may be an oversimplification of a most complex combination of elements, including road and railway nets, supply levels, distribution of targets, and tonnage on targets. This requires much further exhaustive analysis in order to achieve confidence in this relatively simple relationship of interdiction effort to supply capability.”[2]

The historical work done by Dupuy and HERO on logistics and combat appears unique, but it seems highly relevant. There is no lack of detailed data from which to conduct further inquiries. The only impediment appears to be lack of interest.

NOTES

 [1] Trevor N. Dupuy, Numbers, Predictions and War: Using History to Evaluate Combat Factors and Predict the Outcome of Battles (Indianapolis; New York: The Bobbs-Merrill Co., 1979), p. 38.

[2] Ibid., pp. 78-94.

[NOTE: This post was edited to clarify the effect of supply reduction through aerial interdiction in the Operation STRANGLE study.]

Logistics In War

“Amateurs study tactics, armchair generals study strategy, but professionals study logistics,” as the old saw goes. While the veracity of this statement is debatable, there can be little doubt that the study and appreciation of the role of sustainment in warfare lags behind that of the sexier topics of strategy and tactics.  A new blog, Logistics in War, [also on Facebook (https://www.facebook.com/logisticsinwar/) and Twitter (@logisticsinwar)] is seeking to change that.

The anonymous and somewhat mysterious purveyor of the blog bills it as “a public, unofficial, ‘Professional Military Education’ site,” the purpose of which is “to instigate and inspire, continue and create, a discussion on military logistics that is so often sorely lacking (or if it does occur, does so behind closed doors).”

It seems safe to conclude that the blog’s owner is an Australian Army loggie, however: “Although the blog currently reflects an Australian and Army orientation, its vision is to become broadly applicable; to reflect the many different approaches to logistics as practiced by different military Services, the Joint domain, and militaries of all persuasions.”

The initial posts range in subject from a list of suggested readings about logistics, to the impact of sustainment in battle in recent history, to the challenges of supplying combat forces in the multi-domain battle construct. The writing is crisp, clear, and professional, and the questions and topics addressed are of undeniable importance. Logistics in War is a welcome addition to the online conversation about warfare, and is well worth the time to peruse. It will be very interesting to watch it progress and grow.

Army And Marine Corps Join Forces To Define Multi-Domain Battle Concept

U.S. Army Chief of Staff General Mark Milley and U.S. Marine General Robert Neller recently signed a joint white paper to be sent for review by Joint Chiefs of Staff Gen. Joseph Dunford Jr.,outlining the collective views of their services on what has been termed “multi-domain battle.” The Army and Marine Corps have also established a joint task force to develop tactics applicable to the concept.

Multi-domain battle is a concept that has evolved as a response to challenges posed by anti-access/area-denial capabilities fielded by potential U.S. military rivals, such as Russia, China, and Iran. Its proponents argue that in it’s broadest application, the concept seeks to expand the principles of combined arms tactics beyond the traditional air/sea/land service boundaries and apply them to joint operations and newly emerging domains such as cyber warfare and information operations. Trevor Dupuy postulated that the employment of combined arms on the battlefield was one solution armies have historically adopted to adapt to increases in weapon lethality over time.

When the Army officially introduced the concept last year, General Milley said “This is pretty much the beginning of a new way of thinking.” General Neller echoed Milley’s comments. “We’ve been shoulder-and-shoulder on multi-domain battle and land concepts. We can’t afford to waste any resources on duplication when it’s not necessary. We see the problem the same way; we have the same conclusions.” U.S. Pacific Command (USPACOM) commander, U.S. Navy Admiral Harry B. Harris commented last fall that

We need a degree of jointness, in my opinion, in which no one military service dominates and no domain has a fixed boundary. A combatant commander must be able to create effects from any single domain to target in every domain in order to fight tonight and win. [I need] a true land-based cross-domain capability [that] offers us an integrated joint force capable of deterring rising powers by denying them the domains in which they seek to operate.

U.S. Army, Pacific (USARPC) is currently working with USPACOM to finalize exercises scheduled for this spring to test multi-domain battle warfighting concepts. Similar exercises are being planned for Europe in 2018.

There is a sense of urgency regarding multi-domain battle in the Pacific, given ongoing tensions with North Korea and recent comments by Trump Administration officials regarding the South China Sea. USARPC commander General Robert Brown recently stated “This isn’t something 10 years from now. If Kim Jong-un goes south tomorrow, I will need some of this tomorrow.'”

Even as the Army and Marine Corps move forward with integrating multi-domain battle into their combat doctrines, the concept is not without its discontents. Aside from Admiral Harris, the Navy has had little to say about multi-domain battle. The U.S. Air Force has also expressed skepticism that U.S. land combat forces will reduce their dependence on air power anytime soon. When the Army raised concerns last year about capabilities Russian forces had demonstrated in the Ukraine, some in its sisters services and the national security community accused it of alarmism in support of its lobbying for an increased share of the defense budget.

Whether mutli-domain battle survives as an organic concept, it seems to be spurring useful thinking about warfare in the near future. In addition to stimulating new technological research and development (Third Offset Strategy), it is leading to new ways at looking at command and control, planning, and notions of “jointness.”

Trevor Dupuy and Historical Trends Related to Weapon Lethality

There appears to be renewed interest in U.S. Army circles in Trevor Dupuy’s theory of a historical relationship between increasing weapon lethality, declining casualty rates, and greater dispersion on the battlefield. A recent article by Army officer and strategist Aaron Bazin, “Seven Charts That Help Explain American War” at The Strategy Bridge, used a composite version of two of Dupuy’s charts to explain the American military’s attraction to technology. (The graphic in Bazin’s article originated in a 2009 Australian Army doctrinal white paper, “Army’s Future Land Operating Concept,” which evidently did not cite Dupuy as the original source for the charts or the associated concepts.)

John McRea, like Bazin a U.S. Army officer, and a founding member of The Military Writer’s Guild, reposted Dupuy’s graphic in a blog post entitled “Outrageous Fortune: Spears and Arrows,” examining tactical and economic considerations in the use of asymmetrical technologies in warfare.

Dr. Conrad Crane, Chief of Historical Services for the U.S. Army Heritage and Education Center at the Army War College, also referenced Dupuy’s concepts in his look at human performance requirements, “The Future Soldier: Alone in a Crowd,” at War on the Rocks.

Dupuy originally developed his theory based on research and analysis undertaken by the Historical Evaluation and Research Organization (HERO) in 1964, for a study he directed, “Historical Trends Related to Weapon Lethality.” (Annex I, Annex II, Annex III). HERO had been contracted by the Advanced Tactics Project (AVTAC) of the U.S. Army Combat Developments Command, to provide unclassified support for Project OREGON TRAIL, a series of 45 classified studies of tactical nuclear weapons, tactics, and organization, which took 18 months to complete.

AVTAC asked HERO “to identify and analyze critical relationships and the cause-effect aspects of major advances in the lethality of weapons and associated changes in tactics and organization” from the Roman Era to the present. HERO’s study itself was a group project, incorporating 58 case studies from 21 authors, including such scholars as Gunther E. Rothenberg, Samuel P. Huntington, S.L.A. Marshall, R. Ernest Dupuy, Grace P. Hayes, Louis Morton, Peter Paret, Stefan T. Possony, and Theodore Ropp.

Dupuy synthesized and analyzed these case studies for the HERO study’s final report. He described what he was seeking to establish in his 1979 book, Numbers, Predictions and War: Using History to Evaluate Combat Factors and Predict the Outcome of Battles.

If the numbers of military history mean anything, it appears self-evident that there must be some kind of relationship between the quantities of weapons employed by opposing forces in combat, and the number of casualties suffered by each side. It also seems fairly obvious that some weapons are likely to cause more casualties than others, and that the effectiveness of weapons will depend upon their ability to reach their targets. So it becomes clear that the relationship of weapons to casualties is not quite the simple matter of comparing numbers to numbers. To compare weapons to casualties it is necessary to know not only the numbers of weapons, but also how many there are of each different type, and how effective or lethal each of these is.

The historical relationship between lethality, casualties, and dispersion that Dupuy deduced in this study provided the basis for his subsequent quest to establish an empirically-based, overarching theory of combat, which he articulated through his Quantified Judgement Model. Dupuy refined and updated the analysis from the 1964 HERO study in his 1980 book, The Evolution of Weapons and Warfare.

An Additional Comment on the Link Between Operations, Strategy, and Policy In Russian Hybrid Warfare

A conclusion that Fox alluded to in his article, but did not state explicitly, is that in a sense, the Russians “held back” in the design of their operations against the Ukrainians. It appears quite clear that the force multipliers derived from the battalion tactical groups, drone-enabled recon-strike model, and cyber and information operations capabilities generated more than enough combat power for the Russians to decisively defeat the Ukrainian Army in a larger “blitzkrieg”-style invasion and occupy most, if not all, of the country, if they had chosen to do so.

This clearly is not the desired political goal of the Russian government, however. Instead, the Russian General Staff carefully crafted a military strategy to fulfill more limited political goals, and creatively designed their operations to make full use of their tactical capabilities in support of that strategy.

This successful Clausewitizan calibration of policy, strategy, operations, and tactics by the Russians in Ukraine and Syria should give the U.S. real concern, since itself does not currently seem capable of a similar level of coordination or finesse. Now, the Russian achievements against the relatively hapless Ukrainians, or in Syria, where the ultimate outcome remains very much indeterminate, are no guarantee of future success against more capable and well-resourced opponents. However, it does demonstrate what can be achieved with a relatively weak strategic hand to play through a clear unity of political purpose and military means. This has not been the U.S.’s strong suit historically, and it is unclear at this juncture whether that will change under the incoming Trump administration.

Linking Operations, Strategy, and Policy In Russian Hybrid Warfare

Map depicting the encirclement and withdrawal of Ukrainian forces in the Debaltseve area, 14 January – 20 February 2015 [Map by Goran tek-en (Wikipedia)]

U.S. Army Major Amos Fox, who is quickly establishing himself as one of the brighter sparks analyzing the contemporary Russian way of land warfare, has a new article, “The Russian–Ukrainian War: Understanding the Dust Clouds on the Battlefield,” published by West Point’s Modern War Institute. In it he assesses the linkage between Russian land warfare operations, strategy, and policy.

In Fox’s analysis, despite the decisive advantages afforded to the Russian Army and their Ukrainian Separatist proxies through “the employment of the semi-autonomous battalion tactical group, and a reconnaissance-strike model that tightly couples drones to strike assets, hastening the speed at which overwhelming firepower is available to support tactical commanders,” the actual operations executed by these forces should be characterized as classic sieges, as opposed to decisive operational maneuver.

Fox details three operations employing this approach – tactical combat overmatch enabling envelopment and the subsequent application of steady pressure – that produced military success leading directly to political results advantageous to the Russian government.

According to Fox, the military strategy of siege operations effectively enabled the limited political goals of the Russian government.

What explains Russia’s evident preference for the siege? Would it not make more sense to quickly annihilate the Ukrainians? Perhaps. However, the siege’s benefit is its ability to transfer military power into political progress, while obfuscating the associated costs. A rapid, violent, decisive victory in which hundreds of Ukrainian soldiers are killed in a matter of days is counterproductive to Russia’s political goals, whereas the incremental use of violence over time accomplishes the same objectives with less disturbance to the international community.

Fox believes that this same operational concept was applied by the Syrian Army and its Russian enablers to capture the city of Aleppo last month, albeit with somewhat different tactics, such as substituting airstrikes for long-range artillery and rockets.

He advises that the U.S. would be prudent to plan for and prepare to face the new Russian land warfare capabilities.

These new features of Russian warfare—and an understanding of them in the context of that warfare’s very conventional character—should inform US planning. The contemporary Russian army is combat-experienced in combined arms maneuver at all echelons of command, a skill that the US Army is still working to recover after well over a decade of counterinsurgency operations in Iraq and Afghanistan. This fact could prove troublesome if Russia elects to push further in Europe, infringing upon NATO partners, or if US and Russian interests continue to collide in areas like Syria. Preparing to combat Russian cyber threats or hybrid tactics is important. But the lesson from Ukraine is clear: It is equally vital to train and equip US forces to counter the type of conventional capabilities Russia has demonstrated in Ukraine.

UPDATE: An Additional Comment on the Link Between Operations, Strategy, and Policy In Russian Hybrid Warfare

Military Effectiveness and Cheese-Eating Surrender Monkeys

The International Security Studies Forum (ISSF) has posted a roundtable review on H-Diplo of Jasen J. Castillo’s Endurance and War: The National Sources of Military Cohesion (Stanford, CA: Stanford University Press, 2014). As the introduction by Alexander B. Downes of The George Washington University lays out, there is a considerable political science literature that addresses the question of military effectiveness, or why some militaries are more effective combatants than others. Castillo focused on why some armies fight hard, even when faced with heavy casualties and the prospect of defeat, and why some become ineffective or simply collapse. The example most often cited in this context – as Downes and Castillo do – is the French Army. Why were the French routed so quickly in 1940 when they had fought so much harder and incurred far higher casualties in 1914? (Is this characterization of the French entirely fair? I’ll take a look at that question below.)

According to Downes, for his analysis, Castillo defined military cohesion as staying power and battlefield performance. He identified two factors that were primary in determining military cohesion: the persuasiveness of a regime’s ideology and coercive powers and the military’s ability to train its troops free from political interference. From this, Castillo drew two conclusions, one counterintuitive, the other in line with prevailing professional military thought.

  • “First, regimes that exert high levels of control over society—through a combination of an ideology that demands ‘unconditional loyalty’ (such as nationalism, communism, or fascism) and the power to compel recalcitrant individuals to conform—will field militaries with greater staying power than states with low levels of societal control.”
  • “Second, states that provide their military establishments with the autonomy necessary to engage in rigorous and realistic training will generate armies that fight in a determined yet flexible fashion.”

Based on his analysis, Castillo defines four military archetypes:

  • “Messianic militaries are the most fearsome of the lot. Produced by countries with high levels of regime control that give their militaries the autonomy to train, such as Nazi Germany, messianic militaries possess great staying power and superior battlefield performance.”
  • “Authoritarian militaries are also generated by nations with strong regime control over society, but are a notch below their messianic cousins because the regime systematically interferes in the military’s affairs. These militaries have strong staying power but are less nimble on the battlefield. The Red Army under Joseph Stalin is a good example.”
  • “Countries with low regime control but high military autonomy produce professional militaries. These militaries—such as the U.S. military in Vietnam—perform well in battle but gradually lose the will to fight as victory recedes into the distance.”
  • “Apathetic militaries, finally, are characteristic of states with both low regime control and low military autonomy, like France in 1940. These militaries fall apart quickly when faced with adversity.”

The discussion panel – Brendan Rittenhouse Green, (University of Cincinnati); Phil Haun (Yale University); Austin Long (Columbia University); and Caitlin Talmadge (The George Washington University) – reviewed Castillo’s work favorably. Their discussion and Castillo’s response are well worth the time to read.

Now, to the matter of France’s alleged “apathetic military.” The performance of the French Army in 1940 has earned the country the infamous reputation of being “cheese eating surrender monkeys.” Is this really fair? Well, if measured in terms of France’s perseverance in post-World War II counterinsurgency conflicts, the answer is most definitely no.

As detailed in Chris Lawrence’s book America’s Modern Wars, TDI looked at the relationship between national cost of foreign interventions and the outcome of insurgencies. One method used to measure national burden was the willingness of intervening states to sustain casualties. TDI found a strong correlation between high levels of casualties to intervening states and the failure of counterinsurgency efforts.

Among the cases in TDI’s database of post-World War II insurgencies, interventions, and peace-keeping operations, the French were the most willing, by far, to sustain the burden of casualties waging counterinsurgencies. In all but one of 17 years of continuous post-World War II conflict in Indochina and Algeria, democratic France’s apathetic military lost from 1 to 8 soldiers killed per 100,000 of its population.

In comparison, the U.S. suffered a similar casualty burden in Vietnam for only five years, incurring losses of 1.99 to 7.07 killed per 100,000 population between 1966 and 1970, which led to “Vietnamization” and withdrawal by 1973. The United Kingdom was even more sensitive to casualties. It waged multiple post-World War II insurgencies. Two that it won, in Malaya and Northern Ireland, produced casualty burdens of 0.09 British killed per 100,000 during its 13 years; Northern Ireland (1968–1998) never got above 0.19 British soldiers killed per 100,000 during its 31 years and for 20 of those years was below 0.025 per 100,000. The British also lost several counterinsurgencies with far lower casualty burdens than those of the French. Of those, the bloodiest was Palestine, where British losses peaked at 0.28 killed per 100,000 in 1948, which is also the year they withdrew.

Of the allegedly fearsome “authoritarian militaries,” only Portugal rivaled the staying power of the French. Portugal’s dictatorial Estado Novo government waged three losing counterinsurgencies in Africa over 14 years, suffering from 1 to 3.5 soldiers killed per 100,000 for 14 years, and between 2.5 and 3.5 killed per 100,000 in nine of those years. The failure of these wars also contributed to the overthrow of Portugal’s dictatorship.

The Soviet Union’s authoritarian military had a casualty burden between 0.22 and 0.75 soldiers killed per 100,000 in Afghanistan from 1980 through 1988. It withdrew after losing 14,571 dead (the U.S. suffered 58,000 killed in Vietnam) and the conflict is often cited as a factor in the collapse of the Soviet government in 1989.

Castillo’s analysis and analytical framework, which I have not yet read, appears intriguing and has received critical praise. Like much analysis of military history, however, it seems to explain the exceptions — the brilliant victories and unexpected defeats — rather than the far more prevalent cases of indecisive or muddled outcomes.

Mosul and ISF Combat Effectiveness

The situation in Mosul, 16-19 December 2016 (Institute for the Study of War)

After a period of “operational refit,” Iraqi Security Forces (ISF) waging battle with Daesh fighters for control of the city of Mosul launched a new phase of their advance on 29 December. The initial phase of the assault, which began on 17 October 2016, ground to a halt due to strong Daesh resistance and heavy casualties among the Iraqi Counterterrorism Service (CTS) troops spearheading the operation.

For the new offensive, the CTS was reinforced with additional Iraqi Army ground units, as well as an armored element of the Federal Police. Additional U.S. combat forces and advisors have also been moved closer to the front lines in support.

Although possessing an enormous manpower advantage over the Daesh defenders, ISF had managed to secure only one-quarter of the city in two months of combat. This is likely due to the fact that the only ISF elements that have demonstrated any offensive combat effectiveness have been the CTS and the Popular Mobilization Forces (PMF, or Hash’d al Shaabi) Iraqi Shi’a militia mobilized by Grand Ayatollah Ali Sistani in 2014. PMF brigades hold the western outskirts of the city, but thus far have been restrained from entering it for fear of provoking sectarian violence with the mostly Sunni residents.

Daesh defenders, believed to number only from 3,000-5,000 at the outset of the battle, have had the luxury of fighting against only one axis of advance and within urban terrain filled with trapped civilians, which they have used as human shields. They mounted a particularly effective counterattack against the CTS using vehicle-borne improvised explosive devices (VBIEDs), which halted the initial offensive in mid-December. ISF casualties appear to be concentrated in the elite 1st Special Operations Brigade (the so-called “Golden Division”) of the CTS. An unnamed Pentagon source was quoted as stating that the Golden Division’s maneuver battalions had incurred “upwards of 50 percent casualties,” which, if sustained, would have rendered it combative ineffective in less than a month.

The Iraqi government has come to rely on the Golden Division to generate reliable offensive combat power. It spearheaded the attacks that recovered Tikrit, Ramadi, and Fallujah earlier in the year. Originally formed in 2004 as the non-sectarian Iraqi Special Operations Forces brigade, the Golden Division was amalgamated into the CTS in 2007 along with specialized counterterrorism and national police elements. Although intended for irregular warfare, the CTS appears to be the only Iraqi military force capable of effective conventional offensive combat operations, likely due to higher level of combat effectiveness relative to the rest of the ISF, as well as its interoperability with U.S. and Coalition supporting forces.

Historically, the Iraqi Army has not demonstrated a high level of overall combat effectiveness. Trevor Dupuy’s analysis of the performance of the various combatants in the 1973 Arab-Israeli War ranked the Iraqi Army behind that of the Israelis, Jordanians, Egyptians, and Syrians. He estimated the Israelis to have a 3.43 to 1.00 combat effectiveness advantage over the Iraqis in 1973. Dupuy credited the Iraqis with improved effectiveness following the 1980-88 Iran-Iraq War in his pre-war estimate of the outcome of the 1990-91 Gulf War. This turned out to be erroneous; overestimation of Iraqi combat effectiveness in part led Dupuy to predict a higher casualty rate for U.S. forces than actually occurred. The ineffective performance of the Iraqi Army in 2003 should have not surprised anyone.

The relative success of the CTS can be seen as either indicative of the general failure of the decade-long U.S. effort to rebuild an effective Iraqi military establishment, or as an exemplary success of the U.S. Special Operations Forces model for training and operating with indigenous military forces. Or both.

Tanks and Russian Hybrid Warfare

tanks-russian-hybrid-warfareU.S. Army Major Amos Fox, currently a student at the U.S. Army Command and General Staff College, has produced an insightful analysis of the role of tanks in Russian hybrid warfare tactics and operations. His recent article in Armor, the journal of the U.S. Army Maneuver Center of Excellence at Ft. Benning, Georgia, offers a sense of the challenges of high-intensity combat on the near-future hybrid warfare battlefield.

Fox assesses current Russia Army tactical and operational capabilities as quite capable.

Russia’s contemporary operations embody the characteristic of surprise. Russian operations in Georgia and Ukraine demonstrate a rapid, decentralized attack seeking to temporally dislocate the enemy, triggering the opposing forces’ defeat. These methods stand in stark contrast to the old Soviet doctrine of methodical, timetable-and echelon-driven employment of ground forces that sought to outmass the opposing army. Current Russian land-warfare tactics are something which most armies, including the U.S. Army, are largely unprepared to address.

Conversely, after achieving limited objectives, Russia quickly transitions to the defense using ground forces, drones and air-defense capabilities to build a tough, integrated position from which extrication would be difficult, to be sure. Russia’s defensive operations do not serve as a simple shield, but rather, as a shield capable of also delivering well-directed, concentrated punches on the opposition army. Russia’s paradoxical use of offensive operations to set up the defense might indicate an ascendency of the defense as the preferred method of war in forthcoming conflicts.

These capabilities will pose enormous challenges to U.S. and allied forces in any potential land combat scenario.

Russia’s focus on limited objectives, often in close proximity to its own border, indicates that U.S. Army combined-arms battalions and cavalry squadrons will likely find themselves on the wrong end of the “quality of firsts” (Figure 4). The U.S. Army’s physical distance from those likely battlefields sets the Army at a great disadvantage because it will have to hastily deploy forces to the region, meaning the Army will arrive late; the arrival will also be known (location, time and force composition). The Army will have great difficulty seizing the initiative due to its arrival and movement being known, which weakens the Army’s ability to fight and win decisively. This dynamic provides time, space and understanding for the enemy to further prepare for combat operations and strengthen its integrated defensive positions. Therefore, U.S. Army combined-arms battalions and cavalry squadrons must be prepared to fight through a rugged enemy defense while maintaining the capability for continued offensive operations.

Fox’s entire analysis is well worth reading and pondering. He also published another excellent analysis of Russian hybrid warfare with a General Staff College colleague, Captain (P) Andrew J. Rossow, in Small Wars Journal.

What Is The Relationship Between Rate of Fire and Military Effectiveness?

marine-firing-m240Over at his Best Defense blog, Tom Ricks recently posed an interesting question: Is rate of fire no longer a key metric in assessing military effectiveness?

Rate of fire doesn’t seem to be important in today’s militaries. I mean, everyone can go “full auto.” Rather, the problem seems to me firing too much and running out of ammunition.

I wonder if this affects how contemporary military historians look at the tactical level of war. Throughout most of history, the problem, it seems to me, was how many rocks, spears, arrows or bullets you could get off. Hence the importance of drill, which was designed to increase the volume of infantry fire (and to reduce people walking off the battlefield when they moved back to reload).

There are several ways to address this question from a historical perspective, but one place to start is to look at how rate of fire relates historically to combat.

Rate of fire is one of several measures of a weapon’s ability to inflict damage, i.e. its lethality. In the early 1960s, Trevor Dupuy and his associates at the Historical Evaluation Research Organization (HERO) assessed whether historical trends in increasing weapon lethality were changing the nature of combat. To measure this, they developed a methodology for scoring the inherent lethality of a given weapon, the Theoretical Lethality Index (TLI). TLI is the product of five factors:

  • rate of fire
  • targets per strike
  • range factor
  • accuracy
  • reliability

In the TLI methodology, rate of fire is defined as the number of effective strikes a weapon can deliver under ideal conditions in increments of one hour, and assumes no logistical limitation.

As measured by TLI, increased rates of fire do indeed increase weapon lethality. The TLI of an early 20th century semi-automatic rifle is nearly five times higher than a mid-19th century muzzle-loaded rifle due to its higher rate of fire. Despite having lower accuracy and reliability, a World War II-era machine gun has 10 times the TLI of a semi-automatic rifle due to its rate of fire. The rate of fire of small arms has not increased since the early-to-mid 20th century, and the assault rifle, adopted by modern armies following World War II, remains that standard infantry weapon in the early 21st century.

attrition-fig-11

Rate of fire is just but one of many factors that can influence a weapon’s lethality, however. Artillery has much higher TLI values than small arms despite lower rates of fire. This is for the obvious reasons that artillery has far greater range than small arms and because each round of ammunition can hit multiple targets per strike.

There are other methods for scoring weapon lethality but the TLI provides a logical and consistent methodology for comparing weapons to each other. Through the TLI, Dupuy substantiated the observation that indeed, weapons have become more lethal over time, particularly in the last century.

But if weapons have become more lethal, has combat become bloodier? No. Dupuy and his colleagues also discovered that, counterintuitively, the average casualty rates in land combat have been declining since the 17th century. Combat casualty rates did climb in the early and mid-19th century, but fell again precipitously from the later 19th century through the end of the 20th.

attrition-fig-13

The reason, Dupuy determined, was because armies have historically adapted to increases in weapon lethality by dispersing in greater depth on the battlefield, decentralizing tactical decision-making and enhancing mobility, and placing a greater emphasis on combined arms tactics. The area occupied by 100,000 soldiers increased 4,000 times between antiquity and the late 20th century. Average ground force dispersion increased by a third between World War II and the 1973 Yom Kippur War, and he estimated it had increased by another quarter by 1990.

attrition-fig-14

Simply put, even as weapons become more deadly, there are fewer targets on the battlefield for them to hit. Through the mid-19th century, the combination of low rates of fire and relatively shorter range required the massing of infantry fires in order to achieve lethal effect. Before 1850, artillery caused more battlefield casualties than infantry small arms. This ratio changed due to the increased rates of fire and range of rifled and breach loading weapons introduced in the 1850s and 1860s. The majority of combat casualties in  conflicts of the mid-to-late 19th century were inflicted by infantry small arms.

attrition-fig-19The lethality of modern small arms combined with machine guns led to further dispersion and the decentralization of tactical decision-making in early 20th century warfare. The increased destructiveness of artillery, due to improved range and more powerful ammunition, coupled with the invention of the field telephone and indirect fire techniques during World War I, restored the long arm to its role as king of the battlefield.

attrition-fig-35

Dupuy represented this historical relationship between lethality and dispersion on the battlefield by applying a dispersion factor to TLI values to obtain what he termed the Operational Lethality Index (OLI). By accounting for these effects, OLI values are a good theoretical approximation of relative weapon effectiveness.

npw-fig-2-5Although little empirical research has been done on this question, it seems logical that the trend toward greater use of precision-guided weapons is at least a partial response to the so-called “empty battlefield.” The developers of the Third Offset Strategy postulated that the emphasis on developing precision weaponry by the U.S. in the 1970s was a calculated response to offset the Soviet emphasis on mass firepower (i.e. the “second offset”). The goal of modern precision weapons is “one shot, one kill,” where a reduced rate of fire is compensated for by greater range and accuracy. Such weapons have become sufficiently lethal that the best way to survive on a modern battlefield is to not be seen.

At least, that was the conventional wisdom until recently. The U.S. Army in particular is watching how the Ukrainian separatist forces and their Russian enablers are making use of new artillery weapons, drone and information technology, and tactics to engage targets with mass fires. Some critics have alleged that the U.S. artillery arm has atrophied during the Global War on Terror and may no longer be capable of overmatching potential adversaries. It is not yet clear whether there will be a real competition between mass and precision fires on the battlefields of the near future, but it is possible that it signals yet another shift in the historical relationship between lethality, mobility, and dispersion in combat.

SOURCES

Trevor N. Dupuy, Attrition: Forecasting Battle Casualties and Equipment Losses in Modern War (Falls Church, VA: NOVA Publications, 1995)

_____., Understanding War: History and Theory of Combat (New York: Paragon House, 1987)

_____. The Evolution of Weapons and Warfare (Indianapolis, IN: The Bobbs-Merrill Company, Inc., 1980)

_____. Numbers, Predictions and War: Using History to Evaluate Combat Factors and Predict the Outcome of Battles (Indianapolis; New York: The Bobbs-Merrill Co., 1979)