Briefly and in the most general terms possible, I suggest that the long-term effect of dominant firepower will be threefold. It will disperse mass in the form of a “net” of small detachments with the dual role of calling down fire and of local quasi-guerrilla action. Because of its low density, the elements of this net will be everywhere and will thus need only the mobility of the boot. It will transfer mass, structurally from the combat arms to the artillery, and in deployment from the direct fire zone (as we now understand it) to the formation and protection of mobile fire bases capable of movement at heavy-track tempo (Chapter 9). Thus the third effect will be to polarise mobility, for the manoeuvre force still required is likely to be based on the rotor. This line of thought is borne out by recent trends in Soviet thinking on the offensive. The concept of an operational manoeuvre group (OMG) which hives off raid forces against C3 and indirect fire resources is giving way to more fluid and discontinuous manoeuvre by task forces (“air-ground assault groups” found by “shock divisions”) directed onto fire bases—again of course with an operational helicopter force superimposed. [Simpkin, Race To The Swift, p. 169]
It seems to me that in the mid-1980s, Simpkin accurately predicted the emergence of modern anti-access/area denial (A2/AD) defensive systems with reasonable accuracy, as well the evolving thinking on the part of the U.S. military as to how to operate against them.
Simpkin’s vision of task forces (more closely resembling Russian/Soviet OMGs than rotary wing “air-ground assault groups” operational forces, however) employing “fluid and discontinuous manoeuvre” at operational depths to attack long-range precision firebases appears similar to emerging Army thinking about future multidomain operations. (It’s likely that Douglas MacGregor’s Reconnaissance Strike Group concept more closely fits that bill.)
One thing he missed on was his belief that rotary wing helicopter combat forces would supplant armored forces as the primary deep operations combat arm. However, there is the potential possibility that drone swarms might conceivably take the place in Simpkin’s operational construct that he allotted to heliborne forces. Drones have two primary advantages over manned helicopters: they are far cheaper and they are far less vulnerable to enemy fires. With their unique capacity to blend mass and fires, drones could conceivably form the deep strike operational hammer that Simpkin saw rotary wing forces providing.
Just as interesting was Simpkin’s anticipation of the growing importance of information and electronic warfare in these environments. More on that later.
For my money, one of the most underrated analysts and theorists of modern warfare was the late Brigadier Richard Simpkin. A retired British Army World War II veteran, Simpkin helped design the Chieftan tank in the 60s and 70s. He is best known for his series of books analyzing Soviet and Western military theory and doctrine. His magnum opus was Race To The Swift: Thoughts on Twenty-First Century Warfare, published in 1985. A brilliant blend of military history, insightful analysis of tactics and technology as well as operations and strategy, and Simpkin’s idiosyncratic wit, the observations in Race To The Swift are becoming more prescient by the year.
Some of Simpkin’s analysis has not aged well, such as the focus on the NATO/Soviet confrontation in Central Europe, and a bold prediction that rotary wing combat forces would eventually supplant tanks as the primary combat arm. However, it would be difficult to find a better historical review of the role of armored forces in modern warfare and how trends in technology, tactics, and doctrine are interacting with strategy, policy, and politics to change the character of warfare in the 21st Century.
To follow on my previous post on the interchangeability of fire (which I gleaned from Simpkin, of course), I offer this nugget on how increasing weapons lethality would affect 21st Century warfare, written from the perspective of the mid 1980s:
While accidents of ground will always provide some kind of cover, the effect of modern firepower on land force tactics is equally revolutionary. Just as we saw in Part 2 how the rotary wing may well turn force structures inside out, firepower is already turning tactical concepts inside out, by replacing the anvil of troops with an anvil of fire (Fig. 5, page 51)*. The use of combat troops at high density to hold ground or to seize it is already likely to prove highly costly, and may soon become wholly unprofitable. The interesting question is what effect the dominance of firepower will have at operational level.
One school of thought, to which many defence academics on both sides of the Atlantic subscribe, is that it will reduce mobility and bring about a return to positional warfare. The opposite view is that it will put a premium on elusiveness, increasing mobility and reducing mass. On analysis, both these opinions appear rather simplistic, mainly because they ignore the interchangeability of troops and fire…—in other words the equivalence or complementarity of the movement of troops and the massing of fire. They also underrate the part played by manned and unmanned surveillance, and by communication. Another factor, little understood by soldiers and widely ignored, is the weight of fire a modern fast jet in its strike configuration, flying a lo-lo-lo profile, can put down very rapidly wherever required. With modern artillery and air support, a pair of eyes backed up by an unjammable radio and perhaps a thermal imager becomes the equivalent of at least a (company) combat team, perhaps a battle group. [Simpkin, Race To The Swift, pp. 168-169]
Sound familiar? I will return to Simpkin’s insights in future posts, but I suggest you all snatch up a copy of Race To The Swift for yourselves.
[This series of posts is adapted from the article “Artillery Effectiveness vs. Armor,” by Richard C. Anderson, Jr., originally published in the June 1997 edition of the International TNDM Newsletter.]
Table IX shows the distribution of cause of loss by type or armor vehicle. From the distribution it might be inferred that better protected armored vehicles may be less vulnerable to artillery attack. Nevertheless, the heavily armored vehicles still suffered a minimum loss of 5.6 percent due to artillery. Unfortunately the sample size for heavy tanks was very small, 18 of 980 cases or only 1.8 percent of the total.
The data are limited at this time to the seven cases.[6] Further research is necessary to expand the data sample so as to permit proper statistical analysis of the effectiveness of artillery versus tanks.
NOTES
[18] Heavy armor includes the KV-1, KV-2, Tiger, and Tiger II.
[19] Medium armor includes the T-34, Grant, Panther, and Panzer IV.
[20] Light armor includes the T-60, T-70. Stuart, armored cars, and armored personnel carriers.
[This series of posts is adapted from the article “Artillery Effectiveness vs. Armor,” by Richard C. Anderson, Jr., originally published in the June 1997 edition of the International TNDM Newsletter.]
[14] From ORS Joint Report No. 1. A total of an estimated 300 German armor vehicles were found following the battle.
[15] Data from 38th Infantry After Action Report (including “Sketch showing enemy vehicles destroyed by 38th Inf Regt. and attached units 17-20 Dec. 1944″), from 12th SS PzD strength report dated 8 December 1944, and from strengths indicated on the OKW briefing maps for 17 December (1st [circa 0600 hours], 2d [circa 1200 hours], and 3d [circa 1800 hours] situation), 18 December (1st and 2d situation), 19 December (2d situation), 20 December (3d situation), and 21 December (2d and 3d situation).
[16] Losses include confirmed and probable losses.
[17] Data from Combat Interview “26th Infantry Regiment at Dom Bütgenbach” and from 12th SS PzD, ibid.
[This series of posts is adapted from the article “Artillery Effectiveness vs. Armor,” by Richard C. Anderson, Jr., originally published in the June 1997 edition of the International TNDM Newsletter.]
[11] Five of the 13 counted as unknown were penetrated by both armor piercing shot and by infantry hollow charge weapons. There was no evidence to indicate which was the original cause of the loss.
[12] From ORS Report No. 17
[13] From ORS Report No. 15. The “Pocket” was the area west of the line Falaise-Argentan and east of the line Vassy-Gets-Domfront in Normandy that was the site in August 1944 of the beginning of the German retreat from France. The German forces were being enveloped from the north and south by Allied ground forces and were under constant, heavy air attack.
German Army 150mm heavy field howitzer 18 L/29.5 battery. [Panzer DB/Pinterest]
[This series of posts is adapted from the article “Artillery Effectiveness vs. Armor,” by Richard C. Anderson, Jr., originally published in the June 1997 edition of the International TNDM Newsletter.]
Curiously, at Kursk, in the case where the highest percent loss was recorded, the German forces opposing the Soviet 1st Tank Army—mainly the XLVIII Panzer Corps of the Fourth Panzer Army—were supported by proportionately fewer artillery pieces (approximately 56 guns and rocket launchers per division) than the US 1st Infantry Division at Dom Bütgenbach (the equivalent of approximately 106 guns per division)[4]. Nor does it appear that the German rate of fire at Kursk was significantly higher than that of the American artillery at Dom Bütgenbach. On 20 July at Kursk, the 150mm howitzers of the 11th Panzer Division achieved a peak rate of fire of 87.21 rounds per gum. On 21 December at Dom Bütgenbach, the 155mm howitzers of the 955th Field Artillery Battalion achieved a peak rate of fire of 171.17 rounds per gun.[5]
NOTES
[4] The US artillery at Dom Bütgenbach peaked on 21 December 1944 when a total of 210 divisional and corps pieces fired over 10,000 rounds in support of the 1st Division’s 26th Infantry.
[5] Data collected on German rates of fire are fragmentary, but appear to be similar to that of the American Army in World War ll. An article on artillery rates of fire that explores the data in more detail will be forthcoming in a future issue of this newsletter. [NOTE: This article was not completed or published.]
Notes to Table I.
[8] The data were found in reports of the 1st Tank Army (Fond 299, Opis‘ 3070, Delo 226). Obvious math errors in the original document have been corrected (the total lost column did not always agree with the totals by cause). The total participated column evidently reflected the starting strength of the unit, plus replacement vehicles. “Burned'” in Soviet wartime documents usually indicated a total loss, however it appears that in this case “burned” denoted vehicles totally lost due to direct fire antitank weapons. “Breakdown” apparently included both mechanical breakdown and repairable combat damage.
[9] Note that the brigade report (Fond 3304, Opis‘ 1, Delo 24) contradicts the army report. The brigade reported that a total of 28 T-34s were lost (9 to aircraft and 19 to “artillery”) and one T-60 was destroyed by a mine. However, this report was made on 11 July, during the battle, and may not have been as precise as the later report recorded by 1st Tank Army. Furthermore, it is not as clear in the brigade report that “artillery” referred only to indirect fire HE and not simply lo both direct and indirect fire guns.
[This series of posts is adapted from the article “Artillery Effectiveness vs. Armor,” by Richard C. Anderson, Jr., originally published in the June 1997 edition of the International TNDM Newsletter.]
The effectiveness of artillery against exposed personnel and other “soft” targets has long been accepted. Fragments and blast are deadly to those unfortunate enough to not be under cover. What has also long been accepted is the relative—if not total—immunity of armored vehicles when exposed to shell fire. In a recent memorandum, the United States Army Armor School disputed the results of tests of artillery versus tanks by stating, “…the Armor School nonconcurred with the Artillery School regarding the suppressive effects of artillery…the M-1 main battle tank cannot be destroyed by artillery…”
This statement may in fact be true,[1] if the advancement of armored vehicle design has greatly exceeded the advancement of artillery weapon design in the last fifty years. [Original emphasis] However, if the statement is not true, then recent research by TDI[2] into the effectiveness of artillery shell fire versus tanks in World War II may be illuminating.
The TDI search found that an average of 12.8 percent of tank and other armored vehicle losses[3] were due to artillery fire in seven eases in World War II where the cause of loss could be reliably identified. The highest percent loss due to artillery was found to be 14.8 percent in the case of the Soviet 1st Tank Army at Kursk (Table II). The lowest percent loss due to artillery was found to be 5.9 percent in the case of Dom Bütgenbach (Table VIII).
The seven cases are split almost evenly between those that show armor losses to a defender and those that show losses to an attacker. The first four cases (Kursk, Normandy l. Normandy ll, and the “Pocket“) are engagements in which the side for which armor losses were recorded was on the defensive. The last three cases (Ardennes, Krinkelt. and Dom Bütgenbach) are engagements in which the side for which armor losses were recorded was on the offensive.
Four of the seven eases (Normandy I, Normandy ll, the “Pocket,” and Ardennes) represent data collected by operations research personnel utilizing rigid criteria for the identification of the cause of loss. Specific causes of loss were only given when the primary destructive agent could be clearly identified. The other three cases (Kursk, Krinkelt, and Dom Bütgenbach) are based upon combat reports that—of necessity—represent less precise data collection efforts.
However, the similarity in results remains striking. The largest identifiable cause of tank loss found in the data was, predictably, high-velocity armor piercing (AP) antitank rounds. AP rounds were found to be the cause of 68.7 percent of all losses. Artillery was second, responsible for 12.8 percent of all losses. Air attack as a cause was third, accounting for 7.4 percent of the total lost. Unknown causes, which included losses due to hits from multiple weapon types as well as unidentified weapons, inflicted 6.3% of the losses and ranked fourth. Other causes, which included infantry antitank weapons and mines, were responsible for 4.8% of the losses and ranked fifth.
NOTES
[1] The statement may be true, although it has an “unsinkable Titanic,” ring to it. It is much more likely that this statement is a hypothesis, rather than a truism.
[2] As pan of this article a survey of the Research Analysis Corporation’s publications list was made in an attempt to locate data from previous operations research on the subject. A single reference to the study of tank losses was found. Group 1 Alvin D. Coox and L. Van Loan Naisawald, Survey of Allied Tank Casualties in World War II, CONFIDENTIAL ORO Report T-117, 1 March 1951.
[3] The percentage loss by cause excludes vehicles lost due to mechanical breakdown or abandonment. lf these were included, they would account for 29.2 percent of the total lost. However, 271 of the 404 (67.1%) abandoned were lost in just two of the cases. These two cases (Normandy ll and the Falaise Pocket) cover the period in the Normandy Campaign when the Allies broke through the German defenses and began the pursuit across France.
I have taken a look in previous posts at how the historical relationship identified by Trevor Dupuy between weapon lethality, battlefield dispersion, and casualty rates argues against this assumption with regard to personnel attrition and tank loss rates. What about artillery loss rates? Will long-range precision fires make ground-based long-range precision fire platforms themselves more vulnerable? Historical research suggests that trend was already underway before the advent of the new technology.
In 1976, Trevor Dupuy and the Historical Evaluation and Research Organization (HERO; one of TDI’s corporate ancestors) conducted a study sponsored by Sandia National Laboratory titled “Artillery Survivability in Modern War.” (PDF) The study focused on looking at historical artillery loss rates and the causes of those losses. It drew upon quantitative data from the 1973 Arab-Israel War, the Korean War, and the Eastern Front during World War II.
Conclusions
1. In the early wars of the 20th Century, towed artillery pieces were relatively invulnerable, and they were rarely severely damaged or destroyed except by very infrequent direct hits.
2. This relative invulnerability of towed artillery resulted in general lack of attention to the problems of artillery survivability through World War II.
3. The lack of effective hostile counter-artillery resources in the Korean and Vietnam wars contributed to continued lack of attention to the problem of artillery survivability, although increasingly armies (particularly the US Army) were relying on self-propelled artillery pieces.
4. Estimated Israeli loss statistics of the October 1973 War suggest that because of size and characteristics, self-propelled artillery is more vulnerable to modern counter-artillery means than was towed artillery in that and previous wars; this greater historical physical vulnerability of self-propelled weapons is consistent with recent empirical testing by the US Army.
5. The increasing physical vulnerability of modern self-propelled artillery weapons is compounded by other modern combat developments, including:
a. Improved artillery counter-battery techniques and resources; b. Improved accuracy of air-delivered munitions; c..increased lethality of modern artillery ammunition; and d. Increased range of artillery and surface-to-surface missiles suitable for use against artillery.
6. Despite this greater vulnerability of self-propelled weapons, Israeli experience in the October war demonstrated that self-propelled artillery not only provides significant protection to cannoneers but also that its inherent mobility permits continued effective operation under circumstances in which towed artillery crews would be forced to seek cover, and thus be unable to fire their weapons. ‘
7. Paucity of available processed, compiled data on artillery survivability and vulnerability limits analysis and the formulation of reliable artillery loss experience tables or formulae.
8. Tentative analysis of the limited data available for this study indicates the following:
a. In “normal” deployment, percent weapon losses by standard weight classification are in the following proportions:
b. Towed artillery losses to hostile artillery (counterbattery) appear in general to very directly with battle intensity (as measured by percent personnel casualties per day), at a rate somewhat less than half of the percent personnel losses for units of army strength or greater; this is a straight-line relationship, or close to it; the stronger or more effective the hostile artillery is, the steeper the slope of the curve;
c. Towed artillery losses to all hostile anti-artillery means appears in general to vary directly with battle intensity at a rate about two-thirds of the-percent personnel losses for units of army strength or greater; the curve rises slightly more rapidly in high intensity combat than in normal or low-intensity combat; the stronger or more effective the hostile anti-artillery means (primarily air and counter-battery), the steeper the slope of the curve;
d. Self-propelled artillery losses appear to be generally consistent with towed losses, but at rates at least twice as great in comparison to battle intensity.
9. There are available in existing records of US and German forces in World war II, and US forces in the Korean and Vietnam Wars, unit records and reports that will permit the formulation of reliable artillery loss experience tables and formulae for those conflicts; these, with currently available and probably improved, data from the Arab-Israeli wars, will permit the formulation of reliable artillery loss experience tables and formulae for simulations of modern combat under current and foreseeable future conditions.
The study caveated these conclusions with the following observations:
Most of the artillery weapons in World War II were towed weapons. By the time the United States had committed small but significant numbers of self-propelled artillery pieces in Europe, German air and artillery counter-battery retaliatory capabilities had been significantly reduced. In the Korean and Vietnam wars, although most American artillery was self-propelled, the enemy had little counter-artillery capability either in the air or in artillery weapons and counter-battery techniques.
It is evident from vulnerability testing of current Army self-propelled weapons, that these weapons–while offering much more protection to cannoneers and providing tremendous advantages in mobility–are much more vulnerable to hostile action than are towed weapons, and that they are much more subject to mechanical breakdowns involving either the weapons mountings or the propulsion elements. Thus there cannot be a direct relationship between aggregated World War II data, or even aggregated Korean war or October War data, and current or future artillery configurations. On the other hand, the body of data from the October war where artillery was self-propelled is too small and too specialized by environmental and operational circumstances to serve alone as a paradigm of artillery vulnerability.
Despite the intriguing implications of this research, HERO’s proposal for follow on work was not funded. HERO only used easily accessible primary and secondary source data for the study. It noted much more primary source data was likely available but that it would require a significant research effort to compile it. (Research is always the expensive tent-pole in quantitative historical analysis. This seems to be why so little of it ever gets funded.) At the time of the study in 1976, no U.S. Army organization could identify any existing quantitative historical data or analysis on artillery losses, classified or otherwise. A cursory search on the Internet reveals no other such research as well. Like personnel attrition and tank loss rates, it would seem that artillery loss rates would be another worthwhile subject for quantitative analysis as part of the ongoing effort to develop the MDB concept.
There is probably no obscurity of combat requiring clarification and understanding more urgently than that of suppression… Suppression usually is defined as the effect of fire (primarily artillery fire) upon the behavior of hostile personnel, reducing, limiting, or inhibiting their performance of combat duties. Suppression lasts as long as the fires continue and for some brief, indeterminate period thereafter. Suppression is the most important effect of artillery fire, contributing directly to the ability of the supported maneuver units to accomplish their missions while preventing the enemy units from accomplishing theirs. (p. 251)
Official US Army field artillery doctrine makes a distinction between “suppression” and “neutralization.” Suppression is defined to be instantaneous and fleeting; neutralization, while also temporary, is relatively longer-lasting. Neutralization, the doctrine says, results when suppressive effects are so severe and long-lasting that a target is put out of action for a period of time after the suppressive fire is halted. Neutralization combines the psychological effects of suppressive gunfire with a certain amount of damage. The general concept of neutralization, as distinct from the more fleeting suppression, is a reasonable one. (p. 252)
Despite widespread acknowledgement of the existence of suppression and neutralization, the lack of interest in analyzing its effects was a source of professional frustration for Dupuy. As he commented in 1989,
The British did some interesting but inconclusive work on suppression in their battlefield operations research in World War II. In the United States I am aware of considerable talk about suppression, but very little accomplishment, over the past 20 years. In the light of the significance of suppression, our failure to come to grips with the issue is really quite disgraceful.
This lack of interest is curious, given that suppression and neutralization remain embedded in U.S. Army combat doctrine to this day. The current Army definitions are:
Suppression – In the context of the computed effects of field artillery fires, renders a target ineffective for a short period of time producing at least 3-percent casualties or materiel damage. [Army Doctrine Reference Publication (ADRP) 1-02, Terms and Military Symbols, December 2015, p. 1-87]
Neutralization – In the context of the computed effects of field artillery fires renders a target ineffective for a short period of time, producing 10-percent casualties or materiel damage. [ADRP 1-02, p. 1-65]
A particular source for Dupuy’s irritation was the fact that these definitions were likely empirically wrong. As he argued in Understanding War,
This is almost certainly the wrong way to approach quantification of neutralization. Not only is there no historical evidence that 10% casualties are enough to achieve this effect, there is no evidence that any level of losses is required to achieve the psycho-physiological effects of suppression or neutralization. Furthermore, the time period in which casualties are incurred is probably more important than any arbitrary percentage of loss, and the replacement of casualties and repair of damage are probably irrelevant. (p. 252)
Thirty years after Dupuy pointed this problem out, the construct remains enshrined in U.S. doctrine, unquestioned and unsubstantiated. Dupuy himself was convinced that suppression probably had little, if anything, to do with personnel loss rates.
I believe now that suppression is related to and probably a component of disruption caused by combat processes other than surprise, such as a communications failure. Further research may reveal, however, that suppression is a very distinct form of disruption that can be measured or estimated quite independently of disruption caused by any other phenomenon. (Understanding War, p. 251)
He had developed a hypothesis for measuring the effects of suppression, but was unable to interest anyone in the U.S. government or military in sponsoring a study on it. Suppression as a combat phenomenon remains only vaguely understood.
Scharre agreed that robotic drones are indeed vulnerable to such countermeasures, but made this point in response:
I think this is 100% correct! The genius of robotic vehicles is that they don't have to be survivable. They can be built cheaply and expendable, overwhelming the adversary with mass. 5/
He then went to contend that robotic swarms offer the potential to reestablish the role of mass in future combat. Mass, either in terms of numbers of combatants or volume of firepower, has played a decisive role in most wars. As the aphorism goes, usually credited to Josef Stalin, “mass has a quality all of its own.”
Numbers matter. For an adversary willing to treat individual units as expendable, swarming is a very appealing tactic. 9/
Overwhelming the enemy through sheer mass has been an effective military tactic throughout the ages. In fact, that's precisely how the Allies won World War II, by overwhelming the Axis through an onslaught of iron. 10/
As Paul Kennedy wrote, "No matter how cleverly the Wehrmacht mounted its tactical counterattacks … it was to be ultimately overwhelmed by the sheer mass of Allied firepower." 12/
Scharre observed that the United States went in a different direction in its post-World War II approach to warfare, adopting instead “offset” strategies that sought to leverage superior technology to balance against the mass militaries of the Communist bloc.
During the Cold War, the United States adopted an "offset strategy" to counter Soviet numerical superiority with qualitatively superior technology — first nuclear weapons then information-age precision-guided weapons. 13/
While effective during the Cold War, Scharre concurs with the arguments that offset strategies are becoming far too expensive and may ultimately become self-defeating.
The logical conclusion of that strategy is the current death spiral of the U.S. military — rising platform costs and shrinking quantities leading to qualitatively superior weapons but in insufficient quantities to deliver operational results. 14/
And it's not about the budget. More money won't save the U.S. from this trap. From 2001-2008 the base (non-war) budgets of the Navy and Air Force grew by 22% and 27% respectively in real dollars. # of assets declined by 10% for ships and nearly 20% for aircraft. 16/
In order to avoid this fate, Scharre contends that
The United States needs to change the way it produces combat power, focusing on the most cost-effective way to accomplish its operational goals rather than building next-gen "X" programs at any price. 17/
Robots might very well change that equation. Whether autonomous or “human in the loop,” robotic swarms do not feel fear and are inherently expendable. Cheaply produced robots might very well provide sufficient augmentation to human combat units to restore the primacy of mass in future warfare.