Tag Military Science

Dupuy’s Verities: The Advantage Of The Offensive

Union assault on the “Mule Shoe” salient, 12 May 1864, by Thure de Thulstrup (1887) [Wikimedia]

The seventh of Trevor Dupuy’s Timeless Verities of Combat is:

An attacker willing to pay the price can always penetrate the strongest defenses.

From Understanding War (1987):

No matter how alert the defender, no matter how skillful his dispositions to avoid or mitigate the effects of surprise or the effects of flank or rear attack, a skillful attacker can always achieve at least a temporary advantage for some time at a place he has selected. This is one reason why Napoleon always endeavored to seize and retain the initiative. In the great battles of 1864 and 1865 in Virginia, Lee was always able to exploit his defensive advantage to the utmost. But Grant equally was always able to achieve a temporary superiority when and where he wished. This did not always result in a Union victory—given Lee’s defensive skill—but invariably it forced Lee to retreat until he could again impose a temporary stalemate with the assistance of powerful field fortifications. A modern example can be found in the Soviet offensive relieving Leningrad in 1943. Another was the Allied break-out from the Normandy beachhead in July and August of 1944.

The exact meaning of this verity is tricky to determine, as the phrase “willing to pay the price” does a lot of work here. History is certainly replete with examples of Phyrric victories, where the cost paid for battlefield success deprived the attacker of any clear benefit. (The U.S. Civil War Battle of Chickamauga in 1863 would be an example in line with Dupuy’s description above.) Perhaps “willing and able to pay the price” would have been a better of way stating this. And, of course, no attack is guaranteed to succeed.

What Dupuy had in mind here is probably best understood in the context of two other of his verities “Offensive action is essential to positive combat results” and “Initiative permits application of preponderant combat power.” Even if the defensive may be the stronger form of combat, the offensive affords certain inherent potential advantages that can enable attackers to defeat the strongest of defenses if conducted effectively, sufficiently resourced, and determinedly pressed.

Some Useful Resources for Post-World War II U.S. Army Doctrine Development

This list originated in response to a Twitter query discussing the history of post-World War II U.S. Army doctrine development. It is hardly exhaustive but it does include titles and resources that may not be widely known.

The first two are books:

Benjamin Jensen, Forging the Sword: Doctrinal Change in the U.S. Army (Stanford University Press, 2016)

Jensen focused on the institutional processes shaping the Army’s continual post-war World War II efforts to reform its doctrine in response to changes in the character of modern warfare.

Shimon Naveh, In Pursuit of Military Excellence: The Evolution of Operational Theory (Routledge, 1997)

In an excellent overview of the evolution of operational thought through the 20th century, Naveh devoted two chapters to the Army’s transition to Active Defense in the 70s and then to AirLand Battle in the 80s.

There are several interesting monographs that are available online:

Andrew J. Bacevich, The Pentomic Era: The U.S. Army Between Korea and Vietnam (NDU Press, 1986)

Paul Herbert, Deciding What Has to Be Done: General William E. DePuy and the 1976 Edition of FM 100-5, Operations (Combat Studies Institute, 1988)

John Romjue, From Active Defense to AirLand Battle: the Development of Army Doctrine 1973-1982 (TRADOC, 1984)

John Romjue, The Army of Excellence: The Development of the 1980s Army (TRADOC, 1997)

John Romjue, American Army Doctrine for the Post-Cold War (TRADOC, 1997)

A really useful place to browse is the Army Command and General Staff College’s online Skelton Combined Arms Research Library (CARL). It is loaded with old manuals and student papers and theses addressing a wide variety of topics related to the nuts and bolts of doctrine.

Another good place to browse is the Defense Technical Information Center (DTIC), which is a huge digital library of government sponsored research. I recommend searches on publications by the Army’s defunct operations research organizations: Operations Research Office (ORO), Research Analysis Corporation (RAC), and the Special Operations Research Office (SORO). The Combat Operations Research Group (CORG), particularly a series of studies of Army force structure from squads to theater HQ’s by Virgil Ney. There is much more to find in DTIC.

Two other excellent places to browse for material on doctrine are the Combat Studies Institute Press publications on CARL and the U.S. Army Center of Military History’s publications.

Some journals with useful research include the Journal of Cold War Studies and the Journal of Strategic Studies.

If anyone else has suggestions, let me know.

Engaging the Phalanx (part 7 of 7)

Hopefully this is my last post on the subject (but I suspect not, as I expect a public response from the three TRADOC authors). This is in response to the article in the December 2018 issue of the Phalanx by Alt, Morey and Larimer (see Part 1, Part 2, Part 3, Part 4, Part 5, Part 6). The issue here is the “Base of Sand” problem, which is what the original blog post that “inspired” their article was about:

Wargaming Multi-Domain Battle: The Base Of Sand Problem

While the first paragraph of their article addressed this blog post and they reference Paul Davis’ 1992 Base of Sand paper in their footnotes (but not John Stockfish’s paper, which is an equally valid criticism), they then do not discuss the “Base of Sand” problem further. They do not actually state whether this is a problem or not a problem. I gather by this notable omission that in fact they do understand that it is a problem, but being employees of TRADOC they are limited as to what they can publicly say. I am not.

I do address the “Base of Sand” problem in my book War by Numbers, Chapter 18. It has also been addressed in a few other posts on this blog. We are critics because we do not see significant improvement in the industry. In some cases, we are seeing regression.

In the end, I think the best solution for the DOD modeling and simulation community is not to “circle the wagons” and defend what they are currently doing, but instead acknowledge the limitations and problems they have and undertake a corrective action program. This corrective action program would involve: 1) Properly addressing how to measure and quantify certain aspects of combat (for example: Breakpoints) and 2) Validating these aspects and the combat models these aspects are part of by using real-world combat data. This would be an iterative process, as you develop and then test the model, then further develop it, and then test it again. This moves us forward. It is a more valued approach than just “circling the wagons.” As these models and simulations are being used to analyze processes that may or may not make us fight better, and may or may not save American service members lives, then I think it is important enough to do right. That is what we need to be focused on, not squabbling over a blog post (or seven).

Has The Army Given Up On Counterinsurgency Research, Again?

Mind-the-Gap

[In light of the U.S. Army’s recent publication of a history of it’s involvement in Iraq from 2003 to 2011, it may be relevant to re-post this piece from from 29 June 2016.]

As Chris Lawrence mentioned yesterday, retired Brigadier General John Hanley’s review of America’s Modern Wars in the current edition of Military Review concluded by pointing out the importance of a solid empirical basis for staff planning support for reliable military decision-making. This notion seems so obvious as to be a truism, but in reality, the U.S. Army has demonstrated no serious interest in remedying the weaknesses or gaps in the base of knowledge underpinning its basic concepts and doctrine.

In 2012, Major James A. Zanella published a monograph for the School of Advanced Military Studies of the U.S. Army Command and General Staff College (graduates of which are known informally as “Jedi Knights”), which examined problems the Army has had with estimating force requirements, particularly in recent stability and counterinsurgency efforts.

Historically, the United States military has had difficulty articulating and justifying force requirements to civilian decision makers. Since at least 1975, governmental officials and civilian analysts have consistently criticized the military for inadequate planning and execution. Most recently, the wars in Afghanistan and Iraq reinvigorated the debate over the proper identification of force requirements…Because Army planners have failed numerous times to provide force estimates acceptable to the President, the question arises, why are the planning methods inadequate and why have they not been improved?[1]

Zanella surveyed the various available Army planning tools and methodologies for determining force requirements, but found them all either inappropriate or only marginally applicable, or unsupported by any real-world data. He concluded

Considering the limitations of Army force planning methods, it is fair to conclude that Army force estimates have failed to persuade civilian decision-makers because the advice is not supported by a consistent valid method for estimating the force requirements… What is clear is that the current methods have utility when dealing with military situations that mirror the conditions represented by each model. In the contemporary military operating environment, the doctrinal models no longer fit.[2]

Zanella did identify the existence of recent, relevant empirical studies on manpower and counterinsurgency. He noted that “the existing doctrine on force requirements does not benefit from recent research” but suggested optimistically that it could provide “the Army with new tools to reinvigorate the discussion of troops-to-task calculations.”[3] Even before Zanella published his monograph, however, the Defense Department began removing any detailed reference or discussion about force requirements in counterinsurgency from Army and Joint doctrinal publications.

As Zanella discussed, there is a body of recent empirical research on manpower and counterinsurgency that contains a variety of valid and useful insights, but as I recently discussed, it does not yet offer definitive conclusions. Much more research and analysis is needed before the conclusions can be counted on as a valid and justifiably reliable basis for life and death decision-making. Yet, the last of these government sponsored studies was completed in 2010. Neither the Army nor any other organization in the U.S. government has funded any follow-on work on this subject and none appears forthcoming. This boom-or-bust pattern is nothing new, but the failure to do anything about it is becoming less and less understandable.

NOTES

[1] Major James A. Zanella, “Combat Power Analysis is Combat Power Density” (Ft. Leavenworth, KS: School of Advanced Military Studies, U.S. Army Command and General Staff College, 2012), pp. 1-2.

[2] Ibid, 50.

[3] Ibid, 47.

Historians and the Early Era of U.S. Army Operations Research

While perusing Charles Shrader’s fascinating history of the U.S. Army’s experience with operations research (OR), I came across several references to the part played by historians and historical analysis in early era of that effort.

The ground forces were the last branch of the Army to incorporate OR into their efforts during World War II, lagging behind the Army Air Forces, the technical services, and the Navy. Where the Army was a step ahead, however, was in creating a robust wartime historical field history documentation program. (After the war, this enabled the publication of the U.S. Army in World War II series, known as the “Green Books,” which set a new standard for government sponsored military histories.)

As Shrader related, the first OR personnel the Army deployed forward in 1944-45 often crossed paths with War Department General Staff Historical Branch field historian detachments. They both engaged in similar activities: collecting data on real-world combat operations, which was then analyzed and used for studies and reports written for the use of the commands to which they were assigned. The only significant difference was in their respective methodologies, with the historians using historical methods and the OR analysts using mathematical and scientific tools.

History and OR after World War II

The usefulness of historical approaches to collecting operational data did not go unnoticed by the OR practitioners, according to Shrader. When the Army established the Operations Research Office (ORO) in 1948, it hired a contingent of historians specifically for the purpose of facilitating research and analysis using WWII Army records, “the most likely source for data on operational matters.”

When the Korean War broke out in 1950, ORO sent eight multi-disciplinary teams, including the historians, to collect operational data and provide analytical support for U.S. By 1953, half of ORO’s personnel had spent time in combat zones. Throughout the 1950s, about 40-43% of ORO’s staff was comprised of specialists in the social sciences, history, business, literature, and law. Shrader quoted one leading ORO analyst as noting that, “there is reason to believe that the lawyer, social scientist or historian is better equipped professionally to evaluate evidence which is derived from the mind and experience of the human species.”

Among the notable historians who worked at or with ORO was Dr. Hugh M. Cole, an Army officer who had served as a staff historian for General George Patton during World War II. Cole rose to become a senior manager at ORO and later served as vice-president and president of ORO’s successor, the Research Analysis Corporation (RAC). Cole brought in WWII colleague Forrest C. Pogue (best known as the biographer of General George C. Marshall) and Charles B. MacDonald. ORO also employed another WWII field historian, the controversial S. L. A. Marshall, as a consultant during the Korean War. Dorothy Kneeland Clark did pioneering historical analysis on combat phenomena while at ORO.

The Demise of ORO…and Historical Combat Analysis?

By the late 1950s, considerable institutional friction had developed between ORO, the Johns Hopkins University (JHU)—ORO’s institutional owner—and the Army. According to Shrader,

Continued distrust of operations analysts by Army personnel, questions about the timeliness and focus of ORO studies, the ever-expanding scope of ORO interests, and, above all, [ORO director] Ellis Johnson’s irascible personality caused tensions that led in August 1961 to the cancellation of the Army’s contract with JHU and the replacement of ORO with a new, independent research organization, the Research Analysis Corporation [RAC].

RAC inherited ORO’s research agenda and most of its personnel, but changing events and circumstances led Army OR to shift its priorities away from field collection and empirical research on operational combat data in favor of the use of modeling and wargaming in its analyses. As Chris Lawrence described in his history of federally-funded Defense Department “think tanks,” the rise and fall of scientific management in DOD, the Vietnam War, social and congressional criticism, and an unhappiness by the military services with the analysis led to retrenchment in military OR by the end of the 60s. The Army sold RAC and created its own in-house Concepts Analysis Agency (CAA; now known as the Center for Army Analysis).

By the early 1970s, analysts, such as RAND’s Martin Shubik and Gary Brewer, and John Stockfisch, began to note that the relationships and processes being modeled in the Army’s combat simulations were not based on real-world data and that empirical research on combat phenomena by the Army OR community had languished. In 1991, Paul Davis and Donald Blumenthal gave this problem a name: the “Base of Sand.”

Validating Attrition

Continuing to comment on the article in the December 2018 issue of the Phalanx by Alt, Morey and Larimer (this is part 3 of 7; see Part 1, Part 2)

On the first page (page 28) in the third column they make the statement that:

Models of complex systems, especially those that incorporate human behavior, such as that demonstrated in combat, do not often lend themselves to empirical validation of output measures, such as attrition.

Really? Why can’t you? If fact, isn’t that exactly the model you should be validating?

More to the point, people have validated attrition models. Let me list a few cases (this list is not exhaustive):

1. Done by Center for Army Analysis (CAA) for the CEM (Concepts Evaluation Model) using Ardennes Campaign Simulation Study (ARCAS) data. Take a look at this study done for Stochastic CEM (STOCEM): https://apps.dtic.mil/dtic/tr/fulltext/u2/a489349.pdf

2. Done in 2005 by The Dupuy Institute for six different casualty estimation methodologies as part of Casualty Estimation Methodologies Studies. This was work done for the Army Medical Department and funded by DUSA (OR). It is listed here as report CE-1: http://www.dupuyinstitute.org/tdipub3.htm

3. Done in 2006 by The Dupuy Institute for the TNDM (Tactical Numerical Deterministic Model) using Corps and Division-level data. This effort was funded by Boeing, not the U.S. government. This is discussed in depth in Chapter 19 of my book War by Numbers (pages 299-324) where we show 20 charts from such an effort. Let me show you one from page 315:

 

So, this is something that multiple people have done on multiple occasions. It is not so difficult that The Dupuy Institute was not able to do it. TRADOC is an organization with around 38,000 military and civilian employees, plus who knows how many contractors. I think this is something they could also do if they had the desire.

 

Validation

Continuing to comment on the article in the December 2018 issue of the Phalanx by Jonathan Alt, Christopher Morey and Larry Larimer (this is part 2 of 7; see part 1 here).

On the first page (page 28) top of the third column they make the rather declarative statement that:

The combat simulations used by military operations research and analysis agencies adhere to strict standards established by the DoD regarding verification, validation and accreditation (Department of Defense, 2009).

Now, I have not reviewed what has been done on verification, validation and accreditation since 2009, but I did do a few fairly exhaustive reviews before then. One such review is written up in depth in The International TNDM Newsletter. It is Volume 1, No. 4 (February 1997). You can find it here:

http://www.dupuyinstitute.org/tdipub4.htm

The newsletter includes a letter dated 21 January 1997 from the Scientific Advisor to the CG (Commanding General)  at TRADOC (Training and Doctrine Command). This is the same organization that the three gentlemen who wrote the article in the Phalanx work for. The Scientific Advisor sent a letter out to multiple commands to try to flag the issue of validation (letter is on page 6 of the newsletter). My understanding is that he received few responses (I saw only one, it was from Leavenworth). After that, I gather there was no further action taken. This was a while back, so maybe everything has changed, as I gather they are claiming with that declarative statement. I doubt it.

This issue to me is validation. Verification is often done. Actual validations are a lot rarer. In 1997, this was my list of combat models in the industry that had been validated (the list is on page 7 of the newsletter):

1. Atlas (using 1940 Campaign in the West)

2. Vector (using undocumented turning runs)

3. QJM (by HERO using WWII and Middle-East data)

4. CEM (by CAA using Ardennes Data Base)

5. SIMNET/JANUS (by IDA using 73 Easting data)

 

Now, in 2005 we did a report on Casualty Estimation Methodologies (it is report CE-1 list here: http://www.dupuyinstitute.org/tdipub3.htm). We reviewed the listing of validation efforts, and from 1997 to 2005…nothing new had been done (except for a battalion-level validation we had done for the TNDM). So am I now to believe that since 2009, they have actively and aggressively pursued validation? Especially as most of this time was in a period of severely declining budgets, I doubt it. One of the arguments against validation made in meetings I attended in 1987 was that they did not have the time or budget to spend on validating. The budget during the Cold War was luxurious by today’s standards.

If there have been meaningful validations done, I would love to see the validation reports. The proof is in the pudding…..send me the validation reports that will resolve all doubts.

Engaging the Phalanx

The Military Operations Research Society (MORS) publishes a periodical journal called the Phalanx. In the December 2018 issue was an article that referenced one of our blog posts. This took us by surprise. We only found out about thanks to one of the viewers of this blog. We are not members of MORS. The article is paywalled and cannot be easily accessed if you are not a member.

It is titled “Perspectives on Combat Modeling” (page 28) and is written by Jonathan K. Alt, U.S. Army TRADOC Analysis Center, Monterey, CA.; Christopher Morey, PhD, Training and Doctrine Command Analysis Center, Ft. Leavenworth, Kansas; and Larry Larimer, Training and Doctrine Command Analysis Center, White Sands, New Mexico. I am not familiar with any of these three gentlemen.

The blog post that appears to be generating this article is this one:

Wargaming Multi-Domain Battle: The Base Of Sand Problem

Simply by coincidence, Shawn Woodford recently re-posted this in January. It was originally published on 10 April 2017 and was written by Shawn.

The opening two sentences of the article in the Phalanx reads:

Periodically, within the Department of Defense (DoD) analytic community, questions will arise regarding the validity of the combat models and simulations used to support analysis. Many attempts (sic) to resurrect the argument that models, simulations, and wargames “are built on the thin foundation of empirical knowledge about the phenomenon of combat.” (Woodford, 2017).

It is nice to be acknowledged, although it this case, it appears that we are being acknowledged because they disagree with what we are saying.

Probably the word that gets my attention is “resurrect.” It is an interesting word, that implies that this is an old argument that has somehow or the other been put to bed. Granted it is an old argument. On the other hand, it has not been put to bed. If a problem has been identified and not corrected, then it is still a problem. Age has nothing to do with it.

On the other hand, maybe they are using the word “resurrect” because recent developments in modeling and validation have changed the environment significantly enough that these arguments no longer apply. If so, I would be interested in what those changes are. The last time I checked, the modeling and simulation industry was using many of the same models they had used for decades. In some cases, were going back to using simpler hex-games for their modeling and wargaming efforts. We have blogged a couple of times about these efforts. So, in the world of modeling, unless there have been earthshaking and universal changes made in the last five years that have completely revamped the landscape….then the decades old problems still apply to the decades old models and simulations.

More to come (this is the first of at least 7 posts on this subject).

What Multi-Domain Operations Wargames Are You Playing? [Updated]

Source: David A. Shlapak and Michael Johnson. Reinforcing Deterrence on NATO’s Eastern Flank: Wargaming the Defense of the Baltics. Santa Monica, CA: RAND Corporation, 2016.

 

 

 

 

 

 

 

[UPDATE] We had several readers recommend games they have used or would be suitable for simulating Multi-Domain Battle and Operations (MDB/MDO) concepts. These include several classic campaign-level board wargames:

The Next War (SPI, 1976)

NATO: The Next War in Europe (Victory Games, 1983)

For tactical level combat, there is Steel Panthers: Main Battle Tank (SSI/Shrapnel Games, 1996- )

There were also a couple of naval/air oriented games:

Asian Fleet (Kokusai-Tsushin Co., Ltd. (国際通信社) 2007, 2010)

Command: Modern Air Naval Operations (Matrix Games, 2014)

Are there any others folks are using out there?


A Mystics & Statistic reader wants to know what wargames are being used to simulate and explore Multi-Domain Battle and Operations (MDB/MDO) concepts?

There is a lot of MDB/MDO wargaming going on in at all levels in the U.S. Department of Defense. Much of this appears to use existing models, simulations, and wargames, such as the U.S. Army Center for Army Analysis’s unclassified Wargaming Analysis Model (C-WAM).

Chris Lawrence recently looked at C-WAM and found that it uses a lot of traditional board wargaming elements, including methodologies for determining combat results, casualties, and breakpoints that have been found unable to replicate real-world outcomes (aka “The Base of Sand” problem).

C-WAM 1

C-WAM 2

C-WAM 3

C-WAM 4 (Breakpoints)

There is also the wargame used by RAND to look at possible scenarios for a potential Russian invasion of the Baltic States.

Wargaming the Defense of the Baltics

Wargaming at RAND

What other wargames, models, and simulations are there being used out there? Are there any commercial wargames incorporating MDB/MDO elements into their gameplay? What methodologies are being used to portray MDB/MDO effects?

Active Defense, Forward Defense, and A2/AD in Eastern Europe

The current military and anti-access/area denial situation in Eastern Europe. [Map and overlay derived from situation map by Thomas C. Thielen (@noclador) https://twitter.com/noclador/status/1079999716333703168; and Ian Williams, “The Russia – NATO A2AD Environment,” Missile Threat, Center for Strategic and International Studies, published January 3, 2017, last modified November 29, 2018, https://missilethreat.csis.org/russia-nato-a2ad-environment/]

In an article published by West Point’s Modern War Institute last month, The US Army is Wrong on Future War,” Nathan Jennings, Amos Fox and Adam Taliaferro laid out a detailed argument that current and near-future political, strategic, and operational realities augur against the Army’s current doctrinal conceptualization for Multi-Domain Operations (MDO).

[T]he US Army is mistakenly structuring for offensive clashes of mass and scale reminiscent of 1944 while competitors like Russia and China have adapted to twenty-first-century reality. This new paradigm—which favors fait accompli acquisitions, projection from sovereign sanctuary, and indirect proxy wars—combines incremental military actions with weaponized political, informational, and economic agendas under the protection of nuclear-fires complexes to advance territorial influence…

These factors suggest, cumulatively, that the advantage in military confrontation between great powers has decisively shifted to those that combine strategic offense with tactical defense.

As a consequence, the authors suggested that “the US Army should recognize the evolved character of modern warfare and embrace strategies that establish forward positions of advantage in contested areas like Eastern Europe and the South China Sea. This means reorganizing its current maneuver-centric structure into a fires-dominant force with robust capacity to defend in depth.”

Forward Defense, Active Defense, and AirLand Battle

To illustrate their thinking, Jennings, Fox, and Taliaferro invoked a specific historical example:

This strategic realignment should begin with adopting an approach more reminiscent of the US Army’s Active Defense doctrine of the 1970s than the vaunted AirLand Battle concept of the 1980s. While many distain (sic) Active Defense for running counter to institutional culture, it clearly recognized the primacy of the combined-arms defense in depth with supporting joint fires in the nuclear era. The concept’s elevation of the sciences of terrain and weaponry at scale—rather than today’s cult of the offense—is better suited to the current strategic environment. More importantly, this methodology would enable stated political aims to prevent adversary aggression rather than to invade their home territory.

In the article’s comments, many pushed back against reviving Active Defense thinking, which has apparently become indelibly tarred with the derisive criticism that led to its replacement by AirLand Battle in the 1980s. As the authors gently noted, much of this resistance stemmed from the perceptions of Army critics that Active Defense was passive and defensively-oriented, overly focused on firepower, and suspicions that it derived from operations research analysts reducing warfare and combat to a mathematical “battle calculus.”

While AirLand Battle has been justly lauded for enabling U.S. military success against Iraq in 1990-91 and 2003 (a third-rank, non-nuclear power it should be noted), it always elided the fundamental question of whether conventional deep strikes and operational maneuver into the territory of the Soviet Union’s Eastern European Warsaw Pact allies—and potentially the Soviet Union itself—would have triggered a nuclear response. The criticism of Active Defense similarly overlooked the basic political problem that led to the doctrine in the first place, namely, the need to provide a credible conventional forward defense of West Germany. Keeping the Germans actively integrated into NATO depended upon assurances that a Soviet invasion could be resisted effectively without resorting to nuclear weapons. Indeed, the political cohesion of the NATO alliance itself rested on the contradiction between the credibility of U.S. assurances that it would defend Western Europe with nuclear weapons if necessary and the fears of alliance members that losing a battle for West Germany would make that necessity a reality.

Forward Defense in Eastern Europe

A cursory look at the current military situation in Eastern Europe along with Russia’s increasingly robust anti-access/area denial (A2/AD) capabilities (see map) should clearly illustrate the logic behind a doctrine of forward defense. U.S. and NATO troops based in Western Europe would have to run a gauntlet of well protected long-range fires systems just to get into battle in Ukraine or the Baltics. Attempting operational maneuver at the end of lengthy and exposed logistical supply lines would seem to be dauntingly challenging. The U.S. 2nd U.S. Cavalry ABCT Stryker Brigade Combat Team based in southwest Germany appears very much “lone and lonely.” It should also illustrate the difficulties in attacking the Russian A2/AD complex; an act, which Jennings, Fox, and Taliaferro remind, that would actively court a nuclear response.

In this light, Active Defense—or better—a MDO doctrine of forward defense oriented on “a fires-dominant force with robust capacity to defend in depth,” intended to “enable stated political aims to prevent adversary aggression rather than to invade their home territory,” does not really seem foolishly retrograde after all.