Tag Analysis

Toward An American Approach To Proxy Warfare

U.S.-supported Philippine guerilla fighters led the resistance against the Japanese occupation of Luzon during World War II. [Warfare History Network]

U.S. Army Major Amos Fox has recently published the first two of a set of three articles examining nature of proxy warfare in the early 21st century and suggests some ideas for how the U.S. might better conduct it.

In “In Pursuit of a General Theory of Proxy Warfare,” published in February 2019 by the The Institute of Land Warfare at the Association of the U.S. Army, and “Time, Power, and Principal-Agent Problems: Why the U.S. Army is Ill-Suited for Proxy Warfare Hotspots,” published in the March-April 2019 edition of Military Review, Fox argues,

Proxy environments dominate modern war… It is not just a Russian, Iranian or American approach to war, but one in which many nations and polities engage. However, the U.S. Army lacks a paradigm for proxy warfare, which disrupts its ability to understand the environment or develop useful tactics, operations and strategies for those environments.

His examination of the basic elements of proxy warfare leads him to conclude that “it is dominated by a principal actor dynamic, power relationships and the tyranny of time.” From this premise, Fox outlines two basic models of proxy warfare: exploitative and transactional.

The exploitative model…is characterized by a proxy force being completely dependent on its principal for survival… [It] is usually the result of a stronger actor looking for a tool—a proxy force—to pursue an objective. As a result, the proxy is only as useful to the principal as its ability to make progress toward the principal’s ends. Once the principal’s ends have been achieved or the proxy is unable to maintain momentum toward the principal’s ends, then the principal discontinues the relationship or distances itself from the proxy.

The transactional model is…more often like a business deal. An exchange of services and goods that benefits all parties—defeat of a mutual threat, training of the agent’s force, foreign military sales and finance—is at the heart of the transactional model. However, this model is a paradox because the proxy is the powerbroker in the relationship. In many cases, the proxy government is independent but looking for assistance in defeating an adversary; it is not interested in political or military subjugation by the principal. Moreover, the proxy possesses the power in the relationship because its association with the principal is wholly transactional…the clock starts ticking on the duration of the bond as soon as the first combined shot is fired. As a result, as the common goal is gradually achieved, the agent’s interest in the principal recedes at a comparable rate.

With this concept in hand, Fox makes that case that

[T]he U.S. Army is ill-suited for warfare in the proxy environment because it mismanages the fixed time and the finite power it possesses over a proxy force in pursuit of waning mutual interests. Fundamentally, the salient features of proxy environments—available time, power over a proxy force, and mutual interests—are fleeting due to the fact that proxy relationships are transactional in nature; they are marriages of convenience in which a given force works through another in pursuit of provisionally aligned political or military ends… In order to better position itself to succeed in the proxy environment, the U.S. Army must clearly understand the background and components of proxy warfare.

These two articles provide an excellent basis for a wider discussion for thinking about and shaping not just a more coherent U.S. Army doctrine, but a common policy/strategic/operational framework for understanding and successfully operating in the proxy warfare environments that will only loom larger in 21st century international affairs. It will be interesting to see how Fox’s third article rounds out his discussion.

TDI Friday Read: Engaging The Phalanx

The December 2018 issue of Phalanx, a periodical journal published by The Military Operations Research Society (MORS), contains an article by Jonathan K. Alt, Christopher Morey, and Larry Larimer, entitled “Perspectives on Combat Modeling.” (the article is paywalled, but limited public access is available via JSTOR).

Their article was written partly as a critical rebuttal to a TDI blog post originally published in April 2017, which discussed an issue of which the combat modeling and simulation community has long been aware but slow to address, known as the “Base of Sand” problem.

Wargaming Multi-Domain Battle: The Base Of Sand Problem

In short, because so little is empirically known about the real-world structures of combat processes and the interactions of these processes, modelers have been forced to rely on the judgement of subject matter experts (SMEs) to fill in the blanks. No one really knows if the blend of empirical data and SME judgement accurately represents combat because the modeling community has been reluctant to test its models against data on real world experience, a process known as validation.

TDI President Chris Lawrence subsequently published a series of blog posts responding to the specific comments and criticisms leveled by Alt, Morey, and Larimer.

How are combat models and simulations tested to see if they portray real-world combat accurately? Are they actually tested?

Engaging the Phalanx

How can we know if combat simulations adhere to strict standards established by the DoD regarding validation? Perhaps the validation reports can be released for peer review.

Validation

Some claim that models of complex combat behavior cannot really be tested against real-world operational experience, but this has already been done. Several times.

Validating Attrition

If only the “physics-based aspects” of combat models are empirically tested, do those models reliably represent real-world combat with humans or only the interactions of weapons systems?

Physics-based Aspects of Combat

Is real-world historical operational combat experience useful only for demonstrating the capabilities of combat models, or is it something the models should be able to reliably replicate?

Historical Demonstrations?

If a Subject Matter Expert (SME) can be substituted for a proper combat model validation effort, then could not a SME simply be substituted for the model? Should not all models be considered expert judgement quantified?

SMEs

What should be done about the “Base of Sand” problem? Here are some suggestions.

Engaging the Phalanx (part 7 of 7)

Persuading the military operations research community of the importance of research on real-world combat experience in modeling has been an uphill battle with a long history.

Diddlysquat

And the debate continues…

Dupuy’s Verities: The Advantage Of The Offensive

Union assault on the “Mule Shoe” salient, 12 May 1864, by Thure de Thulstrup (1887) [Wikimedia]

The seventh of Trevor Dupuy’s Timeless Verities of Combat is:

An attacker willing to pay the price can always penetrate the strongest defenses.

From Understanding War (1987):

No matter how alert the defender, no matter how skillful his dispositions to avoid or mitigate the effects of surprise or the effects of flank or rear attack, a skillful attacker can always achieve at least a temporary advantage for some time at a place he has selected. This is one reason why Napoleon always endeavored to seize and retain the initiative. In the great battles of 1864 and 1865 in Virginia, Lee was always able to exploit his defensive advantage to the utmost. But Grant equally was always able to achieve a temporary superiority when and where he wished. This did not always result in a Union victory—given Lee’s defensive skill—but invariably it forced Lee to retreat until he could again impose a temporary stalemate with the assistance of powerful field fortifications. A modern example can be found in the Soviet offensive relieving Leningrad in 1943. Another was the Allied break-out from the Normandy beachhead in July and August of 1944.

The exact meaning of this verity is tricky to determine, as the phrase “willing to pay the price” does a lot of work here. History is certainly replete with examples of Phyrric victories, where the cost paid for battlefield success deprived the attacker of any clear benefit. (The U.S. Civil War Battle of Chickamauga in 1863 would be an example in line with Dupuy’s description above.) Perhaps “willing and able to pay the price” would have been a better of way stating this. And, of course, no attack is guaranteed to succeed.

What Dupuy had in mind here is probably best understood in the context of two other of his verities “Offensive action is essential to positive combat results” and “Initiative permits application of preponderant combat power.” Even if the defensive may be the stronger form of combat, the offensive affords certain inherent potential advantages that can enable attackers to defeat the strongest of defenses if conducted effectively, sufficiently resourced, and determinedly pressed.

Engaging the Phalanx (part 7 of 7)

Hopefully this is my last post on the subject (but I suspect not, as I expect a public response from the three TRADOC authors). This is in response to the article in the December 2018 issue of the Phalanx by Alt, Morey and Larimer (see Part 1, Part 2, Part 3, Part 4, Part 5, Part 6). The issue here is the “Base of Sand” problem, which is what the original blog post that “inspired” their article was about:

Wargaming Multi-Domain Battle: The Base Of Sand Problem

While the first paragraph of their article addressed this blog post and they reference Paul Davis’ 1992 Base of Sand paper in their footnotes (but not John Stockfish’s paper, which is an equally valid criticism), they then do not discuss the “Base of Sand” problem further. They do not actually state whether this is a problem or not a problem. I gather by this notable omission that in fact they do understand that it is a problem, but being employees of TRADOC they are limited as to what they can publicly say. I am not.

I do address the “Base of Sand” problem in my book War by Numbers, Chapter 18. It has also been addressed in a few other posts on this blog. We are critics because we do not see significant improvement in the industry. In some cases, we are seeing regression.

In the end, I think the best solution for the DOD modeling and simulation community is not to “circle the wagons” and defend what they are currently doing, but instead acknowledge the limitations and problems they have and undertake a corrective action program. This corrective action program would involve: 1) Properly addressing how to measure and quantify certain aspects of combat (for example: Breakpoints) and 2) Validating these aspects and the combat models these aspects are part of by using real-world combat data. This would be an iterative process, as you develop and then test the model, then further develop it, and then test it again. This moves us forward. It is a more valued approach than just “circling the wagons.” As these models and simulations are being used to analyze processes that may or may not make us fight better, and may or may not save American service members lives, then I think it is important enough to do right. That is what we need to be focused on, not squabbling over a blog post (or seven).

Historians and the Early Era of U.S. Army Operations Research

While perusing Charles Shrader’s fascinating history of the U.S. Army’s experience with operations research (OR), I came across several references to the part played by historians and historical analysis in early era of that effort.

The ground forces were the last branch of the Army to incorporate OR into their efforts during World War II, lagging behind the Army Air Forces, the technical services, and the Navy. Where the Army was a step ahead, however, was in creating a robust wartime historical field history documentation program. (After the war, this enabled the publication of the U.S. Army in World War II series, known as the “Green Books,” which set a new standard for government sponsored military histories.)

As Shrader related, the first OR personnel the Army deployed forward in 1944-45 often crossed paths with War Department General Staff Historical Branch field historian detachments. They both engaged in similar activities: collecting data on real-world combat operations, which was then analyzed and used for studies and reports written for the use of the commands to which they were assigned. The only significant difference was in their respective methodologies, with the historians using historical methods and the OR analysts using mathematical and scientific tools.

History and OR after World War II

The usefulness of historical approaches to collecting operational data did not go unnoticed by the OR practitioners, according to Shrader. When the Army established the Operations Research Office (ORO) in 1948, it hired a contingent of historians specifically for the purpose of facilitating research and analysis using WWII Army records, “the most likely source for data on operational matters.”

When the Korean War broke out in 1950, ORO sent eight multi-disciplinary teams, including the historians, to collect operational data and provide analytical support for U.S. By 1953, half of ORO’s personnel had spent time in combat zones. Throughout the 1950s, about 40-43% of ORO’s staff was comprised of specialists in the social sciences, history, business, literature, and law. Shrader quoted one leading ORO analyst as noting that, “there is reason to believe that the lawyer, social scientist or historian is better equipped professionally to evaluate evidence which is derived from the mind and experience of the human species.”

Among the notable historians who worked at or with ORO was Dr. Hugh M. Cole, an Army officer who had served as a staff historian for General George Patton during World War II. Cole rose to become a senior manager at ORO and later served as vice-president and president of ORO’s successor, the Research Analysis Corporation (RAC). Cole brought in WWII colleague Forrest C. Pogue (best known as the biographer of General George C. Marshall) and Charles B. MacDonald. ORO also employed another WWII field historian, the controversial S. L. A. Marshall, as a consultant during the Korean War. Dorothy Kneeland Clark did pioneering historical analysis on combat phenomena while at ORO.

The Demise of ORO…and Historical Combat Analysis?

By the late 1950s, considerable institutional friction had developed between ORO, the Johns Hopkins University (JHU)—ORO’s institutional owner—and the Army. According to Shrader,

Continued distrust of operations analysts by Army personnel, questions about the timeliness and focus of ORO studies, the ever-expanding scope of ORO interests, and, above all, [ORO director] Ellis Johnson’s irascible personality caused tensions that led in August 1961 to the cancellation of the Army’s contract with JHU and the replacement of ORO with a new, independent research organization, the Research Analysis Corporation [RAC].

RAC inherited ORO’s research agenda and most of its personnel, but changing events and circumstances led Army OR to shift its priorities away from field collection and empirical research on operational combat data in favor of the use of modeling and wargaming in its analyses. As Chris Lawrence described in his history of federally-funded Defense Department “think tanks,” the rise and fall of scientific management in DOD, the Vietnam War, social and congressional criticism, and an unhappiness by the military services with the analysis led to retrenchment in military OR by the end of the 60s. The Army sold RAC and created its own in-house Concepts Analysis Agency (CAA; now known as the Center for Army Analysis).

By the early 1970s, analysts, such as RAND’s Martin Shubik and Gary Brewer, and John Stockfisch, began to note that the relationships and processes being modeled in the Army’s combat simulations were not based on real-world data and that empirical research on combat phenomena by the Army OR community had languished. In 1991, Paul Davis and Donald Blumenthal gave this problem a name: the “Base of Sand.”

Validating Attrition

Continuing to comment on the article in the December 2018 issue of the Phalanx by Alt, Morey and Larimer (this is part 3 of 7; see Part 1, Part 2)

On the first page (page 28) in the third column they make the statement that:

Models of complex systems, especially those that incorporate human behavior, such as that demonstrated in combat, do not often lend themselves to empirical validation of output measures, such as attrition.

Really? Why can’t you? If fact, isn’t that exactly the model you should be validating?

More to the point, people have validated attrition models. Let me list a few cases (this list is not exhaustive):

1. Done by Center for Army Analysis (CAA) for the CEM (Concepts Evaluation Model) using Ardennes Campaign Simulation Study (ARCAS) data. Take a look at this study done for Stochastic CEM (STOCEM): https://apps.dtic.mil/dtic/tr/fulltext/u2/a489349.pdf

2. Done in 2005 by The Dupuy Institute for six different casualty estimation methodologies as part of Casualty Estimation Methodologies Studies. This was work done for the Army Medical Department and funded by DUSA (OR). It is listed here as report CE-1: http://www.dupuyinstitute.org/tdipub3.htm

3. Done in 2006 by The Dupuy Institute for the TNDM (Tactical Numerical Deterministic Model) using Corps and Division-level data. This effort was funded by Boeing, not the U.S. government. This is discussed in depth in Chapter 19 of my book War by Numbers (pages 299-324) where we show 20 charts from such an effort. Let me show you one from page 315:

 

So, this is something that multiple people have done on multiple occasions. It is not so difficult that The Dupuy Institute was not able to do it. TRADOC is an organization with around 38,000 military and civilian employees, plus who knows how many contractors. I think this is something they could also do if they had the desire.

 

Validation

Continuing to comment on the article in the December 2018 issue of the Phalanx by Jonathan Alt, Christopher Morey and Larry Larimer (this is part 2 of 7; see part 1 here).

On the first page (page 28) top of the third column they make the rather declarative statement that:

The combat simulations used by military operations research and analysis agencies adhere to strict standards established by the DoD regarding verification, validation and accreditation (Department of Defense, 2009).

Now, I have not reviewed what has been done on verification, validation and accreditation since 2009, but I did do a few fairly exhaustive reviews before then. One such review is written up in depth in The International TNDM Newsletter. It is Volume 1, No. 4 (February 1997). You can find it here:

http://www.dupuyinstitute.org/tdipub4.htm

The newsletter includes a letter dated 21 January 1997 from the Scientific Advisor to the CG (Commanding General)  at TRADOC (Training and Doctrine Command). This is the same organization that the three gentlemen who wrote the article in the Phalanx work for. The Scientific Advisor sent a letter out to multiple commands to try to flag the issue of validation (letter is on page 6 of the newsletter). My understanding is that he received few responses (I saw only one, it was from Leavenworth). After that, I gather there was no further action taken. This was a while back, so maybe everything has changed, as I gather they are claiming with that declarative statement. I doubt it.

This issue to me is validation. Verification is often done. Actual validations are a lot rarer. In 1997, this was my list of combat models in the industry that had been validated (the list is on page 7 of the newsletter):

1. Atlas (using 1940 Campaign in the West)

2. Vector (using undocumented turning runs)

3. QJM (by HERO using WWII and Middle-East data)

4. CEM (by CAA using Ardennes Data Base)

5. SIMNET/JANUS (by IDA using 73 Easting data)

 

Now, in 2005 we did a report on Casualty Estimation Methodologies (it is report CE-1 list here: http://www.dupuyinstitute.org/tdipub3.htm). We reviewed the listing of validation efforts, and from 1997 to 2005…nothing new had been done (except for a battalion-level validation we had done for the TNDM). So am I now to believe that since 2009, they have actively and aggressively pursued validation? Especially as most of this time was in a period of severely declining budgets, I doubt it. One of the arguments against validation made in meetings I attended in 1987 was that they did not have the time or budget to spend on validating. The budget during the Cold War was luxurious by today’s standards.

If there have been meaningful validations done, I would love to see the validation reports. The proof is in the pudding…..send me the validation reports that will resolve all doubts.

Engaging the Phalanx

The Military Operations Research Society (MORS) publishes a periodical journal called the Phalanx. In the December 2018 issue was an article that referenced one of our blog posts. This took us by surprise. We only found out about thanks to one of the viewers of this blog. We are not members of MORS. The article is paywalled and cannot be easily accessed if you are not a member.

It is titled “Perspectives on Combat Modeling” (page 28) and is written by Jonathan K. Alt, U.S. Army TRADOC Analysis Center, Monterey, CA.; Christopher Morey, PhD, Training and Doctrine Command Analysis Center, Ft. Leavenworth, Kansas; and Larry Larimer, Training and Doctrine Command Analysis Center, White Sands, New Mexico. I am not familiar with any of these three gentlemen.

The blog post that appears to be generating this article is this one:

Wargaming Multi-Domain Battle: The Base Of Sand Problem

Simply by coincidence, Shawn Woodford recently re-posted this in January. It was originally published on 10 April 2017 and was written by Shawn.

The opening two sentences of the article in the Phalanx reads:

Periodically, within the Department of Defense (DoD) analytic community, questions will arise regarding the validity of the combat models and simulations used to support analysis. Many attempts (sic) to resurrect the argument that models, simulations, and wargames “are built on the thin foundation of empirical knowledge about the phenomenon of combat.” (Woodford, 2017).

It is nice to be acknowledged, although it this case, it appears that we are being acknowledged because they disagree with what we are saying.

Probably the word that gets my attention is “resurrect.” It is an interesting word, that implies that this is an old argument that has somehow or the other been put to bed. Granted it is an old argument. On the other hand, it has not been put to bed. If a problem has been identified and not corrected, then it is still a problem. Age has nothing to do with it.

On the other hand, maybe they are using the word “resurrect” because recent developments in modeling and validation have changed the environment significantly enough that these arguments no longer apply. If so, I would be interested in what those changes are. The last time I checked, the modeling and simulation industry was using many of the same models they had used for decades. In some cases, were going back to using simpler hex-games for their modeling and wargaming efforts. We have blogged a couple of times about these efforts. So, in the world of modeling, unless there have been earthshaking and universal changes made in the last five years that have completely revamped the landscape….then the decades old problems still apply to the decades old models and simulations.

More to come (this is the first of at least 7 posts on this subject).

What Multi-Domain Operations Wargames Are You Playing? [Updated]

Source: David A. Shlapak and Michael Johnson. Reinforcing Deterrence on NATO’s Eastern Flank: Wargaming the Defense of the Baltics. Santa Monica, CA: RAND Corporation, 2016.

 

 

 

 

 

 

 

[UPDATE] We had several readers recommend games they have used or would be suitable for simulating Multi-Domain Battle and Operations (MDB/MDO) concepts. These include several classic campaign-level board wargames:

The Next War (SPI, 1976)

NATO: The Next War in Europe (Victory Games, 1983)

For tactical level combat, there is Steel Panthers: Main Battle Tank (SSI/Shrapnel Games, 1996- )

There were also a couple of naval/air oriented games:

Asian Fleet (Kokusai-Tsushin Co., Ltd. (国際通信社) 2007, 2010)

Command: Modern Air Naval Operations (Matrix Games, 2014)

Are there any others folks are using out there?


A Mystics & Statistic reader wants to know what wargames are being used to simulate and explore Multi-Domain Battle and Operations (MDB/MDO) concepts?

There is a lot of MDB/MDO wargaming going on in at all levels in the U.S. Department of Defense. Much of this appears to use existing models, simulations, and wargames, such as the U.S. Army Center for Army Analysis’s unclassified Wargaming Analysis Model (C-WAM).

Chris Lawrence recently looked at C-WAM and found that it uses a lot of traditional board wargaming elements, including methodologies for determining combat results, casualties, and breakpoints that have been found unable to replicate real-world outcomes (aka “The Base of Sand” problem).

C-WAM 1

C-WAM 2

C-WAM 3

C-WAM 4 (Breakpoints)

There is also the wargame used by RAND to look at possible scenarios for a potential Russian invasion of the Baltic States.

Wargaming the Defense of the Baltics

Wargaming at RAND

What other wargames, models, and simulations are there being used out there? Are there any commercial wargames incorporating MDB/MDO elements into their gameplay? What methodologies are being used to portray MDB/MDO effects?

Active Defense, Forward Defense, and A2/AD in Eastern Europe

The current military and anti-access/area denial situation in Eastern Europe. [Map and overlay derived from situation map by Thomas C. Thielen (@noclador) https://twitter.com/noclador/status/1079999716333703168; and Ian Williams, “The Russia – NATO A2AD Environment,” Missile Threat, Center for Strategic and International Studies, published January 3, 2017, last modified November 29, 2018, https://missilethreat.csis.org/russia-nato-a2ad-environment/]

In an article published by West Point’s Modern War Institute last month, The US Army is Wrong on Future War,” Nathan Jennings, Amos Fox and Adam Taliaferro laid out a detailed argument that current and near-future political, strategic, and operational realities augur against the Army’s current doctrinal conceptualization for Multi-Domain Operations (MDO).

[T]he US Army is mistakenly structuring for offensive clashes of mass and scale reminiscent of 1944 while competitors like Russia and China have adapted to twenty-first-century reality. This new paradigm—which favors fait accompli acquisitions, projection from sovereign sanctuary, and indirect proxy wars—combines incremental military actions with weaponized political, informational, and economic agendas under the protection of nuclear-fires complexes to advance territorial influence…

These factors suggest, cumulatively, that the advantage in military confrontation between great powers has decisively shifted to those that combine strategic offense with tactical defense.

As a consequence, the authors suggested that “the US Army should recognize the evolved character of modern warfare and embrace strategies that establish forward positions of advantage in contested areas like Eastern Europe and the South China Sea. This means reorganizing its current maneuver-centric structure into a fires-dominant force with robust capacity to defend in depth.”

Forward Defense, Active Defense, and AirLand Battle

To illustrate their thinking, Jennings, Fox, and Taliaferro invoked a specific historical example:

This strategic realignment should begin with adopting an approach more reminiscent of the US Army’s Active Defense doctrine of the 1970s than the vaunted AirLand Battle concept of the 1980s. While many distain (sic) Active Defense for running counter to institutional culture, it clearly recognized the primacy of the combined-arms defense in depth with supporting joint fires in the nuclear era. The concept’s elevation of the sciences of terrain and weaponry at scale—rather than today’s cult of the offense—is better suited to the current strategic environment. More importantly, this methodology would enable stated political aims to prevent adversary aggression rather than to invade their home territory.

In the article’s comments, many pushed back against reviving Active Defense thinking, which has apparently become indelibly tarred with the derisive criticism that led to its replacement by AirLand Battle in the 1980s. As the authors gently noted, much of this resistance stemmed from the perceptions of Army critics that Active Defense was passive and defensively-oriented, overly focused on firepower, and suspicions that it derived from operations research analysts reducing warfare and combat to a mathematical “battle calculus.”

While AirLand Battle has been justly lauded for enabling U.S. military success against Iraq in 1990-91 and 2003 (a third-rank, non-nuclear power it should be noted), it always elided the fundamental question of whether conventional deep strikes and operational maneuver into the territory of the Soviet Union’s Eastern European Warsaw Pact allies—and potentially the Soviet Union itself—would have triggered a nuclear response. The criticism of Active Defense similarly overlooked the basic political problem that led to the doctrine in the first place, namely, the need to provide a credible conventional forward defense of West Germany. Keeping the Germans actively integrated into NATO depended upon assurances that a Soviet invasion could be resisted effectively without resorting to nuclear weapons. Indeed, the political cohesion of the NATO alliance itself rested on the contradiction between the credibility of U.S. assurances that it would defend Western Europe with nuclear weapons if necessary and the fears of alliance members that losing a battle for West Germany would make that necessity a reality.

Forward Defense in Eastern Europe

A cursory look at the current military situation in Eastern Europe along with Russia’s increasingly robust anti-access/area denial (A2/AD) capabilities (see map) should clearly illustrate the logic behind a doctrine of forward defense. U.S. and NATO troops based in Western Europe would have to run a gauntlet of well protected long-range fires systems just to get into battle in Ukraine or the Baltics. Attempting operational maneuver at the end of lengthy and exposed logistical supply lines would seem to be dauntingly challenging. The U.S. 2nd U.S. Cavalry ABCT Stryker Brigade Combat Team based in southwest Germany appears very much “lone and lonely.” It should also illustrate the difficulties in attacking the Russian A2/AD complex; an act, which Jennings, Fox, and Taliaferro remind, that would actively court a nuclear response.

In this light, Active Defense—or better—a MDO doctrine of forward defense oriented on “a fires-dominant force with robust capacity to defend in depth,” intended to “enable stated political aims to prevent adversary aggression rather than to invade their home territory,” does not really seem foolishly retrograde after all.