Category Lessons of History

TDI Reports at DTIC

Just as a quick easy test, I decided to find out which of The Dupuy Institue (TDI) reports are on the Defense Technical Information Center (DTIC). Our report list is here: http://www.dupuyinstitute.org/tdipub3.htm

We are a private company, but most of these reports were done under contract for the U.S. government. In my past searches of the DTIC file, I found that maybe 40% of Trevor Dupuy’s HERO reports were at DTIC. So, I would expect that a few of the TDI would be filed at DTIC.

TDI has 80 reports listed on its site. There are 0 listed on DTIC under our name.

https://publicaccess.dtic.mil/psm/api/service/search/search?&q=%22dupuy+institute%22&site=default_collection&sort=relevance&start=0

There are a significant number of reports listed based upon our work, but a search on “Dupuy Institute” yields no actual reports done by us. I searched for a few of our reports by name (combat in cities, situational awareness, enemy prisoner of war, our insurgency work, our Bosnia casualty estimate) and found four:

https://publicaccess.dtic.mil/psm/api/service/search/search?site=default_collection&q=capture+rate+study

This was four of eight reports we did as part of the Capture Rate Study. So apparently one of the contract managers was diligent enough to make sure those studies were placed in DTIC (as was our Kursk Data Base), but since then (2001), none of our reports have been placed in DTIC.

Now, I have not checked NTIS and other sources, but I have reason to believe that not much of what we have done in the last 20+ years is archived in government repositories. If you need a copy of a TDI report, you have to come to us.

We are a private company. What happens when we decide to close our doors?

Basements

Basements appear to be very important in the world of studies and analysis. That is where various obscure records and reports are stored. As the industry gets grayer and retires, significant pieces of work are becoming harder to find. Sometimes the easiest way to find these reports is to call someone you know and ask them where to find it.

Let me give a few examples. At one point, when we were doing an analysis of Lanchester equations in combat modeling. I was aware that Bob McQuie, formally of CAA, had done some work on it. So, I called him. Turns out he had a small file he kept of his own work, but he had loaned it to his neighbor as a result of a conversation he had. So…..he reclaimed the file, two of our researchers drove over to his house, he gave us the file, and we still have it today. Turns out that much of his material is also available through DTIC. A quick DTIC search shows the following: https://publicaccess.dtic.mil/psm/api/service/search/search?site=default_collection&q=mcquie

Of particular interest is his benchmarks studies. His work on “breakpoints” and comments on Lanchester equations is not included in the DTIC listing because it was published in Army, November 1987. I have a copy in my basement. Neither is his article on the 3:1 rule (published in Phalanx, December 1989). He also did some work on regression analysis of historical battles that I have yet to locate.

Battle Outcomes: Casualty Rates As a Measure of Defeat

So, some of his work had been preserved. But, on the other hand, during that same casualty estimation methodologies study we also sent two researchers over to another “gray beard’s” house and he let our researchers look through his basement. We found the very useful study called Report of the Model Input Data and Process Committee, reference in my book War by Numbers, page 295. It does not show up in DTIC. We could not of find this study without a visit to his basement. He now lives in Florida, where they don’t have basements. So I assume the remaining boxes of materials he had have disappeared.

I am currently trying to locate another major study right now that was done by SAIC. So far, I have found one former SAIC employee who has two volumes of the multi-volume study. It is not listed in DTIC. To obtain a complete copy of the study, I am afraid I will have to contract someone else and pay to have it copied. Again, I just happen to know who to talk to find out what basement it is stored away in.

It is hard to appreciate the unique efforts that go into researching some of these projects. But, there is a sense at this end that as the “gray beards” disappear; reports and research efforts are disappearing with them.

Strachan On The Changing Character Of War

The Cove, the professional development site for the Australian Army, has posted a link to a 2011 lecture by Professor Sir Hew Strachan. Strachan, a Professor of International Relations at St. Andrews University in Scotland, is one of the more perceptive and trenchant observers about the recent trends in strategy, war, and warfare from a historian’s perspective. I highly recommend his recent book, The Direction of War.

Strachan’s lecture, “The Changing Character of War,” proceeds from Carl von Clausewitz’s discussions in On War on change and continuity in the history of war to look at the trajectories of recent conflicts. Among the topics Strachan’s lecture covers are technological determinism, the irregular conflicts of the early 21st century, political and social mobilization, the spectrum of conflict, the impact of the Second World War on contemporary theorizing about war and warfare, and deterrence.

This is well worth the time to listen to and think about.

Isolating the Guerilla

The Vietnam was significant in that it was third bloodiest war in U.S. military history (58,000 U.S. killed) and the U.S. Army choose to learn no lessons from it !!! This last point is discussed in my book America’s Modern Wars: Understanding Iraq, Afghanistan and Vietnam.

In 1965 Trevor Dupuy’s HERO (Historical Evaluation Research Organization) conducted a three-volume study called “Isolating the Guerilla.” It was an interesting survey of 19 insurgencies that included on its research team 26 experts. This included General Geoffrey Lord Bourne (British Army, ret.), Andrew C. Janos, Peter Paret, among others.

These guys:

https://en.wikipedia.org/wiki/Geoffrey_Bourne,_Baron_Bourne

http://www.nytimes.com/1975/04/26/archives/col-r-ernest-dupuy-88-dead-publicist-and-military-historian.html

https://en.wikipedia.org/wiki/Trevor_N._Dupuy

https://es.wikipedia.org/wiki/Andrew_C._Janos

https://www.goodreads.com/author/show/2793254.William_A_Nighswonger

https://en.wikipedia.org/wiki/Peter_Paret

http://www.legacy.com/obituaries/northjersey/obituary.aspx?pid=163090077

https://en.wikipedia.org/wiki/Theodore_Ropp

https://en.wikipedia.org/wiki/Gunther_E._Rothenberg

http://www.ur.umich.edu/9495/Oct03_94/29.htm

http://www.andersonfuneralhomeltd.com/home/index.cfm/obituaries/view/fh_id/12343/id/3994242

http://www.nytimes.com/1984/08/31/obituaries/frank-n-trager-78-an-expert-on-asia-dies.html

 

The first volume of the study, although developed from historical sources, was classified after it was completed. How does one classify a study that was developed from unclassified sources?

As such, the first volume of the study was in the classified safe at DMSI when I was there. I was aware of the study, but had never taken the time to look at it. DMSI went out of business in the early 1990s and all the classified material there was destroyed. The Dupuy Institute did not have a copy of this volume of the study.

In 2004 we did our casualty and duration estimate for Iraq. It was based upon a survey of 28 insurgencies. We then expanded that work to do an analysis based upon 89 insurgencies. This was done independently of our past work back in 1965, which I had never seen. This is detailed in my book America’s Modern Wars.

As this work was being completed I was contacted by a Lt. Colonel Michael F. Trevett in 2008. It turns out he had an unclassified copy of the study. He found it in the Ft. Huachuca library. It was declassified in 2004 and was also in DTIC. So, I finally got a copy of a study after we had almost completed our work on insurgencies. In retrospect, it would have been useful to have from the start. Again, another case of disappearing studies.

In 2011, Michael F. Trevett published the study as a book called Isolating the Guerrilla. The book is the study, with many of the appendices and supporting data removed at the request of the publisher. It was a self-publishing effort that was paid by Michael out of his personal/family funds. He has since left the Army. I did write the foreword to the book.

What can I say about this case? We did a study on insurgencies in 1965 that had some relevance to the wars we entered in Afghanistan in 2001 and Iraq in 2003. It remained classified and buried in a library in Ft. Huachuca, Arizona and at DTIC. It was de-classified in 2004 and came back to light in 2008. This was through the effort of a single motivated Lt. Colonel who was willing to take the time and his own personal money to make it happen.

The SAIC Library

The story of the disappearing SAIC research library occurred in the middle of the 1990s, during the same time as the HERO Library was disappearing. SAIC had an “Military Operations Analysis Division” that for a time was a competitor to HERO/DMSI. In particular, around 1990, they hired three former HERO/DMSI employees and used them for studies that normally would have been done by us. Trevor Dupuy was on-the-outs with some people at the U.S. Army Concepts Analysis Agency (CAA). Some time in the mid-1990s, SAIC decided to close down their military operations analysis division.

The early 1990s were a difficult time for defense contractors. The Warsaw Pact and the Soviet Union had disappeared and the defense industry was shrinking. SAIC got rid of the division that did analytical work for DOD as they realized it was a dying business (something that we could never get through our heads). Companies like BDM, one of the stalwarts in the industry since 1959, was sold off in 1990s; with Trevor Dupuy’s old company, DMSI, also going out of business in the 1990s.

Anyhow, SAIC had a library for this division. It was the size of two double offices, maybe 400 square feet or more. It was smaller than the HERO Library. They decided to dissolve the library along with the division. They told the staff to grab what they wanted and dumped the rest. Having never had access to this library, I do not know if there were any holdings of value, but as SAIC had been around since 1969, it is hard to believe that there was not something unique there.

 

This post is related to:

The HERO Library

Missing HERO Reports

 

Combat Power vs Combat Power

In my last post, I ended up comparing Combat Effectiveness Value (CEV) to Combat Power. CEV is only part of combat power. In Trevor Dupuy’s formulation, Combat Power is P = (S x V x CEV)

This means that combat power (P) is the product of force strength, including weapon effects (S), operational and environmental factors (V) and human factors (CEV).

From his list of 73 variables on page 33 of Numbers, Predictions and War (NPW), the operational and environmental factors include terrain factors, weather factors, season factors, air superiority factors, posture factors, mobility effects, vulnerability factors, tactical air effects, other combat processes (including surprise), and the intangible factors (which are included in his CEV).

Again, it turns into a much longer laundry list of variables than we have from ADP 3.0.

What Makes Up Combat Power?

Trevor Dupuy used in his models and theoretical work the concept of the Combat Effectiveness Value. The combat multiplier consisted of:

  1. Morale,
  2. training,
  3. experience,
  4. leadership,
  5. motivation,
  6. cohesion,
  7. intelligence (including interpretation),
  8. momentum,
  9. initiative,
  10. doctrine,
  11. the effects of surprise,
  12. logistical systems,
  13. organizational habits,
  14. and even cultural differences.
  15. (generalship)

See War by Numbers, page 17 and Numbers, Predictions and War, page 33. To this list, I have added a fifteenth item: “generalship,” which I consider something different than leadership. As I stated in my footnote on pages 17 & 348 of War by Numbers:

“Leadership” is this sense represents the training and capabilities of the non-commission and commissioned officers throughout the unit, which is going to be fairly consistent in an army from unit to unit. This can be a fairly consistent positive or negative influence on a unit. On the other hand, “generalship” represents the guy at the top of the unit, making the decisions. This is widely variable; with the history of warfare populated with brilliant generals, a large number of perfectly competent ones, and a surprisingly large number of less than competent ones. Within in army, no matter the degree and competence of the officer corps, or the rigor of their training, poor generals show up, and sometimes, brilliant generals show up with no military training (like the politician turned general Julius Caesar).

 

Anyhow, looking at the previous blog post by Shawn, the U.S. Army states that “combat power” consists of eight elements:

  1. Leadership,
  2. information,
  3. mission command,
  4. movement and maneuver
  5. intelligence
  6. fires,
  7. sustainment,
  8. and protection.

I am not going to debate strengths and weaknesses of these two lists, but I do note that there are only two items on both lists (leadership and intelligence). I prefer the 15 point list.

How Does the U.S. Army Calculate Combat Power? ¯\_(ツ)_/¯

The constituents of combat power as described in current U.S. military doctrine. [The Lightning Press]

One of the fundamental concepts of U.S. warfighting doctrine is combat power. The current U.S. Army definition is “the total means of destructive, constructive, and information capabilities that a military unit or formation can apply at a given time. (ADRP 3-0).” It is the construct commanders and staffs are taught to use to assess the relative effectiveness of combat forces and is woven deeply throughout all aspects of U.S. operational thinking.

To execute operations, commanders conceptualize capabilities in terms of combat power. Combat power has eight elements: leadership, information, mission command, movement and maneuver, intelligence, fires, sustainment, and protection. The Army collectively describes the last six elements as the warfighting functions. Commanders apply combat power through the warfighting functions using leadership and information. [ADP 3-0, Operations]

Yet, there is no formal method in U.S. doctrine for estimating combat power. The existing process is intentionally subjective and largely left up to judgment. This is problematic, given that assessing the relative combat power of friendly and opposing forces on the battlefield is the first step in Course of Action (COA) development, which is at the heart of the U.S. Military Decision-Making Process (MDMP). Estimates of combat power also figure heavily in determining the outcomes of wargames evaluating proposed COAs.

The Existing Process

The Army’s current approach to combat power estimation is outlined in Field Manual (FM) 6-0 Commander and Staff Organization and Operations (2014). Planners are instructed to “make a rough estimate of force ratios of maneuver units two levels below their echelon.” They are then directed to “compare friendly strengths against enemy weaknesses, and vice versa, for each element of combat power.” It is “by analyzing force ratios and determining and comparing each force’s strengths and weaknesses as a function of combat power” that planners gain insight into tactical and operational capabilities, perspectives, vulnerabilities, and required resources.

That is it. Planners are told that “although the process uses some numerical relationships, the estimate is largely subjective. Assessing combat power requires assessing both tangible and intangible factors, such as morale and levels of training.” There is no guidance as to how to determine force ratios [numbers of troops or weapons systems?]. Nor is there any description of how to relate force calculations to combat power. Should force strengths be used somehow to determine a combat power value? Who knows? No additional doctrinal or planning references are provided.

Planners then use these subjective combat power assessments as they shape potential COAs and test them through wargaming. Although explicitly warned not to “develop and recommend COAs based solely on mathematical analysis of force ratios,” they are invited at this stage to consult a table of “minimum historical planning ratios as a starting point.” The table is clearly derived from the ubiquitous 3-1 rule of combat. Contrary to what FM 6-0 claims, neither the 3-1 rule nor the table have a clear historical provenance or any sort of empirical substantiation. There is no proven validity to any of the values cited. It is not even clear whether the “historical planning ratios” apply to manpower, firepower, or combat power.

During this phase, planners are advised to account for “factors that are difficult to gauge, such as impact of past engagements, quality of leaders, morale, maintenance of equipment, and time in position. Levels of electronic warfare support, fire support, close air support, civilian support, and many other factors also affect arraying forces.” FM 6-0 offers no detail as to how these factors should be measured or applied, however.

FM 6-0 also addresses combat power assessment for stability and civil support operations through troop-to-task analysis. Force requirements are to be based on an estimate of troop density, a “ratio of security forces (including host-nation military and police forces as well as foreign counterinsurgents) to inhabitants.” The manual advises that most “most density recommendations fall within a range of 20 to 25 counterinsurgents for every 1,000 residents in an area of operations. A ratio of twenty counterinsurgents per 1,000 residents is often considered the minimum troop density required for effective counterinsurgency operations.”

While FM 6-0 acknowledges that “as with any fixed ratio, such calculations strongly depend on the situation,” it does not mention that any references to force level requirements, tie-down ratios, or troop density were stripped from both Joint and Army counterinsurgency manuals in 2013 and 2014. Yet, this construct lingers on in official staff planning doctrine. (Recent research challenged the validity of the troop density construct but the Defense Department has yet to fund any follow-on work on the subject.)

The Army Has Known About The Problem For A Long Time

The Army has tried several solutions to the problem of combat power estimation over the years. In the early 1970s, the U.S. Army Center for Army Analysis (CAA; known then as the U.S. Army Concepts & Analysis Agency) developed the Weighted Equipment Indices/Weighted Unit Value (WEI/WUV or “wee‑wuv”) methodology for calculating the relative firepower of different combat units. While WEI/WUV’s were soon adopted throughout the Defense Department, the subjective nature of the method gradually led it to be abandoned for official use.

In the 1980s and 1990s, the U.S. Army Command & General Staff College (CGSC) published the ST 100-9 and ST 100-3 student workbooks that contained tables of planning factors that became the informal basis for calculating combat power in staff practice. The STs were revised regularly and then adapted into spreadsheet format in the late 1990s. The 1999 iteration employed WEI/WEVs as the basis for calculating firepower scores used to estimate force ratios. CGSC stopped updating the STs in the early 2000s, as the Army focused on irregular warfare.

With the recently renewed focus on conventional conflict, Army staff planners are starting to realize that their planning factors are out of date. In an attempt to fill this gap, CGSC developed a new spreadsheet tool in 2012 called the Correlation of Forces (COF) calculator. It apparently drew upon analysis done by the U.S. Army Training and Doctrine Command Analysis Center (TRAC) in 2004 to establish new combat unit firepower scores. (TRAC’s methodology is not clear, but if it is based on this 2007 ISMOR presentation, the scores are derived from runs by an unspecified combat model modified by factors derived from the Army’s unit readiness methodology. If described accurately, this would not be an improvement over WEI/WUVs.)

The COF calculator continues to use the 3-1 force ratio tables. It also incorporates a table for estimating combat losses based on force ratios (this despite ample empirical historical analysis showing that there is no correlation between force ratios and casualty rates).

While the COF calculator is not yet an official doctrinal product, CGSC plans to add Marine Corps forces to it for use as a joint planning tool and to incorporate it into the Army’s Command Post of the Future (CPOF). TRAC is developing a stand-alone version for use by force developers.

The incorporation of unsubstantiated and unvalidated concepts into Army doctrine has been a long standing problem. In 1976, Huba Wass de Czege, then an Army major, took both “loosely structured and unscientific analysis” based on intuition and experience and simple counts of gross numbers to task as insufficient “for a clear and rigorous understanding of combat power in a modern context.” He proposed replacing it with a analytical framework for analyzing combat power that accounted for both measurable and intangible factors. Adopting a scrupulous method and language would overcome the simplistic tactical analysis then being taught. While some of the essence of Wass de Czege’s approach has found its way into doctrinal thinking, his criticism of the lack of objective and thorough analysis continues to echo (here, here, and here, for example).

Despite dissatisfaction with the existing methods, little has changed. The problem with this should be self-evident, but I will give the U.S. Naval War College the final word here:

Fundamentally, all of our approaches to force-on-force analysis are underpinned by theories of combat that include both how combat works and what matters most in determining the outcomes of engagements, battles, campaigns, and wars. The various analytical methods we use can shed light on the performance of the force alternatives only to the extent our theories of combat are valid. If our theories are flawed, our analytical results are likely to be equally wrong.

The HERO Library

The first research library that I was aware of that was broken up and scattered was Trevor Dupuy’s library that was kept at HERO/DMSI (HERO was a division of DMSI at this stage).

One has to talk corporate structure here for a moment. HERO (Historical Evaluation Research Organization) had been established and built up by Trevor Dupuy. In an attempt to expand the business, he created a company called DMSI (Data Memory Systems Incorporated) of which he was just one of the owners (although with about 40% of the stock). In 1987-1989 period, the U.S. defense spending reached it peaked as did DMSI, which had 25 employees. With Glasnost, Perestroika, the Warsaw Pact dissolving and finally the Soviet Union collapsing in 1991, the defense budget collapsed. In the resulting collapse, so to did DMSI. With the business crashing, Trevor Dupuy having a falling out with the other management and quit his own company.

DMSI/HERO had an extensive library and an extensive collection of research files, dating back to its founding in 1962. They even had share library privileges with the Library of Congress due to some unique material in the HERO library. The library took up a large room and there were file cabinets full of research files.

This library was broken up. First, Trevor Dupuy took the report file he kept in his office with him. This was the entire collection of 120+ reports written by HERO. Those eventually ended up at TDI (The Dupuy Institute). The library remained at DMSI, except for those books that the Dupuy family took from the library, which I gather was considerable. These are still in the hands of the Dupuy family. Then DMSI went out of business around 1993, and the remaining files and library were scattered. The employees were invited to take what they wanted out of the files. After that, one employee decided to rescue the files that he thought were important and moved them to his barn in rural Virginia. These included most (but not all) of the Ardennes files and Grace Hayes files (the original VP of research at HERO). A few years later, we arranged with that ex-employee to reclaim the files and he graciously brought them to our office from his barn. After we blew the dust, dirt, hay and mouse droppings from them, we then refiled them at TDI. The files taken by other employees were not recovered. Most of the remaining library was taken by a principal at the company and moved to his basement. They were used for his business for a while. Eventually, he needed to clean his basement and the “HERO Library” ceased to exist.

We were able to save most of the critical files, meaning the reports, most of the Ardennes files and the Grace Hayes files. The rest was lost, which was a shame, although not overwhelmingly critical. Still, the process got my attention and this is potentially a problem with any private company. Unless someone goes through some extra-ordinary process to preserve the files and libraries of their work, then when a private company collapses (which most do at some point), those files are lost. I am aware of several other cases like this.

Missing HERO Reports

Back in 1987 I did a DTIC search for HERO reports (DTIC is the defense library of reports, HERO was Trevor Dupuy’s old company(s) that produced around 130 or so reports). My DTIC search ended up finding something like 40% of HERO reports in their files. As almost of all these reports were done under contract for the government, then the figure should have been something more like 100%.

Now, I guess if I was a responsible citizen, I would have made sure that all the missing reports were identified, copies made, and they were then sent to DTIC. I did not do this, because as I busy running a large project that was behind schedule and over budget (the Ardennes Campaign Simulation Data Base).

But……this little survey got my attention concerning what was preserved in the national report library (like NTIS…National Technical Information Center) and what was not. It raises the question as to whether we are properly preserving the studies and results of all this analysis that various companies have done….or are we loosing some of it. Unfortunately, I have found enough cases over the years of significant and important studies and files disappearing or becoming difficult to find….that I have become concerned. Whether this is indicative of a larger problem I will leave to the reader to determine, but there will be a few blog posts about this subject or the next week or two.

All of our reports and studies are listed on our website here: http://www.dupuyinstitute.org/tdipubs.htm

It is some 130 HERO reports and 80 TDI reports.

It would be possible to someone to search the NTIS site and see how many these reports can be found (and which ones they cannot find). I would be interested in knowing the results of that if anyone wants to spend a few days doing this. If the problem exists for HERO and TDI reports…then it probably exists for work done by a lot of other companies also.