Militarized Microprocessors

In the aftermath of the overwhelming US victory over Iraq in the 1991 Gulf War, US military leaders roundly postulated a narrative in which American bleeding-edge technological superiority would lift the fog of war and lead to a near-deterministic approach to military conflict. Across the technological landscape of the early 1990’s, the microprocessor had finally become powerful enough it could be installed in ruggedized combat systems and in turn transform analog technology into smart weapons. Stealth fighter-bombers were utilized for the first time to deliver precision-guided munitions utilizing the Global Positioning Satellite system; on land, the M-1 Abrams main battle tank employed a microprocessor-intensive laser range finder and fire guidance system for the first time. The lesson seemed all but certain – the proliferation of microprocessors and custom software would eventually lead to such a surgically-precise and informationalized application of combat power that the perennial ‘fog of war’ would finally be lifted for US Commanders. As such, the US Army doctrine of Battle Command was postulated, in which technologically-driven, information hyper-awareness would and should be Commanded in a top-down control model. If only Army leaders had understood the evolution of compute technology’s architectural design pattern, they would have realized they were living in the wrong era, and were instead assessing nearly thirty years too-early the fruition of digitized combat. Let Historical lessons be hard learned: you can’t lift the Fog of War until you develop the Cloud of War

 Mainframes and Maestros

Information Technology has a curious history, in that its evolving architectural design patterns always precede the US Army’s doctrine of land warfare by 5-7 years. That transformations in technology necessarily find application in warfare should be no surprise. However, the US Army has a curious way of adapting its doctrine to, and even espousing the same tenants – the themes – of mainstream computing approximately between half and a whole decade after Silicon Valley has arrived at the same conclusion. One of the most-central questions in Compute have always been where processing power should be concentrated, and for decades it has ticked and tocked like a pendulum between centralized consolidation and dispersed coordination. The US military has typically experienced a similar see-saw effect, concentrating Command and Control (C2) power at the highest echelons during peace, then gradually divesting that power down to the lowest ranks during actual, prolonged conflict; then when drawing near the cessation of hostilities, C2 functions have invariably been reconsolidated back at the rear-echelon command post. In Compute, the dominant model from the 1950’s to mid-1980’s was the mainframe and terminal. Computational power and storage were concentrated in a central location, heavily-endowed with support resources, while the terminal had nearly no computational capability nor memory of its own. The top-down, C2 model of military planning and organization was thematically (and strikingly) similar to the mainframe-terminal model for those many decades, and for good reason. In the late 1980’s and early 1990’s, Modern warfare was being imagined on a global and theatre-wide scale. No Commander with less than a 250K:1 view of the globe would be able to see and exploit, say, defensive lines of weakness in heavily layered tank formations on the European continent facing off with the Soviet Union, or instead could fathom the coordinated, strategic messaging used necessarily to enforce the efficacious (if not the paradoxical) utilization of nuclear weapons’ very existence in order to prevent their ever, actual use. No, up until and including the cusp of the first Gulf War, defense postures world-wide followed a mainframe/client, traditional set-piece doctrine because doing so worked. General Colin Powel, then Chairman of the Joint Chiefs of Staff, understood relentless, forward progress, but he also understood supply lines. He understood logistics, air mobility and combat sustainment and support, and he espoused a doctrine of overwhelming attack that could never have been achieved by instilling decision-making at the lowest ranks of the military. Massive enterprises require leaders looking down from above more so than they require problem-solvers on the ground. As such, the 1991 gulf war was fought the way a mainframe computer was designed to operate – centralized, hierarchical, and deterministically. It assumed efficiency was only possible in broad-sweeping policy and usage decisions, instead of individually-decided localized subroutines which in aggregate might achieve strategic objectives.


After the dissolution of the Soviet Union in the early 1990’s, the doctrine of Battle Command – in a sense, a by-word for Powell Doctrine – was instantiated, yet it described only one end of the spectrum of human conflict: well known, uniformed and strategic-thinking enemies, static alignments, set-piece battles, and an overwhelming preoccupation on brute strength, speed, and destructive capability – i.e., Conventional War. As the so-called Peace Dividend was spent, the mid-to-late 1990’s saw the rise of two seemingly disconnected phenomena – Military Operations Other Than War (e.g., Somalia, Hatti, the Balkans), and the ascendance of desktop computing. Once again, the pendulum had swung. MOOTW required a greater level of autonomy in US Army Task Forces due to the then-expanding mission set and the unpredictable nature of the counter-U.S. forces that complicated those missions; unlike Battle Command’s emphasis on a thinking enemy, the Hurricane Andrew relief in Florida, the stability operations in Haiti, or the military support to California law enforcement during the 1992 Los Angeles riots, all involved macro-problems which could sway a population’s attitude with or against the US military, and yet were problems that did not involve a strictly human adversary. In other words, in MOOTW the U.S. military’s success could not be determined by destructive force and organization alone. In parallel, the dramatic increase in microprocessing power heralded by Intel’s x86 Pentium processor and the ubiquity and customer easy-of-use of Windows 95 drove the top-down mainframe-terminal model into obsolesce. It was this profound, exponentiation of technological advancement that had been on display in the Gulf War and the small wars of the 1990’s; it was the same advancement in consumer technologies that next allowed individuals to do everything from complete office tasks on their own home computers, to consuming media on ever-smaller platforms, to storing data long-term; it allowed autonomy and independent decision-making. Peer-to-peer file-sharing and P2P chat services further interrupted the old client-server model, placing greater onus on the client to send and receive content. The relationship between two opaquely-related trends certainly is curious: the rise of the desktop computer and the way it empowered the basic, end-user (often without any specific computer training skills) to nonetheless enter and engage in the digital revolution, in then a near-perfect predictor of the kind of military doctrine needed to address the complexities of MOOTW and the small wars of the 1990’s.

The Uncertainty Problem, a.k.a., ‘A slight detour into the late 90’s Marine Corps’

The US Army formally recognizes that “the uncertainty and complexity of future operations will demand forces that can operate in a decentralized manner…” Mission command therefore has evolved in conjunction with the central tenant of operational adaptability. However, this idea is certainly not new.

In 1999, Marine Corps General Charles Krulak postulated the concept of the “Three Block War,” in which a Marine Task Force would theoretically be called upon to conduct full-scale military operations on one block in an urban environment, but then necessarily conduct Peace Keeping operations on the next block, and on yet a third contiguous city block be required to conduct humanitarian aid missions. General Krulak was not writing metaphorically. Rather, he recognized the i220px-charles_c-_krulaknherent extremes of urban conflict and counter-insurgency nearly five years before the Marines’ heralded battles for Fallujah, Iraq proved his assessment accurate.

General Krulak’s solution to tamping mission extremes was the Strategic Corporal.  He recognized that leadership in complex, rapidly evolving mission environments must be dispersed lower down the chain of command in order to better exploit time-critical information; it must ultimately land on the Corporal. Otherwise, obtaining mission instructions from remotely located C2 structures would likely result in mission failure, or in avoidable casualties. In the contest of wills for not only military dominance, but just as importantly, for the hearts and minds of the local populace, the lowest echelon’s tactical actions then came to have a strategic effect. National public opinion could be galvanized for or against the US Military based on inherently tactical decisions. As such, General Krulak’s follow-on recommendations for training and doctrine were to invest more time and resources into the training of even the lowest ranking noncommissioned officers.


The wars in Iraq and Afghanistan in the 2000’s and 2010’s once again impelled the pendulum swing for US Army doctrine. Perhaps unsurprisingly, a parallel transition in Compute to a decentralized, Cloud model was likewise underway. But why the rise of the Cloud over the client-server/desktop model? With an annualized failure rate of up to 8.6%, individual servers cannot be trusted to bear the full burden of constant readiness on their own – there must be a system of mutually reinforcing components, which work cohesively because they all follow a unified framework (i.e., Apache Hadoop, Microsoft Azure).  Cloud architecture still gives primacy to the individual server, but is designed to adapt to a server failure, act decisively based on common architecture, yet still be agile enough in implementation and reconfiguration that developers can take the initiative and apply the technology to evolving problem sets. Clearly, the Cloud is the smarter, more agile version of the Mainframe-Terminal model. The US Army’s current doctrine of Mission Command is likewise the smarter, more agile version of Battle Command. Both imagine Full-Spectrum Operations and both seek to solve perennial problems and evolving problems not yet imagined.

The mission command framework may be the only model capable of not only providing a suitable structural design on a near-thematic level, but also just on the raw computing power needed to achieve in total some of the U.S. military’s more ambitious goals; a global information grid providing real-time, actionable tactical intelligence to ground operators through easy-to-access heads up displays or wrist-worn computing systems will require tremendous data ingest capability and resilient dataflow avenues, plus processing and correlating power, dedicated satellite bandwidth on both the uplink and downlink, and then robust, automated encryption and decryption at the user’s information terminal (no matter how small its form-factor). Those requirements admittedly go beyond what a Cloud Architecture alone can provide. But then again, those interlocking technologies could never function on their own without the massive computational throughput of a cloud architecture backing them up.

In sum, the military planners of the early 1990’s were not incorrect in their assessments imagining possibilities and commensurate command structures; they were just 30 years too early. Truly, had they known the timescale for the development toward and then dissolution of the Desktop as the dominant paradigm in computing, then they could perhaps have predicted the efficacy of hybridizing the Terminal/Mainframe model with the Desktop model and projected it into the mid-2010’s. Only then, they would have been able to fully appreciate that you cannot lift the Fog of War without first inventing the Cloud of War.