Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
Carl von Clausewitz, the fog-of-war, and the AI revolution : the real world is not a game of Go
Wallace R., Springer International Publishing, New York, NY, 2018. 102 pp. Type: Book (978-3-319746-32-6)
Date Reviewed: Jan 14 2019

As artificial intelligence (AI) is increasingly used in many areas, including fully automated vehicle control and military applications, there are concerns about how effectively such systems deal with extreme conditions. Carl von Clausewitz, the 19th century military theorist, coined the term “the fog of war,” a pithy phrase that parallels Helmuth von Moltke’s more modern quote: “No battle plan ever survives contact with the enemy” [1]. Military engagements are chaotic, and communication is difficult, meaning leaders must deal with delayed, conflicting, or erroneous intelligence reports when making real-time decisions on how to proceed. In this brief monograph, the author compares the conditions of battle with how AI systems might behave when, for example, “driving a fast car on a twisting, pot-holed road at night.” Will we have a graceful degradation of performance or will there be a threshold beyond which catastrophic failure occurs?

Following von Clausewitz, two primary conditions are explored using models drawn from physical analogs. The first is the fog of war, or limitations on available data for making decisions. The second is friction, that is, difficulty exerting control due to adverse conditions. An AI system must operate a control loop in which it reads data from sensors, analyzes the data, and then makes decisions about what to do. There is a clear analog to a military commander in the field using reconnaissance to make battlefield decisions. As US military strategist John Boyd noted, getting “inside” the opponent’s control loop is the key to victory because it disrupts the opponent’s ability to proceed effectively. In a non-battle situation, what happens if the data input to a self-driving car is missing crucial data due to sensor failure, or it exceeds the AI’s functional ability to process it, or, conversely, contention for bandwidth among many such vehicles limits the available data?

This question is first addressed with Feynman’s notion of “information as free energy,” and then by using statistical concepts from thermodynamics to model a system’s behavior in conditions where information flow is increasingly degraded. The model indicates that at a certain point, an event similar to a phase change (water to ice) can occur that shifts a system from stable to unstable. Having built the model, the author then applies it to several examples: bus service disruption due to passenger crowding, a military example in which the tactical situation changes too quickly for the strategists to react, and what could happen with the introduction of autonomous weapons. In the last case, the author notes that graceful degradation can suddenly fail into an “all potential targets are enemies” condition, or in more picturesque language, “kill them all and let God sort them out.” The monograph concludes with an investigation of evolutionary properties of real-time systems and possible pathological outcomes. The view expressed in the concluding chapter is that the use of AI systems for critical real-time applications will eventually have a result where “considerable morbidity and mortality are expected.”

As noted above, the text is a brief 102 pages, including an appendix on the mathematical concepts used. The arguments are presented using techniques of mathematical physics. Derivations are similarly brief; readers need backgrounds in both calculus and physics in order to fully understand the models. I am disappointed with the author’s choice in the last chapter to direct an extreme vulgarity at AI developers. Such language is inappropriate in scholarly works. Finally, one could perhaps argue how well the models presented describe the behavior of AI systems yet to be built, but clearly the models suggest that developers must be very careful to understand and test such systems under extreme conditions before releasing them to the public.

Reviewer:  G. R. Mayforth Review #: CR146379 (1904-0103)
1) von Moltke, H. As quoted in Donnybrook: the Battle of Bull Run, 1861, Harcourt Books, Orlando, FL, 2004.
Bookmark and Share
  Reviewer Selected
 
 
Applications And Expert Systems (I.2.1 )
 
 
Chaotic Systems (G.1.7 ... )
 
 
Command And Control (J.7 ... )
 
 
Learning (I.2.6 )
 
Would you recommend this review?
yes
no
Other reviews under "Applications And Expert Systems": Date
Institutionalizing expert systems: a handbook for managers
Liebowitz J. (ed), Prentice-Hall, Inc., Upper Saddle River, NJ, 1991. Type: Book (9780134720777)
Nov 1 1991
Verifying and validating personal computer-based expert systems
Bahill A., Prentice-Hall, Inc., Upper Saddle River, NJ, 1991. Type: Book (9780139574573)
Jun 1 1992
Knowledge-based systems: a manager’s perspective
Tuthill G., Levy S., TAB Books, Blue Ridge Summit, PA, 1991. Type: Book (9780830634798)
Dec 1 1991
more...

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy