Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
Computers in battle: will they work?
Bellin D. (ed), Chapman G. (ed), Harcourt Brace &’ Co., Orlando, FL, 1987. Type: Book (9780151212323)
Date Reviewed: Sep 1 1988

The book consists of eleven chapters about past, present, and future uses of computers in the electronic battlefield. The chapters and their authors are

  • Computers in Battle: A Human Overview by S. M. Ornstein

  • A History of Computers and Weapon Systems by P. N. Edwards

  • The New Generation of High-Technology Weapons by G. Chapman

  • Computer System Reliability and Nuclear War by A. Borning

  • Computers and the Strategic Defense Initiative by E. Roberts and S. Berlin

  • The Strategic Computing Program by J. Jacky

  • Computers in Weapons: The Limits of Confidence by D. L. Parnas

  • Artificial Intelligence as Military Technology by T. Athanasiou

  • High Technology and the Emerging Dual Economy by L. Siegel and J. Markoff

  • Role of Military Funding in Academic Computer Science by C. Thomborson

  • Computers and War: Philosophical Reflections on Ends and Means by J. Ladd

In addition, the book contains extensive references cited by each author and an explanation of the acronyms used. There is also a resource list of organizations for information about computers and weapons.

The general theme of the book is that we should not trust computers to protect us from nuclear annihilation. Each chapter presents a different perspective on the same theme. The following quotations have been drawn from the book.

In his chapter on the human overview of computers in battle, S. Ornstein states that

The real source of [a computer’s power] comes [not just from its speed, but] from the computer’s ability to make a choice based on conditions at the time the program is running--either the condition of information within the machine (Have I added up all the numbers yet?) or outside the machine (Are there any radar reflections from this direction?).

Not suprisingly, he indicates that

When something we’re using breaks, ninety-nine times out of a hundred, it is a hardware failure rather than a design failure. At least if it’s not a computer.

In conclusion, he warns

Computers provide some of the most stunningly clear evidence of how naive we are about ourselves. To anyone with any capacity for humility, they are a telling object lesson. Nonetheless, there are plenty of opportunists willing to encourage fantasies in exchange for some of the loot. Such fantasies are rubbish, dangerous rubbish. Not only will computer-controlled systems fail to provide protection, but if we really persist in relinquishing our responsibilities and putting more and more higher and higher levels of evaluation and decision making into their hands, we will be sealing our fate. If we continue on this course, that fate will be richly deserved.

The use of computers in weapon systems is traced in the chapter by P. Edwards, where he states that

The measures and countermeasures of the electronic battlefield have become so complex, so rapid, and so interlocked that the human element in war is increasingly difficult to discern. . . .The irony of the automation process is that it can never approach the asymptote of perfection that drives its progress. War is essentially human: it is about human conflicts and human problems, which automation cannot solve.

In his discussion about autonomous weapons, G. Chapman states that

Autonomous weapons subvert all international law concerning the just deployment of force because computers can never be punished. The basis of all law is responsibility and the prospect of retribution, which are meaningless to computers.

In his chapter on computer systems reliability, A. Borning indicates that

The standard of reliability required of a military system that can potentially help precipitate a thermonuclear war if it fails must be higher than any other computer system, since the magnitude of disaster is so great.

He goes on, however, to state that

most professional programmers today do not use such software engineering techniques as structured programming, modularity and information hiding, cooperating sequential processes, or formal program semantics.

In conclusion, he summarizes:

We must recognize the limits of technology. The threat of nuclear war is a political problem, and it is in the political, human realm that solutions must be sought.

E. Roberts and S. Berlin, in their chapter on computers and SDI, present the following conclusions:

At present, we must conclude that SDI, given its complexity, lies beyond the limits of software engineering. And even if we could build such a system, that fact that we could not test it under realistic conditions would make it impossible to achieve the necessary level of confidence in its effectiveness.

They cite the Eastport study, which notes that

an architecture that cannot be tested or that relies on software that does not work is of no value.

D. L. Parnas indicates in his chapter on the limits of confidence in the computers in weapons that

It is in the self-interest of weapons manufacturers to pretend that there are no limits on our ability to develop highly sophisticated computer-based weapon systems. It is in their interest to promise that investment in such systems can increase our security. It is in the national interest to recognize that there are limits on our own ability to build such systems and limits on the ability of technologists to solve the political problems that make us insecure. We have been pursuing a technological solution to the problem of security throughout our history. It is time to recognize that this approach is both limited and, because of the power of modern weapons, unacceptably dangerous.

This feeling of reliance on military technology instead of on politics and diplomacy is echoed in the chapter by T. Athansiou, in which he concludes:

Technology has a role to play in promoting peace, but its role is not a central one. Defensive military postures and their associated tools, important though they may be, cannot substitute for the social and political changes that must underlie any real movement away from the militarist brink. Technology cannot solve the arms race, just as it cannot by itself bring us peace. It is the militaristic dynamic that must be broken.

The chapter by L. Siegel and J. Markoff discusses the military takeover of university research. They state in their conclusion:

If the military attracts the best and brightest engineers, programmers, and computer scientists and puts them to work on focused military problems behind closed doors, then those segments of American high-technology industry that compete in the marketplace will be severely hurt.

This theme is echoed by C. Thomborson in his chapter, when he states that

If the DoD is allowed to maintain control of our R&D establishment, it will continue to sap our nation’s commercial and political vitality. The DoD is preventing us from developing our civilian and scientific networking capability. It has also damaged our nation’s ability to design, market, and sell commercial lines of digital circuits.

He concludes that we must force the DoD to relinquish control of our nation’s research and development policy through public pressure.

In the final chapter of the book, J. Ladd sums up by stating that

Technological certainty does not wipe out military and political uncertainty.

Neither does it eliminate accountability.

As warfare becomes more and more technological and technology becomes more computer controlled, accountability becomes less and less possible. Accountability and computer technology mix like oil and water. For accountability is essentially a human and social response to one’s fellows when the going is rough.

I found three errors that might lead the casual reader astray. The first is the use of the word break on page 22, when it should be clear that the author is talking about brake linings. The second error is in reference to the probability of the first Space Shuttle computers not being properly synchronized at launch time. On page 125, A. Borning indicates that the probability is 1 in 67, while on page 161, E. Roberts and S. Berlin state that the problem would “surface only once in every sixty-four attempts to initialize the system.” Interestingly enough, both chapters cite the same paper by J. Garman [1]. The third error is on page 220 in Parnas’s equation (x + 2y + z)y that should be expanded to xy + 2y2 + zy, not x + y + z.

On the whole, I recommend this book to anyone who is concerned about the uncontrolled and growing use of computers in warfare.

Reviewer:  John Cupak, Jr. Review #: CR112511
1) Garman, J. R.The “bug” heard :10round the world. Softw. Eng. Notes 6, 5 (Oct. 1981), 3.
Bookmark and Share
 
Military (J.7 ... )
 
 
Performance (D.4.8 )
 
 
Reliability (D.4.5 )
 
 
Performance of Systems (C.4 )
 
Would you recommend this review?
yes
no
Other reviews under "Military": Date
Inventing accuracy
MacKenzie D., MIT Press, Cambridge, MA, 1990. Type: Book (9780262132589)
Sep 1 1991
Designing secure message systems: the military message systems (MMS) project
Heitmeyer C., Landwehr C. (ed)  Computer-based message services (, Nottingham, UK,2571984. Type: Proceedings
Nov 1 1985
The development of software for ballistic-missile defense
Lin H. Scientific American 253(6): 46-53, 1985. Type: Article
Jul 1 1986
more...

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy