This book presents the theoretical aspect of game theoretical control problems, emphasizing the theory and analysis of various differential games. A differential game is one in which opposing players attempt to reach a satisfactory conclusion to the game, for example, the approach-evasion problem. Since a differential game is a minimax problem, the book discusses solutions of minimax problems. The book consists of 11 chapters (approximately 500 pages), with the mathematical discussion presented at a pseudo-abstract level. This is a book primarily for workers in the area of theoretical control theory. Some examples are given, but the majority of the text is devoted to theory.
Chapter 1 describes the minimax control problem in the sense of the equations of motion, strategies, the quality (or cost) function, and the concept of differential games. Chapter 2 is devoted to the theoretical aspects of the approach-evasion game. The concept of stable bridges (solutions) is introduced, and the existence of such bridges is explored. Chapter 3 is concerned with the fixed termination and duration problems of differential games.
Dynamic programming is discussed in chapter 4, which covers the main equation of the theory of differential games. This equation is of the Hamilton-Jacobi type and is frequently known as the Bellman-Isaacs equation of dynamic programming. Chapters 5 and 6 are devoted to a discussion and mathematical treatment of extremal aiming. The remainder of the book, chapters 7, 8, 9, 10, and 11, discusses different aspects of differential games.
This book is the most complete treatment of differential games known to me. The book is written at a rather high level of mathematical theory and will be of value only to the serious worker in control theory. Although the Russian version of the book was written in 1974, the material is current as the theory of differential games has not changed much in the past decade. The book is a must for the serious researcher in game theory.