Let f be a random polynomial of degree n; namely, the n+1 coefficients are random variables, not necessarily independent. Let B = ∪ Bi be a union of nonoverlapping intervals in . The author considers the following question: What is the probability that such a random polynomial has at least ni roots in each Bi, or exactly ni roots in each Bi? The author claims to reduce the computation of these probabilities to integrating the probability density function of the coefficients over certain polyhedra in coefficient space. This reduction proceeds by using the Fourier-Budan theorem to produce linear inequalities that the coefficients must satisfy in order for each particular sign sequence satisfying the Fourier-Budan criterion on sign changes in the sequence of values of the derivative to hold.
At this point, I get lost. The Fourier-Budan theorem can never guarantee that a polynomial has “at least” a certain number of zeros, merely “at most.” Hence, the paper seems to fall down at this point. No actual experimental results are given, and despite the abstract saying “realization and compatibility of the algorithm discussed,” the paper concludes with: “Developing the software routines ... are the main fields of planned future research.”
This paper would benefit from more thorough editing, both for the English and the clarity of expression. For example, the object defined at the top of page 2 is not a cube, but rather a cuboid.