Consider a set of equations (not necessarily linear) with several unknowns, which is not over-determined. We could, of course, solve the whole set; however, it might be possible to solve a subset, plug the values thus obtained into the rest, and possibly find another solvable subset. In any case, we would have reduced the problem to solving smaller systems. In the case of linear equations, we are asking whether the system can be permuted so as to be block upper triangular; in the case of nonlinear equations, we are asking the same question of the incidence matrix.
If the equations are not under-determined (so that there are as many equations as variables), then there is an efficient (certainly polynomial-time) algorithm for computing the best such decomposition. The authors quote Dulmage and Mendelsohn [1], chronologically earlier, whereas a matrix theorist might be more familiar with Duff and Reid [2].
If there are more variables than equations, we can fix some of the variables (or regard them as parameters) and then ask the same question about the--by now determined--resulting system. Unfortunately, the answer depends on which variables we pick to fix. How hard is it to decide which ones to fix?
Alas, much harder. The authors show that the parameterized problem of deciding if a system with n variables has a block decomposition with maximum block size k is W[1] hard; that is, its complexity is not f(k)nO(1) unless the polynomial hierarchy collapses here.