Relative to its response to a storm pattern, the river basin may be considered a system: local watersheds respond to local rainfall patterns according to soil moisture, ground cover, average slope, etc., and contribute runoff to tributary streams. Here the drainage channels combine the flows, and waves of discharge are propagated to downstream areas on the main stem. Often the response has been modified by man: forests have been cleared, dams have been constructed across streams, levees alter the flood wave propagation velocity. But whatever its current configuration, if left alone or subjected only to rigid operating rules for reservoir releases, it is in theory a deterministic system. With enough information and computing power the response in terms of flood heights and consequent damages can be predicted for any of an infinite number of storm patterns.
This sounds like a large order, but it is just what engineers have been doing to the best of their ability for a long time. An important part of the design of a dam is the determination of the inflow expected from a "design storm" based on the best available statistics of rainfall and runoff. The downstream channel is then designed on the basis of the expected release rates from the dam.
However, in the design process only a few storm patterns can be used out of the infinity of possibilities. After the design process is complete and the physical works are constructed, the response to each storm pattern as it occurs is then subject to further calculations so that within the physical limitations the best operation is carried out. This is not a simple problem, even for a single reservoir. Consider the position of a decision maker in the midst of a moderate storm as he plans the release rates from a reservoir: his first responsibility is the safety of the dam, but within that restriction he must make releases so that long-term damage to the downstream area is minimized.
What information does he have on which to base his decisions? He knows the current inflow rate to his reservoir, and can project this for a day ahead with fair confidence; he knows how much of the reservoir volume remains vacant and available to take up the difference between inflows and releases; and if there are uncontrolled areas downstream between the dam and a potential damage area, he may know how much of a flood may be coming from such areas and regulate his own releases accordingly. The number of combinations and permutations of these few factors can be accommodated by a set of rules, charts, and nomograms in many cases. That is, there need be no ad hoc human judgment applied for this single reservoir problem.
Were the effect of operating all reservoirs in a river basin strictly additive, the above ideas could be extended easily, perhaps with the help of linear programming. Unfortunately, no such simple methods are feasible; the interactions among elements of a multireservoir system cannot be neglected.
To illustrate: Assume that we can see, coming down the main stem of the river, a large flood wave that will arrive at our damage center in about two days. Meanwhile, a local tributary, equipped with a regulating reservoir, is threatening to produce damaging flows in about twelve hours unless we use up storage volume to prevent its doing so. By taking all the upstream information into account, we can arrive at a reasonable estimate of and probable lower limit to the damage from the main stem flood acting alone.
By appropriately manipulating the tributary releases, we can allow an initial heavy flow from a local area in order to save storage space, even though this will produce damages that would not otherwise have occurred so soon. Later this storage space will be used to allow the tributary to be completely shut down as the main flood goes by. Thus, by taking our damages ahead of time, we may be able to reduce the overall damage during the worst period.
In a large system there are likely to be many such opportunities during a flood emergency; however, each of them will be unique, and there is likely to be an astronomical number of possible situations ...
Linear and dynamic programming are capable of analyzing problems, of astonishing complexity—so long, as the independent variables enter in a linear way. However, when interactions and other non-linearities do enter, they compound the problem not in an additive way but in an exponential way. It appears that we could step up computer speed and memory capacity by a factor of a hundred and still not feel any more hopeful about the usefulness of these programming methods for the problem we are discussing.
An important factor in extending the number of variables is that each physical variable changes with time, and in the ordinary optimization schemes its value during each interval of time becomes a new mathematical variable. However, this new sequence is ordered, for later values cannot affect earlier ones; dynamic programming takes advantage of this fact. Simulation takes advantage of it too, in an implicit way, for it is a method of calculating at each instant the progress of phenomena that would take place in the prototype according to the parameters and inputs assumed.
Whatever the physical means of carrying out the simulation, there will be a branch point each time a decision must be made; and as in a chess game, the number of individual paths through all the branches quickly becomes astronomical. Thus it appears that introducing human judgment at critical points, within the simulation, will be very helpful; some would say essential.
This leads us towards the concept of man-machine interaction in which the machine provides the high-speed computing or simulation under the currently prescribed conditions and decisions, and the human operator provides the decisions that steer successive simulations toward an optimum.
The analog simulators the Hydraulics Laboratory at the University of California, Berkeley, has been building are designed for very fast operation and for easy communication to and from the human operator. The Kansas River analog model, for example, completes sixty simulations per second; consequently, the output display on a cathode ray oscilloscope is presented without a flicker that is apparent to the eye. This model, the development of which was supported by the Kansas City District of the Army Corps of Engineers, and completed in 1964, represents the first application of analog simulation techniques to a river basin.
The basic idea of such simulators is that they obey the same equations, relative to the electrical currents and voltages involved, as govern the actual prototype variables. In this case, electrical current corresponds to discharge, and voltage corresponds to water levels above a local datum. The analog, which is a large network of special circuits, simulates all the pertinent properties of the flood control system: the ability to transmit waves having the correct celerity from one point to another, to attenuate these waves or to allow them to build up according to the channel properties, to store electrical charge in the way reservoirs store water, and to be subject to all the manipulations to which the flood control system is subject. This is quite an order, and has been achieved through a large number of ingenious circuit designs in which the physical features of the flood control system are simulated.
The operator then can see displayed before him in almost instantaneous fashion the results of whatever reservoir release schedule he has established. He can increase by an arbitrary amount the release from an upstream reservoir during a particular six-hour period, note that the resulting flood wave will overtop a levee fifty miles downstream, and reduce the release back to its original amount, all in less time than it takes to describe his actions. Because the machine is so responsive, he can try out all kinds of schemes, and can just as quickly reject the unworkable ones.
In looking back over our experience with analog simulators at Berkeley, it seems to me that the most important conclusion has been that the human mind, with its capacity to integrate and hold together a vast amount of information and to formulate decisions on this basis, must be brought into interaction with computing machinery, of whatever sort, if we are to successfully optimize the operation of large systems. At one time we were dazzled by the seemingly inexhaustible capacity of the computer for enormously extended analysis. Now we realize that as the degree of system complexity goes up, the permutations rise exponentially, and some commonsense limitations must be made on the alternative possible paths to optimizing a solution. So far as I can see, there will never be a substitute for this function of the human mind; therefore we must search for ways that can accommodate this important part of the man-machine team as a real equal, and not as a second-class partner off at a desk somewhere. In this search the analog simulators have made a significant contribution.
Adapted from "Analog Models for Stream Hydrology," by J. A. Harder.