On Adaptive Control Processes. Bellman, R. & Kalaba, R. 4(2):1–9.
On Adaptive Control Processes [link]Paper  doi  abstract   bibtex   
One of the most challenging areas in the field of automatic control is the design of automatic control devices that 'learn' to improve their performamce based upon experience, i.e., that can adapt themselves to circumstances as they find them. The military and commercial implications of such devices are impressive, and interest in the two main areas of research in the field of control, the USA and the USSR, runs high. Unfortunately, though, both theory and construction of adaptive controllers are in their infancy, and some time may pass before they are commonplace. Nonetheless, development at this time of adequate theories of processes of this nature is essential. The purpose of our paper is to show how the functional equation technique of a new mathematical discipline, dynamic programming, can be used in the formulation and solution of a variety of optimization problems concerning the design of adaptive devices. Although, occasionally, a solution in closed form can be obtained, in general, numerical solution via the use of high-speed digital computers is contemplated. We discuss here the closely allied problems of formulating adaptive control processes in precise mathematical terms and of presenting feasible computational algoritbms for determining numerical solutioms. To illustrate the general concepts, consider a system which is governed by the inhomogeneous Van der Pol equation
@article{bellmanAdaptiveControlProcesses1959,
  title = {On Adaptive Control Processes},
  author = {Bellman, Richard and Kalaba, Robert},
  date = {1959-11},
  journaltitle = {Automatic Control, IRE Transactions on},
  volume = {4},
  pages = {1--9},
  issn = {0096-199X},
  doi = {10.1109/tac.1959.1104847},
  url = {https://doi.org/10.1109/tac.1959.1104847},
  abstract = {One of the most challenging areas in the field of automatic control is the design of automatic control devices that 'learn' to improve their performamce based upon experience, i.e., that can adapt themselves to circumstances as they find them. The military and commercial implications of such devices are impressive, and interest in the two main areas of research in the field of control, the USA and the USSR, runs high. Unfortunately, though, both theory and construction of adaptive controllers are in their infancy, and some time may pass before they are commonplace. Nonetheless, development at this time of adequate theories of processes of this nature is essential. The purpose of our paper is to show how the functional equation technique of a new mathematical discipline, dynamic programming, can be used in the formulation and solution of a variety of optimization problems concerning the design of adaptive devices. Although, occasionally, a solution in closed form can be obtained, in general, numerical solution via the use of high-speed digital computers is contemplated. We discuss here the closely allied problems of formulating adaptive control processes in precise mathematical terms and of presenting feasible computational algoritbms for determining numerical solutioms. To illustrate the general concepts, consider a system which is governed by the inhomogeneous Van der Pol equation},
  keywords = {*imported-from-citeulike-INRMM,~INRMM-MiD:c-3733907,adaptation,adaptive-control,control-problem,dynamic-programming},
  number = {2}
}

Downloads: 0