Stochastic Control of Partially Observable Systems
The problem of stochastic control of partially observable systems plays an important role in many applications. All real problems are in fact of this type, and deterministic control as well as stochastic control with full observation can only be approximations to the real world. This justifies the importance of having a theory as complete as possible, which can be used for numerical implementation. This book first presents those problems under the linear theory that may be dealt with algebraically. Later chapters discuss the nonlinear filtering theory, in which the statistics are infinite dimensional and thus, approximations and perturbation methods are developed.
Product details
November 2004Paperback
9780521611978
364 pages
247 × 190 × 20 mm
0.643kg
Available
Table of Contents
- Preface
- 1. Linear filtering theory
- 2. Optimal stochastic control for linear dynamic systems with quadratic payoff
- 3. Optimal control of linear stochastic systems with an exponential-of-integral performance index
- 4. Non linear filtering theory
- 5. Perturbation methods in non linear filtering
- 6. Some explicit solutions of the Zakai equation
- 7. Some explicit controls for systems with partial observation
- 8. Stochastic maximum principle and dynamic programming for systems with partial observation
- 9. Existence results for stochastic control problems with partial information
- References
- Index.