Abstract
This chapter is devoted to basic control problem statements for stochastic systems, to deriving the Gellman equation, to a discussion of linear-quadratic problems under various conditions of a priori knowledge, to the investigation of stabilization problems, and to a discussion of various approximation techniques for the determination of optimal control.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 1996 Springer Science+Business Media Dordrecht
About this chapter
Cite this chapter
Afanas’ev, V.N., Kolmanovskii, V.B., Nosov, V.R. (1996). Control of Stochastic Systems. Problem Statements and Investigation Techniques. In: Mathematical Theory of Control Systems Design. Mathematics and Its Applications, vol 341. Springer, Dordrecht. https://doi.org/10.1007/978-94-017-2203-2_9
Download citation
DOI: https://doi.org/10.1007/978-94-017-2203-2_9
Publisher Name: Springer, Dordrecht
Print ISBN: 978-90-481-4615-4
Online ISBN: 978-94-017-2203-2
eBook Packages: Springer Book Archive