D-CIS Publication Database

Publication

Type of publication:Incollection
Entered by:MN
TitleOptimal control theory and the linear Bellman Equation
Bibtex cite ID
Booktitle Inference and Learning in Dynamic Models
Year published 2009
Publisher Cambridge University Press
Keywords Optimal control theory
Abstract
Optimal control theory is a mathematical description of how to act optimally to gain future rewards. In this paper I give an introduction to deterministic and stochastic control theory; partial observability, learning and the combined problem of inference and control. Subsequently, I discuss a new class of non-linear stochastic control problems for which the Bellman equation becomes linear in the control and that can be efficiently solved using a path integral. In this control formalism the central concept of cost-to-go becomes a free energy and methods and concepts from probabilistic graphical models and statistical physics can be readily applied. I illustrate the theory with a number of examples.
Authors
Kappen, Hilbert J.
Editors
Chiappa, Silvia
Cemgil, Ali Taylan
Barber, David
Topics
=SEE CLASSIFICATION DIFFERENCE FROM OTHERS=
BibTeXBibTeX
RISRIS
Attachments
 
Total mark: 5