6.231 Dynamic Programming and Stochastic Control

As taught in: Fall 2008

Level:

Graduate

Instructors:

Prof. Dimitri Bertsekas

Diagram in which nodes can be inserted into or removed from a list of active nodes.
Label correcting methods for shortest paths. See lecture 4 for more information. (Figure by MIT OpenCourseWare, adapted from course notes by Prof. Dimitri Bertsekas.)

Course Features

Course Highlights

This course features a complete set of lecture notes, as well as assignments and exams with solutions.

Course Description

This course covers the basic models and solution techniques for problems of sequential decision making under uncertainty (stochastic control). We will consider optimal control of a dynamical system over both a finite and an infinite number of stages (finite and infinite horizon). We will also discuss some approximation methods for problems involving large state spaces. Applications of dynamic programming in a variety of fields will be covered in recitations.
Donate Now