Dynamical Systems (AM3.2)
LECTURER : R.M. Adams
Linear algebra, Advanced calculus. Further exposure to differential equations and/or mechanics would be advantageous.
This course is an elementary introduction to modern linear control from an application-oriented mathematical perspective. Control theory deals with dynamical systems which can be controlled; to control a dynamical object means to influence its behaviour in order to achieve a desired goal. Typically, the dynamics of a (deterministic, finite dimensional, differentiable) control system is described by a set of ordinary differential equations (the so-called state equations) with the controls as parameters. Control systems with outputs constitute natural extensions. Linear control systems (with outputs) form an important class of control systems.
Mathematical formulation of the control problem; Linear dynamical systems; Linear control systems : controllability, observability, linear feedback, and realization; Stability and control; Optimal control : Pontryagin's principle, linear regulators with quadratic costs.
None prescribed. Complete course notes will pe provided.
ONE tutorial per week.
TWO class tests and ONE final examination.