Luigi Piroddi

Nonlinear System Identification

Linear system identification has well established foundations and is widely used in practical engineering problems. However, it may provide unsatisfactory results whenever the nonlinear nature of the system under consideration is significant. The goal of this course is to provide methods and tools for the identification of nonlinear systems, both static and dynamic. Various nonlinear model representations are provided, together with the corresponding identification techniques. The course has both a theoretical and a practical flavour, with examples and applications of the illustrated techniques in various fields.

List of lecturers: Luigi Piroddi, Simone Garatti, Simone Formentin, Lorenzo Fagiano, Giulio Panzani.


Course program

  1. Introduction and motivation. Problem definition. Main terminology and notation in learning problems: static vs. dynamic, linear vs. nonlinear, parametric vs. non-parametric, interpolation vs. fitting, basis functions, kernels. The dynamic system identification problem. Backgrounds on linear system identification. An example where linear identification fails. Data-driven nonlinearity tests. A taxonomy of model identification. Prediction error versus simulation error. The problem of overfitting and regularization methods.
  2. Nonlinear model classes and identification methods. Functional expansion series (Volterra series), block structured systems (Wiener-Hammerstein models), input-output recursive models (polynomial and Neural Networks-based NARX/NARMAX). Identification of polynomial NARX models: OLS and FROE. Model structure selection: batch and recursive methods. Generalized Frequency Response Functions. Neural Networks for nonlinear identification: multilayer perceptrons, RBF networks.
  3. Sequential MC methods for nonlinear identification. Problem formulation: white-box, state-space, nonlinear identification. Maximum likelihood vs. Bayesian identification. Marginalization vs. Data augmentation. A brief resume of nonlinear filtering and particle filtering. Metropolis-Hasting algorithm. EM algorithm. The Gibbs sampler. Computational aspects and numerical results.
  4. Nonlinear Set Membership identification. Motivation - models and model sets, application examples. Problem formulation and assumptions, feasible function sets, data (in)validation results. Optimal approximation, interpolatory approximations, properties, and computational aspects. Application to fast NMPC. Application to nonlinear filtering.

The lecture notes of the course will be made available. In addition, references to relevant papers and books will be provided for each topic.

Prospective students should preferably have followed basic courses in Linear Systems Identification and Probability Theory and Statistics.

A (provisional) list of suggested projects is available at this link. Contact me (or any other lecturer) for project assignments and further details. Personal project proposals are most welcome.


Course timetable

Day 1 (Monday, 28/01/19): Introduction and motivation
Lecturer: Simone Formentin

11:00-12:30 - Problem definition. Main terminology and notation in learning problems: static vs. dynamic, linear vs. nonlinear, parametric vs. non-parametric.
14:00-15:30 - Function approximation: interpolation vs. fitting, basis functions, kernels.
16:00-17:30 - The dynamic system identification problem. Backgrounds on linear system identification. An example where linear identification fails. Data-driven nonlinearity tests.

Day 2 (Tuesday, 29/01/19): Nonlinear model classes and NARX identification
Lecturer: Luigi Piroddi

09:00-10:30 - Nonlinear model classes: Functional expansion series (Volterra series), block structured systems (Wiener-Hammerstein models), LPV models, input-output recursive models (NARX/NARMAX). Generalized Frequency Response Functions.
11:00-12:30 - Identification of polynomial NARX models: OLS and FROE. Model structure selection.
14:00-15:30 - Simulation error versus prediction error. Regularization methods (LASSO).

Day 3 (Wednesday, 30/01/19): Block structured systems and neural networks
Lecturer: Giulio Panzani

09:00-10:30 - Identification methods for block structured systems.
11:00-12:30 - Neural networks for nonlinear identification.
14:00-15:30 - Laboratory session.

Day 4 (Thursday, 31/01/19): Sequential MC methods for nonlinear identification
Lecturer: Simone Garatti

09:00-10:30 - Problem formulation: white-box, state-space, nonlinear identification. Maximum likelihood vs. Bayesian identification. Marginalization vs. Data augmentation.
11:00-12:30 - A brief resume of nonlinear filtering and particle filtering. Metropolis-Hasting algorithm. EM algorithm. The Gibbs sampler.
14:00-15:30 - Computational aspects and numerical results.

Day 5 (Friday, 01/02/19): Nonlinear Set Membership identification
Lecturer: Lorenzo Fagiano

09:00-10:30 - Motivation: models and model sets, application examples. Problem formulation and assumptions, feasible function sets, data (in)validation results.
11:00-12:30 - Optimal approximation, interpolatory approximations, properties, and computational aspects. Application to fast NMPC. Application to nonlinear filtering.
14:00-15:30 - Exercise session on a given data-set.

All lessons will take place in the Sala Seminari of the Dip. di Elettronica, Informazione e Bioingegneria of the Politecnico di Milano.


Lecture notes and course material

Contact me by e-mail to obtain lecture notes not available anymore.