EE-662

From CYPHYNETS

Jump to: navigation, search
EE-662: Parameter & State Estimation


Instructor

Dr. Abubakr Muhammad, Assistant Professor of Electrical Engineering

Email: abubakr [at] lums.edu.pk

Office: Room 9-311A, 3rd Floor, SSE Bldg

Course Details

Year: 2012-13

Semester: Spring

Category: Graduate

Credits: 3

Elective course for electrical engineering majors

Course Website: http://cyphynets.lums.edu.pk/index.php/EE-662

Course Description

In this course we develop a hands-on yet rigorous approach to tackling uncertainties in the dynamical evolution of an engineering system. We learn about the main sources of uncertainty and how to model them statistically. We learn that installing sensors on an uncertain system can help reduce this uncertainty. However, sensors themselves introduce noise. Still, there are amazingly efficient algorithms to process sensor data and minimize uncertainty due to both sensor and process noises. You will learn about the computer algorithm that navigated man to the moon and whose implementation requirements inspired the microelectronics revolution. Main topics of the course include Kalman filters, Bayesian estimation, Particle filters and Markov decision processes with lots of applications in robot navigation, geophysical data assimilation, signal detection, radar tracking, computer vision, aerospace guidance & control and many more.

Objectives

  • To introduce an applied perspective on using estimation techniques in state space models of nonlinear non-Gaussian dynamical systems.
  • To introduce applications of state estimation in robot navigation, geophysical data assimilation, signal detection, radar tracking, computer vision etc.

Learning Outcomes

  • To identify and model uncertainties in sensors and dynamics of engineering systems.
  • To learn a unifying mathematical framework for tackling a vast range of estimation problems.
  • To appreciate common outcomes in attempts at uncertainty quantification from seemingly diverse disciplines of mathematical statistics, machine learning, signal processing, inverse problems and stochastic control theory.

Pre-requisites

EE-561. Digital Control Systems AND EE-501. Applied Probability OR By permission of instructor

Text book

The course will be taught from the following textbooks.

  • Optimal State Estimation by Dan Simon (Wiley, 2006)

Other important references include

  • Probabilistic Robotics by Thrun, Burgard, Fox (MIT Press, 2006)
  • Statistical Signal Processing (Part 1: Estimation theory) by Kay.
  • Estimation with Applications to Tracking and Navigation by Yaakov Bar-Shalom, X. Rong Li, Thiagalingam Kirubarajan (Wiley, 2001)

Grading Scheme

Home-works : 20%

Project: 25%

Midterm Examination: 25%

Final: 30%

Policies and Guidelines

  • Quizzes will be announced. There will be no makeup quiz.
  • Homework will be due at the beginning of the class on the due date. Late homework will not be accepted.
  • You are allowed to collaborate on homework. However, copying solutions is absolutely not permitted. Offenders will be reported for disciplinary action as per university rules.
  • Any appeals on grading of homeworks, quiz or midterm scores must be resolved within one week of the return of graded material.
  • Attendance is in lectures and tutorials strongly recommended but not mandatory. However, you are responsible for catching the announcements made in the class.
  • Many of the homeworks will include MATLAB based computer exercise. Some proficiency in programming numerical algorithms is essential for both the homework and project.


Schedule

WEEK TOPICS REFERENCES
Week 1. Jan 21 Lecture 1. Linear Algebra review; matrix calculus review; Application: How to estimate a constant vector;
Week 2. Jan 28 Lecture 2. Intro to Linear systems theory; concept of a state; continuous-time to discrete-time conversion; matrix exponentials;

Lecture 3. Intro to stability; Lyapunov and asympotitic stability; Eigenanalysis and stability;

Week 3. Feb 4 Lecture 4. Jordon form; controllability; observability; stabilizability; detectability; system modes;
Week 4. Feb 11

Lecture 5. Intro to least squares estimation; weighted least squares; recursive least squares; unbiased estimation;

Lecture 6. Recursive least squares (contd.); error propagation; optimal gain derivation; Example of RLS;

Week 5. Feb 18

Lecture 7. alternate formulas for Recursive least squares; Matrix inversion lemma;

Lecture 8. Stein's equation; solution of Lyapunov's equation; related theorems; noise in sampled data systems;

Week 6. Feb 25

Lecture 9. Sylvester's equation; related theorems; Notation for aposteriori, apriori estimates; setup for Kalman filter; Kalman filter as RLS; optimality of KF;

Lecture 10. KF tracking example; how to deal with colored noise;

Week 7. March 4

Lecture 11. Correlated process and measurement noise; steady state filtering; Discrete-Algebraic Riccati Equation;

Lecture 12. Steady state KF (contd.); solution of DARE; existence and uniqueness; elaboration of theorems by examples;

Week 8. March 11

Lecture 13. Introduction to Bayesian state estimation; Markovian dynamics; derivation of Bayes filter;

Midterm.

Week 9. March 25

Lecture 14. KF vs. Bayesian filtering; elaboration on a scalar stochastic differential equation;

Project Proposal Presentations.

Week 10. April 1

Lecture 15. Bayes filtering for linear dynamics and Gaussian noise; Equivalence to KF;

Lecture 16. Nonlinear filtering; linearization of dynamics and sensing; Extended Kalman filter (EKF);

Week 11. April 8

Lecture 17. Nonlinear filtering (contd.); Unscented transform Vs. linearization; graphical explanation;

Lecture 18. Analysis of Unscented transform; Mean and covariance in non-linear transformations;

Week 12. April 15

Lecture 19. Guest lecture. An applied intro to Particle Filtering (Dr Murtaza Taj)

Lecture 20. Guest lecture (contd.) Applications of PF in computer vision (Dr Murtaza Taj)

Week 13. April 22

Lecture 21. UKF Vs. EKF; example;

Lecture 22. Particle filtering and its derivation

Week 14. April 29

Lecture 23. Resampling in PF; Importance resampling;

Week 15. May 6

Lecture 24. A final overview of estimators; Cramer Rao lower bounds; classical Vs. Bayesian estimation;

Project Presentations. 1

Project Presentations. 2

==
Personal tools