EE-662

From CYPHYNETS

(Difference between revisions)
Jump to: navigation, search
(Schedule)
(Schedule)
Line 104: Line 104:
| align ="left" | Week 4. Feb 18
| align ="left" | Week 4. Feb 18
| align ="left" |   
| align ="left" |   
-
'''Lecture 5'''. Intro to least squares estimation; weighted least squares; recursive least squares; unbiased estimation;
+
'''Lecture 7'''. alternate formulas for Recursive least squares; Matrix inversion lemma;  
-
'''Lecture 6'''. Recursive least squares (contd.); error propagation; optimal gain derivation; Example of RLS;  
+
'''Lecture 8'''. Stein's equation; solution of Lyapunov's equation; related theorems; noise in sampled data systems;  
 +
| align ="left" |
 +
|-
 +
| align ="left" | Week 4. Feb 25
 +
| align ="left" | 
 +
'''Lecture 9'''. Sylvester's equation; related theorems; Notation for aposteriori, apriori estimates; setup for Kalman filter; Kalman filter as RLS; optimality of KF;
-
'''Lecture 6'''.  
+
'''Lecture 10'''. KF tracking example; how to deal with colored noise;
 +
 
 +
| align ="left" |
 +
|-
 +
| align ="left" | Week 4. March 4
 +
| align ="left" | 
 +
'''Lecture 11'''. Correlated process and measurement noise; steady state filtering; Discrete-Algebraic Riccati Equation;
 +
 
 +
'''Lecture 12'''. Steady state KF (contd.); solution of DARE; existence and uniqueness; elaboration of theorems by examples;
 +
 
 +
| align ="left" |
 +
|-
 +
| align ="left" | Week 4. March 11
 +
| align ="left" | 
 +
'''Lecture 13'''. Introduction to Bayesian state estimation; Markovian dynamics; derivation of Bayes filter;
 +
 
 +
 
 +
 
 +
| align ="left" |
 +
|-
 +
| align ="left" | Week 4. March 25
 +
| align ="left" | 
 +
'''Lecture 14'''. KF vs. Bayesian filtering; elaboration on a scalar stochastic differential equation;
 +
 
 +
'''Midterm.'''
 +
 
 +
| align ="left" |
 +
|-
 +
| align ="left" | Week 4. March 25
 +
| align ="left" | 
 +
'''Lecture 14'''. KF vs. Bayesian filtering; elaboration on a scalar stochastic differential equation;
 +
 
 +
'''Midterm.'''
| align ="left" |  
| align ="left" |  

Revision as of 13:01, 6 January 2014

EE-662: Applied Paramter & State Estimation


Instructor

Dr. Abubakr Muhammad, Assistant Professor of Electrical Engineering

Email: abubakr [at] lums.edu.pk

Office: Room 9-311A, 3rd Floor, SSE Bldg

Course Details

Year: 2012-13

Semester: Spring

Category: Graduate

Credits: 3

Elective course for electrical engineering majors

Course Website: http://cyphynets.lums.edu.pk/index.php/EE-662

Course Description

In this course we develop a hands-on yet rigorous approach to tackling uncertainties in the dynamical evolution of an engineering system. We learn about the main sources of uncertainty and how to model them statistically. We learn that installing sensors on an uncertain system can help reduce this uncertainty. However, sensors themselves introduce noise. Still, there are amazingly efficient algorithms to process sensor data and minimize uncertainty due to both sensor and process noises. You will learn about the computer algorithm that navigated man to the moon and whose implementation requirements inspired the microelectronics revolution. Main topics of the course include Kalman filters, Bayesian estimation, Particle filters and Markov decision processes with lots of applications in robot navigation, geophysical data assimilation, signal detection, radar tracking, computer vision, aerospace guidance & control and many more.

Objectives

  • To introduce an applied perspective on using estimation techniques in state space models of nonlinear non-Gaussian dynamical systems.
  • To introduce applications of state estimation in robot navigation, geophysical data assimilation, signal detection, radar tracking, computer vision etc.

Learning Outcomes

  • To identify and model uncertainties in sensors and dynamics of engineering systems.
  • To learn a unifying mathematical framework for tackling a vast range of estimation problems.
  • To appreciate common outcomes in attempts at uncertainty quantification from seemingly diverse disciplines of mathematical statistics, machine learning, signal processing, inverse problems and stochastic control theory.

Pre-requisites

EE-561. Digital Control Systems AND EE-501. Applied Probability OR By permission of instructor

Text book

The course will be taught from the following textbooks.

  • Optimal State Estimation by Dan Simon (Wiley, 2006)

Other important references include

  • Probabilistic Robotics by Thrun, Burgard, Fox (MIT Press, 2006)
  • Statistical Signal Processing (Part 1: Estimation theory) by Kay.
  • Estimation with Applications to Tracking and Navigation by Yaakov Bar-Shalom, X. Rong Li, Thiagalingam Kirubarajan (Wiley, 2001)

Grading Scheme

Home-works : 20%

Project: 25%

Midterm Examination: 25%

Final: 30%

Policies and Guidelines

  • Quizzes will be announced. There will be no makeup quiz.
  • Homework will be due at the beginning of the class on the due date. Late homework will not be accepted.
  • You are allowed to collaborate on homework. However, copying solutions is absolutely not permitted. Offenders will be reported for disciplinary action as per university rules.
  • Any appeals on grading of homeworks, quiz or midterm scores must be resolved within one week of the return of graded material.
  • Attendance is in lectures and tutorials strongly recommended but not mandatory. However, you are responsible for catching the announcements made in the class.
  • Many of the homeworks will include MATLAB based computer exercise. Some proficiency in programming numerical algorithms is essential for both the homework and project.


Schedule

WEEK TOPICS REFERENCES
Week 1. Jan 21 Lecture 1. Linear Algebra review; matrix calculus review; Application: How to estimate a constant vector;



Week 2. Jan 28 Lecture 2. Intro to Linear systems theory; concept of a state; continuous-time to discrete-time conversion; matrix exponentials;

Lecture 3. Intro to stability; Lyapunov and asympotitic stability; Eigenanalysis and stability;

Week 3. Feb 4 Lecture 4. Jordon form; controllability; observability; stabilizability; detectability; system modes;


Week 4. Feb 11

Lecture 5. Intro to least squares estimation; weighted least squares; recursive least squares; unbiased estimation;

Lecture 6. Recursive least squares (contd.); error propagation; optimal gain derivation; Example of RLS;


Week 4. Feb 18

Lecture 7. alternate formulas for Recursive least squares; Matrix inversion lemma;

Lecture 8. Stein's equation; solution of Lyapunov's equation; related theorems; noise in sampled data systems;

Week 4. Feb 25

Lecture 9. Sylvester's equation; related theorems; Notation for aposteriori, apriori estimates; setup for Kalman filter; Kalman filter as RLS; optimality of KF;

Lecture 10. KF tracking example; how to deal with colored noise;

Week 4. March 4

Lecture 11. Correlated process and measurement noise; steady state filtering; Discrete-Algebraic Riccati Equation;

Lecture 12. Steady state KF (contd.); solution of DARE; existence and uniqueness; elaboration of theorems by examples;

Week 4. March 11

Lecture 13. Introduction to Bayesian state estimation; Markovian dynamics; derivation of Bayes filter;


Week 4. March 25

Lecture 14. KF vs. Bayesian filtering; elaboration on a scalar stochastic differential equation;

Midterm.

Week 4. March 25

Lecture 14. KF vs. Bayesian filtering; elaboration on a scalar stochastic differential equation;

Midterm.

==
Personal tools