EE662
From CYPHYNETS
(→Instructors) 
(→Schedule) 

Line 71:  Line 71:  
! WEEK !! TOPICS !! REFERENCES  ! WEEK !! TOPICS !! REFERENCES  
    
   align ="left"  Week 1.  +   align ="left"  Week 1. Jan 21 
   align ="left"  '''Lecture 1'''.  +   align ="left"  '''Lecture 1'''. Linear Algebra review; matrix calculus review; Application: How to estimate a constant vector; 
+  
 align ="left"    align ="left"   
    
   align ="left"  Week 2.  +   align ="left"  Week 2. Jan 28 
   align ="left"  '''Lecture 2'''.  +   align ="left"  '''Lecture 2'''. Intro to Linear systems theory; concept of a state; continuoustime to discretetime conversion; matrix exponentials; 
+  
+  '''Lecture 3'''. Intro to stability; Lyapunov and asympotitic stability; Eigenanalysis and stability;  
  
 align ="left"    align ="left"   
    
+   align ="left"  Week 3. Feb 4  
+   align ="left"  '''Lecture 4'''. Jordon form; controllability; observability; stabilizability; detectability; system modes;  
+  
+  
+   align ="left"   
+    
+   align ="left"  Week 4. Feb 11  
+   align ="left"   
+  '''Lecture 5'''. Intro to least squares estimation; weighted least squares; recursive least squares; unbiased estimation;  
+  
+  '''Lecture 6'''. Recursive least squares (contd.); error propagation; optimal gain derivation; Example of RLS;  
+  
+  
+  
+   align ="left"   
+    
+   align ="left"  Week 4. Feb 18  
+   align ="left"   
+  '''Lecture 5'''. Intro to least squares estimation; weighted least squares; recursive least squares; unbiased estimation;  
+  
+  '''Lecture 6'''. Recursive least squares (contd.); error propagation; optimal gain derivation; Example of RLS;  
+  
+  
+  '''Lecture 6'''.  
+  
+   align ="left"   
+    
+  
}  }  
==  == 
Revision as of 12:47, 6 January 2014
EE662: Applied Paramter & State Estimation 

Instructor
Dr. Abubakr Muhammad, Assistant Professor of Electrical Engineering
Email: abubakr [at] lums.edu.pk
Office: Room 9311A, 3rd Floor, SSE Bldg
Course Details
Year: 201213
Semester: Spring
Category: Graduate
Credits: 3
Elective course for electrical engineering majors
Course Website: http://cyphynets.lums.edu.pk/index.php/EE662
Course Description
In this course we develop a handson yet rigorous approach to tackling uncertainties in the dynamical evolution of an engineering system. We learn about the main sources of uncertainty and how to model them statistically. We learn that installing sensors on an uncertain system can help reduce this uncertainty. However, sensors themselves introduce noise. Still, there are amazingly efficient algorithms to process sensor data and minimize uncertainty due to both sensor and process noises. You will learn about the computer algorithm that navigated man to the moon and whose implementation requirements inspired the microelectronics revolution. Main topics of the course include Kalman filters, Bayesian estimation, Particle filters and Markov decision processes with lots of applications in robot navigation, geophysical data assimilation, signal detection, radar tracking, computer vision, aerospace guidance & control and many more.
Objectives
 To introduce an applied perspective on using estimation techniques in state space models of nonlinear nonGaussian dynamical systems.
 To introduce applications of state estimation in robot navigation, geophysical data assimilation, signal detection, radar tracking, computer vision etc.
Learning Outcomes
 To identify and model uncertainties in sensors and dynamics of engineering systems.
 To learn a unifying mathematical framework for tackling a vast range of estimation problems.
 To appreciate common outcomes in attempts at uncertainty quantification from seemingly diverse disciplines of mathematical statistics, machine learning, signal processing, inverse problems and stochastic control theory.
Prerequisites
EE561. Digital Control Systems AND EE501. Applied Probability OR By permission of instructor
Text book
The course will be taught from the following textbooks.
 Optimal State Estimation by Dan Simon (Wiley, 2006)
Other important references include
 Probabilistic Robotics by Thrun, Burgard, Fox (MIT Press, 2006)
 Statistical Signal Processing (Part 1: Estimation theory) by Kay.
 Estimation with Applications to Tracking and Navigation by Yaakov BarShalom, X. Rong Li, Thiagalingam Kirubarajan (Wiley, 2001)
Grading Scheme
Homeworks : 20%
Project: 25%
Midterm Examination: 25%
Final: 30%
Policies and Guidelines
 Quizzes will be announced. There will be no makeup quiz.
 Homework will be due at the beginning of the class on the due date. Late homework will not be accepted.
 You are allowed to collaborate on homework. However, copying solutions is absolutely not permitted. Offenders will be reported for disciplinary action as per university rules.
 Any appeals on grading of homeworks, quiz or midterm scores must be resolved within one week of the return of graded material.
 Attendance is in lectures and tutorials strongly recommended but not mandatory. However, you are responsible for catching the announcements made in the class.
 Many of the homeworks will include MATLAB based computer exercise. Some proficiency in programming numerical algorithms is essential for both the homework and project.
Schedule
WEEK  TOPICS  REFERENCES 

Week 1. Jan 21  Lecture 1. Linear Algebra review; matrix calculus review; Application: How to estimate a constant vector;
 
Week 2. Jan 28  Lecture 2. Intro to Linear systems theory; concept of a state; continuoustime to discretetime conversion; matrix exponentials;
Lecture 3. Intro to stability; Lyapunov and asympotitic stability; Eigenanalysis and stability;  
Week 3. Feb 4  Lecture 4. Jordon form; controllability; observability; stabilizability; detectability; system modes;
 
Week 4. Feb 11 
Lecture 5. Intro to least squares estimation; weighted least squares; recursive least squares; unbiased estimation; Lecture 6. Recursive least squares (contd.); error propagation; optimal gain derivation; Example of RLS;
 
Week 4. Feb 18 
Lecture 5. Intro to least squares estimation; weighted least squares; recursive least squares; unbiased estimation; Lecture 6. Recursive least squares (contd.); error propagation; optimal gain derivation; Example of RLS;

==