EE-565

From CYPHYNETS

Revision as of 04:57, 24 January 2019 by Abubakr (Talk | contribs)
(diff) ←Older revision | Current revision (diff) | Newer revision→ (diff)
Jump to: navigation, search
EE-565/CS-5313: Mobile Robotics


Instructors

Dr Ahmad Kamal Nasir

Dr Abubakr Muhammad

Assistant Professor of Electrical Engineering

Email: ahmad.kamal [at] lums.edu.pk, abubakr [at] lums.edu.pk

Office: Room 9-313, Left Wing, 3rd Floor, SSE Bldg

Office Hours: Tuesday[1400-1500] Thursday[1400-1500]

Teaching assistant.

Course Details

Year: 2018-19

Semester: Spring

Category: Grad

Credits: 3

Elective course for electrical engineering, computer engineering and computer science majors


Course Description

This course is designed to provide students a hands-on experience on real aerial and ground mobile robots. It provides an overview of problems / approaches in mobile robotics. Most of the algorithms described in this course are probabilistic in nature, dealing with noisy data. The students shall be given an opportunity to implement state of the art probabilistic algorithms for mobile robot state estimation, with a strong focus on vision as the main sensor.

This course is NOT about

  • Mechatronics or robot building.

Learning Outcomes

The students should be able to:

  • Understand basic wheel robot kinematics, common mobile robot sensors and actuators knowledge.
  • Understand and able to apply various robot motion and sensor models used for recursive state estimation techniques.
  • Demonstrate Inertial/visual odometeric techniques for mobile robots pose calculations.
  • Use and apply any one of the Simultaneous Localization and Mapping (SLAM) technique.
  • Understand and apply path planning and navigation algorithms.

Pre-requisites

Courses. CS310 OR EE361 OR By permission of instructor.

Topics/Skills. Programming proficiency in C/C++; linear algebra and probability

Text books & Supplementary Readings

The course will be taught from a combination of the following textbooks.

Grading Scheme

  • Final Project : 20%
  • Midterm Examination: 25% + 15%
  • Lab Tasks: 40%

Course Delivery Method

Lectures. Tues, Thurs: 0800-0850. Lab. Fri: 0800-1050.

Schedule

WEEK TOPICS READINGS/REFERENCES

Week 1

  • Jan 26
  • Jan 28

Lecture:

  • Introduction to mobile robotics and trends, course objectives
  • Short notes on Linear Algebra
  • 2D/3D Geometry, Transformations, 3D-2D Projections
  • Recap of Probability Rules

Tutorial: Introduction to ROS

Week 2

  • Feb 02
  • Feb 04

Lecture:

  • Wheel kinematics and robot pose calculation
    • Differential wheel drive
    • Ackermann wheel drive
  • Introduction to mobile robot sensors
    • Wheel encoders
    • Inertial Measurement Unit (IMU) and GPS
    • Range sensors (Ultrasonic,2D/3D Laser range scanner)
    • Vision sensors (Monocular/Stereo camera)
  • Introduction to mobile robot actuators
    • DC Brush/Brushless motors
    • PID based velocity controller
    • PID based position controller

Lab Task: ROS Interface with simulation environment

Week 3

  • Feb 09
  • Feb 11

Lecture:

  • Motion Models
    • Velocity based model (Dead-Reckoning)
    • Odometry based model (Wheel Encoders/IMU)
  • Sensor Models
    • Beam model of range finders
    • Feature based sensor models
      • Laser scanner
      • Kinect
      • Camera

Lab Task: ROS Interface with low level control

Week 4

  • Feb 16
  • Feb 18

Lecture:

  • Recursive State Estimation: Bayes Filter
  • Linear Kalman Filter
  • Extended Kalman Filter

Lab Task: IRobot setup with ROS and implement odometeric motion model

Week 5

  • Feb 23
  • Feb 25

Lecture:

  • Non-parametric filters
    • Histogram filters
    • Particle filters

Lab Task: AR Drone setup with ROS and sensor data fusion using AR Drone’s accelerometer and gyroscope

Week 6

  • Mar 02
  • Mar 04

Lecture:

  • Inertial sensors models
    • Gyroscope
    • Accelerometer
    • Magnetometer
    • GPS
  • Inertial Odometry

Mid-Term Examination 1

Week 7

  • Mar 09
  • Mar 11

Lecture:

  • Visual Odometry: Camera model, calibration
  • Feature detection: Harris corners, SIFT/SURF etc.
  • Kanade-Lucas-Tomasi Tracker (Optical Flow)

Lab Task: Inertial Odometry using AR Drone’s IMU and calculating measurement’s covariance

Week 8

  • Mar 16
  • Mar 18

Lecture:

  • Epi-polar geometry for multi-view Camera motion estimation
  • Structure From Motion (SFM): Environment mapping (Structure), Robot/Camera pose estimation (Motion)

Lab Task: Calibrate AR Drone’s camera and perform online optical flow.

Week 9

  • Mar 30
  • Apr 01

Lecture:

  • Map based localization
  • Markov based localization
  • Kalman Filter based localization

Lab Task: Using AR Drone’s camera, perform visual odometry by SFM algorithm

Week 10

  • Apr 06
  • Apr 08

Lecture:

  • Mapping
    • Feature mapping
    • Grid Mapping
  • Introduction to SLAM
  • Feature/Landmark SLAM
  • Grid Mapping (GMapping)

Mid-Term Examination 2

Week 11

  • Apr 13
  • Apr 15

Lecture:

  • RGBD SLAM

Lab Task: Creating grid map using IRobot-Create equipped with laser scanner.

Week 12

  • Apr 20
  • Apr 22

Lecture:

  • Configuration/work spaces
  • Path Planning algorithms:
    • Greedy Best First Search
    • Dijkstra
    • A*
  • Obstacle avoidance: Bug Algorithms

Lab Task: Create a 3D grid map using IRobot equipped with Microsoft Kinect

Week 13

  • Apr 27
  • Apr 29

Lecture:

  • Exploration, Roadmaps

Lab Task: Setup and perform navigation using ROS navigation stack and stored map

Week 14

  • May 04
  • May 06

Lecture:

  • Recap, Recent research works and future directions
  • Guest Lecture by Dr. Haider Ali. (DLR Germany)

Lab Task: Hands-on introduction to sampling based planners via Open Motion Planning Library (OMPL)

Week 15

  • May 11
  • Project 1 - Human Detection and following using OpenCV and AR-Drone
  • Project 2 - Controlling AR-Drone With Simple Hand Gesture
  • Project 3 - Monocular Vision based Lane Following by AR-Drone
  • Project 4 - Fusion of Road Surface using RGB-D Sensors
  • Project 5 - Canal Following Using AR drone in Simulation Environment
  • Project 6 - Aerial tracking of ground based moving target
  • Project 7 - Surface Detection and Localization
  • Project 8 - NERC 2015 Gazebo simulation implementation
Personal tools