EE
527: Detection and Estimation Theory (Spring 2018)
- Updates/Reminders
- Prerequisites: EE 224, EE 322, Basic calculus & linear
algebra. Suggested co-requisite: EE 523
- Location, Time: Howe
1226, Tues-Thurs 2:10-3:30
- Instructor: Prof
Namrata Vaswani
- Office Hours: Wed
11-12, Thurs 11-12
- Office: 3121 Coover Hall
- Email: namrata
AT iastate DOT edu Phone: 515-294-4012
- Grading policy
- Homeworks: 10%
- Midterm
Exam: 30%
- Final Exam: 40%
- Project
/ term paper: 20%
- Exam Dates
and Project Details and Deadlines
- Exam
dates
- Project / Term Paper
details:
- Pick a topic related to the course (can also be related to your
research but cannot be research that you have already done)
- Either pick a paper(s), implement the algorithm(s) for an
application and discuss pros and cons / what else can be done
- Or pick a theoretical paper, present problem, solution approach,
guarantee, and proof (or most of the proof – carefully select a paper
with proof that is not too long)
- Submission requirements: report write-up and presentation (at most
an hour, can be shorter)
- Syllabus:
- Background material:
recap of probability, calculus, linear algebra
- Estimation Theory
- Minimum variance
unbiased estimation, best linear unbiased estimation
- Cramer-Rao lower bound
(CRLB)
- Maximum Likelihood
estimation (MLE): exact and approximate methods (EM, alternating max,
etc)
- Bayesian inference
& Least Squares Estimation (from Kailath
et al's Linear Estimation book)
- Basic ideas, adaptive
techniques, Recursive LS, etc
- Kalman filtering
(sequential Bayes)
- Finite state Hidden
Markov Models: forward-backward algorithm, Viterbi
(ML state estimation), parameter estimation (f-b + EM)
- Graphical Models
- Applications: image processing,
speech, communications (to be discussed with each topic)
- Sparse Recovery and
Compressive Sensing introduction
- Monte Carlo methods:
importance sampling, MCMC, particle filtering, applications in numerical
integration (MMSE estimation or error probability computation) and in
numerical optimization (e.g. annealing)
- Detection Theory
- Likelihood Ratio
testing, Bayes detectors,
- Minimax detectors,
- Multiple hypothesis
tests
- Neyman-Pearson detectors
(matched filter, estimator-correlator etc),
- Wald sequential test,
- Generalized likelihood
ratio tests (GLRTs), Wald and Rao scoring tests,
- Applications
- The syllabus is similar to Prof. Dogandzic's EE527 but I will
cover least squares estimation, Kalman
filtering and Monte Carlo methods in more detail and will discuss some
image/video processing applications also. Note that LSE, KF are also
covered in EE524, but different perspectives are always useful
- Books:
- Textbook: S.M. Kay's Fundamentals
of Statistical Signal Processing: Estimation Theory (Vol
1), Detection Theory (Vol 2)
- References
- Kailath, Sayed
and Hassibi, Linear Estimation
- V. Poor, An
Introduction to Signal Detection and Estimation
- H.Van Trees, Detection, Estimation, and Modulation Theory
- J.S. Liu, Monte
Carlo Strategies in Scientific Computing. Springer-Verlag, 2001.
- B.D. Ripley, Stochastic
Simulation. Wiley, 1987.
- Disability accommodation: If you have a
documented disability and anticipate needing accommodations in this
course, please make arrangements to meet with me soon. You will need to
provide documentation of your disability to Disability Resources (DR)
office, located on the main floor of the Student Services Building, Room
1076 or call 515-294-7220.
- Homeworks
- Homework 1: due Mon Jan
29
- New added:
- Discuss and explain rank and spark
- Prove the interlacing theorem for the matrix (A+zz^T)
where z is a vector.
- New added: Prove all the if and only if statements for
joint-Gaussian random variables from this
document
- Chapter 3 of Supplementary
Problems for Bertsekas’s Probability Text:
Problems 5, 6, 8,
9, 10, 14, 18, 19, 20, 21
- Correction: Suppose X1 is N(0,1), X2 is
1 w.p. 1/2 and -1 w.p.
1/2, and X3 = X1. X2. Compute the pdf
of X3 and compute the joint pdf of X1 and X3
- Practice problems: use this link to do selected practice problems
from Chapters 1 and 2: EE 322 Fall 2007 homework sets (do not need to be submit)
- Homework 2: due Mon Feb 12
- Problems 2.1, 2.4,
2.7, 2.9, 2.10 of Kay-I. Bonus: 2.8
- Homework 3: due Tues
Feb 23
- Problems 5.2, 5.3,
5.4, 5.5, 5.7, 5.13, 5.16
- Compute
the MVUE for a N(\mu,\sigma^2) distribution
using N i.i.d. observations. Also compute the
covariance matrix of the MVUE estimator
- Homework 4: due Thurs
March 3
- Problems 3.1, 3.3, 3.9, 3.11
- Do the following sets of problems: practice set (will
be graded for completion, ignore deadlines written on it)
- Homework
5: Due March Tuesday March 9
- Problems
4.2, 4.5, 4.6, 4.10, 4.13, 4.14
- Problems:
6.1, 6.2, 6.5, 6.7, 6.9, 6.16
- Extra
credit: 6.8, 6.14, 6.10
- Homework 6: Due
Thursday March 31
- Problems 7.6, 7.7, 7.14, 7.18, 7.19, Problems 8.24, 8.26, 8.27 (skip the Newton Raphson part)
- Practice
problems (I will suggest doing at least two): 8.4, 8.12, 8.28, 8.29
- Homework
7: Due Thursday April 7
- Course
Handouts
- Introduction
slides
- Linear
Algebra and Probability Review and New Material
- Classical
Estimation
- Sparse
Recovery / Compressive sensing
- Bayesian
estimation
- MMSE and linear MMSE estimation and Kalman filtering
- Some extra things
- Graphical models
- Graphical models
(Prof. ALD's notes) approach for handling conditional dependencies in
Bayesian estimation (Prof ALD's handout)
- Hidden Markov Models (HMM)
- Detection
Theory:
- Monte
Carlo
- Simple MC and Importance Sampling (IS)
- Markov Chain Monte Carlc
(MCMC)
- Particle filtering
- Particle filtering
- Doucet et al's paper
(2000)
- HMM
model and other algorithms
- Importance
sampling to approximate a PDF (sum of weighted Diracs)
- Sequential
Importance Sampling (SIS)
- Resampling
concept
- Particle
filtering algorithm: SIS + Resample
- Probability
and Linear Algebra Recap
- Undergrad probability review (EE 322)
- Linear algebra review:
Back to
Namrata Vaswani's homepage