DETECTION AND ESTIMATION
Learning outcomes of the course unit
Objective of the course is to provide the student with the ability to understand and apply the basic rules of detection and estimation theory, and in particular:
- to apply the most common statistical tests in deciding among different hypotheses
- to synthesize the structure of the optimal receiver and analyze its performance in the context of digital transmissions
- to apply the most common statistical estimators
- to synthesize the structure of the optimal filters and analyze their performance in the context of digital transmissions.
The abilities in applying the above-mentioned knowledge are in particular in the:
- design and performance analysis of the decision block in digital receivers
- design and performance analysis of the parameter-estimation blocks in digital receivers
Entry-level courses in probability theory and Fourier analysis for stochastic processes, such as those normally offered in the corresponding 3-year Laurea course, are necessary pre-requisites for this course.
Course contents summary
1. Detection Theory
1.1 Bayes, MiniMax, Neyman Pearson Tests
1.2 Multiple hypothesis testing MAP and ML tests
1.3 Sufficient statistics
Factorization, Irrelevance, Reversibility theorems
1.4 MAP Test with Gaussian signals. Additive Gaussian noise channel
1.5 Optimal detection of continuous-time signals: discrete representation.
Orthonormal bases and signal coordinates. Gram-Schmidt procedure.
Projection Theorem. Complete bases
1.6 Discrete representation of a stochastic process. Karhunen Leove (KL) basis
1.7 Optimal MAP receiver in AWGN
1.8 Techniques to evaluate error probability
1.9 Composite hypothesis testing: partially known signals in AWGN.
Optimal incoherent MAP receiver structure
1.10 Detection in additive colored Gaussian noise: whitening, Cholesky decomposition
1.11 Detection with stochastic Gaussian signals: Radiometer
2. Estimation theory
2.1 Fisherian estimation
2.1.1 Minimum Variance Unbiased Estimation
2.1.2 Cramer Rao Lower Bound
2.1.3 Maximum Likelihood estimation
2.2 Bayesian estimation
2.2.1 Minimum Mean Square Error estimation
2.2.2 MAP estimation
2.2.3 Linear MMSE estimation
2.2.4 Spectral Factorization and Wiener Filtering
Syllabus (every class = 2 hours)
First hour: Course organization, objectives, textbooks, exam details. Sneaky preview of the course, motivations, applications. Second hour: basic probability theory refresher: total probability, Bayes rule in discrete/continuous/mixed versions, double conditioning. A first elementary exercise on binary hypothesis testing.
First hour: completion of proposed exercise. Second hour: Bayes Tests.
First hour: exercise on Bayes Test (Laplacian distributions) Second hour: MiniMax Test.
First hour: esercise on Minimax. Second hour: Neyman Pearson Test with example.
First hour: ROC properties. NP test with distrete RVs: randomization. Second hour: Exercise on Bayes, Minimax, Neyman-Pearson tests.
First hour: Multiple hypothesis testing, Bayesian approach. MAP and ML tests. Decision regions, boundaries among regions: examples in R^1 and R^2. Second hour: exercise: 3 equally-likely signal "hypotheses" -A,0,A in AWGN noise: Bayes rule (ML) based on the sample-mean (sufficient statistic).
First hour: Minimax in multiple hypotheses. Sufficient statistics: introduction. Second hour: Factorization theorem, irrelevance theorem. Reversibility theorem. Gaussian vectors refresher: joint PDF, MGF/CF.
First hour: Summary of known main results on Gaussian random vectors: Gaussian MGF, 4th order statistics from moment theorem, MGF-based proof of Gaussianity of linear transformations. Examples of Gaussian vectors: Fading Channel. Second hour: A: MAP Test with Gaussian signals. B: Additive Gaussian noise channel. Decision regions are hyperplanes.
First hour: examples of decision regions. Optimal detection of continuous-time signals: motivation for their discrete representation. Second hour: Discrete signal representation: definitions. Inner product, norm, distance, linear independence. Orthonormal bases and signal coordinates.
Gram-Schmidt orthonormalization. Detailed example. Operations on signals, and dual operations on signal images.
Unitary marices in change of basis. Orthorgonal matrices: rotations and reflections. Orthogonality principle. Projection theorem. Interpretation of Gram-Schmidt procedure as repeated projections. Complete ON bases: motivations and definition.
First hour: exercises: 1. product of unitary matrices is unitary. 2. unitary matrix preserves norm of vectors. Projection matrices, eigenvectors, eigenvalues, spectral decomposition. Properties. Second hour: examples of complete bases in L2: the space of band-limited functions, evaluation of series coefficients, sampling theorem, ON check. More examples of complete bases: Legendre, Hermite, Laguerre.
Discrete representation of a stochastic process. Mean and covariance of process coefficients. Properties of covariance matrices for finite random vectors: Hermitianity and related properties. Whitening. Karhunen Leove (KL) theorem for whitening of discrete process representation (hint to proof). Statement of Mercer theorem. KL bases.
Summary of useful matrices: Normals and their subclasses: unitary, hermitian, skew-hermitian. If noise process is white, any ON complete basis is KL. Digital modulation. Example: QPSK. Digital demodulation with correlators bank or matched-filter bank.
First hour: Matched filter properties. Max SNR, physical reason of peak at T. Second hour: back to M-ary hypothesis testing with time-continuous signals: receiver structure. With white noise, irrelevance of noise components outside signal basis. Optimal MAP receiver in AWGN. Basis detector. Signal detector.
Examples of MAP RX and evaluation of symbol error probability Pe. First hour: MAP RX for QPSK signals and its Pe. Second hour: MAP RX for generic binary signals, basis detector, reduced complexity signal detector. Evaluation of Pe.
First hour: Techniques to evaluate Pe: rotational invariance in AWGN
Part I: Detection
 J. Cioffi, "Signal Processing and Detection", Ch. 1, http://www.stanford.edu/~cioffi
 B. Rimoldi, "Principles of digital Communications", EPFL, Lausanne. Ch 1-4.
 A. Lapidoth, "A Foundation in Digital Communication" ETH, Zurich.
 R. Raheli, G. Colavolpe, "Trasmissione numerica", Monte Universita' Parma Ed., Ch. 1-5. In Italian.
Part II: Estimation
 S. M. Kay, "Fundamentals of statistical signal processing", Vol.I (estimation), Prentice-Hall, 1998.
Classroom teaching, 63 hours. In-class problem solving, 9 hours. Homeworks assigned weekly.
Assessment methods and criteria
Oral only, to be scheduled on an individual basis. When ready, please contact the instructor by email at alberto.bononi[AT]unipr.it by specifying the requested date. The exam consists of solving some proposed exercises and explaining theoretical details connected with them, for a total time of about 1 hour. You can bring your summary of important formulas in an A4 sheet to consult if you so wish. Some sample exercises can be found on the course website. To get userid and password, please send an email to alberto.bononi[AT]unipr.it from your account email@example.com.
Monday 11:30-13:30 (Scientific Complex, Building 2, floor 2, Room 2/19T).