# DETECTION AND ESTIMATION

## Learning outcomes of the course unit

Course Objectives

Provide an introduction to the theory of Detection and Estimation, with applications mainly in the area of digital communications.

## Prerequisites

Pre-requisites

Entry-level courses in probability theory and Fourier analysis for stochastic processes, such as those normally offered in the corresponding 3-year Laurea course, are necessary pre-requisites for this course.

## Course contents summary

Detection and Estimation

## Course contents

Syllabus

CLASS 1:

First hour: Course organization, objectives, textbooks, exam details. Sneaky preview of the course, motivations, applications. Second hour: basic probability theory refresher: total probability, Bayes rule in discrete/continuous/mixed versions, double conditioning. A first elementary exercise on binary hypothesis testing.

CLASS 2:

First hour: completion of proposed exercise. Second hour: Bayes Tests.

CLASS 3:

First hour: exercise on Bayes Test (Laplacian distributions) Second hour: MiniMax Test.

CLASS 4:

First hour: esercise on Minimax. Second hour: Neyman Pearson Test with example.

CLASS 5:

First hour: ROC properties. NP test with distrete RVs: randomization. Second hour: Exercise on Bayes, Minimax, Neyman-Pearson tests.

CLASS 6:

First hour: Multiple hypothesis testing, Bayesian approach. MAP and ML tests. Decision regions, boundaries among regions: examples in R^1 and R^2. Second hour: exercise: 3 equally-likely signal "hypotheses" -A,0,A in AWGN noise: Bayes rule (ML) based on the sample-mean (sufficient statistic).

CLASS 7:

First hour: Minimax in multiple hypotheses. Sufficient statistics: introduction. Second hour: Factorization theorem, irrelevance theorem. Reversibility theorem. Gaussian vectors refresher: joint PDF, MGF/CF.

CLASS 8:

First hour: Summary of known main results on Gaussian random vectors: Gaussian MGF, 4th order statistics from moment theorem, MGF-based proof of Gaussianity of linear transformations. Examples of Gaussian vectors: Fading Channel. Second hour: A: MAP Test with Gaussian signals. B: Additive Gaussian noise channel. Decision regions are hyperplanes.

CLASS 9:

First hour: examples of decision regions. Optimal detection of continuous-time signals: motivation for their discrete representation. Second hour: Discrete signal representation: definitions. Inner product, norm, distance, linear independence. Orthonormal bases and signal coordinates.

CLASS 10:

Gram-Schmidt orthonormalization. Detailed example. Operations on signals, and dual operations on signal images.

CLASS 11:

Unitary marices in change of basis. Orthorgonal matrices: rotations and reflections. Orthogonality principle. Projection theorem. Interpretation of Gram-Schmidt procedure as repeated projections. Complete ON bases: motivations and definition.

CLASS 12:

First hour: exercises: 1. product of unitary matrices is unitary. 2. unitary matrix preserves norm of vectors. Projection matrices, eigenvectors, eigenvalues, spectral decomposition. Properties. Second hour: examples of complete bases in L2: the space of band-limited functions, evaluation of series coefficients, sampling theorem, ON check. More examples of complete bases: Legendre, Hermite, Laguerre.

CLASS 13:

Discrete representation of a stochastic process. Mean and covariance of process coefficients. Properties of covariance matrices for finite random vectors: Hermitianity and related properties. Whitening. Karhunen Leove (KL) theorem for whitening of discrete process representation (hint to proof). Statement of Mercer theorem. KL bases.

CLASS 14:

Summary of useful matrices: Normals and their subclasses: unitary, hermitian, skew-hermitian. If noise process is white, any ON complete basis is KL. Digital modulation. Example: QPSK. Digital demodulation with correlators bank or matched-filter bank.

CLASS 15:

First hour: Matched filter properties. Max SNR, physical reason of peak at T. Second hour: back to M-ary hypothesis testing with time-continuous signals: receiver structure. With white noise, irrelevance of noise components outside signal basis. Optimal MAP receiver in AWGN. Basis detector. Signal detector.

CLASS 16:

Examples of MAP RX and evaluation of symbol error probability Pe. First hour: MAP RX for QPSK signals and its Pe. Second hour: MAP RX for generic binary signals, basis detector, reduced complexity signal detector. Evaluation of Pe.

CLASS 17:

First hour: Techniques to evaluate Pe: rotational invariance in AWGN and signal image shifts.

## Recommended readings

Reference Textbooks

Part I: Detection

[1] J. Cioffi, "Signal Processing and Detection", Ch. 1, http://www.stanford.edu/~cioffi

[2] B. Rimoldi, "Principles of digital Communications", EPFL, Lausanne. Ch 1-4.

[3] A. Lapidoth, "A Foundation in Digital Communication" ETH, Zurich.

[4] R. Raheli, G. Colavolpe, "Trasmissione numerica", Monte Universita' Parma Ed., Ch. 1-5. In Italian.

Part II: Estimation

[5] S. M. Kay, "Fundamentals of statistical signal processing", Vol.I (estimation), Prentice-Hall, 1998.

## Teaching methods

Teaching Methodology

Classroom teaching. Homeworks.

## Assessment methods and criteria

Exams

Oral only, to be scheduled on an individual basis. When ready, please contact the instructor by email alberto.bononi[AT]unipr.it by specifying the requested date. The exam consists of solving some proposed exercises and explaining theoretical details connected with them, for a total time of about 1 hour. You can bring your summary of important formulas in an A4 sheet to consult if you so wish. Some sample exercises can be found here . To get userid and password, please send an email to alberto.bononi[AT]unipr.it from your account nome@studenti.unipr.it.

## Other informations

Office Hours

Monday 11:30-13:30 (Scientific Complex, Building 2, floor 2, Room 2/19T).