maanantai 14. tammikuuta 2019

Jan 14: Estimation and detection

On the third lecture, we looked more thoroughly at estimation theory, which is in our context closely linked with regression: prediction of numerical values (regression) as opposed to prediction of classes (classification).

We repeated the main principle: write down the probability of observing exactly these samples x[0], x[1], ... x[n-1] given a parameter value (e.g., A). Then the question is only to maximize this likelihood function with respect to the unknown parameter (e.g., A). In most cases the trick is to take the logarithm before differentiation--otherwise the mathematics is too hard to solve.

On the second lecture, we saw an example MLE problem solved on the blackboard. The example was from June 2017 exam, and is as follows:

In this case, the maximum likelihood estimate for parameter theta is given as theta = 2N / sum(x[n]).

As the last item on parameter estimation, we looked at the german tank problem. In this case, it turns out that the maximum likelihood solution suffers from severe bias: the estimated total number of tanks is simply the largest ID of all tanks encountered so far. On the other hand, we saw that the minimum variance unbiased estimator (MVUE) solution was very close to the true number of tanks in the wartime. MVUE is outside the scope of this course, but if you are interested, refer to my old SSP course.

On the second lecture, we looked at detection of sinusoidal signal (with known frequency) embedded under Gaussian noise. This can be posed as a hypothesis testing question, and solved optimally using a model of the data.

Ei kommentteja:

Lähetä kommentti