sunnuntai 13. tammikuuta 2019

Jan 10: Estimation theory

Today we studied estimation theory (recap of least squares and first look at maximum likelihood).

The least squares part is assumed familiar from earlier courses and was only discussed by an example. Namely, we looked at predicting the house prices in the Hervanta region. There is a database from which one can scrape a set of realized house prices with attributes of each apartment. The example is available as a Jupyter Notebook. There is also a recent M.Sc. thesis on implementing the house price prediction as a web service (in etuovi.com).

Next, we looked at maximum likelihood estimation. The main principle is to choose first a model for the data, e.g., a noisy sinusoid: 

x[n] = A * sin(2*pi*f*n + phi) + w[n]

Next, we write down the probability of observing exactly these samples x[0], x[1], ... x[n-1] given a parameter value (e.g., A). Then the question is only to maximize this likelihood function with respect to the unknown parameter (e.g., A). In most cases the trick is to take the logarithm before differentiation--otherwise the mathematics is too hard to solve.

At the end of the week it turned out that the Friday exercise sessions are overcrowded. We will have a more strict monitoring of the attendance starting this week. Please go to the groups for which you have signed. 

Ei kommentteja:

Lähetä kommentti