AACIMP 2012 - Introduction to Machine Learning
Konstantin Tretyakov
University of Tartu,
BIIT Research Group,
STACC
August 2012
Tutorials
- Tutorial materials: (zip)
In order to use the tutorial materials you need a Windows-based computer with Java installed. Then simply download the zip-archive and unpack into a directory of your choice.
Remarks:
- If you are only interested in the Python-based part of the tutorial, you do not need Java.
- The limitation to Windows-based systems is only specific to the particular software compilation (the package which was prepared to be used in the KPI computer class). All the software used in the tutorials is otherwise cross-platform and may be installed on either a Linux or a Mac. You'll need to install Python 2.7 alongside with the appropriate packages. At the very least, this includes NumPy, SciPy, Matplotlib, IPython, Scikits-learn and their dependencies.
Lectures
"Homework"
There are some exercises I recommend you practicing after the lectures. Those are not compulsory, but will certainly deepen your understanding. Here is a non-exhaustive list.- Tutorial sessions: All tutorial sessions are packed with more Notebook-style exercises that will fit into the timeframe of the tutorial sessions. You are welcome to complete them all.
- Weka and Rapidminer: The Software folder used in the practice sessions has, among others, preinstalled versions of Weka and RapidMiner GUI tools. Explore those packages on your own. For Weka just follow any of the tutorials from the web. For RapidMiner start with a Quick Tour video.
- Bonus: Stock price prediction: In particular, the "Google stock price prediction" exercise from the first tutorial session is special. A special prize will be awarded to the author of the best solution - a unique self-learning checkers-playing computer.
- Math exercises from Day 2: The second lecture (Optimization and Linear Regression) had some simple optimization exercises for you to practice, namely:
- Find the minimum of the objective function f(w) = Σi || xi - w ||2. Derive an analytic, iterative and stochastic gradient solution.
- Find the minimum of the OLS objective function f(w) = || Xw - y ||2. Derive an analytic and stochastic gradient solution.
- Bonus (prize to the first to solve: chocolate bar). Derive the analytical representation of the full set of solutions to the above problem.
- Derive an SGD algorithm for Ridge Regression.
- Math exercises from Day 3:
- Derive the MLE estimate for the Bernoulli distribution parameter p.
- Derive the MAP estimate for the Bernoulli distribution parameter p, if the prior is a Beta(2,2) distribution.
- Math exercises from Day 4:
- Prove that for a linear classifier ||w|| = 1/m.
- The Perceptron algorithm is just a stochastic gradient optimization method for some objective function. What is this objective function?
- Show that the matrix XXT is positive semidefinite.
- Math exercises from Day 5:
- Use the Lagrange multipliers method to derive an analytical solution to the PCA objective.