Sergey Nikolenko![Sergey Nikolenko](sergey150_9.jpg) Main page
print ' Books';
print ' Research papers';
print ' Talks and posters';
print ' Students';
print ' Popular science';
print ' Other stuff';
print ' Research';
print ' CS and crypto';
print ' Bioinformatics';
print ' Machine learning';
print ' Algebraic geometry';
print ' Algebra';
print ' Bayesian networks';
print ' Earth sciences';
print ' Teaching';
print ' 2014';
print ' ML, KFU';
print ' Game Theory, HSE';
print ' Mech. Design, HSE';
print ' ML, CSClub Kazan';
print ' Game theory, HSE';
print ' Math. logic, AU';
print ' Machine learning, STC';
print ' Machine learning, AU';
print ' 2013';
print ' Discrete math, HSE';
print ' Machine learning, STC';
print ' Math. logic, AU';
print ' Cryptography, AU';
print ' 2012';
print ' Machine learning, STC';
print ' Math. logic, AU';
print ' Machine learning II, AU';
print ' Machine learning, AU';
print ' Machine learning, EMC';
print ' 2011';
print ' Cryptography, AU';
print ' Math. logic, AU';
print ' Machine learning, AU';
print ' 2010';
print ' Math. logic, AU';
print ' Machine learning, AU';
print ' Cryptography, AU';
print ' 2009';
print ' Crypto in CS Club';
print ' Statistics';
print ' Machine learning, AU';
print ' Cryptography';
print ' 2008';
print ' Speech recognition';
print ' MD for CS Club';
print ' ML for CS Club';
print ' Mechanism design';
print ' 2007';
print ' Machine Learning';
print ' Probabilistic learning';
print ' External links';
print ' Google Scholar profile';
print ' DBLP profile';
print ' LiveJournal account
nikolenko (in Russian) | ';
print '![](spacer.gif) | ';
?>
Teaching activities |
Machine Learning for CS Club: Kazan 2014
This is a short introduction to machine learning presented as part of the
Computer Science Club program
in Kazan; see also the course page.
The course itself (all slides and lecture notes are in Russian):
- 1. Introduction. History of AI. Overview of different problem settings in machine learning.
Probability theory basics. Bayes theorem and maximal a posteriori hypotheses.
- Slides ()
- 2. Sample application of Bayesian ideas: Laplace's rule of succession. Priors. Conjugate priors.
Beta distribution as a conjugate prior for Bernoulli trials. Parametric and nonparametric models: nearest neighbors.
Curse of dimensionality.
- Slides (.pdf, 714kb)
- 3. Linear regression. Least squares, polynomial curve fitting. Overfitting: ridge regression. Ridge regression as Gaussian priors.
Other kinds of regularizers: lasso regression. Linear classification: logistic regression.
- Slides (.pdf, 1673kb)
- 4. Graphical models. Directed graphical models, d-separation. Undirected graphical models. Factor graphs.
Message passing: sum-product on a chain, sum-product on a general tree. Overview of approximate inference algorithms.
- Slides (.pdf, 1504kb)
- 5. The Expectation-Maximization algorithm: mixture of Gaussians for clustering, general case. Hidden Markov models: definitions, the three problems, the Baum--Welch algorithm.
- Slides (.pdf, 710kb)
- 6. Sample applications of probabilistic modeling: text categorization (naive Bayes), topic modeling (LDA), recommender systems (nearest neighbors and SVD), Bayesian rating systems (TrueSkill).
- Slides (.pdf, 3391kb)
Selected references.
- Christopher M. Bishop. Pattern Recognition and Machine Learning. Springer, Information Science and Statistics series, 2006.
- Kevin Murphy. Machine Learning: A Probabilistic Perspective, MIT Press, 2012.
- David J. C. MacKay. Information Theory, Inference, and Learning Algorithms. Cambridge University Press, 2003.
|