In this course, we are going to follow some nice O'Reilly data science manual and, line by line, learn about meaning of terms like "feature", "multi-class classification", "training" and "cross validation" and, while doing so, acquire all necessary prerequisities of "the most sexy job of 22nd century".
We start this Friday (24th April) at 10:00 am
it is not about neural networks *
it is not about "regression"
* well, it will be also about neural networks, but just a little bit ;)
about learning
about classification
about main machine learning concepts: true positive / true negative / false positive / false negative / feature, feature extraction, classifier, accuracy etc.
Coded in vim (front-end: D3.js; back-end: kastalia.medienhaus) by Prof. Daniel D. Hromada (UdK / ECDF).