Schedule: W 4:30-7:10 pm, University Hall 1203
Instructor: Igor Griva, email@example.com , (703) 993-4511
Office hours: W 7:30 -8:30 pm, Exploratory Hall, Rm 4114
Prerequisite: Permission of instrustor. Students are expected to have familiarity with the basics of calculus, linear algebra, probability theory and statistics; understanding of basic programming principles and skills.
Text: Tom M. Mitchell, “Machine Learning,” McGraw-Hill, 1997
Exams: There is one midterm exam:
October 25 (points 0 - 100)
Final Exam : December 13 (points 0 - 100)
Final score: F = 0.3*(Midterm) + 0.4*(Homework / Projects) + 0.3*(Final Exam)
The course surveys algorithms that enable computers to learn a concept or automatically improve their performance of some task with experience. The main goal of this class is to familiarize students with basic concepts and algorithms of computational learning. Students who complete this course should be able to identify problems where computational learning algorithms can be useful and to apply these algorithms for finding the solution.
We discuss the following topics: parametric/non-parametric learning, decision tree learning, neural networks, Bayesian learning, instance-based learning, bias/variance tradeoffs, Vapnik-Chernovenkis theory, support vector machines, and reinforcement learning. The class provides some necessary background introducing basic concepts from statistics, optimization, and information theory, relevant to computational learning. Some popular real world applications of computational learning algorithms are also discussed.