home
schedule
posting

Announcements (last updated 7th September 2006)

Course summary

Course announcement: Please see the course posting/announcement for an overview of the course content.
Course description: Teaching computer programs to improve their performance through guided training and unguided experience. Symbolic and numerical approaches. Concept learning, decision trees, neural nets, latent variable models, probabilistic inference, time series models, Bayesian learning, sampling methods, computational learning theory, support vector machines.
Course homepage:http://www.cse.buffalo.edu/faculty/mbeal/cse574
Location / Time:4 Knox   12:30-1:50pm   T R   (Fall'05 schedule)
(first lecture Aug 29, last lecture Dec 7, no lectures Sep 19, Nov 23)
Instructor:Dr. Matthew J. Beal
Contact:Office: 210 Bell Hall, Phone: (716) 645 3180 x154, Email: mbeal [at] cse.buffalo.edu
TA:Harish Srinivasan (email hs32 [at] cse.buffalo.edu)
Office hours:
(subject to change)
Matthew Beal: T/R: 9-10:30, Bell 210
Harish Srinivasan: M 12-1:30, W 3-4:30, Cedar-Room 3
Credits:3.00 G / 4.00 UG
Recitations:
M: 11-11:50, 17 Clemens
W: 2-2:50, 146 Park
Monday-Wednesday pairs. The recitations review the previous week's material; they are not compulsory, but attendance is strongly advised so that you can consider example exercises with the TA: these are perfect examples of questions that will appear in the assignments, mid-term, and final.
Prerequisites: 1. CSE 250 and any of EAS 305/308, STA 401/421, MTH 309; or permission of instructor.
2. A solid background in calculus and linear algebra.
3. Basic knowledge of probability will be assumed.
4. If you are not comfortable with 90% of the material in this crib sheet [pdf][ps], please do not take the class. You will not have time to to "learn/relearn" this material once you start.
Computing: Some examples and homework will require Matlab, a straightforward but powerful programming language. So, you must either know Matlab or Octave, or be taking a course in either, or be willing to learn Matlab (we will make use of this primer [pdf]).
Required texts: T. M. Mitchell (1997) Machine Learning, McGraw Hill.
D. J. C. MacKay (2003) Information Theory, Inference, and Learning Algorithms, Cambridge University Press (I highly recommend this book; it is also very cheap and available free for online viewing).
Optional text: T. Hastie, R. Tibshirani, J. Friedman (2003) The Elements of Statistical Learning, Springer.
Grading:
(subject to change)
35% Homework Assignments (best 5 out of 6, worth 7% each)
15% Mid-Term Exam
20% Projects (10% each)
30% Final Exam (will also include topics covered before the Mid-Term Exam)
Handing in work: Homeworks and Projects are due at the beginning of the lecture of the due date.
Late homeworks will receive a 10% penalty immediately, and a further 10% for each weekday late, regardless of excuse. No credit will be awarded once the assignment solutions are discussed in either the recitations or lectures after the due date.
Auditing: The official department policy is that auditing a course is not allowed, but I welcome you and am very happy for you to sit in during my lectures (if there is space) provided you do not draw on the university's resources in any way, including: asking me or the TAs questions related to the course material (be it during or after lectures or recitations), attending examinations, expecting assignments to be graded, or using the department's or university's computing facilities to tackle the assignments.
Academic Honesty: Group study and discussion are encouraged, but project and homework assignments must be your own work. For coding assignments, if you use a piece of code which you borrowed from elsewhere and therefore did not write yourself, make sure you comment it to show this.
Zero tolerance on plagiarism/cheating: consult the University Code of Conduct for details on consequences of academic misconduct, and see also Prof. Shapiro's page on Academic Integrity of the CSE department: http://www.cse.buffalo.edu/~shapiro/Courses/integrity.html.