CS 594 - Neural Information Processing
Spring 1998 - Bruce MacLennan
If you want a graduate-level neural net course, you should take this one,
since there may not be another graduate- or undergraduate-level
course before Fall 1999 or Spring 2000, if then.
Contact Information
Instructor:
Bruce MacLennan
Phone: 974-5067
Office: 110-A Ayres
Hours: 2:10-3:25 TR or
make an appointment
Email:
MacLennan@cs.utk.edu
Teaching Assistant:
Yushuai Lu
Hours: TBA
Email:
lu@cs.utk.edu
This page: http://www.cs.utk.edu/~mclennan/Classes/594-S98/
Description
This course will be a comprehensive introduction to neural networks
from an engineering and mathematical perspective. It will cover most of
the material essential to anyone intending to use neural networks in
their research.
Tentative List of Topics
A tentative list of topics is: overview of neural network models,
overview of learning, correlation matrix memories, perceptrons, LMS
algorithms, multilayer networks, RBF networks, models based in
statistical physics, self-organizing systems (Hebbian, competitive
learning, information theoretic approaches), temporal processing &
neurodynamics.
Tentative Schedule
With a few exceptions we will try to do a chapter (or about 40 pages)
per week (the first half-week counts
as week 1). You will be expected to do the chapter reading before class, so that
class time can be devoted to supplementary explanation.
- overview of neural network models (ch 1)
- overview of learning I (ch 2)
- overview of learning II (ch 2 ctu'd)
correlation matrix memories (ch 3)
- perceptrons (ch 4)
LMS algorithms (ch 5)
- multilayer networks I (ch 6)
- multilayer networks II (ch 6 ctu'd)
- multilayer networks III (ch 6 ctu'd)
- RBF networks (ch 7)
- models based in statistical physics (ch 8)
- self-organizing systems I: Hebbian (ch 9)
- self-organizing systems II: competitive learning (ch 10)
- self-organizing systems III: information theoretic approaches (ch 11)
- temporal processing (ch 13)
- neurodynamics (ch 14)
- supplementary topics
Prerequisites
Basic calculus (including elementary differential
equations), linear algebra, probability and statistics.
No previous experience with neural networks is necessary.
Text
Simon Haykin: Neural Networks: A Comprehensive Foundation
(Macmillan 1994).
Grading
There will be homework and projects.
Meeting Time and Place
Tuesdays & Thursdays, 12:40 - 1:55 in BU 655.
Return to MacLennan's home page
Send mail to Bruce MacLennan / MacLennan@cs.utk.edu
Last updated:
Tue Apr 21 14:58:42 EDT 1998