Under construction (all semester)!
CS 594 - Mathematics for Neural Nets
This course will be a brief overview of the mathematics I consider useful for ne
ural
networks and especially for
field computation.
N.B. If you intend to do work with me in the areas of neural
networks or field computation, you should plan on taking this course or
at least sitting in on it.
It may be the only opportunity you have to learn this material in a
convenient format.
Tentative List of Topics
Click on a topic to see it in hypertext format, or you can view or print
Postscript versions.
(Since there is a great deal of mathematical notation, you will have to
use a graphics-capable browser to view the hypertext;
since it was created with latex2html, it's not a pretty sight.)
-
Basic concepts of topology and metric spaces
-
Banach spaces, including derivatives
-
Hilbert spaces
- Basic complex analysis
- Convolution and correlation
- Fourier transforms
- Gabor transforms and the uncertainty principle
- Wavelet transforms
- Qualitative differential equations
- Lyapunov theorems and gradient systems
- Laplace transforms and linear systems
- Matched filters
- Autocorrelation, crosscorrelation and spectral density analysis
- Singular value decomposition
- Linear regression
Obviously I will cover these subjects at a very high level,
and I will limit myself to topics relevant to neural nets and field computation.
Prerequisites
Prerequisites will include calculus through differential equations,
linear algebra,
elementary set theory,
and elementary probability and statistics.
Meeting Time and Place
5:05-6:20 TR in Ayres 127
Online Notes in Postscript
I will attempt (but will not promise) to have notes for the lectures available online.
The following chapters are currently available in postscript form:
1,
2,
3,
4 (partial)
Return to MacLennan's home page
Send mail to Bruce MacLennan / MacLennan@cs.utk.edu
Last updated:
Thu Jan 16 17:48:02 EST 1997