Under construction (all semester)!

CS 594 - Mathematics for Neural Nets

This course will be a brief overview of the mathematics I consider useful for ne ural networks and especially for field computation. N.B. If you intend to do work with me in the areas of neural networks or field computation, you should plan on taking this course or at least sitting in on it. It may be the only opportunity you have to learn this material in a convenient format.

Tentative List of Topics

Click on a topic to see it in hypertext format, or you can view or print
Postscript versions. (Since there is a great deal of mathematical notation, you will have to use a graphics-capable browser to view the hypertext; since it was created with latex2html, it's not a pretty sight.)
  1. Basic concepts of topology and metric spaces
  2. Banach spaces, including derivatives  - Working on it!
  3. Hilbert spaces  - Updated!
  4. Basic complex analysis
  5. Convolution and correlation
  6. Fourier transforms
  7. Gabor transforms and the uncertainty principle
  8. Wavelet transforms
  9. Qualitative differential equations
  10. Lyapunov theorems and gradient systems
  11. Laplace transforms and linear systems
  12. Matched filters
  13. Autocorrelation, crosscorrelation and spectral density analysis
  14. Singular value decomposition
  15. Linear regression
Obviously I will cover these subjects at a very high level, and I will limit myself to topics relevant to neural nets and field computation.

Prerequisites

Prerequisites will include calculus through differential equations, linear algebra, elementary set theory, and elementary probability and statistics.

Meeting Time and Place

5:05-6:20 TR in Ayres 127

Online Notes in Postscript

I will attempt (but will not promise) to have notes for the lectures available online. The following chapters are currently available in postscript form:

1, 2, 3, 4 (partial)


Return to MacLennan's home page

Send mail to Bruce MacLennan / MacLennan@cs.utk.edu
Last updated: Thu Jan 16 17:48:02 EST 1997