CS 420/594: Advanced Topics in Machine Intelligence
Fall 2007: Biologically-Inspired Computation
Office: 217 Claxton Complex
Hours: 3:40–5:00 TR, or make an
Office: Claxton 124
Hours: 2:00–4:00 W, or
make an appointment
Email: ytang AT eecs.utk.edu
Kristy Van Hornweder
Office: Claxton 122C
Hours: 1:00–2:00 TR, or
make an appointment
Email: kvanhorn AT eecs.utk.edu
Classes: 2:10–3:25 TR in HSS 110
of Handouts, Labs, etc.
This page: http://www.cs.utk.edu/~mclennan/Classes/420
CS 420 covers advanced topics in machine intelligence with an emphasis
on faculty research; CS 594 is similarly focused on faculty research. In
the Fall semester of 2007 the topic for my CS 420/594 will be biologically-inspired computation,
including recent developments in computational methods inspired by
nature, such as neural networks, genetic algorithms and other
evolutionary computation systems, ant swarm
optimization, artificial immune systems, swarm intelligence, cellular
automata, and multi-agent systems.
Fundamental to the understanding and implementation of massively parallel,
distributed computational systems is an investigation of the behavior and
self-organization of a variety of systems in which useful work emerges from the
interaction of many simple agents. The question we address is: How should a multitude of independent
computational (or robotic) agents cooperate in order to process
information and achieve their goals, in a way that is efficient,
self-optimizing, adaptive, and robust in the face of changing needs, damage, and
Fortunately, nature provides many models from which we can learn.
In this course we will discuss natural computational systems that solve some of the
same problems that we want to solve, including adaptive path
minimization by ants, wasp and termite nest building, army ant raiding,
fish schooling and bird flocking, pattern formation in animal coats,
coordinated cooperation in slime molds, synchronized firefly flashing,
soft constraint satisfaction in spin glasses, evolution by natural
selection, game theory and the evolution of cooperation, computation at
the edge of chaos, and information processing in the brain.
You will learn about specific computational applications of these
ideas, including artificial neural networks, simulated annealing,
cellular automata, ant colony optimization, artificial immune systems,
particle swarm optimization, and genetic algorithms and other
evolutionary computation systems. These techniques are also used in computer games and computer animation.
the goal of this goal of this course is for you to gain an intuitive
understanding of adaptive and self-organizing computational systems,
the lectures make extensive use of videos, simulations, and other
computer demonstrations. Your grade will be based on three or
four moderate-sized projects, and I will consider group projects (but
you will have to do more!). There are no other assignments or tests. (In the past, students who did all the work received A or B+ in this course.)
This is a project-oriented course and therefore
all students will be expected to have basic programming skills.
However, for non-EECS students (e.g., those in biology, ecology,
psychology, etc.) I will provide alternate non-programming
assignments. If you have any questions about whether you should
take this course, please send me mail.
Your grade will be based
on three or four projects, in which you will conduct and
write up experiments using the software associated with Flake’s
book, as well as conducting experiments with software that you program
yourself. (Non-EECS students can do alternative, non-programming
There will be no exams or other homework.
Students taking CS 594 (i.e. the course for graduate credit) will be
expected to do specified additional work.
In the past, most students have earned A or B+ in this course.
CS 420 & 594: Flake, Gary William. The Computational Beauty of Nature. MIT Press, 1998. See also the book’s online webpage (including software).
Evolving List of Topics
Chapter numbers refer to Flake unless otherwise specified. Slides
for each lecture will be posted in the course of the semester.
Slides from the Fall 2004 and Fall 2003 versions of the course
are still available on their website.) Note: An “*” after the
lecture number indicates that the slides were revised after class.
We will do about one topic every week or so.
- Overview: course description, definition of biologically-inspired computing, why it is important
Lectures: 1, 2.
- Cellular Automata: Wolfram’s classification, Langton’s lambda, CA models in nature, excitable media (ch. 15)
Lectures: 3, 4, 5, 6, 7, 8, 9.
- Autonomous Agents and Self-Organization: termites, ants, flocks, herds, and schools (ch. 16)
Lectures: 10, 11, 12, 13*, 14*, 15, 16, 17, 18*, 19.
Information on Lattice Swarm simulations and source for simulator.
Some links on synchronized fireflies in the Smoky Mountains:
- The above site seems to have gone offline, but the files can also be found here.
Links on flocking & schooling behavior:
Yuhui Shi’s page on Particle Swarm Optimization (including demonstration).
- Natural and Analog Computation: artificial neural nets, associative memory, Hebbian learning, Hopfield networks (ch. 18)
Lectures: 20*, 21, 22, 23.
- Genetics and Evolution: biological adaptation & evolution, genetic algorithms, schema theorem (ch. 20)
Lectures: 23 (dupl.), 24, 25.
Online genetic algorithm demonstrations:
and Cooperation: zero- and nonzero-sum games, iterated prisoner’s
dilemma, stable strategies, ecological & spatial models (ch. 17)
Lectures: 25 (dupl.), 26.
Some useful links:
- Neural Networks and Learning: pattern classification & linear
separability, single- and multilayer perceptrons, backpropagation,
internal representation (ch. 22)
Lecture: 27, 28.
- Complex Systems & Phase Transitions: summary (ch. 19)
- Adaptation: summary (ch. 23)
As time permits:
- Nonlinear Dynamics in Simple Maps (ch. 10)
- Strange Attractors (ch. 11)
- Producer-Consumer Dynamics (ch. 12)
- Controlling Chaos (ch. 13)
- Chaos, Randomness, and Computability (ch. 14)
- Project 2 Information [pdf] — due Oct. 25 (extended to Nov. 1) (handout revised 10-23)
- CBN Programs
You can run most of these as applets from the Website for Flake’s textbook. You can also download sources and executable from there. Some of these are already available in the experiments/CBN subdirectory. The Unix executables are in experiments/CBN/cbn/code/bin.
- NetLogo Programs
You are supposed to
be able to run NetLogo programs as Java applets. To do this,
click on their
name below. However, they don’t seem to run on all browsers
(see information with programs). Also beware that if you are
running NetLogo over a slow connection, it will have to download the
NetLogoLite jar (1.5MB). If you have downloaded a NetLogo system,
also download the programs (.nlogo files) directly from the NetLogo directory.
- Life — Conway’s classic Game of Life
- CA-1D-General-Totalistic — 1D Totalistic CA simulator (good for Project 1)
- B-Z-Reaction — Belousov-Zhabotinsky reaction (equivalent to Hodgepodge machine)
- SlimeSpiral — spiral aggregation of slime molds
- SlimeStream — streaming aggregation of slime molds
- SlimeAggregation — both spiral and streaming stages of slime mold aggregation
- Fur — uses small and large neighborhoods for activation-inhibition system
- Pattern — uses diffusion rates for activation-inhibition system
- Termites — simulation of Resnick “turmites”
- Pillars — simulation of Deneubourg model of pillar construction be termites
- Vants — Langton’s Vants (virtual ants)
- Vants-Large-Field — Langton’s Vants on a large field (may take too much memory)
- Generalized-Vants — generalized vants, with a programmable rule
- Ants — simulation of Resnick ants
- Firefly — Camazine’s firefly synchonization model
- Fireflies-opt-mobile — Wilensky’s firefly synchonization model
- Flock — Huth & Wissel model of fish schooling
- Flocking — implementation Reynold’s “boids” flocking model
- PSO — demonstration of particle swarm optimization
- EIPD — ecological simulation of Iterated Prisoner’s Dilemma
- SIPD — spatial simulation of Iterated Prisoner’s Dilemma
- Artificial Neural Net — demonstration of back-propagation learning
Return to MacLennan’s home page
Send mail to Bruce MacLennan / MacLennan@eecs.utk.edu
This page in www.cs.utk.edu/~mclennan/Classes/420
Last updated: 2007-12-09.