## CS302 Lecture Notes - Dynamic Programming

• November 18, 2009
• Latest update: November, 2016
• James S. Plank
• Directory: /home/plank/cs302/Notes/DynamicProgramming

### Reference Material and Topcoder Problems

To augment this lecture, there are two sets of tutorials on dynamic programming from Topcoder. They are excellent:

These are Topcoder problems where I give you hints, but no code -- these are good practice for you to work on Dynamic Programming on your own:

• SplitStoneGame - This is a really straightforward dynamic program. Good practice.
• EllysSocks - This is another straightforward dynamic program.
• PartisanGame -- I walk you through this one in a series of steps.
• QuickSort - This is a fairly straightforward DP where you use a string as the memoization key.
• Mafia - This is a dynamic program where you turn the state of your simulation into a string, and then make recursive calls on modified strings. Obviously, you memoize on the string too.
• CtuRobots - This is a wonderful dynamic program, where you have to reason your way to the solution.
• StringWeightDiv2 - This is a counting problem where again, the tricky part is using recursion to help you.
And finally, these are Topcoder problems where I give you explanations and code:
• CountGame - This is a small program, and is a good one for you to try on your own. You'll be tempted to think about the game strategy, but using dynamic programming is much easier (and less bug-prone)!
• CheeseSlicing. A pretty classic, straightforward dynamic program.
• SubsetSumExtreme - Dynamic programming with bit arithmetic to represent sets. How does it get any better than this problem?
• RemovingParenthesis - A straightforward dynamic program for counting strings.
• LastDigit - Dynamic programming with really large integers.

## Dynamic Programming in a Nutshell

Dynamic programming is nothing more than the following:
• Step 1: You spot a recursive solution to a problem. When you program that solution, it will be correct, but you'll find that it's incredibly slow, because it makes many duplicate procedure calls.

• Step 2: You cache the answers to recursive calls so that when they are repeated, you can return from them instantly. This is called memoization.

• Step 3: If you want to, you can typically figure out how to eliminate the recursive calls, and instead populate the cache with one or more for loops. This is faster than memoization, but usually not by much.

• Step 4: Sometimes, you can eliminate the cache completely. This makes the program even faster and more memory efficient.

I find that steps 3 and 4 are often optional. However, they usually represent the best solutions to a problem.

I will illustrate with many examples.

### Dynamic Programming Example Programs

Each of these has its own set of lecture notes. There is a makefile in this directory which makes all of the examples.