#include < stdio.h >
main(int argc, char **argv)
{
int n;
double f;
double count;
int i, j;
long t0;
if (argc != 2) {
fprintf(stderr, "Usage: ex1 n\n");
exit(1);
}
n = atoi(argv[1]);
t0 = time(0);
count = 0;
f = 23.4;
for (i = 0; i < n; i++) {
count++;
f = f * f + 1;
j = (int) f / 1000;
f = f - j*1000;
/* printf("%lf\n", f); */
}
printf("N: %d Count: %.0lf Time: %d\n", n, count, time(0)-t0);
}
What this does is compute n random-ish numbers between one and
1000, and if we uncommented the printf() statement, then it
would print them -- try it out (uncomment the print statement).
Suppose we run this program with varying values of n. What do we expect? Well, as n increases, so will the count, and so will the running time of the program:
(This is on my machine at home -- I don't know how fast the machines here will be.)
UNIX> gcc -o linear1 linear1.c UNIX> linear1 1 N: 1 Count: 1 Time: 0 UNIX> linear1 10000 N: 10000 Count: 10000 Time: 0 UNIX> linear1 10000000 N: 10000000 Count: 10000000 Time: 8 UNIX> linear1 20000000 N: 20000000 Count: 20000000 Time: 14 UNIX> linear1 100000000 N: 100000000 Count: 100000000 Time: 72 UNIX>Just what you'd think. In fact, the running time is roughly linear:
Now, look at four other programs below. I will just show their loops:
linear2.c:
...
for (i = 0; i < n; i++) {
count++;
f = f * f + 1;
j = (int) f / 1000;
f = f - j*1000;
}
for (i = 0; i < n; i++) {
count++;
f = f * f + 1;
j = (int) f / 1000;
f = f - j*1000;
}
...
|
log.c:
...
for (i = 1; i <= n; i *= 2) {
count++;
f = f * f + 1;
j = (int) f / 1000;
f = f - j*1000;
}
...
|
nlogn.c:
...
for (k = 0; k < n; k++) {
for (i = 1; i <= n; i *= 2) {
count++;
f = f * f + 1;
j = (int) f / 1000;
f = f - j*1000;
}
}
...
|
nsquared.c:
...
for (i = 0; i < n; i++) {
for (k = 0; k < n; k++) {
count++;
f = f * f + 1;
j = (int) f / 1000;
f = f - j*1000;
}
}
...
|
In the five programs total, the value of count will be the following (note, I will expect you to be able to do things like tell me what count is as a function of n on tests):
| linear1 | linear2 | log | nlogn | nsquared |
Perhaps its hard to gauge how much each is less than the other until you see it. Below I plot the same graph, but zoomed up a bit so you can get a better feel for n*log(n) and n*n.
Put graphically, it means that after a certain point on the x axis, as we go right, the curve for f(n) will always be higher than g(n). Thus, given the graphs above, you can see that n*n is greater than n*log(n), which is greater than 2n, which is greater than n, which is greater than log(n).
So, here are some functions:
That was easy. How about c(n) and b(n)? b(n) is greater, because for any value of n greater than 6, b(n) is 100 and c(n) is negative.
Here's a total ordering of the above. Make sure you can prove all of these to yourselves:
Some rules:
This means given the definitions of a(n) through j(n) above:
Read the book for the other definitions.