CS1101S
Lecture 4: Recursion and Order of Growth
24 August 2012
Martin Henz
Orders of Growth
The first version of fib run in a time
proportional to the argument n.
The third version of fib run in a time
that grows exponentially with the argument n.
What exactly do we mean by this?
How can we be sure?
Rough measure
We are interested in a rough measure of resources
used by a computational process
Recap Resources
Time: How long it takes to run the program
Space: How much memory do we need to run the program
Example: Double the argument
Compare the runtime of factorial(10)
and
factorial(20)
Compare the runtime of fib(10)
and fib(20)
A Definition of "Big Oh"
Let n denote the size of the problem, and
let R(n) denote the resource needed solving the problem of
size n.
R has order of
growth O(f(n)),
if there is a positive constant k such that
R(n) ≤ k f(n)
for any sufficiently large value of n.
What do we mean by "suffiently large"?
Example
function factorial(n) {
return n===1 ? 1 : n * factorial(n-1);
}
Recursive Process
factorial(4)
4 * factorial(3)
4 * (3 * factorial(2))
4 * (3 * (2 * factorial(1)))
4 * (3 * (2 * 1))
4 * (3 * 2)
4 * 6
24
Note the build-up of pending operations.
A Definition of "Big Omega"
Let n denote the size of the problem, and
let R(n) denote the resource needed solving the problem of
size n.
R has order of
growth Ω(f(n)),
if there is a positive constant k such that
k f(n) ≤ R(n)
for any sufficiently large value of n.
A Definition of "Big Theta"
Let n denote the size of the problem, and
let R(n) denote the resource needed solving the problem of
size n.
R has order of
growth Θ(f(n)),
if there are positive constants k1 and
k2 such that
k1 f(n) ≤ R(n) ≤ k2
for any sufficiently large value of n.
Do constants matter?
Let's say we have R has order of growth O(n2).
Do we also have
R has order of growth O(0.5 n2)?
A Definition of "Big Oh"
Let n denote the size of the problem, and
let R(n) denote the resource needed solving the problem of
size n.
R has order of
growth O(f(n)),
if there is a positive constant k such that
R(n) ≤ k f(n)
for any sufficiently large value of n.
Do constants matter? Answer: No
Let's say we have R has order of growth O(n2).
We also have
R has order of growth O(0.5 n2).
Do minor terms matter?
Let's say we have R has order of growth O(n2).
Do we also have
R has order of growth O(n2 - 40 n)?
Yes, we do.
Some common f(n)
- 1
- n
- n2
- n3
- log n
- n log n
- 2n
Iterative Factorial Revisited
function f(n, k, f_k_minus_1) {
return (k > n) ? 1
: f(n,
k + 1,
k * f_k_minus_1);
}
f(5,2,1));
Time: O(n)
Space: O(1) in Scheme, O(n) in JavaScript
How do we calculate Big Oh?
Topic of algorithm analysis (CS3230)
For now:
- Identify the basic computational steps
- Try a few small values
- Extrapolate
- Watch out for "worst case" scenarios
Some numbers
The world's oldest algorithm
Greatest Common Divisor (GCD)
Euclid of Alexandria, fl. 300 BC
function gcd(a,b) {
return (b === 0) ? a
: gcd(b, a % b);
}
Power function
function power(b, e) {
return (e === 0) ? 1 : b * power(b, e-1);
}
Fast Power function
function fast_power(b, e) {
if (e === 0) {
return 1;
} else if (is_even(e)) {
return fast_power(b * b, e / 2);
} else {
return b * fast_power(b, e - 1);
}
}
Power function
function power(a, b) {
return b === 0 ? 1 : b * power(a, b-1);
}
←
→
/
#