Algorithms
- Motivation
- Introduction to the Problem
- Iterative
- Recursive
- Iteration vs Recursion
- Overlapping Subproblems
- Dynamic Programming
- Algorithmic Patterns
Motivation
Why? Because it’s in everything. Or it will be. And it’s interesting.
- Live videos? Through compression algorithms.
- Maps? Dijkstra’s.
- ISS solar panel? Optimization and scheduling algorithms.
Look it up if you don’t believe me.
SHA256 may be the most popular algorithm on the planet (used in cryptocurrency calculation)
Introduction to the Problem
What is the runtime of this function, fun_1( )?
void fun_1( )
{
int i, j;
for(i = 1; i <= n; i++)
for(j = 1; j <= log_2 i; j++)
printf("Hello World");
}
Iterative
Iterative is when an algorithm is called in iterations
Generally if there are loops such as for, while, etc., it’s probably iterative
Imperative languages tend to use iterative
Recursive
Recursion is an algorithm that calls itself in its definition to solve a problem
There are two parts to a recursive function:
- base case, a conditional statement that is used to break the recursion
- recursive case, a conditional statement that is used to trigger the recursion, basically the function to call itself
Two potential problems you can run into with recursive algorithms: stack level too deep, stack overflow
Functional languages tend to use recursion
where possible, do an iterative version as well
it is often helpful to get the mathematical formula which you can convert to become the function to call itself
def recurse(parameter):
if base case:
return some value
else:
recurse(modified parameter)
Example recursion:
def recursive(n):
if n < 1:
print("exit")
else:
recursive(n-1)
print(n)
recursive(3)
(Without running this program, what do you think gets printed first?)
Summing list items recursively:
A = [10,20,30,40,50,60,70,80,90,100]
def test(A, n):
if n == 1:
print(A[0])
return A[0]
s = test(A, n-1)
print(A[n-1])
s += A[n-1]
return s
print(test(A, 4))
If you can divide the problem into similar subproblems, try recursion
Steps:
- Recursive case - the flow
- Base case - the stopping criteria
- Unintentional case - constraint, can use assert
Recursion uses stack memory
Tower of Hanoi uses simple recursion
Iteration vs Recursion
It depends!
Infinite recursion tends to overload the memory Infinite iteration tends to overload the CPU
Recursion seems to be better for mathematics problems, like ones found in Project Euler.
Trickier to do recursion – can lead to larger stack calls very quickly. How do you debug something with infinite recursion calls?
Memoized recursion is useful.
When using the memoize technique, just create a class with a sort of tracker variable.
Recursion is not recommended for low memory systems like mobile devices
Recursion is not recommended for critical systems where speed matters
Overlapping Subproblems
A problem has overlapping subproblems if finding its solution involves the same subproblem multiple times.
Dynamic Programming
“The “Viterbi algorithm” is an algorithm for finding the most likely sequence of hidden states in a hidden Markov model, which is a mathematical model used to describe a sequence of observations that are generated by a sequence of underlying states.”
- Write a recursive solution to solve a small problem.
- Memoize it.
- That’s it, in short.
Another way to put it:
- Solve it recursively with a brute force solution
- Identify the repeated work and what key used to capture it
- Memoize it and flip it around to solve it bottom up
Bottom-Up
bottom up is tabulation
Going bottom up is a common strategy for dynamic programming.
Going bottom up is a way to avoid recursion, saving the memory cost that results from call stack. Bottom-up algorithms start at the beginning while recursive algorithms start at the end.
Top-down
top down is memoization
Algorithmic Patterns
Brute
Greedy
A greedy algorithm is a myopic algorithm that processes information one at a time with no apparent look ahead
Great for scheduling and graph problems
An algorithm that, while executing, selects only the information that meets a certain criteria
- there is a candidate set from which a solution is created
- a selection function, which chooses the best candidate
- a feasibility function, to determine if the candidate can contribute
- an objective function, assigns a value to a solution or partial solution
- a solution function, which will indicate when we have a complete solution
Often used to find fast (usually non-optimal) solutions
Interval Scheduling
resources & requests 1, …, n (single resource) requiring time to correspond to resource s(i) is the start time f(i) is the finish time s(i) < f(i)
definition of compatibility: two requests i and j are compatible if they don’t overlap f(i) <= s(j) or f(j) <= s(i)
requests and intervals are interchangeable
- use a simple rule to select a request i
- reject all requests that are incompatible with i
- repeat until all requests are processed
The best solution so far is to apply the selection rule with the earliest finish time
Proof by induction
Weighted Interval Scheduling
Earliest finish time fails here
There is currently nothing on this planet that a simple rule would work for every problem instance (in this case, maximum weight solution)
Dynamic programming solution
Heuristics
Divide and Conquer
- Divide into subproblems
- Conquer subproblems recursively
- Combine into solution
Backtracking
Backtracking is useful for graph algorithms like DFS and BFS
algorithms
memoization
bottom_up
]