Recurrences
Methods for solving recurrences
- Substitution: Guess and check
- Recursion tree: Draw a tree, sum computation at nodes; very general
- Master Theorem: Formula for large class of recurrences
Master Theorem
- Video
- How work at root compares to work at leaves (i.e. number of leaves, since base case has input of size 1)
- l=logbn levels, al=alogbn=nlogba leaves
- Case 1: leaves dominate; so much work at the bottom that the other levels don't matter; number of leaves grows polynomially faster with n than work at the root does
- Case 2: log/poly-log factor worse
- Case 3: root work f(n) grows polynomially faster, is lower bounded by the leaves' growth, so Θ(f(n))
- If f(n) is polynomial s.t. f(n)=Θ(nc) for c≥0, then the cases are simplified
- Q: If the master theorem is about trees — comparing work at root vs at leaves — why are recursion trees more general? What restrictions do the assumptions of the master theorem add?
- "Work done" is number of operations we need to do
Exercises
- Use (strong) induction for verifying substitution guess. Base case is usually trivial: T(1)=O(h(1))=O(1). Assume true for all n′>n, then show for n. By assumption, T(n)≤a[c⋅h(g(n))]+f(n). Show that that ≤c⋅h(n).
- Some recursion trees are chains, i.e. if branching factor is 1
- To find number of levels: (3/2)ln=1
- Omg most people said "2" re: binary search recurrence coefficient and I said "1" so I thought I was wrong BUT I WAS RIGHT
Sequence Interface
- Collection of items in extrinsic order; each item has a rank because some external party put it there
- Generalization of stacks and queues, which support a subset of sequence operations
-
Operations
- Container:
build(X), len()
- Static:
iter_seq(), get_at(i), set_at(i, x)
- Dynamic:
insert_at(i, x), delete_at(i), insert_first/last(x), delete_first/last()
- We'll discuss three data structures to implement the sequence inteface: arrays, linked lists, dynamic arrays
Set Interface
- Collection of items based on intrinsic property, usually a unique key
- Generalizations of dictionaries and other intrinsic query databases
-
Operations
- Container:
build(X), len()
- Static:
find(k)
- Dynamic:
insert(x), delete(k)
- Order:
iter_ord(), find_min(), find_max(), find_next(k), find_prev(k)
Sequence Implementations
Three data structures
Array Sequence
- Many processes may share the same main memory store, so OS assigns a fixed chunk of memory addresses to each active process
- Amount of memory allocated depends on needs of process and availability of free memory
- If we store an array A, then B right next to it, then want to add to A, better to request a new range of memory, copy A, add the new item to the end, and free up A's original space
get_at and set_at are O(1) because of our random access machine
- Deleting or inserting involve moving items and resizing array, so linear-time in worst case
Linked List Sequence
- Pointer-based or linked
- Stores address of head (node storing first element of list), linked list's size, and number of items in linked list
- Stores each item in a node, a constant-sized container with two properties:
node.item storing the item, node.next storing the memory address of the node containing the next item in the sequence
- Adding new item at head takes O(1) time to relink pointers
get_at and set_at worst-case linear time to step through items one-by-one
Dynamic Array Sequence
See rec 3