This recitation began with everyone introducing themselves. I was mildly surprised about the mix of people, year and majorwise.
Most of my notes are in the form of iPad annotations on a PDF, but I'll note some retrospective thoughts here too.
Algorithms
Correctness
We did a proof by contradiction, whereas the notes used induction
Efficiency
Resources used by a program, e.g. storage space or running time
Asymptotic Notation
As opposed to absolute (downsides: depends on machine, hard to describe speed on all inputs)
Big O
$g(n) \in O(f(n)) \iff \lim_{n \to \infty} \frac{g(n)}{f(n)} < \infty$
or iff there exists a positive real number $c$ and positive integer $n_0$ such that $g(n) \leq c \cdot f(n)$ for all $n \geq n_0$. $g$ doesn't outpace $f$, $g$ is upperbounded by $f$
Big $\Omega$
$g(n) \in \Omega(f(n)) \iff \lim_{n \to \infty} \frac{f(n)}{g(n)} < \infty$
or iff there exists a positive real number $c$ and positive integer $n_0$ such that $c \cdot f(n) \geq g(n)$ for all $n \geq n_0$. $g$ is lowerbounded by $f$
Big $\Theta$
$g(n) \in \Theta(f(n)) \iff g(n) \in O(f(n)) \cap \Omega(f(n))$
The problems we'll look at fall into these:
 Constant: $\Theta(1)$
 Logarithmic: $\Theta(\log n)$
 Linear (graphs often $V + E$): $\Theta(n)$
 Quadratic (matrices): $\Theta(n^2)$
 Polynomial: $\Theta(n^c)$
 Exponential: $2^{\Theta(n^c)}$
Model of Computation
 We'll use a $w$bit WordRAM model of computation
 Computer as random acess array of machine words (sequence of $w$ bits representing an integer from $\{0,...,2^w1\}$) called memory
 Processor an perform operations on the memory
 Constant time: add, subtract, multiply, integer division, modulo, bitwise operations, binary comparisons; read/write at address $a$
Data Structure
 Way to support nonconstant amount of data, supporting a set of operations to interact with that data

Interface: set of operations supported by a data structure
 Many data structures might support the same interface, but with different performance
 Many problems can be solved more easily by choosing the right data structure

Static array: most primitive data structure native to the WordRAM; contiguous sequence of words reserved in memory, supporting a static sequence interface
StaticArray(n)
: allocate static array of size $n$ initialized to 0 in $\Theta(n)$ time
StaticArray.get_at(i)
: $\Theta(1)$
StaticArray.set_at(i, x)
: $\Theta(1)$
Running Time Analysis
 I appreciate the Python code showing an implementation of a static array!
 We did this the same way edX 6.0001 introduced, I'm wondering how people do it once they're more practiced / how to efficiently get the hang of this more
Asymptotics Exercises
 When I saw that we had to find a tight, simple asymptotic bound for $n \choose 6006$, I immediately thought we'd have to use Stirling's approximation, but it turns out that we could cancel terms
 The last problem — show $(6n)! \notin \Theta(n!)$ but $\log((6n)!) \in \Theta(\log(n!))$ — does actually use Stirling's approximation:
$\log_2 n! = n \log_2 n  n \log_2 e + O(\log_2 n)$
Videos
Algorithms Lesson 6: Big O, Big Omega, and Big Theta Notation
 So Big O: upperbound, Big Ω: lowerbound, Big Θ: intersection
Introduction to Big O Notation and Time Complexity (Data Structures & Algorithms #7)
 Find the fastest growing term