← Back to mit.katmh.com


· Read 02/17/2021

This recitation began with everyone introducing themselves. I was mildly surprised about the mix of people, year- and major-wise.

Most of my notes are in the form of iPad annotations on a PDF, but I'll note some retrospective thoughts here too.



We did a proof by contradiction, whereas the notes used induction


Resources used by a program, e.g. storage space or running time

Asymptotic Notation

As opposed to absolute (downsides: depends on machine, hard to describe speed on all inputs)

Big O

g(n)O(f(n))    limng(n)f(n)<g(n) \in O(f(n)) \iff \lim_{n \to \infty} \frac{g(n)}{f(n)} < \infty

or iff there exists a positive real number cc and positive integer n0n_0 such that g(n)cf(n)g(n) \leq c \cdot f(n) for all nn0n \geq n_0. gg doesn't outpace ff, gg is upper-bounded by ff

Big Ω\Omega

g(n)Ω(f(n))    limnf(n)g(n)<g(n) \in \Omega(f(n)) \iff \lim_{n \to \infty} \frac{f(n)}{g(n)} < \infty

or iff there exists a positive real number cc and positive integer n0n_0 such that cf(n)g(n)c \cdot f(n) \geq g(n) for all nn0n \geq n_0. gg is lower-bounded by ff

Big Θ\Theta

g(n)Θ(f(n))    g(n)O(f(n))Ω(f(n))g(n) \in \Theta(f(n)) \iff g(n) \in O(f(n)) \cap \Omega(f(n))

The problems we'll look at fall into these:

  • Constant: Θ(1)\Theta(1)
  • Logarithmic: Θ(logn)\Theta(\log n)
  • Linear (graphs often V+E|V| + |E|): Θ(n)\Theta(n)
  • Quadratic (matrices): Θ(n2)\Theta(n^2)
  • Polynomial: Θ(nc)\Theta(n^c)
  • Exponential: 2Θ(nc)2^{\Theta(n^c)}

Model of Computation

  • We'll use a ww-bit Word-RAM model of computation
  • Computer as random acess array of machine words (sequence of ww bits representing an integer from {0,...,2w1}\{0,...,2^w-1\}) called memory
  • Processor an perform operations on the memory
  • Constant time: add, subtract, multiply, integer division, modulo, bitwise operations, binary comparisons; read/write at address aa

Data Structure

  • Way to support non-constant amount of data, supporting a set of operations to interact with that data
  • Interface: set of operations supported by a data structure

    • Many data structures might support the same interface, but with different performance
  • Many problems can be solved more easily by choosing the right data structure
  • Static array: most primitive data structure native to the Word-RAM; contiguous sequence of words reserved in memory, supporting a static sequence interface

    • StaticArray(n): allocate static array of size nn initialized to 0 in Θ(n)\Theta(n) time
    • StaticArray.get_at(i): Θ(1)\Theta(1)
    • StaticArray.set_at(i, x): Θ(1)\Theta(1)

Running Time Analysis

  • I appreciate the Python code showing an implementation of a static array!
  • We did this the same way edX 6.0001 introduced, I'm wondering how people do it once they're more practiced / how to efficiently get the hang of this more

Asymptotics Exercises

  • When I saw that we had to find a tight, simple asymptotic bound for (n6006)n \choose 6006, I immediately thought we'd have to use Stirling's approximation, but it turns out that we could cancel terms
  • The last problem — show (6n)!Θ(n!)(6n)! \notin \Theta(n!) but log((6n)!)Θ(log(n!))\log((6n)!) \in \Theta(\log(n!)) — does actually use Stirling's approximation:
log2n!=nlog2nnlog2e+O(log2n)\log_2 n! = n \log_2 n - n \log_2 e + O(\log_2 n)


Algorithms Lesson 6: Big O, Big Omega, and Big Theta Notation

  • So Big O: upper-bound, Big Ω: lower-bound, Big Θ: intersection

Introduction to Big O Notation and Time Complexity (Data Structures & Algorithms #7)

  • Find the fastest growing term