Quick Sort
Best: O(nlog(n))
Average: O(nlog(n))
Worst: O(n^2)
Selection Sort
Best: O(n^2)
Average: O(n^2)
Bubble Sort
Best: O(n)
Insertion Sort
Merge Sort
Worst: O(nlog(n))
Heap Sort
Balanced vs Unbalanced Trees
Balanced Tree
query: O(logn)
build: O(n)
Examples: AVL-tree, Red-Black tree
Unbalanced Tree
query: O(n)
Hash Table
give average and worst case complexities for
Space
Search
Insert
Delete
What does it mean for a problem to be NP-complete?
problem is NP and all other NP problems are polynomial-time reducible to it
What are the major steps in proving a problem is NP-complete?
Basic Strategy to prove Problem X is NP-complete
Prove X ∈ NP
Choose a problem Y that is known to be NP-complete
Prove that Y ≤p X
Master Theorem
Draw NP=P vs NP≠ P
Stable Sorting Algorithms
Bubble Sort, Insertion Sort, Merge Sort, Count Sort
Unstable Sorting Algorithms
Quick Sort, Heap Sort, Selection Sort
Count Sort
Best: O(n+k)
Average: O(n+k)
Worst: O(n+k)
Radix Sort
uses count sort
O(d(n+k))
Time Complexities sorted
O(1)<O(logn)<O(sqrt(n))<O(n)<O(nlogn)<O(n^2)<O(2^n)<O(n!)
Definition Big O
Upper bound
O(g(n)) = f(n) there exists a positive constant c and no such that 0 ≤ f(n) ≤ cg(n) for all n ≥ no
Definition Big Omega
Lower bound
Ω(g(n)) = f(n) there exists positive constant c and no such that
0 ≤ cg(n) ≤ f(n) for all n ≥ no
Definition Big Theta
Tight bound
Θ(g(n)) = f(n) there exists positive constants C1 and C2 and no such that 0 ≤ C1g(n) ≤ f(n) ≤ C2g(n) for all n ≥ no
DFS/BFS runtime
O(V+E)
Array
Linked List
Stack
LIFO
Queue
FIFO
Red-Black-Tree, AVL-Tree
Binary Heap
Heapify: O(logn)
Build Heap with Heapify: O(n)
Find Max/Min: O(1)
Extract Max/Min: O(logn)
Increase Key: O(logn)
Insert: O(logn)
Delete: O(logn)
Merge: O(n+m)
Binomial Heap
Find Max/Min: O(logn)
Merge: O(logn)
Fibonacci Heap
Increase Key: O(1)
Merge: O(1)
Different Proof Tactics
Proof by contradiction
Proof by induction (stay ahead)
Proof by exchange
logx(x)
1
logx(1)
0
DFS
not optimal for finding the shortest path
used in various applications such as acyclic graphs and topological order etc.
recursive algorithm that uses the idea of backtracking
edge (x,y) in DFS means: x or y is the ancestor of the otger
BFS
optimal for finding the shortest path
used in various applications such as bipartite graphs, shortest paths, etc.
no concept of backtracking
edge (x,y) in BFS means: x and y differs by at most 1 layer
Amortized cost Aggregate Analysis
Show that sequence of n operations takes worst time T(n)
In the worst-case average cost is T(n)/n
Amortized cost Accounting method
Dijkstra runtime
O(ElogV) using min priority queue (binary heap)
Fibonacci Heap: O(E+VlogV)
MST
Kruskal time:O(E * logE)
space: O(V+E)
Prim: O(V^2) (can be O(E * logV) with the help of a binary heap)
space: O(V)
Max-Flow
what algorithm
runtime & space
Ford-Fulkerson
runtime:O(|E| * f), where |E| is the number of edges and 'f' is the maximum flow of a network
Edmond-Karp: runtime: O(|V| * E^2)
space for both: O(V)
KMP
runtime: O(n+m)
Suffix Trie vs Suffix Tree
Suffix Trie
create: O(m^2)
space: Ω(m^2)
search for pattern: O(n)
Suffix Tree
create: O(m) (McCreights)
space: O(2n − 1)
n is number of leaves
BWT runtimes
transformation: O(n) (use skew algorithm)
decoding: O(n)
Approximate String Matching
O(km+|C|n)
k=amount of mismatches that are allowed
|C|= number of candidates
Zuletzt geändertvor 2 Jahren