CSIS-385: Analysis of Algorithms Final Exam Review

Chapter 1: Algorithms

pp 1-2: 7 Properties of Algorithms *****

Don’t just memorize them; be able to show and identify examples

pp 4-9: Pseudocode

Imprecise but makes writing complex algorithms easier

Chapter 2: Math Review

pp 18: Polynomials

Remember, largest exponents dominates the growth rate

pp 24-26: Logarithms
The base is just a constant log2n = Q(log4n)

Log grows very slowly n log n < n2

pp 32-37: Induction Proofs

Used to prove the correctness of algorithms and running time

Chapter 2.3: Analysis of Algorithm

pp 41-43: Order Notation - Memorize the definitions of Big-O, Ω, and Q *****

Big-O is upper bound, Ω is lower is lower bound

pp 44-49: Examples

pp 50-51: Common growth functions

Chapter 2.4: Recurrence Relations

pp 55-56: Simple Examples

pp 56-58: How to solve

pp 59-60: Main Recurrence Theorem (Big-Hammer) Review Exam 1 *****

Chapter 2.5: Graphs

pp 68-70: Overview (review only)

pp 70-71: Adjacency Matrix for graphs

Also, consider adjacency list as alternative

pp 71-76: Complete and bipartite graphs; paths, cycles, and other terminology

pp 76-81: Hamiltonian and Euler Cycles

Chapter 3: Data Structures

pp 101-109: Stacks & Queues (review only)

pp 111-117: Linked Lists

Linked Lists vs. Arrays *****

Adjacency Lists for graphs

pp 120-130: Binary Search Trees (BST)

Linked-list type structure that support efficient insertion and search
O(log n) search on average; O(n) search in worst case
Flaws are the motivation for balanced trees (AVL trees) and Heaps

pp 133-147: Binary Heaps, Priority Queues, Heapsort
Heaps are the best structure for Priority queues:
O(1) insertion (average case);O(log n) find min (worst case)

Chapter 5: Divide and Conquer

pp 219-222: Merge Sort *****
Know the merge function and recursive mergesort function

Divide and conquer, bottom up recursion; O(n log n) in the worst case

pp 226-231: Closest-pair Problem

O(n2) pairs to consider, yet Divide & Conquer yields O(n log n) algorithm

see the PowerPoint Presentation


Chapter 6: Sorting

pp 239-241: Insertion Sort

O(n2) but very efficient on partially sorted list

pp 243-252: Quicksort *****

O(n log n) on average, but O(n2) worst case

pick a good pivot helps avoids worst case, but not always

pp 254-259: Lower-bound

Ω(n log n) is the fastest we can sort using direct comparison of values

O(n) sorting is possible (count sort) but requires O(n) array and integer mapping (hashing).

Chapter 7: Greedy Algorithms

pp 275-282: Kruskal’s MST Algorithm *****
sorts by edge weight; considers smallest edges first

pp 284-293: Prim’s MST Algorithm *****

considers shortest one hop first

pp 295-301: Dijkstra’s Shortest Path Algorithm *****

nearly identical to Prim’s but considers shortest path, not one hop

Prim’s and Dijkstra’s are made efficient through use of data structures:
O(1) vertex lookups using arrays, O(log n) find min using Heaps

pp 313-318: Continuous Knapsack Problem

A greedy algorithm is optimal because you can pick a faction (percentage) of each item. Sort based on density. Note: very different than general Knapsack problem.

Chapter 8: Dynamic Programming

pp 342-347: Edit Distance or Longest Common Subsequence Problem *****

see the PowerPoint Presentation

pp 350-355: Floyd’s Shortest Path Algorithm *****

O(n3) algorithm; but finds the shortest path among all pairs of vertices.
see the PowerPoint Presentation

Chapter 10: P and NP *****

pp 429-431: Overview

pp 458-459: P, NP, and Exponential

pp 466-473: NP-Complete Problems
see the PowerPoint Presentation

Special Topics not covered in Book

AVL Trees ****

A balanced binary search tree (BST); enforces an imbalance of at most at most one; eliminates the worst case when a BST can become a linear data structure.

Knapsack Problem ****

Book covers continuous knapsack problem (pick a fraction/percentage of an item) which can be solved using a greed algorithm. We also studied general knapsack problem where you must pick the entire item.