Dynamic Programming, Backtracking and Branch & Bound

Algorithm Analysis & Design (CST306) - Module 4 Study Guide


Dynamic Programming: Matrix Chain Multiplication

[SURE SHOT | ↻ REPEATED PYQ]

Section A: What is this? (The Concept)

Imagine you are a chef in a high-pressure kitchen. You have three tasks: peeling potatoes, chopping them, and frying them. Depending on which tools you use and which task you do first, the total effort changes. If you chop then peel, it takes forever. If you peel first then chop, it’s efficient.

The Analogy: Matrix Chain Multiplication is like finding the most efficient “recipe” for multiplying a long string of matrices. Multiplying matrices is expensive (takes many scalar multiplications). Because matrix multiplication is associative, (A×B)×C(A \times B) \times C gives the same result as A×(B×C)A \times (B \times C), but one way might take 10 times more “effort” (scalar multiplications) than the other. We use Dynamic Programming to find the absolute cheapest way to parenthesize the chain.

Section B: Exam-Ready Theory (The Rigor)

1. Problem Statement: Given a chain of nn matrices A1,A2,,An\langle A_1, A_2, \dots, A_n \rangle, where matrix AiA_i has dimensions pi1×pip_{i-1} \times p_i. The goal is to fully parenthesize the product A1A2AnA_1 A_2 \dots A_n in a way that minimizes the total number of scalar multiplications.

2. Cost Calculation: Multiplying a p×qp \times q matrix by a q×rq \times r matrix takes p×q×rp \times q \times r scalar multiplications.

3. Recursive Property: Let m[i,j]m[i, j] be the minimum number of scalar multiplications needed to compute the matrix Ai..jA_{i..j}.

m[i,j]={0if i=jminik<j{m[i,k]+m[k+1,j]+pi1pkpj}if i<jm[i, j] = \begin{cases} 0 & \text{if } i = j \\ \min_{i \leq k < j} \{ m[i, k] + m[k+1, j] + p_{i-1} p_k p_j \} & \text{if } i < j \end{cases}

4. Complexity:

  • Time Complexity: O(n3)O(n^3)
  • Space Complexity: O(n2)O(n^2)

Section C: Step-by-Step Worked Example (The Application)

Example: Matrix Chain Multiplication (April 2025 University Question) Chain of 4 matrices: A1(5×6),A2(6×4),A3(4×8),A4(8×10)A_1(5 \times 6), A_2(6 \times 4), A_3(4 \times 8), A_4(8 \times 10). Dimensions p=[5,6,4,8,10]p = [5, 6, 4, 8, 10].

  1. m[i,j]m[i,j] table initialization:
    • m[1,1]=m[2,2]=m[3,3]=m[4,4]=0m[1,1]=m[2,2]=m[3,3]=m[4,4]=0.
  2. L=2L=2 (Length 2 chains):
    • m[1,2]=p0p1p2=5×6×4=120m[1,2] = p_0 p_1 p_2 = 5 \times 6 \times 4 = 120.
    • m[2,3]=p1p2p3=6×4×8=192m[2,3] = p_1 p_2 p_3 = 6 \times 4 \times 8 = 192.
    • m[3,4]=p2p3p4=4×8×10=320m[3,4] = p_2 p_3 p_4 = 4 \times 8 \times 10 = 320.
  3. L=3L=3 (Length 3 chains):
    • m[1,3]=min(k=1:m[1,1]+m[2,3]+p0p1p3,k=2:m[1,2]+m[3,3]+p0p2p3)m[1,3] = \min(k=1: m[1,1]+m[2,3]+p_0p_1p_3, k=2: m[1,2]+m[3,3]+p_0p_2p_3)
    • m[1,3]=min(0+192+240,120+0+160)=280m[1,3] = \min(0+192+240, 120+0+160) = 280.
  4. L=4L=4 (Final Chain):
    • m[1,4]=min(k=1:0+560+300,k=2:120+320+200,k=3:280+0+400)m[1,4] = \min(k=1: 0+560+300, k=2: 120+320+200, k=3: 280+0+400)
    • m[1,4]=640m[1,4] = 640.

Final Result: Minimum scalar multiplications = 640. Optimal Order: (A1A2)(A3A4)(A_1 A_2) (A_3 A_4).

Section D: How to Write This in the Exam (The Strategy)

  • Start With: State the Principle of Optimality and the cost formula.
  • Body: Draw the triangular mm table. Show the kk split values clearly.
  • Traps: Do not forget to state the final parenthesized order.
  • Close With: Time complexity O(n3)O(n^3).

Dynamic Programming: Floyd-Warshall (All Pairs)

[SURE SHOT | ↻ REPEATED PYQ]

Section A: What is this? (The Concept)

Imagine you are designing a high-speed train network. You need a master table that shows the fastest way between every possible pair of cities simultaneously.

The Analogy: Floyd-Warshall is the “Master Schedule”. It starts with direct routes. Then it asks: “What if we allow a stopover in City 1? Can we find a shortcut?” It does this for every city until the table is perfectly optimized.

Section B: Exam-Ready Theory (The Rigor)

  • Recurrence: Dij(k)=min(Dij(k1),Dik(k1)+Dkj(k1))D^{(k)}_{ij} = \min(D^{(k-1)}_{ij}, D^{(k-1)}_{ik} + D^{(k-1)}_{kj}).
  • Complexity: O(V3)O(V^3).
  • Property: Handles negative edge weights but not negative cycles.

Section C: Step-by-Step Worked Example (The Application)

Example: Floyd-Warshall Trace (University Question) Initial Matrix D(0)D^{(0)} (4 nodes):

    1   2   3   4
1 [ 0,  9, -4,  α]
2 [ 6,  0,  α,  2]
3 [ α,  5,  0,  α]
4 [ α,  α,  1,  0]
  1. D(1)D^{(1)} (Via 1): D23=min(α,6+(4))=2D_{23} = \min(\alpha, 6+(-4)) = 2.
  2. D(2)D^{(2)} (Via 2): D13=min(4,9+2)=4D_{13} = \min(-4, 9+2) = -4.
  3. Final D(4)D^{(4)}: Shortest paths for all pairs.

Section D: How to Write This in the Exam (The Strategy)

  • Start With: Write the 3-nested-loop algorithm.
  • Body: Show every D(k)D^{(k)} matrix.
  • Close With: Final optimized distance matrix.

Backtracking / Branch & Bound: TSP and N-Queens

[HIGH PROB | ↻ REPEATED PYQ]

Section A: What is this? (The Concept)

The Analogy: Backtracking is like walking into a tunnel, hitting a dead end, and turning back. Branch & Bound is like sending friends with walkie-talkies. They explore level-by-level and “kill” expensive paths immediately.

Section B: Exam-Ready Theory (The Rigor)

  • N-Queens: Uses Backtracking (DFS).
  • TSP: Uses Branch and Bound (BFS/LC-Search) with Matrix Reduction.

Section C: Step-by-Step Worked Example (The Application)

Example 1: 4-Queens Problem

  1. Place Q1 at (1,1).
  2. Place Q2 at (2,3).
  3. Backtrack if no valid spot for Q3.
  4. Final Solution: (2,4,1,3)(2, 4, 1, 3).

Example 2: TSP Matrix Reduction Cost(Root) = sum of row mins + sum of column mins. Branch to City jj: Set Row ii and Col jj to \infty.

Section D: How to Write This in the Exam (The Strategy)

  • Start With: Define State Space Tree.
  • Body: Draw the pruned tree for 4-Queens. Show reduction steps for TSP.
  • Close With: Final tour path and min cost.