Algorithm Analysis & Recurrences

Algorithm Analysis & Design (CST306) - Module 1 Study Guide

Part A: High Frequency Definitions (Module 1 Focus)

Topic: Little-o notation o(n2)o(n^2) justification

[SURE SHOT | ↻ REPEATED PYQ]

Section A: What is this? (The Concept)

Imagine you are in a race against a rocket. No matter how fast you run, the rocket will eventually outpace you so severely that your speed becomes mathematically insignificant compared to its speed.

The Analogy: Little-o notation is the mathematical way of saying “I will get completely destroyed by this upper limit.” While Big-O means “I will be less than or equal to this limit,” Little-o strictly means “I will be strictly less than this limit as things get huge.”

If a function f(n)f(n) is o(n2)o(n^2), it means the growth rate of f(n)f(n) is strictly slower than n2n^2. As nn approaches infinity, f(n)f(n) becomes arbitrarily small compared to n2n^2.

Section B: Exam-Ready Theory (The Rigor)

  • Definition: A function f(n)=o(g(n))f(n) = o(g(n)) if and only if for ANY positive constant c>0c > 0, there exists a constant n0>0n_0 > 0 such that: 0f(n)<cg(n)for all nn00 \leq f(n) < c \cdot g(n) \quad \text{for all } n \geq n_0
  • Limit Definition: The most rigorous way to justify little-o is using limits. f(n)=o(g(n))f(n) = o(g(n)) if: limnf(n)g(n)=0\lim_{n \to \infty} \frac{f(n)}{g(n)} = 0
  • Justification for o(n2)o(n^2): If you are asked to justify why a function like nlognn \log n is o(n2)o(n^2), you apply the limit theorem. limnnlognn2=limnlognn=0\lim_{n \to \infty} \frac{n \log n}{n^2} = \lim_{n \to \infty} \frac{\log n}{n} = 0 Since the limit is 0, nlognn \log n is strictly bounded above by n2n^2, hence it is o(n2)o(n^2).

Section C: Step-by-Step Worked Example (The Application)

Question: Justify if 2n=o(n2)2n = o(n^2).

  1. Apply the Limit Test: Set up the limit of f(n)/g(n)f(n) / g(n) as nn approaches infinity. limn2nn2\lim_{n \to \infty} \frac{2n}{n^2}
  2. Simplify the expression: =limn2n= \lim_{n \to \infty} \frac{2}{n}
  3. Evaluate the limit: As nn grows infinitely large, any constant divided by infinity approaches zero. =0= 0
  4. Conclusion: Since the limit is 0, 2n2n is indeed o(n2)o(n^2).

Section D: How to Write This in the Exam (The Strategy)

  • Start With: Write down the formal limit definition: limnf(n)g(n)=0\lim_{n \to \infty} \frac{f(n)}{g(n)} = 0.
  • Body: Substitute the given functions into the limit. Simplify the algebraic expression cleanly before taking the limit.
  • Traps: Do not confuse Little-o with Big-O. Big-O allows the limit to be a constant value greater than 0 (e.g., 2n=O(n)2n = O(n) because limit is 2). Little-o requires the limit to be exactly 0.
  • Close With: Conclude with the sentence: “Because the limit evaluates to 0, f(n)f(n) grows strictly slower than g(n)g(n), satisfying the little-o property.”

Part B: Core Topics

Recursion Solving (Master’s & Recursion Tree)

[SURE SHOT | ↻ REPEATED PYQ]

Section A: What is this? (The Concept)

Imagine you are tasked with counting all the leaves on a massive, sprawling oak tree. Counting them one by one would take forever. Instead, you recruit three friends and say, “Each of you take a main branch, count the leaves, and tell me the total.” Your friends do the exact same thing, recruiting their own friends for the smaller branches, until someone reaches a tiny twig with just one leaf.

The Analogy: Solving recurrences is like calculating the total time it takes for all these friends to finish counting. The work is split into smaller identical tasks (recursive calls), but splitting the tasks and combining the results also takes some effort.

In computer science, algorithms like Merge Sort or Binary Search work by splitting a problem into smaller chunks. A recurrence relation is simply a mathematical equation that describes the running time of these recursive algorithms in terms of their smaller inputs. We need methods like the Master’s Theorem, Recursion Trees, and Substitution to solve these equations and find the final Time Complexity in standard Asymptotic notation (like O(nlogn)O(n \log n)).

Section B: Exam-Ready Theory (The Rigor)

To analyze recursive algorithms, we represent their time complexity as a Recurrence Relation: an equation that describes a function in terms of its values on smaller inputs, such as T(n)=aT(n/b)+f(n)T(n) = aT(n/b) + f(n).

Here are the three primary rigorous methods to solve them:

1. The Master’s Theorem This is a direct formula-based method for solving recurrences of the form: T(n)=aT(n/b)+Θ(nklogpn)T(n) = aT(n/b) + \Theta(n^k \log^p n) Where:

  • a1a \geq 1 is the number of subproblems (recursive calls).
  • b>1b > 1 is the factor by which the input size is divided.
  • k0k \geq 0 and pp is a real number representing the cost of dividing/combining f(n)f(n).

The 3 Master Cases:

  • Case 1: If a>bka > b^k, then T(n)=Θ(nlogba)T(n) = \Theta(n^{\log_b a})
  • Case 2: If a=bka = b^k, we check the value of pp:
    • If p>1p > -1, then T(n)=Θ(nlogbalogp+1n)T(n) = \Theta(n^{\log_b a} \log^{p+1} n)
    • If p=1p = -1, then T(n)=Θ(nlogbaloglogn)T(n) = \Theta(n^{\log_b a} \log \log n)
    • If p<1p < -1, then T(n)=Θ(nlogba)T(n) = \Theta(n^{\log_b a})
  • Case 3: If a<bka < b^k, we check the value of pp:
    • If p0p \geq 0, then T(n)=Θ(nklogpn)T(n) = \Theta(n^k \log^p n)
    • If p<0p < 0, then T(n)=Θ(nk)T(n) = \Theta(n^k)

2. The Recursion Tree Method This is a pictorial representation of the iteration method. It converts the recurrence into a tree where nodes represent the cost incurred at various levels of the recursion.

  • The root is the initial cost f(n)f(n).
  • Its children represent the cost of the subproblems, e.g., aa branches each costing f(n/b)f(n/b).
  • You sum the costs across each horizontal level of the tree.
  • Finally, you sum the costs of all levels from the root down to the leaves to get the total complexity.

3. The Substitution Method This method involves two formal mathematical steps:

  1. Guess the form of the solution (e.g., guess that T(n)=O(nlogn)T(n) = O(n \log n)).
  2. Use Mathematical Induction to find the exact constants and prove that the guess is correct. This involves showing T(n)cg(n)T(n) \leq c \cdot g(n).

Section C: Step-by-Step Worked Example (The Application)

Example 1: Master’s Theorem Solve T(n)=3T(n/4)+nlognT(n) = 3T(n/4) + n \log n

  1. Compare with the standard form: T(n)=aT(n/b)+Θ(nklogpn)T(n) = aT(n/b) + \Theta(n^k \log^p n)
  2. Extract the constants:
    • a=3a = 3
    • b=4b = 4
    • f(n)=nlog1n    k=1,p=1f(n) = n \log^1 n \implies k = 1, p = 1
  3. Evaluate bkb^k:
    • bk=41=4b^k = 4^1 = 4
  4. Compare aa and bkb^k:
    • a=3a = 3 and bk=4b^k = 4.
    • Since 3<43 < 4, we have a<bka < b^k. This falls under Case 3.
  5. Check pp for Case 3:
    • p=1p = 1, which means p0p \geq 0.
  6. Apply the formula: T(n)=Θ(nklogpn)T(n) = \Theta(n^k \log^p n)
    • T(n)=Θ(n1log1n)T(n) = \Theta(n^1 \log^1 n)
    • Final Answer: T(n)=Θ(nlogn)T(n) = \Theta(n \log n)

Example 2: Recursion Tree Method (April 2025 University Question) Solve T(n)=3T(n/4)+cn2T(n) = 3T(n/4) + cn^2

  1. Level 0 (Root): The cost at the root is cn2cn^2.
  2. Level 1: The recurrence branches 3 ways (a=3a=3). The input size drops to n/4n/4.
    • Each node costs c(n/4)2=cn2/16c(n/4)^2 = cn^2 / 16.
    • Total cost at Level 1 =3×(cn2/16)=(3/16)cn2= 3 \times (cn^2 / 16) = (3/16)cn^2.
  3. Level 2: Each of the 3 nodes branches 3 ways (9 nodes total). Input is n/16n/16.
    • Each node costs c(n/16)2=cn2/256c(n/16)^2 = cn^2 / 256.
    • Total cost at Level 2 =9×(cn2/256)=(3/16)2cn2= 9 \times (cn^2 / 256) = (3/16)^2 cn^2.
  4. Level ii (General Case):
    • Number of nodes =3i= 3^i.
    • Cost per node =c(n/4i)2=cn2/16i= c(n/4^i)^2 = cn^2 / 16^i.
    • Total cost at Level i=(3/16)icn2i = (3/16)^i cn^2.
  5. Total Cost Summation:
    • T(n)=i=0log4n1(3/16)icn2+Leaf CostT(n) = \sum_{i=0}^{\log_4 n - 1} (3/16)^i cn^2 + \text{Leaf Cost}
    • This is a geometric series with ratio r=3/16r = 3/16.
    • T(n)=cn2[1+(3/16)+(3/16)2+]T(n) = cn^2 [1 + (3/16) + (3/16)^2 + \dots ]
    • Using the infinite series formula S=a/(1r)S = a / (1-r):
    • T(n)<cn2×113/16=cn2×1613T(n) < cn^2 \times \frac{1}{1 - 3/16} = cn^2 \times \frac{16}{13}
  6. Final Result:
    • T(n)=O(n2)T(n) = O(n^2).

Example 3: Substitution Method (April 2025 University Question) Solve T(n)=2T(n/2)+nT(n) = 2T(\lfloor n/2 \rfloor) + n where T(1)=1T(1) = 1.

  1. Guess: Based on the form (similar to Merge Sort), we guess T(n)=O(nlogn)T(n) = O(n \log n).
  2. Inductive Goal: Prove T(n)cnlognT(n) \leq cn \log n for some c>0c > 0.
  3. Inductive Step: Assume it holds for n/2n/2, i.e., T(n/2)c(n/2)log(n/2)T(n/2) \leq c(n/2) \log(n/2).
  4. Substitute into Recurrence:
    • T(n)2[c(n/2)log(n/2)]+nT(n) \leq 2 [c(n/2) \log(n/2)] + n
    • T(n)cnlog(n/2)+nT(n) \leq cn \log(n/2) + n
    • T(n)cn(lognlog2)+nT(n) \leq cn (\log n - \log 2) + n
    • T(n)cnlogncn+nT(n) \leq cn \log n - cn + n
  5. Final Condition: We need cnlogncn+ncnlogncn \log n - cn + n \leq cn \log n.
    • This is true if cn+n0    n(1c)0-cn + n \leq 0 \implies n(1-c) \leq 0.
    • This holds for all c1c \geq 1.
  6. Conclusion: T(n)=O(nlogn)T(n) = O(n \log n).

Section D: How to Write This in the Exam (The Strategy)

  • Start With: For Master’s theorem questions (Even Path - Q12), immediately write down the standard form T(n)=aT(n/b)+Θ(nklogpn)T(n) = aT(n/b) + \Theta(n^k \log^p n) at the top of your page. Extract a,b,k,pa, b, k, p clearly in a list.
  • Body: Calculate bkb^k and explicitly write down the comparison ”aa vs bkb^k”. State exactly which case of the Master’s theorem you are applying. For Recursion Trees, you must draw the tree. Calculate the sum of the first 3 levels and write down the general series equation.
  • Traps: Do not forget to check the condition for pp in Cases 2 and 3 of the Master’s theorem. A common mistake is writing T(n)=Θ(nlogba)T(n) = \Theta(n^{\log_b a}) for Case 3 when it should be determined by f(n)f(n).
  • Close With: Write your final answer in a neat box: T(n)=Θ()T(n) = \Theta(\dots).

Asymptotic Notations (Big O, Omega, Theta)

[HIGH PROB | ↻ REPEATED PYQ]

Section A: What is this? (The Concept)

Imagine you are comparing two cars. Car A is a heavy truck that starts off really fast but hits a strict top speed. Car B is a sports car that stalls out for the first 10 seconds, but eventually accelerates infinitely. If you only test them on a 5-meter driveway, Car A wins. But if you test them on a cross-country highway, Car B destroys Car A.

The Analogy: Asymptotic notation is the “cross-country highway” test for algorithms. We do not care about small inputs (the 5-meter driveway) or minor hardware advantages. We only care about how the algorithm’s running time grows as the input size (nn) gets massively large (approaches infinity).

Asymptotic notations are mathematical tools used to represent the Time and Space complexity of algorithms. They allow us to establish upper bounds (Worst case), lower bounds (Best case), and tight bounds (Average/Exact case) ignoring machine-specific constants.

Section B: Exam-Ready Theory (The Rigor)

There are 5 primary asymptotic notations used to classify functions by their asymptotic growth rate:

1. Big-Oh (OO) - Asymptotic Tight Upper Bound (Worst Case)

  • The function f(n)=O(g(n))f(n) = O(g(n)) if and only if there exist two positive constants cc and n0n_0 such that:
  • 0f(n)cg(n)for all nn00 \leq f(n) \leq c \cdot g(n) \quad \text{for all } n \geq n_0
  • Meaning: f(n)f(n) grows no faster than g(n)g(n). g(n)g(n) bounds f(n)f(n) from above.

2. Big-Omega (Ω\Omega) - Asymptotic Tight Lower Bound (Best Case)

  • The function f(n)=Ω(g(n))f(n) = \Omega(g(n)) if and only if there exist two positive constants cc and n0n_0 such that:
  • 0cg(n)f(n)for all nn00 \leq c \cdot g(n) \leq f(n) \quad \text{for all } n \geq n_0
  • Meaning: f(n)f(n) grows at least as fast as g(n)g(n). g(n)g(n) bounds f(n)f(n) from below.

3. Big-Theta (Θ\Theta) - Asymptotic Tight Bound (Average Case)

  • The function f(n)=Θ(g(n))f(n) = \Theta(g(n)) if and only if there exist three positive constants c1,c2,c_1, c_2, and n0n_0 such that:
  • 0c1g(n)f(n)c2g(n)for all nn00 \leq c_1 \cdot g(n) \leq f(n) \leq c_2 \cdot g(n) \quad \text{for all } n \geq n_0
  • Meaning: f(n)f(n) and g(n)g(n) grow at the exact same rate asymptotically.

4. Little-Oh (oo) - Asymptotic Loose Upper Bound

  • The function f(n)=o(g(n))f(n) = o(g(n)) iff for ANY positive constant c>0c > 0, there exists a constant n0>0n_0 > 0 such that:
  • 0f(n)<cg(n)for all nn00 \leq f(n) < c \cdot g(n) \quad \text{for all } n \geq n_0
  • Equivalently using limits: limnf(n)g(n)=0\lim_{n \to \infty} \frac{f(n)}{g(n)} = 0
  • Meaning: g(n)g(n) becomes arbitrarily large relative to f(n)f(n).

5. Little-Omega (ω\omega) - Asymptotic Loose Lower Bound

  • The function f(n)=ω(g(n))f(n) = \omega(g(n)) iff for ANY positive constant c>0c > 0, there exists a constant n0>0n_0 > 0 such that:
  • 0cg(n)<f(n)for all nn00 \leq c \cdot g(n) < f(n) \quad \text{for all } n \geq n_0
  • Equivalently using limits: limnf(n)g(n)=\lim_{n \to \infty} \frac{f(n)}{g(n)} = \infty

Properties of Asymptotic Notations:

  • Reflexivity: f(n)=O(f(n))f(n) = O(f(n)), f(n)=Ω(f(n))f(n) = \Omega(f(n)), f(n)=Θ(f(n))f(n) = \Theta(f(n))
  • Symmetry: f(n)=Θ(g(n))f(n) = \Theta(g(n)) if and only if g(n)=Θ(f(n))g(n) = \Theta(f(n))
  • Transpose Symmetry: f(n)=O(g(n))f(n) = O(g(n)) if and only if g(n)=Ω(f(n))g(n) = \Omega(f(n))

Section C: Step-by-Step Worked Example (The Application)

Example 1: Prove that f(n)=2n+1f(n) = 2^{n+1} is O(2n)O(2^n) (April 2025 University Question)

  1. State the definition: We need to find cc and n0n_0 such that 2n+1c2n2^{n+1} \leq c \cdot 2^n for all nn0n \geq n_0.
  2. Algebraic Manipulation:
    • 2n+1=212n=22n2^{n+1} = 2^1 \cdot 2^n = 2 \cdot 2^n
  3. Identify the constants:
    • The inequality becomes 22nc2n2 \cdot 2^n \leq c \cdot 2^n.
    • This is true if we choose c=2c = 2.
    • Since this holds for all n1n \geq 1, we set n0=1n_0 = 1.
  4. Conclusion: Since there exist positive constants c=2c=2 and n0=1n_0=1, 2n+1=O(2n)2^{n+1} = O(2^n).

Example 2: Is 22n=O(2n)2^{2n} = O(2^n)? Justify your answer. (University Question)

  1. State the definition: 22nc2n2^{2n} \leq c \cdot 2^n for all nn0n \geq n_0.
  2. Algebraic Manipulation:
    • (22)nc2n(2^2)^n \leq c \cdot 2^n
    • 4nc2n4^n \leq c \cdot 2^n
  3. Analyze the ratio:
    • 4n/2nc4^n / 2^n \leq c
    • (4/2)nc(4/2)^n \leq c
    • 2nc2^n \leq c
  4. Conclusion: There is no constant cc that can bound the function 2n2^n as nn approaches infinity. The ratio grows without bound.
  5. Final Result: Therefore, 22nO(2n)2^{2n} \neq O(2^n).

Example 3: Arrange functions by growth rate (University Question) n3,2n,logn3,2100,n2logn,nn,logn,n0.3,2lognn^3, 2^n, \log n^3, 2^{100}, n^2 \log n, n^n, \log n, n^{0.3}, 2^{\log n}

  1. Constant: 21002^{100}
  2. Logarithmic: logn\log n and logn3=3logn\log n^3 = 3 \log n
  3. Fractional Power: n0.3n^{0.3}
  4. Linear (via identity): 2log2n=n2^{\log_2 n} = n
  5. Linear-Logarithmic: n2lognn^2 \log n
  6. Polynomial: n3n^3
  7. Exponential: 2n2^n
  8. Factorial-like: nnn^n Final Order: 2100<logn<logn3<n0.3<2logn<n2logn<n3<2n<nn2^{100} < \log n < \log n^3 < n^{0.3} < 2^{\log n} < n^2 \log n < n^3 < 2^n < n^n.

Section D: How to Write This in the Exam (The Strategy)

  • Start With: If asked for definitions (Odd Path - Q11), write the formal mathematical inequality immediately. Do not just write english text. “f(n) is O(g(n)) iff f(n)cg(n)f(n) \leq c \cdot g(n)”.
  • Body: For proof questions, use the technique of replacing lower-order terms with higher-order terms to find your constant cc. Clearly state “Let c=c = \dots and n0=n_0 = \dots”. If asked to illustrate graphically, draw the standard X-Y axis graph showing f(n)f(n) crossing cg(n)c \cdot g(n) at point n0n_0.
  • Traps: Do not forget the condition “for all nn0n \geq n_0”. Without stating that the inequality holds for all inputs after a certain point, the definition is mathematically incorrect and will lose marks.
  • Close With: Conclude proofs with “Since there exists positive constants cc and n0n_0, the function belongs to the complexity class.”

Algorithm Analysis (Linear/Binary/Insertion Sort)

[HIGH PROB]

Section A: What is this? (The Concept)

If you hire a contractor to build a brick wall, you don’t calculate the time by measuring how fast his heart beats. You calculate the time by counting how many bricks he has to lay.

The Analogy: In Algorithm Analysis, we don’t measure the exact milliseconds a CPU takes (because a supercomputer is faster than a 10-year-old laptop). Instead, we use the “Frequency Count” method. We count exactly how many times the core statements (the “bricks”) inside loops are executed relative to the input size nn.

By doing this, we determine the Time Complexity (how execution time scales) and Space Complexity (how memory usage scales). We analyze algorithms under three lenses: Best Case (the easiest input possible), Worst Case (the most difficult, punishing input), and Average Case (a mathematically probable random input).

Section B: Exam-Ready Theory (The Rigor)

1. Space Complexity: S(P)=C+SP(I)S(P) = C + S_P(I) The total memory required by an algorithm to run to completion.

  • Fixed Part (CC): Independent of input characteristics. Includes instruction space (code), space for simple variables, and constants.
  • Variable Part (SPS_P): Dependent on the input instance. Includes space for dynamic arrays, referenced variables, and the recursion stack space.

2. Time Complexity & Frequency Count Time complexity is calculated by counting the number of times fundamental operations execute.

  • for(i=1; i<=n; i++): The initialization runs 1 time. The condition check runs n+1n+1 times (it checks and fails on the final try). The inner body runs nn times.
  • If a loop increments geometrically (e.g., i = i * 2), the loop executes log2n\log_2 n times.

3. Types of Complexity Analysis

  • Best Case (Ω\Omega): The minimum number of steps executed for a given parameter. Occurs when the input naturally bypasses the algorithm’s heavy lifting.
  • Worst Case (OO): The maximum number of steps executed. The algorithm is forced to do every possible calculation.
  • Average Case (Θ\Theta): The expected number of steps averaged over all possible inputs.

Analysis of Classical Algorithms:

  • Linear Search:
    • Best Case: Item is at index 0. Ω(1)\Omega(1).
    • Worst Case: Item is not present or at the very end. O(n)O(n).
    • Average Case: Item is found in the middle (n+1)/2(n+1)/2 comparisons. Θ(n)\Theta(n).
  • Insertion Sort:
    • Best Case: Array is already sorted. The inner while loop condition fails immediately every time. Ω(n)\Omega(n).
    • Worst Case: Array is sorted in reverse order. Inner loop runs 1+2+3++(n1)=n(n1)/21 + 2 + 3 + \dots + (n-1) = n(n-1)/2 times. O(n2)O(n^2).
    • Average Case: Elements are half-sorted. Θ(n2)\Theta(n^2).

Section C: Step-by-Step Worked Example (The Application)

Example 1: Analyze the time complexity of the following code segment

for(int i=1; i<=n; i++) {
    for(int j=1; j<=n; j++) {
        // O(1) statement
    }
}
  1. Analyze Outer Loop: The outer loop i runs from 11 to nn. It successfully executes its body nn times.
  2. Analyze Inner Loop: For every single iteration of the outer loop, the inner loop j resets and runs from 11 to nn, executing its body nn times.
  3. Total Frequency Count: The innermost O(1)O(1) statement executes nn (from outer) ×\times nn (from inner) times.
  4. Result: Total executions =n2= n^2. Time Complexity =O(n2)= O(n^2).

Example 2: Analyze an algorithmic segment with dependent bounds

for(int i=1; i<=n; i++) {
    for(int j=1; j<=i; j++) {
        // O(1) statement
    }
}
  1. Trace iterations:
    • When i=1i = 1, inner loop runs 11 time.
    • When i=2i = 2, inner loop runs 22 times.
    • When i=ni = n, inner loop runs nn times.
  2. Sum the frequencies: Total executions =1+2+3++n= 1 + 2 + 3 + \dots + n
  3. Apply Arithmetic Progression Formula: Sum of first nn natural numbers =n(n+1)2=n2+n2= \frac{n(n+1)}{2} = \frac{n^2 + n}{2}
  4. Result: Dropping lower-order terms and constants, the Time Complexity =O(n2)= O(n^2).

Example 3: Determine Space Complexity for a Recursive Array Sum

int RSum(int a[], int n) {
    if (n <= 0) return 0;
    return a[n] + RSum(a, n-1);
}
  1. Fixed Space: Negligible (pointers and nn passed by value).
  2. Variable Space (Recursion Stack): For each recursive call, the stack must store the return address and local parameters.
  3. Depth of Recursion: The function calls itself nn times (from nn down to 00).
  4. Result: Since the depth of the recursion tree is nn, and each stack frame takes O(1)O(1) space, the total Space Complexity is O(n)O(n).

Example 4: Analyze the complexity of the following function (University Question)

void function(int n) {
    int count=0;
    for(int i=n/2; i<=n; i++)
        for(int j=1; j<=n; j=2*j)
            for(int k=1; k<=n; k=k*2)
                count++;
}
  1. Outer Loop (ii): Runs from n/2n/2 to nn.
    • Number of iterations =nn/2+1n/2= n - n/2 + 1 \approx n/2.
    • Complexity: O(n)O(n).
  2. Middle Loop (jj): Starts at 1, doubles each time (1,2,4,8,,n1, 2, 4, 8, \dots, n).
    • This is a geometric progression. The loop runs kk times where 2k=n    k=log2n2^k = n \implies k = \log_2 n.
    • Complexity: O(logn)O(\log n).
  3. Inner Loop (kk): Same as the middle loop, starts at 1 and doubles until nn.
    • Complexity: O(logn)O(\log n).
  4. Total Frequency Count:
    • T(n)=(Outer iterations)×(Middle iterations)×(Inner iterations)T(n) = (\text{Outer iterations}) \times (\text{Middle iterations}) \times (\text{Inner iterations})
    • T(n)=(n/2)×(logn)×(logn)T(n) = (n/2) \times (\log n) \times (\log n)
  5. Final Result:
    • T(n)=O(nlog2n)T(n) = O(n \log^2 n).

Section D: How to Write This in the Exam (The Strategy)

  • Start With: If given a code snippet (Odd Path), create a table with columns: Step/Execution, Frequency Count, and Total Frequency.
  • Body: Map each line of code to a row in the table. For simple assignment statements, put 11. For a for loop, put n+1n+1 for the loop statement itself, and nn for the block inside it. Multiply nested loops carefully. Show the summation clearly (e.g., writing out 1+2+...+n=n(n+1)/21+2+...+n = n(n+1)/2).
  • Traps: When asked for Best/Worst/Average cases of Insertion Sort, do not just write the final O(n2)O(n^2) answer. You must explicitly state the physical condition of the array that causes the case (e.g., “Best case occurs when the array is already sorted in ascending order”).
  • Close With: Box your final result in Big-O notation, e.g., Time Complexity = O(n2)O(n^2). Ensure you explicitly name the mathematical series used if applicable (Arithmetic, Geometric, Logarithmic).