Algorithm Analysis & Design (CST306) - Module 1 Study Guide
Part A: High Frequency Definitions (Module 1 Focus)
Topic: Little-o notation justification
[SURE SHOT | ↻ REPEATED PYQ]
Section A: What is this? (The Concept)
Imagine you are in a race against a rocket. No matter how fast you run, the rocket will eventually outpace you so severely that your speed becomes mathematically insignificant compared to its speed.
The Analogy: Little-o notation is the mathematical way of saying “I will get completely destroyed by this upper limit.” While Big-O means “I will be less than or equal to this limit,” Little-o strictly means “I will be strictly less than this limit as things get huge.”
If a function is , it means the growth rate of is strictly slower than . As approaches infinity, becomes arbitrarily small compared to .
Section B: Exam-Ready Theory (The Rigor)
- Definition: A function if and only if for ANY positive constant , there exists a constant such that:
- Limit Definition: The most rigorous way to justify little-o is using limits. if:
- Justification for : If you are asked to justify why a function like is , you apply the limit theorem. Since the limit is 0, is strictly bounded above by , hence it is .
Section C: Step-by-Step Worked Example (The Application)
Question: Justify if .
- Apply the Limit Test: Set up the limit of as approaches infinity.
- Simplify the expression:
- Evaluate the limit: As grows infinitely large, any constant divided by infinity approaches zero.
- Conclusion: Since the limit is 0, is indeed .
Section D: How to Write This in the Exam (The Strategy)
- Start With: Write down the formal limit definition: .
- Body: Substitute the given functions into the limit. Simplify the algebraic expression cleanly before taking the limit.
- Traps: Do not confuse Little-o with Big-O. Big-O allows the limit to be a constant value greater than 0 (e.g., because limit is 2). Little-o requires the limit to be exactly 0.
- Close With: Conclude with the sentence: “Because the limit evaluates to 0, grows strictly slower than , satisfying the little-o property.”
Part B: Core Topics
Recursion Solving (Master’s & Recursion Tree)
[SURE SHOT | ↻ REPEATED PYQ]
Section A: What is this? (The Concept)
Imagine you are tasked with counting all the leaves on a massive, sprawling oak tree. Counting them one by one would take forever. Instead, you recruit three friends and say, “Each of you take a main branch, count the leaves, and tell me the total.” Your friends do the exact same thing, recruiting their own friends for the smaller branches, until someone reaches a tiny twig with just one leaf.
The Analogy: Solving recurrences is like calculating the total time it takes for all these friends to finish counting. The work is split into smaller identical tasks (recursive calls), but splitting the tasks and combining the results also takes some effort.
In computer science, algorithms like Merge Sort or Binary Search work by splitting a problem into smaller chunks. A recurrence relation is simply a mathematical equation that describes the running time of these recursive algorithms in terms of their smaller inputs. We need methods like the Master’s Theorem, Recursion Trees, and Substitution to solve these equations and find the final Time Complexity in standard Asymptotic notation (like ).
Section B: Exam-Ready Theory (The Rigor)
To analyze recursive algorithms, we represent their time complexity as a Recurrence Relation: an equation that describes a function in terms of its values on smaller inputs, such as .
Here are the three primary rigorous methods to solve them:
1. The Master’s Theorem This is a direct formula-based method for solving recurrences of the form: Where:
- is the number of subproblems (recursive calls).
- is the factor by which the input size is divided.
- and is a real number representing the cost of dividing/combining .
The 3 Master Cases:
- Case 1: If , then
- Case 2: If , we check the value of :
- If , then
- If , then
- If , then
- Case 3: If , we check the value of :
- If , then
- If , then
2. The Recursion Tree Method This is a pictorial representation of the iteration method. It converts the recurrence into a tree where nodes represent the cost incurred at various levels of the recursion.
- The root is the initial cost .
- Its children represent the cost of the subproblems, e.g., branches each costing .
- You sum the costs across each horizontal level of the tree.
- Finally, you sum the costs of all levels from the root down to the leaves to get the total complexity.
3. The Substitution Method This method involves two formal mathematical steps:
- Guess the form of the solution (e.g., guess that ).
- Use Mathematical Induction to find the exact constants and prove that the guess is correct. This involves showing .
Section C: Step-by-Step Worked Example (The Application)
Example 1: Master’s Theorem Solve
- Compare with the standard form:
- Extract the constants:
- Evaluate :
- Compare and :
- and .
- Since , we have . This falls under Case 3.
- Check for Case 3:
- , which means .
- Apply the formula:
- Final Answer:
Example 2: Recursion Tree Method (April 2025 University Question) Solve
- Level 0 (Root): The cost at the root is .
- Level 1: The recurrence branches 3 ways (). The input size drops to .
- Each node costs .
- Total cost at Level 1 .
- Level 2: Each of the 3 nodes branches 3 ways (9 nodes total). Input is .
- Each node costs .
- Total cost at Level 2 .
- Level (General Case):
- Number of nodes .
- Cost per node .
- Total cost at Level .
- Total Cost Summation:
- This is a geometric series with ratio .
- Using the infinite series formula :
- Final Result:
- .
Example 3: Substitution Method (April 2025 University Question) Solve where .
- Guess: Based on the form (similar to Merge Sort), we guess .
- Inductive Goal: Prove for some .
- Inductive Step: Assume it holds for , i.e., .
- Substitute into Recurrence:
- Final Condition: We need .
- This is true if .
- This holds for all .
- Conclusion: .
Section D: How to Write This in the Exam (The Strategy)
- Start With: For Master’s theorem questions (Even Path - Q12), immediately write down the standard form at the top of your page. Extract clearly in a list.
- Body: Calculate and explicitly write down the comparison ” vs ”. State exactly which case of the Master’s theorem you are applying. For Recursion Trees, you must draw the tree. Calculate the sum of the first 3 levels and write down the general series equation.
- Traps: Do not forget to check the condition for in Cases 2 and 3 of the Master’s theorem. A common mistake is writing for Case 3 when it should be determined by .
- Close With: Write your final answer in a neat box: .
Asymptotic Notations (Big O, Omega, Theta)
[HIGH PROB | ↻ REPEATED PYQ]
Section A: What is this? (The Concept)
Imagine you are comparing two cars. Car A is a heavy truck that starts off really fast but hits a strict top speed. Car B is a sports car that stalls out for the first 10 seconds, but eventually accelerates infinitely. If you only test them on a 5-meter driveway, Car A wins. But if you test them on a cross-country highway, Car B destroys Car A.
The Analogy: Asymptotic notation is the “cross-country highway” test for algorithms. We do not care about small inputs (the 5-meter driveway) or minor hardware advantages. We only care about how the algorithm’s running time grows as the input size () gets massively large (approaches infinity).
Asymptotic notations are mathematical tools used to represent the Time and Space complexity of algorithms. They allow us to establish upper bounds (Worst case), lower bounds (Best case), and tight bounds (Average/Exact case) ignoring machine-specific constants.
Section B: Exam-Ready Theory (The Rigor)
There are 5 primary asymptotic notations used to classify functions by their asymptotic growth rate:
1. Big-Oh () - Asymptotic Tight Upper Bound (Worst Case)
- The function if and only if there exist two positive constants and such that:
- Meaning: grows no faster than . bounds from above.
2. Big-Omega () - Asymptotic Tight Lower Bound (Best Case)
- The function if and only if there exist two positive constants and such that:
- Meaning: grows at least as fast as . bounds from below.
3. Big-Theta () - Asymptotic Tight Bound (Average Case)
- The function if and only if there exist three positive constants and such that:
- Meaning: and grow at the exact same rate asymptotically.
4. Little-Oh () - Asymptotic Loose Upper Bound
- The function iff for ANY positive constant , there exists a constant such that:
- Equivalently using limits:
- Meaning: becomes arbitrarily large relative to .
5. Little-Omega () - Asymptotic Loose Lower Bound
- The function iff for ANY positive constant , there exists a constant such that:
- Equivalently using limits:
Properties of Asymptotic Notations:
- Reflexivity: , ,
- Symmetry: if and only if
- Transpose Symmetry: if and only if
Section C: Step-by-Step Worked Example (The Application)
Example 1: Prove that is (April 2025 University Question)
- State the definition: We need to find and such that for all .
- Algebraic Manipulation:
- Identify the constants:
- The inequality becomes .
- This is true if we choose .
- Since this holds for all , we set .
- Conclusion: Since there exist positive constants and , .
Example 2: Is ? Justify your answer. (University Question)
- State the definition: for all .
- Algebraic Manipulation:
- Analyze the ratio:
- Conclusion: There is no constant that can bound the function as approaches infinity. The ratio grows without bound.
- Final Result: Therefore, .
Example 3: Arrange functions by growth rate (University Question)
- Constant:
- Logarithmic: and
- Fractional Power:
- Linear (via identity):
- Linear-Logarithmic:
- Polynomial:
- Exponential:
- Factorial-like: Final Order: .
Section D: How to Write This in the Exam (The Strategy)
- Start With: If asked for definitions (Odd Path - Q11), write the formal mathematical inequality immediately. Do not just write english text. “f(n) is O(g(n)) iff ”.
- Body: For proof questions, use the technique of replacing lower-order terms with higher-order terms to find your constant . Clearly state “Let and ”. If asked to illustrate graphically, draw the standard X-Y axis graph showing crossing at point .
- Traps: Do not forget the condition “for all ”. Without stating that the inequality holds for all inputs after a certain point, the definition is mathematically incorrect and will lose marks.
- Close With: Conclude proofs with “Since there exists positive constants and , the function belongs to the complexity class.”
Algorithm Analysis (Linear/Binary/Insertion Sort)
[HIGH PROB]
Section A: What is this? (The Concept)
If you hire a contractor to build a brick wall, you don’t calculate the time by measuring how fast his heart beats. You calculate the time by counting how many bricks he has to lay.
The Analogy: In Algorithm Analysis, we don’t measure the exact milliseconds a CPU takes (because a supercomputer is faster than a 10-year-old laptop). Instead, we use the “Frequency Count” method. We count exactly how many times the core statements (the “bricks”) inside loops are executed relative to the input size .
By doing this, we determine the Time Complexity (how execution time scales) and Space Complexity (how memory usage scales). We analyze algorithms under three lenses: Best Case (the easiest input possible), Worst Case (the most difficult, punishing input), and Average Case (a mathematically probable random input).
Section B: Exam-Ready Theory (The Rigor)
1. Space Complexity: The total memory required by an algorithm to run to completion.
- Fixed Part (): Independent of input characteristics. Includes instruction space (code), space for simple variables, and constants.
- Variable Part (): Dependent on the input instance. Includes space for dynamic arrays, referenced variables, and the recursion stack space.
2. Time Complexity & Frequency Count Time complexity is calculated by counting the number of times fundamental operations execute.
for(i=1; i<=n; i++): The initialization runs 1 time. The condition check runs times (it checks and fails on the final try). The inner body runs times.- If a loop increments geometrically (e.g.,
i = i * 2), the loop executes times.
3. Types of Complexity Analysis
- Best Case (): The minimum number of steps executed for a given parameter. Occurs when the input naturally bypasses the algorithm’s heavy lifting.
- Worst Case (): The maximum number of steps executed. The algorithm is forced to do every possible calculation.
- Average Case (): The expected number of steps averaged over all possible inputs.
Analysis of Classical Algorithms:
- Linear Search:
- Best Case: Item is at index 0. .
- Worst Case: Item is not present or at the very end. .
- Average Case: Item is found in the middle comparisons. .
- Insertion Sort:
- Best Case: Array is already sorted. The inner
whileloop condition fails immediately every time. . - Worst Case: Array is sorted in reverse order. Inner loop runs times. .
- Average Case: Elements are half-sorted. .
- Best Case: Array is already sorted. The inner
Section C: Step-by-Step Worked Example (The Application)
Example 1: Analyze the time complexity of the following code segment
for(int i=1; i<=n; i++) {
for(int j=1; j<=n; j++) {
// O(1) statement
}
}
- Analyze Outer Loop: The outer loop
iruns from to . It successfully executes its body times. - Analyze Inner Loop: For every single iteration of the outer loop, the inner loop
jresets and runs from to , executing its body times. - Total Frequency Count: The innermost statement executes (from outer) (from inner) times.
- Result: Total executions . Time Complexity .
Example 2: Analyze an algorithmic segment with dependent bounds
for(int i=1; i<=n; i++) {
for(int j=1; j<=i; j++) {
// O(1) statement
}
}
- Trace iterations:
- When , inner loop runs time.
- When , inner loop runs times.
- …
- When , inner loop runs times.
- Sum the frequencies: Total executions
- Apply Arithmetic Progression Formula: Sum of first natural numbers
- Result: Dropping lower-order terms and constants, the Time Complexity .
Example 3: Determine Space Complexity for a Recursive Array Sum
int RSum(int a[], int n) {
if (n <= 0) return 0;
return a[n] + RSum(a, n-1);
}
- Fixed Space: Negligible (pointers and passed by value).
- Variable Space (Recursion Stack): For each recursive call, the stack must store the return address and local parameters.
- Depth of Recursion: The function calls itself times (from down to ).
- Result: Since the depth of the recursion tree is , and each stack frame takes space, the total Space Complexity is .
Example 4: Analyze the complexity of the following function (University Question)
void function(int n) {
int count=0;
for(int i=n/2; i<=n; i++)
for(int j=1; j<=n; j=2*j)
for(int k=1; k<=n; k=k*2)
count++;
}
- Outer Loop (): Runs from to .
- Number of iterations .
- Complexity: .
- Middle Loop (): Starts at 1, doubles each time ().
- This is a geometric progression. The loop runs times where .
- Complexity: .
- Inner Loop (): Same as the middle loop, starts at 1 and doubles until .
- Complexity: .
- Total Frequency Count:
- Final Result:
- .
Section D: How to Write This in the Exam (The Strategy)
- Start With: If given a code snippet (Odd Path), create a table with columns:
Step/Execution,Frequency Count, andTotal Frequency. - Body: Map each line of code to a row in the table. For simple assignment statements, put . For a
forloop, put for the loop statement itself, and for the block inside it. Multiply nested loops carefully. Show the summation clearly (e.g., writing out ). - Traps: When asked for Best/Worst/Average cases of Insertion Sort, do not just write the final answer. You must explicitly state the physical condition of the array that causes the case (e.g., “Best case occurs when the array is already sorted in ascending order”).
- Close With: Box your final result in Big-O notation, e.g., Time Complexity = . Ensure you explicitly name the mathematical series used if applicable (Arithmetic, Geometric, Logarithmic).