U-1
1 Basic Concepts: Introduction to complexity.
[Link] is Complexity?In computer science, complexity refers to the amount of resources an algorithm uses when solving a
problem. The two primary resources are: * Time – how long the algorithm takes to run; *Space – how much memory the
algorithm [Link] is Complexity Important?Algorithms are used to solve problems, and often, there are many ways to solve
the same problem. Some methods may be faster or more memory-efficient than others. Knowing an algorithm’s complexity
helps us: -Predict performance on large data; -Compare multiple algorithms; -Optimize programs for speed and [Link]
of Complexity: [Link] Complexity: Time complexity measures how the execution time of an algorithm changes with respect to
the size of the input(usually denoted as n).For example: -A simple loop that runs n times has time complexity O(n).-A nested loop
(loop inside a loop) might have time complexity O(n²).[Link] Complexity: Space complexity measures how much memory is
used by the algorithm, including: -Input storage; -Temporary variables; -Function call stacks, [Link] an algorithm creates a
temporary array of size n, the space complexity is O(n).4. Big O Notation: To express complexity in a standard and simplified
way, we use Big O [Link] describes the upper bound (worst-case scenario) of an algorithm's running time or space usage in
terms of input size n. Examples of Big O Notation:
Big O Name Description / Example
O(1) Constant time Executes in the same time regardless of input size. Example: Accessing an element by index in
an array.
O(log n) Logarithmic time Input size is reduced in each step. Example: Binary Search
O(n) Linear time Time increases linearly with input. Example: Traversing an array
O(n log n) Linearithmic Efficient sorting algorithms. Example: Merge Sort, Quick Sort (average case)
O(n²) Quadratic time Nested loops. Example: Bubble Sort
O(2ⁿ) Exponential time Extremely slow, grows rapidly. Example: Solving the Fibonacci sequence with simple recursion
O(n!) Factorial time Very slow; often seen in brute-force solutions. Example: Solving the Traveling Salesman
Problem using all permutations
5. Best, Average, and Worst Case: Time complexity is often expressed as the worst-case scenario, but there are others:
Case Description 6. Real-Life Analogy: Imagine finding a book
Best Case Minimum time taken(e.g.,finding an item at the start of a list) in a shelf: -If you randomly check every book
Average Case Expected time over multiple inputs until you find it→ O(n)(linear time).-If the books
Worst Case Maximum time taken (usually used for analysis) are sorted and you check the middle one,
then go left/right → O(log n) (binary search). -If you already know the exact position → O(1) (constant time).
7. Graphical View of Time Complexities: Here's how the time grows with input size n (conceptually):
Input Size (n) ➡
Time ↑
|
| O(2^n)
| O(n^2)
| O(n log n)
| O(n)
| O(log n)
| O(1)
|____________________________________
2 applications of data structure
Applications of Data Structures: Data structures are essential tools in computer science that help store, organize, and manage
data efficiently. The choice of data structure impacts the performance of software systems and algorithms.
[Link]: Description:An array is a fixed-size, indexed collection of elements of the same type.* Applications: - Storing large
datasets: e.g., storing names, scores, temperatures.- Image processing: 2D arrays store pixel data.- Matrices and mathematical
computations.- Used in databases to hold records in a table format.
[Link] Lists: Description: A linear structure where elements (nodes) are connected using [Link]: -Dynamic
memory allocation (e.g., memory management).-Implementing stacks and queues.-Image viewer / music player: navigating
forward and backward.-Undo/ Redo functionality in editors.-Polynomial arithmetic in symbolic computations.
[Link]: Description:A Last-In-First-Out (LIFO) data structure. Applications: -Function call management (call stack). -Undo
operations in text editors.-Expression evaluation (infix to postfix conversion).-Syntax parsing in compilers. -Backtracking
algorithms, like maze solving or solving [Link]: Description:A First-In-First-Out (FIFO) data structure. Applications:
-Task scheduling in operating systems (CPU scheduling).-Print queue management.-Breadth-first search (BFS) in
graphs.-Customer service systems: simulating real-world queues.-Streaming data (buffer management).
[Link]: Description: A hierarchical data structure with a root and child [Link]: - Hierarchical databases (e.g., file
systems).- XML/HTML document representation (DOM tree). - Binary Search Tree (BST): fast searching and sorting. - Heaps:
used in priority queues and heap sort. - Trie trees: efficient text search, autocomplete.- Game trees: in AI (e.g., chess,
tic-tac-toe).[Link]: Description:A non-linear structure consisting of nodes (vertices) and edges.
Applications:- Social networks (Facebook friends, LinkedIn connections).- Maps and GPS (shortest path, Dijkstra’s algorithm).-
Web crawlers: modeling the internet.- Network routing protocols.- Project planning (PERT/CPM).
- Dependency resolution in package [Link] Tables/Hash Maps: Description:Stores key-value pairs for fast
[Link]: - Databases and caching systems.- Implementing associative arrays / dictionaries.- Indexing in search
engines.- Symbol tables in compilers.- Password storage using hash functions.- Routing tables in networks.
[Link]: Description:A complete binary tree used to maintain a priority [Link]: - Priority queues (e.g., CPU task
scheduling).- Heap sort algorithm.- Finding k largest/smallest elements.- Memory management (dynamic memory allocation).-
Graph algorithms: e.g., Dijkstra's [Link](Prefix Trees): Description:A tree-like data structure used to store strings
[Link]: -Autocomplete features (search engines, keyboards).-Spell checking and suggestions.-IP routing
(longest prefix matching).- Storing a dictionary of [Link] Sets (Union-Find): Description:Keeps track of a set of
elements partitioned into disjoint (non-overlapping) subsets. Applications: -Network connectivity checks.-Cycle detection in
graphs.-Kruskal’s algorithm for minimum spanning trees.- Image processing (connected component labeling).
3 Basic data structures.
Basic Data Structures: Data Structures are ways to store and organize data efficiently so that operations like insertion, deletion,
searching, and updating can be performed effectively. [Link]: Description: A collection of elements, each identified by an index.
All elements are stored in contiguous memory [Link] Features: - Fixed size ;- Fast access by index (O(1)).Common
Uses: - Storing lists, tables, or matrices;- Lookup [Link] (in Python):
arr = [10, 20, 30, 40]
print(arr[2]) # Output: 30 [Link] List: Description: A linear structure where elements (nodes) are connected using pointers.
Each node contains data + pointer to the next [Link]: - Singly Linked List;- Doubly Linked List;- Circular Linked
[Link] Uses: - Dynamic memory allocation;- Implementing stacks/queues;- Efficient insertions/deletions.
[Link]: Description: A LIFO (Last In First Out) structure. You can only access the top [Link]: *Push – Add
item.*Pop – Remove item.*Peek – View top [Link] Uses:-Undo features;-Expression evaluation ;-Function call
[Link]: Description:A FIFO (First In First Out) [Link] first element added is the first to be removed.
Operations: *Enqueue – Add item;*Dequeue – Remove item. Variants: -Circular Queue;-Priority Queue;-Double-ended
Queue(Deque).Common Uses: -Task scheduling ;-Print queues ;-Breadth-first search (BFS).[Link] Table(Hash Map):
Description:Stores key-value pairs. Uses a hash function to map keys to [Link]: -Insert, Delete, Search – typically
O(1).Common Uses:- Fast lookup (e.g., dictionary).-Caching; -Indexing in [Link]: Description:A hierarchical
structure consisting of nodes. The top node is called the [Link] Types: Binary Tree; Binary Search Tree (BST); AVL Tree;
Heap; Trie. Common Uses: -Representing hierarchical data (file systems).-Fast searching/sorting. -Syntax trees in
[Link]: Description:A collection of nodes (vertices) connected by edges. Can be directed/ undirected,
weighted/[Link] Uses: -Social networks ;-Road maps ;-Network routing ;-Web crawling.
4 data structure and data structure operations.
1. What is a Data Structure? A data structure is a way of organizing and storing data so that it can be accessed and modified
[Link] defines the relationship between data and the operations that can be performed on the [Link] of Data
Structures: [Link] Data Structures: Basic data types supported by programming [Link]:
- Integer;- Float;- Character;- [Link]-Primitive Data Structures:These are more complex and can store multiple
[Link] Data Structures: Data is stored in a sequential manner.
Data Structure Description
Array Fixed-size indexed collection
Linked List Nodes connected via pointers
Stack LIFO structure (Last-In-First-Out)
Queue FIFO structure (First-In-First-Out)
ii. Non-linear Data Structures: Data elements are not stored sequentially.
Data Structure Description
Tree Hierarchical structure (e.g., BST, Trie)
Graph Nodes (vertices) connected by edges
Heap Complete binary tree used in priority queues
Hash Table Key-value mapping using hash functions
[Link] Structure Operations: All data structures support a basic set of operations that allow us to manipulate data. Common
Operations:
Operation Description
Insertion Add an element to the data structure
Deletion Remove an element
Traversal Visit each element (e.g., loop through)
Searching Find an element based on a key or value
Sorting Arrange elements in a specific order (ascending or descending)
Updating Change the value of an element
Merging Combine two data structures into one (e.g., merging two arrays or lists)
Examples of Operations in Different Structures:
Data Structure Insertion Deletion Searching Traversal
Array O(1) (end) / O(n) (middle) O(n) O(n) O(n)
Linked List O(1) (at head) O(1)/O(n) O(n) O(n)
Stack O(1) (push) O(1) (pop) O(n) O(n)
Queue O(1) (enqueue) O(1) (dequeue) O(n) O(n)
Binary Search Tree (BST) O(log n) O(log n) O(log n) O(n) (inorder, preorder, postorder)
Hash Table O(1) avg / O(n) worst O(1) O(1) O(n)
Graph Depends on implementation Depends O(n) O(n) (BFS/DFS)
[Link] of Operations: Understanding operations helps: - Choose the right data structure for a problem;- Optimize for
speed and memory;- Write efficient [Link]-Life Analogies: - Array → Like a row of lockers, each with a number
(index);- Stack → Like a pile of plates (LIFO);- Queue → Like a line at a ticket counter (FIFO);- Linked List → Like a chain of
train cars;- Tree → Like a family tree or file system;- Graph → Like a map or social network;- Hash Table → Like a dictionary
with keys and values.
5 Arrays: introduction ,Types of Arrays
[Link] is an Array?An array is a collection of elements (values or variables), each identified by an index or a key, and stored in
contiguous memory [Link] is one of the most basic and widely used linear data structures in programming.
Key Characteristics of Arrays: - Fixed size (in most languages like C, C++);- Elements are of the same data type; -Accessed
using indexing (starts from 0 in most languages);- Supports random access (O(1) time to access any element). [Link] Use
Arrays?- To store multiple values using a single variable name;- Easy to traverse and manipulate using loops;- Useful in
implementing other data structures (stacks, queues, etc.);- Efficient for searching and sorting algorithms.
[Link] Examples: In C: int arr[5] = {10, 20, 30, 40, 50}; In Python: arr = [10, 20, 30, 40, 50]
[Link] of Arrays:Arrays can be categorized based on: -Dimensions (1D, 2D, 3D, etc.);- Data type (int array, float array, char
array);- Static or Dynamic [Link] on Dimensions: [Link]-Dimensional Array (1D Array): A simple list of
elements.;- Accessed using a single index. int arr[5] = {1, 2, 3, 4, 5}; // C arr = [1, 2, 3, 4, 5] # Python
[Link]-Dimensional Array (2D Array): - An array of arrays (matrix form).;- Accessed using two indices (row and column).
int matrix[3][3] = { matrix = [ [Link] Array (3D and higher): - More than two dimensions.;- Used for
{1, 2, 3}, [1, 2, 3], complex data like 3D graphics, tensor computation. int arr[2][3][4]; // C #Example of 3D
{4, 5, 6}, [4, 5, 6], array in Python using lists: arr = [[[0]*4 for_in range(3)]for_in range(2)]
{7, 8, 9} [7, 8, 9]
}; // C ] # Python
[Link] on Data Type: - Integer Array: int arr[] = {1, 2, 3}; ;- Float Array: float arr[] = {1.1, 2.2}; ;- Character Array: char arr[] =
{'a', 'b', 'c'}; ;- String Array: An array of strings (array of character arrays in C). [Link] on Memory Allocation: [Link] Array:
-Size is defined at compile time;- Memory is fixed. int arr[10]; // C [Link] Array:- Size can be changed at runtime;- Uses
dynamic memory (e.g., malloc() in C, or ArrayList in Java, or just lists in Python).
int* arr = (int*)malloc(n * sizeof(int)); // C arr = [] # Python dynamic list [Link](10)
6 Memory representation of Arrays
[Link] Does Memory Representation Mean?When an array is created, a block of contiguous memory (i.e., memory cells next
to each other) is reserved to store its elements. Understanding how arrays are stored in memory is essential for performance and
low-level [Link] Arrays Are Stored in Memory: - Each array element is stored in a contiguous memory location.;-
Each element occupies memory according to its data type size.;- The memory location of the first element is called the base
address.;- Every subsequent element is placed right after the previous one.
[Link] Indexing and Address Calculation: The address of any element in the array can be calculated using the formula:
Address Formula: Address(arr[i]) = Base_Address + (i × Size_of_Data_Type). Where: - arr[i] → Element at index i;-
Base_Address → Memory address of arr[0];- i → Index number;- Size_of_Data_Type → Size in bytes of the data type (e.g., int
= 4 bytes).Example: 1D Array: int arr[5] = {10, 20, 30, 40, 50}; // Assume base address = 1000
Index (i) Value Address Calculation Final Address Assuming each int takes 4 bytes.
0 10 1000 + (0 × 4) 1000 Example: 2D Array:
1 20 1000 + (1 × 4) 1004 int arr[2][3] = {
2 30 1000 + (2 × 4) 1008 {1, 2, 3},
3 40 1000 + (3 × 4) 1012 {4, 5, 6}
4 50 1000 + (4 × 4) 1016 }; // Base address = 2000
*Stored in row-major order (C, C++): All elements are stored row by row in memory:
Index Value Address (assuming 4 bytes per int) Address Formula for 2D Array (Row-Major Order):
arr[0][0] 1 2000 Address(arr[i][j]) = Base_Address + ((i × No_of_Columns) + j) ×
arr[0][1] 2 2004 Element_Size
arr[0][2] 3 2008 [Link]-Major vs Column-Major Order:
arr[1][0] 4 2012 Order Type Stored by Used in
arr[1][1] 5 2016 Row-Major Rows first C, C++, Python
arr[1][2] 6 2020 Column-Major Columns first Fortran, MATLAB
[Link] Layout Diagram (1D Array): If we visualize an array like: int arr[4] = {5, 10, 15, 20}; // Base address = 1000
It is stored in memory as: Address → 1000 1004 1008 1012 Element → 5 10 15 20 Index → [0] [1] [2] [3]
7 Applications and operations of Arrays
[Link] of Arrays: Arrays are widely used in various real-world and technical scenarios due to their simplicity, speed, and
direct access [Link]-World Applications:
Application Description
Storing multiple items Arrays can store large volumes of similar data (e.g., student marks, salaries).
Image processing Images are represented as 2D arrays (pixels with RGB values).
Matrices and mathematical ops Arrays are used to implement matrices, vectors, and perform operations like addition and
multiplication.
Scheduling and timetables Used in apps like calendars, exams schedules, train timetables.
Gaming (boards/grids) Games like Sudoku, chess, and tic-tac-toe use 2D arrays for board state.
Database tables Arrays are used internally to represent rows and columns in RAM.
Buffers in I/O systems Arrays serve as buffers for data transfer in networking or file systems.
Data science & ML Arrays (via NumPy arrays, tensors) are fundamental in ML, used for data representation and
computation.
[Link] Applications: - Sorting algorithms (e.g., bubble sort, quicksort);- Searching algorithms (e.g., linear search,
binary search);- Hashing (using arrays as buckets);- Implementing other data structures: stacks, queues, heaps;- Dynamic
programming (storing intermediate results).[Link] on Arrays: Arrays support several operations that are essential for data
manipulation and algorithm [Link] Array Operations:
Operation Description Time Complexity
Traversal Accessing and processing each element one by one O(n)
Insertion Adding a new element at a specific position O(1) (end), O(n) (anywhere else)
Deletion Removing an element from the array O(n)
Searching Finding the location of an element O(n) (linear), O(log n) (binary search in sorted array)
Updating Changing the value of an existing element O(1)
Sorting Rearranging elements in order O(n log n) best (e.g., merge sort), O(n²) worst (e.g., bubble
(ascending/descending) sort)
Merging Combining two arrays into one O(n + m)
Reversing Reversing the order of elements O(n)
Examples in Python: - Traversal: arr = [10, 20, 30, 40]
for i in arr:
print(i) - Insertion (at end): [Link](50) # [10, 20, 30, 40, 50] - Deletion: [Link](30) # removes 30 from array
- Searching: if 20 in arr:
print("Found at index", [Link](20)) - Updating: arr[2] = 100 # change third element to 100.
*Important Notes: - In static arrays (like in C/C++), the size is fixed and operations like insertion require shifting elements. - In
dynamic arrays (like Python lists or Java ArrayLists), the size can grow or shrink, making operations more flexible.
8 Stacks: introduction, Memory representation
[Link] to Stack: A stack is a linear data structure that follows the LIFO(Last In, First Out)principle.-The last element
inserted is the first one to be removed.-Think of a stack of plates: you add plates to the top, and remove from the [Link] Stack
Operations:
Operation Description Use Cases / Applications of Stack: - Undo/Redo functionality
Push Insert an element at the top of the stack in editors;- Backtracking (e.g., maze, puzzles);
Pop Remove the topmost element - Expression evaluation (postfix, infix);
Peek/Top View the top element without removing it - Function call management (call stack in compilers);
isEmpty Check if the stack is empty - Web browser history navigation;- Recursive programming.
isFull Check if the stack is full (only for static stack)
[Link] Representation of Stack: A stack can be implemented in two main ways:
A. Array-Based Stack (Static Stack): -Uses a fixed-size array;- A top pointer/index keeps track of the top [Link]: int
stack[100]; int top = -1; *Push Operation: top++; stack[top] = value; *Pop Operation: value = stack[top]; top--; *Memory Layout
(for a stack of size 5):
Index Value - Top = 2;- Next push goes to index 3;- Next pop removes value at index 2.*Limitations: - Fixed size (not
0 10 flexible);- Risk of stack overflow if [Link] List-Based Stack (Dynamic Stack): - Uses a linked list;- The
1 20 head (or top node) is considered the top of the stack.*Structure:
2 30 struct Node {
3 int data;
4 struct Node* next;
}; - Each push() adds a node at the front;- Each pop() removes a node from the front.
*Memory Layout: Top → [30|Next] → [20|Next] → [10|NULL] - Each element is stored in a node;- More flexible: no size limit
(until memory is full).*Comparison of Stack Implementations:
Feature Array-Based Stack Linked List-Based Stack
Memory Allocation Fixed size (static) Dynamic (allocates as needed)
Overflow Risk Yes (if full) No (unless memory full)
Underflow Handling Yes Yes
Access Speed Fast (O(1)) Slightly slower (O(1), but with pointer overhead)
Space Efficiency May waste memory Uses extra space for pointers
[Link] Memory Representation in System (Call Stack): When a program runs, the system uses a special stack memory
for: - Function calls;- Local variables;- Return addresses.
Example: Call Stack:
void A() { Stack Frame Function
B(); Top C()
} B()
void B() { Bottom A()
C();
}
Each function call pushes a frame to the call stack, and when a function returns, the frame is popped.
9 Applications and operations of Stacks.
[Link] is a Stack?: A stack is a linear data structure that follows the LIFO (Last In, First Out) [Link] last element pushed
into the stack is the first one to be popped [Link] of Stack: [Link]: - Adds an element to the top of the stack.;- If the
stack is full (in a static array implementation), it causes overflow. [Link]: - Removes the topmost element from the stack.;- If the
stack is empty, it causes underflow. [Link] / Top: Returns the element at the top of the stack without
removing it. [Link]: Checks if the stack has no [Link] (for fixed-size stacks): Checks if the stack has reached its
maximum capacity.*Time Complexities of Stack Operations:
Operation Time Complexity [Link] of Stacks: Stacks are widely used in both real-world systems and
Push O(1) computer programming.
Pop O(1)
Peek O(1)
isEmpty O(1)
isFull O(1)
[Link]-World Analogies:
Real-Life Stack Description
Stack of plates/books Last plate placed is the first one removed
Undo button in text editor Reverses recent actions
Browser back button Goes back to the most recently visited page
[Link] Science Applications:
Application Description
Expression evaluation Evaluate postfix, infix, and prefix expressions using stacks
Expression conversion Convert infix to postfix/prefix and vice versa
Function calls (Call stack) Handles recursive and nested function calls
Undo/Redo operations In text editors, changes are pushed/popped using stack
Balanced parentheses checking e.g., ((a+b)*c) → uses stack to check if parentheses are balanced
Backtracking algorithms Maze-solving, Sudoku, recursion
Compiler syntax parsing Uses stacks to build abstract syntax trees
Memory management Stack memory is used for local variables and function calls
String reversal Push all characters and pop them to reverse
DFS (Depth-First Search) Uses stack explicitly or implicitly via recursion
[Link] Examples (in Python): *Push and Pop: stack = [] # Push: [Link](10) [Link](20) # Pop:
print([Link]()) # Output: 20 *Peek: top = stack[-1] if stack else None print(top) *Check if empty: if not stack:
print("Stack is empty")
0 Recursion of Stacks.
[Link] is Recursion? Recursion is a programming technique where a function calls itself to solve a smaller instance of the same
[Link]’s commonly used for: - Factorial calculation;- Fibonacci sequence;- Tree traversal;- Backtracking problems(e.g.,
maze, Sudoku). [Link] Between Recursion and Stack: Every time a recursive function is called, the system uses
an internal stack (call stack) to: -Store the current function call;- Save local variables and return address;- Resume execution
after the recursive call returns. Key Idea: Recursion uses the stack [Link] Stack Works in Recursion: Example:
Recursive Function to Calculate Factorial
def factorial(n):
if n == 0:
return 1
return n * factorial(n - 1)
*Stack Representation (for factorial(4)):
Function Call State Stored in Stack Once base case is reached, stack starts popping: - factorial(1) = 1 * 1 = 1;-
factorial(4) Waits for factorial(3) factorial(2) = 2 * 1 = 2;- factorial(3) = 3 * 2 = 6;- factorial(4) = 4 * 6 = 24. [Link]
factorial(3) Waits for factorial(2) Representation of Recursion (Call Stack): Each function call stores: -Function
factorial(2) Waits for factorial(1) parameters;- Return address;- Local [Link] is stored as a stack [Link] a
factorial(1) Waits for factorial(0) recursive call is made: A new stack frame is pushed. When it returns: The stack
factorial(0) Returns 1 (base case) frame is popped off.*Visual Example (Stack frames for factorial(3)):
| return 3 * factorial(2) | ← Top of Stack
| return 2 * factorial(1) |
| return 1 * factorial(0) |
| return 1
Issue
Stack Overflow
| ← Base case [Link] of Recursion:
Explanation ⚠️ In Python, deep recursion beyond 1000 calls
Too many recursive calls can exceed stack memory causes a RecursionError.
Slower Performance Overhead of pushing/popping stack frames
Memory Usage More memory needed for each recursive call
[Link] Recursion (Optimized Recursion): - A special kind of recursion where recursive call is the last operation.;- Some
languages (e.g., Scheme, Haskell) optimize tail recursion by reusing the same stack frame.;- Python does not support tail-call
[Link] Recursion with an Explicit Stack: Since recursion uses a system-managed stack, we can also write
the same logic using our own stack (iteration).Example: Iterative Factorial Using Stack
def factorial_iterative(n):
stack = []
while n > 0:
[Link](n)
n -= 1
result = 1
while stack:
result *= [Link]()
return result
1 U-4 Searching: binary and Linear Search
[Link] is Searching?Searching is the process of finding the position (index) of a given element (called the key) within a
collection of data (e.g., array, list).[Link] of Searching Algorithms: The two most basic types are:
Search Type Works on Efficiency [Link] Search: Linear Search checks each element of the array one by one until
Linear Search Any list Slower the desired element is found or the end of the array is reached.
Binary Search Sorted list only Faster
*Algorithm Steps: [Link] from the first element. [Link] each element with the target. [Link] match found → return index. [Link]
not found till the end → return -1.**Time Complexity: - Best case: O(1) – Element at the beginning;- Worst case: O(n) –
Element not found or at the end ;- Average case: O(n).**Space Complexity: O(1).*Python Example:
def linear_search(arr, key):
for i in range(len(arr)):
if arr[i] == key:
return i
return -1 # Example: arr = [10, 20, 30, 40, 50] print(linear_search(arr, 30)) # Output: 2. [Link] Search: Binary Search is a
more efficient algorithm that works only on sorted arrays. It repeatedly divides the search interval in half.*Algorithm Steps:
[Link] with the entire sorted array. [Link] the middle element. [Link] the middle element is the key → return index. [Link] key < middle
element → search in the left half. [Link] key > middle element → search in the right half. [Link] until the element is found or the
interval is empty.**Time Complexity: - Best case: O(1);- Worst case: O(log n);- Average case: O(log n).**Space Complexity: -
O(1) for iterative;- O(log n) for recursive (due to call stack).*Python Example (Iterative):
def binary_search(arr, key):
low = 0
high = len(arr) - 1
while low <= high:
mid = (low + high) // 2
if arr[mid] == key:
return mid
elif arr[mid] < key:
low = mid + 1
else:
high = mid - 1
return -1 # Example: arr = [10, 20, 30, 40, 50] print(binary_search(arr, 40)) # Output: 3 [Link] Table:
Feature Linear Search Binary Search
Works on Unsorted or sorted array Sorted array only
Time Complexity O(n) O(log n)
Implementation Simple Slightly more complex
Use Case Small or unsorted data Large, sorted datasets
Space Complexity O(1) O(1) (iterative)
Speed Slower Much faster
[Link] to Use What?
Situation Use Search Type
Data is unsorted and small Linear Search
Data is sorted and large Binary Search
One-time search on small data Linear Search
Multiple searches on same dataset Binary Search (sort once, search many times)
2 Sorting: Bubble sort, insertion sort , selection sort, merge sort, quick sort
What is Sorting?Sorting is the process of arranging data (typically numbers or strings) in a specific order — either ascending or
descending. [Link] Sort: Idea: - Repeatedly compare adjacent elements and swap them if they are in the wrong order.;-
Largest element "bubbles" to the end in each [Link]: [5, 2, 4, 1] → [2, 4, 1, 5] → [2, 1, 4, 5] → [1, 2, 4, 5] **Time
Complexity: - Best: O(n) (already sorted);- Average/Worst: O(n²);- Space: O(1).*Python Code:
def bubble_sort(arr):
n = len(arr)
for i in range(n):
for j in range(0, n-i-1):
if arr[j] > arr[j+1]:
arr[j], arr[j+1] = arr[j+1], arr[j] [Link] Sort: Idea:- Builds the sorted array one item at a time.;- Takes one element,
finds its correct position in the sorted part, and inserts [Link]:[5, 2, 4, 1] → [2, 5, 4, 1] → [2, 4, 5, 1] → [1, 2, 4, 5] **Time
Complexity: -Best: O(n);- Average/Worst: O(n²);-Space: O(1).*Python Code:
def insertion_sort(arr):
for i in range(1, len(arr)):
key = arr[i]
j=i-1
while j >= 0 and arr[j] > key:
arr[j+1] = arr[j]
j -= 1
arr[j+1] = key [Link] Sort: Idea:Repeatedly find the smallest element in the unsorted part and move it to the front.
Example: [5, 2, 4, 1] → [1, 2, 4, 5] **Time Complexity: - Best/Worst/Average: O(n²);- Space: O(1). *Python Code:
def selection_sort(arr):
for i in range(len(arr)):
min_idx = i
for j in range(i+1, len(arr)):
if arr[j] < arr[min_idx]:
min_idx = j
arr[i], arr[min_idx] = arr[min_idx], arr[i] [Link] Sort (Divide and Conquer): Idea: - Divide the array into halves recursively;-
Merge the sorted halves. Example: [5, 2, 4, 1] → [5,2] & [4,1] → [2,5] & [1,4] → [1,2,4,5] **Time Complexity: -
Best/Worst/Average: O(n log n);- Space: O(n). *Python Code:
def merge_sort(arr):
if len(arr) > 1:
mid = len(arr) // 2
left = arr[:mid]
right = arr[mid:]
merge_sort(left)
merge_sort(right)
i=j=k=0
# Merge left and right
while i < len(left) and j < len(right):
if left[i] < right[j]:
arr[k] = left[i]
i += 1
else:
arr[k] = right[j]
j += 1
k += 1
# Remaining elements
while i < len(left):
arr[k] = left[i]
i += 1
k += 1
while j < len(right):
arr[k] = right[j]
j += 1
k += 1
[Link] Sort (Divide and Conquer): Idea: - Pick a pivot element.;- Partition the array so that elements less than pivot go to the
left, greater go to the right.;- Recursively sort both [Link]: [5, 2, 4, 1] → pivot=4 → [2, 1], 4, [5] → [1, 2], 4, [5]
*Time Complexity: - Best/Average: O(n log n);- Worst (already sorted): O(n²);- Space: O(log n) (recursive stack).
*Python Code: def quick_sort(arr):
if len(arr) <= 1:
return arr
pivot = arr[0]
left = [x for x in arr[1:] if x < pivot]
right = [x for x in arr[1:] if x >= pivot]
return quick_sort(left) + [pivot] + quick_sort(right) Summary Comparison Table:
Algorithm
Bubble Sort
Time (Best) Time (Worst)
O(n) O(n²)
Space
O(1) ✅
✅
Stable? Method
Yes Compare & Swap
Insertion Sort
Selection Sort O(n²)
O(n) O(n²)
O(n²)
O(1)
O(1) ❌
✅
Yes
No
Incremental Build
Min Selection
Merge Sort
Quick Sort
O(n log n)
O(n log n)
O(n log n)
O(n²)
O(n)
❌
O(log n)
Yes
No
Divide & Conquer
Partitioning
3 Comparison of various Searching and Sorting algorithms
Part 1: Searching Algorithms Comparison:
Feature Linear Search Binary Search
Requirement Works on any data Data must be sorted
Best Case O(1) — first element match O(1) — mid element match
Worst Case O(n) O(log n)
Average Case O(n/2) = O(n) O(log n)
Space Complexity O(1) O(1) iterative, O(log n) recursive
Implementation Very simple Slightly more complex
Stability Not applicable Not applicable
When to Use Small or unsorted data Large, sorted datasets
Part 2: Sorting Algorithms Comparison:
Feature Bubble Sort Insertion Sort Selection Sort Merge Sort Quick Sort
Best Case O(n) O(n) O(n²) O(n log n) O(n log n)
Average Case O(n²) O(n²) O(n²) O(n log n) O(n log n)
Worst Case O(n²) O(n²) O(n²) O(n log n) O(n²) (bad pivot)
Space
Stable?
In-place?
✅
✅
O(1)
Yes
Yes
✅
✅
O(1)
Yes
Yes
❌
✅
O(1)
No
Yes
✅
❌
O(n)
Yes
No
❌
✅
O(log n)
No
Yes
Method Compare & Swap Incremental Insertion Min Selection Divide & Conquer Divide & Conquer
Use Case Small data, teaching Mostly sorted data Memory-limited Large datasets Fast general-purpose sort
environments
**Summary Diagrams: *Searching: Time Complexity Summary
Algorithm Best Average Worst
Linear Search O(1) O(n) O(n)
Binary Search O(1) O(log n) O(log n)
*Sorting: Time Complexity Summary
Algorithm Best Average Worst
Bubble Sort O(n) O(n²) O(n²)
Insertion Sort O(n) O(n²) O(n²)
Selection Sort O(n²) O(n²) O(n²)
Merge Sort O(n log n) O(n log n) O(n log n)
Quick Sort O(n log n) O(n log n) O(n²)
**When to Use Which? *Searching: *Use Linear Search if: - The list is unsorted;- Only one or few searches are needed. *Use
Binary Search if:- The list is sorted;- Multiple searches will be performed.*Sorting:
Situation Recommended Sort
List is almost sorted Insertion Sort
You want a simple algorithm Selection or Bubble
You need performance Quick Sort
You need stable sort Merge Sort, Insertion Sort
You want to minimize memory Quick Sort, Insertion Sort
You are sorting very large data Merge Sort (external sorting)
4 U-3 Trees: Definition and Basic concepts
[Link] is a Tree in Data Structures?A tree is a non-linear hierarchical data structure consisting of nodes connected by edges.
Unlike arrays, stacks, queues, or linked lists (which are linear), trees branch out like a real tree.*Tree Characteristics: - Consists
of nodes and edges;- Has a root node (starting point);- Each node can have 0 or more child nodes;- No cycles (i.e., no loops);- A
connected and acyclic graph. [Link] Terminology:
Term Definition [Link] of Trees: [Link] Tree: Each
Node Basic unit of a tree that contains data node can have any number of children.
Root The topmost node of the tree [Link] Tree: Each node has at most two
Parent A node that has children children (left and right).
Child A node that is descended from another node [Link] Search Tree(BST): A binary tree
Leaf A node with no children where: - Left child < Node;- Right child >
Edge Connection between parent and child nodes Node.
[Link] Tree(e.g., AVL, Red-Black
Sibling Nodes that share the same parent
Tree): Maintains minimal height for fast
Ancestor All the nodes from a node to the root
operations. [Link] Binary Tree: All
Descendant All nodes below a given node
levels are completely filled except possibly the
Degree Number of children a node has last, which is filled from left to right.
Depth Number of edges from root to the node [Link] Binary Tree: Every node has 0 or 2
Height Number of edges on the longest path from node to a leaf children. [Link] Binary Tree: All internal
Subtree Any node and its descendants (itself becomes the root of the subtree) nodes have 2 children and all leaves are at
Level The position of a node in the tree (root is at level 0) the same level.
[Link] Use Trees? Trees are used when:- Hierarchical relationships are needed (e.g., file systems);- Fast searching, insertion,
deletion is required (e.g., BSTs) -Used in: Databases ;Compilers(syntax trees) ;Routing algorithms ;Artificial Intelligence(decision
trees).[Link] Representation in Memory: A. Using Pointers (Linked Structure): Each node contains: Data ;Pointer to left
child ;Pointer to right child.
struct Node {
int data;
struct Node* left;
struct Node* right;
}; [Link] Arrays(for Complete Binary Trees): Root at index 0 ;Left child of index i: 2*i + 1 ;Right child of index i: 2*i + 2.
[Link] Traversals(Basic Concept): Traversal = Visiting all nodes in a specific order. Types of Tree Traversals:
Type Description
In-order Left → Root → Right
Pre-order Root → Left → Right
Post-order Left → Right → Root
Level-order Level by level (uses a queue)
5 Representation in Contiguous Storage of trees
[Link] is Contiguous Storage Representation?In contiguous storage, a tree is stored in a single block of memory (array),
where nodes are placed at specific index positions based on their relationship in the [Link] representation is efficient for
certain types of trees, especially: Complete Binary Trees; Full Binary Trees; Heap trees. [Link] Representation of Binary
Tree: In a binary tree, every node can have at most 2 children: Left child; Right [Link] node is stored in an array index, and
its children and parent are computed using index formulas.*General Formulas (1-based index):
Node Position Formula ● For 0-based index, Node Position Formula
Left child of node at index i 2 * i formulas change Left child of node at index i 2*i+1
Right child of node at index i 2 * i + 1 slightly: Right child of node at index i 2*i+2
Parent of node at index i i // 2 Parent of node at index i (i - 1) // 2
Example (Complete Binary Tree): Index Element *How it works: - A (index 0): Left child = 2×0+1 =
A 0 A 1 → B; Right child = 2×0+2 = 2 → C.- B (index 1):
/ \ 1 B Left child = 2×1+1 = 3 → D; Right child = 2×1+2 =
B C
/\ /
D EF
2
3
C
D Right child = 2×2+2 = 6 → ❌
4 → E.- C (index 2): Left child = 2×2+1 = 5 → F;
(none).
4 E
*Array Representation (0-based index): 5 F
[Link] of Array Representation:
❌
Drawback
❌ Wasted space
Explanation
For sparse trees, many array elements remain unused
❌ Fixed size Must define size beforehand or resize
Hard to represent general trees Only binary trees are efficiently represented this way
[Link] Use Cases: Complete Binary Trees; Heaps (Min/Max Heap); Binary Search Trees with minimal missing nodes.
[Link]: Linked Representation (Pointer) For trees like: General trees (any number of children); Sparse binary trees.
6 Binary Tree, Binary Tree Traversal, searching, Insertion and Deletion in Binary trees, Binary Search tree
[Link] Tree: Definition: A Binary Tree is a hierarchical data structure where each node has at most two children, usually
referred to as: Left child; Right child.*Key Terminology:
Term Description * Types Type Description
Root Topmost node of the tree of Full Binary Tree Every node has 0 or 2 children
Leaf Node with no children Binary Complete Binary Tree All levels full except possibly the last (left-filled)
Internal Node Node with at least one child Trees: Perfect Binary Tree All levels are completely filled
Height Longest path from root to a leaf Skewed Binary Tree All nodes only have one child (Left or Right)
[Link] Tree Traversal:Traversal means visiting all nodes in a specific order.*A. Depth-First Traversal:
Traversal Type Order Example tree: A Type Output
Inorder Left → Root → Right /\ Inorder DBEAC
Preorder Root → Left → Right B C Preorder ABDEC
Postorder Left → Right → Root /\ Postorder DEBCA
D E
[Link]-First Traversal (Level Order): Uses a queue to visit nodes level by [Link]: A B C D E [Link] in a Binary
Tree: In a regular binary tree (not BST), you must search recursively or with level-order traversal (BFS) to find an element.*Time
Complexity: -Worst case: O(n) (You may have to visit every node).[Link] in Binary Tree (not BST): For non-BST binary
trees (like complete binary trees), insertion is usually done using level-order traversal to find the first empty spot (from left to
right).*Algorithm (Level Order): [Link] a queue to do level- order traversal. [Link] the first node with an empty left or right child.
[Link] the new node there.*Time Complexity: O(n) in worst case. [Link] in Binary Tree (not BST): To delete a node:
[Link] level-order traversal to find the node to delete and the deepest rightmost node. [Link] the data of the node to be deleted
with the deepest node’s data. [Link] the deepest [Link] Search Tree (BST): A Binary Search Tree is a special kind
of binary tree where: Left child < Node; Right child > Node.*Properties:
Operation Time Complexity (Average) Worst Case (Skewed Tree) BST Example: - Search 40: go left, then right.;-
Search O(log n) O(n) 50 Insert 55: go right of 50 → left of 70
Insertion O(log n) O(n) / \ → right of 60;- Delete 30: Replace
Deletion O(log n) O(n) 30 70 with inorder successor (40).
/\ /\
20 40 60 80
*Traversal in BST:
Traversal Output (for BST above)
Inorder 20 30 40 50 60 70 80 (Sorted)
Preorder 50 30 20 40 70 60 80
🔸Postorder 20 40 30 60 80 70 50
Inorder Traversal of a BST always gives a sorted list.
7 Graphs: Introduction, Memory Representation, Graph Traversal(DFS and BFS )
1. Introduction to Graphs: A graph is a non-linear data structure consisting of: A set of vertices(also called nodes or
points).- A set of edges(connections between the vertices).*Types of Graphs:
Type Description [Link] Terminology:
Directed Graph (Digraph) Edges have a direction (A → B) Term Meaning
Undirected Graph Edges have no direction (A — B) Vertex A node in the graph
Weighted Graph Edges have weights/costs (e.g., distances) Edge Connection between two vertices
Unweighted Graph Edges do not have weights Degree Number of edges connected to a vertex
Cyclic Graph Contains at least one cycle Path Sequence of edges from one vertex to another
Acyclic Graph No cycles Cycle A path that starts and ends at the same vertex
Connected Graph All vertices are reachable Adjacent Two vertices connected by an edge
Disconnected Graph Some vertices are not reachable Connected There is a path between every pair of vertices
[Link] Memory Representation: Graphs can be represented in memory using: [Link] Matrix: * A 2D array of size
V x V where: - matrix[i][j] = 1 if there is an edge from vertex i to j.; For weighted graphs, store weight instead of 1.; 0 if no edge.
Example (Undirected): - Vertices: A, B, C. - Edges: A-B, B-C
A B C *Space Complexity: O(V²). [Link] List: - A → B
A 0 1 0 Each vertex stores a list of its adjacent vertices;- B → A, C
B 1 0 1 Memory-efficient for sparse graphs. C→B
C 0 1 0 Example (Same Graph): Space Complexity: O(V + E)
4. Graph Traversal: Graph traversal means visiting all the vertices in a systematic [Link] most common types: - DFS (Depth
First Search); - BFS (Breadth First Search).
**[Link] First Search (DFS): Idea: Go as deep as possible **[Link] First Search(BFS): *Idea: Visit all neighbors first,
from the starting node, then backtrack.*Method: Use Stack then move to their neighbors.*Method: Use a Queue; Mark
(explicitly or via recursion);- Mark visited nodes to avoid cycles. visited nodes. *Steps: [Link] from source node
*Steps: [Link] from source node [Link] node [Link] go to [Link] it, mark visited [Link], visit, enqueue unvisited
unvisited neighbors.*DFS Example: Graph: neighbors.*BFS Example: Graph:
A-B-C A-B-C
| | | |
D ----- E D ----- E BFS from A: A → B → D → C → E
DFS from A: A → B → C → E → D *BFS Pseudocode: from collections import deque
*DFS Pseudocode (Recursive): def bfs(graph, start):
def dfs(graph, start, visited=None): visited = set()
if visited is None: queue = deque([start])
visited = set() [Link](start)
[Link](start) while queue:
print(start, end=' ') vertex = [Link]()
for neighbor in graph[start]: print(vertex, end=' ')
if neighbor not in visited: for neighbor in graph[vertex]:
dfs(graph, neighbor, visited) if neighbor not in visited:
[Link](neighbor)
[Link](neighbor)
*Time & Space Complexities:
Traversal Time Complexity Space Complexity
DFS O(V + E) O(V)
BFS O(V + E) O(V)
Where V = number of vertices, E = number of edges
Feature DFS BFS
Uses Stack or Recursion Queue
Strategy Go deep, then backtrack Explore level by level
Memory Less memory in sparse graphs Can use more memory
Good for Pathfinding, Topological Sort Shortest path (unweighted graphs)
Time Complexity O(V + E) O(V + E)
Applications of Graphs: Social networks (friends, followers); Maps and GPS(routing); Webpage link analysis(Google
PageRank); Job scheduling(DAGs); AI(state graphs); Computer networks (routers, protocols).
8 U-2 Linked List: Definition , type of Linked list: Singly, Doubly, Header, Circular linked List
Linked List: A Linked List is a linear data structure in which elements (called nodes) are stored non-contiguously in memory and
are connected using pointers.*Structure of a Node: Each node in a linked list has two parts: [Link]: The value/information
stored in the node [Link] (Link): A reference to the next node in the list. *Basic Node (in C-like syntax):
struct Node {
int data; // data part
struct Node* next; // pointer to next node
}; *Visual Representation(Singly Linked List): Head → [10|*] → [20|*] → [30|NULL] - Head: Points to the first node.- NULL:
Marks the end of the list.*Key Features:
Feature Description
Dynamic size Grows or shrinks at runtime using dynamic memory allocation
Efficient insertion/deletion Especially at the beginning or middle
No random access Must traverse sequentially to reach a specific node
Why Use Linked List Over Arrays?
Arrays Linked Lists Types of Linked Lists: There are four main types of linked
Fixed size Dynamic size lists: [Link] Linked List [Link] Linked List [Link] Linked
Insertion/deletion is costly Easier insertion/deletion List [Link] Linked List [Link] Linked List: In a Singly
Fast access (O(1)) Slow access (O(n)) Linked List, each node contains: Data; A pointer to the next
Contiguous memory required Non-contiguous memory is fine [Link] last node points to NULL, indicating the end of the list.
*Structure: [Data | Next] → [Data | Next] → [Data | NULL] **Characteristics: Can be traversed in one direction only; Simple and
memory-efficient; Cannot go backward. **Use Cases: Stacks, queues, adjacency lists. [Link] Linked List: Each node
contains: Data; Pointer to the next node; Pointer to the previous node.*Structure: NULL ← [Prev | Data | Next] ⇄ [Prev | Data |
Next] ⇄ [Prev | Data | Next] → NULL. **Characteristics: Can be traversed in both directions; More flexible than singly linked list
; Requires more memory(stores two pointers).**Use Cases: Navigation (back/forward), undo-redo systems, deques. [Link]
Linked List: In this type, the last node points back to the first node, forming a [Link] be:- Singly Circular: Only next pointer,
circular;- Doubly Circular: Both next and prev pointers, circular.*Structure (Singly): [Data | Next]→[Data | Next] → [Data | Head]
**Characteristics: No NULL in any node; Can be traversed infinitely; Useful for round-robin scheduling, playlists, [Link]
Linked List: A header node is a special starting node that does not hold actual data, just metadata or a starting reference.
*Structure:[Header | Next]→[Data | Next]→[Data | Next]→NULL **Characteristics: Simplifies insertion/deletion(especially in
empty lists); Often used in academic or complex implementations.*Summary Table
Type Direction Last Node Points To Memory Use Traversal
Singly Linked List One-way NULL Low Forward only
Doubly Linked List Two-way NULL Moderate Forward and Backward
Circular Linked List One or Two First Node Depends Continuous Loop
Header Linked List Depends Depends Slightly More Simplified ops
9 Operations - traversing, searching, inserting, deleting, operations on singly linked list and doubly linked list
[Link] a Linked List: Singly Linked List: Start from the head; Visit each node until NULL
void traverse(struct Node* head) { b)Insert at End:
struct Node* temp = head; void insertAtEnd(struct DNode** head, int data) {
while (temp != NULL) { struct DNode* newNode = malloc(sizeof(struct DNode));
printf("%d -> ", temp->data); newNode->data = data;
temp = temp->next; newNode->next = NULL;
} if (*head == NULL) {
} *Doubly Linked List: Same as singly, but can also newNode->prev = NULL;
traverse backward using prev pointer. *head = newNode;
void reverseTraverse(struct DNode* tail) { return;
struct DNode* temp = tail; }
while (temp != NULL) { struct DNode* temp = *head;
printf("%d <- ", temp->data); while (temp->next != NULL)
temp = temp->prev; temp = temp->next;
} temp->next = newNode;
} [Link] in Linked List: Logic: Start from the head newNode->prev = temp;
and compare data with the key } *[Link] in Linked List: *Singly Linked List: a)Delete at
int search(struct Node* head, int key) { Beginning:
struct Node* temp = head; void deleteAtHead(struct Node** head) {
while (temp != NULL) { if (*head == NULL) return;
if (temp->data == key) struct Node* temp = *head;
return 1; // found *head = (*head)->next;
temp = temp->next; free(temp);
} } b)Delete by Value:
return 0; // not found void deleteByValue(struct Node** head, int key) {
} Same logic applies to doubly linked list struct Node* temp = *head, *prev = NULL;
[Link] in Linked List: *Singly Linked List: a)Insert if (temp != NULL && temp->data == key) {
at Beginning: *head = temp->next;
void insertAtHead(struct Node** head, int data) { free(temp);
struct Node* newNode = malloc(sizeof(struct Node)); return;
newNode->data = data; }
newNode->next = *head; while (temp != NULL && temp->data != key) {
*head = newNode; prev = temp;
} b)Insert at End: temp = temp->next;
void insertAtEnd(struct Node** head, int data) { }
struct Node* newNode = malloc(sizeof(struct Node)); if (temp == NULL) return;
newNode->data = data; prev->next = temp->next;
newNode->next = NULL; free(temp);
if (*head == NULL) { } *Doubly Linked List: a)Delete a Node by Value:
*head = newNode; void deleteByValue(struct DNode** head, int key) {
return; struct DNode* temp = *head;
} while (temp != NULL && temp->data != key)
struct Node* temp = *head; temp = temp->next;
while (temp->next != NULL) if (temp == NULL) return;
temp = temp->next; if (temp->prev != NULL)
temp->next = newNode; temp->prev->next = temp->next;
} c)Insert After a Node: else
void insertAfter(struct Node* prevNode, int data) { *head = temp->next;
if (prevNode == NULL) return; if (temp->next != NULL)
struct Node* newNode = malloc(sizeof(struct Node)); temp->next->prev = temp->prev;
newNode->data = data; free(temp);
newNode->next = prevNode->next; }
prevNode->next = newNode; *Summary Table:
} *Doubly Linked List: a)Insert at Beginning: Operation Singly Linked List Doubly Linked List
void insertAtHead(struct DNode** head, int data) { Traverse One direction only Both forward and backward
struct DNode* newNode = malloc(sizeof(struct DNode)); Search Linear (O(n)) Linear (O(n))
newNode->data = data; Insert at Head O(1) O(1)
newNode->prev = NULL; Insert at End O(n) O(n)
newNode->next = *head; Delete at Head O(1) O(1)
if (*head != NULL) Delete by Value O(n) O(n), but easier to manage
(*head)->prev = newNode;
Extra Pointer No (1 pointer/node) Yes (2 pointers/node)
*head = newNode;
Memory Use Less More
}
0 Memory representation , applications of linked list.
Memory Representation of Linked List: Unlike arrays (which use contiguous memory), a linked list uses non-contiguous
memory allocation. Each node is stored anywhere in memory, and the nodes are linked using pointers.*Structure of a Node:
struct Node {
int data; // actual data
struct Node* next; // pointer to the next node
}; Each node contains: Data (value); Pointer to the next node’s address.*Visual Representation in Memory: Let’s say we
have 3 nodes with data: 10, 20, and 30. *Memory addresses: Node1: 1000 ;Node2: 3000 ;Node3: 7000.*Linked like this:
1000:[10 | 3000]→3000:[20|7000]→7000:[30|NULL]. - The first node contains 10 and a pointer to 3000 (next node);- The second
node contains 20 and a pointer to 7000;- The last node points to NULL, indicating the end.*Key Points:
Feature Description
Dynamic allocation Nodes are allocated in heap memory at runtime
No contiguous memory Nodes can be scattered in memory
Pointer linkage Each node stores the address of the next (or prev)
**Applications of Linked Lists: Linked lists are widely used in real-life software and system implementations due to their
dynamic and flexible [Link] Memory Management: Linked lists allow flexible use of memory without requiring a
predefined size.;- Allocated at runtime using malloc() or [Link] of Data Structures: Stacks and Queues can be
efficiently implemented using singly or doubly linked lists.;- Hash Tables: Chaining (to handle collisions) uses linked lists.;
- Graphs: Adjacency lists are often implemented using linked lists.;- Trees: Nodes in trees (like binary trees) are often
implemented as linked structures. [Link] Insertion and Deletion: Insertion/deletion in the middle or beginning is faster (O(1))
compared to arrays. *Useful in applications where frequent insert/delete is required: Text editors; Undo/redo functionality;
Browser history. [Link] Efficiency in Sparse Matrices: Sparse matrices store mostly zero elements. Linked lists are used to
store only non-zero elements [Link] Scheduling: Circular Linked Lists are used in: Operating systems (Round-
robin scheduling); Multiplayer games (turn-based actions); Media playlists (continuous playback).[Link] and Big Number
Arithmetic: Linked lists can be used to represent polynomials and large integers digit by digit or term by term. [Link]-Time
Applications: Music playlists, image viewers, and tab switchers can use doubly linked lists for navigation (forward/backward).
1 polynomial manipulation in Linked List.
What is Polynomial Manipulation? Polynomial manipulation includes performing operations such as: Creation of
polynomial expressions; Display of polynomials; Addition of two polynomials; Multiplication of two [Link] Use Linked
List for Polynomials? A polynomial is a set of terms, each with a coefficient and exponent. - Linked lists allow: Dynamic
memory allocation; Efficient insertion of terms; Easy merging of terms with same exponents.*Structure of Node:
struct Term { [Link] Two Polynomials:
int coeff; // Coefficient struct Term* addPoly(struct Term* p1, struct Term* p2) {
int exp; // Exponent struct Term* result = NULL;
struct Term* next; // Pointer to next term while (p1 != NULL && p2 != NULL) {
}; Each node represents one term of a polynomial.*Example if (p1->exp > p2->exp) {
Polynomial: - Polynomial: 5x^3 + 4x^2 + 2x + 7;- Linked list insertTerm(&result, p1->coeff, p1->exp);
representation: [5,3]→[4,2]→[2,1]→[7,0]→NULL.*Polynomial p1 = p1->next;
Operations Using Linked List: [Link] Polynomial: Terms } else if (p1->exp < p2->exp) {
are inserted in descending order of exponent. insertTerm(&result, p2->coeff, p2->exp);
struct Term* createTerm(int coeff, int exp) { p2 = p2->next;
struct Term* newTerm = (struct Term*)malloc(sizeof(struct } else {
Term)); insertTerm(&result, p1->coeff + p2->coeff, p1->exp);
newTerm->coeff = coeff; p1 = p1->next;
newTerm->exp = exp; p2 = p2->next;
newTerm->next = NULL; }
return newTerm; }
} while (p1 != NULL) {
void insertTerm(struct Term** poly, int coeff, int exp) { insertTerm(&result, p1->coeff, p1->exp);
struct Term* newNode = createTerm(coeff, exp); p1 = p1->next;
if (*poly == NULL || exp > (*poly)->exp) { }
newNode->next = *poly; while (p2 != NULL) {
*poly = newNode; insertTerm(&result, p2->coeff, p2->exp);
} else { p2 = p2->next;
struct Term* temp = *poly; }
while (temp->next != NULL && temp->next->exp > exp) return result;
temp = temp->next; } [Link] Two Polynomials:
if (temp->next != NULL && temp->next->exp == exp) { struct Term* multiplyPoly(struct Term* p1, struct Term* p2) {
temp->next->coeff += coeff; struct Term* result = NULL;
} else { for (struct Term* temp1 = p1; temp1 != NULL; temp1 =
newNode->next = temp->next; temp1->next) {
temp->next = newNode; for (struct Term* temp2 = p2; temp2 != NULL; temp2 =
} temp2->next) {
} int coeff = temp1->coeff * temp2->coeff;
}[Link] Polynomial: int exp = temp1->exp + temp2->exp;
void display(struct Term* poly) { insertTerm(&result, coeff, exp);
while (poly != NULL) { }
printf("%dx^%d", poly->coeff, poly->exp); }
if (poly->next != NULL) return result;
printf(" + "); }*Full Example: Add and Multiply Polynomials. Input: P1 =
poly = poly->next; 5x^2 + 4x + 2 P2 = 3x^3 + x^2 + 6 Output: Addition:3x^3 +
} 6x^2 + 4x + 8. Multiplication: 15x^5 + 12x^4 + 10x^3 + 3x^2 +
printf("\n"); 24x + 12.
}
*Applications of Polynomial Manipulation:
Application Area Use Case Example
Computer Algebra Systems Like MATLAB, Mathematica
Symbolic Computation Integration, Differentiation, Simplification
Graphics & Animation Curve modeling using polynomials
Signal Processing Polynomial filters
Scientific Computing Polynomial approximations in simulations
2 Queue: Introduction, types.
Queue: A Queue is a linear data structure that follows the FIFO (First In, First Out) principle.
*The element inserted first is removed first — just like a real-world queue (e.g., waiting in line).Basic Operations of a Queue:
Operation Description *Queue Visualization: Front → [10] [20] [30] [40] ← Rear Enqueue 50:
enqueue Insert an element at the rear of queue Added at rear; Dequeue: Remove from [Link] of Queues: Queues
dequeue Remove an element from the front are classified into several types based on how they handle insertion
peek View the front element without removing it and deletion: [Link] Queue:A Linear Queue is the most basic type of
isEmpty Check if queue is empty queue that works on the FIFO (First In, First Out) [Link]:
isFull Check if queue is full (for fixed-size queues) Front→[10][20][30]← Rear.- Enqueue at the rear; Dequeue from the front.
-Limitation: Wastes memory as the front moves forward, and freed space can't be reused. -Use Cases: Print queues; Simple
buffering [Link] Queue: A Circular Queue connects the rear back to the front to form a circle, making better use of
memory.* Structure: [30] [40] [50] [--] [10] (front = 1, rear = 0 → after wrapping around) *Features: Rear wraps around when
reaching the end.; Solves memory limitation of linear queue.*Formulae: rear = (rear + 1) % size; front = (front + 1) % size; - Use
Cases: CPU scheduling (Round Robin); Real-time systems (ring buffers). [Link](Double-Ended Queue): In a Deque,
elements can be inserted or removed from both ends. *Types of Deque: [Link]-restricted deque: Deletion from both ends,
insertion at one end. [Link]-restricted deque: Insertion from both ends, deletion at one end. *Structure: Insert←[30][40][50]
→Remove.- Use Cases: Browser history (forward and backward navigation); Palindrome checking; Sliding window problems.
[Link] Queue: A Priority Queue is a special type where each element has a priority, and elements are served based on their
priority, not just insertion order. Rules: Higher priority elements are dequeued first.; If two elements have the same priority, FIFO
order is [Link]: enqueue(30,priority2) enqueue(20,priority3) enqueue(10, priority1) →dequeue()removes 20(highest
priority) - Use Cases: Task scheduling(OS); Dijkstra’s algorithm(shortest path); Event-driven simulations.*Summary Table:
Queue Type Insertion Deletion Special Feature
Linear Queue Rear Front Basic FIFO, simple but limited
Circular Queue Rear (wraps around) Front Reuses space, efficient
Deque Both ends Both ends Flexible insert/delete
Priority Queue Based on priority Based on priority Elements ordered by priority
*Visualization Quick Recap: - Linear Queue: [10][20][30];- Circular Queue: [30] [40] [50] [10](rear wraps);- Deque:
⇄[20][30][40]⇄(both ends usable);- Priority Queue: [Highest] →[Medium]→[Lowest].
3 Memory representation and applications of Queue.
Memory Representation of a Queue: A queue can be implemented in two main ways: [Link] Arrays(Static Implementation):
Fixed size; Uses two pointers: - front: Points to the element to be removed;- rear: Points to the last inserted element.
Visual Example: Suppose we have: * Formula for moving rear: rear = (rear + 1) % size;
int queue[5]; [Link] Linked List(Dynamic Implementation): - No fixed
int front = 0; size;- Each node contains: Data ; next pointer.
int rear = -1; * Enqueue (Insert 10): - rear++ → rear = 0; Structure:
- queue[0] = 10. * Dequeue (Remove): front++ → front = 1. struct Node {
* Problem: Even if we remove elements, space isn't reused int data;
(unless circular queue is used). [Link] Circular Arrays: Rear struct Node* next;
wraps around to the beginning when it reaches the end ;- More }; - front: points to the first node;- rear: points to the last
efficient use of space. *Visual (Queue of size 5): node.*Enqueue: Add node at the rear. *Dequeue: Remove
- Initial: front = 0, rear = 4 → insert at position 0; node from the front.
- Queue: [50, _, _, _, 40] (after wrap-around insert);
**Comparison:
Feature Array Implementation Linked List Implementation *Applications of Queue: Queues are used widely in
Memory size Fixed (static) Dynamic (grows as needed) both software and hardware systems:
Insert/Delete Fast (O(1)) Fast (O(1)) [Link] Systems: Process Scheduling (Round Robin);
Space reuse Poor (unless circular) Excellent Task queues in multi-threaded environments; Buffer
Overflow Possible Not possible (unless memory full) management in I/O systems
[Link] Buffers: Keyboard buffer; Printer spooling; Disk scheduling. [Link] Systems: Packet switching in routers; Message
queues in communication systems. [Link] Systems: Supermarket checkout systems; Traffic simulation; Call centers.
[Link] and Graph Traversal: Breadth-First Search (BFS) uses a queue. [Link] Scheduling: Print jobs; Batch processing; CPU
scheduling. [Link] Real-World Applications: Customer service lines; Ticket booking systems; Order processing queues.