Algorithm Complexity
Algorithm Complexity
The average-case time complexity represents the expected time an algorithm would
take for a random input. It involves calculating the average over all possible inputs.
Average Case (Space Complexity) is the expected space usage of the algorithm over
all possible inputs. It's an average measure, considering different input scenarios and
their respective space requirements.
1
Worst Case (Space Complexity) represents the maximum amount of memory the
algorithm would require for any input. It's a crucial metric, especially in scenarios
Efficiency: Faster algorithms that use less memory can handle larger datasets and
more complex problems within practical time frames.
Resource Utilisation: Efficient use of time and space resources is particularly
important in systems with limited capacity, like embedded systems or mobile
devices.
Cost-Effectiveness: In many computing environments, particularly in cloud
computing and data centres, resource utilisation directly translates to
operational costs.
User Experience: For applications that interact with users, faster response times
lead to better user satisfaction.
Scalability: Efficient algorithms are more scalable, meaning they can handle
increases in workload (such as more data or users) more gracefully.
It is all about more outcomes in less time and less space as much as possible.
That means Big - Oh notation always indicates the maximum time required by an
algorithm for all input values. That means Big - Oh notation describes the worst case
of an algorithm time complexity.
f(n) = O(g(n))
Consider the following graph drawn for the values of f(n) and C g(n) for input (n)
value on X-Axis and time required is on Y-Axis
In the above graph after a particular input value n0, always C g(n) is greater than f(n)
which indicates the algorithm's upper bound.
Example
Consider the following f(n) and g(n)...
f(n) = 3n + 2
g(n) = n
If we want to represent f(n) as O(g(n)) then it must satisfy f(n) <= C g(n) for all
values of C > 0 and n0>= 1
f(n) <= C g(n)
⇒3n + 2 <= C n
That means Big-Omega notation always indicates the minimum time required by an
algorithm for all input values. That means Big-Omega notation describes the best
case of an algorithm's time complexity.
Big - Omega Notation can be defined as follows...
Consider function f(n) as the time complexity of an algorithm and g(n) is the most
significant term.
If f(n) >= C g(n) for all n >= n0, C > 0 and n0 >= 1.
f(n) = Ω(g(n))
Consider the following graph drawn for the values of f(n) and C g(n) for input (n)
value on X-Axis and time required is on Y-Axis
In the above graph after a particular input value n0, always C g(n) is less than f(n)
which indicates the algorithm lower bound.
Example
Consider the following f(n) and g(n)...
f(n) = 3n + 2
g(n) = n
If we want to represent f(n) as Ω(g(n)) then it must satisfy f(n) >= C g(n) for all
values of C > 0 and n0>= 1
f(n) >= C g(n)
⇒3n + 2 >= C n
That means Big - Theta notation always indicates the average time required by an
algorithm for all input values. That means Big - Theta notation describes the average
case of an algorithm time complexity.
f(n) = Θ(g(n))
Consider the following graph drawn for the values of f(n) and C g(n) for input (n)
value on X-Axis and time required is on Y-Axis
In the above graph after a particular input value n0, always C1 g(n) is less than f(n)
and C2 g(n) is greater than f(n) which indicates the algorithm's average bound.
Example
Consider the following f(n) and g(n)...
f(n) = 3n + 2
g(n) = n
If we want to represent f(n) as Θ(g(n)) then it must satisfy C1 g(n) <= f(n) <= C2
g(n) for all values of C1 > 0, C2 > 0 and n0>= 1
C1 g(n) <= f(n) <= C2 g(n)