Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
33 views

Algorithm Complexity

Uploaded by

111vinaymaury
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
33 views

Algorithm Complexity

Uploaded by

111vinaymaury
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 7

Algorithm complexity refers to the efficiency of an algorithm in terms of the

resources it consumes, such as time and space. Complexity, in the context of


algorithms, refers to the analysis of how the time or space requirements of an
algorithm grow in relation to the size of the input.
It is known as Asymptotic notation.

Asymptotic notation of an algorithm is a mathematical representation of its complexity.

There are two main types of algorithm complexity:


Time complexity measures the amount of time an algorithm takes to complete based
on the input size.
Space complexity measures the amount of memory (space) an algorithm uses based
on the input size.

Best, Average and Worst Case


The best-case time complexity represents the minimum amount of time an algorithm
would take to complete its task under optimal conditions or with a favourable input.

The average-case time complexity represents the expected time an algorithm would
take for a random input. It involves calculating the average over all possible inputs.

The worst-case time complexity represents the maximum amount of time an


algorithm would take to complete its task for any input.

Best Case (Space Complexity) represents the minimum amount of memory an


algorithm needs to complete its task under the most favourable conditions. It's the
least amount of space required by the algorithm for any input.

Average Case (Space Complexity) is the expected space usage of the algorithm over
all possible inputs. It's an average measure, considering different input scenarios and
their respective space requirements.

1
Worst Case (Space Complexity) represents the maximum amount of memory the
algorithm would require for any input. It's a crucial metric, especially in scenarios

1 NEHAL MEHDI, MCA, GOLD MEDALIST


where memory resources are limited, as it provides an upper bound on the memory
requirements.

The goal in optimisation algorithms is indeed to achieve the maximum outcome


(effectiveness, accuracy, functionality) while minimising the resources used,
specifically time and space. This optimization is crucial in computer science and
software engineering for several reasons:

Efficiency: Faster algorithms that use less memory can handle larger datasets and
more complex problems within practical time frames.
Resource Utilisation: Efficient use of time and space resources is particularly
important in systems with limited capacity, like embedded systems or mobile
devices.
Cost-Effectiveness: In many computing environments, particularly in cloud
computing and data centres, resource utilisation directly translates to
operational costs.
User Experience: For applications that interact with users, faster response times
lead to better user satisfaction.
Scalability: Efficient algorithms are more scalable, meaning they can handle
increases in workload (such as more data or users) more gracefully.

It is all about more outcomes in less time and less space as much as possible.

THREE types of Asymptotic Notations and those are as follows...


1. Big - Oh (O), Worst Case
2. Big - Omega (Ω), Best Case
3. Big - Theta (Θ), Average Case

Big - Oh Notation (O)


Big - Oh notation is used to define the upper bound of an algorithm in terms of Time
Complexity.

That means Big - Oh notation always indicates the maximum time required by an
algorithm for all input values. That means Big - Oh notation describes the worst case
of an algorithm time complexity.

Big - Oh Notation can be defined as follows…


Consider function f(n) as the time complexity of an algorithm and g(n) is the most
significant term.
If f(n) <= C g(n) for all n >= n0, C > 0 and n0 >= 1.
Then we can represent f(n) as O(g(n)).

f(n) = O(g(n))
Consider the following graph drawn for the values of f(n) and C g(n) for input (n)
value on X-Axis and time required is on Y-Axis
In the above graph after a particular input value n0, always C g(n) is greater than f(n)
which indicates the algorithm's upper bound.

Example
Consider the following f(n) and g(n)...
f(n) = 3n + 2
g(n) = n

If we want to represent f(n) as O(g(n)) then it must satisfy f(n) <= C g(n) for all
values of C > 0 and n0>= 1
f(n) <= C g(n)

⇒3n + 2 <= C n

Above condition is always TRUE for all values of C = 4 and n >= 2.


By using Big - Oh notation we can represent the time complexity as follows...
3n + 2 = O(n)

Big - Omega Notation (Ω)


Big - Omega notation is used to define the lower bound of an algorithm in terms of
Time Complexity.

That means Big-Omega notation always indicates the minimum time required by an
algorithm for all input values. That means Big-Omega notation describes the best
case of an algorithm's time complexity.
Big - Omega Notation can be defined as follows...

Consider function f(n) as the time complexity of an algorithm and g(n) is the most
significant term.

If f(n) >= C g(n) for all n >= n0, C > 0 and n0 >= 1.

Then we can represent f(n) as Ω(g(n)).

f(n) = Ω(g(n))
Consider the following graph drawn for the values of f(n) and C g(n) for input (n)
value on X-Axis and time required is on Y-Axis

In the above graph after a particular input value n0, always C g(n) is less than f(n)
which indicates the algorithm lower bound.
Example
Consider the following f(n) and g(n)...
f(n) = 3n + 2
g(n) = n

If we want to represent f(n) as Ω(g(n)) then it must satisfy f(n) >= C g(n) for all
values of C > 0 and n0>= 1
f(n) >= C g(n)

⇒3n + 2 >= C n

Above condition is always TRUE for all values of C = 1 and n >= 1.


By using Big - Omega notation we can represent the time complexity as follows...
3n + 2 = Ω(n)

Big - Theta Notation (Θ)


Big - Theta notation is used to define the average bound of an algorithm in terms of
Time Complexity.

That means Big - Theta notation always indicates the average time required by an
algorithm for all input values. That means Big - Theta notation describes the average
case of an algorithm time complexity.

Big - Theta Notation can be defined as follows...


Consider function f(n) as the time complexity of an algorithm and g(n) is the most
significant term.
If C1 g(n) <= f(n) <= C2 g(n) for all n >= n0, C1 > 0, C2 > 0 and n0 >= 1.
Then we can represent f(n) as Θ(g(n)).

f(n) = Θ(g(n))
Consider the following graph drawn for the values of f(n) and C g(n) for input (n)
value on X-Axis and time required is on Y-Axis
In the above graph after a particular input value n0, always C1 g(n) is less than f(n)
and C2 g(n) is greater than f(n) which indicates the algorithm's average bound.
Example
Consider the following f(n) and g(n)...
f(n) = 3n + 2
g(n) = n

If we want to represent f(n) as Θ(g(n)) then it must satisfy C1 g(n) <= f(n) <= C2
g(n) for all values of C1 > 0, C2 > 0 and n0>= 1
C1 g(n) <= f(n) <= C2 g(n)

⇒C1 n <= 3n + 2 <= C2 n

Above condition is always TRUE for all values of C1 = 1, C2 = 4 and n >= 2.


By using Big - Theta notation we can represent the time complexity as follows...
3n + 2 = Θ(n)

You might also like