Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
372 views

Algorithms Complexity and Data Structures Efficiency

The document discusses different data structures and their time complexities. It compares common data structures like arrays, lists, trees and hash tables. It then describes various tree data structures like binary search trees, AVL trees, B-trees and B+ trees. For each, it provides the time complexity for operations like search, insert and delete. Hash tables provide highly efficient O(1) search times on average using hash functions to map keys to array positions. Choosing the appropriate data structure impacts an algorithm's efficiency.

Uploaded by

Sri Hari
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
372 views

Algorithms Complexity and Data Structures Efficiency

The document discusses different data structures and their time complexities. It compares common data structures like arrays, lists, trees and hash tables. It then describes various tree data structures like binary search trees, AVL trees, B-trees and B+ trees. For each, it provides the time complexity for operations like search, insert and delete. Hash tables provide highly efficient O(1) search times on average using hash functions to map keys to array positions. Choosing the appropriate data structure impacts an algorithm's efficiency.

Uploaded by

Sri Hari
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 17

Algorithms Complexity and Data Structures Efficiency

1. Fundamental Data Structures Comparison


Arrays vs. Lists vs. Trees vs. Hash-Tables

2. Choosing Proper Data Structure for a program will increase the efficiency of a program. 3. Data structures and algorithms are the

foundation of computer programming

Algorithms Complexity
Algorithmic thinking, problem solving and data structures are vital for software engineers.

Time Complexity space complexity.


Mean, Average and Worst Case.

Worst-case

Average-case

An upper bound on the running time for any input of given size
Assume all inputs of a given size are equally likely The lower bound on the running time

Best-case

Why we should analyze algorithms?


Predict the resources that the algorithm requires
Computational time (CPU consumption) Memory space (RAM consumption) Communication bandwidth consumption

The running time of an algorithm is:

The total number of primitive operations executed (machine independent steps) Also known as algorithm complexity

Big O notation
Time complexity in so-called "Big O" notation. This, approximately, describe the time taken for an algorithm to solve the given task. Prefer Time complexity", rather than "space complexity" mostly because now a days space doesnt matter. The big 0 notation will calculate time complexity in worst case scenario.

Lists or Tables
Table BST AVL B B+ Hashing For table the

The time complexity for an table is O(n). To search an element in an table it may take maximum of n iteration for n elements. So to over come above problems we introduced BST(binary search tree). The time complexity of an BST is O(log2 n).

Its has no difference between tables and BST if it is not a balanced tress.

AVL tress An AVL tree is a binary search tree which has the following properties. The sub-trees of every node differ in height by at most one. Every sub-tree is an AVL tree. An balanced BST is an is an AVL tree.

B tress
In BST and AVL we have only one root element and time complexity is O(log n) If we have m elements in root the time complexity is O(log m n). Where m is number of elements in root.

If we have even number of keys the it arises two cases left biased right biased B+ tree is an advanced version of b tree. When a leaf node would split off, its reference is kept in successor node. If an internal node would split up its same as in B tree.

Hashing
Hash tables support one of the most efficient types of searching hashing. Fundamentally, a hash table consists of an array in which data is accessed via a special index called a key. The primary idea behind a hash table is to establish a mapping between the set of all possible keys and positions in the array using a hash function. A hash function accepts a key and returns its hash coding, or hash value. Keys vary in type, but hash coding are always integers.

The complexity of good hash function is O(1). The complexity of bad hash function is O(n).

You might also like