The Term Time Complexity in DSA

Time complexity in DSA is the computational complexity that describes the amount of computer time it takes to run an algorithm. Time complexity is commonly estimated by counting the number of elementary operations performed by the algorithm, supposing that each elementary operation takes a fixed amount of time to perform. Thus, the amount of time taken and the number of elementary operations performed by the algorithm are taken to be related by a constant factor.

An algorithm’s running time may vary among different inputs of the same size, one commonly considers the worst-case time complexity, which is the maximum amount of time required for inputs of a given size.

The efficiency of an algorithm depends on two parameters

  • Time Complexity
  • Space Complexity

Time Complexity: It is defined as the number of times a particular instruction set is executed rather than the total time is taken. It is because the total time took also depends on some external factors like the compiler used, processor’s speed, etc.

Space Complexity: It is the total memory space required by the program for its execution.

NameRunning time (T(n))Example algorithms
constant timeO(1)Finding the median value in a sorted array of numbers Calculating (−1)n
inverse Ackermann timeO(α(n))Amortized time per operation using a disjoint set
iterated logarithmic timeO(log* n)Distributed coloring of cycles
log-logarithmicO(log log n)Amortized time per operation using a bounded priority queue
logarithmic timeO(log n)Binary search
polylogarithmic timepoly(log n) 
fractional powerO(nc) where 0 < c < 1Searching in a kd-tree
linear timeO(n)Finding the smallest or largest item in an unsorted array, Kadane’s algorithm, linear search
“n log-star n” timeO(n log* n)Seidel’s polygon triangulation algorithm.
linearithmic timeO(n log n)Fastest possible comparison sort; Fast Fourier transform.
quasilinear timen poly(log n) 
quadratic timeO(n2)Bubble sort; Insertion sort; Direct convolution
cubic timeO(n3)Naive multiplication of two n×n matrices. Calculating partial correlation.
polynomial time2O(log n) = poly(n)Karmarkar’s algorithm for linear programming; AKS primality test
quasi-polynomial time2poly(log n)Best-known O(log2 n)-approximation algorithm for the directed Steiner tree problem.
sub-exponential timeO(2nε) for all ε > 0Contains BPP unless EXPTIME (see below) equals MA.
sub-exponential time2o(n)Best-known algorithm for integer factorization; formerly-best algorithm for graph isomorphism
exponential time2O(n)Solving the traveling salesman problem using dynamic programming
exponential time2poly(n)Solving matrix chain multiplication via brute-force search
factorial timeO(n!)Solving the traveling salesman problem via brute-force search
double exponential time22poly(n)Deciding the truth of a given statement in Presburger arithmetic

What are the different types of Time complexity notation used?

  1. Constant time – O (1)
  2. Linear time – O (n)
  3. Logarithmic time – O (log n)
  4. Quadratic time – O (n^2)
  5. Cubic time – O (n^3)

Constant time – O (1)

An algorithm is said to have constant time with order O (1) when it is not dependent on the input size n. Irrespective of the input size n, the runtime will always be the same.

Linear time – O(n)

An algorithm is said to have a linear time complexity when the running time increases linearly with the length of the input. When the function involves checking all the values in input data, such function has Time complexity with this order O(n).

Logarithmic time – O (log n)

An algorithm is said to have a logarithmic time complexity when it reduces the size of the input data in each step. This indicates that the number of operations is not the same as the input size. The number of operations gets reduced as the input size increases. Algorithms with Logarithmic time complexity are found in binary trees or binary search functions.

Quadratic time – O (n^2)

An algorithm is said to have a non – linear time complexity where the running time increases non-linearly (n^2) with the length of the input. Generally, nested loops come under this time complexity order where for one loop takes O(n) and if the function involves a loop within a loop, then it goes for O(n)*O(n) = O(n^2) order.

Time Complexity of Insertion Sort

The time complexity of Insertion Sort in the best case is O(n). In the worst case, the time complexity is O(n^2).

Time Complexity of Merge Sort

This sorting technique has a stable time complexity for all kinds of cases. The time complexity of Merge Sort in the best case is O(nlogn). In the worst case, the time complexity is O(nlogn). This is because Merge Sort implements a same number of sorting steps for all kinds of cases.

Time Complexity of Bubble Sort

The time complexity of Bubble Sort in the best case is O(n). In the worst case, the time complexity is O(n^2).

Time Complexity of Quick Sort

The time complexity of Quick Sort in the best case is O(nlogn). In the worst case, the time complexity is O(n^2). Quicksort is considered to be the fastest of the sorting algorithms due to its performance of O(nlogn) in best and average cases.

Time Complexity of Searching algorithms

Let us now dive into the time complexities of some Searching Algorithms and understand which of them is faster.

Time Complexity of Linear Search

Linear Search follows the sequential access. The time complexity of Linear Search in the best case is O(1). In the worst case, the time complexity is O(n).

Time Complexity of Binary Search

Binary Search is the faster of the two searching algorithms. However, for smaller arrays, linear search does a better job. The time complexity of Binary Search in the best case is O(1). In the worst case, the time complexity is O(log n).

Happy Time Complexity in DSA 🙂


Discover more from mycodetips

Subscribe to get the latest posts sent to your email.

Discover more from mycodetips

Subscribe now to keep reading and get access to the full archive.

Continue reading

Scroll to Top