Skip to content

Latest commit

 

History

History
173 lines (140 loc) · 17.5 KB

README.md

File metadata and controls

173 lines (140 loc) · 17.5 KB

Algorithms & Data Structures

Data Structure Subtype / Implementation Reading Insertion Deletion Note
Array - $O(1)$ $O(n)$ $O(n)$ Allows random access.
List
Linked list $O(n)$ $O(1)$ $O(1)$ Each node has a pointer to the next one.
Doubly linked list $O(n)$ $O(1)$ $O(1)$ Each node has a pointer to the next and previous one.
Circular linked list $O(n)$ $O(1)$ $O(1)$ Head and tail are linked.
Hash Table - $O(1)$ $O(1)$ $O(1)$ Assuming a good hash function is used and the table is not too full. In the worst case, where many collisions happen, time complexity can be $O(n)$.
Set - - - - -
Using hash tables. $O(1) $O(1)$ $O(1)$ The worst-case of some operations, such as finding the maximum or minimum element, can be O(n) if the implementation does not keep track of these values separately.
Using BST such as AVL, red-black, or splay tree. $O(log\ n)$ $O(log\ n)$ $O(log\ n)$ ^
Stack - - - - -
Using linked lists. $O(1)$ (Last) $O(1)$ (Push) $O(1)$ (Pop) The time complexity of the basic operations (LIFO) is constant, regardless of the number of elements.
Using fixed-size arrays. $O(1)$ (Last) $O(n)$ (Push) $O(1)$ (Pop) Insertion may be proportional to the size of the Stack if an new array allocation is needed.
Queue - - - - -
Using linked lists. $O(1)$ (First) $O(1)$ (Enqueue) $O(1)$ (Dequeue) The time complexity of the basic operations (FIFO) is constant, regardless of the number of elements.
Priority Queue $O(1)$ (First) $O(log\ n)$ (Enqueue) $O(1)$ (Dequeue) Using a Binary Heap, we can get the element with highest priority in $O(1)$ and insertions in $O(log\ n)$.
Tree - - - - -
Binary Tree $O(n)$ $O(n)$ $O(n)$
Binary Search Tree $O(log n)$ (average) $O(log n)$ (average) $O(log n)$ (average) Worst case is $O(n)$ when BST is a linear chain of $n$ nodes.
Red-Black Tree $O(log n)$ $O(log n)$ $O(log n)$ Self-balancing guarantees the height of the tree to remain logarithmic with respect to the number of keys.
Trie (Prefix tree) $O(m)$ $O(m)$ $O(m)$ Where $m$ is the length of the string.
B-Trees $O(log n)$ $O(log n)$ $O(log n)$ The height of the three remains logarithmic with respect to the number of keys.
Interval Tree $O(log n + k)$ $O(log n + k)$ $O(log n + k)$ Where $k$ is the number of intervals.
M-ary / K-ary Trees $O(log_m n)$ $O(log_m n)$ $O(log_m n)$ Assumes the tree is balanced but in practice, the performance can vary depending on many factors.
Order-Statistic Tree $O(log n)$ $O(log n)$ $O(log n)$ It's an specialized Red-Black Tree and therefore, self-balanced.
Van Emde Boas Tree $O(log log n)$ $O(log log n)$ $O(log log n)$
Heaps - - - - -
Binary Min/Max Heap $O(1)$ (peek min/max) $O(log n)$ $O(log n)$
Mergeable Heap - - - The time complexity of the basic operations will depend on the underlying heap implementation regardless of the added "merge" operation.
Fibonacci Heap $O(1)$ $O(1)$ amortized $O(log n)$ amortized
Graph - - - - -

This is only a high-level overview of base implementations. Usually, you'd be able to optimize data structures for specific use cases by making some assumptions and tweaks.

Algorithm Design

Algorithms

Mathematics

Cryptography

  • Breadth-first search
  • Depth-first search
  • Dijkstra's algorithm
  • Minimum Spanning Trees
    • Kruskal's algorithm
    • Prim's algorithm
  • Single-Source Shortest Paths
    • The Bellman-Ford algorithm
    • Dijkstra's algorithm
  • All-Pairs Shortest Paths
    • Floyd-Warshall algorithm
    • Johnson's algorithm for sparse graphs
  • Maximum Flow
    • Ford-Fulkerson method
    • Edmonds-Karp algorithm
    • Push-relabel algorithms
    • Relabel-to-front algorithm

Problem-specific

Searching and traversal

What is a Tree Traversal?

Algorithm Time Space Notes
DFS: Depth-first search $O(|V|+|E|)$ $O(1)$
BFS: Breadth-first search $O(|V|+|E|)$ $O(1)$
Binary search $O(log_2 n)$ $O(1)$
Algorithm Time Space Stable
Selection sort $O(n^2)$ $O(1)$ No
Insertion sort $O(n^2)$ $O(1)$ Yes
Mergesort $O(n\ log\ n)$ $O(1)$ Yes
Heapsort $O(n\ log\ n)$ $O(1)$ No
Quicksort $O(n^2)$ worst and $O(n\ log\ n)$ average $O(log(n))$ No
Counting sort $O(n)$ $O(k)$ No Where $k$ is the largest element in the array
Radix sort $O(d*(n+k)$ $O(n+k)$ Yes Where $d$ is the number of digits in the longest number and $k$ the base (e.g., 10 for decimals)
Bucket sort $O(n^2)$ worst and $O(n+k)$ average $O(n+k)$ Yes (if stable sorting algorithm is used within the buckets) Where $k$ is the number of buckets.
Random permutations - - - Complexities will depend on whether it uses some sort of priority and comparison sort or not.

Solved problems

Well-known

By Data Structure

Lists

Coding platforms