Data Structure | Subtype / Implementation | Reading | Insertion | Deletion | Note |
---|---|---|---|---|---|
Array | - | Allows random access. | |||
List | |||||
Linked list | Each node has a pointer to the next one. | ||||
Doubly linked list | Each node has a pointer to the next and previous one. | ||||
Circular linked list | Head and tail are linked. | ||||
Hash Table | - | Assuming a good hash function is used and the table is not too full. In the worst case, where many collisions happen, time complexity can be |
|||
Set | - | - | - | - | - |
Using hash tables. | $O(1) | The worst-case of some operations, such as finding the maximum or minimum element, can be O(n) if the implementation does not keep track of these values separately. | |||
Using BST such as AVL, red-black, or splay tree. | ^ | ||||
Stack | - | - | - | - | - |
Using linked lists. |
|
|
|
The time complexity of the basic operations (LIFO) is constant, regardless of the number of elements. | |
Using fixed-size arrays. |
|
|
|
Insertion may be proportional to the size of the Stack if an new array allocation is needed. | |
Queue | - | - | - | - | - |
Using linked lists. |
|
|
|
The time complexity of the basic operations (FIFO) is constant, regardless of the number of elements. | |
Priority Queue |
|
|
|
Using a Binary Heap, we can get the element with highest priority in |
|
Tree | - | - | - | - | - |
Binary Tree | |||||
Binary Search Tree |
|
|
|
Worst case is |
|
Red-Black Tree | Self-balancing guarantees the height of the tree to remain logarithmic with respect to the number of keys. | ||||
Trie (Prefix tree) | Where |
||||
B-Trees | The height of the three remains logarithmic with respect to the number of keys. | ||||
Interval Tree | Where |
||||
M-ary / K-ary Trees | Assumes the tree is balanced but in practice, the performance can vary depending on many factors. | ||||
Order-Statistic Tree | It's an specialized Red-Black Tree and therefore, self-balanced. | ||||
Van Emde Boas Tree | |||||
Heaps | - | - | - | - | - |
Binary Min/Max Heap |
|
||||
Mergeable Heap | - | - | - | The time complexity of the basic operations will depend on the underlying heap implementation regardless of the added "merge" operation. | |
Fibonacci Heap |
|
|
|||
Graph | - | - | - | - | - |
This is only a high-level overview of base implementations. Usually, you'd be able to optimize data structures for specific use cases by making some assumptions and tweaks.
- Amortized Analysis
- Divide and Conquer
- Recursion
- Dynamic Programming
- Greedy Algorithms
- Probabilistic Analysis
- Randomized Algorithm
- NP-complete problems
- Breadth-first search
- Depth-first search
- Dijkstra's algorithm
- Minimum Spanning Trees
- Kruskal's algorithm
- Prim's algorithm
- Single-Source Shortest Paths
- The Bellman-Ford algorithm
- Dijkstra's algorithm
- All-Pairs Shortest Paths
- Floyd-Warshall algorithm
- Johnson's algorithm for sparse graphs
- Maximum Flow
- Ford-Fulkerson method
- Edmonds-Karp algorithm
- Push-relabel algorithms
- Relabel-to-front algorithm
Algorithm | Time | Space | Notes |
---|---|---|---|
DFS: Depth-first search | |||
BFS: Breadth-first search | |||
Binary search |
Algorithm | Time | Space | Stable | |
---|---|---|---|---|
Selection sort | No | |||
Insertion sort | Yes | |||
Mergesort | Yes | |||
Heapsort | No | |||
Quicksort |
|
No | ||
Counting sort | No | Where |
||
Radix sort | Yes | Where |
||
Bucket sort |
|
Yes (if stable sorting algorithm is used within the buckets) | Where |
|
Random permutations | - | - | - | Complexities will depend on whether it uses some sort of priority and comparison sort or not. |