You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: Big-O Notation.markdown
+107-1Lines changed: 107 additions & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -15,8 +15,114 @@ Big-O | Name | Description
15
15
**O(n^2)** | quadratic | **Kinda slow.** If you have 100 items, this does 100^2 = 10,000 units of work. Doubling the number of items makes it four times slower (because 2 squared equals 4). Example: algorithms using nested loops, such as insertion sort.
16
16
**O(n^3)** | cubic | **Poor performance.** If you have 100 items, this does 100^3 = 1,000,000 units of work. Doubling the input size makes it eight times slower. Example: matrix multiplication.
17
17
**O(2^n)** | exponential | **Very poor performance.** You want to avoid these kinds of algorithms, but sometimes you have no choice. Adding just one bit to the input doubles the running time. Example: traveling salesperson problem.
18
-
**O(n!)** | factorial | **Intolerably slow.** It literally takes a million years to do anything.
18
+
**O(n!)** | factorial | **Intolerably slow.** It literally takes a million years to do anything.
19
+
19
20
21
+
Some examples to better understand the Big(O) notation:
22
+
23
+
**O(1)**
24
+
25
+
The most common example with O(1) complexity is accessing an array index.
26
+
27
+
```c++
28
+
int i = a[5];
29
+
```
30
+
31
+
Another example of O(1) is Pushing and Popping from Stack.
32
+
33
+
34
+
**O(log n)**
35
+
36
+
```c++
37
+
for(int i=0; i<n; i *= 2){
38
+
cout<<i<<endl; // instead of simply incrementing, 'i' is increased by 2 times itself in each run.
39
+
}
40
+
```
41
+
Binary Search Algorithm is an example of O(log n) complexity.
42
+
43
+
44
+
**O(n)**
45
+
46
+
```c++
47
+
for(int i=0; i<n; i++){
48
+
cout<<a[i]<<endl;
49
+
}
50
+
```
51
+
52
+
Array Traversal and Linear Search are examples of O(n) complexity.
53
+
54
+
55
+
**O(n log n)**
56
+
57
+
```c++
58
+
for(int i = 0; i < n; i++) { //linear
59
+
for(int j = 1; j < n; j *= 2){ // log (n)
60
+
//do constant time stuff
61
+
}
62
+
}
63
+
```
64
+
65
+
Merge Sort and Heap Sort are examples of O(n log n) complexity.
66
+
67
+
68
+
**O(n^2)**
69
+
70
+
```c++
71
+
for(int i = 0; i < n; i++) {
72
+
for(int j = 1; j < n; j++){
73
+
//do constant time stuff
74
+
}
75
+
}
76
+
```
77
+
78
+
Traversing a simple 2-D array and Bubble Sort are examples of O(n^2) complexity.
79
+
80
+
81
+
**O(n^3)**
82
+
83
+
```c++
84
+
for(int i = 0; i < n; i++) {
85
+
for(int j = 1; j < n; j++){
86
+
for(int k = 1; k < n; k++){
87
+
//do constant time stuff
88
+
}
89
+
}
90
+
}
91
+
```
92
+
93
+
**O(2^n)**
94
+
95
+
Algorithms with running time O(2^N) are often recursive algorithms that solve a problem of size N by recursively solving two smaller problems of size N-1.
96
+
The following example prints all the moves necessary to solve the famous "Towers of Hanoi" problem for N disks.
97
+
98
+
```c++
99
+
void solve_hanoi(int N, string from, string to, string spare) {
100
+
if (N<1) {
101
+
return;
102
+
}
103
+
if (N>1) {
104
+
solve_hanoi(N-1, from, spare, to);
105
+
}
106
+
print "move from " + from + " to " + to;
107
+
if (N>1) {
108
+
solve_hanoi(N-1, spare, to, from);
109
+
}
110
+
}
111
+
```
112
+
113
+
114
+
**O(n!)**
115
+
116
+
The most trivial example of function that takes O(n!) time is given below.
117
+
118
+
```c++
119
+
voidnFacFunc(int n) {
120
+
for(int i=0; i<n; i++) {
121
+
nFacFunc(n-1);
122
+
}
123
+
}
124
+
```
125
+
20
126
Often you don't need math to figure out what the Big-O of an algorithm is but you can simply use your intuition. If your code uses a single loop that looks at all **n** elements of your input, the algorithm is **O(n)**. If the code has two nested loops, it is **O(n^2)**. Three nested loops gives **O(n^3)**, and so on.
21
127
22
128
Note that Big-O notation is an estimate and is only really useful for large values of **n**. For example, the worst-case running time for the [insertion sort](Insertion%20Sort/) algorithm is **O(n^2)**. In theory that is worse than the running time for [merge sort](Merge%20Sort/), which is **O(n log n)**. But for small amounts of data, insertion sort is actually faster, especially if the array is partially sorted already!
0 commit comments