Skip to content

Commit e1f32e0

Browse files
authored
Merge pull request kodecocodes#634 from nb2998/nb2998-bigO
Updated Big-O Notation.markdown
2 parents f75ebbd + 6cfff2f commit e1f32e0

File tree

1 file changed

+123
-1
lines changed

1 file changed

+123
-1
lines changed

Big-O Notation.markdown

Lines changed: 123 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,8 +15,130 @@ Big-O | Name | Description
1515
**O(n^2)** | quadratic | **Kinda slow.** If you have 100 items, this does 100^2 = 10,000 units of work. Doubling the number of items makes it four times slower (because 2 squared equals 4). Example: algorithms using nested loops, such as insertion sort.
1616
**O(n^3)** | cubic | **Poor performance.** If you have 100 items, this does 100^3 = 1,000,000 units of work. Doubling the input size makes it eight times slower. Example: matrix multiplication.
1717
**O(2^n)** | exponential | **Very poor performance.** You want to avoid these kinds of algorithms, but sometimes you have no choice. Adding just one bit to the input doubles the running time. Example: traveling salesperson problem.
18-
**O(n!)** | factorial | **Intolerably slow.** It literally takes a million years to do anything.
18+
**O(n!)** | factorial | **Intolerably slow.** It literally takes a million years to do anything.
19+
1920

21+
Below are some examples for each category of performance:
22+
23+
**O(1)**
24+
25+
The most common example with O(1) complexity is accessing an array index.
26+
27+
```swift
28+
let value = array[5]
29+
```
30+
31+
Another example of O(1) is pushing and popping from Stack.
32+
33+
34+
**O(log n)**
35+
36+
```swift
37+
var j = 1
38+
while j < n {
39+
// do constant time stuff
40+
j *= 2
41+
}
42+
```
43+
44+
Instead of simply incrementing, 'j' is increased by 2 times itself in each run.
45+
46+
Binary Search Algorithm is an example of O(log n) complexity.
47+
48+
49+
**O(n)**
50+
51+
```swift
52+
for i in stride(from: 0, to: n, by: 1) {
53+
print(array[i])
54+
}
55+
```
56+
57+
Array Traversal and Linear Search are examples of O(n) complexity.
58+
59+
60+
**O(n log n)**
61+
62+
```swift
63+
for i in stride(from: 0, to: n, by: 1) {
64+
var j = 1
65+
while j < n {
66+
j *= 2
67+
// do constant time stuff
68+
}
69+
}
70+
```
71+
72+
OR
73+
74+
```swift
75+
for i in stride(from: 0, to: n, by: 1) {
76+
func index(after i: Int) -> Int? { // multiplies `i` by 2 until `i` >= `n`
77+
return i < n ? i * 2 : nil
78+
}
79+
for j in sequence(first: 1, next: index(after:)) {
80+
// do constant time stuff
81+
}
82+
}
83+
```
84+
85+
Merge Sort and Heap Sort are examples of O(n log n) complexity.
86+
87+
88+
**O(n^2)**
89+
90+
```swift
91+
for i in stride(from: 0, to: n, by: 1) {
92+
for j in stride(from: 1, to: n, by: 1) {
93+
// do constant time stuff
94+
}
95+
}
96+
```
97+
98+
Traversing a simple 2-D array and Bubble Sort are examples of O(n^2) complexity.
99+
100+
101+
**O(n^3)**
102+
103+
```swift
104+
for i in stride(from: 0, to: n, by: 1) {
105+
for j in stride(from: 1, to: n, by: 1) {
106+
for k in stride(from: 1, to: n, by: 1) {
107+
// do constant time stuff
108+
}
109+
}
110+
}
111+
```
112+
113+
**O(2^n)**
114+
115+
Algorithms with running time O(2^N) are often recursive algorithms that solve a problem of size N by recursively solving two smaller problems of size N-1.
116+
The following example prints all the moves necessary to solve the famous "Towers of Hanoi" problem for N disks.
117+
118+
```swift
119+
func solveHanoi(N: Int, from: String, to: String, spare: String) {
120+
guard n >= 1 else { return }
121+
if N > 1 {
122+
solveHanoi(N: N - 1, from: from, to: spare, spare: to)
123+
} else {
124+
solveHanoi(N: N-1, from: spare, to: to, spare: from)
125+
}
126+
}
127+
```
128+
129+
130+
**O(n!)**
131+
132+
The most trivial example of function that takes O(n!) time is given below.
133+
134+
```swift
135+
func nFacFunc(n: Int) {
136+
for i in stride(from: 0, to: n, by: 1) {
137+
nFactFunc(n - 1)
138+
}
139+
}
140+
```
141+
20142
Often you don't need math to figure out what the Big-O of an algorithm is but you can simply use your intuition. If your code uses a single loop that looks at all **n** elements of your input, the algorithm is **O(n)**. If the code has two nested loops, it is **O(n^2)**. Three nested loops gives **O(n^3)**, and so on.
21143

22144
Note that Big-O notation is an estimate and is only really useful for large values of **n**. For example, the worst-case running time for the [insertion sort](Insertion%20Sort/) algorithm is **O(n^2)**. In theory that is worse than the running time for [merge sort](Merge%20Sort/), which is **O(n log n)**. But for small amounts of data, insertion sort is actually faster, especially if the array is partially sorted already!

0 commit comments

Comments
 (0)