Skip to content

Commit d7589aa

Browse files
authored
Replaces the code snippets with swift equivalents
1 parent d7ebac3 commit d7589aa

File tree

1 file changed

+60
-44
lines changed

1 file changed

+60
-44
lines changed

Big-O Notation.markdown

Lines changed: 60 additions & 44 deletions
Original file line numberDiff line numberDiff line change
@@ -24,103 +24,119 @@ Some examples to better understand the Big(O) notation:
2424

2525
The most common example with O(1) complexity is accessing an array index.
2626

27-
```c++
28-
int i = a[5];
27+
```swift
28+
let value = array[5]
2929
```
3030

3131
Another example of O(1) is Pushing and Popping from Stack.
3232

3333

3434
**O(log n)**
3535

36-
```c++
37-
for(int i=0; i<n; i *= 2){
38-
cout<<i<<endl; // instead of simply incrementing, 'i' is increased by 2 times itself in each run.
36+
```swift
37+
var j = 1
38+
while j < n {
39+
// do constant time stuff
40+
j *= 2
3941
}
40-
```
42+
```
43+
44+
Instead of simply incrementing, 'j' is increased by 2 times itself in each run.
45+
4146
Binary Search Algorithm is an example of O(log n) complexity.
4247

4348

4449
**O(n)**
4550

46-
```c++
47-
for(int i=0; i<n; i++){
48-
cout<<a[i]<<endl;
49-
}
51+
```swift
52+
for i in stride(from: 0, to: n, by: 1) {
53+
print(array[i])
54+
}
5055
```
5156

5257
Array Traversal and Linear Search are examples of O(n) complexity.
5358

5459

5560
**O(n log n)**
5661

57-
```c++
58-
for(int i = 0; i < n; i++) { //linear
59-
for(int j = 1; j < n; j *= 2){ // log (n)
60-
//do constant time stuff
62+
```swift
63+
for i in stride(from: 0, to: n, by: 1) {
64+
var j = 1
65+
while j < n {
66+
j *= 2
67+
// do constant time stuff
68+
}
69+
}
70+
```
71+
72+
OR
73+
74+
```swift
75+
for i in stride(from: 0, to: n, by: 1) {
76+
func index(after i: Int) -> Int? { // multiplies `i` by 2 until `i` >= `n`
77+
return i < n ? i * 2 : nil
78+
}
79+
for j in sequence(first: 1, next: index(after:)) {
80+
// do constant time stuff
6181
}
6282
}
63-
```
83+
```
6484

6585
Merge Sort and Heap Sort are examples of O(n log n) complexity.
6686

6787

6888
**O(n^2)**
6989

70-
```c++
71-
for(int i = 0; i < n; i++) {
72-
for(int j = 1; j < n; j++){
73-
//do constant time stuff
90+
```swift
91+
for i in stride(from: 0, to: n, by: 1) {
92+
for j in stride(from: 1, to: n, by: 1) {
93+
// do constant time stuff
7494
}
75-
}
95+
}
7696
```
7797

7898
Traversing a simple 2-D array and Bubble Sort are examples of O(n^2) complexity.
7999

80100

81101
**O(n^3)**
82102

83-
```c++
84-
for(int i = 0; i < n; i++) {
85-
for(int j = 1; j < n; j++){
86-
for(int k = 1; k < n; k++){
87-
//do constant time stuff
103+
```swift
104+
for i in stride(from: 0, to: n, by: 1) {
105+
for j in stride(from: 1, to: n, by: 1) {
106+
for k in stride(from: 1, to: n, by: 1) {
107+
// do constant time stuff
88108
}
89109
}
90-
}
110+
}
91111
```
92112

93113
**O(2^n)**
94114

95115
Algorithms with running time O(2^N) are often recursive algorithms that solve a problem of size N by recursively solving two smaller problems of size N-1.
96116
The following example prints all the moves necessary to solve the famous "Towers of Hanoi" problem for N disks.
97117

98-
```c++
99-
void solve_hanoi(int N, string from, string to, string spare) {
100-
if (N<1) {
101-
return;
102-
}
103-
if (N>1) {
104-
solve_hanoi(N-1, from, spare, to);
118+
```swift
119+
func solveHanoi(N: Int, from: String, to: String, spare: String) {
120+
guard n >= 1 else { return }
121+
if N > 1 {
122+
solveHanoi(N: N - 1, from: from, to: spare, spare: to)
123+
} else {
124+
solveHanoi(N: N-1, from: spare, to: to, spare: from)
105125
}
106-
print "move from " + from + " to " + to;
107-
if (N>1) {
108-
solve_hanoi(N-1, spare, to, from);
109-
}
110-
}
126+
}
111127
```
112128

113129

114130
**O(n!)**
115131

116132
The most trivial example of function that takes O(n!) time is given below.
117133

118-
```c++
119-
void nFacFunc(int n) {
120-
for(int i=0; i<n; i++) {
121-
nFacFunc(n-1);
122-
}
123-
}
134+
```swift
135+
func nFacFunc(n: Int) {
136+
for i in stride(from: 0, to: n, by: 1) {
137+
nFactFunc(n - 1)
138+
}
139+
}
124140
```
125141

126142
Often you don't need math to figure out what the Big-O of an algorithm is but you can simply use your intuition. If your code uses a single loop that looks at all **n** elements of your input, the algorithm is **O(n)**. If the code has two nested loops, it is **O(n^2)**. Three nested loops gives **O(n^3)**, and so on.

0 commit comments

Comments
 (0)