You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -16,61 +16,59 @@ Stability, in the context of sorting, refers to preserving the relative order of
16
16
- In an **unstable** sorting algorithm, their order might be reversed in the sorted output.
17
17
- In a **stable** sorting algorithm, their relative order remains unchanged.
18
18
19
-
Sure, let's use an example with characters and their indices to illustrate both stable and unstable sorting.
20
-
21
19
#### Example
22
20
23
-
Consider the following pairs of characters and their indices:
21
+
Picture each element as a **programming-language name** followed by the 0-based position it held in the original (unsorted) list:
24
22
25
23
```
26
-
[A0] [C1] [B2] [A3] [B4]
24
+
[C#0] [Python1] [C#2] [JavaScript3] [Python4]
27
25
```
28
26
29
27
#### Stable Sort
30
28
31
-
When we sort these pairs alphabetically by character using a stable sorting method, pairs with the same character retain their original order in terms of indices.
29
+
A **stable** sort keeps items that compare as “equal” in the same left-to-right order they started with.
32
30
33
-
I. Sort the character 'A':
31
+
I. Bring every `"C#"` to the front **without** changing their internal order (`0` still precedes `2`):
34
32
35
33
```
36
-
[A0] [A3] [C1] [B2] [B4]
34
+
[C#0] [C#2] [Python1] [JavaScript3] [Python4]
37
35
```
38
36
39
-
II. Next, sort the character 'B', retaining the original order of indices:
37
+
II. Next, move the two `"Python"` entries ahead of `"JavaScript"`, again preserving `1` before `4`:
40
38
41
39
```
42
-
[A0] [A3] [B2] [B4] [C1]
40
+
[C#0] [C#2] [JavaScript3] [Python1] [Python4]
43
41
```
44
42
45
-
So, the stablesorted sequence is:
43
+
So the stable-sorted sequence is:
46
44
47
45
```
48
-
[A0] [A3] [B2] [B4] [C1]
46
+
[C#0] [C#2] [JavaScript3] [Python1] [Python4]
49
47
```
50
48
51
49
#### Unstable Sort
52
50
53
-
When we sort these pairs alphabetically by character using an unstable sorting method, pairs with the same character may not retain their original order in terms of indices.
51
+
An **unstable** sort does *not* guarantee that equal items keep their original relative order.
54
52
55
-
I. Sort the character 'A':
53
+
I. While collecting `"C#"` items, the algorithm might emit index `2`*before* index `0`:
56
54
57
55
```
58
-
[A3] [A0] [C1] [B2] [B4]
56
+
[C#2] [C#0] [Python1] [JavaScript3] [Python4]
59
57
```
60
58
61
-
II. Next, sort the character 'B', without retaining the original order of indices:
59
+
II. Later, the two `"Python"` entries can also swap positions (`4` before `1`):
62
60
63
61
```
64
-
[A3] [A0] [B4] [B2] [C1]
62
+
[C#2] [C#0] [JavaScript3] [Python4] [Python1]
65
63
```
66
64
67
-
So, the unstable sorted sequence is:
65
+
So one possible unstable-sorted sequence is:
68
66
69
67
```
70
-
[A3] [A0] [B4] [B2] [C1]
68
+
[C#2] [C#0] [JavaScript3] [Python4] [Python1]
71
69
```
72
70
73
-
This characteristic is particularly valuable in scenarios where we might perform multiple rounds of sorting based on different criteria.
71
+
This stability property matters when you chain sorts on multiple keys—for instance, first sorting bug reports by **severity**, then by **timestamp**—because each later pass can rely on ties already being in the correct internal order.
74
72
75
73
### Bubble Sort
76
74
@@ -88,6 +86,26 @@ Imagine a sequence of numbers. Starting from the beginning of the sequence, we c
88
86
4. After the first pass, the largest item will be at the last position. On the next pass, you can ignore the last item and consider the rest of the array.
89
87
5. Continue this process for `n-1` passes to ensure the array is completely sorted.
An important optimization for bubble sort is to keep track of whether any swaps were made during a pass. If a pass completes without any swaps, it means the array is already sorted, and there's no need to continue further iterations.
@@ -98,9 +116,9 @@ Bubble sort is stable. This means that two objects with equal keys will retain t
98
116
99
117
#### Time Complexity
100
118
101
-
- In the **worst-case** scenario, the time complexity of bubble sort is **$O(n^2)$**, which occurs when the array is in reverse order.
102
-
- The **average-case** time complexity is also **$O(n^2)$**, as bubble sort generally requires quadratic time for typical unsorted arrays.
103
-
- In the **best-case** scenario, the time complexity is **$O(n)$**, which happens when the array is already sorted, especially if an optimization like early exit is implemented.
119
+
- In the **worst-case** scenario, the time complexity of bubble sort is $O(n^2)$, which occurs when the array is in reverse order.
120
+
- The **average-case** time complexity is also $O(n^2)$, as bubble sort generally requires quadratic time for typical unsorted arrays.
121
+
- In the **best-case** scenario, the time complexity is $O(n)$, which happens when the array is already sorted, especially if an optimization like early exit is implemented.
104
122
105
123
#### Space Complexity
106
124
@@ -122,15 +140,38 @@ Consider an array of numbers. The algorithm divides the array into two parts: a
122
140
4. Move the boundary of the sorted and unsorted subarrays one element to the right.
123
141
5. Repeat steps 1-4 until the entire array is sorted.
124
142
143
+
```
144
+
Start:
145
+
[ 64 ][ 25 ][ 12 ][ 22 ][ 11 ]
146
+
147
+
Pass 1: find min(64,25,12,22,11)=11, swap with first element
148
+
[ 11 ][ 25 ][ 12 ][ 22 ][ 64 ]
149
+
150
+
Pass 2: find min(25,12,22,64)=12, swap with second element
151
+
[ 11 ][ 12 ][ 25 ][ 22 ][ 64 ]
152
+
153
+
Pass 3: find min(25,22,64)=22, swap with third element
154
+
[ 11 ][ 12 ][ 22 ][ 25 ][ 64 ]
155
+
156
+
Pass 4: find min(25,64)=25, swap with fourth element (self-swap)
157
+
[ 11 ][ 12 ][ 22 ][ 25 ][ 64 ]
158
+
159
+
Pass 5: only one element remains, already in place
160
+
[ 11 ][ 12 ][ 22 ][ 25 ][ 64 ]
161
+
162
+
Result:
163
+
[ 11 ][ 12 ][ 22 ][ 25 ][ 64 ]
164
+
```
165
+
125
166
#### Stability
126
167
127
168
Selection sort is inherently unstable. When two elements have equal keys, their relative order might change post-sorting. This can be problematic in scenarios where stability is crucial.
128
169
129
170
#### Time Complexity
130
171
131
-
- In the **worst-case**, the time complexity is **$O(n^2)$**, as even if the array is already sorted, the algorithm still iterates through every element to find the smallest.
132
-
- The **average-case** time complexity is also **$O(n^2)$**, since the algorithm's performance generally remains quadratic regardless of input arrangement.
133
-
- In the **best-case**, the time complexity is still **$O(n^2)$**, unlike other algorithms, because selection sort always performs the same number of comparisons, regardless of the input's initial order.
172
+
- In the **worst-case**, the time complexity is $O(n^2)$, as even if the array is already sorted, the algorithm still iterates through every element to find the smallest.
173
+
- The **average-case** time complexity is also $O(n^2)$, since the algorithm's performance generally remains quadratic regardless of input arrangement.
174
+
- In the **best-case**, the time complexity is still $O(n^2)$, unlike other algorithms, because selection sort always performs the same number of comparisons, regardless of the input's initial order.
134
175
135
176
#### Space Complexity
136
177
@@ -157,15 +198,35 @@ Imagine you have a series of numbers. The algorithm begins with the second eleme
157
198
4. Insert the current element into the correct position so that the elements before are all smaller.
158
199
5. Repeat steps 2-4 for each element in the array.
159
200
201
+
```
202
+
Start:
203
+
[ 12 ][ 11 ][ 13 ][ 5 ][ 6 ]
204
+
205
+
Pass 1: key = 11, insert into [12]
206
+
[ 11 ][ 12 ][ 13 ][ 5 ][ 6 ]
207
+
208
+
Pass 2: key = 13, stays in place
209
+
[ 11 ][ 12 ][ 13 ][ 5 ][ 6 ]
210
+
211
+
Pass 3: key = 5, insert into [11,12,13]
212
+
[ 5 ][ 11 ][ 12 ][ 13 ][ 6 ]
213
+
214
+
Pass 4: key = 6, insert into [5,11,12,13]
215
+
[ 5 ][ 6 ][ 11 ][ 12 ][ 13 ]
216
+
217
+
Result:
218
+
[ 5 ][ 6 ][ 11 ][ 12 ][ 13 ]
219
+
```
220
+
160
221
#### Stability
161
222
162
223
Insertion sort is stable. When two elements have equal keys, their relative order remains unchanged post-sorting. This stability is preserved since the algorithm only swaps elements if they are out of order, ensuring that equal elements never overtake each other.
163
224
164
225
#### Time Complexity
165
226
166
-
- In the **worst-case**, the time complexity is **$O(n^2)$**, which happens when the array is in reverse order, requiring every element to be compared with every other element.
167
-
- The **average-case** time complexity is **$O(n^2)$**, as elements generally need to be compared with others, leading to quadratic performance.
168
-
- In the **best-case**, the time complexity is **$O(n)$**, occurring when the array is already sorted, allowing the algorithm to simply pass through the array once without making any swaps.
227
+
- In the **worst-case**, the time complexity is $O(n^2)$, which happens when the array is in reverse order, requiring every element to be compared with every other element.
228
+
- The **average-case** time complexity is $O(n^2)$, as elements generally need to be compared with others, leading to quadratic performance.
229
+
- In the **best-case**, the time complexity is $O(n)$, occurring when the array is already sorted, allowing the algorithm to simply pass through the array once without making any swaps.
169
230
170
231
#### Space Complexity
171
232
@@ -193,15 +254,52 @@ Quick Sort, often simply referred to as "quicksort", is a divide-and-conquer alg
193
254
3. Recursively apply steps 1 and 2 to the left and right partitions.
194
255
4. Repeat until base case: the partition has only one or zero elements.
Quick sort is inherently unstable due to the long-distance exchanges of values. However, with specific modifications, it can be made stable, although this is not commonly done.
199
297
200
298
#### Time Complexity
201
299
202
-
- In the **worst-case**, the time complexity is **$O(n^2)$**, which can occur when the pivot is the smallest or largest element, resulting in highly unbalanced partitions. However, with effective pivot selection strategies, this scenario is rare in practice.
203
-
- The **average-case** time complexity is **$O(n \log n)$**, which is expected when using a good pivot selection method that balances the partitions reasonably well.
204
-
- In the **best-case**, the time complexity is also **$O(n \log n)$**, occurring when each pivot divides the array into two roughly equal-sized parts, leading to optimal partitioning.
300
+
- In the **worst-case**, the time complexity is $O(n^2)$, which can occur when the pivot is the smallest or largest element, resulting in highly unbalanced partitions. However, with effective pivot selection strategies, this scenario is rare in practice.
301
+
- The **average-case** time complexity is $O(n \log n)$, which is expected when using a good pivot selection method that balances the partitions reasonably well.
302
+
- In the **best-case**, the time complexity is also $O(n \log n)$, occurring when each pivot divides the array into two roughly equal-sized parts, leading to optimal partitioning.
205
303
206
304
#### Space Complexity
207
305
@@ -218,8 +316,9 @@ Heap Sort is a comparison-based sorting technique performed on a binary heap dat
218
316
219
317
#### Conceptual Overview
220
318
221
-
1. The first step is to **build a max heap**, which involves transforming the list into a max heap (a complete binary tree where each node is greater than or equal to its children). This is typically achieved using a bottom-up approach to ensure the heap property is satisfied.
222
-
2. During **sorting**, the maximum element (the root of the heap) is swapped with the last element of the unsorted portion of the array, placing the largest element in its final position. The heap size is then reduced by one, and the unsorted portion is restructured into a max heap. This process continues until the heap size is reduced to one, completing the sort.
319
+
1. The first step is to **build a max heap**, which involves transforming the list into a max heap (a complete binary tree where each node is greater than or equal to its children). This is typically achieved using a bottom-up approach to ensure the heap property is satisfied. *(Building the heap with Floyd’s bottom-up procedure costs Θ(*n*) time—lower than Θ(*n log n*)—so it never dominates the overall running time.)*
320
+
321
+
2. During **sorting**, the maximum element (the root of the heap) is swapped with the last element of the unsorted portion of the array, placing the largest element in its final position. **After each swap, the newly “fixed” maximum stays at the end of the *same* array; the active heap is simply the prefix that remains unsorted.** The heap size is then reduced by one, and the unsorted portion is restructured into a max heap. This process continues until the heap size is reduced to one, completing the sort.
223
322
224
323
#### Steps
225
324
@@ -229,19 +328,85 @@ Heap Sort is a comparison-based sorting technique performed on a binary heap dat
229
328
4. "Heapify" the root of the tree, i.e., ensure the heap property is maintained.
230
329
5. Repeat steps 2-4 until the size of the heap is one.
231
330
331
+
```
332
+
Initial array (size n = 5) index: 0 1 2 3 4
333
+
4 [4,10,3,5,1]
334
+
/ \
335
+
10 3
336
+
/ \
337
+
5 1
338
+
339
+
↓ BUILD MAX-HEAP (Θ(n)) —> heapSize = 5
340
+
10 [10,5,3,4,1]
341
+
/ \
342
+
5 3
343
+
/ \
344
+
4 1
345
+
```
346
+
347
+
**Pass 1 extract-max**
348
+
349
+
```
350
+
swap 10 ↔ 1 [1,5,3,4 | 10] heapSize = 4
351
+
↑ live heap ↑ ↑fixed↑
352
+
heapify (1↔5, 1↔4) → [5,4,3,1 | 10]
353
+
354
+
5
355
+
/ \
356
+
4 3
357
+
/
358
+
1
359
+
```
360
+
361
+
**Pass 2 extract-max**
362
+
363
+
```
364
+
swap 5 ↔ 1 [1,4,3 | 5,10] heapSize = 3
365
+
heapify (1↔4) → [4,1,3 | 5,10]
366
+
367
+
4
368
+
/ \
369
+
1 3
370
+
```
371
+
372
+
**Pass 3 extract-max**
373
+
374
+
```
375
+
swap 4 ↔ 3 [3,1 | 4,5,10] heapSize = 2
376
+
(no heapify needed – root already ≥ child)
377
+
378
+
3
379
+
/
380
+
1
381
+
```
382
+
383
+
**Pass 4 extract-max**
384
+
385
+
```
386
+
swap 3 ↔ 1 [1 | 3,4,5,10] heapSize = 1
387
+
(heap of size 1 is trivially a heap)
388
+
```
389
+
390
+
**Pass 5 extract-max**
391
+
392
+
```
393
+
Done – heapSize = 0
394
+
Sorted array: [1,3,4,5,10]
395
+
```
396
+
232
397
#### Stability
233
398
234
399
Heap sort is inherently unstable. Similar to quicksort, the relative order of equal items is not preserved because of the long-distance exchanges.
235
400
236
401
#### Time Complexity
237
402
238
-
- In the **worst-case**, the time complexity is **$O(n \log n)$**, regardless of the arrangement of the input data.
239
-
- The **average-case** time complexity is also **$O(n \log n)$**, as the algorithm's structure ensures consistent performance.
240
-
- In the **best-case**, the time complexity remains **$O(n \log n)$**, since building and deconstructing the heap is still necessary, even if the input is already partially sorted.
403
+
- In the **worst-case**, the time complexity is $O(n \log n)$, regardless of the arrangement of the input data.
404
+
- The **average-case** time complexity is also $O(n \log n)$, as the algorithm's structure ensures consistent performance.
405
+
- In the **best-case**, the time complexity remains $O(n \log n)$, since building and deconstructing the heap is still necessary, even if the input is already partially sorted.
241
406
242
407
#### Space Complexity
243
408
244
-
$(O(1))$ - The sorting is done in-place, requiring only a constant amount of space for variables, regardless of the input size.
409
+
$O(1)$ – The sorting is done in-place, requiring only a constant amount of auxiliary space. **This assumes an *iterative*`siftDown/heapify`; a recursive version would add an \$O(\log n)\$ call stack.**
0 commit comments