You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -76,7 +76,7 @@ Insertion never needs more than 2 rotations. Removal might require up to __log(n
76
76
77
77
## The code
78
78
79
-
Most of the code in [AVLTree.swift](AVLTree.swift) is just regular [binary search tree](../Binary Search Tree/) stuff. You'll find this in any implementation of a binary search tree. For example, searching the tree is exactly the same. The only things that an AVL tree does slightly differently are inserting and deleting the nodes.
79
+
Most of the code in [AVLTree.swift](AVLTree.swift) is just regular [binary search tree](../Binary%20Search%20Tree/) stuff. You'll find this in any implementation of a binary search tree. For example, searching the tree is exactly the same. The only things that an AVL tree does slightly differently are inserting and deleting the nodes.
80
80
81
81
> **Note:** If you're a bit fuzzy on the regular operations of a binary search tree, I suggest you [catch up on those first](../Binary%20Search%20Tree/). It will make the rest of the AVL tree easier to understand.
Copy file name to clipboardExpand all lines: Bounded Priority Queue/README.markdown
+1-1Lines changed: 1 addition & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -26,7 +26,7 @@ Suppose that we wish to insert the element `G` with priority 0.1 into this BPQ.
26
26
27
27
## Implementation
28
28
29
-
While a [heap](../Heap/) may be a really simple implementation for a priority queue, a sorted [linked list](../Linked List/) allows for **O(k)** insertion and **O(1)** deletion, where **k** is the bounding number of elements.
29
+
While a [heap](../Heap/) may be a really simple implementation for a priority queue, a sorted [linked list](../Linked%20List/) allows for **O(k)** insertion and **O(1)** deletion, where **k** is the bounding number of elements.
Notice that the helper functions `leftBoundary()` and `rightBoundary()` are very similar to the [binary search](../Binary Search/) algorithm. The big difference is that they don't stop when they find the search key, but keep going.
58
+
Notice that the helper functions `leftBoundary()` and `rightBoundary()` are very similar to the [binary search](../Binary%20Search/) algorithm. The big difference is that they don't stop when they find the search key, but keep going.
59
59
60
60
To test this algorithm, copy the code to a playground and then do:
Where a [breadth-first search](../Breadth-First Search/) visits all immediate neighbors first, a depth-first search tries to go as deep down the tree or graph as it can.
43
+
Where a [breadth-first search](../Breadth-First%20Search/) visits all immediate neighbors first, a depth-first search tries to go as deep down the tree or graph as it can.
44
44
45
45
Put this code in a playground and test it like so:
46
46
@@ -71,13 +71,13 @@ print(nodesExplored)
71
71
```
72
72
73
73
This will output: `["a", "b", "d", "e", "h", "f", "g", "c"]`
74
-
74
+
75
75
## What is DFS good for?
76
76
77
77
Depth-first search can be used to solve many problems, for example:
78
78
79
79
* Finding connected components of a sparse graph
80
-
*[Topological sorting](../Topological Sort/) of nodes in a graph
80
+
*[Topological sorting](../Topological%20Sort/) of nodes in a graph
81
81
* Finding bridges of a graph (see: [Bridges](https://en.wikipedia.org/wiki/Bridge_(graph_theory)#Bridge-finding_algorithm))
Copy file name to clipboardExpand all lines: Deque/README.markdown
+21-21Lines changed: 21 additions & 21 deletions
Original file line number
Diff line number
Diff line change
@@ -9,43 +9,43 @@ Here is a very basic implementation of a deque in Swift:
9
9
```swift
10
10
publicstructDeque<T> {
11
11
privatevar array = [T]()
12
-
12
+
13
13
publicvar isEmpty: Bool {
14
14
return array.isEmpty
15
15
}
16
-
16
+
17
17
publicvar count: Int {
18
18
return array.count
19
19
}
20
-
20
+
21
21
publicmutatingfuncenqueue(_element: T) {
22
22
array.append(element)
23
23
}
24
-
24
+
25
25
publicmutatingfuncenqueueFront(_element: T) {
26
26
array.insert(element, atIndex: 0)
27
27
}
28
-
28
+
29
29
publicmutatingfuncdequeue() -> T? {
30
30
if isEmpty {
31
31
returnnil
32
32
} else {
33
33
return array.removeFirst()
34
34
}
35
35
}
36
-
36
+
37
37
publicmutatingfuncdequeueBack() -> T? {
38
38
if isEmpty {
39
39
returnnil
40
40
} else {
41
41
return array.removeLast()
42
42
}
43
43
}
44
-
44
+
45
45
publicfuncpeekFront() -> T? {
46
46
return array.first
47
47
}
48
-
48
+
49
49
publicfuncpeekBack() -> T? {
50
50
return array.last
51
51
}
@@ -73,7 +73,7 @@ deque.dequeue() // 5
73
73
This particular implementation of `Deque` is simple but not very efficient. Several operations are **O(n)**, notably `enqueueFront()` and `dequeue()`. I've included it only to show the principle of what a deque does.
74
74
75
75
## A more efficient version
76
-
76
+
77
77
The reason that `dequeue()` and `enqueueFront()` are **O(n)** is that they work on the front of the array. If you remove an element at the front of an array, what happens is that all the remaining elements need to be shifted in memory.
78
78
79
79
Let's say the deque's array contains the following items:
@@ -92,7 +92,7 @@ Likewise, inserting an element at the front of the array is expensive because it
92
92
93
93
First, the elements `2`, `3`, and `4` are moved up by one position in the computer's memory, and then the new element `5` is inserted at the position where `2` used to be.
94
94
95
-
Why is this not an issue at for `enqueue()` and `dequeueBack()`? Well, these operations are performed at the end of the array. The way resizable arrays are implemented in Swift is by reserving a certain amount of free space at the back.
95
+
Why is this not an issue at for `enqueue()` and `dequeueBack()`? Well, these operations are performed at the end of the array. The way resizable arrays are implemented in Swift is by reserving a certain amount of free space at the back.
96
96
97
97
Our initial array `[ 1, 2, 3, 4]` actually looks like this in memory:
98
98
@@ -120,26 +120,26 @@ public struct Deque<T> {
120
120
privatevar head: Int
121
121
privatevar capacity: Int
122
122
privatelet originalCapacity:Int
123
-
123
+
124
124
publicinit(_capacity: Int=10) {
125
125
self.capacity=max(capacity, 1)
126
126
originalCapacity =self.capacity
127
127
array = [T?](repeating: nil, count: capacity)
128
128
head = capacity
129
129
}
130
-
130
+
131
131
publicvar isEmpty: Bool {
132
132
return count ==0
133
133
}
134
-
134
+
135
135
publicvar count: Int {
136
136
return array.count- head
137
137
}
138
-
138
+
139
139
publicmutatingfuncenqueue(_element: T) {
140
140
array.append(element)
141
141
}
142
-
142
+
143
143
publicmutatingfuncenqueueFront(_element: T) {
144
144
// this is explained below
145
145
}
@@ -155,15 +155,15 @@ public struct Deque<T> {
155
155
return array.removeLast()
156
156
}
157
157
}
158
-
158
+
159
159
publicfuncpeekFront() -> T? {
160
160
if isEmpty {
161
161
returnnil
162
162
} else {
163
163
return array[head]
164
164
}
165
165
}
166
-
166
+
167
167
publicfuncpeekBack() -> T? {
168
168
if isEmpty {
169
169
returnnil
@@ -176,7 +176,7 @@ public struct Deque<T> {
176
176
177
177
It still largely looks the same -- `enqueue()` and `dequeueBack()` haven't changed -- but there are also a few important differences. The array now stores objects of type `T?` instead of just `T` because we need some way to mark array elements as being empty.
178
178
179
-
The `init` method allocates a new array that contains a certain number of `nil` values. This is the free room we have to work with at the beginning of the array. By default this creates 10 empty spots.
179
+
The `init` method allocates a new array that contains a certain number of `nil` values. This is the free room we have to work with at the beginning of the array. By default this creates 10 empty spots.
180
180
181
181
The `head` variable is the index in the array of the front-most object. Since the queue is currently empty, `head` points at an index beyond the end of the array.
182
182
@@ -219,7 +219,7 @@ Notice how the array has resized itself. There was no room to add the `1`, so Sw
219
219
|
220
220
head
221
221
222
-
> **Note:** You won't see those empty spots at the back of the array when you `print(deque.array)`. This is because Swift hides them from you. Only the ones at the front of the array show up.
222
+
> **Note:** You won't see those empty spots at the back of the array when you `print(deque.array)`. This is because Swift hides them from you. Only the ones at the front of the array show up.
223
223
224
224
The `dequeue()` method does the opposite of `enqueueFront()`, it reads the value at `head`, sets the array element back to `nil`, and then moves `head` one position to the right:
225
225
@@ -250,7 +250,7 @@ There is one tiny problem... If you enqueue a lot of objects at the front, you'r
250
250
}
251
251
```
252
252
253
-
If `head` equals 0, there is no room left at the front. When that happens, we add a whole bunch of new `nil` elements to the array. This is an **O(n)** operation but since this cost gets divided over all the `enqueueFront()`s, each individual call to `enqueueFront()` is still **O(1)** on average.
253
+
If `head` equals 0, there is no room left at the front. When that happens, we add a whole bunch of new `nil` elements to the array. This is an **O(n)** operation but since this cost gets divided over all the `enqueueFront()`s, each individual call to `enqueueFront()` is still **O(1)** on average.
254
254
255
255
> **Note:** We also multiply the capacity by 2 each time this happens, so if your queue grows bigger and bigger, the resizing happens less often. This is also what Swift arrays automatically do at the back.
256
256
@@ -302,7 +302,7 @@ This way we can strike a balance between fast enqueuing and dequeuing at the fro
302
302
303
303
## See also
304
304
305
-
Other ways to implement deque are by using a [doubly linked list](../Linked List/), a [circular buffer](../Ring Buffer/), or two [stacks](../Stack/) facing opposite directions.
305
+
Other ways to implement deque are by using a [doubly linked list](../Linked%20List/), a [circular buffer](../Ring%20Buffer/), or two [stacks](../Stack/) facing opposite directions.
306
306
307
307
[A fully-featured deque implementation in Swift](https://github.com/lorentey/Deque)
Copy file name to clipboardExpand all lines: Graph/README.markdown
+3-3Lines changed: 3 additions & 3 deletions
Original file line number
Diff line number
Diff line change
@@ -28,7 +28,7 @@ The following are also graphs:
28
28
29
29

30
30
31
-
On the left is a [tree](../Tree/) structure, on the right a [linked list](../Linked List/). Both can be considered graphs, but in a simpler form. After all, they have vertices (nodes) and edges (links).
31
+
On the left is a [tree](../Tree/) structure, on the right a [linked list](../Linked%20List/). Both can be considered graphs, but in a simpler form. After all, they have vertices (nodes) and edges (links).
32
32
33
33
The very first graph I showed you contained *cycles*, where you can start off at a vertex, follow a path, and come back to the original vertex. A tree is a graph without such cycles.
34
34
@@ -42,15 +42,15 @@ Like a tree this does not have any cycles in it (no matter where you start, ther
42
42
43
43
Maybe you're shrugging your shoulders and thinking, what's the big deal? Well, it turns out that graphs are an extremely useful data structure.
44
44
45
-
If you have some programming problem where you can represent some of your data as vertices and some of it as edges between those vertices, then you can draw your problem as a graph and use well-known graph algorithms such as [breadth-first search](../Breadth-First Search/) or [depth-first search](../Depth-First Search) to find a solution.
45
+
If you have some programming problem where you can represent some of your data as vertices and some of it as edges between those vertices, then you can draw your problem as a graph and use well-known graph algorithms such as [breadth-first search](../Breadth-First%20Search/) or [depth-first search](../Depth-First%20Search) to find a solution.
46
46
47
47
For example, let's say you have a list of tasks where some tasks have to wait on others before they can begin. You can model this using an acyclic directed graph:
48
48
49
49

50
50
51
51
Each vertex represents a task. Here, an edge between two vertices means that the source task must be completed before the destination task can start. So task C cannot start before B and D are finished, and B nor D can start before A is finished.
52
52
53
-
Now that the problem is expressed using a graph, you can use a depth-first search to perform a [topological sort](../Topological Sort/). This will put the tasks in an optimal order so that you minimize the time spent waiting for tasks to complete. (One possible order here is A, B, D, E, C, F, G, H, I, J, K.)
53
+
Now that the problem is expressed using a graph, you can use a depth-first search to perform a [topological sort](../Topological%20Sort/). This will put the tasks in an optimal order so that you minimize the time spent waiting for tasks to complete. (One possible order here is A, B, D, E, C, F, G, H, I, J, K.)
54
54
55
55
Whenever you're faced with a tough programming problem, ask yourself, "how can I express this problem using a graph?" Graphs are all about representing relationships between your data. The trick is in how you define "relationship".
Copy file name to clipboardExpand all lines: Heap Sort/README.markdown
+1-1Lines changed: 1 addition & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -40,7 +40,7 @@ And fix up the heap to make it valid max-heap again:
40
40
41
41
As you can see, the largest items are making their way to the back. We repeat this process until we arrive at the root node and then the whole array is sorted.
42
42
43
-
> **Note:** This process is very similar to [selection sort](../Selection Sort/), which repeatedly looks for the minimum item in the remainder of the array. Extracting the minimum or maximum value is what heaps are good at.
43
+
> **Note:** This process is very similar to [selection sort](../Selection%20Sort/), which repeatedly looks for the minimum item in the remainder of the array. Extracting the minimum or maximum value is what heaps are good at.
44
44
45
45
Performance of heap sort is **O(n lg n)** in best, worst, and average case. Because we modify the array directly, heap sort can be performed in-place. But it is not a stable sort: the relative order of identical elements is not preserved.
0 commit comments