Skip to content

Commit 5c7175e

Browse files
author
Parva9eh
authored
Merge branch 'master' into patch-3
2 parents da0be01 + 52a266e commit 5c7175e

File tree

24 files changed

+946
-110
lines changed

24 files changed

+946
-110
lines changed

AVL Tree/README.markdown

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -53,7 +53,7 @@ For the rotation we're using the terminology:
5353
* *RotationSubtree* - subtree of the *Pivot* upon the side of rotation
5454
* *OppositeSubtree* - subtree of the *Pivot* opposite the side of rotation
5555

56-
Let take an example of balancing the unbalanced tree using *Right* (clockwise direction) rotation:
56+
Let take an example of balancing the unbalanced tree using *Right* (clockwise direction) rotation:
5757

5858
![Rotation1](Images/RotationStep1.jpg) ![Rotation2](Images/RotationStep2.jpg) ![Rotation3](Images/RotationStep3.jpg)
5959

@@ -76,7 +76,7 @@ Insertion never needs more than 2 rotations. Removal might require up to __log(n
7676

7777
## The code
7878

79-
Most of the code in [AVLTree.swift](AVLTree.swift) is just regular [binary search tree](../Binary Search Tree/) stuff. You'll find this in any implementation of a binary search tree. For example, searching the tree is exactly the same. The only things that an AVL tree does slightly differently are inserting and deleting the nodes.
79+
Most of the code in [AVLTree.swift](AVLTree.swift) is just regular [binary search tree](../Binary%20Search%20Tree/) stuff. You'll find this in any implementation of a binary search tree. For example, searching the tree is exactly the same. The only things that an AVL tree does slightly differently are inserting and deleting the nodes.
8080

8181
> **Note:** If you're a bit fuzzy on the regular operations of a binary search tree, I suggest you [catch up on those first](../Binary%20Search%20Tree/). It will make the rest of the AVL tree easier to understand.
8282

Binary Search Tree/README.markdown

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -195,7 +195,7 @@ For convenience, let's add an init method that calls `insert()` for all the elem
195195
precondition(array.count > 0)
196196
self.init(value: array.first!)
197197
for v in array.dropFirst() {
198-
insert(v, parent: self)
198+
insert(value: v)
199199
}
200200
}
201201
```

Bounded Priority Queue/README.markdown

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ Suppose that we wish to insert the element `G` with priority 0.1 into this BPQ.
2626

2727
## Implementation
2828

29-
While a [heap](../Heap/) may be a really simple implementation for a priority queue, a sorted [linked list](../Linked List/) allows for **O(k)** insertion and **O(1)** deletion, where **k** is the bounding number of elements.
29+
While a [heap](../Heap/) may be a really simple implementation for a priority queue, a sorted [linked list](../Linked%20List/) allows for **O(k)** insertion and **O(1)** deletion, where **k** is the bounding number of elements.
3030

3131
Here's how you could implement it in Swift:
3232

Count Occurrences/README.markdown

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -36,7 +36,7 @@ func countOccurrencesOfKey(_ key: Int, inArray a: [Int]) -> Int {
3636
}
3737
return low
3838
}
39-
39+
4040
func rightBoundary() -> Int {
4141
var low = 0
4242
var high = a.count
@@ -50,12 +50,12 @@ func countOccurrencesOfKey(_ key: Int, inArray a: [Int]) -> Int {
5050
}
5151
return low
5252
}
53-
53+
5454
return rightBoundary() - leftBoundary()
5555
}
5656
```
5757

58-
Notice that the helper functions `leftBoundary()` and `rightBoundary()` are very similar to the [binary search](../Binary Search/) algorithm. The big difference is that they don't stop when they find the search key, but keep going.
58+
Notice that the helper functions `leftBoundary()` and `rightBoundary()` are very similar to the [binary search](../Binary%20Search/) algorithm. The big difference is that they don't stop when they find the search key, but keep going.
5959

6060
To test this algorithm, copy the code to a playground and then do:
6161

Depth-First Search/README.markdown

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -40,7 +40,7 @@ func depthFirstSearch(_ graph: Graph, source: Node) -> [String] {
4040
}
4141
```
4242

43-
Where a [breadth-first search](../Breadth-First Search/) visits all immediate neighbors first, a depth-first search tries to go as deep down the tree or graph as it can.
43+
Where a [breadth-first search](../Breadth-First%20Search/) visits all immediate neighbors first, a depth-first search tries to go as deep down the tree or graph as it can.
4444

4545
Put this code in a playground and test it like so:
4646

@@ -71,13 +71,13 @@ print(nodesExplored)
7171
```
7272

7373
This will output: `["a", "b", "d", "e", "h", "f", "g", "c"]`
74-
74+
7575
## What is DFS good for?
7676

7777
Depth-first search can be used to solve many problems, for example:
7878

7979
* Finding connected components of a sparse graph
80-
* [Topological sorting](../Topological Sort/) of nodes in a graph
80+
* [Topological sorting](../Topological%20Sort/) of nodes in a graph
8181
* Finding bridges of a graph (see: [Bridges](https://en.wikipedia.org/wiki/Bridge_(graph_theory)#Bridge-finding_algorithm))
8282
* And lots of others!
8383

Deque/README.markdown

Lines changed: 21 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -9,43 +9,43 @@ Here is a very basic implementation of a deque in Swift:
99
```swift
1010
public struct Deque<T> {
1111
private var array = [T]()
12-
12+
1313
public var isEmpty: Bool {
1414
return array.isEmpty
1515
}
16-
16+
1717
public var count: Int {
1818
return array.count
1919
}
20-
20+
2121
public mutating func enqueue(_ element: T) {
2222
array.append(element)
2323
}
24-
24+
2525
public mutating func enqueueFront(_ element: T) {
2626
array.insert(element, atIndex: 0)
2727
}
28-
28+
2929
public mutating func dequeue() -> T? {
3030
if isEmpty {
3131
return nil
3232
} else {
3333
return array.removeFirst()
3434
}
3535
}
36-
36+
3737
public mutating func dequeueBack() -> T? {
3838
if isEmpty {
3939
return nil
4040
} else {
4141
return array.removeLast()
4242
}
4343
}
44-
44+
4545
public func peekFront() -> T? {
4646
return array.first
4747
}
48-
48+
4949
public func peekBack() -> T? {
5050
return array.last
5151
}
@@ -73,7 +73,7 @@ deque.dequeue() // 5
7373
This particular implementation of `Deque` is simple but not very efficient. Several operations are **O(n)**, notably `enqueueFront()` and `dequeue()`. I've included it only to show the principle of what a deque does.
7474

7575
## A more efficient version
76-
76+
7777
The reason that `dequeue()` and `enqueueFront()` are **O(n)** is that they work on the front of the array. If you remove an element at the front of an array, what happens is that all the remaining elements need to be shifted in memory.
7878

7979
Let's say the deque's array contains the following items:
@@ -92,7 +92,7 @@ Likewise, inserting an element at the front of the array is expensive because it
9292

9393
First, the elements `2`, `3`, and `4` are moved up by one position in the computer's memory, and then the new element `5` is inserted at the position where `2` used to be.
9494

95-
Why is this not an issue at for `enqueue()` and `dequeueBack()`? Well, these operations are performed at the end of the array. The way resizable arrays are implemented in Swift is by reserving a certain amount of free space at the back.
95+
Why is this not an issue at for `enqueue()` and `dequeueBack()`? Well, these operations are performed at the end of the array. The way resizable arrays are implemented in Swift is by reserving a certain amount of free space at the back.
9696

9797
Our initial array `[ 1, 2, 3, 4]` actually looks like this in memory:
9898

@@ -120,26 +120,26 @@ public struct Deque<T> {
120120
private var head: Int
121121
private var capacity: Int
122122
private let originalCapacity:Int
123-
123+
124124
public init(_ capacity: Int = 10) {
125125
self.capacity = max(capacity, 1)
126126
originalCapacity = self.capacity
127127
array = [T?](repeating: nil, count: capacity)
128128
head = capacity
129129
}
130-
130+
131131
public var isEmpty: Bool {
132132
return count == 0
133133
}
134-
134+
135135
public var count: Int {
136136
return array.count - head
137137
}
138-
138+
139139
public mutating func enqueue(_ element: T) {
140140
array.append(element)
141141
}
142-
142+
143143
public mutating func enqueueFront(_ element: T) {
144144
// this is explained below
145145
}
@@ -155,15 +155,15 @@ public struct Deque<T> {
155155
return array.removeLast()
156156
}
157157
}
158-
158+
159159
public func peekFront() -> T? {
160160
if isEmpty {
161161
return nil
162162
} else {
163163
return array[head]
164164
}
165165
}
166-
166+
167167
public func peekBack() -> T? {
168168
if isEmpty {
169169
return nil
@@ -176,7 +176,7 @@ public struct Deque<T> {
176176

177177
It still largely looks the same -- `enqueue()` and `dequeueBack()` haven't changed -- but there are also a few important differences. The array now stores objects of type `T?` instead of just `T` because we need some way to mark array elements as being empty.
178178

179-
The `init` method allocates a new array that contains a certain number of `nil` values. This is the free room we have to work with at the beginning of the array. By default this creates 10 empty spots.
179+
The `init` method allocates a new array that contains a certain number of `nil` values. This is the free room we have to work with at the beginning of the array. By default this creates 10 empty spots.
180180

181181
The `head` variable is the index in the array of the front-most object. Since the queue is currently empty, `head` points at an index beyond the end of the array.
182182

@@ -219,7 +219,7 @@ Notice how the array has resized itself. There was no room to add the `1`, so Sw
219219
|
220220
head
221221

222-
> **Note:** You won't see those empty spots at the back of the array when you `print(deque.array)`. This is because Swift hides them from you. Only the ones at the front of the array show up.
222+
> **Note:** You won't see those empty spots at the back of the array when you `print(deque.array)`. This is because Swift hides them from you. Only the ones at the front of the array show up.
223223
224224
The `dequeue()` method does the opposite of `enqueueFront()`, it reads the value at `head`, sets the array element back to `nil`, and then moves `head` one position to the right:
225225

@@ -250,7 +250,7 @@ There is one tiny problem... If you enqueue a lot of objects at the front, you'r
250250
}
251251
```
252252

253-
If `head` equals 0, there is no room left at the front. When that happens, we add a whole bunch of new `nil` elements to the array. This is an **O(n)** operation but since this cost gets divided over all the `enqueueFront()`s, each individual call to `enqueueFront()` is still **O(1)** on average.
253+
If `head` equals 0, there is no room left at the front. When that happens, we add a whole bunch of new `nil` elements to the array. This is an **O(n)** operation but since this cost gets divided over all the `enqueueFront()`s, each individual call to `enqueueFront()` is still **O(1)** on average.
254254

255255
> **Note:** We also multiply the capacity by 2 each time this happens, so if your queue grows bigger and bigger, the resizing happens less often. This is also what Swift arrays automatically do at the back.
256256
@@ -302,7 +302,7 @@ This way we can strike a balance between fast enqueuing and dequeuing at the fro
302302
303303
## See also
304304

305-
Other ways to implement deque are by using a [doubly linked list](../Linked List/), a [circular buffer](../Ring Buffer/), or two [stacks](../Stack/) facing opposite directions.
305+
Other ways to implement deque are by using a [doubly linked list](../Linked%20List/), a [circular buffer](../Ring%20Buffer/), or two [stacks](../Stack/) facing opposite directions.
306306

307307
[A fully-featured deque implementation in Swift](https://github.com/lorentey/Deque)
308308

Graph/README.markdown

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ The following are also graphs:
2828

2929
![Tree and linked list](Images/TreeAndList.png)
3030

31-
On the left is a [tree](../Tree/) structure, on the right a [linked list](../Linked List/). Both can be considered graphs, but in a simpler form. After all, they have vertices (nodes) and edges (links).
31+
On the left is a [tree](../Tree/) structure, on the right a [linked list](../Linked%20List/). Both can be considered graphs, but in a simpler form. After all, they have vertices (nodes) and edges (links).
3232

3333
The very first graph I showed you contained *cycles*, where you can start off at a vertex, follow a path, and come back to the original vertex. A tree is a graph without such cycles.
3434

@@ -42,15 +42,15 @@ Like a tree this does not have any cycles in it (no matter where you start, ther
4242

4343
Maybe you're shrugging your shoulders and thinking, what's the big deal? Well, it turns out that graphs are an extremely useful data structure.
4444

45-
If you have some programming problem where you can represent some of your data as vertices and some of it as edges between those vertices, then you can draw your problem as a graph and use well-known graph algorithms such as [breadth-first search](../Breadth-First Search/) or [depth-first search](../Depth-First Search) to find a solution.
45+
If you have some programming problem where you can represent some of your data as vertices and some of it as edges between those vertices, then you can draw your problem as a graph and use well-known graph algorithms such as [breadth-first search](../Breadth-First%20Search/) or [depth-first search](../Depth-First%20Search) to find a solution.
4646

4747
For example, let's say you have a list of tasks where some tasks have to wait on others before they can begin. You can model this using an acyclic directed graph:
4848

4949
![Tasks as a graph](Images/Tasks.png)
5050

5151
Each vertex represents a task. Here, an edge between two vertices means that the source task must be completed before the destination task can start. So task C cannot start before B and D are finished, and B nor D can start before A is finished.
5252

53-
Now that the problem is expressed using a graph, you can use a depth-first search to perform a [topological sort](../Topological Sort/). This will put the tasks in an optimal order so that you minimize the time spent waiting for tasks to complete. (One possible order here is A, B, D, E, C, F, G, H, I, J, K.)
53+
Now that the problem is expressed using a graph, you can use a depth-first search to perform a [topological sort](../Topological%20Sort/). This will put the tasks in an optimal order so that you minimize the time spent waiting for tasks to complete. (One possible order here is A, B, D, E, C, F, G, H, I, J, K.)
5454

5555
Whenever you're faced with a tough programming problem, ask yourself, "how can I express this problem using a graph?" Graphs are all about representing relationships between your data. The trick is in how you define "relationship".
5656

Heap Sort/README.markdown

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -40,7 +40,7 @@ And fix up the heap to make it valid max-heap again:
4040

4141
As you can see, the largest items are making their way to the back. We repeat this process until we arrive at the root node and then the whole array is sorted.
4242

43-
> **Note:** This process is very similar to [selection sort](../Selection Sort/), which repeatedly looks for the minimum item in the remainder of the array. Extracting the minimum or maximum value is what heaps are good at.
43+
> **Note:** This process is very similar to [selection sort](../Selection%20Sort/), which repeatedly looks for the minimum item in the remainder of the array. Extracting the minimum or maximum value is what heaps are good at.
4444
4545
Performance of heap sort is **O(n lg n)** in best, worst, and average case. Because we modify the array directly, heap sort can be performed in-place. But it is not a stable sort: the relative order of identical elements is not preserved.
4646

0 commit comments

Comments
 (0)