answersLogoWhite

0

What is a quick sort?

Updated: 8/11/2023
User Avatar

Wiki User

10y ago

Best Answer

Quicksort is a recursive algorithm.

Given an array and indices to the lower and upper bounds of a subset within the array, divide the subset into two subsets around a pivot value then recursively apply the algorithm to each of these subsets until a subset has fewer than 2 elements.

The algorithm begins by checking the size of the given subset. If there are fewer than 2 elements then the subset is sorted. This represents the end condition for this instance of the algorithm since a subset of 0 or 1 elements can always be regarded as being sorted.

If the set has 2 or more elements, however, a pivot element is selected . the pivot is typically the mode of the first, last and middle elements. The pivot is then swapped with the last element so it's out of the way.

Two indices are then instantiated, left and right, one referring to the first element (the lower bound), the other referring to the penultimate element (we ignore the pivot at the end for now).

While the left index is less than the right index, repeat:

1. While left is less than right and the value referred to by left is less than the pivot, increment left. (In other words, search for the first value from the left but no further than the right that is not less than the pivot).

2. While right is greater than left and the value referred to by right is not less than the pivot, decrement right. (In other words, search for the first value from the right but no further than the left index that is less than the pivot).

3. If the left and right indices are not equal, swap the values at those indices.

When the indexes are the same, or left crosses right, the loop ends.

Swap the value at the left index with the last element (the pivot).

At this point, the pivot is now in place and all values less than the pivot are to its left, with all others to its right. The values on either side are not yet sorted, thus we recursively invoke the same algorithm to sort each of these two subsets (excluding the pivot, of course).

Since the set is divided into two smaller subsets upon each invocation of the algorithm, each recursion takes less time because there are fewer elements to divide. The recursions effectively form a binary tree with one recursion per node. Some nodes will terminate earlier than others so the tree will probably be unbalanced. But since the sorting is done in-place, each recursion will unwind automatically leaving behind a sorted subset within the set. When the initial instance of the algorithm terminates, the entire set is sorted.

User Avatar

Wiki User

9y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

8y ago

Merge sort works by dividing and subdividing the set until every division has just one element. Every division can now be regarded as being sorted because a set of one is always a sorted set. We then merge adjacent pairs of sorted divisions to create a larger sorted division. We merge by comparing the first elements from each division, removing the lower value and adding it to the new division, repeating until both divisions are empty. We do this for all pairs of adjacent divisions, then merge these larger divisions together in the same way, repeating until there is only one division containing the sorted set.

This answer is:
User Avatar

User Avatar

Wiki User

7y ago

Quicksort is one of many sorting algorithms and is primarily used to sort large sets of data; however it's only suitable when the entire data set can be loaded into memory at once. The algorithm uses a recursive divide-and-conquer technique, splitting the set into two partitions around a pivot value. One partition holds all the values that are less than the pivot, the other holds all the values that are are greater than or equal to the pivot (logically, those that are not less than the pivot). The algorithm repeats over each partition. The end case is reached when a partition has fewer than two elements, because a set of one can always be regarded as being a sorted set, as can an empty set.

The algorithm can be expressed as follows (in pseudocode):

algorithm quicksort (A, lo, hi) is

if lo < hi then

{

p := partition (A, lo, hi)

quicksort (A, lo, p-1)

quicksort (A, p+1, hi)

}

Each time we invoke the partition algorithm, the pivot value (A[p]) is in its correct place. All other values are in the correct partition but are not necessarily sorted. Thus we recursively invoke the quicksort algorithm upon each of those partitions.

The partition algorithm does most of the actual work and can be expressed as follows:

algorithm partition (A, lo, hi) is

pivot := A[hi]

i := lo

for j := lo to hi -1 do

if A[j] <= pivot then

{

swap A[i] with A[j]

i := i + 1

}

swap A[i] with A[hi]

return i

This particular algorithm (attributed to Nico Lomuto) produces a stable sort; however, the number of swaps required to place values on the correct side of the pivot makes it inefficient, particularly when the set is already sorted (every value is less than or equal to the pivot and will therefor swap with itself).

The partition algorithm is essential to efficient performance, thus there are many variations. Most produce unstable sorts, where equal values are not guaranteed to be in the same order they were input. Stability is of importance when data may be sorted by a choice of keys. The chosen key becomes the primary key for the sort; however the primary key from a previous sort becomes the secondary key. With stable sorts, we do not have to consider secondary keys, but with unstable sorts we do. The more keys there are to consider, the more complex the algorithm becomes.

C. A. R. Hoare proposed an unstable partitioning algorithm using two indices rather than one. Starting from each end of the unsorted array, we work inwards until we find two inverted values which we then swap. We continue in this manner until the two indices meet. Our chosen pivot is then placed in this index which is the returned to the quicksort algorithm. This algorithm can be expressed as follows:

algorithm partition (A, lo, hi) is

pivot := A[lo]

i := lo -1

j := hi +1

loop forever...

do i := 1 + 1 while A[i] < pivot

do j := j - 1 while A[j] > pivot

if i >= j then return j

swap A[i] with A[j]

This algorithm performs around 3 times fewer swaps on average.

Choosing a good pivot value is key to efficient performance. Ideally we want to split the array into two equal sized partitions. This ideal means the pivot value ideally needs to be the median value; however, in order to determine the median we must sort the array, which naturally renders the algorithm moot. However, we can approximate the median by selecting the median of the first, last and middle elements. We cannot guarantee this will always give an even split, but we can guarantee that there will always be at least one element in each partition.

Another improvement we can make is to place all values equal to the pivot in a third partition between the other two. We then return both the lower and upper bounds of this partition. This changes the quicksort algorithm as follows:

algorithm quicksort (A, lo, hi) is

if lo < hi then { p, q := partition (A, lo, hi) // note: returns TWO values quicksort (A, lo, p-1) quicksort (A, q+1, hi) }

We can also improve performance further by recursing over the smaller of the two partitions and using tail-call recursion for the other:

algorithm quicksort (A, lo, hi) is

loop forever... if lo < hi then { p, q := partition (A, lo, hi) // note: returns TWO values if (p-lo

This answer is:
User Avatar

User Avatar

Wiki User

10y ago

It's a recursive sorting algorithm. You must be sure to understand recursion, to be able to understand this algorithm! The Wikipedia article summarizes the algorithm as follows:

1. Pick an element, called a pivot, from the list.

2. Reorder the list so that all elements with values less than the pivot come before the pivot, while all elements with values greater than the pivot come after it (equal values can go either way). After this partitioning, the pivot is in its final position. This is called the partition operation.

3. Recursively apply the above steps to the sub-list of elements with smaller values and separately the sub-list of elements with greater values.

Read the Wikipedia article on "Quick sort" for more details. Also, if you don't know much about "recursion" yet, be sure to read about recursion first.

This answer is:
User Avatar

User Avatar

Wiki User

13y ago

Quickly.

The internal workings of quicksort (qsort) is upto the manufacturer of the c library that contains qsort.

This answer is:
User Avatar

User Avatar

Wiki User

12y ago

Although quick sort has a worst case time complexity of O(n^2), but for sorting a large amount of numbers, quick sort is very efficient because of the concept of locality of reference.

This answer is:
User Avatar

User Avatar

Wiki User

10y ago

3

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: What is a quick sort?
Write your answer...
Submit
Still have questions?
magnify glass
imp