Bubble sort is easy to program, slower, iterative. Compares neighboring numbers swaps it if required and continues this procedure until there are no more swaps
Quick Sort is little difficult to program, Fastest, Recursive. Pivot number is selected, other numbers are compared with it and shifted to the right of number or left depending upon criteria again this method is applied to the left and right list generated to the pivot point number. Select pivot point among that list.
First of all we are going to define how each of them works and later talk more about their performance.
Selection sort
We have array of 10 (N = 10) integers with random number raging from 1 to 10 and we need to sort them. What selection sort does it looks for the lowest number in all N spots then it puts the lowest number in the first spot of array, while moving value in the first spot in older lowest number spot. Making exchange between values. When it does the same, but this time only with N - 1 elements and so on.
Example (with 5 numbers):
2 8 4 6 3
2 8 4 6 3 /* first iteration, 2 is already the lowest number and it is in first spot, nothing to do */
2 3 4 6 8 /* We checked next N - 1 [8, 4, 6, 3] and lowest was 3, we exchanged values 3 <--> 8 */
2 3 4 6 8 /* 4 is the lowest, nothing to do */
2 3 4 6 8 /* same */
2 3 4 6 8 /* same, N = 0 we are done. */
Note: It wold be wise not to do any sorting when we have N = 1, there is nothing to sort with 1 element.
Bubble sort
Again we have array of 10 integers. Bubble sort is done by comparing neighbors elements in array and if n > n + 1 then we exchange values. We do the same N - 1 times throughout full array.
Note: n stand for element position in array.
Example (with 5 numbers):
2 8 4 6 3
2 8 4 6 3 /* 2 > 8 = NO */
2 4 8 6 3 /* 8 > 4 = YES */
2 4 6 8 3 /* 8 > 6 = YES */
2 4 6 3 8 /* 8 > 3 = YES */
2 4 6 3 8 /* 2 > 4 = NO */
2 4 6 3 8 /* 4 > 6 = NO */
2 4 3 6 8 /* 6 > 3 = YES */
2 4 3 6 8 /* 6 > 8 = NO */
2 4 3 6 8 /* 2 > 4 = NO */
2 3 4 6 8 /* 4 > 3 = YES */
...
Performance
As you can see in both cases we have the same results but method is completely different. Now is the time to ask which one requires more times/operations or memory to finish sorting. We talk mostly about time/operations in Big-O notation language.
Selection sort:
Worst scenario: O(n2);
Average scenario: O(n2);
Best scenario: O(n2);
Bubble sort:
Worst scenario: O(n2);
Average scenario: O(n2);
Best scenario: O(n2);
Summary
They both are equally bad in performance manner. They are not optimal solutions. The only difference is in method how you are sorting.
Insertion sort is stable but is only suitable for small data sets.
Quicksort is unstable but is only suitable for large data sets.
Stability relates to elements of equal value. In a stable sort, the elements are kept in the order they were input. An unstable sort may change the order. This can be important when sorting elements that have the same sort key but are different values.
There is no single sorting algorithm that is ideally suited to all types of data.
distinguish between selection sort and exchange sort
in a selection sort of n elements .how many times is swap function is called?
ronaldo
By understanding the time and space complexities of sorting algorithms, you will better understand how a particular algorithm will scale with increased data to sort. * Bubble sort is O(N2). The number of Ops should come out <= 512 * 512 = 262144 * Quicksort is O(2N log N) on the average but can degenerate to (N2)/2 in the worst case (try the ordered data set on quicksort). Quicksort is recursive and needs a lot of stack space. * Shell sort (named for Mr. Shell) is less than O(N4/3) for this implementation. Shell sort is iterative and doesn't require much extra memory. * Merge sort is O( N log N) for all data sets, so while it is slower than the best case for quicksort, it doesn't have degenerate cases. It needs additional storage equal to the size of the input array and it is recursive so it needs stack space. * Heap sort is guaranteed to be O(N log N), doesn't degenerate like quicksort and doesn't use extra memory like mergesort, but its implementation has more operations so on average its not as good as quicksort.
Bucket sort is a sorting algorithm that works by partitioning an array into a finite number of buckets. Each bucket is then sorted individually, either using a different sorting algorithm, or by recursively applying the bucket sorting algorithm. A variation of this method called the single buffered count sort is faster than the quick sort and takes about the same time to run on any set of data.
If there was a way, it would be the new insertion sort! Theoretically you could reduce the time by using a linked list and searching to the position it needs to be inserted and inserting it. In practice however you would be better off simply using a different sort, especially if you don't want your data in a linked list. Selection sort is better when writing is expensive. Quicksort and Mergesort are faster on large data sets.
Linked lists are not ideally suited to the quicksort algorithm because linked lists do not provide constant-time random access. The most efficient means of implementing quicksort upon a list is to move all the elements to an array, sort the array using quicksort, then move the elements back into a list. This increases the complexity by O(n*2), which is costly, but is more than compensated for by the improved efficiency of sorting an array.
Quicksort is a popular algorithm to sort items in software, aiming at completion in the smallest number of steps (shortest time) possible.
Quicksort is faster than other algorithms, though it is a comparison sort, not a stable sort. It uses O(n log n) comparisons to sort n terms. It works well with cache.
ronaldo
By understanding the time and space complexities of sorting algorithms, you will better understand how a particular algorithm will scale with increased data to sort. * Bubble sort is O(N2). The number of Ops should come out <= 512 * 512 = 262144 * Quicksort is O(2N log N) on the average but can degenerate to (N2)/2 in the worst case (try the ordered data set on quicksort). Quicksort is recursive and needs a lot of stack space. * Shell sort (named for Mr. Shell) is less than O(N4/3) for this implementation. Shell sort is iterative and doesn't require much extra memory. * Merge sort is O( N log N) for all data sets, so while it is slower than the best case for quicksort, it doesn't have degenerate cases. It needs additional storage equal to the size of the input array and it is recursive so it needs stack space. * Heap sort is guaranteed to be O(N log N), doesn't degenerate like quicksort and doesn't use extra memory like mergesort, but its implementation has more operations so on average its not as good as quicksort.
radix sort
Bucket sort is a sorting algorithm that works by partitioning an array into a finite number of buckets. Each bucket is then sorted individually, either using a different sorting algorithm, or by recursively applying the bucket sorting algorithm. A variation of this method called the single buffered count sort is faster than the quick sort and takes about the same time to run on any set of data.
If there was a way, it would be the new insertion sort! Theoretically you could reduce the time by using a linked list and searching to the position it needs to be inserted and inserting it. In practice however you would be better off simply using a different sort, especially if you don't want your data in a linked list. Selection sort is better when writing is expensive. Quicksort and Mergesort are faster on large data sets.
Stack is not a way to perform quicksort, it is a tool used to implement recursion.
Since MS-DOS isn't even a remote consideration for most people these days, the numerous differences between them have no direct bearing on what operating system they choose to use. Sort of like how the difference between a donkey and a 2010 Cadillac Escalade doesn't impact what vehicle people choose to purchase.
Linked lists are not ideally suited to the quicksort algorithm because linked lists do not provide constant-time random access. The most efficient means of implementing quicksort upon a list is to move all the elements to an array, sort the array using quicksort, then move the elements back into a list. This increases the complexity by O(n*2), which is costly, but is more than compensated for by the improved efficiency of sorting an array.
The traditional bubble sort moves any number of elements at most one position per iteration, while selection sort moves exactly one element per iteration. Both sorts require an exponential amount of time to produce their results.
selection sort