answersLogoWhite

0


Best Answer

The worst case occurs when data is already sorted where the complexity is O(n^2) instead of the well known O(n log n)

User Avatar

Wiki User

14y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Worst case of Quicksort algorithm
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

What is the difference between best worst and average case complexity of an algorithm?

These are terms given to the various scenarios which can be encountered by an algorithm. The best case scenario for an algorithm is the arrangement of data for which this algorithm performs best. Take a binary search for example. The best case scenario for this search is that the target value is at the very center of the data you're searching. So the best case time complexity for this would be O(1). The worst case scenario, on the other hand, describes the absolute worst set of input for a given algorithm. Let's look at a quicksort, which can perform terribly if you always choose the smallest or largest element of a sublist for the pivot value. This will cause quicksort to degenerate to O(n2). Discounting the best and worst cases, we usually want to look at the average performance of an algorithm. These are the cases for which the algorithm performs "normally."


Define worst-case of an algorithm?

Asymptotic


What is quicksort?

Quicksort is a popular algorithm to sort items in software, aiming at completion in the smallest number of steps (shortest time) possible.


What is the worst case analysis for matrix multiplication algorithm?

n^3


What is recursive call in terms of algorithm?

A recursive call in an algorithm is when a function (that implements this algorithm) calls itself. For example, Quicksort is a popular algorithm that is recursive. The recursive call is seen in the last line of the pseudocode, where the quicksort function calls itself. function quicksort('array') create empty lists 'less' and 'greater' if length('array') ≤ 1 return 'array' // an array of zero or one elements is already sorted select and remove a pivot value 'pivot' from 'array' for each 'x' in 'array' if 'x' ≤ 'pivot' then append 'x' to 'less' else append 'x' to 'greater' return concatenate(quicksort('less'), 'pivot', quicksort('greater'))


What is the big-O worst-case complexity of this algorithm?

Can't say without some detail about the algorithm in question.


What is the worst case and best case of bubble sort?

There is no worst case for merge sort. Each sort takes the same amount of steps, so the worst case is equal to the average case and best case. In each case it has a complexity of O( N * log(N) ).


Which algorithm has some average worst case and best case time?

All algorithms have a best, worst and average case. Algorithms that always perform in constant time have a best, worst and average of O(1).


How does the sorting algorithm Quicksort work?

Quicksort is faster than other algorithms, though it is a comparison sort, not a stable sort. It uses O(n log n) comparisons to sort n terms. It works well with cache.


What is the worst case running time of algorithm to delete each element from the linked list?

Linear time. O(n).


What do you understand by complexity of sorting algorithms?

By understanding the time and space complexities of sorting algorithms, you will better understand how a particular algorithm will scale with increased data to sort. * Bubble sort is O(N2). The number of Ops should come out <= 512 * 512 = 262144 * Quicksort is O(2N log N) on the average but can degenerate to (N2)/2 in the worst case (try the ordered data set on quicksort). Quicksort is recursive and needs a lot of stack space. * Shell sort (named for Mr. Shell) is less than O(N4/3) for this implementation. Shell sort is iterative and doesn't require much extra memory. * Merge sort is O( N log N) for all data sets, so while it is slower than the best case for quicksort, it doesn't have degenerate cases. It needs additional storage equal to the size of the input array and it is recursive so it needs stack space. * Heap sort is guaranteed to be O(N log N), doesn't degenerate like quicksort and doesn't use extra memory like mergesort, but its implementation has more operations so on average its not as good as quicksort.


Inventions that begin with the letter q?

The quicksort algorithm is an invention beginning with Q. Developed by Tony Hoare in 1960 while a visiting student of the Moscow State University.