A variant of the algorithm efficiently computes the length of a longest increasing subsequence in a given array.
In fact, when the input array is already sorted, all values form a single pile and both phases run in O(n) time.
The average-case complexity is still O(n log n): any uniformly random sequence of values will produce an expected number of
[1] An evaluation of the practical performance of patience sort is given by Chandramouli and Goldstein, who show that a naive version is about ten to twenty times slower than a state-of-the-art quicksort on their benchmark problem.
They attribute this to the relatively small amount of research put into patience sort, and develop several optimizations that bring its performance to within a factor of two of that of quicksort.
worst-case running time for putting the cards into piles, relying on a Van Emde Boas tree.
The difference with the patience sorting algorithm is that there is no requirement to place a new card on the leftmost pile where it is allowed.
Aldous and Diaconis suggest defining 9 or fewer piles as a winning outcome for n = 52, which happens with approximately 5% probability.
S. Bespamyatnikh and M. Segal[3] give a description of an efficient implementation of the algorithm, incurring no additional asymptotic cost over the sorting one (as the back-pointers storage, creation and traversal require linear time and space).
[1] According to Aldous and Diaconis,[4] patience sorting was first recognized as an algorithm to compute the longest increasing subsequence length by Hammersley.
Within a series of measurements, the existence of a long increasing subsequence can be used as a trend marker.