top | item 28946451

Unconventional Sorting Algorithms

89 points| devev | 4 years ago |codingkaiser.blog | reply

54 comments

order
[+] nayuki|4 years ago|reply
My favorite is still slowsort. https://en.wikipedia.org/wiki/Slowsort

> It is a reluctant algorithm based on the principle of multiply and surrender. It was published in 1986 by Andrei Broder and Jorge Stolfi in their paper Pessimal Algorithms and Simplexity Analysis.

https://www.mipmip.org/tidbits/pasa.pdf

> The slowsort algorithm is a perfect illustration of the multiply and surrender paradigm, which is perhaps the single most important paradigm in the development of reluctant algorithms. The basic multiply and surrender strategy consists in replacing the problem at hand by two or more subproblems, each slightly simpler than the original, and continue multiplying subproblems and subsubproblems recursively in this fashion as long as possible. At some point the subproblems will all become so simple that their solution can no longer be postponed, and we will have to surrender. Experience shows that, in most cases, by the time this point is reached the total work will be substantially higher than what could have been wasted by a more direct approach.

[+] Sohcahtoa82|4 years ago|reply
I like Intelligent Design sort:

> The probability of the original input list being in the exact order it's in is 1/(n!). There is such a small likelihood of this that it's clearly absurd to say that this happened by chance, so it must have been consciously put in that order by an intelligent Sorter. Therefore it's safe to assume that it's already optimally sorted in some way that transcends our naïve mortal understanding of "ascending order". Any attempt to change that order to conform to our own preconceptions would actually make it less sorted.

https://www.dangermouse.net/esoteric/intelligentdesignsort.h...

[+] kjeetgill|4 years ago|reply
Ohhhhhh... fascinating! Is there a good metric for how well sorted a list is? Maybe for every pair of elements, how many are in the correct order?

Given a random shuffle of a list, what are the chances that another shuffle would be more sorted? There's a life lesson in here, I know it...

[+] michaelmior|4 years ago|reply
Quantum bogosort is probably my personal favorite. Randomly permute the input using a quantum source of entropy. If the input is not sorted, destroy the universe. The only universe remaining is one where the input was already sorted. Runs in O(n) time.
[+] contravariant|4 years ago|reply
Even without destroying the universe you can sort your list with a quantum source of entropy, just use:

    def cosmicraysort(list):
        while not sorted(list):
            pass
[+] Sharlin|4 years ago|reply
(the problem of destroying the universe left as an exercise to the reader)
[+] cevi|4 years ago|reply
Unfortunately, randomly permuting the input takes O(nlog(n)) steps (you need at least this much time just to read in sufficiently many random bits). Perhaps a clever parallel architecture could reduce the runtime?
[+] forinti|4 years ago|reply
My contribution: Power Sort.

Start with p=0 and add 2^n for each n. Then subtract the largest power of 2 successively to get your integers ordered.

Con: p will be very big.

Pro: you don't need ifs!

[+] Paedor|4 years ago|reply
Fun fact, when you consider the bits required to contain p, this is equivalent to bucket sort with a bucket size of 1.
[+] Snild|4 years ago|reply
Ooh, that's a fun one.

Another con: you'd better hope your input contains no duplicates. :)

[+] dragontamer|4 years ago|reply
Bogosort is always a fun one, because I think its one of the better introductions to NP-complete problems.

If you imagine that there's no "easy" algorithm to return a sorted list (but sorted lists do exist), then the only methodology possible is to try all combinations. And you do this either systematically, or randomly.

Random NP-complete algorithms, such as WalkSAT, are quite good in practice. Systematic NP-complete algorithms, such as DPLL-SAT, are more "obvious" in how they work, and are sufficient for smaller sizes.

[+] scarmig|4 years ago|reply

  def everett_sort(list):
    shuffle(list)
    if is_sorted(list):
      return
    else:
      suicide()
[+] boothby|4 years ago|reply
I recently found a fun linear-time systolic sorting algorithm. I'd describe the serial implementation as unconventionally obvious. It's like insertion sort, without all the bother of swapping.

  def index_sort(a):
    n = len(a)
    output = [None] * n
    for i in range(n): #can run in parallel, damn the GIL
      index = 0
      for j in range(n):
        if a[j] < a[i]:
          index += 1
        elif a[j] == a[i] and j < i:
          index += 1
      output[index] = a[i]
    return output
[+] Snild|4 years ago|reply
> linear-time

But... there are two nested `range(n)` for loops. Typo?

[+] lifthrasiir|4 years ago|reply

    import ctypes
    
    def mutation_sort(a):
        for i, v in enumerate(a):
            ctypes.c_int.from_address(id(v) + 24).value = i # for x86-64, adjust the offset for your platform
        return a
    
    def is_sorted(a):
        prev = None
        for v in a:
            if prev is not None and prev > v: return False
            prev = v
        return True
    
    is_sorted(mutation_sort([3000, 1000, 7000, 8000, 2000])) # prints True
[+] fjfaase|4 years ago|reply
Usage sort: About sorting a sequence of numbered books while you are picking books out of the sequence when you need them and insert them in afterwards (moving the books in between). What is the best strategy for where to put the books back? for a discussion see: https://www.iwriteiam.nl/Dpuzzle.html#usagesort
[+] nickcw|4 years ago|reply
Here is my offering

https://gist.github.com/ncw/5419af0e255d2fb62b98

It's a Go channel based quicksort. It sorts a channel of ints using O(n) go routines! Interestingly it doesn't need to know how many ints are in the channel at the start.

Not a practical sort method but fun to see the Go concurrency primitives in use.

[+] unnouinceput|4 years ago|reply
I know this article is meant as a joke but the first variant of bogosort, which is actually randomsort, it will be the fastest algorithm once we start using quantum computing.

Having an array and allocating "n"=="array length" qubits for it then implementing parallel version of randomsort (no while, while is serialization!) will be the fastest sorting algorithm.

[+] axiosgunnar|4 years ago|reply
Slightly off-topic, the article was fine except for one nitbit...

I wonder what the author, "hilariously" making one sorting function print "<val> sent to gulag" and calling it "stalin-sort", would think of "hitler-sort" which prints "<val> burned in oven"?

Please don't normalize mass-murderers, even for memes.

[+] axiosgunnar|4 years ago|reply
If you downvote, you must write what is wrong with that statement.
[+] karaterobot|4 years ago|reply
It's an obvious joke, the point of which is that the Stalin Sort is destructive and doesn't actually get you what you want, and you'd never want to use it.
[+] NotAnOtter|4 years ago|reply
Isn't 'sleep sort' linear with respect to array size?

O(n) sort achieved?

[+] dfox|4 years ago|reply
It is if you ignore how the underlying OS implements multitasking and sleep() in particular. And the kernel-side implementation usually involves some kind of priority queue of sleeping threads. So taken as a whole sleep-sort is just an convoluted implementation of heap-sort that outsources the actual sorting to OS.
[+] boothby|4 years ago|reply
It's perversely linear with respect to the largest element in the array, if that's greater than the runtime of the underlying heapsort. Good luck with negative entries.

Response to dead comment:

> Couldn't this be solved by scaling everything by the largest entry. Of course that requires first a pass to find the largest entry but that's not as expensive?

One presumes that we don't have infinite precision timers, so you should only scale it down so the smallest gap between elements is a few clock cycles (unless you're satisfied with breaking the fiction that sleepsort is not heapsort). This seems like a mildly interesting question. My intuition is that finding the smallest abs(a[i]-a[j]) with i != j should be just as hard as sorting the list on a classical computer, but it seems like a quantum computer might find the smallest gap faster.

[+] throw3849|4 years ago|reply
Australia sorting algorithm:

Stay at your index!!!

[+] DeathArrow|4 years ago|reply
for i = 1 to n do

for j = 1 to n do

if A[i] < A[j] then

swap A[i] and A[j

[+] unnouinceput|4 years ago|reply
fastest swap implementation, assembler:

PUSH A[j]; PUSH A[i]; POP A[j]; POP A[i];

[+] fauscist1984|4 years ago|reply

[deleted]

[+] WJW|4 years ago|reply
Not related to sorting algorithms.