For real-time applications, worst case can be much more important than the average case. However, not all computation is real-time. Sometimes the average rate is more important than the worst-case.
If we're talking about practical applications like a video game, I'd still say that an algorithm that won't finish within my lifetime is strictly worse than one that might.
Lol.
If we're talking practical applications, the it doesn't really matter!
Agreed, that sometimes the average rate is fine. That just legitimises my disagreement over the 'worst' sort, there is no worst sort.
Ultimately its just down to semantics. The author stroked his beard and thought about the absolute slowest sort, and called it the worst. I stroked my (metaphorical) beard and found a different kind of worst. Both should be avoided in practical applications where possible.
His is cleverer yes, mine is stupid, that's probably the most relevant metric, and probably why no one submitted a sort that returns the wrong answer, which I would claim could also be 'worst'.
> Lol. If we're talking practical applications, the it doesn't really matter!
Then why did you bring up fps in video games?
> there is no worst sort.
Of course there is no worst sort. But some sorts are worse than others. The disagreement here is that your sorting algorithm is clearly better than Worstsort.
If all randomized algorithms with unbounded runtime are worse than worstsort, then flipping a coin until you get heads and then running a fast sort algorithm would be worse than Worstsort. If you instead take a more reasonable approach like comparing expected runtime, Worstsort is clearly worse than bogosort.
If we're talking about practical applications like a video game, I'd still say that an algorithm that won't finish within my lifetime is strictly worse than one that might.