SoME_Topics
SoME_Topics copied to clipboard
Complexity of sorting
It is well known that it takes O(n log n) complexity to sort dara, but why? That is what I'd like to demonstrate.
A provisional outline would be:
- Explain the big-O notation
- Explain the two parts of sorting, (e.g. as a recipe) i.e. ordering the data (following the recipe) and working out what to do (creating the recipe). These two operations are somewhat interleaved in any standard sort
- Show that, once the recipe has been created, it takes O(n) (e.g. sway the first element with whatever, then the second, and so forth)
- Show it takes O(n log n) to create the recipe 4.1. There are n! ways of ordering n items, so need to distinguish between n! scenarios 4.2. Each comparison splits the scenario space in at best half, so would need at least log2 operations. 4.3. log2 = log[2](2 x 3 x ... x n) = log2 + log2 + ... + log2 4.4. Using log2 x 1 rectangles, show that this is at least integral(log2, x=1 ... n) 4.5. The result follows easily
What are you looking for? I'd be happy to produce some animations demonstrating different sorting algorithms if you think you could use those. We could have it sort colored boxes into a rainbow, or bars of different lengths into a ramp.
Thanks for looking at this. I was more interested in using this as an introduction to complexity, demonstrating that no sorting algorithm could improve on O(n log(n)) complexity. It may be interesting to examine the complexity of other algorithms (e.g., showing that the Travelling Salesman Problem (TSP) is NP-complete(?)
The outline I provided above was to demonstrate the O(n log(n)) complexity.
Hope this makes sense.
When I was a student, 25 years ago, I wrote a framework for visualizing algorithms. https://chu.in-chemnitz.de/programmieren/java/ViA/ (Sorry for the German links, but any simple translator can certainly handle that.)
Then we used it to visualize all kind of algorithms, but in particular sorting of things. https://chu.in-chemnitz.de/programmieren/java/ViA/Applets/Finals/Sort/index.html
If you like, I can support your idea. I have a degree in computer science and worked as a teacher for some time as well.
It might be worth noting that sorting can in fact be done in O(n) time, in common cases. Sorting only requires O(n log n) if you can only use comparisons.
For example, if you're sorting integers that don't have too many digits, radix sort is O(n).
Yes and no. Basically, radix sorts simply change the base of the logarithm. Take, for example, lexicographic sorting of words, the logarithm would be to the base of 256. This does not change the complexity, just the constant multiplier.
I agree that it is worth noting that the $\Omega(n \log n)$ lower bound applies to comparison-based algorithms only: say, one can easily sort $n$ bits in time $O(n)$.
Ah, but don't forget to at least acknowledge spaghetti sort, which is in fact an O(n)
sort.