This argument is very much in line with Mike Acton's data-driven design philosophy [1] -- understand the actual specific problem you need to solve, not the abstract general problem. Understand the statistical distribution of actual problem instances, they'll have parameters in some range. Understand the hardware you're building writing software for, understand its finite capacity & capability.
It's common that new algorithms or data-structures with superior asymptotic efficiency are less performant for smaller problem sizes vs simpler alternatives. As always, it depends on the specifics of any given application.
[1] see Mike's CppCon 2014 talk "Data-Oriented Design and C++" https://www.youtube.com/watch?v=rX0ItVEVjHc
It's vexatious how good we are at it, and it's exactly the sort of problem that Science likes. We know it's true, but we can't reproduce it outside of the test subject(s). So it's a constant siren song to try to figure out how the fuck we do that and write a program that does it faster or more reliably.
Traveling Salesman was the last hard problem I picked up solely to stretch my brain and I probably understand the relationship between TSP and linear programming about as well as a Seahawks fan understands what it is to be a quarterback. I can see the bits and enjoy the results but fuck that looks intimidating.
I read this article a few days ago I'm sure, word-for-word, but it wasn't on this site in OP? It stood out because when it mentioned textbooks and said "including ours" I looked at the site and thought to myself "they do textbooks?".
https://www.quantamagazine.org/new-method-is-the-fastest-way...
Supposing one uses a 'trie' as a priority queue, the inserts and pops are effectively constant.
this was a lot of words that sum up to "I heard that new algorithm exists but spent zero effort actually evaluating it"
(side note: does anyone else get thrown off by the Epilogue font? It looks very wide in some cases and very narrow in others... makes me want to override it with Stylus if my employer hadn't blocked browser extensions for security reasons, which raises the question of why I am even reading the article on this computer....)
https://youtu.be/3ge-AywiFxs?si=TbcRsBNkzGhpOxQ4&t=842
(timestamped=) He shows a derivation that at best, a sorting algorithm can do is O(n log(n)) for n real positive numbers.
It's actually common for algorithms with a lower asymptotic complexity to be worse in practice, a classic example is matrix multiplication.
Also please, please, can we stop with the "eww, math" reactions?
> The new approach claims order (m log^(2/3) n) which is clearly going to be less for large enough n. (I had to take a refresher course on log notation before I could even write that sentence with any confidence.)
I'm sure the author is just exaggerating, he's clearly very competent, but it's a sentence with the vibes of "I can't do 7x8 without a calculator."