I concur:
Nowadays, the basic building blocks are done. Good sorting, database indexing, O(1) set-membership, many variants of lists, hash tables, and other higher level data structures are well covered by modern libraries. The foundations of modern system design are algorithms and data structures, and they are largely commoditized, for general use. Certainly, working on general purpose, clever-but-foundational algorithms is becomming increasingly niche.
I disagree with the final conclusion that somehow algorithms have faded into obscurity.
Anyone writing any program has to consistently track their expected program performance as it shuffles data between commodotized library calls or between network end points. This is algorithm design. Many "foundational" algorithms like max-flow actually make use of "foundational(er)" algorithms and shuffle data between them.
This is what the modern programmer does when he decomposes a data-intensive problem into sub-problems and uses commoditized libraries and re-composes those solutions.
The push to catalog and archive a wide array of well-implemented algorithms and data structures tracked the emergence of the personal and commodity hardware. The emergence of new hardware will require new solutions. Mobile, webassembly (perhaps), ASIC-based data centers, CUDA, are all exciting new areas where our "foundational" algorithms and canonical solutions need to be revisited, redesigned, and re-tuned. Or at least re-composed.
I concur: Nowadays, the basic building blocks are done. Good sorting, database indexing, O(1) set-membership, many variants of lists, hash tables, and other higher level data structures are well covered by modern libraries. The foundations of modern system design are algorithms and data structures, and they are largely commoditized, for general use. Certainly, working on general purpose, clever-but-foundational algorithms is becomming increasingly niche.
I disagree with the final conclusion that somehow algorithms have faded into obscurity.
Anyone writing any program has to consistently track their expected program performance as it shuffles data between commodotized library calls or between network end points. This is algorithm design. Many "foundational" algorithms like max-flow actually make use of "foundational(er)" algorithms and shuffle data between them.
This is what the modern programmer does when he decomposes a data-intensive problem into sub-problems and uses commoditized libraries and re-composes those solutions.
The push to catalog and archive a wide array of well-implemented algorithms and data structures tracked the emergence of the personal and commodity hardware. The emergence of new hardware will require new solutions. Mobile, webassembly (perhaps), ASIC-based data centers, CUDA, are all exciting new areas where our "foundational" algorithms and canonical solutions need to be revisited, redesigned, and re-tuned. Or at least re-composed.