Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

My favorite blind spot is how our definitions of quality change over time. I knew someone who had a mature codebase which a new developer made substantially faster by removing his old optimizations. He’d measured very real improvements back when he made that hand-rolled assembly code on early Pentium generations but by the time we revisited it less than a decade later the combination of compiler and processor improvements meant that the C reference implementation was always faster. (I was assisting with the port to PowerPC and at first we thought that it was just XLC being especially good there, but then we tested it on x86 with GCC and found the same result)

Beyond the obvious lesson about experience and sunk costs it was also a great lesson about how much time you assume you have for maintenance: when he’d first written that code as a grad student he’d been obsessed with performance since that was a bottleneck for getting his papers out but as his career progressed he spent time on other things, and since it wasn’t broken he hadn’t really revisited it because he “knew” where it was slow. Over time the computer costs eventually outweighed that original savings.



My compilers support inline assembler, but these days there doesn't seem to be much of any point to them. The compilers have simply gotten too good.


Yes, it’s funny how support has never been better or less necessary.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: