My favorite blind spot is how our definitions of quality change over time. I knew someone who had a mature codebase which a new developer made substantially faster by removing his old optimizations. He’d measured very real improvements back when he made that hand-rolled assembly code on early Pentium generations but by the time we revisited it less than a decade later the combination of compiler and processor improvements meant that the C reference implementation was always faster. (I was assisting with the port to PowerPC and at first we thought that it was just XLC being especially good there, but then we tested it on x86 with GCC and found the same result)
Beyond the obvious lesson about experience and sunk costs it was also a great lesson about how much time you assume you have for maintenance: when he’d first written that code as a grad student he’d been obsessed with performance since that was a bottleneck for getting his papers out but as his career progressed he spent time on other things, and since it wasn’t broken he hadn’t really revisited it because he “knew” where it was slow. Over time the computer costs eventually outweighed that original savings.
Beyond the obvious lesson about experience and sunk costs it was also a great lesson about how much time you assume you have for maintenance: when he’d first written that code as a grad student he’d been obsessed with performance since that was a bottleneck for getting his papers out but as his career progressed he spent time on other things, and since it wasn’t broken he hadn’t really revisited it because he “knew” where it was slow. Over time the computer costs eventually outweighed that original savings.