Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Assuming with asymptotic behaviour, provided the constant factor is acceptable, ought to be your default. This is because software tends to need to work with greater volumes of data over time, on machines with more memory, and more and faster CPUs. If your algorithms grow with greater than linear complexity as a function of input, your software will tend to get worse over time.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: