Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

At CMU we had a mandatory stats class but it didn't have anything concerning sample size picking. Ironically, the stats class I had to take when I was in the humanities dept. was more comprehensive. When I walked into industry, I realized after two or three years that practically everyone I'd met knew jack shit about statistics. I myself learned most of what I know from humanities-major friends and books like NLP by Manning and Schutze.

Having been through what Zed's been through (on both the giving and receiving end to wit), I can empathize with why he'd get all sanctimonious about it, but I feel that it just makes the problem worse (at least in person). Programmers are already an egotistical lot, and I've learned that directly attacking their ego tends to make things worse.

What can be done then? I don't know. It feels like this is at the core a personality problem, in particular one in which people associate their ability to know with their sense of self, and personality problems are terribly hard to correct.



I'm currently doing a CS masters and I did my undergrad in Physics. I'm really glad that I have the background in mathematics and stats that I have. In our physics classes they really drilled into us that although our theory and homework often used exact numbers, no measurement meant anything without a measurement of the error. "A number without error is meaningless", they would say.

One thing I never fully understood was how to calculate the propagation of error. There are many useful tricks for reducing the impact of error on your final calculation, and a few things you need to watch to make sure it doesn't increase.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: