It's tough though, because when you are a small company that expects developers to jump in and be productive, you do need some kind of process. That process can be referral based, in which case testing isn't as necessary because you can rely on past shared experience, but that obviously limits your talent pool.
When you are interacting with an unknown (or a less known) person, you have to have some kind of vetting process. And with a small company culture, typically false negatives are far less harmful than false positives.
You're making fair points. I think I understand the equation you're setting up here (harm of false positive vs harm of false negative, along with probabilities for each). I'm just starting to believe that the damage of so many false negatives to avoid the occasional false positive may be harming the industry more than we realize.
Let's not forget, we're an industry where employers very consistently claim a serious and economically harmful shortage of talented developers. At the same time, we're an industry that accepts an interview process they open acknowledge will result in the rejection of good candidates because they feel that the damage of a false positive is so high.
Is there a chance that there is a strong "negativity bias" at work here?
In other words, is it possible that the bad memories of a false positive is masking the thousand paper cuts that are inflicted on the high tech workforce though grueling interviews? What overall harm does it do to an industry when a programmer who would have been successful gets rejected from a promising job due to a poor performance on a one hour data structures problem at a white board? I think there's the harm you see, and then the harm you don't.
This one line suggests the harm may be serious and extensive. I'm not making any particular prescription here, I'm just saying that when developers experience repeated rejections due to repeated false negatives, this may be vastly more harmful in a global sense than people currently realize.
> I'm just starting to believe that the damage of so many false positives to avoid the occasional false negative may be harming the industry more than we realize.
I think you have "false positive" and "false negative" backwards, but I agree with your point. I have long thought that the industry systematically understates the damage of false negatives and overstates the damage of false positives.
If you are hiring, it fits one of two broad categories: catch as catch can, or specific need. In a catch as catch can situation, the "always hiring" mode, the company can afford to place more emphasis on minimizing the false positive rate at the expense of the false negative rate because the desire is to skim the cream off the top of the applicant pool as it comes to you. The company doesn't need new people per se; it finds slots for these highly-capable people somewhere in the company after the hire is made.
In the specific need situation, the company is hiring to fill a hole. Either work is not getting done, or other people are overburdened with the responsibilities this new hire will take on. In this case, every day that passes costs the company in some way. The cost of a false negative is related to the quality and size of the applicant pipeline. A false negative could add anywhere from a few days, if the company has a strong pipeline, to a few months, if the company has a weak pipeline, to the timeline for filling this position. The "false negatives are cheap" position comes from the former, but I would say that very few companies actually have an applicant pipeline that strong. Furthermore, if you can go months without filling a position and not suffer any ill effects, you need to ask yourself why you are looking for someone in the first place.
You're on to something here. There's the specific, need-based interview, and there's the "sure, we're always looking for talent" interview.
The latter can lead to an unintended problem. I once went on an interview where I was passed from room to room, on the hour, and taken through my technical paces each time. Each interview was intensely technical. If my interview had been my only exposure to the company (i.e., I hadn't gone to the site, read press releases and articles, and so forth), I would have no idea what the company does, other than that it seems to involve operations-research style math (optimization and stochastic processes), data structures and algorithms, and complex sql. I certainly would have no idea what they planned on having me do from a business point of view. I didn't get an offer (was busy, hadn't adequately refreshed my knowledge of tree traversal, markov chains, and linear optimization to immediate post-exam undergrad/grad levels), but I wasn't hot on the company anyway, because it left me with the impression that they wouldn't value my business input in any way.
So, why would they do this? Maybe they're just sort of looking generally for people. They figure, eh, if we find candidates who can rock an exam[1], sure, let's go ahead and make an offer, I'm sure we'll find something for them to do. Because their need isn't immediate, their standards (no false positives) will be set unusually high.
The unintended consequence is that yet another developer in this space is now reluctant to interview, because who wants to prepare for and re-take their undergraduate/graduate math and CS exams?
[1] At this point, I don't even think we should call this an interview, it was an exam that lasted longer than my MS graduate exams, with far less information about what would be tested.
I'm a big fan of looking at developers' github contributions. It's a good way to see that they can actually write code and it doesn't involve a high-pressure whiteboard test.
On the other hand, it can be a negative for somebody working for a company that discourages or forbids open sourcing their work. Or people who don't have free time to spend on FOSS projects.
Don't confuse activity for productivity. Most of the activities people engage in when searching for a new employee are complete wastes of time, at best.
I personally find it is net-better to stop wasting that time, just try people at essentially random, and get back to my own work.
Only if you have a policy of not having positions for Junior programmers.
Fog Creek has the same hypocrisy as any other company that demands X years of experience from applicants but consider it somebody else's problem how they get that.
As a business owner who is a developer and hires developers: getting my experience was my responsibility. It's now yours. And you won't do it on my dime.
I'm not in the business of using money to make developers. I'm in the business of using developers to make money. The jobs I offer are not because I want to give someone a job, they're because I want someone to come do a job for me.
Until very recently, the vast majority of Fog Creek's devs came through the internship program. Which is restricted to people who are going back to school afterwards (and, with a few exceptions, is only for undergrads).
Worse: they demand that you can code, and don't consider it their problem to teach you how to do that.
Anyway I thought it was well established that years of experience is a bullshit measurement anyway and you should just apply no matter what - worst thing happens, you end up getting hired.
When you are interacting with an unknown (or a less known) person, you have to have some kind of vetting process. And with a small company culture, typically false negatives are far less harmful than false positives.