Oh that's fascinating! Can you talk about what questions are scientifically proven to make a difference, and/or the methodology of proving it? I've read bits and pieces here and there, but not seriously followed such research. I imagine that style, specific words, amount of follow-up, and even mood in the room can play important factors here in the "success rates" of certain interview questions...?
I just ask the prepared questions... That is a good question but I force myself to concentrate on other interesting questions.
They have said that if we want a new question added to the list we need a good link controlled experiment : ask a random sample of candidates, don't use the answer, and then evaluate after 6 months if there is a difference between the two groups performance on the job.
Care to back that up with any reasons? Engineers are absolutely not immune to cultural and gender biases, the evidence may be pointing the other way, that we currently have greater than average biases in tech. Isn’t oversight likely to make it better, not worse? While it may be a problem to not be able to ask some kinds of technical questions, like the GP comment alluded to, what is wrong with the idea of trying to limit questions to those that are proven to be relevant to candidate performance? This seems similar to how I heard that graduate school performance and success in the sciences doesn’t correlate with GRE math & science scores anywhere near as much as it correlates with the language & writing tests... I wouldn’t be surprised if technical interview questions do not do a good job of identifying who you should hire...
Some technical questions are really sink or swim. [0]
I've seen interviews get totally derailed by a simple FizzBuzz question. [1]
I wonder if the GRE situation doesn't have more to do with selection bias, ie applications with a score lower than a certain threshold aren't considered at all or that folks that aren't convinced of their ability in math & science simply won't apply to graduate school.
I’d completely agree that asking some technical questions is pretty important, and my experience is that they don’t have to be particularly difficult at all, you can filter a lot of people with very basic questions. I have several first hand stories that match the one you heard about the FizzBuzz question. People who refuse to answer easy technical questions and/or get angry about being asked them are a HUGE red flag. Sometimes you have to do mundane tasks at work, and anyone who thinks they’re above it doesn’t deserve the job. Getting angry about easy questions is short-sighted, I mean they’re easy. It’s a sign the person doesn’t enjoy coding or work.
Anyway, you could be right, but I don’t think the GRE thing is selection bias, I looked this up a while back but I can’t find the study now - I’ll add a link if I can find it. Choices and scores were controlled for, and the takeaway was that people who are good at language truly did perform better in grad school. I think it’s plausible since success in grad school and primary output in grad school is papers & writing & a thesis. The same is true for working at a company, the majority of very successful people aren’t the coders (with some exceptions) but they are more often people who are good at communication, writing, planning, strategizing, and rallying others to work together.
I wouldn't jump to that conclusion. It might be true sometimes, but there are people who can code but get offended by such tests. I'm saying it doesn't matter if they can code or not, because getting angry about easy questions is a good reason to reject the candidate, aside from their technical skills. It's demonstrating a lack of willingness. They might be an amazing coder, but not a team player. They might be unfriendly. They might be defensive about their skills, or assuming a golden resume should exempt them from coding tests. They might not be able to code. No matter what the cause is, I have a reason to avoid hiring that person.
> What's missing from the picture is that a great communicator who isn't a chemist will never apply to a PhD program in chemistry.
While true, I'm not sure why that matters? PhD applicants do have a wide range of scores on their GREs, and you can still control for those scores and adjust, as well as being able to look at all science fields and not just chemsitry. If the correlation between outcomes and writing scores is higher than the correlation between outcomes and math/science scores, it doesn't really matter that you haven't measured people who weren't interested, does it?
BTW, googling around currently gets me a metric ton of results and studies concluding that GREs don't predict graduate school success beyond the first year. I'm not finding much on the subject tests vs math vs language, but it looks like the current consensus is that GREs are bad predictors and cause some gender/race/class bias problems.
> That's not really going to help fill technical positions.
I'm saying the successful coders are the ones who are better at communication, among the coders -- counting only coders who already have the job. And the successful grad students are the ones who are better writers, counting only grad students who are already enrolled in a science PhD. That's what I meant about controlling for the bias. Being able to communicate and write well is a major skill needed in technical positions, and the technical people who excel are the ones who are better at explaining, communicating, writing, publishing, etc.