Don't focus on the domain, focus on the coding. i.e. don't talk about what they were doing, talk about how they did it. Where there any performance considerations? concurrency? what did they do for testing? was it automated? How the thing deployed? do they maintain it? etc. etc.
In doesn't matter that in all of the above I'm talking about the code used to talk to little green men on Mars (which you know nothing about), but it matters how I used code to solve all the problems which are common across all kinds of software.
How much faith do you really have in these questions?
This could be your entire 45 minute interview:
Where there any performance considerations: Yes, the spaceship was extremely slow at first. After reverse engineering the launch protocol, we discovered that we could increase speeds by 5x simply by limiting the fuel cell usage. "Oh tell me about your rocket fuel cell usage"- Well, we have this thing called a fuel cell. It takes 5 batteries. 10 minutes of business logic later...
* Now let's spend 25 minutes talking about infrastructure and team dynamic *
Concurrency: None needed here.
What did they do for testing: We used Jest.
Was it automated: Yes.
How was it deployed: Github / Heroku / AWS
Do you maintain it: Yes, we maintain it with my little green friends. We each take turns writing code and maintaining the rocket protocols. It's actually quite nice.
* Oh, well I guess we're out of time. *
Congrats, you just hired a guy who read a few blogs and made up a few stories.
> How much faith do you really have in these questions?
It depends entirely on how you direct the converstation.
> After reverse engineering the launch protocol, we discovered that we could increase speeds by 5x simply by limiting the fuel cell usage
Tell me HOW you reverse engineered it. What tools did you use. What source did it wind up as. What problems did you encounter?
>Oh tell me about your rocket fuel cell usage
(Don't ask that question because you don't care about rocket fuel cell usage, you care about if this person is a good coder. Ask them questions about code and their person software process!)
>What did they do for testing: We used Jest. Was it automated: Yes.
Obviously it's on you to tease out more than one word answers.
If you want to have a conversation with somebody and learn if they're capable of something, the onus is on you to direct the conversation and get what you need. If you're willing to accept one word answers, then I'm thinking this "informal chat over a few hours" approach is not for you.
Agree with all your points, I'm just coming from the perspective of how I embarrassingly hired a guy who could answer a lot of these questions but couldn't solve the problems we needed him to solve.
And that, by switching to the "Hello, nice to meet you, okay let's open up Coderpad and solve this problem", as "inhumane" as it sounds, and I KNOW we will continue to see these forum threads for years to come, it actually WORKED to find some seriously amazing candidates who could actually showcase their skills LIVE.
It's like, there's knowing your implementation details, and there's actually implementing something.
Honestly, as a candidate, I prefer the technical challenge now. Partly because my brain isn't equipped to even remember deep implementation details of specific projects. Think about it, how much can you really remember from the last project you worked on? Is that result going to give you more concrete details than actual code on a small problem? I think companies will continue to use Coderpad because it just gets to a clear result faster.
Obviously the way this goes down is different for each interviewee and each interviewer.
I personally remember a lot of details from some of my favourite projects, but I wouldn't hesitate to say "let me grab my laptop and I'll show you" because it will be clearer. Then I'd walk the interviewer though all the details of the code, deployment, testing, etc. etc. etc.
I don't think we can evaluate whether someone can do a decent job in 45 minutes. Or an hour. Sure, we can spot and confirm a hopeless case in 20. But beyond that, an hour is simply not enough.
I've been involved in hiring and interviewing for close to 15 years. The best results have been with candidates with whom I've arranged to have at least 90 minutes, and where that time ended up being well spent. It takes a while to get comfortable, to warm up, to establish a common ground. And it sure as hell takes time to actually discuss a technical problem, whether it's design or a programming problem, as the solution unfolds.
The desire to shoehorn an interview into at most 1-hour slots is not designed to find the best candidates. From where I look at things, it's designed around the idea that most business meetings are booked for one hour each, and the cadence in the day must fit the business of, well, doing business. And it kind of works, because everyone (or near it) has the context and shape of the problem fairly clear in their heads.
But for an interview? A process, where by definition you are dealing with people who are not well versed in your business? Companies book 1-hour interview slots because it's convenient - for their employees, including those who have no part in the interview process, but who are expected to attend other 1-hour meetings with the people who are involved.
The gauntlet of 1-hour interviews feels like a very much intended consequence of the organisation thinking in terms of 1-hour slots for everything. The result is a grueling exercise very few like, and almost everyone with experience despises. It's bad for the candidates, it's bad for interviewers, and I'm pretty sure it's bad for the companies.
But it keeps getting done that way because the cargo-cult of 1-hour slots for everything can not be reasoned with, or deviated from, results be damned.
Just think how well you would do the engineering and programming part of your profession if you had to carve everything into 1-hour slots. After all, PG wrote about it back in 2009: http://www.paulgraham.com/makersschedule.html
I’ve done over a hundred technical interviews in the past 5 years. You can tell a lot about someone by quickly glancing over a project source code, comments, documentation, automation, etc vs talking to them for an hour or two. (to be clear I’m not talking about live coding or take-home problems)
There are bunch of people who can sound smart by just referencing stuff they read online or in the books.
I don’t focus on specific technologies or frameworks because everyone has different backgrounds. However I do focus on CS fundamentals and software engineering practices. These don’t say much about actual coding skills.
You can figure out whether they actually understand what they’ve done by asking the right questions. E.g. not “how did you do that” but “why did you do it this way and not that way?”