When we started Triplebyte, we'd thought there would be pretty much a straight line from being a bad to great programmer and we'd just have to figure out where to put the cut off when deciding whether to work with an engineer. The biggest surprise has been just how much disagreement there is amongst companies on what a "great engineer" actually means.
That's when we realized we were actually working on a mapping problem and the first step was figuring out a universal set of criteria that all companies care about. Then if we could assign the right weight for each attribute to specific companies, we could route engineers only to companies they'll be a strong technical fit for.
It'd be great to get thoughts on the criteria we chose and experiences from engineers who have done a lot of technical interviewing,
To make it even more complicated, most companies might not even know what they need. The classic example is algorithms: many companies will say they care about algorithms, but few of them actually need those skills.
"Back-end web understanding" seems oddly specific compared to everything else on that list.
Why that and not something more general which might encompass other kinds of domain-specific knowledge? There are a lot of companies on your list which seem like they might care more about other skills that don't really fit anywhere else. (Experience working with databases for example, for one of the 7 or so database companies.)
I assume the (very broad) criteria listed on the blog are supersets of very specific sub critera. Maybe if you could elaborate on what exactly those sub critera are, it would make things clear.
I still don't understand why I would want to go through the hassle of doing an onsite interview with TripleByte only to have to go through further onsite interviews at the hiring companies?
If TripleByte's onsite interview allowed me to skip the onsite at the hiring company, then I'd be all for it, but it is like it's just a layer of friction.
For the record, I've had zero problems applying to companies by either emailing them or getting contacted by them via LinkedIn, email, etc. I just don't understand what benefit they bring at this moment. Maybe if the job market tightens and they were exclusive providers for companies, then sure, but all the SV companies have teams of recruiters emailing people all day long. As a hiring candidate there's no reason why I would want to go through their onsite.
Finding the right company to join is hard, you have to find which companies are doing interesting things that match your interests and then narrow down to the ones where you'll be both a technical and cultural fit. Failed interviews are a big time suck and we see that most people only have the stamina to interview with a few companies and they'll often accept one of the first offers they get, rather than optimizing for the companies they're most excited about. We have the data to match you with companies you'll be a strong technical fit, which saves you wasting time speaking to companies who don't value your particular engineering skills. The end result is a more efficient job search process, giving you more options by speaking with less companies.
We do also reduce the total amount of time engineers have to spend in technical interviews. Triplebyte candidates skip the technical phone screens, usually an hour per company at least. If you're speaking with at least 3 companies (which everyone working with us is), you've already saved time as our technical interview is 2.5 hours.
Happy to talk more about this, harj AT triplebyte.
While hiring is indeed a big problem that can be addressed with a data-driven approach, I'm not sure the approach of "we have data, just trust us" is fair to all parties.
The naming of Engineering Genome Project is styled after Pandora's Music Genome Project. The difference is that Pandora uses data to provide relevant and immediately verifiable results by the user, such as music along the same genre and artist. In contrast, an Engineering Genome Project uses criteria such as "applied problem solving" and "professional code" that is impossible for a user to interpret intuitively.
Well, the engineers who go through our process are in a good position to verify the effectiveness of the matching. Granted, the bar to reach that point and check the quality is higher than it is for Pandora (readers can't go check right now what their matches would be). But I don't think that's an argument against trying to do a better job matching engineer with companies. This is an important area that's been largely overlooked.
The categories that you mention (applied problem solving and professional code) really are important. Companies differ widely in how much care about those two things (solving problems in the interview effectively vs. showing clean, well-structured code and good testing process). When an effective but iterative (and sometimes sloppy) programmer interviews at a company that values process highly, the result is wasted time and pain for everyone.
I think this writeup is a tad lengthy. It's not until the fifth paragraph that I understand what's even going on
>Intelligent matching with software is how hiring should work. Failed technical interviews are a big loss for both sides. They cost companies their most valuable resource, engineering time. Applicants lose time they could have spent interviewing with another company that would have been a better fit.
I feel like that should have been the headline for this. For a company that is meant to match people to companies, I think their external communication should be excellent not just good. How can I trust that this company will communicate my strengths and weaknesses in a way other people can understand if it's difficult for me to follow one of their flagship blog posts?
Whenever someone decides to hire someone, all of their criteria are heavily biased towards what kind of skills the candidates posses.
I wonder if someone can come up with a reasonably accurate way to determine how well or easily can a candidate acquire particular skills.
I realize this line of thought might not be popular for most startups who would want someone to get going as soon as they start. But if you're having a tough time hiring a Machine Learning engineer and you get applications from a bunch of smart folks who want to gain experience in Machine Learning, would it be a good idea to give them a shot?
The traditional 'puzzle solving' in interviews was probably geared in this direction, but I'm wondering if there are better ways to gauge this.
This is something we're able to do too by encouraging people to reapply and tracking how much they've improved between technical interviews. It makes sense for companies to do this too but they don't, mostly because it's never any single person's area of focus.
I got rejected by Triplebyte and then hired by Google a while later. I prepared much more for the Google interview though, so I don't know if Triplebyte was at fault.
Employers don't actually (in most cases) have a very good grasp of what qualities they select for. It's function of the engineers doing the interviews and the engineering culture, and most companies are not aware how much this differs between companies. We model what each company looks for by sending them candidates with specific attributes and reading their feedback (we get honest feedback from companies after interviews, which is pretty rare). We're then able to see how feedback for the same candidate differs between companies.
Harj|9 years ago
That's when we realized we were actually working on a mapping problem and the first step was figuring out a universal set of criteria that all companies care about. Then if we could assign the right weight for each attribute to specific companies, we could route engineers only to companies they'll be a strong technical fit for.
It'd be great to get thoughts on the criteria we chose and experiences from engineers who have done a lot of technical interviewing,
choxi|9 years ago
samdk|9 years ago
Why that and not something more general which might encompass other kinds of domain-specific knowledge? There are a lot of companies on your list which seem like they might care more about other skills that don't really fit anywhere else. (Experience working with databases for example, for one of the 7 or so database companies.)
dineshp2|9 years ago
pfarnsworth|9 years ago
If TripleByte's onsite interview allowed me to skip the onsite at the hiring company, then I'd be all for it, but it is like it's just a layer of friction.
For the record, I've had zero problems applying to companies by either emailing them or getting contacted by them via LinkedIn, email, etc. I just don't understand what benefit they bring at this moment. Maybe if the job market tightens and they were exclusive providers for companies, then sure, but all the SV companies have teams of recruiters emailing people all day long. As a hiring candidate there's no reason why I would want to go through their onsite.
Harj|9 years ago
We do also reduce the total amount of time engineers have to spend in technical interviews. Triplebyte candidates skip the technical phone screens, usually an hour per company at least. If you're speaking with at least 3 companies (which everyone working with us is), you've already saved time as our technical interview is 2.5 hours.
Happy to talk more about this, harj AT triplebyte.
minimaxir|9 years ago
The naming of Engineering Genome Project is styled after Pandora's Music Genome Project. The difference is that Pandora uses data to provide relevant and immediately verifiable results by the user, such as music along the same genre and artist. In contrast, an Engineering Genome Project uses criteria such as "applied problem solving" and "professional code" that is impossible for a user to interpret intuitively.
ammon|9 years ago
The categories that you mention (applied problem solving and professional code) really are important. Companies differ widely in how much care about those two things (solving problems in the interview effectively vs. showing clean, well-structured code and good testing process). When an effective but iterative (and sometimes sloppy) programmer interviews at a company that values process highly, the result is wasted time and pain for everyone.
FabioFleitas|9 years ago
dang|9 years ago
llovan|9 years ago
Harj|9 years ago
ececconi|9 years ago
>Intelligent matching with software is how hiring should work. Failed technical interviews are a big loss for both sides. They cost companies their most valuable resource, engineering time. Applicants lose time they could have spent interviewing with another company that would have been a better fit.
I feel like that should have been the headline for this. For a company that is meant to match people to companies, I think their external communication should be excellent not just good. How can I trust that this company will communicate my strengths and weaknesses in a way other people can understand if it's difficult for me to follow one of their flagship blog posts?
impish19|9 years ago
I wonder if someone can come up with a reasonably accurate way to determine how well or easily can a candidate acquire particular skills.
I realize this line of thought might not be popular for most startups who would want someone to get going as soon as they start. But if you're having a tough time hiring a Machine Learning engineer and you get applications from a bunch of smart folks who want to gain experience in Machine Learning, would it be a good idea to give them a shot?
The traditional 'puzzle solving' in interviews was probably geared in this direction, but I'm wondering if there are better ways to gauge this.
Harj|9 years ago
thaumasiotes|9 years ago
This is known as an "IQ test".
Joof|9 years ago
code3434|9 years ago
searine|9 years ago
It's a shame it's limited to just engineers. I've been looking for a recruiter company like this for data science.
nzoschke|9 years ago
Edit to my own question....
The 7 genome dimensions looks really reasonable. But hypothetically thinking I still want it all!
ammon|9 years ago
jackalb1|9 years ago
frsandstone|9 years ago