Sajarin's blog

Technical Assessments Should be Open Source

The Hiring Problem

This isn’t your typical rant about the cargo culting of Leetcode interviews (although it is a related point.) There is a problem today with the process of identifying technical talent.

The assessments we use to evaluate candidates are too heavily weighed against them. Algorithmic coding questions are biased against those without a CS background. Take home assessments often ask for too much time investment or are suspected of being actual engineering work that is being outsourced. And pair programming interviews often depend largely on the ability of the interviewer, their delivery of the prompt and how well they can help the candidate get unstuck.

Consider for a moment, that even the most studious Leetcode practitioner fails at getting consistent results with interviews. If these interviews really are standardized, then reason stands that if you get an offer from one large company, you should be able get an offer from all of the other ones as well, but you seldom see this in the industry.

All of this results in poor signal with individuals and their interview performance. As a result, individuals have to apply to hundreds of jobs and companies have to spend time and engineers to filter through hundreds of applications. Put it simply, there is an inefficient allocation of scarce resources mainly due to our dubious ability to identify talent.

Why are things like this? Why aren’t more people working on trying to fix this problem? To be honest, I'm not quite sure. It's almost a rite of passage for budding engineers to complain about some aspect of the hiring process (this blog post is mine). The answer might be simple: a developer-first method for evaluating candidates either cannot be imagined or is not profitable enough as a possible venture. Of the two options and based on history, the latter seems to be more likely.

Some History

There was a recent hacker news thread about StarFighter, a “recruiting ctf” game where users were tasked with writing bots to battle other players and AI bots in an online multiplayer environment. The players who wrote the best performing bots were then referred to companies for positions. The company and game has long been shut down and a user wanted to know if there was any retrospective shared on the reasons why.

In the thread, one of the creators of StarFighter explained that the main reason why the idea never took off was because companies would often reject the candidates that were referred. In addition despite being recommended, the candidates would have to go through the company’s hiring process anyway.

Another top comment, recanted their experience of working at a similar company. They claimed that most "companies don't have a screening problem, they have a sourcing problem" and that it is tricky to build a platform that attracts seasoned engineers, which is "what all recruiters want most of all"

Tangentially related, another company, Sourceress, tried to use machine learning algorithms to automate sourcing candidates for companies. However, they shut down for one reason or another, citing problems with their business model. The founders (who are now founders of Imbue) made mention of the fact that the more they delivered value, the more they would lose their best customers. Once a company had closed a role using Sourceress, they no longer needed them. This lack of stickiness meant that Sourceress always had to keep finding new companies to balance their high churn. The latter is speculative but given that Sourceress ceased operations, it is likely they did not experience the mid-stage growth they needed.

Learning From History

There are a few patterns one can gleam from past examples. For one, these platforms are largely focused on the entry-level market. In fact, this is in some ways evidence of a prioritization of companies over developers. Fresh graduates and career-changers are naively unaware or consciously willing to put up with bad interview experiences, while seasoned developers can stand to be more picky. The more important point however, is that many of these recruiting companies are missing out on delivering a disproportionate amount of value because they are ignoring the small subset of seasoned developers who are practically begging for something better in this space.

Another obvious pattern worth mentioning is the high churn with regards to both customers and candidates. In theory this means that the best sourcing companies have high throughput, but since these companies are all competing for the same valuable signal from the noise of early career candidates, it is difficult to project sustainable growth.

Putting it all together, most platforms focusing on evaluating candidates, eventually end up becoming sourcing pipelines in order to remain profitable. These companies all use essentially the same coding tests to filter for promising early career engineers, who at the end of the day, are not the most important need for smaller companies. Larger companies usually already have their own pipeline/process in place and therefore don't need to rely on outside platforms and even if they do, there are plenty of options to choose from. These factors make it difficult to grow, especially considering that the faster the company grows, the more churn it experiences.

High competition. High churn. High volume. Low signal. Dubious growth. It is difficult to win in this space unless you rethink the whole approach.

A Potential Solution?

So if assessment companies become sourcing companies and sourcing companies suffer from a flawed business model, how does one build a business that fixes the problem with hiring in our industry?

Working backwards, we should create a platform that attracts seasoned developers. We can do this by designing trustworthy assessments that respect their time. Coding questions are out since they over-index on algorithmic knowledge, which is more relevant for CS graduates than senior engineers. Take home assessments are out too since they ask for too much upfront investment. Pair programming interviews are potentially viable but require synchronous investment on both sides, making them expensive (though probably worth the cost for a good senior engineer).

I think there’s a better solution: open source asynchronous debugging interviews. The idea is to take a piece of open source code, introduce some random bugs (perhaps with the assistance of AI) and ask the candidate to get the code back into a working state. Debugging as a task is shorter than integrating features, assesses the ability to both read and write code and is arguably a more interesting problem. Fixing bugs in a broken version of a popular open source library you’ve used in the past is probably more motivating than implementing features for a random CRUD app.

To go even further, why not create an open source HackerRank-like platform that hosts dozens of these debugging exercises; as a sort of library for engineers to use to practice their skills? Why not make this library open for any developer to contribute to? A true hiring platform for the developers, by the developers.

The Business

Maybe I’m way off the mark. But for the sake of argument, let’s assume the above proposal is a hiring platform that seasoned developers would love. The main question is, how does one avoid the other pitfall of high churn?

In my view, there is really nothing you can do about it with an assessment platform alone. The only way to curb the churn is to bundle assessments with other more sticky products. Most companies attempt this by rolling out their own applicant tracking system (ATS) but it is difficult to be compelling enough to compete with companies specializing in this one product category.

There are two product categories that I believe are potentially untapped opportunities that bundle well with an assessments based sourcing pipeline. But in an attempt to start some discussion about this topic, I leave it as an exercise to the reader to think about it for themselves (and as a potential part two to this blog post for myself)

The general strategy towards growth would be to use the new assessments platform as a pivot point to explore other customer needs that complement the initial offering. If you have a platform that developers love, you’ll invariably attract the best developers, which for an open source product, will not only strengthen the product but it will also lead to an unique competitive advantage that others will find difficult to replicate.

Conclusion

So there you have it, another rant to add to the other rants on the topic of technical hiring. This is all just a small, cursory glance at a few companies and trends in this space. There are likely many profitable recruiting focused companies who are doing things differently. Maybe my analysis is completely wrong but I would be more than happy to hear why.

I'm thinking about building a product that aligns with some of the ideas in this post. If you’re particularly incensed, you can email me and convince me why this is a terrible idea. Please reach out!

Subscribe to my blog via email or RSS feed.