Stop Interviewing with Leet Code

  • All of these supposed "flaws" of leetcode are by design. Big companies want people who are smart enough to do the work, but obedient enough to put up with all the bullshit that comes with working at a big company. What person better matches that than someone who's able and willing to study for and pass a tech version of the SAT? Every anti-leetcode article I read is some version of "leetcode is bad because it measures the wrong things." No, we all know it measures those things, and those are exactly the things the measurers want to measure.

    You might ask, so why do startups do leetcode too? I heard startups are supposed to be, uh, innovating, developing new technology, and working on hard, meaningful problems? Shouldn't they want brilliant, super effective people, instead of smart-enough, obedient workers? Apparently not. Apparently they want the same workers bigcos want. The implication of this is left as an exercise to the reader.

  • Someone who builds a truly novel technology solution involving hundreds of hours of effort gets filtered out of an interview involving contrived scenarios. You may have built the next generation X, but given an array of strings and a fixed width, can you format the text such that each line has exactly maxWidth characters and is fully justified -- in the next 30 minutes? Maybe you should have cultivated that skillset instead, because around here we value parlor tricks more than real world accomplishments.

  • In my team we do technical interviews in three steps:

    - an algorithmic challenge. It's related to what we do day to day. I work in domain names so we ask to parse a domain name. There are oddities with domain names so we check multiple things: does the candidate know what basic string manipulation functions exist? do they ask questions to get more info? how do they react when we give additional info that break the code they did so far? What we don't check: whether the code compiles or actually works. We don't care. We explicitly tell the candidate they can write pseudo code or comments defining the steps of the algorithm. We're interested in their reflection.

    - an architecture challenge. We ask the candidate how they would scale an API worldwide. There's no code, it's an open discussion. They can talk about whatever they want: asynchronous, statelessness, load balancing, replication, anycast, whatever. We can also guide the candidate to know whether they know some specifics concepts (for example I can ask "what would you do if you have a GET REST endpoint that returns the same thing every time" and expect "cache its result", even with this question I get different answers (which is great), some will talk about HTTP cache headers, others will talk about Redis or in memory caching, rarely do candidates talk about both)

    - a refactoring challenge. We work with tons of legacy code. So we show the candidate a crappy piece of code with performance issues and no tests and ask them for what their strategy would be. No writing code here, just thinking and discussion.

    So yeah, just a quick screening to check if the candidate can write basic code (you'd be surprised of the results), and open discussions on our day to day problems.

  • I just tell recruiters I simply won't. Turned down continuing with an interview process with a company just last week for this reason.

    I'll do take-home programming, I'll do collaborative debugging and coding in a shared IDE with something that looks like a real project, I'll do system design etc. interviews, talk to you about programming, etc. and I'll show you my GitHub etc. projects and you can judge from that, and my 20 year long resume, whether I might be a fit.

    Want me to write CS-class algorithm & data structure problems on a timed clock on a whiteboard or equivalent? You've just told me everything I need to know about your engineering culture.

    I worked at Google for 10 years and I hated their interview process, it needs to stop spreading to the rest of the job market. If enough of us say to no to this process, it will end.

    And to hiring managers: most of you are not Google (thankfully) and don't have a bottomless pit of talent to choose from. Stop pretending otherwise. You'll get better results. If you feel you have to do it, save it for new grads and stop using it for senior talent. All you're testing for is whether people practiced leetcode or whether they're straight out of a CS program.

  • I interviewed hundreds of C++ developers as a freelance assessment interviewer. The most candidates I interviewed wanted to work in the automotive industry in Europe.

    Many candidates (like: maybe the half) are fancy talkers without any skill in writing code. I really don't know why they are applying for dev jobs. It is easy to filter out these persons with a very simple coding test.

    I agree with the article that 'leetcode' tests (find that complicated algorithm in 30 minutes while I am staring at you) are bad. But I think coding tests are good! Give the candidate just a really simple coding task with stuff they normally do every day. Create and delete object, fill arrays, iterate over arrays, and so on. 50% of the candidates will fail! The rest are OK engineers.

  • Don't get me wrong I hate LeetCode-style interviews as much as the next guy. In fact I really, really, really suck at them! Not sure if that says more about my ability then anything but c'est la vie

    In the defence of LeetCode-style questions, I do think they work, and very well may I add - with the caveat you have the throughput of candidate to make it work well? Their ability to filter out 'those who can't code' in an efficient manor while sacrificing a small amount where it filters out 'those who can code' greatly out weighs the alternatives. The alternatives needing to fit into a 1 hour timebox, be objective while also favouring the positive cases (I think I got that the right way round).

    My two cents would be more around the way in which they are conducted; in my experience I've found conflict with the interviewer more then the process itself - with interviewers in my past lacking.... empathy (may not be the right word) for the person on the other end of the screen/table feeling flustered, nervous or down right stupid that they're struggling to solve a simple fizz-buzz/reverse string problem, leads to a snowball effect and pilling onto that can effect the candidate in quite a spectacular way. Best interviewer I've had asked if I was alright and got me a glass of water, props to that guy!

    I dunno - I've just come to terms with having to learn how to play the game, even if I find that part of the game really hard and to some parts unfair. Such is life

  • In a previous job of mine, we would show candidates a printout of some buggy code, and ask them to find the bugs. We would leave the room and let the candidate work through it on their own. The code in question was basic algorithms and data structure stuff in C++, such as inserting into a doubly-linked list. I always thought it was a good exercise. Suits slow-thinkers and nervous people, and it's a good test of coding ability since if you can spot a bug then you can clearly read and understand the code.

  • Considering the normalization of spending hundreds of hours grinding LC questions and the industry built around whiteboard interview preparation, my (n=1) conclusion is that LC interviews is not about technical assessment at all.

    It's an assessment that's designed to find people who are ready to submit to an endless grind with little to no skepticism. Developers who question the technical usefulness of LC interviews are simply not the target audience anymore. The target audience seems to be potential employees that are hungry and without leverage.

  • Having taken some interviews myself I can see some value in having the interviewee do some small coding exercise when they do not have any personal projects to show. I think it can filter out some false positives of candidates that can talk well but can’t actually write code (you’d be surprised how many candidates can’t)

    What surprises me most is the lack of flexibility in the process. If a candidate shows up with a broad portfolio I’d rather talk about that then doing some random coding problem. Yet our HR manager insists on the fixed program. This is worse when the candidate is interviewing for a senior role where I don’t really care. Then I am mostly interested in their past experiences and knowledge on how to build things that don’t fall apart after six months.

    Again it is definitely process over people here… not sure if it is better in other places.

    I would also say that these exercises are most effective when they are quite simple. They let you test ā€˜can this person write a function’. The complicated ones often filter more for people who have studied those type of problems. Harder problems != better coder. At least not for the projects I work on which are more integrating existing services than investing new novel highly efficient code.

  • You know after going through like 10 or so interviews this year while trying to find a new job, I grew to appreciate leetcode tests as a candidate more.

    The first thing you start to realize when interviewing is that every company has their own unique process for interviewing. The second thing you realize is that you're not going to ace every process every time, it's a roulette wheel you're spinning to see if this specific process and this specific day and this specific set of interviewers and this specific mood you're in are able to align in a way that they feel confident giving you an offer (leetcode or no leetcode, doesn't really play in to this factor).

    The nice thing about leetcode tests as a candidate is that you can study for them, go through a few rounds of it with different companies, and get better at it and know how to improve for the next interview. When companies drop the leetcode tests you end up getting judged on the arbitrary criteria and testing that they devised in its place. If you come out of that interview not doing well, you can't really use that as practice for the next one -- because the next company you interview at may have a process/test that doesn't overlap at all with that previous interview. Now you don't have to practice leetcode and spend an hour doing something that's not directly applicable to your day to day job, but instead you're subjected more to the whims of randomness and whether you were prepared to satisfy the process they came up with instead -- which may not be shared by other companies.

    Leetcode isn't great, but it's also not that bad. Some companies (i.e. Google, Facebook) you will have to get very in depth with practicing and knowing data structures and algorithms to do well in the interviews. A lot of other companies you can pass the leetcode with a lot less work, just need to have a basic refresher on graphs, trees, linked lists, etc. Other companies yet, they won't ask you leetcode at all (but that doesn't mean the job/company is good in other areas either, it's always a tradeoff).

  • > So What To Use Instead?

    > Deal with ambiguity, Reviewing code, understanding what it does, finding gaps, Testing, Code structure, Cleanliness, Learning new concepts

    Yeah, no.

    A lot of these are culture. I'm fairly confident I can teach someone smart and competent to write clean code, add tests, and properly modularise the project.

    It's called training (progression from "junior" to "senior"), I think more companies need to invest in it.

    What you can't teach someone, is how to be (1) smart, (2) understand how computers work, and (3) be passionate about tech. That's what interviews are supposed to test, and leetcode (and some deep discussions, e.g. "how does a hashtable work" then leading deeper into the details of CPU, memory, instruction scheduling, optimisation, ...) does that.

  • People who call for Leet Code either never tried to hire actually high skill developers or fools themselves into thinking that they're the cool kid in town. You might think it's better than any other alternative. I agree - it's been an amazing filter to filter out companies who are on average, quite crap to work for. From my personal experience, they are overburdened with process to a degree that even if they hired the best devs out there, they wouldn't be able to deliver anything because of all the red tape.

    The best jobs I had to date, I met the person leading the company/project/team, we had a chat, talked what tech we like, dislike, how we'd structure a product, what are the preferences to the process around everything. And that's the key thing - it was always a discussion, no Q&A. The key is that the candidate is not the only one who needs to know his stuff - so does the lead.

    As a side effect, all of those jobs were way above the market. Again, personal experience, but higher up you go - less BS like "we need leetcode to hire" you get. Unless you're Facebook and you have a genuine problem of too many qualified engineers constantly applying, you should aim to only disqualify truly hopeless cases.

    The company can't hide behind process and expect great hires. Early in my career, in a small city I was working in (in return, in the dev community you know about what other devs are doing), our company denied so many devs that within a year or two were among the top performers, just because of the leaderships insistence of a take home tests, Q&A interviews and gotcha style questions...

    So please - do continue using leetcode, it makes filtering your company out so much easier and I don't need to go through bullshit stages to know that the leadership has no balls to make the hard calls when it comes to hiring & firing.

  • This is a tired topic at this point. The mistakes people make are:

    1. An interview proces exists to fill a position. It doesn't exist to fairly assess an individual candidate. Candidates would like that. That's not the point. If there are 10 candidates and the employer fills the role successfully, they've achieved their goal even if someone great was filtered out;

    2. FizzBuzz came about because many people talked a great game but couldn't code a flor loop. Giving a simple coding problem is an excellent negative filter. Doing great at the problem means nothing; and

    3. Interviewers make the mistake of thinking FizzBuzz is too easy so they give harder problems. This is a mistake that defeats the entire purpose of the filter. Stop doing this.

    These points remain constant in every such engineering hiring or interviewing thread.

  • It's much better than the alternative and ideally only one part of the interview process.

    Ultimately the proof is in the pudding, I've had loads of candidates that could barely code and Leetcode-style problems are a great filter against that.

    Otherwise you risk just getting PM-style bullshitters as engineers, who talk persuasively about projects that other people actually implemented.

  • > A first alternative is to look at some of the candidate’s code to begin with.

    This is biased towards people who code for free in their free time. I'd call this terrible advice for most companies, since it filters out the vast majority of qualified candidates.

    I have some open source contributions on my GH account, but none of them are representative of how I code since they are all bug fixes and/or small enhancements which are shoehorned into an existing codebase, since my goal is to add a feature with the smallest number of code changes (in order to increase the likelihood of a PR getting accepted).

  • Every time this comes up, lots of people start with the assumption that coding problems have a large amount of false negatives.

    Because of this perceived "truth", I had the same worry when we started to implement a coding problem. Rather than guess, we decided to measure it: for the first six months, we used a wide filter (50% pass rate, actually like 65-80% of people who didn't cheat).

    What we found was that there were zero candidates in the bottom 66% who passed the rest of the interviews. The plagiarism detector also had no false positives (based on manual review). So at least on this sample, we found that we could screen out about 80% of applicants without having _any_ false negatives.

    I'm sure there are some bad employers misusing coding tests, just like with any tool, but I have to imagine many others have done similar experiments and found their tests to be effective.

  • Last xmas I worked on a side project that ended up requiring a moderately substantial algorithm; much more involved than anything I've ever had to do in a LeetCode interview - there's absolutely no way i'd be able to produce it under those circumstances. Yet, the algorithm wasn't the hard part.

    The hard part was all the exploration I had to do around the problem to get to the point where I understood the constraints well enough to solve it. I had to rewrite my attempted solution a number of times as my understanding grew. In my opinion, this represents the real thing we should be trying to test - a candidate's ability to unearth the true definition of the problem. When it comes to writing real world algorithms, whether you can solve it in 30 minutes or 3 days is (mostly) irrelevant, because it makes up such a small part of the overall engineering time.

    You could squint and make a case that this is what LeetCode challenges are doing, but I'd only agree if we removed the requirement/pressure to have working code passing all the tests within the time allocation; and to be honest, most of them just hand you all the constraints on a platter.

  • Just talk to the person, ask them about projects they’ve worked on, problems they’ve solved etc. you’ll learn far more about them that way than getting them to put on a dog and pony show at a whiteboard!

  • After hundreds of interviews for a FAANG I'm become more and more weary of this format. Particularly I find the the leetcode style "problem solving" question to have the higher rate of false negatives. It's common for candidates to bomb that question and do well on the rest.

    My favorite way to judge candidates now is by asking a ā€œclean codeā€ question. This doesn’t refer to Uncle’s Bob Clean Code, but to code that is simple, maintainable, and extensible. I give candidates a simple and slightly ambiguous problem statement, usually revolving around ā€œwrite a library that does Xā€. I expect candidates to ask questions and clarify the ambiguities, then proceed to define the APIs and finally write the code. The implementation is straightforward, with no tricks or logical puzzles. Only use simple structures such as lists, hashmaps, and loops. Then I ask one or two follow-up questions for more requirements, such that they need to modify or extend their code. Depending on how this is organized this might be trivial or very complicated.

    I feel this format is the closest to on-the-job work and gives me a good feeling of what it would be like to work with these people. Also has a lot of freedom and allows one to peek inside the candidate’s mindset. How do they deal with ambiguity? How do they approach API design? how do they handle incorrect values? Do they care about corner cases? It is also mostly devoid of what developers hate most e.g. trick questions and obscure algorithms.

  • Been an interviewer and interviewee recently, so being on both sides of the track has given me some perspective.

    This is the current process that I think is fair and holistic:

    1. meeting with the candidate, our manager, and some devs talking about their past exp., our company, our team, and their wants

    2. Take home coding task based on our day to day work: This is linear with direct instructions for inputs and outputs; there is an optional part at the end for testing more tricky concepts. They are instructed to write clean and clear, no stress if they don’t finish, take their time with a week to do it (it’s a few hours work).

    3. Interview with them walking through their code on their machine and describing their thought process, field questions from them if any are left.

    Then we decide by a team discussion afterwards.

    Gives them their space to think, reduced pressure for candidates who are socially pressured.

    Thoughts?

    I personally detest leet code as a recruiting tool.

  • I used to work in a bank and wrote an interview script for my department (kdb+/q market data). It mainly consisted of having the candidate sit in front of an interpreter, and walk them through a script which looked something like:

    1. Load a data file here

    2. Tell me some facts about the data

    3. Here's another dataset, can we use both to figure something out.

    4. One of the executions for this order is missing, how can we find which one.

    5. Here is a data feed, can you write a process to ingest the data and calculate something in real time.

    I by far preferred this system to the alternative which was to ask trivia questions and see if the candidate memorised the docs. There is of course some value in asking basics, or to elaborate etc. But on the spot algo questions are usually only useful in filtering people who either like leetcode problems, or have grinded them for the last 6 months.

  • Totally against using LeetCode for interviewing engineers.

    However, when asking around about why people who use it do so, I found out it does have one irrefutable advantage: it stops people who can't code at all.

    From an engineer's point of view, LeetCode is a complete waste of everyone's time because it measures things that aren't factors in successful engineering (as TFA says).

    Bu from the non-technical manager's point of view it's awesome because it gives a single, simple score for "how good is this engineer compared to the other ones?" and people who can't code at all can't complete it.

    The non-technical manager's worry when interviewing is that they hire a really expensive employee who can't do the job. But because they don't understand the tech, and the tech is complicated and even expert engineers spend lots of time fighting it to no apparent end, it's really hard to understand if an engineer is incompetent and bullshitting them, or actually good but the problem is hard. Having a nice, easy metric that stops the complete bullshitters from getting in solves a problem for them.

    What we need, obviously, is a professional association for software dev, that can then properly test us and verify that we can do the things we say we can do. But the industry has a lot of growing yet to do to get to a point where this is even possible.

  • I recently failed interviews at Google and Amazon, exactly at this sort of algorithmic problems. And that's quite far from my client application daily job. I knew it's not my strong side, I would have applied before they approached me otherwise. Now I can tell everyone - I knew quicker than Google, that I'm not the right fit ;)

  • I can relate so hard with the part which says it does not favour slow thinkers. I've faced this countless times - person at work asks me something, I reply "I'll think about it and get back to you", which inevitably leads to their disappointment.

    I am okay with disappointing people, but it can be unnerving when that disappointment means I miss out on a good opportunity.

  • To be honest if I was an employer I probably would do leet code or similar. The logic would be that yes, many engineers are shy or nervous around people. But losing those is worth it since hiring someone bad that you have to fire (that wastes some months of productive time at the company) and then re-hire someone else is definitely worth that risk. Having made something impressive on github is somewhat fakeable since you can get a lot of help from the internet or even copy large chunks (that you may understand but didn't have to invent) from stack overflow. Someone who can do leet code at least have some hard-to-fake ability to reason and work under pressure.

    That said, I still don't like doing leet code interviews and I'm pretty bad at them, but that's the logic that imagine goes on in an employers mind (hence why "they're missing out on some types of candidates" logic likely won't sway anyone reading these comments I suspect).

  • Nice article - Leet code style interviews give minimal signal. The fact that we use the word Leetcode makes me think the company is hiring for the masses and it's going to be a boring job. I call such a company a Dinosaur.

    I think companies should offer a choice to interviewers if they prefer to give code samples , an at home problem solving or an in-person exercise. This addresses careful thinkers, adapts for anxiety during an interview.

    I do appreciate when companies ask relevant questions that they have come across rather than mundane Sudoku questions.

    I have interviewed with a few companies, and Stripe's interview style stands out. Coding questions are relevant day to day style questions.

    I would say Google, Amazon and Facebook set this trend and have spoiled it for all.

    Unfortunately, some companies cannot think on their feet to set a different approach. Maybe it's in your best interest to avoid these places.

  • Right now, I get bombarded with headhunter requests, if you are going to ask me to leet code, my inbox is full of alternatives.

    Leet Code is only cargo culted because many candidates let it happen.

  • I've decided I'm going to start interviewing with a set of PR's per each role we hire for. each PR will have obvious mistakes, complicated logic problems, as well as code that could be "refactored" once the whole PR has been read.

    this provides a few things; it gives us an ability at how the interviewee problem solves. next we can see how they respond too obvious fixes (would they be someone you'd want to send a PR too?). finally, it tests their knowledge of the language and APIs, hopefully much better than Leet Code can. I would also like to see if the user can spot obvious bugs in the setup code (say, package.json, pyproject.toml, etc)

    I am going to make an example PR for: - frontend (React/NextJS, TypeScript, CSS) - backend (Django, Python) - DevOps (potentially some Pulumi code for deploying to a Kubernetes cluster?)

  • Benefit #175 of having some kind of open source project in your company: you can open all types of different issues and user stories, and use them to discuss with candidates. You can even attach bounties to them so that candidates don't feel like they are doing free work, and it would still be cheaper than paying a recruiter that will likely just source candidates by spamming different sites.

  • I worked at a unicorn that did these type of interviews under the premise that they provided a way to reduce bias in the process. That seems reasonable if you believe that qualified engineers can walk in and compete these tasks without having practiced extensively.

    I don’t happen to believe that and thus believe it biases in favor of candidates who can devote considerable quantities of free time to preparation.

  • Instead, I offer to share a private GitHub repository, (if not some open source) with the prospective employer. It offers them the ability to see how my code changed over time, how it ended up, and the quality and calibre I may or may not devote to projects.

    It invites a conversation, in depth, about software construction, quality, and decision making from the point of view of real work. It also tells me if a company wants to simply filter. If the organization is unwilling to invest in a candidate interview, as I do to be interviewed - I in turn learn a lot, and decline to pursue accordingly.

  • I once failed a leet code graph problem because I solved it with a genetic algorithm.

    The problem wasn’t that I was incapable of solving the problem, it was the narrow view of possible solutions. The interviewer was looking for the CS101 solution.

    My biggest gripe with leetcode is they tend to filter diversity of thought.

  • Leet Code tests can be a great way to find good engineers - after you've told them what complex puzzle you want them to code, you can reject any that don't ask you why you need it! :)

  • My favorite style of interviewing is through reviewing existing codebase, then ask question on it. Then i'll ask candidates how to improve their codebase (or someone else code).

    If possible, i'll let them code some small functions and ask them how they gonna do the unit test.

    To me, refactoring skills is a must, as most of engineering work is on refactoring.

  • I'm not sure. Yep, I tanked several interviews because even when I got info at the beginning that "we will look for your way of thinking and this how you handle problems" finally I heard "Yeah, it was OK, but there exist better algo to do it". But still for sure those interview questions are close to checking raw IQ/algos, and if somebody shines in those probably will be OK employee. And now it is question of candidates pool size, if you are one of wanted employers like Google, Amazon you may use this filter, you will loose a lot of good candidates who aren't good in algos, but those who will pass your interviews are still good, and you have good amount of those. Smaller, less sexy companies may have problem finding people with this approach, but bigger and better know probably don't have problems here.

  • This is crazy and needs to stop indeed it's like you have:

    - A degree (In which you've proven you can understand these algos and spent 4 years studying)

      I'm not going to redo all that in 1 week before your stupid puzzle!
    
    - Experience (That has to be worth something it's not like everyone is lying about it)

    - You may have open source contributions

    But no, some companies will not even start to look at that or not look at all, before they ask you this stupid puzzle.

    Personally I now filter those companies out, I mean 20 years of experience, contributions in major open source projects, if you can't recognize that? why would I interview?

    Were I work now we give code assignments, while they take longer to do you can actually see structure which I agree with the author is the top quality I'm looking for. They are also less stressful for the candidate.

  • I think the LC interview measures how determined your are. Are you willing to spend hours and hours to solve boring/meaningless coding exercises in order to pass the interview or not? Because even if you are smart and a good coder if you haven’t seen the problem before it is extremely unlikely to give optimal solution to 2 LC medium or 1 hard problem within 45 minutes. Your only option is to practice.

    So if you willing to spend your free time mindlessly practicing, you will be a good ant at the company. Which is very desired since most of business programming is boring and repetitive and does not require creativity.

    Also, validating the solution is simple. Does not need too much effort and creativity from the person conducting the interview.

  • that's an engineer's opinion about things beyond his understanding

    75% of fresh grads are below mediocre, to put it very mildly. 50% of candidates with a seemingly OK employment record or portfolio are too.

    leetcode filters them out right away. that's the purpose it serves. it's not there to get you good candidates, it's there to make sure that you only spend time interviewing potentially good candidates.

    I would agree that it would be ideal to use coding challenges suited for the job you're hiring for, but that would take a lot more time and effort to make and review

  • These articles always seem to have an underlying assumption that really great people are being denied access to jobs because they can't get through these interviews but it is implying that the people who do get through these are not also great but are less hassle, easier to measure their ability and so what if they have swatted up on LeetCode to help their application? That means they are driven, that they have learned stuff on the way and are more likely to get something right the first time.

    I have been on the receiving end of applying to an agency and being told I didn't make the grade technically. I was disappointed because I know I am a good engineer but I didn't expect them to magically know this. I can also see how my approach to the technical tests might have made me look less than what they were looking for, which is fine.

    I am also not sure of any good alternatives because someone will always object to any alternative which they cannot achieve for some reason. A "take home" project is good for real life work but some people cannot (or will not) invest the time even if they are paid for it; discussions can be great for helping nervous people but that is not how work usually is, there are challenges, pressures etc. and the able people object that it is not fair that people are getting in too easily.

  • I don't think it will stop unless 1) It's illegal. 2) It's ineffective 3) It's not economical for the company. To stop the trend you have to achieve one of the three.

    Similar to algorithms itself, interviewing is essentially searching for people that the company wants.

    How a company interviews candidates define its effectiveness (hire the right person, less false positives, less false negatives) and cost (how much does the company spend on the interview process per hire).

    Interviewing with LeetCode is acceptable effective: candidates.filter(leetCode) gives you a much smaller set of people good at algorithm brain teasers, and this set of people have acceptable approximation with the set of ideal candidates.

    In other word, it's a lazy but effective enough. The majority of companies will only switch to alternatives when the cost is lower, or substantially more effective. It's broken from the candidates perspective, but not the companies'. Most companies will stick to "if it ain't broke don't fix it" unless we offer them 10x solutions - which is yet to be seen.

    But I can also see it the other way around, we can undermine the effectiveness or cost for all companies: if we can develop better courses and bootcamps, letting more people hack LeetCode problems quickly. And then companies will naturally go for harder problems. In the end, the problems will be ridiculously hard that the companies can only filter people who memorizes LeetCode, and those set of people have virtually no correlation of good hires (good problem solving skills). And of course, it's also a profitable business to teach people this.

  • I think that a blanket rejection of the "Leetcode" style interview is too broad. They are a tool, and like any tool, one should try and understand when they are appropriate for the job. If the "the job" is to try and provide a fairly level playing field on which to assess candidates on raw problem-solving ability and programming intuition, these are useful tools. There are the same problems as with any test of this kind (you can improve your results by learning the format of the test well), but there's still a lot of signal. If you're trying to hire based on the things these tests measure, you should you use them. As the post points out, correctly, those are far from the only qualities required for many jobs, and anecdotally it seems like some companies are overusing this one type of interview, but the solution is for interviewers to assess what they are trying to select for, and make sure that their selection process measures it as well as possible. Part of that process may well still be puzzle-style interviews.

  • These threads are always so depressing. On one side you feel bad for the people that have to study leetcode so hard, but then again being good at leetcode offers you the ability to basically jump into a 6 figure software career that could very well change your life.

    Without the ability to get hired by just "being good at leetcode," does that make it harder for people to break into the industry?

  • I've come to the realisation that during those two hours at maximum that I get with the candidate the most I can do is:

    - Check their English

    - Confirm that they are not an impostor

    The former is an especially good predictor, because it tells me whether that person can read documentation.

    I suppose for native English speakers a reading comprehension test would do.

    The only thing Leet Code ever tests did for me is give false negatives.

  • The best interview we've used was sharing a simple but very not idiomatic Python file with working (but slow) code and tests at the bottom. The task was to refactor and speed it up. This allows seeing the actual thought process and some basic skills, while at the same time being something any good dev could do in 20min without much pressure.

  • As an alternative to Leetcode, I have had a good experience with Byteboard. It's sort of a "project" based interview format, that comes with 2 parts.

    Part 1 is a design doc where a problem statement has been outlined with goals and notes from the development team. Your job is to respond to their questions and propose a high level architecture to solve the problem. Part 2 is the coding portion, you're dropped into a codebase that relates to the previous portion and are given a todo list of tasks to complete. It's basically feature implementation, ranging from trivial to somewhat involved.

    I really like this approach. As someone who has interviewed a lot of candidates using Leetcode style questions, I would love if my org moved towards this format. Unfortunately, it's pretty hard getting FAANG companies to drastically change hiring practices, but if smaller companies start adopting it maybe it will get some traction.

  • In my experience most companies using leetcode thinks their shit don't stink. So i'll leave this where it is.

  • I think for large companies where they want to reduce the number of false positives, leetcode style questions work well.

  • > All starts with showing some code, a class that does some stuff and its corresponding tests. The code is not glaringly bad, but it’s also not great on purpose. [And all that follows]

    This is worse than LeetCode in my opinion. Because all it is, is a shallow copy of LeetCode. You've constructed a puzzle by laying out a picture and cutting out particular pieces. It's "find the differences" between what you've given them and the image in your mind.

    > If I’m hiring a landscaper, I’m not gonna ask them to tell me about the classification of ficus in Fiji, or the specific reproduction period of Douglas Fir in the West Coast. I’m gonna ask them to trim a tree and see if the result suits me.

    And the landscaper will walk. They will not work for free. They'll give you references, they'll show you pictures of work they've done before, but they won't do work for you.

  • I'd kill for the industry to progress in this area. Really enjoyed when I interviewed at Netflix that they didnt put me through it as a manager. It was all about cross partner collaboration, working with and coaching devs, technical design/vision and handling customer/vendor relationships.

  • A lot of comments theorize about what characteristic - for the most part other than coding skill - these kinds of interviews select for. Mostly it's to justify their use. OK, then: is there any evidence that this style of interview effectively or efficiently selects for any particular characteristic that matters? If these companies were really as data driven as they all claim to be, they'd rigorously analyze predictors of success (whatever that means) within the system they've built, and then design an interview process that can rationally be expected to select for those characteristics. Having worked at one, interviewed at another, and heard lots of stories about the rest, I don't get the impression that any of them have actually done that.

  • undefined

  • They need us more than we need them. Don’t work for people who ask you to do party tricks for the privilege.

  • A task you solve with someone (as part of your team if you get hired) > take home task > leetcode interview

    in my opinion

  • I don't really understand the point almost all of the thing they list as downsides of using Leet Code are actually benefits. If someone can't manage to code some simple questions during an interview I can't imagine they'd ever make real contributions.

  • Not arguing that LC problems are a poor, degrading way to filter but every single post HN about how "interviewing is broken" assumes the process is targeted at the applicant's experience. I don't hear HR departments making this complaint.

  • Leet code definitely discriminates against slow thinkers and those who might be more inclined to use libraries and interfaces. It is biased against a certain types of developers who is usually not the best type of developer.

    Have you ever seen corporate codebases? Leetcode emphasizes the old way of thinking is/was prone to do what you are told, dont think, just do the task in the timebox allocated for the sprint, always reinvent the wheel, and each axel, multiple times, for each wheel, in the same codebase.

    This jira waterfall code now, dont think, and this might be fine for a unicycle, or even a bicycle, bad for trucks and trains.

    Hint: With factor t - all tech debt - Everything will become a truck or train.

  • If people who had to study for the LSAT, or MCAT, or countless other high-intensity selection processes realized that we sat around bitching about LeetCode interviews as the only obstacle to making more annually than some of them ever do, oy vey…

  • Pointless. All they do is confirm that you can remember algorithms you’ll never use in real life. I won’t work with anyone who uses them… I’d hate to work with a group of people who got their jobs solving riddles rather than building software.

  • issue with leetcode is it designed to filter out 95% of people, companies dont want to know you

    they rather keep a set of leetcode, iq and personality test to remove you from the list of candidates

    root cause is bootcamps over saturation of devs in job market, especially in web

  • Studies show that one of the most effective interviewing tools is a test for general mental ability: https://www.semanticscholar.org/paper/The-validity-and-utili...

    Now, IQ tests are of dubious legality, at least in the US, but algorithmic coding questions basically get you an IQ test crossed with a programming skill check: win-win.

    All the ire about how you don't actually invert binary trees or whatnot during your real job are rather missing the point.

  • I don't know, risky answer but I will give it nonetheless. This advice is given to many candidates, and I agree that Leet Code interviewing is bad and the industry needs to work on something better, but in the meantime, if you are starting out, think of it this way (and I have friends younger by 10 years or so and straight out of college and I say the same thing), a few months of studying gives you a huge salary. It's a win-win, and in the meantime, people will complain and hopefully the industry will fix it, but do not "Skip Leet Code" just because it's the cause du jour. That's just my opinion.

  • The issue I have with these arguments is that they assume that just because a certain thing has flaws there must be a better solution. What if predicting long-term effectiveness using a limited amount of time is inherently approximate at best?

    I'm sure interviewing can be improved, but I think it's worth remembering that we are one of the few industries that actually tries to do skills-based interviewing. When people say "get rid of leetcode" or what not, are they really saying that the alternative that the rest of the world uses (resume screen plus vibes check) is preferable?

  • > you can pick a ticket and pair program. Have them review an actual PR. Etc.

    This seems so obvious. If you pick a leetcode question, there’s always the risk that your candidate has memorized the answer to that particular question. But if you pick an actual bug/PR from your codebase, that problem disappears completely, and you get to see how they would perform on the actual job you’re hiring for.

    Can anybody think of any negatives here? The only thing that comes to mind is that it might be seen as the employer trying to get free labor if they use a bug report that hasn’t been resolved yet.

  • Dear Everyone,

    Please keep using Leet Code in interviews so I can continue to hire extremely talented developers with little competition from out of state companies that like to pay 60% more than local prevailing wages. Thanks.

  • Let's separate an idea from implementation. It really depends from how to conduct such interviews. A good example is Google and Bolt. Both use it but differently. And for me personally in both companies it showed clearly all my shortcomings. Even when I known how to solve the task in the best way and wrote code on a board which you can compile. And to be honest I have the same shortcomings when I solve problems bigger or harder. You know, unlike many others this experience was clear and useful for me. It needs practice but I like it.

  • I still don’t understand why more companies don’t do interviews that include solving real problems that their engineers have had to solve (or will solve), modified by some or more of the following factors: 1) domain specific knowledge removed so someone without company proprietary data can grasp it in an interview 2) code setup, library installation, all the time consuming fat removed 3) some level of repeatability and possible algorithm space so different candidates strengths can shine.

    How hard would this be to do?

  • Leetcode tasks are just a starting point for discussion. Okay, you wrote that SQL query correctly (if you didn't - stop whining and go learn SQL, it was a pretty easy query). So, if those tables will grow with time, where will the performance bottleneck be? How will you avoid it? What types of indexes do you know? What is the difference? When you would use which? What do you know about data partitioning? Sharding? How do you avoid data resharding on growing cluster?

  • On the flip side, I'm about to start interviewing some engineering candidates, and I have no idea what to do. I've been in the field for 8+ years now but interviewing for technical positions is hard and I've never seen a company really get it right. How do you have high standards without a ton of false negatives? How do you avoid reductive coding exercises without selecting for charming incompetence? Genuinely asking, are there any good resources on this?

  • Strongly disagree.

    Leetcode interviews are not perfect, but they're vastly superior to the previous system of throwing away resumes which did not come from the right pedigree.

  • > Alternatively, look at under-performing people and find what they are lacking

    This is great suggestion. While the ā€œlook at their githubā€ one is a bad suggestion. Github polishing is theatre more suited for theatre majors instead of people actually working with integrity before coming to your company. Its very similar to the issue with the leetcode interviews as its geared towards people with time to optimize that instead of a day to day job.

  • With how often this topic shows up I'm surprised HN hasn't banned it.

    There are lots of suggestions on how to better evaluate a candidate, but they are either not true or involve needing the person to dedicate an enormous amount of time for the interview, which most people would not agree it. Coding exercises are a "least-worst" scenario in terms of evaluation versus time-spent interviewing and companies know that.

  • I hear this all the time. Don’t use programming tests, don’t use LC or HackerRank, etc. What do other professions that demand high skill do? As an example - medicine. How are doctors interviewed when they switch jobs?

    The beauty of LC type interviews is that it requires no validation by your existing employer or no public record or demonstration of work. In the absence of LC, I’m afraid we have to settle for some of that.

  • We keep seeing posts like these on HN/Reddit, yet companies keep using them. And companies like Faang (or is it Manga now??), whose engineers I would think would hang out on HN

    So it either means:

    1. HN's influence is even less than we thought-- even MANGA engineers /10x silicon valley types dont hang out here

    2. Everyone agrees its a good idea, but no one cares. Like everyone agrees we should care about the environment etc

  • As someone who just went through this (finished up loop at fb and google) - I can honestly say at first I HATED the idea of leetcoding/having to learn this stuff, but after a while I started actually learning the real concepts behind the problems and started looking at them like puzzles. I enjoy puzzles so this approach made these problems more approachable and engaging.

  • I’ve kind of had it up to here with the deliberate proposal echoed by many folks here that Leetcode interview hazing has anything to do with either software quality or the safety of safety critical systems.

    Safety critical systems already have legally defined coding standards. No really.

    The former notion is just not worth defending, but this thread will continue to grow, regardless

  • "you’re skewing the data with somebody’s ability to prepare for the interview"

    And I love it, and use it to my advantage. It's so much easier to prepare for a round of interviews, than it is to actually be good at your job. So this flaw makes it much easier to pass interviews, if you know its there.

    ...and believe me, I've used it :)

  • The hubris of interviewing companies is just unreal. Someone can talk until they're blue in the face about the problems they've solved and things they've built, but no, I'm not going to believe any of it until I see that, what, they can traverse a binary tree post order? Give me a fucking break.

  • Pretty frustrating when 2 years ago you're solving leetcode problems with GPT-3, and can demonstrate it inside a facebook interview but the interview process doesn't factor in the 10x programmer, and doesn't recognise any TRULY out of the box thinking. Leetcode was dead when GPT-3 came out

  • If you get a never before seen problem to solve, going trough the taught process and communicating the steps of solving it might display the skill of problem solving. Which should be what interviewers are looking for. The solution will probably not be the most efficient one, but can be refined later.

  • If you want to break the $200k TC barrier - leetcode seems to be the way to do it.

    Does it suck? Yes.

    Is it basically the only way to make real money in tech aside from toiling away with startup after startup. Also yes, from a person who spent five years thinking I was getting somewhere working for startups.

  • More appropriate interview would be "figure out how to get permission to change this setting"

  • Whether someone gets a correct answer with a code challenge should only be a single factor in hiring. You should also look at how the attempt to solve the problem. Do they ask further questions? Do they talk it out?

    There’s probably other factors I cannot think of too.

  • Such algorithmic tests are used by FAANG as a means to discriminate candidates based on age:

    For example, if they want to get rid of old candidates, it's easier to do it by asking them to implement a BST algorithm which a freshly college graduate could do easier.

  • I've been in the industry for many years and I've never had to do this. Is it mostly an American thing?

    I had to do a coding test for the job I'm doing now but it was a "take home" test and was directly related to the work I would be doing.

  • There has to be some sort of hiring "gate", what would be the replacement? It used to be a CS degree (which is even more restrictive).

    FAANG jobs are in high demand due to salary, so now we have 5+ round interviews and leetcode as a low effort filter.

  • I've done a few interviews like this recently. I always thought I was a OK programmer, judging from the feedbacks I get from colleagues. But after these interviews, I felt terrible. Doing my daily Leetcode as we speak...

  • Leetcode tests how is your math, and math is hard because handwaving and virtue signalling does not work with it.

    Algorithms should be not memorized but derived on the go from well known mathematical models.

    This is what good schools like MIT used to teach.

  • Since some orgs don't seem to have time to interview without LeetCode (or just suck at interviewing), why aren't there professional interviewers that get paid to vet candidates? Is this just an untapped market?

  • undefined

  • Isn’t preparing for an interview a skill? Seems like unprepared candidates who know there is a simple way to prepare should be cut. They don’t want it enough to go to the trouble of spending their time.

  • The process won't change until a new Google/Facebook/Amazon arrives on the scene that doesn't use leetcode. Then everyone will cargo cult whatever process they use.

  • I think the bias against Leetcode is not specific enough. My bias against leetcode: stop screening at leetcode easy/medium. Screen instead at Hard/Extra hard.

  • As a student that practiced LeetCode for getting summer internships, I'm getting stressed reading these comments! It's not an enjoyable process at all.

  • No, you keep interviewing with Leet Code, and I will keep vacuuming up the fabulous sleeper candidates that cannot or will not pass your screens ;)

  • If you happen to find a landscaper that knows the reproduction period of Douglas Fir in the West Coast, odds are he is really into his job and very good at it.

  • the last Leet Code interview was a time ago when some firm, RealNetworks, had the bright idea that they would inject their own codecs into Android OS kernel and perhaps be able to sell them to OEMs.

    Needless to say I stopped the interviewer when they started asking Leet code questions and have refused to do any of Leet code interviews since.

    Life and fun code and fun design is way to short to waste it on ineffective BS.

  • Do people actually pay for leetcode premium? If so, that whole business is byproduct of the interview industry to defeat it which is fun.

  • In addition to these cookie-cutter style questions, I am also against asking questions related to some obscure language feature.

  • Who is going to the bell the cat? Unless at least 20% of FAANG, unicorns, stop using leetcode, this show will go on.

  • Maybe doing leetcode style interviews is just a bias that gets you more young applicants (recent graduates).

  • Is this a symptom that algorithmic/competitive coding will be considered the IQ test of future?

  • The best programmers are the ones who can solve problems without writing a single line of code.

  • Was that blog post created with some kind of LaTeX-based tool? It sort of feels like it was...

  • Leetcode style questions are not a bad backdrop as long as you do a few things.

    1) pick questions that are actually somewhat aligned with a problem that would come up for the role. Usually implementing a data structure of sorts is going to be way more predictive and relevant than a dynamic programming problem.

    2) Ensure the question requires a fair amount and complexity of code to complete.

    3) The question should just be a backdrop. Consider also how quickly and proficiently they can code. How intelligent they come across in conversation. Things they call out as side notes, testing, quality etc.

    Many interviewers seem to have forgotten the purpose of the interview is to be predictive to on the job success, not to invent some separate funnel and gauge how well the candidate did on that funnel.

    In practice, at scale, you will likely have enough correlation between success on a contrived interview system and general competency, but you're going to get a lot of false negatives/positives using that as a yard stick.

    My experience has been that leetcode "theory" is very weakly correlated with competency for most roles, and quality and speed of coding much more highly correlated. One of my best hires was a guy who couldn't implement a tree traversal in the interview

  • _Most_ of the people that complain about the hiring bar don't put in the effort to pass the hiring bar. That amount of effort depends on intelligence. Really smart people can quickly grasp the patterns of leetcode questions, but for someone less intelligent it requires a lot more studying therefore they are less likely to pass.

  • the writer forgot to say they discriminate against those who can't (or don't want to) practice leetcode questions in their free time. This includes those with families, caretakers, people with after work hobbies, etc.

  • Leetcode is great for avoiding the bottom left of the von Manstein matrix.

  • This is good. Managers need to leetcode to prove their IQ.

  • Stop telling everyone to stop interviewing with Leet Code

  • I love this idea to check how quickly a candidate can learn: if blocked on some coding/refactoring question, show the candidate how to do it, erase everything and ask him to redo it alone.

  • We need to upvote this to the top of HN :-)

  • This is very similar to my process and I've found it to be very effective. We hire engineers to work with a well known dynamic language's MVC-ish framework building a pretty standard fare web platform.

    We tell candidates they can look things up on the condition they tell us when they are doing so (basically, "think out loud and walk us through the process -- knowing where to find answers is a valuable skill!") And in most cases they're permitted to use pseudocode if they want to, e.g., if the situation demands any kind of obscure syntax or boilerplate, they just have to note it.

    Exercise 1

    All candidates are shown some (poorly written) code and asked to pretend they're performing a code review for the author, who we describe as a novice programmer who is new to the language & framework. The code we use is a composite of real code pulled from many places in our system (basically, what the code would look like if all the mistakes we encounter were collected into one snippet). The functionality it implements is exactly the kind of functionality the candidate will be expected to implement on a daily basis.

    We ask them to identify antipatterns, suggest edits to make the code more idiomatic, discover bugs, point out security or performance flaws, improve names, etc, and reassure them by telling them that no one spots all of the issues.

    We're causal in demeanor and try really hard to remove stress from the situation, making jokes, etc. We help them if they get stuck.

    Exercise 2

    We share a 90% working piece of code that is missing a single method. Without getting too detailed, something like "This will setup a form based on this model, but the way it is written right now does not provide a mechanism for allowing the options of the select box to depend upon which user is signed in. What would you change to enable that?" They don't even have to write the code (though they often do), they just need to understand conceptually why it doesn't work and then talk through a solution.

    Exercise 3

    A very simple test of their ORM knowledge. They need to utilize a technique that they'll have used dozens of times if they are being honest about their experience but that they probably wouldn't learn in the most basic of tutorials.

    ..And so on.

    For candidates applying for senior roles we have an additional live-coding exercise, but most of the same rules apply -- they can look up docs, we help them if they get stuck, etc. We give them a starting skeleton app and they have more than enough time to solve the problem. They can use their own editor, copy/paste sample code they find on stackoverflow or in docs, etc -- basically everything they do when they are actually coding.

    The problem is a very realistic one -- a simplified version of a feature that had at one time been on our roadmap but which we eventually abandoned. We encourage them to add comments to indicate what they'd do if they had more time, and when they're done, we discuss the overall approach and ask questions about their decisions.

    I've found the above approach to work far, far better than any "whiteboard coding" or "leetcode"-style unrealistic (for the places I've worked and the roles I hire for) interview problems. We rarely regret hires and people stick around for a (shockingly) long time.

    I really wish other tech decision makers would adopt this style, for their own sake and for the sake of those seeking jobs.

  • I've interviewed at three FAANGs.

    None of them asked any leet code questions (I was actually looking forward to the silly puzzle questions MS was notorious for at the time because I love those puzzles, even if I think they're bullshit in an interview. Alas it turned out all those questions were basically for PM positions :( ).

    The only question I got that seemed particularly bullshitty was at Google, where it was one of those questions where basically you're expected to work out "the trick", it seemed to me to be very gotcha like. But that was just one question among many, across many people.

    I've also interviewed many people over the years, and no one has asked any of those stupid questions. They are completely and utterly useless - I see a few comments here saying we're testing for conformance and that has never been involved in any of it, because again it doesn't provide any knowledge of technical skill. As an interviewer you're also aware that the person on the other side of the table is often extremely stressed or nervous. So we understand that you might make mistakes, or stumble on answers, etc - failing to account for such issues simply means potentially discounting good candidates.

    As far a whiteboard coding goes, for myself, and I believe many of my co-interviewers a lot of what is actually being looked for is your thinking and problem solving - seriously, I cannot emphasize enough how you should talk through all your reasoning as you write. That allows us to know whether a logic error is a failure to understand/do the correct thing, or just a standard typo-style mistake that everyone does from time to time (again recall we know you're stressed). Also by and large we aren't looking for /perfect/ code (ok, some do but in reality it's worthless metric - I only got this from the gotcha interviewer at G).

    Personally my interviewing I often don't care about the language, I'm interested in the solution, and generally accept pseudo code, or your preferred language.

    Just a few general tips as an interviewer:

    * When asked a coding question, repeat back what you're being asked, you want to confirm it (I've had people try to solve the wrong problem before), and have (where reasonable) some follow up probe/clarification questions.

    * Follow on from above coding question. If you're answering on a whiteboard, remember that the interviewers know that the nature of the format means you might make simple/silly/"stupid" mistakes. Listen for any feedback they give you while answering.

    * Additional follow on. Another cannot be emphasized enough point. Write test cases for the problem you're solving. Do it before you start the solution. It demonstrates that you understand the need for them, and provides another opportunity to ensure there's agreement on the problem being solved. It also lets you clarify things like the expected API - not part of the actual problem, but something needed for any implementation. Try to make your test cases cover "normal" and edge cases.

    * Be aware that if you are nervous, stressed, or worried, the interviewers are aware of that, and know that that can cause errors you wouldn't normally make

    * Try to have a reasonable awareness of the job that you're being interviewed for, some of the relevant things the company does/how it does [software dev, engineering, product management, etc], if at all possible. Either so you can ask questions that indicate you have some understanding, or so you can tie in what the company does as it relates to a particular question (if appropriate, don't just shoe horn things in)

    * Be polite - this is a "be subservient" thing, this is just if you act like an asshole the interviewer won't like you, and that will impact what they report. I believe it's consistent across FAANGs that immediately post interview every interview send an email that is basically "Yes/No. Reason: .."

    * Don't be sexist, racist, homo-/transphobic, or just generally a bigot - I am aware of one woman interviewer having a candidate assume she was an admin, and treated her as such. Another case where a woman was interviewing someone for a position reporting to her, where a candidate asked who his manager would be, found out it was her. Then told her to her face that he didn't think he could work for a woman. That indicates not just incredible sexism, but also just a complete lack of judgement and common sense. The latter alone would warrant a no hire.

  • "I don't do riddles" (DHH)

  • As an interviewer I can tell you are absolutely missing the point.

    Why would I ever want to hire a developer without seeing them perform? And since being able to program a small piece of code to specification is such a basic, important part of development, why would it be bad for me to verify if you can do it?

    If your friends are so good developers, why would they have a problem reasoning around a relatively simple, toy problem? Is it possible that your evaluation of your friends' prowess is biased?

    Why do you think dealing with problems under pressure is not a valuable skill?

    Why do you think leetcode questions are supposed to tell about quality of code that the candidate will produce? Is it possible that you just don't understand what leetcode is for?

    It all seems to me like students complaining that the exam was hard. IT DOES NOT MATTER if the exam was hard. What matters is if you were better than other students. (And even that does not matter, because in the long run it only matters if you have learned something useful.)

    So what is the point here? I think people just complain too much rather than focus on figuring out how to succeed.

    Leetcode questions are supposed to tell me:

    - Can the candidate understand the question? Can they think about the problem analytically? (Somehow there are a lot of people that can have nice conversation but they fail when they are supposed to apply hygiene to their thinking.)

    - Can they follow instruction? (I explain the rules of the task and am interested in seeing if the person is able to follow basic instruction)

    - Can they program? (I met a lot of people over the years who are able to fake their way through the process EXCEPT for when they have to actually write some code. For example they learned standard library by rote but do not have ability to use those functions when needed.)

    - Can they plan? Are they organised? (A lot of people just do stuff at random that might work for very small change but will utterly fail for any larger task. Good developer inevitably have some kind of plan and organisation.)

    - Can they work with somebody else on the problem? (Some people don't know how to work with others even when offered help.)

    - Do they understand what the program they wrote is doing? (MOST people do not know if their program works or not or what it does. They need to run the program to be able to tell. All best developers I ever worked with can tell what the program will do before they run it. Any person that can't do this is destined to be creating huge number of bugs as they mindlessly retry code until it works in the process leaving every bug that did not stop it from working in their test environment.)

    - Are they intelligent (enough)? (Leetcode is sort of intelligence test. You typically need to be at least at some intelligence level to solve the problem.)

    The problem with leetcode is all those interviewers that do not understand how to use it as a tool to learn things about the interviewee. And that frequently is because they want to get information that you can't easily from leetcode.

    So what you can't learn from leetcode question?

    - Will they write nice code? You can't learn this because writing nice code is ability to adhere to the body of code you are already working with. Everybody's programming style is different and even best style might still be incomprehensible to a person that is not used to it.

    - Knowledge. Do not ask stupid questions like how to transform a binary tree in a certain way because they only thing you are testing for is whether the candidate is lucky to know the answer to your problem.

    - Can they solve complex problems? Do not give complex problems on interview. It is just too noisy and luck-driven. Perfect question is just complex enough to be novel and present some (but not too much) challenge to the candidate but not complex enough to run the risk of running out of time for a reasonable candidate.

  • I keep seeing this kind of article that leetcode is not the answer. Yes it is not the answer but the best strategy to filter out not dumb, low IQ people who are going to panic when you give a vague, open ended problem. There are many people who call themselves engineers while trying to talk everything out instead of diving deep into problem.