I was interviewing this kid the other day. I can rightfully call him a kid, since I’m at least 15 years older than him. But because calling someone ‘the Kid’ gets old, I’ll call him John Doe. He was straight out of school — up to this point we have been recruiting senior engineers, and to see someone this green was a new experience for me. I wasn’t expecting much, up to this point most of my interviews with college kids resulted in painful deer in the headlights moments.
Long story short, John Doe had the best instincts I have seen on anyone, including people with much more impressive resumes. As a natural talent, he has the geek equivalent of a 54 inch box jump, a 90 VO2 Max, a +5 ape index. And he didn’t have any semi autistic behaviors, i.e. Bill Gates Rocking — he actually seemed quite well adjusted.
I gave him my standard coding question, which goes like this:
Given one 20GB disk of unsorted ints, an empty 20 GB disk, and a 1 GB memory buffer, move the unsorted ints to the empty disk, in sorted order.
I love this question because there are many good answers to it. There are also some quick observations to be made:
(1) Standard in-place sorting algorithms don’t work due to the space limitation.
(2) The buffer is best used as a sorting mechanism.
(3) When using the buffer, you will need to handle the case where it fills up.
(4) Extracting the data from the buffer is as important as inserting it.
You would be surprised at how many people take a long time to get past (1). They then have to be prodded into (2), and when they code an insertion algorithm, they don’t deal with (3). And if they get to (4), it is only as an afterthought.
Mr Doe verbalized all 4 issues within the first 3 minutes, including the fact that any data structure he uses would decrease the efficiency of the algorithm because pointers take up space. He then proceeded to pick a binary tree, define it’s insertion efficiency, and code the traversal as an elegant 3 line ruby solution. He then coded the insertion logic just as elegantly, catching and handling the max buffer size exceeded case.
Even the guys that made it this far tended to fill the board up with Java. I was struck by the conciseness and readability of Mr Doe’s solution. I remember coding the same solution in Java and using much more of the whiteboard, and I began to wonder if John Doe was just that much better than me (totally possible), or if my exposure to C++ and Java had made me incapable of expressing elegant solutions. This is a thought that I have had a lot lately, since I got pulled back into some java development that involved Spring, and spent a not insignificant amount of time re-ramping up on how Spring config files work, and watching other developers flail their way through that same process. After I had finished re-introducing myself to Java Enterprise Development and learning enough about JPA to be extremely dangerous, I found myself reading the latest Yegge novella and nodding my head. Alex Vollmer says it much better than I ever could: when a language is by definition bloated and inconsistent, it becomes harder and harder to be elegant and concise.
I’m not entirely convinced that I can effectively blame a language for my over complexity disorder (OCD), but there are two things I am fairly sure of. (1), John Doe is in as far as I’m concerned. He is a rock star. (2) I’m going to spend the next little bit focusing on coding Ruby like a Rubyist would — i.e. taking advantage of conciseness and readability, and hopefully producing something that is elegant.