I learned to code late in life, around age 30. Since then, I’ve not only become a veteran, happy engineer but I’ve also managed to become one of those guys who gets to interview developer candidates for junior positions. Sadly, most of the people I talk to have only bothered to learn HTML and CSS and little else.
I argue you’ll be a stronger candidate if you can understand the basics of backend code.
My whole life I danced around the idea of learning to code, but was always too afraid to actually commit to it. It was a scary prospect as it clearly has a steep learning curve. I recall watching a friend write some code in Ruby and being absolutely flummoxed as to how a bunch of random letters and symbols somehow amounted to a working program. At the time, I was working as a solutions architect at a technology company that sold middleware to large organizations.
Up until that point, I was able to work in tech without knowing anything about computer programming. I was focused on learning more about business and strategy, assuming that I’d definitely have a place in the future economy.
But at some point I hit a wall.
Developers in Silicon Valley, and elsewhere, were seeing enormous success building startups and technology, using the coding knowledge they had to create products and services that people wanted to use. I desperately wanted to be part of that revolution, and I knew that even if I never planned on programming professionally, I should at least learn how it’s done so that I might become a more effective business strategist and entrepreneur.
But what really set me over the edge was that the same friend whom made writing code look easy, was able to build and execute any idea he had. If one day he woke up and thought of a great app or web service, he was able to just sit down and hammer it out. To me, that was incredibly valuable. You don’t need solutions architects at a startup, you need people who can build things.
Overcoming Misconceptions of Programmers
Setting about to learn to code at age 30 might seem crazy. And it was, but mostly because I had so many misconceptions about how computers worked. The last time I had done any coding was in the late 1990’s. Everything was HTML with some CSS, and I assumed that that was all I needed to really understand computer programming. I was vaguely aware of other programming languages, but I didn’t fully comprehend how they all related to one another. It turned out that that was the most critical lesson for me to learn.
I don’t personally believe you can be a successful product manager, business development manager, or really any kind of manager in any type of company without understanding some fundamentals about how your tech works. This comes into play when validating customer use cases and thinking about how you’re going to solve them. Your team won’t respect you if you’re asking them to build things that don’t have some basis in reality.
The company is called GrayMeta, and at the time, we were building a complicated service that extracted searchable metadata from file-based assets that helped customers manage their data, and as the company’s first hire, it was incumbent upon me to understand the technology through and through. I learned a great deal more about programming, deployment, containers, dev ops, RESTful services, web hooks, and more, throughout my tenure at this startup which only helped me become a more effective solver of problems. I even wrote a tool that the sales team used to help demonstrate the value of our platform. It wasn’t going to win any awards, but it worked, and could be improved upon by any real developer.
After a year, it was time to grow our development team (which was a handful of backend developers) and start to interview front end developer candidates.
One candidate in particular really stood out in my mind, and did in fact became the inspiration for this post.
Get the Code Fundamentals, Get the Tech
He was young, perhaps in his early twenties, and quite eager to break into startups. He had just graduated from a developer bootcamp, and was seeking a junior front end developer position at our startup to put his new skills to good use. I started digging into his previous projects, of which there were a healthy handful (certainly more than mine at that point), when I started to realize there was something missing.
I’m the last person to suggest we all become experts on everything, but what I saw here was a fundamental flaw in his learning process.I certainly don’t blame him for it He was essentially stuck where I was before I started to learn how to code. There was no way a junior developer at a company that built a platform for users to manage data would be able to contribute without understanding the fundamentals of how data gets onto a page.
Even if you don’t become a developer, but do plan on working in software, I strongly encourage the pursuit of a learning process that incorporates computer science. In my experience it has been vital I understand how various pieces fit together, even superficially.
As a product manager, I’m never asked nor do I ever suggest how to accomplish something, but I do know how to gauge if something is feasible, roughly how difficult it might be to do, and understand what challenges may arise.
As a business development manager, product manager, and chief evangelist (I wore many hats back then), understanding how to integrate the value of our platform into other technologies was critical to my success. It isn’t just about being able to answer customer’s questions, it’s also about being able to talk about the value certain technology can bring to a partner, as every customer will have their own unique set of challenges.
Learning the basics of computer science has been critical to my success as an entrepreneur, product manager, business development manager, technology executive, and general all-around problem solver.