This post is more than a year old. The information, claims or views in this post may be out of date.
Over the last few weeks I’ve been wrestling with the question of what to do next, career-wise. In doing that, I’ve been re-evaluating most of what I’ve been working on over the last few years, trying to figure out what actually made me happy, what worked to advance my career, and what held me back.
One of the things I consistently identified as being a positive, was being in a situation where I had the opportunity to develop mastery in a particular subject. I think anyone who’s driven by the need to learn would identify with that.
A new and interesting point (to me, anyway) is the idea that mastery itself is relative. I’d always thought of it as an absolute: that there’s a known limit to a given subject, and if you can reach that limit of knowledge, you’re a master in it. Sink 10’000 hours into something, and you’re the best.
That doesn’t really seem to be the case, though. In order to truly develop mastery in anything, you need to keep surrounding yourself with people that are better than you, and learn from them. There’s a quote, that like most quotes has a fuzzy origin:
If you’re the smartest person in the room, you’re in the wrong room.
It’s really obvious in hindsight. If you’re the smartest developer at your company, that doesn’t mean you’ve mastered software development – just that you’ve hit the limits of your learning. To actually become the smartest software developer, you need to find smarter developers to learn from, and inevitably, teach other developers what you know.
And even then, the goalposts keep moving. For instance, being a ‘master’ software developer 30 years ago required the command of much fewer tools and languages. To be an even half-decent full-stack developer in 2016 requires you to understand a bit of everything, from servers to UX, and all the different languages those are expressed in.
Meaning that mastery is an unattainable goal, but by far the worthiest to pursue.