One of the challenges that we – as programmers – are constantly faced with is learning a new programming language or whatever the latest and greatest technologies that are available (and using the term ‘greatest’ is just for the sake of the cliché phrase).
Then again, a lot of us get into this field for that exact reason, don’t we? I mean, we enjoy learning new technologies, applying them within the context of programming and problem solving, and then doing it again.
But after even just a few years in the business, it can get a little exhausting. This doesn’t mean it’s any less rewarding, but it does mean that when something else comes along – especially from the likes of larger players like Facebook, Twitter, or a similar company – other people are going to want to use them in their projects because the other company uses them and then we’re tasked with learning them.
For what it’s worth, I don’t consider it a bad thing, but it does contribute to the amount of time that goes into a project. This particular idea is something that’s better suited for a business-related post so I digress, for now.
But there’s another side to this:
A segment of the programming culture claims that we should be keeping our skills sharp by learning something new each year or so in order to make sure that we’re still relevant, to make sure that we’re still able to keep up with those coming up behind us, and generally able to keep up with the direction the web and/or software is headed.
And I don’t necessarily disagree with this, but if you’re in a position where you’re raising a family, or you’re in a position where you’re a little bit older and have obligations and responsibilities outside of your work such that you can’t spend as much time at a computer as you used to do (and I know that feel!), then this comes off as a very intimidating and frustrating demand on our time.
So why do we do it to ourselves?
Continue reading