I think one of the biggest challenges that developers face – especially new developers – is this feeling of having the keep up with the newest, latest, and greatest technologies that come out almost monthly (if not weekly), and the guilt that follows when peers attempt to “hold them accountable” (for lack of a much better phrase) for not knowing or not using something.

Now, for the record, when it comes to the technologies that people opt to use for their projects, I don’t really care – to each his or her own. Whatever makes you more productive and gets you solving problems faster, then awesome.

Sure, there’s something to be said for a team using a unified set of tools, but that’s content for another post.

Generally speaking, can’t we stop with espousing every latest-and-greatest language, tool, framework, whatever as the next best thing?

Because it’s probably not.

The Best Programming Language is Here

In recent years, there has been a very interesting shift in the culture of programming such that every time a new technology comes out, there’s a mad dash for other people to learn it quickly, to claim how much they love it, to swear it’s the best thing since a compiler, and … then what?

Move on to the next thing that comes along in a few weeks

1. We’re Speed Dating

" So Whatcha Up To This Week?"

” So Whatcha Up To This Week?”

It wasn’t that long ago that we didn’t have libraries like jQuery (or Prototype before it) to help us with writing JavaScript.

In fact, it’s only been within the past few years that JavaScript has surged in popularity among developers so much so that we’re now not only writing code on the client-side in JavaScript, but on the server-side, as well.

And now, we’ve got a variety of different build tools cropping up all of which are based on JavaScript that are meant to help us automate certain tasks for our web-based projects.

But just as soon as one takes hold and gets anywhere close to something resembling something such as mass-adoption, something new comes along and is espoused as the best thing ever.

2. Inversion of Thinking

We've got it upside down.

We’ve got it upside down.

The thing is, it wasn’t that long ago that people appreciated certain languages and tools that had been around for years – decades, even – because they knew they would be robust.

Developers had been building software, hacking away at compilers, and basically continually iterating to work out the kinks in their languages and tools such that they became extremely powerful – so much so that they could predict errors before you even compiled the program (think about background compilation for VB.NET which isn’t even that old – that is, if you can get past the acronym “VB”).

Now, we’ve got this weird inverted perspective that goes something like this: If something new comes along that can replace something I’ve been using for a while, then it’s probably better.

Maybe it is, maybe it isn’t, but it’s weird how quickly our perspectives have shifted.

3. “Kids Today…”

"I remember writing code up hill both ways in the snow."

“I remember writing code up hill both ways in the snow.”

At this point, I can’t help but wonder what the previous generation thinks about the current generation of programmers.

These are guys who ended up having to do a lot more “manual labor” in terms of writing lower-level code, memory management, etc. to build their programs, and here we are doing a significant portion of our work in, say, browsers or in these incredibly polished environments that give us simulators for our computers, phones, tablets, and so on.

And rather than working diligently to get incredibly good at solving problems for a certain platform, we’ve fallen prey to some sort of programming ADD where we just look for the next thing that’s available and trust that it’s going to be the best thing to use.

4. Ah, The Irony

Isn't it ironic? Don't you think?

Isn’t it ironic?

But there’s a bit of irony in all of this:

This is coming from a generation who wants everyone to be able to code, yet shuns people when they don’t use the language, framework, platform, toolset, database system, or whatever other programmed-related term fits that they deem to be the best for the job.

And if you think the latter part of that statement is false, then read nearly any blog post, Hacker News thread, or random Twitter discussion on a given language, framework, or set of tools, and you’ll find people literally insulting a persons intelligence or personality for not using a particular technology.

For those who are in the WordPress world, you likely experience this on a monthly – sometimes weekly – basis.

How often have we heard:

  • The WordPress codebase is a mess. OMG it makes my eyes bleed.
  • WordPress has a low barrier to entry – that’s why so many people are using it.
  • It uses PHP? Nasty.
  • …and so on.

And you know what? All of those arguments can be validated for the person who is saying them, but that doesn’t mean that it’s all 100% correct, or that it’s a purely objective truth.

Somethings, perhaps, but all of them? Hardly.

Replace ‘WordPress’ for whatever is your platform of choice, and I’m sure you’ve at least witnessed some type of religious war or crusade as to why it’s the end-all-be-all of development. Some of you may have even participated.

You’re Not Guilty

Whatever the case may be, I remember a time in my career when fellow developers would ask me if I knew a certain library, language, or whatever and if I said “I don’t know,” then they’d gush about how much they loved it, and then I’d be left feeling guilty – like I was doing something wrong – because I wasn’t an expert in it.

But that goes away (and I hope it does for everyone) because you can’t take a deep dive into an area of technology without investing years before becoming an expert, and I find it highly unlikely that you “love” something after spending a few hours with it.

Or maybe you can – and if so, more power to you – but I can’t.

As programmers, how can we – in good conscious – invite other people to learn to code, and to step into our field and then chastise them when they don’t use whatever we think they should be using, or deem them less-than-capable because they aren’t using something that was made available last week?

Whatever the case may be, at least be consistent in something and pick a side: Do you want people to learn to code and to solve problems, or do you want people to skip along the surface of technologies as they appear spending more time learning how to use the technology rather than actually getting stuff done?