As an industry, computer science and/or software engineering is comparatively young. The field moves fast, no doubt, but we’ve obviously not been around as long as many of the other industries in which our parents or our peers are involved.

I think that’s just one part of what makes all of this interesting.

One thing that I’ve begun to notice about the industry is that the more new technology that is made available with respect to web development, the less people understand about the work that came before them.

It’s kinda sad, but it’s also natural, right? I mean, given a cliché example, each time something new is added to a car, we don’t all necessarily understand what was in place before a new piece of technology, we’re generally just happy that a new piece of technology exists and that it makes our lives easier.

But in the field of web development (and perhaps other types of development – I don’t really know), other developers – specifically those that are newer to the field – do call into question some of the ways that we do things, some of the things that we do write about, and some of the things that we do try to explain because “there’s an easier way” and, as such, it makes no sense to share such antiquated tips.

That's Borderline Inappropriate

Bummer.

Assume You Know Nothing (This is a Problem?)

But why does this come off as a problem? I mean, do these developers have a point?

Absolutely.

I mean, if there’s a newer technology that’s available that makes, say, connecting the front-end to the back-end a little bit easier, then why would we not use it? For those who have been involved in development for, say, just 10 years (let alone more than that), then you likely know the answer: It’s the result of what has become a legacy application.

The technologies that were used to build said application are not necessarily compatible of playing nice with what’s now available. Alternatively, perhaps the architecture of the application doesn’t work well with including the new technology, or maybe the cost of refactoring the code doesn’t justify the cost a business would need in order to patch the code in a more compatible way.

For Example

Let’s say that you have a web application built in a prior version of .NET that uses jQuery and loads a ton of information into the front-end.

You’ve used a combination of vanilla JavaScript, something like jQuery UI, Ajax, and a caching mechanism in order to make displaying the data, paging the data, modifying the data, and so on and little bit easier for the user.

The challenge is that each time the page loads, you have to perform at least one – if not several – loops through the data in order to handle it. And we’re not talking, say, a few hundred pieces of data. At this point in time, that’s next to nil in terms of computational power.

But perhaps we’re talking on the order of hundreds of thousands. And yes, there are a variety of ways to solve this problem.

  • We could grab the first page of the data from the cache and then use Ajax to retrieve each subsequent page
  • We could load up the entire set of data and cache it on the first hit
  • We could optimize the tables in which the data is stored as well as the server-level and client-level algorithms are used to parse the information
  • …and all of the above, some of the above, and so on

But the point that I’m trying to make is this:

When building an application, there are times in which you cannot foresee how users will employ what you’ve built. You don’t know how much information an application is going to really manage so you have to make the most educated guess possible as to how to best implement your solution and then go from there.

Coming from a position of someone who has had to do this more than once, sure, refactoring and improving the speed of things can be a lot of fun. But is a business willing to justify a cost? This depends on several factors (that aren’t relevant to post this).

Similarly, as someone who has had to work on legacy applications, there are also constraints in which we need to work that don’t always allow us to simply use whatever the latest and greatest tools are in order to achieve the best performance possible.

So we have to work with what we have.

Don’t Think You Know

The bottom line of what I’m trying to say is this: It’s a common theme to see people leaving comments on the web – especially technical articles – that start with:

  • “If I were you, I would’ve just done…”
  • “Why did you not use [this library], it offers [these advantages]…”
  • “I’m so tired of seeing these inexperienced developers do [whatever]

When the truth is, you don’t know what you would’ve done because you weren’t at the place and time using the technologies and tools that were available when building the application.

I Know Everything

Furthermore, they may not be inexperienced developers at all – instead, they may actually be more experienced than you are and have very logical, wise, and business-driven decisions to have made the calls that they made with respect to building what they’ve built.

So maybe this is a better approach:

We ask why a given decision was made versus one that we thought would be better and try to gain an understanding as to why something was designed and built the way it was?

Ultimately, it may make you a better developer, come off less pretentious, help you appreciate the tools that are made available to you now, and help you to understand the constraints that a person had when they were building the application, and how and why they made the decisions that they did.

This doesn’t mean you have to follow suit, but it can help you appreciate where things have come from, where they’re heading, and maybe even learn a new strategy for how to implement a solution of your own.

We don’t know it all, so we really should stop commenting on others posts in the way that we do (though that’s probably just a pipe dream).