Software Engineering in WordPress, PHP, and Backend Development

Category: Articles (Page 1 of 257)

Personal opinions and how-to’s that I’ve written both here and as contributions to other blogs.

Move Fast but Understand Things

In Beware of the Makefile Effect, the author defines the phrase as such:

Tools of a certain complexity or routine unfamiliarity are not run de novo, but are instead copy-pasted and tweaked from previous known-good examples.

If you read the article, you’ll see that there are a number of examples given as to what is meant by the phrase.

Originally, makefiles were files used for C (or C++) build tools to help assemble a program. This is not unlike:

Just as developers have long been susceptible to the ‘Makefile Effect’ when it comes to configuration files, the rise of generative AI tools brings a new risk of compounding a lack of understanding. Like copy-pasting Makefiles, using AI-generated code without fully following how it works can lead to unintended consequences.

Though it absolutely helps us move faster in building The Thing™️, it’s worth noting: Many of these configuration files are the result of taking a working version and copying and pasting them into our project, tweaking a few things until it works, and then deploying it.

As it currently stands, we may not be copying and pasting pre-existing files, but generative AI may be close enough (if not a step further): It produces what we need and, if it doesn’t work, we can just tell it to keep tweaking the script based on whatever error is returned until we have something that works.

It’s obviously not limited to configuration files, either. Let’s include functions, classes, libraries, or full programs.

Again, the advantage this gives us now versus just a few years ago is great but failure to understand what’s being produced has compounding effects.

To that end, prompt the LLM to explain what each line or block or function is actually doing and then consider adding comments in your own words to explain it. This way, future you, or someone else, will have that much more context available (versus needing to feed the code back into an LLM for explanation) whenever the code is revisited.

Perhaps this will help to resist the makefile affect as well as a lack of understanding as to whatever code is being produced and ultimately maintained.

Strategies for Locally Developing Google Cloud Functions

For the last few months, I’ve been doing a lot of work with Google Cloud Functions (along with a set of their other tools such as Cloud Storage, PubSub, and Cloud Jobs, etc.).

The ability to build systems on top of Google’s Cloud Platform (or GCP) is great. Though the service has a small learning curve in terms of getting familiar with how to use it, the UI looks exactly like what you’d expect from a team of developers responsible for creating such a product.

An Aside on UIs

Remember how UIs used to look in the late 90s and early 00s? The joke was something like “How this application would look when designed by a programmer.”

UX Planet has a comic that captures this:

If developers were responsible for UIs.

I can’t help but think of this whenever I am working in the Google Cloud Platform: Extremely powerful utilities with a UI that was clearly designed by the same types of people who would use it.

All that aside, the documentation is pretty good – using Gemini to work with it is better – and they offer a CLI which makes dealing with the various systems much easier.

With all of that commentary aside, there are a few things I’ve found to be useful in each project in which I’m involved when they utilize features of GCP.

Specifically, if you’re working with Google’s Cloud Platform and are using PHP (I favor PHP 8.2 but to each their own, I guess), here are some things that I use in each project to make sure I can focus on solving the problem at hand without navigating too much overhead in setting up a project.


Locally Developing Google Cloud Functions

Prerequisites

  • The gcloud CLI. This is the command-line tool provided by Google for interacting with Google Cloud Platform. The difference in this and the rest of the packages is that this is a utility to connect your system to Google’s infrastructure. The rest of the packages I’m listing on PHP libraries.
  • vlucas/hpdotenv. I use this package to maintain a local copy of environmental variables in a .env file. This is used to work as a local substitute for anything I store in Google Secrets Manager.
  • google/cloud-functions-framework. This is the Google-maintained library for interacting with Cloud Functions. It’s what gives us the ability to work with Google Cloud-based function locally while also deploying code to our Google Cloud project.
  • google/cloud-storage. Not every project will serialize data to Google Cloud Storage, but this package is what allows us to read and write data to Google Cloud Storage buckets. It allows us to write to buckets from our local machines just as if it were a cloud function.
  • google/cloud-pubsub. This is the library I use to publish and subscribe to messages when writing to Google’s messaging system. It’s ideal for queuing up messages and then processing them asynchronously.

Organization

Though we’re free to organize code however we like, I’ve developed enough GCP-based solutions that I have a specific way that I like to organize my project directories so there’s parity between what my team and I will see whenever we login to GCP.

It’s simple: The top level directory is named the same as the Google Cloud Project. Each subdirectory represents a single Google Cloud Function.

So if I have a cloud project called acme-cloud-functions and then there are three functions contained in the project, then the structure make look something like this:

tm-cloud-functions/
├── process-user-info/
├── verify-certificate-expiration/
└── export-site-data/

This makes it easy to know what project in which I’m working and it makes it easy to work directly on a single Cloud Function by navigating to that subdirectory.

Further, those subdirectories are self-contained such that they maintain their own composer.json configuration, vendor directories, .env files for local environmental variables, and other function-specific dependencies, files, and code.

So the final structure of the directory looks something like this:

tm-cloud-functions/
├── process-user-info/
│   ├── src/
│   ├── vendor/
│   ├── index.php
│   ├── composer.json
│   ├── composer.lock
│   ├── .env
│   └── ...
├── verify-certificate-expiration/
│   ├── src/
│   ├── vendor/
│   ├── index.php
│   ├── composer.json
│   ├── composer.lock
│   ├── .env
│   └── ...
└── export-site-data/
    ├── src/
    ├── vendor/
    ├── index.php
    ├── composer.json
    ├── composer.lock
    ├── .env
    └── ...

Testing

Assuming the system has been authenticated with Google via the CLI application, testing the function is easy.

First, make sure you’re authenticated with the same Google account that has access to GCP:

$ gcloud auth login

The set the project ID equal to what’s in the GCP project:

$ gcloud config set project [PROJECT-ID]

Once done, verify the following is part the composer.json file:

"scripts": {
  "functions": "FUNCTION_TARGET=[main-function] php vendor/google/cloud-functions-framework/router.php",
  "deploy": [
    "..."
  ]
},

Specifically, for the scripts section of the composer.json file, add the functions command that will invoke the Google Cloud Functions library. This will then, in turn, allow you to run your local code as if you were writing it in the Google Cloud UI. And if there are errors, notices, warnings, etc., they’ll appear in the console.

To run your function locally, run the following command:

$ composer functions

Further, if you’ve got Xdebug installed, you can even step through your code. (And if you’re using Herd and Visual Studio Code, I’ve a guide for that.)

Deployment

Next, in composer.json, add the following line to your the deploy section as referenced above:

"deploy": [
  "gcloud functions deploy [function-name] --project=[project-id] --region=us-central1 --runtime=php82 --trigger-http --source=. --entry-point=[main-function]"
]

Make sure the following values are set:

  • function-name is the name of the Google Cloud Function set up in the GCP UI.
  • project-id is the same ID referenced earlier in the article.
  • main-function is whatever the entry point is for your Google Cloud Function. Oftentimes, Google’s boilerplate generates helloHttp or something similar. I prefer to use main.

Then, when you’ve tested your function and are ready to deploy it to GCP, you can run the following command:

$ composer deploy

This will take your code and all necessary assets, bundle it, and send it to GCP. This function can then be accessed based on however you’ve configured it (for example, using authenticated HTTP access).

Note: Much like .gitignore, if you’re looking to deploy code to Google Cloud Functions and want to prevent deploying certain files, you can use a
.gcloudignore
file.

Conclusion

Ultimately, there’s still provisioning that’s required on the web to set up certain aspects of a project. But once the general infrastructure is in place, it’s easy to start running everything locally from testing to deployment.

And, as demonstrated, it’s not limited to functions but also to working with Google Cloud Storage, PubSub, Secrets Manager, and other features.


Finally, props to Christoff and Ivelina for also providing some guidance along setting up some of this.

Review and Highlights of 2024

I usually don’t write a full “year in review” type of post, but I do sometimes highlight various milestones, goals, and/or notable things that have happened in the last year. And this year, I’ve both the desire and time to write about exactly that.

When drafting the last post, I re-read some of the posts I’d published in the past. While it’s fun to see how things evolve over the years, it also provides a guide for how to write these kinds of posts even when I feel out of the habit.

So here’s a summary of the highlights from this year.


Highlights of 2024

Most Popular Posts

Books

For the past couple of years, I’ve been trying to read two books simultaneously – one fiction and one non-fiction. I don’t participate in book clubs, I don’t try to accomplish a certain number of books per month (or year or whatever other unit of time), and I don’t always try to grab whatever the most recent best seller is.

Instead, I try to read the things that I want and that seem relevant, interesting, and/or helpful. I read a total of 20 books this year (10 fiction, 10 non-fiction).

Here are the things I enjoyed the most:

Omission from this list doesn’t mean that I didn’t like it or that it wasn’t something educational. I tried to limit this list to one book from each category but I couldn’t do it so I arbitrarily decided to include two from each instead.

Fitness

Over the years, I’ve tried to make exercise a consistent part of my day-to-day. On the whole, I’ve been good about it even though the type of fitness I do each year tends to change.

Some years, I’ve done nothing but run. Other years, I’ve incorporated some type of guided program. And there are other times where I’ve mixed it up between the two.

This year was kind of like the latter: I was running at least two-to-three 5Ks a week and lifting weights every other day. Unfortunately, I pinched a nerve in my back in September and that brought everything to a grinding halt.

I started walking every day once again in November but that’s about the extent of what I’m doing. My goal is to get back to both cardio and basic weight lifting in January, but we’ll see.

Lastly, if you workout and have an Apple Watch or an iPhone, I recommend Gentler Streak. It’s far an away my favorite fitness app primarily because it aims to keep you moving and in a healthy state without having you just blindly try to close your rings.

Music, TV, and Podcasts

My favorite music from 2024 include the following albums:

  • Moment of Truth by the Red Clay Strays (and their Live At The Ryman album is absolutely worth it, too). If there was a way to capture 50s rock and roll with 70s southern rock and timeless blues lyrics, this is the band.
  • Deeper Well by Kacey Musgraves. I’ve been a fan of hers for a longtime. Golden Hour is still my favorite by her and I haven’t really been a fan of anything sense, but Deeper Well is a bit of a return to form.
  • Rebel Diamonds by The Killers. This is more of a greatest hits collection but if you’ve never listened to the band or are looking to hear how their sound has changed over the year, it’s a good listen.
  • I started listening to Wild Rivers this year and am a fan of what I’ve heard so far. I can’t recommend any single album since most of their songs came up in a recommended playlist.

Most of the shows I watch during the year are whenever I’m on the treadmill or it’s the period between when the kids are done for the day and Meghan and I are still up.

  • Only Murders in the Building. I thoroughly enjoy Steve Martin and Martin Short’s comedy in this show (and Selena Gomez holds her own with them while also balancing them out). We’ve not watched the most recent season yet, but very much enjoy this show so far.
  • From. It’s hard to succinctly describe this show. If you’re into sci-fi horror, then read up on the premise on Wikipedia. It’s shame how much time passes between seasons, but that seems to be the norm in the age of streaming. I wish this show was available on a platform with a wider reach
  • Shrinking. I didn’t start watching this until October but am glad I did so much so that I watched it once through on my own then and immediately watched it through again with Meghan. If you’re a fan of Scrubs, you’ll likely love this show.

I was going to do a Music, Movies, and TV section but I can count the number of movies I watched this year on one hand so I’m mixing it up and adding the podcasts I enjoyed the most this year.

This is not an exhaustive list nor is my sharing this saying I’ve listened to every single episode (unless I mention it, obviously). But they are the ones that kept me coming back a few times a month.

To 2025

Since the majority of what I write here on a daily, weekly, monthly basis primarily has to do with my day-to-day, I try to cover anything outside of that in posts like this.

And these are the highlights for 2024. Like most, I have things that I’m planning to do in 2025 though I’ll wait until this time next year to share how everything went.

If anything the last couple of years has shown me, it’s that this stage of life – while great – has all kinds of ways for making it difficult to make concrete plans. So beyond the high-level goals of reading, working out, listening to music, and writing, there’s not much more to add.

Whatever it is you’ve planned for 2025, here’s to it all going well. And if not, here’s to having the fortitude to push through.

Merry Christmas and Happy Holidays 2024

Over the years, I’ve usually written some type of end of the year post centered around Christmas that also talks about what’s happening and what happened:

And the closest I came to doing something like this last year was an article about The Most Useful (Or Popular) Articles from 2023.

For the first set, it’s fun to look back at how things have changed, and for the latter, it’s neat to look back to see what caught attention over the last year.

These posts are the closest I get to the ‘end of the year’ type of posts and I’d like to eventually get one done for 2024 even if I don’t complete it before the start of the year.

For today, though, it’s a short post to say Merry Christmas and Happy Holidays.


Merry Christmas 2024

Whether or not you’re celebrating Christmas, Hanukkah, Boxing Day, something else, or nothing at all, may the week (or weekend) be good to you.

As for my family and me, we’re celebrating Christmas and spending time with extended family over the next few days.

It’s my favorite time of year and, as cliché, as it may sound, I dig spending it with those who are near-and-dear. And I think everyone should be so lucky.

With that, here’s to the end of the year and the beginning of the next.

Catch Outgoing Emails From WordPress in Laravel Herd

Earlier this year, I swapped my local development environment over to Herd (along with a couple of other changes such as DBngin which is worth covering in another post).

There’s a lot to like about it one of which is how easy it is to begin capturing outgoing emails from whatever application you’re using.

From the docs:

Herd Pro provides an SMTP mail server on your local machine that catches all outgoing emails instead of sending them to the world. It displays them in Herds own email client and provides rich debugging capabilities for all types of emails.

Emails From WordPress in Laravel Herd

If you’re using WordPress and you’re looking for an extremely quick way to add this functionality to your local installation, add the following code to an mu-plugin:

<?php
/**
 * Initializes the PHPMailer instance before it is used to send an email.
 *
 * This action hook is used to configure the PHPMailer instance with the necessary
 * SMTP settings, such as the host, authentication, port, username, and password.
 *
 * @param PHPMailer $phpmailer The PHPMailer instance being initialized.
 */
add_action('phpmailer_init', function ($phpmailer) {
    $phpmailer->isSMTP();
    $phpmailer->Host = '127.0.0.1';
    $phpmailer->SMTPAuth = true;
    $phpmailer->Port = 2525;
    $phpmailer->Username = 'WordPress';
    $phpmailer->Password = '';
});

For example, I have a file – herd-mail.php – located in mu-plugins. Once this is added, any outgoing email from WordPress will be immediately captured and funneled to Herd’s email inbox for review.

Notes

  • PHPMailer is part of WordPress core so there’s no need to install a third-party library).
  • phpmailer_init is a native WordPress hook.
  • It’s also really easy to set up Xdebug in Visual Studio Code to work with Herd. If you’re interested in learning how, review this article.
« Older posts

© 2025 Tom McFarlin

Theme by Anders NorenUp ↑