Software Engineering in WordPress, PHP, and Backend Development

Author: Tom (Page 1 of 425)

WP Privacy, Attestation, Git Updater Lite, and More

For years, I’ve kept track of various resources that I’ve found useful. Having them here makes it easy to refer to them in the future should the need arise (don’t you refer back to your old posts? /s).

It also makes it easy for others to find them if they’re searching for them either in traditional ways or via some of the new ways we have to search (that latter of which is why I find value in still sharing content).

Anyway, over the last two weeks, there have a been four things I’ve found that I hope to look more into in the future. And if not, at least they’re here for posterity.


  • WP API Privacy. The default WordPress installation from wordpress.org automatically transmits extraneous information via various HTTP calls that occur in the admin. Some of this data may be cause for concern from a privacy perspective. This plugin seeks to limit that information, attempting to further protect your privacy in the process (via Duane Storey).
  • WordPress Plugin Attestation. Add this action to your deployment workflow to generate a build provenance attestation of the plugin ZIP file on WordPress.org (via John Blackbourne). For what it’s worth, “attestation” is just the verification that the software comes from where it claims to originate.
  • RAVE for WordPress. RAVE for WordPress is an automated tool which compares the contents of published packages of WordPress with the canonical source code to verify they have not been tampered with (via John Blackbourne).
  • Git Updater Lite. “Since Git Updater already gathers and parses this data, Git Updater Lite only needs to query an update server run by the developer” (via Andy Fragen).

And if you stumble across this post and are interested in anything I’ve written in the past week, you can find that below:

If you’re using WordPress and you’re looking for an extremely quick way to add this functionality to your local installation, add the following code to an mu-plugin …

    Until the next time there’s a backlog of stuff for me to share, that’s it for now.

    Catch Outgoing Emails From WordPress in Laravel Herd

    Earlier this year, I swapped my local development environment over to Herd (along with a couple of other changes such as DBngin which is worth covering in another post).

    There’s a lot to like about it one of which is how easy it is to begin capturing outgoing emails from whatever application you’re using.

    From the docs:

    Herd Pro provides an SMTP mail server on your local machine that catches all outgoing emails instead of sending them to the world. It displays them in Herds own email client and provides rich debugging capabilities for all types of emails.

    Emails From WordPress in Laravel Herd

    If you’re using WordPress and you’re looking for an extremely quick way to add this functionality to your local installation, add the following code to an mu-plugin:

    <?php
    /**
     * Initializes the PHPMailer instance before it is used to send an email.
     *
     * This action hook is used to configure the PHPMailer instance with the necessary
     * SMTP settings, such as the host, authentication, port, username, and password.
     *
     * @param PHPMailer $phpmailer The PHPMailer instance being initialized.
     */
    add_action('phpmailer_init', function ($phpmailer) {
        $phpmailer->isSMTP();
        $phpmailer->Host = '127.0.0.1';
        $phpmailer->SMTPAuth = true;
        $phpmailer->Port = 2525;
        $phpmailer->Username = 'WordPress';
        $phpmailer->Password = '';
    });

    For example, I have a file – herd-mail.php – located in mu-plugins. Once this is added, any outgoing email from WordPress will be immediately captured and funneled to Herd’s email inbox for review.

    Notes

    • PHPMailer is part of WordPress core so there’s no need to install a third-party library).
    • phpmailer_init is a native WordPress hook.
    • It’s also really easy to set up Xdebug in Visual Studio Code to work with Herd. If you’re interested in learning how, review this article.

    Thanksgiving 2024

    At this point, it’s more of a tradition to post on Thanksgiving Day than anything else. I’ve been doing it for 12 years now.

    Happy Thanksgiving

    But the general sentiment is still the same as it was 10 years ago:

    We’re celebrating Thanksgiving today in the United States, so I’m taking a day off of the typical routine.

    If you’re in the United States and/or are celebrating today, may it be a good one. And if not, may your day still be just as great.

    I’m looking forward to ending the year strong and with a few more posts as I try to get back in the habit.

    Use Static Variables in Plugin Bootstrap Files

    As nice as event-driven programming can be within the context of WordPress’ hook system, one of the challenges is preventing code from executing every single time the hook is called.

    For example, say you’re writing a function that fires during the init action but something happens in WordPress core that triggers the init action to fire again consequently causing your code to fire once again even though it’s unneeded.

    Multiply this across however many callbacks spread across however many files and general functionality registered with the hook and you may end up affecting performance and/or executing code that has no need to be run.


    Static Variables in Plugin Bootstrap Files

    How an LLM thinks this post would look as an image.

    One way this can happen is in a plugin’s bootstrap. Case in point: Say your plugin is registered with the init action and then it sets up a registry which in turn instantiates a set of subscribers that register services. Repeating this every single time init is fired is unnecessary.

    Here’s a way to manage this:

    add_action( 'init', 'tm_acme_function', 100);
    function tm_acme_function() {
      static $initialized = false;
      if ( $initialized ) {
        return;
      }
    
      $initialized = true;
    
      // ... set up the rest of the function.
    }

    If you know how static variables work, you’re may already be doing this, you’re able to follow the above code, or both. And if that’s the case, there’s nothing else to see here.

    But if not, static variables can be useful in this scenario because static variables maintain state between calls whereas regular variables are reinitialized every time the function fires. This means a static variable retains its value across multiple calls to the function.

    Static Variables and Plugins

    So having an static $initialized flag works like this:

    • $initialized starts as false.
    • When the function runs for the first time, the $initialized variable is set to true.
    • On subsequent calls, the condition if ( $initialized ) prevents the rest of the function from executing, effectively short-circuiting it.

    And because of that, this:

    • prevents redundant execution,
    • optimizes performance by avoiding running unnecessary code (especially as it relates to registering duplicate functionality, running multiple queries, or trashing data unintentionally).

    If your plugin’s bootstrap registers a callback with a WordPress hook, considering using static variables to prevent code from being called unnecessarily more than once.

    Maybe ChatGPT Didn’t Wreck Our Type of Content

    To say that 2024 has been a year would be an understatement. Though I’m talking about things that have happened offline, the same can be said for the WordPress economy at large, too.

    On a regrettable level, the degree at which I’ve written has decreased more this year than likely any other year since I’ve been writing. Some of this can be attributed to stage of life, some can be attributed to work, and some of this can be attributed to the rise of AI in our industry.

    AI taking a bite out of WordPress (or something like that).

    Over a year ago, I wrote that ChatGPT Wrecked Our Type of Content in which I claim:

    Though the goals of this question are not mutually exclusive, I think getting an answer fast often outweighs the “I’m looking for an answer but it was neat to also read about someone else’s situation while searching for it.” And this is why ChatGPT has “wrecked” some of the content a bunch of us typically write.

    But, as stated, it’s been over a year since this was written. And since I work in R&D in my current role, we’ve done – and continue to do – a lot of work with the various systems, applications, utilities, and so on.

    Given that, I – like many of you – have recalibrated my perspective on how this changes the work we do.


    ChatGPT Didn’t Wreck Our Type of Content

    Improved Productivity

    First, it’s undeniable that when used properly, AI assistants can vastly improve productivity. I run both Copilot and Cody in my editor as I’m consistently evaluating which one performs best for a given use case. At the time of this writing, I’m partial to Cody though I also know Copilot is going to support multiple LLMs in the coming months (or weeks?).

    So, sure, AI assistants have changed the way the work in our day-to-day but, as the months have passed, I’m no longer convinced it’s “wrecked” our type of content so much as it’s “drastically altered” how we explain – for lack of a better word – our content.

    One of which is more neutral than the other.

    Large Context Windows but Lacking Context

    Secondly, for as much as I typically work with ChatGPT, Gemini, and/or Claude (is there a clever acronym for all of these, yet?) on a daily basis, I find myself continuing to enjoy well-written content either in newsletters (see The WP Minute, The Repository, or Within WordPress) or blogs (see Brian Coords, what Mike is doing over with Ollie, and so on). Though I’m but one person, each of these properties or people continue to publish even though LLMs are available for any of us to use.

    And that brings me to the final point: There are reasons AI hasn’t completely wrecked the type of content I – and others – have often published:

    • AI hallucinates. Recommendations provided within a given LLM are presented with an authoritative sense regardless of if the recommendations even use hooks, function names, or language features that don’t exist.
    • Lack of context. LLMs do not have the context as to how a given developer arrived at a solution and why one was chosen over another. Sure, you can ask for a variety of solutions and tradeoffs but there are times in which it’s still faster to read from someone who’s had the same experience, shared it, and provided contextual information as to how and why they arrived at a solution.
    • Aggressive Autocomplete. I’m a fan of using coding assistants within my IDE. As I said, the level of productivity and speed of solving problems has definitely increased, but that doesn’t mean its attempts to autocomplete a piece of functionality are always helpful. It still takes a critical eye to review what’s being proposed and determine whether or not it’s worth integrating.

    There are likely more and your experience likely varies – but I suspect aren’t much different – from mine.

    The Why Behind the How

    The reason I share all of this is because one of the fundamental things that is missed when working solely with AI is the value that human beings bring to the table when sharing the why behind the how.

    This is not me taking a position on whether or not AI will, can, should, or whatever other argument is the current hot topic replace humans. Instead, it’s me saying that although I appreciate the value AI has brought to our industry and I recognize it alters the need for certain types of content, I no longer think it completely negates or replaces the type of content about I – and others – used to write.

    Sure, our approach may need to be tweaked but there’s still plenty of ways to regularly share what we’re working on, how to solve a certain problem, and why one solution was chosen versus another.

    Finishing 2024, Into 2025

    Given that 2024 is coming to a close in the coming weeks and that we seem to have accepted the role AI plays in the day-to-day work of software development, perhaps I can start writing somewhat regularly once again.

    There’s no shortage of things I’ve built, learned, saved, and archived. And while others have continued to publish their stuff, I’ve missed doing the same. Perhaps the coming weeks – and coming year – is a time in which those of us who so frequently wrote about development can find our way to back to doing exactly that.

    Maybe with a few alterations, though.

    « Older posts

    © 2024 Tom McFarlin

    Theme by Anders NorenUp ↑