Lee Willis

Background processing for WordPress

| 0 comments

I’m the maintainer of a reasonably well-used WooCommerce plugin that currently generates a full set of product data from a store This is currently done on-demand to make sure the information is up to date. On smallish stores, everything works just fine. It’s even OK on larger stores as long as your hosting is ‘modern’ (think reasonable PHP versions, execution limits, persistent object cache etc.).

However, some client hosting can’t always manage to pull products, build product objects, apply business logic for output, and construct the output for all products within execution time limits / memory limits. This is especially the case for stores which may have large inventories, but not necessarily high levels of site traffic. These people have had to make do with generating multiple, partial feeds which increases complexity for them, and can cause other issues around re-use of output data with multiple integration partners.

There’s a couple of obvious approaches to solving this:

  1. Make the processing less complex
  2. General code optimisations
  3. Pre-generate the output and cache it

I’ve looked at, and worked on, several of these avenues over the years, [1] and [2] have given me some pretty good results, and even proved a learning opportunity in WordPress internals at times…

https://twitter.com/leewillis77/status/768859124339736577

It’s now reached the point that the only way to eek out more performance would be to stop using WordPress to generate the content, and to pull it out with straight database queries instead. This would no doubt be a lot quicker, and a lot lighter on CPU.

However, it misses out on a lot of the point of WordPress – that plugins can add, update, and remove information programatically.

From my support work on the plugin over the years, it’s clear that many people rely on plugins / hook customisations to get their final data to be what they want. Raw SQL queries would most likely result in incorrect data, so it’s not really an option.

That leaves us with item [3] – pre-generating the output and caching it. Due to the previous scalability concerns though – we can’t assume that we can generate the full output in a single request. Nor do we want to have to invalidate the full output if a single product, or category is changed.

The solution I’ve settled on generates cached output for each item included in the full output. This gives a couple of benefits:

  1. We can generate our final output with a simple SQL query to pull all of the fragments. This is quick. We also don’t have to  worry that information won’t include input from other plugins on the site as full processing was used to build the fragment
  2. We can invalidate, and rebuild single products, or sets of products without having to invalidate everything
  3. We can build up our cached output in parts, we don’t have to generate it all at once

I was left with a requirement for background processing infrastructure that the pre-generation, and batched invalidation could be handed off to.

I’ve been working a lot with Laravel recently. One of its strengths, and something I’ve used on most (if not all) of my Laravel projects is its Queue implementation. It allows you to easily add jobs to one or more queues, and have them processed by workers in the background. You can set up multiple queues, and you’re in control of how many workers you set up, and which queues they’ll process. The library is usable standalone – it can be used outside of a Laravel project. Unfortunately this plugin supports WordPress’ minimum requirements which are still anchored at PHP 5.2, while Laravel’s Queue system requires 5.6.4 minimum making it a non-starter.

Fortunately, I came across and mentally noted the WP Background Processing library from Ashley Rich / Delicious Brains a while ago:

It’s the perfect solution to this problem. It’s simple, and lets you build background jobs that will run automatically through WordPress’ existing cron infrastructure, taking care to balance running as many jobs as possible in an execution with getting through the jobs as quickly as possible. It’s also compatible with PHP 5.2.

Using the library has let me build out some pretty neat invalidation / pre-generation jobs without having to worry about putting together the cron logic myself. Definitely worth looking at if you need to do any background processing in WordPress plugins.

  1. Stuff I’ve used
  2. Error tracking with Sentry
  3. Autotrack for Google Analytics
  4. WordPress performance tracking with Time-stack
  5. Enforce user password strength
  6. WYSIWYG with Summernote
  7. Backing up your Laravel app
  8. Adding Google Maps to your Laravel application
  9. Activity logging in Laravel
  10. Image handling in PHP with Intervention Image
  11. Testing Laravel emails with MailThief
  12. Assessing software health
  13. IP Geolocation with MaxMind’s GeoLite2
  14. Uptime monitoring with Uptime Robot
  15. Product tours with Hopscotch
  16. Background processing for WordPress
  17. Using oEmbed resources in Laravel

Leave a Reply

Required fields are marked *.