localgov icon indicating copy to clipboard operation
localgov copied to clipboard

Performance optimisation strategies - starting with xhprof in ddev

Open finnlewis opened this issue 1 year ago • 6 comments

We're keen to help people deploy super fast sites, which is good for user experience, server resource usage and helps to minimise energy consumption, reducing the carbon footprint of the website.

While developing, lots of the default configurations are set for ease of developing rather than performance. I'm keen to gather information on options and best practices for improving performance when going live.

Some quick initial thoughts on areas I want to look at:

  1. PHP performance optimisation in views and other heavy pages.
  2. Best practice for composer configuration for performance when deploying to production.
  3. Best practice Drupal cache settings for production.
  4. Reverse proxy caching with Varnish
  5. Cloudfront / Cloudflare / other hosted solutions.
  6. Purging strategies when we have a reverse proxy cache in front of the production site.

1. PHP performance optimisation in views and other heavy pages.

I've just made a start by using ddev to analyse the performance of the PHP callstack with xhprof.

Ddev has made this refreshingly easy.

ddev xhprof on

To which it responds:

Enabled xhprof.
After each web request or CLI process you can see all runs, most recent first, at
https://localgov.ddev.site/xhprof

We can now hit the page or view we want to optimise and then return to https://localgov.ddev.site/xhprof where we can see a list of reports like this.

image

image

So for this particular view (a directory channel) this is pointing me to

a) composer autoload b) database queries

I am now reading up on https://getcomposer.org/doc/articles/autoloader-optimization.md

finnlewis avatar Jan 29 '24 19:01 finnlewis

Notes from Merge Tuesday:

On production, best to run:

composer install --no-dev It sounds like for Drupal there should be no difference running

composer install --optimize-autoloader or

composer install

@Adnan-cds @stephen-cox @ekes @andybroomfield @rupertj any other suggestions?

finnlewis avatar Jan 30 '24 13:01 finnlewis

PHP config recomendations:

; - Turn *off* opcache file revalidation.  PHP files are not going to change in these environments.
opcache.validate_timestamps = Off
opcache.revalidate_freq = 0
opcache.max_accelerated_files = 20000
opcache.interned_strings_buffer = 16

finnlewis avatar Jan 30 '24 15:01 finnlewis

This is interesting @finnlewis and something I'm looking at in more detail as part of a post series on my personal blog (that I hope to post at some point) and would be great to catch up with you on this more.

Would what be useful from a PHP/SQL point of view is some way we can collate data of users using LocalGov Drupal that have New Relic running in the background (I know Pantheon does).

New Relic has very detailed Drupal breakdowns from slow-running modules right down to slow code and SQL execution which once we have enough data may highlight areas for improvement.

millnut avatar Feb 01 '24 08:02 millnut

Looking at new relic, its the node that embeds the directories or events view that has the slowest performance.

Screenshot 2024-02-01 at 9 58 16 am

@millnut Anything specific you'd look for on New Relic, its not something I'm that familliar with it but Acquia bundles it with our hosting.

andybroomfield avatar Feb 01 '24 10:02 andybroomfield

@andybroomfield yeah I'll post some specifics in here later today, but happy to jump on a slack huddle if you want a run-through of things to look at in New Relic

millnut avatar Feb 01 '24 11:02 millnut

Hi @andybroomfield with New Relic, in general, I tend to view 7 days of data unless there have been deployments which I then review 30 days just to check for regressions, negative impacts to performance and/or increase in errors.

I then start with transactions and look into the slow ones by clicking on them, if there are common ones I check a batch to see if there are any common patterns in function calls and areas it highlights as slow.

For Drupal-specific investigation, I use the New Relic Drupal functionality; this is normally below the "External services" menu in the navigation called "Drupal".

Within that, you can get an overview of Drupal calls (views, response time and throughput), hooks and modules. In each section (e.g. hooks, modules) it breaks down the information further and you can filter by;

  • Most time consuming
  • Slowest average call time
  • Function call count (this one is useful for finding inefficient functions and poorly behaving loops)

Hope that helps, and I'm on the LocalGov Drupal Slack under Lee (TPXimpact) if you have any specific questions around New Relic

millnut avatar Feb 01 '24 13:02 millnut