Mediawiki file cache update after page edit

I have implemented Mediawiki 1.35 on a dedicated server (24 core, 32GB RAM, RAID 1) to which I have root access.

The instance contains 21m+ articles, some of which are large and contain multiple script calls which cannot be further optimized. Consequently some page generation takes 30+ seconds to serve and sometimes the scripts timeout and exit incomplete which is obviously unacceptable from a UX perspective.

To mitigate the problem I enabled gzipped file caching as per the official documentation which works very well. By running multiple instances of the official file cache generation script I can cache all the pages in 30 days (as a one off event). However, the default caching policy is to drop cached items on article edit (of course since the data is then stale), but not regenerate a new cache file when the edit is submitted. Therefore the first anonymous user to request the page has to wait for the caching to take place with the attendant problems I described above. The wiki has few editors but many anonymous readers.

I would like a solution to automatically regenerate a page’s cache file on article edit. Is there a fairly simple way to hook into the end of the edit process and make a call to regenerate the cache file, or some other simple method? I prefer not to add the complication of (say) Varnish or other reverse proxy caching if possible but will consider any viable option.

TIA