There are many articles on the Internet with dozens of points on how to reduce the load. What they just don’t write: both lazyload (Google PageSpeed recommends), and gzip-compression, and writing directives in robots.txt, and setting the cache in .htaccess for static files (although in the yard 2019 and 95% of hosts do it for you ), and many other little things. No, using lazyload is super, and processing the pictures on the server-side is also cool, and be sure to collect all the styles in one CSS file, and scripts in footer + async. But in terms of reducing the load, this will not help. Indeed, there are only a few working methods. For Joomla, I use the Jotcache component and for WordPress, we recommend using the W3 Total Cache plugin. Each for himself decides what caching time to set. Since I have a news site, I put 5 minutes. But if the speed of adding and updating information does not play a tangible role (for example, for article sites), you can set 30-90 minutes. PS: Standard built-in components will not help here. Caching should reduce the load by 2-3 times. Each included component, module, plugin or extension is an additional database query. Especially if it is related to any content. Crookedly written modules can do dozens of database queries. For example, the “last records” module can make not 1 query to the database, but as many records as are displayed. For example, this one makes about 20 queries: What solution? Remove the module, look for another, reduce the number of queries experimentally. You can do the same using the W3 Total Cache plugin or any other caching plugins, just make sure to enable the database caching option in your plugin settings. This will surely help you reduce the load on your database by 50-70%. You need to go to /your_site/access_log and see who is there and what is there. If we find too active bots that are of no use, block them. To do this, go to .htaccess and add the following lines: RewriteCond% {HTTP_USER_AGENT} PycURL [OR] RewriteCond% {HTTP_USER_AGENT} Aport [OR] RewriteCond% {HTTP_USER_AGENT} ia_archiver PycURL, Aport, ia_archiver are the names of the bots. Each line in the list ends with [OR], and the last is simply the name of the bot. By the way, some still suggest ignoring bots through robots.txt, but most bypass these directives. If your site consists of 10k + pages, believe me, this will significantly reduce the load. For a large site, even with little traffic, only bots can cause a load that goes beyond the base rate. My site is out of season now, traffic is 200 people a day. But access_log has 10,400 entries. Of them:
Semrush – 4000GoogleBot – 700Bing – 300DotBot – 500Mail.RU_Bot – 400The remaining 2000 are bots, real people, and more.
Semrush was previously more modest, but the constant 30% of requests per day of the total number is also a reason to go to the ban. As for YandexNews, which makes a request to the RSS feed, the RSS feed has the latest five news, which makes the load not so noticeable even with 4600 requests per day. Semrush, Ahrefs, Megaindex, Linkpad, Moz, etc. – Just parsing your site in order to find out external links and their direction. Whom to block, and whom to leave – of your choice. But in any case, using these methods, you can reduce the load and, possibly, return it to the framework of your current tariff.