[00:25:01] lol, cirrussearch indexing maxes out my cpu, ram, and swap [00:25:08] not optimal [00:29:48] Search is memory hungry [00:30:45] lucene doesn't seem to use so much ram on my old server [00:30:54] about 700 MiB [00:31:39] i probably have it configured differently [00:34:33] It seems like a truism that the newer the software the more ram it takes [00:35:24] maybe [00:46:49] ah, it allocates 2 gigs of ram as soon as it launches [00:46:52] that would do it [11:16:27] Hello and BIG Thank You Thank You Thank You for the most awesome wiki engine in the Laniakea Supercluster [11:25:08] I'm trying to speed up two wikis.. First I'll deal with the less crucial one and then the important one. [11:26:16] the suggestions given by https://gtmetrix.com/ are to enable gzipping of content and to utilize browser caching. If I have the right understanding those are done with webserver configs and not in Mediawiki. Is this correct? [11:28:29] !cache [11:28:29] General information about caches can be seen at , for configuration settings see [11:35:53] mediawiki can compress cached pages itself but afaik it's incompatible with the compression that most people configure their web servers to do [11:36:00] which is maybe why it's off by default [11:36:30] thanks for info p858snake|L [11:37:11] okdana: but the cached pages increase speed only for anonymous users I read somewhere [11:37:58] yes, the file cache does [11:38:42] Currently I have in LocalSettings.php: [11:38:45] $wgMainCacheType = CACHE_ACCEL; [11:38:47] $wgMemCachedServers = []; [11:39:40] so if I undestand that correctly there is no caching since the array $wgMemCachedServers is empty [11:39:48] Does memcached require a lot of RAM ? [11:40:13] $wgMainCacheType controls object caching, which is usually done in-memory (though there are some other options) [11:40:36] CACHE_ACCEL uses apcu (a php extension) if it's installed [11:40:50] memcached can use as much ram as you configure it to [11:40:55] I'll check if apcu is installed on either server [11:41:37] apcu is not a debian package.. I'll google for how it is installed [11:41:54] should be in apt as php-apcu [11:42:05] ok thanks okdana. I'll check now [11:43:48] Yeah.. It is installed. Is apcu a good choice? The 'apt show php-apcu' says it doesn't have something called "opcode caching". Do I want to stick with apcu or is there something better I could use? [11:44:39] the opcache extension is included with php now [11:44:49] ok. good. [11:45:02] the original apc extension had its own implementation, apcu is just apc with that stuff taken out, since php handles it now [11:45:21] i haven't had any issues with apcu, and the manual recommends it for smaller sites [11:45:23] is it automatically included in Mediawiki conf or is modifying LocalSettings.php required [11:45:54] just setting CACHE_ACCEL is all you need to do in your mw config [11:46:05] you might need to mess with your php ini settings to configure apcu though [11:47:59] 'apt show php-apcu' says that apcu can be used together with memcached [11:48:08] "for optimal caching" [11:48:19] Should I install memcached ? [11:49:47] it means they can be used together if you need them to [11:49:50] but you probably don't [11:50:08] I don't. The wikis are really small with barely any other users besides me [11:50:50] i'm sure apcu is fine then [11:51:13] they're just different ways to cache things in memory [11:51:50] Ok.. next item is gzip compression [11:52:22] It seems it has been included in default apache2 configuration on the more recently installed server but the older server does not have it [12:13:36] irritating that styles for common page components like toc, thumb-nails, and tables are kept in an 800-line mega style sheet with incredibly specific selectors on them [12:14:11] Ait.. managed to enable the gzipping with these instructions https://gtmetrix.com/enable-gzip-compression.html .. the gtmetrix.com seems like a rock solid service [12:15:48] Next up: Leveraging browser caching by adding proper expiry headers [12:17:33] If I am understanding correctly the expire headers may cause problems if an image is uploaded over an old image.. the problem being the old image lingering on [12:21:07] that could happen yeah [12:26:14] https://gtmetrix.com/leverage-browser-caching.html recommends setting expiry of images to one year and CSS and javascript to one month. Seems a tad bit agressive.. Does setting these mean that if the files change the browsers will not see the changes? [12:26:59] It is 15:26 .. I better eat my morning porridge [12:30:11] it's common to set very long expiry times on images, yeah [12:32:22] wikipedia seems to just use etags for 'content' images though [12:33:08] okdana: is there any way to invalidate the changed images from the browser cache.. I mean does the browser cache check for modification times or changed size? [12:34:15] time-based expiry headers like expires and cache-control tell the browser to not even request the file unless it's past its expiry date [12:34:32] there's no way to remotely tell it the file has changed unless you actually change the url [12:35:12] with etags, the server can check to see if there's a match, and if there is it will tell the browser to use the version it has cached, instead of sending it to it again [12:35:13] that's not good.. what other alternatives there are to utilize browser caching besides the time-based method? [12:35:57] okdana: how do I setup a similar arrangement as WMF uses for browser caching? [12:36:10] ... using the 'etags' (whatever those are) [12:37:48] i don't know what the wmf server configuration is but it seems like they're doing the equivalent of setting a one-year expiry on images in general, but then no expiry on images under /images/ or whatever [12:52:17] Ok.. I'll set the browser caching only for static images with a .htaccess in the root dir [12:52:26] thanks for your info and help okdana [13:00:39] Hmm... seems .htaccess is applied to subdirectories automatically [13:01:11] I better not touch this browser cache thingy.. the static images are only few and small [23:43:58] Hello, I got this error on the RecentChanges page when I tried to upgrade my MediaWiki site to 1.27.4 from 1.26. ------------ Declaration of Flow\Data\BagOStuff\BufferedBagOStuff::set($key, $value, $exptime = 0) must be compatible with BagOStuff::set($key, $value, $exptime = 0, $flags = 0) in {path}/extensions/Flow/includes/Data/BagOStuff/BufferedBagOStuff.php on line 427 ------------- any help would be appreciated. [23:45:19] doubletropius, looks like your Flow extensions is broken. May need to upgrade it [23:46:15] Is there a SQLite extension to access MediaWiki API? [23:47:15] doubletropius: you need to update your Flow extension to use the 1.27 version as well [23:47:19] zzo38: what do you mean by sqlite extension? [23:48:00] legoktm: I mean like a virtual table that can access remote MediaWiki installations [23:48:25] not that I'm aware of [23:48:25] N3X15, legoktm, thank you, will test that. [23:51:14] I wrote a SQLite extension to download data from the internet with libcurl, although it is a generic extension without specific API for specific services; those could be separate extensions, implementing virtual tables or functions as applicable to the remote service being accessed. (In case of MediaWiki, a virtual table makes sense because it has one row per page and many columns to indicate the title, contents, last changed date, etc)