[01:33:19] giby, I'm not sure, but I don't believe there are any built-in pageview analytics tools. This /might/ contain what you want https://www.mediawiki.org/wiki/Category:Web_Analytics_extensions (but I'm dangerously close to https://meta.wikimedia.org/wiki/Cunningham%27s_Law here ;) [01:36:07] Note: None of those, that I know of, have been updated to use Google Tag Manager(GTM). [01:41:45] quiddity: they used to have a page view count, my backup script would run once a week and store previous values in a table [01:41:49] THEN THEY FUCKED IT UP [01:41:56] It wasn't brilliant yeah, but it was easy [04:43:59] by going through LocalSettings.php i found the culprit for "Cannot modify header information" was "if (!ini_get( 'zlib.output_compression')) @ob_start('ob_gzhandler');" [06:20:07] the install scripts still fails without error message and I can't see why. All hints are appreciated: https://paste.debian.net/996344/ [07:58:23] huh, weird, so variant only affects the page contents, which is fine, but I'd kind of expect it to switch the interface language to the right thing at the same time too [12:16:30] Depression https://phabricator.wikimedia.org/T155678#3772102 [12:39:58] can anyone have a glance at this paste before I write to the list please https://paste.debian.net/996344/ [12:43:40] traumschule: you can't use install.php if you have a LocalSettings file in place (which would mean the wiki is already installed) [12:43:50] what are you trying to do? [12:56:45] Thanks Vulpix, that already helps a lot! We are installing MW with ansible and have a LocalSettings.php that is copied over to the newly installed site, but I lack knowledge about the correct order of steps. This is our playbook: https://github.com/traumschule/hitchwiki/blob/ansible/scripts/ansible/roles/hitchwiki/tasks/mediawiki.yml [12:58:42] if you need to create the database from scratch, you should use install.php *before* copying the LocalSettings.php, then copy LocalSettings.php and then run update.php to create the required tables for additional extensions [12:59:05] sadly there's no maintenance script to create the database with an existing LocalSettings.php [13:06:28] Vulpix: will do as you said. we also have other extensions like antispoof and SMW which get installed here https://github.com/traumschule/hitchwiki/blob/ansible/scripts/ansible/roles/hitchwiki/tasks/mw_import.yml [16:27:14] hello [16:27:20] some days ago the special lint errors page was updated everytime a page was edited [16:28:08] but now, the errors remains even when que delete this error on the page [16:28:40] i tried to purge the page too but this not helps [16:29:26] whats wrong with this ? what can i do to update the special lint error page ? [16:32:38] leoncastro: couldn't be that the page still has lint errors? [16:32:58] no [16:33:14] the error was a font tag, and i delete all font tags [16:34:00] yesterday i delete about 3000 errors of tidy-font-bug [16:34:18] but the counter and the list wasnt updated till today [16:35:45] some speecial pages are updated on a daily basis (or similar), not in real-time. This one may be one of them (or became one of them recently) [16:36:43] but i can made somethig to force the list update ? [16:37:28] maybe some api call [16:38:23] no [16:39:18] not on wmf wikis, if you're referring to one of them. If you have shell access you can update it [16:39:40] :( well thanks for your answer [18:40:33] hey, i run a wiki on sqlite and it has become super slow. is there an official guide for migration to postgres? [18:40:42] Nope [18:40:51] Postgres isn't very well supported, so wouldn't generally be advised [18:41:31] oh, mariadb then? :) [18:42:07] That'd probably be better, yeah [18:42:13] There's not much of a migration path though :( [18:43:21] hm, i guess it is like restoring from an export then? ie only pages and stuff is easy, user accounts is another story? [18:43:47] Depends how familiar you are with sql I guess [18:43:57] *if* you get mw to create the mysql schema... [18:43:58] i dont want to touch it :) [18:44:15] And then get an sql dump from sqlite... import that to mysql without the tables [18:49:19] how about dumpBackup.php , install fresh from scratch, importDump.php, copying files/images and then manually restoring the user and user_groups tables? any other tables that are not in the xml dump but important for restore? [18:49:53] ouch, edit logs would also get lost [18:51:25] but revisions are included. does that mean i would lose the information _who_ did the edits or what are those edit logs? [18:52:24] There's no "edit logs" [18:52:30] that's in the revision table [18:52:42] Logs are page moves, user blocks etc etc [18:56:00] ah, thanks [19:46:32] lol wat: importDump says (0.27 pages/sec 1.09 revs/sec) [19:49:53] sqlite is slow? :P [19:50:34] no this is the mariadb import [19:57:52] "PHP Fatal error: Call to a member function getId() on boolean in /srv/www/w/includes/filerepo/file/LocalFile.php on line 1299" [19:57:53] :( [20:42:02] Hello! How to retrieve via API the ?action=info of a page? Interested in "Wikidata entities used in this page" section. [20:44:55] https://en.wikipedia.org/w/api.php?action=query&prop=wbentityusage&titles=Paris [20:45:02] https://en.wikipedia.org/wiki/Paris?action=info [20:51:42] <3 [20:52:02] aspects? [20:52:49] Seemingly [20:52:59] I've no idea why the terminology is completely different [21:01:38] I mean that I don't know that an "aspect" is :) [21:01:49] that → what * [21:02:03] https://en.wikipedia.org/w/api.php?action=help&modules=query%2Bwbentityusage [21:02:40] (as usually) the result is not documented :| [21:07:06] valerio-bozzolan, https://www.mediawiki.org/wiki/Wikibase/Schema/wbc_entity_usage [21:12:22] I've added a link to that page, from https://www.mediawiki.org/wiki/Wikibase/API#wbentityusage which might help. [21:33:49] valerio-bozzolan: "aspects" are things like "label" or "sitelinks". They say way *part* of the item is used. If you don't care about that, just ignore aspects [21:35:03] Yep, I've learned that part thanks to quiddity improvement to the API doc :)