[08:31:23] Hello, I am using mediawiki 1.33 [08:31:55] I used to use the older version 1.25 where I could find the popular pages under special page section [08:32:28] after I updated the mediawiki, I the "popular pages" statistics did not show anymore [08:33:06] I need the statistics of most visited pages and the number of viewers [08:33:15] could you gelp? [08:33:19] help* [08:37:52] !e HitCounters | Mariam [08:37:52] Mariam: https://www.mediawiki.org/wiki/Extension:HitCounters_ [08:38:59] thank you! [13:33:13] Hitcounter! I've been missing them since they disappeared [13:33:17] Thanks! [13:33:54] ... for Miriam for asking and p858snake for giving the right trigger to the bot [13:45:00] I need to export-import a useful set of Wikipedia templates in preparation of moving a few pages from a wiki of low importance to a wiki of importance. Last time I tried it, but did not get it to work due to the system also pulling modules. Did not keep notes that time, this time I promise to keep notes [13:45:26] I just need the citation templates [13:46:49] I am guessing that this https://www.mediawiki.org/wiki/Manual:Importing_Wikipedia_infoboxes_tutorial has applicable parts for the citation templates export-import I need to do [13:50:51] yes it seems to [13:52:01] (i.e.: get the list of templates per infobox by looking at the list at the bottom when you view source/edit the templates, stuff the template names into the for for Special:Export, etc.) [13:52:11] you will likely want the modules shown there too [14:04:15] Thanks apergos. But all I want is the citation templates. I'm going to run backups of the servers and start researching. Actually I need to upgrade a few wikis and this time do it more intelligently and document it. [14:05:03] ok! good luck and looking forward to seeing the docs sometime :-) [14:37:30] Most of the citation templates depend/use the modules these days [14:37:37] So unless you grab ooold versions... [14:59:00] I need to update the Mediawikis to the latest versions before doing other stuff [15:00:25] I'm looking at https://www.mediawiki.org/wiki/Download_from_Git#Using_Git_to_download_MediaWiki_extensions, but when I clonean extension of the REL1_34-branch, but then when I try to run 'git submodule update --init --recursive' it complains that there is no git repository [15:01:37] I'm trying to find quick, clean and efficient way to update the Wikis. Do I get this right, that if I clone a specific branch, then I can pull the upgrades and just run 'update.php' and that will upgrade to the latest? [15:08:33] Uh-oh. 1.34 requires PHP 7.2.9+. Debian9 machine does not have that [15:08:55] Debian10 machine has that. Maybe I should upgrade the OS of the older machine [16:00:50] After upgrading the OS to Debian10 one wiki works, but a wiki where Elastica is installed complains "The Elastica extension requires the cURL PHP extension to operate, but it is not installed on this server." I have probably installed it with something hardcoded to the old php, because 'php7.3-curl' is the latest version [16:02:48] In Special:Version of the wiki that works I find PHP version being '7.0.33-0+deb9u6 (apache2handler)'. What to do? [16:03:28] Yeah.. They are using the old PHP, not 7.3 [16:16:59] Got it [16:17:01] sudo a2dismod php7.0 && sudo a2enmod php7.3 && sudo service apache2 graceful [16:17:08] that did the trick [16:32:32] Hi huys, what is a good service to hosting and hacking a mediawiki. So I want to edit code, try different mediawiki versions, switch git branches, ... and then see the changes immediately. But I don't want to run a server on my machine. Is AWS, Azure, Heroku,... any good or another service? What was your best experience? What would you recommend? [16:59:40] ohd: https://www.mediawiki.org/wiki/Hosting_services lists a few. Cannot tell how good they are [17:02:48] ohd: I'd recommend using vagrant for that purpose. https://www.mediawiki.org/wiki/MediaWiki-Vagrant [17:03:14] it is running on your machine, but in terms of getting something hackable that you can develop on, it's by far the easiest method [17:14:22] mobijubo thank you, but that list I do know :) [17:15:11] Skizzerz thank you, it sound very interesting, didn't know that. Will check it out. [17:30:40] extensions/Scribunto does not contain Scribunto.php, what to do? [17:31:43] mobijubo: that's correct; PHP entry points for extensions were deprecated a long time ago and have since been removed from a lot of extensions [17:31:51] they aren't needed anymore [17:32:07] you load extensions by calling wfLoadExtension() in your LocalSettings.php file [17:37:42] Thanks Skizzerz. That was the problem. Wiki #1 upgraded now [17:54:56] Upgrading the next wiki I run into the problem that extensions/cldr/extension.json does not exist. How do I generate it? [17:56:30] !e cldr [17:56:30] https://www.mediawiki.org/wiki/Extension:cldr [17:56:33] check to see if a more recent version of that extension is available [17:56:42] generally as you update mediawiki you will also need to update all extensions you use [17:58:15] old versions of extensions with new versions of mediawiki may not work as expected or even cause the entire wiki to cease working. [17:58:42] mobijubo: ^ [17:58:54] I got all extensions with git: 'git clone https://gerrit.wikimedia.org/r/mediawiki/extensions/ --branch REL1_34' [17:59:13] I did a shell loop to replace with each extension needed [17:59:49] it seems the loop failed for this one somehow [18:00:02] ok. I try again [18:00:02] looking at the git source, the REL1_34 branch for cldr does indeed have an extension.json file [18:00:04] extension.json exists on that branch: https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/cldr/+/REL1_34 [18:00:48] note that `git clone` will *fail* if the target directory exists; you use that to download a fresh extension, not switch branches on an existing (already-downloaded) extension [18:01:40] some extensions may have additional dependencies or requirements, so it's good to check the individual documentation pages for them to see if there's other setup needed [18:12:29] Now I'm missing extensions/ConfirmAccount/extension.json. Which command generates these, or are they supposed to come with the package? [18:19:11] I moved the directory away and cloned into with 'git clone https://gerrit.wikimedia.org/r/mediawiki/extensions/ConfirmAccount --branch REL1_34' and there is no extension.json. What to do? [18:23:39] * mobijubo is confused and goes to get some coffee [18:27:56] When I changed 'wfLoadExtension('ConfirmAccount');' to 'wfLoadExtension('ConfirmAccount');' the upgrade ran just fine. Perhaps the repo is missing the extension.json ? [18:30:17] oh sorry, my bad, copypasted the same thing twice. What I meant is that if I load it the old fashioned way the upgrade runs fine [18:31:14] mobijubo: read the extension documentation pages [18:31:25] https://www.mediawiki.org/wiki/Extension:ConfirmAccount seems to indicate that ConfirmAccount still requires the old way of loading [19:21:18] How does the --doshared switch work? Last time I upgraded the wiki family I got it to pass by disabling the shared tables (ipblocks, user_groups and interwiki), dumping them from the master wiki and loading them into the testwiki (the 2nd one in the family) [19:21:48] and after upgrading the 2 wikis in the family I re-enabled the shared tables [19:23:05] shared tables aren't updated by update.php, unless you run it with --doshared [19:23:17] that's the theory [19:23:45] so should I try the --doshared on the wiki that contains the shared tables or the wiki that borrows the tables? [19:24:02] the one that contains it [19:25:41] (your shared table list also looks a bit odd -- you're sharing things that key into user like ipblocks and user_groups, but not sharing the user table itself?) [19:26:56] Skizzerz: How does the main wiki know about the tables being shared by the test wiki? The shared tables are inside of 'elseif ($wikiId == 'testwiki') { ... } [19:28:33] Skizzerz: Yes, I am a bit confused about that I'm not sharing the user table yet all the users show up in https://test.consumerium.org/wiki/Special:ListUsers [19:29:24] generally all wikis in the farm (including the "master" one) would use the same config for $wgSharedDB, $wgSharedTables, etc. [19:29:41] if your setup varies from that, then you don't need --doshared when upgrading the master [19:30:15] So I try upgrading the master wiki first and then the testwiki with '--doshared'? [19:30:42] the shared tables need to only be upgraded once [19:30:54] mobijubo: how are you defining the shared tables in LocalSettings? [19:31:01] like what's the "..." bit above [19:31:21] ok. and the upgrade.php will detect that the shared tables have been upgraded already [19:32:08] update.php should ignore shared tables entirely unless you pass --doshared [19:32:16] I say "should" because that may not be a guarantee [19:35:18] Skizzerz: So you figure that I should share the "user"-table too? [19:35:32] I figure you should answer the question I asked above [19:36:50] Skizzerz: https://paste.debian.net/1122747/ [19:37:20] mobijubo: thanks. That method of setting it up keeps the default values and just adds to it. By default, the 'user' and 'user_properties' tables are shared [19:37:38] so you are sharing those two tables as well, even though LocalSettings doesn't explicitly list them [19:38:13] (this is just meant as an FYI, there is nothing inherently good or bad about sharing the user table) [19:38:23] Skizzerz: Ok thanks for the info. I've been wondering how does the single-login work without manually sharing user and user_properties [19:38:59] anyway, with your setup, simply running the regular updater (WITHOUT --doshared) should be fine [19:39:24] ordering-wise, you'll likely want to update the master first before updating any of the others [19:44:22] mobijubo: what version of mediawiki are you upgrading to? [19:44:51] Skizzerz: 1.33 -> 1.34 [19:46:19] mobijubo: you will likely also want to add the following to the list of shared tables: 'actor', 'ipblocks_restrictions', 'user_former_groups' [19:46:26] do that BEFORE doing any upgrades [19:46:52] well, may be too late actually with actor since you're already on 1.33 [19:47:24] in which case I'd suggest un-sharing ipblocks instead and using an extension that allows for global blocking [19:49:27] (the issue with sharing ipblocks but not sharing actor is that the other wikis may show incorrect results to a blocked user when they explain who blocked them, and may possibly cause database errors depending on who set the block) [19:50:46] sharing ipblocks_restrictions is necessary if partial blocks are in use while ipblocks is shared. user_former_groups being shared is recommended if user_groups is shared to prevent certain loopholes with autopromotion rules [19:51:38] however, since you're already on 1.33, the different wikis likely already have divergent actor tables. So flipping sharing on for that may cause a lot of issues in terms of attribution of edits [19:52:02] so the best course of action may be to stop sharing ipblocks instead [19:52:12] (and keep actor not shared either) [19:59:04] Now I'm running into a missing extensions/LinkedWiki/extension.json [20:02:47] Trying to clone into it I get "fatal: Remote branch REL1_34 not found in upstream origin" [20:03:55] I'm going to see wha the Extension Distributor says [20:05:47] Yeah, there's only "Master"-branch at https://www.mediawiki.org/wiki/Special:ExtensionDistributor/LinkedWiki [20:06:08] So.. I clone the master branch or how should I proceed? [20:08:47] Now I'm running into "Class 'Wikibase\DataModel\Entity\ItemId' not found" I'm guessing some unmet dependencies. Full output: https://paste.debian.net/1122750/ [20:19:18] https://www.mediawiki.org/wiki/Extension:LinkedWiki does not seem up-to-date as it suggest running 'yarn install --production=true' to which yarn complains 'yarn install --production=true' [20:19:47] sorry. copypaste fail again... It complains "yarn: error: no such option: --production" [20:31:05] I have faint recollection that yarn was two different programs and I guess 'sudo apt install yarn' installed the wrong one. Now trying with npm [20:38:39] I installed npm and ran 'npm install --production' in the extensions/LinkedWiki -directory, but still getting problem "Class 'Wikibase\DataModel\Entity\ItemId' not found" [20:40:36] Ionly know of yarn as part of the oozie-yarn-hadoop ecosystem [20:42:15] I guess there are composer dependencies you need to install [20:43:13] running the npm again I notice it is complaining about too old Node.js and "bootstrap@4.3.1 requires a peer of jquery@1.9.1 - 3 but none is installed. " [20:43:34] on that I can't help you [20:43:49] the other bit looks like it's missing some wikibase dependency [20:45:06] if you're trying to use wikidata ids or something, you'll need to follow the wikibase extension installation bits [20:45:12] for the client end I suppose [20:47:18] The debian package 'nodejs' and this are different I gather https://nodejs.org/en/ [20:50:05] https://wikimedi.ca/wiki/Sp%C3%A9cial:Version hm they are using linkedwiki without wikibase so maybe there is something else triggering that error [20:50:10] some other extension [20:50:38] and it's quite likely that whatever version debian ships is behind the nodejs site one [20:50:44] for npm etc [20:50:58] yeah. I'm looking at instructions to install more recent [20:51:05] ok! [20:51:32] it's late here so I'm going to wander off... hope you get everything sorted out [20:58:13] mobijubo: I recommend you use nvm or similar [20:58:40] then lock the node version with .nvmrc (and/or .node_version) [21:56:33] maybe my composer is outdated and that's why I get the following errors when running 'php maintenance/update.php --doshared' https://paste.debian.net/1122759/ [22:03:21] mobijubo: is your Wikibase extension updated? Because that setting is deprecated: https://www.mediawiki.org/wiki/Manual:$wgSquidMaxage [22:03:38] Composer was slightly old. Now it is version 1.9.1, but still no worky [22:08:51] Vulpix: I just moved the extensions/Wikibase out of the way and recloned into it, but still getting the same error [22:09:47] Ah, I got distracted by the Undefined index: wgSquidMaxage, but that's only a notice [22:09:57] Yeah [22:11:39] apparently, you need to run composer there [22:13:27] I ran composer in the extensions/Wikibase after installing jquery, popper.js and bootstrap with yarn [22:28:21] so 'yarn install --production=true --scripts-prepend-node-path' is having a problem with an npm thing it seems. Here is what it logs: https://paste.debian.net/1122761/ [22:57:51] 'yarn add jquery' fails complaining that make exited with code "2" [22:59:01] You really shouldn't need to be running node package managers to get MW working [22:59:18] But Wikibase.. [23:00:26] You don't need to for wikibase either [23:00:43] Class 'Wikibase\DataModel\Entity\ItemId' [23:00:46] is a PHP class [23:01:01] `composer install --no-dev` in the Wikibase folder should take care of that [23:20:09] Reedy: Thanks for your help. Got it to work now