[08:15:38] Hmm, I've run both the mysql dump from before the 1.34->1.35 upgrade through update.php for 1.39 and the dump from after, which I thought was broken. Both end up with revision tables where rev_comment_id is always 0. [08:20:04] select count(1) from revision where rev_comment_id=0; and select count(1) from revision; result in 37754. [08:20:35] This is not a problem in it self of course, the problem is that history pages never load. [08:35:11] Unless they are trivially short. [11:31:39] I'll upgrade to 1.43 and hope it goes away... [11:53:31] That sounds like a maintenance script isn't being run [11:59:24] The 1.43 update script is migrating away from the revision_comment_temp table so this should probably fix the issues. Reading that the the temp table was removed being removed with 1.40 was what prompted me to upgrade go ahead and upgrade 1.43. Hopefully it works. It's running the migration now. [12:02:29] migrateRevisionCommentTemp.php exists in 1.43 but not in 1.39. Yet 1.35 and 1.39 were running queries not against the temp table but the revision table for rev_comment_id and and rev_actor_id. [12:03:39] I've got a feeling we were running in a mode of "write to new, read from old", so forced migration wasn't actually needed [12:03:58] To me it looks like failed detection on whether to use the revision_comment_temp and revision_actor_temp tables or not. [12:04:33] It seems a little weird that no one has apparently reported it before though [12:05:09] Yeah, though this wiki has lived quite a long life. It started out in 2008 I believe. [12:05:51] And probably started on whatever Debian had packaged at the time. [12:21:37] Most likely it is due to me holding MW wrong in some way, but I've not figured it out yet. [13:05:19] And with that forced migration the revisions are back in action. [20:36:11] Hello, I've been newly assigned the task of maintaining a MediaWiki install.  One of my latest tasks has been to mass-edit a bunch of links to change the hostname part of the URL.  Any ideas of tools I can use or info I should read for how to automate (or semi-automate) this? [20:38:30] I assume that's a followup to https://www.mediawiki.org/wiki/Project:Support_desk#how_to_mass-edit_links_on_my_wiki [20:38:37] Heh... good find. [20:38:49] I'm being maybe a bit impatient. [20:39:13] I wonder if Pywikibot as an editing tool could work with some regexes. But I have no experience in using that. [20:39:29] It does sound like a task for a bot though, or your own code using the MediaWiki Action API [20:39:34] Hmm.. never heard of pywikibot. [20:40:04] https://www.mediawiki.org/wiki/Manual:Pywikibot/replace.py [20:41:35] The number of edits that needs to happen is in that annoying middle-zone where doing them manually is annoying, but may be quicker than learning a new programming language/api and writing a script/bot from scratch. [20:42:32] Is this something I could do with SQL if I knew the database schema? [20:48:51] there is https://www.mediawiki.org/wiki/Extension:Replace_Text [20:48:57] (but have never used it) [20:49:45] When I tried it (an hour ago), it would modify the _text_ of the page, but did/could not modify the _links_ within pages. [20:50:10] Quite willing to believe I was doing it wrong.