[04:51:24] hello [04:52:08] why i get a spam from mediawiki on my own talk page? [04:59:25] hi [05:00:30] cronolio: Can you be more specific please? [05:01:00] andre__: https://www.mediawiki.org/wiki/Topic:V7071nacrksgi0o1 [05:01:49] cronolio, That is not "from mediawiki" but it has a sender signature [05:02:21] (but the user name of the recipient is broken in that message, it seems) [05:02:30] https://www.mediawiki.org/wiki/Extension:MassMessage [05:02:35] so, can i ban this user ? [05:04:09] cronolio: If you're a 'normal' user then you cannot "ban" other users. Your settings allow you to mute users though. [05:04:39] i'm a spam warior time to time [05:08:13] is there any one explain more about freenode? [05:12:58] arad: Nobody knows until you post your followup questoin anyway [10:41:13] hi all [10:41:46] is there a way, plugin or core, to show on all pages a brief "modification history" on top of the page ? [10:52:59] What would that look like? [10:56:49] andre__: something like that : https://i.imgur.com/6sQw3MS.png [10:58:04] (I think this is stupid and that the history page is way better... but I don't really have a choice here and would like to automate this table somehow) [12:41:57] Has anyone here had success with importing file (image) history when importing files? I am trying to follow the instructions online, but it seems to simply not work [12:49:56] which exact online? [12:51:12] andre__: I can use importImages.php without any problems, but if I do Special:Export on a given file's history page and do Special:Import, it wants an interwiki prefix. If I use importDump.php it doesn't ask for a prefix and imports seemingly fine, but I can't actually see the history page... [13:33:09] I seem to be encountering poorly documented or buggy features in the maintenance scripts :-( [13:47:24] fdarling: https://www.mediawiki.org/wiki/How_to_report_a_bug [13:47:45] andre__: have you ever exported/imported images/history successfully? [14:01:19] Don't think I ever tried every corner of the MediaWiki software, no :) [14:47:38] fdarling: i don't think there is a way to import/export upload history of files... [14:47:56] duesen_: Of course "history of files" can mean two things [14:47:59] One, the wikipage history [14:48:02] And two, the actual file history [14:48:19] duesen_: I am having some success doing it, if you use importDump --uploads with an XML file that has embedded images/history, it works, but the script is buggy and emits errors that are unclear if they are fatal [14:48:55] duesen_: I am still trying to figure out what's going on, it has to do with the JPEG image being parsed as XML and throwing (non fatal?) errors, the feature was marked as "experimental" which means "janky" :-P [14:49:28] i wasn't even aware i existed, and I rewrote part of the export code recently :P [14:49:33] so yea, "experimental"... [14:49:58] embedding binary file data in the xml sounds... like breave approach [14:50:17] not to say adventurous [14:50:33] duesen_: it is base64 encoded, but then it's thrown into a temporary file and loaded as XML (incorrectly) [14:50:38] duesen_: this script is janky [14:52:02] duesen_: the irritating thing is that the --debug option doesn't seem to actually work, I can't yet figure out why [14:52:41] duesen_: I think even worse, I'm pretty sure its both experimental and from like 12 years ago [14:53:25] I would honestly just reccomend doing files separate from pages (including the file description page as a page) [14:53:38] import the xml containing pages, and then run importImages.php for all the images [14:53:46] probably would work better [14:54:45] fdarling: the output of --debug goes to a log file, see $wgDebugLogFile [14:55:12] bawolff: but that doesn't give you upload history. [14:55:27] well, it gives you dummy revisions for the uploads. but not the old versions of the files, right= [14:55:28] oh hmm [14:55:47] but yea, i can't think of a way that actually works :) [14:56:16] yeah, i think the main choice is someone rewrites the xml support so it actually works [14:56:59] Or i guess manually imports SQL and sets up the file structure (not for the feint of heart, I don't really reccomend this) [14:58:46] bawolff: using MCR to replace the oldimage table would partically solve this. but only partially. [14:59:44] it boggles my mind that exporting/importing images with history isn't a standard feature [15:00:08] fdarling: it's impractical for any but trivially small sites [15:00:13] file data is just too big [15:00:41] also, upload history tends to be pretty worthless. [15:01:04] I am actually more worried about the upload date and who uploaded it [15:01:24] for the latest version, or all versions? [15:01:37] for the latest version [15:02:10] hm... importImages can set that for you. but I'm not sure there is an export feature that would write a file in the approporate format. [15:02:29] I added this feature to importImages for bulk imports from 3rd party sources such as public archives [15:03:37] ah no, sorry. that only works for the decription, not other meta data [15:03:47] i was a long time ago that i wrote this... [15:05:38] so, i suppose importImages.php should have the ability to fetch meta-data about the uploads from a file. perhaps a standard xml dump even. it would base the meta data on revisions. it would have to somehow figure out which, though... [15:05:45] sounds like a nice little feature request :) [15:05:50] so I managed to get it working by putting a try {} catch () {} block around an attempt to use XmlReader->open() on invalid XML data. They were checking the return value to see if it worked, but apparently it throws an exception and doesn't ever make it down to the "graceful" handling code... [15:06:35] duesen_: if you are a developer on this stuff, perhaps I should speak with you more later about fixing this functionality? [15:06:51] duesen_: I'd love to help and submit patches, but I am jumping into MediaWiki's code here with little experience :-S [15:07:45] fdarling: for me to work on this, you'd have to convince someone in wmf that this is worth spending resources on :) it'S not trivial. [15:08:04] if you want to attempt to fix it, i can try to help and review code [15:08:49] but I don't know how responsive I'll be. depends on other commitments and workload [15:09:37] I understand it's all volunteer work, and I would love to volunteer myself, it's just tough when you don't have someone to orient you to the existing codebase [15:09:42] fdarling: i suppose the "easy fix" to try first is not to load a jpeg as xml :) [15:10:06] development is mostly done by payed staff (including myself) [15:10:32] volunteer ontributions are very welcome, but it can be hard to get help and code review. [15:10:39] this channel here is a good üplace for that, though [15:10:54] if you have concrete questions, i can try to answer them [15:12:04] oh huh, looks like brion wrote this in 2008 :P [15:12:07] * duesen_ pokes brion [15:14:53] the relevant class is ImportableUploadRevisionImporter [15:15:03] but i suppose you had already found that. i don't know anything about that code [15:15:47] oh, ah - addshore might :) [15:16:11] duesen_: imported file history shows lists the user as "Imported>OtherWikiUsername", and it doesn't show the usual (talk | conribs | block) links next to the name [15:16:20] hey addshore, another question for you here ;) apparently ImportableUploadRevisionImporter doesn't work as expected [15:16:28] duesen_: is it because it's detecting the user doesn't exist? or is it a special "comment in place of user" placeholder? [15:16:39] *reads up* [15:17:01] duesen_: I wasn't looking at that class specifically [15:17:06] Oh wait, going into hotel room now, back in a sec [15:17:08] fdarling: that's because there is no way to know wther that user is the same as the local user. it's intended to prevent misattribution during cross-wiki imports. [15:17:27] there may be options for considering users local... that's actually something addshore can probably help with [15:17:30] duesen_: I understand, especially in the general case, is there a way to fix it up afterwards though? [15:17:54] i'm not sure, to me honest [15:18:13] that functionality (separating "actors" from "users") is quote new. anomie is the export [15:25:12] duesen_: importDump has a --no-local-users option that is off by default, the default behavior to being to assign it to local users where available. It doesn't seem to actually be working though [15:28:43] is it working for regular page revisions? [15:28:49] duesen_: no [15:29:03] hm.... that's odd. [15:29:37] duesen_: it seems that in WikiImporter.php there are two instances of "new ExternalUserNames()", the default having the $assignKnownUsers constructor argument hardcoded to false (for default behavior?) and one passing in a boolean from setUsernamePrefix() [15:29:44] duesen_: so I think it was never fully tested or something [15:30:36] duesen_: the problem is that for importDump the default is to have it assign local usernames, so the defaults seem inconsistent, and it seems that it's not propagating it's different default behavior to the library it's using [15:31:50] duesen_: if $importer->setUsernamePrefix() is called then it'll propagate it, but it is only called if the --username-prefix option is set, so I am going to have to set that to some dummy value just to get local user assignment working [15:31:54] duesen_: I'd call this a bug :-P [15:32:46] sounds like a bug indeed. [15:33:17] or at least rather confusing [15:54:00] No WiFi in my room :( [15:54:03] *reads up* [15:54:57] In the wikiimporter stuff it should definitely be possible to import things and attach them to local users [15:55:50] addshore: I have it working, but I had to make a small patch and a workaround with command line options [18:21:08] Hi, I just upgraded to 1.33.0 and the page *contents* appear empty, while the text/page tables still seem ok [18:21:55] Have you run the database updater? [18:22:37] yes [18:22:50] the mariadb updater then update.php [18:22:55] both did a lot of work [18:23:30] if I try to edit a page it says that revision 0 is missing [18:24:03] What's the mariadb updater? [18:24:13] mysql_update [18:24:44] random page finds an exiting page, but contents still appear empty [18:25:42] Short pages do print the page sizes [18:26:09] it's fascinating [18:26:31] All pages has all the page names [18:27:45] nothing in error_log [18:45:45] So no idea I guess? [19:09:05] Sarayan: I don't know too much about your case, my upgrades have always gone smoothly, but I have seen database issues in general before and I'd check to make sure the table structure makes sense