[00:21:17] What's the best way to add text that readers can see on all pages? I know of SiteNotice, but users can close it. I'm aware of the possibility of editing skin code, but that needs shell access. [00:22:00] Editing sidebar is one possibility, but I'd like to add text above the page contents and title. [02:00:19] hey Can anyone help me with my problem on visualeditor? [02:00:24] I get "Unknown error, HTTP status 500" when saving the page or creating it. [02:04:51] Hello? [02:15:08] Yo is anyone there? [02:32:28] Jo__: yes, i just came back from a meal [02:33:02] Jo__: please pastebin all files you have in /var/log/nginx* [02:46:43] Hey Im having a bigger issue here now Il let you know [02:51:38] !ask | Jo__ [02:51:38] Jo__: Please feel free to ask your question: if anybody who knows the answer is around, they will surely reply. Don't ask for help or for attention before actually asking your question, that's just a waste of time – both yours and everybody else's. :) [04:02:52] Jo__: what's the issue? [09:05:44] I don't actually know how to use this platform. I just have a question regardin a page I would like to post on Wiki but don't know who to turn to. Sorry guys for bothering you. I appreciate any help:) [13:34:34] hi i need some help to create a new special page. I'm not sure where to extend the auto loader classes. Any ideas? [15:52:59] Hiya [16:14:14] OliverUK: Hi. [16:27:35] I have a but of a weird problem with MediaWiki, I have copied the database and images etc to another machine and the wiki works great as a backup [16:28:26] Problem is from the point I installed the backup any pages I edit on the first installation the links on that page no longer work on the backup [16:29:50] So I can browse a page on the backup and all of the links work not a problem, I edit the same page on the original and do not make any changes and then save it. Once I run the script that copies it all to the backup location none of the links on that page work now on the backup install [16:31:17] The links before the save are like this: http://backupserver/mediawiki/index.php?title=Page_Title, then after I save and re-backup they are like this: http://backupserver/w/Page_Title [16:47:09] hey guys i need some help. im having some problems with exporting articles using mw-lib to pdf I have followed the mediawiki guide to enabling pdf export but it doesnt seem to work [17:04:27] mistawright: MediaWiki version? What is "mwlib to pdf"? Which guide (link please)? [17:11:35] andre__, https://www.mediawiki.org/wiki/Extension:Pdf_Export#MWLib [17:13:14] Aha, OK so I have completely removed the backup wiki installation, re-installed it and re-copied the db, images etc across and all of the links work again [17:14:21] When I save a page on the original again and re-backup they are again broken, there must be a file in the wiki dir that has maybe a date on it or something for the install, any pages that are updated past that date have the links broken [17:15:12] Any ideas people? :-) [17:15:46] andre__, meant to say i was trying to export to pdf using mw-lib [17:17:05] mistawright: Sigh, outdated docs. [17:17:19] MWLib is NOT used on Wikipedia to generate PDFs... anymore. [17:17:22] Thanks for spotting that. [17:17:58] mistawright, which MediaWiki version? [17:26:19] andre__, how can i check this. recent its using php7.1 fpm [17:26:48] sorry for the late reply. watching youtube and in said server at the same time [17:27:04] mistawright: Special:Version [17:27:50] Also note that Extension:Pdf_Export says its last release was six years ago. (I'd be surprised if it just worked without any issues.) [17:29:05] andre__, 1.27.3 hmmm was hoping it would have mentioned mediawiki version and didnt from what i saw [17:30:00] otherwise would have looked for a more recent solution. what do you suggest that can be used to either export to pdf or odf or word doc? [17:42:52] andre__, any suggestions [17:53:18] mistawright: the Collection extension perhaps [17:53:41] https://www.mediawiki.org/wiki/Extension:Collection [17:54:33] wmat, PDF is a story that still confuses me because so many changes and names. Collection extension is my best bet, yeah. And OCG is dead. And the latest stuff (Electron and Photon) are service-based backends, as far as I remember. [17:55:26] indeed, but it's a feature that enterprise users typically require. [17:56:05] i find Collection a little unwieldy tbh [17:56:11] The electron stuff acts as a backend for collection (sometimes depending on config) [17:56:39] So you still use collection even if you are using ocg or electron (never heard of photon) [17:57:02] electron of course, is basically just starting up chrome and hitting print [17:57:14] which raises the question of why the user can't just hit print... [17:57:28] Any ideas on my breaking links problem? :-) [17:58:11] bawolff: Collection lets you build a book from many, often unconnected wiki pages all at once and download and save the PDF [17:58:23] oh right, forgot about that [17:58:55] OliverUK: So $wgArticlePath changed between your new and old server? [17:59:12] OliverUK: In theory, you should be able to kill all the cache by just making a dummy edit to LocalSettings.php [17:59:49] bawolff: So its the edit date of LocalSettings.php that is screwing it up? [18:00:17] [just a second, I skimmed your original messages. I'm re-reading them] [18:00:21] bawolff: You are correct it has changed, not fussed about it on the backup system as it is just running on a NAS enclosure [18:00:48] Oh, i see, you have them both running at the same time [18:00:57] and you want links to work correctly on both [18:01:16] Yeah, on separate systems [18:01:33] Easiest way would be to have $wgArticlePath and $wgScriptPath be the same on both (if possible). $wgServer can be different, but you should set $wgCanoncialServer to the main server [18:01:45] if you really need the different $wgArticlePath / $wgScriptPath [18:02:07] The links do work fine after the installation but when a link is updated on the real wiki then it is copied over then the links break [18:02:13] You think this may be the cache though? [18:02:14] you can set https://www.mediawiki.org/wiki/Manual:$wgRenderHashAppend to be something different between the 2 servers [18:02:22] yes, this is parser cache [18:02:49] or I suppose you could set $wgParserCacheType = CACHE_NONE; on the backup server, and then it will only cache things on the main server [18:02:54] The LocalSettings.php file is not copied over, only the DB (in full) the images folder and the logo file [18:03:34] $wgRenderHashAppend is the solution that Wikimedia used to use when they operated secure.wikimedia.org as an alternative way to access the site, and had the same issue with different article paths on different servers [18:04:52] What would you suggest would be the better solution? [18:05:09] $wgRenderHashAppend or $wgParserCacheType? [18:05:17] Or both haha [18:09:04] If you don't expect anyone to really view the backup, I would say $wgParserCacheType = CACHE_NONE; (on only the backup server) as that will save space related to storing all the cached entries [18:09:51] bawolff: Thank you so much for your help :-) [18:09:55] Have a great day [18:10:07] You too [18:25:51] wmat, have you used the pediapress render server before? I seem to be stuck on that portion of the collections extension pdf generation [19:26:22] is mediawiki still working with mw render server? the default server doesnt appear to be wokring for me [19:36:12] mistawright: What is "mw render server"? [19:36:29] (but in general, I'd expect documentation to be outdated) [19:48:08] there was supposed to be a default server the collection uses and it doesnt work. its a public render server [19:48:44] mistawright: i can't remember how i set it up, although i know it's all local [19:52:03] mistawright: also, when the server goes down we have to restart the render server [19:52:15] * wmat makes a note to document this stuff [19:52:37] did you use node with that? [19:52:43] wmat, [19:54:08] mistawright: yeah, i believe so [19:57:02] mistawright: tbh, getting it working was a pain [19:57:36] i'm scared to upgrade MW now [19:58:06] wmat: Documenting things! That's crazy talk [19:58:12] heh [21:34:12] https://blog.readthedocs.com/read-the-docs-2017-stats/ [23:18:31] There's that long lost Reedy [23:44:03] If I want the equivalent of MediaWiki:Gadgets-definition for each Wikimedia wiki, do I have some way of getting it that isn't doing action=query for each wiki separately? [23:44:43] harej: No [23:45:25] Alright, I suppose I'll make 910 API requests then. [23:45:32] Enjoy. [23:45:49] I'd suggest a for loop instead of Web browser tabs. [23:47:03] harej: There's a sitematrix API module if you want an input list for iteration. [23:47:12] Ooh. [23:48:35] It gives fully resolved URLs for each wiki. I like this. [23:53:51] I wonder how many of these miscellaneous wikis will ever be closed.