[02:50:32] Is there an easy way to download a site created using mediawiki for offline viewing? I could use wget but I'm afraid that would strain the server's resources [04:19:00] Is there a generic MediaWiki app that can hook into any MediaWiki installation (probably via /api.php)? [11:45:19] Is there an easy way to download a site created using mediawiki for offline viewing? (apart from wget) [11:51:37] jrmu: not really. you can get the source wikitext of pages using Special:Export, but depending on the site that might not be easy to read. [11:52:24] MatmaRex: ok, thanks. I know wikipedia provides dumps, but I guess that is a special service they offer. For others, I probably just have to sue wget [11:52:30] s/sue/use [11:52:44] jrmu: dumps are in the same format as Special:Export output [11:53:25] MatmaRex: ah. Can I get the entire site in one dump, or do I have to go to Special:Export for each and every page? [11:55:20] jrmu: i guess you can copy-paste the list of pages from Special:AllPages into the text box on Special:Export ;) [11:55:40] there's some config option that adds a checkbox "Export all pages", but i think it's off by default [11:55:55] (don't ask me why) [11:56:10] ($wgExportAllowAll) [11:59:31] MatmaRex: Ah, thank you, that may come in handy [20:40:01] Is there configuration documentation for RESTBase that is not the "this is how to setup a translation server" documentation? I have not found any. [20:41:03] What're you trying to do? [20:41:37] Setup RESTBase to sit in front of Parsoid replacing the deprecated Extension:Parsoid that did that job. [20:42:13] I have RESTBase up and running, configured the base config.example.yaml => config.yaml.(Correct URLs and all that.) Every end point is 404 Not Found though. [20:42:26] Of course, there is nothing that actually lists what end points are available for testing. [20:45:20] Ohhhh... Google prevails. Accidentally. https://www.mediawiki.org/wiki/Parsoid/Setup/RESTBase [20:50:15] Oh, that is based on a really old version of RESTBase.