[00:56:21] i think Extension:Cargo is more actively developed than SemanticMediaWiki at this moment [00:56:33] so i'd put a vote in for Extension:Cargo [05:09:21] hi all i was wondering if anyone is in christchurch whou can tell me the digital freqency to go bus and red bus thanks [12:24:47] 11 [12:26:21] Hi, I need to get all interlanguage links for a bunch of titles (text page_title, not Title), but resolving redirects before doing so. My current approach is to use LinkBatch to check existence, then use that to separate the redirects. After that, I'm in the dark; the only redirect resolvers I could find are in ApiPageSet and in SMW's DeepRedirectTargetResolver, and I was told if I use an API module inside regular code, [12:56:56] FreedomFighterSp: Your query cut off after "...regular code." [12:59:23] I have a question for the channel too - what's the best way to know the last date a given user was active on any wiki? i.e. what's the most recent time a user has made an edit on any Wikimedia project? So far the API has not proven helpful. [13:20:19] Should I upgrade my 2 Mediawikis 1.29.1 directly to 1.30.0 or to 1.29.2? .. not much point in the latter if the procedure to upgrade is the same [13:20:53] Niharika: check their contribs on each project the latest contrib out of all of them is the last time they were active [13:26:14] Zppix: I was hoping there's an easier way. There are 800+ projects. [13:26:58] jubo2: I can't comment on the upgrade procedure (should be same) but upgrading to the latest version is always ideal. [13:27:24] Niharika: i wonder if centralauth has an api for that? [13:27:40] Zppix: Not that I could find. [13:29:30] Niharika: i wonder if the mw team would have any ideas [13:30:21] I am already awaiting their response. Thanks anyway. [13:30:45] No problem [13:35:51] Niharika: but not always wise to go with the latest version. Mediawiki software engineering seems really top notch [13:36:23] jubo2: If you do face problems with 1.30.0, you can always switch to 1.29.2. [13:38:35] I made and verified full backups of all databases and of /var/www so I can revert [13:39:34] Niharika: (assuming this is about wm, not some other wikifarm) The guc-tool can sort by most recent, but it's web-only. They seem to have a sane class-seperation of "get the data" and "make the web output" though, so it might no be that hard to get the data into the (already present) api.php of said tool. If you don't find anything, you might want to extend said api and use that: [13:39:36] https://github.com/wikimedia/labs-tools-guc [13:40:11] eddiegp: Yeah, I was just looking into that. Thanks! [13:41:25] cscott: you recommended Extension:Cargo over SMW earlier, but that was simply on a question of structured input; do you know perhaps if it offers the same internationalization/translation capabilities? [13:41:58] djr013: What do the docs say? [13:42:02] !e Cargo [13:42:02] https://www.mediawiki.org/wiki/Extension:Cargo [13:43:19] djr013: It uses TranslateWiki for translations but you can use pretty much anything, it seems. It's structured in the same way as other MW extensions. [13:43:29] https://github.com/wikimedia/mediawiki-extensions-Cargo/tree/master/i18n [13:47:25] I mean for translations of content [13:48:55] Admittedly it's been a while since I've reviewed and worked with these extensions, perhaps I should just try again [13:51:34] https://www.semantic-mediawiki.org/wiki/Content,_page_and_user_language [13:51:39] I'm getting ready to run 'php update.php' on wiki #1 (the minor significance one first). This time actually taking notes on what are the correct commands to go about this [13:52:35] running it now. done in 25 s [13:54:04] 'sudo service apache2 graceful' now? [13:54:24] I try reloading the wiki with browser [13:54:32] won't work atm is my guess [13:55:31] nope. Special:Version says version is 1.30.0 without reloading the webserver [14:01:07] I move to upgrade the more important wiki now armed with my list of commands to give [14:07:36] djr013: what kind of translation support do you need for your content? [14:27:48] Nikerabbit: It's for a fully multilingual wiki, content defaults to a user's language(s) if such translation is available, enables contribution of missing or incomplete translations. [14:29:56] yeah.. all went smoothly. Big thank you thank you to the devels [14:32:58] And should work just as well for file pages, category pages, and page titles. [14:33:05] what is the page to alter the site-wide header thing ? [14:33:20] I'm trying to find on mediawiki.org search but I don't find it [14:33:39] jubo2: sounds like a theme thing [14:33:41] Site-wide message? [14:34:03] djr013: No. I mean a textual message on every page of the wiki [14:34:21] the footer? [14:34:23] I can edit some page to remove or alter the message but I don't recall what it is called [14:36:53] djr013: there isn't a complete solution. Especially page titles are difficult because even though you can customize them they are not reflected in any listings. [14:38:00] djr013: but you need at least ULS, and possibly {{int}} or some other solution for the content and Translate extension to provide translation interface for translation [14:39:31] I recall SMW helping with page titles, but that might have simply been that the translated page names didn't have to be subpages, but could have any (URL) name. [14:39:45] And would just associate the pages with eachother as translations [15:01:15] djr013: I don't. I haven't actually used it, my recommendation is just from talking to its developer at last year's dev summit. [15:02:25] ah [20:59:27] Olá eu acabei criando esse site para participar de um concurso fotografico [21:00:28] e hoje uma empresa se utiliza de uma foto do meu trabalho ja to acionando meus advogados [21:50:34] Hey guys, I'm trying to automatically export certain pages within one category as PDFs, can you guys think of a clever way to do this? [21:50:51] Automatically? [21:51:01] Reedy, cron probably [21:51:05] like weekly or so [21:51:36] I haven't seen that any of the PDF extensions have any "maintenance scripts" for doing it... [21:51:57] I'm thinking about curling the pages and then dealing with them in my own way, but I'm not sure how to get all those links [21:52:08] I have MySQL access too though [21:53:11] The API will give you category members [21:55:52] Reedy, think I might have found something https://www.mediawiki.org/wiki/Extension:PdfBook [21:56:21] https://github.com/OrganicDesign/extensions/tree/master/MediaWiki/PdfBook.git is 404 [21:56:36] Seems they've moved it [21:56:37] https://github.com/OrganicDesign/extensions/tree/master/MediaWiki/PdfBook [21:57:05] Oh. No [21:57:07] The template is stupid [21:57:50] no, it's not :) [21:58:05] Well, GIGO [21:58:10] But my point still stands [21:58:12] flying_sausages: That might work, yeah [21:58:27] https://github.com/OrganicDesign/extensions/blob/master/MediaWiki/PdfBook/PdfBook.hooks.php#L226 [21:58:33] You might be able to build a URL... [22:30:10] this tool does exactly what I need it to do ha [22:30:36] Finally I can use the Wiki to make documentation that adheres to our backup policy [22:32:00] Your backup policy is "it must be on PDF"? :P [22:32:23] it's more like "it can't be an sql dump if someone needs to quickly read something up" [22:32:37] Pfft :) [22:33:08] Also, this way we can CTRL=F things even if the webserver is offline [22:33:20] heh [22:33:30] If the wiki is where I keep tabs on all my VMs, this becomes handy when all VMs die [22:33:44] one of which hosts the wiki [23:14:35] flying_sausages: headless chromium + categorymembers API call + for loop in bash? [23:14:54] probably the least effort [23:15:44] tgr curl to file using the plugin above gives me exactly what I need but cheers :) [23:16:36] yeah but you end up with crappy PHP PDF generation [23:17:01] the browser's "save as PDF" tends to be far superior [23:17:46] tgr: to someone viewing it in Adobe, what would the difference be? [23:19:21] looks uglier? not sure, it's not PHP-based actually, they use a command-line tool called htmldoc, no idea what that is [23:20:19] "However, it does not support many things in “the modern web”, such as: ...CSS, tables, Unicode..." [23:20:22] so that, probably [23:20:27] https://michaelrsweet.github.io/htmldoc/index.html