[00:06:11] i just found: wiki/index.php/Special:Export/Gas_Cap [00:07:04] although action raw is good i may have to do that [00:07:14] thx [00:50:22] so if i export raw from mediawiki, it contains the wiki formatting. is there a pluging that translates that to html? or does it have an export to html that only contains the body text? [00:53:18] *cough* [00:53:31] Louder. [00:53:32] IPFS does some interesting things on the network level [00:53:42] oops [00:53:51] Sorry, wrong channel. [00:54:05] * Lord-Simon takes the cough back [01:02:44] snowkidind: you can try ?action=render [01:03:10] yup i just concluded that as well [13:58:56] Hi. I was going through vagrant install on a laptop and it's stuck on "Bringing machine 'default' up with 'virtualbox' provider..." since an hour with no further updates on the terminal. [14:01:18] bd808: ^ (in case you're online already :) [14:21:16] I destroyed it. Running another vagrant up now... [14:40:12] Good morning everyone, what's up with mediawiki.org's certificate? [14:40:20] I'm getting an error when I try to access the Downloads page [14:42:20] Ulfr: Only in Firefox as far as I can tell [14:42:24] Every site on the cluster same way [14:42:30] -operations not responding [14:46:56] marktraceur: Forgot to sacrifice a chicken to the server gods, I see [14:47:04] Thanks, I'll try that evil browser [14:52:58] hey, whilst I'm here it doesn't hurt to ask, is there any way to adjust how long pywikibot sleeps between API actions? [14:54:33] wrong chan... but -putthrottle should work for that [14:56:51] Unless you mean *read* API queries [14:58:53] No, I mean I need to protect all pages in a given category, and I can't leave my computer running all night whilst i'm at work, so I wanted to wait a half or a third as long [14:59:09] protecting doesn't require that much oomph and waiting 10 seconds between pages when there's a couple thousand makes things take time [14:59:10] I haven't seen a setting for read actions [15:01:19] wondering if there isn't some sort of --gottagofast option I can plug in so I don't have to reconfigure this bot to run on my home pc [15:02:42] -putthrottle is a command line parameter, just use it or not when you want, nothing to reconfigure [15:35:45] i'm trying to use mobilefrontend with varnish, but all the requests to load.php have a &version=2015.... parameter which is messing with the caching [15:36:28] it's changing on every request, even though nothing is really changin [15:37:10] o/ [15:39:46] i see a version key on en.m.wikipedia.org requests, but that one is some kind of hash and doesn't change [15:40:28] is this the right channel to ask, or is there a mobile channel i can try? or shoudl i try the -dev channel? [15:47:45] hi UserProd [15:47:56] Hi MatmaRex [15:48:07] UserProd: if it's MobileFrontend-specific, try #wikimedia-mobile [15:48:12] thnx [15:48:59] UserProd: #wikimedia-dev is mostly the same people as here (but fewer), and it has a lot of noise, so here is probably better than there [15:49:05] i don't know how to help with your issue, sorry :( [15:49:35] np , thanks for the channel name [15:54:33] I'M VERY NOISY [16:15:40] *snicker* found a funny typo on: https://www.mediawiki.org/wiki/Extension:MobileFrontend [16:15:57] $wgMFSpecialCaseMainPage promises special massaging for html, wonder if there's a groupon for that kind of thing [16:30:28] Niharika: did you get your mw-vagrant pain sorted out? [16:49:47] bd808: A restart of the laptop sorted most things - but I'm yet to ssh and check if all's well. [17:13:38] Hi, I logged into my wiki today and all the content is missing. The database is still there.. what could have happened? [17:15:09] how is it missing? [17:15:57] it looks like a fresh install [17:16:02] 'main page' is blank [17:16:14] there are no links, etc [17:16:35] the history still exists [17:17:12] how blank? [17:17:17] the whole page is blank? [17:17:21] The content is missing.. [17:17:29] or is the interface there, but the content empty? [17:17:34] right [17:17:36] or does mediawiki tell you the page does not exist? [17:18:19] SimbaLion: can you try editing any page that suffers from this problem, and check if that makes the content appear? [17:18:41] ah hah, edit shows the content [17:18:42] SimbaLion: if yes, i'd guess something weird happened to the parser cache [17:18:57] although that's interesting, no idea what could cause this [17:19:06] but preview is blank [17:19:22] easy to fix? [17:19:22] ah, and what you save, is it still blank? [17:19:29] I haven't clicked save, I'm afraid to [17:19:31] when you save* [17:19:39] heh, okay [17:20:44] SimbaLion: we've had this kind of issue in the past when your PHP installation uses a newish version of PCRE, and you use an old-ish version of MediaWiki. PCRE changed something in a minor release that broke stuff [17:21:46] I did just upgrade PHP to 5.6 [17:21:49] assuming that somebody upgraded your PCRE or PHP version recently, you could try upgrading MediaWiki and see if it fixes the problem. (or just applying the patch for this issue, let me find it) [17:22:38] Hey, I am a newbie and I need some help in installing git-review, please help me. [17:22:58] I will try updating, it probably needs an update [17:23:02] SimbaLion: the issue was fixed in MW 1.23, the patch is https://phabricator.wikimedia.org/rMWb9f291e8cd5bb1450f7b1031aa17cf7775aa7e96 (should also work with older MediaWikis) [17:23:07] I haven't upgraded mediawiki since 2013 I think [17:23:31] $ git review -s. Getting this error Cannot determine where .git directory is.Not a git repository (or any of the parent directories) [17:24:18] nisha: looks like you're not running `git review -s` in a directory that contains a git repository. [17:24:46] MatmaRex Thank you :) [17:28:40] Ohh Thanks MatmaRex, le'me try again [17:55:26] Hey , I was trying to install mediawiki through vagrant I am facing issues while running 'vagrant provision' due to error which says 'dpkg was intrupped' when I entered into vagrant ssh and ran command 'sudo dpkg --configure -a' and again executed 'vagrant provision' It shows error that composer.phar missing' I tried to resolve that but cannot do it .Can anyone help? [18:03:35] adisha: where did you get your vagrant setup for mediawiki? [18:04:45] I took vagrant setup for wikimedia using this link "https://www.mediawiki.org/wiki/MediaWiki-Vagrant" [18:07:59] ok, and do you have more context for the composer.phar error? [18:10:01] I read a post that said to create a file 'composer.json' in project root directory. i created the directory [18:10:09] I created the file [18:10:38] But still error was not resolved [18:11:05] no, that's definitely not the problem [18:11:09] adisha: something went wrong with the original setup of your vagrant. probably a connection dropped or something at a critical point [18:11:58] Yes I too thought of that but When I run Vagrant up my system sometimes hang [18:12:17] So I tried to reinstall certian times [18:13:11] first do a git pull in the vagrant dir, then ./setup.sh and then vagrant git-update [18:13:58] ok I will retry it [18:14:02] and after all that finished and isn't reporting errors, then run vagrant provision [18:17:18] Thanks @thedj I will try this. [20:10:36] Hello all, looking for some guidance as we have a mediawiki install in place that had links pointing to a file share that has moved. What would be the best way to find and update all these links? Ideas? [20:11:27] Special:LinkSearch can probably find those pages [20:11:56] Oh ok, cool I will give that a shot, much appreciated [20:12:40] and maybe [[Extension:Replace Text]] can be used to do a mass search & replace [20:12:43] @link [20:12:43] https://www.mediawiki.org/wiki/Extension:Replace_Text [20:15:44] Ok, that might just be perfect, thanks Vulpix [22:00:40] asd [22:00:58] asdf [22:01:03] nice [22:04:33] qwerty [23:25:28] <[1]hypergrove> hi what's the channel for mediawiki [23:25:37] <[1]hypergrove> s's translate bundle? [23:27:18] <[1]hypergrove> oh yes -i18n