[00:21:28] Just trying to implement short URLs with nginx: https://www.mediawiki.org/wiki/Manual:Short_URL/Nginx [00:21:34] seems to be full of many errors [00:21:36] ... [00:23:08] anyone ever tried this before publishing? [00:23:30] No, we publish docs that no one has tested [00:24:27] i see [00:29:20] Is there a complete list of URLs to php files used by mediawiki both for setup and daily usage? [00:29:42] since I would have to build the config myself it seems [00:34:34] first error is on line 51: rewrite ... ; kicks nginx out of that block, so all the settings in that location are irrelevant [00:34:57] missing "break;" at the end of the rewrite rule [00:39:41] http://nginx.org/en/docs/http/ngx_http_rewrite_module.html [00:40:48] next error is on line 44: that SCRIPT_FILENAME will lead to "File not found" as it is contructed wrong [00:41:36] should be "fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;" according to https://www.nginx.com/resources/wiki/start/topics/examples/phpfcgi/ [00:41:44] hi, anyone tried to export cargo db? im exporting it with utf8 encoding and getting nothing understandable at the page titles (my wiki is not english one) [01:25:12] anyone willing to help me with an ImageMagick issue? I have ImageMagick and Imagick installed on my CENTOS/RHEL7 systems --- i have verified that PHPInfo has an ImageMagick section --- and SVGs now finally work on my MW 1.31 system --- but for (at least one) large images it says that there's a permissions error when writing the thumbnails --- I've tried setting $wgMaxShellMemory = 524288; [01:25:13] verified that my thumbs folder in MW is writable by the server --- the best lead I have right now is that ImageMagick might not have the right temp folder path set --- anyone here good with troublshooting ImageMagick? [02:12:59] hello [02:13:24] Need help [02:13:37] yes please [02:14:03] DGS_, are you asking for help or offering? 😁 [02:14:34] is there any way we can download the content of wiki in onenote or other format, alltogether? [02:14:44] help [02:16:05] ? [02:16:19] ok. i have no insight about onenote , sorry. [02:17:11] can we get resource(all data) in zip file or any other format? [02:17:13] to download [02:21:36] any idea to download all material all together? [02:34:31] DGS_: https://dumps.wikimedia.org/ [12:03:17] Hey all, I am confused, is there right now an extension that works and can connect to an Active Directory or Ldap ? [12:03:39] I am using 1.30.1 on a SLES with an Apache [12:26:40] Hi, I'm new to wiki world and need to host an offline version of the Wikipedia API - is it simply a case of download the wikipedia data and import into local mediawiki ? [12:27:57] Potentially [12:28:05] Along with numerous extensions and a lot of storage space [12:29:30] Thanks, I just need EN content for now and looking to host on AWS - any recommendations or things that I should look out for much appreciated ? [12:29:56] How is hosting it on AWS offline? [12:31:24] haha, 'offline' in as much as I need my own personal copy so as not to hammer the public one as planning to send a large number of requests [12:31:51] AWS also isn't local :P [12:32:05] You realise we get a lot of requests, from people probably bigger than you (google, facebook etc) [12:34:20] my use-case is to look up textual n-grams in hundreds of documents to find out if pairs of words are 'recognised things' eg "money laundering" so it's going to be pretty high volume and didn't feel like the right thing to do to hitup the official endpoint.. [12:35:20] https://www.mediawiki.org/wiki/API:Etiquette [12:41:07] thanks - makes sense. I'll definitely be hitting it in parallel and very spikey load - will probably do some prototyping on the public endpoint and then move to private instance if need. sounds from your first point like I could do that using mediawiki [14:00:35] What's with the date filter on special:log being weird and trying to query from year 1 [14:18:37] bawolff, looks okay to me... [14:19:46] Hmm. If i go to Special:log on meta, put my name in the performer and hit show, the page that reloads has an invalid date in the date field [14:19:51] I should go file a bug [14:22:19] https://phabricator.wikimedia.org/T209490 [14:23:37] oh, legoktm fixed this already server side with f198154d76782 [14:28:55] bawolff: that might be a browser bug/inconsistency? when i submit that form with no date, i get just "&wpdate=", rather than "&wpdate=0000-00-00" [14:29:16] Maybe. I'm using an old version of firefox [14:29:21] (and when i view your link, i get an empty date field, rather than an invalid value) [14:29:45] we can probably work around this, but it would be good to know what exactly are we working around [14:32:21] MatmaRex: Its the invalid value in the html source [14:32:53] whoa, indeed [14:33:09] curiouser and curiouser [14:33:32] actually, the date 0000-00-00 is the same as -0001-11-30 [14:33:37] if you use some wonky math [14:33:48] so at least that mostly makes sense [14:34:32] Wait [14:34:42] how is that possible [14:35:39] Wouldn't it have to be like negative decmeber not negative november to be the same? [14:43:30] shrug. wonky math [14:43:31] https://stackoverflow.com/questions/10450644/how-do-you-explain-the-result-for-a-new-datetime0000-00-00-000000 [14:44:25] well, that is a pretty poor explanation, actually [14:44:40] the manual also mentions "-0001-11-30": http://php.net/manual/en/datetime.formats.date.php [14:46:45] Hi, is this the right place to ask a question about database dumps? I'm trying to read plwiktionary-...-pages-articles-multistream with Python (but I think it's a language-agnostic question). The first line of the index file is "670:1:czytać". But if I open the dump file like so: with open(bz2.BZ2File(path)) as f: and then f.seek(670), I'm not getting anywhere close to "czytać", which actually starts at 2730th byte. [14:47:59] eww, that is wronky math [14:48:12] According to https://en.wikipedia.org/wiki/Wikipedia:Database_download#Should_I_get_multistream?, " The first field of this index is # of bytes to seek into the archive" [14:51:06] bawolff: can you test if this helps? https://gerrit.wikimedia.org/r/473527 [14:51:30] alkamid: i don't know the answer, but just want to assure you that its ontopic in this channel. Although other channels might get you a better answer (maybe #wikimedia-tech . At the very least some of the dump people idle in that channel) [14:52:32] just a minute [14:52:58] bawolff, thanks, I'll knock on their door. [15:01:04] Technical Advice IRC meeting starting in 60 minutes in channel #wikimedia-tech, hosts: @Thiemo_WMDE & @CFisch_WMDE - all questions welcome, more infos: https://www.mediawiki.org/wiki/Technical_Advice_IRC_Meeting [15:51:04] Technical Advice IRC meeting starting in 10 minutes in channel #wikimedia-tech, hosts: @Thiemo_WMDE & @CFisch_WMDE - all questions welcome, more infos: https://www.mediawiki.org/wiki/Technical_Advice_IRC_Meeting [16:15:35] Hello everyone [16:16:03] Is anyone here familiar with [[Template:Infobox country]] from Wikipedia? Or rather, adapting it to other instances of MediaWiki? [19:25:40] We just started updating all of Gamepedia to MW 1.31. :D @CindyCicaleseWMF [19:53:58] hi, why the this section is removed in upgrade via root access in mediawiki page? [19:53:59] tar xvzf PACKAGE_FILE_NAME -C PATH_TO_WIKI --strip-components=1 [19:54:28] whats wrong with this method? [21:11:42] \o