[00:08:06] I have a problem with pdfhandler and possibly diff3 [00:08:19] could anyone please help? [00:09:43] You'd be best asking questions :) [00:09:48] the thing is that I cant get any pdf to show a thumbnail on the wiki. [00:10:35] I have installed ghostscript and so on, but nothing happens. [00:11:31] I heard of a security issue with this, and maybe the wiki is somehow blocked from running commands? [00:15:34] MediaWiki 1.22.6 [00:15:34] PHP 5.5.13 (fpm-fcgi) [00:15:34] MariaDB 5.5.37-MariaDB [00:15:53] no one? [00:16:14] If shelling out was disabled, usually you'd get an error message to that extent [00:17:04] ok. hold on. I get something that i looked up to be a bug.. [00:18:02] nah it seems to be gone. [00:20:22] these are the relevant entries I have made to LocalSettings.php: [00:20:44] require_once "$IP/extensions/PdfHandler/PdfHandler.php"; [00:20:56] $wgFileExtensions[] = 'pdf'; [00:21:03] # for PdfHandler [00:21:03] $wgPdfProcessor = "/opt/bin/gs"; [00:21:03] $wgPdfPostProcessor = "/opt/bin/convert"; [00:21:03] $wgPdfInfo = "/opt/bin/pdfinfo"; [00:21:03] /Maximum amount of virtual memory available to shell processes under Linux, in KB. [00:21:04] $wgMaxShellMemory = 614400; [00:21:12] gute: If you enable debug logging, it will often indicate exact commands that were executed, which can help in situations like these [00:21:55] "but nothing happens" is not very descriptive. Are you getting an error message, a blank page, something else? [00:22:46] bawolff>> Ill try that. No error message, just a link to file. [00:23:38] blue link, red link? [00:23:59] What happens on the file description page (If wiki is public can you give me a link) [00:24:52] What you're describing isn't really consistent with what usually happens with an error executing an external program [00:24:58] usually there's big scary error messages [00:25:51] trying to find the $wg.. for error messaging.. no the file just shows as it normally would by just entering $wgFileExtensions[] = 'pdf'; [00:31:48] >>bawolff is adding this to localsettings enough to debug?: [00:31:49] gute: SyntaxError: Unexpected identifier [00:31:50] error_reporting( -1 ); [00:31:51] ini_set( 'display_errors', 1 ); [00:32:25] I meant the types of errors that would mediawiki would output regardless [00:33:19] ok. in that case, no error reports. [00:33:35] I have a private wiki, though... [00:33:52] for the family actually [00:34:02] Ok, to start with, verify that the extension gets registered properly (Just see if its listed at Special:Version ) [00:34:43] ok, here is that error i was talking about: [00:34:57] Notice: Uncommitted DB writes (transaction from DatabaseBase::query (LCStore_DB::get)). in /volume1/web/wikiprivat/includes/db/Database.php on line 3944 [00:35:04] I think this is a bug... [00:35:12] Ok, you can ignore that warning [00:35:30] It is a bug, but it doesn't actually cause any problems [00:35:41] Its supposedly fixed in 1.23 [00:36:11] Also check on the file description page, are dimensions of the file extracted properly [00:36:20] check. the extension is properly registrered. [00:36:47] e.g. Like on https://commons.wikimedia.org/wiki/File:BLV_242_Briefwechsel_zwischen_Gleim_und_Ramler_Band_1.pdf there's a line directly under the image saying Original file ‎(779 × 1,285 pixels, file size: 8.62 MB, MIME type: application/pdf, 407 pages) [00:37:50] no... 0 x 0 pixels. [00:38:10] ok. That could actually be a sign that pdfinfo couldn't be executed [00:38:18] Mine is in swedish though... [00:38:25] Busybox_commands.pdf ‎(0 × 0 pixlar, filstorlek: 112 kbyte, MIME-typ: application/pdf) [00:38:57] ok. [00:39:03] gute: Could you try setting $wgDebugLogFile to somewhere [00:39:16] It should say in the debug log file any commands that were executed [00:39:23] I changed the db user to root just to see if there was anything to be done there, but nothing happens. [00:40:46] what file extension? .log? [00:41:36] sure. It doesn't really matter [00:41:41] $wgDebugLogFile = "/somewhere/errors.log"; [00:41:48] yep [00:41:54] done [00:42:28] ok, visit the file description page of the pdf again, doing ?action=purge (So mywiki.com/wiki/File:Foo.pdf?action=purge ) [00:42:41] browse around a bit... trying to upload a new version of the pdf... [00:42:42] And then put whatever is in your debug log file into a pastebin [00:45:05] http://hagur.synology.me/errors.log [00:46:11] hmm you're right, it is a problem with wfShellExec [00:46:38] had a gut feeling... [00:48:10] At least for the job queue [00:49:33] alright... [00:49:40] Do you mind going to mywiki.com/w/index.php?title=File:Foo.pdf&action=purge and then posting what the output of the debug log for that page is [00:49:56] (Sorry, last time I told you with a ?action=purge, where it should have been an & ) [00:50:45] ok. the link is the actual log, so wait a moment. [00:51:26] done. check again. [00:51:41] gute: For the job queue, its possible $wgPhpCli is just set to the wrong thing, you might want to change that in LocalSettings.php [00:52:09] or actually maybe not [00:52:14] gute: you are on linux? [00:52:23] yep [00:52:44] You have a copy of bash that's executable by your web server user at /bin/bash ? [00:53:44] no on the server (nas) i have busybox. let me do a which bash... [00:54:05] nothing [00:54:36] ah. thats right. i dont think busybox has bash.. let me do a quick google. [00:55:06] hmm, mediawiki hard codes bash and /bin/bash as the path to it [00:55:38] in line about 2907 of includes/GlobalFunctions.php [00:56:10] hard-coding /bin/bash is not usually an issue? [00:56:11] busybox uses the ash shell.. [00:56:46] what do you mean [00:57:09] SamB: surprisingly not. This is the first time I've heard of this being a problem [00:57:31] SamB: We only hard code on linux. If you're using *nix or windows, its not hardcoded to bash [00:57:36] oh [00:57:49] well, I mean, you should of course only do that if you need *bash* [00:58:01] if e.g. dash is fine too, it's called /bin/sh [00:58:22] but i have a product from Synology, which uses busybox, so I guess I'm not the only one... [00:58:42] I'm not sure. maybe limit.sh uses some bash-isms [00:59:18] sooo..... a minor bug? [00:59:35] yeah I guess so [00:59:48] what can i do, if anything? [00:59:58] Hacky unofficial work around [01:00:05] sweet [01:00:05] Change the line [01:00:09] if ( php_uname( 's' ) == 'Linux' ) [01:00:16] to [01:00:19] if ( false ) [01:00:31] where? what file? [01:00:38] on about line 2894 of GlobalSettings.php [01:00:59] the line number might be a little different depending on your version of MediaWiki. Its in the function wfShellExec [01:02:17] where is GlobalSettings? found GlobalFunctions... [01:02:31] Sorry, I meant includes/GlobalFunctions.php [01:02:37] not sure why i said globalsettings [01:02:41] bawolff: how do they run that script on other OSes? [01:02:48] SamB: we don't [01:03:08] If a shell command takes up too much memory on a different OS, it just kills your server [01:03:14] SamB: https://bugzilla.wikimedia.org/show_bug.cgi?id=44568 [01:03:15] ah. [01:05:03] whats the name of the function? [01:06:14] gute: wfShellExec [01:11:57] alright. finally found this. on line 2775. =) saving and trying again. [01:13:34] same same. [01:13:45] ahhh [01:13:50] YYYYEEEEESSSSS [01:13:57] took awhile... [01:14:16] >>bawolff Thanks alot! [01:14:17] gute: SyntaxError: Unexpected identifier [01:14:31] no problem [01:15:22] good thing I decided to try this IRC, otherwise I would have pulled the little hair i have left on my bald skull! :) [01:15:35] Guess you guys made a bug report? [01:16:24] gute: yes, I'm working on that now [01:16:30] well I didn't actually make a bug report [01:16:40] (PS1) Brian Wolff: Only use limit.sh if /bin/bash executable, instead of if linux. [core] - https://gerrit.wikimedia.org/r/140880 [01:16:59] is a fix for the issue that will hopefully be in the next version of MW (If it gets positively reviewed) [01:18:59] Sounds great! I'm happy! Nice work! I just started wiki-ing a couple of weeks ago. [01:19:32] a quick question though... [01:20:28] I plan to have more wikis for family and friends. Is GlobalFunctions.php always the same for this ver of MediaWiki? Can I just copy it to other folders as well? [01:21:41] what you can do is have one install of mediawiki power multiple sites [01:21:59] https://www.mediawiki.org/wiki/Wiki_farm unfortunately I think that guide is out of date [01:23:01] yeah, I got down that road and it didnt work so well. I just go with several folders instead. [01:23:19] Can I just copy this file over? [01:23:38] instead of manually editing every file? [01:24:05] I'm about 99% sure that I could, but just to be sure? [01:25:17] yeah you can [01:26:42] yeah, you can even just make all the files except LocalSettings.php be symbolic links to a central copy of Mediawiki [01:27:17] ok, thanks! [01:27:34] that will be all for me. Great support everyone! Bye! [03:37:22] Hola a todos! Quiero instalar media wiki en un centos 6.5, pero después de la primer pantalla al dar clic en el enlace se queda en blanco la siguiente pantalla [07:12:55] Hello everyone. [07:12:55] Looking for help regarding the installation and the configuration of the extension VisualEditor. [07:12:55] After several tests, I get nothing conclusive. [07:12:55] OS: GNU/Linux Debian jessie/sid | WM: 1.23 | VE: manual snapshot made for MediaWiki 1.23 [07:12:55] nodejs 0.10.29 and parsoid 0.1.12 installed with aptitude [07:14:03] N3oXid: it's a quarter past midnight here and I'm just heading off to bed, but I saw you ask yesterday, too, without getting a reply, and I want to make sure you are able to work it out [07:14:16] so could I ask you to poke me tomorrow (I'm in PST) if you still haven't resolved it? [07:14:22] I promise to take the time to help you work that through. [07:15:11] ori: yes, sure :) Thanks [07:15:29] ori: Sorry, it's 9:15 am here :) [08:23:51] Morning. I have two mediawiki databases, the current live one 'mediawiki' and an old backup that I just imported 'oldwiki'. How should I go about transferring the articles from the old one into the new one? [09:01:43] Reedy, hello! [09:02:54] I am making a form in my material database extension to add a new material. I wish to get all the traits in a dropdown which are present in my traits table. [09:54:22] Reedy, ping [10:01:03] Can any one help me please? [13:05:24] Whats the tag for creating an internal wiki link to another page? [13:08:47] sander^work: [[other page's title]]? [13:43:11] 6 [14:08:48] do Template changes propagate immediately? [14:09:12] I've changed a template and don't see the updates in the pages using that template [14:11:26] wmat: no [14:12:09] wmat: depending on wiki, it may take a couple hours [14:12:20] or even days [14:12:28] interesting, I didn't know that [14:13:05] !jobqueue [14:13:05] The Job Queue is a way for mediawiki to run large update jobs in the background. See http://www.mediawiki.org/wiki/Manual:Job_queue [14:13:28] https://en.wikipedia.org/wiki/Wikipedia:Job_queue is a more useful link [14:13:54] thanks! [14:25:00] hmpf, only because I removed lots of errors it contained [14:25:26] * Nemo_bis shouldn't contribute to en.wiki's docs autarchy [14:27:56] Nemo_bis: I mostly went with, only because the mediawiki one is still talking about MW 1.6 [14:31:38] lol [14:33:45] this makes little sense https://meta.wikimedia.org/wiki/Help:Job_queue [14:33:50] We should probably have an help page [14:34:25] https://meta.wikimedia.org/w/index.php?title=Help:Job_queue&diff=3671604&oldid=3346768 [14:34:40] was that edit really necessary? [14:34:55] ah, that was a bot [14:38:44] So the parts talking of 1.6 are probably still Tim's words [15:28:48] Anyone have any idea why the Lingo extension isn't able to annotate text returned by my own tag extension that's returning $parser->recursiveTagParse($wikitext)? [15:29:51] Lingo is working in other pages and in other parts of pages that use my extension. Seems the results of my tag are not getting processed by Lingo though. [15:31:39] retentiveboy: Perhaps Lingo isn't handled strip markers properly? [15:31:49] * bawolff never used lingo, doesn't know how it works [15:32:06] Or perhaps lingo is being run prior to parser tags being processed [15:32:17] Seems to hook into ParserAfterTidy [15:37:06] Well that hook is pretty near the end, it should be seeing your content [15:37:32] yeah, adding logging to see if I can figure this out. [15:49:09] bawolff: Lingo's parser appears to be getting input that looks like instead of the output of my extension. [15:59:33] retentiveboy: I would expect link holders to be replaced before ParserAfterTidy is called [16:00:10] Where can I learn about "link holders"? [16:00:17] Not sure what's going on there. [16:03:32] Umm, includes/parser/LinkHolderArray.php I guess. [16:04:02] basically grep through includes/parser/Parser.php for any time the $holders variable is used [16:05:49] The parser is not exactly the easiest piece of MW to just jump into [16:15:40] !performance | Lcawte [16:15:40] Lcawte: [16:16:25] jackmcbarn: fwiw, I had to switch away from luasandbox as it was preventing Users from saving any Template pages. All I was seeing was a timeout error. [16:16:47] wmat: are there any more detauls than that? [16:17:19] jackmcbarn: not at the moment, I haven't had time to dig deeper [17:16:32] Having some issues with ImageMagick in MW 1.23. Trying to use an alert template I imported from mediawiki with an svg image to convert to thumbnail. I instead get an error reading 'Error creating thumbnail: convert: no decode delegate for this image format `' @ error/constitute.c/ReadImage/501.' Running the convert command on an svg with png output outside of MW works without issue.. [17:17:58] !debug [17:17:58] For information on debugging (including viewing errors), see http://www.mediawiki.org/wiki/Manual:How_to_debug . A list of related configuration variables is at https://www.mediawiki.org/wiki/Manual:Configuration_settings#Debug.2Flogging [17:18:05] You can find out what exactly MW is trying to run [17:18:55] theding0: Sometimes caused by $wgMaxShellMeory being too low [17:19:55] bawolff: I've already manually set $wgMaxShellMemory to 131072 in LocalSettings.php [17:21:00] That's lower than the default... [17:21:05] $wgMaxShellMemory = 307200; [17:21:23] Reedy: oh man. I pulled that number from an article. what should I set it to? [17:21:50] It's in KB [17:22:00] It defaults to 300MB, you set it to what, 128? [17:22:17] Try remove your override and try again :) [17:22:41] Removed the override with same error [17:25:07] you could try setting to $wgMaxShellMemory = 907200; just to rule it out, but 300 mb should be plenty [17:26:03] plenty? IIRC twn had to raise to 500 [17:26:26] I suspect twn doesn't have many svgs to scale [17:26:49] some [17:27:05] Nemo_bis: That's higher than what wmf uses (wmf is at 400 mb) [17:27:20] WMF is more frugal [17:27:52] Commons is just used to their svgs sucking :P [17:28:07] Indeed. [17:28:28] Well probably also rsvg uses less memory than image magick [17:28:58] Should rsvg be showing in my delegates? [17:29:24] theding0: That depends if rsvg support is compiled into image magick [17:29:32] * bawolff assumes anyways [17:30:02] theding0: You said earlier that you could convert the svg outside of mediawiki, so presumably your image magick is ok then [17:31:19] that's what I concluded too [17:31:39] Unless MW is passing some stupid parameter [17:31:42] that's broken in your version [17:34:53] theding0: windows or linux? [17:35:47] Reedy: MW being stupid? NEVER! (-; [17:36:06] :P [17:36:51] theding0: I've found a very similar issue: command works from command line, but it fails from PHP: it was a permission issue - http://www.multipole.org/discourse-server/viewtopic.php?f=1&t=20201#p80213 [17:39:17] Vulpix: it's linux. I'll check that article [17:39:51] I asked because in that article the problem was on Windows (with PDF instead of SVG to PNG) [17:40:20] ashley: Ever [17:40:32] Vulpix: oh ok [17:40:47] Vulpix: you can have permission errors on linux to [17:41:01] yeah i'm checking it anyways [17:41:11] i've had to do some permission editing already so worth a shot [17:41:33] theding0: you can use the su command to switch to the apache user (usually www-data) and try the command [17:42:09] but per what Reedy said earlier, try enable debug logging, see what the full command line (with limit.sh and everything), and try running that command outside MW and see what happens [17:43:26] really appreciate your guys's help. i'm extremely noobish [17:44:32] (In case it wasn't clear, by use su, I meant execute the command su to become root, then execute the command su www-data to become www-data (or whatever your apache is running as), and then run the image magick command [17:45:19] that being said, i added the error_reporting lines at the end of LocalSettings and don't know what I'm supposed to do next [17:45:57] error looks the same on the template page [17:45:58] theding0: you need to set the debug log to see the exact command being executed [17:46:11] For the debug logging, add a line like $wgDebugLogFile = '/path/to/somewhere/log.txt'; [17:46:49] then purge the image description page (add &action=purge to the end of the url) (which causes images to be attempted to be re-rendered) [17:47:02] And then look in the debug file for lines starting with wfShellExec [17:49:16] ooo. when I su to www-data I get permission denied when running convert [17:49:47] SElinux? [17:50:03] theding0: Do you become root first? [17:50:16] bawolff: i'm terrible and log in to the shell as root.. [17:51:52] oh i misread your message as being you get permission denied when changing to www-data [17:52:21] Ok, what are the permissions on convert? ( ls `which convert` ) [17:52:38] err i mean ls -l `which convert` [17:53:06] -rwxr-xr-x 1 root staff 27897 [17:54:51] well everyone should be able to execute it, so maybe what vulpix said (?) [17:55:44] yeah, apart from binary permissions, and maybe permissions on the files being converted, the only thing that could deny permissions that I think of, is SElinux [17:56:06] !selinux [17:56:06] SELinux is a linux variant that allows fine grained access control. If you are using it, you may have to adjust the access control rules in order for MediaWiki to work. See . [17:56:25] I mean, it's just the turnkey linux mediawiki provided [17:57:00] theding0: what does ls -laZ `which convert` [17:57:02] give you [17:57:10] discussing skin revamp in #wikimedia-office in 5 min https://www.mediawiki.org/wiki/Architecture_meetings/RFC_review_2014-06-20 [17:57:37] bawolff: -rwxr-xr-x 1 root staff ? 27897 [17:59:48] * bawolff doesn't actually know anything about selinux [17:59:53] :) [18:00:02] theding0: what does getenforce command give you [18:00:03] #startmeeting Skins revamp | Channel is logged and publicly posted (DO NOT REMOVE THIS NOTE). https://meta.wikimedia.org/wiki/IRC_office_hours [18:00:08] whoops! [18:00:11] :P [18:01:06] bawolff: command not found [18:01:35] Well that probably means that selinux isn't on... [18:01:44] you can also try the sestatus command [18:02:12] But if selinux isn't on, and there's global read and execute permissions on convert, I'm really not sure what's up [18:02:14] also not found [18:02:50] ugh. this is like the last freakin thing i wanted to implement lol [18:03:04] sestatus is in /usr/sbin/sestatus, usually not in $PATH for normal users except root [18:03:24] you may need to use the full path [18:03:33] Vulpix: ran sestatus from that directory with same result [18:03:54] ok [18:04:07] ok, we are talking about https://www.mediawiki.org/wiki/Separating_skins_from_core_MediaWiki in #wikimedia-office [18:10:04] theding0: so the only message you get is "permission denied"? [18:12:53] Vulpix: when running from command line as www-data, yes [18:14:02] well, that may not be relevant [18:14:04] hold on [18:15:59] is there a maintenance script that creates all the required tables? [18:16:19] MC8: the installer? [18:16:41] there's one that will create missing ones after you add an extension [18:16:54] update? upgrade? somthing like that [18:16:56] iirc [18:17:00] yeah, that's update.php [18:17:08] but i don't think it will get an empty schema working [18:17:18] it doesn't [18:17:28] I also miss a maintenance script for that [18:18:08] for example, you have the LocalSettings in place with DB credentials, but you're unable to create the database from command line, unless you use the install script, that requires *a lot* of parameters [18:20:27] maybe we should separate the command line installer into 2 steps. first build localsettings that takes a ton of params, then build the db which would read from localsettings and only need the db admin password as a param [18:20:48] MC8: maintinance/install.php [18:21:05] MC8: Or run tables.sql through sql.php, but no garuntees that actually works [18:25:18] oh okay, I was hoping to avoid the installer 'cos I already have a config [18:27:56] MC8: Well you could run the installer, then delete the config [18:28:05] and replace it with your old config [18:29:05] hi brion - wanna join us in #wikimedia-office? [18:29:19] finishing up talking about Bartosz's (MatmaRex's) work on skin revamp, moving on to Trevor's [18:29:24] ah great [18:29:33] sorry, slept poorly and my schedule’s all messed up today [18:29:59] my sympathies [19:10:00] Nemo_bis: do you like installation particularities, don't you? look at this one: http://cefresearch.ca/wiki/ (look before the automatic redirect) [19:12:28] lol :) [19:13:52] apparently, they forgot to check for new updates :( [19:15:54] They also didn't notice that the ConfirmAccount extension exists [19:37:15] I'm having trouble upgrading MW from 1.20.8 to 1.23: during the update.php script, I'm getting a db query error: "select page_namespace, etc. from page where ..." from the function Title::newFromID is erroring with "Unknown column page_content_model" in 'field list' -- I've seen a few similar results from googling but nothing I've done so far has helped. Any suggestions? [19:37:48] jcl: did you run the update script (web installer, or update.php from command line)? [19:38:01] ah, during the update script [19:38:05] I missed that, sorry [19:39:06] that's... really unfortunate. Why the update script is instantiating a new Title object? [19:39:15] yeah... [19:39:40] extensions? [19:40:04] well, there are multiple things happening here, potentially. i've pulled down new versions of all relevant extensions, we also have semantic mediawiki being upgraded from 1.8.0.5 to 1.9.something [19:40:07] could that be related? [19:40:31] i've read the upgrade notes for that i thought i'd accounted for everything, but i could be missing something [19:40:49] possibly. try disabling all extensions, running the update script, and then re-enabling extensions, and running update script again [19:41:12] that would be a good idea [19:42:00] I have lke 25 extensions [19:42:41] Well if they're all in a block, that's what multiline comments are for :) [19:42:42] jcl: just comment out their require_once from LocalSettings.php, you can use the block comment: /* */ [19:44:11] i can do that but i guess there's a lot I don't understand about what all update.php does [19:45:03] lots of things [19:45:08] hmm, same error [19:45:42] There's a $wg for this [19:45:57] $wgContentHandlerUseDB = false; [19:46:52] hmm [19:46:53] so should i try disabling that variable and re-enabling extensions? or leave them commented out? i do like making one change at a time when possible... [19:47:10] Reedy: wasn't the update.php script supposed to be setting that while it's running? [19:47:11] You might want to leave extensions disabled [19:47:43] MatmaRex: Looks like it [19:48:06] where is it doing that? i don't see it setting that anywhere in the maintenance directory [19:48:12] Supposedly as step 1 [19:48:21] just 2 references to the var in backupTextPass.inc [19:48:24] protected function disableContentHandlerUseDB() { [19:48:24] global $wgContentHandlerUseDB; [19:48:25] if ( $wgContentHandlerUseDB ) { [19:48:25] $this->output( "Turning off Content Handler DB fields for this part of upgrade.\n" ); [19:48:26] $this->holdContentHandlerUseDB = $wgContentHandlerUseDB; [19:48:28] $wgContentHandlerUseDB = false; [19:48:30] } [19:48:32] } [19:48:37] includes/installer [19:48:55] ah ok [19:49:10] ok let me change that var and try the update again... [19:50:05] nope same problem [19:50:47] MatmaRex: That code looks "new" in 1.23 [19:51:16] https://gerrit.wikimedia.org/r/#/q/I31078678e8939c897b1357bcb77eb2d26f806f29,n,z [19:51:41] jcl said he's using 1.23 [19:51:45] Do I dare ask why it was apparently backported to 1.21 but not 1.22? [19:51:56] When did 1.22 come out... [19:51:58] !1.22 [19:51:58] MediaWiki 1.22 is a legacy version of MediaWiki, released on Dec 6th 2013. See https://www.mediawiki.org/wiki/MediaWiki_1.22 [19:52:02] lol [19:52:03] Oh, is in 1.22 [19:52:11] i'm wondering if i should blow away the current database (doing this in my dev env) and recreate it from the backup I'm using for this testing and try again. Perhaps something broke during previous update attempts while extensions were enabled? [19:52:25] So does this mean we have essentially no automated tests for the installer at all? [19:52:34] you could just add the page_content_model fields manually [19:52:40] jcl: it shouldn't have [19:52:42] we were going to upgrade to 1.22.7 despite 1.23 being out, dot zero releases and all, but i did finally decide to go with 1.23.0 [19:52:55] bawolff: yes, unless Marks are hiding somethiing [19:53:02] oh forgot to mention... the field IS there, I verified it with the exact query it says is failing [19:53:16] wut [19:53:20] That's... [19:53:22] what [19:53:41] i even tried a suggestion i'd found about running update, during the countdown deleting the field, and then the update recreates the field, which I see it do, then it fails anyway [19:53:55] yeah [19:54:03] It shouldn't try and create it if it already exists (for obvious reasons) [19:54:16] right, it fails either way, whether or not i first drop the field [19:54:40] Ok. I'm going to try deleting that field on my install and see if things explode [19:54:46] if i drop it, it does create it, and like i said, i run the exact query update says is failing and it shows the field, even verify with show create table page. [19:55:06] jcl: unless something with transactions maybe (?) [19:55:14] do transactions even apply to alter statements [19:55:21] And you are looking at the correct database? [19:55:27] yes [19:57:28] ah! [19:57:38] ok this is something to do with the replication [19:57:55] i just noticed the field doesn't exist in the replica db's page table [19:57:56] hmm [19:58:08] aren't ddl statements replicated?! [19:58:11] I guess the reads are hitting the slave [19:58:14] pretty sure they are [19:58:32] yeah, that's to be expected, so i'm not sure why the field doesn't exist there, unless repl is broken... [19:58:32] Ah, that will do it [19:58:57] this is just a dev server so I can start from scratch, but i'm still confused about how the replica could be wrong [19:59:01] They should be replicated... Purely the reason we don't just do changes on the mysql masters for Wikipedia [19:59:24] my LocalSettings.php is set to use the master for writes and a single replica for the reads [19:59:28] as per normla [19:59:30] normal even [19:59:59] well there you go, the slave threads aren't running, gr... [20:00:04] SHOW SLAVE STATUS; [20:00:42] that must be my problem. i'll get replication fixed and this will probably go away. [20:01:14] just for clarification though, there's probably no reason, then, to disable extensions, right? since that's not part of the normal upgrade procedure [20:01:15] might take a little while to catch up [20:01:30] Right [20:01:36] Unless you know one is buggy or similar [20:01:45] Which at this point, there is no reason to believe so [20:02:13] i'll just recreate the replica db, change master, and should be good to go. replication is on the same host since i'm using a single server for the entire wiki-db stack, just wanted to reasonably mirror the production architecture [20:02:59] thank you all for the help. i'll follow up once i've rebuilt the replica db and tried with the slave threads running :) [20:13:14] jcl: FYI https://bugzilla.wikimedia.org/66887 [20:15:29] Hello all! I wish to get the userID of the logged in user. Is there any predefined function for this? I tried searching here https://doc.wikimedia.org/ but couldn't find. [20:16:08] albertcoder: Where in the code base do you want this? (E.g. are you in a special page?) [20:16:38] albertcoder: Inside a special page, it would be $this->getUser()->getId() [20:16:39] Yes, I am making an extension. I wish to use that in my extension. [20:17:10] If you're doing a tag extension, it would be $parser->getUser()->getId(); [20:17:20] I'm assuming you want the id for the current user? [20:17:30] yes. [20:18:32] Basically, if you have any object implementing IContextSource (Such as a SpecialPage subclass), you can do $thatObject->getUser()->getId(); [20:19:23] Alright, I will try. Thanks a lot bawolff [20:22:00] bawolff, I am having problem in locating the functions and resources already built in mediawiki to use in my extension. Where should I refer and what should be my approach to understand mediawiki thoroughly? [20:22:31] Don't hesitate to ask on the channel [20:23:09] https://doc.wikimedia.org/mediawiki-core/master/php/html/ is a good place to start, but can still be hard to find what you're looking for if you don't know where it is [20:23:20] I thought twice to ask the userID :P but I could find so I just asked. [20:24:22] Hello, Is there an possibility to make like this page in another wiki? [20:24:24] https://en.wikipedia.org/wiki/Special:NewPagesFeed [20:24:39] If yes, then how?. [20:25:19] albertcoder: If you want to read up on some of the main classes in MediaWiki, I'd reccomend https://doc.wikimedia.org/mediawiki-core/master/php/html/classTitle.html https://doc.wikimedia.org/mediawiki-core/master/php/html/classUser.html and https://doc.wikimedia.org/mediawiki-core/master/php/html/interfaceIContextSource.html [20:25:52] and maybe https://doc.wikimedia.org/mediawiki-core/master/php/html/classHtml.html [20:26:06] Ahmed_Sammour: Is your wiki english [20:26:44] Ahmed_Sammour: Due to silly design decesions, that page is very hard to adapt to communities that are not enwikipedia [20:26:46] bawolff: No, it's AR.wiki. [20:27:49] bawolff: I can translate it all if I could. [20:27:49] Ahmed_Sammour: Issue is tracked by https://bugzilla.wikimedia.org/show_bug.cgi?id=48552 although I don't think much is going to happen on that front [20:27:50] Thanks a lot bawolff , I will go through all these and will ask if I face any problem in extension-making. :) [20:28:33] albertcoder: If you're doing a php tag extension, the recursiveTagParse method of Parser object is important [20:29:02] Ahmed_Sammour: From what I understand, its not just "translation" per se, some of it is very hard coded to enwiki politics [20:29:09] * bawolff not overly following the situation [20:30:27] hi [20:30:35] bawolff: Ok, No problem. Thank you for your help :) [20:30:40] bawolff, would you please elaborate what php tag extension is? I am really a beginner. I am just making an extension in mediawiki which can store, display the properties of materials. [20:30:44] why I can see here the code of the link: https://cs.wikiversity.org/wiki/Wikiverzita:Diskuse_o_smaz%C3%A1n%C3%AD/Z%C3%A1mek_(za%C5%99%C3%ADzen%C3%AD) ? [20:31:46] albertcoder: Its a type of extension. It allows you to add new tags to wikimarkup (For example, on mediawiki.org, there's a php tag extension that implements the tag, which allows users to insert syntax highlighted code into the wiki) [20:33:01] albertcoder: The most commons types of extensions are: new special pages (Special:Foo on the wiki), new media handler (for different file formats), parser tags, which add to wiki syntax, and parser functions which add {{#bar:stuff here}} to wikisyntax [20:33:03] okay I got it. [20:34:41] But it will be more clear, if I go through these. I also wish to implement the features like if a normal user has logged in and inserted some data, the admin will review the data and if it is valid then only it should be inserted. [20:34:46] albertcoder: you may also want to read through https://www.mediawiki.org/wiki/Manual:Developing_extensions [20:35:45] That can get complicated. Its possible, but mediawiki is much more oriented towards an instantly available type of thing [20:37:58] hey anomie - you had a chance to talk with robla about REST Fest? [20:38:08] Ah!, I meant the data should be inserted instantly but it will be visible worldwide after installing extension only if the admin approves it. Is that possible bawolff ? [20:38:27] yes that's possible [20:38:38] But it involves a lot of extra steps [20:39:24] If you're entering the info into a special page, its not too bad, but if you enter data into a wiki page, the approval thing can become complicated [20:40:03] There's some examples of doing that sort of thing in the ApprovedRevs/FlaggedRevions extension (FlaggedRevisions is a complicated extension to understand, probably not the best one to learn from) [20:40:25] albertcoder: If you're mostly concerned with storing data in a wiki, you may want to read up on wikidata [20:41:39] Yes, I am mostly concerned with storage, display, import and export in CSV, JSON etc. and search. [20:43:01] sumanah: I mentioned it to him, but I don't think we made a final decision [20:43:25] got it [20:44:12] bawolff, do you recommend wikidata for storage, display, import and export in CSV, JSON etc. and search? [20:45:12] I don't know enough about it to say [20:45:19] Although its not really a search thing afaik [20:45:51] can someone help me with installation of mediawiki on debian jessie ? [20:46:06] there are no real queries yet, hence probably not, albertcoder [20:47:03] I followe this link http://www.rosehosting.com/blog/how-to-install-mediawiki-on-debian-wheezy/ to do the install but when I run localhost/mediawiki I get a 404 [20:47:31] Speaking of which, I wonder if we should really be so hard with Debian's packages [20:47:34] https://www.mediawiki.org/wiki/Manual:Running_MediaWiki_on_Ubuntu [20:48:00] dforce: it would be wonderful if you could test some of our docs and remove all the sections which are not useful :) [20:48:34] Or we should perhaps kill all the Running MW on platform X docs as they're mostly outdated and wrong [20:48:45] And with the exception of windows, its pretty much all the same [20:48:48] bawolff: I killed 5 the other day [20:48:59] https://www.mediawiki.org/wiki/Thread:Project:Current_issues/Installation_guide_consolidation [20:49:11] Just go kill those you don't like, redirect and it's done [20:49:13] okay Nemo_bis , bawolff thanks a lot. I will show you the young extension soon and would really be grateful if you review and give suggestions. [20:49:28] albertcoder: extension for what? are you using SMW now or not [20:49:41] NotASpy, I am not. [20:49:46] (I remember the aim, minerals; but not the current approach) [20:49:53] ok [20:50:02] (weird autocomplete you have) [20:50:07] :P [20:50:14] pardon :) [20:50:20] lol " upload_max_filesize = 8M" [20:50:31] Wow, really pushing the limits on the php config in that doc [20:50:43] 8 megabytes ought to be enough for anyone! [20:51:45] tired [20:51:48] yawn [20:53:19] Nemo_bis, I have been able to make my extension workable to some extent without using SMW. I will show you once I install it on my college server. Hope you will be able to suggest better then. [20:53:52] Sorry, I don't have in my plans to try being smarter than SMW :) [20:54:04] lol bawolff [20:54:37] bawolff: otoh there are reports on commons that upload over 12 MB doesn't work, so :D [20:55:11] Well they should just stop using upload wizard then :P [20:55:26] Nemo_bis, I think I will really have to switch to SMW as soon as possible. :P [20:55:51] Or maybe we should fix chunked uploading. nah, that's crazy talk [20:57:31] <^demon|lunch> So many things to fix. [21:04:43] Hmm, the --layers optimizeTransparency looks pretty bad on https://commons.wikimedia.org/wiki/File:ROTARY_-_a_lady%27s_wrist_watch._Fellows-1438-91-A.gif [21:25:53] Hi. Is it ok to change the permission of bureaucrats to let them protect/unprotect articles? [21:26:40] pera: On wikimedia? That's usually the job of admins, and usually all crats are admins [21:26:52] I mean on mediawiki :) [21:27:02] ie my personal wiki [21:27:05] If its your own wiki, you can do whatever you want :) [21:27:50] yeah but I was worried about the rol of each group.. [21:28:38] *role [21:29:24] like, it might be a bad practice for some reason (first time using mediawiki) [21:29:37] but thanks then bawolff :) [21:35:45] hallo, quick technical question regarding extension and wikitext. I'm trying to build a very simple extension that provides a parserfunction allowing to include the intro (section 0) of another article into a page. [21:35:55] relevant code: http://fpaste.org/111632/30004714/ [21:37:10] unfortunately I stumbled upon a 'known problem': the section is properly transcluded, but each section title of the page gets a fancy UNIQ ... QINU [21:38:16] I know about Parser::recursiveTagParse but if you look at my code I'm not even using parse. My guess is, it happens when I'm using the getSection() method [21:39:07] so my question is - is there a way around or should I just code my own getSection method? [21:39:43] Bloupfuh: do you know of the TextExtracts extension [21:40:03] yeah. but it's a javascript API [21:40:34] can't you reuse the logic or underlying PHP code? [21:40:43] Bloupfuh: it's possible to call api.php modules from other PHP code :) [21:40:54] it's even possible that Lua might be able to call such functions [21:41:04] (but a bit clunky) [21:41:56] Bloupfuh: here's example code where i used that ability: https://gerrit.wikimedia.org/r/#/c/125565/3..2/includes/specials/SpecialContributions.php [21:42:33] (the problem with lua is, text returned by a lua module isn't parsed, so wikitext is displayed as plain text) [21:42:43] taking a look [21:42:47] MatmaRex: last i checked, calling the api internally still triggered uniq issues [21:43:03] bawolff: oh i have no idea, i was just saying this re: TextExtracts [21:43:24] i know nothing about uniq issues [21:44:31] Bloupfuh: slightly hacky work around [21:44:50] Bloupfuh: instead of doing $content = $content->getSection("0"); [21:45:30] Bloupfuh: do: global $wgParser; $oldParser = $wgParser; $wgParser = new Parser; $content = $content->getSection("0"); $wgParser = $oldParser; [21:45:46] "slightly" [21:45:51] :> [21:46:11] you know I submitted a patch related to this very issue just a couple hours ago [21:46:35] https://gerrit.wikimedia.org/r/141056 [21:46:35] for which bawolff deserves an applause [21:46:53] Except not this specific issue, just the OutputPage one's, but same approach could be used [21:47:43] can't wait [21:48:09] bawolff: I don't understand if your patch supersedes https://bugzilla.wikimedia.org/show_bug.cgi?id=65826#c5 or #c9 [21:48:13] I'm trying your work around right away [21:50:22] Bloupfuh: Another extension that does a similar thing is Babel - https://git.wikimedia.org/commitdiff/mediawiki%2Fextensions%2FBabel.git/224fe8975c781d564f0437ba52e8792f3be5177b [21:50:49] Nemo_bis: it supersedes both [21:50:56] Assuming it gets merged [21:52:59] well the hack is working wonderfully [21:54:12] :D [21:54:18] no wonder Babel is doing a similar thing, you're the author of the fix! [21:54:30] indeed [21:55:30] thank you. [21:59:55] I have a quick question. Since upgrading from 1.21.1 to 1.23.0 I've noticed that links for new pages are not going blue or purle, they are showing with red links and edit action in link, as if your trying to create a page that doesn't exist when it does. [22:00:01] This is for a work wiki. [22:00:24] s/purle/purple/ [22:01:10] Anyone have ideas as to the cause? I'm still poking around to see why it's happening at the moment. [22:04:31] CowboyPride: for any size of the pages in question? [22:04:43] Yes [22:04:46] The only change I can think of is the removal of the stub threshold preference [22:04:57] The one page I'm looking at is very small. [22:05:24] It is a page the holds links to other pages... [22:05:28] How about bigger ones [22:05:51] Kind of like a link directory, and on the page the link shows a red even though the page exists... [22:06:23] page with link in red is 13,090 bytes in size. [22:06:57] Havne't yet noticed if trend is toward larger pages.. I will look and see recent edits to see if that is the trend. [22:08:16] The link page which has the red link is 2,884 bytes [22:08:39] the target size is what matters [22:08:52] but again, this is just speculation :) it's wrong in any case [22:09:11] The target pages is teh 13,000 bytes page... [22:09:25] looking for more to see if there is a trend here. [22:26:35] Well, while I've had reports of this, I'm only seeing the once instance... so this is weird... [22:28:13] And it's the only page over 10,000 bytes [22:28:26] At least that's been created in past 7 days [22:29:07] CowboyPride: If you make a dummy edit to the page that has the redlink on it (not the page being linked to, the page with the link on it), does the red link turn blue? [22:29:14] If so, run refreshLinks.php [22:29:21] Yes it does... [22:29:32] Ok, thanks I'll try that. [22:37:29] Wow! This is gonna take a while... But hopefully I'll only have to do it once. [22:38:34] sorry, should of mentioned that, the script is going to take probably about a day to run [22:39:17] 9992 page id's over here... [22:45:06] hmm, maybe not a day. Maybe more like an hour [22:47:57] bawolff: Still showing red... Tsk... [22:48:20] Hmm, well that should of been the equivalent of doing a dummy edit to all pages [22:48:26] Sorry in advance for the paste everyone... [22:48:29] 9900 [22:48:29] Retrieving illegal entries from pagelinks... 0..1 [22:48:29] Retrieving illegal entries from imagelinks... 0..0 [22:48:29] Retrieving illegal entries from categorylinks... 0..0 [22:48:31] Retrieving illegal entries from templatelinks... 0..0 [22:48:33] Retrieving illegal entries from externallinks... 0..0 [22:48:36] Retrieving illegal entries from iwlinks... 0..0 [22:48:38] Retrieving illegal entries from langlinks... 0..0 [22:48:41] Retrieving illegal entries from redirect... 0..0 [22:48:44] Retrieving illegal entries from page_props... 0..0 [22:48:50] !pastebin [22:48:51] To avoid overflowing the channel with inane amounts of text, use https://dpaste.org/ or other awesome pastebin sites to share code, errors, and other large texts. [22:49:59] Well didn't work so I guess I'll dig around so more or just make manual dummy edits on every page that is broken like this. [23:12:20] hello [23:12:57] is this the right place for questions about the wikipedia api? [23:13:17] Taggnostr: Its one of many right places [23:13:22] Taggnostr: what's your question? [23:13:38] I'm doing this query: http://en.wikipedia.org/w/api.php?action=query&prop=extracts&exsentences=1&titles=CPython&format=json&explaintext&redirects [23:13:47] this is the page http://en.wikipedia.org/wiki/CPython [23:14:00] I would like to get the first sentence of the first paragraph, not the note at the top [23:14:06] is there a way to do it? [23:14:44] Taggnostr: i know popups does that somehow. check their code [23:15:16] jackmcbarn, what popups? and do you know where I can find their code? [23:15:26] you may have to post process the results [23:15:50] https://en.wikipedia.org/wiki/MediaWiki:Gadget-popups.js [23:16:14] bawolff, if I use &exsentences=2 I get the second one too, but there's no way to tell that the first one should be excluded [23:16:34] maybe if I use html instead of plain text I will get some hint? [23:16:51] jackmcbarn, thanks [23:23:49] jackmcbarn, afaics popups doesn't use &exsentences or &exchar -- maybe it get the whole content and extracts it manually [23:23:57] probably [23:24:07] i know it does it; i'm not sure exactly how [23:25:25] jackmcbarn, do you know if there's an online demo for popups that I can try to check what it requests? [23:25:37] Taggnostr: just enable it in your wikipedia preferences [23:25:44] Taggnostr: i'd recommend fixling a bug against the TextExtracts extension :) [23:25:50] filing* [23:26:04] jackmcbarn, do I need an account on wikipedia to do it? [23:26:09] Taggnostr: yes [23:26:29] (there are ways you don't, but they're all harder) [23:30:43] jackmcbarn, looks like it's parsing the page using regex [23:49:46] ok, I solved by getting the html and looking for the first

[23:49:54] so far everything seems to work ok [23:50:26] this is the URL: http://en.wikipedia.org/w/api.php?action=query&prop=extracts&titles=CPython&format=json&redirects [23:50:47] exsentences doesn't work anymore, and I have to find the end of the sentence on my own