[00:59:36] Anyone here familiar with LST ? [00:59:38] User:ShakespeareFan00/Sandbox/Ruffhead [01:00:09] It SHOULD be loading two sections (one in Latin and one in English, but it seems to sometimes only load the English portion..) [01:00:24] https://en.wikisource.org/wiki/User:ShakespeareFan00/Sandbox/Ruffhead [01:00:37] I've pared the exxample down to the absolute mnimum [01:06:46] Howmany api request per seccond I could do? [02:08:34] !brain [02:08:34] Hello, I'm wm-bot. The database for this channel is published at http://bots.wmflabs.org/~wm-bot/db/%23mediawiki.htm More about WM-Bot: https://meta.wikimedia.org/wiki/wm-bot [02:08:35] !botbrain [02:08:35] Hello, I'm wm-bot. The database for this channel is published at http://bots.wmflabs.org/~wm-bot/db/%23mediawiki.htm More about WM-Bot: https://meta.wikimedia.org/wiki/wm-bot [02:08:35] !bot [02:08:35] A bot is an automatic process which interacts with MediaWiki as though it were a human editor and is designed to simplify repetitive actions too cumbersome to do manually. An overview of how to create a bot and a list of known frameworks can be found at http://en.wikipedia.org/wiki/Wikipedia:Creating_a_bot [02:08:39] 0_o [04:05:18] I have an old mediawiki database that I want to bring online. I've installed mediawiki, but need some help understaning where the configurations need to be re-connected in apache and in mediawiki. Is there a good description somewhere of the files in /etc/apache, etc/mediawiki and corresponding files under /var? [04:11:43] pac1: are you using a package version of mediawiki from your OS repo? [04:13:34] p858snake, yes, ubuntu 15.10 [04:15:40] I'm trying to puzzle out how the /etc and /var directories relate to each other. [04:16:13] i would recommend just using the tarball install compared to the package [04:18:09] p858 thanks for the hint. [04:18:30] Its late, I'm tired, I'll fighure this out tomorrow. [04:53:30] Is there a way to have brackets in external link titles? ie. [[http://google.com [something] something else]] [04:55:10] er, meant single-brackets on the outside of that, not double [04:56:48] nvm, figured out works in that too :p [13:12:56] Thanks Nikerabbit for summary https://phabricator.wikimedia.org/T113210#1900470 [13:36:30] morning [16:33:54] Funny how nobody is interested in tables on weekends :) http://stats.grok.se/www.w/latest30/Help:Tables [16:35:12] replace tables with * ? [16:43:47] Vulpix: not really, for instance http://stats.grok.se/www.w/latest30/Help:FAQ doesn't follow the same pattern [16:45:10] Nemo_bis: try the graph for 60 days [16:55:36] is there a magic word for users language preference? [17:06:58] n/m found it [17:08:20] Vulpix: fair enough, still that's a factor 2 and not a factor 3 :) [17:08:36] even less on pages like http://stats.grok.se/www.w/latest60/Download [18:14:42] https://meta.miraheze.org/w/load.php?debug=false&lang=nl&modules=ext.echo.badgeicons%7Cext.echo.styles.badge%7Cext.uls.nojs%7Cmediawiki.feedlink%2Chelplink%2CsectionAnchor%7Cmediawiki.legacy.commonPrint%2Cshared%7Cmediawiki.skinning.interface%7Cmediawiki.special.changeslist%7Cmediawiki.special.changeslist.legend%7Cskins.vector.styles&only=styles&skin=vector [18:14:42] gives all kinds of errors and missing modules [18:15:12] Does anybody know how I could try to debug this issue? [18:17:04] Is your cenodr repo in sync with core? [18:17:45] !debug [18:17:45] For information on debugging (including viewing errors), see http://www.mediawiki.org/wiki/Manual:How_to_debug . A list of related configuration variables is at https://www.mediawiki.org/wiki/Manual:Configuration_settings#Debug.2Flogging [18:17:46] Too [18:17:48] cenodr? [18:17:54] Vendor [18:18:09] it should be.. /me looks again [18:18:40] It is in sync. HHVM debug logs do not say anything suspicious [18:19:10] hm, I guess I found the culprit [18:19:35] [resourceloader] Generating module package failed: [Exception Less_Exception_Parser( /srv/mediawiki/w/vendor/oyejorge/less.php/lib/Less/Parser.php:447) Less.php cache directory isn't writable: /srv/mediawiki/w/cache] <- it is not cool this isn't in the HHVM error logs though [18:19:57] File a bug [18:20:19] should it be in the error log? :) [18:21:04] Well, there's the error logs and shown to the world [18:21:28] The latter would be a security issue, as a path disclosure [18:22:33] I don't see path disclosure as a security issue (like I don't see some other stuff as security issues too, but that doesn't matter), but yeah [18:22:59] exceptions in RL usually don't give that information unless $wgShowExceptionDetails = true; [18:23:41] People do though. And they report it [18:23:47] For WMF we don't care either [19:56:47] Right. [19:57:03] Left. [20:02:08] mhutti1: Can you not run tests locally? [20:02:39] Reedy: I would like to but have no idea how to run the zend test... [20:03:16] If you've got zend php installed, you can run them [20:03:28] All it means, is whether they're run under hhvm or under zend php [20:04:25] Reedy: I spent an hour looking stuff up but from what I found the ubuntu php command already uses zend yet on my end it works but jenkins keeps failing it [20:04:37] Ah, that's interesting :) [20:04:51] hi guys [20:05:39] Reedy: I know debuging it in the clouds a bad idea but I really have know idea whats wrong :( [20:05:54] YMMV [20:06:01] I wasn't sure if you were having issues running them [20:06:27] Let me try it [20:07:34] Reedy: From my tests the problem is this array only has 1 namespace in it https://integration.wikimedia.org/ci/job/mediawiki-phpunit-zend/9995/consoleFull but in the hhvm version its alright [20:07:46] Reedy: and the xml string is right on both [20:10:14] Seems my unit tests are broken too [20:10:16] Helpful [20:13:19] How do I run it using a phar again... [20:13:26] * The_Photographer preffer use Jenkins with Java [20:13:43] --with-phpunitdir [20:22:31] mhutti1: ok, fixed that issue :P [20:25:14] and the problem was... [20:26:24] phpunit-old is needed for PHP 5.3 [20:27:03] https://gerrit.wikimedia.org/r/261159 [20:27:06] Screw renaming it [20:29:05] Oh ok [20:29:25] I underestand the problem now [20:29:31] Reedy: So it should work now if I recheck it? Or did you solve a problem so you could test my problem im conffused :) [20:29:47] Yeah, I solved a problem that prevented me from running it :P [20:29:53] Wait for it to merge, then rebase it ;) [20:30:15] well done Reedy [20:32:34] mhutti1: mind if I hit rebase on your patch? [20:32:48] Reedy: Sure I have never done that before [20:33:59] You might want to go to master and then git review -d 260968 [20:35:38] Reedy: So zend should now pass it? [20:35:47] Nope [20:36:05] Reedy: So my test will now fail it? [20:37:00] With the var_dump removed... It's marking your test as risky [20:37:03] echo? [20:37:16] Reedy the var dump was just to debug [20:37:22] yeah [20:37:29] but like I say, it's still saying it's risky [20:37:45] Reedy: what do you mean by risky> [20:37:52] OK, but incomplete, skipped, or risky tests! [20:37:53] Tests: 2, Assertions: 4, Risky: 1. [20:38:05] 1) ExportTest::testPageByTitle with data set #0 ('UTPage') [20:38:05] This test printed output: [20:38:09] It doesn't like your echo [20:38:21] echo $xmlString; [20:38:25] Is that debugging? [20:38:27] Reedy: yeah [20:38:40] Reedy: But without those it should work? [20:38:46] yeah, it does [20:39:33] * Reedy tries hhvm too [20:41:29] Passes there too [20:42:47] mhutti1: It looks like it's failing on hhvm (after my rebase) because of the var_dump and echo [20:42:59] Reedy: Yeah it was doing that on mine [20:43:19] on jenkins that is [20:44:01] Reedy: So should I run check zend? [20:44:08] Could do [20:44:36] Was going to suggest removing the debugging, and see what crops up then [20:45:53] Reedy: Well It takes me about 5 mins to work out how to get my local repository rebased [20:46:19] mhutti1: you can just do... [20:46:37] Are you not using a branch? Just straight onto master? [20:46:52] You can just rebase the whole thing... [20:46:56] git fetch --all [20:46:59] git rebase origin [20:47:02] or [20:47:05] git reset HEAD~1 --hard [20:47:07] git pull [20:47:17] git review -d 260968 [20:49:43] Reedy: thanks [20:49:49] Reedy: Will save this [20:50:08] Just removed the CR -2 too [20:50:12] As that was to block a merge it seems [20:50:47] Reedy: Ty [20:52:02] is is_object() better than gettype() === object? [20:53:11] Reedy: You tell me :) You are an expert programmer... I am a complete novice everything I have learnt is from stackoverflow :P [20:53:55] I think it would be... [20:54:20] gettype() is mostly used in MW where we need the type in a string.. to output or whatever [20:54:43] Reedy: kk once zend hopefully accepts it I will change it to that [20:54:48] :) [20:55:57] Reedy: Its failed it again :( [20:58:20] Patch Set 10: Verified+2 [20:59:05] Let's try again [20:59:21] I wonder if it's a PHP 5.3 issue [20:59:23] Reedy: Its the same error I was initialy trying to debug [20:59:53] Reedy: For some reason its not recognising the array $xmlNamespace has anything in it. [21:00:46] jenkins is known to be a dick too ;) [21:01:00] let me try a machine with php 5.3 on [21:01:26] Reedy: Ty :) [21:01:34] aka debugging on the WMF cluster ;) [21:03:06] lol, I heard that StackOverflow was blocked on some defense contractor networks [21:03:28] According to @SwiftOnSecurity this is why the F-35 doesn't work [21:03:43] OH-: Not suprising :P [21:06:26] Fatal error: Call to a member function getNamespace() on a non-object in /srv/mediawiki-staging/php-1.27.0-wmf.9/includes/Export.php on line 166 [21:06:40] That looks... suspect [21:07:29] Reedy: 166? Doesnte exist [21:07:45] 'page_namespace=' . $title->getNamespace() . [21:07:51] Null being passed to pageByTitle ? [21:08:05] Oh [21:08:06] $title = Title::newFromText( $pageTitle ); [21:08:18] duh [21:09:13] Reedy: So the dataproviders not working correctly or UTPage is not a test page [21:09:19] No [21:09:23] It's me live hacking stuff [21:11:11] mhutti1: http://p.defau.lt/?aRHuKCNl7qzVeHE818rH4A [21:11:43] Looks a LOT like a PHP bug to me [21:12:00] Reedy: Yeah but that makes no sense because the actual array contains like 19 elements [21:12:13] PHP sucks :D [21:14:11] mhutti1: the ->namespace at the end breaks it [21:14:14] on 5.3 [21:15:07] Reedy: Yeah I thought it might when I wrote it. The work around I came up with was doing a string replace to rename it and then using a diffrent name but it seemed rather crude [21:16:42] Reedy: Unless theres some way to escape the special name [21:16:50] Just looking [21:20:29] There's gotta be a trick for this [21:20:58] Reedy: I tried looking when I wrote it but couldn't find a single case of it being a problem before [21:21:32] How do we import it? [21:21:49] Using our xml reader [21:22:23] Reedy: Ill try look [21:22:44] Not our reader... XMLReader [21:24:25] mhutti1: Ah [21:24:27] found it, I think [21:24:52] nope [21:24:54] UGH [21:30:55] mhutti1: str_replace seems an easy, but hacky fix :P [21:30:55] Reedy: It might just be worth me editing the string and then using the new name to reffrence it. I really doubt theres another way. Unless theres another bug to counteract the bug we found [21:31:14] Reedy: I might as well do it and check it works [21:31:21] hang on, might have a better fix [21:31:31] kk [21:33:31] $foo = (array)$xmlObject->siteinfo->namespaces; [21:33:43] var_dump( $foo['namespace'] ); [21:33:58] The Special Namespace is a bit iffy [21:34:00] But that should work [21:34:13] Leave a comment in here about 5.3 compat so we can fix it up at a later date [21:34:54] ? how [21:35:06] how for what bit? [21:35:11] the comment [21:35:30] // FIXME: PHP 5.3 support. When we don't support PHP 5.3, replace with... [21:35:40] kk cool [21:36:36] A FIXME like that isn't your fault [21:37:49] mhutti1: Looks like that should be good [21:37:54] Ignore my comment about the special namespace [21:38:04] We end up with string(7) "Special" [21:38:04] [2]=> [21:38:04] object(SimpleXMLElement)#247 (1) { [21:38:07] Anyway on newer PHP [21:38:10] So should be all good [21:42:11] Reedy: So I end up with 2 comments next to each other do I do a block comment or do 2 //'s [21:42:23] You can do either [21:42:28] Doesn't matter too much :) [21:43:08] kk [21:46:26] Let me know when you fix that and the gettype [21:46:31] See if we can get this merged tonight :) [21:49:58] I still try to find out why "namespace" breaks simplexml in php 5.3 :D [21:50:44] lol [21:50:48] mhutti1: You missed a line [21:51:04] Reedy: I finish it, wikiradio [21:51:09] $xmlNamespaces = (array) $xmlObject->siteinfo->namespaces; [21:51:18] $xmlNamespaces = $xmlNamespaces['namespace']; [21:51:32] FlorianSW: PHP sucks. Duh [21:51:35] The_Photographer: Congrats! [21:51:48] Reedy: I didnt need to do that cause i just added in in the str_replace right? [21:51:57] oh, duh [21:52:02] yeah, should be good [21:52:02] heh [21:52:31] just need to add new line to comment [21:52:36] PHP, my darling :P [21:55:28] Yey zend passed it [21:55:48] 21:50:33 50 | WARNING | Line exceeds 100 characters; contains 107 characters [21:55:49] heh [21:56:40] mhutti1: You might want to fix FlorianSW's comment too this time [21:56:42] ;) [21:56:49] Reedy: Done [21:56:58] :D [21:57:09] Reedy: At least one of them. I can remove the dataprovider if you want? [21:57:28] I don't know what our usual way is... [21:57:37] Do we use dataproviders elsewhere for only one page? [22:00:30] FlorianSW: ^^ [22:01:00] uhm, I'm not sure, in my opinion it's useless :) [22:01:51] Well yes or no :P [22:02:06] mhutti1: remove the dataprovider, it can be added later, too :D [22:02:16] FlorianSW: cool [22:05:26] Now we wait ;) [22:05:57] Reedy: Ok I think its done [22:09:44] btw is there any reason why i cant find you guys on mediawiki-dev [22:10:16] #wikimedia-dev [22:10:30] I'm there and thought you were offline mhutti1 :D [22:10:59] Idk i tried an online version as well but I can only see 8 people in the channel [22:11:17] wikimedia, not mediawiki [22:11:24] ahh [22:12:07] im on my laptop away from home so I didnt have the channels saved [22:12:48] yay [22:14:00] mhutti1: Did a GCI task get filed for it? [22:14:33] Reedy: Yeah ttos my mentor but hes on +11 or something [22:14:49] Just tagging the bug as GCI then ;) [22:15:47] Wait for check zend to finish, then I'll +2 [22:16:11] Reedy: Cool :) [22:32:16] Reedy: Succsess :) [22:39:10] mhutti1: More I think about it, the more I think I've seen/discussed this issue with someone else before [22:39:14] Maybe a few years ago [22:39:55] Reedy: I'm just supprised theres nothing on the internet about it [22:40:13] googling for namespace in xml is just false positives for this issue [22:40:56] Reedy: Yeah. I would say im pretty good at googling but google really falls down on false positives [23:07:38] mhutti1: I wonder how much coverage you will have added? :P [23:08:00] Reedy: coverage? [23:08:10] https://integration.wikimedia.org/cover/mediawiki-core/master/php/includes_Export.php.html [23:08:39] Wasn't that the point of the task? ;) [23:09:03] "Currently, Export.php and SpecialExport.php have 0% test coverage" [23:09:05] Reedy: Does it automaticly compute it [23:09:13] Yeah [23:09:17] I think it's updated every 24 hours or so [23:10:00] Reedy: Ah What exactly does it do? [23:10:21] Shows you what lines/% of lines are covered by MWs phpunit tests [23:10:30] ie what lines are executed during a run of all the core phpunit tests [23:10:52] Reedy: Ah so it tells you what codes executed and assumes if it were to break the test would fail [23:11:54] Not quite [23:12:02] It's so you can tell how much of your code is programatically tested [23:12:42] Reedy: Cool. Theres an awful lot of red in mediawiki-core [23:12:47] There is [23:13:33] 7.31% of lines [23:13:54] JS is probably better covered [23:14:19] I didn't even know what a unittest was before starting this task :P How important are they? [23:14:43] Importance is relative [23:14:54] Ideally, all of MW core should be 70-90% covered [23:15:10] Newer code generally tends to have coverage [23:15:30] Some projects require unit tests to give coverage of any bugfixes or new features [23:15:48] In the ideal world, with sufficient coverage, you either find code that you can remove as you don't need it [23:16:00] Or know that your tests will catch most possible failures [23:16:21] Yeah I guess its also pretty good for catching redundent code [23:16:56] Some of the code will only really be used by extensions etc too... So numbers might be a bit higher [23:20:41] mhutti1: Feel free to keep writing tests :P [23:22:26] Reedy: I must addmit its quite fun. Having to anticipate where things might go wrong and writing concise code to catch as much as possible. [23:22:41] It'll be interesting to see how much coverage you add [23:23:00] I was just looking if I could trigger the ci system to update the report, but I can't see where ti is [23:23:22] mhutti1: unit tests are also useful to write libraries, when you want a function to give you some result [23:23:48] mhutti1: for mathematical or linguistic operations, you generally know what it should returns before to write code [23:24:11] See also: Test Driven Development [23:24:13] mhutti1: so if you start with the tests, it helps to plan all the cases, including the tricky one like 0, null, "" [23:26:08] Wow thats really cool. I guess its designed to force you to think through what you want before working out the best way to get it [23:26:59] You can do that to some extent with new MW features, but it's a little harder with codebases like that [23:27:08] A lot of the code is hard to test, because it was never designed to be [23:28:27] https://scrutinizer-ci.com/g/wikimedia/mediawiki/ [23:30:25] That data appears to be old? [23:31:48] Yeah [23:31:54] Just looking how to get newer data out of it :P [23:32:15] https://scrutinizer-ci.com/g/wikimedia/mediawiki-extensions-Wikibase/ is a better example [23:33:10] Not free either [23:33:52] well i guess they have some pretty cool algorithms for determening complexity [23:34:08] There's numerous free tools about [23:34:19] I wouldn't be surprised if they're mostly charging for the shinyness etc ontop