[00:04:11] 03kaldari * 10/trunk/extensions/ProofreadPage/ProofreadPage.php: attempt to fix bug 33873 - localBasePath shouldnt end with a slash [00:10:10] 03jeroendedauw * 10/trunk/extensions/CreatePage/ (6 files): importing CreatePage extension [00:18:43] 03(mod) Email abuse - filter to stop? - 10https://bugzilla.wikimedia.org/33761 +comment (10john) [00:22:02] New code comment: Reedy; Tested locally, works fine in non debug mode :); [00:22:22] Zaran, ^ [00:23:05] Reedy : looks interesting [00:23:10] but it was already working locally [00:23:34] lol, it works fine for me locally in non debug mode [00:23:43] can you update it on labs? [00:24:04] not sure [00:24:06] hexmode can [00:26:21] ok Reedy : i just did [00:26:57] still the same problem [00:27:02] bleh [00:33:29] 03(mod) Email abuse - filter to stop? - 10https://bugzilla.wikimedia.org/33761 +comment (10rolandrance) [00:34:30] 03(mod) Proofread extension JS loading fully broken on 1.19 - 10https://bugzilla.wikimedia.org/33873 +comment (10Ryan Kaldari) [00:37:02] 03(NEW) Special:NovaProject needs a table of contents - 10https://bugzilla.wikimedia.org/34196 enhancement; MediaWiki extensions: OpenStackManager; (sam) [00:50:39] Zaran: Pushed your patch to labs, as well r110691. JS still isn't loading, but maybe you'll have more luck troubleshooting it now. [00:56:55] kaldari : which patch ? [00:57:11] I think it was already pushed yesterday [00:57:26] the one to make mediawiki.legacy.wikibits a dependancy in ProofreadPage [00:57:50] oh ok, I thought it was pushed [00:57:58] Maybe it was [00:58:10] r110691 is new though [00:58:29] Reedy reports that it's working for him locally [00:59:10] and of course it's extra hard to troubleshoot on labs since it only happens when you have debug off :( [00:59:16] right [01:29:56] is there a proper way to phrase this idiom in a way that mysql will accept? [01:29:58] cast( user_touched AS timestamp ) >= '20110905000000' [01:30:02] it doesn't like that :> [01:30:13] says syntax error at 'timestamp' [01:30:13] What are you trying to do? [01:30:31] You shouldn't need to cast the timestamp if you're just doing a time comparison. [01:30:37] get a timestamp from the user_touched value (which is binary(14)) and compare it against a minimum time [01:30:46] You can just do user_touched >= '20110905000000'. [01:30:49] ahh [01:30:51] ok [01:30:59] didn't know if binary() type supported collation [01:31:01] It's just important that you quote the value. [01:31:08] Otherwise it'll be slow as shit. [01:31:28] If you want to format the user_touched value in the select, you can use... [01:31:45] DATE_FORMAT() [01:31:56] https://wiki.toolserver.org/view/Queries [01:32:15] actually I'm aiming for a count of users since opening with a certain edit count threshold :> [01:32:54] user_touched isn't very reliable for anything. [01:33:03] It's an internal timestamp that's recalculated for all sorts of strange reasons. [01:33:14] hmm [01:33:27] You'd want... user.user_editcount and... [01:33:44] user.user_registration, I guess? [01:34:05] the problem with that is that this database was imported [01:34:16] so there are a lot of old users who are not active in the new wiki [01:34:53] and on the other hand, people have reclaimed their old accounts by sending me their email address so they can set a new password, so there have been relatively few registrations [02:14:13] 04(REOPENED) Add support for non-Arabic number systems - 10https://bugzilla.wikimedia.org/34193 +comment (10Shiju Alex) [02:55:21] 03(mod) importScript, importScriptURI, and addPortletLink errors - 10https://bugzilla.wikimedia.org/34147 summary (10Mark A. Hershberger) [02:55:36] 03(mod) importScript, importScriptURI, and addPortletLink errors - 10https://bugzilla.wikimedia.org/34147 (10Mark A. Hershberger) [03:37:31] Is there a parser function that switches a piece of text with another piece of text? That is, if the {{{gender}}} value is Male, it will display Son, if it is Female, it will display Daughter, if nothing is entered, it displays Child. [03:37:33] Any ideas? [03:40:35] Nevermind, I think I found it. [04:19:07] 03(mod) Page translation with external link needing CAPTCHA - 10https://bugzilla.wikimedia.org/34182 +comment (10niklas.laxstrom) [04:27:40] 03(mod) RevisionDeletion-suppress and log entries don't play nice - 10https://bugzilla.wikimedia.org/33167 +comment (10niklas.laxstrom) [04:50:36] 03(mod) Add support for non-Arabic number systems - 10https://bugzilla.wikimedia.org/34193 +comment (10Jayanta Nath) [05:58:19] hi, is there anyway to block an IP that's abusing a wiki, except the IP is the original IP stored in XFF? [05:58:41] it looks like another IP, but when you checkuser it, the true IP shows up in XFF [06:00:18] hello guys, this is a non-related mediawiki question - anyone have hydrairc, how to i export the settings -- hmm maybe there is a hydrairc irc channel? [06:16:02] rum: have a look for a hydra folder under your app data settings (start -> run -> %appdata%) [06:17:57] thank you so much mr snake! [06:18:06] have a great weekend! [06:51:13] 03(mod) #time parser function can't read local language month names - 10https://bugzilla.wikimedia.org/19412 +comment (10jayantanth) [07:07:21] 03(NEW) Can't save text including the protocol https on wikimania2012, translating it into Korean - 10https://bugzilla.wikimedia.org/34197 trivial; MediaWiki: Internationalization; (qbphcqdw) [07:18:41] 03(mod) Can't save text including the protocol https on wikimania2012, translating it into Korean - 10https://bugzilla.wikimedia.org/34197 +comment (10niklas.laxstrom) [07:24:55] 03(mod) Adding options to Special:ListUsers (hide permanent and temporary blocks) - 10https://bugzilla.wikimedia.org/33545 +comment (10Aashish Mittal) [07:40:59] Hey guys, I'm trying to add a Traditional Chinese version of my wiki. It's a bit confusing, most of web uses zh-cn and zh-tw, but for my wiki do I need to use zh-hans and zh-hant in my localsettings for the language? [08:04:38] SideSw|pe: Does this help? http://www.mediawiki.org/wiki/Manual:LocalSettings.php/zh-cn [08:05:08] let me check it out, thanks. [08:05:43] Google found it for me with mediawiki zh-cn [09:02:30] hi! just tried latest svn mediawiki. It just return internal error. Exception caughtt inside exception handler. And nothing more, even with $wgShowExceptionDetails='true' [09:03:59] o hai [09:04:03] anyone awake??? [09:04:07] yes [09:04:17] need some help with the SMWHalo install [09:04:34] Fatal error: Call to a member function addMessages() on a non-object in /var/www/mediawiki2/deployment/Deployment.php on line 35 [09:04:35] which issue you have? [09:05:14] which version you have? can you paste that line? [09:05:34] version of mediawiki? [09:05:51] SMWHAlo? [09:06:26] both i think [09:07:10] SMW 1.6.0 [09:07:45] mediawiki 1.18.1 [09:08:17] so, what about that line? [09:08:36] i am sorry , not understanding [09:09:14] line 35 from Deployment.php [09:09:18] oh ok [09:10:04] $wgMessageCache->addMessages($dfgLang->getLanguageArray(), $wgLang->getCode()); [09:10:24] $wgLang? [09:10:29] I have not seen it [09:13:32] i think there is no $wgMessageCache [09:15:36] xvilka: thanks i will hunt, [09:17:07] xvilka: yes ! http://www.mediawiki.org/wiki/Special:Code/MediaWiki/81027 [09:17:27] wgMessageCache is removed [09:22:15] ok i am still lost [09:22:26] Fatal error: Call to undefined method EmptyBagOStuff::addMessages() in /var/www/mediawiki2/deployment/Deployment.php on line 35 [09:23:32] what you've changed? [09:23:42] i have to chance the call to wgMemc [09:24:03] I used $wgMemc instead [09:24:10] bad idea? [09:24:19] looks like [09:25:41] http://www.gossamer-threads.com/lists/wiki/wikitech/262242 [09:44:22] ya $wgMemc ahs nothign to do with messages [09:44:30] has nothing* [09:45:32] 03(mod) PostgreSQL: pg_query() [function.pg-query]: Query failed: ERROR: syntax error at or near "ENGINE" - 10https://bugzilla.wikimedia.org/29555 +comment (10anton.kochkov) [09:50:37] Could someone help me with something regarding ParserFunctions? I don't know where I've gone wrong. http://camphalfblood.wikia.com/wiki/User:MakeShift/workbench/test (and the original code is without the /test) [10:08:12] Anyone? [10:31:05] i am not proficient enough to do this [10:33:30] 03(mod) Thumbnails of PDFs with non-standard page sizes are rendered incorrectly - 10https://bugzilla.wikimedia.org/33722 summary (10This, that and the other) [10:41:17] New code comment: Krinkle; Yeah, that's true. bug 33711/r110542 is not the same problem though (although it does result in the ; [10:50:10] 03(FIXED) developer email should state more clearly that it is public - 10https://bugzilla.wikimedia.org/33826 +comment (10Yuvi Panda) [10:50:53] 03(FIXED) PostgreSQL: pg_query() [function.pg-query]: Query failed: ERROR: syntax error at or near "ENGINE" - 10https://bugzilla.wikimedia.org/29555 +comment (10Anton Kochkov) [10:53:40] 03(mod) Sending multiple http status code headers kills fastcgi causing misery (and ugliness since RL does this) - 10https://bugzilla.wikimedia.org/12124 +comment (10jukey) [10:56:10] I guess i should file a bug report :( [11:14:17] 03(NEW) first item in the history shows no information about its size - 10https://bugzilla.wikimedia.org/34198 normal; MediaWiki: History/Diffs; (umherirrender_de.wp) [11:30:59] Hi all. I'm working on my extension and want to check via API if a page exists (external API). I've just tried this but I don't know exactly what should I do now ($pageexistence_data['pages']['...']...): $pageexistence = file_get_contents( $url . $prefix . '/api.php?action=query&format=json&titles=Help:' . $page->page_title ); $pageexistence_data = json_decode( $pageexistence, /*assoc=*/ true ); [11:32:08] Existent page: http://www.wmx-developing.org/w/api.php?action=query&titles=Help:Staff&format=json nonexistent page: http://www.wmx-developing.org/w/api.php?action=query&titles=Help:Sff&format=json [11:32:51] There's a "missing":"" and "pageid" is not included at nonexistent pages [11:38:45] 03(mod) Enable e-mail notifications for watchlist (EnotifWatchlist) on all small wikis - 10https://bugzilla.wikimedia.org/28026 +comment (10federicoleva) [11:41:50] 03(mod) login-progress message should be in past tense, not present - 10https://bugzilla.wikimedia.org/34185 +comment (10post) [11:44:22] 03jeroendedauw * 10/trunk/extensions/CreatePage/CreatePage.php: allow for inline display [11:47:51] Hi all. I'm working on my extension and want to check via API if a page exists (external API). I've just tried this but I don't know exactly what should I do now ($pageexistence_data['pages']['...']...): $pageexistence = file_get_contents( $url . $prefix . '/api.php?action=query&format=json&titles=Help:' . $page->page_title ); $pageexistence_data = json_decode( $pageexistence, /*assoc=*/ true ); Existent page: http://www.w [11:47:51] mx-developing.org/w/api.php?action=query&titles=Help:Staff&format=json nonexistent page: http://www.wmx-developing.org/w/api.php?action=query&titles=Help:Sff&format=json There's a "missing":"" and "pageid" is not included at nonexistent pages [11:52:24] 03(mod) Wikipedia Android app should have a setting to turn off images - 10https://bugzilla.wikimedia.org/33866 +comment (10tfinc) [11:59:41] 03ialex * 10/trunk/extensions/ (8 files in 6 dirs): svn:eol-style native [12:17:30] 03cervidae * 10/trunk/extensions/ReassignEdits/ReassignEdits_body.php: Removed bug-causing and redundant things at $\olduser [12:29:01] 03(mod) first item in the history shows no information about its size - 10https://bugzilla.wikimedia.org/34198 +comment (10umherirrender_de.wp) [12:32:50] 03(mod) WIkipedia Android app forward button at times does nothing - 10https://bugzilla.wikimedia.org/33280 +comment (10yuvipanda) [12:37:28] Krinkle: You have helped me in the past. I hope you can also help me today :-) I'm working on my extension and want to check via API if a page exists (external API). I've just tried this but I don't know exactly what should I do now ($pageexistence_data['pages']['...']...): $pageexistence = file_get_contents( $url . $prefix . '/api.php?action=query&format=json&titles=Help:' . $page->page_title ); $pageexistence_data = json_decod [12:37:28] e( $pageexistence, /*assoc=*/ true ); Existent page: http://www.wmx-developing.org/w/api.php?action=query&titles=Help:Staff&format=json nonexistent page: http://www.wmx-developing.org/w/api.php?action=query&titles=Help:Sff&format=json There's a "missing":"" and "pageid" is not included at nonexistent pages [12:38:42] Tim_Weyer: Yes [12:38:50] So looks like you already know [12:38:53] what are you asking ? [12:39:31] Timhttp://www.wmx-developing.org/w/api.php?action=query&titles=SomeOddPagename|Help:Staff&format=jsonfm&indexpageids [12:39:33] Tim_Weyer: http://www.wmx-developing.org/w/api.php?action=query&titles=SomeOddPagename|Help:Staff&format=jsonfm&indexpageids [12:39:50] (jsonfm = human readable preview, use json in the real request) [12:39:58] I want to check if an external page does exist or not. I've tried it with: $pageexistence = file_get_contents( $url . $prefix . '/api.php?action=query&format=json&titles=Help:' . $page->page_title ); if ( strpos($pageexistence,'missing":"') == false ) { but it didn't work [12:40:10] Tim_Weyer: Do not use strpos [12:40:36] yes, firstly it doesn't work and secondly I think it's the wrong way [12:41:30] how can I check "missing":"" ? [12:41:46] I'm a probie at APIs [12:43:43] No. of commits from the beach today: 1 [12:44:59] Tim_Weyer: http://pastebin.com/B02fBADZ [12:45:09] Tim_Weyer: isset( $page['missing'] ) [12:45:23] Tim_Weyer: I'd recommend using format=php / unserialize() when working in PHP [12:45:31] json work also, no problem [12:45:40] but is more optimized for javascript [12:46:11] json_decode format=json gives you exactly the same as unseralize format=php [12:51:12] I'm trying it all :-) [13:00:30] hi! I wanted to know if registration for mediawiki gsoc 2012 has begun.. [13:20:10] needed help to run mediawiki with eclpise-php [13:27:58] how to register [13:28:10] i need to register my nick name [13:28:21] i'm new to here can anybody help [13:29:06] 03siebrand * 10/trunk/translatewiki/Wikia/extensions.txt: Remove commented out code. [13:32:35] Krinkle: I overlooked your link and applied it now. It does work. Thanks! [13:56:03] 03raymond * 10/trunk/translatewiki/MediaWiki/WikimediaAgg.yaml: Variablepage unused per r110627 [14:00:48] New code comment: Raymond; Setting to FIXME to get a decision which method is better. I want to enable this file on Translatewi; [14:07:16] mysql not working in mediawiki when opened by Eclipse [14:07:21] Someone please help me [14:07:30] i m stuck with this for days :-| [14:14:07] 03siebrand * 10/trunk/translatewiki/Wikia/extensions.txt: Update extension support for Wikia. [14:21:35] 03(mod) Page translation with external link needing CAPTCHA - 10https://bugzilla.wikimedia.org/34182 (10qbphcqdw) [14:21:35] 03(mod) Can't save text including the protocol https on wikimania2012, translating it into Korean - 10https://bugzilla.wikimedia.org/34197 (10qbphcqdw) [14:29:41] 03(mod) Can't save text including the protocol https on wikimania2012, translating it into Korean - 10https://bugzilla.wikimedia.org/34197 (10qbphcqdw) [14:34:03] 03(mod) Gadgets stopped working on enwiki beta and commons beta - 10https://bugzilla.wikimedia.org/34195 summary; +comment (10Erwin Dokter) [14:36:32] 03jeroendedauw * 10/trunk/extensions/CreatePage/ (COPYING INSTALL README RELEASE-NOTES): added doc files [14:51:11] 03ialex * 10/trunk/phase3/includes/profiler/ProfilerStub.php: Added overrides of logData() and getCurrentSection() in ProfilerStub; just in case. [14:53:45] [14:59:42] 03ialex * 10/trunk/extensions/DumpHTML/dumpHTML.inc: [14:59:42] * Use ProfilerStub instead of a special stub profiler; was throwing E_STRICTs about signature mismatches and isStub() was not returning true [14:59:42] * Pass an empty array to ProfilerSimpleUDP's constructor to not throw E_WARNINGs about missing parameter [14:59:42] * No need to call wfProfileIn( '-total' ); it's already done by the constructor [14:59:43] * Call ProfilerSimpleUDP::logData() to actually log the data; getFunctionReport() doesn't do anything [15:33:23] 03(NEW) Allow filtering action=sitematrix by project code - 10https://bugzilla.wikimedia.org/34199 normal; MediaWiki: API; (danny.b) [15:34:42] 03(NEW) Allow filtering action=sitematrix by language code - 10https://bugzilla.wikimedia.org/34200 normal; MediaWiki: API; (danny.b) [15:40:32] 03(mod) Allow filtering action=sitematrix by language code - 10https://bugzilla.wikimedia.org/34200 (10Sam Reed (reedy)) [15:40:33] 03(mod) Non Core API issues (tracking) - 10https://bugzilla.wikimedia.org/23855 (10Sam Reed (reedy)) [15:40:39] 03(mod) Allow filtering action=sitematrix by project code - 10https://bugzilla.wikimedia.org/34199 (10Sam Reed (reedy)) [15:40:42] 03(mod) Non Core API issues (tracking) - 10https://bugzilla.wikimedia.org/23855 (10Sam Reed (reedy)) [15:42:35] Reedy: what were you playing? [15:43:27] bf3 [15:44:00] Ah [15:44:37] still am actually [15:45:38] 3 sniper kills [15:45:40] go me [15:46:52] lol, and changing bug reports at the same time? :P [15:47:13] indeed [15:47:16] multitask ftw [15:48:38] multimonitor? [15:48:52] 2 machines [15:48:53] 3 screens [15:49:00] i just got knifed [15:49:01] boo [15:49:13] why 2 machines? [15:49:20] desktop and laptop? [15:49:44] yeah [15:49:50] not sure, something i started during my pc rebuild [15:51:00] Hmm, torrent, please hurry up, I'm in the mood to watch some stargate in hd.. :/ [15:51:42] And if by magic, I get a seed at 100kb/s [15:52:31] usenet ftw [15:52:36] ALL THE BANDWITHS [15:53:08] all your bandwidths are belong to us [15:53:21] lol, ok, I'll get usenet for all the seeders... :P [15:54:23] no, you don't have seeders [15:54:29] you have a server with a fuckload of bandwith [15:54:33] i bloody hate this keyboard [15:54:50] ^ #firstworldproblem [15:56:37] another sniper kill [15:56:38] wheeee [15:56:56] * Lcawte spots twitter [15:58:35] Wow, the UK twitter population.. trending currently, Michael Jackson songs, football scores, Lady Gaga, and of course, Justin Bieber jokes.. [15:58:42] more snow [15:59:40] Not more snow, any snow would be nice... [15:59:54] Atleast, just for the week, its half term after that ^.^ [16:03:39] sniper ribbbon :D [16:03:45] Darn... I got my new headset like, last week? And now the right ear piece is broken :/ [16:04:22] Hmm, now, where is that warrenty registration thing.. [16:14:21] hmm, ok, now, do I claim on the warrenty and ask for a new one... hmm [16:15:51] after a week i'd take it back to store of purchase [16:16:15] Any idea where I can find out my local Amazon.co.uk? :P [16:17:07] 2 years manifactorer warenty... with free postage label... (can't spell) [16:18:14] 03(mod) page search check box disappers - 10https://bugzilla.wikimedia.org/33020 +comment (10rahuldeshmukh101) [16:19:36] need 2 more sniper kills [16:19:39] manufacturer [16:20:11] 1 [16:20:30] 03(mod) Please create a Marathi Wikisource - 10https://bugzilla.wikimedia.org/33907 +comment (10rahuldeshmukh101) [16:20:48] Yeah, that.. [16:21:08] hi sumanah [16:21:50] hi Nikerabbit [16:22:15] mozilla is having a new wiki for managing localisation better way [16:22:40] link? [16:22:50] * sumanah relaxes by deleting spam comments [16:24:34] no more info [16:24:48] maybe google knows more than me [16:25:25] then how did you learn this? FOSDEM? [16:25:51] in a side note of a presentation [16:27:08] Nikerabbit: maybe you could send a one-line email to the localisation team to follow up? [16:28:14] Nikerabbit: https://wiki.mozilla.org/Special:Version perhaps? [16:29:00] developer.mozilla.org more likely [16:29:11] but if it is not done yet... [16:59:22] sumanah: [16:59:24] sumanah: hey [16:59:29] hi potter [16:59:48] sumanah: can you suggest me good IDE for mediawiki [17:00:10] potter: many of us use NetBeans [17:00:22] potter: I think Reedy might be able to suggest something [17:00:35] potter: many of us use emacs or vim or similar Unix-y text editors [17:01:02] sumanah: i tried eclipse but i m not able to make MW work on it [17:01:07] phpstorm [17:01:51] Reedy: You use phpstorm? [17:03:03] yup [17:03:08] and notepad++/nano [17:04:02] Does any body use Eclipse [17:04:27] not for php [17:04:57] Reedy: oh ok... but i work on linux..phpstorm is on linux? [17:05:16] it's java based, windows, mac os x or linux [17:06:00] i m getting http://dpaste.com/697499/-- error on eclipse - Unable to determine IP.. [17:06:43] It only gets defined at runtime [17:06:56] sumanah: btw can you tell me some Extension which is buggy .. i want something for the weekend [17:07:16] many of them [17:07:30] potter: you can do a bugzilla search [17:07:53] Reedy: many ppl here use phpstorm? cause it will be good if many use..so that if problem comes it can be solevd [17:08:17] potter: https://bugzilla.wikimedia.org/buglist.cgi?query_format=advanced&list_id=84433&bug_severity=blocker&bug_severity=critical&bug_severity=major&bug_severity=normal&bug_severity=minor&bug_severity=trivial&product=MediaWiki%20extensions [17:08:23] there's a few of us that do, yes [17:08:49] sumanah: yeah.. but then it must be to a decent level.. i mean i have started [17:08:55] sumanah: recently :) [17:09:28] Reedy: Thanks man [17:09:30] potter: learning to use our Bugzilla will be a helpful thing for you to do -- with a little practice you can structure a query to give you, for example, MediaWiki extensions bugs with the "easy" keyword [17:09:57] 125 bugs: https://bugzilla.wikimedia.org/buglist.cgi?keywords=easy%2C%20&query_format=advanced&keywords_type=allwords&list_id=84435&product=MediaWiki%20extensions [17:10:12] and then if you click "comp" (short for "component") it'll group them by Extension [17:10:12] sumanah: or we could use a bugtracker that doesn't suck :) [17:10:21] johnduhart: you volunteering to transition us over? [17:10:35] johnduhart: I'm following Theodore Roosevelt's advice [17:10:40] doing what I can where I am with what I have [17:10:45] right now, I'm helping potter [17:10:46] if only it was that simple [17:11:01] Dantman had a project on labs to test other bugtrackers but we need a BZ dump [17:11:09] Reedy: orly? [17:11:42] potter: so, doing that, I see that AbuseFilter, CentralAuth, MobileFrontend, and UploadWizard would be good choices for you to try [17:11:52] not ca [17:11:55] no no no [17:12:04] Reedy: ? [17:12:15] it's not simple to setup [17:12:22] so doing work on it is even harder [17:12:23] potter: Reedy is about to explain why CentralAuth is not one to start with [17:12:33] Those are all pretty complex extensions sumanah. [17:12:52] johnduhart: ok, the traditional "let Sumana answer the question first then correct her" dance is now in its 2nd step [17:13:34] so what do you guys suggest? [17:13:36] * johnduhart looks at the bug list [17:13:52] potter, what extensions have you already tried playing with? [17:14:07] inputboxform [17:14:31] tried wikilove..but couldnt understand [17:14:44] read the simple_extensions_list [17:14:47] thats it [17:14:48] potter, did you already look at https://www.mediawiki.org/wiki/List_of_simple_extensions ? [17:14:49] ok [17:15:00] sumanah: yeah [17:15:13] potter, if you want, you could talk about WikiLove things here, things you didn't understand, and people could help you understand it [17:15:29] at some point learning has to occur :) [17:15:46] sumanah: yeah.. thats right. [17:15:59] sumanah: i did not understand the API usage [17:16:47] https://bugzilla.wikimedia.org/buglist.cgi?keywords=easy%2C%20&query_format=advanced&keywords_type=allwords&list_id=84441&columnlist=bug_severity%2Cpriority%2Cop_sys%2Cassigned_to%2Cbug_status%2Cresolution%2Cshort_desc%2Ccomponent&resolution=---&product=MediaWiki%20extensions [17:16:55] potter, as you can tell, people like Reedy, johnduhart, JeroenDeDauw, ashley, etc are going to be able to be more helpful than I am in giving you more concrete help [17:17:05] This one has non-completed and a component column [17:17:09] sumanah: and i m facing lots of problems with global function names.. and variables and there are lots of functions [17:17:22] sumanah: so each time i read about any function i go deeper and deeper into the code [17:17:29] sumanah: and then lost.. [17:17:37] sumanah: do you use any IDE? [17:17:49] potter: and then you come here and ask for help! which is the right thing to do! don't give up. [17:18:05] potter, I use gEdit or nano when I write software, which is infrequently [17:18:25] sumanah: yeah..ppl here help..which is very nice cause i did not get help in KDE.so i left it [17:18:29] hi! [17:18:31] johnduhart: I accidentally removed all the Resolved/Resolution parameters, no wonder I got 120+ results rather than 25 [17:18:47] sumanah: what do you develop on ? [17:18:59] potter, I do not work as a software developer. [17:19:11] hi ashley! potter is looking to get into MediaWiki via extensions. is SocialProfile a good entry point? [17:19:30] potter, see the link johnduhart posted a few minutes ago? that might be a good place for you to go [17:19:45] sumanah: yeah.. i am seeing it [17:20:00] potter: when I code at all, it's a little bit of novice JavaScript or Python to learn my way around stuff [17:20:25] SocialProfile is...interesting, but it's not exactly what I'd recommend for a new developer -- its codebase is still not the prettiest and the code logic (or the lack of thereof!) is not too obvious... [17:21:00] ashley: so what do you suggest ? [17:21:29] potter: let me find you some better examples ;-) just a moment [17:21:41] ashley: ok .thanks [17:24:35] potter: you might want to take a look at some of the extensions I've written -- see the list at http://www.mediawiki.org/wiki/User:Jack_Phoenix/extensions. many of those are simple yet effective and a good way to learn how MediaWiki's internal work, I'd say :-) I'd also recommend checking out the example extensions, in trunk/extensions/examples (IIRC), at least the FourFileTemplate should be in decent shape [17:25:19] ashley: you could suggest me a good IDE also?phpstorm is not free [17:25:26] potter, will you be going to the Pune hackathon and/or the Chennai event? [17:25:49] sumanah: chennai one.. [17:26:05] MW devs have some sorta license for phpStorm, I think...but alas, I don't use any IDEs, I just use Notepad++ and lots and lots of testing [17:26:14] https://svn.wikimedia.org/viewvc/mediawiki/trunk/extensions/examples/ [17:26:38] potter: http://wikimedia.7.n6.nabble.com/Fwd-Mediawiki-l-JetBrains-PHPStorm-License-for-MediaWiki-Developers-td804531.html in case you want a free phpStorm license [17:27:00] potter: Reedy can get it to you [17:27:07] its got a free trial to begin with [17:27:07] ashley: then how do me mange to know about functions and variables [17:27:25] http://svn.wikimedia.org/viewvc/mediawiki/trunk/extensions/LockDownEnglishPages/LockDownEnglishPages.php?revision=106136&view=markup & http://svn.wikimedia.org/viewvc/mediawiki/trunk/extensions/Quantcast/Quantcast.php?revision=106136&view=markup are two really basic hook extensions that I've written -- simple but effective :-) [17:28:11] potter: I've managed just fine without an IDE, but then again I've been around for a while...but there are some online docs, generated from source code comments at svn.wikimedia.org/docs (IIRC) and then there's the on-wiki documentation on MediaWiki.org [17:28:14] ashley: then how do me mange to know about functions and variables ? [17:28:50] ashley: i tried them..but then i guess that i must be patient enought to go trhough them :) [17:29:45] ashley: you have writeten sixteen exentions. Which was has some bugs? [17:29:55] patience (or alternatively, an alcoholic beverage of your choice ;-) is the key to MediaWiki development [17:29:59] potter: let's see... [17:30:19] ashley: lol [17:32:37] most of my own, totally (or mostly) original extensions -- the 16 you refer to -- are rather simple and thus I can't think of any (major) bugs in them, or any reported bugs at all...some of the extensions, especially the social tools, that I've contributed to have bugs, but since those bugs give me a headache, I'd rather not confuse you with those :-) [17:32:58] then again you could try making http://www.mediawiki.org/wiki/Extension:ShoutWiki_Ads suck a bit less and/or maybe adding a new ad provider or two to it, since right now it supports only Google AdSense [17:34:22] ashley: What other ad services could you add? :P [17:34:38] potter: if you know some JavaScript, maybe you could try your hand with http://www.mediawiki.org/wiki/Extension:AJAX_Poll ? its codebase should be improved (i.e. class-based approach instead of global functions, ResourceLoader support instead of nasty inline JS...) [17:34:52] Lcawte: I'm sure that there are some non-Big Brother powered ones out there [17:35:34] ashley: i just know Javascript.. but i have no excellency with it? [17:37:29] bye for the day! [17:37:30] 03(mod) Please create a Marathi Wikisource - 10https://bugzilla.wikimedia.org/33907 +comment (10sam) [17:37:43] ashley: ok thanks a lot..but you have given so many stuff that i m confused now ! so can you tell me where to start off(i have complete weekend) [17:37:59] 03(mod) Allow filtering action=sitematrix by project code - 10https://bugzilla.wikimedia.org/34199 normal->15enhancement (10Sam Reed (reedy)) [17:37:59] 03(mod) Allow filtering action=sitematrix by language code - 10https://bugzilla.wikimedia.org/34200 normal->15enhancement (10Sam Reed (reedy)) [17:39:28] ashley: there ? [17:40:03] potter: yes, I was browsing MW.org and looking for some useful links for you and thinking of extensions with known bugs [17:40:19] ashley: ok :) [17:40:45] something I personally think is a great way of familiarizing yourself with MediaWiki and its internals is fixing a Wikia or wikiHow extension -- those two big sites have written some *awesome* MW extensions that unfortunately rely on their custom modifications to function properly...but since both sites release their code publicly, that's not much of an issue for a capable dev :) [17:41:01] (now Lcawte can tell just how wrong I am :P) [17:41:52] HMm, maybe... [17:43:24] That reminds me, I need to finish that cloakcheck patch don't I.. [17:47:58] something that would probably be rather rewarding would be to fix a bug in one of the many, many extensions used by Wikimedia wikis (excluding CentralAuth and whatever other royal clusterfucks there are) [17:48:06] ashley: can i ask you doubts on the extension and hooks ..when i dont understand them? [17:48:32] potter: you might want to read http://www.mediawiki.org/wiki/Manual:Hooks for an overview of hooks [17:48:32] ashley: or i dont understand the logic [17:48:51] ashley: i was on that page only :) [17:48:56] basically, hooks allow you to insert code into core files without actually having to edit core files [17:49:14] (that's probably not 100% accurate description, but it's how I like to think of it) [17:49:48] ashley: yeah even i have the same type of idea for hooks [17:52:44] potter: if you haven't checked out the example extensions yet, I'd suggest that you do: http://svn.wikimedia.org/viewvc/mediawiki/trunk/extensions/examples/ -- they're very bare-bones versions of extensions; for example, there are many special page extensions (extensions that add new special pages); FourFileTemplate is a basic special page that demonstrates how to add a new special page from an extension without performing any magic in it [17:53:39] ashley: ok i'll look into these also [17:54:57] ashley: doubt1 - what is NS_MEDIAWIKI ?you have used -- $title->getNamespace() == NS_MEDIAWIKI [17:56:39] it's a namespace constant, defined in includes/Defines.php -- it refers to the MediaWiki: namespace on the wiki, which allows to edit the wiki interface (to a degree) on-wiki; see [[manual:system message]] on MediaWiki.org for details about the MediaWiki: namespace [17:57:32] ashley: ok dude [17:59:20] * Dantman wants to replace that [18:00:20] ashley, potter: Btw, if your extension is 1.19+ the new preferred pattern is $title->isNamespace( NS_MEDIAWIKI ); [18:00:39] ashley, potter: Sorry, I mean $title->inNamespace( NS_MEDIAWIKI ); [18:01:11] Dantman: I'm far from being 1.19 and so are my extensions...I haven't even fixed all the breakages caused by ResourceLoader yet... [18:01:52] * Dantman wants to make NS_ constants disappear. [18:02:26] Dantman: why? [18:02:48] vvv: Because they have to have hardcoded integers [18:03:05] And you do not like hardcoded namespace numbers? [18:03:35] vvv: The plan forward is a namespace registration system that will allow new namespaces for extensions, etc... to be dynamically registered [18:03:36] Dantman: remember, we expose that stuff to users [18:03:58] In templates [18:04:09] $wgMessageCache->addMessages($dfgLang->getLanguageArray(), $wgLang->getCode()); [18:04:13] oops [18:04:33] what is the replacement for the above method? [18:04:41] i18n files [18:04:53] 03(mod) #time parser function can't read local language month names - 10https://bugzilla.wikimedia.org/19412 +comment (10Siddhartha.Ghai) [18:07:34] http://www.mediawiki.org/wiki/Manual:Developing_extensions#Internationalization [18:07:53] Nikerabbit: thanks I will later a look [18:13:09] i have never unserstood the global functin which are used for messages example --- wfMsgForContent -- ?what are these set of functions? [18:13:22] can someonle help me out [18:14:11] pothos: wfMsg uses the user-specified language [18:14:14] potter: they are internationalization (i18n) functions, defined in includes/GlobalFunctions.php (although wfMsgForContent and its friends are somewhat deprecated nowadays, I think, and the preferred approach is the Message class, accessed via the wfMessage global function) [18:14:41] wgMsgForContent uses the site language [18:15:03] vvv , ashley :What is thier function? [18:15:16] They return a text of a message [18:15:20] By the message's name [18:15:28] 03(mod) tracking bug for RTL bugs in the Wikimedia Mobile Android app - 10https://bugzilla.wikimedia.org/34166 (10Amir E. Aharoni) [18:15:28] 03(NEW) words in place names in an RTL script appear in the wrong order on the maps - 10https://bugzilla.wikimedia.org/34201 normal; Wikimedia Mobile: android; (amir.aharoni) [18:15:52] vvv : wfEmptyMsg( 'quantcast-tracking-number', $message ) - what will this do ? [18:16:07] vvv: i mean just the overall inetrpretation of this code [18:16:40] I have no idea about wfEmptyMsg [18:16:46] I can tell you what wfMsg will do [18:17:14] wfEmptyMsg is (was) used to check if the given message is empty - its usage is somewhat confusing to be honest [18:17:31] Oh [18:17:50] Yep, that is the part about messaging system that sucks [18:17:53] I think I am in the same realm with my issue wiht the SMWHalo extension [18:18:02] vvv: :what will wfMsg do? [18:18:03] a better, more modern way to do that is: $msgObj = wfMessage( 'something' ); $isEmpty = $msgObj->isBlank(); (or $msgObj->isDisabled();) [18:18:32] ashley: that sucks too [18:18:36] $wgMessageCache->addMessages($dfgLang->getLanguageArray(), $wgLang->getCode()); [18:18:37] Because it is too long [18:18:53] potter: wfMsg( 'messagename' ) will give you the text of a message with name "messagename" [18:19:11] vvv: stuff sucks, but that's life [18:19:27] ashley: well, wfMsg() sucked less [18:19:33] IMO big rewrites that break stuff suck even more [18:19:40] and yeah, I like the name wfMsg() too [18:19:56] (but honestly, the usage of wfEmptyMsg() was confusing, to me at least) [18:20:05] Yep [18:20:06] vvv: so we pass the messages with names and then use it at some other point in code?or there is some specific logic behind uding thes [18:20:10] It should have returned false [18:20:30] potter: yes, the message text is in the language files [18:20:33] potter: http://www.mediawiki.org/wiki/Localisation & http://www.mediawiki.org/wiki/WfMessage() & http://www.mediawiki.org/wiki/New_messages_API [18:20:36] They are in language/messages [18:21:14] except when it comes to extensions, in which case the appropriate .i18n.php file is usually in the extension's own folder or subfolder [18:22:02] vvv: so they are like dictionary data type.. sort of ? [18:22:13] Yes [18:22:21] Except that you may do more advanced stuff [18:22:30] vvv: like? [18:22:56] Like, you can write $1 in message text, and wfMsg( 'messagename', 'param' ) will replace $1 -> param [18:23:45] vvv: ok now i m getting the feel of the msg.. they are like some string editing functions plus a dictinary ;) [18:23:59] They also depend upon the language [18:24:23] MW's i18n framework also supports things like plurals, gender of people (i.e. in English we can just say "user" but for French it's either utilisateur (m) or utilisatrice (f)), formatting of numbers, etc. [18:24:45] Oh yeah [18:24:47] ashley: i did not follow you.. [18:25:15] except when it comes to extensions, in which case the appropriate .i18n.php file is usually in the extension's own folder or subfolder ? i did not get this also :( [18:25:29] Yes [18:25:44] Extension have their own dictionary of messages [18:26:05] vvv: what happens when two messages get the same title [18:26:13] vvv: overridden? [18:26:23] Well, they are just PHP arrays [18:26:39] potter: http://svn.wikimedia.org/viewvc/mediawiki/trunk/extensions/TitleKey/TitleKey.i18n.php?revision=108083&view=markup -- example of the message file [18:26:57] vvv: ok things are getting clear :) [18:28:46] ashley: in the Quantcast hook.. basically you check if there is "quantcast-tracking-number" .. if yes then you use that number ..else you use some default number? [18:29:18] potter: yep, exactly! [18:30:09] ashley: is quancast-tracking-number unique for a page?and when do you set such a number? [18:31:00] looks like I need to look at the same code? [18:31:53] quantcast-tracking-number is a message in the MediaWiki: namespace (i.e. a page called "MediaWiki:Quantcast-tracking-number" on the wiki); it's not unique per page, but per site (I think, I don't know much about Quantcast) [18:32:28] it's not actually set anywhere, hence why the installation instructions (see http://www.mediawiki.org/wiki/Extension:Quantcast) tell you to create that page first ;) [18:35:13] Blech... config in i18n [18:35:24] + analytics extensions are junk [18:38:02] ashley: in lockdownenglishpage i understood the code but..why is it?i mean --- We want to prevent editing of MediaWiki pages for users who have the editinterface right but who are not staff when the action is 'edit'--i did not get this line :-| [18:41:37] LockDownEnglishPages is quite a site-specific quick little hack...long story short, at ShoutWiki we have a wiki where people can translate MediaWiki messages into their local language. people who are not staff should not be allowed to edit the English MediaWiki: messages on that wiki, because there's a high risk that when editing, they'll mess up the messages and thus they mess up the interface on all of ShoutWiki's English wikis [18:46:41] ashley: translate MediaWiki messages ? [18:47:42] yes, because ShoutWiki has certain experimental/custom features which haven't been translated yet and the MediaWiki: namespace can be used to translate extensions etc. [18:47:43] ashley: which messages? do you mean articles? [18:49:45] no, MediaWiki messages; see http://svn.wikimedia.org/viewvc/mediawiki/trunk/extensions/TitleKey/TitleKey.i18n.php?revision=108083&view=markup for an example of a messages file -- 'titlekey-desc' is a message on the MediaWiki: namespace on the wiki, i.e. it is a page called "MediaWiki:Titlekey-desc" [18:52:28] Does anyone know the status on the mobile app competition? I made an iPhone app a while ago and submitted it and never even got any acknowledgement. [18:53:59] No.. [18:54:01] (Back in October). I Submitted on time, and was told I would at least get a certificate. [18:54:06] And the people who might know aren't here [18:54:36] Do you know of any email addresses of those people who I can contact about this? [18:57:34] codingchallenge@wikimedia.org [18:57:39] in theory [18:57:43] No idea if it's still being monitored.. [18:57:54] Okay thanks [19:00:46] Are you sure it would be @wikimedia.org and not @mediawiki.org? [19:01:17] So, whats the best way to go forward now that gsoc is announced? [19:01:22] Yup [19:01:22] https://www.mediawiki.org/wiki/MediaWiki:Contests-October_2011_Coding_Challenge_Introduction [19:01:39] GP, no idea. You can start drafting a proposal if you wanted to do it [19:02:01] Thanks Reedy [19:02:10] again, the needed person isn't here :L( [19:03:08] Reedy, when could mediawiki guys release their project proposals? [19:03:20] We don't really put any proposals [19:03:25] We just give ideas what might be nice [19:03:40] You can suggest nearly anything you want [19:03:48] ok, but when might that happen? [19:04:51] If GSOC has now formally be announced, we'll start doing it soon I guess [19:04:59] I can start a project page in a bit if you want [19:06:22] That would be great, and yeah Gsoc is formally announced on 4th april [19:06:31] sorry 4th feb [19:07:27] I would expect more formal information to start appearing from monday then [19:07:29] Reedy: Whatever happened with that October contest? [19:07:41] 03ialex * 10/trunk/phase3/includes/profiler/ProfilerSimpleTrace.php: Removed declaration of ProfilerSimpleTrace::$mMinimumTime; already defined in parent class [19:07:42] Did someone win? [19:07:46] Joan, nfi, that's what benbenbob1 was asking about [19:07:54] which is really bad :/. [19:08:23] benbenbob1: Seems like more a mailing list question. [19:08:33] Oh, no. [19:08:35] Here we go: http://blog.wikimedia.org/2012/01/30/october-2011-coding-challenge-winners/ [19:09:25] Thanks Joan [19:09:48] I jfgi. ;-) [19:09:48] what is the function of '''Parser''' class? [19:10:04] It parses wikitext and converts it to HTML? [19:11:18] potter: https://www.mediawiki.org/wiki/Manual:Parser.php#Description [19:12:35] Joan: thanks [19:14:08] If any of you would like to look at my iPhone app, it mimics the Wikipedia app a bit, but has some other cool features. https://github.com/benbenbob1/Wikipedia-App [19:15:59] benbenbob1, people in #wikimedia-mobile might be interested [19:24:03] Okay [19:24:21] Do you know that are future scripting langauge has no exceptions or warnings at all? [19:36:01] suppressing all errors? That's gonna go down well vvv [19:36:48] Reedy: well, if you write in lua "return godforsakenvariable", it will return nil (null) [19:37:18] And I do not know whether it is possible to make it complain [19:37:48] And of course it thinks that strings are arrays of bytes, just like PHP [19:38:27] So Lua = deformed JS object model + PHP string handling - any adequate error handling [19:38:39] And + Pascalish syntax [19:39:38] But of course it has its niceties [19:39:46] GP, https://www.mediawiki.org/wiki/Summer_of_Code_2012 [19:39:56] And we actually have a working Lua sandbox [19:40:23] I wonder why nobody even thought about using Python as our embedded language [19:41:02] 03(mod) Adding options to Special:ListUsers (hide permanent and temporary blocks) - 10https://bugzilla.wikimedia.org/33545 +comment (10bawolff+wn) [19:41:10] 03(mod) words in place names in an RTL script appear in the wrong order on the maps - 10https://bugzilla.wikimedia.org/34201 +upstream (10Brion Vibber) [19:41:49] vvv: Can can that pythy language even be embedded? [19:41:56] Dantman: yes [19:42:11] tabs don't work so well in browser text fields [19:42:13] And it looks like it was designed for that [19:42:17] ...that, ;) and whitespace based syntax is shit on the website [19:42:31] Reedy: we are going to set up proper code editor anyway [19:42:43] vvv, I'll believe it when I see it :p [19:43:09] Reedy: well, Python programmers hate tabs [19:43:16] They are all fourspace-geeks [19:43:20] a couple we or so ago, people gave me a good site for learning js [19:43:30] but now I can't remember it [19:43:36] Besides, lack of tab support is a PITA for coding in any langauge [19:43:55] it started out with an interactive lesson and typing your name in quotes [19:43:59] Because you have to ident your code no matter whether your interpreter forces to do it or not [19:44:04] any idea what it was? [19:45:09] Oh great, Python even has its own memory manager [19:46:09] vvv: Can can that pythy language even be embedded? [19:46:10] what [19:46:38] Joan: which part? [19:46:49] The duplicate "can" and "pythy" confused me. [19:46:55] But I think it was a pun on "pithy"? [19:47:11] ;) Yeah, couldn't resist the temptation [19:47:17] Heh. [19:47:25] Python? In MY php? [19:48:00] Reedy: You'll be begging for snakes when I start lisping in your php... [19:48:29] We should make enwiki use whitespace [19:48:33] They'll be alright with that [19:54:05] codeacademy [19:54:10] that was it [19:54:16] thank you awesome bar [19:54:55] (> lisp php) [19:55:28] but php learns [19:55:36] at least, it has closures, now [19:56:22] lol [20:02:37] Are there utilities in core to autocomplete on articles in a certain ns? I want to create some input that autocompletes on articles in the file ns on commons [20:04:38] hotcat does that [20:05:59] Reedy: any idea where I can find that code? [20:06:37] it's a gadget [20:06:46] think it just does requests against the api [20:07:13] https://www.mediawiki.org/wiki/MediaWiki:Gadget-HotCat.js [20:07:20] commons.wikimedia.org/w/index.php?title=MediaWiki:Gadget-HotCat.js [20:07:23] JeroenDeDauw: didn't someone present somethign like this for categories at the hackathon? [20:07:31] yea, like hotcat, but in the editor [20:08:52] Reedy: does that allow for nice autocompletion - ie if I type "bar", that I get "foobar" and "foobarbaz" and not just stuff starting with "bar"? [20:09:04] Dunno [20:09:05] Daniel_WMDE: right, will have a look at the projects page [20:09:05] try it out [20:09:06] ;) [20:10:27] * Lcawte wonders why BF3 plays it a lot more than MW3 and vice versa [20:11:00] that makes no sense [20:11:08] Daniel_WMDE: does not look like it's on the showcase page - do you know where the sources are? [20:11:38] hexmode: You do recall that one of Lisp's advantages is that as a homoiconic language new features such as a foreach() loop can be written in the language itself instead of requiring the language itself to implement it and wait for it to be deployed everywhere... Also, last I checked Common Lisp already has lexical scope and lambdas (ie: closures)... [20:13:13] Reedy: I thought the uploadwizard also had something called hotcat, but gerping it is not giving any hits :/ [20:14:47] is anyone aware of an extension or setting that allows you to see your REAL wanted files listed with all the images taken from repos excluded? [20:15:38] I'm not sure the current SpecialPage serves as practical a purpose anymore [20:22:43] I thought hotcat was a gadget or something [20:23:08] yeah, it is [20:30:25] hello, i've got this error : mediawiki/bin/ulimit4.sh: xmalloc: ../bash/array.c:89: cannot allocate 32 bytes (57344 bytes allocated), does anyone can help me ? [20:32:13] prattcorto: What are you doing when that happens? [20:32:36] We need a new api for including extensions [20:33:53] It's a problem with generating thumbnails. i use imagemagick but it seems it doesn't work [20:34:13] i've got this error in the error.log of apache [20:34:22] Try raising the $wgMaxShellMemory limit [20:34:39] Our default never seams to be enough for imagemagick [20:34:54] ok thanks i try [20:36:57] I wonder if I should go `class ExtensionParserFunctions extends MWExtension {}` style, or `$e = ExtensionLoader::register( 'ParserFunctions', __FILE__ );` style. [20:42:06] thanks a lot Dantman it works fine now [20:47:48] Hmmm, Installer, Selenium, SiteConfiguration.php, StubObject, maintenance scripts, and tests seam to use GLOBALS in a way that doesn't work in HipHop [21:01:59] I am currently fixing this https://bugzilla.wikimedia.org/show_bug.cgi?id=34161 (WikiArticleFeeds not working with 1.19-svn) and found a problem with https://www.mediawiki.org/wiki/Manual:Hooks/UnknownAction . [21:02:01] When called with &action=feed , the value (svn trunk version!) is not $action == 'feed' but 'nosuchaction' I need your help. Either the https://www.mediawiki.org/wiki/Manual:Hooks/UnknownAction is wrong or the SVN. [21:03:28] (the extension as such work, when I let it hook upon $action == 'unknown" (which is not a fix) [21:03:37] s/work/works/ [21:05:52] Hi all. can somebody tell me why forking inside of an extension would be a bad idea? It seems to me to be a bad idea but I am to much of a novice to have a good reason at hand. [21:07:29] tr|nn|, why do you want to? [21:07:44] 03(NEW) pls. add Extension:PdfBook to the Bugzilla option list, as this is in MW SVN - 10https://bugzilla.wikimedia.org/34202 normal; Wikimedia: Bugzilla; (mail) [21:08:27] Reedy, process lots of data in the background in parallel.. which also takes ages [21:08:59] No [21:09:04] Oh [21:09:09] I misread [21:10:19] I suppose if you're explictly doing it when told to do some action, it's not so bad [21:10:28] forking on every page llload etc seems really weird [21:11:35] 03aaron * 10/trunk/extensions/DumpHTML/ (SkinOffline.php dumpHTML.inc dumpHTML.php): (log message trimmed) [21:11:35] * (bug 33878) Updated file copying functions to account for FileBackend in some key places.* Create temp dir so script does not fail on windows as soon as it starts. [21:11:35] * Hacked copyToDump() a bit to not assume that the thumb zone base dir is under the public zone base dir. [21:11:35] * Fixed bogus '$this->debug()' calls in copyToDump() which caused fatals. [21:11:36] * Added wfMkdirParents() so that every single file_put_contents() call in copyToDump() doesn't fail (used for foriegn APIs). [21:11:36] * Removed 'forceCopy' param. Files will always be copied, as symlinks break abstraction. [21:11:37] * Fixed E_STRICT notice in offline skin class. [21:11:42] 03(mod) HTML dump is missing all images due to new FileBackend code - 10https://bugzilla.wikimedia.org/33878 +comment (10Aaron Schulz) [21:13:11] Hm, is it possible to whiteout $wgOut in a specialpage? such that calling a specialapage returns almost nada (misusing the specialpage like an api-extension) [21:13:27] s/like/like it were/ [21:14:05] that sounds creepy to me somehow... [21:15:11] 03jeroendedauw * 10/trunk/extensions/EducationProgram/ (4 files in 3 dirs): some initial work on adding images in ambassdor profiles [21:15:29] 03(NEW) Unknown page action hook problem: the hook must pass the unknown action to the callee and not the value "nosuchaction" - 10https://bugzilla.wikimedia.org/34203 blocker; MediaWiki: General/Unknown; (mail) [21:15:40] 03(mod) Unknown page action hook problem: the hook must pass the unknown action to the callee and not the value "nosuchaction" - 10https://bugzilla.wikimedia.org/34203 (10T. Gries) [21:16:07] 03(mod) Unknown page action hook problem: the hook must pass the unknown action to the callee and not the value "nosuchaction" - 10https://bugzilla.wikimedia.org/34203 (10T. Gries) [21:16:18] tr|nn|: $wgOut->disable(); or $wgOut->setArticleBodyOnly( true ); maybe? [21:16:54] ashley: I'll have a look at both -- thanks alot! [21:17:03] you're welcome :-) [21:20:00] 03(mod) WikiArticleFeeds not working with 1.19-svn - 10https://bugzilla.wikimedia.org/34161 +comment (10T. Gries) [21:20:00] 03(mod) Unknown page action hook problem: the hook must pass the unknown action to the callee and not the value "nosuchaction" - 10https://bugzilla.wikimedia.org/34203 (10T. Gries) [21:20:51] 03(mod) WikiArticleFeeds not working with 1.19-svn - 10https://bugzilla.wikimedia.org/34161 (10T. Gries) [21:21:27] tr|nn: To clear the content, just use $wgOut->clearHTML(); and set your new content below with $wgOut->addHTML( 'text' ); or something you like [21:22:13] tr|nn|: also [21:22:43] Tim_Weyer: Thanks, that's also good to know I can always clear the contents already saved in OutputPage [21:24:02] OutputPage's html content is not outputted when you use ::disable() [21:24:08] I think the combination of ->clearHTML and ->setArticleBodyOnly will give me a kind of "api" functionality (i know it is a kind of raping SpecialPage) [21:24:25] Also, please do not use $wgOut if you have an OutputPage available through your parameters or class context [21:24:47] ->setArticleBodyOnly( true ) = just content, not more [21:24:50] tr|nn|: That's what we have ->disable() for. [21:25:26] Dantman: but wouldn't that also disable simple html-output? [21:25:58] Is that not what you're trying to do? [21:26:11] I would say just try it [21:26:12] with setArticleBodyOnly it would be possible to still output contents. [21:26:31] Why do you want to? [21:26:34] Dantman: well halfwise. It wouldn't be bad to return whether or not forking was successful. [21:26:51] Tim_Weyer: I will :) [21:26:57] We just use $out->disable(); and then start using echo. [21:27:13] When I used $wgOut->setArticleBodyOnly( true ); in a normal page I just got the content [21:27:17] Dantman: Oh that makes it even easier [21:27:41] tr|nn|: If I understand right, you want to export a special page's content? [21:27:49] I read not all [21:28:09] Tim_Weyer: no, I am trying to find a way to process data in parallel [21:28:21] We use it in core http://svn.wikimedia.org/svnroot/mediawiki/trunk/phase3/includes/actions/RawAction.php [21:28:32] (not using the jobqueue and running several maintenance/runJobs) [21:28:37] There might even be a special page that uses it [21:29:02] tr|nn|: For what exactly? [21:29:04] ...this is probably mostly moot, because you can't exactly fork from a web process [21:29:33] Tim_Weyer -> Clustering a wiki with k-means-clustering with a specialpage as the interface [21:30:14] Dantman: Yay -- now I remember why forking was bad. It didn't work out of the browser when I tried [21:30:26] If you just want to export pages, you could work with API [21:30:31] only from shell -- but that's probably my apache-build [21:31:24] If you understand what forking is the whole idea sounds insane [21:31:26] Tim_Weyer: Oh yeah, I know that it is possible, but it is much easier to use the classes already provided for me [21:31:48] Title and Page [21:32:03] Forking pretty much starts the same process you're running as a new process... it re-runs it [21:32:26] And inside php on the web side you're usually either in an Apache process or a FastCGI process [21:32:49] The process does not start up intending to run your script [21:33:02] It only gets to that point after serving out a specific page [21:33:33] So really what a fork would be telling the process, would be to start a brand new webserver worker or fastcgi process [21:34:15] Dantman: which doesn't sound too bad to me somehow. [21:34:47] Sure... ;) If you're writing webserver code [21:35:19] Dantman: hm, means that I'd be out of the php-interpreter? [21:35:28] uhuh... [21:35:50] And the process would have absolutely no indication that it's purpose is even to execute your script [21:36:19] It would be starting up with the intent to listen for new requests for webpages to serve out [21:36:52] And to top it off it wouldn't be setup the right way for anything to even send anything to it [21:37:33] Either it would sit there waiting for requests and not get them, or perhaps try to listen on a port that the parent process was already using, fail, and then die... [21:38:06] 03jeroendedauw * 10/trunk/extensions/EducationProgram/ (6 files in 3 dirs): work on ambassador pagers [21:40:53] Dantman: Hm, my brain experiment as of now is to call a specialpage that returns nothing (or a simple return value) and forks itself into a new process where it can go on using the MW-classes while the original process does nothing much but returning a minimal page to the browser/xmlhttprequest [21:41:52] I still say you should try to use the job queue... perhaps make it break jobs up if necessary [21:42:00] e.g. returning whether or not forking worked. the forked process would compute a fixed number of datasets and store its results in memcached [21:42:37] Dantman: the problem with the job queue was to monitor the jobs and trigger the next steps within the algorithm [21:42:55] Why do you need to monitor the jobs? [21:44:52] depending on the results at the end of each computation-lot i need to decide if another rerun through all pages is necessary [21:45:14] Then create a new job from within your job [21:45:18] SMW Does it [21:46:30] It spawns a single job at the start of a full wiki refresh. That job spawns a small set of jobs to refresh individual pages. And the job spawns another job like itself that will spawn the next set of jobs. [21:46:40] once i split up my run into several small jobs that sounds a little difficult to me as the job queue doesn't have a fixed sequence in which the jobs are processed .. at least thats what i got out of the code [21:46:51] Sure... [21:47:17] If you need a sequence, then instead of spawning the jobs at once, make each step of the process spawn the next step's job [21:47:21] Dantman: now that is an interesting idea. [21:47:29] Then the next step's job does not exist until the first job is finished [21:47:58] And if you do have anything that can be done in parallel instead of sequence, then you can spawn multiple jobs instead of just the next one [21:48:31] Then you can let the code take advantage of the possibility the wiki's servers may be setup with multiple servers running jobs separately [21:48:46] yes that sounds good. [21:49:34] ^_^ I only wish I had a self-interested project that handled enough data that I had a reason to write something like that myself ;) [21:49:37] lol [21:49:38] Dantman: And another good thing about it: I could still use memcached :) [21:49:45] Sure.... [21:49:52] we... to be specific $wgMemc [21:50:05] yeah [21:50:13] Which will cache in say apc if it's a single server that doesn't use memcached [21:50:27] Just be sure it's not CACHE_NONE. [21:50:31] Hm, you might be interested in the extension once it is running... [21:51:25] K-Means-Clustering of all pages [21:52:29] Maybe [21:58:20] 03(NEW) Unable to publish my article - 10https://bugzilla.wikimedia.org/34204 normal; MediaWiki: Installation; (dxlaragon) [22:03:47] 03(mod) Unknown page action hook problem: the hook must pass the unknown action to the callee and not the value "nosuchaction" - 10https://bugzilla.wikimedia.org/34203 +comment (10mail) [22:09:38] 14(INVALID) Unable to publish my article - 10https://bugzilla.wikimedia.org/34204 (10Sam Reed (reedy)) [22:09:46] Lol, now I have found a good reason: Doing my computation in a fork is bad because the process would be running on the servers that most of all are supposed to serve webpages... and they'd be stuck with computing stuff theyre not supposed to. [22:11:11] 03(mod) Unknown page action hook problem: the hook must pass the unknown action to the callee and not the value "nosuchaction" - 10https://bugzilla.wikimedia.org/34203 +comment (10mail) [22:11:12] 03jeroendedauw * 10/trunk/extensions/EducationProgram/ (6 files in 2 dirs): work on ambassador pagers [22:11:13] Dantman: The way SMW uses the Job Queue (spawning new jobs before exiting one) -- can you point me to some code I may have a look at? [22:11:27] same way you'd add a job anyway [22:12:06] Reedy, simply making a new instance of Job [22:12:43] The idea is generally the simple, but http://svn.wikimedia.org/svnroot/mediawiki/trunk/extensions/SemanticMediaWiki/includes/jobs/SMW_RefreshJob.php [22:13:44] Look at your job params, process your data, create a job with new params and insert [22:16:59] Reedy, Dantman: thx. I tend to ask more than necessary, sorry. It's somehow inside of me wanting to check first before doing while instead it is even a better learn curve to just do it. [22:17:41] Sorry, if I am taking up your time. [22:21:44] 03jeroendedauw * 10/trunk/extensions/EducationProgram/ (3 files in 2 dirs): added missing messages and set img width [22:29:04] 03jeroendedauw * 10/trunk/extensions/EducationProgram/EducationProgram.php: added missing wikieditor dependency [22:36:07] 03(FIXED) pls. add Extension:PdfBook to the Bugzilla option list, as this is in MW SVN - 10https://bugzilla.wikimedia.org/34202 normal->15enhancement; +comment (10Sam Reed (reedy)) [22:43:09] 03raymond * 10/trunk/ (80 files in 49 dirs): Localisation updates for core and extension messages from translatewiki.net [22:44:30] I've got a database from an old version of mediawiki, and I'm not sure which version it is. any way to figure that out [22:44:30] ? [22:45:10] I tried coupling it with 1.18 and running the update.php but I'm getting errors which google does not have an answer to. [22:45:38] caimlas: well, you could make a list of tables [22:45:42] what errors? [22:45:50] And then check it against the database scheme docs [22:46:02] IIRC they have version number for when each table was introduced [22:46:16] !schema [22:46:16] http://www.mediawiki.org/wiki/Manual:Database_layout [22:46:18] Reedy, "...ipblocks table does not exist, skipping new field patch" [22:46:53] Oh [22:46:58] That must be really old [22:47:36] I want to say it's 1.7 but I'm not sure. [22:48:17] I also got "Set $wgShowExceptionDetails = true; in LocalSettings.php to show detailed debugging information." and did so, but I'm not seeing where that detailed debugging information is going (doesn't change the output of php update.php at any rate) [22:49:09] Cause it's not encountering any exceptions [22:49:48] We are using PCRE for regexes. I remember they had some problems. Can anyone remind me which problems were there? [22:53:04] 03jeroendedauw * 10/trunk/extensions/EducationProgram/ (3 files in 2 dirs): get rid of duplicate code and added min width [22:53:14] 03raymond * 10/trunk/tools/ToolserverI18N/language/messages/ (13 files): Localisation updates for ToolserverI18N from translatewiki.net [22:55:57] vvv, Reedy, can either of you recommend a way in which I can get this wiki db in use/upgraded to the most current? best idea I've got is to figure out which version it was, then upgrade in 'steps', but i suspect i'll encounter similar problems along the way. [22:56:20] Well [22:56:32] Does it have "cur" and "old" tables? [22:57:12] for some reason, we apparently don't always drop old tables [22:57:18] Okay [22:57:24] Does it have "revision" table? [23:02:41] based on the table history, looks like it's either 1.6 or 1.5 [23:03:01] no cur or old tables. [23:04:22] i've got validate and templatelinks [23:05:13] and pagelinks, so probably 1.6? [23:06:25] vvv, and yes, there's a revision table. [23:06:54] That's good [23:06:57] ah. :) [23:06:59] good to know. [23:07:05] i know 1.4 and prior is a bitch. [23:07:08] That means that the migration will not be as painful as it could be [23:07:22] vvv, recommendations on a general approach? [23:07:31] If update script does not work, try to apply patches manually [23:07:54] It sort of tedious, but then you can see the whole process [23:07:58] vvv, so, still better to go all in one leap? [23:08:14] instead of jumping to eg. 1.12 and then 1.17 or some such thing? [23:08:23] You could try updating incrementally [23:08:32] But I do not know which version you'd need [23:10:33] gotcha. [23:11:10] oh fudge I know what the problem is (may be) [23:11:14] table prefix. [23:11:16] *smack* [23:11:20] let's fix that. :) [23:20:41] Weird [23:21:09] It is possible to create situation in V8 when running time of regex has exponential time depending on the string which is checked [23:35:16] 03(mod) UnknownAction hook problem: the hook must pass the unknown action to the callee and not the value "nosuchaction" - 10https://bugzilla.wikimedia.org/34203 summary (10T. Gries) [23:46:19] has anyone used https://github.com/jpatokal/mediawiki-gateway ?