[00:04:16] New review: MarkAHershberger; "> Is space significant around a Math equation?" [mediawiki/extensions/Math] (master) C: 0; - https://gerrit.wikimedia.org/r/4422 [00:10:57] New review: MarkAHershberger; "http://krt3.lsu.edu/training/latex/latex.html#spaces" [mediawiki/extensions/Math] (master) C: 0; - https://gerrit.wikimedia.org/r/4422 [00:22:17] 03(NEW) SVGs state "No higher resolution available" on file description page - 10https://bugzilla.wikimedia.org/35821 normal; MediaWiki: Images and files; (rd232) [00:53:15] 03(mod) Cannot scroll low enough to see all content - 10https://bugzilla.wikimedia.org/35551 +comment (10Killiondude) [01:18:26] !seen bawolff [01:18:28] bawolff (~bawolff@wikinews/bawolff) was last seen quitting from #mediawiki 23 hours, 47 minutes ago stating (Quit: ChatZilla 0.9.88.1 {[Iceweasel} 3.5.16/20110302220840\]). [01:18:37] !seen ialex [01:18:39] ialex (~IAlex@mediawiki/pdpc.active.ialex) was last seen quitting from #mediawiki 9 days, 3 hours, 44 minutes ago stating (Quit: ialex). [01:41:01] 03(mod) Cannot scroll low enough to see all content - 10https://bugzilla.wikimedia.org/35551 +comment (10Amgine) [01:53:04] Heh... looking deeper I think git REALLY doesn't work with diffs... [01:54:14] From the sounds of it git doesn't store commits as deltas as I thought it would have... [01:54:29] It stores the entire file contents in a blob. [01:54:50] ;) It gets it's efficiency because it has such an awesomely efficient packfile format. [01:55:32] That might be fun to try and addapting into an external storage format. [02:07:37] Hello guys, are there any graphing extensions that create interactive graphs instead of just static pictures? [02:13:42] hi all [02:13:45] what do you think of this? http://wikipediafs.sourceforge.net/ [02:13:53] it looks like something I've been looking for and I'm wondering if someone got it working / experiences / ... :) [02:18:19] dcht00: https://www.mediawiki.org/wiki/Manual:External_editors [02:20:17] What if I'd like an application to simply read & edit wiki pages? [02:21:30] you think using the API would be better suited? http://www.mediawiki.org/wiki/API:Main_page [02:23:15] I think it might just be something I did, but why is my active users -1 on this page: http://wiki0x10c.com/wiki/Special:Statistics [02:24:39] 03(NEW) Output preview should go through Tidy if Tidy is used to process article - 10https://bugzilla.wikimedia.org/35822 normal; MediaWiki extensions: ExpandTemplates; (liangent) [02:33:06] 03(NEW) Wikijunior and cookbook namespaces for the Vietnamese Wikibooks - 10https://bugzilla.wikimedia.org/35823 normal; Wikimedia: Site requests; (mxn) [02:43:14] 03(mod) Watchlist doesn't show or count pages not in the main namespace when deleting them from the watchlist - 10https://bugzilla.wikimedia.org/35601 +need-review +patch; +comment (10Sumana Harihareswara) [02:48:54] T_T I've returned to proc hell [03:06:41] <^Mike> I guess ipb-blockingself must mean I am being stopped because I'm blocking myself - but that's what I want. How can I override or avoid this API error? [03:07:18] You want to block yourself? [03:07:43] Joan: gosh, it's what all the cook kids do [03:07:49] cool [03:07:51] <^Mike> Yes, for testing. [03:07:53] Hah! By proc_hell, hello git abuse!!! [03:07:58] Bye* [03:09:22] http://dpaste.org/sbyCx/ [03:09:57] ^Mike: I'm looking. [03:10:13] <^Mike> awesome, thanks [03:10:25] <^Mike> It isn't documented in the api help page, or the on-wiki docs [03:11:15] > [03:11:17] # Give admins a heads-up before they go and block themselves. Much messier [03:11:20] # to do this for IPs, but it's pretty unlikely they'd ever get the 'block' [03:11:23] # permission anyway, although the code does allow for it. [03:11:25] # Note: Important to use $target instead of $data['Target'] [03:11:28] # since both $data['PreviousTarget'] and $target are normalized [03:11:30] # but $data['target'] gets overriden by (non-normalized) request variable [03:11:33] # from previous request. [03:11:35] if( $target === $performer->getName() && [03:11:38] ( $data['PreviousTarget'] !== $target || !$data['Confirm'] ) ) [03:11:40] { [03:11:43] return array( 'ipb-blockingself' ); [03:11:45] } [03:11:48] > [03:11:50] So it looks like you need to specify a confirm parameter of some kind. [03:12:00] Which is a hidden value in the form. [03:12:34] <^Mike> I'm using the API, I can't even look for hidden elements in the UI :) [03:13:29] [03:13:34] You might be able to specify one of those? [03:13:36] Dunno. [03:24:49] return array( 'ipb-blockingself' ); [03:24:52] deep [03:25:16] you sure you do not want to put it on an infinite loop? [03:28:25] New patchset: Eloquence; "Fix for chunked uploading support in API." [mediawiki/core] (master) - https://gerrit.wikimedia.org/r/4544 [03:29:42] New review: jenkins-bot; "Build Successful " [mediawiki/core] (master); V: 1 C: 0; - https://gerrit.wikimedia.org/r/4544 [03:42:00] Do criminals get copyright protection? For example, this coin is a fake being advertised as genuine, and I stole the image so it could be annotated to protect potential victims: http://www.coincompendium.com/w/index.php/File:1333942588-6317.jpg [03:42:31] I'm not sure if I should respect copyrights in such circumstances. [03:42:49] You would get better responce asking in another channel that isn't dedicated to MW development [03:43:03] oh, whoops, thought I was in #wikipedia [04:23:36] 03(mod) Language and direction of first heading should depend on page content language instead of user interface language - 10https://bugzilla.wikimedia.org/34514 +comment (10Amir E. Aharoni) [04:27:30] * DanielFriesen wonders if he went overboard [04:58:23] Aaaaagghhhhhh... it's a tug-of-war of language advantages-disadvantages... [05:02:57] Ruby: Piles of libraries. Great process api and piles of other things I'd need; No, Python+Django, it's got smilar levels of good api, and django has good user handling built in; No, but Ruby has JRuby, if we need to we can easily start usng JGit. [05:04:49] DanielFriesen: also https://www.samba.org/~jelmer/dulwich/ [05:05:59] DanielFriesen: it is reasonably complete + stable (hg and bzr use it for their git interop, apparently) [05:06:10] + pure python, so no install hell yay [05:06:55] Would it be feasible to make a log of edits which are blocked by the local/global spam blacklist? [05:11:11] 03(mod) Need a way to debug exceptions thrown from my ResourceLoader module "Uncaught RangeError: Maximum call stack size exceeded" - 10https://bugzilla.wikimedia.org/35814 +comment (10Dmitriy Sintsov) [05:17:11] [QUESTION] I just installed a fresh 1.18.2, imported an sql dump, confirmed import, restarted apache2 and mysql, and the wiki doesn't seem to have integrated any of the data... any takers? [05:22:17] Sieva_: Is LocalSettings.php configured to read from the appropriate database? [05:22:25] The same database where you imported the data? [05:22:37] How'd you import the data? [05:23:24] CLI [05:23:51] imported to the db that the web installation was hooked to [05:23:58] YuviPanda: I looked at that shortly ago... I haven't found a single acceptible git library yet... pleny of them are missing something as simple as "fetch commits from remote" [05:24:08] DanielFriesen: ah [05:24:23] At this rate I may go high level... [05:24:25] Sieva_: You imported into an existing DB? [05:24:27] DanielFriesen: you could write the app in bash :D [05:24:33] lol [05:24:38] @Joan: Yes [05:24:40] Were you moving the wiki or something? [05:24:49] yes, moving to new box [05:25:12] Well, if you're not seeing the new data, [05:25:14] 04(REOPENED) "oversighter", "oversight", "unoversight", "hide" and "unhide" messages are very unclear - 10https://bugzilla.wikimedia.org/35026 +comment (10Amir E. Aharoni) [05:25:20] I'd say it's either reading from the wrong place. [05:25:23] Or you're hitting cache. [05:25:46] hmm... ill clear APC. brb, thank you! [05:26:26] is anyone around to explain create account, auto creation and interaction with the filter action == 'createaccount' [05:26:45] I'd imagine the filter doesn't account for auto-creation. [05:26:48] But that's just a guess. [05:26:50] the subtleties are interesting, so I am after some clarification [05:27:03] yep, that would be my guess [05:27:27] well, that would be my expectation with the evidence to hand [05:29:15] @Joan: nope, not cache. using show tables for the db given in localsettings, it has all the relevant tables shown. maybe the db structure didn't like the import, and the import just bloated the db with unreadable table structure? [05:29:42] Sieva_: Can you select from the page table? [05:29:48] Does it have page titles? [05:29:57] select * from page limit 1; [05:29:59] Or something. [05:33:51] returns info on monobook.css :/ [05:34:41] should i import to a completely empty table and redirect localsettings? [05:38:53] How'd you dump the database? [05:38:55] mysqldump? [05:39:07] 03(NEW) autocreation of accounts skips filter action == 'createaccount' - 10https://bugzilla.wikimedia.org/35824 normal; MediaWiki extensions: AbuseFilter; (billinghurst) [05:39:24] I'd not do a fresh install DB merge with exported data. [05:39:27] That seems like asking for trouble. [05:39:36] I'd just dump and then import in and have the import make the new DB. [05:39:54] Then copy your LocalSettings.php file over and make the appropriate changes to $wgDbname, etc. [05:42:49] Sieva_: have you looked at our guide for moving your wiki? [05:43:19] yes, was working from that. [05:43:25] dumped from CLI [05:44:26] attempting fresh db and redirect localsettings now. [05:50:12] * DanielFriesen seriously does not understand why php requires a timezone setting [05:50:49] * johnduhart shrugs [05:51:17] DanielFriesen: I blame lack of design... ;) [05:53:12] php does a lot of stupid crap up front that you really don't need to have done/waste a chunk of performance. Some of it requires timezones. Why? there is no reason. [05:59:29] 03(mod) Need a way to debug exceptions thrown from my ResourceLoader module "Uncaught RangeError: Maximum call stack size exceeded" - 10https://bugzilla.wikimedia.org/35814 +comment (10Dmitriy Sintsov) [06:08:22] 03(NEW) fullurl arguments exposing nowiki stripmarkers - 10https://bugzilla.wikimedia.org/35825 normal; MediaWiki: Parser; (sumurai8) [06:29:41] New patchset: Santhosh; "Fix Bug 33658 - Add support for {{GRAMMAR:}} to the mediawiki.jqueryMsg module" [mediawiki/core] (master) - https://gerrit.wikimedia.org/r/4078 [06:30:15] 03(NEW) Protool neutral link from https://labsconsole.wikimedia.org/wiki/Help:Access#Giving_users_Labs_access.2C_if_they_don.27t_already_have_SVN_access - 10https://bugzilla.wikimedia.org/35826 normal; Wikimedia Labs: Unspecified; (billinghurst) [06:30:27] 03(mod) Protocol neutral link from https://labsconsole.wikimedia.org/wiki/Help:Access#Giving_users_Labs_access.2C_if_they_don.27t_already_have_SVN_access - 10https://bugzilla.wikimedia.org/35826 summary (10billinghurst) [06:30:59] New review: jenkins-bot; "Build Successful " [mediawiki/core] (master); V: 1 C: 0; - https://gerrit.wikimedia.org/r/4078 [07:02:15] New review: Hashar; "(no comment)" [mediawiki/extensions/AntiSpoof] (master); V: 1 C: 2; - https://gerrit.wikimedia.org/r/3410 [07:02:18] Change merged: Hashar; [mediawiki/extensions/AntiSpoof] (master) - https://gerrit.wikimedia.org/r/3410 [07:05:27] New review: Hashar; "(no comment)" [mediawiki/core] (master); V: 0 C: 1; - https://gerrit.wikimedia.org/r/4551 [07:08:38] New review: Hashar; "(no comment)" [mediawiki/core] (master); V: 1 C: 2; - https://gerrit.wikimedia.org/r/4521 [07:08:41] Change merged: Hashar; [mediawiki/core] (master) - https://gerrit.wikimedia.org/r/4521 [07:37:29] 03(mod) Protocol neutral link from https://labsconsole.wikimedia.org/wiki/Help:Access#Giving_users_Labs_access.2C_if_they_don.27t_already_have_SVN_access - 10https://bugzilla.wikimedia.org/35826 normal->15enhancement (10Peter Bena) [07:39:47] 03(mod) CheckExtensionsCLI instantiates PremadeMediawikiExtensionGroups with incorrect constructor parameters - 10https://bugzilla.wikimedia.org/35804 +comment (10Niklas Laxström) [07:48:08] 03(mod) Gerrit uses flash - 10https://bugzilla.wikimedia.org/35800 +comment (10Niklas Laxström) [07:50:25] Anyone understand how templates will collaborate with wiki engine, I've learned http://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2012-01-30/Technology_report and got surprized :-) [07:53:04] Hello, is there any "Who's online"? [07:53:58] In MediaWiki? [07:54:09] No, at least in core [07:54:15] There may be extensions for that [07:54:48] How? [07:56:50] I mean, is it in the download? [08:19:51] https://www.mediawiki.org/wiki/Extension:WhosOnline [08:38:29] New patchset: IAlex; "Check that the result of Title::makeTitleSafe() is an object before calling a method on it." [mediawiki/core] (master) - https://gerrit.wikimedia.org/r/4552 [08:39:47] New review: jenkins-bot; "Build Successful " [mediawiki/core] (master); V: 1 C: 0; - https://gerrit.wikimedia.org/r/4552 [08:43:03] 03(mod) [Regression] Edit summary shouldn't be parsed as wikitext into html in e-mail notifications - 10https://bugzilla.wikimedia.org/35019 +comment (10Alexandre Emsenhuber [IAlex]) [08:43:04] 14(DUP) Templates used in edit summaries are expanded in e-mail notifications - 10https://bugzilla.wikimedia.org/34714 +comment (10Alexandre Emsenhuber [IAlex]) [08:44:24] Joan uh oh is it working properly, the ping config? [08:56:03] 03(mod) Introduce custom events for MediaWiki's front-end flow - 10https://bugzilla.wikimedia.org/30713 (10joan.creus.c) [09:07:55] New patchset: Santhosh; "Grammar rules ported to js" [mediawiki/core] (master) - https://gerrit.wikimedia.org/r/4554 [09:07:59] New patchset: Santhosh; "Fix Bug 33658 - Add support for {{GRAMMAR:}} to the mediawiki.jqueryMsg module" [mediawiki/core] (master) - https://gerrit.wikimedia.org/r/4078 [09:09:12] New review: jenkins-bot; "Build Successful " [mediawiki/core] (master); V: 1 C: 0; - https://gerrit.wikimedia.org/r/4554 [09:10:24] New review: jenkins-bot; "Build Successful " [mediawiki/core] (master); V: 1 C: 0; - https://gerrit.wikimedia.org/r/4078 [09:13:17] New code comment: Mihalienkelejd; http://www.emcollections.net/default_en.asp; [09:43:39] 03(NEW) Special:UserRights does not accept GET params - 10https://bugzilla.wikimedia.org/35827 normal; MediaWiki: Special pages; (danny.b) [09:44:56] saper : hi saper, some help ! [09:45:50] saper : how to read a tag from wikitext using parser ? [09:47:19] gwicke , hello [09:47:41] hi chughakshay16 [09:48:04] gwicke: how to read a tag from wikitext using parser object..? [09:48:33] chughakshay16: you can register a parser hook [09:49:06] chughakshay16: http://www.mediawiki.org/wiki/Manual:Tag_extensions [09:49:28] gwicke: yeah i went through this page.. but still not very clear about how to do it [09:49:43] chughakshay16: did you try the sample code? [09:49:56] 03(mod) Grant the abusefilter-log-detail right to patrollers on Commons - 10https://bugzilla.wikimedia.org/35545 +comment (10Rainer@Rillke.eu) [09:49:57] 03(mod) Enable $wgAllowCopyUploads (upload by URL) - 10https://bugzilla.wikimedia.org/20512 (10Rainer@Rillke.eu) [09:50:04] gwicke: what if i dont want to do it with a hook ? [09:50:33] chughakshay16: then describe what you actually want [09:50:37] gwicke: i have stored a tag in a page content.. and now i want to read its attributes.. how should i do it ? [09:51:03] 03(NEW) Sometimes the nearby layer can be scrolled up to reveal the web page beneath it. - 10https://bugzilla.wikimedia.org/35828 normal; Wikipedia App: Near by; (liangent) [09:51:07] chughakshay16: a parser hook is pretty much the only supported way to do that [09:52:00] gwicke: what if i extract wikitext for a page from the text table.. and now i want to read the attributes of my tag.. ? [09:52:11] gwicke: cant i do it any other way ? [09:52:40] chughakshay16: you can always parse the text yourself [09:53:31] gwicke: yeah but for that i would have develop new code to parse it.. isnt there any parser function already available which just extracts the tag i am looking for ? [09:54:36] chughakshay16: tag hooks do just that [09:54:44] ideas on how to get all diffs from a specific wikipedia? right now i'm trying to get from the revision table and searching with a regexp in rev_comment, but i wonder if there's something more effective (maybe looking on the recentchanges table?) [09:57:17] gwicke: i dont know how to explain u what exactly i am trying to do.. :) [10:00:15] 03(mod) API imageinfo query returns no imageinfo data when asking for redirect and redirect target - 10https://bugzilla.wikimedia.org/31849 +comment (10Rainer@Rillke.eu) [10:11:08] varnent: hello [10:33:07] or, maybe, a way to get the previous revision? [10:44:27] joancreus: I didn't read everything but API might help you# [10:44:48] you should ask one who knows how to deal with API [10:46:45] joancreus: all diffs? download a dump of database with full history (it is an xml file) [10:46:57] 03(mod) A general interwiki language link to oldwikisource is required - 10https://bugzilla.wikimedia.org/32189 +comment (10Robin Pepermans (SPQRobin)) [10:50:03] 03(NEW) WikiStats comparison charts of (very) active users are broken, 0 is shown as missing - 10https://bugzilla.wikimedia.org/35829 normal; Wikimedia: Usage Statistics; (federicoleva) [10:51:53] Tim_Weyer: Beau_ i've thought of using api/dumps/mysql. api definitely not, because there'd be too many requests. the problem with dumps/mysql is that i cannot find which are reversions, and which are not [10:53:14] Beau: Have you changed the Proofread page code ? http://en.wikisource.org/wiki/The_Statutes_at_Large_%28Ruffhead%29/Volume1/Magna_Carta is not behaving correctly [10:53:30] joancreus: why not? just compare the revision you are processing with earlier ones, if it matches, then current edit was a revert [10:53:53] it has n^2 complexity, but it works ;) [10:54:02] Beau: And I'm getting rather frustrated that this inconsistency of behaviour occurs EVERY single time I try to do section transcludes [10:54:23] I don't like playing hunt the obscure typo, EVERY single time [10:54:31] hmm ok [10:55:12] joancreus: you can build a hash table to speed the things up if it is too slow, but I would go with simpliest solution first :) [10:55:31] 03(mod) Transwiki import to multilingual (old) wikisource broken - 10https://bugzilla.wikimedia.org/32411 +comment (10Doug) [10:55:52] the other way, Beau_ , is look at the recentchanges table but it only lasts 30 days [10:55:53] Qcoder00: I don't think there were any deployments of PP code recently [10:56:00] OK [10:56:15] So why does the transclude 'break'? [10:56:19] joancreus: recentchanges table does not contain information if the edit is a revert or not [10:56:29] Beau_: rev_last_id ? [10:56:30] And include stuff that's CLEARLY after the end of named section? [10:56:38] joancreus: that just id of previous revision [10:56:54] :O what i can also do is rev_comment LIKE 'Revertides%' [10:56:58] but that needs l10n [10:57:07] and still, no oldid [10:57:07] joancreus: compare revision texts [10:57:10] i'll go with your solution [10:57:13] of comparing [10:57:15] thanks!! [10:58:57] Qcoder00: hmm... [10:59:17] http://en.wikisource.org/wiki/Page:Ruffhead_-_The_Statutes_at_Large,_1763.djvu/45 is what is being transcluded from [10:59:38] And aif you check the markup for the page the sections ARE clearly labelled [11:00:02] It seems that Proofread Page can'c cope with sections inside tables [11:00:22] Or that there is an obscure typo that causes it to misbehave [11:01:04] Qcoder00: I don't think that this is LST or PP fault, It does not modify the transcluded text [11:01:18] Well please explain why the transclude breaks [11:03:07] Qcoder00: sure :-) https://en.wikisource.org/w/index.php?title=The_Statutes_at_Large_%28Ruffhead%29%2FVolume1%2FMagna_Carta&diff=3735805&oldid=3735796 [11:03:27] Well that now works [11:03:41] but I'd had the exact same code as you before and it did NOT [11:04:02] Qcoder00: that was the only change I made [11:04:10] Strange [11:08:59] Anyway how is the onlysection idea coming along? [11:15:33] New patchset: Santhosh; "Introduce useosk=true as alternate method to enable osk feature and other bug fixes, cleanup" [mediawiki/extensions/Narayam] (master) - https://gerrit.wikimedia.org/r/4555 [11:15:34] New patchset: Santhosh; "Add initial version of optional onscreen keyboard support." [mediawiki/extensions/Narayam] (master) - https://gerrit.wikimedia.org/r/4070 [11:19:00] Platonides, "Could we magically convert $wgFilterCallback into hooks?" - You want to change every call to $wgFilterCallback into a hook? [11:19:56] Beau : Hmm [11:20:30] https://en.wikisource.org/wiki/The_Statutes_at_Large_%28Ruffhead%29/Volume1/Magna_Carta Can we take this to PM, because I still seem to be having issues getting stuff to transclude properly [11:20:33] ? [11:21:17] Fu... wha? `echo (binary) 'foo';` [11:23:39] hey everyone! can someone help me on spam prevention? [11:24:08] I want to hide external links on the wiki until they are approved [11:24:27] pouyanster: I'm not sure that's currently possible [11:24:51] Qcoder00: do you have any other suggestions? [11:25:16] I don't [11:25:36] Qcoder00: we want to avoid black/white list filtering and the idea was to take external links as a hint [11:26:20] thanks anyway. [11:32:26] Beau_: Next question - https://en.wikisource.org/wiki/The_Statutes_at_Large_%28Ruffhead%29/Volume1/Magna_Carta The column widths should be the same... [11:33:02] They aren't despite setting up the table exactly the same as I did in - https://en.wikisource.org/wiki/The_Statutes_at_Large_%28Ruffhead%29/Volume1/Statutes_of_Merton [11:37:34] Also in https://en.wikisource.org/wiki/The_Statutes_at_Large_%28Ruffhead%29/Volume1/Magna_Carta at the bottom there is a sidenote showing up where it's shouldn't [11:37:44] It's not present in the original at the location it's rendering [11:41:58] Hey all. I've added a bug report to bugzilla two days ago. It's just a request for a change on MediaWiki.org: https://bugzilla.wikimedia.org/show_bug.cgi?id=35794 [11:45:46] New code comment: SVG; [[Manual:$wgFilterCallback|$wgFilterCallback]] doesn't say it's deprecated, just outdated page?; [11:52:54] 03cervidae * 10/trunk/extensions/Postcomment/Postcomment.php: [11:52:54] Use an array to replace comma by 'and' if there are several authors [11:52:54] Corrected descriptionmsg of $wgExtensionCredits after checking Postcomment.i18n.php [12:00:50] https://en.wikisource.org/wiki/The_Statutes_at_Large_%28Ruffhead%29/Volume1/Magna_Carta - In CAPI a sidenote is rendering twice ? [12:00:59] I can't find the typo causing this ? [12:01:01] Suggestions? [12:04:55] || [12:04:55] || [12:05:02] everything is given twice [12:07:38] 03(mod) Content (including categories) not showing on file redirects - 10https://bugzilla.wikimedia.org/27857 summary; +comment (10Rd232) [12:08:36] Tim Wyer: One section is te english [12:08:41] One section is the original [12:08:50] The page scan is multi col [12:09:08] Tim_Weyer : http://en.wikisource.org/wiki/Page:Ruffhead_-_The_Statutes_at_Large,_1763.djvu/46 [12:09:17] And the sidenote is only CODED once. [12:09:25] So it should not be rendering twice [12:09:31] Where's the typo? [12:09:32] yes, I see [12:10:47] Sorry, Qcoder00, I don't know what the problem is. There came so many new things in the last years... [12:13:17] 03(mod) A general interwiki language link to oldwikisource is required - 10https://bugzilla.wikimedia.org/32189 +comment (10Doug) [12:13:32] short URLs: fullurl doesn't use $wgArticlePath. Is that intended? [12:13:49] 03(mod) Content (including categories) not showing on file redirects - 10https://bugzilla.wikimedia.org/27857 +comment (10Rd232) [12:15:50] New review: J; "(no comment)" [mediawiki/core] (master) C: 1; - https://gerrit.wikimedia.org/r/4544 [12:17:07] Subfader: Yes, it is intended. But it's also possible to use /wiki/Article?action=edit [12:18:30] https://www.mediawiki.org/wiki/Manual:$wgActionPaths [12:18:33] I don't want plain URLs to own articles [12:19:17] oh, I didn't know $wgActionPaths [12:20:48] https://en.wikisource.org/wiki/The_Statutes_at_Large_%28Ruffhead%29/Volume1/Magna_Carta - Hmm [12:20:56] It cannot be this hard to do section transclusions [12:21:04] $wgActionPaths['edit'] = $wgArticlePath.'action=edit'; should work [12:21:17] EVERY time i do it I seem to end up with spurious bits of table that shouldn't be there [12:21:19] erm, '?action=edit' [12:21:25] and I check the source pages carefully [12:21:27] New patchset: Liangent; "Add missing message "action-renameuser"" [mediawiki/extensions/Renameuser] (master) - https://gerrit.wikimedia.org/r/4556 [12:21:35] Can someone PLEASE explain where I am making typos? [12:21:46] before I get even more frustrated? [12:21:58] I don't understand the use of it. I can use plain URLs like http://www.mywiki.com/wiki/Article?action=edit ? I can do that without $wgActionPaths [12:23:08] {{fullurl:/wiki/Article?action=edit}} ? [12:23:14] I'd say if you are already using $wgArticlePath = '/wiki/$1'; you can set $wgActionPaths to make it descriptive [12:23:16] New code comment: Platonides; DefaultSettings says "@deprecated since 1.17" \ And https://gerrit.wikimedia.org/r/4550 is attempting; [12:24:54] 03(mod) A general interwiki language link to oldwikisource is required - 10https://bugzilla.wikimedia.org/32189 +comment (10Nemo_bis) [12:24:57] New review: Demon; "(no comment)" [mediawiki/core] (master); V: 0 C: 2; - https://gerrit.wikimedia.org/r/4534 [12:25:33] It seems I can use http://www.mywiki.com/wiki/editCustomActionName/Article [12:25:52] anyway. not what I wanted [12:25:57] Thanks tho [12:26:43] Subfader: http://p.defau.lt/?rF4hHR02IfGW0rfY9tnZaw [12:26:57] oh, okay [12:28:52] I wanted {{fullurl:Article|foo=bar}} to return http://www.mywiki.com/wiki/Article?foo=bar and not http://www.mywiki.com/realWikiPath/index.php?title=Article&foo=bar [12:29:54] then, you should use this: http://p.defau.lt/?JQAI2p_MA6iOH4Pwtz62Qw [12:30:44] and configure the .htaccess or httpd.conf in the right way [12:32:17] Subfader: Tried it now here, it works: http://international.wikiunity.com/ [12:32:57] per example: http://international.wikiunity.com/wiki/Wikiunity_International?action=history [12:33:14] the links have been changed by: http://p.defau.lt/?JQAI2p_MA6iOH4Pwtz62Qw [12:34:54] I can use plain URL like /wiki/Wikiunity_International?action=history as well, but is {{fullurl:Article|foo=bar}} creating such? [12:35:18] I'll try it, please wait a moment [12:36:02] {{fullurl:Article|action=edit}}: http://international.wikiunity.com/wiki/Article?action=edit [12:36:06] just for action links only [12:36:15] for foo=bar, it doesn't work [12:36:34] {{fullurl:Article|foo=bar}}: http://international.wikiunity.com/w/index.php?title=Article&foo=bar [12:36:41] Ah ok, they would have to be defined in advance I guess. Too bad [12:37:26] See. And my question if that is intended. It should use $wgArticlePath just like [[Article]] links do [12:37:39] but you already can set action links, I'll look for a solution for every link [12:38:16] No effort please. I can fiel a new bug report on bugzilla [12:40:42] Subfader: Don't create a bug report if there is no bug, just let me check [12:41:11] It is a bug imo. I see no reason for different handling between fullurl and [[link]]. [12:41:50] you might just need to set $wgActionPaths['foo'] = .... [12:42:02] Subfader: add action 'foo' to $actions array [12:42:07] then it might work [12:42:28] no custom actions ;) [12:43:02] yes, I've tried it now..hmm... [12:43:07] have guests now [12:43:09] 03(NEW) If set, fullurl should use $wgArticlePath to create URLs - 10https://bugzilla.wikimedia.org/35830 minor; MediaWiki: Parser; (subfader) [12:48:04] And poof... [12:48:24] I already have a replacement for my first impl of gareth that does the same things [12:48:46] <^demon> Yay, already at Gareth v2? :) [12:49:50] Heh... [12:50:03] ^demon: I just ended up giving up and rewriting in Symfony [12:50:16] <^demon> Aww, not ZendFramework? [12:50:20] <^demon> :) [12:50:24] ...still hate MVC though [12:50:39] Nikerabbit? [12:50:49] I've had to work around two or three bugs already... [12:51:20] <^demon> Does Symfony tie you into arbitrary directory/naming conventions? [12:51:53] Mhmm... they say you can use it without them... but that's only a subset of the codebase [12:52:24] <^demon> *nod* [12:52:38] chughakshay16: I don't know what you mean... is it your own tag? I am not sure you really need tags [12:52:48] The parser creates