[00:02:34] 03(mod) Lag on history page in Firefox 3.5 - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=19287 (10matthew.britton) [00:03:41] is there a way to force the API to force namespaces to a specific language? [00:04:23] anything like lang=en or something [00:15:37] Betacommand: no, because namesaces only have one localisation, and that is the content language [00:15:42] so for each site, they are hard-set to that [00:16:24] Splarka: can you force them to the defaults? [00:16:26] (otherwise you'd have conflicts, and you'd have to alias /every/ namespace to every /other/ namespace) [00:16:44] how dya mean? [00:18:25] Betacommand: example query? [00:18:39] [[Datei:Ohio.JPG]] is the german localizaion but i am looking to get File:Ohio.JPG instead [00:18:57] the non-localized namespaces [00:19:30] ehm... [00:19:50] okay, here is an example [00:19:51] http://de.wikipedia.org/w/api.php?action=query&titles=Bild:O2-Logo.svg [00:20:01] [00:20:05] ns="6" you see there [00:20:10] yeah [00:20:12] so load [00:20:12] http://de.wikipedia.org/w/api.php?action=query&meta=siteinfo&siprop=namespacealiases|namespaces [00:20:16] Datei [00:20:27] and you can also see: Image and Bild [00:20:38] you now know pretty much everything there is to know abut Datei: [00:20:43] Splarka: you cant force it to use defaults when returning results? [00:20:55] Datei is the default [00:21:06] that, for all intents and purposes, /is/ the namespace on de.wp [00:21:08] Splarka: I mean the MW default [00:21:16] that is the mw default [00:21:20] in german, /that/ is the namespace [00:21:28] the english canonical is an alias to that [00:21:47] but no, you can't force it to be 'File: [00:21:54] any more than you can force it to be Bild: or Image: [00:22:09] here, you can combine both queries: [00:22:10] http://de.wikipedia.org/w/api.php?action=query&titles=Bild:O2-Logo.svg&meta=siteinfo&siprop=namespacealiases|namespaces [00:22:24] dam I was hoping to make this easier [00:22:37] scraping the dom? [00:23:45] is there a template to get the full base page name? in other words, if i have User:DeFender1031/somepage, i would like it to return "User:DeFender1031". it seems that {{BASEPAGENAME}} only returns "DeFender1031". [00:24:03] DeFender1031: {{NAMESPACE}}:{{BASEPAGENAME}} [00:24:06] o_O [00:24:12] though that might fail in main [00:24:32] {{#if:{{NAMESPACE}}|{{NAMESPACE}}:}}{{BASEPAGENAME}} [00:24:39] something made a horizontal scroll, lol [00:24:44] Splarka, of course that would work, i was simply wondering if there was a built-in, as i ten to view using built-in functions where they exist as "the right way" to do something [00:25:05] tend* [00:25:09] DeFender1031: no, there isn't, and {{BASEPAGENAME}} will only work where subpages are enabled [00:25:17] maybe you want #titleparts [00:25:40] Splarka, it seems like {{BASEPAGENAME}} just returns {{PAGENAME}} where they aren't enabled [00:25:41] http://www.mediawiki.org/wiki/Help:Extension:ParserFunctions#.23titleparts: [00:25:50] as does {{SUBPAGENAME}} [00:25:59] yes, exactly [00:26:06] #titleparts works on an arbitrary string though [00:26:11] which makes total sense [00:26:13] it is namespace-agnostic [00:27:11] charitwo: 48 tabs? [00:27:15] nah, i can do what i need with {{NAMESPACE}}, {{BASEPAGENAME}}, and {{SUBPAGENAME}}, i was just wondering if there was a {{FULLBASEPAGENAME}} equivalent of if i had to implement it myself is all [00:27:23] ew [00:27:27] don't give MrZ any ideas [00:27:58] there are too many esoteric magic words already [00:28:11] like [00:28:49] what the heck is ? [00:29:11] i know of [00:29:24] exactly! [00:30:00] esoteric extra aliases to code that can be done already with existing code are annoying [00:30:01] hahahaha touche [00:30:18] onlyinclude is like wrapping everything else in the page in noinclude [00:30:31] Splarka: dunno, couldn't find it [00:30:39] foo bar bazstufffoo bar baz [00:30:44] ah [00:30:53] foo bar bazstufffoo bar baz [00:31:05] it isn't like includeonly, because it also shows on the unincluded page [00:31:27] i got it, it's the opposite of noinclude, and confusing at that [00:32:28] DeFender1031: but if you have ParserFunctions you have #titleparts, it can do what ya want [00:35:41] Splarka, i have parser functions, but AFAIK they're usually slower than magic words, so it's smarter to use magic words wherever possible, no? [00:36:03] hmm [00:39:00] DeFender1031: yes [00:40:04] http://en.wikipedia.org/w/api.php?action=parse&forceprofile=true&title=User:DeFender1031/somepage&text=%7B%7BNAMESPACE%7D%7D%3A%7B%7BBASEPAGENAME%7D%7D -> 0.089457 1 - -total [00:40:28] http://en.wikipedia.org/w/api.php?action=parse&forceprofile=true&title=User:DeFender1031/somepage&text=%7B%7B%23if%3A%7B%7BNAMESPACE%7D%7D%7C%7B%7BNAMESPACE%7D%7D%3A%7D%7D%7B%7BBASEPAGENAME%7D%7D -> 0.089277 1 - -total [00:40:41] http://en.wikipedia.org/w/api.php?action=parse&forceprofile=true&title=User:DeFender1031/somepage&text=%7B%7B%23titleparts%3A%7B%7BFULLPAGENAME%7D%7D%7C1%7D%7D 0.073155 1 - -total [00:41:39] ran it again, #titleparts was fastest again [00:42:02] and again, it tied [00:42:06] it just depends [00:42:16] hmm [00:42:18] alright [00:42:20] thanks [00:42:21] Prodego ^_^ [00:42:36] Splarka: well someone should tell someone that [00:42:56] Prodego: I think it is because it only uses one variable and one parser function? dunno [00:43:08] instead of a parser function with three magic words [00:43:11] or two magic words [00:43:13] why even have the magic word [00:43:20] which? [00:43:27] there must be a FULLPAGENAME or something to do this directlhy [00:43:43] like {{FULLBASEPAGENAME}}? there isn't [00:43:45] read scrollback [00:44:14] Prodego, FULLPAGENAME isn't what i want [00:44:25] i want only the full basepage name [00:44:50] but these two options both work quite nicely [00:45:54] if you wanna find out why exactly it seems faster, you could compare the traces, eg &forcetrace=true instead of forceprofile, in the above URLs [00:46:45] hmm I guess there is no option to do it directly [00:47:05] odd since we have like {{CURRENTMINUTE12}} or whatever [00:47:31] is weird too, titleparts calls > Parser::braceSubstitution a second time, compared to ns:pn [00:47:32] Splarka, i'm honestly not that concerned with why at this point... it'd be purely academic to research and i don't have the time currently [00:47:47] you can't spare 30 milliseconds? heh [00:48:20] well, then i'd have to analyze results etc... [00:48:25] *Splarka nods [00:49:24] here is one of each that exactly tied in execution time: http://p.defau.lt/?aJERWvh9VDWNZH7SB7OqIQ http://p.defau.lt/?o_HJSX5ihreOgAIw_D7P0w [00:49:35] (almost) [00:49:53] heh [00:54:24] 03(mod) Search: add "all namespaces" shortcut - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=11036 +accessibility; +comment (10jidanni) [01:13:02] How could I make a wiki farm? [01:13:20] Is there an extension that would allow me to make a wiki farm? [01:13:29] Like a wiki automatically created? [01:13:29] !farm [01:13:29] --mwbot-- To run multiple wikis, you can simply install MediaWiki in different folders, with different databases or in one with database prefixes. You can also have multiple wikis using a single installation: and . [01:13:51] What about like an automatic creation wiki? [01:14:05] Like Wikia has. [01:14:08] And Referata. [01:14:26] that's more of a wiki breeding pen, than a wiki farm [01:14:32] 03(mod) Consider $wgRCMaxAge when generating limit links for recent changes - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=9257 +comment (10jidanni) [01:14:55] Does anyone know how I could do that? [01:15:34] Like this: http://referata.com/wiki/Special:CreateSite [01:15:53] http://www.wikia.com/wiki/Special:CreateWiki [01:16:16] When a user wants a wiki, it would automatically create one. [01:16:21] Something like that. [01:16:57] Guest20951: I don't know if that exists [01:17:00] let me see [01:17:07] Wikia has it. [01:17:17] https://svn.wikia-code.com/wikia/trunk/extensions/wikia/AutoCreateWiki/ [01:17:50] 03(mod) Revert recentchanges limit to 5000 (id.wikipedia) - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=18197 (10stanley) [01:17:54] Splarka: why isn't that on special:version [01:17:56] probably highly customized for their setup [01:18:11] Prodego: because they aren't the best at creating extensions [01:18:25] wikia is an awful awful site [01:18:34] I want to create a better one. [01:18:40] [no citation needed] [01:18:47] Which is why I need an AutoCreate feature. [01:18:52] Prodego: wfm [01:18:55] http://www.wikia.com/wiki/Special:Version [01:18:57] chuck is already doing that [01:19:00] > AutoCreateWiki (Version 1.0) Create wiki in WikiFactory by user requests Krzysztof Krzyz.aniak, Piotr Molski [01:19:15] it isn't on the other site though [01:19:20] what other site? [01:19:32] referata [01:19:48] Referata doesn't have it available. [01:19:48] so? what does that have to do with Wikia? [01:20:12] ah, ok [01:20:28] Thanks for the help. [01:20:32] I see, I was looking at where it was going to be and of course it wasn't there yet so [01:20:50] probably referata uses their own custom script [01:20:58] basically for this type of thing, you'd need to [01:22:15] wikia is still awful though :) [01:23:38] I just wish they would release some things. :) [01:40:46] 03(NEW) Gadget usage statistics - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=19288 enhancement; Normal; MediaWiki: User interface; (oq9pscb02) [01:42:02] Annemarie: code orage, bug summary without verb [01:42:07] ^orange [01:49:31] 03(mod) Consider $wgRCMaxAge when generating limit links for recent changes - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=9257 (10jidanni) [01:58:49] 03(mod) Gadget usage statistics - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=19288 (10herd) [02:25:17] is it possible to have two links on one line under the "navigation" on the mediawiki sidebar? [02:25:32] luna6: easy with css [02:25:33] if so would like to know the format when editing mediawiki:sidebar [02:25:44] through css... hmm [02:25:52] can't be done by editing the interface message tho [02:25:53] possible through mediawii:sidebar splarka? [02:25:55] ahh [02:26:49] hmm then how about ... adding extra links under "toolbox" ? [02:26:56] javascript [02:26:59] or an extension [02:27:03] my navigation section is long and would like to shorten it [02:27:11] extension ok will look into that [02:27:13] you can split it [02:27:25] in the sidebar just specify a new headin [02:27:36] g, like *Community [02:27:55] compare: http://www.mediawiki.org/wiki/MediaWiki:Sidebar?action=edit [02:28:19] you can even stick a portlet (those boxes are called portlets), under the toolbox portlet [02:28:27] easily with the sidebar interface [02:28:33] (but you can't add links inside the toolbox) [02:30:30] ok thanks for the tips splarka [02:30:33] interesting stuff [02:30:43] hmm, as for sharing a line... [02:31:12] I have like a FAQ link and Contact link [02:31:19] would like to have it on one line [02:31:25] FAQ / Contact [02:31:27] etc... [02:31:38] pastebin the page source for that portlet [02:32:08] like for example. on mw.org the download portlet is: http://p.defau.lt/?kYeh_5qgFPYL_oI27HA42w [02:33:46] hmm [02:34:03] i have a few links written with url and it trips off the pastebin spam detection [02:34:12] http://p.defau.lt/new.html [02:35:15] http://p.defau.lt/?Nt21oSIjvkDytq_k_TzATg [02:35:39] no, the html [02:35:46] er [02:36:43] html err... hmm from my skins .php ? [02:37:12] go to Special:Blankpage, view page source, copy the
for that navigation box [02:38:20] "Gadget usage statistics" seems sufficiently clear. [02:38:35] http://p.defau.lt/?BkcOd8l85mbzzrrZVTh9lw [02:39:05] Annemarie: think of the children! [02:39:15] I did. [02:39:20] They put me on parole. [02:39:32] !defaultsettings [02:39:32] --mwbot-- For the current version of DefaultSettings.php, see: . [02:39:51] luna6: ta, so Faq/contact? [02:40:00] yes [02:40:11] FAQ / Contact [02:40:22] I probably do the same with actors & actresses too... [02:40:34] #n-FAQ {display:inline;} [02:40:34] #n-Contact-Form {display:inline;} [02:40:37] however [02:40:46] this would break if you reanmed 'Contact form' to 'Contact' [02:40:53] because the ID attribute is derived from the text [02:41:01] then it would be #n-Contact {display:inline;} [02:41:12] also, there is no background image, hmm [02:44:20] #n-FAQ {display:inline;margin-left:.5em;margin-right:.5em;} #n-Contact-Form {display:inline;} isn't too bad [02:45:01] ok stupid question but where would I input that? (#n-FAQ {display:inline;margin-left:.5em;margin-right:.5em;} #n-Contact-Form {display:inline;}) ? [02:45:15] or you could simply: [02:45:16] #n-FAQ {float:right;list-style:none !important;} [02:45:24] oh, MediaWiki:Monobook.css [02:45:36] edit that, insert: #n-FAQ {float:right;list-style:none !important;} [02:45:41] see how that looks [02:45:47] (purge browser cache) [02:47:31] ok... my monobook.css looks empty .. also I am using a customized skin of monobook which is renamed ... [02:47:52] bah [02:47:57] sorry :) [02:47:58] put it in MediaWiki:Common.css then [02:48:07] it'll work, just be a bit redundant for skins that don't see it [02:48:19] to be safe: .portlet #n-FAQ {float:right;list-style:none !important;} [02:48:23] I gotta go, bbl [03:07:08] <\s> Hey guys, I have a very important error (fatal on the main page) to report. Fatal error: Call to a member function isCurrent() on a non-object in /home/groups/n/ne/neokiller/htdocs/w/includes/SkinTemplate.php on line 710 on bleeding edge svn [03:08:28] when just viewing the page or what? [03:09:26] <\s> OverlordQ: Just viewing [03:09:50] <\s> The overall wiki works, when logging in (bypassing the main page), but it's a big bug [03:10:10] any special extensions? [03:10:31] <\s> I have a custom home page (NeoKiller:Home) and run Extension:CodeReview [03:12:16] that happens on wikimedia too [03:13:00] <\s> TimStarling: Is it a simple problem, or something more serious? [03:13:48] so it happens when you view the main page? [03:14:52] <\s> TimStarling: Yes, http://neokiller.sf.net/wiki/NeoKiller:Home [03:18:44] can you try opening SkinTemplate.php in a text editor and then on line 710 replace this: [03:18:56] if ( ( $wgArticle->isCurrent() && $istalk ) || $wgOut->showNewSectionLink() ) { [03:18:58] with this: [03:19:04] if ( ( $wgArticle && $wgArticle->isCurrent() && $istalk ) || $wgOut->showNewSectionLink() ) { [03:19:41] <\s> TimStarling: One moment [03:23:47] <\s> TimStarling: Done, I'll just try it now [03:24:25] <\s> TimStarling: It appears to be working now [03:24:52] ok, I'll commit that [03:25:40] <\s> Wonderful :) [03:27:20] 03tstarling * r52149 10/trunk/phase3/includes/SkinTemplate.php: Fix fatal error. Logged on WM but exact cause uncertain. [03:29:35] <\s> TimStarling: I just svn up'd it, and now "Parse error: syntax error, unexpected T_SL in /home/groups/n/ne/neokiller/htdocs/w/includes/SkinTemplate.php on line 710" [03:29:48] heh [03:30:00] <\s> Tsk tsk, syntax errors from a top dev :) [03:30:22] it's probably a conflict marker [03:30:35] you should open the file [03:30:59] <\s> What would it look like? [03:31:06] >>>>>>>> or <<<<<<< or ------- [03:31:10] you'll know it when you see it [03:31:23] just look at the same line you just changed [03:32:28] <\s> Ah, all better [03:51:45] what's the creative commons rdf output supposed to look like? Article::getContributors has MySQLisms in it [04:20:56] Anybody around? [04:21:09] http://www.uncyclopedia.com is currently a GoDaddy parking page [04:21:26] i think you want #wikia [04:21:31] Oh, right [04:21:37] *EugeNetBooK is a few beers in [04:22:29] I think I may hang out here, anyway [04:34:41] can someone help with http://wiki.kahaf.com/index.php/Main_Page ? i had a look at: http://www.mediawiki.org/wiki/Project:Support_desk/Archive_11#Database_error and out of all those i just upgraded my Linux recently. [04:36:33] sounds like mysql was upgraded and the config changed [04:37:33] if you change innodb settings you'll get that [04:39:39] i disabled innoDB in mysql, there is a skup-innodb in my.conf somewhere... [04:40:05] uh, don't do that [04:40:22] although that doesn't look like a missing innodb error [04:40:35] (since that shouldn't cause connection failures..) [04:40:36] i just commented out that line and wiki starts working [04:40:39] great... [04:40:44] thanks everyone [04:56:20] 03(NEW) Make importDump.php handle bzip2 and 7zip - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=19289 enhancement; Normal; MediaWiki: Maintenance scripts; (Emufarmers) [04:57:00] Emufarmers: do you have any caching running for fanhistory wiki? [04:57:08] chuck: of course [04:57:31] what do you use? [04:57:54] Memcached, APC, and file caching [04:59:32] How about you? [05:02:20] Emufarmers: We use memcached, we have APC enabled, and we use Squid [05:02:55] Occasionally we have problems with Squid, though [05:03:58] I've tried setting up Squid a couple times, but it always proved to be too much trouble and there was never enough need to actually keep trying [05:04:55] Yeah, I think I might bypass Squid and have nginx serve the requests directly if I can't fix some of the really really strange slowness [05:05:18] pages will be slow, even after repeatedly trying to refresh then, then when I tack on ?prof=true (to force profiling), it loads perfectly... *facepalm* [05:05:27] Do you use nginx for everything, or just images? [05:05:52] everything [05:06:24] squid does cache images and skin assets, which probably helps [05:06:52] I use apache-worker; solved the memory shortage I was having with -prefork [05:07:17] on a VPS? [05:08:41] Yes? [05:09:32] do you have to use fastcgi with apache-worker? [05:10:01] Yes [05:10:03] 03(mod) Make importDump.php handle bzip2 and 7zip - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=19289 (10Emufarmers) [05:10:07] It's not as bad as it sounds [05:10:39] I have to use fastcgi with nginx, and it's kind of annoying [05:10:50] Actually, you probably *can* still use mod_php with -worker; it's just not supported (and apt-get made it easy to set it up with fcgi) [05:10:50] how many fastcgi processes can you spawn? (how much memory do you have total?) [05:11:04] 2GB [05:13:13] wow, how does laura affora that? [05:13:57] With prefork I had to keep MaxSpareServers at around 10, and still seemed to be having problems; with -worker now, StartServers is at 2, and ThreadsPerChild is at 25 [05:14:06] Ask her \o/ [05:16:18] How much memory do you have? [05:17:57] we have 540MB, and 16 php-cgi processes running [05:18:01] mysql is running on another VPS [05:18:16] I'm thinking of switching from Linode to Mosso, it's a lot cheaper to get more memory there [05:19:44] Uh, they don't seem to say how much memory you get [05:20:36] it's on their "Cloud Server" page [05:20:48] http://www.mosso.com/pricing.jsp ? [05:20:58] Oh, I see [05:21:08] http://www.mosso.com/cloudservers.jsp too [05:21:44] It's paid by the hour, but the server with 1GB of ram translates to $43.20 a month [05:22:17] Sounds too good to be true [05:22:36] heh, well it's run by rackspace, which is a pretty reputable company [05:22:47] hi all [05:22:51] Yeah, but Slicehost is owned by Rackspace now too [05:23:11] yep [05:24:02] Ah, wait a minute [05:24:07] you have to pay for bandwidth on top of that? [05:24:15] May I ask a quick question? Who know the command line to import an xml file please? [05:24:41] Mat__: importDump.php [05:24:52] Emufarmers: oh, really? [05:24:54] didn't see that [05:25:04] Emufarmers thanks [05:25:18] chuck: I didn't see it at first either. I think that's the point. :D [05:25:48] they're good at hiding how much their service costs :D [05:26:09] Well, VPSes are good because they tend to be very up-front about costs [05:26:19] tried this but doesn't work: php importDump.php [05:26:22] Cloud computing is probably going to be less so [05:26:44] because everything will be configurable, so there'll be 20 different things the price is computed from [05:26:53] Mat__: what's the error? [05:27:05] -bash: syntax error near unexpected token `newline' [05:27:15] Uh [05:27:26] You know you have to replace with the name of the file, right? [05:27:27] my file has spaces in its name [05:27:31] Oh [05:27:32] yes [05:27:35] Surround it in quotation marks [05:27:41] ok [05:27:52] Emufarmers: I think "cloud" is just a buzzword they're using, it looks like it's exactly the same as traditional VPS hosting [05:28:03] yowch [05:28:17] .22/GB out .08/GB in [05:28:28] chuck: well, in theory, you should be able to adjust everything on the fly. Of course, you can already do that with Slicehost (and linode, I assume) [05:28:47] although in theory you'd also be able to have whatever allotments you want [05:28:52] so 666MB or whatever [05:29:02] Although it looks like they don't let you do that :( [05:29:15] I'm not sure about SliceHost, but you have to migrate to a new host in the datacenter to upgrade to a newer plan. (smaller memory extras can be added without a migration, but it's not very desirable to do that) [05:29:17] but yeah, mostly buzzwordy [05:30:11] chuck: http://www.slicehost.com/questions#upgrade [05:31:17] that answer is a little vague, but I'm guessing that's exactly how Linode does it too [05:31:47] I think I'll stick with Linode for now, we're rewriting with Rails soon anyways, and we'll be hosting with lovely Heroku.com when that time comes [05:33:04] You're dumping MW? [05:33:16] Er, bad choice of words; ditching? [05:33:47] yeah [05:34:11] :( [05:34:19] Why? [05:34:43] It's a pain to upgrade, and the fact that we have to make an entire copy of the schema that has to be upgraded and all of that makes it really hard to implement some new ideas that we have for what we'd like to do next [05:35:06] :| [05:35:12] If it's that hard to upgrade, urdoinitrong [05:35:28] what do you mean? how can I do it any other way? :P [05:35:43] What do you use for your farm? [05:36:11] we have to keep a copy of the entire mediawiki schema for each site that we host. doing it any other way would require rewriting a lot of database infrastructure in mediawiki methinks [05:36:17] Emufarmers: our own scripts [05:36:39] Well, what about kim's thing? [05:36:57] Anyway, I don't see why that would make it hard to upgrade; just have a script that runs update.php for every site [05:37:09] kims thing would be even harder [05:38:26] that's what we do, but then when we want to install a new extension, we have to mess with the extension table variables. when we want to delete an extension, we have to do weird stuff to get rid of the scehma [05:38:30] *schema [05:38:58] Meh, that doesn't sound *that* hard [05:39:25] And if you're a slow like me, you'll just leave old schemata sitting in the database o; [05:40:50] *slob [05:41:36] Emufarmers: do I have to be in the maintenance folder to enter the command line importDump.php? [05:41:39] using multiple databases in just wrong in the first place. using rails will allow us to correctly make schema changes using migrations that will affect all the wikis at once without any fuss and extra variables that could go wrong that come with having 100s of copies of the schema laying around [05:42:04] Mat__: no, you just need to adjust the path accordingly [05:42:12] ok [05:42:22] php maintenance/importDump.php if you're in the directory above, for instance [05:42:28] Same goes for the target file [05:42:29] thanks [05:42:33] x2 [05:42:52] chuck: well, you could use prefixes, though those seem to be icky [05:44:24] 03shinjiman * r52150 10/trunk/phase3/languages/messages/ (4 files): Localisation updates Cantonese, Chinese and Literary Chinese [05:45:23] that would just remove the need for new databases to be created, the schema would still be duplicated 100s of times unfortunately :( [05:45:56] Emufarmers: thanks [05:46:00] np [05:46:30] Well, I don't know enough (or really anything) about Rails, so I'll have to take your word for it that it would be better :p [05:48:49] Well, basically instead of having multiple copies of the schema in different databases, I'd use keys like wiki_id that would separate the pages and stuff in the database [05:49:06] I don't know enough about DBA to know if that would perform well, though. I might ask domas about that. [05:49:57] chuck: different databases allow different indexing [05:49:59] I'm pretty sure every time I've seen DBA near MediaWiki it was marked with skulls and crossbones [05:50:24] also, easier to handle such things as partial replication, etc [05:50:47] domas: would it be bad, though, to have a table with many rows for all of the wikis? [05:50:50] er, actually, I was thinking of something else [05:50:56] oh, is that the fracturing thing or whatever it's called? [05:50:58] I've said already two reasons, you need more? [05:51:09] no [05:51:31] simply, different collations would not be available with single large table [05:51:42] it's easier to handle stuff like partial replication if I use different databases, or if I use one table? [05:52:16] if you use different databases [05:52:27] you cannot split replication streams by data in it, yet :)) [05:52:35] maybe could do with proxy [05:52:51] would need additional development [05:53:08] so it would be better to use a single database with a single set of the schema for all of the wikis, then use a index, right? [05:53:20] better for what? [05:53:50] well, better isn't the right word [05:54:04] i'm not really comparing it to anything, but would it be okay performance-wise to do that? [05:54:09] yes [05:54:19] would be slightly faster, even [05:54:26] or not [05:54:27] depends [05:54:29] =) [05:55:05] haha [05:55:08] say, many tables avoid table open serialization issues [05:55:16] but many tables also don't hit auto-inc contention on single table [05:55:28] so it is version specific stuff :) [05:57:32] I even showed someone how to do this without changing code too much [05:57:46] you can create views that depend on session variables;-) [05:57:52] I don't think the auto-inc would get too far out of hand [05:57:53] and then use triggers to fill those fields [05:58:15] On Twitter, yes, it got so high that it went higher than the limit of an integer, but in reality on a wiki or something, it shouldn't be as bad :P [05:59:14] anyway, manageability and flexibility are the issues [05:59:26] tell me why multiple tables is bad [05:59:50] (in context of 1000 wikis, not 100000 ;-) [06:01:58] emumarkers: what is wrong inside MW schema? :) [06:02:12] why would developers mark with skull and crosses? :) [06:03:23] It gets tedious to maintain the schema of 100s of wikis, and it would only get worse as we grow... For example, we have a script pieced together that runs update.php on all of the wikis with the correct parameters, but if an update.php fails on a certain wiki for whatever reason, we have to go through the whole output and figure out what was wrong, fix it, then re run the updaters. As I said above, we have to mess with $wgExtNewTables to ... [06:03:29] ... install new extensions, and then we have to write new maintenance scripts to remove extension schema changes. it just seems like there are more variables to go wrong when the schema is to be duplicated 100s of times and then managed and upgraded after that [06:04:04] isn't that problem of your scripts? :) [06:04:33] domas: my bad; I was thinking of DBA for object caching [06:04:36] it doesn't matter, 2 or 200 or 2000 [06:04:56] once you have multiple, you automate :) [06:05:16] Well, the WMF seems to manage somehow [06:06:58] I assume you have some set of scripts cobbled together too? [06:07:37] for wiki in $(..) :) [06:08:08] domas: I just think it would be a lot more maintainable to just add a new row to the wikis table and be able to reference to that rather than having to create a new database, insert the schema, and then run the updaters to install extension schema [06:08:34] Of course then we have the problem of removing stuff, but that's DELETE FROM (pages|revisions|whatever) WHERE wiki_id = 666;... [06:08:43] chuck: how does it work when you need different collation for different wikis? [06:08:50] e.g. germans want their umlauts sorted properly [06:09:10] hmm... [06:09:19] Hmm, 'course the WMF uses $wgConf and CentralAuth and other scary stuff [06:09:25] MySQL can't magically do that? ;-) :-( [06:09:36] do what? [06:09:44] provide different indexing schemes in single index? [06:09:49] sort stuff nicely even with special characters [06:09:50] no, it is simply impossible [06:09:55] it can sort stuff nicely [06:09:59] but in one collation [06:10:14] spanish collation is different from german [06:10:23] and lithuanians treat 'y' as 'i' when sorting [06:10:33] darn those lithuanians [06:10:35] MySQL magically does a lot of things, but it's usually stuff it shouldn't be doing ;) [06:11:07] chuck: I assume you looked at Extension:Farmer? [06:11:25] yeah. [06:11:33] chuck: and of course, how do you manage sharding with such tables %) [06:11:45] Wasn't up-to-snuff at the time? (though I don't know if it is even now) [06:11:54] and in wikimedia scenario sharding is quite complicated due to data distribution [06:14:28] Emufarmers: It looks nice for smaller setups, but I don't think it's something that we want to piggyback off of and try to extend [06:15:20] Now that I think about stuff closely, I guess it would be remotely possible to do it with MediaWiki, but I don't think it's worth it to bend the software so far and make it do things that it was never designed to do, it will crack eventually [06:17:43] Wikia doesn't seem to have cracked [06:18:51] Wikia is different, that's not really bending MediaWiki far because MediaWiki was meant to work that way [06:19:48] What's the difference between what you do and what Wikia does? [06:20:19] Nothing right now, but we don't want to stay in that market because we don't think it's very profitable at all [06:20:36] What are you going to be doing? [06:21:24] My goal with a new wiki engine is to make it easy for people or groups to register an account and create spaces within their account for collaboration. We'll have different plans to limit the amount of spaces created, with a difference between private spaces and public spaces like you can see in sites like github and bitbucket [06:21:44] like pbwiki? [06:22:14] yeah, kind of like pbwiki [06:22:22] On wheels :D [06:22:43] we'd be organised more like how lighthouse is, where you have your own subdomain and then separate spaces/projects inside of that subdomain [06:30:17] chuck: myspace with a touch of wiki? [06:30:49] probably wouldn't be too many social networking features ;-) [06:31:11] I might throw in a Twitter-like clone for groups [06:43:22] sounds like a lot of work and alot of htaccessing [06:45:22] not really, ServerAlias *.ourdomain.com, and then a simple database query before the request is processed to make sure the account is in good standing [06:46:33] Oh i see, a flexible conf [06:47:56] Basically in Rails, we'd do: @current_account = Account.find(:subdomain => request.something.something) (I forget the rails specific stuff there), then I can just do stuff like @current_account.pages.find_by_name('Home') or whatever [06:48:00] and it's easy [06:48:04] i really like how apache handles the multiple languages in their conf, they use a rewrite rule based on a regexp w / sethandler on the type-map [06:48:45] you should look at the http-manual.conf, it is really nice remapping concepts [06:49:06] chuck: you can do that in any language.. and if you're considering your new project based on how you can write neat one-liners in your pet language, maybe you didn't consider it quite enough ;) [06:49:15] I thought he was using nginx? [06:49:52] flyingparchment: I'm just showing him that it wouldn't be a nightmare of hacked together scripts and "a lot of htaccessing" to make it work. ;-) [06:50:35] well, i would be wary if you werent doing some apache magic, because apache magic is how i would imagine a successful project would look like for this [06:51:06] Emufarmers: I think it's server_name *.ourdomain.com; in nginx [06:58:53] is there an easy way to put sessions in the db? [06:59:14] i have my own custom session handler but it doesnt seem to work 100%, i get the "loss of session data" error [07:00:05] $wgMainCacheType = CACHE_DB; $wgSessionsInMemcached = true; [07:00:12] (a bit hacky) [07:00:24] ialex: that actually works? [07:00:42] Emufarmers: it should [07:00:53] $wgMainCacheType should already be in LocalSettings.php [07:01:18] 03(NEW) French spacing matches where it shouldn't - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=19290 minor; Normal; MediaWiki: Page rendering; (tstarling) [07:01:51] *Emufarmers greps [07:02:42] Emufarmers: includes/MemcachedSessions.php [07:03:07] and wfSetupSession() in GlobalFunctions.php [07:03:49] Yeah, that's where I'm looking [07:05:37] ah so memcached will have to be enabled? [07:05:43] It looks pretty Memcached-specific to me, but I'll take your word for it [07:05:47] can i just use php's built-in session overrides? [07:06:08] in conjunction with wfSetupSession() ? [07:07:08] i just took down my nfs server by using session_save_path for some reason. with a single user :p [07:11:13] 03ialex * r52151 10/trunk/extensions/ (6 files in 4 dirs): svn:eol-style native [07:30:10] hey domas: http://en.wikipedia.org/wiki/Wikipedia:Village_pump_(technical)#Special:BlankPage [07:31:34] i guess i have to use memcached [07:43:20] mshadle_: Not necessarily; $wgSessionsInMemcached is misnamed, 'memcached' there really means any caching mechanism you have enabled [07:43:32] And with $wgMainCacheType = CACHE_DB; that's the database [07:44:13] I *think* this stuff ought to work with non-memcached caches as well, but I'm not sure [07:45:10] *ialex uses it with CACHE_ACCEL [07:45:46] memcached sessions don't seem to work very well for me [07:45:50] they get lost very quickly [07:45:52] werdna: is the error_handler now restored at all if no exception is thrown? [07:46:05] *Emufarmers makes a note on the manual page [07:46:13] morning btw [07:46:18] hoi [07:47:18] wait CACHE_DB will use mysql? [07:47:32] yes [07:47:37] what else would it use? [07:47:38] well that's damn simple [07:47:44] i dont know, what table does it use? [07:47:50] objectcache [07:49:24] ah [07:49:31] mshadle_: you'll probably get better performance out of an accelerator or Memcached, if they're options [07:53:12] im not worried aobut performance. it will be a small site [07:53:44] Eh, well, if you like things to load fast, having a bytecode cache helps a lot :) [07:54:01] bytecode cche != session storage [07:54:10] although they can both use apc, sure [07:54:15] I know; I'm just saying [07:54:30] but i dont have a problem with a single server, im on a multi-webserver environment so i need central sessions [07:55:04] Emufarmers: Is adding "extension=apc.so" to php.ini enough to enable APC's bytecode cache? [07:55:23] I think the whole point of Memcached is for multi-webserver environments [07:55:34] chuck: I think so [07:55:48] Emufarmers: Yes, that's pretty much true [07:56:14] Memcached works in a single-server environment but I've heard people say that APC cache is faster in that case [07:56:19] Which makes perfect sense [07:56:21] I wonder which is faster, APC cache in MediaWiki or memcached in mediawiki [07:56:30] that answers my question ;-) [07:56:39] apc is better for a single server [07:56:42] does $wgSessionsInMemcached work with CACHE_APC? [07:56:49] apc is different than memcached, anyway. [07:56:52] ialex says it does [07:56:58] apc is a bytecode cache with optional storage. [07:57:04] chuck: CACHE_APC doesn't exist [07:57:04] memcached is storage, period [07:57:07] I think when I tried them out, the accelerator was a little faster, but Memcached used fewer resources, as I recall [07:57:16] chuck: you need CACHE_ACCEL [07:58:00] isn't that for eAccelerator? [07:58:02] I should benchmark them side-by-side again sometime [07:58:23] APC is an accelerator for this purpose [07:58:38] I wonder if memcached vs APC would make a difference in requests per second [07:58:52] again, apples and oranges [07:59:08] how is it apples and oranges? [07:59:30] apc is a bytecode cache, that allows for apc_store and locale in-memory storage. [07:59:35] comparing two request per second benchmarks seems like an apple to apple comparison :P [07:59:41] bye it is a bytecode cache *first* [07:59:54] memcached is for distributed memory-based storage, not opcode caching [08:00:22] So you still need APC or eAccelerator or another bytecode cache if you have Memcached, right? [08:00:31] your best bet is apc for local bytecode caching (or xcache, or eaccelerator) and memcached for objects [08:00:33] You don't *need* it [08:00:45] But yeah, memcached won't do it for you [08:00:47] yes, if you want bytecode caching [08:00:52] which you do [08:00:53] mshadle_: even for one server? [08:00:54] :p [08:00:55] RoanKattouw: well, you still get the huge benefit from bytecode caching [08:00:58] it will be built in to php6 [08:01:07] unless someone figures how to share apc cache between cli and cgi it has big downside [08:01:10] this is so strange [08:01:33] the reason is taht PHP is made by Zend, and they want to sell their commercial accelerator [08:01:34] I got 15 requests per second on a page on YourWiki, but every time I re-run ab with the same settings, the requests per second gets lower and lower... [08:01:35] not a huge one. the web requests are typically multitudes higher than cli requests [08:01:45] as in.. not able to access object cache from command line scripts [08:02:14] mshadle_: no that's not the real problem [08:02:40] chuck: wouldn't it be hitting Squid anyway? [08:02:56] It is, but I don't think Squid likes to cache the main page for some really odd reason [08:03:06] never, ever, buy or pay money for any zend product/training/etc. fyi. [08:03:26] even rasmus today said zend framework was nuts [08:03:50] he liked the ability to load only what you wanted but still too interdependent. quote "spaghetti" [08:03:57] < Cache-Control: private, s-maxage=0, max-age=0, must-revalidate [08:04:06] 03catrope * r52152 10/trunk/phase3/ (3 files in 2 dirs): API: (bug 14200) Add user and excludeuser to list=recentchanges and list=watchlist. Requires the rc_user_text index, which was finally added on all servers with the recent schema changes. [08:04:08] Emufarmers: ^ that's on the main page from curl, so there's no cookies or anything [08:04:11] kidna strange [08:04:11] *kinda [08:05:43] Emufarmers: have you experienced any weirdness with the file caching? [08:06:01] like what? [08:06:40] well, how often do you run into problems with dynamic stuff that got put into a page that gets cached? [08:07:09] I guess stuff like DynamicPageList wouldn't work correctly anymore, but that's probably the same for Squid too though, right? [08:07:17] ugga [08:07:36] which brings me.. it should be clearly documented somewhere how extensions can disable caching [08:07:46] chuck: I assume so; we don't use DPL, though [08:08:04] So the only time there's an issue is if the skin or sitenotice changes [08:09:19] have any of you notice content loss on adding a namespace to the wgcontentnamespace? the existing pages in that namespace seem to get clobbered [08:09:33] Nikerabbit: I hate thinking about caching in general :P [08:09:37] so many variables to take into account [08:09:44] Really? I love thinking about it. :D [08:09:52] you're crazy in the head ;-) [08:10:38] 03ialex * r52153 10/trunk/extensions/LiquidThreads/LiquidThreads.php: Quick fix, seems that the file does not exist [08:11:15] rook2pawn: run the fixing script under maintenance [08:11:34] the namespace in the database just doesn't magically change [08:18:56] when you make a page say foo:bar does foo get added to some namespace area in the db or the script area? [08:19:25] if there is no namespace foo [08:19:41] it will get 0, (NS_MAIN) foo:bar tuple in the namespace [08:19:50] ops, comman in wrong position [08:19:54] comma* [08:22:06] 03(mod) Gadget usage statistics - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=19288 (10niklas.laxstrom) [08:22:15] Will cookies set for .yourwiki.net get sent with a request to static.nocookie.yourwiki.net? [08:22:34] Pretty sure [08:23:28] 03(FIXED) Large images do not display in ProofreadPage extension - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=18322 (10thomasV1) [08:24:53] Nikerabbit: I just made /Foo:bar , saved with "foo bar baz" as content. THen i added $wgExtraNamespaces[] = "Foo" and $wgContentNamespaces[] = "Foo", then reloaded, and /Foo:bar is gone [08:25:16] *Splarka pokles http://translatewiki.net/wiki/MediaWiki:Sectionlink/km at Nikerabbit [08:25:24] rook2pawn: sure, because not it is parsed (foo, bar) [08:25:30] and the old page is (0, foo:bar) [08:25:33] now* [08:25:38] where do all these typos come today [08:27:08] What url can i type to find my old /Foo:bar [08:27:53] rook2pawn: is it [[/Foo:bar]] or [[Foo:bar]] ? [08:28:42] rook2pawn: you can't [08:29:14] maintenance/cleanupTitles.php or namespaceDupes.php or something [08:29:35] or if you know the pageID you can delete it in the API [08:29:47] Or move it [08:30:16] deleting is more fun [08:30:17] Ah, thank you guys :p its like a wellspring of wiki knowledge [08:30:18] Or you can just remove the namespace, move it, and then re-add the namespace [08:30:36] don't forget to delete or suppress the redirect [08:31:07] Oh wow, i removed the namespace and the old page came back as well [08:31:41] this of course is the fault of having the pagename/namespace delimiter a legal title character [08:33:05] (as well as interwiki/interlang/parserfunction, and magical escape for categories/files, and mainspace transclusion) [08:33:18] 03ialex * r52154 10/trunk/extensions/PageCSS/PageCSS.php: Added URL to extension's credits [08:33:32] phew, mw's colon is way overused (insert gay porn star joke) [08:37:07] oh it was holy big but uncle sam saw it coming [08:47:04] 03catrope * r52155 10/trunk/extensions/UsabilityInitiative/PrefStats/ (PrefStats.sql SpecialPrefStats.php): PrefStats: Add ps_pref field to ps_duration_start index for better indexing and corrected query [08:47:39] *Emufarmers kicks chairtwo [08:47:39] no u [08:48:08] o: [08:48:16] abbussee [08:52:30] 03nikerabbit * r52156 10/trunk/phase3/ (3 files in 2 dirs): * (bug 19286) Correct commafying function in Polish (pl) [08:53:31] 03(mod) Correct commafying function in Polish - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=19286 (10niklas.laxstrom) [08:54:26] that was (FIXED) goddamit [08:55:31] *u kicks wikibugs [08:56:18] Nikerabbit: I think adding a CC was more imporant than changing the status ^^ [08:57:12] any bugzilla admins? why don't we have a component for math? [09:00:32] texvc? [09:00:56] in extensions? [09:01:33] hmm: https://bugzilla.wikimedia.org/buglist.cgi?query_format=advanced&component=texvc&&bug_status=UNCONFIRMED&bug_status=NEW&bug_status=ASSIGNED&bug_status=REOPENED&bug_status=RESOLVED&bug_status=VERIFIED&bug_status=CLOSED [09:01:35] 03(mod) Display of lower case rho as ϱ rather than ρ on Safari Browser - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=11663 (10N/A) [09:01:51] probably under the wrong product then eh? [09:03:02] or maybe easier to make math an extension! [09:03:08] http://www.mediawiki.org/wiki/List_of_extensions_to_be_merged_to_the_core#Things_to_split_out_from_core_into_extensions [09:07:41] 03(NEW) Mechanism to find usages of raw-html messages - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=19291 normal; Normal; MediaWiki: Page rendering; (niklas.laxstrom) [09:07:44] 03(mod) XHTML compliance (tracking) - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=209 (10N/A) [09:09:33] 03(mod) Many MediaWiki: messages not safe in XHTML - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=212 (10niklas.laxstrom) [09:10:00] Splarka: are those extensions listed there being merged with the core proper or only their functionality included? [09:12:41] ugga [09:12:43] avar: I think it depends how much cleanup is required [09:14:42] what he said [09:15:45] It would be unfortunate to smear some of these on the code proper as opposed to making a directory called default-extensions or something and auto-loading everything there my default [09:15:50] but meh [09:16:16] Well, Tim was saying that things shouldn't be merged [09:16:20] and that they should just be bundled [09:16:31] which I suppose people have been saying for the last 5 years [09:16:48] well, the new user log [09:16:52] that definitely makes sense in core [09:17:12] and probably TitleKey [09:17:14] anyway, I was going to start working on bundling (should be simple enough that even I can do it), but he was redoing the installer or the updater or something and then I forgot about it [09:18:39] http://www.mediawiki.org/wiki/Extension:ExtensionInstaller helps that somebody's actually already written it [09:18:43] plus there's kim's thing... [09:19:01] Emufarmers: now should that be merged into core? heh heh [09:19:35] Splarka: yes, or a reimplementation of it [09:20:26] 03nikerabbit * r52157 10/trunk/extensions/CentralAuth/CentralAuth.i18n.php: Convert special page names to content language, per bug 17619 [09:21:18] 03(mod) No localize link by fullurl/localurl - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=17619 (10niklas.laxstrom) [09:21:54] hi all. I am currently in the process of creating a local Wikipedia database, but am unsure about import performance and required disk space [09:21:58] *Emufarmers wonders when/whether http://svn.wikimedia.org/viewvc/mediawiki/branches/new-installer/phase3/ will be merged [09:22:15] do you recommend to use InnoDB or turn it off in favour of MyISAM? [09:22:42] articles on the net seem to suggest to turn off InnoDB and just go with MyISAM [09:22:53] babilen: enwp? [09:23:19] Emufarmers: eventually yes, right now I want to import dewiki and zhwiki [09:23:28] Well, big difference in terms of size :) [09:24:05] Nikerabbit: yes [09:24:19] I want to use it for NLP and am absolutely not interested in any concurrent inserts and all that as i will just read data from the DB [09:24:31] Emufarmers: with InnoDB taking much more space? [09:24:40] NLP? [09:24:44] *werdna waves [09:24:54] And I meant a big difference between enwiki and the others [09:24:55] Emufarmers: natural language processing [09:25:08] Ah [09:25:30] Emufarmers: yes, i know that. Am just unsure about the best MySQL configuration for my needs [09:26:01] Well, InnoDB is recommended for serious traditional use, but if you're using it for something else MyISAM is probably good [09:26:12] especially since InnoDB has much slower imports, as I recall [09:26:33] though I don't know whether that applies to XML dumps [09:26:59] when importing into innodb, disable the binlog [09:27:01] Can I ask for help here? [09:27:01] that helps [09:27:08] BlueMaxima: you can try :) [09:27:22] *BlueMaxima Duesentrieb Sarcasm? [09:27:35] Spelled his name wrong. [09:27:43] !ask [09:27:43] --mwbot-- Don't say "I have a question", or ask "Is anyone around?" or "Can anyone help?". Just ask the question, and someone will help you if they can. See also: !gethelp [09:27:51] Duesentrieb: i already disabled that as i really don't need that. Question is: Do i gain anything by using InnoDB or will it just have a performance impact? [09:28:45] hehe [09:28:46] babilen: InnoDB is much faster for concurrent access. [09:28:52] babilen: ask domas :=) [09:29:08] I have a file from a Wikia (pages_current.xml) which I would like to be readable, because I open in Wordpad, and it's unreadable, because of all the XML surrounding the info. Any suggestions? [09:29:19] domas: There's a live hack in ApiQueryCategoryMembers.php which disables part of the functionality because of a missing index. Tim added that index recently, so that live hack should be safe to remove now, can you check that? [09:29:53] got sidetracked [09:30:11] domas: Any advice on InnoDB vs MyISAM for Wikipedia DB server only used for read access by less than 10 people? [09:30:35] myisam, probably [09:30:59] roan: don't see livehack [09:31:29] domas: probably? [09:31:29] domas: Weird; can you do svn status on the API directory to see if it might be in another file? [09:32:59] nope [09:33:01] clean [09:33:13] Hm [09:33:23] Must've been removed already then; thanks anyway [09:34:50] BlueMaxima: wiki XML files are designed for importing [09:35:45] Um...did anyone get my question? [09:36:18] Emufarmers did ^_^ [09:36:26] *Emufarmers cuddles Splarka [09:36:38] also: you know it is the raw wikicode in the dump, right? [09:36:43] eeeee, you smell like emu [09:37:15] Nothing gets the ladies excited like that emu dung smell [09:39:38] hi Splarkaaaa [09:39:40] 03(mod) move nds to nds-de, newly create nds locale - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=17592 (10niklas.laxstrom) [09:39:44] babilen: if you're going just to read articles, then myisam :) [09:40:10] domas: ok! thanks for that clarification. [09:51:53] 03thomasv * r52158 10/trunk/phase3/includes/DjVuImage.php: encode string to utf8 before converting to xml [09:54:22] hi [09:54:34] morning [09:54:48] hi LauraHale [09:54:58] is it possible to request a valid e-mail address for user registration (i.e. the user has to confirm account creation after receiving an e-mail) ? [09:55:17] steko: I think the confirmaccount extension might have that feature [09:55:19] It is possible to require e-mail confirm before a user can edit [09:55:20] otherwise I don't know [09:55:31] That's just a $wgSomething setting [09:55:59] RoanKattouw, I have already enabled that option, nevertheless I found annoying to have many spam users registered [09:56:53] werdna, at a first look confirmaccount puts all the burden on the sysop, forcing him to approve / deny account creation requests.. am I wrong ? [09:57:36] steko: I thought that was so, but maybe it has the option, or maybe it can be hacked in. [09:58:19] could do something silly like http://www.mediawiki.org/wiki/Extension:AuthBugzilla [09:59:06] (maybe not that specific one, but: use a separate auth system that requires a confirmed email) [10:00:32] we're also in the process of setting up an LDAP server, maybe we could wait for that [10:02:10] Is it possible to store all language versions of WP into one DB, or do I need one DB *per* language? [10:02:24] You can use table prefixes [10:02:52] But still, each language is gonna have its own page table, its own revision table, etc. [10:03:28] Guess i won't gain much by that except more work for the import then [10:05:03] If you're importing from an SQL file, yes [10:05:20] ok [10:06:01] I create the SQL files with mwdumper.. that tools also creates a page.sql file, but I could also download it from WP. Which one should I use? [10:08:26] why do you want them all in one DB? [10:10:00] werdna: I haven't thought about reasons in favour of that scheme yet, just wanted to check if it would be passible at all [10:16:33] ah [10:16:38] well table prefixes mean it is [10:17:17] Damn, our profile sure is hungry for memory [10:17:19] PHP Fatal error: Allowed memory size of 524288000 bytes exhausted (tried to allocate 1296 bytes) in C:\\Program Files\\Apache Software Foundation\\Apache2.2\\htdocs\\t\\includes\\Profiler.php on line 28, [10:17:55] *profiler [10:20:24] *werdna dies of hideous-code-itis. [10:20:26] RoanKattouw: usual DB profiler? [10:20:28] RoanKattouw: get a real platform ;) [10:20:34] hehe [10:20:51] Unfortunately my Linux box doesn't have 2 GB of virtual memory [10:21:02] Maybe you should run the profiler and find out where the bottleneck is >_> [10:21:04] werdna: are you install-Linuxing right now? [10:21:04] uh [10:21:08] why not? [10:21:21] linux can not be installed on 16 bit machines :P [10:21:41] RoanKattouw: so install it on Windows box? [10:21:43] What I mean is that RAM + swap < 2 GB [10:21:55] oh :) [10:22:34] vvv: I'm on my macbook :) [10:22:37] I could increase the swap of course but it only has a few hundred megs of RAM, so not much use [10:22:51] time to buy some more ram [10:23:55] dude, a 1G stick of RAM is like $50 [10:24:01] $50? [10:24:08] I think that's expensive ;) [10:24:21] *werdna googles 1 GB RAM [10:24:26] MART 1 GB Memory - SO DIMM 200-pin - 533 MHz ( PC2-4200 )from $13 - 23 stores [10:24:32] depends on what it's for of course [10:24:43] A very old box :) [10:24:56] werdna: so, you consider mac to be a real platform [10:24:59] so it probably accepts standard RAM chips. [10:25:06] vvv: certainly :) [10:27:42] werdna: old standard ram chips... then it gets more expensive again [10:35:46] werdna: Not an imaginary platform? [10:35:54] avar: hmm? [10:36:16] A play on vvv's question:) [10:38:21] avar: ah, right [10:47:22] hi, i'm doing [[Image:name.png|link=http://www.example.com|caption]] but it keeps linking to the image page, do i need to change something in localsettings or whatever? [10:47:46] Anarhist: what version of MediaWiki? [10:48:13] 1.13.3 [10:48:19] You need to upgrade [10:48:21] 1.14+ feature [10:48:24] i don't think you can use external links [10:48:27] as well [10:48:38] p858snake: pretty sure you can o: [10:48:41] http://www.mediawiki.org/wiki/Manual:Linked_images [10:48:43] alternatives [10:48:49] well maybe you can [10:49:02] i'll upgrade i guess [10:49:30] p858snake: http://distributedresearch.net/blog/2008/01/21/how-to-make-image-links-in-mediawiki claims to have discovered it before we wrote that manual (with external image syntax), heh [10:49:46] hello guys [10:50:43] 03daniel * r52159 10/trunk/WikiWord/ (4 files in 4 dirs): use database hints [10:52:39] Gee, a feature documented a year before it was implemented? Usually we have features implemented a year before they're documented. :D [10:53:10] emu: hmm? [10:53:19] Emufarmers: no a diff way to do it [10:53:34] Splarka: doesn't really work, I know; I just wanted to make that joke v: [10:53:45] wat [10:54:06] anyway, I first discovered the external image + external link trick on uncyc: http://uncyclopedia.wikia.com/index.php?title=User_talk:Insertwackynamehere&oldid=175829&useskin=myskin#Better_never_than_late.21_or..._something [10:54:30] "Why dont the mediawiki people have something simple built in, in the first place?" [10:55:46] 03(mod) Add 'autoeditor' group, remove 'autopromote', 'patroller' and 'autopatroller' groups on Russian Wikisource - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=18591 (10irmvol) [11:00:13] 03(mod) Add 'autoeditor' group and remove 'autopromote' on Ukrainian Wiktionary - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=18590 (10irmvol) [11:02:26] 03daniel * r52160 10/trunk/WikiWord/WikiWordBuilder/src/main/java/de/brightbyte/wikiword/builder/AlterDataset.java: interface fix [11:02:30] 04(REOPENED) Add 'autoeditor' group and remove 'autopromote' on Russian Wikiquote - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=18594 +comment (10irmvol) [11:05:21] 03(mod) set $wgCategoryPrefixedDefaultSortkey=false for en.wiki - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16552 (10happy-melon) [11:13:30] ugga [11:20:10] <`butane> any plans to add a formal hooking mechanism (and hookpoints) to medawiki? [11:20:45] <`butane> I want to write an extension but its looking like my extension will be more of a patch to the src than an extension [11:21:23] we already have hook points. [11:21:27] and a formal hooking mechanism [11:21:28] !hook [11:21:28] --mwbot-- http://www.mediawiki.org/wiki/Manual:Hooks/`e1 [11:21:32] !hooks [11:21:32] --mwbot-- Hooks allow you to run functions at various points in the main MediaWiki code. For more information about using hooks in MediaWiki, see . [11:22:36] ugga [11:22:40] <`butane> wfRunHooks( 'EditPageBeforeEditButtons', array( &$this, &$buttons, &$tabindex ) ); [11:22:42] <`butane> ah yes you do [11:22:54] <`butane> I was looking at the src code to the spellchecker extension [11:22:57] <`butane> it must be pretty old then [11:27:31] 03(mod) Support OpenID extension on all wikimedia projects - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=9604 +comment (10alex.emsenhuber) [11:27:36] 03(mod) Implement memcached store for OpenID extension - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=17637 (10alex.emsenhuber) [11:29:07] 2questions. 1st: have there been some project to create ready-to-import dumps with templates, CSS and JS ? [11:29:37] 2nd: if none, how should I manage to initiate this project? [11:31:56] dumps of what? [11:32:35] dumps of basic-useful-content to import into MediaWiki after it has been configure [11:33:11] (I have corporate-use, industrialization process in mind) [11:34:57] and what is useful content? [11:35:41] gadgets javascripts; basic-template such as {{!}} and its associated CSS files [11:36:29] (to get some nice tables (sortable, etc.) ; etc.) [11:36:52] {{!}} does not work anymore [11:37:00] sortable is in the core, as far as I know [11:37:19] gadgets require an extension.. and many times I've requested somebody to set up a database of those [11:38:00] Ok Nikerabbit I would like your request to be done (I will help of course!) [11:38:29] [3rd: is there someone who would like to help me doing this? If so I will try to convince my company to contract with you] [11:39:47] Am I fully wrong? Isn't there a basic need of ready-to-import dumps? [11:41:03] dunno [11:41:16] in my opinion it should just be easier to install stufff that is wanted [11:42:06] Yeah I agree but I would like to build an industrialization process [11:42:50] of course there will always remain some afterward-configuration to get the wanted stuff [11:44:05] view also http://www.mediawiki.org/wiki/Category:Corporate-friendly_solution [11:49:36] 03(mod) Unicode combining characters are difficult to edit in some browsers - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16697 (10niklas.laxstrom) [11:50:50] ok I 've to leave. [[user:al Maghi]] to contact me. [11:51:10] 04(REOPENED) Enable StringFunctions on WMF wikis - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=6455 summary; +comment (10happy-melon) [11:52:36] ooh *gets popcorn* [11:53:12] 03(mod) Must differentiate 'specialpage(s)' in all languages - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=18403 (10niklas.laxstrom) [11:53:44] 03(NEW) Copy updated StringFunctions code from ParserFunctions - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=19292 enhancement; Normal; MediaWiki extensions: General/Unknown; (happy-melon) [11:54:12] 03(mod) Update StringFunctions with improved code from ParserFunctions - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=19292 summary (10happy-melon) [11:56:43] > I'm with you all the way that the hack is just that, a horrible hack, but the fact that it's seeing widespread use is indicative of the need for this functionality to improve editing and viewing experience. [11:57:19] *nom nom* [11:58:07] 03(mod) Allow html tags inside page names in Special:SpecialPages - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15336 (10niklas.laxstrom) [12:13:24] 03(mod) LanguageConverter should always use interface language for conversion - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14823 (10N/A) [12:19:17] 03(mod) Special pages: Allow using templates / adding interwiki links - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13489 (10niklas.laxstrom) [12:20:55] 03(mod) uselang=en translation string vs. zh-tw [LanguageConverter?] - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=17712 (10N/A) [12:28:00] 03(mod) Incorrectly truncated multibyte UTF-8 char - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12444 (10niklas.laxstrom) [12:35:15] I want to include Special:Statistics on an article page. I made it includeable in Local Settings (works for others), but {{Special:Statistics}} returns nothing (instead of link only) [12:35:28] 03(mod) Specify default language on a per-page basis - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=9360 (10niklas.laxstrom) [12:35:52] Subfader: you can't include Special:Statistics [12:36:26] generally not yes, but i made it includeable as it works for others [12:36:40] what did you do? [12:37:10] used a hook in localsettings http://dpaste.org/6ZiA/ [12:37:12] buh [12:37:18] ialex: can you help? [12:37:43] Nikerabbit: if less than 5 minutes, yes [12:37:54] https://bugzilla.wikimedia.org/show_bug.cgi?id=8732 [12:37:54] (will need to go soon) [12:41:33] 03(mod) LTR RTL keys on commons and other wikimedia projects - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=8213 (10niklas.laxstrom) [12:42:16] Nikerabbit: sorry, really have to go [12:47:18] 03(mod) Enable translation for protection log message - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=6367 (10niklas.laxstrom) [12:49:39] 03(NEW) Bad paging links in Special:Category - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=19293 minor; Normal; MediaWiki: Categories; (niklas.laxstrom) [12:55:49] 03(mod) Simple multilingual wiki ability enhancement request - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=6107 (10niklas.laxstrom) [13:00:28] 03(mod) Allow different directionality for user interface and wiki content - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=6100 (10niklas.laxstrom) [13:03:28] 03gerardm * r52161 10/trunk/testing/util/environment/Environment_install.sh: Change from sh to bash [13:08:59] 03(mod) Facilitate bidirectional editing / readability via a trailing escape character (as "\") at the end of line - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=5121 (10niklas.laxstrom) [13:14:16] "Escape spaces in Erlang magic to avoid misidentifications, particularly PostScript files produced on Tuesdays" [13:14:33] That is actually not a joke, oddly. [13:15:11] Simetrical: I have no idea what to do with https://bugzilla.wikimedia.org/show_bug.cgi?id=3615 [13:15:59] there hasn't been any other bug reports for the esperanto auto conversion... should I assume it works good enough? [13:16:04] Hello gents. We are trying to migrate a wiki from point A to point B, but the import didn't work too well due to corruption and whatnot. Is there a way to undo an import? [13:16:21] badp: I do hope you have backups [13:16:25] Nikerabbit, why can't the Esperanto speakers use Unicode like EVERYONE ELSE IN THE UNIVERSE? [13:16:39] Not for the new empty wiki, Nikerabbit [13:16:52] badp: then re-install it form scratch? [13:16:54] The old wiki is still up live and running [13:17:13] Simetrical: no idea, but I haven't seen any other bugs... so I guess they are ok with it [13:17:23] eh, I guess it's just easier that way =/ [13:17:24] Or there are too few of them to bother with. :) [13:18:21] Simetrical: I'd like to mark it invalid and request opening new specific bugs if any [13:18:31] Seems reasonable. [13:51:43] 03(mod) Integrate gadget into standard pref sections - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=18113 (10subfader) [14:12:37] 03(NEW) Sp-contributions-footer-anon on IPs without edits - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=19294 trivial; Normal; MediaWiki: Special pages; (AlexSm) [14:17:12] 03daniel * r52162 10/trunk/WikiWord/ (14 files in 4 dirs): sql syntax abstraction [14:19:52] 03(NEW) Navigation headings should not be lower-cased in German (and other languages) - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=19295 normal; Normal; MediaWiki: Modern skin; (raimond.spekking) [14:26:56] 03(NEW) Days to show in watchlist - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=19296 enhancement; Normal; MediaWiki: User preferences; (koehler-liesel73) [14:27:31] 03(NEW) DNS problem or Proxy configuration missing - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=19297 major; Normal; Wikimedia: DNS; (utwk) [14:33:10] 03(mod) Days to show in watchlist - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=19296 15enhancement->normal (10raimond.spekking) [14:42:56] 03raymond * r52163 10/trunk/phase3/includes/Article.php: Add heading of the deletion log on the 'delete' form for consistency with 'move' and 'protect' forms [14:54:07] "Call to undefined method ParserOptions::getIsPrintable()" in .../includes/parser/Parser.php -- upgrading 1.14alpha using SVN update; [14:56:04] Updated to revision 52163 [15:01:44] 03yaron * r52164 10/trunk/extensions/ExternalData/ED_Utils.php: [15:01:44] Made http_number_of_tries a static variable, added David Macdonald's additions [15:01:44] for DB and LDAP handling [15:05:25] 03yaron * r52165 10/trunk/extensions/ExternalData/ (ED_ParserFunctions.php ExternalData.php README): Version 0.7 - David Macdonald's additions for accessing database and LDAP data [15:05:25] I'm working on a tag extension and I want the input to be a template, such as {{Main page photos}}. How do i retrieve the data from the template for the tag? [15:05:50] Because right now it is only just returning the text {{Main Page Photos}} [15:14:15] Hello, is there a kind of offline modus @ media Wiki? [15:14:54] so that admins see the whole page but visitors and so get a message that the wiki is in the moment not rechable? [15:16:02] TC-nbigelow: By data you mean the expanded template? [15:16:18] TC-nbigelow: Try doing {{#tag:mytagname|{{Main page photos}}}} [15:28:25] Is this the best place for mediawiki upgrade questions? [15:29:14] <^demon> !ask [15:29:14] --mwbot-- Don't say "I have a question", or ask "Is anyone around?" or "Can anyone help?". Just ask the question, and someone will help you if they can. See also: !gethelp [15:29:50] !gethelp [15:29:50] --mwbot-- Please read http://workaround.org/getting-help-on-irc for a good explanation of getting help on IRC. [15:30:11] <^demon> dont waste your time reading that. [15:37:48] 14(DUP) Unnecessary

paragraph tags within extension output - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=1772 +comment (10tstarling) [15:37:54] 03(mod) doBlockLevels inserts pre-tags in a text created by an extension - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=1319 +comment (10tstarling) [15:38:07] 14(DUP) Parser hook output block level corruption - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=8997 +comment (10tstarling) [15:38:10] 03(mod) doBlockLevels inserts pre-tags in a text created by an extension - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=1319 +comment (10tstarling) [15:57:34] hey brion [15:57:38] 03daniel * r52166 10/trunk/WikiWord/ (6 files in 2 dirs): fiddeling with hsql and dbunit [16:13:02] whoa [16:13:04] wtf of the day: [16:13:17] if ( $this->request->wasPosted() && $this->methodApplies( 'mark_as_unread' ) ) { [16:13:20] [...] [16:13:24] } elseif ( $this->request->wasPosted() && $this->methodApplies( 'mark_as_read' ) ) { [16:13:29] is there a way i can configure bugzilla for my local mediawiki install? [16:13:36] is there an extension for it? [16:13:51] i have googled but no answer. [16:14:07] am I missing something in that code excerpt? [16:14:19] 03jdpond * r52167 10/trunk/extensions/DiscussionThreading/DiscussionThreading.php: Was not loading localization information correctly for versions of MediaWiki < 1.11 [16:15:14] What is DiscussionThreading? Different from LiquidThreads? [16:17:17] oh, unread v. read [16:28:32] 03thomasv * r52168 10/trunk/extensions/ProofreadPage/ (ProofreadPage.php proofread_index.js): fill header and footer fields with default values [16:53:02] Why does MediaWiki render ' as "'" and not as "'"? [16:54:39] 03(WONTFIX) Enable StringFunctions on WMF wikis - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=6455 +comment (10Simetrical+wikibugs) [16:59:01] i just updated to 1.15.0, i do not see any rivers on milk and honey? where are they? [16:59:20] qbwdp: They have been delayed for 1.16.x [16:59:25] oh damn [16:59:35] ok than, thank you [16:59:53] qbwdp: No problem. There were some issues with the river flooding over, and some overflow issues. [17:00:00] Too much milk and honey all over. [17:00:04] A huge mess. [17:00:32] ah yes, the old seepage and leakage bug [17:00:44] Indeed. [17:12:46] 03(mod) E-mail notifications (enotifs) are not sent in the recipient's user interface language - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=2592 +comment (10siebrand) [17:14:22] how do i a parser function such as {#random:{{template}} ) to expand the transclusion for processing [17:24:25] 03sergeychernyshev * r52169 10/trunk/extensions/Widgets/Widgets.php: Release 0.8.7 - namespaces issues fixed by ialex. [17:24:58] 03sergeychernyshev * r52170 10/tags/extensions/Widgets/REL_0_8_7/ (. Widgets.php Widgets.php): Release 0.8.7 - namespaces issues fixed by ialex. [17:27:35] 03sergeychernyshev * r52171 10/trunk/extensions/Widgets/Widgets.php: All subsequent changes go under 0.8.8-dev version to avoid confusion. [17:33:52] Simetrical, [[bugzilla:6455#80]]: citation needed! [17:34:58] enhydra: why do you need it? [17:36:12] vvv, Because [17:36:14] enhydra, k [17:36:15] 03(mod) Enable StringFunctions on WMF wikis - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=6455 (10Simetrical+wikibugs) [18:35:21] 03(NEW) Enable Lua extension on WMF wikis - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=19298 enhancement; Normal; Wikimedia: Site requests; (gtisza) [18:36:23] 03(mod) Enable StringFunctions on WMF wikis - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=6455 (10gtisza) [18:52:50] what's up [18:53:02] there is an lua extension? [18:53:10] not much. [18:53:14] No clue. [18:53:43] 3yes indeed there is [18:54:13] weird, I haven't noticed it [18:54:31] it's even supported on twn [18:55:11] werdna: oh I was blind [18:55:24] werdna: I excepted the restore clause to be after the block [18:55:45] Nikerabbit: :) [18:56:31] if ($xversion[0] <= "1" && $xversion[1] <= "11") { [18:56:39] 03(mod) Enable Lua extension on WMF wikis - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=19298 +comment (10Simetrical+wikibugs) [18:56:43] I wonder if that does even work [18:58:55] 03(mod) E-mail notifications (enotifs) are not sent in the recipient's user interface language - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=2592 (10niklas.laxstrom) [18:59:15] 03(mod) E-mail notifications (enotifs) are not sent in the recipient's user interface language - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=2592 (10N/A) [18:59:25] 03(mod) Display changed content-language messages in preference to default interface-language messages - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=8188 (10N/A) [18:59:27] 03tparscal * r52172 10/trunk/phase3/skins/vector/main.css: Reduced padding above first heading [19:08:00] ugga [19:08:16] hi. I am a newbie. after installation of mediawiki i get the main page. can i just clear off all the links in it and create my own or should i move the file to some other name and clear the redirect and start editing the main page? [19:09:24] nascentmind: you can replace the text in it with anything you want [19:10:04] Nikerabbit: but i want those links so that its handy to check things out.. so can i just move the page and remove the redirect to it? [19:11:31] werdna: can you assign this to yourself if necessary: https://bugzilla.wikimedia.org/show_bug.cgi?id=19296 ? [19:11:40] nascentmind: you can do that too [19:12:31] 03(mod) E-mail notifications (enotifs) are not sent in the recipient's user interface language - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=2592 (10mail) [19:13:51] Nikerabbit: thanks [19:15:18] 03(mod) Enable Lua extension on WMF wikis - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=19298 (10gtisza) [19:18:23] 03(mod) Enable Lua extension on WMF wikis - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=19298 (10Simetrical+wikibugs) [19:20:26] 03(mod) Enable Lua extension on WMF wikis - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=19298 +comment (10agarrett) [19:20:47] 03tparscal * r52173 10/trunk/phase3/skins/Vector.php: Fixed printing bugs [19:21:36] Hello! Does anyone know why the limit in Special:RecentChanges has dropped from 5000 to 500 again? It's almost impossible to patrol recent changes with these settings (see, for example, https://bugzilla.wikimedia.org/show_bug.cgi?id=18197) [19:22:49] NickK: again? [19:24:27] it was dropped because long lists are slow [19:24:28] it became 500 about three months ago (here: http://svn.wikimedia.org/viewvc/mediawiki?view=rev&revision=48735), but about a day or two later it came back to 5000 [19:26:00] well, but there is no other way to cope with vandals, for example in Ukrainian Wikipedia we have 18 out of 19 admins in the same time zone, so it's difficult to control vandalism at night [19:26:07] shouldn't, at least not in core [19:26:24] dunno if somebody override it in some wiki [19:27:00] abuse filter is there to help cope with vandals [19:27:11] no, it works so in all wikis [19:28:19] 03(mod) Enable Lua extension on WMF wikis - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=19298 (10Simetrical+wikibugs) [19:29:44] 03nikerabbit * r52174 10/trunk/phase3/ (RELEASE-NOTES includes/specials/SpecialContributions.php): * (bug 19294) Always show Sp-contributions-footer(-anon) [19:30:35] 03(mod) Sp-contributions-footer-anon on IPs without edits - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=19294 (10niklas.laxstrom) [19:31:07] 03siebrand * r52175 10/trunk/phase3/includes/Article.php: Follow-up to r51579: fix copy paste error (time->date) per CR [19:31:28] Reedy, yes,abuse filter is good against edits like www.example.com etc, but it can't help with changing facts or sonethung like that [19:34:24] lolsplit [19:34:45] Simetrical: perhaps some PHP-based MediaWiki-adapted parser will be useful [19:34:56] vvv, in other words, AbuseFilter? [19:35:08] Simetrical: yep, modified and optimized [19:37:31] 03nikerabbit * r52176 10/trunk/extensions/Translate/ (MessageChecks.php check-blacklist.php): Balance check was not working due to: wrong group id, spaces versus underscore [19:37:54] Simetrical: I think we may introcude tag for functions, {{#invoke}} to use functions and