[00:00:46] autoconfirmed or emailconfirmed? [00:01:46] Skizzerz: emailconfirmed - use case is we want to be able to administratively annoint someone has emailconfirmed in the case that they didn't get the email (for whatever reason) [00:01:56] ok [00:02:17] any help is appreciated [00:02:45] *Skizzerz goes to check out his db [00:02:46] do I need to set a field in the user_options blob? [00:03:00] or, better yet [00:03:17] *Skizzerz goes to check out the php file of the page that confirms email addresses [00:03:27] haha - thanks Skizzerz [00:06:30] hmm, should be possible to do it entirely via PHP [00:07:09] coo [00:08:37] jimbojw: I could whip up something really quick that'll let you confirm it for other users if that was your intent [00:09:43] otherwise, the thing you need is $user->confirmEmail(), where $user is a valid user object [00:10:02] mew. [00:10:06] would it be impolite to reassign a bugzilla report to the author of the affected mediawiki extension? [00:10:19] Skizzerz: no worries - i can take care of it, i should have just looked for the confirmEmail() function [00:10:23] jimbojw: do you want to give them emailconfirmed so that they can get and send email? or for other purposes? [00:10:23] i'm in PHP [00:10:28] ok :) [00:10:47] other - we limit what people can do until they've had their email confirmed [00:10:57] you could possibly fake it then, with a user group [00:10:59] but sometimes we need to get somebody important acess quickly [00:11:08] $wgGroupPermissions['trusted']['emailconfirmed'] = true [00:11:15] and then set them to group +trusted [00:11:17] right [00:11:28] that's a good idea too Splarka [00:11:30] thanks [00:11:35] although that would just give them the emailconfirmed right, not the group [00:11:56] so if you have something else limited to emailconfirmed by user rights, you'd have to add that to trusted too, I think [00:12:08] (is annoying to have groups and rights of the same name, heh) [00:12:43] there is a precedent for this btw: $wgGroupPermissions['bot' ]['autoconfirmed'] = true; [00:12:55] bots are magically autoconfirmed, even if they aren't old enough or don't have enough edits [00:16:21] can someone tell me how to add items to the sidebars in MediaWiki ? someone else set up the Wiki, I never figured out how they added the sidebar items [00:16:29] !sidebar | Samus_Aran [00:16:29] --mwbot-- Samus_Aran : To edit the navigation menu on the left, edit [[MediaWiki:Sidebar]] using its special syntax. For more details, see . [00:16:45] Skizzerz: thank you kindly [00:16:48] :) [00:16:56] *Skizzerz pets mwbot [00:17:25] bah the ratings extension has no postgres support [00:26:45] Anyone know of an extension (or existing functionality) that will generate a wiki site index by page name? [00:29:54] like Special:Allpages but static? [00:32:28] No - just like that but grouped alphabetically. [00:34:00] isn't Allpages alphabetic? well except for the namespace prefixes [00:34:25] 03siebrand * r29410 10/trunk/extensions/FlaggedRevs/Language/ (FlaggedRevsPage.i18n.li.php MakeReviewer.i18n.li.php): Localisation updates. Add proper eol-style [00:34:25] 03thomasv * r29409 10/trunk/extensions/ProofreadPage/proofread.js: minor [00:34:51] I'd like to see them grouped, with a header letter - A then all the A's, etc., etc. [00:37:56] I can make do with what I have here, though. Take a look at http://olr.freemason.com/wiki/index.php?title=Contents - Why do the links take the to an edit page? How do I prevent that? [00:39:28] hi!do you guys know a cms like joomla/drupal that can integrate mediawiki?like unified log-ins? [00:40:09] Steev43230: the action=edit is because those linked pages don't exist [00:40:30] as far as the links table is concerned [00:40:42] So, what do I do? [00:40:47] create them ^_^ [00:40:55] or change the links [00:41:12] But they already exist - at http://olr.freemason.com/wiki/index.php?title=Special:Allpages for example [00:41:33] well, that isn't what you're linking to [00:41:38] [[Alphabetical List|Special:Allpages]] [00:41:40] maybe you mean [00:41:44] [[Special:Allpages|Alphabetical List]] [00:42:11] OIC. Crap on a crutch. "They" got these [[ and [ backwards from my point of view..... [00:42:17] heh [00:42:29] Wait a min! [00:42:30] DPL or DAL might have what you need, for an alphabetical list [00:42:32] I DO have them as [[Categorized List|Special:Categories]] [00:42:42] [[Link|Link text]] [00:42:44] That's an internal link [00:42:47] OIC [00:42:49] Never mind [00:43:36] [00:43:48] *Splarka tries something [00:47:04] Steev43230: looks like http://www.mediawiki.org/wiki/DynamicPageList will do what you want [00:47:20] although might be resource intensive [00:47:42] Thanks. I'll check that oot right now. I'm off to see the wizard /user leaves [00:56:12] hi guys!is there a cms like joomla/drupal that can integrate mediawiki?like unified log-ins if i already got mediawiki set up? [00:56:24] can anyone here bring me up to speed on the #ifexist issue? That is, what the current status is? And whether there is a solution for en:{{Archive}}/{{Archive box}}/{{archives}}, which really only use #ifexist to determine if archive subpages exist. [01:01:32] One possible option I'm considering for en:{{Archive list long}} (36 #ifexists) is the transclusion of Special:Prefixindex/pagename Any opinions/thoughts/advice? [01:04:25] Special:Prefixindex is nice and fast [01:04:29] if you can use it, then do so [01:09:35] hmm I see if Wikibooks and Wikiversity wants what its talked about wanting then SpecialSearch::goResult() would need some changing [01:14:57] Anyone have some suggestions on creating a portal/index splash page? [01:15:29] 03dale * r29411 10/trunk/extensions/MetavidWiki/README: updated basic documentation [01:17:39] look at other portal/index splash pages that you like? [01:18:40] TimStarling: *nod* Special:Prefixindex has a few negative side effects, but its a potentially good substitute for {{Archive list long}}. I also have a couple of ideas in mind for equally horrific {{Archive list}} (100 #ifexists), but need to know if the payoff of avoiding #ifexist is higher than having (what would effectively be) a very very long/complex template. [01:18:58] thus my questions above. [01:20:16] are these negative side effects fixable? [01:20:22] depending on the setup, you could make use of DPL, have all archives categorized and use DPL to get pages inside that category [01:20:40] we're not enabling DPL on en.wikipedia [01:20:49] that would be even worse than #ifexist [01:21:11] I said depend on the setup, I wasn't sure where this was for [01:22:19] TimStarling: Just prettiness issues. a) output of Prefix is alphabetic, so 'Archive 10' appears after 'Archive 1'. And, b) the output has the name of the page that subpages are being searched for, i.e 'basename/Archive NN' or whatever. [01:22:55] sounds like something could use dpl... [01:23:21] Tim already mentioned this was on WP, and DPL isn't going to happen there [01:23:50] I KNOW it's for wp [01:24:01] but I can't figure out why they're so against DPL >_> [01:24:08] so maybe we could have some kind of archive mode for Special:Prefixindex? [01:24:14] personally, I think "list of archives" is something that bots could just as well take care of. [01:24:15] Skizzerz, because it would cause the servers to melt. [01:24:25] yeah, there's that... [01:24:53] or maybe we could have a separate parser function that generates an archive list? [01:24:58] that'd be pretty easy [01:25:11] is every instance of a Special: transclusion a custom made hack? [01:25:29] {{Special:Prefixindex/pagename}} {{Special:Wantedpages/number}} {{Special:Recentchanges/number}} etc? [01:25:32] what do you mean? [01:25:34] as I see it, if the pages are being archived automatically by a bot, then the same bot should be updating the list of archives. [01:25:59] Mahlzahn: that would work too [01:26:08] and it would solve the cache problems [01:26:27] Special:Prefixindex isn't going to update at the right time, if the page isn't edited [01:26:55] true [01:27:13] How to change the default date-time format, e.g., 20:23, 7 January 2008 (EST) ? [01:28:15] #ifexist *does* update at the right time, at the expense of spamming the pagelinks table with invisible links [01:29:06] Tim: I mean, they take a somewhat intuitive parameter, and return results that seem unique... Prefixindex returns a , wantedpages returns a comma separated list (with penultimate 'and'), and recentchanges returns an

+
    block [01:29:52] is there a means to shove off the idea of 'invisible links' to view time? i.e. effectively make redlinks invisible (either non-links, or links that _look_ like plain text) [01:30:27] a.new {display:none;} ? [01:30:52] Splarka: it's up to the special page to decide what it wants to output [01:30:55] well, "display:none iff redlink" sort of thing. [01:31:10] *Splarka takes that as a yes, then [01:31:18] there's no reason they couldn't be standardised though, just like the special page itself is standardised [01:31:22] Mahlzahn: redlinks are class="new" [01:31:53] what I'm saying is, it's no more of a "custom hack" than the special page itself [01:31:57] so, for example, .archivebox a.new {display:none;} would hide redlinks in an object with class="archivebox" [01:32:04] heh, touche [01:32:24] and since you don't stab near as much as rob, here is another insane idea [01:32:47] Splarka: right, but I can't say [[a link|whatever]] can I? [01:33:16] what if you had an #ifeq or #switch mode that allowed parsing just the newline/xml stripped text? so you could do, {{#ifeqstrip:{{Special:Prefixindex}}||No subpages|{{Special:Prefixindex}}}} *grin* [01:33:26] Mahlzahn: that's valid [01:33:29] Mahlzahn: you can, but... [01:33:48] the underline setting in preferences will not operate as expected [01:34:16] you can hide red links, but you can't hide the separators [01:34:20] if a user chooses to see link underlines, the link underline will be red if you don't specify the inner span to be underlined [01:34:24] > How to change the default date-time format, e.g., 20:23, 7 January 2008 (EST) ? [01:34:38] For the entire site, not just he user [01:34:47] if a user chooses not to see underlines, and you do underline it, then they'll see that (ugly) [01:34:51] so unless your archive link list is separated by whitespace only, it doesn't work [01:34:56] better to use site-wide CSS [01:35:24] yes, I see the problem. [01:35:44] well, that css of course assumes a whitespace delimited list... [01:35:47] I think it is http://www.mediawiki.org/wiki/Manual:%24wgAmericanDates [01:36:12] Tim: you can do some limited in-tag magic [01:36:25] [[Foo|
    Bar]] [01:36:25] [[Foo|
    Bar]] [01:36:32] [01:37:05] that will have visible consequences in some browsers [01:37:09] a.new {display:none} that would hide the newline too, for example [01:37:14] heh [01:37:27] hello [01:37:31] *Splarka didn't write the parser, just likes abusing it [01:37:32] hi Nikerabbit [01:38:20] well, {{Archive list}} presently just emits "1, 2, 3, 4 ..." for #ifexisting "Archive 1" etc. If I make the "2, 3, 4" /appear/ as plain text, I doubt anyone would have reason to complain. [01:38:26] what if you had an #ifeq or #switch mode that allowed parsing just the newline/xml stripped text? so you could do, {{#ifeqstrip:{{Special:Prefixindex}}||No subpages|{{Special:Prefixindex}}}} *grin* [01:38:33] sure, it'd work [01:38:55] but every time you add a feature like that, you cause me about a day's extra work when I next try to refactor the parser [01:39:00] then you'd have a way to get the old behavior of #ifexist back (pre registered-link type, cacheable) [01:39:08] heh [01:39:19] You have too many bosses, please ignore me unless I say something sensible [01:39:32] I mean, it works now, we have this concept of stripping [01:39:34] but for how long? [01:39:49] and what happens when we want to get rid of it? [01:40:06] oh, does it? hmm, seems like it didn't when I tried it, but I didn't investigate too much [01:40:23] Mahlzahn: well, as plain text, that's easier [01:40:26] is there some other reason we need an unstrip concept? [01:42:13] doubtful ^_^ [01:42:18] you know, there isn't a strip pass anymore [01:42:29] it's been merged with template expansion [01:43:48] let's have interfaces that make common things easy, and that we can maintain [01:43:55] is #replace enabled on en. ? [01:43:59] no [01:44:08] Mahlzahn: well, you'd have to decide, if you'd want them to appear as plain text (black?), blue links, red links, or invisible I guess, and then apply for a tiny bit of site-wide CSS, probably one line. Note that it could be slightly open to abuse though as blue text, people could [[Missingpage]] [01:44:22] too many chances of a DoS, eh? [01:44:27] what would #replace do anyway? [01:44:36] can't think of any reason it would be useful here [01:45:07] just thinking out loud for stripping the output of Special:Prefindex [01:45:16] doesn't work [01:45:27] the output of Special:Prefixindex is a strip marker [01:45:36] k. [01:45:39] #replace can't do anything with it except skip it [01:45:48] actually, an unstrip concept... you could check for red links with #ifeq... {{#ifeq:Zzy|Red|Blue}}, of course #ifexist does this, but there may be other silly cases [01:46:13] er, {{#ifeq:[[Zzy]]|... even [01:46:29] well, that's different to unstripping [01:46:34] ahh k [01:46:40] so there's another reason why it's a bad idea [01:46:46] counterintuitive results [01:46:51] splarka say bad idea = ignore [01:47:51] [[Zzy]] is just [[Zzy]], it's main-pass syntax and is just those literal characters to the preprocessor [01:48:12] whereas {{Special:Prefixindex}} is expanded to a strip marker by the preprocessor [01:48:24] and that strip marker will later be replaced with the HTML output of the special page [01:49:33] Skizzerz: do you know how to make the sidebar headings Mixed Case ? it is forcing them to lowercase at the moment [01:49:57] the manual page only mentions lowercase links, not the headings [01:49:58] without hacking the code, no idea [01:50:03] that makes sense... is there any sort of "parse order for dummies" documentation other than the source? [01:50:13] :D [01:50:29] Samus_Aran: is css [01:50:45] Splarka: ah, I see [01:50:49] .portlet h5 { [01:50:52] well, there's my essay in RELEASE-NOTES [01:50:52] text-transform: lowercase; [01:50:57] but it's wrong now, needs to be updated [01:50:58] in monobook/main.css [01:51:10] is anyone currently making extension messages loading better? [01:51:10] just reverse it in MediaWiki:Monobook.css [01:51:11] documentation is a todo at the moment [01:51:51] haha [01:52:09] Nikerabbit: better than what? [01:52:13] Splarka: thank you kindly [01:52:55] better than just showing up all messages in at some random point? [01:53:29] showing up? [01:53:46] um, putting them into messagecache [01:54:20] Splarka: got it working the way I wanted, thanks again. it seemed kind of unprofessional to me, having the links in Title case and the headings in lowercase [01:54:56] well, obviously they can't be put in there on demand in MessageCache::get(), since you'd need a hashtable for that, and if you're creating a hashtable of all messages in memory, then you may as well load the message text as well [01:55:30] the extension can load them manually when they are needed, that is what $wgExtensionMessagesFiles is for [01:55:50] Samus: welcome to MediaWiki ^_^ [01:56:55] but we could load only needed languages (fallback language seems not to be respected currently) [01:57:13] fallback language for extensions? [01:57:58] well, we could merge only the needed languages [01:58:12] but I don't think having 250 files for each extension is the way to go [01:58:44] maybe each extension could have a single i18n file that contains all of its messages [01:59:01] that's what most currently do [01:59:10] I'm not finished yet [01:59:20] go on then [01:59:27] then the core message loader, via $wgExtensionMessagesFiles, could split the messages by language and cache them separately [01:59:55] merged across all extensions perhaps? [02:00:20] sounds reasonable [02:00:23] it'd probably make sense, sacrificing deferred loading [02:02:34] the hash table could be smaller though, many extension prefix all of their messages [02:04:42] are you saying that we could use that prefix to implement deferred loading? [02:05:05] maybe [02:05:18] how to find the extension which includes: class="taxobox" [02:05:22] for tables.. [02:05:27] mmm, it's an idea [02:05:48] I won't go so far as calling it good or bad just yet :) [02:05:53] i want to create such tables in my wiki: http://de.wikipedia.org/wiki/Hilfe:Tabellen#Geschachtelte_Tabellen.2C_Listen_in_Tabellen.2C_Bilder_in_Tabellen [02:06:25] is message loading a performance problem? [02:06:26] wouldn't the name of the extension be more reliable? unless start requiring extensions use a prefix for messages [02:07:14] TimStarling, domas complains about it. He got especially annoyed at my accesskeys change. [02:07:15] darkcode: that would be thousands of lines of code to migrate [02:07:24] Accesskeys/tooltips. [02:07:37] serialised hash table of extension supported by translate extensions is about 200 kilobytes [02:08:00] that's a big hashtable [02:08:15] TimStarling I mwan using the extension name as the key for the hash [02:08:16] but most people don't have that many extensions [02:08:49] it uses references so that the name of extension is stored only once [02:08:51] instead of the extension prefix [02:09:17] darkcode: the idea is to load the extension messages when wfMsg() is called [02:09:26] but wfMsg() doesn't know the extension name [02:09:47] so you'd have to migrate the extensions to call a version of wfMsg() where they can supply the extension name [02:10:13] and sometimes it's not even the extension that fetches the message [02:10:19] for instance, special page or group names [02:11:21] Where's that profiling data for the cluster again? [02:11:52] http://noc.wikimedia.org/cgi-bin/report.py [02:12:03] TimStarling well how would it know what the extension prefix is then? [02:12:21] darkcode: the extension could register it [02:13:48] so extensions already register there name don't they? eg $wgExtensionsCredit['parserhook'][] = array( 'name' => 'Foobar' ); [02:14:02] Well, 6% of cluster time is spent on wfMsgReal. 11-12% is on MessageCache::addMessages. [02:14:30] let me hit the clear button [02:14:31] I'm also a little worried about the memory usage :o [02:14:44] What's real/c, seconds per hundred calls? [02:15:09] milliseconds per call, wall clock time [02:15:21] What's the "/c" supposed to mean? [02:15:22] darkcode: that's not always the prefix of the messages. For example, the name part in the extension credits may contain spaces and other wiki code [02:15:31] per call [02:15:31] Ah. [02:16:40] 12% is quite a lot [02:16:50] While we're talking about this, is it a bug or a feature that recursive functions are counted many times over in the report? [02:16:50] hmm, can`t find taxobox downloads. any hints, where to get it? [02:16:57] Skizzerz I'm not saying it is. I'm suggesting that if prefixes aren't required for extension messages, that it could not be relied on to be there [02:17:13] ah [02:17:32] Or at least, I think they are, from looking at the code. [02:17:37] and that the extension name is more likely to be there [02:17:56] Simetrical: looks like a bug to me [02:18:25] TimStarling, which would imply that all this fussing about variable replacement taking 80% of CPU time was nonsense? [02:18:48] well, it wasn't my reason for fussing [02:18:51] Although still a problem for latency reasons in extreme cases, of course. [02:18:59] Which is probably a better reason anyway. [02:19:00] if someone was fussing because of that, then they were wrong, yes [02:20:05] Flightbase: it isn't an extension, it is a combination of templates, wikicode, and css [02:20:19] is there a setup howto? [02:20:36] all i found was usage - and the hint, that it is hard to setup ;) [02:20:38] hmm I just remembered http://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3/includes/Title.php?r1=27774&r2=27934 [02:20:39] !infobox [02:20:39] --mwbot-- I don't know anything about "infobox". [02:20:45] bah *slaps mwbot* [02:20:48] !taxobox [02:20:48] --mwbot-- I don't know anything about "taxobox". [02:20:57] I'm having second thoughts if that is good solution [02:22:03] Flightbase: http://en.wikipedia.org/wiki/Help:Infobox [02:22:15] I don't think page views of messages with language names are a serious problem [02:22:27] because you know the language, so you can load that cache entry [02:22:30] currently pages in mediawiki namespace show "randomly" as blue or red :/ [02:22:46] Splarka, thanks so much, again ;) [02:22:54] for links, yes, it could get onerous [02:23:08] but even then, loading the cache for a given language should be pretty quick [02:23:12] <10ms [02:23:19] basically you create a template (they didn't on de.wp example, but it is easier), with contents {| class="infobox" [02:23:34] TimStarling: it ate all memory in changes list if there was changes on many languages to mediawiki pages [02:23:51] and the more table elements and parameters, and then you create a set of css definitions in your sitewide css files to style them [02:25:57] we could use the DB [02:26:17] that would get around the memory limit problems [02:26:33] Splarka, actually that help is for endusers only [02:26:37] heh, or store it all in a DOM tree [02:26:48] Flightbase: well, learn what they are, before creating them ^_^ [02:26:51] did I tell you, DOM memory usage is not accounted for towards memory_limit? [02:27:04] heh [02:27:04] Nice. [02:27:09] so you can just pretend it doesn't exist until your server goes into swap [02:27:11] also, copy http://en.wikipedia.org/wiki/Template:Infobox and the css from http://en.wikipedia.org/wiki/MediaWiki:Common.css related to .infobox [02:27:15] a good start [02:27:17] i`m just root of that server. i don`t want to use them at all ;) [02:27:34] well, tell your users, there is no extension, just css/templates [02:27:45] although ParserFunctions is helpful for constructing them [02:28:40] maybe we could just drop the "mediawiki: page is always known if the message is defined in source" thing [02:28:55] could do [02:30:13] DOM trees are almost as inefficient as PHP hashtables, so it's a really serious problem [02:30:33] I might have to port all my new parser work over to some other data structure [02:31:14] so if we drop the condition from that function, the database will be queried for page existence, not? [02:31:22] TimStarling, why is the non-production-ready new parser the default in current trunk? [02:32:29] I haven't had single problem with the new parser after tim fixed the bugs with section editing [02:33:16] Okay, then why is it disabled on Wikipedia? I guess it doesn't matter, it's trunk. [02:33:19] gn8 all [02:33:30] Someone was complaining of some sort of weird behavior using trunk recently, on wikitech-l. [02:33:38] It was suggested they switch to the old parser. [02:34:01] had someone here also, can't remember what the issue was [02:34:26] something to do with extensions [02:34:47] the wikihiero issue? [02:35:35] hmm, normal people are waking up to go to work... I should go sleeping [02:35:35] there's no outstanding issues that I'm aware of [02:35:41] it's beta quality as far as I'm concerned [02:35:49] and that's good enough for trunk [02:37:01] as for wikipedia, well I'm waiting for the wheels to turn [02:37:16] no rush [02:37:53] does Men_Without_Hats still break or was that a silly template call to begin with that should have broken in the old parser? [02:38:30] once brion syncs the latest changes, we'll do some more testing against wikipedia articles and see what we get [02:38:53] sounds reasonable [02:40:02] Splarka: can't remember what issue that was now, but that was a while ago, I'm pretty sure I fixed everything that came up at that time [02:40:14] *Splarka investilooks [02:40:32] {{ambox [02:40:32] [02:40:32] |type = content [02:40:47] they'd commented out the first parameter, so the name of the template was a newline, a comment, and another newline, before the pipe [02:40:54] worked in old, broke in new [02:40:58] and silly either way [02:41:04] yeah, that works now [02:41:08] k [02:41:43] Anyone know how to get access to the information inside the variables in the special:allmessages? I want to append a ?action=purge to the link in the returnto message but the $1 seems to have the url in there in a way I can't get at it. [02:41:44] now comments are competely handled in preprocessToDom(), I don't think removeHTMLcomments() is used at all [02:46:45] frieze: depends on the message... [02:47:11] Splarka: the returnto message [02:47:24] by default it's "return to $1." [02:47:25] see http://en.wikipedia.org/w/index.php?title=MediaWiki:Movepage-moved&action=edit for some hacky examples, sometimes transclusion works [02:47:40] for example: [{{fullurl:Special:Whatlinkshere/$4}} check] [02:47:48] works if it is transcluded into the message [02:49:34] you might be able to... eg... [02:49:40] Return to {{Returntocheck}}. [02:49:45] and in Template:Returntocheck: [02:50:00] $1?action=purge or {{#switch:$1}} even [02:50:55] * not guaranteed to work [02:53:35] hmm [02:53:46] you should be able to use {{{1}}}, that'd be pretty cool [02:53:48] is fullurl: documented somewhere obvious? [02:54:04] searchign mediawiki.org is predictably unhelpful [02:54:07] $1 has lots of problems [02:54:08] frieze: http://meta.wikimedia.org/wiki/Help:Magic_words [02:54:21] but $1 in returnto is already a link, so you don't need it [02:54:57] *Splarka wonders why m:Help:Magic_words isn't on mw: [02:59:39] hmm [02:59:47] well transclusion doesn't seem to work. [03:00:00] there might not be much you can do then [03:00:39] since $1 is a pre-formatted link already [03:01:26] well it gets transcluded okay, just turns into a $1 when fullurl gets called [03:01:32] Splarka, can't be imported to mw:Help: because it's GFDL. [03:01:45] frieze, there's a bug about this somewhere on Bugzilla. [03:01:53] I could maybe write (yet another) parserfunction to edit it [03:02:39] sim: bah! [03:03:26] $1 gets turned into %24 by fullurl, which then isn't transformed [03:04:03] passing the message parameters through to the parser so that you can use {{{1}}} would solve that problem, among others [03:04:27] tim: http://en.wikipedia.org/w/index.php?title=Template:MediaWiki_move_check&action=edit [03:04:40] the transclusion hack works though? [03:04:49] or is that broken.. I haven't moved a page lately ^_^ [03:04:55] it depends on the message [03:04:58] ahh [03:05:05] thanks to another terribly ugly hack [03:05:06] returnto is called with... $link = wfMsg( 'returnto', $wgUser->getSkin()->makeLinkObj( $title ) ); [03:05:23] $2 could be created as just the $title I suppose? [03:05:44] hack++; [03:05:57] I have to go get some lunch [03:06:00] I tried copying that over with just $1 instead of Special:Whatlinkshere/$4 but that didn't appear to do anything helpful [03:06:20] frieze: oh, there is another possiblity [03:06:27] give me an example of a 'returnto' ? [03:06:54] the default is "Return to $1." [03:07:10] yes [03:07:17] but show me an example where mediawiki generates that [03:07:26] ahh [03:07:31] or, for the purposes that ?action=purge would be useful [03:07:37] haven't found the one for watches yet [03:07:46] because sometimes you can use magicwords [03:08:07] like, action=delete and delete success, they can use {{FULLPAGENAME}} [03:08:41] I need it because I wrote this ugly pile of code that allows me to create buttons to watch and unwatch pages from within a template [03:08:43] but that depends on if the page that is calling 'returnto' is at the page in question, or on a Special: page, or a bad title [03:09:08] the returnto i'm concerned with is the one that comes up when you add a watch [03:09:25] doesn't ajaxwatch make that pretty rare now? [03:10:15] *Splarka disables JS and tests [03:11:33] ahh, does work, for the watch, sweet [03:11:47] oop, spoke too soon [03:12:17] oh man [03:12:23] purge [03:12:27] works [03:12:36] but ugh! ugly [03:13:59] what other wikicode actions have a 'return to'? this might break other ones [03:15:00] lots of them [03:15:02] sadly [03:15:10] ugh [03:15:30] I really don't want to edit the php that makes the call when a watch is added/removed. makes upgrades complicated [03:15:46] hmm [03:16:05] well, here, I set that on http://test.wikipedia.org/wiki/MediaWiki:Returnto [03:16:24] try some other returntos and tell me if any of them show {{FULLPAGENAME}} as something catchable [03:16:33] I mean, you could do maybe: [03:16:42] {{#switch:{{FULLPAGENAME}}| [03:16:56] Main Page=Return to $1.| [03:17:02] Bad title=Return to $1.| [03:17:12] #default=purge}} [03:17:12] hmm [03:17:20] well it works mostly [03:17:33] it will break somewhere though, I assume ^_^ [03:17:46] if you can find some examples it will help [03:18:42] actually, that probably wouldn't work, if it allows raw html it probably won't like parserfunctions [03:20:08] oop, it *does* work [03:20:09] damn [03:20:57] while it does work I am confident it won't survive an upgrade. Some weird new regimen of hacks will break it in new and exciting ways [03:21:05] of course [03:21:10] still thanks a ton for the help [03:21:12] but I must say [03:21:17] I never would have thought of going that way [03:21:45] mixing magiwords, extension syntax, parserfunctions, and raw html in a MediaWiki: msg is really weird [03:21:45] I was about to try to us a hook to force page purges on watch add/removes [03:22:06] which would almost certainly destroy the space/time continuum [03:23:01] Return to {{#ifeq:{{FULLPAGENAME}}|Main Page|$1|{{FULLPAGENAME}}}}. [03:43:35] well, I'm off to sleep the sleep of the righteous. thanks for the help [04:31:21] thanks for the help all [04:31:28] I need to use special character "&" in article titles (1.11). Someone offered a hack for this before but I can't remember how to do it. Any help? [04:32:59] http://en.wikipedia.org/wiki/Wikipedia:Naming_conventions_(technical_restrictions) [04:33:03] no mention of it here [04:33:10] if you encode it it should be fine [04:34:33] %26 [04:58:12] Hello [04:58:22] Are there some kind of extension for creating tables easily? [05:02:03] TableEdit seems interesting :) [05:05:11] What SQL query is used to calculate size changes in the edit histories? I can only find revision.rev_len, so I assume it must compare that field for each set of new/old revisions, but I'm not sure how to find the previous revision ID. [05:10:17] order by rev_timestamp [05:12:01] Thanks; that would work with page_id. [05:26:32] Nah, most Table extensions seems outdated or broken [05:26:51] It's possible to do this on Mediawiki 1.11? http://en.wikipedia.org/wiki/Help:Sorting [05:28:25] 1.11 > 1.9 [05:28:42] Splarka, OK. I tried and not works :P [05:28:56] link? [05:28:57] http://wiki.atarichat.net/wiki/Test2 [05:30:07] table sorting *is* in your wikibits.js, hmm [05:30:45] Splarka, another test http://wiki.atarichat.net/wiki/Test3 [05:30:51] (this one taken from http://qed.princeton.edu/main/Help:Tables#Sortable_Tables_with_wikitable_Syntax ) [05:31:33] hmm, I reloaded and it is working now [05:31:50] *browser voodoo* [05:32:09] Test3 works too [05:32:12] oops, that one is a different kind. I want the tiny buttons [05:32:38] I mean the tiny buttons, you push then and orders in a way or another. Order by Name, by Surname... [05:32:45] yep, I see them... [05:32:59] Not here, and I deleted the cache. WHat browser? [05:33:02] Name [><] Surname [><] Height [><] [05:33:11] Mozilla [05:33:41] oops, forgot to enable javascript. I use NoScript and deleted the alow [05:33:51] Splarka, thanks for all [05:34:02] *Splarka digs out the cluebat from storage [05:34:25] I almost made it so that
    displays nothing, because I thought that's how it worked in the old parser [05:35:06] then at the last minute, I actually checked the old parser, and in fact it displays
    <--foo-->
    [05:35:13] +! [05:35:22] heh [05:35:28] Are there a table generator for dumbs or a csv to wiki extension? [05:36:12] so the new one was right in the first place [05:36:19] Finally... I actually managed to get a checkout of trunk... [05:36:22] Man this computer is slow [05:36:30] Tim: something else, whitespace:nowrap or whitespace:pre works on other (real XML) tags, but the excess whitespaces get stripped, only
     keeps the whitespaces
    [05:36:31] 	That took under a minute on my dad's
    [05:36:50] 	timofonic: just use HTML tables
    [05:37:05] 	there's plenty of software around that will generate them for you
    [05:37:11] 	and for the most part, they'll work in mediawiki
    [05:37:13] 	would it be.. um.. hard to exclude real html tags from losing excess whitespace if they use a whitespace style definition?
    [05:37:15] 	TimStarling, there will clueless people on the wiki ant that will be rocket science for them :)
    [05:37:22] 	TimStarling, uhm
    [05:37:32] 	TimStarling, like this? http://area23.brightbyte.de/csv2wp.php
    [05:38:09] 	Splarka: is that something different from what we're doing?
    [05:38:54] *MZMcBride 	wonders if sort keys will ever be possible with sortable tables
    [05:39:15] 	damn, I need to test these things out before complaining from memory
    [05:39:35] *Splarka 	tests
    [05:40:53] 	Splarka: by the way, ampersands and short URLs are always a bitch
    [05:40:54] 	Tim: http://test.wikipedia.org/wiki/Whitespace
    [05:41:11] 	well, sure, but [[&]] works?
    [05:41:51] 	well there aren't any technical restrictions on Wikipedia because it's been fixed there; but elsewhere, it won't normally work without some apache screwing around
    [05:41:58] 	if you're rewriting URLs, that is
    [05:41:59] 	ahh
    [05:42:10] 	doh
    [05:42:14] 	mmm, strange
    [05:42:40] 	Tim: this behavior changed btw, in the last few years
    [05:43:22] 	could be a tidy bug
    [05:43:38] 	it used to just keep all whitespace/newlines in any html tag, IIRC
    [05:44:30] 	wait, my suggestion was dumb...  
    set via .foo {white-space:nowrap} wouldn't be predictable by the parser, it'd have to be all tags' whitespace (as before) or nothing (as now) [05:46:31] the obvious solution is to leave whitespace as-is [05:46:36] regardless of tag [05:46:38] TimStarling, the problem of csv2wp is that isn't possible to just making it usable to registered users [05:46:40] which is what we do already [05:46:59] ahh, bugs <3 [05:47:24] the thing is, this happened on Wikia too, and they don't use Tidy (or didn't when the problem started) [05:48:10] [0547][tstarling@zwinger:~]$ echo "
    x y
    " | tidy [05:48:16]
    x y
    [05:48:16] When was page.page_is_new added to the database? [05:48:27] are you sure they don't? [05:48:46] when we first noticed the problem, they did not have tidy, was about a year ago [05:48:59] but it might be an unrelated coincidence [05:49:05] > print $wgParser->tidy('
    x y
    ') [05:49:05]
    x y
    [05:51:05] hmm, space-indended (the other
    ) keep the whitespace, but also parse the wikicode...
    [05:51:27] 	how often does Tidy get updated? there's a lingering bug that's fixed in the current Tidy version...
    [05:51:32] 	http://test.wikipedia.org/wiki/Integers
    [05:51:58] 	does this settle the question? http://p.defau.lt/?YnD4bwE_Di3x4cbLxR6nGQ
    [05:52:10] 	MZMcBride: too often, last time we did it, it broke everything
    [05:52:19] 	we should make a stable fork
    [05:52:26] 	ah
    [05:52:49] 	Tim: okay, bug in tidy
    [05:52:52] 	Someone made a note on MediaWiki that made me create this. What does everyone think? http://bugzilla.wikimedia.org/show_bug.cgi?id=12530
    [05:52:55] 	we're on April 2007 at the moment
    [05:52:58] *Splarka 	gets pwned
    [05:53:56] 	makes sense that newer versions could screw things up
    [05:56:18] 	I'm coding a set of hooks for Special:Listusers
    [05:56:33] 	(Curently on one for ->formatRow)
    [05:57:03] 	Dantman|Coding: I think you spelt "too" wrong
    [05:59:00] *Dantman|Coding 	debates on if that's serious, or humor he can't catch...
    [05:59:30] 	Anyways...
    [05:59:43] 	MZMcBride: Yes, ampersands are a bitch... and just encoding them isn't working either
    [05:59:57] 	I'm giving the hook the output of wfSpecialList normally used, and also the $row passed to the function...
    [06:00:00] 	Sasoriza: are you using short URLs?
    [06:00:09] 	there's some good help pages on mediawiki.org
    [06:00:38] 	Yeah, and yeah, I thought so, just haven't found the page yet
    [06:01:11] 	Should I also include the $userPage/$name/$groups or some other variable used in that function, or leave that to be redone by someone implementing the hook?
    [06:01:24] 	((Using the data from $row))
    [06:01:57] 	Don't know if more variables, is worse than redoing some medium weight code
    [06:02:46] 	ampersands are only a problem if you're using path-to-query rewrite rules, aren't they?
    [06:03:48] 	once upon a time, we used a path-to-query rewrite rule, and we had an apache patch to deal with the ampersands
    [06:04:16] 	but the proper way to deal with it is to use an alias
    [06:04:44] 	or if that's not possible, a path-to-PATH_INFO rewrite rule
    [06:06:51] *Dantman|Coding 	intends to come up with a nice way to use rewrites in NGINX at some point
    [06:09:20] 	in a choice between using a display:none hack or using 100 #ifexist calls, which option should i choose?
    [06:09:33] 	neither
    [06:09:36] 	(there's an editprotected request on en.wiki)
    [06:10:04] 	well, it's an archive list thing; currently uses the #ifexist scheme
    [06:10:53] 	well, you can just use a static archive list, and edit it when you add archives, someone suggested that earlier in this channel didn't they?
    [06:11:36] 	kinda defeats the template magic :(
    [06:12:30] 	yes, defeats it very well, I should say
    [06:12:36] 	morning!
    [06:14:02] 	hi domas
    [06:19:19] 	hello dear Oracle people!
    [06:23:59] 	domas: ever read any RM Hare?
    [06:24:09] *AaronSchulz 	is thinking about it
    [06:24:16] 	AaronSchulz: nope
    [06:24:51] *AaronSchulz 	puts the Philosopher Schulz hat on
    [06:25:31] 	domas: at any rate, http://www.phy.duke.edu/~rgb/Philosophy/axioms/axioms/node45.html is a fun read :)
    [06:28:45] 	TimStarling: is LP really part of QM?
    [06:30:27] 	haha http://www.phy.duke.edu/~rgb/Philosophy/axioms/axioms/footnode.html#foot10124
    [06:30:29] 	'By the way, it should become clear from my frequent use of this as a Universal Resource that in my opinion Wikipedia is well on its way toward becoming the crowning achievement of human civilization - literally an online, free repository for all non-encumbered human knowledge, such as it is.'
    [06:32:07] 	"LP is at the very root of certain interpretations of quantum mechanics"
    [06:32:17] 	certain interpretations of quantum mechanics are rubbish
    [06:32:37] 	namely the copenhagen interpretation and the many worlds interpretation
    [06:32:54] 	so I don't think it's saying very much to associate LP with one of those two
    [06:33:17] 	ok, good
    [06:33:37] 	I remember you said 'copenhagen interpretation' was crap, now I see you don't like many-worlds too :)
    [06:34:54] 	nope, there's only one I like, that's the null interpretation :)
    [06:35:09] 	many-worlds make me think of dualism a little
    [06:35:47] 	I always have trouble with coherent relations among universes
    [06:36:16] 	unless 'universe' is define so loosely it becomes 'dimension'...
    [06:36:46] 	I wasn't sure about many worlds for a long time, because none of my colleagues ever talked about it
    [06:37:05] 	everyone just switched between reality and copenhagen as the whim takes them
    [06:37:51] 	the reason I eventually decided I didn't like it is because it still implies discreteness, i.e. that the worlds are countable, and perhaps finite in number
    [06:38:50] 	yeah, that kind of doesn't work ;)
    [06:39:07] 	that was my first problem with it
    [06:39:38] 	if copenhagen mainstream?
    [06:39:41] 	*is
    [06:40:47] 	yes
    [06:41:05] 	but it's only an interpretation, everyone understands that
    [06:41:07] 	it's not predictive
    [06:41:29] 	it's a way to make sense of the numbers, once you have them
    [06:41:36] 	it's not a way to get the numbers
    [06:48:30] 	http://forums.philosophyforums.com/threads/cartesian-dualism-and-the-many-worlds-theory-of-quantum-mechanics-27946.html
    [06:48:39] 	TimStarling: lol, some of those comments are funny
    [06:53:30] 	many-worlds is fun to imagine, but seems far-fetched... kinda like junk food for the quantum soul
    [06:54:49] 	now if only i could make & work... grrr
    [06:55:00] 	is this the only connection between dualism and many-worlds?
    [06:55:35] 	this idea that maybe the world of the mind is diffferent from the world of matter in the same way that one quantum possibility is different from another?
    [06:56:22] 	that's kind of like collapsing the wave function, there, TimStarling
    [06:58:41] 	Damn it. Someone did a specialchars hack for 1.9, now I can't find it
    [06:59:01] 	not even sure it'll work but I'd like to try
    [06:59:06] 	ah, it was in new scientist
    [07:01:50] 	well, there's something to be said for popularising physics, I suppose it's nice to get these ideas out there
    [07:02:07] 	and popularity helps with funding
    [07:03:06] 	but if you're not willing to get into the nitty gritty of it, you're not going to understand quantum theory
    [07:05:04] 	TimStarling: Whom are you addressing?
    [07:08:43] 	http://en.wikipedia.org/wiki/Many-worlds_interpretation#Acceptance_among_physicists
    [07:08:49] 	'MWI is considered by some to be unfalsifiable and hence unscientific because the multiple parallel universes are non-communicating, in the sense that no information can be passed between them. Others[29] claim MWI is directly testable. Everett regarded MWI as falsifiable since any test that falsifies conventional quantum theory would also falsify MWI.[6]'
    [07:08:51] 	heh
    [07:09:19] 	I love that last sentence ;)
    [07:10:52] 	Someone thinks they communicate
    [07:11:27] 	the last sentence seems to imply that you can just add shit on to a theory and it remains scientific
    [07:14:42] 	that's a matter of POV
    [07:15:51] 	TimStarling: 'In fact, Peres questioned whether MWI is really an "interpretation" or even if interpretations of quantum mechanics are needed at all.'
    [07:16:05] 	there you go, the 'null interpretation' ;)
    [07:20:02] *Sasoriza 	thought this was MW, not MWI
    [07:20:08] 	if they can communicate, just wait until someone requests #ifexist for alternate universe queries
    [07:20:24] 	Splarka: lol
    [07:21:16] 	that's a scary thought... and actually intriguing
    [07:22:02] 	Splarka: I shouldn't be able to make alternate universe queries :)
    [07:22:31] 	they should be unreferencable and appear not to exist
    [07:22:44] 	I'm of the belief: If you can imagine it, then it's possible.
    [07:23:01] 	is that The Secret? :D
    [07:23:02] 	{{#ifexist:UniverseWhereHitlerWasNeverDefeated:w:de:Americans|Whew|Uhoh}}
    [07:23:30] 	Sasoriza: I should make my mind it's own universe for fun
    [07:23:48] 	AaronSchulz: you already did
    [07:25:13] 	{{#ifexist:UniverseWhereSasorizaGotTheDamnAmpersandToWork|Sasoriza|WouldBe|Happy}}
    [07:27:10] 	Splarka: http://en.wikipedia.org/wiki/Interpretations_of_quantum_mechanics#Summary_of_common_interpretations_of_QM
    [07:27:15] 	that looks interesting
    [07:32:54] 	the wave function does not apply to an individual system, or for example, a single particle, but is an abstract mathematical, statistical quantity that only applies to an ensemble of similar prepared systems or particles... why does that not make sense to me
    [07:34:03] 	sounds like Albert pulling an Everett
    [07:45:54] 	"the "mind" must be separate from the body"... makes sense
    [07:48:47] 	"the many minds interpretation advises you to get in the box with the cat, since it is certain that your only successor will emerge unharmed." now where's the sense in that
    [07:49:20] 	The yous that are harmed don't emerge
    [07:49:38] *Sasoriza 	won't be one of them
    [07:50:54] *Sasoriza 	is getting out of the box... with the cat
    [07:53:45] 	When all is said and done, we are part of the system. And yet, what determines the system, if not itself?
    [07:54:55] 	The only way to not be part of the system is to be the system
    [07:56:44] <_thom_>	deep
    [07:56:50] 	lol
    [08:00:51] 	a system determines the system determines the system, ad infinitum
    [08:01:27] 	Sasoriza: did you get ampersands to work?
    [08:06:00] 	nope. see this is what I don't get: $wgLegalTitleChars introduced in 1.6... I'm on 1.11
    [08:06:19] 	yet I get breakage
    [08:06:57] 	Apache issue?
    [08:08:02] 	if you're using rewrite rules, it gets complicated
    [08:08:08] 	personally, i just use "and" :)
    [08:09:20] 	that's what I've been doing
    [08:09:37] 	but I'm the type who has to have the cake and be able to eat it
    [08:11:51] 	you saw these links, right? http://www.mediawiki.org/wiki/Manual:Short_URL#Ampersand_.28.26.29_problem
    [08:12:49] *MZMcBride 	is away
    [08:17:55] 	I didn't. TY! Anyone know if those Barrylb's solution works on 1.11?
    [08:18:53] 	http://www.mediawiki.org/wiki/Manual:Short_URL/Ampersand_solution_with_root_access#Another_option
    [08:46:27] 	Does anyone know what revision of MW 1.12 getUserPermissionsErrors was added in?
    [08:47:58] 	svn annotate!
    [08:50:42] <_wooz>	lo
    [08:53:52] *Splarka 	mumbles 26568 in case dan didn't find it already
    [08:55:20] 	if you mean the hook
    [09:01:22] 	Hey, I'm getting a "Database Error" and it says that the SQL query is hidden
    [09:01:30] 	I enabled all debugging features
    [09:01:48] 	but still getting it, I need to see what's wrong with that particular query
    [09:03:00] 	Actually those I've enabled are:
    [09:03:02] 	$wgShwoSQLErrors = true;
    [09:03:02] 	$wgDebugDumpSql = true;
    [09:03:02] 	$wgShowExceptionDetails = true;
    [09:04:25] 	Shwo?
    [09:04:33] 	:)
    [09:04:42] 	hehehe
    [09:05:43] 	pfff
    [09:05:44] 	:)
    [10:51:38] 	wikibugs is dead?
    [10:52:32] 	DDT ?
    [10:54:34] 	Hello?
    [10:55:28] 	it's alive !! :) 
    [10:55:52] 	I'm not sure if I am just yet, hang on
    [10:57:08] 	:))
    [10:59:04] 	is anyone subscribed to wikibugs-l?
    [11:01:07] 	hmm, not me.
    [11:01:26] 	there's no http://lists.wikimedia.org/pipermail/wikibugs-l/2008-January/thread.html
    [11:02:05] 	it's not archived, that's why I asked
    [11:03:27] 	TimStarling: that's the output of bugzilla, or is it more?
    [11:03:28] 	TimStarling: I am
    [11:05:48] 	have you gotten any emails recently that weren't reported on IRC?
    [11:11:24] *archivist 	has a gmail bin full of wikibugs-l
    [11:13:59] 	and last mail was a few minutes ago
    [11:28:28] 	timichal: yes. I've got the mail when I said  wikibugs is dead?
    [11:28:28] 	gah
    [11:28:29] 	TimStarling: ^^
    [12:12:54] 	is it possible to link to a image outside my images? something like the image tag to a different location
    [12:15:51] 	hi all
    [12:16:09] 	hi
    [12:16:47] 	hi
    [12:24:22] 	can mediawiki have üäöß in usernames?
    [12:24:29] 	yes
    [12:24:56] 	you can have a sanskrit user name if you like
    [12:25:02] 	hmpf. i got a vbulletin integration - and they map "jürgen" to "jrgen"
    [12:25:12] 	which drives me nuts ;D
    [12:25:20] 	blame them then :)
    [12:25:54] 	ohh i will ;)
    [12:27:36] 	=)
    [12:27:54] 	bonjour
    [12:28:03] 	Do you speak french ?
    [12:29:46] *amidaniel|away 	blinks
    [12:30:01] 	=)
    [12:30:35] 	lol
    [12:30:56] 	french gyus always pretend to have 0 english skills. but they do.
    [12:31:03] 	true
    [12:31:05] 	:)
    [12:31:17] 	so i would pretend at any time, that i can`t understand one single french word ;)
    [12:31:31] 	sweet
    [12:33:32] 	i want to add a jabber status graphic to our local wiki...is there a way to add extern images?
    [12:33:59] 	make extension
    [12:34:03] 	Yes.
    [12:34:15] 	If you want to know what the way is, then give me cookies.
    [12:34:29] *marvxxx 	passes a cookie
    [12:34:43] *amidaniel|away 	gobbles
    [12:34:45] 	sec
    [12:35:22] 	!man $wgAllowExternalImages
    [12:35:22] --mwbot--	http://www.mediawiki.org/wiki/Manual:%24wgAllowExternalImages
    [12:35:26] 	One option ^
    [12:37:40] 	amidaniel|away: this doesnt look bad
    [12:38:31] 	Depends upon how much you trust your users ... can be used for malicious things
    [12:38:51] 	The least hackiest of ways to do it, though
    [12:39:28] 	amidaniel|away: its a base of like 4 users
    [12:39:31] 	and i know all of them :)
    [12:39:46] 	Ah, then just use that :P
    [12:40:15] 	ok then i thank you alot
    [12:41:08] 	Duesentrieb, is a query ok?
    [12:42:31] 	Flightbase: yes, though i don't have much time... and why not ask here?
    [12:43:26] 	ouh, serveral reasons ;)
    [13:27:04] 	do we have release Notes/or "new changes" list for MediaWiki 1:12 ?
    [13:28:36] 	http://svn.wikimedia.org/svnroot/mediawiki/trunk/phase3/RELEASE-NOTES
    [13:28:45] 	that's the latest in head.
    [13:33:47] 	ah gracias
    [13:37:18] 	how do i get rid of "Donate page" - is it in every skin - or is there some default include for it?
    [13:43:18] 	hello guys. i setup about 12 wikis (http://nas-central.org and a bunch of subwikis based on vendors) without choosing the licence as i was unsure if i should use a CC or a GFDL licence. now i would like to configure the wikis to use the GFDL licence....i used this in the Localsettings.php: http://pastebin.org/14535  but somehow it does not display the GFDL icon on the bottom left. anyone has a clue how i can fix that? thx in
    [13:43:55] 	(latest mediawiki-package)
    [13:45:00] 	also i see no change in the metadata.
    [13:47:59] 	i don't think the license gos into the metadata at all
    [13:48:15] 	mediawiki is sadly not to great about metadata (yet)
    [13:48:54] 	mindbender: are you using a custom scin? what the url of your wiki? do you see code related to the reights icon in the html source?
    [13:49:25] 	just go to http://nas-central.org to have a look
    [13:49:31] 	thx for your help
    [13:51:08] 	mindbender: the path you used for the icon is a full fs path, startzing with /var. a browser can not resolve that. use a web path.
    [13:51:18] *Morbus 	nods.
    [13:51:22] 	probably /skins/common/images/gnu-fdl.png
    [13:51:22] 	you've got /var/chroot/nas-central.org/srv/www/nas-central.org/nas-central.org/skins/common/images/gnu-fdl.png
    [13:51:27] 	yes. which is wrong
    [13:51:31] 	aye.
    [13:52:15] 	beleive it or not...its right.
    [13:52:27] 	ah!
    [13:52:27] 	Morbus: btw... did you think of the legal implications of imposing a license after the fact? at least in theory, you need the permissions of each and every contributor.
    [13:52:28] 	ok.
    [13:52:36] 	so i should use the a web link. ok
    [13:52:42] 	i thought i can use a relative link
    [13:52:48] 	Duesentrieb: mindbender.
    [13:52:54] 	actually i am the only contributor so far
    [13:53:11] 	Morbus: sorry - my cleint gave you the same color and added to my confusion :P
    [13:53:13] 	the only problematic wiki is http://buffalo.nas-central.org
    [13:53:34] 	all others were setup by myself only the last days
    [13:54:19] 	.oO(why do i keep reading "nsa central"?)
    [13:54:46] 	thx guys...its working!
    [14:09:11] 	hi
    [14:11:16] 	hmm, without any rewrite rules set, why would Mediawiki remove the ++ in this url: http://techbasetest.kde.org/index.php?title=Policies/Binary_Compatibility_Issues_With_C++
    [14:11:16] 	?
    [14:12:16] 	because + in URL means space
    [14:12:28] 	especially, it does in attribute values
    [14:12:31] 	domas: but it used to work
    [14:12:37] 	it wouls work if you use pathinfo
    [14:12:42] 	(depending on webserver setup)
    [14:12:42] 	danimo: use %2b%2b instead
    [14:12:50] 	pathinfo -> index.php/FooBar
    [14:12:55] 	+ might work on a broken webserver, but i'd be surprised
    [14:13:29] 	flyingparchment: this works on wikipedia: http://en.wikipedia.org/wiki/C++
    [14:13:48] 	yes, because that's not a request parameter
    [14:13:58] 	Duesentrieb: the rewrite tutorial specificly said to disable path info
    [14:14:18] 	flyingparchment: indeed - it's a bit unclear how this is handeled in other parts of the url. afaik, implementations differ
    [14:14:19] 	danimo: nothing should generate a URL with a literal + like that anyway, where did you get it from?
    [14:14:19] 	that's only for sites that have an arguably-bad layout.
    [14:14:20] 	Duesentrieb: and enabling doesn't change anything
    [14:14:33] 	where your install directory is the same location as your wiki namespace.
    [14:14:40] 	flyingparchment: it just used to work in prior mediawiki versions
    [14:14:54] 	flyingparchment: upgrading to 1.11 broke it (right now it's on 1.12a
    [14:15:03] 	danimo!
    [14:15:10] 	heya domas :)
    [14:15:12] 	didn't look at nickname :)
    [14:15:13] 	how are ya
    [14:15:26] 	hm. i just encountered a small problem again at http://nas-central.org.org ..... here is the part of Localsettings.php : http://pastebin.org/14539 ..... the link on the bottom right which say: "Content is available under GNU Free Documentation License v1.2" is linking to http://nas-central.org/index.php/Http://www.gnu.org/licenses/fdl.txt ....anyone knows how to fix that?
    [14:15:28] 	domas: fed up with urls for sure :)
    [14:15:32] 	haha
    [14:16:19] 	domas: and for KDE it kinda makes sense to get URLs that include C++ working again :)
    [14:16:43] 	mindbender: don't set $wgRightsPage if you wanrt a url. the comment after it, in your local settings, even sais that its a *page*.
    [14:17:15] 	mindbender: set a page OR a url. not both.
    [14:17:59] 	domas: I wonder why it used to work in first place
    [14:18:01] 	danimo: this stuff is pretty fucked up :) perhaps read the comments in DefaultSettings php wrt title chars
    [14:18:27] 	danimo: some stuff abotu pathinfo changed in 1.11... maybe that's it. or something. dunno :P
    [14:18:29] 	ah! thx
    [14:19:54] 	Duesentrieb: doesn't really help :)
    [14:20:10] 	Duesentrieb: I'd rather talk to whoever changed url handling in 1.11 :)
    [14:20:19] 	danimo: someone really wanted it to work once? :)
    [14:20:36] 	danimo: anyway, we use Alias /wiki/ ... on our sites
    [14:20:40] 	danimo: svn blame :)
    [14:21:05] 	regardless of who changed the handling in MediaWiki, treating + as a space in URLs is /correct/.
    [14:21:07] 	domas: I know, but I really want to avoid this. I also don't really see how that affects url handling
    [14:21:45] 	http://techbasetest.kde.org/index.php?title=Policies/Binary_Compatibility_Issues_With_C%2B%2B is the correctly encoded URL for literal +s.
    [14:21:46] 	Morbus: probably, but in case of mediawiki things are not so simple
    [14:22:23] 	Morbus: but that means to manually encode these strings each time
    [14:22:35] 	or not?
    [14:22:43] 	ideally, MW should do that for you, if you're linking to it.
    [14:23:37] 	danimo: no, [[C++]] should generate a url hat uses %2B%2B, not ++
    [14:23:49] 	ok :)
    [14:23:54] 	try it, though
    [14:24:03] 	yeah, it does.
    [14:24:07] 	http://techbasetest.kde.org/index.php?title=Policies
    [14:24:12] 	on that page, it does the %2B correctly
    [14:24:54] 	ok
    [14:25:14] 	hm... firefox does *not* convert + to %2B in the %s of a keyword-bookmark. that's a bug.
    [14:25:41] 	Duesentrieb: does for me.
    [14:25:51] 	I just did "wp whee++" and it sent me to http://en.wikipedia.org/wiki/Special:Search?search=whee%2B%2B
    [14:26:00] 	wp being my keyword.
    [14:26:01] 	using 2.0.0.11 here
    [14:26:19] 	same here, os x.
    [14:26:23] 	huh
    [14:26:30] 	linux here. maybe that's it :)
    [14:28:45] 	ok, all working now. i want to use this opportunity to thank the creators of mediawiki....we used pmwiki before but mediawiki has much more opportunuties! i  will bring up the topic about a donation to you guys the next time we have a community meeting.
    [14:29:56] 	okok :)
    [14:30:34] 	(and as we are using 12 instances of mediawiki i feel a little guilty :D )
    [14:30:57] 	in case the others don`t like it i will donate myself
    [14:44:49] 	err, thanks i guess.
    [14:44:52] 	hm, gone :/
    [14:47:40] 	heh
    [14:48:25] 	domas: I still don't get why wikipedia handles + properly (or wrong,in the sense of mindbenders argumentation)
    [14:48:38] 	err, morbus' argumentation
    [14:48:47] 	dunno ;-)
    [14:48:54] 	eh?
    [14:49:20] 	ah, i see.
    [14:49:28] 	http://en.wikipedia.org/wiki/C++ and http://en.wikipedia.org/wiki/C%2B%2B
    [14:49:30] 	both work.
    [14:49:52] 	yepp
    [14:49:53] 	danimo: + in url *parameters* is the same as %20. + in the path of a urls is... unclear. apparently, apache can be made to handle it literally, at least in some configurations
    [14:50:34] 	Duesentrieb: so you think it's all a matter of apaches configuration? how come then that it used to work with prior mediawiki versions?
    [14:51:06] 	danimo: well, it sounds like you disabled PathInfo.
    [14:51:31] 	prior to MediaWiki 1.11.0, the code was tolerant to "code and wiki namespace in the same location". in 1.11.0, it is now less tolerant.
    [14:51:37] 	danimo: it may well be that it's more a bug than proper configuration, and you have to take special care in mediawiki to use it. no idea, really. maybe its presented in the pathinfo in some odd form, or optherinfo gets processed differently now.
    [14:51:42] 	i really don't know :)
    [14:51:50] 	Morbus: what does pathInfo do anyways?
    [14:51:53] 	and, per one of the wiki configurations on clean urls, it instructs you to /turn off pathinfo/
    [14:51:59] 	danimo: makes index.php/Whee mean something.
    [14:52:12] 	ah
    [14:52:34] 	i don't think you can use literal + with pathinfo off. unless mod_rewrite has some special feature to turn that into %2B.
    [14:52:50] 	actually, i can imaging mediawiki running a patched version of mod_rewrite to do just that :)
    [14:53:00] 	but it's all speculation
    [14:53:03] 	maybe mark would know
    [14:53:24] 	Morbus: enabling pathinfo doesn't change anything
    [14:53:45] 	...and also: "recommended" doesn't mean that this is how wikipedia does it
    [14:53:48] 	it wouldn't, if your scriptpath/articlepath isn't set properly to match, as far as i know.
    [14:53:51] 	ah, ok
    [14:53:54] 	nevermind, it does
    [14:54:04] 	hm?
    [14:54:05] 	ok, ok, I see the problem I guess
    [14:54:36] 	prior to the upgrade "$wgArticlePath      = "$wgScript/$1";" was set
    [14:54:43] 	right, now edit a page.
    [14:54:46] 	with the upgrade, that resultet in in infinite redirect
    [14:54:57] 	if you're going to run up against a bug here, you'll see yourself trying to edit Index.php over and over again.
    [14:55:04] 	Morbus: yes
    [14:55:40] 	Morbus: so I had to use the title=$1 form
    [14:55:47] 	?title=$1 form even
    [14:57:38] *danimo 	has to ask brion it seems
    [15:02:36] 	danimo: i suspect he'll tell you that making your wiki namespace in the root namespace is a bad idea ;)
    [15:03:07] 	Morbus: I don't see no namespaces
    [15:03:15] 	Morbus: I thought namespaces are those sperated by :
    [15:03:27] 	think up a higher level.
    [15:03:37] 	site.com/wiki/ <-- "wiki" namespace.
    [15:03:43] 	site.com/example/ <-- "example" namespace.
    [15:03:49] 	site.com/ <-- "root" namespace.
    [15:03:57] 	in your case: site.com <-- "root" AND "wiki" namespace.
    [15:04:10] 	ok
    [15:05:07] 	is he even using short urls?
    [15:05:37] 	ah, he turned them off
    [15:05:42] 	no, doesn't look like it.
    [15:05:56] 	anywa, there's nothing wrong with putting index.php in /
    [15:06:08] 	the problem is when you try to put articles in /
    [15:06:26] 	yeah.
    [15:06:32] 	i missed he wasn't using shorts.
    [15:06:47] 	hrm. ideally, he should be able to do index.php/Development then...
    [15:07:13] 	hi Jack_Phoenix
    [15:07:41] 	danimo: are you using apache 2?
    [15:07:50] 	Morbus: sure
    [15:07:51] 	heya Nikerabbit :-)
    [15:08:08] 	Morbus: yes, index.php works
    [15:08:10] 	hrm. and before the upgrade, index.php/Whee was working fine?
    [15:08:20] 	even without index.php
    [15:08:26] 	never bothered checking with
    [15:08:33] 	ah, so you were using articles in /.
    [15:08:43] 	es
    [15:08:44] 	yes
    [15:08:49] 	Morbus: the problem is that different urls would break the current scheme
    [15:08:59] 	sure.
    [15:09:26] 	uhm. whats the syntax for [del] in mediawiki?
    [15:09:38] 	like [del]  [/del] in most forums
    [15:10:26] 	 
    [15:10:47] 	thanks
    [15:10:49] 	
    [15:10:51] 	mmm
    [15:10:59] 	danimo: if you had articles like this: "http://example.com/Main_Page" then your config was screwed to start with and you need to change your URL scheme
    [15:12:01] 	!rewriteproblem
    [15:12:01] --mwbot--	1) Do not put the files into the document root; 2) Do not map the pages into the document root; 3) Use different paths for real files and virtual pages; 4) Do not set a RewriteBase; 5) Put all rules into the .htaccess file in the document root.
    [15:12:16] 	covers 90% :)
    [15:12:49] 	:D
    [15:13:06] 	Duesentrieb: it would be so nice if someone updated the short urls page with a hard-ass tact.
    [15:13:18] 	Duesentrieb: that page harms more than helps nowadays.
    [15:13:23] 	as it encourages bad configs.
    [15:13:30] 	Morbus: i used to do that and 5 minutes later some idiot would come and put back the broken stuff, so i gave up
    [15:13:39] 	heh
    [15:13:45] 	i suspected as much, actually.
    [15:13:56] 	flyingparchment, big warning and start banning!
    [15:14:07] 	Maybe I should look for a last good revision and enforce it.
    [15:14:14] 	This seems to be one of the biggest complaints.
    [15:14:18] 	i like the idea of protecting it
    [15:14:21] 	well, i'd heartily explain /why/.
    [15:14:31] 	it's not enough for /me/ just to know how to do it.
    [15:14:36] 	i want to know why it's this way.
    [15:14:42] 	it's basically being vandalised and causing huge problems for people trying to set up mediawiki
    [15:14:44] 	(I know why, I'm just saying)
    [15:16:06] 	ideally, i think half the configs that are listed up there are trial and errors, and people don't know /why/ they work
    [15:16:12] 	they just know it "worked for me!".
    [15:16:23] 	Morbus: the opensuse guys will prolly run into similar kinds of problems it seems.. once they upgrade from 1.5.8 :)
    [15:16:49] 	lol.
    [15:16:55] 	heheh. definitely :D
    [15:17:13] 	HELP: my mediawiki is open to everybody, only registered users should be allowed to write articles but after upgrading suddenly everybody can 
    [15:17:17] 	is that even still in the secururity maintainance?
    [15:17:26] 	no, 1.5 is completely unsupported
    [15:17:28] 	danimo, no.
    [15:17:30] 	of course, i'd support root article namespace for wiki.sitename.com, but only if the install dir was outside of root.
    [15:17:49] 	!prevacc | Alex_ 
    [15:17:49] --mwbot--	Alex_ : http://www.mediawiki.org/wiki/Manual:Preventing_access
    [15:17:50] 	Morbus: if you really want to do that, you need two domains, one for index.php and the static files, and one for the articles
    [15:17:52] 	Alex_: you need to look into wgRoot.
    [15:17:56] 	er. 
    [15:17:59] 	wgGroupPermissions.
    [15:18:44] 	flyingparchment: yeah, wouldn't really work.
    [15:18:48] 	flyingparchment, robots.txt and favicon.ico won't work so well with that.  (Well, maybe robots.txt: isn't that shared for, e.g., all wikipedia.org?)
    [15:19:06] 	Simetrical: yes, but we don't put articles in the root, so /robots.txt is accessible
    [15:19:14] 	favicon you can do by hacking the skin
    [15:19:20] *Morbus 	has always done /wiki, and only recently has done /wiki and /w.
    [15:19:47] 	what do i do wrong if changePassword.php doesnt work? ;-)
    [15:19:59] 	run around screaming.
    [15:20:03] 	that's what you could do wrong.
    [15:20:05] 	i user the username out of the table "user" 
    [15:20:21] 	*use
    [15:20:44] 	but it tells me: No such user
    [15:21:00] 	AdminSettings.php works
    [15:21:53] 	flyingparchment, well, you don't, but I thought we were discussing doing so.
    [15:22:13] 	Simetrical: i don't see how robots.txt working on wikipedia.org is related to it working on sites with articles in /
    [15:22:35] 	haha ok...someone should hit me
    [15:22:45] 	i was in the wrong mediawiki
    [15:22:46] 	;-)
    [15:22:56] 	i should worry if this user would have existed there
    [15:23:16] *Simetrical 	goes to look up details about robots.txt
    [15:24:17] 	how do i set wgGroupPermissions so that only registered and logged in users are allowed to to anything
    [15:24:34] 	Alex_: its in the FAQ
    [15:25:11] 	how can I display the date/time an article was modified?
    [15:25:33] 	DigitallyBorn: besides the one shown in the footer?
    [15:25:39] 	yes
    [15:25:49] 	http://en.wikipedia.org/wiki/Help:Magic_words
    [15:25:56] 	look for REVISION...
    [15:25:58] 	do i have to set everthing by hand
    [15:26:01] 	?
    [15:26:02] 	Morbus: Thanks .. I was looking for that
    [15:26:23] 	Alex, look for "Restrict editing by all non-sysop users" on http://www.mediawiki.org/wiki/Manual:Preventing_access
    [15:26:23] 	i mean restrict editing, viewing etc.
    [15:26:26] 	modify that to fit.
    [15:26:48] 	if you want to stop viewing too, do the same only replace 'edit' with 'view'.
    [15:27:04] 	sorry, 'read'.
    [15:27:05] 	not 'view'.
    [15:27:51] 	thank you very much morbus
    [15:29:09] 	Morbus: I still don't quite see your issue with root name spaces
    [15:29:14] 	Morbus: sorry :)
    [15:29:34] 	danimo: your articles should not be in the same location as real life files.
    [15:29:57] 	k everything is fixed
    [15:29:59] 	danimo: site.com/ArticleName is a virtual article, in the same "location" (namespace) as literal files README.txt, style.css, etc.
    [15:30:16] 	hi ; is it possible to oversight an image using the oversight extension ?
    [15:30:23] 	sorry for rushing in here but i was panicking
    [15:30:36] 	no worries.
    [15:31:08] 	Morbus: yeah, but mod_rewrite takes care of that
    [15:31:19] 	Morbus: (yes, I am aware of potential conflicts)
    [15:31:40] 	Morbus: and I'm all for fixing it :)
    [15:32:46] 	that is, finding a way to make it possible
    [15:32:55] 	danimo: whether mod_rewrite can or not (and it certainly can with -f and -d, sure), that's not the party-line at MediaWiki HQ ;)
    [15:33:11] 	Morbus: hehe
    [15:33:18] 	Morbus: anyway, gtg, thanks for the insight
    [15:46:14] 	03rotem * r29426 10/trunk/extensions/ExpandTemplates/ExpandTemplates.i18n.php: Update for he.
    [15:50:09] 	03siebrand * r29418 10/trunk/extensions/ (9 files in 2 dirs): 
    [15:50:09] 	* use wfLoadExtensionMessages for AntiSpoof
    [15:50:09] 	* rename AntiSpoof i18n file (replace "_" with ".")
    [15:50:09] 	* add version for AntiSpoof
    [15:50:09] 	* whitespace fixes
    [15:50:10] 	* update Translate extension
    [15:50:40] *siebrand 	greets CIA-40 back.
    [15:57:06] 	03grondin * r29420 10/trunk/extensions/ (3 files in 2 dirs): Optimisation of the UserMerge extension internationalization file for export
    [15:58:56] 	how do you suppress the category list at the bottom of a page? 
    [15:59:07] 	frieze, you . . . don't categorize it?
    [15:59:18] 	Or, you use some silly CSS trick?
    [15:59:31] 	is it possible to pass arrays to templates? i want to make a template for a "table of contents" with variable length
    [15:59:42] 	ah, thought there might be some magic word type deal to do that
    [15:59:58] 	didn't see it on the list, but I've missed stuff before
    [16:00:10] 	yeah, CSS hide would be easiest, i think.
    [16:00:15] 	don't know of any magic word.
    [16:00:44] 	Alp-, no arrays in wikitext.
    [16:02:13] 	Simetrical: maybe another solution to that?
    [16:02:33] 	Alp-, what do you want the template to do, exactly?
    [16:03:03] 	Simetrical: sec, i'll paste an example
    [16:04:49] 	Simetrical: http://pastie.caboo.se/136713
    [16:06:21] 	Alp-, can't be done, except by imposing a max number of parameters and doing something like {{#if:{{{param27|}}}|# {{{param27}}} }}\n{{#if:{{{param28|}}}|...
    [16:06:30] 	No for loops in wikitext.  This isn't a programming language.
    [16:06:46] 	. . . or technically it probably is, really, except for the memory limit.
    [16:09:04] 	ok
    [16:09:12] 	maybe there are extensions for that?
    [16:09:22] 	i dont want a max number
    [16:10:35] 	i don't know of anything that does loops.
    [16:14:25] 	http://www.mediawiki.org/wiki/Extension:LoopFunctions
    [16:20:47] 	You could do that, and hope it doesn't slow your wiki to a crawl.  Make sure there are limits on the number of iterations possible, otherwise someone can add for( i=0; i < 1000000000; i++ ), or while( true ), or equivalent.
    [16:22:54] 	thats right
    [16:23:01] 	but this is a internal wiki only
    [16:23:04] 	for 5 people
    [16:32:03] 	03catrope * r29427 10/trunk/phase3/includes/api/ (ApiProtect.php ApiQueryInfo.php): 
    [16:32:03] 	* (bug 12543) API should support new protected titles system
    [16:32:03] 	* Changing ApiProtect to return ISO 8601 timestamps
    [16:32:03] 	* This doesn't really need a RELEASE-NOTES entry, as the protected titles system is already mentioned in RELEASE-NOTES
    [16:35:54] 	Hello
    [16:37:00] 	03catrope * r29428 10/trunk/phase3/includes/api/ApiProtect.php: Changing name of error code to something making slightly more sense
    [16:37:33] 	in mediawiki, can you make an image a link?
    [16:38:44] 	Simetrical: "Per standard maximum number of iterations for both for and foreach is 100 cycles per session."
    [16:39:03] 	!imagelink | Dotel 
    [16:39:03] --mwbot--	Dotel : Image linking is not directly supported in MediaWiki at the present time for historical reasons.  You can use the ImageMap extension or Template:Click from Wikipedia.
    [16:39:36] 	Alp-, of course then it might give an error on one page view but not another, depending on the exact configuration of templates and conditional paths.  :)
    [16:39:46] 	thanks mwbot i'll take a look at thoes
    [16:39:54] 	Someone adds a few extra pages and some more #ifexists return true . . . whoops, error.
    [16:40:55] 	I've seen various google maps extensions ... but is there a way to embed a map with the markers it has on it at the google end?  rather than adding them to the wiki page ...
    [16:42:31] 	03catrope * r29429 10/trunk/phase3/includes/api/ApiQueryInfo.php: Returning protections more consistently
    [16:43:42] 	anyone have any experience of embedding google maps in mediawik?
    [16:44:31] 	Generally you won't get so much help here with extensions not written by MediaWiki developers/used on Wikimedia sites/etc.  You're welcome to try, though.
    [16:44:42] 	You might be better off asking the ones who wrote the extensions.
    [16:44:46] 	alternatively, how do you embed raw html into a mediawiki page?
    [16:44:48] 	that would do fine
    [16:45:13] 	(i could just use the google maps embed code)
    [16:45:18] 	!html | bakert 
    [16:45:18] --mwbot--	bakert : For allowing any and all HTML, see . This is of course VERY DANGEROUS. Safer options include ,  and .
    [16:45:47] 	magic .. thanks Simetrical 
    [16:50:09] 	is there any way to make a linebreak in wikitext? i need that for my loop. example: http://pastie.caboo.se/136748
    [16:50:46] 	
    [16:50:56] flyingparchment: that doesnt work [16:51:02] that gives me another output, sec [16:51:16]
    is how you create a newline in wikitext, i guess your question is more complicated :) [16:52:27] flyingparchment: http://pastie.caboo.se/136748 [16:52:29] see edit [16:54:33] !extensions [16:54:33] --mwbot-- MediaWiki has been built so it can easily be customized by adding extensions. This is usually a simple process. See for instructions to install extensions, as well as for writing them. See for an overview of known extensions. [16:54:46] :) [16:54:49] i like mwbot [16:55:08] whose baby is he? what's he written in? [16:56:59] don't worry - he told me :) [16:59:40] hmm i dont get it :/ [17:06:07] is it possible to create a sortable table where the last row is not sorted? (eg for totals) [17:09:23] is there a way to add a header to all wiki pages? Our center has a banner/drop down menu navigation bar that goes above all the other pages, and would be nice to have that for the wiki as well... [17:11:25] yes, but no. [17:11:37] there's sitenotice (check the FAQ), but it's wiki-syntax only. [17:11:46] no js or raw html. if you need js/raw html, you'd have to modify the skin. [17:12:20] meh, k, thanks [17:12:23] if you're using the default Monobook, modify skins/Monobook.php. [17:12:50] MonoBook.php, rather. [17:25:52] 03siebrand * r29430 10/trunk/phase3/languages/messages/ (12 files): Localisation updates for core messages from Betawiki (2008-01-08 17:05 CET) [17:26:36] 03siebrand * r29431 10/trunk/extensions/ (38 files in 35 dirs): Localisation updates for extension messages from Betawiki (2008-01-08 17:05 CET) [17:27:54] 03midom * r29432 10/trunk/debs/memcached/ (16 files in 2 dirs): import memcached package, with ulimit -n tweak [17:52:34] 03vasilievvv * r29433 10/trunk/tools/planet/ru/config.ini: Update Russian planet config [17:53:21] *VasilievVV is ready to commit new features he made during this week [17:53:23] 03yaron * r29434 10/trunk/extensions/SemanticDrilldown/languages/ (7 files): Changed format for usage by Betawiki, the translation wiki [17:58:42] how can I increase the maximum value of the "limit" parameter [17:58:45] ? [17:59:01] the max is 5000... I want at least 20000 [17:59:57] how to do it? [18:01:12] 03vasilievvv * r29435 10/trunk/extensions/TitleBlacklist/ (TitleBlacklist.i18n.php TitleBlacklist.list.php): [18:01:12] TitleBlacklist: [18:01:12] * Add title whitelist [18:01:12] * Add profiling [18:01:12] * Make cache version check really work [18:10:45] 03yaron * r29436 10/trunk/extensions/SemanticDrilldown/skins/SD_main.css: New look - categories list is in a right-hand-side box [18:11:11] 03vasilievvv * r29437 10/trunk/phase3/ (7 files in 3 dirs): API: add action=logout [18:12:21] hello again :) [18:12:31] brion: heya [18:12:48] yo yo [18:13:03] brion: we were wondering today: should mediawiki support root name spaces (foo.com/Main_Page) ? [18:13:28] danimo: i recommend against it, but there's plenty of sample rewrite rules about for it [18:13:41] note that 1.11.0 was a bit flaky in this configuration, though [18:13:51] brion: yepp, that's the point [18:14:24] brion: with 1.11, you cannot use pathinfo [18:14:38] brion: which in turn means I can't use the literal plus in titels [18:14:46] right, upgrade or apply the patch for that [18:14:52] brion: which is a pity for our case, being based on a language called C++ ;) [18:15:02] brion: I did upgrade to svn [18:15:41] 03yaron * r29438 10/trunk/extensions/SemanticDrilldown/specials/ (SD_BrowseData.php SD_ViewData.php): Moved 'ViewData' to 'BrowseData' [18:15:48] (trunk) [18:15:52] 03aaron * r29413 10/trunk/extensions/ConfirmAccount/ (ConfirmAccount.i18n.php ConfirmAccount_body.php): List user rights [18:17:31] 03siebrand * r29439 10/trunk/extensions/SemanticDrilldown/languages/SD_Messages.php: Whitespace fixes [18:18:00] 03yaron * r29440 10/trunk/extensions/SemanticDrilldown/ (5 files in 3 dirs): [18:18:00] Version 0.3 - new CSS look for header, 'View data' changed to 'Browse data', [18:18:00] language messages switched to Betawiki format, better handling of [18:18:00] property and namespace aliases [18:20:16] Is there page that I can view that will list all of the pages in a namespace [18:21:12] 03siebrand * r29441 10/trunk/extensions/Translate/ (MessageGroups.php Translate.php): Add support for SemanticDrilldown [18:21:18] 03vasilievvv * r29442 10/trunk/phase3/includes/BagOStuff.php: Introduce BagOStuff::keys() method for debugging/etc. purposes [18:22:42] brion: so is there anything in particular I should watch out for? [18:22:52] *shrug* [18:22:58] 03vasilievvv * r29443 10/trunk/phase3/docs/globals.txt: Add some variables to globals.txt [18:27:27] brion: because you mentioned a patch [18:28:15] if you're running current svn trunk, no need to patch [18:31:59] question: why does mediaWiki not handle it's namespaces in the database? [18:33:06] funkju, because they weren't originally set up that way. If anyone ever makes a namespace manager, they'll have to be put in the database. [18:33:24] Unless we want to parse and modify LocalSettings.php, which we don't. [18:34:20] 03vasilievvv * r29444 10/trunk/ (7 files in 3 dirs): [18:34:20] * Add exception hooks to output pretty messages [18:34:20] * Also, there's an example of such extension [18:34:42] It doesn't seem like it would be hard to just have the namespaces stored and pulled from the db [18:34:50] do you know why they didnt set it up that way originally? [18:35:51] funkju, why should they have? [18:36:12] It's in PHP variables in the config file like all configuration settings. [18:36:15] I' appreciate better namespace management from the UI, too. [18:37:38] Simetrical, it isn't a config setting, really- it's more like a super-category [18:37:53] You can look at it either way. [18:38:09] probably [18:38:18] i'm going to change my wiki to use the database [18:38:24] you're not meant to add namespaces all willy-nilly [18:38:35] so there was never any need to put it in the database [18:39:17] 03siebrand * r29445 10/trunk/extensions/SemanticDrilldown/languages/SD_Messages.php: Localisation updates for extension messages from Betawiki (yep, we held a race ;)) [18:51:47] wow, they keep pushing up the base config (and price) of the mac pro :P [18:52:02] 8-core for $2799 [18:52:36] yeah. [18:52:38] noticed. [18:52:50] but, it would suggest that they have more hardware announces at the Pro next week. [18:52:55] since they're announcing these early. [18:53:14] i did a config build, though, and the only thing i wanted to increase was the HD - from 320 to 750, brought the total to $3049. [18:53:21] i'll probably pick up one late this quarter. [18:54:52] i've got no excuse to upgrade; my macbook pro and imac are doing me fine :) [18:57:52] my g5 is about 4 years old. time for me to get a new one. [18:58:58] used to have a lovely g5... replaced it with a macbook when i moved out of california [18:59:00] *Simetrical sneers at the Mac users, for the principle of the thing [18:59:20] *brion installs FreeBSD on Simetrical [18:59:24] heh [18:59:25] :( [18:59:36] brion: yeah, i have a intel mbp too. [18:59:44] "FreeBSD: enough like linux to have no benefits, and enough different to be a pain in the ass" [18:59:50] Simetrical: re the separate 'note' like functionality for (bug #6271), how are you dealing with the autonumbering issue? [18:59:50] but that's used differently than my desktop box. [19:00:19] at least freebsd taught me how to turn on colored ls on macos :) [19:00:31] Mahlzahn, well, I haven't really started on an implementation for current trunk. Good question, I hadn't really thought about it. [19:01:03] The initial implementation may well just mix the numbers together, so it will only be suitable for footnotes per section. [19:01:22] Allowing different display is a separate issue, implementation-wise, although most of the uses of this one depend on it. [19:01:33] see the second last point of the crosspost I added [19:01:38] CSS question here on how to force inline font change. But this code ain't workin. Any thoughts??? Symbol Font \ should render as three-dots symbol [19:01:42] I would like a general sort of way of doing that, that would also allow Harvard-style references. [19:01:52] yes, indeed. [19:01:59] Steev43230, do you have Symbol installed on your computer? [19:02:03] I do. [19:02:44] brion: why wyould anyone want an 8 core desktop? [19:02:45] In former website I passed it as an image. Like to avoid doing that on the wiki [19:02:55] flyingparchment, graphics processing, probably. [19:03:07] *nod* [19:03:07] A lot of those apps can benefit heavily from parallelism. [19:03:32] Steev43230, web fonts are completely unreliable. You have no guarantee that any reader will have the font installed. [19:03:33] flyingparchment: well, i want one becuase they last me longer. i keep more apps open, more desktops, more dual displays. [19:03:43] apps are getting slower, not faster. [19:03:44] Yes, I know. [19:03:45] me too, but i only use one app at a time [19:03:51] Morbus, eight cores is dramatically excessive for just multitasking. [19:03:58] don't have one mouse for each hand yet :) [19:04:02] Be nice to TeX it but server doesn't have TeX installed. [19:04:03] Simetrical: sure. [19:04:08] Two cores is usually enough there, four is overkill unless you run tons of computationally-intensive background processes. [19:04:21] Simetrical: I'm running four now, and my computer is what i'd consider pretty slow. [19:04:33] Morbus, and is CPU the bottleneck? [19:04:36] i could do with one core so eve doesn't take up all the cpu, but other than that, when i'm not using something, it's not using any cpu [19:04:41] no idea. dont much care. throw money at the problem. [19:04:51] :) [19:05:02] Simetrical, maybe it would be easier to install this one character as a template, then call it that way. What do you think about that? [19:05:12] Steev43230, maybe. [19:05:19] Morbus: you need to understand which problem to throw money at first [19:05:29] flyingparchment: /me sticks fingers in ears. la la la. [19:05:30] Morbus: "My car is too slow. I have lots of money, so I will buy two cars to go faster." [19:05:30] Still passing it as an image though. [19:05:40] hehe [19:06:15] Trouble is, I cannot upload any images right now - server being configured and no idea when it will be finished. [19:06:26] Is my CSS line "good"? [19:06:46] if you're making multiple trips shipping goods, two cars *will* double your speed. ;) [19:07:10] not if you only ahve one driver ;P [19:08:09] tow cable? :D [19:12:29] 03siebrand * r29446 10/trunk/extensions/ (5 files in 2 dirs): [19:12:29] * use wfLoadExtensionMessages for AjaxShowEditors [19:12:29] * add version for AjaxShowEditors [19:12:29] * update Translate extension [19:12:41] brion: (given your points in Wikipedia:HiddenStructure): what do you think of AllisonW's en:MediaWiki:Common.css use of display:none/speak:none to suppress redlinks (see also .hidden-redlink in meta's MediaWiki:Common.css), so avoiding the use for many instances of #ifexist? [19:13:21] huh? red links are supposed to always be visible [19:13:27] otherwise your text turns into illegible shit [19:13:37] and your user interface stops working [19:14:04] not in articles. When building lists 'xyz exists' sort of thing. [19:14:39] for example in the automated listing of talk page archives [19:14:43] It's fairly ridiculous, to be honest. [19:14:50] I mean, an okay workaround. [19:15:00] But the software is already running a query to check for page existence there. [19:15:22] It shouldn't be so expensive to have a conditional based on that. [19:15:24] what are the docbook features/option/add-ons for mediawiki, if any? [19:15:29] Docbook? [19:16:28] www.docbook.org [19:16:50] Simetrical: yes, rel2abs-like, but returning nothing if the result doesn't also exist. [19:16:59] good evening^^ [19:17:44] anybody here who's able to help me? I'm too stupid [19:17:54] Of course, I don't think the #ifexist actually *would* run another query if the redlink already established the page's nonexistence. [19:18:01] The result would be cached. [19:18:05] I'm trying to include more languages in my wiki - won't work *argh* [19:19:13] Simetrical: literally thousands of pages use archivebox, which each call -- all told -- 137 instances of #ifexist. [19:19:59] Well, redlinks are handled in a LinkBatch, now that I think of it, aren't they? So it's not one query per link. [19:20:05] Are #ifexist calls batched? [19:20:24] Anyway, sure, use whatever hacky CSS workaround that works for you. :) [19:20:30] damn I hate apple's X server, always crashing [19:21:40] anyway, are there any docbook import/export routines or features that are in use with mediawiki? [19:21:56] yes, its terribly hacky. A hack for a hack really, since automated archiving should also cause the list of archives to be updated, which would really avoid all the #ifexist dependancies for archives [19:23:46] another possibility would be an unformatted (not-tabled) output of Special:Prefindex with the call to prefindex then being transcluded from the archivebox. [19:24:41] Mahlzahn, seriously, do people manually archive anymore? And if they do, why don't they just spend the extra two seconds to update the infobox thing? [19:25:19] not them. The automated ones. The bot doesn't update the list though. [19:26:35] Why not? [19:26:41] That sounds fairly trivial to implement. [19:26:51] Much easier than all this hocus-pocus with ifexist/CSS/redlinks/etc. [19:27:15] so I'd expect too. But not the way its done. [19:29:58] Howdy all. I'm working on trying to make file-icon template. Working on the switch statement for selecting which icon based on which file extension parsed. I borrowed some of this code and I just can't seem to make it work. I was wondering if any of you had some sugestions or tips. Thanks. http://rafb.net/p/QRR8Qg84.html [19:31:09] http://eyeworry.com/stuff/wiki_langs.jpg [19:31:18] What do I need to get this^^ [19:31:19] 12?14¿12? [19:32:15] ShakataGaNai: you need to install http://meta.wikimedia.org/wiki/Help:ParserFunctions and http://www.mediawiki.org/wiki/Extension:StringFunctions [19:33:02] Simetrical: See auto= parameter in source for {{archives}} {{archive box}} {{archive box collapsible}} [19:33:12] ialex: Ah, interesting. I had the prior, but not the later [19:34:23] mhm 12:1414( [19:35:24] 12N14o answer for my question? [19:35:25] 12:1414( [19:36:05] 03siebrand * r29447 10/trunk/extensions/ (3 files in 2 dirs): [19:36:05] * use wfLoadExtensionMessages for BackAndForth [19:36:05] * update structure in BackAndForth.php [19:36:05] * add version and description in Extension credits for BackAndForth [19:36:05] * update Translate extension [19:36:41] 03siebrand * r29448 10/trunk/extensions/Asksql/Asksql.i18n.php: Localisation updates. Update indentation and author credits [19:37:51] eYeWoRRy: http://www.mediawiki.org/wiki/Template:Languages [19:38:07] 03siebrand * r29449 10/trunk/extensions/Translate/MessageGroups.php: Bugfix [19:38:14] I read it^^ but it's not woekin [19:38:20] something is missing @ ialex [19:38:25] workin^^ [19:39:15] http://www.mediawiki.org/wiki/Template:Languages/Lang [19:39:18] ialex: You are fantastic. Extension installed, works like a charm. Thanks much. [19:40:12] ShakataGaNai: no problem [19:40:41] ialex: I read it 100 times - It'S not working - something is wrong - don't know what [19:41:30] eYeWoRRy: you need also http://www.mediawiki.org/wiki/Template:Languages/Lang [19:41:46] ^^ [19:42:01] with which input? [19:43:49] this sub-template is automatically included by {{Languages}} [19:44:05] you don't need to use it directly [19:46:40] I don'T understand that manual [19:47:00] 03thomasv * r29450 10/trunk/extensions/ProofreadPage/ProofreadPage.i18n.php: reversing 1 and 2 on de [19:48:54] I'll give you a screen of mine [19:50:08] http://eyeworry.com/stuff/wiki_lang1.jpg [19:50:09] ^^ [19:50:12] @ ialex [19:50:43] ah, this : install http://meta.wikimedia.org/wiki/Help:ParserFunctions [19:51:09] Mhm 12O14kay 12T14hank 12Y14ou - I'll try it [19:51:29] 03yaron * r29451 10/trunk/extensions/SemanticDrilldown/includes/SD_GlobalFunctions.php: Fixes for SMW 1.0 [19:51:40] how do I make a word point to a page which isn't the same word [19:51:57] [[ centos 4.1 ]] but points to a page called centos not centos 4.1 [19:53:07] proprietarystink: use [[centos|centos 4.1]] [19:53:35] coold thanks [19:54:55] how can I link to a section within that page? [19:55:12] similar to html -> blah.html#wherever [19:55:49] proprietarystink: yes, use [[page#section|text]] [19:56:14] or just [[page#section]] [19:59:01] mhm - 12O14kay now 12N14o symbols 12;1414) [19:59:35] yay, sounds like the hd format war's winding down [20:00:42] ialex: and know? To get more languages? [20:01:28] wow how did you do colors? [20:01:49] eYeWoRRy: hmm? [20:02:31] ialex: I tried: /index.php/Testseite/de but {{Language}} shows Only English [20:02:37] any other changes I need [20:02:59] eYeWoRRy: purge the cache of the page [20:03:04] mhm a sec [20:03:27] There's an other error [20:03:28] 12L14aughing 12O14ut 12L14oud [20:03:56] Deutsch - is shown - but English not clickable *lol* [20:04:21] lol [20:04:22] ? [20:04:27] how you do those colors [20:04:52] It's a script...^^ [20:05:12] google - IRC Scripts [20:06:12] 03yaron * r29452 10/trunk/extensions/SemanticDrilldown/includes/SD_GlobalFunctions.php: Another SMW 1.0 fix [20:07:29] Hi [20:08:30] Interesting, Ubuntu desktop has /tmp ext3 by default, not tmpfs. [20:08:30] I have a little PHP script on my server (csv2wp.php) and I want to just be accessible to registered Mediawiki users. It's possible to do this? I don't want to get spam or "leechers" [20:09:14] eYeWoRRy: is Deutsch clickable? [20:09:48] Yes^^ it's clickable - but on the side Testseite/de - only English bold and not clickable - and 12N14o other Link [20:10:01] Same stuf with Testseite/fr [20:10:02] ^^ [20:10:05] can I rename a page? [20:10:14] from Centos to CentOS ? [20:10:43] proprietarystink: you should see a "move" button above every page [20:11:34] eYeWoRRy: do you have subpages enabled, so do you see a little "< [[testseite]]" under the title? [20:11:58] probably not [20:12:02] nope^^ 12N14o "< [[testseite]]" under te title [20:12:11] but i saw it toda [20:12:14] few hours ago [20:12:17] totday [20:12:25] in the same namespace? [20:13:25] How can I get the "requested_articles" funcationality in my mediawiki install? [20:13:27] how do I remove the move though, I just want to rename the page [20:13:34] now there's a redirect [20:13:36] on an other side @ SPQ [20:13:52] and how can I enable subpages? [20:14:07] eYeWoRRy: set the following in your LocalSettings.php: [20:14:23] $wgNamespacesWithSubpages[NS_MAIN] = true; [20:14:58] or how can I just remove a page? [20:15:22] proprietarystink: rename is not possible, it's the same as moving. you can, though, delete the redirect by clicking on "delete" (if you have admin rights) [20:16:36] it works 12;1414) [20:16:40] 12T14hank 12Y14ou a lot Robin [20:16:56] And know I'll try it at Mediawiki 1.6.0 [20:16:57] 12;1414) [20:17:08] 12T14hank 12Y14ou a lot guys [20:17:16] which version do you use? [20:18:56] How can I get the "requested_articles" funcationality in my mediawiki install? Wikipedia does it. Is it an extention/setting? I'm using 1.4.3 [20:19:21] it's just a list maintained by hand, cselph [20:19:49] brion: ok, but is there an easy way to add that link to the page not found page? [20:20:14] eYeWoRRy: which version do you use? because you said "I'll try it at MediaWiki 1.6.0" but 1.11 would be better [20:20:24] cselph: go to Special:Allmessages, you'll find the complete list of all user-interface messages [20:20:30] find the one you want and edit it from there [20:20:35] brion: thank you [20:20:44] note also that 1.4.3 is wildly obsolete and probably full of security bugs :) [20:20:50] 1.11 on XAAMP and 1.6 on webspace - because we had a phpBB Crash last night with php5 on the server [20:21:18] 03yaron * r29453 10/trunk/extensions/SemanticDrilldown/skins/SD_main.css: Slightly darker gray [20:24:03] brion: i've found it but it says "This page has been locked to prevent editing." [20:24:47] is it a privledges thing? [20:27:44] my user is "Admin, Sysop" [20:28:55] ok, damn error with alias help plz [20:33:44] cselph: that's a warning that even though you can edit pages, other people can't [20:33:53] (if you actually can't edit, then you're not a sysop, even if you think you are) [20:38:27] flyingparchment, or the permissions have been modified in a strange way. [20:39:40] ñêàæèòå êàê ÿ ìîãó çàãðóçèòü âèäåî íà ñàéò [20:41:39] Mracobes: better join #wikipedia-ru and ask there [20:42:21] thanks [20:43:53] flyingparchment: It lists me as a sysop, but when i go to User Rights managment, it says im not a sysop and i can't add myself, and neither can others [20:44:48] cselph, check LocalSettings.php for modifications to $wgGroupPermissions. [20:45:17] cselph, you said you're an "Admin" too? That's not a default usergroup. (Or is that the username?) [20:45:40] sysop my be "Admin" in some localizations [20:46:19] cselph: note that sysops can not assign user groups (rights). only beureaucrats can [20:46:47] way to manually propote someone to some status is described in the faq [20:48:01] mhm [20:48:04] not working in 1.6.10 [20:48:06] *argh* [20:50:46] Duesentrieb: got it. i had to take out a row from the database [20:51:01] o_O [20:51:05] take one out? [20:51:14] add one, ok, but take one out?... [20:53:46] hehe, it was my fault. earlier I didn't know how to make myself an admin so i tried to do it manually in the database. apparently theres more to permissions than the user_groups table [20:54:43] i like how he says this at the end [20:54:55] Hi all! Hi Duesentrieb! Hi flyingparchment! The Jira people opened http://jira.atlassian.com/browse/CONF-10401 about multiple encoding / decoding [20:56:05] gangleri: how does this concern us? [20:57:05] as soon as you will have some real UTF.8 pages at your confluence it *may* concern you [20:57:06] cselph: no, there isn't, really. well, that table assignes people to groups. the wgGroupPermissions config variable assignes permissiosn to groups. [20:57:42] gangleri: we don't run under Resin [20:57:51] gangleri: shouldn't. because our confluence has the URIEncoding already set to utf-8. i didn't try it though. feel free to play with ti. [20:57:54] ok so don't mind [20:58:35] thanks [20:59:12] Duesentrieb: Millosh tried yestewrday to create an account using real UTF-8 characters; he could not and got an error message [20:59:27] 03yaron * r29454 10/trunk/extensions/SemanticDrilldown/skins/SD_main.css: Improved padding [21:02:11] can I mix named/numbered parameters in a template call if the named params are last? like {{NAME|a|b|c|display=none}} [21:02:53] i think yes [21:02:54] i'd like a template that can accept a maximum of 10 numbered params, but be tweaked by named params. [21:04:35] ok, good. i desoigned what i wanted on a piece of paper, but won't be able to implement for a few hours. [21:04:38] wanted to know if i should redesign ;) [21:20:19] mhm [21:20:21] 12:1414( [21:20:34] not working [21:20:35] *cry* [21:20:36] http://skichallenge.eyeworry.com/wiki/en/index.php/Testing/de [21:21:16] *grrr* [21:26:21] Hi millosh! [21:33:18] 03yaron * r29455 10/trunk/extensions/SemanticDrilldown/languages/SD_LanguageEn.php: Re-added namespace values [21:33:35] meoweo [21:35:42] anyone know a way to eliminate the login successful screen or replace it with a redirect? [21:35:56] putting a redirect in allmessages does not appear to work [21:39:10] ok let's svn up [21:44:56] 03catrope * r29456 10/trunk/phase3/includes/api/ApiQueryInfo.php: API: Temporary fix for broken XML rendering; XML formatter apparently doesn't like array(null). Will attempt to fix the real issue later [22:01:36] 03brion * r29457 10/trunk/phase3/includes/filerepo/ (FSRepo.php RepoGroup.php): Revert r29361 -- breaks upload, calls nonexistent functions [22:02:58] hi I just installed mediawiki on ubuntu, how do I "connect" it to my apache setup... gentoo has webapp-config, but I'm not clear how you do things like that with vhosts etc on ubuntu [22:04:19] axod: first, remove whatever package you installed [22:04:29] then, download the release tarball from www.mediawiki.org [22:04:33] uncompress it [22:04:39] stick it in your document root in a nice place [22:04:42] and follow the directions [22:05:11] hmm ok, I thought apt-get would be simpler [22:05:17] thx [22:05:52] 03catrope * r29458 10/trunk/phase3/includes/api/ApiQueryInfo.php: [22:05:52] Improving r29456: [22:05:52] * The XML formatting bug is obscure and not trivial to fix, working around it is easier [22:05:52] * Added missing braces [22:14:50] why if i made redirect become another skin [22:17:41] re [22:18:08] if go to http:ip/wiki i have new skin [22:18:16] I just reemerged apache and php to newer version and got this on my mediawiki http://marchelly.org.ua/mediawiki/ [22:18:22] but if go to fqdn i have old skin [22:18:33] don't know what to do [22:19:15] chek path of php [22:19:42] chek rights of index.php [22:25:33] help [22:25:35] help [22:25:46] ip requst new skin [22:25:51] fqdn old skin [22:25:52] ? [22:26:08] i will new skin on ip and fqdn [22:26:13] please use good spelling and be as descriptive as possible [22:27:00] if i made request to http:myip/wiki i hhave wiki in new skin [22:27:08] ok... [22:27:29] but if i made request to http://mywiki.com.de i become old skin [22:27:50] because they're different installations of mediawiki? [22:27:54] die7, thanks [22:28:41] Hi domas! [22:29:15] marchelly, work it now [22:29:31] Skizzerz no is not i have just one [22:29:35] die7, yes. [22:29:40] :) [22:29:59] hmm, did you try changing your preferences in the other wiki to the new skin [22:31:14] yes but same skin allways [22:31:48] then I have no idea what's wrong :( [22:32:00] thanks [22:33:23] 03siebrand * r29459 10/trunk/extensions/ (5 files in 2 dirs): [22:33:24] * Reworked to work with 1.12alpha [22:33:24] * added special page class to seperate file [22:33:24] * using wfLoadMessages now [22:33:24] * updated extension credits URL and added version [22:33:24] * updated message group in Translate [22:36:16] Welcome back domas! [22:36:40] hiii! [22:57:18] 03siebrand * r29460 10/trunk/extensions/ (3 files in 2 dirs): [22:57:18] * use wfLoadExtensionMessages for CommentSpammer [22:57:18] * add version in extension credits [22:57:18] * update Translate extension [23:01:19] http://test.wikipedia.org/wiki/User_talk:Splarka#Sysop.2C_steward fail [23:09:51] 03siebrand * r29461 10/trunk/extensions/Translate/MessageGroups.php: Activate export for AskSQL [23:20:52] why i heve two different skin [23:21:08] on ip/wiki the new [23:21:36] on http://mysit,cm/wiki the old one [23:21:39] It's been a while since I've installed MediaWiki afresh, so I popped in so I can be guided along if I get stuck :) [23:23:29] Are there any good techniques for copying all pages in one wiki to a fresh install? [23:23:42] dump the database, import it, install [23:24:12] no, the first wiki is a hosted one, ad-supported [23:24:39] i mean.. X hosts it, not me [23:24:45] no DB access [23:25:51] surely you have something like phpMyAdmin? [23:26:02] 03siebrand * r29462 10/trunk/extensions/ (3 files in 2 dirs): [23:26:02] * use wfLoadExtensionMessages for Contributors [23:26:02] * add version in extension credits [23:26:02] * update Translate extension [23:26:09] what relevance is that? [23:26:18] Gary: if it's someone cool like wikia, they'll probably give you a dump if you ask [23:26:35] It is a freely hosted service, and no, it's a small service [23:26:49] 03siebrand * r29463 10/trunk/extensions/Translate/MessageGroups.php: Fix typo [23:29:30] *Splarka blinks... thought he saw flyingparchment call wikia "Cool"... [23:30:43] Splarka: indeed :) [23:31:17] *Splarka needs to lay off the absinthe [23:40:34] 03siebrand * r29464 10/trunk/extensions/ (4 files in 2 dirs): [23:40:34] * use wfLoadExtensionMessages for ExpandTemplates [23:40:34] * add version in extension credits [23:40:34] * update Translate extension [23:48:36] anything with AutoWikiBrowser?