[00:45:48] Anyway to allow iFrames? [00:50:58] aBSDaemon, there is an extension for this [00:51:15] I'm searching through the matrix now [00:51:19] but beware XSS-vunerabilities. [00:51:47] It's for Amazon, working with that script now [00:56:54] Yet another extension to add to my repository [00:57:11] 30 and counting [00:58:10] :D [00:58:16] you like nonstandard functionality [00:58:58] More like, "Client says, site needs to do XYZ". XYZ is outside of that standard functionality. [00:59:40] So when I cut and paste code, I create a .zip file and upload to mwext.com [01:06:48] hi guys!how's the OpenID integration with mediawiki?is it stable? [01:07:11] dunno. [01:07:16] check the release-notes of it? [01:07:29] I think it's running on wikitravel, so it can't be too poor. [01:12:59] *amidaniel beats Werdna with a stick [01:13:28] ? [01:16:03] Werdna: ok thanks! :) [01:16:56] Any reason why the Bookmarks heading would display to the right hand side on the main page, but as it should on all others? http://www.ourpropertytaxes.com/taxinfo/index.php/Main_Page [01:17:46] Werdna: Just felt like hitting someone [01:18:07] [01:18:08] Working with Mediawiki makes me want to do that sometimes too... [01:18:12] its the align="left" [01:18:21] Thanks. [01:19:49] you could do align="right" if you want them to be next to each other... [01:21:47] I just removed it and everything looks okay. So I'm not going to go experimenting tonight. I've got plenty of other things to do. [01:22:30] ok [01:23:29] What does this mean? [01:23:34] XML Parsing Error: no element found [01:23:36] Location: http://www.editthis.info/Amusement/Special:Export?title=Special:Export&action=submit [01:23:38] Line Number 95, Column 1: [01:24:13] you can't export a special page... [01:24:18] My main thing tonight is to tweak a couple templates and get this Amazon code working. Then tweak the templates for the latest data merge template. 6500 records from a CSV file...that is going to be fun [01:24:27] indeed. the error is a bit odd, though :P [01:24:50] also, i suspect the error happens while doing a transwiki import [01:25:18] I don't want to export a special page [01:25:29] *darkcode thinks both the import and export error reporting could use some improvements [01:25:50] Gary_Kirk: the url looks... odd. when exactly does the error happen? [01:26:09] it seems that it happens on large exports? [01:26:24] when i reduced the number of pages i was exporting from 6 to 3 it was fine [01:26:37] Well, there goes my internet pass. And I'm not buying another drink. Time to fire up the browser to the proxy I setup at home [01:26:39] depending on their size - i'm exporting full histories [01:26:43] Gary_Kirk: but when exporting, no xml is parsed. so you should not get a parsing error. oh... wait... this is your browser reporting a parsing error, maybe? [01:26:51] yes [01:27:00] Gary_Kirk: i suspect it happens if the export is aborted for some reason. like a timeout, if it's too large. [01:27:06] yeah, cheers [01:27:11] i'll just keep it small :) [01:27:22] It's been a very long time since I set up MediaWiki :) [01:27:31] darkcode: indeed :) for the export part, see http://brightbyte.de/page/Extending_MediaWiki%27s_dump_format [01:27:34] what version was it where you could save the export as a .xml file without having to view the thing first [01:27:36] or do anything apart from stuff on meta to be honest [01:27:45] Gary_Kirk: to export more stuff, use the command line scripts [01:27:46] was that 1.11 or 1.12 [01:27:46] !backup [01:27:46] --mwbot-- http://www.mediawiki.org/wiki/Manual:Backing_up_a_wiki [01:28:10] i can't, as it's a free wikifarm thing [01:28:11] Skizzerz: i think even before that. 1.10 or something. [01:28:21] hmm, perhaps [01:28:32] Gary_Kirk: if it's a decent wiki farm, they should provide this as a service. like "mail mea dump" or something. [01:28:39] *Skizzerz never upgraded to 1.10, went straight from 1.9 to 1.11 [01:28:42] Gary_Kirk: just ask them [01:28:59] Dusentrieb: it isn't, which is why I'm asking :) [01:29:14] It's a fairly small wiki in terms of numbers of pages, I'm fine [01:29:22] *not asking. Doing it manually :) [01:34:13] import and export should probably instead tell you it won't be able to do exactly what was requested and ask if the person still wants to import/export, a bit like how moves are handled when the page trying to move to already exists... [01:34:50] *cry* [01:35:11] http://meta.wikimedia.org/wiki/Help:ParserFunctions [01:35:24] *Werdna runs off for a bit [01:35:34] in combination with mediawiki 1.6.10 and Templates:Languages - NOT WORKING *cry* [01:36:01] 1.6 was never meant to work with ParserFunctions [01:36:10] 12?14¿12? [01:36:22] MediaWiki: > 1.6.8 [01:36:23] ^^ [01:36:30] except for the very first revision of the extension, but that was quite different to ParserFunctions today [01:36:44] are you saying I'm wrong? [01:36:52] mediawiki.org is right? [01:36:59] you should upgrade to 1.11 [01:37:02] 12N14ope - you're right [01:37:11] 12H14e12H14e12H14e it's difficult [01:37:19] no php5? [01:37:31] we had some problems with phpBB2 ^^ board crash after php4 tp php5 update [01:37:48] 4hrs work - but we didn't find the BUG [01:37:54] the PHP group are dropping support for PHP 4 this year [01:38:20] ^^ i know [01:38:40] http://phpbb-php5mod.sourceforge.net/ [01:38:42] And yet, the CentOS folks still don't have a PHP5 distro in their core repo for CentOS 4 because "PHP5 is still beta" [01:40:03] 12T14hank 12Y14ou darkcode - problem in our board is - too much (very) old mods [01:40:29] you can always install both, have them running side-by-side [01:40:49] then pick which one you want to use by directory, or use a .php5 file extension [01:41:38] MW has a .php5 stub file for all its entry points these days [01:41:40] 12Y14es we know - trying to use that option with .php5 to find the BUG [01:41:58] are all these colors really necessary? [01:43:03] It's a script, can't change it - sry [01:43:12] thanks, appreciated [01:43:30] upgrade the mods too eYeWoRRy? [01:45:03] I would @ darkcode - very old BETA mods - discontinued mod^^ [01:45:04] mods [01:45:14] so - HF eYe [02:37:31] Hello. It's possible do sorting on a table like this? The problem is being ordered in a different way http://wiki.mobileread.com/wiki/E-book_Reader_Matrix [02:41:29] timofonic: "wikitable sortable" ? [02:41:55] MZMcBride, I know but but that don'w works with tables with alternative ordering [02:42:49] oh, you mean going horizontal rather than vertical... [02:42:57] MZMcBride, look at the example, it does it the wrong way. Sorry, my English is limited and I don't know how to explain it [02:42:59] MZMcBride, yep [02:43:27] no, it's not possible, as i far as i know [02:43:36] MZMcBride, :( [02:43:45] the ability to sort has always presumed it was going to be vertical [02:46:18] MZMcBride, no parameters? [02:47:51] nope [02:56:47] if your any good with javascript, you could modify it to allow vertical sorting as well [02:57:06] darkcode, I even can't code one line in BASIC :( [02:57:49] ok [04:32:26] Hello, I'm still getting a create directory permissions error when trying to display thumbs from commons. Can anyone help? [04:36:15] i get this error > Warning: mkdir() [function.mkdir]: Permission denied in /var/www/vhosts/mydomain.org/subdomains/en/httpdocs/wiki/includes/GlobalFunctions.php on line 1670 [04:37:01] TimStarling: http://en.citizendium.org/wiki?title=Church_of_Scientology&action=history ;) [04:43:44] I have 777 on all the dirs and sub dirs on all language wikis /images.. Ive also looked through all 70+ bugs that I can find, no luch [04:52:53] I think I might have this config incorrect. > http://en.citizendium.org/wiki?title=Church_of_Scientology&action=history [04:53:07] sorry that was in my clip board [04:53:52] this is what im talking about = $wgSharedUploadDirectory = '/(LOCALPATH)/POOL-FOLDER/images/'; [05:09:36] Ok I finally fixed it that was what was wrong with my local settings. [05:09:42] it needed: $wgSharedUploadDirectory = '/var/www/vhosts/mydomain.org/subdomains/global/httpdocs/wiki/images/'; [05:10:44] see you all in about 6 months.. hopefully Thanks for your help the other dat [06:32:27] hi all [06:32:57] there is a way to allow MediaWiki to display external images ? [06:33:35] $wgAllowExternalImages = true; [06:35:17] and then any full URL ending with gif/jpg/jpeg/png will turn into an [06:40:41] !externalimages [06:40:41] --mwbot-- To allow images from elsewhere to be included in your wiki, see . To limit this to some specific sources, see . [07:20:32] <_wooz> lo [07:23:30] +l [07:31:51] hey guys [07:32:21] i'm having a lot of difficulty downloading wikipedia xml dumps, as the server doesnt seem to support partial-content requests [07:32:41] am I doing something wrong, or is this a known issue? [07:32:52] or feature? ;) [07:46:48] Splarka: thank you [08:01:30] Hi I'm looking for a way to make my urls looks like www.mydomain.com/Page_title [08:01:40] LuCypher, read FAQ first [08:01:42] topic ^^ [08:21:55] Sasa^Stefanovic , thanks I've found it and solved [08:22:01] np [08:25:19] Can I ask you where I can find some templates/skins for mediawiki? [08:25:37] LuCypher, on meta too [08:25:44] that is on FAQ also [08:26:05] :-) [08:49:58] Hi, How can I restrict user-registrations only to those whose email-ids are from a specific domain? [08:53:12] re [08:56:54] nareshov: I think there are extensions to do that, and some manual hacks like: http://www.mediawiki.org/wiki/Manual_talk:Configuration_settings#Restrict_Registration_to_specific_Domain_via_email_address [09:07:14] Wow, Splarka +1 for finding a link to a solution to the *exact* question :) [09:08:03] *Splarka bites amidaniel [09:08:11] Wikia has an extension for this, but I can't remember the name [09:08:23] *Splarka bugs sannse [09:11:11] Ouch [09:11:18] *amidaniel bleeds in Splarka's mouth [09:11:26] mmmm, devilish [09:11:46] Yeah, you like that, don't you, bitch? [09:12:09] Hurray for kinkiness in a technical channel! [09:12:29] man... gotta love it... test, crash, fix, compile, bundle, deploy, repeat [09:14:13] Yup, that's called programming. [09:15:00] ami: so how hard would it be to simply use the AddNewAccount hook and AbortNewAccount if the email address isn't matchy-happy? would that prevent new account creation properly? [09:15:07] *Splarka is not a PHPerson [09:15:33] Splarka: Sounds good. Not too familiar with the acct. creation code though. [09:15:39] or, could people fill up Special:Userlist by spamming the account creation form with the proper email address with no intention of being able to respond [09:16:00] *Splarka pokles nareshov [09:16:19] er Listusers [09:16:30] yeah [09:16:50] oh [09:16:57] Well, youc could send a ping out to the mailserver and test the addy [09:17:02] s/youc/you [09:17:46] nareshov: how good are you at php? [09:18:07] not very good, but I can sort of understand what the code says [09:18:20] heh [09:38:59] hi [09:39:14] Hello [09:45:51] we don't have any generic self-submitting HTML form class do we? [09:46:44] maybe we should fix that some day [09:58:20] you have to admit, the way we do forms is pretty crap [09:59:20] = manually building them everywhere? [10:00:08] yes, either with string concatenation or with the Xml module, both ways aren't very good [10:05:29] it doesn't help that HTML itself is so inconsistent [10:05:47] every time I have to write a form, I end up having to refer to the spec, even though I must have done it tens of times [10:07:34] e.g.
{{#if:1|}}
can breakify [11:34:07] oje this looks tricky [11:34:44] well, what parts do you want to change? [11:34:50] the color and the message, but everything else static? [11:35:14] perhaps just don't use tables at all... [11:35:54] Duesentrieb: tables would be fine cause they look like this: [11:36:18] |product|product-id|who works on the product|status [11:36:32] and for like 100 things [11:36:46] ok then [11:37:00] i thought you were looking for a message-box type thing [11:37:10] which is often done using tables.but using plain divs is better [11:37:30] marvxxx: do you want an infobox-type system? [11:38:14] its more like a big table [11:38:32] and one field in it should have a different color and the status written in it [11:38:46] i thought i would make it my workers easier if they could use templates [11:38:48] would this table be used on more than one page? [11:39:52] no [11:40:13] so, the table would be mostly static, and only certain elements that might be repeated would be templates? [11:41:02] exactly [11:41:04] *Splarka notes wikitable syntax is sometimes simpler than transclusions (unless there are lots of styles involved) [11:42:20] i just thought it would be nice if they write in the status row {{done}} [11:42:29] and it would be green and "done" written in [11:42:45] sure [11:42:51] just make Template:Done contents: [11:42:55] |- [11:43:10] | style="background-color:#ddffdd;" | done [11:43:23] for example [11:43:51] http://p.defau.lt/?H0ZW9Ka_ARGldLgkKm0vZg [11:44:08] is easy with wikitables, and almost as easy with html tables if you have tidy [11:44:34] aaahhh nice [11:44:41] i prefer wikitables for wikis :) [11:45:17] just be careful of random extra newlines and whitespaces, wikitables are a bit sensitive to it [11:45:50] okidoki [11:45:59] like {{done}} might create
|-
, or |-{{done}} might just fubar royally [11:46:53] ah ok...mmhh i will have to check this out [11:47:03] i wonder how i can delete templates [11:47:43] find a sysop? [11:48:56] Splarka: it works like a charme [11:48:57] thank you [11:49:05] rar [11:53:59] where can i read about the mediawiki session cookie implementation? [11:57:08] hippietrail: afaik there is none. it uses whatever php provides. [11:57:21] really? [11:57:26] dues: aka "read /includes/User.php" ? [11:57:35] :P [11:57:37] maybe [11:57:48] *Splarka looks at the gibberish that is function setCookies() { [11:57:49] i don't even knowwhere the session stuff is handled [11:58:18] i think i need to leverage the session stuff in javascript [11:58:40] whyfor? [11:59:58] hmm, can you use an ajax call to the API login? [12:00:39] basically i'm storing some prefs type stuff in a user subpage which js gets using ajax. but xmlhttprequest results are always cached [12:01:25] you could append a &12345 random number or timestamp to the end of the request URL to decache it [12:01:28] so i need a way to know when to make sure i don't get stale stuff out of the cache, such as when the subpage has been edited [12:03:03] it's actually even more complicated. i'm thinking up a new js extension which will put some settings which are now cookies in another user subpage so that they will go with you when you move computers [12:03:43] hmm, I was working on something as a proof of concept, sort of similar [12:03:54] and so that the next person to use your computer doesn't inherit your prefs [12:04:01] it was basically a 3 level extensions preference for gadgets, not quite the same as for extra prefs though [12:04:08] Splarka: tell me! [12:04:14] first level would be plain cookies, and I got that working [12:04:16] in fact can i invite you to #wiktionary [12:04:23] bed soon [12:04:35] hippietrail: if you leave a computer to someone else without logging out, you deserve all the bad things that may happen to you. [12:04:40] second level would be user JS globals, which would be checked by the same system, got that semi working [12:05:00] third level would be a way to hard-save your cookie based prefs occasionally to the user JS, so as not to edit it unnecessarily [12:05:03] also... do we have settings in cookies? i though that was only session tokens and stuff [12:05:13] I have it running on test.wp a bit [12:05:19] Duesentrieb: our current js prefs will be transferred even if you do log out i believe [12:05:22] Duesentrieb: en.wt has loads of cookie hacks [12:05:32] or, planned to [12:05:33] hippietrail: huh? how would that even work? [12:05:43] hippietrail: http://en.wikipedia.org/wiki/Wikipedia:WikiProject_User_scripts/preferences [12:05:45] cookies are browser specific [12:05:53] they don't know about login and logout [12:05:55] they are browser *session* specific [12:06:04] Duesentrieb: when you make an extension in javascript you cannot access the db for extra prefs [12:06:05] not js hacked in ones [12:06:08] i'm talking about logging out of the computer'S account [12:06:10] not wikipedia [12:06:17] ah ok [12:06:46] anyway, I posted this to #mediawiki-scripts but it was during the holidays and had no replies ^_^ [12:06:52] *Splarka prods zocky [12:07:02] hippietrail: trie. an ajax based solution might be nice for that [12:07:11] yo, splarka [12:07:13] it would be nice... [12:07:26] we have users on wiktionary who strongly want our current extra prefs to continue to work for anons - but i for one think all prefs should work the same way - so i'm trying to find a solution that will make everybody happy [12:08:06] i could never even convince connel to put save and cancel buttons on WT:PREFS [12:08:08] if, as I'd planned, but got lazy, to make an ajax "save all current preferences (warning, this will create an edit on wgServer)", so that you can fiddle and hack your preferences via cookie, but still save them for other browsers [12:09:05] or, maybe we could save a preference object in the database, and receive it in javascript on every page view [12:09:13] Splarka: what i had envisaged was a prefs abstraction layer which would support prefs in the db, in a user subpage, and in cookies [12:09:27] yes zocky, that'd be nice, good luck with getting the devs to do that X_X [12:09:32] if we have it in the db, which is better in many ways, we don't really need a user sub page [12:09:35] it would be best, but until then... [12:09:48] having separate preferences in cookies and in your user profile sucks [12:09:52] there may be privacy concerns about certain prefs being accessible via ajax [12:10:07] well [12:10:08] they can be accessible only to you [12:10:13] no more so than any other way of storgage [12:10:20] it could just load with gen=js as a set of globals [12:10:47] as one global, JSON-like [12:11:03] on wiktionary we're looking first for a way to do it which doesn't require changes to mediawiki or php extensions to be installed [12:11:07] anyway, please develope it, and keep the Gadgets need in mind [12:11:12] but later i think that is the best way [12:11:30] and something simple gadget creators can use, like zocky's search box... [12:11:35] if we aren't going to build an extension, then we would need to use the sub-pages for cross-browser stuff [12:11:43] the wiktionary people so far don't like gadgets - they prefer our own "system" [12:11:47] $wgPrefs={"myPref":true,foo:"bar",baz:"etc"} [12:11:50] that sort of thing [12:11:52] if(getPref(SearchBox-alwaysshowbydefault)) { foo } else { bar } [12:12:09] zocky: there is already a user options manager, so it wouldbe best to add into that [12:12:25] Splarka: exactly [12:12:25] head melting, have fun storming the castle boys ^_^ [12:12:38] also if people really want prefs for anons, in cookies or otherwise, then we should give all prefs the possibility of working that way [12:12:52] that will be much harder to do though [12:13:23] well that would require core changes or a new extension to be installed [12:13:33] core changes I think [12:13:47] a rewrite of getUserOptions [12:13:54] or whatever the function is [12:14:14] or a hook in it. beside the point for now anyway (-: [12:14:33] yup [12:14:42] so, looks like we're back where we started [12:15:20] it would be possible to store prefs at the bottom of User:/monobook.js [12:15:21] cirwin: did you look at this? http://en.wikipedia.org/wiki/Wikipedia:WikiProject_User_scripts/preferences [12:15:53] i would prefer a separate page but the beauty of abstraction layers is we could have both [12:15:59] looks even more uglier than ours :) [12:16:47] but setjsPref and getjsPref are the functions we want [12:17:11] my brain is not active enough to read the code sadly (-: [12:17:21] we could then use autoedit to add an array to the bottom of monobook.js with the preference values [12:17:26] and it could read those too [12:17:28] i do better with pseudocode - especially for js [12:34:43] Hey, my mediawiki installation is missing the /Help:Contents page [12:34:49] there's no text in it :| [12:36:37] Help:Editing is what I wanted [12:36:42] try wikipedia [12:37:03] okay [12:37:06] http://en.wikipedia.org/wiki/Help:Wikitext_examples [12:39:03] nice [12:47:31] morning [13:35:13] nhm [13:35:37] maybe there should be a planet mediawiki feed, in addition to the planet wikimedia one. [14:06:18] I'm creating pages with subpages, like: http://wiki/course/year/assignments/1 and so on. How can I have easy navigation to any node in this URL ? [14:07:03] suppose I want to quick-navigate to the course page from assignment1 page? [14:09:46] hey people, we have a wiki running in http://hostname/wiki/ and we need to be able to create URLs back to the main web site (i.e. http://hostname/) without specifying the name "hostname" [14:09:46] in other words, we need the equivalent of Homepage but on a Wiki page - any ideas? [14:11:52] fishsponge: there is a way, I'll tell you, just let me remember it! [14:12:32] fishsponge, does [[/]] work? [14:12:43] I think {{server}} is the thing [14:13:07] perfect! [14:13:08] thanks dude! [14:14:20] why isn't CIA-40 talking? [14:15:43] hey i want to change the layout of my articles, the articles shoud have headlines in the article for example (Instruction,Tips,Warning) [14:15:49] *Hojjat pings CIA-40 [14:16:12] bango, then you can add "header"s! [14:16:19] == Header == [14:16:34] no i want that every article has got this headlines [14:16:55] i want 4 textfields not one [14:17:05] end every textfield shoud have got their own headline [14:17:31] first textfield has got the headline instrucition [14:17:37] the second warnings [14:17:39] and so on [14:19:35] any idea [14:19:36] = [14:20:30] bango: mediawiki does not really support that. it's possible to hack it in, with some effort - i know of one website that uses something like this, namely, wikihow. [14:20:45] i don't know if it's possible without hacking core code [14:20:51] yes [14:20:58] i know wikihow [14:21:07] ask the guy who rote that, then [14:21:13] maybe he can put it up on mediawiki.org [14:21:21] i did it [14:21:24] but he didnt answer [14:21:39] really? too bad. did you write an email? [14:21:51] yes [14:22:10] hm :/ [14:22:13] no idea then [14:22:16] it looks so gerat :/ [14:22:39] mhm okay, ive got another question.. [14:23:00] how can i change the size of a picture in a article? [14:23:31] [14:24:38] i think the picture have to be on my server? [14:26:24] bango: if you want "nice" image features, then yes, they have to be uploaded to the wiki, through the wiki. [14:26:39] you can allow inclusion of external images, but for that, only basic inlining is supported. [14:26:59] how can i allow this? [14:27:12] !externalimages [14:27:12] --mwbot-- To allow images from elsewhere to be included in your wiki, see . To limit this to some specific sources, see . [14:29:36] !help [14:29:36] --mwbot-- Hi! I'm mwbot, a bot that was quickly whipped up by Daniel Cannon (AmiDaniel) to help out around #mediawiki. Some quick help is at < http://www.mediawiki.org/wiki/Mwbot >, you can find all my source code at < http://amidaniel.com/viewvc/trunk/MWBot/?root=svn > [14:30:20] @trusted [14:30:20] --mwbot-- [wikipedia/Hashar, wikipedia/AmiDaniel, s23.org, wikimedia/Soroush83, wikia/.*, wikimedia/Tim-Laqua, wikimedia/Danny-B., cc1081997-b.harli1.fr.home.nl, clematis.knams.wikimedia.org, wikipedia/ialex, .*\.wikimedia\.org, wikipedia/MZMcBride, wikipedia/.*, fuchsia.knams.wikimedia.org, 222-153-0-95.jetstream.xtra.co.nz, wikimedia/Eagle-101, wikimedia/.*, wikia/Skizzerz, wikia/Jack-Phoenix, silentflame/member/pdpc.base.minuteelectron, wikimedia/VasilievVV, wikipedia/AzaToth, cpe-69-201-152-135.nyc.res.rr.com, wikimedia/Huji, pdpc/supporter/student/werdnum, wikimedia/Pathoschild, c-67-171-249-42.hsd1.wa.comcast.net, wikipedia/simetrical, wikimedia/Kalan] [14:30:51] @dump [14:30:51] --mwbot-- A dump is now available at http://tools.wikimedia.de/~amidaniel/botbrain.html [14:32:15] AzaTht: when we have "wikimedia/.*", isn't it a good idea to remove all other "wikimidea/" ones, to make the list cleaner? [14:32:27] just though about that [14:32:31] :) [14:34:23] @untrust wikipedia/Hashar wikipedia/AmiDaniel wikimedia/Soroush83 wikimedia/Tim-Laqua wikimedia/Danny-B. wikipedia/ialex wikipedia/MZMcBride wikimedia/Eagle-101 wikia/Skizzerz wikia/Jack-Phoenix wikimedia/VasilievVV wikipedia/AzaToth wikimedia/Huji wikimedia/Pathoschild wikipedia/simetrical wikimedia/Kalan [14:34:23] --mwbot-- Removed wikipedia/Hashar wikipedia/AmiDaniel wikimedia/Soroush83 wikimedia/Tim-Laqua wikimedia/Danny-B. wikipedia/ialex wikipedia/MZMcBride wikimedia/Eagle-101 wikia/Skizzerz wikia/Jack-Phoenix wikimedia/VasilievVV wikipedia/AzaToth wikimedia/Huji wikimedia/Pathoschild wikipedia/simetrical wikimedia/Kalan from trusted hostnames list. [14:34:29] @trusted [14:34:29] --mwbot-- [wikipedia/Hashar, wikipedia/AmiDaniel, s23.org, wikimedia/Soroush83, wikia/.*, wikimedia/Tim-Laqua, wikimedia/Danny-B., cc1081997-b.harli1.fr.home.nl, clematis.knams.wikimedia.org, wikipedia/ialex, .*\.wikimedia\.org, wikipedia/MZMcBride, wikipedia/.*, fuchsia.knams.wikimedia.org, 222-153-0-95.jetstream.xtra.co.nz, wikimedia/Eagle-101, wikimedia/.*, wikia/Skizzerz, wikia/Jack-Phoenix, silentflame/member/pdpc.base.minuteelectron, wikimedia/VasilievVV, wikipedia/AzaToth, cpe-69-201-152-135.nyc.res.rr.com, wikimedia/Huji, pdpc/supporter/student/werdnum, wikimedia/Pathoschild, c-67-171-249-42.hsd1.wa.comcast.net, wikipedia/simetrical, wikimedia/Kalan] [14:34:36] hmpf [14:34:48] thought you could place them all in one query [14:35:05] @untrust wikimedia/Soroush83 [14:35:05] --mwbot-- Removed wikimedia/Soroush83 from trusted hostnames list. [14:35:14] @untrust wikimedia/Tim-Laqua [14:35:14] --mwbot-- Removed wikimedia/Tim-Laqua from trusted hostnames list. [14:35:23] @untrust wikimedia/Danny-B. [14:35:23] --mwbot-- Removed wikimedia/Danny-B. from trusted hostnames list. [14:35:26] hmpf [14:35:27] @trusted [14:35:27] --mwbot-- [wikipedia/Hashar, wikipedia/AmiDaniel, s23.org, wikia/.*, cc1081997-b.harli1.fr.home.nl, clematis.knams.wikimedia.org, wikipedia/ialex, .*\.wikimedia\.org, wikipedia/MZMcBride, wikipedia/.*, fuchsia.knams.wikimedia.org, 222-153-0-95.jetstream.xtra.co.nz, wikimedia/Eagle-101, wikimedia/.*, wikia/Skizzerz, wikia/Jack-Phoenix, silentflame/member/pdpc.base.minuteelectron, wikimedia/VasilievVV, wikipedia/AzaToth, cpe-69-201-152-135.nyc.res.rr.com, wikimedia/Huji, pdpc/supporter/student/werdnum, wikimedia/Pathoschild, c-67-171-249-42.hsd1.wa.comcast.net, wikipedia/simetrical, wikimedia/Kalan] [14:35:38] hey how can i make spotlight articles like on wikihow [14:35:43] eh? [14:35:48] @untrust wikipedia/ialex [14:35:48] --mwbot-- Removed wikipedia/ialex from trusted hostnames list. [14:35:58] what's with the untrusts? [14:36:05] Jack_Phoenix: duplicates [14:36:10] ah, k :) [14:36:24] wikia/.* wikipedia/.* and wikimedia/.* is trusted [14:36:58] @untrust wikipedia/MZMcBride [14:36:58] --mwbot-- Removed wikipedia/MZMcBride from trusted hostnames list. [14:37:17] @untrust wikia/Jack-Phoenix [14:37:17] --mwbot-- Removed wikia/Jack-Phoenix from trusted hostnames list. [14:37:22] Jack_Phoenix: try now, it should work [14:37:27] still [14:37:38] !foo [14:37:38] --mwbot-- I don't know anything about "foo". You might try: !footer [14:37:43] try !foo is bar [14:37:54] !foo is bar [14:37:54] --mwbot-- Successfully added keyword: foo [14:38:03] *Jack_Phoenix hugs mwbot :D [14:38:20] !foo del [14:38:20] --mwbot-- Successfully removed keyword: foo [14:38:32] @dump [14:38:32] --mwbot-- A dump is now available at http://tools.wikimedia.de/~amidaniel/botbrain.html [14:38:38] guys, go to #mwbot please [14:38:46] ok [14:41:31] a bug in mwbot [14:41:43] pipe doesn't work when using params [14:41:50] !r 5678 | anyone [14:41:50] --mwbot-- http://svn.wikimedia.org/viewvc/mediawiki?view=rev&revision=5678 [14:42:03] !r | anyone [14:42:03] --mwbot-- anyone: http://svn.wikimedia.org/viewvc/mediawiki?view=rev&revision=`e1 [14:46:39] hey how can i make spotlight articles like on wikihow [14:51:14] mhm [14:51:43] what's the difference between the session cookie and the token cookie? [14:54:03] token cookie is embedded in forms [14:54:11] session cookie is a browser cookie? [14:54:53] hippietrail: session cookie is used to keep you logged in. token cookie is to avoid double submission of data [14:54:54] they are both used in User.php / loadFromSession() [14:55:12] token cookie is to avoid CSRFs too [14:55:23] CSRF? [14:55:34] cross site request forging [14:55:44] aha [14:55:51] http://en.wikipedia.org/wiki/CSRF [14:56:28] i'm wondering if i can use either of them to synchronise preferences in cookies with preferences stored on the site or potentially in a user subpage [14:56:48] (from javascript) [14:59:44] for instance on page load i would like to detect from javascript whether the current user was previously browsing from a different computer, or whether a different users is logged in since last time we checked, or if a long time has passed since this user last browsed the site [15:29:12] how well does mediawiki handle animated gifs? [15:30:40] Morbus: mediawiki doesn't care about the fact they are animated. the only question is how well thumbnailing works for them. if you use ImageMagick, it should work reasonably well, though not always perfect, iirc [15:30:58] ok. thanks. [15:31:02] there used to be massive problems with scaling animated gifs - not sure if imagemgaick fixed the bug, or if it's some config tweak [15:46:07] 03dale * r29486 10/trunk/extensions/MetavidWiki/includes/ (4 files in 2 dirs): stream page delete updates streams table & mvd index [15:50:13] we are at 29496, CIA-40 is reporting 29486 ??! [15:51:15] *Jack_Phoenix kicks CIA-40 [15:51:16] ow [15:51:23] better to get the FBI guys ;-) [15:55:13] Jack_Phoenix: One way to make CIA-40 work is to give it false information about Alqaida! [15:55:24] good idea :D [16:00:33] 03shinjiman * r29479 10/trunk/phase3/languages/messages/MessagesId.php: [16:00:33] * (bug 12469) Update Indonesian translations [16:00:33] Patch by Borgx [16:00:59] 03aaron * r29477 10/trunk/extensions/ConfirmAccount/ConfirmAccount_body.php: Bump rows [16:02:05] *Al-Qaeda-40 kicks CIA-40 [16:02:07] ow [16:02:29] It is working retrospective! [16:02:42] 29479 then 29477 !! [16:10:47] I am having difficulties backing up mediawiki site from debian machine to another debian machine...There are some incompatibilities in tables...I used mysqldump database > database.sql and then imported on a new installation of mediawiki, however it is refusing to display the site [16:12:50] what error message do you get? [16:13:44] yang: are both using the exact same version of mediawiki? if not, you have to import the db first, and *then* install mediawiki [16:13:45] Hi there. What would be the easiest way to password protect my wiki for viewing? [16:14:05] KenSentMe: the simplest way is using httpauth [16:14:13] other than that... [16:14:16] !access | KenSentMe [16:14:16] --mwbot-- KenSentMe: For information on customizing user access, see . For common examples of restricting access using both rights and extensions, see . [16:14:36] Duesentrieb, thanks for both [16:14:56] yang: to transfer page content, you can always use an xml dump, which is language and charset neutral [16:15:10] yang: oh yea, and beware that mysqldump may fuck up charset-wise [16:15:12] !backup [16:15:12] --mwbot-- http://www.mediawiki.org/wiki/Manual:Backing_up_a_wiki [16:23:01] Hojjat: retroactively :) [16:24:29] http://en.wikipedia.org/wiki/Special:Contributions/ClueBot [16:24:38] Average edits per day: 2880 (since last active, for last 50 edit(s)) [16:24:44] http://en.wikipedia.org/wiki/Special:Contributions/VoABot_II [16:24:49] Average edits per day: 2666.67 (since last active, for last 50 edit(s)) [16:25:04] well, that is the rate as of the last few min [16:25:34] 5546 edits, or about 2770 reverts :D [16:27:08] Duesentrieb: When I first started the mediwiki site, it used to be an old version...later the release has upgraded and i upgraded via apt-get ... I think the first version was 1.7 and the new installation is 1.9 onthe other machine [16:28:07] yang: we generally recommend against using 3rd party distribution packages, since we have no idea how and if their installers/upgrades work... [16:28:36] yang: also, if you want to use the same database, you must have the exact same version of mediawiki - or run the updater, before doing anything else [16:29:03] yang: on the "broken" wiki, try running maintenance/update.php [16:29:07] run on all error reporting [16:29:09] !debug [16:29:09] --mwbot-- For information on debugging (including viewing errors), see . [16:29:25] oh, also: [16:29:28] !adminsettings [16:29:28] --mwbot-- AdminSettings.php is an additional configuration file for use with command line maintenance scripts. See AdminSettings.sample in your installation directory for details. [16:29:43] Good afternoon, all! May I pester someone for some help in resolving a compatability issue with MySQL password hashing? [16:29:54] I'm a bit new at the whole MySQL administration thing... [16:29:58] 03siebrand * r29490 10/trunk/extensions/Translate/ (MessageGroups.php Translate.php): Add support for MetavidWiki [16:30:10] 03brion * r29473 10/trunk/extensions/AssertEdit/AssertEdit.php: * (bug 12541) Use of wrong var name in AssertEdit [16:30:11] Skimble: you can try [16:30:12] Skimble: maybe you should ask in #mysql then :) [16:30:31] Skimble: if it is remotely mediawiki related, can try here :) [16:30:31] Skimble: oh, you got domas' attention - use it while it lasts ! [16:30:37] Well, it's sort of a mySQL/Mediawiki incompatibility issue. [16:30:50] My hosting company doesn't support PHP 5, it's using PHP 4.4. [16:30:57] So I need to use 1.16.10 of Mediawiki, right? [16:31:00] go get another hoster then [16:31:02] right [16:31:05] or another hoster [16:31:09] Er, 1.6.10 even. [16:31:14] Okay, so that's great so far... [16:31:44] The problem is that the version of MySQL that my hosting company uses is 4.1.22-standard. [16:32:00] 03op_kai * r29497 10/trunk/extensions/SemanticMediaWiki/skins/SMW_tooltip.js: [16:32:00] bugfix: tooltip position in IE scroll container [16:32:00] enh: method '_smw_hideAllTooltips' to close all open tooltips [16:32:02] shouldn't be a problem [16:32:24] It turns out that MediaWiki is trying to use a version of password hashing to connect to the database that 4.1 doesn't support. [16:32:41] Skimble: it is not mediawiki, but php client [16:32:48] that is compiled against older version of mysql client lib [16:33:00] Ah. =/ [16:33:01] Skimble: your hoster is incompetent, if does set such passwords [16:33:09] anyway, mysql can be configured to use old password scheme [16:33:25] (or juset SET PASSWORD = OLD_PASSWORD('blah') [16:33:26] Can I set that for my database, or would I have to shout at my hoster to do that? [16:33:40] 03siebrand * r29484 10/trunk/extensions/ (47 files in 36 dirs): Localisation updates for extension messages from Betawiki (2008-01-09 8:32 CET) [16:33:47] if you can connect to mysql, you can use SET PASSWORD [16:33:51] 03siebrand * r29483 10/trunk/phase3/languages/messages/ (17 files): Localisation updates for core messages from Betawiki (2008-01-09 8:32 CET) [16:33:51] to set old hash of password [16:33:52] Skimble: are you getting an exceptionally great deal at that place? [16:34:08] Skimble: it sounds like a "take your data and run" situation to me... [16:34:09] *domas points Skimble to http://dev.mysql.com/doc/refman/5.0/en/old-client.html [16:34:12] 85 dollars a year with 'infinite' webspace and bandwidth. [16:34:31] that's rather cheap yes. [16:34:42] With both ASP and PHP support. [16:34:50] shell access? [16:34:58] I've just added on a linux folder with MySQL so I can try and get MediaWiki working. [16:35:18] domas: I've tried to set the password, but I got a user access violation error in MyPHP. [16:35:25] bah, whatever [16:35:34] Sorry, PHPMyAdmin. [16:35:36] going to fix dinner [16:35:38] No shell access, no. [16:36:08] Skimble: that sucks for running mediawiki. [16:36:13] it's doable, but it sucks. [16:36:21] and it makes your deal sound not so good any more :P [16:36:24] I was rather hoping it would be a config and go situation. =) [16:37:13] 03aaron * r29467 10/trunk/extensions/ConfirmAccount/ConfirmAccount_body.php: [16:37:13] * Trim name [16:37:13] * Use user accessors [16:37:13] * wgAuth stuff [16:37:37] I got an error when I tried to change the password in PHPMyAdmin of #1044 - Access denied for user 'bradthev'@'localhost' to database 'mysql' [16:37:57] Presumably meaning I don't have the access permissions to actually affect the password database on the main database? [16:38:25] domas: I like my current CSE study topics [16:38:34] Teh binary search trees [16:39:38] 03huji * r29498 10/trunk/extensions/ (3 files in 3 dirs): Added Persian translations [16:39:49] AaronSchulz: lol [16:39:55] today one guy wanted to tell me how much he kne [16:39:55] w [16:40:00] and started explaining btrees [16:40:10] AaronSchulz: the funny part was when he started telling me how they are used in mysql a lot [16:40:36] ;) [16:40:45] LOL [16:40:53] I was asking my prof about b-trees and b(+/-)-trees [16:41:17] domas: he said that I should google it or read the "Wikipedia" article [16:41:31] but that "of course, you can totally trust it, since anyone can edit" [16:41:48] "can"? [16:41:51] later someone told the guy where I work [16:41:54] he said 'it can't be' [16:42:06] but apparently started to believe, once found that I am 'support engineer' [16:42:16] Hojjat: *can't, of course [16:42:45] well, if people acting like they know more than you (the professional) is funny, then I have this fun every day! [16:43:07] ;-D [16:43:12] well [16:43:19] that guy is amazing [16:43:27] apparently he's trying to write his own ext2 drivers now [16:43:29] Hojat: that reminds me, I should edit CZ more [16:43:37] but every question I ask, false answer I get :) [16:43:54] he started telling me that something he read was different from what I say [16:44:22] AaronSchulz: you meant Citizendium, right? [16:44:41] Hojat: it's amazing how the freedom to boldy edit stuff grew on me after WP [16:44:53] CZ doesn't feel that same [16:44:54] =) [16:45:07] heh [16:45:08] so it gets hard to get motivated [16:45:08] at least they are different [16:45:24] and they're running better database engine too [16:45:25] %) [16:45:40] I really value their efforts to maintain the port [16:45:41] with fancy bitmap index stuff [16:45:55] do they use that? :) [16:46:00] all the better for MW postgres support [16:46:18] you know who did all the initial MW postgres support? :) [16:46:20] c'mon [16:46:42] domas: CZ's mediawiki used to have a trillion mods [16:46:53] they couldn't scap ;) [16:47:35] 03brion * r29465 10/trunk/phase3/maintenance/dumpTextPass.php: fix include path regressions with text prefetch [16:47:38] AaronSchulz: look at the bottom: http://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3/includes/SearchTsearch2.php?view=log#rev23531 ;-) [16:47:42] I was one of the ones saying "WTF, don't branch off have to support it yourselves with like no devs!" [16:48:51] domas: much of the comments I read there on forums and on-site are attacking WP in passing [16:49:12] It seemed like in the beginning, the attitude was to burn the bridge with the core devs too [16:49:19] oh, I did read their comments too [16:49:39] http://dammit.lt/2007/03/29/some-thoughts-on-citizendium/ [16:49:48] they tried to rebuild bridges afterwards [16:50:15] I saw the one when their head something-or-other said MediaWiki was one of the worst-written pieces of software he knew about. [16:50:21] I recall the quote "Only Mozilla Thunderbird gets more disrespect from me." [16:50:29] well, read my entry above [16:50:30] ;-) [16:50:41] Yeah, there you go. [16:50:58] well there is some shitty code, but it is not that bad overall [16:51:17] when I finish revisiondelete, I'd like to attack the log code [16:51:26] By and large it's okay. [16:51:42] It just makes me feel that whether code is good or not is somewhat subjective. [16:51:45] (unless rob returns and does it as he was planning a while ago) [16:51:52] Although, not always. [16:52:00] Like, have you ever looked at the source code for NetHack? [16:52:26] Not just inconsistent tab and brace style, but they go by the old 1980's scheme of "memory is expensive, limit variable names to three letters or less". [16:52:37] Kind of hard to read a lot of the time. [16:52:47] do they have comments? [16:52:50] Sometimes. [16:53:11] Some of their functions have completely misleading names, too. [16:53:13] And variables. [16:53:22] domas: I'd still like to see what CZ can produce [16:53:25] 03siebrand * r29499 10/trunk/extensions/ (3 files in 2 dirs): [16:53:25] * use wfLoadExtensionMessages for SpecialCrossNamespaceLinks [16:53:25] * add version and url in extension credits [16:53:25] * update Translate extension [16:53:35] I recently saw a function call like gain_xp( 0, 10 ) and assumed it must be like ( levels, XP points ). [16:53:43] but my enthusiasm is really being tested [16:53:44] But no, it's ( XP points, score points ). [16:53:59] Why in heaven's name would you use a function named gain_xp (or whatever) to increase the player's *score*? [16:54:54] Now, why is it there's no way to set a message to the empty string again? [16:55:00] It's kind of silly. [16:55:29] Some enwiki people were complaining about the delete log default message. [16:55:34] That it couldn't be removed entirely. [16:56:24] domas: http://en.citizendium.org/wiki/Chiropractor#Efficacy [16:56:31] I don't like that article [16:57:04] 03dale * r29485 10/trunk/extensions/MetavidWiki/ (9 files in 4 dirs): [16:57:04] Better internationalization support matching SemanticForms messages implementation [16:57:04] (pre request of Betawiki / siebrand) [16:58:13] Simetrical: because it used to only have xp, then score was added later [16:58:17] (btw: wtf are score points?) [16:58:26] flyingparchment, then they should make a new function. [16:58:32] You know, like, adding points to your score. [16:58:36] Simetrical: yes. meanwhile, in the real world... [16:58:43] i never saw a game that had xp and a score [16:58:51] usually, xp/level _is_ the score :) [16:59:07] Really? [16:59:09] How peculiar. [16:59:23] domas: whenever I read an "alternative medicine" article I always wonder what Steven Novella would say [16:59:49] =) [16:59:51] AaronSchulz, does CZ's policies mean [[Chiropractor]] gets written by a chiropractor? [16:59:59] yes [17:00:09] not directly, but that was the case [17:00:13] . . . and [[Scientology]] gets written by a Scientologist? And [[pyramid scheme]] gets written by a pyramid scammer? [17:00:19] a chiropractor is an 'editor' and constable [17:00:33] Simetrical: they are written by 'experts' in the subject [17:00:47] so not necessarily a scientologist [17:00:47] Who of course may have a fairly dramatic vested interest in presenting their subject favorably. [17:00:48] Great. [17:01:20] someone with a Phd in religion studies and such could approve versions of Scientology [17:01:27] of course, there is a demarkation problem [17:02:10] 03siebrand * r29500 10/trunk/extensions/ (3 files in 2 dirs): [17:02:10] * use wfLoadExtensionMessages for Newuserlog [17:02:10] * add version in extension credits [17:02:10] * update Translate extension [17:02:12] a politics editor tried to tag [[Scientology]] as a politics article, which would let him approve content and get final word on disputes [17:02:27] Hello [17:02:42] Is bugZill's issues there/ [17:02:44] ? [17:03:03] Simetrical: but yeah, having a chiropracter and acupuncturist have final say on the article scares me [17:03:18] it is just begging to get a pseudo-authoritative pseudoscience article [17:03:31] OsamaK, Bugzilla issues may deal with either the software or the Wikimedia servers. Your request dealt with the software. [17:03:56] OsamaK, I have marked the request in Bugzilla appropriately. Someone may get around to it sometime; it's not actually a very simple request to fulfill, for logistical reasons. [17:04:11] (that request being http://bugzilla.wikimedia.org/show_bug.cgi?id=12566, for the interested) [17:04:31] Thanks Simetrical :) [17:05:11] Simetrical: the rate of approval is like Nupedia at the moment [17:05:21] the only edge CZ has now is the lack of vandalism [17:06:17] Simetrical: but yes, I forsee a crapload of conflict of interest problems [17:07:03] Simetrical: dupe, that means they will ignore it? [17:07:32] OsamaK, no, it means it was a duplicate of another request. [17:07:48] Simetrical: But they aren't same [17:07:52] OsamaK, so we'll look at the one it was marked duplicate of, instead of having two requests that are the same. [17:08:32] Simetrical: Until fixed. we need to count these articles with are without links.. [17:09:11] OsamaK, you have to change the software to do that too. [17:09:49] Maybe a minor edit in some files will count them, no? [17:10:53] Is there a way to let only "some" users edit a page (and its subpages)? [17:20:06] 03siebrand * r29501 10/trunk/extensions/ (3 files in 2 dirs): [17:20:06] * use wfLoadExtensionMessages for Nuke [17:20:06] * add version in extension credits [17:20:06] * update Translate extension [17:24:52] anyone have a good HOWTO on securing parts of a wiki by group or such? [17:24:59] !prevacc | Wanderer_ [17:24:59] --mwbot-- Wanderer_ : http://www.mediawiki.org/wiki/Manual:Preventing_access [17:25:08] thank you [17:25:37] nareshov, you can use page protection for this. However, it may not be ideally suited to your needs. Wikis are generally meant to be broadly editable, so MediaWiki does not have fine-grained access control. [17:25:52] nareshov, you may also be interested in http://www.mediawiki.org/wiki/Manual:Preventing_access, and perhaps the Lockdown extension. [17:26:06] Simetrical, yup, got to that page [17:26:07] thanks [17:26:09] 03grondin * r29470 10/trunk/extensions/ (5 files in 2 dirs): [17:26:09] Creating internationalization file for Purge Cache extension. [17:26:09] Updating Translate extension whit adding this new i18n.php file. [17:26:09] * Translate.php [17:26:09] * MessageGroups.php [17:26:36] I have 2 gorups, sales and technical, which need to be blocked from viewing or editing each other's pages and sub-sections [17:28:10] Wanderer_, this is not what MediaWiki is designed for. You may be able to use the Lockdown extension. If not, you'll have to either rethink whether you really need technological barriers here, or make two separate MediaWiki installations, or use different software. [17:28:55] Making three wikis would probably be least pain [17:35:07] Dashiva: 3> [17:35:09] *? [17:35:30] One shared and one for each group's private stuff [17:35:35] MinuteElectron One for each seperate area and one for the common stuff [17:35:43] ok [17:40:31] 03dale * r29466 10/trunk/extensions/MetavidWiki/includes/ (4 files): better support for externally hosted stream images [17:40:42] 03tstarling * r29472 10/trunk/phase3/includes/filerepo/RepoGroup.php: Revert revert of setSingleton(), unrelated to broken, accidentally committed code in FSRepo.php. [17:40:55] 03brion * r29469 10/trunk/backup/worker.py: [17:40:55] * --force option to unlock a locked manually-selected wiki [17:40:55] * --date option to force a prior date, such as when recontinuing an aborted dump [17:40:55] * --checkpoint option to skip dump elements prior to the specified one. [17:40:55] Class name or class name.option (eg XmlDump.articles) [17:41:20] 03brion * r29468 10/trunk/phase3/maintenance/dumpTextPass.php: tweak paths more [17:45:45] 03aaron * r29471 10/trunk/extensions/ConfirmAccount/ConfirmAccount_body.php: Less crack this time ;) [17:48:07] 03brion * r29474 10/trunk/phase3/ (RELEASE-NOTES includes/EditPage.php): * (bug 12553) Fixed invalid XHTML in edit conflict screen [17:52:27] 03siebrand * r29502 10/trunk/extensions/ (4 files in 2 dirs): [17:52:27] * use wfLoadExtensionMessages for SiteMatrix [17:52:27] * add version and url in extension credits [17:52:27] * update Translate extension [17:54:45] 03yaron * r29503 10/trunk/extensions/SemanticDrilldown/languages/SD_LanguageEn.php: Re-added special property values [17:58:39] 03yaron * r29504 10/trunk/extensions/SemanticDrilldown/ (skins/SD_main.css specials/SD_BrowseData.php): More CSS improvements [17:59:20] 03yaron * r29505 10/trunk/extensions/SemanticDrilldown/ (INSTALL README includes/SD_GlobalFunctions.php): New version: 0.3.1 [17:59:34] 03tstarling * r29480 10/trunk/extensions/LabeledSectionTransclusion/ (lst.php lstParserTests.txt lsth.php): (log message trimmed) [17:59:34] MW 1.12 version of #lst/#lstx. Uses the document tree form of the template to [17:59:34] find
tags. No need for headingOffset since PPFrame::expand() adds the [17:59:34] appropriate heading markers. The new preprocessor lacks the capricious stripping [17:59:36] of trailing whitespace present in the old LST, so I updated the parser tests to [17:59:38] suit. My other parser test changes are trivial, and all tests pass. [17:59:40] I didn't use XPath in the end, because I really want to migrate away from DOM at [18:04:44] meh [18:04:46] Tim committing stuff [18:04:49] and not showing up here [18:05:40] ah, thats some old message [18:06:06] domas Got time for an off-topic question? [18:06:08] About performance of MySQL vs. MSSQL [18:06:19] I'm not that much of an expert of mssql [18:06:20] :) [18:06:48] Do you have any idea how the performance compares? [18:06:59] sure mysql is faster! [18:07:03] =) [18:07:21] but mssql is useful. ;) [18:07:25] I was in a project handover meeting the other day, and was a bit surpised to see the hardware requirements for MSSQL was so big. [18:07:55] well, I've seen big mysql boxes too [18:08:16] 2 DUAL-core Xeons and 4 GB memory to do exactly the same as a project I once made, running on a flat database on a P-II with 512MB [18:08:32] And the database was pretty stressed to perform as well. [18:08:56] So I'm wondering if the non-enterprise version of MSSQL is intentionally dumbed down somehow [18:09:41] or it was run by an idiot [18:09:52] I don't think so [18:10:04] i've configured small oracles instances on my desktop that perform comparably to mysql, i'd be surprised if mssql was different [18:10:08] -s [18:10:17] The project had used a consultant to get to that point [18:11:01] We're talking a database with 500k records tat are continiously changed. [18:13:25] if i indent inside a nest of ifs, am I going to get that whitespace considered as wikitext? [18:13:38] http://pastebin.com/m353e8102 [18:13:40] fr example. [18:13:45] Wegge, maybe the consultant was incompetent. Have you actually seen the table structures and queries? [18:14:00] Yes [18:14:19] And they're all perfectly correct and sane? [18:14:22] Seems surprising. [18:14:27] We've been doing this for ages, so I'd be surpirised if it was that bad. [18:14:35] You're not cynical enough. [18:14:45] Maybe not [18:14:57] Most people have no idea how to run databases efficiently, and that probably includes a high percentage of self-appointed consultants. [18:15:04] Experience doesn't necessarily equate to knowledge. [18:15:10] I'm tempted to make a trial on a MySQL instance, just to see if there is any real difference. [18:15:24] With the same code, syntax converted to MySQL? [18:15:33] Would be interesting. [18:15:39] when you say 'non-enterprise version', do you mean the developer edition, or something else? [18:15:47] Of course MySQL can be faster because it has no features, especially 4.0! [18:15:52] We don't need silly stuff like subqueries! [18:16:08] Yes, otherwise it would not be a usefull comparision [18:16:38] I mean the product you get when you just order a Microsoft SQL server 2005, without specifying extra words like Enterprise &c [18:16:45] Not the developer version [18:19:38] alalalala [18:19:41] I'm hunting for parser bug [18:19:55] hello [18:20:05] 03siebrand * r29506 10/trunk/extensions/ (4 files in 2 dirs): [18:20:05] * use wfLoadExtensionMessages for Filepath [18:20:05] * add version in extension credits [18:20:05] * update Translate extension [18:20:19] domas Which one? [18:20:25] what is the new log in german "Verschiebelogbuch"? [18:20:48] no. Vereinigungslogbuch [18:20:49] sry [18:21:13] heuler06 Sounds like a log for the merging of page histories [18:21:40] but it's not possible to do that with only one action? [18:21:48] It is now [18:21:50] could someone take a look at a template i just made and see if i can shorten it somehow? [18:22:08] ah [18:22:11] [[Special:Mergehistory]] or somesuch [18:22:34] http://pastebin.com/m79826672 [18:22:45] i wanted more whitespace to ease readability, but the whitespace became wikitext. [18:22:51] thx, wegge [18:22:54] cu [18:22:59] you call it like {{Death by|axe|monkey|lower=1}} [18:23:55] Morbus, you can put as much whitespace as you want around template parameters. That gets stripped. [18:24:02] Or should. [18:24:18] Simetrical: http://pastebin.com/m353e8102 [18:24:22] that was my first attempt, and it made it a pre [18:24:36] *Simetrical does have to wonder about someone who needs a template to accommodate all the different categories of images of Death by X [18:24:38] i was hoping whitespace was stripped between | | [18:24:49] Hmm. [18:24:55] It's supposed to be. [18:24:57] I think. [18:25:14] Simetrical: horror wiki. there's a lot of different deaths ;) [18:25:45] Simetrical: supposed to be stripped? hrm [18:25:52] I think so. [18:25:57] Maybe it's not done properly. [18:26:33] there's no shortening tips to my second pastebin though? that's just the way it's gotta be? [18:26:47] ok, the problem [18:26:57] any magic word used decreases parser cache time to 1h [18:26:58] i want to accept up to five descriptors for one death scene, and then offer named booleans of "lower[case]" or "display" [18:27:21] any solutions? :) [18:28:29] brion: would people be very unhappy if we had magic word-screwed parser cache increased to one day, instead of one hour? [18:28:49] ARE YOU KIDDING?! WHAT J00 SAY?! [18:28:51] *Morbus spittakes. [18:29:33] domas Do we have {{CURRENTHOUR}} ? [18:29:47] we have stupid stuff [18:29:57] currenttimestamp [18:30:12] currenthour [18:30:15] numberofedits [18:30:17] etc [18:30:38] ok, I add a feature to control that [18:30:40] caching [18:30:45] and set it to 1d on live site [18:31:00] \o/ [18:31:18] So things like {{CURRENTHOUR}} are actually never more than an hour off? [18:31:24] yup [18:31:30] that sucks, eh? [18:31:38] well, stuff is still cached in squids [18:31:53] and there magicwords don't matter [18:31:58] Hello [18:32:31] well, i think the difference is that squid is supposed to cache generated content, and mediawiki is supposed to generate as active a content as possible. [18:32:41] $wgParserCacheDynamicPageExpireTime; [18:32:42] \o/ [18:32:52] well, if someone is really unhappy [18:32:54] Don't you dare. [18:32:59] we can add exclusions [18:33:02] Half the length of that variable name, please. [18:33:17] $wgPCachDynPExp [18:33:20] nah [18:33:20] lol [18:33:22] ++ [18:33:23] I like long names [18:33:41] I like descriptive names, but not *excessively* long names. [18:33:49] domas $wgParserCacheExpireTime = 86400; [18:33:51] this is not excessively long [18:34:01] I have 23" monitor [18:34:03] Wegge, no, it's for dynamic pages only. [18:34:06] and my editor isn't even maximized [18:34:17] Wegge: wgParserCacheExpireTime exists, yes [18:34:27] the code now looks like: [18:34:27] if( $parserOutput->containsOldMagic() ){ [18:34:27] $expire = $wgParserCacheDynamicPageExpireTime; # 1 hour [18:34:27] } else { [18:34:27] $expire = $wgParserCacheExpireTime; [18:34:28] } [18:34:28] Simetrical: ok, so the difference seems to be the starting whitespace. Compare http://pastebin.com/m353e8102 to http://pastebin.com/m177a72be [18:34:31] $wgParserCacheDynamicExpire would be marginally better. [18:34:38] that's my first non-working, and my third working attempt. [18:34:48] the third working is marginally better, but forces the delimiters to the start of the line. [18:35:16] i can't get rid of the nested }}'s on line 6 though [18:35:18] domas, that variable name alone is half of an 80-char line. [18:35:48] Is it possible to create a new Article instance with some prepared content in it and launch edit page on that, from an extension ? [18:36:05] domas Ideally we do away with all that, and offload the work to some clientside .js [18:36:09] I'm pretty sure EditPage.php is hardcoded to use $wgArticle everywhere. [18:36:29] Wegge, relying on JS unnecessarily is not acceptable. [18:36:34] As it is now, the squids will happily keep an outdated page [18:36:49] Relying solely isn't [18:37:20] But oflloading that kin of stuff will make >95% of the users happy [18:37:32] And the rest is already screwed by squid caching [18:37:56] you can't put most uses of magic words into js, though [18:38:10] like the main use of the current date stuff is including per-day pages [18:38:10] If we wanted to, we could [18:38:10] Right, you can't do something like {{#if: something generated by JS}}. [18:38:12] That too. :) [18:38:31] Question: I just did an edit on a page, hit save, all that... in the DB, page.page_latest = 1409, however, text.old_id only is going up to 1408... where is this text being stored in the meantime? [18:39:17] Andrew: text.old_id does not correspond to page_latest [18:39:25] page_latest = rev_id, rev_text_id = old_id [18:40:16] hmm, k, looked like page_latest was a foreign key for text... [18:40:38] AndrewBuis, it's explained in maintenance/tables.sql. [18:42:18] building a program to go through and basically do a search on our web pages for words in our wiki, create a link to them as well as have title=(text of the wiki page) [18:42:59] 03catrope * r29507 10/branches/ApiEdit_Vodafone/includes/ (42 files in 3 dirs): [18:42:59] ApiEdit_Vodafone: Partial svnmerge of latest revisions from trunk. [18:42:59] * Merge split up to prevent hanging mail clients [18:43:07] domas: http://lists.wikimedia.org/pipermail/foundation-l/2008-January/037280.html [18:43:39] 'But now it was just taking a piss at people doing their job and trying to get projects supported, extended and elevated.' [18:43:50] 03catrope * r29508 10/branches/ApiEdit_Vodafone/ (12 files in 6 dirs): ApiEdit_Vodafone: svnmerge part 2 [18:43:54] hmm, not sure about that [18:44:02] certainly a possibility ;) [18:45:18] 03catrope * r29509 10/branches/ApiEdit_Vodafone/languages/messages/ (59 files): ApiEdit_Vodafone: svnmerge part 3 [18:46:18] 03catrope * r29510 10/branches/ApiEdit_Vodafone/languages/ (54 files in 3 dirs): ApiEdit_Vodafone: svnmerge part 4 [18:47:33] I thought API edit was already merged into trunk? [18:47:48] Ehh this was svnmerging from trunk TO apiedit_VODAFONE [18:48:03] Meh, even hangs my mail client for a while when I split it in four [18:48:11] Doesn't hang as long, though ... [18:48:38] get a better mail client ;) [18:49:07] Well it's pretty much Thunderbird's only disadvantage [18:49:27] Is it possible to create a new Article instance with some prepared content in it and launch edit page on that, from an extension ? [18:49:29] Native support for multiple mail accounts is important for us (4-person family, each with a different account) [18:49:44] Telemac you can't create an Article with some content in it [18:50:03] Err to whoever said that: EditPage.php uses $wgArticle *nowhere* [18:50:05] i changed from TB to the bat recently, as well as being faster i find it a lot less annoying :) [18:50:23] RoanKattouw, oh, that's nice. [18:50:31] *Simetrical just uses Gmail's web interface [18:50:33] Telemac you can just create an EditPage object and set its $mArticle member [18:50:53] Oh wait you already have to pass an Article object to EditPage's constructor [18:51:07] Telemac: You can also check out the docs at http://svn.wikimedia.org/doc/classEditPage.php [18:51:50] RoanKattouw: thanx [18:53:23] ok, where was I [18:53:36] alternative is to maintain list of deterministic/nondeterministic magic words [18:53:36] mornin [18:53:41] hi brion! [18:53:53] I'd use Tim as authority, but you fit too. :) [18:53:59] \o/ [18:54:07] brion: {{MAGICWORDS}} make parsercache work just for one hour [18:54:20] and {{MAGICWORDS}} are so much abused in all templates, that generally whole enwiki has this problem [18:55:09] so, is bumping up that hour very bad thing [18:55:15] The downside is that caching content using magic words too long makes it outdated [18:55:26] thats upside! :) [18:55:40] domas: was this newly added? [18:55:46] It's kind of lame when some page tells you Election Day is 3 days away because the cache is only purged every week... [18:55:49] brion: probably not [18:55:52] i don't remember us actually having such a thing [18:55:55] RoanKattouw: allo [18:56:00] AzaTht: Hi [18:56:10] brion: I extended memcached arena to 160gb today [18:56:15] woot [18:56:17] from previous 50g [18:56:20] RoanKattouw, you can manually purge the page if necessary. [18:56:24] I know [18:56:30] With a bot or whatever. [18:56:33] and now I'm sad, as most of objects have insanely short TTLs because of deterministic magic words [18:56:47] RoanKattouw: have a couple of things regarding the api (as usual) [18:56:48] But that results in a lot of manual purges, and essentially accomplishes nothing save for a lot of annoyances ;) [18:56:49] Maybe we should allow some tag like that takes cron-style input, and purges at those intervals. [18:56:53] I thought you would [18:56:56] Hi I have a Mediawiki Site which has a problem. Users cant change existing pages anymore, but can create and change new ones? What could be wrong? [18:56:58] Except not more than, say, once an hour. [18:57:10] RoanKattouw: hey, you never here [18:57:19] petzlux: After they've created them, can they edit them a second time? [18:57:22] Even I as Admin has the same problem [18:57:38] RoanKattouw: Yes, a newly created page can be edited fine [18:57:39] RoanKattouw: prop=imageinfo, iihistory should be changed into iilimit imo [18:58:06] check out www.manipedia.eu , thats the wiki with the problem [18:58:26] RoanKattouw: I would be happy to be able to use backlinks as a prop in an generator [18:58:32] AzaTht: Change iihistory to iilimit? Why? [18:58:38] RoanKattouw: makes sense [18:58:47] Backlinks as prop? Backlinks is a list... What do you mean? [18:58:54] Its frustrating as there is no error message you just get bumped back to the main index page [18:58:58] AzaTht: You mean iilimit defaulting to 0? [18:59:16] yea, or 1 [18:59:46] would make sense to have same sort of limits as other revisions related lists [19:00:22] RoanKattouw: why would backlinks be more of a list than langlinks? [19:00:57] AzaTht: backlinks lists pages, like allpages and friends do. Langlinks lists links, like prop=links, prop=templates and friends do [19:01:00] Makes perfect sense [19:01:06] damn [19:01:19] Keep in mind that API modules are often categorized based on what they return (pages, revs, other stuff...) [19:01:27] goddamn FUCK [19:01:27] The iilimit thing is on my todo list [19:01:37] spotlight seems to reindex my drive *every fucking time* the machine dies [19:01:41] RoanKattouw: isn't a template a page? [19:01:44] *brion stabs apple [19:02:01] AzaTht: Yes, but a langlink isn't [19:02:19] domas: we should get Tim to make quantum magic words [19:02:37] Also, prop=links|langlinks|templates|categories is about stuff *on* the page, list=backlinks|embeddedin|imageusage is about stuff on other pages *referencing* the page [19:02:41] RoanKattouw: but templates is a prop [19:02:42] I guess that's the logic [19:02:56] RoanKattouw: can understand that [19:03:16] brion: enjoying your non-MS box? [19:03:18] petzlux: ALL clicks on preview or edit throw me back to the main page [19:03:19] 03brion * r29476 10/trunk/phase3/ (RELEASE-NOTES includes/RawPage.php): [19:03:19] * (bug 12505) Fixed section=0 with action=raw [19:03:19] PHP's fuzzy type comparisons strike again! A check for $this->mSection != '', [19:03:19] probably intended to protect against an unset value, matched for integer values [19:03:19] of 0 as well. (The fun part is that 0=='' and 0=='0' but '0'!='' :) Since the [19:03:20] parameter is validated through getIntOrNull(), only the null check is necessary [19:03:24] here. [19:03:25] come back to the dark side [19:03:27] but it's rather cumbersome when you have a genrerator that only return one page to be unable to connect list on that page directly [19:03:33] 03dale * r29475 10/trunk/extensions/MetavidWiki/ (9 files in 4 dirs): updates to web interface to manage stream files [19:03:34] It is the only way! buhahaha!!!! [19:03:36] Err brion didn't you commit that one earlier? [19:03:40] HUH?!? [19:03:44] CIA-40 has gone crazy [19:03:51] Reports 29476 before 29475 [19:03:52] RoanKattouw: Okay, but this is recent behaviour, thats what I dont understand ? [19:04:04] wow, yet another fuzzy comp. bug [19:04:06] RoanKattouw: didnt change anything knowlingly [19:04:16] petzlux did you change anything recently? Stuff in LocalSettings.php, updated to a new version, that sort of thing? [19:04:16] AaronSchulz, PHP is great that way, huh? [19:04:21] RoanKattouw: the thing I has is I from a category grabs 1 image [19:04:24] That's like what, four in the past year? [19:04:29] Not like we're a very big project, either. [19:04:44] *RoanKattouw is still angry at PHP for WONTFIXing $wgUser->getEffectiveGroups()[0] [19:04:46] Simetrical: heh, don't rank on PHP :( [19:04:53] it's lazy programming [19:04:59] No it's not. [19:05:02] It's a stupid language. [19:05:09] AaronSchulz: haven't had an MS box of my own in a long long time :) [19:05:22] RoanKattouw: would be rather useful to use lists as props in some sort of generator [19:05:38] *AaronSchulz hugs PHP :'( [19:05:46] Use lists as props? But there isn't really a *difference* between lists and props, is there (other than that props generally don't have limits) [19:05:50] AaronSchulz, have you tried Python or Perl? [19:05:58] Or Ruby, or any other scripting language? [19:06:18] AzaTht: Oh you mean like do generator=something&list=backlinks ? [19:06:21] and about the user thingi, what about an usergenerator, that for example can be used on revisions and allusers, and can have props like groups, contribs etc... [19:06:26] RoanKattouw: yea [19:06:29] That *used* to work, but doesn't anymore [19:06:35] I know :( [19:06:38] AzaTht: The userthingy is on the long-term todo list ;) [19:07:01] 'Cause backlinks only lists backlinks for one page, and a genny can return more than one page (or none at all) [19:07:03] brion: people should use === and such for strong comps. [19:07:08] *simetrical [19:07:33] RoanKattouw: the issue here is that I for the generator generates ONE page ツ [19:07:43] I know [19:07:47] But that's not always the case [19:07:49] brion: Come on, don't you want to tast some sweet Microsoft love again? [19:07:57] What should the software do when the genny returns multiple pages? [19:08:01] AaronSchulz, if I say $var == '', could that ever possibly reasonably be interpreted as including 0? [19:08:11] RoanKattouw: prop=categories, dunno if it's a bug, but the clprop doesn't work when using the prop as an generator [19:08:16] gclprop then [19:08:20] AaronSchulz, if I wanted to match 0 or '' or any variant thereof, I would just use if( !$var ). [19:08:26] AzaTht: That's right [19:08:35] PHP's handling of comparisons makes no sense, which is why it causes bugs like this on a regular basis. [19:08:43] damn [19:08:44] fuck [19:08:44] shit [19:08:48] RoanKattouw: ok [19:08:52] even {{cite}} breaks caching [19:08:57] argh [19:09:00] frustration :) [19:09:00] Simetrical: 0 == "" is fucked up, yeah [19:09:10] 'Cause if you do like generator=categories&gclprop=something&prop=links , all the ApiQueryCategories module returns is an array of pageids that prop=links works its magic on [19:09:21] AaronSchulz, that's the whole problem I'm objecting to. 0 == "any arbitrary string that doesn't start with a number" is even more insane. [19:09:24] There's no way to pass that clprop stuff along [19:09:31] ok [19:09:35] domas: try to keep all of your curse marbles in a row, kthx :) [19:09:37] I think MagicWord.php needs new method, telling the TTLs for each of them [19:09:40] fuck [19:09:41] shit [19:09:41] damn [19:09:42] :) [19:10:19] domas: do that at CZ [19:10:27] domas: none of those things touch caching unless someone added a new feature for that [19:10:29] RoanKattouw: for imageusage, would it possible to avoid bailing out if the given title isn't an image [19:10:30] so first find the feature [19:10:33] then look at it [19:10:34] they delete and replace all incivility with a fun template [19:10:42] Err AzaTht: why [19:11:01] imageusage only takes one title right? [19:11:02] as categorymembers doesn't bail out if the given title doesn't exists [19:11:09] Simetrical: huh? [19:11:12] brion: http://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3/includes/ParserCache.php?view=diff&r1=14200&r2=14201 [19:11:22] oh wait, wrong one [19:11:28] an backlinks etc... [19:11:31] wrong [19:11:33] again [19:11:36] domas [19:11:37] AaronSchulz, aryeh@aryeh-desktop:~$ php -r "var_dump( 0 == 'x' );"bool(true) [19:11:38] strange, annotate shows that [19:11:43] domas: :D [19:11:44] AzaTht: That's because non-existent categories/pages can still be linked [19:11:46] Er, it mashed the output on the end. [19:11:55] Non-existent images can never be included, though, only linked [19:12:05] AaronSchulz, in comparing a string to a number, the string is cast to a number. Strings that don't start with a number are cast to zero. [19:12:16] RoanKattouw: true, but when it bails, it stops other queries as well [19:12:22] Wow, 14201, that's an oldie [19:12:27] Ergo, if you type $var == 0, it will return true for almost any string whatsoever. [19:12:27] I have list=imageusage|backlinks [19:12:31] Right [19:12:42] Hmm, I guess that could be done [19:12:42] Simetrical: really? [19:12:44] if imageusage bails, backlinks won't return anything as well [19:12:46] I can't believe that [19:12:52] AaronSchulz, look at the docs. [19:12:58] AaronSchulz, or try it yourself. [19:12:59] brion: does this heathen speak Teh truth? [19:13:06] AaronSchulz, just run php -r "var_dump( 0 == 'x' );" [19:13:09] brion: it was in virgin 2517 revision of ParserCache [19:13:10] AaronSchulz, bool(true). [19:13:11] AzaTht: I'm on the imageusage thing right now [19:13:34] ok [19:13:37] thanks [19:13:41] though back then it was [19:13:41] if( $parserOutput->containsOldMagic() ){ if ( $value ) { [19:13:41] 20 $expire = "1 HOUR"; # Delete if article has changed since the cache was made [19:13:41] 21 } else { if ( $value->getTouched() != $article->getTouched() ) { [19:13:42] 22 $expire = "7 DAY"; $wgMemc->delete( $key ); [19:13:44] 23 } [19:13:46] ;-) [19:14:01] oh, good old wfQuery() times [19:14:42] AaronSchulz, just run php -r "var_dump( 0 == 'x' );" [19:14:45] bool(true). [19:14:48] AaronSchulz, just run php -r "var_dump( 0 == '1x' );" [19:14:51] bool(false) [19:15:03] opps, extra cp stuff there [19:15:04] AaronSchulz, yup. '1x' == 1. [19:15:14] AaronSchulz: hehe [19:15:22] WHAT [19:15:23] THE [19:15:25] FUCK??? [19:15:26] '2007 was a great year for computing' == 2007. [19:15:33] Simetrical: yes [19:15:40] What about 2007 == '2007 rocked' ? [19:15:47] I am still having a hard time believing this [19:15:49] AaronSchulz, do you believe me now that PHP is a moronic language? [19:15:50] brion: so I can assume it is historical cruft and throw it out? :-D [19:15:51] RoanKattouw: don't think order matters [19:15:57] That was crafted by lazy idiots, for lazy idiots? [19:16:08] RoanKattouw, of course, that's true too. [19:16:09] I guess that when int and string are compared, the string is converted to a string [19:16:10] AaronSchulz: it's lazy evaluation [19:16:21] ", the string is converted to a string"? [19:16:23] RoanKattouw, the string is converted to the numeric type, yes. [19:16:26] Simetrical: Try '2007 rocked' - 2000 [19:16:27] plenty of good things are called "lazy", this is beyond that [19:16:30] To an int, sorry [19:16:34] hehe [19:16:37] '0' == 0 is good lazy [19:16:38] The rationale is that it makes sense to have '007' == 7. [19:16:41] " 2007" != 2007 [19:16:41] hehehe [19:16:42] http://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3/includes/ParserCache.php?revision=2517&view=markup [19:16:52] 'meow' == 35 is not [19:17:00] 'meow' == 0, not 35. [19:17:05] thats how mediawiki used to look like :) [19:17:28] how do you set the default page to be something other than Main_Page [19:17:32] Simetrical: why isn't just removing forward zeros and then doing a '3' = 3 type of comp [19:17:35] AzaTht: Are you having trouble passing Image:Nonexistent to imageusage, or Talk:Nonexistent ? Cause the latter isn't an image [19:17:37] AaronSchulz, 0 == '0' is arguably good. Depends how strong you want your typing to be. [19:17:46] AaronSchulz, because they're morons. [19:17:49] Simetrical: opps, typo [19:18:00] And now, of course, it's too late to change it! Backward compatibility! [19:18:01] RoanKattouw: probably just the latter [19:18:07] Let's add even more completely broken features! [19:18:09] domas: if you find actual active cache expiry in stuff we use, first *look at it*, then *see what it actually does*, then see if it needs to be adjusted [19:18:11] I mean '003' == '3' is pushing it, but ok [19:18:15] AaronSchulz, look at the __get and __set functions in PHP 5. [19:18:19] They're totally worthless. [19:18:19] but strings are just not zero [19:18:26] how do you set the default page to be something other than Main_Page ? [19:18:27] unless it is '0' [19:18:38] RoanKattouw: yea, the latter [19:18:38] Instead of == '' you should just use empty() [19:18:41] brion: well, say {{cite web}} uses {{NAMESPACE}} in it [19:18:43] Because if you have one __get or __set function nested inside another, the ones higher on the stack ALWAYS RETURN null. [19:18:48] AzaTht: Can't blame imageusage in that case [19:18:56] brion: and {{NAMESPACE}} immediately makes parser cache 1-hour long [19:19:05] RoanKattouw: true, but can you do it so that only the active list bails? [19:19:10] and not everything else [19:19:14] brion: most of enwiki templates use magic words [19:19:24] NAMESPACE should not have any affect on caching whatsoever [19:19:29] brion: indeed [19:19:31] brion: but it does [19:19:35] then find and fix it [19:19:36] jesus [19:19:39] AzaTht: Err well, since you're just requesting your stuff on one title, can't you check for yourself whether it's an image first? [19:20:13] brion: Do moves purge the parser cache? Cause {{NAMESPACE}} can be changed by a move... [19:20:16] That what I have to do now [19:20:22] wow. that 1 == '1x' is amazing. i know. i am too late. but thats hard. [19:20:31] i have 1.5.5 mediawiki installed when i try to make templates with with named parameters .. its does not work but when i do the numbered paramters its works .. does anybody know what am i missing here [19:20:32] brion: well, that means refactoring all magicwords \o/ thanks [19:20:37] RoanKattouw: they should. [19:20:39] PunkRock, not as bad as 0 == 'The cow jumped over the moon'. [19:20:41] RoanKattouw: http://en.wikipedia.org/wiki/User:AzaToth/twinkleunlink.js [19:20:52] since they update page_touched they'll invalidate any cache [19:21:01] Did 1.5.5 support named parameters to templates? [19:21:12] *AaronSchulz is still emotionally attached to PHP [19:21:12] argh. [19:21:29] AaronSchulz, try writing a nontrivial app in Python sometime. [19:21:34] PunkRock: actually, same in quite a few places [19:21:38] i mean how many ppl. use those comparisons. man. php sucks so much. [19:21:46] mysql> select "1x" = 1; [19:21:46] +----------+ [19:21:46] | "1x" = 1 | [19:21:46] +----------+ [19:21:46] | 1 | [19:21:47] +----------+ [19:21:50] RoanKattouw: it's still that you musn't specify Category: for categories [19:21:54] it's worse then i have ever thought. [19:21:57] you just have to know which types are stronger [19:21:59] which are weaker [19:22:03] AzaTht: Starting at if(wgNamespaceNumber == Namespace.image) , aren't you duplicating code? Can't you just run the else { } block and then do query['list'][1] = 'backlinks'; etc. ? [19:22:08] domas, or you could use strong typing all the time. [19:22:11] any dynamic environment has this problem [19:22:20] >>> print 0 == '0' [19:22:20] False [19:22:23] Python doesn't. [19:22:45] Simetrical: well, "dynamic typing" [19:22:49] or what is the term [19:22:57] RoanKattouw: perhaps, but still some sort of consistent behavour would be useful [19:23:20] RoanKattouw: list=categorymembers should also bail [19:23:20] Err consistent behavior? Is this about the Category: stuff, or about code duplication? [19:23:21] domas: I don't mind dynamic typing, but this is just stupid weak [19:23:25] yea [19:23:35] AzaTht: No, non-existent categories can have members [19:23:55] RoanKattouw: yea, but a category must also start with Category: [19:24:44] Well not if the Category: prefix is omitted, which categorymembers does (e.g. cmcategory=Physics rather than cmcategory=Category:Physics ) [19:24:49] and something more, I would recommend combining action=render and action=parse [19:24:57] AaronSchulz: well, as I've said - seen that elsewhere :) [19:25:07] can you not then ommit Image: prefix for images? ツ [19:25:11] then fuck elsewhere [19:25:12] AzaTht: Those are not my brainchilds, I'll look at them briefly [19:25:17] *AaronSchulz fucks elsewhere [19:25:17] ok [19:25:44] and yea, able to use action=render|parse|query ツ [19:25:47] how do you make a page redirect to another page? [19:26:11] topbloke: Create it with #REDIRECT [[Target page]] as its content [19:26:17] ok thanks! [19:26:28] *AaronSchulz gets annoyed [19:26:31] brion: grrr...did you ever get in the mood where you just want to "fuck everything"? [19:26:32] AzaTht: Yes, render should go, it's redundant, parse returns the same [19:26:37] RoanKattouw: you said list was for pages? [19:26:37] Perhaps parse should have a prop param [19:26:46] yea [19:26:48] Mostly [19:26:51] list=logevents [19:26:53] Sometimes users as well [19:26:59] And revisions (list=recentchanges0 [19:27:30] But I guess the difference is made by whether stuff is on the page (prop), or on other pages referencing the page (list) [19:27:40] ok [19:28:14] it would be extreemly useful to be able to generate lists from pageids [19:28:29] could somebody help me with named paramater templates [19:28:41] xerophytev: What's your question? [19:28:49] AzaTht: Can you give me a concrete example? [19:29:06] Simetrical: http://findmagiccards.com/Cards/LW/Pestermite.html [19:29:11] ooooh cute [19:29:19] RoanKattouw: i have 1.5.5 mediawiki installed when i try to make templates with with named parameters .. its does not work but when i do the numbered paramters its works .. does anybody know what am i missing here [19:29:19] RoanKattouw: http://en.wikipedia.org/wiki/User:AzaToth/twinkleimagetraverse.js [19:29:25] the basequery there [19:29:42] to be able to add a generated list of imageusage [19:29:51] RoanKattouw: do you know why would the named parameter does work [19:30:23] xerophytev: Maybe named parameters were added in a later version of MW, 1.5.5 is really ancient (we're at 1.10.0 now) [19:30:37] RoanKattouw: we are at 1.11 now, actually [19:30:38] RoanKattouw: hmmmm [19:30:44] Typo [19:30:48] :P [19:30:50] hehe [19:31:17] AzaTht: Again, what happens if someone forgets gcmlimit=1 and generator=categorymembers returns multiple images? [19:31:36] any comments on TTLs?: http://p.defau.lt/?wkcR8mm_V_AABHZCjKXcyg [19:31:36] ;-) [19:31:46] either it goes on with their buisness, bails or limits to the first image [19:31:53] or should I just use 3600 for anything dynamic [19:32:08] AaronSchulz: i'm a one-woman man ;) [19:32:16] booooring! [19:32:29] brion: Dr House kicks your ass [19:32:48] AzaTht: That's kind of a makeshift solution [19:32:51] RoanKattouw: and yea, I would recommend changing prop to gprop [19:32:58] Why's that? [19:33:23] so you could be able to have a ordinary prop and a genrator is the same query [19:34:46] so when using a glist, the gfoolimit is limited to 1 [19:36:21] So rather than saying generator=categorymembers&list=imageusage you want generator=categorymembers&glist=imageusage , do I get that correctly? [19:39:35] yea [19:39:59] and an request for imageinfo: [19:40:13] I would like to be able to get the url for specific size of an image [19:40:19] Well that's unnecessary parameter clutter. I could look into the list generator thing though [19:40:28] Hmm [19:40:37] I don't really know how that image scaling stuff works [19:40:51] But I'll look into it when I have time; noted on my ever-growing TODO list [19:41:03] Actually, I think you're responsible for my creating such a list in the first place ;) [19:41:04] RoanKattouw: it isn't much easier to generate such an url owsite ツ [19:41:24] RoanKattouw: hehe [19:42:14] RoanKattouw: and yea, how is edit going? [19:42:28] hello.. i find info about loadbalance in mediawiki with mysql !!! any link ?? [19:42:57] AzaTht: Pretty busy with non-programming stuff these days [19:43:01] ok [19:43:08] So not much happening with action=edit right now [19:43:14] :( [19:43:27] I have looked at the Vodafone code though (seems to be unmaintained), and it throws fatal erros [19:43:43] After fixing the stupidities, there's still an obscure one I can't locate within 5 minutes [19:44:06] The module looks like a mess anyway, so I intend to rewrite it. Their modifications of EditPage.php were all right though [19:44:14] ok [19:44:31] yurik isn't doing anything theese days? [19:44:44] Yurik hasn't done much for a long time [19:44:58] I see [19:45:02] Not that I blame him: he wrote query.php and the api.php framework [19:45:10] And most of the ApiQuery* modules [19:45:18] yup [19:45:56] there was someone pointing out that nolanglinks from query.php doesn't have an quivalent in aip [19:46:19] "nolanglinks", what's that? [19:46:40] "Enumerates pages without language links to the output list (automatically filters out redirects)" [19:47:12] Right [19:47:20] Sounds like an API moduel [19:47:21] apfilterlang [19:47:23] *module [19:47:28] perhaps [19:47:29] On the TODO list [19:47:51] No, allpages only enumerates one namespace [19:47:57] I assume nolanglinks enumerates all? [19:48:14] yea [19:48:35] I guess it's gonna be a list module then [19:49:23] and it was the txt format in query.php that is missing by some [19:49:33] txt format? [19:49:36] print_r() output [19:49:41] Right [19:49:51] *RoanKattouw opens TODO list again [19:49:55] and dbg [19:49:59] to some extent [19:50:06] PHP source code using var_export() [19:51:02] other than that I think query.php can be fully deprecated now [19:52:02] RoanKattouw: a question: prop=extlinks says no interwiki links [19:52:16] what can give interwiki links? [19:52:26] prop=links? [19:52:35] Maybe [19:52:54] prop=extlinks only returns like like [http://www.example.com/ this] [19:53:16] http://en.wikipedia.org/w/api.php?action=query&prop=links&titles=Main%20Page [19:53:22] doesn't seems it have iw links [19:53:49] oh damn [19:54:04] where is Tim when I need him :) [19:54:08] answer: sleeping [19:54:11] domas: hehe [19:54:42] RoanKattouw: perhaps an even new modlue is needed... [19:55:05] AzaTht: I'm not sure iwlinks are even stored in the DB [19:55:10] hmm [19:55:20] that's strange [19:56:01] RoanKattouw: there are a table "interwiki" [19:56:04] Yes [19:56:09] But that's interwiki *bindings* [19:56:19] Interwiki links aren't stored, only interlanguage. [19:56:19] Like mapping IMDB: to www.imdb.com/tt$1 [19:56:23] This has been discussed before. [19:56:42] ok [19:57:15] so normal links, language links and external links are stored in the DB, but not interwiki links? [19:58:01] Yep [19:58:12] I don't see the logic either, but that's how it is [19:58:20] so using interwiki links should be avoided as much as possible? [19:58:28] No [19:58:35] They're easier [19:58:50] hmm [19:59:05] I mean [[wiktionary:Dog]] is easier than [http://en.wiktionary.org/wiki/Dog] [19:59:36] fishy [20:01:43] RoanKattouw: could something be implemented to give size of an category? [20:02:00] There's a lot of count stuff that could be implemented [20:02:06] Currently you have to just walk it I guess [20:02:14] yea [20:02:22] TODO list... [20:02:25] and that's not really optiomal [20:02:28] haha [20:03:04] No, it's not optimal, but with limit=5000 on your side, you shouldn't need too many requests [20:03:20] Domas: Is COUNT(*) on a huge category (like Category:Living people) expensive? [20:04:20] RoanKattouw, O(N) in the size of the category. [20:04:22] So yes. [20:04:35] Category:Living people takes a few seconds on the main servers, as far as I've measured it. [20:04:53] We should have a category table that contains this type of metadata. [20:04:57] Yeah [20:05:07] schema_change ... [20:05:16] And maybe even uses category_id instead of having to store NS+title for every single category in categorylinks. :) [20:05:18] Means my TODO list shrinks, though [20:05:35] Simetrical: Yeah [20:05:49] categorylinks is one of the biggest tables at present [20:05:49] . [20:05:51] Previously, that couldn't be done because nonexistent cats don't have a pageid [20:05:54] Of course [20:06:07] Pagelinks is also huge I think [20:06:07] Probably the biggest, except for maybe templatelinks (not counting external storage). [20:06:14] Yeah, all of those. [20:07:23] RoanKattouw: but now it also includes download all the data [20:07:32] I know [20:07:46] But I agree with Simetrical that we should have counters in the DB [20:07:51] me too [20:07:54] Like we do with editcount [20:08:05] Well, edit count is recent. [20:08:11] But like we do with site_stats, say. [20:10:21] Simetrical: Not that recent, though, it was already there when I got here ;) [20:10:39] RoanKattouw, recent as in the past six months, as I recall. [20:10:41] *Simetrical looks [20:10:57] Only been here for like 5 months [20:11:10] 1.9. [20:11:16] So that's older than I thought. [20:11:26] Wow [20:11:44] Like a year. [20:11:52] I came in just after the release of 1.6, IIRC. [20:12:01] Huh, I thought it was newer. [20:12:16] my hack is so ugly it makes me cry: http://p.defau.lt/?RUsNgya8gq0lR804rBBHcQ [20:12:46] I didn't run this code yet [20:12:46] :) [20:12:52] so it must have syntax errors [20:13:18] It always has [20:13:35] I'm very surprised when freshly written code runs without fatal errors [20:13:48] And check that I really saved the file :D [20:14:06] domas, doesn't look so bad. [20:14:13] ok, it does have syntax errors [20:14:20] anyway, the major problem is that pcache objects aren't compatible [20:15:34] heh [20:15:37] I could do much easier hack [20:15:43] that would not change formats [20:15:48] and keep boolean oldmagic [20:15:56] then just have a set of dynamic mwords [20:16:06] and cache all for 1h [20:16:17] watch would be 1/3 of this one. [20:16:20] *patch [20:17:15] domas, make sure people who don't like caching can set timeout to 60 for currentminute! \o/ [20:17:29] they can entirely disable pcache %) [20:33:46] Is there any way to detect that a page is transcluded into another page from a tag extension used on the first page? [20:34:39] agtbaldrick: wich tag ? [20:34:54] any tag extension [20:35:49] Something like "which page contains this tag that is now on the aggregated page" [20:37:23] dunno if it's possible [20:40:33] yay [20:40:35] easymode patch [20:40:35] http://p.defau.lt/?t7uXOxdtybeFV4w4zxXQRw [20:40:36] ;-) [20:44:06] oh, it even works [20:45:04] committing! [20:45:27] agtbaldrick can you give a concrete example? [20:47:36] A page "A" contains a tag , and a page "B" transcudes page "A" into itself. Can the tag-extension detect that it was used on page "A" but is now on page "B" [20:47:58] \o/ [20:48:00] 03midom * r29511 10/trunk/phase3/includes/ (MagicWord.php Parser.php Parser_OldPP.php): [20:48:00] set OutputPage::mContainsOldMagic only for specific listed magic words [20:48:00] For now the suggested times are ignored and base 3600 is used for all dynamic ones [20:48:00] This makes templates more friendly parser-cache wise. [20:48:30] agtbaldrick I think so [20:48:54] It works for links: links on a template are listed for each transclusion of that template in Special:Whatlinkshere [20:49:50] It is possible to add a parameter identifying the originating page, but it should if possible be done a little more generic (read there are some stupid users that shall use this tag) [20:50:36] I don't think so [20:50:37] Yes, I found a way to list them, but it seems fairly inefficient as there would be several queries [20:50:44] domas: I can't wait till Tim finishes the new parser [20:51:07] we wouldn't have to support the old one anymore/could kill it or something [20:51:45] at least my changes are fastest to hit live cluster [20:51:47] \o/ [20:53:01] hi :) [20:53:21] hi [20:53:41] did anyone code anything to support SQLite ? [20:53:48] No [20:54:16] hashar: SQLite doesn't handle properly ALTER TABLE i think [20:54:31] \o/ 29511 is live [20:54:42] no very good for schema updates [20:54:56] brion-away: pushed the fix live \o/ [20:55:19] "It is not possible to remove a column from a table." <)) [20:56:04] domas: I still see r29455 on testwiki:Special:Version [20:56:38] only MagicWork.php has been updated ? [20:56:48] RoanKattouw: why do people believe Special:Version is any important? :) [20:56:53] RoanKattouw: it shows scap version [20:56:57] not individual file revisions [20:57:14] So then you only svn up'ed one file? [20:57:38] (or a few, not the whole deal anyway) [20:58:12] yup! [20:58:16] I never do full scaps [20:58:20] but I love individual revisions [20:58:25] or even individual diffs [20:58:28] thats teh most evil [20:59:29] Yeah, if it depends on a previous rev it could fail to apply or even break stuff [21:00:46] quick question, we have a mediawiki instance with a custom namespace which I would like to be the default if you browse to one URL and main for the other (manual.foo.com would do the Manual: namespace, www.foo.com would do the main) -- is this possible? [21:01:11] RangerRick: You could probably do that with RewriteRules [21:01:47] default how? [21:02:17] You just want the main namespace to be called Manual? [21:02:21] Dashiva no [21:02:42] Dashiva: I want the manual namespace to act as main on manual.foo.com, and the main namespace to act as main on www.foo.com [21:02:47] He wants manual.example.com/wiki/Foo to go to www.example.com/wiki/Manual:Foo [21:03:01] You can probably accomplish that with RewriteRules [21:03:28] yeah, I'll give that a shot, although I suppose I'd need to reverse-proxy to rewrite the content coming back out of mediawiki, if I really wanted it to all be seamless [21:03:38] but if it's not, it wouldn't be the end of the world =) [21:03:45] thats teh most evil [21:03:49] ergh [21:03:52] this had to go to shell [21:04:06] RangerRick: you probably shouldn't want seamlessness [21:04:18] The most evil would be to rename namespace id 0 to Manual on a second install using the same db :P [21:04:20] how does wikipedia handle (en|de|etc.).wikipedia.org? that's doing the same thing in concept [21:04:30] No [21:04:34] meh, title blacklist caching code is screwed [21:04:35] Those are different wikis altogether\ [21:04:52] RangerRick: hiii! [21:05:04] RangerRick: where is my native mac kcachegrind package! [21:05:22] RangerRick: and generally as LocalSettings is php script, different values can be set dynamically in it [21:05:23] domas: =) [21:05:28] isn't valgrind not ported to osx? [21:05:43] RangerRick: I need kcachegrind, the visualization tool [21:05:56] ah, does it just read dumps or something? I've never tried it [21:06:10] RangerRick: http://dammit.lt/2006/01/18/mediawiki-graphic-profile/ [21:06:16] RangerRick: running X is too much pain :) [21:06:31] RangerRick: and yes, kcachegrind reads dumps produced by cachegrind or xdebug [21:06:38] and xdebug dumps are very nice for PHP performance engineering [21:06:43] I NEED MY PACKAGE :) [21:07:00] domas: 100.4% total ?!? [21:07:12] RoanKattouw: its mediawiki, everything is possible [21:07:21] lol [21:07:28] hehe [21:07:37] How come php::fgets() takes so much time? When do we even *use* it? [21:07:42] RangerRick: anyway, in configuration script you can just read the $_SERVER stuff, then dynamically build db names, paths, etc [21:07:44] Oh memcached, nm [21:07:56] yeah, its from network [21:08:02] well, I'll look into the rewrite thing; it probably is best to just have manual.foo.com and just redirect to the Manual:Main_Page or whatever and not try to get tricky [21:08:02] I'm going to switch to PECL memcached client some day [21:08:10] PECL? [21:08:13] thanks [21:08:16] meh [21:08:23] I wanted to force him to do real stuff for me [21:08:25] and he left [21:08:27] lol [21:08:36] You need a .deb for kcachegrind? [21:08:52] Oh wait native Mac? [21:08:58] KDE on the Mac? [21:09:05] yes, linked against qt/mac [21:09:12] Right [21:10:23] Meh [21:10:50] domas ftp.debian.org doesn't have packages for PowerPC, that sucks; or do you have a 386 Mac? [21:10:57] Intel Mac sorry [21:11:00] yup [21:11:03] have a few :) [21:11:13] right now I have 4 macs at home :) [21:11:16] lol [21:11:40] Hmm, still, you'll probably have to compile it from source, can you live with that? [21:11:41] (C++) [21:12:04] well, I have one compiled against QT/X [21:12:15] believe me, KDE4 build against QT/Mac is a bit of pain [21:12:15] :) [21:12:18] lol [21:12:22] someone has to specialize [21:12:45] Yeah well I was surprised by how well Qt/Windows works [21:12:47] *RoanKattouw hides [21:13:02] and BTW, RangerRick is maintainer of KDE packages on mac :) [21:13:19] Qt code is actually fully portable between Windows and X (as far as I've tried anyway) [21:13:25] Ah that explains [21:21:39] if( '' != $right && !$user->isAllowed( $right ) ) { [21:21:39] $errors[] = array( 'protectedpagetext', $right ); [21:21:39] } [21:21:52] Is it just me, or is that message *totally* inappropriate for the condition? [21:21:56] *Simetrical looks pointedly at Werdna [21:22:14] Which source file and function, Simetrical? [21:22:27] Dashiva, includes/Title.php, line 1150ish. [21:26:50] Looks like it might error due to a lot of different permissions, yeah [21:27:05] Look at the text, though. [21:27:10] "This page has been locked to prevent editing." [21:28:18] Maybe the execution paths are so that it only happens for edit locking [21:28:34] Execution paths can change [21:28:38] Nope, this is a very general function. [21:28:52] is 1.4.14 too old to ask questions about? [21:29:05] Seems odd that it hasn't spawned a bunch of bugs, then [21:29:06] thats lighttpd version? :) [21:29:09] Maybe, maybe not [21:29:14] beekhof: I feel sorry for you [21:29:33] yeah - i'm stuck with ubuntu6 on this server [21:30:04] the links in my sidebar are fine, but links in the main body insist on including index.php [21:30:14] in the url i mean [21:30:21] and it kinda annoying :) [21:30:29] You mean like ugly URLs? index.php?title=Example [21:30:43] shorter but not shout [21:30:46] short [21:31:01] in the sidebar i have /mw/PageTitle [21:31:16] but in the body they come up with /mw/index.php/PageTitle [21:31:22] Right [21:31:27] They both work, right? [21:31:48] right [21:31:50] Can you check what $wgArticle and $wgScript are set to in LocalSettings.php ? [21:31:55] $wgScript = "/mw"; [21:32:08] $wgArticlePath = "/mw/$1"; [21:32:19] i dont see wgArticle [21:32:25] Sorry meant $wgArticlePath [21:32:31] What does the edit tab link to? [21:32:56] http://www.clusterlabs.org/mw?title=Main_Page&action=edit [21:33:08] Wrongue [21:33:17] $wgScript should be "/mw/index.php"; [21:33:28] Not sure if that fixes your problem, though [21:33:40] Alias /mw/index.php /var/lib/mediawiki/index.php [21:33:40] Alias /mw /var/lib/mediawiki/index.php [21:33:47] the edit works fine [21:34:44] i can access the page with every form of the url - i'd just like the [[PageName]] urls to not have index.php in them [21:35:01] I understand [21:36:08] beekhof: is /var/lib/mediawiki/ a directory exposed to the web? [21:36:18] yup [21:36:25] What's its URL? [21:36:35] http://www.clusterlabs.org/ [21:37:17] http://www.clusterlabs.org/index.php doesn't exist [21:37:27] Is index.php directly accessible? [21:37:33] Other than through the alias? [21:37:47] 'Cause it should be [21:38:04] i just aliased it now [21:38:16] What's the URL? [21:38:22] same [21:38:33] http://www.clusterlabs.org/index.php will work [21:38:50] Alright [21:38:54] i just added Alias /index.php /var/lib/mediawiki/index.php [21:39:06] Now set $wgScript = "/index.php" and $wgArticlePath = "/mw/$1"; [21:39:24] What happened to your CSS anyway? [21:39:30] Looks like $wgStylePath is wrong [21:40:13] Should be /skins , not /mw/skins [21:40:19] havent touched css yet [21:40:49] ok - both done [21:41:06] the script and article [21:41:32] seems i have $wgStylePath = "$wgScriptPath/skins"; [21:41:35] will fix [21:42:22] YES! It works now [21:42:29] Check the FAQ link on the Main Page [21:42:53] Some links might be slow catching up, you can force an update by editing the page without changing it (i.e. click Edit, then click Save) [21:43:40] i have no idea how you did it - but thankyou :) [21:44:10] Hey, skin's back up too [21:44:37] yup - fixed on your advice [21:44:45] but how can you tell? looks the same to me [21:45:00] Hold Shift and click the Refresh button in your browser [21:45:08] Hello (again) [21:45:31] beekhof you should really just do a grand upgrade and upgrade PHP to 5.2.5 (or at least 5.1) and upgrade MW to 1.11.0 ;) [21:45:41] i know :) [21:46:38] but i'm in the middle of getting this new project up and running - its consuming most of my energy at the moment [21:47:15] its the fork you have when you're not having a fork - loads of fun for everyone [21:47:34] thanks heaps for the help though - much appreciated! [21:48:31] I'm trying to programatically launch an EditPage instance, so that I can prepare Title, Article and some text. I'm testing that from a SpecialPage extension, but when I call ->edit() on my EditPage instance it tries to edit the special page (I think) and so fails displaying that page is protected. Can I launch such edition from a special page ? How can I do ? [21:50:26] Telemac how do you initialize the EditPage instance? [21:50:59] I create a new Title instance, than a new Article instance based on that Title and finally an EditPage using the article [21:51:11] Should work [21:52:03] ARGH! [21:52:22] Telemac: EditPage doesn't use $wgArticle, but uses $wgTitle all over the place [21:52:26] Which really sucks [21:52:55] Please file a bug at bugzilla.mediawiki.org saying that EditPage.php should have a $mTitle member rather than using $wgTitle [21:53:12] globals rock! [21:53:16] \o/ [21:53:24] who wrote title blacklist? :) [21:53:28] RoanKattouw: so it doesn't take care given title [21:53:28] I have few comments [21:53:30] VasilievVV [21:53:43] RoanKattouw: Ok I file that [21:53:46] Well it does have a $mTitle member (saw it just now), it just doesn't seem to *use* it [21:54:06] Hmm it uses a mix of $this->mTitle and $wgTitle [21:54:12] Now THAT'S ugly [21:54:18] domas: VasilievVV wrote it [21:54:25] are and synonyms or they have different behavior? meta help doesn't make me sure about either :-/ [21:54:39] domas: vasilvv@gmail.com [21:54:45] Danny_B different [21:54:53] FooBar [21:55:09] well, first I have to fix it [21:55:09] RoanKattouw, I think it uses the two for different things. [21:55:11] When you view the template: "Bar" when included "FooBar" [21:55:11] and I'm already lazy [21:55:13] too lazy [21:55:18] FooBar [21:55:20] RoanKattouw, vaguely recall that, anyway. Which, yeah, is even more confusing. [21:55:25] When displayed: "Foo" when included: "Bar" [21:55:31] s/onlyinclude/includeonly [21:55:33] onlyinclude is "pretend there's noinclude everything else" [21:55:41] * $wgTitle is the page that forms submit to, links point to, [21:55:41] * redirects go to, etc. $this->mTitle (as well as $mArticle) is the [21:55:41] * page in the database that is actually being edited. These are [21:55:41] * usually the same, but they are now allowed to be different. [21:55:47] or is there an ? [21:56:04] hmm, interesting. [21:56:04] Skizzerz: see my question above [21:56:06] Yeah [21:56:20] *Skizzerz didn't know it existed [21:56:21] There's both and , Dashiva got it right [21:56:26] certainly has a confusing name [21:56:28] RoanKattouw: isn't the second example vice versa? [21:56:29] wheee [21:56:30] growing! [21:56:31] STAT bytes 954754174 [21:56:33] skiidoo, blame avar. [21:56:44] Danny_B: Yeah [21:56:47] (in regards to there's another tag with almost the same name that has different functionality) [21:56:49] I will! [21:56:54] Er. [21:56:59] Not skiidoo, Skizzerz. [21:57:01] can onlyinclude be used more times? [21:57:02] what are you growing domas? plants? :o [21:57:09] Why do IRC clients always pick the wrong name to autocomplete to? [21:57:15] Danny_B maybe, why don't you try? [21:57:15] caches [21:57:19] Simetrical: But so does EditPage take care of given title (by ctor) for edit() method, or of wgTitle ? [21:57:24] siebrand: they always pick right for me! [21:57:31] LOL [21:57:41] Anyway, gotta go folks [21:57:46] *domas giggles [21:57:50] Telemac, the provided title is the one it actually changes. [21:58:00] But it uses $wgTitle too. [21:58:14] You might want to use the API functions for editing instead of EditPage's. [21:59:00] Simetrical: I need to be able to create a page (title/article) in memory and prepare some text content in it for user [22:00:24] Ok if I overide wgTitle so that mTitle and wgTitle are the same edit work (maybe "a little" tricky :) ) [22:04:56] damn [22:05:05] extension localization = teh suck [22:05:28] What's the proper way with EditPage/Article/Title to programatically set some text content, in order to prepare article ? [22:05:32] domas: which part of it? [22:05:43] this one: http://noc.wikimedia.org/cgi-bin/ng/report.py?db=all&sort=real&limit=5000&prefix=Setup.php-extensions [22:05:54] Nikerabbit: that every included extension immediately loads all messages [22:06:26] thats silly [22:06:27] domas: not every [22:06:43] and it also loads for every language... [22:06:55] though there're max 2 languages in context... [22:07:03] most of them was just recently converted to wfLoadExtensionMessages, but were not yet optimised for delayed loading [22:07:16] anyone know what i can subclass for auto login to mediawiki if i have a valid session from another app? [22:07:41] mzupan: there's AutoAuthenticate hook [22:07:51] mzupan: you can build user objects with it [22:07:55] check hooks.txt [22:08:10] sweet.. thanks [22:08:44] Nikerabbit: how many views need MakeBot? InputBox? Renameuser? LoadDeletedContribsMessages? UsernameBlacklistSetup? [22:08:50] real% 22.73? [22:08:52] Fun. [22:09:02] Simetrical: probably skewed by profiling, but still.. [22:09:17] this is stupid. [22:09:36] every time I turn away, turn back - 1000 extensions loading all their messages for all languages at startup [22:09:42] :D [22:10:05] Simetrical: http://noc.wikimedia.org/cgi-bin/ng/report.py?db=all&sort=real&limit=5000&prefix=MessageCache [22:10:23] most of those are probably easy to fix [22:10:24] we call addMessages 1000 times [22:10:27] per single invocation of a script [22:10:36] yay? [22:10:37] yay! [22:11:34] this is so low hanging fruit [22:11:57] not some parser [22:12:10] domas, the $wgHooks['AutoAuthenticate'][] goes in the LocalSettings.php ? [22:12:23] mzupan: in your extension file :) [22:12:30] mzupan: you can include your extension in LocalSettings.php, yes [22:15:27] buahaha [22:15:35] CIA-40: you can work now [22:16:09] one fruit eaten [22:16:29] mails to cia seem to be delayed [22:17:32] I would guess from their end [22:17:38] how is the "Thanks to everyone who donated in the Wikimedia Foundation fundraiser!" banner set up in mediawiki? [22:18:04] MediaWiki:Sitenotice with the DissmissableSiteNotice extension [22:18:07] evilmoo, CentralNotice extension, but it might not be what you want to use. Try MediaWiki:Sitenotice. [22:18:17] Skizzerz, I don't think that's used here, this is across all sites. [22:18:18] or that :P [22:18:27] oh, yeah, that would make sense [22:18:28] thanks :) [22:20:05] How can I prepare some text content with an EditPage instance ? [22:40:35] Hello everyone. In advance I thank you for your help [22:40:47] We recently changed to a new web host [22:40:56] http://laacz.lv/f/img/avarija-krogu.jpg =) [22:41:00] and now our wiki directory and pages show up as blank [22:41:16] just plain ol' whitespcae [22:41:34] Bluto: probably error reporting turned off and some error was hit [22:41:47] Hello. I'm having an odd problem with the BreadCrumbs2 extension. Instead of creating a nice little trail it just repeats the title no matter how far into my wiki I am. Any suggestions? [22:42:09] Domas, thanks. Is my only option to delete and reinstall? [22:42:15] Bluto: no [22:42:36] domas: What would you suggest? [22:42:40] I just hit an error on wikipedia, let's delete it all and reinstall :) [22:42:53] oh, hush... :) [22:42:56] haha [22:43:05] mark: would be fast enough anyway [22:43:32] Bluto: try putting error_reporting(E_ALL); to your LocalSettings.php and refresh [22:43:40] stupid bot name [22:44:21] k I shall attempt that [22:44:25] brb [22:47:22] 04 01Hello. I'm having an odd problem with the BreadCrumbs2 extension. Instead of creating a nice little trail it just repeats the title no matter how far into my wiki I am. Any suggestions? [22:47:32] how do you prevent mediawiki from squashing whitespace? [22:48:52] domas (and smart alek mark): I'm sure I've made some stupid maistiake. But, at the bottom pf local settings, I put: $error_reporting(E_ALL); I then get (after a refresh): Fatal error: Call to undefined function: () in /homepages/40/d217707377/htdocs/wiki/LocalSettings.php [22:49:07] maistiake? yeah and um a mistake too [22:49:26] no need to $ [22:49:33] in front of function call [22:50:18] k removing the $ [22:50:23] I just get more whitespace [22:50:28] well, blank page [22:51:32] do you have access to error logs? [22:51:35] must be some stupid error ;-) [22:51:56] !errors | nikkiann [22:51:56] --mwbot-- nikkiann : To see PHP errors, add this to the very top of LocalSettings.php: error_reporting(E_ALL); ini_set("display_errors", 1); Fatal PHP errors usually go to Apache's error log - also check the error_log setting in php.ini (or use phpinfo). For more details in wiki error reports, set $wgShowExceptionDetails = true; and $wgShowSQLErrors = true; For full debug output, set $wgDebugLogFile to some path you like. [22:52:06] You may need the ini_set thing too to get it displayed. [22:52:39] Maybe this is a question for another channel but I can't seem to have more than one consecutive space not get squashed in wikitext anywhere [22:53:52] 03siebrand * r29512 10/trunk/extensions/ (4 files in 2 dirs): [22:53:52] * use wfLoadExtensionMessages for SpecialCite [22:53:52] * add version in extension credits [22:53:52] * update Translate extension [22:55:33] that or I am -v [22:55:46] btw, not ignoring, just trying it out [22:56:09] frieze: multiple spaces get collapsed, unless you use   or are in a
 block or you start the line with blanks (which is kind of wiki-pre)
[22:56:38] 	Duesentrieb: but 
 and leading spaces put all my text in an ugly blue box
[22:56:55] 	for some strange reason
[22:57:01] 	then use  
[22:57:11] 	or change the CSS of the tag
[22:57:20] 	frieze`: the reason is that this stuff is intended for code snippets, etc. which should be visually separated
[22:57:28] 	you can of course change or override the styles
[22:57:46] 	03nikerabbit * r29515 10/trunk/extensions/inputbox/inputbox.php: * Delay message loading
[22:57:46] 	frieze`: what are you actually trying to do
[22:57:48] 	?
[22:58:05] 	for one thing, multiple spaces donÄt make much sense unless you also use a fixed-width font
[22:58:41] 	there's also the poem extension - nice for manualy line breaks. not sure how it handles spaces. i could imagine that they are preserved
[23:00:38] 	\o/
[23:01:01] 	have words in bold be spaced a bit more widely so they stand out more
[23:01:06] 	so I would like to just type two spaces
[23:01:18] 	03siebrand * r29516 10/trunk/extensions/ (4 files in 2 dirs): 
[23:01:18] 	* use wfLoadExtensionMessages for Desysop
[23:01:18] 	* add version in extension credits
[23:01:18] 	* update Translate extension
[23:01:28] 	even fixing 
 turns off all the line wraps and such
[23:02:18] 	woooo
[23:02:18] 	yes. that's what pre is intended for
[23:02:20] 	now thats work!
[23:02:50] 	frieze`: i suppose you could just alter the css style for  or whatever to ass some spacing.
[23:03:35] 	heh
[23:03:40] 	why do people use require_once in extensions
[23:03:43] 	autoloader ftw \o/
[23:03:55] 	Duesentrieb: thanks. I'll just live with  's everywhere
[23:04:22] 	I thought there might be some trick out there to turning off whitespace squashing
[23:04:45] 	frieze`: no, that sucks. just pot this on your MediaWiki:common.css page: b { padding-left:1ex; padding-right:1ex; }
[23:04:57] 	frieze`: then, all bold text will have spacing before and after it.
[23:05:11] 	autoloader is indeed nice <3
[23:06:41] 	K, I promise I'm not an idiot, but um, 1) how do I set $wgDebugLogFile to a file I want?  just $wgDebugLogFile = ?;  2) where is phpinfo or php.ini
[23:08:11] 	k you can all stop laughing now...
[23:10:42] 	
[23:10:48] 	03siebrand * r29517 10/trunk/extensions/ (3 files in 2 dirs): 
[23:10:48] 	* use wfLoadExtensionMessages for DismissableSiteNotice
[23:10:48] 	* add version and url in extension credits
[23:10:48] 	* update Translate extension
[23:11:25] 	nikkiann: create a file phpinfo.php and put that code
[23:11:37] 	ok brb
[23:11:43] 	oh and my webhost : Since we don't provide access to Apache error logs on shared hosting packages for 
[23:11:44] 	technical reasons
[23:11:50] 	etc etc
[23:12:32] 	nikkiann: you can change any configuration values by editing LocalSettings.php
[23:14:59] 	okay I created a papge called phpinfo.php
[23:15:11] 	I added the following o the top of local settings:
[23:15:14] 	error_reporting(E_ALL); 
[23:15:14] 	ini_set("display_errors", 1);
[23:15:14] 	$wgShowExceptionDetails = true;
[23:15:14] 	$wgShowSQLErrors = true;
[23:15:14] 	$wgDebugLogFile = phpinfo.php;
[23:15:40] 	(sorry for the flooding)
[23:15:54] 	I refresh the wiki index
[23:15:58] 	and get:
[23:15:59] 	Notice: Use of undefined constant phpinfo - assumed 'phpinfo' in /homepages/40/d217707377/htdocs/wiki/LocalSettings.php on line 18
[23:16:00] 	Notice: Use of undefined constant php - assumed 'php' in /homepages/40/d217707377/htdocs/wiki/LocalSettings.php on line 18
[23:16:00] 	Parse error: parse error, unexpected T_STRING, expecting T_OLD_FUNCTION or T_FUNCTION or T_VAR or '}' in /homepages/40/d217707377/htdocs/wiki/includes/Exception.php on line 160
[23:16:47] 	nikkian: that's wrong syntax
[23:17:44] 	phpinfo.php should be in quotes IIRC
[23:17:56] 	ooo
[23:18:20] 	better choose another name
[23:18:22] 	03siebrand * r29513 10/trunk/extensions/ (3 files in 2 dirs): 
[23:18:22] 	* use wfLoadExtensionMessages for DeletedContributions
[23:18:22] 	* add version in extension credits
[23:18:22] 	* update Translate extension
[23:18:48] 	03simetrical * r29514 10/trunk/phase3/includes/ (Article.php EditPage.php LoadBalancer.php): Clean up some braces, indentation; add profiling to LoadBalancer::reallyOpenConnection.
[23:19:12] <_das>	ok, i may be an idiot here, but how do I make an embedded image link to an external URL? is there a way?
[23:19:22] 	Simetrical: why that profiling section?
[23:19:27] 	no
[23:19:28] 	Simetrical: Database::open should be identical
[23:19:36] 	or perhaps there is
[23:19:36] 	_das: like an image embedded with [[Image:Imagename.ext]]?
[23:19:38] 	Simetrical: we even have per-server info there
[23:19:45] 	_das: no way per default to make it link anywhere except its own description, actually
[23:19:52] <_das>	Skizzerz: yes
[23:19:54] <_das>	ok
[23:19:56] 	domas, well, if you think it's silly, remove it.  :)
[23:20:07] 	You know better than I do.
[23:20:14] 	you could try the icon extension, but IIRC that's only for local links
[23:20:22] 	iirc, there's a hack that allows you to use an external image as an external link. if you have external images enabled, that is.
[23:20:51] <_das>	right...and we don't
[23:20:58] 	[http://wherever.org/something http://someplace/foo.png]
[23:21:03] <_das>	so i'll just do it from its description, thanks!
[23:21:27] 	ok so I changed the file name.  I also changed the phpinfo to the new file name in the new file
[23:21:35] 	no errors on refresh, just the blank page
[23:21:36] 	_das: there are a couple of extensions for this
[23:21:47] 	so I go to that page and I get:
[23:21:59] 	Fatal error: Call to undefined function: wikierrors() in /homepages/40/d217707377/htdocs/wiki/wikierrors.php on line 1
[23:22:52] 	why use the php extension?
[23:23:08] 	change it to .txt or something else
[23:23:13] 	ok
[23:23:59] 	Simetrical: removed :)
[23:24:07] 	03midom * r29518 10/trunk/phase3/includes/LoadBalancer.php: this profiling section is nearly identical to our already verbose Database::open and dbconnect ones
[23:25:34] 	blank pages on both
[23:26:19] 	Simetrical: generally a profiling section is needed around some code which does some clear piece of work and doesn't have single outstanding code path
[23:26:33] 	Simetrical: reallyOpenConnection always does Database::open() :)
[23:26:47] 	and thats nearly everything it does
[23:27:20] 	damn, need to write PECL profiler
[23:29:21] 	damn
[23:29:30] 	mediawiki in includes/ alone has 122kLOC
[23:30:24] 	... I'm screwed aren't I?
[23:30:48] 	if not by whatever error but by my not having a clue (:
[23:33:17] 	Hi My system is behind a reverse proxy... The wfGetIp(): function is returning the ip of the proxy and not the users ip.  I need the HTTP_X_FORWARDED_FOR.  I have set $wgSquidServer to my proxy IP?..Still doesn't work..Any thoughts?
[23:35:43] 	03siebrand * r29519 10/trunk/extensions/ (4 files in 2 dirs): 
[23:35:43] 	* use wfLoadExtensionMessages for LinkSearch
[23:35:43] 	* update Translate extension
[23:36:25] 	Amos: $wgUseSquid ?
[23:36:30] 	Amos: grep -i squid defaultsettings.php :)
[23:36:46] 	ya..$wgUseSquid = yes;
[23:37:01] 	oh
[23:37:11] 	and $wgSquidServers  is an array
[23:39:01] 	tnx..I do have it as an array but only one server
[23:39:06] 	I'm guessing that was a yes.  Well, thanks anyway people
[23:41:04] 	One more question actually.  If I did delete and reinstall would the aticls people have done be gone or are they stored in the database?
[23:41:25] 	articles too
[23:42:37] 	articles are in database
[23:42:39] 	images are not
[23:43:14] 	I see the option to upgrade account to sysadmin, possible to downgrade a user?
[23:44:34] 	thanks very much!
[23:57:29] 	03siebrand * r29520 10/trunk/phase3/languages/messages/ (24 files): 
[23:57:29] 	Localisation updates for core messages from Betawiki (2008-01-10 0:38 CET)
[23:57:29] 	* fallback for 'lij' to 'it'