[00:01:27] ah... hrm... dang... is there a place I can see why the dump failed or when the next one will be made? [00:05:34] no [00:08:44] I'm getting a permissions error on /libraries/session.inc.php on line 89, any idea why? [00:08:54] Its telling me to make sure the sessions temp folder is set right, and it seemingly is [00:10:56] never heard of it [00:11:02] are you sure you're in the right channel? [00:56:21] 03(mod) Restrict page moves to autoconfirmed users on all WMF wikis - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12071 (10mike.lifeguard) [02:32:30] i need help with mediawiki [02:32:49] please [02:33:03] !ask | xCuber125 [02:33:03] xCuber125: Don't say "I have a question", or ask "Is anyone around?" or "Can anyone help?". Just ask the question, and someone will help you if they can. Also, please read < http://workaround.org/moin/GettingHelpOnIrc > for a good explanation of getting help on IRC. [02:34:26] When I goto my install of MediaWiki, it doesn't let me access it from /wiki, i have to type in "/wiki/index.php5" it has the be .php5 too, .php doesn't work. [02:34:45] did you install it yet? [02:34:50] yes. [02:35:11] !ask [02:35:11] Don't say "I have a question", or ask "Is anyone around?" or "Can anyone help?". Just ask the question, and someone will help you if they can. Also, please read < http://workaround.org/moin/GettingHelpOnIrc > for a good explanation of getting help on IRC. [02:35:13] ok, so you want it to be like /wiki/PageName [02:35:36] instead of /wiki/index.php5?title=PageName? [02:35:39] i want to see the main page when i go to /wiki [02:36:25] currently, when you go to /wiki, what do you get? [02:36:29] nothing [02:36:45] and when i go to index.php i get nothing, i have to go to index.php5 to see stuff [02:37:02] try renaming index.php to index.php~ or something, and then index.php5 to index.php [02:37:05] that might work [02:37:50] now i get a error 500 (internal server error) [02:38:15] ok... then I'm really not sure what your problem is [02:38:30] (oh, and undo the renaming... since that messes it up more apparantly) [02:38:54] thanks for trying [02:39:05] could i try a reinstall? [02:39:09] sorry I couldn't help :( [02:39:16] not sure if that will work but you can try it [02:39:22] ok, i'll try [02:39:36] it might be a problem with the php configuration though, but I really don't know [02:52:47] 03(mod) Allow images that link somewhere other than the image page ( using normal link syntax) - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=539 (10rene.kijewski) [03:17:26] 03(FIXED) sep11 wiki yet listed on "Wiki does not exist" - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12140 +comment (10tstarling) [03:35:00] 03(FIXED) Change the template used for new created sites - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12139 +comment (10tstarling) [03:36:48] 03(FIXED) Allow blocked users to edit their talk page on Hebrew Wiktionary - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12141 +comment (10tstarling) [04:25:56] 03(NEW) CAPTCHA group exceptions only apply to edits - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12142 normal; normal; MediaWiki extensions: ConfirmEdit; (Emufarmers) [05:14:00] 03(NEW) New pages user interface presents link to patrol deleted page - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12143 trivial; normal; MediaWiki: User interface; (sco_scam) [05:48:43] hello [05:50:30] Looking for a way to show how many pages or major updates have been submitted by a certain user. Was wondering if there was an extension of sorts that would supply these numbers without having to parse through the 'contributions' list. [05:52:30] any given users? [05:52:36] Yes [05:52:39] or you want to know who contributed the most? [05:52:48] No, for any given user [05:54:37] 03(NEW) Request for "Refer A Friend" Extension - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12144 15enhancement; normal; MediaWiki extensions: General/Unknown; (moleculedude) [05:55:53] calivw78: http://svn.wikimedia.org/svnroot/mediawiki/trunk/extensions/Contributionseditcount/ [05:56:31] Thanks TimLaqua, I'll take a look. [06:26:51] 03raymond * r27941 10/trunk/phase3/ (RELEASE-NOTES includes/Article.php): * (bug 12143) Do not show a link to patrol new pages for non existent pages [06:27:17] 03(FIXED) New pages user interface presents link to patrol deleted page - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12143 +comment (10raimond.spekking) [06:36:38] 03raymond * r27942 10/trunk/phase3/languages/messages/ (4 files): [06:36:39] * (bug 12137) Update Chinese translations [06:36:39] Patch by Shinjiman [06:36:59] 03(FIXED) Update for Cantonese language (yue) #62, Update for Old Chinese / Late Time Chinese languages (och/ltc) , Update for Chinese localisation (zho series) - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12137 +comment (10raimond.spekking) [07:01:42] 03raymond * r27943 10/trunk/phase3/includes/EditPage.php: [07:01:42] Cleanup for r27897: [07:01:42] * Title::getBaseText is misleading as this function returns the subpage of a subsubpage instead of the root page [07:01:42] (maybe my fault by misinterpreting "Get the base name, i.e. the leftmost parts before the /" [07:01:42] * Fixes an issue for the (rare?) case that subpages are disabled for user (talk) namespace [07:20:12] hello [07:22:43] I am looking for an extension : a search by multiple category ... [07:23:02] is there exist ? [07:25:40] objective is to search a set of category, or excluding a category in search [07:26:15] helo Ganon [07:26:33] hello cquad ;) [07:27:39] anyone know if this extension exists or not ? [07:33:17] Hi - Should logged-in users of a standard MediaWiki installation automatically get logged-out after a period of inacivity? They don't with my installation, but I'd like this to be the case, but can't find any info about how to do this. [07:40:39] DerrickFarnell: i dont think so [07:40:48] maybe there is a extension for that but i cant say [07:41:37] thanks [07:42:22] I thought such automatic logout after inactivity was a standard security feature of sites that people can log into. But is this not something I should be worried about then? [07:43:03] I'm sure I automatically get logged-out of Wikipedia after inactivity... [07:43:38] I sure don't get logged out after one week of inactivity. [07:44:12] So either you have an extension installed or there is an option I'm not aware of. [07:45:03] if there is a extension or option i would love to hear about it [07:46:01] But isn't this a security issue? [07:47:29] Why would you think so? [07:48:50] Because most other sites on the web that you can log into will automatically log you out after a period of inactivity, so there must be a good reason for that. [07:49:25] The only issue I see that could be reduced by automatic logout is a user who shares her system with a malicios user and keeps a web session open and the malicious one reuses it. [07:50:45] DerrickFarnell: a few hundred years ago most people thought women that know a lot about herbs are witches, so there must be a good reason for that, right? :-) [07:51:44] :-) [07:52:01] hehe [07:52:16] I guess I don't automatically get logged-out of my Gmail after inactivity... [07:52:33] today i will check out what extensions exist...which are well...not too much but which are useful for daily use [07:52:46] DerrickFarnell: you are right...me neither [07:53:42] Now that I think of it, I guess its mostly sites which hold your bank/credit card details which automatically log you out after inactivity - Amazon, my bank... [07:57:51] Actually, given Gmail doesn't have automatic log-out, I'm now not bothered about having this feature. It's obviously more convenient not to have it - I was just worried about security... [07:58:37] it would be nice to have a feature where you could shut down all other active sessions for your account [07:58:59] if you go to a public terminal and forget to log out of gmail, your account may be active there for anyone to use later [07:59:06] re [08:00:55] 03raymond * r27944 10/trunk/extensions/ (4 files in 4 dirs): [08:00:55] * Multiupload: consistent use of 'file' instead of 'image' [08:00:55] * Updates German [08:10:24] <_wooz> lo [08:18:39] A database query syntax error has occurred. This may indicate a bug in the software. The last attempted database query was: [08:18:39] (SQL query hidden) [08:18:39] from within function "Revision::insertOn". MySQL returned error "1223: Can't execute the query because you have a conflicting read lock (10.0.0.237)". [08:19:13] remove the lock ^^ [08:20:46] i have had this yesterday during maintanence time in amsterdam [08:29:57] 03nikerabbit * r27945 10/trunk/extensions/Translate/SpecialTranslationChanges.php: * Sort extensions too [08:32:39] is there may a extension to do well pdfs out of the wiki? [08:45:50] marvxxx: PdfExport [08:51:59] Caldrin: is it useful? [08:57:24] Yes, I think so. It works alright. [08:57:56] hmm [08:58:47] can someone tell me why these transclusions are broken? [08:58:48] http://en.wikipedia.org/wiki/Wikipedia:Requests_for_arbitration/Armenia-Azerbaijan/Enforcement_Log#Blocks_for_violation_of_the_temporary_injunction [08:58:54] at the bottom of the table [08:59:11] define "broken" [08:59:22] [[Wikipedia:Requests for comment/FadixTemplate:Highrfc-loop|rfc]] [08:59:25] it's exploding the page? it's sleeping with your girlfriend? it's not appearing? [08:59:30] werdan7 it doesnt transclude right [08:59:42] Werdna no template is going to sleep with my GF [08:59:56] werdan7 see that entry? [09:00:10] Fadix (talk · contribs · count · logs · block log · lu · rfa · rfb · arb · [[(boom) [09:00:20] next two entries arent even called properly [09:01:16] I dunno. I would imagine it's some restriction [09:01:19] on the software [09:01:50] Werdna against arbotration cases? [09:01:58] O_o [09:02:00] no, against including four zillion templates [09:02:11] how is that a zillion [09:02:16] 1 per blocked account [09:02:35] well, that's a hypothesis [09:02:39] could be a software burp, though [09:02:50] not very familiar with the parser.. TimStarling would be your best bet there [09:03:36] TimStarling: what are the reasons why a template would be linked instead of transcluded? [09:04:36] *Shiroi_Neko licks TimStarling :P [09:04:50] he probably wouldn't like that.. not very hygienic. [09:06:12] ok, so there's a maximum include size [09:06:40] Werdna cats lick for hygiene :P [09:07:00] TimStarling is there a way to overide or increase the cap? [09:07:09] I am not even half way done with that arbcom case [09:07:11] Werdna: the prime reasons are: a) it doesn't exits and b) the include limit is exceeeded [09:07:45] Werdna: c) would be it's included from a namespace marked as non-includable [09:07:47] I want to {{Usercheck|}} each account [09:07:54] no sensible page should hit the limit [09:08:05] weeeellll :P [09:08:06] what's it set to on wmf, Duesentrieb? something insane like 256KB, hmm? [09:08:08] Werdna this is an arbitration case [09:08:12] well, it's a loop [09:08:25] we should just block all templates with "loop" in the name [09:08:26] Shiroi_Neko: since when were arbitration cases "sensible pages" by any use of the word? [09:08:30] and thats mere block log [09:08:35] Werdna ah [09:08:36] that'd save a lot of problems [09:08:39] yes you are right :) [09:08:48] Werdna: there are two limits: pre-expand and post-expand. i have no clue what they are set to - actually, i donÄt even know where this is configured. i tried to find it once, and failed. [09:08:56] TimStarling I need this for the log [09:09:00] Duesentrieb: I've been digging through code [09:09:07] I am sure there are lots of ways to fix it [09:09:11] TimStarling: where are the include limits configured? [09:09:17] #1: Fix the {{usercheck}} template [09:09:24] Werdna cant [09:09:26] notanadmin [09:09:31] and it works normaly [09:09:42] when used multiple times is the problem [09:09:42] yes, but it has a failure case [09:09:52] the limit is given in the comment [09:09:59] Werdna I am not very good with parser functions [09:10:21] TimStarling if you increased the limit by 4 times we wouldnt have problems [09:10:36] no, our problems would be 4 times worse [09:10:49] TimStarling think of the sockpuppets [09:10:51] if I reduced it by a factor of 2, then we'd be talking [09:11:01] TimStarling do as you please [09:11:09] I am merely pointing out a problem [09:11:14] the problem is not "You can't make an arbitration page that includes lots of templates", the problem is "You use templates which flood the parser with useless text" [09:11:15] the limits are there for a reason [09:11:20] weather you help me or not wont get me anyhting [09:11:39] TimStarling arbitration case needs link to involved parties [09:11:52] I didnt ask these users to be disruptive [09:12:15] and this is just the partial log of their activities [09:12:23] the limit is too restrictive [09:12:36] any solution that will make it work is welcome [09:13:05] alright, well let's talk about solutions... [09:13:10] sure [09:13:29] I need to link to the cases in question if they exist [09:13:35] a red link would be fine by me [09:13:38] it used to be like that [09:13:47] on failiure links are grey now [09:13:49] I dont know why [09:13:53] Template:Usercheck is unprotected [09:14:09] Werdna can you revert back to the last good version [09:14:36] there isn't a "last good version" [09:14:40] and that template doesn't look that bad [09:14:45] maybe it's another template doing it [09:14:49] okay [09:15:13] [[Template:Usercheck]] should be protected as it is a vandal target [09:15:35] meh [09:15:38] don't invent problems [09:15:44] it isn't regularly vandalised, is it? [09:16:02] first explain to me in a few words what the template does and how it works [09:16:12] if it is it would lock the servers probably [09:16:21] TimStarling okay [09:16:26] the template scans the users activity [09:16:36] for pas rfc rfar cases and etc [09:16:42] rfcu cases too [09:16:55] if such a case exists the template links to it with a different color [09:17:02] using a whole of #ifexist calls? [09:17:12] http://en.wikipedia.org/w/index.php?title=Template:Highrfc-loop&action=edit [09:17:14] the template only takes username as input [09:17:15] yes. [09:17:17] TimStarling probably [09:17:21] how many, per user? [09:17:26] TimStarling: can you please tell me where the limits are set? i'd need it for testing. i have looked for it several times - i must be blind i guess. [09:17:47] 7 or 9 times maybe [09:17:50] I am unsure [09:17:52] passed as a parser option [09:18:01] TimStarling: 20 for each rfc, rfar, and rfa [09:18:04] so 60, I think [09:18:04] at the end of the parser output: [09:18:06] [09:18:27] 60 #ifexist calls *per user* [09:18:32] ah [09:18:33] okay [09:18:34] 60 DB requests [09:18:35] actually make that 80 [09:18:46] probably ~3ms each [09:18:51] I didnt create this [09:19:07] http://en.wikipedia.org/w/index.php?title=Template:Highrfc-loop&action=edit -- that's the template [09:19:25] so say 240ms, but not of cheap app server CPU time mind you [09:19:41] this is 240ms of the time of a $12000 slave DB server [09:19:45] per user, per parse [09:19:56] yea perhaps [09:19:57] TimStarling: do you want me to limit calls to #ifexist to 3 per page or something? [09:20:08] in ParserFunctions [09:20:09] brb [09:20:24] TimStarling if you do that practialy all infoboxes on Wikipedia would break [09:20:36] and you're doing this until you hit the template inclusion limit? [09:20:50] you could fit hundreds of these things in a page that way, couldn't you? [09:21:07] *Werdna smells denial of service [09:21:09] it must take minutes to save [09:21:09] TimStarling I am listing blocked master accounts and their sockpuppets [09:21:22] TimStarling no not at all [09:22:00] *Shiroi_Neko saved it in a few seconds [09:22:07] how many instances? [09:22:24] TimStarling as many as the number of usernames [09:22:33] let me count [09:23:17] 25 [09:23:23] and I am not done listing these [09:23:58] TimStarling I reliase this is a ridiclous number of usernames [09:24:12] which is why I want to give reviewers access to all data with minimum effort [09:24:26] they need to review all that on a regular basis [09:25:11] so if I'm right about 240ms, 25 of them would take 6 seconds [09:25:19] probably [09:25:51] *Duesentrieb feels ignored [09:25:56] where is this list? [09:26:00] *Shiroi_Neko licks Duesentrieb [09:26:03] TimStarling [09:26:24] http://en.wikipedia.org/wiki/Wikipedia:Requests_for_arbitration/Armenia-Azerbaijan/Enforcement_Log [09:26:41] TimStarling the list isnt static [09:26:47] per each new block it gets longer [09:26:53] per sockpuppet [09:27:31] the listed isnt eve half of Artaxiad's sockpuppets [09:28:41] huh [09:28:45] $this->mMaxIncludeSize = $wgMaxArticleSize * 1024; [09:28:55] i guess that's it? [09:28:58] the page isn't rendering at all for me [09:29:07] Duesentrieb: yes, that's it [09:29:12] 2MB [09:29:14] TimStarling it seems broken now [09:29:36] TimStarling: ah, so there's currenlty no way to set it directly - which is why i didn't find the setting :) [09:29:43] Shiroi_Neko: that's alright, I can use a smaller test case [09:30:17] guess not [09:30:26] I added a setting for PP node count though [09:32:53] TimStarling this problem isnt unique to this templae some infoboxes such as the ones on chemicals are suffering similar problems [09:33:25] 5.8s for 6 invocations [09:33:59] so 970ms, not 240ms [09:34:44] or roughly 1 secs [09:34:46] *TimStarling adds [[User:Ben]] to his hit list [09:35:00] *Shiroi_Neko hits ben [09:37:14] how about this... [09:37:21] each time you add a username, go to Special:Prefixindex [09:37:32] do a search for each of the 4 prefixes of interest [09:37:53] TimStarling the case can be created later [09:37:59] which is why I am using the template [09:38:21] well, when the case is created, add it to all the relevant places [09:38:47] TimStarling yes but that doesnt help if the case is created after I finish this list [09:38:53] that why I want to keep it dynamic [09:38:58] well you can't [09:39:14] any one of these users may have an independent rfcu [09:39:22] you can do automated searches of Special:Prefixindex if you like [09:39:24] say once a day [09:39:30] and then update the list to suit [09:39:44] TimStarling I have other priorities than this case :( [09:39:51] and we are talking about +25 accounts [09:40:03] I have other priorities too [09:40:10] yes I realise that [09:40:11] do you have any idea how long it would take to optimise this case? [09:40:22] TimStarling I have 3 ideas [09:40:33] days, surely [09:40:37] I'm not sure if it's even possible [09:40:38] 1) extending the cap at least temporarily [09:40:49] 2) simplifying the templates algorithm somehow [09:41:08] ie lowering the resources used by the parser functions [09:41:30] 3) create a more built in tool that links to such things [09:41:30] and the third? [09:41:43] 1) forget it [09:41:47] in other words rather than a trmplate this'd be something like an extension [09:41:57] TimStarling I know #1 is a no [09:41:58] you said so [09:42:00] the cap is too high in this case, you're abusing the system and creating unparseable pages [09:42:12] it's meant to give a graceful error before it hits the memory limit [09:42:19] 2) I said it would take days [09:42:19] *Shiroi_Neko is notanabuser [09:42:20] :P [09:42:21] I have other things to do [09:42:29] 3) you know where the source is [09:42:47] TimStarling I am no mediawiki dev [09:42:51] TimStarling: should we limit calls to #ifexists by number? [09:43:15] i.e. "Only 5 calls per parse", or something? [09:43:27] Werdna: maybe to about 100 per parse [09:43:50] *Werdna looks at the code. [09:43:51] this page is doing about 2000 per parse [09:43:59] with ambitions to do 8000 [09:44:19] *Shiroi_Neko likcs the parse [09:44:33] maybe even 100 is too low [09:44:45] it's hard to say without knowing the applications [09:44:55] isn't 100 a bit high? [09:45:15] I mean, that's 300ms of database time per page per user per parse [09:45:48] maybe [09:46:00] domas will back you up, I'm sure [09:46:29] TimStarling I am no mediawiki dev [09:46:46] you're not going to extort me into coding it for you by overloading the server [09:46:50] Allowing lots of ifexists for article applications is one thing, but for prettying up display of an administrative thing used sparingly by privileged users, it's madness [09:46:57] dinner, bbl [09:47:04] 03(CLOSED) Kazakh message updates - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=10367 +comment (10alefzet) [09:47:04] 03(NEW) Kazakh message updates - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12145 normal; normal; MediaWiki: Internationalization; (alefzet) [09:47:07] okay Shiroi_Neko I have a temporary solution [09:49:11] I'm doing it now [09:51:41] the old-fashioned special:prefixindex wway [09:51:53] TimStarling extort? [09:52:00] nah I may lick you to doing it maybe :P [09:52:24] *Shiroi_Neko licks TimStarling's dinner [09:52:47] Werdna well [09:55:20] doing what? [09:55:24] I've fixed your template. [09:55:58] mm... [09:56:05] *Shiroi_Neko licks Werdna [09:56:07] :P [09:56:35] yay! [09:57:41] it will work, but it won't do that fancy detection stuff. [09:58:32] Werdna it is woring perfectly [09:58:34] am I wrong? [09:59:18] click on the links. [09:59:44] *Shiroi_Neko clicks [09:59:45] oh [09:59:49] *Shiroi_Neko cries [10:00:43] why does it turn blue when nothing is linked [10:00:56] it should take you to a prefixindex page [10:01:27] yes [10:01:29] and itns blank [10:02:34] which link are you clicking on, on what page? [10:03:05] maybe he means blank as in no matches [10:03:17] which is how it's meant to be [10:03:17] Werdna okay [10:03:27] Anatolmethanol (talk · contribs · count · logs · block log · lu · rfa · rfb · arb · rfc · lta · checkuser · socks) [10:03:31] socks is blue [10:03:34] meaning it exists [10:03:36] yet it aint [10:03:45] deal with it [10:03:47] next item is grey [10:03:56] why is that? [10:04:15] if it was always blue, Id be fine with it [10:04:19] but its sometimes grey [10:04:25] whats causing that? [10:04:37] eh, some of the magic put in there [10:04:45] somebody else can tidy that up. I've fixed your problem. [10:05:07] I'm not really familiar with template syntax [10:05:14] especially when it's all crammed onto three lines. [10:05:35] yes well [10:05:55] Werdna the magic might be the problem why it is all breaking [10:05:56] TimStarling: of course, the issue itself needs to be resolved: the fact that this #ifexists could be used to mount a nasty denial of service attack [10:06:09] Shiroi_Neko: it was. that's why I removed the offending magic. [10:06:16] yes, it still needs a limit [10:06:31] *Shiroi_Neko likes to point out potential vunrabilities [10:06:36] *Shiroi_Neko hugs TimStarling [10:06:40] store that in the parser someplace? [10:06:46] TimStarling I hope you loved your food [10:07:16] yeah, and zero it in the clearState() hook, assuming we have one [10:07:31] I mean, it's an extension, so that's kind of ugly.. but if we generalised it to "extra database queries", then it might work [10:07:38] ah, a hook.. I see [10:07:39] I think there is a limit on #time already, so it would work like that [10:08:03] *Werdna svn ups and takes a look [10:08:04] you can add custom members to the Parser object, just prefix them with the extension name (or an abbreviation thereof) [10:09:10] $parser->pf_querycount++;? [10:09:13] TimStarling wouldnt it be better to fix parser functions themselves along with that safeguard [10:09:20] so they use fewer resources [10:09:27] that's what I'm doing/ [10:09:29] Werdna: yeah [10:10:00] btw, I blanked the offending template with an appropriate "this is developer activity. don't revert." edit summary [10:10:27] Shiroi_Neko: hey, good idea! [10:10:57] *Shiroi_Neko uses lick on TimStarling [10:10:57] let's just optimise the parser until it works at linespeed [10:11:15] then it won't be a problem anymore [10:11:18] wfRunHooks( 'ParserClearState', array( &$this ) ); [10:11:19] excellend [10:11:23] excellent* [10:11:39] TimStarling HAH [10:11:51] I am sure something else would break [10:12:08] can I assume that clearState will run before any parsing is done? [10:12:17] yes [10:13:21] if you don't call clearState() before calling the parser, all sorts of things break [10:14:15] heh, inclding this new amendment :p [10:14:39] the include size limits are done the same way [10:15:07] you don't have to call clearState() from outside, btw, parse() does it [10:15:23] but sometimes people try calling private members of Parser, or accessing member variables [10:15:27] rightio [10:15:30] and that sometimes breaks [10:16:56] TimStarling: there's already a clearState hook, in which something is stored inside the ParserFunctions class. Should I store the "number of ifexists calls done" there, or in the parser itself? [10:17:16] in the parser itself [10:17:21] really both should do that [10:17:29] Cite stores stuff locally as well [10:17:44] it breaks the differential fuzz tester [10:18:16] I didn't bother fixing it when the problem came up, but maybe I will some day [10:18:29] but for new code, put it in the parser [10:19:11] I should make this configurable, I guess. [10:20:22] do I just set up configuration variables the same way as in core code? [10:21:08] hang on [10:21:18] 03(mod) Kazakh message updates - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12145 +comment (10alefzet) [10:22:18] the usual way to configure extensions is to set the global variable at file level [10:22:33] okay, shall do [10:22:34] then the user will include the file, which sets the defaults, and configure it afterwards [10:22:40] same prefix? [10:22:49] guess so [10:23:11] but they have to configure AFTER they include_once(), correct? [10:23:22] yes, that's how it works [10:23:24] otherwise their settings are overridden by the defaults [10:23:38] I experimented with some other schemes, but that's the simplest one [10:24:41] some day we might load extensions strictly before the configuration is done, but currently, the user has to know when to do what [10:27:49] *Shiroi_Neko eyes [10:30:03] hmm, what's wrong with this syntax? [10:30:07] it's not actually /doing/ anything [10:30:08] {{#ifexist|Main Page|1|0}} [10:30:20] just passing straight through.. [10:30:48] #ifexist: [10:30:52] not #ifexist| [10:30:58] ah [10:32:38] <_AtH> It seems, that all my problems are because PHP has too low memory (16M). What MediaWiki version should I downgrade to? [10:36:10] _AtH: there's not a huge amount of difference in memory usage between versions [10:36:41] there's a bit of a difference between PHP versions, maybe that's what you're seeing? [10:37:05] TimStarling: how does this look? http://www.devanywhere.com/ViewPub.php?id=47 [10:37:14] <_AtH> PHP: 5.2.4 (apache) [10:38:28] <_AtH> I restored old MySQL database from backup, the same bugs. I even setup frest MediaWiki installation, bugs are on the first page! [10:39:28] Werdna: fine by me [10:39:42] _AtH: get more memory [10:39:45] and 100 is an okay limit? [10:39:54] yeah, ok with me [10:39:58] <_AtH> TimStarling: how I can do it? [10:40:04] *Werdna pokes domas [10:40:06] *Shiroi_Neko licks Werdna [10:40:13] ew. [10:40:15] what have I done :D [10:40:17] _AtH: is it shared hosting, or your own server? [10:40:24] <_AtH> hosting [10:40:25] Werdna cat licking is pleasant :P [10:40:28] <_AtH> commercial one [10:40:50] ask nicely [10:41:14] you need 100MB really, but 50 would probably get you out of trouble [10:41:26] <_AtH> 100MB??? [10:41:35] yeah, that's what we use, isn't it? [10:41:53] <_AtH> Installation script told I need 20MB of RAM [10:42:12] php5-x86_64.ini:memory_limit = 100M ; Maximum amount of memory a script may consume (8MB) [10:42:34] well, maybe that needs updating to account for the new version of PHP [10:42:49] <_AtH> Is there a way to change this setting outside of php.ini ? [10:43:00] 03werdna * r27946 10/trunk/extensions/ParserFunctions/ParserFunctions.php: (log message trimmed) [10:43:00] Prevent the parser function #ifexist being used more than a certain amount on [10:43:00] error pages. The maximum number of #ifexist queries can be set in [10:43:00] $wgMaxIfExistCount, which is, by default, 100. Any more uses over here will [10:43:00] default to the "else" text. Done to discourage templates like [10:43:03] Template:highrfc-loop on enwiki, which willingly does something like 50 database [10:43:05] queries for a template that is used for many users on one page. Clearly [10:43:13] you can try ini_set('memory_limit', '100M'); [10:43:17] but it probably won't work [10:43:31] domas: you owe me a cookie [10:44:09] <_AtH> Hmm... At first glance, it works. [10:44:40] <_AtH> Is there a way to check if it works? [10:44:53] in a new file: [10:44:54] hmm, scratch that... [10:45:25] you can just do this: [10:45:32] print ini_get('memory_limit'); [10:45:59] <_AtH> Encode this in what file? [10:46:15] after the ini_set in LocalSettings.php [10:46:22] LocalSettings.php has this per default: ini_set( 'memory_limit', '20M' ); [10:46:28] or was that changed recently? [10:46:46] probably should be changed [10:46:55] really? what eats all the ram? [10:47:05] <_AtH> Yes, 20M. Why 100M, I don't know. Currently I have 16MB. [10:47:09] code. [10:47:21] by the time we compile all the code, it becomes unwieldly [10:47:24] that's my understanding anyhow [10:48:06] yeah, code might take it over 20 these days [10:48:51] <_AtH> It still 16M even after ini_set( 'memory_limit', '100M' ); [10:48:59] <_AtH> So I can't use MediaWiki on this hosting? [10:49:19] <_AtH> print ini_get('memory_limit'); trick works -- thx, TimStarling [10:49:43] _AtH: what's the URL? [10:49:57] <_AtH> http://mgdvorec.ru/wiki/ [10:50:27] <_AtH> I already has about 20 users, that started to use MediaWiki. :( [10:50:32] <_AtH> *have [10:51:04] it *is* starting up [10:51:21] what crashes exactly? [10:51:30] <_AtH> templates and magic words [10:51:36] <_AtH> everything inside {{ }} [10:51:52] <_AtH> Then weird problems with MySQL database [10:51:53] strange [10:52:24] <_AtH> Bug #12073 [10:52:51] <_AtH> exists even on fresh installation with 16M of RAM for PHP [10:53:13] <_AtH> I wonder if I should downgrade to some old MediaWiki version [10:53:28] ah right, that's the bug that I had to downgrade the priority/severity for [10:54:21] <_AtH> downgrade priority??? [10:54:51] why do you think it's the memory limit? [10:55:27] yeah, if it's only your installation that's not working, and it's working just fine for everyone else, it's not critical [10:55:51] *_AtH is checking official Requirements [10:56:03] you want your money back? [10:56:24] is it giving memory exhausted errors? [10:56:26] <_AtH> May be I need a new hosting or something. [10:56:37] <_AtH> How to turn on this error checking? [10:56:42] <_AtH> No, no errors [10:56:52] so why do you think it's the memory limit? [10:57:56] <_AtH> The problem is that it worked fine for a couple of weeks. Then turns wild. [10:58:36] could be anything [11:08:44] *Shiroi_Neko wants TimStarling's money :P [11:08:51] huh? [11:09:00] Werdna free money! :) [11:09:08] shh [11:09:12] good morning fellows [11:09:16] okay [11:11:54] 03(NEW) maintenance/findhooks.php does not find all hooks - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12146 trivial; lowest; MediaWiki: General/Unknown; (ThomasBleher) [11:12:03] I'm not finding the options I need to create usergroups so I can restrict access - where do I start? [11:14:27] 03(ASSIGNED) Create patroller group on Russian Wikipedia - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12116 (10torin) [11:14:45] 03(NEW) Error deleting an uploaded file - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12147 major; normal; MediaWiki: Uploading; (jeremy.m.cook) [11:15:08] !access | Twinkletoes [11:15:08] Twinkletoes : For information on customizing user access, see . For common examples of restricting access using both rights and extensions, see . [11:15:23] Werdna: Thank you :) [11:18:14] <_AtH> My hoster suggested to recompile PHP to get more memory. What PHP version is better for MediaWiki 1.11.0 -- 5.2.5 or older? [11:18:34] 03(mod) maintenance/findhooks.php does not find all hooks - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12146 (10ThomasBleher) [11:19:49] <_AtH> TimStarling: I think this problem is about memory limit, because configuration script warns me it need 20M of memory, and I has only 16M. [11:30:09] _AtH: I doubt it is the memory limit [11:30:19] I think it's some other problem [11:30:26] <_AtH> How could I trace it? [11:30:32] recompiling PHP might help [11:30:54] upgrading or downgrading [11:31:00] <_AtH> And what PHP version is the best for stable MediaWiki ? [11:31:18] <_AtH> 5.2.5 ? [11:31:28] none of them are stable [11:31:38] <_AtH> hmm... [11:31:42] just pick the one that works best [11:32:14] <_AtH> Do I need ldap ? [11:33:23] depends on what you want to do [11:33:26] <_AtH> configure can't find ldap.h [11:33:59] you don't need the LDAP module for PHP, no [11:34:07] <_AtH> Thanks. [11:41:01] is it possible to delete every mediawiki article complete? without removing mediawiki itself [11:41:22] cleaning it for a complete recover from a xmldump [11:52:42] mors [12:05:10] is there a way to remove all articles from a mediawiki installation to import a fresh xml dump? [12:07:20] this would be pretty useful for me...for like when you have two installations on two different servers [12:07:45] i would like to import every month a new xml dump to the "offline" version of my wiki [12:08:23] just drop the tables? [12:10:21] Nikerabbit: this would work? [12:10:37] btw how can i import a whole dump? [12:11:37] aaahh ok [12:11:41] importDump.php [12:14:02] oh ok...i have to empty some tables [12:14:05] okidoki [12:17:24] What does importDump.php do other that mysql -u $wikiuser -p wikidb <$wikidb? [12:23:42] 03(mod) Message transformation leaves strip markers - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12056 (10roan.kattouw) [12:23:55] Caldrin: its not a sql dump..its a xml dump [12:24:12] Caldrin: importDump is for xml dumps. very different from sql dumps. [12:29:32] i think the pro for the xml dumps are that they leave the sql table speceific stuff out...just the important things [12:32:12] Hi,How I can set me like sysop on my wiki? [12:33:00] I forgot it [12:33:11] Sevela_p: go to Special:Userrights [12:33:55] but this can use only bureaucrat [12:34:21] you don't have a bureaucrat account [12:34:23] ? [12:34:24] Sorry, my fault. [12:34:42] i instal my wiki and I don't know how to set me as bureaucrat and sysop [12:35:15] Sevela_p: did you create an account at setup ? [12:35:30] like WikiSysop [12:35:53] w8 [12:36:46] try to use this account [12:37:25] yes I created but I forget it. I would like to change it in source code, but I don't know where... [12:38:17] Sevela_p: you have to modify the database then [12:38:54] ialex:WikiSysop doesn't work [12:42:15] Sevela_p: go to Special:Listusers and see if there's an account that is a bureaucrat [12:43:36] hi there [12:44:21] anyone here had beem used mediawiki with AD, knows why a see a blank screen after try to login ? [12:53:47] klapzin: sorry i didnt tried that [12:53:50] <_AtH> TimStarling: recompiling PHP and setting memory limit to 32Mb helped. Thank you very much for support! [13:03:34] re [13:30:31] <^demon> Has Eagle been around recently? [13:35:20] Hi there. [13:36:48] I forgot the password of my WikiSysop but have access to the mysql-db. What could I do? [13:37:16] good queston...i would love to know that too [13:38:08] zwerg, http://e-huned.com/2006/08/15/reset-a-mediawiki-password/ [13:38:38] (I found that by typing in "reset mediawiki password" in the search box on www.google.ie) [13:38:40] reset the password using this SQL query: UPDATE user SET user_password = MD5(CONCAT('123-',MD5('newpassword'))) WHERE user_id=123; [13:38:59] where newpassword is the new password and 123 is the user's ID [13:39:24] <^demon> marvxxx: I haven't seen him in days. [13:40:12] 03wegge * r27947 10/trunk/phase3/languages/messages/MessagesDa.php: Added/updated translations for da [13:43:10] Thanks. [13:57:24] does anyone know a well opensource xml viewer or editor? [14:04:31] örörö [14:04:53] do well, but a good thing [14:05:56] and nope, i don't know any better than a good text editor [14:06:15] 03(FIXED) API for Extension:SiteMatrix - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11431 +comment (10roan.kattouw) [14:09:14] too bad but thank you Nikerabbit [14:17:13] 14(INVALID) Please, may we use square brackets in JSON callbacks? - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12136 +comment (10roan.kattouw) [14:18:00] is Roan Kattouw in this channel? [14:18:10] 03(WONTFIX) Patch that includes previous revision ID in prop=revisions - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=10297 +comment (10roan.kattouw) [14:22:10] TimStarling: PING [14:22:19] hi [14:22:43] so, basic test case, does x work? [14:22:56] Testing [14:23:26] UNIQf26fbe533c53ff4-nowiki-00000001-QINU [14:23:47] Hardcoded in the wikitext source :O [14:23:54] there's an interesting detail there... [14:23:58] Exactly [14:24:06] is there a delete character before the UNIQ? [14:24:12] Yes [14:24:15] Some weird char anyway [14:24:29] One non-printable before UNIQ and one after QINU [14:24:32] ok [14:24:35] Whether they're the same char I can't tell [14:24:43] no, just a guess [14:25:33] the fact that it corrupts text on save is not particularly surprising, I fixed lots of bugs like that during development [14:25:48] the fact that the entire strip marker system is broken is the strange thing [14:26:46] let's see if we can get some intermediate output... [14:27:10] what OS are you on? [14:27:16] Windows XP [14:27:19] *RoanKattouw hides [14:27:26] *shrug* same as me [14:27:44] My Linux PC started emitting smoke a while ago [14:27:56] So I set up WAMP [14:28:07] is it a public server? [14:28:11] No [14:28:15] And it's slow as hell [14:28:23] I do have Special:Eval installed though [14:30:05] TimStarling do you know Hamachi? [14:30:25] no [14:30:39] grrr, you think your computer is slow... [14:30:54] No, I think my *Apache* is slow [14:31:09] Anyway, Hamachi can be used to set up a fake LAN over the internet [14:31:21] That way you could connect to my Apache without me having to expose it to the world [14:32:00] let's just try a few traditional methods first... [14:32:20] brion-away [14:32:24] I got your stats [14:32:30] I observe a minor problem tho [14:36:27] 14(WFM) List of rights that can be granted/revoked broken in Special: Userrights - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11505 +comment (10roan.kattouw) [14:37:28] RoanKattouw: here we go: http://p.defau.lt/?qIVtemE8iLS5dVQCD_rQvA [14:37:42] save as /path/to/wiki/bug12056.php [14:38:13] execute it with a browser and paste me back the result [14:38:15] TimStarling: what do you thing about http://lists.wikimedia.org/pipermail/mediawiki-api/2007-November/000211.html ? [14:40:11] 1. It would be less expensive to fetch 500 (5000) rows on *one query* [14:40:11] than to do *10 times the same query. [14:40:14] TimStarling: http://p.defau.lt/?2HpEVCWW5MUqEdJr5xqMUw [14:40:16] that's not the two options [14:40:39] TimStarling my opinion in this matter is that the 500 limit is there to slow down nasty people [14:40:54] Raising it to 5,000 makes attacking a wiki exactly 10 times easier [14:41:07] right, the point is throttling [14:41:20] But don't forget about Special:Allpages that allows 5000 [14:41:20] I do agree with the second point [14:41:27] That's just insane [14:41:35] while your single request is running, it can't be throttled in any way, and the requester doesn't know how much time it is using [14:41:39] it's not even cancellable [14:42:21] I do agree with the highlimits part [14:42:29] apihighlimits should be a separate right [14:42:32] Looking into that right nwo [14:42:34] *now [14:42:52] *VasilVV will add it soon [14:43:03] *RoanKattouw is trying to add it right now [14:44:55] RoanKattouw: your output paste is exactly what it's meant to do [14:45:23] That doesn't mean much to me xD [14:45:30] It still doesn't work [14:46:24] with the advances in Pywikipedia, perhaps highlimits could be given to all accounts with Bot flag [14:46:56] 03(mod) SelectCategory: previewing clears the categories - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=9849 +comment (10slowder) [14:47:40] Hojjat will add highlimits for bot and sysop in DefaultSettings [14:48:00] ? [14:48:30] Hojjat are we talking about the same thing? APIhighlimits? [14:48:38] yes [14:48:47] I'm adding it right now [14:48:59] I'll have DefaultSettings.php say it's enabled by default for sysops and bots [14:49:11] oh okay [14:49:37] "Hojjat will add..." -> "Hojjat I will add" [14:52:11] 03catrope * r27949 10/trunk/phase3/ (4 files in 3 dirs): Adding apihighlimits permission. Users with this permission can request 10 times as many rows in API requests. Enabled by default for sysops and bots. [14:52:12] "Hojjat: will add" was intended [14:52:18] There is is [14:52:22] *it is [14:54:50] RoanKattouw thanks [15:03:43] RoanKattouw: I'm casting the net a little wider: http://p.defau.lt/?ZJBHOXwkrIskX9CV__JbfQ [15:04:02] it's another script that you can run, like before [15:04:25] [Thu Nov 29 16:00:14 2007] [error] [client 127.0.0.1] PHP Notice: Undefined variable: rv in C:\\Program Files\\Apache Software Foundation\\Apache2.2\\htdocs\\t\\bug12056.php on line 10 [15:04:26] [Thu Nov 29 16:00:14 2007] [error] [client 127.0.0.1] PHP Fatal error: Call to a member function getText() on a non-object in C:\\Program Files\\Apache Software Foundation\\Apache2.2\\htdocs\\t\\includes\\Parser.php on line 4063 [15:04:43] oops, sorry, fixed another version but not in the paste [15:04:54] change that $rv to $rvw [15:05:35] PHP Fatal error: Call to a member function getText() on a non-object in includes\\Parser.php on line 4063 [15:06:11] is there a way I can link to the upload page and specify the name of the file and the text of the link? [15:07:00] you can specify the text of the link as in any link [15:07:31] using external link syntax, you can probably also give the (target) name of the file as a url parameter [15:07:40] RoanKattouw: can you add var_dump($t) on line 5? [15:07:43] is it a title object? [15:08:06] *VasilVV implements limit=max [15:08:43] DigitallyBorn: just give wpDestFile=foo.bar in the url [15:09:07] TimStarling: it's a Title object, error still stands [15:09:17] object(Title)#29 (19) { ["mTextform"]=> string(1) "X" ["mUrlform"]=> string(1) "X" ["mDbkeyform"]=> string(1) "X" ["mUserCaseDBKey"]=> string(1) "x" ["mNamespace"]=> int(0) ["mInterwiki"]=> string(0) "" ["mFragment"]=> string(0) "" ["mArticleID"]=> int(-1) ["mLatestID"]=> bool(false) ["mRestrictions"]=> array(0) { } ["mCascadeRestriction"]=> NULL ["mRestrictionsExpiry"]=> NULL... [15:09:19] ...["mHasCascadingRestrictions"]=> NULL ["mCascadeRestrictionSources"]=> NULL ["mRestrictionsLoaded"]=> bool(false) ["mPrefixedText"]=> NULL ["mDefaultNamespace"]=> int(0) ["mWatched"]=> NULL ["mOldRestrictions"]=> bool(false) } [15:09:27] no need for that [15:12:08] put an exit; before this: [15:12:09] $editInfo = $article->prepareTextForEdit( $text ); [15:12:15] does the error still happen? [15:12:46] Yes [15:13:30] weird... [15:14:52] oh... [15:15:25] not sure why it didn't get an error on my wiki, maybe that's significant, but there is a bug [15:15:47] Duesentrieb: I tried "[/index.php?title=Special:Upload&wpDestFile={{PAGENAMEE}}.jpg Upload Image]" and "[[Special:Upload?wpDestFile={{PAGENAMEE}}.jpg|Upload Image]]" [15:15:56] Duesentrieb: neither worked .. suggestions? [15:16:33] DigitallyBorn [{{fullurl:Special:Upload|wpDestFile={{PAGENAMEE}}}} upload image] [15:17:02] DigitallyBorn: external link syntax only works with full urls [15:17:15] yea, do what RoanKattouw said [15:17:16] aha [15:17:52] sweet .. thanks :) [15:18:11] DigitallyBorn: make a template :) [15:19:18] Duesentrieb: That's what I'm doing :) [15:19:26] Duesentrieb: Lots of templating [15:20:08] 03vasilievvv * r27950 10/trunk/phase3/ (RELEASE-NOTES includes/api/ApiBase.php): * Add limit=max to API. It also adds limit info to output [15:21:52] RoanKattouw: try this version: http://p.defau.lt/?RyE2FaCMvoyV46pwju_IuA [15:22:47] using "{{{name|Name left blank}}}" doesn't seem to ever use the default [15:23:26] oh .. it only works if the param isn't specified at all [15:24:07] that's kinda lame .. I've gotta do parser function logic for parameters to make sure the default is shown correctly all the time? [15:24:12] TimStarling: same error [15:24:17] Oh wait [15:24:19] *RoanKattouw forgot to save [15:24:44] Nope, still same error :( [15:24:55] very odd [15:25:23] TimStarling does it work for you at your MW install then? [15:25:37] DigitallyBorn: yes. otherwise, it would be impossible to expicitly override a default with a blank [15:26:18] yes [15:26:25] can I have a backtrace at the point of the fatal? [15:26:42] print wfBacktrace(); should be fine [15:27:26] TimStarling sure [15:28:16] it's probably some trivial difference, $wgTitle not initialised or something [15:28:53] http://p.defau.lt/?MRJfWObG6327I1zyyCkqog [15:29:07] That's debug_print_backtrace();, wfBackTrace(); didn't output anything [15:29:21] 03thomasv * r27951 10/trunk/extensions/ProofreadPage/proofread.js: removing deprecated test [15:30:58] wfBacktrace() returns its output, it doesn't print it [15:31:26] Oops [15:31:48] Adding print wfBackTrace(); [15:31:55] * Parser.php line 4063 calls wfBacktrace() [15:31:57] * Parser.php line 4007 calls Parser::pstPass2() [15:31:57] maybe this is related [15:31:58] * bug12056.php line 11 calls Parser::preSaveTransform() [15:32:21] how's debugging going? [15:32:36] Nikerabbit I don't know, TimStarling tells me what to do and I do it [15:32:44] All I know is that it ain't fixed yet [15:33:52] Is there some sort of default throttling in mw? I'm trying to spider my own local instance of mw, and it seems to "pause" for > Apache timeout value after about 7-8 initial requests. [15:33:53] maybe I should try an upgrade to 5.2.4 [15:34:16] Fails with wget, other tools... but I can spider other sites on my same server to thousands of links, without any issues. [15:34:30] So it's not Apache, Apache is waiting for the next request... [15:34:31] 80% chance I'm just crazy and this is trivial [15:34:40] but if I'm not crazy, it's a PHP bug [15:34:50] Maybe it is [15:34:56] I'm on 5.2.2 [15:35:04] Would be very weird and very hard to narrow down, though [15:35:24] TimStarling 5.2.5 [15:35:43] http://translatewiki.net/w/?title=Process/tasks&diff=prev&oldid=164002 is this related? [15:36:29] looks like the same, but it doesn't happen every time [15:36:34] Nikerabbit that's *exactly* what we're researching now [15:36:47] All XML tags being replaced by UNIQ tokens [15:37:28] well, Nikerabbit's one is random, your one is reproducible [15:37:36] True [15:39:01] hrm, this is weird [15:39:10] it's definitely mediawiki throttling my local crawling [15:39:27] But I can't see anything in the DefaultSetup or LocalSettings that would control that [15:40:45] RoanKattouw: if you comment out lines 7-10, does it give an error? [15:41:01] that's $wgParser->startExternalParse to $usw = [15:41:14] I'm guessing not [15:41:38] Backtrace + error [15:41:55] well, so much for guessing [15:42:34] what if you construct a new parser? [15:42:37] $wgParser = new Parser; [15:42:44] Where? [15:42:51] before the $pst line will do [15:43:03] Do I still need to comment out 7-10? [15:43:09] doesn't matter [15:43:39] Doesn't help [15:43:40] this is mostly to rule out extensions [15:43:56] Shall I uncomment all extensions in LocalSettings? [15:43:58] new parsers don't have extensions [15:44:14] no, probably won't make a difference [15:44:44] if you've got a theory though on where this fatal comes from, feel free to test it out [15:45:06] Right now I'm just a brainless dummy reading a book and looking up every time someone says something at IRC [15:45:14] When you tell me to change something, I do that and tell you the results [15:45:17] I don't think ;) [15:45:36] TimStarling any luck on upgrading PHP at your end? [15:48:32] Hello [15:49:49] I found out that 5.2.5 is out and started reading the changelog [15:50:07] downloading 5.2.4 now [15:53:16] Anybody know if there's an easy way to include articles in the User: namespace in Special:Popularpages ? [15:53:27] code it yourself [15:53:41] Hi, does anyone know why at http://www.themoonwiki.org/wiki/Help:Formatting#Single_citation_of_a_reference_or_footnote the 4th citation uses numbers rather than letters? [15:55:10] DigitallyBorn: you've been asking that for two days. Did $wgContentNamespaces not work? [15:55:25] minute, Because you're using
    [15:56:32] !what | DigitallyBorn [15:56:32] DigitallyBorn: Do you understand the words that are coming out of my mouf? [15:56:32] setuid: How would I fix that? [15:56:58] XD [15:57:02] lol [15:57:24] anyhoo, I was pretty sure $wgContentNamespaces should do the trick, DigitallyBorn [16:00:39] have to restart [16:06:14] 03(mod) CAPTCHA group exceptions only apply to edits - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12142 (10ype.sun) [16:06:20] hi [16:07:15] Is there actually any documentation whatsoever for the cite extension? [16:07:16] clone(array()) leaks memory... ugh [16:09:45] Uh, I guess I have to RTFS. [16:10:39] setuid: It has nothing to do with that whatsoever. [16:23:15] I'm running on 5.2.4 now, no fatal error [16:23:27] no strip markers either [16:23:36] hi brion [16:23:38] Weird [16:23:45] yo [16:23:53] You do have it running as CGI, don't you? [16:23:55] in for at least a few hours today [16:24:38] no, apache2handler [16:24:53] Oh [16:24:56] but usually it's pretty consistent across SAPIs [16:25:01] PHP: 5.2.4 (cgi-fcgi) [16:25:18] presumably you could reproduce this problem on the command line [16:25:31] do you have the CLI version, or just CGI? [16:25:38] Dunno, will check [16:26:34] CLI also [16:26:58] so you could do... [16:27:08] cd \path\to\wiki\maintenance [16:27:10] php eval.php [16:27:24] then paste everything from my script before, except the WebStart line [16:27:40] probably only necessary to go as far as the fatal error though [16:28:52] avar: Hi, do you have a moment? [16:29:02] TimStarling: I'll do it line by line [16:29:50] when i do a xml backup of my mediawiki installation...are the pages for the images included? [16:29:58] or would i have to readd them? [16:30:00] Image descriptions, yes [16:30:03] yes, but not the iamges themselves [16:30:08] The images themselves aren't [16:30:14] cause else i just backup my images folder and the xml and im fine [16:30:30] so i just would need the images folder and the xml [16:30:35] brion-office: Roan's helping me with bug 12056, aka "my whole wiki is completely broken and it's all your fault" [16:30:40] but userinfo inst included neither right? [16:30:45] i would have to readd my user [16:30:56] marvxxx not in XML, no [16:30:56] heh [16:31:01] Do a database dump for that [16:31:06] !backup | marvxxx [16:31:06] marvxxx: http://www.mediawiki.org/wiki/Manual:Backing_up_a_wiki [16:31:21] I'm tired though, been up for a looong time [16:32:01] TimStarling: the backtrace is hit on line 11 (PreSaveTransform), but no fatal error [16:32:03] RoanKattouw: but the xml holds up who edited the pages....with a username...so i just would have to register a user again? with the same name..and it would get mapped up again? [16:32:05] I'll continue [16:32:09] wow that's a lot of strip markers [16:32:13] marvxxx no [16:32:26] XML stores user IDs [16:33:03] windows explorer has crashed, going to try to restart it... [16:33:54] RoanKattouw: no way i can get around with a xml backup then? [16:34:07] No [16:34:36] but nothing would happen if i would create user with the same name? [16:34:49] TimStarling: eval.php doesn't like HereDoc [16:34:56] or would it give problems? [16:35:00] marvxxx they'd also have to have the same userID [16:35:20] but i cant control that right? [16:35:30] Yes, by manually adding them to the DB [16:35:37] But then you might as well DB-dump the user table [16:35:43] Or the entire DB for that matter [16:35:43] i do that [16:35:53] but i thought it would be much easier with a xml dump [16:36:05] or more not so..."dirty" [16:36:21] hey, noone has a copy of the bibwiki plugin do they? the sites down, and i realllllly want a copy :p [16:36:35] kim: whats the bobwiki? [16:36:40] bibwiki [16:36:42] marvxxx: Nope, database dumps are the way to go. [16:36:44] its for amnaging bibtex data in mediawiki [16:36:50] *managing [16:36:59] it didn't die quietly [16:37:06] did I miss anything? [16:37:12] no [16:37:25] TimStarling: eval.php hates HereDoc so it's gonna be a little more trouble gathering those vars [16:37:27] But dinner's ready [16:37:36] So you'll have to wait for a bit [16:37:41] well, you've already proven it's different [16:38:02] minute: ok thank you [16:39:10] I'm going to bed [16:40:14] sleep well TimStarling [16:40:40] how can i solve this -> Wiki is set to not allow updates [16:41:09] this happen when i try to auth in ad [16:43:39] Hi Folks, i have a problem witrh my wiki, when i search VBM then i get my page, but when i search vbm then i get a page to create a new one... what must i change? [16:43:44] 03(mod) Error deleting an uploaded file - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12147 +comment (10theurge14) [16:45:30] VasilVV: why add 'limit=max'? wouldn't just not putting a limit do this? [16:47:02] brion-office: because it's much easier to add limit=max than to read user rights, then think if this query has high (500/5000) or low (50/500) limits, etc. [16:47:16] eh? [16:47:26] Hi there, Im trying to set the title of a page in mediawiki, but the documentation I found online isnt very clear on the subject. How do I set the page title? [16:47:26] there's going to be an inherent limit on the backend [16:47:33] asking for 'limit=max' is meaningless [16:48:00] why? [16:48:06] phoenixz: what kind of page and what kind of title? [16:48:22] VasilVV: i think the burden is on you to explain why saying "limit=max" is different from not saying anything. [16:48:56] By default limit=10 for most queries [16:49:06] brion-office, er, the main wiki page.. I'd like to give it some catchy title [16:49:07] Hello [16:49:28] phoenixz: hit the 'move' tab and change its title [16:49:38] brion-office, ahah [16:49:39] ok [16:49:42] then to skip the redirect for the links to the main page, edit MediaWiki:Mainpage on your wiki and change that to the new title [16:49:56] brion-office, I thought I had to specify that in the page itself [16:50:06] VasilVV: ahhh, so not asking for a limit gives you a different limit than the maximum? [16:50:09] that's a bit weird :) [16:50:12] ok that makes sense then. [16:50:55] how can i solve this -> Wiki is set to not allow updates [16:50:57] brion-office, anyway, this is like the name of the page.. Im more talking about the title that will show up in the browser window bar.. [16:51:52] how can i do to make my Wiki-seaech case-insensitive? [16:52:06] phoenixz: that's the same thing [16:52:11] title is title is title [16:52:33] Peperoni: wiki search is always case-insensitive [16:52:43] brion-office, I see.. I thought page title and window bar title could be separately specified [16:52:53] no. [16:55:03] brion-office: The limits thing is not as weird as it sounds: we don't want people who accidentally forget to specify a limit to put too much load on the servers [16:55:22] brion-office, i get not my page they have i created unter VBM when i search vbm then it goes Vbm and its a blank page [16:55:39] RoanKattouw: yes it has some logic to it :) [16:55:51] Peperoni: that's because the wiki considers VBM different from vbm and vbM and all that [16:56:02] Everything except the first letter is case-sensitive [16:56:05] Peperoni: that's the 'go' which goes to exact matches. that checks several case variations. [16:56:13] if there's not an exact match, then it goes on to do a search. [16:56:18] search is always case-insensitive. [16:56:49] brion-office, i type it in der Adress Filed of my browser and press ENTER [16:57:26] Peperoni like /wiki/vbm ? That's gonna lead you to the Vbm page, which probably doesn't exist [16:58:22] To get to VBM you have to do /wiki/VBM [16:58:33] Or you can create a page called Vbm that says #REDIRECT [[VBM]] [17:00:11] hmm, can i nothing change to make the search over der adress field case-insensitive? [17:01:48] 03(mod) Message transformation leaves strip markers - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12056 (10brion) [17:02:21] Peperoni: URLs aren't searches :) [17:03:04] brion-office, yes but i used wiki all times so :) [17:03:28] well, make a redirect from Vbm as noted above :) [17:03:29] Peperoni are you German or Dutch? [17:03:36] German :) [17:03:40] Ah [17:03:46] du au? [17:03:52] Niederländer [17:04:09] ahhh i love it :) the law in NL is like heaven [17:04:14] Peperoni du kannst eine Weiterleitung machen von [[Vbm]] nach [[VBM]] [17:04:40] RoanKattouw, jo das wusste ich, aber des problem ist das ich dann ALLE seiten bearbeiten muss da ich immer grogeschrieben habe :( [17:04:53] Richtig [17:05:03] bzw. bei jeder die mit groenbuchstaben anfngt ein redirect zur richtigen seite machen muss [17:05:11] jop, das wollte ich mir sparen [17:05:45] finde ich aber doof das so eine "simple function" nicht eingebaut ist. $wgSearchcaseinsensitiveoverURL = true; wre doch fine ;) [17:06:05] Nein, das haben wir nicht [17:06:51] Was sollte passieren wenn auf eine Wiki [[VBM]] und [[Vbm]] Artikel sind, und $wgSearchcaseinsensitiveoverURL = true; zugefugt wird? [17:08:15] ja sogesehen schon, aber es macht doch kein unterschied ob ich Vbm oder VBM oder vbm schreibe, ist doch alles das selbe.. es hat ja auch keine andere bedeutung wenn ich Irc schreibe oder IRC ist immer das selbe... [17:09:05] Viele Programme denken nicht so [17:11:57] 03(mod) Message transformation leaves strip markers - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12056 (10roan.kattouw) [17:18:29] 14(WFM) SelectCategory: previewing clears the categories - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=9849 +comment (10samuel.gelineau) [17:18:39] minute: what do you need? [17:24:27] avar: I am having some trouble with the cite extension, instead of displaying "a b c" when there are multiple references with the same name it displays "4.0 4.1 4.2" - is there anyway to fix this? [17:29:18] yes there's a configuration setting for that, see the docs on meta [17:30:19] avar: Ahh, so that is where the documentation is. Been looking on MediaWiki.org and couldn't find it. Thank you. [17:31:09] Which documentation? [17:31:27] That which is at wikimedia? Or other? [17:31:29] avar: Can you give me a link to this documentation - I can't find it after searching meta. [17:32:30] Testing, testing. Do i have to register with nickserve to even show up here? [17:32:38] been years since I used IRC. lol [17:32:52] no [17:32:55] I can see you. [17:32:58] great [17:33:00] ty [17:33:21] minute: I don''t remember, use google:) [17:33:23] avar: In fact, looking at the code, no configuration variables are used apart from $wgUseTidy. [17:33:45] I am looking for simple tutorial for making minor layout changes to elements and messages on the basic Editpage. It must be easier than all the custom stuff I am finding. [17:34:36] I want to mainly supply my own buttons and messages. Is there some page somewhere that just deals with this" I've not been able to find it. [17:35:12] actually, I'd be content to just move things around a bit first. but there doesnt' seem to be a template for for this. [17:35:28] am I missing something obvious? [17:35:58] avar: May I ask, how can I find information on non-existant configuration variables? [17:38:47] I'll take that as "you can't". [17:40:09] lol [17:40:42] wheezer: I think the only way to do that would be to either use CSS, or modify the source code. [17:40:56] i was afraid you would say that [17:41:15] seems odd that with a template engine already there, someone would not theme the editor, no? [17:41:35] My gut keeps saying it must be easier than this. [17:42:01] That is rather unideal, however things generally aren't easy here - presumably no one has ever seen the need to do such a thing. [17:42:12] Of course it is simple to just change the messages on the buttons. [17:42:38] Simply find the relevant system message in Special:Allmessages and modify it. [17:42:38] where do I do even that? [17:42:44] ah [17:42:59] and you find the original message in the code? [17:43:02] and look it up? [17:43:05] no [17:43:21] You just look for the message which has the same message content as on the page in that special page. [17:43:43] oh.. and hope there are not two "Cancels"? [17:43:45] lol [17:44:15] e.g. If you wanted to change "Save page" to "Save me!" you would use a find tool (most browsers come with this) and look for "Save page" (without quotes) then modify the message that row corresponds to. [17:44:55] oh.. right. i remember now. It's just a big list [17:45:10] and it specifically shows the context. got it [17:45:19] :) [17:45:33] does it list them for the bulk text messages too? [17:45:46] Yep. [17:45:51] "PLease note that all contributions, etc" [17:46:23] and what I just want to squelch the message complete? Just kill it with CSS? [17:46:27] if i [17:46:39] yeah [17:46:59] or just replace the MediaWiki: page with a single space. [17:47:23] replace? You lost me [17:47:42] oh, you mean edit the text, and replace with a space? [17:54:35] wheezer: yes [17:54:42] Minute. Is there a way to "comment out" the message, so I don't have to delete it? [17:54:52] [17:54:57] cool [17:55:04] ty [17:55:24] sorry for being such a preemie. It's a lot to absorb in only a few days :) [17:57:09] I don't suppose things like the Save Page, Show Preview button bars (and summary input) are also in a special page, with variables, are they? [17:57:27] They should also be system messages. [17:57:52] Since they are definitely customizable and that is the only way to customize things. [17:58:33] so, you are saying that i can change each element.. so long as I don't want to change the order of anything in the flow of the page. If I want to do that, my only option is to customize Editpage.php? [17:59:24] or css, yes [18:00:55] I presume that we can make our own special pages, with variables, the same way, when we get to that? [18:02:44] wheezer: yep [18:04:05] To your knowledge, there is no extension, specifically designed for reconfiguring an edit page? Except such replacements like FckEditor, etc? [18:05:38] wheezer: To my knowlege, no. [18:05:44] dang [18:06:06] HELO [18:06:16] why do so many people lurk on this channel? Just to get help when they need it? [18:06:24] Some of them are AFK. [18:06:28] hi niker [18:06:31] Others aren't paying attention to IRC> [18:06:33] 03(NEW) Text that's new should be colored the same as text that' s changed - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12148 15enhancement; low; MediaWiki: History/Diffs; (natelroot) [18:06:35] wheezer: some of em just watch [18:06:55] And from now I'm not available because I have to restart my computer as for some reason I cannot access Google Mail; [18:13:54] back [18:14:37] pigin is a better client, i hope [18:29:33] i was wondering if there is a way to work with message bundles and mediawiki? [18:34:59] hi [18:35:02] anybody there [18:35:05] require help [18:39:53] !ask TuskerKing [18:39:56] !ask | TuskerKing [18:39:56] TuskerKing : Don't say "I have a question", or ask "Is anyone around?" or "Can anyone help?". Just ask the question, and someone will help you if they can. Also, please read < http://workaround.org/moin/GettingHelpOnIrc > for a good explanation of getting help on IRC. [18:40:35] mediawiki is successfully installed how to remove installation message from Main Page [18:40:36] ?? [18:40:44] Edit it. [18:40:51] not editable [18:41:04] Then you've changed the configuration settings. [18:41:12] Do you have a link? [18:41:18] yes [18:41:21] indianfinearts.in [18:41:30] http://indianfinearts.in/index.php?title=Main_Page [18:41:54] Message: MediaWiki has been successfully installed. [18:41:56] Consult the User's Guide for information on using the wiki software. [18:42:15] TuskerKing, the account named "Admin" protected it. You have to log in as a sysop account to edit it. [18:42:18] http://indianfinearts.in/index.php?title=Special:Log/protect [18:42:24] yes [18:42:29] i am logged in here as admin [18:42:55] but message below main page is not editable [18:42:58] how this happened? [18:43:34] What does this say? http://indianfinearts.in/index.php?title=Main_Page&action=edit [18:44:21] oh [18:44:26] that was okey [18:44:32] but trouble with the default link [18:44:37] skin [18:44:44] *skin i have used [18:44:47] TuskerKing, ask whoever wrote/installed your custom skin, don't ask us. [18:45:18] okey thanks simetrical [18:45:28] good night! [18:47:01] nice... https://addons.mozilla.org/en-US/thunderbird/addon/4268 [18:47:11] email diff syntax highlighter :D [18:47:38] Heh, neat. [18:47:58] *Simetrical waits for Gmail to implement that . . . they must use SVN or something, right? [18:53:09] Simetrical: you could probably do a greasemonkey script to customize it :D [18:53:19] Good grief: http://www.entropiaforum.com/forums/wlwc/28871-things-you-cant-say-notepad.html [18:53:24] ok, anybody got objections to current svn state? [18:53:56] Not particularly, that I can think of. [18:55:09] ok, i'm going to pull out the SVN up [18:55:28] 03(FIXED) Set $wgAutoConfirmCount to 50 edits for the Arabic (ar) Wikipedia - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12123 +comment (10jeluf) [19:02:25] I'm going to take over a mediawiki 1.5.7 based site (via a database dump most likely), will I be able to use the database dump with mediawiki 1.11.0, or should I start by installing an older version of mediawiki? [19:02:35] anyone here knows how i set the wiki to auto create accounts when i use Ad [19:02:36] ? [19:04:02] Kaare, I *think* it will work if you install it on 1.11.0 and then run maintenance/upgrade.php. [19:04:18] In principle, it should work. [19:04:22] klapzin: "ad"? ad what? auto-create... how? [19:04:50] Kaare, if you install the database and then install the software, in fact, it might be nice enough to auto-upgrade everything for you without you needing shell access to run upgrade.php. [19:04:53] Active Directory [19:05:04] Kaare: in theory, the updater will convert the database. make an XML too though, for good measure. [19:05:07] klapzin, find an extension. [19:05:11] !ldap [19:05:11] http://www.mediawiki.org/wiki/Extension:LDAP_Authentication [19:05:19] Duesentrieb, XML dump is no substitute for a database. Doesn't include all sorts of things. [19:05:21] Duesentrieb, create the accounts in mediawiki automaticaly [19:05:27] Kaare: XML dumps are fully compatible across all versions (starting with mw 1.4) [19:05:33] TimLaqua, LDAP != Active Directory [19:05:34] siebrand, i already use this extension [19:05:36] Simetrical, Duesentrieb: thanks, I'll give it a try when I get the data, and cross my fingers :-) [19:05:43] Simetrical: i'm aware of that. it's a usefull addition though [19:05:49] Possibly. [19:06:06] it's also a nice thing to have for fallback [19:06:19] Simetrical: eh? [19:06:46] xml dumps aer good for long term backups [19:06:58] klapzin: when? based on what? [19:07:03] correct, a protocol is not a directory service... [19:07:24] TimLaqua, oh, does it? [19:07:26] *Simetrical looks it up [19:07:47] See, I know nothing about LDAP. [19:07:57] Okay, so you're right, the LDAP thingie should be just what he wants. [19:07:59] Duesentrieb, when the user try to login in wiki the wiki will check the account in AD and if ok ( user and pass ) the wiki create user on db [19:08:00] it's the protocol that is used to communicate with AD [19:08:12] *Simetrical vaguely knew AD was some MS thing, didn't know it used LDAP [19:08:16] TimLaqua, yes [19:08:43] klapzin: i missed the bit about "AD" = "Active Directory" :) well, talk to TimLaqua and Simetrical then :) [19:08:45] every LDAP comes from netscape [19:08:55] klapzin: http://www.mediawiki.org/wiki/Extension:LDAP_Authentication is the way to go. [19:08:57] hehe [19:08:59] klapzin: if you want users to login using their AD sAMAccountName and password, then use the LDAP Authentication example. It does auto-create MW accounts. [19:09:03] it has about a million ways to configure it [19:09:07] Duesentrieb, i read all of these .. [19:09:09] people have made it work with active directory [19:09:20] it works great on AD. [19:09:34] yes, but its not work with me, i dont know why [19:09:45] you configured it wrong. post in the extension's talk page [19:09:48] look the msg [19:09:50] Wiki is set to not allow updates [19:10:17] *TimLaqua pokes Ryan_Lane [19:10:37] klapzin: which message? [19:10:41] Wiki is set to not allow updates [19:10:48] !what [19:10:48] Do you understand the words that are coming out of my mouf? [19:10:50] wiki show me this error [19:10:58] haioshdiaushdi [19:11:03] its rush hour 1 [19:11:10] very good! [19:11:11] lol! [19:11:39] Hi all, is there a way to only show a piece of the sidebar when you're not logged in ? [19:12:10] klapzin: post your problem, cleaned config, etc in: http://www.mediawiki.org/wiki/Extension_talk:LDAP_Authentication [19:12:28] qsz, I suspect not. [19:13:18] !sidebarex | qsz [19:13:18] qsz: http://www.mediawiki.org/wiki/Extension:SidebarEx [19:13:43] Tim-away, okok [19:14:45] Timlaqua thanks, that's what i'm looking for, [19:14:58] ;-) [19:21:54] !what [19:21:54] Do you understand the words that are coming out of my mouf? [19:22:06] LoL [19:23:13] o_O [19:23:14] !what del [19:23:14] Successfully removed keyword: what [19:25:46] spoil sport. [19:55:43] klapzin: ? [19:55:58] siebrand, hi [19:56:13] klapzin: [20:05] siebrand, i already use this extension [19:56:45] siebrand, do you have any experience with AD and MW [19:56:48] ? [19:56:54] klapzin: none. [19:57:08] hi, ok, i know you can create users, but how can i set up permissions and groups - aka groupA has r/w permissions with 5 users, groupBhas r permissions with 4 users, [19:57:16] can i do thatin mediawiki? [19:57:54] 03(NEW) "no pages or subcategories" not shown if hideroot="on" or depth="1" - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12149 normal; normal; MediaWiki extensions: CategoryTree; (daniel) [19:59:52] !rights | AlphaOmega [19:59:52] AlphaOmega : For information on customizing user access, see . For common examples of restricting access using both rights and extensions, see . [20:00:37] AlphaOmega: Yes, you can easily do that in great detail. :) [20:04:35] thanks so much Pathoschild [20:04:40] thats exactly what i needed [20:05:02] can i get the version from the web front end ? [20:05:26] "get"? "version"? "web front end"? [20:05:31] i mean [20:05:38] the version of your mediawiki? [20:05:48] !version [20:05:48] To find out the version of your MediaWiki installation, visit the page Special:Version. Should the wiki be broken, but you have access to the program files, find $wgVersion in DefaultSettings.php. [20:05:49] i have media wiki installed, i dont know what version, so i was wondering if i could get the version of the mediawiki FROM the front end [20:06:03] ok ty [20:06:16] so the answer to my question is no? [20:06:27] hm? [20:06:36] the answer is "visit the page Special:Version". [20:06:37] you cannot obtain the version from the web pages? [20:07:08] AlphaOmega not directly from the HTML (I think) [20:07:18] awesome ty [20:07:30] yeah, Duesentrieb said its at Special:Version [20:08:27] oh [20:08:32] you said it and i didnt read it [20:08:37] b/c im a god damn retard [20:08:45] :) [20:09:14] hi all [20:11:24] minute: no it's not unconfigurable just because you don't see a php configuration variable, I said a configuration setting. It's a mediawiki message [20:11:43] minute: It's in the cite.php docs wherever they're hosted now. I'm sure you can find them [20:11:54] (I have no idea where they are now) [20:12:12] If you say so, I'll look around. [20:12:54] there's some mediawiki message where you can set symbols it'll use to enumerate references [20:13:20] They are set to a b c d... by default according to the source code. [20:13:31] I'm sure wikipedia will have the message set though. [20:13:31] Minute.. I've looked over special pages. pretty dense. Any ideas of how might find one example of someone overriding the edit page with a special page? Someone must have done it, no? [20:14:01] wheezer: I doubt there is an example, although you should be able to edit the piece of code that generates the tabs to do so. [20:15:04] you mean to launch the page? And the simply start with the original edit page, as our new page source? [20:15:28] :S [20:15:33] You are losing me... [20:15:43] or I am loosing you, whatever. [20:16:06] What was the point of hacking the tabs? [20:16:38] so that you could direct the edit request to your edit special page instead of the edit action. [20:17:10] 03kim * r27952 10/trunk/extensions/Wikidata/util/ (class.php class_missing.php classes.php): [20:17:10] Classes statistics support by Christophe Millet (kipcool) [20:17:10] (no multi-dc support yet) [20:17:21] right. but that new edit source still has to function . So you are suggesting we simply start with the existing editpage, and hack that, right? [20:17:52] but instead of editpage.php, it's using special_edit.php? [20:19:08] wheezer: May I ask why you want to use Special:Edit instead of editpage? [20:20:58] ami, we have 2 needs. 1) we want to have a customized look and feel to the edit form, with some custom fields from external sources, and 2) we need several different editors for different article "types" (determined by our code) [20:21:06] why not just attach to the edit hook? [20:21:21] well, perhaps we can. Trying to learn where to start [20:21:28] that seems a better approach [20:21:39] 03siebrand * r27953 10/trunk/phase3/languages/ (13 files in 2 dirs): [20:21:39] Localisation updates from Betawiki. [20:21:39] * af, be, crh-cyrl, crh-latn, ka, kaa, kk-kz, kn, nap, nl, qu, stq [20:21:49] does the edit hook allow us to completely rearrange the elements of the form, or replace them with our own widgets? [20:21:49] wheezer: NotACow's would be my suggestion. It's better to extend than to override... [20:22:10] 03kim * r27954 10/trunk/extensions/Wikidata/util/ (class.php class_missing.php classes.php): Classes support, now WITH dc \o/ [20:22:15] wheezer: There are multiple hooks into skin and editpage that will let you get away with just about any rearranging you want [20:22:25] Excellent [20:22:37] the AlternateHook hook in EidtPage.pho seems perfectly suited for this [20:22:41] er, AlternateEdit [20:22:42] can you think of a good tutorial to look at, where the edit page is being altered? [20:23:04] wheezer: None that I know of. [20:23:21] (I am not a great php coder, but am making notes for our main programmer) [20:24:02] mediawiki has a scad of hooks [20:24:02] and thank you. this is helpful. it seemed those hooks could only do so much.. but apparently that was a misimpression [20:24:18] i ripped them out of myrtle because they were complicating my life, i'll hve to put something back into replace them eventually :) [20:24:37] wheezer: There are of course limits to what you can accomplish through hooks, but mw is pretty extensible in this way. [20:24:41] What we really want to make, is a sort of template for the edit page.. so that we easily set up slightly different forms for different article types [20:24:43] At least compared to other webware [20:24:47] what are the consequences of renaming a user (as with the Renameuser extension)? Other than potentially broken signature links, is there any data loss? [20:25:02] jimbojw: no [20:25:18] minute: so it's pretty safe to use then? [20:25:20] jimbojw: It can place some fairly heavy load on your db servers and backup the job queue a bit, bot not generally any lasting consequences [20:25:29] s/bot/but [20:25:30] yep [20:25:43] great! thanks amidaniel, minute [20:25:50] np [20:26:49] is there a syntax highlighting option (internal OR external) for mediawiki? [20:27:15] @search .*(?:geshi).* [20:27:15] Results: [highlight] [20:27:21] !highlight | carambola [20:27:21] carambola : there are several extensions for syntax highlighting, see http://www.mediawiki.org/wiki/Category:Syntax_highlighting - the most popular one is at http://www.mediawiki.org/wiki/Extension:SyntaxHighlight_GeSHi [20:27:44] !geshi alias highlight [20:27:44] Successfully added alias: geshi [20:28:32] carambola: the GeSHi highligher that mwbot just linked to is good - I've used it a lot [20:29:27] Hey all, are they any statistics--or educated guesses-- about how many mediwiki sites are actually up and running? [20:29:40] download counts never tell the story [20:30:30] Not sure there are any particularly accurate estimates out there -- but it's surely a lot [20:31:05] 03(mod) Add third variant to Polish {{PLURAL}} like in Czech - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11796 (10cynic) [20:31:25] wheezer: maybe you can use a google search and find out. [20:31:47] tried [20:31:53] no such stats to be had [20:32:09] wheezer: inurl:/wiki/ [20:32:48] 03(mod) Add third variant to Polish {{PLURAL}} like in Czech - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11796 (10cynic) [20:32:49] how did you try? [20:32:52] there are 233 million references to mediawiki, hoever [20:32:55] lol [20:33:19] 03(FIXED) Add localisation settings for Wikispecies - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12119 +comment (10jeluf) [20:33:24] but lets of wikis use Wiki [20:33:26] no? [20:33:28] lots [20:33:44] likely 80-90% of the wikis out there today are mediawiki, methinks [20:33:50] ah. you want to know if it's mediawiki. [20:34:03] *amidaniel notes that 78.3% of all statistics are made up on the fly [20:34:54] *PunkRock trusts no statistic that he has'nt faked. [20:35:29] i think the downloads were like 85,000 for the latest version. that seemed pretty low [20:35:53] There are multiple, multiple download sources. [20:36:00] i suppose the file can come from many sources.. yes [20:36:01] A good number of the installs are running from svn [20:36:12] And not everyone keeps uptodate [20:36:18] right [20:36:47] if it were a game show question, would you guess.. 100k, 500k? 1m? more? [20:36:53] 03(FIXED) Switch Armenian Wikisource logo - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12113 +comment (10jeluf) [20:37:11] 100-500k [20:37:43] that feels right. [20:37:52] for no particular reason [20:38:55] 03siebrand * r27955 10/trunk/extensions/ (16 files in 16 dirs): [20:38:55] Localisation updates from Betawiki. [20:38:55] * Fixes and additions to 16 extensions for af, io, nl, pt [20:39:07] Besides this chat, where is the action, when it comes to the MW community? The mailing list/nabble? [20:39:13] I was at mwusers... [20:39:19] but it didn't seem to busy [20:39:57] where is brion ? [20:40:13] Either in cali or fl :) [20:40:15] away [20:40:40] amidaniel: sorry... that kind of syntax highlighting is not what i meant [20:40:52] i mean wiki-syntax highlighting [20:40:57] life is sad without him [20:42:01] Brion is the founding author of mediawiki? That brion? [20:42:44] (the braces get pretty unruly in templates :/ [20:42:55] is he ? [20:43:12] wheezer: Believe he's a cofounder .. one of the oldest contributors in any case, so yes. [20:43:20] *amidaniel will bbiab [20:44:18] wheezer: He basically became the lead developer after the author made it. For some reason the guy who wrote it decided he didn't want to maintain it. [20:44:27] ah [20:44:33] which name was that? [20:44:34] im going to potentially ask a really stupid question but humor me, does mediawiki support binary file uploads? like so i can upload any arbitrary file to mediawiki? [20:44:41] wheezer: Magnus Manske [20:44:46] ty min [20:44:47] i notice its image upload [20:44:50] AlphaOmega: If correctly configured. [20:44:52] but i want toupload pdfs and such [20:45:08] so brion was the one that refactored it a few years back? [20:45:15] or started to? [20:45:17] When was it ever refactored? [20:45:32] I thought I read it had been, when it was made far more scalable, etc [20:45:37] 03(mod) Namespace "Comentarios" for es.wikinews - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12051 +comment (10jeluf) [20:45:38] minute: i try to upload and it says .pdf is not a valid image [20:45:47] and i turned off image handling (i thought) [20:45:48] AlphaOmega: "if configured correctly" [20:45:58] There are lots of settings you must change. [20:46:38] is there any way to add wiki-syntax highlighting to the edit box? [20:47:39] if someone wants WYSIWYG, you'd suggest TinyMCE or FCKeditor... I don't want WYSIWYG, just syntax highlighting [20:47:47] wikied [20:48:22] http://en.wikipedia.org/wiki/Wikipedia:Wikied [20:48:40] sadly, it doesn't update the highlight whyile you type [20:48:48] hahaha [20:48:50] how lame [20:48:51] i'd consider it a "nice try" [20:49:00] well, it hogs enough cpu as it is :) [20:49:02] Will wiki ed support IE soon? [20:49:16] no idea. will IE support standards soon? [20:49:53] Not my job to know. but since it's still the browser most of the planet is using. I kinda have to ask [20:49:54] 03(FIXED) please create appendix ns on oc.wiktionary - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12066 +comment (10jeluf) [20:50:01] sadly [20:50:09] we're "supposed to" use IE [20:50:41] and actually had firefox.exe blocked... of course, renaming it to firefox2.exe fixed that :) [20:50:53] not anymore, though... IT seems to have come to their senses [20:53:32] Netcraft still says that MSFT has about 56% of the browser market. That's still better than half your user base [20:53:44] I wish it would hurry up and die [20:53:48] but until it does.. well [20:53:56] 03(FIXED) Create two new namespaces for the Arabic Wikinews - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12068 +comment (10jeluf) [20:54:12] At least it's not still 85% [20:54:30] Hey, I got a question, I had a problem where my mediawiki was always asking for you to enable cookies, even if they were already enabeled, well i found out its because im running the wiki on an intranet address, so is there any work around to get the cookies to be sent differently? [20:54:40] 03(mod) Kazakh message updates - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12145 +comment (10alefzet) [20:55:33] whoa. wiked is pretty nice [20:56:27] thanks, minute [20:56:57] So anyone. wikied looks like a nice extension. Is it planning on IE support, at least? [20:58:24] btw, despite not doing "constant updates" to the highlighting, you can just disable and re-enable highlighting to re-parse [20:58:39] no need to preview to refresh the highlighting [20:58:52] Jboucher: zones indeed. ;-) [20:59:22] well i can't make any changes to the browsers [20:59:32] why? [20:59:41] because they are all controlled by the Active Domain Controller and locked down [20:59:45] 03(mod) List of rights that can be granted/revoked broken in Special: Userrights - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11505 (10Prodego) [20:59:50] it's called "group policy" [20:59:57] and even if i unlock it i would need to goto 1,800 computes to change the zone [21:00:01] nope [21:00:06] yeah. well they company doesn't want to do that [21:00:10] you can deploy the IE settings/zones w/ GP [21:00:21] if you use AD, GP is pretty much implied [21:00:34] yup, but we can't make any changes to the Domain [21:00:39] unfortunally they won't let us [21:00:45] Is Wikied open source? [21:00:52] WikEd,. rather? [21:00:54] so i need a way to do it from the server [21:01:10] so have the server "Trick" IE or something [21:01:48] Jboucher: do you access it via FQDN? or just the NETBIOS name? [21:01:56] FQDN [21:02:04] the wiki is running on a Linux server [21:02:16] with a fully qualified domain name on the DNS Server we have [21:02:17] linux does netbios just fine. ;-) [21:02:38] hmm.... via IP does the same thing? [21:02:43] yup it does, but our company has an office in India, and some all over canada [21:02:49] which needs to access the server by the ip address [21:02:56] Netbios doesn't go through the gateway :) [21:03:04] no.. Via IP works just fine [21:03:11] but we can't allow pepole to use the IP [21:03:20] we have no choice but to use the domain name we were given [21:03:27] IS doesn't want it done ANY OTHER WAY :( which sucks [21:03:45] ummm, give it a new crazy DN? [21:03:48] *CN [21:04:06] the address is wiki_ts.cs.primus and it needs to stay that way [21:04:08] can't change [21:04:15] and we can't change the DNS Entries either [21:04:25] well, IE knows darn well that's on your intranet. [21:04:29] Jboucher: your network is broken. _ is forbidden in DNS names [21:04:39] (for A records, anyway) [21:04:40] Hi! I would like to know if there is a quick way to get the number of links on a Whatlinkshere special page [21:04:46] haha.. ive used _ in domain names for years [21:04:47] like http://nl.wikipedia.org/w/index.php?title=Speciaal:VerwijzingenNaarHier/Sjabloon:References&limit=4500&from=0 [21:04:49] wiki_ts is the node part [21:04:49] and hever had problems [21:04:51] it's fine [21:04:56] it's not fine [21:05:00] sure it is [21:05:02] it might happen to work, but it's wrong [21:05:05] bah [21:05:12] *TimLaqua wanders off to MSDN [21:05:28] well the computer name is Wiki_TS and when i try to access it useing the _ the cookies work fine [21:05:46] but the computer name isn't visible in our other offices because of teh gateways [21:05:56] (example places it doesn't work: the Squid proxy will reject all URLs containing _ by default...) [21:06:23] Jboucher: ok... so what CAN you chage? [21:06:25] change? [21:06:34] just the settings on the serve [21:06:34] r [21:06:38] i have FULL access to the server [21:06:43] hmm... [21:06:47] i was givein a domain name, and an ip address which i cant change [21:06:58] i know.. its a tough one.. I work in IS so i have a good understanding on how networks work [21:07:07] but im limited what they will let me do for this project [21:07:11] re [21:07:20] thats why i have narrowed it down on haveing to change something on the server or the wiki code to get it to work [21:07:58] Jboucher: I don't get it - intranet settings should always be more lax than internet zone settings [21:08:19] Jboucher: http or https? [21:08:23] http [21:10:33] you think if i do it https it would work? [21:10:37] nope [21:10:45] ok, figured it wouldent [21:10:49] thats why i havent tried [21:11:06] but we usually don't send passwords over the intranet in plaintext. ;-) [21:11:22] haha.. yah [21:11:44] what version of IE is your company on? [21:11:55] 6 [21:13:45] HA [21:14:06] flyingparchment ftw. [21:14:09] The cause is the presence of an underscore (_) in the host name of the url. In my case, the particular SharePoint application... [21:14:21] w/ regard to cookie explosions on SP apps [21:14:22] help, where the create account link? [21:14:34] i disabled create account, but admin still has rights, i cant find the link to create new accounts [21:14:55] ie6 is known to have flakey cookie handling when underscores exist in the fqdn. [21:15:08] ok, Thanx TimLaqua Ill try to get IS to remove the _ and see what happens [21:15:42] Historically, the underscore character has been allowable for NetBIOS names but not allowable as host names according to RFC 1132 [21:15:44] ;-) [21:15:56] but... I like underscores.... [21:16:20] haha.. yeah well i have never had a problem with underscores [21:16:22] Where do I change the "$wgLogo" What File is it in? [21:16:29] now i just have to try and get the IS department to remove it :) [21:16:30] hahahaha [21:16:43] !logo | xCuber125 [21:16:43] xCuber125: The logo that appears in the top left of each page is determined by the $wgLogo configuration setting in the LocalSettings.php file. To change this you simply need to change the value of $wgLogo to point to the URL of your own logo image. See for more information. [21:17:12] I didn't find it in the LocalSettings.php [21:17:17] add it then [21:19:54] just like this: "$wgLogo = ""? [21:20:24] i got it, thanks [21:21:08] hey [21:21:15] I need a little help here. [21:21:22] !ask [21:21:22] Don't say "I have a question", or ask "Is anyone around?" or "Can anyone help?". Just ask the question, and someone will help you if they can. Also, please read < http://workaround.org/moin/GettingHelpOnIrc > for a good explanation of getting help on IRC. [21:21:48] TimLaque: I really appreacate your help... I think this will solve the Issue... [21:22:03] Jboucher: np, lemme know how it goes. [21:22:19] evil zones. [21:23:24] I'm trying to install media wiki on XAMPP. I just got to the part where it said "Database config". I went down to the part where it said to specifiy a DB username and DB password. I made up both of them, but it keeps saying to "Check username and password". [21:24:11] What should I put in there? [21:26:30] 03thomasv * r27956 10/trunk/extensions/ProofreadPage/proofread.js: adding empty lines to prevent paragraphs from breaking [21:28:43] Hey minute.. you here? [21:29:28] this *should* resolve to {{{var 2}}} right?... [21:29:30] I looked at wikihow's editor. And yes, I would like to add custom buttons to replace the defaults, as they did. Was that done with hooks? AlternateEdit? Or just a hack of editpage? [21:29:49] http://www.wikihow.com/index.php?title=Testpage&action=edit [21:29:52] {{#vardefine: i|2}}{{{var {{#var: i}}}}} [21:30:04] (Or anyone else who feels like answering this :) [21:30:50] I'm trying to run maintenance scripts. I am in the correct directory, and when I type for example "php language/transstat.php", I get an error message saying "C:\wamp\php\php.exe is no valid Win32-application" What now? [21:31:01] 03(mod) Namespace "Comentarios" for es.wikinews - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12051 (10julian_pastine) [21:34:04] Anyone? [21:36:20] okie dokie! i have mediawiki 1.8.5 and i just downloaded 1.11, is it easy to upgrade? and where is the docs i should read about upgrading from 1.8.X to 1.11 ? [21:36:20] 03(NEW) Parser:: insertStripItem not generating unique markers per call - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12150 minor; normal; MediaWiki: Page rendering; (Jared.Williams1) [21:38:07] !upgrade [21:38:07] http://www.mediawiki.org/wiki/Manual:Upgrading [21:38:23] AlphaOmega: for you --- ^ [21:41:17] hi all [21:41:31] ty MZMcBride [21:41:35] im looking at a couble of extensions ive written [21:41:50] one seens to just use new SpecialPage("blah"); [21:42:03] the other creates a new class that extands SpecialPage [21:42:10] On Wikipedia, theres the Table of Contents on every page, How Do I get that on my Wiki? [21:42:25] the only page i can find semes to recomend the later, but waht exactly is the first one doing? [21:42:25] !table of contents [21:42:25] You don't have permission to do that. [21:42:56] On Wikipedia, theres the Table of Contents on every page, How Do I get that on my Wiki? [21:43:07] xCuber125: it's a standard feature [21:43:19] xCuber125: its a user option iirc [21:43:21] ok, how do i get it to show up? [21:43:25] xCuber125 either have more than three sections on the page [21:43:27] or just put __TOC__ at the top of the page [21:43:30] Or enter __TOC__ somewhere on the page [21:43:32] basically have three or more sections :) [21:43:35] xCuber125, if you have more than 3 sections, it shows up, or you can add the __TOC__ (or something like that tag) [21:43:37] or yeah, what those peeps said [21:43:39] ok [21:43:44] i'll try, thanks :) [21:44:44] anyone knwo about my extension question? [21:45:15] Dee subclassing SpecialPage is generally better [21:45:46] Dee: the first one is pointless. if php would know about purely abstract classes, it would be an error. [21:46:03] the first is a LOT simpler code wise [21:46:07] On the Bottom of the search page, there's that area called "Search in NAmespaces" how do i get rid of that? [21:46:15] just wondering what it is actually doing [21:46:20] RoanKattouw: instantiating a plain SpecialPage is nonsense, no? it creates a special page that does nothing? [21:46:30] Dee: creating a special page that does nothing [21:46:33] Yep [21:46:51] seemes to include_once("SpecialBlah.php") and calls fdSpecialBlah() [21:46:53] wf even [21:46:53] I was wondering if anyone here knew what function populates the externallinks table [21:47:03] Duesentrieb [21:47:09] On the Bottom of the search page, there's that area called "Search in NAmespaces" how do i get rid of that? [21:47:09] Duesentrieb: the point it that it IS doing something [21:47:28] that creates a special page fromt he value I gave it [21:47:40] On the Bottom of the search page, there's that area called "Search in NAmespaces" how do i get rid of that? [21:47:57] Dee: then your code does more than just creating that object. if you go about calling functions and stuff, that looks like an "old style" special page imple,entation, which is totally confusing. don't use it :) [21:47:58] *Dee checks the source [21:48:10] xCuber125: don'Ät repeat your question, youÄll only piss us of [21:48:11] it may be an old style [21:48:17] much easier and simpler though :) [21:48:20] xCuber125: if i knoew the anywer, i would have told you [21:48:27] It does create a special page, but how do you get it to do anything? [21:48:29] sorry... [21:48:46] You need to reimplement SpecialPage::execute() in a subclass, how else would you do it? [21:48:52] Using the wfSpecialWhatever() thing? [21:49:02] tahts what it appears to be doing [21:49:30] ahhh [21:49:31] The parent class has an execute() method * which distributes the call to the historical global functions. [21:49:37] Right [21:49:48] The preferred way is to subclass and reimplement [21:49:53] yeah [21:50:34] xCuber125: recommended reading: http://workaround.org/moin/GettingHelpOnIrc [21:51:15] fuck that [21:51:30] uh [21:51:44] That attitude is generally frowned upon. [21:51:48] We are simply trying to help you. [21:51:50] generally [21:52:03] TimLaqua_away: ;) [21:52:18] wait [21:52:20] :S [21:53:46] Is there some standard set of help pages I can load into my site, or is there no choice but to make each one, and paste from mediawiki? [21:53:57] ok, using the new style, do I HAVE to use given file names or can they all go in a single file? [21:54:12] Dee you can use as many files as you want [21:54:21] Be sure to add the right stuff to $wgAutoloadClasses [21:54:43] RewriteEngine on [21:54:43] RewriteCond %{REQUEST_FILENAME} !-f [21:54:43] RewriteCond %{REQUEST_FILENAME} !-d [21:54:43] RewriteRule ^wiki/(.+)$ /wiki/index.php?title=$1 [L,QSA] [21:54:43] RewriteRule ^wiki$ /wiki/index.php [21:54:48] oops, crap, soory! [21:55:03] wheezer, there isn't a standard set, but if you want to use pre-existing ones from another wiki, just export them there, then import them to your wiki, it's a lot easier than copy/pasting [21:55:03] ok, thanks RoanKattouw [21:55:27] On the Bottom of the search page, there's that area called "Search in NAmespaces" how do i get rid of that? [21:55:50] this is new to me. You export someone else's mediawiki? [21:55:50] by editing SpecialSearch.php? [21:55:59] !export | wheezer [21:55:59] wheezer: To export pages from a wiki, navigate to Special:Export on the wiki, type in the names of the pages to export, and hit "export". See for an example of this form. See also: !import [21:56:00] xCuber125: seriously dude. if someone knew, they'd tell you [21:56:14] ah [21:56:15] !import | wheezer [21:56:15] wheezer: To import pages into your wiki, export them using Special:Export (see !export). Then, navigate to Special:Import on the target wiki and select the dump file. You can also import pages directly from another wiki (see !importsources). For mass imports, use the dumpBackup.php and importDump.php scripts (or mwdumper). NOTE: when using content from another wiki, follow the LICENSE TERMS, especially, attribute source and authors! [21:56:22] k, bye [21:56:36] 03(mod) Parser::insertStripItem not generating unique markers per call - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12150 +comment (10roan.kattouw) [21:56:57] Now, do you do this one page at a time? [21:57:04] TimLaqua_away: What did you mean by "generally"? [21:57:13] wheezer Special:Export/Main_Page [21:57:21] if you go to Special:Export, you can type out an entire list of pages to export all at once [21:57:30] great!! [21:57:32] and then just import them all at once too :) [21:57:57] Don't make your dumps too huge, though: there is a maximum execution time [21:58:01] minute: what did you mean by 'generally?' [21:58:01] And which system would you suggest i take my first helps from? wikiMedia.org? [21:58:03] ;-) [21:58:06] So 50MB dumps aren't gonna be imported in full [21:58:11] wheezer: www.mediawiki.org [21:58:13] TimLaqua: *shrug* [21:58:19] thank you [21:58:31] RoanKattouw: if it's mediawiki.org, all of the help pages are PD, so you don't need to export entire histories [21:58:50] but wouldn't it make sense to have a standard set of 'starter helps'? [21:58:58] yes, generally :) [21:59:04] wheezer there's a lot of help @ mediawiki.rog [21:59:05] *org [21:59:18] Docs for the code are at http://svn.wikimedia.org/doc/ [21:59:22] @search doc [21:59:22] Results: [rewriteproblem, webrequest] [21:59:23] yes, but i meant, in my virgin install [21:59:30] !rewriteproblem [21:59:30] a) do not put the files into the document root b) do not map the pages into the document root c) use different pathes for real files and virtual pages d) do not set a RewriteBase e) but all rules into the .htacces in the document root. [21:59:31] ty again [21:59:38] !webrequest [21:59:38] http://svn.wikimedia.org/doc/classWebRequest.html [22:00:24] !documentation is For documentation of all classes and files in MediaWiki, see http://svn.wikimedia.org/doc/ [22:00:24] Successfully added keyword: documentation [22:00:31] !doc alias !documentation [22:00:31] com.amidaniel.mwbot.MWBotException: Unable to resolve keyword: !documentation [22:00:40] !doc alias documentation [22:00:40] Successfully added alias: doc [22:00:47] !docs alias documentation [22:00:47] Successfully added alias: docs [22:01:48] This room rocks :) [22:02:06] Good community [22:08:00] yup [22:10:50] wheezer: no doubt [22:11:24] Is there any way to find the length of the content of an encoded marker from within a parser function? strlen( $parser->recursiveTagParse ) doesn't work. [22:15:05] A lot of open source communities has IRCs.. but you don't get a lot of help in them [22:15:09] have [22:17:54] !doc [22:17:54] For documentation of all classes and files in MediaWiki, see http://svn.wikimedia.org/doc/ [22:18:01] !brion [22:18:20] :-( [22:18:39] I have a question, which I think is related to the API, but I'm not sure... [22:19:24] I noticed that on some wikis you get suggestions in the Search box when you start entering a search term. How can I impliment that on my own wiki project? [22:20:34] what version are you running? [22:20:48] 03(mod) BiDi problem at History/Diffs - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12079 (10gangleri) [22:20:52] 03(mod) Tracking bug (tracking) - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=2007 (10gangleri) [22:20:55] 03(mod) RTL/bidirectional issues (tracking) - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=745 (10gangleri) [22:20:57] 1.10.1 [22:23:09] Farrendel: set $wgAjaxSearch = true; in your localsettings.php file [22:23:53] (and set $wgUseAjax = true; as well, if it hasn't been set to true yet) [22:24:00] That sets the search to a box that extends across the whole page, though... What I want is something similar to what (was?) imlipmented on Wiki wikis, where the suggestions appear inline with the search box. [22:24:22] How do I get the different levels in the Table of Contents? [22:24:40] the

    is the first level but what about the second and third and etc. [22:24:42] I *think* it has something to do with the OpenSearch portion of the API, but I have no idea how to impliment that. [22:25:49] that's wierd, the description said "enable auto suggestion for the search bar" [22:26:15] xCuber125: could you clarify a bit more? [22:26:17] xCuber125: == Heading 2 == [22:26:25] xCuber125: === Heading 3 === [22:26:35] etc. until 6, iirc [22:26:37] ok, thanks [22:27:05] xCuber125: http://meta.wikimedia.org/wiki/Help:Editing [22:27:11] ok, thanks [22:29:29] 03(mod) Add third variant to Polish {{PLURAL}} like in Czech - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11796 +comment (10cynic) [22:30:35] http://www.mediawiki.org/w/index.php?title=API%3AOther_services Seems to suggest that the function is part of the API. Hoever, I've set $wgEnableAPI = true; but I'm not sure what else I need to do to enable it... [22:30:55] 03(mod) Add third variant to Polish {{PLURAL}} like in Czech - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11796 +comment (10cynic) [22:37:30] 03(mod) Namespaces localization in Ido Wiktionary - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12109 (10malafaya) [22:53:01] is it possible to put text in a table cell on the top + left of the cell? [22:53:34] valign="top" halign="left" ? [22:54:12] nice [22:54:25] I didn't see that in the user docs :) [22:54:31] thank you [22:55:12] bah, how 1990s [22:55:17] use proper css :) [22:55:39] gongoputch: tables and table cells support (most) html attributes. [22:56:18] hmmm css .... [22:56:27] 03(NEW) BiDi error (interfearing of field content) =?UTF-8?Q?=20at=20=C2=AB=20special?=:=?UTF-8?Q?Renameuser=20=C2=BB?= - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12151 normal; normal; MediaWiki: Special pages; (gangleri) [22:56:29] 03(mod) RTL/bidirectional issues (tracking) - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=745 (10gangleri) [23:04:35] 03(NEW) possible bug - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12152 normal; normal; MediaWiki: Page rendering; (gangleri) [23:06:03] *amidaniel wonders why one would title a bug report "possible bug" [23:11:49] Hi amidaniel ! [23:12:16] Hello person who named a bug report "possible bug" :D [23:12:52] Hi, I instal MediaWiki on free hosting server, but I must add advertisement on every page. This code are here, http://www.hostuju.cz/info_reklama. Is it possible? [23:14:33] Sevela_p: yes, edit skins/MonoBook.php and put that code uh... before the tag i guess. [23:15:05] Sevela_p: that code looks odd though - it's a script block that does nothing but injecting another script block. just putting that in directly would be much simpler. [23:15:07] Sevela_p: Or you can add the script to your MediaWiki:Common.css [23:15:23] either that's some very odd hack for something, or they have been smoking bad stuff [23:15:55] amidaniel: yes, true, but then he'd not have that excat code in his page. which they might check for using a spider. [23:15:55] *amidaniel assumes the latter [23:15:56] Thanks a lot, I'll try it. [23:16:04] Sevela_p: co vymyslis? [23:16:47] zkouším spustit mediawiki [23:17:20] drzim palce ;-) [23:17:48] Ď, já jenom tak ze zajímavosti [23:18:37] can somebody confirm, pls? - currently there's no way how to search for exact word/phrase/regexp? [23:22:37] Danny_B: Search using .. ? [23:22:52] on current wmf projects [23:22:58] so the current settings [23:23:17] You mean with Special:Search ? [23:23:20] Danny_B: exact word should work [23:23:24] not sure about the others [23:23:36] Danny_B, cannot search for regexp [23:24:08] Ah, I get what you were asking now :) .. sorry [23:24:09] tried foo and "foo" got barfoobaz words in result too :-/ [23:24:38] Then try " foo " [23:25:07] also tried foo bar and "foo bar" got foobaz barqux in results as well [23:25:20] Danny_B: can you give url? [23:25:24] To copy images from one mediawiki instance to another, can I just copy the files under /images? [23:26:08] Danny_B: can't confirm. substrings are not found. [23:26:20] i'll try amidaniel's suggestion, sec [23:26:21] Danny_B: exact wor search seems to work as expected [23:26:32] Danny_B: don't look at highlighted text, its nonsense [23:26:53] ugh, so let's summarize: [23:26:54] Danny_B: exact phrase, not quite. it seems to do an "and" search, with adjusted score for occurrances in the right order [23:26:57] 03(mod) Support global (crosswiki) blocking - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=8707 (10meno25wiki) [23:27:05] Yeah, the highlightings generated by simply loading up the page text and doing a preg_match on the term you entered [23:27:17] It has nothing to do with the indexing or the acutal search being performed [23:27:41] if i type "foo" i should get only and only pages containing at least one foo itself but not pages where only foobar is? [23:27:58] 14(INVALID) Rendering of tags - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12152 summary; +comment (10cannon.danielc) [23:27:58] Danny_B: yes. and as far as i see, this is so. [23:28:18] and phrase? [23:28:26] Duesentrieb: umm, no.. phrases are parsed as phrases, not as AND queries [23:28:57] rainman-sr: really? didn't look like it. but i might have been fooled by the highliting stuff. the actual occurrance might have been further down in the text [23:29:12] Wow, wikibugs is lagged [23:29:12] Danny_B: try searching for "copyri" - no hits. [23:29:28] Can anyone teach me how to move images from one instance of mediawiki to another? [23:29:34] amidaniel: wikibugs reads [23:29:37] *mails* [23:29:39] Duesentrieb: yeah... highlighted text is higly misleading [23:29:45] Duesentrieb: hehe [23:29:47] Duesentrieb: searched for "jak" http://cs.wikibooks.org/w/index.php?title=Special:Search&search=%22jak%22&fulltext=Hled%C3%A1n%C3%AD&offset=160&limit=20 see the last result - the page doesn't have word jak in it [23:30:20] which page? [23:30:21] so as i said getting unwanted results [23:30:25] Praktická elektronika does [23:30:26] "označován jako atime - access time" [23:30:32] the last - http://cs.wikibooks.org/wiki/Linux:Adres%C3%A1%C5%99ov%C3%A1_struktura [23:30:38] Did you do an exact match or wordword search? [23:30:47] eh? [23:31:02] Danny_B: You searched for "jak" not jak [23:31:02] typed "jak" (incl. quot marks) and hit search [23:31:14] With the quotes, it'll match that, without it won't [23:31:50] amidaniel: sorry, you're not right - http://cs.wikibooks.org/w/index.php?title=Special:Search&search=jak&fulltext=Hled%C3%A1n%C3%AD&ns0=1&offset=160&limit=20 last item [23:31:54] Danny_B: hm... might be a problem with non-ascii chars. if they are not read as "letters": it page contains "nějakého" [23:32:08] Danny_B: right, we are using a stemmer for cs [23:32:18] also Jakýkoliv [23:32:23] i would guess it stemms jako (strong?) as jak [23:32:28] ah :) [23:32:34] that expains it more nicely [23:32:41] so... it's not a bug, it's a feature :) [23:32:49] 03(mod) Change site name for Ido Wiktionary to 'Wikivortaro' - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12109 summary (10pathoschild) [23:32:50] hell [23:32:52] how would one effect the background outside the content area? [23:33:03] jak_na_Unix.html [23:33:06] so no way to search for exact word/phrase [23:33:07] may be matching it [23:33:15] gongoputch: css. get firebug or something to find out what class to use [23:33:23] firebug [23:33:35] ok [23:33:38] gongoputch: firebug is great for messing with css in firefox, yes [23:33:42] where is the css kept? [23:33:47] Danny_B: not it the exact word is a stem root [23:33:58] To move images from one wiki to another, can I just move the files under /images? (it doesn't seem to work) [23:34:08] firebug is a gun on fly, go web developer extension [23:34:16] gongoputch: in several files - you can edit it online. go to MediaWiki:common.css (for content stuff) or MediaWiki:monobook.css (for skin stuff) [23:34:21] Adept: No. You need to run importImages or whatever it's called [23:34:25] Adept: Look in /maintenance [23:34:29] ok, thanks [23:34:32] great, thanks! [23:34:40] Danny_B: that does not help you if you want to find out which css class to use to influence some given area of the page. [23:34:55] it does [23:35:01] ctrl-shift-y [23:35:06] and mark the area [23:35:11] ah, that js overlay thingy? [23:35:24] *amidaniel notes that the old-school way of figuring that out was to hit "view source" .. it does work :) [23:35:46] amidaniel: so do mostly i ;-) [23:36:07] i suppose tag is enough to be modified in this case [23:36:14] Danny_B: that doesn't do inherited styles :) [23:36:29] Danny_B: actually, the last result does contain jak [23:36:38] jak_na_Unix.html [23:36:40] rainman-sr: which? [23:36:41] amidaniel: you have to follow the nesting of the tags closely... [23:36:44] _ is like a wordbreak [23:36:58] Duesentrieb: If firebug can do it, so can I [23:37:05] *amidaniel is a messymarkuplanguage-reading guru [23:37:14] amidaniel: sure - it just takes time :) [23:37:18] Danny_B, the last result from the query, Linux:Adresářová struktura [23:37:21] 14(INVALID) Do not strip name attribute before output - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11932 +comment (10pathoschild) [23:37:24] how do I go to MediaWiki:common.css ? [23:37:29] rainman-sr: i see now [23:37:33] is it a 'special page'? [23:37:38] gongoputch: if you're not admin, no way [23:37:39] gongoputch: Type it in the search box and hit "Go" [23:37:42] amidaniel: we all are messy markup gurus. otherwise we would not be able to understand each other [23:37:44] ok [23:37:52] gongoputch: no, it's in the MediaWiki namespace... but you have to be an admin to edit it [23:37:54] Duesentrieb: Hehehe :) [23:37:57] Danny_B: and I checked we don't actually use stemming for cs.. so you can search for exact words [23:38:14] ok, that did it, thanks again [23:38:50] well, the hilight's gonna kill me [23:39:16] it hilites totally different things than i need to have hilited [23:39:25] Danny_B: yeah, i know... hopefully we'll have a better system in couple of moths [23:39:29] *months [23:39:35] i'm getting an "expression error" on a Template page, when using parser function #ifexpr [23:39:44] but i think ti's fine, when transcluded [23:39:50] there [23:39:54] check eg. http://cs.wikibooks.org/wiki/Speci%C3%A1ln%C3%AD:Search?search=v%C3%ADce+ne%C5%BE&fulltext=Hledat [23:40:02] 's no error when it's transcluded [23:40:05] Danny_B: .searchmach { font-color: black; } :) [23:40:27] Danny_B: bit it's a fundamental problem: the search engine tries to be "smart" and "soft", while you want a hard "grep" like match, right [23:40:31] am i supposed to add something so it's not confusing? [23:40:35] amidaniel: :-P in some 20 % it's helpful though ;-) [23:40:37] when viewing the temp[late page directly? [23:40:38] ERm ... .searchmatch { color: black; font-weight: normal; } should do it :) [23:41:08] Duesentrieb: yep, we should have hard match (grep) as an option [23:41:13] no... because the excerpt is chosen on the wrong basis too [23:41:35] Danny_B: Well, a regex search would be awesome .. unfortunately it would take forever and murder our poor db servers :) [23:41:39] Danny_B: that would mean a LOT more indexing... [23:41:52] regex is out of the question [23:42:00] phrase matching is pretty bad already [23:42:03] no not regexp [23:42:12] having exact words in addition to "soft" words/stems might be doable [23:42:20] i mean no wildcards and stuff like that [23:42:37] otherwise there's now way how to correct common mistakes [23:43:47] Danny_B: download dump -> use grep [23:43:53] i wish there were searchable texts on toolserver :-/ [23:43:56] rainman-sr: :D [23:44:06] Danny_B: Don't we all :( [23:46:53] rainman-sr: well, if and only if the dumps were actual... [23:54:48] hi all again [23:55:11] is there any concept of parameters in the MW message strings? [23:55:24] ya [23:55:31] That's what all those $1 $2 dealios are [23:56:03] hmmm? [23:56:10] *Dee looks deeper [23:56:46] Dee: only used by some/few strings. indeed, look for $1 on Special:Allmessages [23:56:55] *Duesentrieb hates numbered parameters [23:57:37] it looks like they are to be replaced manually by my script? [23:57:52] it doesn;t say otherwise on: [23:57:52] http://meta.wikimedia.org/wiki/Help:System_message [23:58:25] How do I upload images to a wiki on my site? [23:58:36] unless im missing something [23:58:36] Dee: look at the various wfMsgXXX functions. some (all?) take parameters [23:58:48] !uploads | xCuber125 [23:58:48] xCuber125: File uploads are an often-used feature of MediaWiki, but are disabled by default in all current release versions. To enable them, first make the upload directory (default images) writable by PHP, then set $wgEnableUploads to true in LocalSettings.php (i.e. "$wgEnableUploads = true;"). See for more info.