[00:12:10] *fajro is back. [00:18:35] Hey everybody...I have a MediaWiki installed, what's the path where I would find the sandbox? I need to do some training and didn't want people to goof up content that I need...thanks in advance for your help [00:19:17] RickZilla: the sandbox is any page you choose to make a scratch area [00:19:22] what's a sample URL from your site? [00:19:32] Hello, folks/ [00:20:07] http://www.sample.net/wiki/index.php?title=Main_Page [00:20:40] http://www.sample.net/wiki/index.php?title= [00:20:45] http://www.sample.net/wiki/index.php?title=Sandbox [00:20:49] so that could be the sandbox [00:21:05] that makes sense....I was looking for something more difficult :-) [00:21:08] Thanks for the help [00:21:26] don't worry, you'll find some other things in mediawiki to be difficult ;-) [00:21:34] I'm using the CategoryTree extension and I was wondering how to make a page list categories like the Category box on this page (below the green info box): http://www.mediawiki.org/wiki/Extension:CategoryTree [00:22:18] I want to make a list of categories and have the treestructure be available too, basically [00:24:43] dammit, I'm being a moron again [00:25:15] Why does it not make sense in the late evenings and perfect sense the next day? [sighs and blames tired eyes] [00:25:24] How do I "lock" a wiki? So no editing may occur? [00:25:35] by which users? [00:25:38] All users. [00:25:44] even sysops? [00:25:46] svip: set $wgReadOnly to some string [00:25:54] which becomes the message given when you try to edit ... [00:26:23] !userrights |svip [00:26:23] --mwbot-- For information on customizing user access, see . For common examples of restricting access using both rights and extensions, see . [00:30:03] 03aaron * r31768 10/trunk/extensions/FlaggedRevs/ (FlaggedRevs.php FlaggedRevsPage.i18n.php flaggedrevs.css): [00:30:03] * Shorten messages [00:30:03] * Some CSS tweaks [00:30:03] * UTF8 [00:31:39] if I upload a file with the same version as an existing file, the old version is preserved in a history, correct? [00:32:01] grr [00:32:38] saying "I bet this channel exists and can answer my question" ftl? [00:33:39] madewokherd: i think the image gets overwritten, but don't quote me [00:33:42] just try it and see what happens [00:33:54] madewokherd: yes [00:34:46] normally I'd expect that, but the warning makes it sound like it might not [00:35:50] ah right - looks like old versions of uploaded files are saved in an archive directory [00:40:55] Reeeaaalllly dumb question here....how do I break a line, it's with \\ correct? [00:41:15] RickZilla, no.
. [00:41:28] \\ is in WikiCreole, I think, but MW has never been a part of that. [00:41:40] Ah, ok....so I just use regular html code then? [00:41:41]

and
all work [00:41:58] Is there a function called internalAttemptSave ? [00:42:38] Someone here gave me some code (with faux request) [00:42:56] that uses this [00:43:15] but it says EditPage::internalAttemptSave is not defined [00:43:18] Works....I coulda swore there was something with slashes, though... [00:43:41] but maybe that is in another wiki software that I tried out before I settled on WikiMedia [00:43:46] Or MediaWiki [00:43:48] lol [00:45:29] hm [00:45:39] nobody? [00:47:07] Can't help you much, Asga [00:49:50] 03aaron * r31770 10/trunk/extensions/FlaggedRevs/FlaggedRevsPage.i18n.php: Small tweak [00:50:16] Asga, yes, look in includes/EditPage.php. [00:50:42] You need to make sure includes/AutoLoader.php is run so that classes will be loaded correctly. [00:50:57] okay now i am confused [00:50:58] Generally it's loaded through includes/Setup.php or some path like that. [00:51:35] i looked into EditPage.php and didn't find it [00:51:38] wait a sec [00:53:08] ah is it called AttemptSave ? [00:54:04] hm strange it should already be loaded at the time i use internalAttemptSave [00:56:19] Simetrical where do you see that internalAttemptSave function in EditPage.php ? [00:58:44] i only see attemptsave :/ [01:04:53] hm don't know my error... i think all the wiki pages should be loaded at the time i call internalAttemptSave (a button on a newly created page is pressed)... and [01:05:46] then again... i am confused by the fact that there is only a function called attemptSave in EditPage and not internalAttemptSave... [01:05:58] or is that the same function? [01:06:15] :(( [01:13:13] Simetrical ? [01:28:19] I've spent 5 hours tracking down blank page "meat". If I run [01:28:21] echo preg_replace('/[\s]/', 'WHITESPACE', "Test à"); [01:28:23] the agrave is garbled, it becomes [01:28:25] TestWHITESPACE�WHITESPACE [01:28:27] I think that's expected behavior, preg_replace() is not multibyte safe. [01:29:51] But when that text goes through parseInternal, MagicWord's matchAndRemove() returns an empty string! [01:30:51] MediaWiki: 1.11.0 (r26292), PHP: 5.2.1 (apache2handler) on Windows XAMPPlite [01:37:17] No comment on this gripping saga?! [01:37:26] lol [01:38:42] I wonder if the bogus character (I think it's 0xFFFD) blows up certain implementations of PHP's preg_replace. [01:39:11] what exactly is the problem, again? [01:44:45] Two problems: 1) agrave becomes garbled after preg_replace of whitespace. 2) MagicWord's matchAndRemove() on this garbled string returns an empty string. [01:45:17] I'm working on a tiny reproducible test program, no worries. [01:48:59] well anybody made up his mind to answer me? xD [01:52:12] I'm trying to modify a skin to add CategoryTree by parsing an Article and putting the result in the skin, but I'm not sure how to include the required CSS/JS on every page without hardcoding them. [01:56:45] hm tried using AttemptSave, but it didn't work... [01:56:46] :/ [02:00:34] how can i check if autoloader.php was run correctly? [02:03:27] We have a winner! From http://php.net/manual/en/reference.pcre.pattern.modifiers.php#54805 , [02:03:28] Regarding the validity of a UTF-8 string when using the /u pattern modifier, [02:03:30] ... [02:03:31] 2. When the subject string contains invalid UTF-8 sequences / codepoints, it basically result in a "quiet death" for the preg_* functions, where nothing is matched but without indication that the string is invalid UTF-8 [02:03:33] 03(NEW) Broken links in special:sitematrix - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13320 trivial; normal; Wikimedia: wikibugs; (lilewyn) [02:03:40] :( [02:04:43] So garbled output from preg_replace will trash MediaWiki parsing. PHP hate hate hate. [02:14:32] Well now i tried everything... it doesn't seem to work with attemptSave and noone of you guys seems to know internalAttemptSave ;) (except the now maybe offline Symetrical) [02:30:11] Ok now i have the problem [02:30:34] my mediawiki installation includes/EditPage.php is different than the one in the generated online documentation and code [02:30:38] why is that? [02:30:52] is my mw version to old? [02:32:42] my EditPage.php is from 03.09.2007 [02:32:59] Asga, http://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3/includes/EditPage.php has the history [02:34:08] Looks like internalAttemptSave showed up in 2007-10-30. [02:34:40] Okay that explains everything [02:34:57] but now... how do i get the newest version? [02:35:08] do i have to replace the whole wiki? [02:37:48] !upgrade [02:37:48] --mwbot-- http://www.mediawiki.org/wiki/Manual:Upgrading [02:39:18] thanks [02:39:29] ill do that and hopefully it solves my problems [02:39:30] :) [02:39:31] thanks [02:41:08] Glad to help! [02:42:39] Good night then.. (its late here) [02:43:03] Kakurady, I'm not sure the cleanest way. OutputPage has an addHeadItem() method and it may be accessible from the skin. "@todo document", so good luck ;-) [02:44:03] skierpage: I'm now not even sure if the page data can still be modified at this point. [02:50:26] Me neither. The extension I'm familiar with (SMW) hooks into ParserAfterTidy and BeforePageDisplay to insert links to the CSS and JS it conditionally requires. If you know you always need it there must be a simpler way. [02:54:27] I guess it has to be done at BeforePageDisplay. The contents of Skin::outputPage looks like voodoo to me though, I guess I didn't really learn PHP after all..? [02:55:18] hello [02:55:44] Hi. [02:56:08] may I ask a question...? [02:56:34] I get the error Parse error: syntax error, unexpected T_LNUMBER in wk/includes/OutputPage.php on line 629 [02:57:02] during install in wk/config/index.php [02:59:47] stephan, what's at that line of code? What MW version are you running? [02:59:51] I've never been here before. I'm new to wiki. I have a serious code problem that it'd be a life saver for somebody to talk to me about in a PM. The problem is template related, I believe. Templates not displaying correctly, etc. If somebody knowledgable could PM me and help me out on a one on one basis that would be very amazing. Thanks in advance. [03:00:30] Any takers? [03:00:38] ... why not post here in public? [03:04:39] huh? [03:04:59] Netsplit. [03:05:49] 03dale * r31771 10/trunk/extensions/MetavidWiki/ (4 files in 3 dirs): [03:05:49] link to original page in mvd rewrites [03:05:49] fixed display of close link while loading xml media download list [03:07:59] version is 1.6.10 ... [03:08:04] and the lines are: [03:08:08] function reportTime() { [03:08:10] $time = wfReportTime(); [03:08:12] return $time; [03:08:13] } [03:10:24] Alright. I'm involved in a wiki project to create document and record the entire history and every aspect of the Church of Scientology. The first step to this is transcribing the wikipedia Scientology portal. I'm trying to figure out how templates work, and I've realized that simply copypasta is not going to work even if I collect every template on every page. [03:10:49] I think I'm missing something. Especially in the reference section. Things are starting to look like: http://www.religionisfree.org/wiki/index.php/Dianetics [03:12:17] A few people from over and ED told me that if I get all the templates created then the pages should display properly. The articales are fixing themselves as I bring templates over and they aren't that messed up to begin with. It all goes back to the references. [03:12:53] I've only got a basic understanding of what I'm working with, though I've figured most out on my own through trial and edit. :D [03:14:09] sorry. one moment. [03:14:47] I'm aware, thank you for pointing it out though! One moment while I register. [03:15:23] *amidaniel eyes http://leuksman.com/maps/geosearch.php [03:21:22] ... why am I rejoining the channel?... [03:27:59] anon1532_ , it looks like the templates you've copied over use the ParserFunctions extension, but http://www.religionisfree.org/wiki/index.php/Special:Version doesn't have this installed. [03:28:08] Hence the {#if: stuff on that page. [03:30:14] 03(NEW) SMW mb unsafe regexps can produce garbled UTF-8 strings - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=13321 major; normal; MediaWiki extensions: Semantic MediaWiki; (info) [03:30:24] wikibugs, that was me :) [03:33:32] I made a little test program, http://www.skierpage.com/tmp/mb_bug.txt [03:33:34] It fails on my Windows XAMPPlite, but works at my ISP and other sites. Can anyone imagine why the regexp [03:33:35] preg_replace('/[\s]/', 'WHITESPACE', "Test à") [03:33:37] would produce garbled UTF-8 in some PHP configurations and not others? [03:33:47] U see [03:34:10] I see* I will pass that information on to my sysop and have him test it. [03:39:28] anon1532_ , You may find you need other extensions in order for Wikipedia text to paste or import cleanly. Compare your Special:Version carefully with http://en.wikipedia.org/wiki/Special:Version , and good luck. [03:40:40] skierpage: It could depend upon the character encoding you used to create the php file [03:40:45] Awesome! Terms I can understand. I can totally do that. Thank you so much! [03:41:08] If that differs from the character encoding of the database, etc., it could cause garbage results [03:42:59] amidaniel, I'm pretty confident my test program is UTF-8, that's the only encoding under which it appears correctly in the browser. There's no database involved. But maybe there's some php.ini setting for character set handling. [03:44:02] skierpage: You might try escaping the á ... that is, use the unicode for the character instead of the character literal [03:46:53] amidaniel, Good point. The original bug is triggered by user-submitted wiki text so I wonder what determines the form PHP uses to represent it internally? [03:47:45] Hmm .. not sure [03:49:45] Maybe default_charset and mbstring.internal_encoding are different on Windows XAMPPlite vs UNIX-like (both are commented out in my settings). Thanks for the pointer! [03:50:39] heh, sure thing -- hope I didn't just send you off on a wild goose chase though :)) [03:58:16] FWIW, the answer might be in mb_get_info(). My ISP doesn't seem to have multibyte support configured. [03:58:56] Putting CategoryTree into sidebar isn't going very well - when there's no wikitext to parse in the page, the script won't load. [03:59:09] Goodnight everyone! Such great software built on such a disorganized hack language... [03:59:55] haha [04:10:06] hello, revision 31562 of Parser.php changed the behavior of ISBN links, and thus broke a userscript that is used by a little more than 700 users on the english wikipedia. [04:10:21] http://svn.wikimedia.org/viewvc/mediawiki?view=rev&sortby=date&revision=31562 [04:11:04] the userscript in question [04:11:06] http://en.wikipedia.org/w/index.php?title=User:Lunchboxhero/monobook.js&oldid=195446625 [04:12:16] would it be possible for a developer to help me contact the users of this userscript and explain what updates they need for their scripts. [04:16:01] I don't see how we would do that [04:16:09] it'd be the exact same process as for anyone else [04:16:52] http://en.wikipedia.org/wiki/Special:Search?ns2=1&search=externISBN&fulltext=Search [04:17:33] I'd use those search results, read through the lot of them, compile a list, write a bot, and make the necessary changes [04:18:41] I imagined that this scenario had come up a before and admins or developers had a tool to do something like that. [04:18:43] and keep in mind that it might change again tomorrow, so maybe you might want to pick a distribution method with easier updates, this time [04:18:46] nope [04:19:57] well, do you have any suggestions for an easier distribution method for userscripts? [04:20:01] I'm sure there are bot operators around with similar scripts set up, and the expertise to modify them [04:20:45] not off the top of my head [04:20:56] importScript? [04:21:03] Keep it in one place. [04:24:02] Hmm.. I'll play with that. Thanks. [04:26:35] meanwhile, if any bot operators want to do me and the 700 or so folks who use this script a favor... [04:27:39] do we have a channel for bots? [04:28:13] There's #wikipedia-BAG and #pywikipediabot [04:28:51] there you go fgregg, try those two [04:29:10] *AaronSchulz forgot all about the BAG channel [04:29:24] Thanks. If I can find a bot operator willing to help, will they have to get perimssion to run this operation? [04:29:29] no [04:29:34] not from me anyway [04:29:47] you can ask them about community policies [04:29:56] It looks as though it would be better to do it manually. [04:30:23] well, I care, but I my care does not multiply that high [04:30:48] to do it manually [04:30:51] Meh. A little meth and a couple of hours and you'll be all done. ;) [04:31:56] haha, well, thanks for the pointers. [05:30:58] motd/ [05:55:23] hello, can some one tell me the exact file to download for the latest meta, history and pages from the wiki dumps? I am on the download.wikimedia.org page but can't figure out [05:55:55] I have more than 40 gb reserved for the download so space is no big problem for me. [05:56:38] I want the name of the right file.tar.bz2 containing the meta, and current pages with their history, not just the current version. [06:00:01] I've popped on a couple of times to ask this but have not received an answer: [06:00:06] Does anyone know how to get mediawiki to assume the current namespace if no namespace is given. [06:00:17] In a link that is. [06:00:57] Like the "Default link prefix" in the namespace manager. [06:06:09] that's an interesting idea, but not one that I'm aware of [06:07:09] Thanks Werdna. [06:14:25] I may have to write my own extension if I don't find one soon. [06:15:04] how to play flash videos in mediawiki [06:15:12] extension. [06:15:37] yes [06:41:30] making a php bot for mediawiki - where should i start? [06:42:04] first, change language :) [06:42:23] ..? [06:42:33] php is designed for web applications, usually. [06:42:37] i know [06:43:06] i consider a bot running on a server shell interacting with wiki a web application even though it's not clearly server-client :-> [06:43:23] i could user ruby/perl whatever but i only found one php bot template for "easy start" [06:43:37] i wonder if there's some recommended api or framework for starting to build a bot [06:43:57] well, pywikipedia is fairly common. [06:44:45] let me rephrase, PHP is generally used as a serverside scripting environment for serving server-generated stuff. [06:45:03] you CAN use it for other purposes if you want, but you're probably better off with perl/python/etc [06:47:18] Werdna: well actually i was looking simply for something like pywikipedia, no matter the language, thanks :-) [07:28:11] #vikidia-fr [07:30:08] *Werdna stabs Grondin [07:30:23] hi werdan7 [07:30:29] Werdna, :-) [07:30:35] *Werdna stabs werdan7 [07:30:44] dev brutality! [07:30:46] get your own name >< [07:30:47] *Splarka films it [08:22:37] hello [08:23:32] hi [09:00:31] <_wooz> lo [09:38:09] 03tstarling * r31772 10/trunk/cortado/src/com/fluendo/plugin/HTTPSrc.java: [09:38:09] Fixed total breakage of the seek feature whenever the applet is running in a [09:38:09] browser. The major browsers cache requests made with URLConnection. They strip [09:38:09] the Range header, and even add their own If-Modified-Since. Fixed by reading [09:38:10] through the byte stream up until the requested byte offset, if there is no [09:38:14] Content-Range header. Hopefully this will be fast in most cases, due to the [09:38:16] client-side caching itself. [09:41:32] 03tstarling * r31773 10/trunk/cortado/src/com/fluendo/player/Status.java: Fixed a bug: there was a slight offset between the thumb rectangle as seen by intersectSeeker(), and the one drawn by paintSeekBar(). Fixed by factoring out the relevant geometry calculations. [09:54:01] Where is the "all about user management" help page again? I forgot to bookmark it... [09:54:51] jelle_: you mean about permissions? [09:55:51] Hello, can somebody tell me if it is possible to require a user login to view ALL wiki content? [09:56:27] yes it's possible [09:56:49] !access | blueandwhiteg3 [09:56:49] --mwbot-- blueandwhiteg3: For information on customizing user access, see . For common examples of restricting access using both rights and extensions, see . [09:57:07] !secrets | blueandwhiteg3: [09:57:07] --mwbot-- blueandwhiteg3:: MediaWiki was not designed with read-restrictions in mind, and may not provide air-tight protection against unauthorized access. We will not be held responsible should private information, such as a password or bank details, be leaked, leading to loss of funds or one's job. [09:57:39] great, thanks [09:57:44] i think i should learn those magic words as well, Duesentrieb :) [09:58:10] hm, i think .htaccess HTTP AUTH seems in order here :) [09:58:12] guillom: just guess, the bot will help [09:58:17] !perm [09:58:17] --mwbot-- I don't know anything about "perm". You might try: !newusergroup !parser [09:58:24] right [09:58:24] it would be nice if somebody wrote a quick tie-in to the wiki database for this.. [09:59:21] !groups [09:59:21] --mwbot-- For information on customizing user access, see < http://www.mediawiki.org/wiki/Help:User_rights >. For common examples of restricting access using both rights and extensions, see < http://www.mediawiki.org/wiki/Manual:Preventing_access >. [09:59:48] blueandwhiteg3: not sure if it exists that way around. it does exist the other way around (make mediawiki trust http auth) [09:59:53] !auth [09:59:53] --mwbot-- http://www.mediawiki.org/wiki/AuthPlugin [10:01:12] Hmm, not quite what i wanted :) [10:31:49] 03grondin * r31774 10/trunk/phase3/languages/messages/MessagesFr.php: Update french localization after rebuilding file [10:32:57] It seems some wiki themes have a tighter spacing when you make bulleted lists (asterisk) than others. Can anybody tell me specifically what the difference is? It also seems to affect behavior when double lines are passed along. [10:33:31] It also seems to affect the first blank line spacing [10:35:35] For example, the stock mediawiki theme has very tight spacing between bullets.. like a quarter aline [10:35:52] However, some themes have full line spacing, which I think looks much, much better [10:41:56] can you be more specific? [10:44:53] chick and monobook have: [10:44:53] ul { [10:44:53] line-height: 1.5em; [10:45:06] TimStarling: http://www.theage.com.au/news/web/more-woes-for-wikipedias-jimmy-wales/2008/03/11/1205125874243.html [10:45:15] Splarka: I don't know how... * points have a full line between them in some skins [10:45:17] Jeff Merkey at it again :) [10:46:05] blueandwhiteg3: I mean, more specific, which themes? [10:46:15] bluewiki has good spacing [10:46:30] default mediawiki does not [10:47:15] link ? [10:47:51] AaronSchulz: have you inserted or removed a BOM in .i18n in http://lists.wikimedia.org/pipermail/mediawiki-cvs/2008-March/052862.html ? [10:48:11] I can't find the link, grrr [10:48:40] and by stock, you mean vanilla monobook? [10:49:06] Raymond_afk: not deliberately, how would I tell? [10:49:45] you might try in your MediaWiki:Monobook.css (assuming that is your skin, if not mayb try Common.css) ==> #bodyContent ul, #bodyContent li { line-height: 2.25em; } [10:49:53] er, subst/li/ol [10:54:03] i can't seem to get definition lists to nest properly [10:54:30] I'm not having much luck getting it to work out right Splarka [10:54:43] It is late here and I need to dissect CSS structures to get this right, it seems [10:58:27] <[blackb]> hi [10:58:36] <[blackb]> wich file i must open to see may toolbar plz? [10:59:53] blueandwhiteg3: try this then:
  • this is a list item
  • this is a list item
  • this is a list item
[11:00:21] and copy it a bunch of times on a page, increase the "1em" a bit until you like the result [11:01:30] hippietrail: ;foo:bar\n:;foo:bar\n:;foo:bar\n::;foo:bar\n:;foo:bar\n;foo:bar doesn't do it for you? [11:02:36] Splarka: i just got it - didn't realize the : and ; are more or less different flavours of the same thing [11:03:25] i thought the inner lines had to start with the same sequence but they need to use : wherever the first line used ; [11:14:49] hippie: : can be used to indentedly nest lots of things [11:15:38] although a line starting with ; (even if the line is "indented" with more colons) will treat the first semicolon as the delimiter.. bah, who wrote this parser [11:21:40] hippietrail: it is all fun confusing parsing of various ascii into sensible
    1. /
      [11:22:18] the scary part is that i understand it now (-: [11:22:36] *Splarka backs away slowly [11:23:41] *ul>li #ol>li :dl>dd ;dl>dt ;dl>dt:dl>dd [11:32:26] <[blackb]> i have a problem [11:32:40] <[blackb]> to add this toolbar to all client on my wiki how can i do? [11:32:43] <[blackb]> http://en.wikipedia.org/w/index.php?title=Wikipedia:Toolbars&action=edit§ion=3 [11:41:48] Hello, I have installed the IssueTracker plugin, but it writes just @ENCODEDsomethingreallylong when I add an tag to text?? My version is 1.11.1 from debian package [11:42:49] Hi, is there a separate IRC-chan for Semantic MediaWiki? [11:47:36] probably [11:49:37] Does anyone know the name? [11:51:01] 03(mod) Allow blank lines between list items - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13223 (10tstarling) [11:53:10] 03(mod) Truncation of audio files in Cortado - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=11232 (10tstarling) [12:17:48] Hi! Is there a way to have a separate CSS file just for one page? [12:18:17] 03(mod) Truncation of audio files in Cortado - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=11232 (10tstarling) [12:18:51] mithro: there are extensions that allow on-page css i think. but you can always use per-page rules in the global css [12:19:01] the body tag has a page-specific class assigend to it [12:19:54] 03(mod) Cortado speaker icon should become a volume control - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=11248 (10tstarling) [12:20:06] Duesentrieb: just looking at the page source now [12:20:16] @import "/w/index.php?title=-&action=raw&gen=css&maxage=18000"; <- any idea what this means? [12:21:44] mithro: that's a dynamically generated bit if css that depends on user preferences [12:22:04] you can pretty much ignore it [12:22:04] Duesentrieb: ahh, I should have just gone to it :) [12:22:06] !css [12:22:06] --mwbot-- To change styles for your wiki, go to one of the MediaWiki:xxx.css wiki page and put your custom styles there (sysop/admin rights required). MediaWiki:common.css is for all skins and should be used for content styles. MediaWiki:monobook.css is for the MonoBook skin (default), etc. For more information, see !skins and [12:23:26] Duesentrieb: any idea where the page-specific class is mentioned in the source? [12:23:45] "the body tag" is what i said :) [12:23:50] there's only one body tag... [12:24:25] it's just page-whatever [12:24:57] all I see is [12:25:35] mithro: then you must be using a very old version of mediawiki, or a custom skin [12:26:51] using Monobook with 1.7.1 [12:27:11] 1.7 is ancient [12:27:15] !upgrade [12:27:15] --mwbot-- http://www.mediawiki.org/wiki/Manual:Upgrading [12:27:26] 1.12 is comming out soon [12:28:18] just using what comes with debian [12:29:26] to say it i robÄs words: "stable" in debian, "deprecated" everywhere else. [12:29:47] 1.7 has long become unsupported [12:30:03] i.e. no security patches. better hope debian keeps it patched. [12:30:16] you got packages for debian stable? [12:30:53] the official package is the tarball. [12:31:37] mediawiki is a copy-and-run app, there isn't much need for using a package manager. and debian etc tend to mess with the directory structure, scattering the wiki over half your system. ugly. [12:32:00] are there any major changes in 1.12 compared to 1.11.2? [12:32:06] !download [12:32:06] --mwbot-- The latest stable release of MediaWiki can be downloaded from . Files are supplied in a .tar.gz archive. MediaWiki can also be obtained direct from our Subversion repository. See for more information. [12:32:14] piksi: read the release notes [12:32:33] piksi: http://svn.wikimedia.org/svnroot/mediawiki/branches/REL1_12/phase3/RELEASE-NOTES [12:35:19] I have to admit, I really dislike wikis [12:36:07] don't use them then. [12:36:12] the rigth place for admitting! [12:36:30] luckily, we don't have to sell you anything [12:38:49] Duesentrieb: sadly, my users seem to love them [12:45:41] I've tested the new version of cortado with Java 1.4-1.6 on Windows [12:45:57] not under a browser though, just with the standalone player [12:46:24] hmmm, maybe there's some way to test it in a browser... [12:48:56] we can probably set it up to use the new applet on test.wikipedia.org, would anyone be interested in trying it out there? [12:54:06] Is there an extension that does this: I want to make "red" links to be plain text unless you're logged in. Thus unabling "new visitors" to get confused by "create new page"-stuff. [12:54:35] no [12:56:30] [1255][tstarling@zwinger:/home/wikipedia/common/php-1.5/extensions/OggHandler]$ svn up [12:56:31] svn: This client is too old to work with working copy '.'; please get a newer Subversion client [12:56:35] ok. thanks. :) Then I'll stop looking, and get to making it myself, I guess. [12:58:56] *TimStarling wonders whose fault this is [12:59:34] it works on hume... [13:00:15] hello, can some one point me to a proper page for the documentation of the xml hierarchy for the wikipedia pages? I have downloaded the wikidump and will be doing a knowledge harvesting project in which I will be using python to extract data from the dump. [13:00:58] *TimStarling moves to #wikimedia-tech [13:02:52] Duesentrieb: thanks for your help [13:04:54] I am exactly trying to search any document which can explain me the nodes of the wikipedia pages and their node hierarchy [13:07:33] kk: it's fairly self-explanatory i guess. the documentation page also has a DTD. bot sure if we have a proper explanation of each node. [13:07:44] *not [13:09:10] Duesentrieb: thanks. I am trying to work my way through the same [13:09:27] was trying to guess if there is a documentation where I can confirm the things [14:02:37] hello [14:04:14] question: some of my users are asking for the option to upload big files, which should be linked into the wiki. From my point of view i dont like to change wiki-upload restrictions and let the users try to upload > 100 mb files...which they want to. So first idea was using ftp-server on the same box [14:04:29] any better idea then using an ftp server in such a case ? [14:06:06] 100mb file upload via a web interface seems like a bad idea to me. [14:06:29] FTP sounds like a better bet. [14:06:41] odb|fide1_: using ssh based uploads is probably a better idea. [14:06:49] mh [14:07:02] 03(NEW) Unused messages - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13322 15enhancement; normal; MediaWiki: Internationalization; (robin_1273) [14:07:03] Duesentrieb: yes, but i dont want to confuse those users with an scp client [14:07:06] that is, scp or something based on that, like fish or the more universally supportes sftp (which is not much in common with ftp) [14:07:33] Duesentrieb: and why betteR: you mean cause of security issues ? [14:07:47] yes. [14:08:16] also because ftp is quicky - depending on the client, it may for example use "ascii mode" and mess with the cocoding of files [14:08:17] ok, i forgot to mention that its related with a closed network....so basically a private network [14:08:24] which more often than not breaks stuff [14:08:28] mh ok [14:08:48] well, if you jhave a private network, you can just use a shared drive [14:09:19] then again we end up in the question: what kinds of links do work inside an wiki-article [14:09:24] the issue is in any case to make the wiki kow about the manually uploaded file, no matter how it got into the server's filesystem [14:09:33] we use mediawiki for our intranet and link to files in *cough* sharepoint. [14:09:34] as my users do want to upload and then add the link to the wiki [14:09:38] there's an importimage script, but you should be aware that it involves copying the file [14:10:26] odb|fide1_: my personal solution would go like this: set up a shared disk (mouted via smb to the windows boxes and nfs to the server/linux boxes) [14:10:41] odb|fide1_: set up a script on the server that checks for files on that driver every x minutes. [14:11:12] odb|fide1_: when it finds a file, it runs the image import thingy on in (which will copy it to the wiki's upload dir and register it in the db) [14:11:26] that sounds like a great idea [14:11:29] then delete the file on the shared driver (opotional, but probably a good idea for huge files) [14:11:35] yes [14:11:47] can you tell me the name of the image thingie ? [14:11:55] as the rest should be manageable for me so far [14:11:56] maintenance/importImages, iirc [14:12:06] i think it expects a description in a text file [14:12:12] ah...so its already part of a default wiki install ? [14:12:18] it ca/should also be told what user to act as [14:12:23] yes, it's there per default [14:12:35] lets me take a look into my wiki-files :D [14:12:59] if you want to go fancy, make your script use the file owner on the fs to determin what wiki isuer to associate with. or use one dir per user. or some such [14:13:22] and require a description in a separate text file, or generate that on the fly, as you like [14:13:56] got a importImages.inc.php in /maintenance [14:14:15] and: importImages.php [14:14:39] Duesentrieb: is this documented in the mediawiki docs ? any idea ? [14:14:45] 03raymond * r31781 10/trunk/phase3/ (2 files in 2 dirs): Follow up commit for r31775, thanks to Nikerabbit :-) [14:15:38] odb|fide1_: http://www.mediawiki.org/wiki/Manual:ImportImages.php [14:15:53] odb|fide1_: though that page doesn't seem to detail the available options. run it without params to see usage info# [14:15:54] i am wondering why i should use:_ importImages for all files in general [14:16:01] ok [14:16:05] thank you very much [14:16:12] like always you are a big help [14:16:20] odb|fide1_: oh, mediawiki calls all uploads "images". never mind that [14:16:29] ok...good to know [14:16:40] if you uploads an mp3 or doc file, it goes into the image namespace. [14:17:09] have fun [14:18:32] 03aaron * r31782 10/trunk/extensions/FlaggedRevs/FlaggedArticle.php: *Don't autoreview outside the reviewable namespaces [14:20:32] sounds like the best solution [14:20:40] 03(NEW) ArticleSave family of hooks has unused parameters - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13323 15enhancement; normal; MediaWiki: Page editing; (seventowers) [14:30:52] Hi, I'm trying to set up my own wiki. I'd like to be able to format album info as they do on wikipedia - for example, the way all of the info about The Dark Side of the Moon album is listed on the right at the top of the article. I have tried to copy and past the info (beginning with "{{Infobox Album...") into my personal wiki, but, when I do the page preview it doesn't work. Any idea what I am doing wrong? Thanks much! [14:39:55] !templates | tomlowry [14:39:55] --mwbot-- tomlowry: For more information about templates, see . The most common issues with templates copied from Wikipedia can be fixed by installing ParserFunctions and enabling HTML Tidy . [14:41:02] tomlowry: my advice: make your own infobox. the one from wikipedia tend to be overly complex. also, it's good to understand how it works, so you can adjust it to your needs [14:47:03] If I want to Google Analytic my site, where do I insert the javascript code? [14:47:18] Apparently, the code from Google Analytics does not work when inserted into common.js [14:47:29] Do I have to insert it manually? [14:47:39] If so, which file should I consult? [14:47:49] 03shinjiman * r31783 10/trunk/phase3/languages/messages/ (4 files): [14:47:49] * Update Chinese translations [14:47:49] * Update Cantonese translations [14:47:49] * Update Old Chinese / Late Time Chinese translations [14:50:39] 03gri6507 * r31784 10/trunk/extensions/WhiteList/ (SpecialWhitelistEdit.php SpecialWhitelistEdit_body.php): fixed a minor bug in displaying the list; used to sometimes cause "Bad Title" error [14:50:47] ... well? [14:52:27] 03raymond * r31785 10/trunk/extensions/ (2 files in 2 dirs): Localisation updates German [14:56:22] 03raymond * r31786 10/trunk/extensions/FlaggedRevs/FlaggedRevsPage.i18n.php: Localisation updates German [14:56:31] 03(mod) Sort key in Template Output - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13318 summary (10roan.kattouw) [14:56:33] 03(mod) Value for Configuring Number Output - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13317 summary (10roan.kattouw) [14:58:26] Hi, is there an easy/fast way to let mediawiki do dummyedits on every single of its articles? [14:58:26] (I just implemented semantics into a template, but the factbox only shows after I do a dummyedit on such a page) [15:02:03] !jobqueue | HACK]Gara[ [15:02:03] --mwbot-- HACK]Gara[: The Job Queue is a way for mediawiki to run large update jobs in the background. See http://meta.wikimedia.org/wiki/Help:Job_queue [15:02:51] thanks [15:03:04] So MW does it automatically if I wait long enough? [15:03:14] yes. but you can have it faster [15:03:22] well, if enough people visit your site [15:03:40] the page i linked to misses a reference to the runJobs.php maintenance tool. [15:03:40] The "job queue length" on Special:Statistics is 0 though [15:03:59] hm... that's a bit odd if you just edited a much-used template. [15:04:07] though i dn't know what "much" means in your case [15:04:19] anyway - you can also ditch all caches by touching localsettings.php [15:04:21] It's linked to ~90 articlepages [15:04:36] i don't think it bothers with the queue is usage is < 500 [15:04:47] it's intended really for cases where the usage is > 100000 :) [15:05:34] HACK]Gara[: just touch localsettings. that will not cause all pages to rerender right away, so links may not update. but all pages will rerender on view. [15:06:10] <[blackb]> hi all i have a problem [15:06:21] Is overwriting it with itself evivalent to touching? [15:06:27] <[blackb]> i personalized toolbar form the root user of my wiki [15:06:29] Since I don't have shell access [15:06:40] <[blackb]> but other client don't see thet toolbar [15:06:41] HACK]Gara[: probably [15:06:53] <[blackb]> only root habe toolbar personal [15:07:09] [blackb]: what did you edit to personalize it? [15:07:46] <[blackb]> i try first with [[Wikipedia:Wikipedia Toolbar|Wikipedia Toolbar]] [15:07:55] <[blackb]> and then with Mediawiki:Common.js [15:08:36] yes, that is the same for everyone [15:08:48] if others don't see it, then they need to clear their browser cache [15:08:54] they still have the old version loaded. [15:09:24] <[blackb]> the Toolbar or the Common.js? [15:09:55] Common.js [15:10:08] the toolbar as such doesn't exist. [15:10:15] it's generated by different peices of javascript [15:13:46] 03(NEW) strange behavior of revision history page when dir=prev - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13324 minor; normal; MediaWiki: History/Diffs; (charlottethewebb) [15:13:50] 03(mod) Make subproperties more usable - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13319 summary (10roan.kattouw) [15:17:59] 03(mod) strange behavior of revision history page when dir=prev - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13324 (10charlottethewebb) [15:18:26] bye guys and thanks for the great help [15:20:39] is anyone familiar with the extension? [15:23:59] does anybody has a good knowledge about the programming language python? and maybe ever looked at the python scripts of the mediawiki extension wikipdf (http://sourceforge.net/projects/wikipdf/)? please i need urgent help :( i'm kind of an idiot in python and have some basic questions [15:25:36] don't you guys want to be a big hereo°? and rescue a little girl°? ;) [15:28:48] 03mkroetzsch * r31787 10/trunk/extensions/SemanticMediaWiki/ (2 files in 2 dirs): Added quickfixes as suggested for Bug 13321 [15:33:36] <[blackb]> yes if i open [[Mediawiki:Common.js]] with another user after deleting all cookie [15:33:40] <[blackb]> i see the java code [15:33:53] <[blackb]> but the panel don't change [15:37:00] Can I change the default color of text in a comment written by a colon? I.e., "blah blah blah \n : Comment-reply here in diff color." Is this in the stylesheet? [15:38:05] 03(ASSIGNED) SMW mb unsafe regexps can produce garbled UTF-8 strings - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=13321 +comment (10mak) [15:39:06] 03(mod) performance: Special:Browse shouldn't translate if the user' s language matches wiki - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=13310 (10mak) [15:40:44] <[blackb]> Duesentrieb: could u say why i have this problem? [15:41:25] [blackb]: first of all, cookies and page cache are two completely unrelated things. [15:41:47] [blackb]: for more, i'd have to look at your wiki. can you give me a url? [15:42:49] where can I ask to the devs if they can try to avoid to change logs and stuff like that if there aren't any new functions? [15:43:32] because a lot of logs that I use for my bots changes the syntax without adding something new... and this only block my bots -.-' [15:43:54] Filnik: that's why you should avoid scraping html pages to get information into your bots. [15:44:05] aren't all logs accessible via api.php? [15:44:45] Duesentrieb: I'm not the only one, almost all the functions "scrap" HTML pages because the api weren't always available [15:45:06] then instead of adjusting to the nrew html, change the code to use api.php [15:45:09] Duesentrieb: and I remember that API doesn't support Special pages and "limit and offset" option [15:45:09] should be much easier, too [15:45:28] Duesentrieb: but I use special pages... [15:45:34] it should support the functions of most special pages. [15:45:53] and it does support limit and also a "start from" thing [15:45:58] Duesentrieb: uh, ok [15:46:15] not an "offset" though - special pages shouldn't either. offsets are terribly inefficient [15:47:26] Filnik: i dunno about "most" special pages. but requests to add a new query to the api are probably viewed in much more favorable licht than a request to not change the html so your bot doesn't break. [15:47:42] [blackb]: without looking at your wiki, i can't help you. [15:48:35] Duesentrieb: but add something and ask not to do something is a bit different (the first takes time and maybe will done in a few days or so, the second is just a "reminder" :) ) [15:49:05] send a mail to mediawiki-l if you like. [15:49:14] but i doubt you will get much of a positive response [15:49:25] i'd call it flamebait :) [15:52:06] Duesentrieb: yes, you're right. But I wonder why people like make changes without adding something new... isn't this useless? :S [15:55:30] Filnik: depends. it might fix issues with some browser, it might make css simpler, it might be that some stuff is unified and combined internally, etc. [15:56:20] 14(INVALID) Inline query tables print raw numbers in addition to expected display values - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12591 +comment (10mak) [15:57:01] *Filnik wonders what might change adding "redlink=1" or "editredlink" or what there were before.. [15:57:47] 03(mod) Transfer some Features from DPL to - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=13151 (10mak) [15:57:53] 03(mod) article metadata in - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=10824 (10mak) [15:59:16] Filnik: maybe you should read the subversion logs if you want to know why changes were made [15:59:44] TimStarling: you're right, I already do it for pywikipediabot ;) [15:59:47] *Filnik goes to check [16:01:14] 03ialex * r31788 10/trunk/phase3/ (docs/skin.txt skins/SkinPHPTal.sample): [16:01:14] * Update docs/skin.txt. [16:01:14] * Removed skins/SkinPHPTal.sample as PHPTal isn't used for ages. [16:04:22] 03mkroetzsch * r31789 10/trunk/extensions/SemanticMediaWiki/maintenance/SMW_dumpRDF.php: Fix of Bug 13142 as suggested. [16:04:35] 03(FIXED) SMW_dumpRDF.php needs server definition - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=13142 (10mak) [16:06:55] '/nick Squared [16:06:55] 03raymond * r31790 10/trunk/phase3/maintenance/language/ (splitLanguageFiles.inc splitLanguageFiles.php): Added into SVN 2005 and unmaintained since 2005. Looks like a forgotten test. [16:08:25] Is it possible to let mediawiki create thumbnails on the fly? [16:09:10] that's what it has been doing for years [16:09:24] !thumbs [16:09:24] --mwbot-- For information on configuring thumbnailing on MediaWiki, please refer to . [16:09:57] um, i kinda lost my innodb tables, can anything be salvaged using the text db? :) [16:09:59] 03(ASSIGNED) Propertys does not properly work inside - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=13011 major->normal; +comment (10mak) [16:10:10] cheers.. how can i inline an external image in mediawiki when the url doesn't end in a standard image suffix? [16:10:22] is there a way to force the media type? [16:10:35] no, the external image support is kind of lame [16:10:37] 03(mod) Create the Hungarian Wikinews, Erzya & Extremaduran & Gan Wikipedia, and Japanese Wikiversity - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13264 (10tomonaka) [16:10:45] you might be able to add ?=.png to the end :) [16:10:59] hmm.. wow, that's perverse, but it may work.. [16:11:24] anyone? The data seems to be intact in the texts-table [16:11:39] Tommie: but text is innodb too, i thought? [16:11:57] Tommie: everything is in innodb. [16:12:06] Tommie: anyway - you can get the full text of all revisions from that table. you get a rough chronological order from the ids. [16:12:10] nopes, not here atleast - i can read all the different reversion of each page in the text-table [16:12:11] you have no chance to recover, restore your backup [16:12:26] Tommie: you will have to assign revision numbers and page tiltes by hand. then you can build an xml dump from that info [16:12:32] that can be imported into a fresh install [16:12:49] damn. does anyone else here use confirmaccount & confirmedit extensions? [16:13:00] Tommie: restoring your backup *is* the best option. [16:13:05] the old_title field wont help me in an way? [16:13:17] piksi: the two are largly unrelated [16:13:35] there isn't any backups, stupid - i know. I just had to little knowledge about innodb and though backing up the mysql-db folder was enough [16:13:58] Tommie: old_title? in the text table? [16:14:02] Duesentrieb: no they arent since i'm having problem with the confirmaccount captcha which is provided by confirmedit [16:14:10] i dunno what you are looking at, but that doesn't exist in my version.. [16:14:23] so, was thinking about writing a script which takes the most recent version of each old_title and reinsert it into the page-table and rebuild it ... is that feasible? [16:14:31] Tommie: usually the best way to back up mysql is with 'mysqldump'. (although copying the files usually works with myisam (not always), it's much less likely to work with innodb) [16:14:43] Tommie: old_title should be empty [16:14:52] that dates from pre-1.4 versions [16:14:57] *Duesentrieb goes to make food [16:15:00] hm... food... [16:15:08] Its isn't here actually, its contains the correct titles [16:15:17] even for pages created recently? [16:15:25] its kinda freaky and was hoping to salvage some info using this technique [16:16:15] SELECT count(*) FROM `text` where old_title != "" [16:16:16] gives me 2451 out of 3,550 rows [16:17:28] say i do this script and re-insert data into page-table, is there a way to rebuild the wiki based on the new data in page? [16:23:20] flyingparchment: thanks. that's ugly as hell, but it works fantastic! [16:24:16] 03(ASSIGNED) Support subproperties in #ask query printouts - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=13319 summary; +comment (10mak) [16:24:52] 03yaron * r31791 10/trunk/extensions/SemanticForms/specials/SF_Forms.php: Removed 'Add data with this form' link [16:25:12] flyingparchment: ok, one more. can i give these images a size component like i can in straight html? [16:25:23] 14(DUP) Support subproperties in #ask query printouts - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=13319 +comment (10mak) [16:25:24] 03(mod) Display subproperties using * operator in queries - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12822 +comment (10mak) [16:26:00] 03yaron * r31792 10/trunk/extensions/SemanticForms/languages/SF_Messages.php: Removed 'sf_forms_adddata' value from all languages - no longer needed [16:26:01] 03raymond * r31793 10/trunk/extensions/HTMLets/ (HTMLets.i18n.php HTMLets.php install.settings): [16:26:01] * Add description message for [[Special:Version]]. [16:26:02] * Clarify path for install.settings [16:27:12] 03(mod) Special:Browse should translate object page and category names - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=13309 (10mak) [16:27:38] flyingparchment: nevermind, i can do it on the other side. thanks again! [16:28:18] 03(mod) SMW mb unsafe regexps can produce garbled UTF-8 strings - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13321 (10brion) [16:29:43] 03ialex * r31794 10/trunk/phase3/skins/ (5 files): Fixed comments in skins/*.deps.php, I expect that this is the good mail, the other one returned a 404 not found. [16:32:08] 03(NEW) Special:Browse should use SMWStore::getSemanticData - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13325 normal; normal; MediaWiki extensions: Semantic MediaWiki; (mak) [16:34:07] <[blackb]> i have a problems with wiki [16:34:24] <[blackb]> i have modified toolbar of root user [16:34:29] 03raymond * r31795 10/trunk/extensions/NewUserNotif/ (NewUserNotif.i18n.php NewUserNotif.php): * Add description message for [[Special:Version]] [16:34:38] <[blackb]> but other user can't see this modification [16:35:20] <[blackb]> i add in common.js ther scirpt extraeditbuttons.js [16:36:02] <[blackb]> http://en.wikipedia.org/wiki/User:MarkS/extraeditbuttons.js [16:38:22] hello, I might be slightly off topic but can some one tell me how can I parse the wiki dump in xml in python [16:39:10] I want to use python and extract the content. the problem is that I am confused as to which library/ python module I must use to achieve the task. [16:39:16] 03yaron * r31796 10/trunk/extensions/SemanticForms/specials/SF_AddData.php: Added handling for automatic page-name setting [16:45:07] 03yaron * r31797 10/trunk/extensions/SemanticForms/includes/SF_FormPrinter.inc: [16:45:07] Added support for automatic creation of page title, through new [16:45:07] $page_name_formula and $generated_page_name variables [16:47:52] 03yaron * r31798 10/trunk/extensions/SemanticForms/ (INSTALL includes/SF_GlobalFunctions.php): New version: 1.0 [16:48:42] 03mkroetzsch * r31799 10/trunk/extensions/SemanticMediaWiki/includes/SMW_QP_Template.php: Updated to work around bug 12906 [16:50:19] AaronSchulz: removing in rev:31724 was intentional? [16:52:49] 03(mod) New parser breaks some template expansion in Semantic MediaWiki - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12906 +comment (10mak) [16:55:22] 03mkroetzsch * r31800 10/trunk/extensions/SemanticMediaWiki/includes/SMW_QP_List.php: Workaround for bug 12906 as suggested in the bug report. [16:56:49] Heya. I was wondering what sort of Anti-vandalism methods I should implement on my site? I'm looking at the type of Anti-Vandal protection Wikipedia has [16:58:45] How can I get the number of articles in category X be displayed in article Y? [17:01:08] !e ConfirmEdit | SuperGeek [17:01:08] --mwbot-- SuperGeek: http://www.mediawiki.org/wiki/Extension:ConfirmEdit [17:01:39] Thanks. [17:01:58] ialex: Well, I'm also looking towards user vandalism [17:02:06] i.e a user blanking a page with "HAHAHAHAAH SPAM" or something [17:02:56] SuperGeek: these are bots runned by some users, they aren't part of MediaWiki [17:03:20] ialex: Oh. Would you mind pointing me to some of these bots? [17:04:49] SysRq868, https://bugzilla.wikimedia.org/show_bug.cgi?id=6943 [17:05:47] Simetrical: ...meaning..? [17:06:10] SysRq868, not present in the software, there's a feature request open for it with a patch you can apply if you like. [17:06:17] Although I don't know if it will apply cleanly. [17:06:42] So if I apply this, I can accomplish this? https://bugzilla.wikimedia.org/attachment.cgi?id=2919 [17:07:41] SysRq868, if it works, yes. [17:08:03] SuperGeek: maybe http://en.wikipedia.org/wiki/User:ClueBot, there a link at the bottom of the page for the script [17:09:02] ialex: Thanks [17:09:58] Simetrical: Thanks a bunch! :) [17:14:17] 03(mod) Show sort-key in tooltip for Category link - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=8538 +comment (10Simetrical+wikibugs) [17:18:34] hey guys [17:19:23] why dosen't wörk {{subst:PAGENAME}} in Wikibooks germany ? [17:20:31] 04(REOPENED) nbsp before colons on fr - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=13273 summary; +comment (10alistrius) [17:22:43] ok goes well now [17:22:45] 03(mod) Special:Categories hides categories having 0 current members - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=10915 +comment (10Simetrical+wikibugs) [17:22:48] was my browser [17:24:07] 03ialex * r31801 10/trunk/phase3/languages/classes/ (4 files): (in continuation of r31794) Fixed comments in languages/classes/*.deps.php. [17:24:27] 14(DUP) RFE: Display name for pages in category - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=11490 +comment (10Simetrical+wikibugs) [17:24:28] 03(mod) Categories need piping feature to list by alternative name - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=491 +comment (10Simetrical+wikibugs) [17:27:17] 03(mod) Cortado speaker icon should become a volume control - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=11248 (10brion) [17:27:18] 03(mod) Search by first letters or digits in [[Special:Categories]] - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13095 +easy (10Simetrical+wikibugs) [17:29:03] 03(mod) Transclude list of pages in a category - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13203 summary (10Simetrical+wikibugs) [17:29:13] 03(mod) Should show newly-added hidden categories for preview version - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13275 summary (10Simetrical+wikibugs) [17:30:05] 14(DUP) categories that should be populated by templates using parser functions remain empty more often than not - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12004 +comment (10Simetrical+wikibugs) [17:30:05] 03(mod) Category listings are not updated when deletion of or edits to a template change the category membership of transcluding pages - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=5382 +comment (10Simetrical+wikibugs) [17:30:18] AlexSm: probably not [17:31:25] *Simetrical braces for bugstorm [17:31:37] 03(NEW) Things that could use a category table (tracking) - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13326 15enhancement; normal; MediaWiki: Categories; (Simetrical+wikibugs) [17:31:38] 03(mod) Tracking bug (tracking) - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=2007 (10Simetrical+wikibugs) [17:31:54] AaronSchulz: that broke my script ... now I wonder if I can rely on all other classes and IDs [17:32:02] 03yaron * r31802 10/trunk/extensions/SemanticForms/specials/SF_AddData.php: Added initialization of some variables [17:32:10] 03(mod) On paged categories, article totals and subcat totals are incorrect - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=1212 (10Simetrical+wikibugs) [17:32:12] 03(mod) Create the Hungarian Wikinews, Erzya & Extremaduran & Gan Wikipedia, and Japanese Wikiversity - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=13264 (10kanjy.wm) [17:32:17] 03(mod) Magic word for number of items in a category - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=6943 (10Simetrical+wikibugs) [17:32:26] 03(mod) Special:Categories hides categories having 0 current members - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=10915 (10Simetrical+wikibugs) [17:32:30] 03(mod) Option to not display [+] if no sub-category exists - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=11132 (10Simetrical+wikibugs) [17:35:31] 03(mod) Number of articles in subcategories shown on category page - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=3834 (10Simetrical+wikibugs) [17:37:09] 03aaron * r31803 10/trunk/phase3/includes/PageHistory.php: lost, adding back [17:58:17] 03(FIXED) SMW mb unsafe regexps can produce garbled UTF-8 strings - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=13321 +comment (10mak) [17:59:30] 03(FIXED) nbsp before colons on fr - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13273 +comment (10Simetrical+wikibugs) [18:00:20] 03simetrical * r31805 10/trunk/phase3/ (5 files in 4 dirs): [18:00:20] * (bug 13273) Fix another hardcoded colon [18:00:20] * Fix a hardcoded hyphen while I'm there [18:00:20] * catseparator should be optionally translated, not never translated (although in practice it may be never) [18:00:31] 03mkroetzsch * r31804 10/trunk/extensions/SemanticMediaWiki/ (13 files in 2 dirs): Protect all preg_* against UTF8 hassles using /u, see Bug 13321. [18:05:52] hi [18:08:24] hi what should i do, if i want to upgrade from 1.12 to trunk [18:08:41] 1.11.2 not 1.12 [18:09:12] alecli1, check out trunk on top of your existing install, then run maintenance/update.php. [18:09:15] Same as any installation. [18:09:17] can i just download the trunk files from svn, and do not change the db at all? [18:09:34] You need to run update.php to update the database. [18:11:15] i go 403 forbidden [18:11:17] got [18:11:33] when i try to run wiki/maintenance/update.php [18:11:43] !upgrading [18:11:43] --mwbot-- http://www.mediawiki.org/wiki/Manual:Upgrading [18:11:49] You need to run it from the command line. [18:12:03] Alternatively, I think you can do some funniness with running the install script again. [18:12:15] But if you're using svn, I assume odds are good you have access to a command line. [18:12:58] actually the trunk version is working now. should i run the update.php? [18:15:42] The answer is "yes", but you left. :( [18:16:14] I need a template that replaces é with e (and so forth). Is this possible without Extension:StringFunctions? (no.wikipedia.org) [18:20:03] anybody knows if the database api could connect to a db different to wikidb [18:20:19] Kagee, if it is, it will be so horrible I don't want to hear about it. [18:21:12] Dresden, you mean the Database class? [18:21:19] I'm pretty sure it has constructor parameters. [18:21:22] Well... then i must convince someone to install StringFunctions..... [18:22:52] hi there.. is there any way to raise the session expire time? [18:23:40] Simetrical: yes [18:24:24] igormorgado: session.gc_maxlifetime = 3600 [18:24:28] in your php.ini [18:24:31] i only see how to choose master db or slave db [18:24:55] TimLaqua: thank you. seconds based, right? [18:25:05] *TimLaqua shrugs [18:25:14] that's just what mine says [18:25:18] looks seconds based to me [18:25:19] TimLaqua: may I use: ini_set ? [18:25:35] dunno, look it up in the manual [18:25:47] as: ini_set( 'session.gc_maxlifetime', 3600) [18:25:53] ok.. I will check [18:25:56] ty [18:26:04] it depends, ini_set only works for certain vars [18:26:13] well... most vars. ;-) [18:27:10] cause I just want raise the timeout of a specific wiki (not all of them) [18:27:20] ah, makes sense [18:27:24] my server hosts a few mediawiki pages. [18:27:43] s/pages/websites/g [18:27:46] Dresden, wfGetDB() probably only allows you to choose between server groups. You can call Database::__construct() directly if you want. [18:27:58] 03(mod) Patrollers right - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13118 +comment (10huji.huji) [18:27:58] I will try. Ty one more time TimLaqua [18:28:00] 14(DUP) fawiki requests patrol usergroup - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13304 +comment (10huji.huji) [18:28:09] couldn't hurt. ;-) [18:28:10] Dresden, that allows you to specify server, user, password, DB name, etc. [18:30:44] hi [18:31:02] *amidaniel waves to the channel [18:31:08] i see on some webs wikipedia are thinking to use ads [18:31:20] *ialex wawes back to amidaniel [18:31:27] :D [18:32:01] this is true? http://www.latimes.com/business/la-fi-wikipedia10mar10,0,3216071,full.story [18:32:07] Btw, has everyone seen this niftiness? :D http://leuksman.com/maps/geosearch.php [18:33:04] 03brion * r31806 10/trunk/phase3/ (RELEASE-NOTES includes/SpecialSearch.php): * (bug 2815) Search results for media files now use thumb instead of text extract [18:33:15] Are there instruction on how to trick robots with the e-mail output on the wiki site ? [18:33:39] asmarin: is there something in particular you're asking, or a point-by-point about every sentence in the article? [18:33:46] somehow that it appears visible but still not trackable by the robots? [18:34:12] brion: this article is true? or is a personal opinion? [18:34:28] there's no way i can answer that, asmarin [18:34:37] Simetrical: i could change the location of the db and use it as the normal api? [18:34:58] well...some notices appears on web [18:35:15] the article describes a lot of things, then states some opinions about those things [18:35:34] tsunami notice effect [18:35:36] asmarin, it has some inaccuracies. The figure of 300 servers is pretty outdated, AFAIK. [18:35:46] yeah i know [18:35:48] The last fundraising campaign featured a video of co-founder Jimmy Wales literally wringing his hands in desperation. <- THE HANDS! [18:36:21] but most important is include commercial banners on wikipedia pages [18:36:23] Haha, i forgot about the hands! [18:36:25] *amidaniel giggles [18:36:44] yauhhh weird english.... [18:37:07] asmarin, the summary of the attitude toward that seems accurate. Ads are not an option at the moment, and don't look likely to become one, because of editors' opinions if nothing else. [18:37:29] ah! this is i need to hear [18:37:37] its a editor opinion [18:38:23] a campain against wikimedia institutions? [18:38:31] asmarin, the article seems to be broadly accurate. [18:38:36] And fair. [18:38:39] It doesn't say ads will happen. [18:38:45] It cites some *outsiders* thinking they're inevitable. [18:39:04] Some people probably do think that, and it's fair to mention it. [18:39:08] ok ok [18:39:11] *flyingparchment completely supports ads [18:39:12] bloggers! [18:39:33] Is there a manual on how to trick robots with the e-mail output on the wiki site ? (anti spam) [18:39:36] The quotation of Gardner saying "never say never" on ads is also legit, AFAIK. But it's not up for discussion now. [18:39:44] flyingparchment, for once we actually agree on something! [18:42:07] Simetrical: uh.. really? i meant to say, i totally oppose ads! [18:42:16] . . . [18:43:45] 03huji * r31807 10/trunk/phase3/languages/messages/MessagesFa.php: * Adding/updating Persian translations [18:47:29] 03raymond * r31808 10/trunk/extensions/FlaggedRevs/FlaggedRevsPage.i18n.php: [18:47:29] Localisation updates German. [18:47:29] Some tweaks from http://de.labs.wikimedia.org/wiki/Wikibooks:Bugs [18:49:52] 03(mod) Install the StringFunctions extension - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=6455 (10post) [18:50:47] Could someone confirm if this plugin EmailTag is working well - http://www.mcmilk.de/wiki/Wiki_EmailTag [18:50:50] Simetrical: that constructor Database::__construct() , is an internal class, it's used by the api isn't it? to use it i only have to include it ? [18:51:04] Dresden, what API are you talking about? [18:51:16] wfgetDB... [18:51:23] yang, you should ask the author, probably. Most people here will not know about unofficial extensions. [18:52:14] http://www.mediawiki.org/wiki/Manual:Database_access that [18:52:16] Dresden, the Database class is the class that's returned by wfGetDB(). You can instantiate it manually if you want instead of using that helper function. [18:53:57] booyah [18:53:57] http://commons.wikimedia.org/wiki/Special:Search?ns0=1&ns6=1&ns9=1&ns11=1&ns14=1&search=power+mac+g5&fulltext=Search [18:54:14] :D [18:54:27] i could see how calls wfgetdb to the constructor method and change the location so i could use the api [18:54:46] Dresden, er, what is it you're trying to do here? [18:55:02] Simetrical: I would like an "official" e-mail anti spam [18:55:12] +filter [18:55:34] want to use the api only changing the location of the db [18:55:51] at the moment i don't want to use the class [18:56:12] Dresden, why do you want to change the location of the DB? [18:56:20] yang, What do you mean by e-mail anti-spam filter? [18:56:48] 03evan * r31809 10/branches/openid_2.0_update/OpenID/ (7 files): [18:56:48] Sloppy mixed-up commit [18:56:48] Lots of bits and pieces in this commit. Fixed last of the errorpage() [18:56:48] uses. Added considerably longer documentation string for login (this [18:56:49] has been a huge issue so far). Changed login page so that [18:56:51] documentation goes ''after'' the login box. Added some more TODOs, [18:56:55] checked some off. [18:57:42] bcause i don't want to have wikidb and my tables together [18:58:32] Simetrical: something that doesnt display email address to the robots [18:58:50] yang, where does MediaWiki ever display e-mail addresses anyway? [18:59:01] Simetrical: if i put mine on the wiki site [18:59:04] Dresden, what are your tables? [18:59:17] Simetrical: as contact [18:59:23] 03(FIXED) Image page search results should include thumbnails - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=2815 +comment (10brion) [18:59:35] yang, I'm not aware of such an extension, but it would be easy enough to write. [19:00:08] Simetrical: i have mine now entered as yang AT domain DOT net - some plain users don't know how to resolve it [19:02:04] just in case i don't want to damage wikidb and for security reasons i think it's better [19:03:12] Dresden, you can probably just give table names like yourdbname.tablename when you use the query functions. You shouldn't need to hack wfGetDB(). [19:04:28] that would be great [19:06:42] grrr, my tabs are in the air [19:06:55] *AaronSchulz wonders why that's happening [19:09:43] 03brion * r31810 10/trunk/phase3/includes/SpecialSearch.php: Workaround for missing pages that get returned in search results (not-up-to-date index) [19:09:51] ?>\n\n [19:09:55] there we go [19:10:02] that was weird [19:10:13] *AaronSchulz kicks localsettings.php [19:10:59] Stupid question: I get an error about php.exe not being able to read ParserTests.txt [19:11:32] (this didn't happen a while ago, my ParserTests.txt is "NOT" on an NTFS drive, so it's not due to permissions error) [19:11:36] any idea? [19:12:14] Byte-Order Mark found in UTF-8 File. [19:12:16] hmm [19:12:41] Hojjat, what kind of drive *is* it on? [19:13:09] FAT [19:13:11] FAT 32 [19:13:19] And I'm on Windows XP Sp2 [19:14:28] brion: does the flaggedrevs msg file have a BOM? [19:15:19] sec [19:15:19] AaronSchulz: I think it doesn't [19:15:30] AaronSchulz, you don't have a hex editor handy? [19:15:45] AaronSchulz: or it "didn't" to be more accurate. My text editor had troubles with it [19:16:07] 03brion * r31811 10/trunk/extensions/FlaggedRevs/FlaggedRevsPage.i18n.php: kill BOM [19:19:43] 03evan * r31812 10/branches/openid_2.0_update/OpenID/ (OpenID.setup.php README TODO): A little more documentation. [19:27:29] hi, i have problems installing mediawiki on mac os x 10.5 [19:27:44] the error is: Failed to parse (PNG conversion failed; check for correct installation of latex, dvips, gs, and convert): [19:28:08] a11235: you may need to install ImageMagick [19:28:15] i have all of them [19:28:33] (i use latexrender in a phpbb forum) [19:28:36] where is your convert binary? [19:28:56] of what? [19:29:04] ah [19:29:06] one sec [19:29:09] which convert [19:29:17] /opt/local/bin/convert [19:29:41] i wonder if mediawiki can't find it for some reason [19:30:01] might have a different path for the Apache user [19:30:12] try setting http://www.mediawiki.org/wiki/Manual:%24wgImageMagickConvertCommand [19:30:19] a11235: usually the PATH doesn't include /sw/bin or /opt/local/bin at bootup [19:30:28] try manually doing a putenv call in your LocalSettings.php [19:30:40] the tex stuff requires its tools to be in PATH [19:30:49] *brion-lunch hit-and-run tech support, boo-yah [19:31:25] i had even changed the first lines in render.ml [19:31:37] let cmd_dvips tmpprefix = "/opt/local/bin/dvips -R -E " ^ tmpprefix ^ ".dvi -f >" ^ tmpprefix ^ ".ps" [19:31:38] let cmd_latex tmpprefix = "/opt/local/bin/latex " ^ tmpprefix ^ ".tex >/dev/null" [19:31:38] let cmd_convert tmpprefix finalpath = "/usr/local/bin/convert -quality 100 -density 120 " ^ tmpprefix ^ ".ps " ^ finalpath ^ " >/dev/null 2>/dev/null" [19:31:38] let cmd_dvipng tmpprefix finalpath = "/opt/local/bin/dvipng -gamma 1.5 -D 120 -T tight --strict " ^ tmpprefix ^ ".dvi -o " ^ finalpath ^ " >/dev/null 2>/dev/null" [19:32:10] Dear SVN: [19:32:12] I do not love you. [19:32:14] Signed, Evan [19:32:55] Better watch out for deadly neurotoxin now [19:33:19] wow [19:33:25] brion: it worked [19:33:35] thank you so much [19:33:42] Dear SVN, nevermind e_s_p's complaints, we all love and appreciate you! Don't go depressive! Signed, the rest [19:36:34] 03evan * r31813 10/branches/openid_2.0_update/OpenID/README: note changes for autoloading [19:37:01] Mbimmler: Quisling! [19:37:42] 03(mod) Query does not work with Category:Cat:Cat - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13327 15enhancement->normal (10dasch_87) [19:39:02] Wow [19:43:45] 03(mod) Corrupted log entries - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13013 (10mike.lifeguard) [19:45:08] 03evan * r31814 10/trunk/extensions/OpenID/ (17 files): [19:45:08] Merge changes from the 2.0 update branch [19:45:08] Merge in changes from the 2.0 update branch. [19:50:57] 03evan * r31815 10/tags/extensions/OpenID/REL_0_8_0/: Tagging the 0.8.0 version of the OpenID extension [19:51:05] Man, I hoped I spelled that right [19:51:31] woohoo! It worked! [19:59:02] 03raymond * r31816 10/trunk/extensions/FlaggedRevs/FlaggedArticle.php: [19:59:02] No labels after event handler, see http://www.w3.org/TR/html401/interact/scripts.html#h-18.2.3 [19:59:02] Thanks to [[de:User:elya]] [20:01:11] 03(NEW) Query does not work with Category:Cat:Cat - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13327 15enhancement; normal; MediaWiki extensions: Semantic MediaWiki; (dasch_87) [20:02:26] anyone familiar with the namespace manager extension? [20:02:50] I'm wondering if there is some way to file things under a namespace but have the namespace: "hidden" on the page title [20:03:09] probably impossible, but it would be very helpful for me [20:03:49] 03raymond * r31817 10/trunk/extensions/FlaggedRevs/ (5 files in 2 dirs): [20:03:49] Some tweaks for the short UI box: [20:03:49] * Make the icons a little bit smaller [20:03:49] * Put the box on the h1 line (tested with FF 3b4, IE7, Opera 9.26, Safari 3) [20:03:51] Thanks to [[de:User:elya]] [20:41:55] dang [20:42:07] MW is ignoring my special page for some reason [20:42:13] Quite bothersome! [20:51:00] what is the ideal mediawiki hacking music? [20:52:58] I hate everything about you - Three Days Grace [20:53:08] It portrays the mood pretty well :D [20:53:13] :) [20:55:00] 03erik * r31818 10/branches/wikidata/includes/Namespace.php: make custom namespace handlers unmovable for now. [20:55:15] *brion cranks up Styx [20:59:07] was wondering if someone could help me with this --> http://rafb.net/p/26fyM242.html [20:59:33] if it doesn't work i dunno :) [20:59:34] basically, simple parser function for labled section transclusion tags [21:00:57] {{#section:1|oneoneone}} [21:01:00] becomes [21:01:01]
      oneoneone
      [21:01:03] in article text, instead of "oneoneone" [21:01:08] cu [21:01:21] hrm [21:01:48] renders appropriately in special:ExpandTemplates though, btw using 1.12 [21:01:57] ok, the problem with that thesizz is that i'm pretty sure the way lst works is to look for the exact
      tags in the raw unparsed text [21:02:08] but.... i dunno :) [21:03:00] any suggestions? [21:04:16] well, i gues syou'd have to change the bit that goes and fetches out the sections to recognize this thing as well [21:05:43] 03evan * r31819 10/trunk/extensions/OpenID/ (5 files): Some post-release bugs (doh!) [21:06:18] 03evan * r31820 10/tags/extensions/OpenID/REL_0_8_0/OpenID/: Re-tagging new version 0.8.0. [21:06:42] 03(NEW) On Special: Newpages please add button to view oldest non-patrolled pages - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13328 15enhancement; normal; Wikimedia: General/Unknown; (sbowers) [21:07:32] Hey everybody, I'm sure this has been answered before, but I couldn't find it in the FAQ's: I would like to have a url path like http://www.example.com/wiki/Page instead of http://www.example.com/wiki/index.php?title=Page. Is is possible to change this after MediaWiki is already installed? [21:10:06] http://www.mediawiki.org/wiki/FAQ#How_do_I_make_my_base_URLs_shorter.3F_.28i.e._.2Fwiki.2FArticle_Name_as_opposed_to_.2Fw.2Findex.php.3Ftitle.3DArticle_Name.29 [21:11:35] @brion, so my goal of using templates & parser functions to create sections readable by LST is basically impossible [21:12:06] thesizz: i'm pretty sure that technique won't do the job, no [21:16:03] Thank Brion...I'll check it out and then I'm sure I'll be back with more questions :-) [21:19:19] good luck! [21:23:32] 03(NEW) Field tab order broken in Firefox 3 beta - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13329 15enhancement; normal; MediaWiki: Special pages; (brion) [21:23:41] anybody know when WordPress 2.5 will be out? I thought it was supposed to be released yesterday [21:24:03] brion: could you activate roan for me? [21:24:11] dunno, i just wait until my brother updates his blog, then blogs about the new version of wordpress ;) [21:24:28] AzaTht: activate in what way? [21:24:41] activate his presence here ツ [21:25:30] 03siebrand * r31821 10/trunk/extensions/OpenID/ (12 files): [21:25:30] * use wfLoadExtensionMessages [21:25:30] * add descriptionmsg [21:25:30] * some reformatting in i18n messages [21:25:30] * remove EOL whitespace [21:25:31] * svn prop eol-style:native where not set [21:25:33] FIXME: Strict Standards: Non-static method SpecialOpenID::LocalizedPageName() cannot be called statically [21:28:00] there is portal, but it changes sometime page instead of /1 (which exist), it goto /2 or /3 or /4 or /5 who dont exist, why? [21:28:08] anyone know what i mean? [21:28:17] you know why it is like that? and if it can be fixed? [21:28:20] this is on Wikipedia [21:34:42] 03(mod) Allow title for new comments to come from the URL - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13100 (10AlexSm) [21:44:15] 03(mod) Search result page numbers should be rounded to nearest integer - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13297 +comment (10brion) [21:44:59] rainman-sr: i was trying to build lucene-search-2.1 earlier today [21:45:07] it seems to be missing the spell checker libs? [21:46:13] Wow [21:46:23] Thanks, whoever threw a lot of conflicts into the OpenID extension [21:46:26] question about user account creation [21:46:27] I appreciate your work [21:46:32] :) [21:46:53] once the data is entered in the field [21:47:05] and then submitted where is it handled by the server. [21:47:20] I saw that the action="index.php?..." but I can't find where that goes [21:47:26] ideas? [21:47:33] I have a security concern right now, with the account creation captcha [21:47:52] Soxred93: pm me about it [21:47:53] 03(mod) dsb: interwiki prefix missing on srv60 - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13017 (10brion) [21:48:02] Oh, that'd be siebrand [21:48:04] Once again [21:48:10] hmmm [21:48:51] e_s_p: you're Evan? [21:48:55] siebrand: yes [21:49:01] e_s_p: what conflicts? [21:49:09] 03(mod) Add portal page at wap.wikipedia.org - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13020 summary (10brion) [21:49:10] All the changes you commited [21:49:12] *yomshleeshee twiddles thumbs [21:49:17] I was working on the same code [21:49:27] I mean, I appreciate the effort, but we've had this problem before [21:50:04] I mean, I'm out of luck, you're the faster gun [21:50:08] e_s_p: yes, then you said you were working on it. Now you committed the update, so I did some work to be able to add it to Translate. [21:50:14] I'll have to be the one to figure out the conflicts [21:50:23] Yeah, thanks [21:50:29] I was doing equivalent work [21:50:37] e_s_p: I installed it, by the way. There are a lot of strict issues in that openid php stuff. [21:50:53] e_s_p: and 1 FIXME in the extension itself. [21:50:56] brion, it's not really needed, you can just remove the imports/possible test files and it should work, i added that at some point for testing but never checked it in [21:51:01] yeah I couldn't find it either... [21:51:05] Yeah, I was fixing those [21:51:16] have another question -- does anyone know a way to do full-text search in mediawiki that will export to xml? [21:51:25] e_s_p: you also maintain the php_openid thingie? [21:51:40] no [21:51:48] or rss/atom/whatever [21:51:58] e_s_p: are you going to send a patch upstream? [21:52:09] the opensearch only searches for page titles it seems [21:52:44] rainman-sr: hmm, well can you fix the build file then? [21:52:48] it's nice if the code... works... :) [21:53:04] so no one knows huh? [21:53:10] 03dale * r31822 10/trunk/extensions/MetavidWiki/includes/ (MV_Title.php specials/MV_SpecialExport.php): relative url made absolute for roe export [21:53:19] brion: "Resolving WONTFIX... :)" :-P [21:53:37] e_s_p: 3x "Assigning the return value of new by reference is deprecated" is the worst, I think. 4x "Declaration of x() should be compatible with that of y()", 1x "Non-static method xt() should not be called statically" [21:53:37] just because you can't fix it doesn't mean you shouldn't try ツ [21:53:54] brion, yeah, i'm working on that.. that's the last piece i need to do - test incremental updater, and make the updates/rebuilds/configuration nicer, and actually work without my work copy [21:54:19] brion: in theory, you should pass it upstream [21:54:30] AzaTht: feel free [21:54:44] *yomshleeshee feels ignored [21:54:53] you can get the missing files from /home/rainman/lucene-search-src-2.1.tar.gz on zwinger [21:55:11] siebrand: so, you're saying there are problems in the extension, or in the libraries it uses? [21:55:16] yomshleeshee: don't understand the question [21:55:30] ok great I'll restate it [21:55:32] rainman-sr: i'd rather see them in svn :) [21:55:38] brion: do you know when using the API::allpages and using apprefix, could it be that it will missa pages? [21:55:49] AzaTht: no clue what you're talking about [21:56:01] So when the form for the create user account page is submitted, how is it handled? [21:56:05] brion: like http://en.wikipedia.org/w/api.php?action=query&list=allpages&apfilterredir=nonredirects&apnamespace=4&apprefix=Articles%20for%20deletion/List%20of%20shopping%20malls%20in%20the%20United%20States [21:56:25] gotten reports that my script sometime overwrite pages, but I can't find why [21:56:25] the action in the form is set to index.php and i can't find where in the php code it handles the account creation [21:56:43] brion: and by the way, have you activated apiedit on test? [21:56:56] no, i have not activated apiedit anywhere and am in no hurry to do so [21:57:05] hmm [21:57:10] e_s_p: both. [21:57:11] test.wikipedia.org... [21:57:19] siebrand: OK, I'll get right on that. [21:57:26] e_s_p: I'll paste you the issues [21:57:36] siebrand: send them in an email [21:57:43] evan@prodromou.name [21:57:46] e_s_p: *nod* [21:57:56] e_s_p: still have your mail somewhere [21:58:02] so If I wanted to add a new field to the create account page, where would I add the action part of it? [21:58:58] yomshleeshee: Well, personally, if I wanted to add a field, I'd find an extension. [21:59:02] But that doesn't really answer your question... [21:59:28] I'm trying to make one [21:59:44] do you know how to install wikiwyg on mediawiki 1.11 ? [22:00:06] 03(mod) nbsp before colons on fr - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13273 (10alistrius) [22:00:19] I want to make account creation based on an invite code. So I can invite people to the wiki [22:00:27] 03brion * r31823 10/trunk/phase3/includes/SpecialExport.php: Fix fatal error in Special:Export -- invalid title when using Template inclusion [22:00:40] so I need to intercept, sort to speak, the create account information to make sure they have to invite code [22:00:54] 03brion * r31824 10/branches/REL1_12/phase3/includes/SpecialExport.php: Merge r31823 from trunk -- fix for Special:Export fatal erorr [22:00:54] I can't find here the account information is gathered [22:01:09] e_s_p: you've got mail. [22:01:27] yomshleeshee: that stuff's all in Special:Userlogin [22:01:35] poke around the code and the available hooks [22:02:48] 03evan * r31825 10/trunk/extensions/OpenID/ (OpenID.i18n.php OpenID.setup.php SpecialOpenID.body.php): [22:02:48] I was making similar changes to siebrand's; they seem to be roughly [22:02:48] compatible. [22:02:52] brion: I think i see it. I'll check it out. [22:04:43] 03brion * r31826 10/branches/REL1_12/phase3/ (RELEASE-NOTES includes/DefaultSettings.php): bump to 1.12.0rc1 [22:04:53] easy question where is "$this" defined? [22:05:09] yomshleeshee: $this is always the current object/class [22:05:16] lol I know [22:05:17] yomshleeshee: i think you need to start at http://us.php.net/manual/en/language.oop5.php [22:05:23] I mean when are the extra fields added [22:05:36] *yomshleeshee signs [22:05:40] *yomshleeshee sighs [22:06:00] oboy [22:06:02] I know what $this is i mean there are fields that are added in the mediawiki code [22:06:03] rc1 [22:06:15] yomshleeshee: scroll up until you find the class definition [22:06:18] ah found it [22:06:19] thanks [22:06:22] then slowly read down looking at the various code [22:08:51] 03brion * r31827 10/tags/REL1_12_0RC1/: tag 1.12.0rc1 [22:09:50] thanks for the help [22:10:17] brion: \o/ [22:13:51] ooh, we're back to release candidates? [22:14:01] we haven't had those since like 1.5 [22:15:11] ... we had them in 1.11. [22:15:57] siebrand: yikes! There are a ton of strict errors [22:16:12] *siebrand smiles mildly at e_s_p. [22:16:24] I think I've fixed the main one (static method in the SpecialOpenID class) in the extension itself [22:16:30] e_s_p: developers should enable that more often... [22:17:56] yeah [22:17:56] I'm not sure what the response is going to be from upstream, though [22:18:47] e_s_p: you could start by letting them know. Check if they are open to input. Then at least you will not invest the time for nothing. [22:21:07] 03evan * r31828 10/trunk/extensions/OpenID/SpecialOpenID.body.php: static method declaration [22:21:43] 03siebrand * r31829 10/trunk/extensions/WhiteList/ (4 files): [22:21:43] * change i18n file formatting for L10n project support [22:21:43] * add descriptionmsg [22:21:43] * fix indentation [22:22:28] 03brion * r31830 10/trunk/release-tools/make-release.sh: bash scripting sounds so easy at first, then you have to do some fucking string manipulation and it's like wtf jesus [22:22:57] ouch. That sounds like frustration, brion... [22:23:13] :D [22:23:25] hahaha [22:23:29] :o [22:23:30] nah, you should have heard me when i discovered i'd lost a year's worth of changes to the release script ;) [22:23:35] O_O [22:23:48] never bothered to check it in, and stuck it in a directory i forgot to back up ;) [22:24:22] e_s_p: you could get rid of loadMessages completely. Not really doing anything special.. Just use wfLoadExtensionMessages where needed. [22:24:42] e_s_p: now it is just another trail to be followed... [22:25:32] brion: ow. that's nasty, too. [22:26:01] but no worries, i rebuild it better... stronger... faster... [22:26:03] i had the technology [22:26:58] siebrand: yes, I'll do that eventually [22:26:59] brion: changing topic... THere's been talk on an image backend rewrite. Is someone working on it, or will someone in your team work on it? [22:27:23] siebrand: tim did a huge amount of stuff on that over the last year :) [22:27:35] there's still various things to consider as far as the actual _storage_ backend part [22:27:46] now that the middle classes are there to allow it [22:28:22] While on the topic, does anyone know the story behind using the first and then the first+second letters of the hash for file directories? Instead of just first, then second? [22:28:28] brion: the thing I remember was about not storing as uploaded name, but something hashy, making links to the files from Image: more easy. [22:28:45] anybody knows with the database api i could access to other db than wiki? [22:29:04] brion: so rename and multiple 'file names' would be easy. [22:29:26] siebrand: *nod* that's one thing i hoped us to be able to do [22:30:24] I'm writing a bot that uses api.php for queries. Is there a significant performance difference between multiple queries that handle only one page each and putting together some sort of batch processing in the bot to put together a single request that covers multiple pages? [22:31:10] 03evan * r31831 10/trunk/extensions/OpenID/ (OpenID.setup.php SpecialOpenID.body.php): Delete SpecialOpenID::loadMessages completely [22:31:29] Pingveno: multiple requests tends to have greater overhead, but you may wish to test your particlar case [22:31:30] Pingveno: I'd think there'd be a really significant performance benefit [22:31:43] okay [22:32:43] It's a bit difficult to do batch processing, so I'd rather not be spending extra time for only a very slight speedup. [22:32:56] 03siebrand * r31832 10/trunk/extensions/Translate/ (MessageGroups.php Translate.php): Add support for HTMLets, OpenID, WhiteList [22:33:52] brion: first impression of the new old search page layout: the actual search field is wedged unconfortably at the bottom. [22:34:06] it kind of looks like it got dropped and no one bothered to pick it up :) [22:34:10] Duesentrieb: yes, that's always kind of sucked :) [22:34:30] the namespace checkboxes also suck big time, always have, but that's the same on the other plugin :) [22:34:34] especially after doing a search. "uh, where did it go?!" [22:34:53] hi I'm trying to put a background image to a div [22:34:59] brion: but thanks for the thumbnails. that rocks! [22:35:12] Edulix: background images are balcklisted from style attributes [22:35:25] Duesentrieb: :( [22:35:31] Duesentrieb: what can I do then? [22:35:34] Edulix: they are easy to abuse. use a css class and assign the background in the css page [22:35:42] !css [22:35:42] --mwbot-- To change styles for your wiki, go to one of the MediaWiki:xxx.css wiki page and put your custom styles there (sysop/admin rights required). MediaWiki:common.css is for all skins and should be used for content styles. MediaWiki:monobook.css is for the MonoBook skin (default), etc. For more information, see !skins and [22:36:05] 03siebrand * r31833 10/trunk/extensions/ (2 files in 2 dirs): Fix a couple of typos [22:36:57] Duesentrieb: it could be good if they were allowed but only for images stored locally in the server [22:37:31] Edulix: yes, that would be nice. [22:37:43] relatively simple to do too, i guess. [22:37:56] Edulix: file a feature request :) look first if it already exists [22:39:22] brion: another quick comment: in the individual rows of the result, the row number is bottom-aligned, and the title is centered vertically. that looks messy, nicer to have both either at the top. [22:39:40] Duesentrieb: it behaves different between different browsers [22:39:42] *Duesentrieb is pondering a "topic search" feature based on his thesis work [22:39:45] i didn't spend much time on it yet tho [22:39:59] yea, should be simple to fix with a bit of css tweaking [22:40:12] just spilling ideas [22:41:27] hm... for an image-centered place like commons, it would rock to show thinbs for "some" pictures from each gallery page and category too. but that's not so trivial... [22:42:02] *nod* [22:42:15] in theory it would be easy to pull a few items from categorylinks or imagelinks [22:42:45] the idea of a "caption image" for pages and categories, to be used in category listings, was about some time before. not sure where i saw it. would need some db work i guess. hm... wait... wew page page_props now. that might be usable, if it was expanded to key->value instead of just flags. [22:43:21] nod [22:43:24] brion: yes, easy enough - but wouldn't that mean one more query to run for each item in the reult list? [22:43:31] pulling from imagelinks runs the risk of including template icons :) [22:43:42] yes [22:43:44] what can be, I cant edit pages. Then I trying to edit always (in all browsers) I got dialog to download index.php file :/ [22:44:02] Min2liz: you have broken rewrite rules [22:44:08] !rewriteproblems | Min2liz [22:44:08] --mwbot-- Min2liz: 1) Try as a fail-safe method; 2) Do not put the files into the document root; 3) Do not map the pages into the document root; 4) Use different paths for real files and virtual pages; 5) Do not set a RewriteBase; 6) Put all rules into the .htaccess file in the document root. [22:44:32] Min2liz: most importantly, liik into point 4 (and 2 and 3 in relation to that) [22:44:56] hm ok I'll try [22:46:11] o_O http://commons.wikimedia.org/wiki/Image:WTF_Sketch.JPG [22:47:38] 03siebrand * r31834 10/trunk/extensions/Collection/Collection.i18n.php: [22:47:38] * article -> page [22:47:38] * fix indentation [22:50:34] e_s_p: what is "openidusernameprefix" for? [22:50:47] e_s_p: should that be an 'optional' or 'ignored' message? [22:51:08] e_s_p: optional: only translate if script differs (roughly), ignored: should not be translated. [22:56:58] I really really need help, I've searched to the point of my brain exploding [22:57:10] does anyone know if you can hide namespace: on specific category pages [22:57:43] for example, I have a namespace for email archives, as well as a category for email archives, when you go to that category everything is indexed under E (for Email Archives) rather than the actual page name [22:58:15] so, I want the Category pages to ignore namespaces [22:58:19] ah. Just learned about Google Summer of Code. Might be worth looking, actually :) [22:59:09] munda: hiding it is a separate issue for wether it's used for sorting [22:59:26] Is there any documentation that lists all of the tags that show up in api.php?format=xml results? [22:59:42] munda: i don't know about hiding, but you can supply a sortkey when categorizing: [[Category:Foo|Einstein, Albert]] [22:59:50] 03dale * r31835 10/trunk/extensions/MetavidWiki/includes/ (MV_GlobalFunctions.php specials/MV_SpecialExport.php): category and search rss link to stream view for temporal metadata context [23:00:18] hmm, I will give that a try [23:00:29] I just want the category page to properly index regardless of namespace [23:00:39] I'm probably asking all wrong, I am a wikinewb [23:00:47] munda: to do it for multiple categories (on a single page) use the DEFAULTSORT magic word [23:00:59] thank you so much! [23:01:34] munda: no, you are asking it right, and it would be nice to be able to switch it per category. and, actually, ignoring the namespace should be the default. sadly, the only way possible currently is to specify the sortkey on each page using the category. [23:01:54] well as long as it's possible that works for me [23:03:17] munda: well, it's kind of silly to add the very same thing to each page: {{DEFAULTSORT:{{PAGENAME}}}} [23:03:24] that's ll do though :) [23:03:44] Duesentrieb not working. i dont user any htaccess or similar like that. Now I deleted all old files and added new, but still same. Want to download file :/ [23:03:45] you can even put it into a template i think. then you just put {{catfix}} on each page [23:04:13] user = use [23:04:43] Min2liz: URL? [23:05:32] http://85.232.143.6/test/index.php/Pagrindinis_puslapis [23:07:45] Min2liz: works fine for me: http://85.232.143.6/test/index.php?title=Pagrindinis_puslapis&action=edit [23:07:52] Min2liz: or what was your problem again? [23:08:20] I cant edit files, always get dialog to download index.php file [23:08:55] hm, it does work for me. very odd. oh wait. [23:09:11] go to your preference page, to the edit tab, and see if you have "external editor" enabled [23:09:15] unckeck it [23:10:08] (people have reported that it suddenly gets enabled on its own - though i have no clue how that could happen, and have never actually seen it happen) [23:10:33] maybe just a mis-click when changing "minor per default" [23:12:41] brion: hahaha - e.V.'s weblog sais "Wikipedia beats Duke Nukem" (Flagged Revisions activated on test wikis) [23:14:07] :D [23:14:26] ok now works, but dont show FCKeditor just textarea with content :) [23:15:13] Min2liz: can't help you with FCKeditor, sorry [23:17:57] it could be possible with the database api access to other db than wiki i've been looking at web but it doesn't say anything [23:19:01] Dresden: yes. if the database is on the same db server, the simplest thing to do is to just write your db queries to qualify all table names with the database name - that should work at least on mysql. [23:20:23] Dresden: if you want a separate db connection, you can create an instance of the right wrapper class on your own. [23:21:03] Dresden: you can use Database::newFromParams to construct a new mysql wrapper, or DatabasePostgres::newFromParameters for PG [23:21:13] it depends on the dificulty [23:21:27] other databases are not supported. there's a wrppaer class for oracly, but it's unmaintained. maybe it still works, who knows [23:21:54] Dresden: hm? difficulty of what? [23:21:55] the easiest way it's to create the tables on the same wikidb [23:22:40] but i want to create a separate db in the same server [23:24:35] the problem is that i don't know if i use wggetDB... then it connects to the wikidb [23:25:26] then if i want a query to another bd i should do select ... from "database".table ?? [23:36:05] Dresden: yes, exactly [23:36:23] Dresden: wgGetDB doesn't connect, btw. it gives you the connection that is already there. [23:37:38] 03brion * r31837 10/trunk/extensions/DidYouMean/DidYouMean.php: [23:37:38] Let the cleanup begin! [23:37:38] * Remove mystery UTF-8 decoding on debug stuff [23:37:38] * use rawurlencode, not urlencode; page names may legitimately include + [23:38:28] Is there some sort of documentation for api.php that focuses on descriptions for every type of tage in the api.php?format=xml output? [23:39:05] 03brion * r31838 10/trunk/extensions/DidYouMean/DidYouMean.php: Don't use generic function names in your extension, you may confict [23:40:07] I've gone through the description of the different types of queries provided by api.php, but getting all of the tags that are used is a little hit and miss. [23:40:30] Duesentrieb: thanks for all dude! [23:41:02] Pingveno: that is indeed a problem with the api. [23:41:19] That there is no documentation for all of the tags? [23:43:45] Pingveno: yes. [23:43:53] hmmm [23:44:01] and that too many different ones are used [23:44:06] and that it can't be changed now :) [23:44:15] Yeah, legacy :)