[00:02:02] Nervosa: true wysiwyg is not really possible, and even comming close is hard. the best option (but far from perfect) is probably the FCKEditor integration [00:03:00] Duesentrieb: The editor they're using looks pretty good. [00:03:14] Looks like JavaScript. [00:03:51] [00:03:55] Im not really concerned about the wysiwyg more or less lookin for an easy way for users to make mages and add them to categories [00:04:42] Nervosa: there's a number of extensions for page templates and category selection. havn't tried any [00:05:08] I also also have seen kinda the samething on wikia sites like http://halo.wikia.com/wiki/Special:Createpage [00:06:20] Duesentrieb: ah, guess Ill have to test them out there. Was woundering if there was a commen one since I have seen it on a few other sites. [00:08:14] no idea, maybe it is :) [00:15:00] Wikia's looks like a custom extension; it's listed on Special:Version (with no documentation link), while armchairgm's isn't. [00:16:03] hey wiki fans... I'm back with more obscure questions [00:16:45] Is there a way to let anonymous have read access from only a speciffic IP range/address? and make everyone else login to read? [00:33:48] 03(mod) Enable Gadgets extension in cs.wikisource - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11994 +comment (10danny_b) [00:43:32] Did I stump the experts? [00:46:45] Klemo: that is not a feature built into MediaWiki [00:46:58] however, you could write an extension to meet that need [00:54:59] lol [00:55:21] I dont think i'm savey enough to write an extension... [00:56:02] well maybe... but i definatly dont have the time ti write one [00:58:48] sigh... [00:58:54] any who... laters wiki freaks... [00:58:55] love ya [01:30:34] Hi [01:32:02] I'm having problems with the svg mime type, when I upload an image of that type it is showed with a text saying "image/svg+xml" I have no idea about how to remove it. Please, could you help me? [02:40:46] hello everybody [02:41:09] how do i configure mediawiki so that it allows user's own javascript? [02:41:41] like it happens on wikipedia: [[User:MySelf/monobook.js]] ? [02:42:54] is it some extension? [02:56:22] php nuke platinum [02:56:25] _ [02:56:27] ? [02:59:38] Xerol [02:59:45] hoțgeidn kardeț [03:06:32] index applications are up to ... adler [03:06:36] at least that's in s3 :) [03:08:01] yay [03:08:26] brion: log_id too? ;) [03:08:36] nope :) [03:08:43] :'( [03:26:14] :P [03:26:17] Pathoschild [03:26:41] No, bi. [03:26:58] masterpersona gay 2 [03:27:20] Oh no I am so deeply insulted. [03:27:38] :Pp [03:28:00] *AaronSchulz is surrounded by more gays than he though [03:28:18] brion: promise to to start trying to kiss me one day will you? [03:28:30] *to not [03:28:34] opps ;) [03:28:49] The usual Freudian slip. [03:28:57] *brion is straight and taken :) [03:29:07] *AaronSchulz wants a girl too [03:29:12] Pathoschild love you :D [03:29:18] *AaronSchulz thinks about stealing brion's [03:29:25] *Pathoschild-gay :D [03:29:29] No stealing. [03:29:30] *Pathoschild doubts brion would approve. [03:29:40] :P [03:29:43] I approve of Pathoschild stealing. [03:29:44] :DD [03:29:48] :o [03:29:50] no noo noo [03:29:51] :DDD [03:29:52] *Pathoschild steals masterpersona. [03:29:58] ;DD [03:30:00] :DD [03:30:11] ok, enough irc kiddie spam [03:30:11] ok they need to make mac os x work in parallels / vmware... [03:30:23] or else be easy to keep old copies of safari around actually using old copies of webkit [03:31:15] G_SabinoMullate: ! [03:31:45] G_SabinoMullane: http://en.citizendium.org/wiki/Special:ConfirmAccounts [03:31:47] wtf? [03:35:53] Checking... [03:37:20] btw, I should make a separate list for "held" accounts [03:37:50] AaronSchulz: Line 994 of _body : $tbl_arch is not defined anywhere [03:39:36] ah, I see [03:41:43] ok, the first run-through of applications seems done [03:41:51] i'll have to do master switches then the next application [03:45:07] AaronSchulz: Rolled into place [03:45:26] G_SabinoMullane: gah, I need shell ;) [03:51:14] ok gonna swap on s3 [03:52:15] locking [03:53:48] bah [03:53:50] something borked [03:55:14] show master status\G [03:55:14] Empty set (0.00 sec) [03:55:16] the fuck? [03:55:59] ERROR 1186 (00000): Binlog closed, cannot RESET MASTER [03:56:28] weeee [03:57:10] ok i guess we won't be using db1 as master. [03:59:50] samuel looks happier [04:00:18] How do I download a database dump? [04:00:43] whenever I try as is I get 26% of the file (about a gigabyte and a bit) and then the download stops [04:00:47] at the exact same place each time [04:01:04] this is for the all pages, current versions only, 20071018 dump [04:02:01] the articles, templates, image descriptions dump stops after a couple kb [04:02:10] yelyos: double-check that your download tool handles large files [04:02:18] It's firefox - should it? [04:02:27] ought to [04:02:27] It shows me the expected file size properly [04:02:52] Just a quick question on mediawiki please [04:02:58] It's fine until it hits that magic 26% and then the download rate crashes to 0 [04:03:10] and refuses to restart [04:03:33] yelyos: link to file? [04:03:49] Opps sorry to interrupt i have a quick question on MediaWiki Please ? [04:03:50] http://download.wikimedia.org/enwiki/20071018/ [04:04:00] sakthi: shoot [04:04:03] Thanks [04:04:10] there's not really such a thing as interrupting in irc [04:04:27] imo anyway. [04:04:30] which file? [04:04:37] I have some how got media wiki installed on win 2003 server [04:04:48] pages-meta-current.xml.bz2 [04:05:01] yelyos: all i can say is, if one tool fails, try another :) [04:05:05] I tried pages-articles.xml.bz2 too [04:05:11] and it downloaded some 9kb file onto my computer [04:05:48] But when some one tries to create a new uses then it come up with an error can some one tell which file do i need to tweek to talk to proper mail server ? [04:06:08] all right, i'll try and see if it works for me... [04:06:27] starting on s2... [04:06:57] MZMcBride: well, it worked fine for more than a gigabyte [04:07:06] so it'd be a while before you'd know whether you were getting the same problem [04:07:24] yeah, i realize; i'll see if it goes past 26% [04:07:40] Guys whats happening to my question please [04:08:04] sakthi: what's the error? [04:08:35] is the e-mail address that you set-up the wiki with valid? [04:08:58] Could not send confirmation mail. Check address for invalid characters. Mailer returned: SMTP server response: 550 5.7.1 Unable to relay for hsukumar@bond.edu.au [04:09:49] i know it is some thing to do with tweaking some file to talk to proper mail server [04:10:12] can some one tell me what the file is please? [04:11:12] all settings are in LocalSettings.php [04:11:30] all available options for LocalSettings.php are found in includes/DefaultSettings.php [04:12:34] MZMcBride: so do you think if i change it to proper mail server then i will not be getting this error? [04:12:46] the only way to find out is to try [04:13:01] ok i will try it thanks mate [04:13:17] I'll be back in a bit. [04:13:23] ok [04:13:25] I'll try wget [04:13:28] see if it works better [04:21:38] Can I check the values of 2 variable in an #if statement? [04:21:46] like, if either are given [04:22:12] I tried: #if: {{{var1|{{{var2|}}}}}} [04:22:27] but that seems to never display [04:22:51] What is the full {{#if:}} statement? [04:23:27] {{ #if: {{{mother|{{{father|}}}}}} | [04:23:38] then there's a couple of lines of .. oh wait .. wikitable stuff [04:23:49] I've had to fix this before [04:23:51] meh [04:24:28] Yeah, wikiML table syntax will mess with the ParserFunction syntax. [04:24:54] yeah .. it works with normal tr/td tags [04:25:14] i use {{ #if: {{{mother|}}}{{{father|}}} | [04:26:43] No difference, beyond style. :) [05:15:15] I changed the wgHashedUploadDirectory to false... how can I move all the images out of the hashed dirs without breaking them?? [05:40:44] 03(mod) add "refresh this page" link - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11811 (10jidanni) [05:44:32] what is the simple way to add an inline image? [05:49:54] it seems like uploads are not enables on my wiki [05:50:07] is there a setting I need to set? [05:52:34] gongoputch: see LocalSettings.php and DefaultSettings.php [05:52:53] ok [06:00:51] it wants 'convert' ... is that from Imagemagick ? [06:02:27] nvr mnd I am stupid [06:06:19] !stupid | gongoputch [06:06:19] gongoputch : The wise man knows he is stupid. [06:12:42] :) [06:23:06] Things dont work in media wiki at all its hard and there are so many easy ones in the market and why do people still use mediawiki [06:31:47] sakthi: I started on it because it has such a userbase, and wikipedia looks good to me [06:32:10] I wish postgres support was more solid for it tho [06:34:17] TimStarling: just a short question to ParserFunctions. In expr.php are ~ 10 errormessages which are not localizable. do you have any objections to move these messages to ParserFunctions.i18n.php to make them localizable? [07:12:34] Raymond: you mean the expr_* messages? sure, you can move them [07:14:29] *AaronSchulz hugs TimStarling [07:15:33] TimStarling: yes. these one. ok, I will move them :) [07:17:27] 03(NEW) A way to select which parts of articles to show - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11996 15enhancement; normal; Wiktionary tools: General; (hippytrail) [07:22:07] 03(NEW) Convert the WT: PREFS javascript to a PHP MediaWiki extension - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11997 15enhancement; normal; Wiktionary tools: General; (hippytrail) [07:33:14] http://en.wikipedia.org/w/index.php?title=Template:Loop&action=edit [07:34:36] 03(NEW) A way to show only the few languages a user is interested in - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11998 15enhancement; normal; Wiktionary tools: General; (hippytrail) [07:35:17] mysql> select count(*) from templatelinks where tl_namespace=10 and tl_title='Loop'; [07:35:21] | 58375 | [07:38:38] I hate users [07:38:50] hand up if you're a user, I want to kick you [07:42:43] no users here? how odd [07:44:25] never... /me is a writer, photographer, programmer, but never a user [07:44:50] what programming languages? [07:45:01] do you program in wikitext? [07:45:07] say 'no' [07:45:13] TimStarling: http://en.wikipedia.org/wiki/Talk:Chemical_imbalance#Response_to_.27Voice_of_All.27 [07:45:17] noooo. a little bit php [07:45:24] grr...I should just leave this guy [07:45:32] I really don't give a shit for this anymore [07:45:51] never written an email client in mediawiki templates? [07:47:27] you mean http://en.wikipedia.org/wiki/Template:Loop ? looks very evil [07:47:51] 03(mod) A way to select which parts of articles to show - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11996 +comment (10dmcdevit) [07:50:12] *Raymond_ offers user:Polonium to kick [08:03:28] 03(FIXED) ParserFunctions error message localisation - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=8773 +comment (10raimond.spekking) [08:03:41] <_wooz> llo [08:03:56] *Raymond_ kicks CIA-36 again [08:04:34] bah, someone is prolly DoSing msn hotmail now :( [08:05:46] Raymond_: dead for days [08:06:09] Nikerabbit: I know but sometimes kicking helps :) at least to me :) [08:07:25] or is that not what you meant? [08:08:01] better kicked bot than a silent bot :) [08:09:02] is there easy way to get diff on one revision in svn? [08:16:30] + $messages = array_merge( $messages_base, $messages_base ); ?? [08:18:53] ?? [08:19:11] Duesentrieb_: ? [08:20:00] does anyone know how to set the message that is displayed when a MediaWiki installation is turend to readOnly? [08:20:34] The message says "THe administrator who locked it offered this explanation: $1"... but I don't know how to set teh $1 [08:21:15] sean: add $wgReadOnly='your read only error message' to localSettings.php [08:21:43] thank you! [08:22:43] Has any of you ever had problems with replication in MediaWiki before? [08:22:56] I had replication going for more than a week... then this morning the site goes haywire [08:23:13] and randomly reads and diffs are chosing between current content and content that seems to be from before replication started [08:24:27] how many requests per second are you serving? [08:24:39] probably about 8 to 10 [08:24:57] I have a squid set up and memcached... so those are other possible points I could be screwing up [08:25:38] 8 to 10 at the frontend? so presumably not much more than that figure at the DB in terms of queries per second? [08:26:05] around 8 at the backend [08:26:06] but yeah [08:26:13] not much more at the back [08:26:41] so surely your DB servers couldn't be heavily loaded at that rate [08:27:00] they're actually fairly heavily loaded [08:27:18] master was running around 30% cpu yesterday and slave was 80-90% CPU [08:27:32] I'm on relatively cheap hardware I guess [08:27:42] the master has 4GB of RAM and the slave has 2GB [08:28:30] i ran "SHOW SLAVE STATUS" and there was no entry in seconds_behind_master... so I assume everything was caught up [08:28:57] our db9 at the moment is doing 3300 queries per second [08:29:08] ... but the weird thing is that the data isn't just slightly stale... on random requests it's like a week old (my gues is from before replication) [08:29:17] hmmm... I'm not sure how many queries per second I'm doing [08:29:22] how can I check that? [08:29:43] $ echo 'show status like "Questions"' | mysql -h db9 ; sleep 10 ; echo 'show status like "Questions"' | mysql -h db9 [08:29:58] $ echo $((3437651457-3437618449)) [08:29:58] 33008 [08:30:22] or 3300.8 qps [08:30:50] running that... [08:31:23] my point is that replication does require a lot of sysadmin work, especially if you're balancing read load [08:31:29] it's not particularly simple to maintain [08:32:22] hmm [08:32:23] so unless you've got a really big site, it might be easier to serve the whole load from the master, and keep the slave as a hot spare [08:32:46] I'm doing 100 queries per second right now [08:33:05] I think it's fairly big... [08:33:18] I couldn't handle the traffic on just the one server before [08:33:28] db9 is a quad-core Xeon with 16GB of memory [08:33:29] Over a million requests get PAST the squid per day [08:33:41] dang [08:34:05] your data set size is important though [08:34:21] if you're serving entirely from RAM, MySQL can serve a *lot* of queries [08:34:57] about 500,000 good pages [08:35:13] ok, so your site is bigger than you thought it was :) [08:35:32] I guess :) [08:35:41] ... that's not all going to fit into RAM I take it [08:35:43] lol [08:35:52] well... not how little RAM i have [08:36:20] try "show table status", that shows you the data set sizes [08:36:43] it's particularly important that you have enough memory to hold the indexes [08:36:47] oh... I'm runnin 100qps on my slave too apparently (so 200 qps)? [08:37:08] hey [08:37:16] hello domas [08:37:27] TimStarling: I realized it is very hot in North Pole! :) [08:37:33] yeah? [08:37:34] is it a common joke in Australia? [08:37:38] http://www.accuweather.com/world-index-forecast.asp?partner=accuweather&traveler=0&locCode=OCN|AU|WA|NORTH%20POLE&metric=1 [08:37:39] :) [08:38:10] re [08:38:10] some of these index sizes are... realllly big [08:38:10] mmm, should I bump up heating or dress more [08:38:25] Sean: sometimes by making the index bigger you make it faster. [08:38:46] and if anyone has single app server, running memcached is bad. [08:39:03] why is that? [08:39:03] anyone tried Sphinx on Windows? [08:39:06] domas: that's amusing, I wonder if that's an actual place [08:39:30] we had -12C here this night [08:39:37] now it warmed up to -5C [08:39:46] why does winter strike that fast. [08:39:53] *domas blames global warming [08:39:58] I only have one APP server... but 4 servers total... is that still bad to run memcached? [08:39:58] Sean: ok, so let's assume that you do actually need replication [08:40:01] Sean: because there're IPC-less methods for same. [08:40:26] ok, thanks :) [08:40:42] hmm [08:40:51] the MediaWiki load balancer will distribute load between non-lagged slaves [08:41:00] non-loaded too [08:41:00] ok [08:41:26] hmmm, I have to remember what the defaults are and what it is that we use... [08:41:31] TimStarling: I had a question for you as load balancer expert. Should we remove master_pos_waits() from "Threads_running" or not? :) [08:42:42] it's a bit tricky, because if there are a lot of threads waiting, they have the potential to overload the server once they come out of wait [08:43:11] indeed [08:43:20] that was what made me hesitate [08:43:24] on the other hand, they may not [08:43:40] because the lag can be caused by some really long single write statement [08:43:51] now that makes "all servers busy" kind of thing [08:44:00] mmm [08:44:42] I don't understand how week-old data could crop up all of the sudden from replication [08:44:49] is there something simple I'm missing? [08:45:02] (like does it roll-back under some circumstances?) [08:45:24] domas: maybe you could take them out of threads_running, but put them in another variable [08:45:41] TimStarling: that was the idea [08:45:55] that way we can use threads_running+threads_lag_wait at the application level if need be [08:46:21] TimStarling: nowai I'd silently remove that without adding elsewhere [08:46:24] that would be fine by me [08:47:06] the term "threads_running" implies they're running, which waiting threads aren't [08:47:30] if you add another variable, then you give more flexibility to the app [08:47:44] then the app can choose what the definition should be [08:48:30] Sean: sounds like MediaWiki was not using your slave server at all [08:48:31] mhm [08:48:40] then something changed and it started using it, so you started getting old data [08:49:35] the default maximum lag is indeed infinite [08:49:46] does anyone know how to add the default mediawiki tables to a database with an arbitrary prefix, without using the normal webinstall? [08:50:00] hmm [08:50:29] you have to use something like $wgDBservers[1]['max lag'] = 30; [08:50:51] then mediawiki will stop using it if replication stops [08:50:57] hmm [08:51:00] I'll add that in [08:51:06] how can I tell if the replication is working? [08:51:13] show slave status; [08:51:17] I ran queries and got WEIRD results [08:51:18] k [08:51:24] and also show processlist [08:51:46] show processlist will show you the replication threads, and their status gives you some useful diagnostic information [08:52:00] it says it's waiting for master to send event [08:52:39] well, you can sanity-check it by looking for the most recent events in other ways [08:52:52] such as "select max(rc_id) from recentchanges" [08:53:28] on a working slave it'll be incremented at each edit [08:53:43] on a stopped slave, the value will be significantly behind the master [08:53:50] the slave has a HIGHER value [08:53:55] :[ [08:54:08] that would do it, yeah... [08:54:13] I wonder if I made MediaWiki do WRITES to the slave [08:54:25] you need to have read_only set on the slave [08:54:36] TimStarling: I'm not sure that parameter is used if slave thread is stopped [08:54:37] 03(mod) Install the StringFunctions extension - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=6455 (10ross) [08:54:37] where... in wgDBservers? [08:54:47] in mysql, it's a mysql global variable [08:54:52] set @@read_only=1; [08:55:02] *domas giggles [08:55:33] k [08:55:35] because if mediawiki writes to a slave for whatever reason, you've got a bit of a disaster [08:55:42] which is what I suppose you've got [08:55:45] haha [08:55:48] i certainly have [08:55:58] the site is totally fubar [08:56:10] mediawiki doesn't write to slaves, usually [08:56:19] cause that would be trapped by our dberror.log :) [08:56:19] it's happened a few times on wikipedia, due to misconfiguration [08:56:33] TimStarling: well, someone promoted slave to a master, in that misconfiguration [08:56:33] let me see if i'm doing this right [08:56:37] there might be some scripts in the maintenance directory to help you out [08:56:56] so I have wgDBuser, wgDB[whateever] all set to the master [08:57:08] first of all, set your site to read-only in MW [08:57:08] then I have wgDBservers as an array [08:57:10] and that array contains the master and the slave [08:57:21] so that the problem doesn't get any worse [08:57:22] yeah, it's read only at the moment [08:57:29] I set that right when I got in this room [08:57:35] http://lyricwiki.org [08:58:06] in wgDBservers both the master and the slave are there, but the slave is configured to take 70% of the load [08:58:25] next you need to look at MediaWiki's tables on each server and work out what has been written to the slave alone [08:58:43] ok [08:58:52] sometimes you can just promote the slave to be the new master, if the switch was clean [08:59:04] alrighty... I'll start digging [08:59:08] but if writes were mixed, some to the real master and some to the slave, then you'd have to merge them [08:59:08] I have a weird anomoly that might help me figure things out though... [08:59:20] so right before I turned the site to ReadOnly, I created a page [08:59:27] 'LyricWiki:Test' [08:59:36] then I went to each database to see which it was created on [08:59:58] and (maybe I'm just messing up the query here)... it looked like it wasn't in either database [09:00:03] even though the page shows up on the site [09:00:18] fun [09:00:43] does that make ANY sense? [09:00:47] here's the query I used: [09:01:01] SELECT * FROM wiki_page WHERE page_title='LyricWiki:Test' [09:01:16] oh okay... wtf [09:01:19] now the page isn't on the site [09:01:42] it's probably there, you're probably just not familiar with the schema [09:01:53] cache [09:02:03] try: SELECT * FROM wiki_page WHERE page_namespace=4 and page_title='Test'; [09:02:11] oh snap [09:02:14] that makes sense [09:02:18] I'll try it [09:02:57] sweet! [09:03:00] okay... it's on the master [09:03:05] unsweet... [09:03:15] lol... I think the writes were done to both database [09:03:16] s [09:03:34] I'm just surprised that it went for so long without doing anything weird on the site [09:03:35] until today [09:03:47] almost everything was messed up today [09:03:56] but yesterday everything was cool; [09:06:00] I did change the weights on the servers last night... that might have done it [09:06:07] I had it at 99% slave, 1% master [09:06:33] so my slave is probably in a better spot... and I just wasn't noticing the 1/100 requests that were getting messed up [09:07:37] how are you telling mediawiki which one is the master and which is the slave? [09:08:23] well, from the docs, I thought it just assumed that wgDBuser, wgDBpassword, etc. were for the master [09:08:29] and everything in wgDBservers were used as slaves [09:08:35] this is probably wrong, huh? [09:08:36] well, they're not [09:08:39] yep [09:08:40] OUCH [09:08:46] how am I supposed to tell it? [09:09:39] looks like it's not documented [09:10:03] hmm [09:10:06] how do you guys do it? [09:10:07] the first member in $wgDBservers is the master, the server with index 0 [09:10:17] WOW [09:10:21] OUCH [09:10:26] I had it the other way [09:10:34] if you set $wgDBservers, then the single-server variables are generally ignored [09:11:04] hmm [09:11:16] alright [09:11:27] looking at the data [09:11:37] it seems that my slave should now be the master [09:11:55] it's not entirely clean-cut since 30% of the writes were going to the master today [09:12:19] ...wait... why were any writes going to it if the first server is used as the master? Will MediaWiki send writes to all of the servers unless they are read_only? [09:12:38] writes will go just to first server in array [09:12:42] fun. :) [09:13:17] oh... okay [09:14:52] so I have to flip the master and the slave [09:15:10] but like... I have the real master (which was supposed to be the slave) on crappy hardware right now [09:15:47] you probably want to wipe the real master and copy the data from the slave [09:15:48] so if I do a mysqldump, overwrite the master with that data, then restart replication... [09:15:52] cool [09:15:57] use a data directory copy, with rsync [09:15:58] then life will be swell :) [09:16:02] it's faster than mysqldump [09:16:10] hmm [09:16:12] how do I do that [09:16:16] man rsync :) [09:16:17] lol [09:16:29] or scp or whatever [09:16:39] network copy tool of your choice [09:16:41] oh ok, cool [09:17:18] where are the data files? [09:17:25] depends on configuration [09:17:50] 'locate mysql' gave me tmi :) [09:18:25] rsync over rsync --daemon [09:18:32] \o/ [09:18:35] if you have gigabit network [09:18:49] and anything, if you have 100mbps or less. [09:18:49] :) [09:19:00] I have gigabit fortunately [09:19:02] mysqldump can saturate 100mbps ;-) [09:19:23] I found a bunch of .myd and .frm and .opt files [09:19:30] is that likely the correct directory? [09:19:54] in /var/lib/mysql [09:19:54] if you run "show variables like 'datadir';" it will show you [09:20:01] niiice [09:20:23] don't forget to move ib* files... [09:20:25] neato [09:20:29] ib* files? [09:20:35] ibdata, .. [09:20:41] heh, .myd [09:20:44] mediawiki on myisam [09:20:48] our most supported configuration, isn't it? [09:21:05] yeah, except image delete is apparently broken on it [09:21:15] you mean he's on myisam? [09:21:25] 200 qps on myisam? :) [09:21:29] last I checked I thought I was innoDb [09:21:55] yeah [09:22:02] if I do "SHOW CREATE TABLE wiki_page" [09:22:08] it says ENGINE=InnoDB [09:22:39] show variables like 'innodb_data_file_path'; [09:22:54] that will give you the innodb data file locations [09:23:09] but they're probably in the data directory, and you just want to copy the whole data directory [09:23:13] it will usually live at the top of datadir [09:23:17] ibdata1:10M:autoextend [09:23:28] TimStarling: 200qps on myisam is likely, if no writes ;-D [09:23:46] for smaller sites which don't use our perverse indexing tricks myisam can be faster [09:23:48] I have to go [09:24:08] thanks for all of the help man! [09:24:17] my life == saved :D [09:24:18] but domas will help you if you're nice to him [09:24:32] shweet [09:24:38] *gives domas a cookie* [09:25:41] I didn't wake up yet [09:25:46] he does actually know this stuff better than me, despite the fact that he's limited himself to occasional cryptic wry comments so far [09:25:56] hahaha [09:26:08] TimStarling: I'm frrrrrozzeennn [09:26:14] poor domas [09:26:14] and coffee doesn't help. [09:26:17] I will have to increase my briberies... maybe send domas a space-heater [09:26:24] *domas doing both - bumping up heating and dressing up. [09:30:19] I shut down both mysql servers during the transfer [09:30:26] I dunno if that's neccessary... but I guessed! [09:30:53] where in the world are you that's so cold domas? [09:32:52] Sean: shutting down is needed, yes [09:33:00] lithuania! [09:33:39] brrr [09:34:21] do you work for WikiMedia also? [09:34:42] (I'm assuming Tim does...) [09:35:13] nope, I work for mysql :) [09:35:21] oh cool [09:35:24] though my contract allows to do voluntary work for wikimedia [09:35:30] nice [09:35:34] mysql is l33t... good job :) [09:40:58] it would seem like I would have to change some setting after this transfer is complete in order to make this all work [09:41:03] ... like the log position [09:41:23] yup [09:41:38] after i do that, everything else should just fall into place, right? [09:41:43] mhm [09:41:50] look up the log on the master... set it on the slave and "START SLAVE" [09:41:54] wow, that's pretty cool :) [09:52:11] [10:40] ok, thanks :), that was a joke ==> I'm asking again : [10:39] anyone tried Sphinx on Windows? [09:54:18] negative... but I've heard good things about the Lucene extension for MediaWiki [09:54:58] i'm so far on Win ==> difficulties with Lucene (if tried) [09:55:53] i think i'm going to try Sphinx (just curious if someone already tried) [09:56:39] and i don't like (too much) about Java ;) [09:59:12] going to change to Linux after trying dualboot (Ubuntu 7.10 is on my list) [09:59:17] hi [09:59:23] hi emunkki [09:59:46] i know this must a be a known (solved) bug, but my rss/atom feeds get broken in mediawiki 1.9.3 [09:59:50] any solution? [10:00:59] it seems there is one empty line before the xml start tag and they become invalid feeds [10:01:15] and i can't figure out where it comes from [10:01:48] i'd rather not update mediawiki, so if there's any other solution.. [10:17:13] thanks domas and TimStarling! everything's working now: http://lyricwiki.org [10:17:19] g'nite! [10:30:15] 03(mod) Moving pages to their own redirects should be permitted - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11948 +comment (10dungodung) [10:35:20] yay [10:35:28] I like how my presence helps a lot [10:35:33] even though I don't say anything [10:42:17] 03(mod) Internal Error when trying to show differences of a page - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11992 (10N/A) [10:47:01] 03(mod) Internal Error when trying to show differences of a page - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11992 (10N/A) [10:47:02] 03(mod) some Functions don't Work with PostgreSQL - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11331 (10N/A) [10:52:46] 03(mod) Multiple watchlists - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=1492 +comment (10dungodung) [10:59:20] 14(DUP) Page move vandalism handling - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11968 +comment (10dungodung) [10:59:22] 03(mod) Page moves should be throttled to say 1 per minute for non-admin users - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=1454 +comment (10dungodung) [11:42:34] 03(mod) uploaded files cannot be updated or deleted - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11905 (10ecornely) [12:01:23] fuck. so much for testing. [12:01:33] Nikerabbit, Raymond_: sorry about the blooper [12:02:15] shit happens :) [12:04:41] anyway... the problem applies to all my extensions [12:05:21] if foo.i18n.xxx.php uses $messages = array(...), but doesn't specify all messages, there is no fallback. [12:05:37] my fallback mechanism isn't designed to handle that [12:05:49] i'd have to put the workaround i made for Gadgets into all me extensions [12:05:59] (or move that function into core) [12:06:20] Nikerabbit: whatcha think? [12:07:02] (and, btw - what happened to rob? burn out?) [12:07:50] dunno... maybe waiting for review of some of his branches? [12:09:10] you mean he didn't go away, just into deep hack mode? [12:10:33] dunno exactly... I know he has a few branches and waits for Brions review. if he is frustrated or does other things... real life? who knows [12:15:02] 03(NEW) variable $comment not defined - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11999 15enhancement; normal; MediaWiki extensions: Inputbox; (bertrand.grondin) [12:15:11] oh no! rob went into real life! [12:15:18] 03(mod) variable $comment not defined - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11999 15enhancement->normal (10bertrand.grondin) [12:15:36] 03(mod) variable $comment not defined - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11999 normal->high (10bertrand.grondin) [12:28:20] hurro [12:29:51] say i want to edit MonoBook.php to change the layout [12:30:54] how does one include a template? [12:31:26] ie. php function to parse out wikitext? [12:31:37] mazzanet: templates would be tricky. It's much easier to include a system message. I.e. use MediaWiki:Foo, not Template:foo [12:31:57] i tried that first [12:32:09] and what's the problem? [12:32:31] i took a stab in the dark and calle [12:32:41] d wfMsgHtml() [12:32:56] call wfMsgWikiHtml [12:33:04] ah! [12:33:06] MsgHtml expects the message to alread ybe html [12:33:15] :P [12:33:18] MsgWikiHtml expects wikitext, and spits back html [12:33:21] ta [12:33:59] is there a way to create a static version of a mediawiki for like burning it to a cd? [12:34:18] marvxxx: maintenance/dumpHtml.php [12:39:07] Duesentrieb_: yay, but... [12:39:24] i get <> around the output [12:40:30] wfMsgHtml did the same around the t [12:40:40] ext i do remember [12:41:11] ie. &< in the final html [12:41:24] s/&// [12:42:11] 03(NEW) Automatic List on linked articles for namespace reference - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12000 15enhancement; normal; Wikimedia: Bugzilla; (gdgourou) [12:53:54] Duesentrieb_: sorry i got kicked out of the internet [12:54:38] dumpHTML just looks fine but it works not independent from the meddiawiki installation...like with the images and stuff [12:58:21] marvxxx_: yea, i think you have to do the images separately. maybe talk to TimStarling [12:58:36] mazzanet: uh, would indicate MediaWiki:foo was not found. [12:59:30] Duesentrieb_: thank you. but it looks like it would make more problems. like with a extension i use. imagemap [12:59:57] or i try wiki on a stick and burn it to a cd [13:00:30] Duesentrieb_: but it parsed out the wikitext [13:10:12] Duesentrieb_: ah forgot to create a message not a templatd [13:10:21] all good, cheers [13:13:34] is http://bugzilla.wikimedia.org/show_bug.cgi?id=11995 well put for editing LocalSettings.php of some wikipedia? [13:14:03] is there a way to disable the search field and the toolbox? [13:33:13] Duesentrieb_: behebt dein i18n patch eigentlich das <> problem das wir gesehen hatten? oder ist das davon unabhĂ€ngig? [13:49:41] hi guys. How do I change the default set of preferences for new users? E.g. I'd like the 'Add pages you edit to your watchlist' pref. to be checked by default for each new user. [13:51:49] Raymond_: keine ahnung was da das problem war [13:51:54] sollte damit nix zu tun haben [13:52:19] stas: $wgDefaultUserOptions [13:53:01] Duesentrieb_: thanks, looking at it rightnow [14:00:25] Duesentrieb_: perfect, just what I needed. Thanks! [14:03:49] is there a global variable for ParserOptions? [14:05:34] SportChick: why is $wgParser not benefitted with mOptions set? [14:05:38] erm. [14:05:46] why is $wgParser not benefitted with mOptions set? [14:07:39] 03(NEW) update french translation - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12001 15enhancement; normal; MediaWiki: Internationalization; (bertrand.grondin) [14:10:05] and why is one of the most important variable not described in the wiki? [14:10:13] *PunkRock stops whining [14:10:20] sorry :) [14:11:59] giving constructive criticism != whining [14:12:55] you should see me....i am not sure if it's the cold, or the pain that makes my eyes wet :) [14:13:07] kidding. [14:13:39] working with the wgParser is anything but trivial. [14:14:36] i want to parse a little tiny bit of wiki markup. if the funtion that does is is called within a SaveArticle Hook everything is fine. [14:14:55] but within a SpecialPage i can't use this. [14:17:00] it throws a fatal error: Fatal error: Call to a member function getMaxIncludeSize() on a non-object in xxxxx on line 2828 [14:17:32] so i tried and pass mOptions to it. [14:30:46] ok. i passed empty ParserOptions and got the parser running ... somehow. [14:40:52] PunkRock: $wgParser->mOptions contains whatever options were last used for parsing [14:41:46] you shouldn't use it for input to parse(), because then the options will either be random (dependent on last caller), or not valid at all, if there was no last caller [14:43:42] 03(NEW) Adding database support for the Ingres DBMS - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12002 15enhancement; normal; MediaWiki: Database; (alexander.thiem) [14:44:12] Tim-away and how do i get rid of the Fatal Error then? [14:44:29] by constructing a new ParserOptions object [14:44:53] you know how to make new objects, right? [14:45:30] that is what i did! [14:45:44] ok. good news then. [14:46:25] 03(mod) creating internationalization i18n file - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11953 (10bertrand.grondin) [14:46:50] i wonder if one should work on mw without knowing how to instantiate a class. [14:47:28] you can also use $wgOut->parse() as a shortcut, $wgOut has a cached options object that it wil use [14:47:42] oh! [14:47:55] thats maybe the best solution. [14:48:31] but. .... finally i got it running ... it would just be a time eater. [15:34:51] what do I need to edit to modify the menu on the left? [15:38:18] !sidebar | DigitallyBorn [15:38:18] DigitallyBorn : To change links in navigation bar on the left, simply edit [[MediaWiki:Sidebar]] in * list style. More details on . [15:47:33] 14(DUP) New accesskey for "Preview" button in edit pages - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11864 +comment (10Simetrical+wikibugs) [15:47:36] 03(mod) Accesskeys conflict with browser keyboard shortcuts - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=477 +comment (10Simetrical+wikibugs) [15:49:48] 03(mod) Accesskeys conflict with browser keyboard shortcuts - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=477 +comment (10Simetrical+wikibugs) [15:54:47] Hello [16:00:09] 03(mod) Moving pages to their own redirects should be permitted - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11948 (10N/A) [16:02:04] Hi everyone, question for you, but first thank you for mediawiki. [16:02:40] I've got a load of images ~120MB on my host server and want a quick way to link to them in my wiki. [16:03:22] Is there a bulk upload feature, or an easy wiki markup to link thereto? [16:03:42] I've no wish to manually upload 900 images using the upload dialogue. [16:07:23] hm. irc, she is not for the impatient. [16:07:56] hi there. is wgScriptPath available in javascript ? [16:08:15] have you looked? [16:09:21] yes, but it behaves strangely [16:09:44] strangely how? [16:09:53] it should be exactly the same as $wgSCriptPath [16:10:04] however, beware execution order [16:10:04] oops sorry, it was me :-P [16:10:25] 03(mod) Moving pages to their own redirects should be permitted - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11948 (10roan.kattouw) [16:10:47] I misinterpreted quotes... [16:11:49] 03(mod) Moving pages to their own redirects should be permitted - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11948 (10dungodung) [16:13:06] ok guys, let's try a svn up :D [16:13:47] *Pathoschild quickly saves his edits before the servers crash. [16:15:42] *domas ducks [16:16:02] x3 [16:17:00] NotACow: You're too right about that article. [16:17:35] minute: thank you [16:17:53] minute: his response to my comment was laughably funny [16:18:00] No, thank you. [16:18:01] minute: he basically took his own argument and ripped it to shreds [16:18:04] lol [16:18:20] The article was actually rather patronissing. [16:18:31] minute: these sorts of articles usually are [16:19:06] minute: it's another one in an ongoing series of rationalizations for nerds being jerks to those around them [16:20:11] Yeah and the article was being rude, obviously some nerds would read it, and it talked about them in the third person. [16:20:18] hmmm, didn't see any conflicts but something's broke. i'll peek in err log [16:20:52] minute: the author is almost certainly an aspie. [16:21:00] *minute looks that up. [16:21:01] minute: not that he'll admit it [16:21:02] btw, brion, I hope checking anon group read permission is enough to guard rawpage on private wikis.. [16:21:27] NotACow: Ahh, yes. [16:21:41] probably [16:22:03] Sorry, my computer died, so if anyone answered you please be so kind as to do so again? I promise to add the info to the wiki :) [16:22:20] just popping in briefly to brag about my parser stuff... [16:22:28] minute: i have an undiagnosed central auditory processing disorder (never been confirmed professionally, but my brother's has been and i have many of the same symptoms) [16:22:44] oh [16:22:54] minute: asperger's is the same concept, except instead of speech it's emotional cues frmo body language [16:23:13] right [16:23:15] I imported [[United States]] and all its templates into my test wiki, disabled image rendering since that throws things out a bit [16:23:26] minute: many aspies cope with their inability to interpret body language and facial expressions directly usign cognitive processes, the same way i do for speech [16:23:54] I'm getting 76s for the old parser and 39s for the new parser [16:24:16] minute: so i have some understanding of how this works. and what he's describing is pretty much classic aspie behavior. [16:24:21] Them's good timings Tim. [16:24:35] NotACow: heh [16:24:37] and the new parser still has room for more memoization/caching [16:24:41] TimStarling: You wrote a new parser? [16:24:45] oooh [16:24:59] TimStarling: sweeeet [16:25:15] *NotACow will have to have a look sometime soon [16:25:17] *minute begs to god that it be able to be used without database access. [16:25:24] minute: heh [16:26:23] minute: well, it's a bit hush-hush at the moment since I don't want to join the vaporware brigade on wikitech-l [16:26:31] that's a big waste of time [16:26:32] hello [16:26:37] TimStarling: heh [16:26:45] TimStarling: But does it work without database access? [16:27:29] I'm just rewriting the preprocessing side of the parser, with full syntax backwards compatibility, in pure PHP [16:27:34] except faster and more flexible [16:27:48] oh, ok [16:27:50] yay [16:28:01] *minute hoped that it would be possible to use it without having a working database. [16:28:02] not doing something useful like formalising the parser in pseudo-EBNF or anything [16:29:12] pseudo-EBNF because apparently it's not expressable in ordinary EBNF [16:29:30] "useful" [16:29:42] sorry, sarcastic tone doesn't carry well on IRC [16:30:18] "useful", like throwing out 80% of the features and writing a wikitext to XML converter in C that's *slower* than the PHP one [16:31:00] (sorry Magnus) [16:31:25] TimStarling: Is it public? [16:31:34] (the test wiki) [16:31:37] no [16:31:43] :( ok [16:32:12] I can't use my browser at the same time the tests are running, because it would slow it down a lot and throw out the results [16:32:18] let alone invite the whole internet in to play [16:33:31] ok think i got the fatals worked out [16:34:21] making the current parser work without a database is pretty simple, if you don't mind losing half a dozen features [16:34:30] obviously you can't have templates without a place for the templates to come from [16:34:46] yeah [16:34:47] ok scap time [16:35:22] minute: in theory one could abstract out a lot of stuff and use the same class in another environment [16:35:28] in fact i think domas has done that every once in a while ;) [16:36:17] TimStarling: you can have callback for template loading ;-) [16:36:21] TimStarling: at least that was my solution. [16:36:46] brion: I remember showing you the source, I don't remember where is it though. [16:37:15] you could probably use an FSRepo for filesystem-only images if you don't mind it being slow [16:37:17] I was working on rewriting the installer, and have been trying to replicate some of the main parsers features in a mini parser. But it would be nice to use the real parser because my mini one would suck. [16:37:35] Nov 16 16:36:46 srv160 apache2[12355]: PHP Fatal error: Method name must be a string in /usr/local/apache/common-local/php-1.5/extensions/BotQuery/query.php on line 660 [16:38:04] now I realize why that one svn command is called "blame" ;-) [16:38:10] :) [16:38:30] domas: You made the parser work without a database? [16:38:45] I made a standalone parser class, yes [16:38:50] oh [16:39:00] just as proof of concept, that could be done in an hour or two. [16:39:05] is there way to profile mediawiki memory usage? [16:39:13] how do i configure mediawiki to be one big free-for-all (no restrictions, anyone can sign up and post and edit) [16:39:23] Nikerabbit: xdebug [16:39:27] domas: does it use the current one as a base, and use class NewParser extends Parser - or was it a complete rewrite? [16:39:33] AlphaOmega: that is the default [16:39:44] minute: no way! awesome! ^_^ [16:39:48] ty [16:39:49] minute: current one [16:39:57] minute: all I did was supplying stub dependancies [16:40:21] if you know where that source is, tell me :) [16:40:24] domas: neato, would you care to share the code. or would you prefer i created my own? [16:40:25] oh [16:40:27] :( [16:41:23] that is most unfourtunate [16:41:53] unfortunate to lose 3 hours of work? [16:42:07] yes [16:43:04] anyway, brb [16:47:09] bah [16:47:18] some of the CU slave updates still running on s3 [16:47:25] it got stuck doing waitforslave for... bacon o_O [16:50:02] TimStarling: Mainly because it seams unlikely domas will do it again (although I haven't asked him) - and I don't have enough experience. [16:52:18] mariusp: aik is cia [16:52:27] :) [17:07:28] 03(FIXED) update french translation - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12001 +comment (10huji.huji) [17:08:52] any brazilian? [17:09:07] Nikerabbit: The betawiki sitenotice is out of date. :P [17:10:59] minute: bah [17:16:50] 03(mod) Moving pages to their own redirects should be permitted - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11948 (10N/A) [17:23:35] 03(FIXED) if page is protected proofread doesnt work - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11983 +comment (10thomasV1) [17:25:09] 03(NEW) Possibility to auto-recognise redirects in article space - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12003 15enhancement; normal; MediaWiki: Redirects; (phi1ipp) [17:25:16] bah, cia dead? [17:25:54] AaronSchulz I think so [17:25:57] AaronSchulz: For a day or so. [17:26:16] brion: can you sync CU, I made a tiny sort change [17:27:06] *brion looks [17:27:19] brion: http://svn.wikimedia.org/viewvc/mediawiki/trunk/extensions/CheckUser/CheckUser_body.php?sortby=date&r1=27548&r2=27547&pathrev=27548 [17:27:26] brion what is the average donation now? [17:27:32] is that info publicaly available? [17:28:22] http://donate.wikimedia.org/en/node/22 [17:32:24] 03(mod) Allow hiding of Cite.php references - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=8033 (10N/A) [17:46:06] 03(mod) Enable Gadgets extension in cs.wikisource - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11994 (10jkbwiki) [17:46:55] why is constantly being removed support of html from messages? at least what is the reason for   can't be in messages? [17:49:11] $wgScriptPath = "/wiki"; [17:49:20] if I change that to / [17:49:25] will it break anything? [17:50:07] I am trying to setup Apache VirtualHost and having a few problems with DocumentRoot and $wgScriptPath [17:51:43] brion, can I have some technical info. How "expensive" are our hardrives? [17:56:37] varies between about $100 - $600 each [17:56:43] roughly [17:58:45] MrCheerful - I had tried that once and it didn't break [18:02:49] Hi all! [18:02:51] mariusp so a 30$ donation would cover 1/20th of our most expensive drives? [18:03:09] a cat [18:03:37] Danny_B, raw HTML is a problem as an attack vector and a way to screw stuff up generally. Entity references like   should probably be allowed anywhere, since you mention it. It's allowed in wikitext-enabled messages. [18:04:07] MrCheerful, it will break things if scripts aren't available from there. [18:05:02] *domas looks up [18:05:40] Shiroi_Neko: add electricity, cases for hdds, admin work, etc.. ;-) [18:06:01] domas oh indeed [18:06:08] my wine costs. [18:06:20] you drink wine in the server room? [18:06:30] no [18:06:32] but I drink wine [18:06:41] a server room isnt exactly the most eccentric and romantic place you know [18:06:48] I know [18:06:50] unless your GF is into servers more than yourself [18:07:06] she'd probably dump you for a server in such a case [18:07:12] italian wine? [18:07:23] Shiroi_Neko: you're not really being sane, I guess. [18:07:28] HelLViS69: nope :) [18:07:31] domas hey :D [18:07:38] *Shiroi_Neko is pattented insane [18:07:44] domas you don't know what you miss :P [18:07:50] http://en.wikipedia.org/wiki/Wikipedia:Requests_for_arbitration#Reply_to_Jack_Merridew [18:07:56] domas my inquiry was for the math [18:10:44] Shiroi_Neko, the math is essentially correct. There's no loss to Wikimedia on the server level from storing tons of stupid articles. [18:11:09] Simetrical: can plaintext messages be turned to plaintext + entities (or at least  )? [18:11:16] Of course you can make arguments about whether it helps or hurts the mission of Wikimedia, but it doesn't cost excessive server resources. [18:11:26] Danny_B, they should be, yes. Maybe I'll look at that in a minute. [18:11:40] thanks [18:12:41] Is there any known way of making a static backup of a mediawiki implementation? (browseable via filesystem) [18:12:49] Simetrical yes well... [18:12:59] I would like to setup wiki.private.company.com [18:13:18] I just wanted to silence the notion that there is a resource issue [18:13:23] but VirtualHost points to the wiki directory [18:13:40] then the wiki trying to look in a wiki directory that does not exist. [18:14:47] Simetrical: because there are typographical rules about connecting certain parts of texts and it's impossible to follow those rules with no nbsp support (which sometimes leads to ugly, hardly legible results). and other entities should be allowed in plaintext as well... [18:15:09] Danny_B, I realize, yes. You can enter an nbsp literal if you want, mind (unless using a broken version of Firefox). [18:15:47] Solifugus: not at the moment, you could make one, however# [18:16:01] Simetrical: did you mean i can enter nbsp as one unicode char? than i can't - it's being converted to regular spave [18:16:22] Danny_B, probably you're using a broken version of Firefox. [18:16:52] Simetrical: latest is broken? [18:16:54] !shorturl | MrCheerful [18:16:54] MrCheerful : To create simple URLs (such as the /wiki/PAGENAME style URLs on Wikimedia sites), follow the instructions at http://www.mediawiki.org/wiki/Manual:Short_URL. There are instructions for most different webserver setups. [18:17:01] Does anyone here know mediawiki, from programming point of view? [18:17:04] and every single other before? [18:17:05] Danny_B, AFAIK yes. I think maybe it's fixed for 3.0, can't remember. [18:17:16] Solifugus, I do a fair bit, yes. [18:17:22] Danny_B, I believe that's the case, indeed. [18:17:45] Simetrical: is there a function that renders a particularly requested page? [18:17:46] Danny_B, https://bugzilla.mozilla.org/show_bug.cgi?id=218277 [18:18:02] Solifugus, Article::getText or somesuch, probably. [18:18:26] Simetrical: ok, then. however,   is still more obvious as well as other entities (sometimes people don't have such support in font and just see square or question mark) [18:18:29] ok, thanks Simetrical [18:18:59] Danny_B, filed in 2003, fixed for 1.9, so it will be in FF3.0. Yes, I know it should be fixed in MW, I'm just suggesting a workaround for the time being. [18:18:59] Simetrical: ok.. I am wondering if I could just code something that requests the rendering of each page to a string, then save the results to a file.. and loop through everything in that manner.. [18:19:18] 03(mod) Change of upload url on hr wiki - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11995 +shell (10jeluf) [18:19:28] Simetrical: wget is not working, due to login... [18:19:45] Solifugus, that should be possible. Would take a while, though, because parsing an average page can take 0.5s+. [18:19:55] Solifugus, since we use a horrible evil sucky multipass PHP parser. [18:20:43] huh, mozilla should use monobook style for bugzilla :o) this one is unclear [18:20:47] When I try to use OpenSearch I get this message - Firefox could not download the search plugin from: http://...url for searchdesc. But when i go to that URL myself, I can download a file whose contents are http://rafb.net/p/b8Ddu928.html [18:20:56] Simetrical: I just need a perhaps weekly static snapshot--all our support documentation is in the wiki, but if the network goes down, we still need to get at the documentation (to fix the network problem) [18:21:08] Has anyone worked with OpenSearch? [18:22:59] "brion-leper"? o_O [18:25:37] Simetrical: doesn't appear to be a getText function in class include/Article.php's class Article [18:26:02] Solifugus, I don't know the exact name, there should be something like that. Or maybe it's the Revision class, that's probably it. [18:26:12] Could be you want $article->getRevision()->getText() or something. [18:26:21] I'm saying all this from memory, I don't know offhand. [18:26:42] 03(mod) Use transclusions to count articles as well - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11868 (10Simetrical+wikibugs) [18:29:00] 03(mod) upload file type black-/whitelist error message - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11984 +comment (10jeluf) [18:29:05] Simetrical: the Revision class has a getText() function, which then calls a getRawText() function--are you sure this provides a rendered version? Or just the wiki source? [18:29:49] Solifugus, if it provides the wiki source, call Parser::parse() on it. [18:30:31] ok.. [18:40:52] 03(mod) image position rendered incorrectly: overlaps main text; Ff 1.0. 7 - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=4643 (10N/A) [18:50:38] has anyone seen style guidelines for writing templates, to make them more readable/maintainable? [18:51:17] A simple 'address' template I'm making (http://wiki.digitally-born.com/wiki/Template:Address) isn't working correctly .. it performs the address2 logic, but it prints the brackets for the city logic, anybody help? [18:55:51] suddenly my Wiki gets like DDOSed by Wordpress blogs :p [18:56:17] tons of POST requests to trackback.php with User Agents like "Wordpress2.0" ,"Wordpress1.9" etc... [18:56:26] i have $wgUseTrackbacks = false; though and didnt change it [18:56:34] server load avg at ~ 20 :o [18:57:12] wonder why they suddenly all started to connect.. strange is also that i cant resolve the IPs to hostnames, though [18:58:40] Digitally-Born: {{ #if ... -> {{#if: ... (mind the colon) [18:58:46] yeah .. I just caught it [18:58:47] meh [18:58:49] thanks :D [19:00:54] mutante: delete trackback.php ? :P [19:01:01] Why would wikis trackback anyway oO [19:01:53] yeah, i moved it, and that dramatically dropped my load average :p [19:02:09] :) [19:02:11] just wonder what changed at wordpress today? [19:02:17] that made them all connect [19:07:12] 03(mod) Add marker that forces a template to be always substituted - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=2003 (10N/A) [19:13:12] 03(mod) Install the StringFunctions extension - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=6455 +comment (10Simetrical+wikibugs) [19:15:32] are there any conventions while using ajax in MW? [19:16:02] Simetrical: orps? [19:16:12] Nikerabbit, ? [19:16:23] Simetrical: what's your mail about signatures about? [19:17:08] You mean where Brion put the strings "User:" and "Special:Contributions" into messages that are marked not to be localized? [19:17:17] Is that considered okay by localization people? [19:17:22] Simetrical: why not? [19:17:25] Surely you want those to be translated? [19:17:27] they are inside links [19:17:31] Simetrical: those are canonical namespace names in link targets [19:17:35] they don't need to be localized [19:17:43] they are not shown to users [19:17:49] brion, they are if they edit the page . . . [19:18:14] boo hoo [19:18:16] they are in content language [19:18:24] they can customize it if they like, but why bother [19:18:52] which inherits from english... oh well [19:19:03] Because it's possibly gibberish to them when they read the wikitext? [19:19:36] And providing users with gibberish isn't helpful? [19:20:00] hmm [19:20:10] Nikerabbit, the messages are marked not to be localized, and it would be silly to try localizing them. Wouldn't make sense to duplicate so many times. [19:20:13] Makes things a PITA. [19:20:37] Anything wrong with {{subst:ns:user}}: and something similar for local special page name? [19:21:02] Actually, I don't know why we link to contribs for anons. Anons can have user pages, it's inconsistent. [19:21:03] I can't figure out anything else bug changing it to {{ns:user}} and then parsing it with parsemag and then putting it into wikitext [19:21:17] It's parsed anyway, surely. [19:21:32] Simetrical: yeah but no {{}} thingies will show up in the page source then [19:21:58] Right, that's good. [19:22:03] Why should they? [19:22:13] ?? [19:22:22] I did say {{subst:}}. {{subst:}} works in custom signatures, right? Wouldn't it work here? [19:22:24] I haven't tried it. [19:22:43] Simetrical: no need to use subst [19:22:48] just pass it trough parsemag [19:23:07] Hmm, right. [19:23:10] Forgot about that option. [19:23:21] (people can edit those messages too afterwards, not possible with subst) [19:23:21] Well, it's being parsed anyway, subst shoudl do the same thing. [19:23:23] should. [19:23:48] What do you mean, people can edit those messages afterwards? This is being saved in the page text, whatever gets saved isn't going to be globally changed later. [19:24:18] yeah but you can't edit message and keep the {{subst}} in it [19:24:19] *Simetrical frowns at GROUP BY cuc_ip ORDER BY last [19:24:29] Oh, I see. [19:24:32] Yes, that's a good point. [19:24:35] parsemag is better, then. [19:25:24] *Nikerabbit imagines what happens when people but random magic words into the signature [19:27:21] *amidaniel hopes we don't allow magic words in sigs ... [19:28:00] only sysops can change it but still... [19:28:24] *amidaniel changes his sig to "__NOEDITSECTION__ {{Special:Recentchanges}}" [19:29:17] brion, can I delete the autolinking RFC/ISBN/PMID feature? It's kind of silly and outdated. [19:29:49] 03(mod) Accesskeys conflict with browser keyboard shortcuts - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=477 (10huji.huji) [19:30:06] Simetrical: no. [19:30:09] not at this time anyway [19:30:24] :( [19:30:24] likes having RFC and ISBN autolinked [19:31:37] It's a horribly narrow feature that could be replaced with templates at the cost of about four extra characters per call. [19:31:44] It would also allow customization of how they actually render. [19:31:52] five! [19:32:09] Nikerabbit, no, four. You replace the space with a pipe. [19:32:26] Simetrical: noooo [19:32:32] that's ev0l [19:33:24] 03(NEW) categories that should be populated by templates using parser functions remain empty more often than not - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12004 normal; normal; Wikimedia: wikibugs; (esprit15d) [19:34:01] *Simetrical introduces esprit15d to the job queue [19:34:45] told #wikipedia i would mention this here, but had to "pastebin" it http://meta.wikimedia.org/wiki/User_talk:Mutante#redirect_to_local_language_pages_on_wikipedia [19:36:30] mutante, interwiki links need not be unique, you know. You can have multiple links from en:Foo to, say, dewiki pages. [19:37:07] Possibly in some wikis a concept is broken down over two or three pages with no one central gateway page, while on other wikis it's all condensed onto one page. [19:37:41] It would be easy enough to write a special page to do this, probably. Special:Interwikiredirect/de/Pagename or whatever. [19:38:25] hmm, but harder to get wikipedia to use it [19:38:28] i guess [19:41:52] non-wikipedia mediawikis could have a template {{wikipedia}} "further wikipedia links for {{PAGETITLE}}, that link to multiple languages at once, without having to add all the local page names in their template.. thats how i started that conversation [19:51:09] is there a way to url-encode a varable in a template? [19:51:29] {{#urlencode:{{{1}}}}} [19:51:30] Digitally-Born: Yup, {{urlencode: ...}} :) [19:51:42] hm... with or without the #` [19:51:43] *amidaniel stabs Duesentrieb_ for being faster [19:51:44] ? [19:51:53] Oh, actually I'm not sure [19:51:54] Sec [19:52:29] Yep, without the # [19:54:49] It's core, not ParserFunctions. [19:55:23] how about make a link open in a new window? [19:55:50] Digitally-Born: You need to toy with js to do that [19:56:12] I mean .. with wikitext? [19:56:17] Digitally-Born, deliberately impossible. [19:56:28] if I can add an attribute to it, it's easy with js [19:56:29] The user can open something in a new window or not as they choose. [19:56:36] You can't add JS attributes in wikitext. [19:56:57] Simetrical: I can live with that .. only because it's deliberate :) [19:56:58] Maybe if you enable $wgUseRawHTML or whatever it's called. But obviously that's not secure, unless your wiki is locked down. [19:57:36] Digitally-Born: Easiest way is to wrap up the links in 's of a certain class and then add the attribute through monobook.js [19:57:43] s/Easiest/Safest [19:57:51] I suppose that would work, too. [19:57:52] eh .. it's not that important [19:58:15] it's just a map link in my address template [19:58:20] no biggie [19:59:06] http://wiki.digitally-born.com/index.php?title=Template:Address&action=edit [20:00:14] Digitally-Born: if you just need it in one or two places, use HTMLets [20:00:27] Digitally-Born: otherwise, there's a hack somewhere on mediawiki.org [20:00:36] ok, this might be big .. is there a way to list the articles in a category on an article? [20:02:02] I'm making 'contact sheet' templates .. one for a person, one for a company [20:02:24] I'd like to list all the articles in Category:Company Employees [20:02:41] that possible without being really dirty? [20:03:12] look at the Dynamic Page List extension. Or CategoryTree [20:15:55] So apparently Wikibooks (English, at least) had a software update last night(?) [20:16:28] This gave us patrolling of new pages via Special:Newpages. This has since disappeared. Anyone know why? [20:17:09] Mike_lifeguard, because it hadn't been properly reviewed before being enabled. [20:17:21] It will be reviewed and reenabled as brion finds the time, probably soonish. [20:18:04] OK. But why was it included in the first place if it's not yet ready? [20:19:02] Mike_lifeguard: I believe it was enabled by default and it was forgotten to disable it before running updates [20:19:10] bingo [20:19:23] woohoo .. what do I win? :) [20:19:34] fair enough. "Soon" you say? [20:19:41] that sounds fine [20:19:44] everything is relative :) [20:20:25] FYI in Special:Specialpages, Special:Captcha is listed as "<captha>" [20:20:42] a cosmetic change, I tink [20:25:28] *amidaniel stabs Mike_lifeguard for copying and pasting incorrectly and sending him on a wild goosehunt with grep :) [20:25:48] :( sry. i didn't copy and paste. that might have worked better [20:25:50] ok, so whats with all the yellow stuff at sp:np [20:25:51] captha? :P [20:26:17] yeah it's actually <captcha> [20:26:19] lol [20:26:20] i missed a c [20:27:01] I'm having issue installing mediawiki, my server admin has installed an automatic configure script in the /config directory and i can not remove how can i create a work around this? [20:27:38] ok fixing it to unlisted [20:27:57] Soup_Can: i believe you can rename the directory [20:28:07] might have to tweak a ouple isntances of 'config/' in the index.php file tho [20:28:29] is that the only place it occurs? [20:28:44] are ther other places i need to modify? [20:33:15] failed to open stream: No such file or directory in /homepages/4/d200985634/htdocs/test/includes/Database.php on line 2183 [20:33:15] Could not open "../maintenance/tables.sql". [20:33:17] is how it ends [20:35:11] well, does the file exist? [20:35:30] is it readable? [20:35:40] brion: bah, CU's linkbatch thing must be broken, can you check? [20:35:54] I'm getting 'redlinked' user/usertalk link to actual pages [20:36:17] :( [20:36:22] example? [20:37:39] AaronSchulz: do they have spaces? [20:37:46] brion: 'get edits' for 217.43.58.59 [20:38:52] Does anyone know whether the {{#if}} function is only available to certain versions of mediawiki? [20:39:06] AaronSchulz: LinkBatch::add doesn't appear to do space normalizes, it assumes underscores [20:39:12] *normalization [20:39:16] but user_text fields have spaces [20:39:30] I'm trying to use {{#if}} it in a template on wiki.eclise.org, but it doesn't work [20:39:33] !parserfunctions [20:39:33] "parser functions" are a way to extend the wiki syntax. ParserFunctions is an extension that provides the basic set of parser functions (you have to install it separately!). For help using parser functions, please see . For details about the extension, see [20:39:40] Jimse: see above [20:39:50] *brion snuggles mwbot [20:40:01] Jack_Phoenix: Thanks [20:40:41] brion: how does special:protectedpages linkbatching work? [20:40:45] np [20:40:51] $lb->add( $row->page_namespace, $row->page_title ); [20:41:26] bah, page table must be normalized [20:41:33] but not CU? [20:41:54] AaronSchulz: it's the cu_user_text for the user and user talk pages [20:42:01] user_text fields have spaces [20:42:54] ahh [20:43:07] brion, is it intended that plaintext messages can't contain entity references like  ? [20:43:30] Simetrical: that's what plaintext means [20:44:13] brion, is it desirable? At least in the case of nbsp, some languages require it, and thanks to Firefox it can't be reliably input literally. [20:44:30] "Plaintext" could (should?) mean "no formatting". [20:44:35] 03(NEW) $wgUseRCPatrol on nl.wikiquote - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12005 normal; normal; Wikimedia: Site requests; (londenp) [20:45:42] Simetrical: it might or might not be desireable to be able to do it, but it would have to be very carefully checked to avoid borkage when you really do want &s [20:46:25] *Aaron|coding tries $lb->add( NS_USER, str_replace(' ','_',$row->cuc_user_text) ); [20:46:33] hmm, my local installation is still getting redlinks [20:47:07] brion, we have Sanitizer stuff to do that already. [20:47:50] yes, and be careful where you use it [20:48:00] you wouldn't want to corrupt js and css output for instance [20:48:02] Hello [20:48:07] Hmm, true. [20:48:09] or examples of entities in text [20:48:18] Hello [20:48:35] brion: grrr [20:48:44] why must this be so hard? :( [21:00:17] 04(REOPENED) Enable Gadgets extension in de.wikipedia - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11956 +comment (10arnomane) [21:01:23] wtf [21:02:40] wtf what? [21:03:02] why won't this shit work [21:03:11] brion: str_replace doesn't work [21:03:38] oddly, 'get edits' for one IP blue links my name, for another, it's redlinked [21:04:09] heh crazy [21:04:16] lemme test locally before i scap other stuff [21:07:53] I see that patrolling of new pages is disabled by default? [21:08:20] (on hr wiki we have patroling enabled) [21:08:41] can somebody edit LocalSetting.php? [21:09:03] or to write a bug to bugzilla and wait.....? [21:09:39] stemd: wrong channel, really... #wikimedia-tech would be better. and generally bugzilla is your best bet for this stuff if you don't chatch someone to do it right away. [21:09:39] Aaron|coding: r27560 ok for you? [21:10:04] Duesentrieb: thanks for pointer [21:11:15] still redlinked [21:11:50] hmmmm [21:12:22] fine here [21:12:47] then I guess you can scap [21:12:55] my test wiki is filled with mods [21:13:08] hmm, I'll try my trunk test wiki [21:14:33] putting bacon back into rotation on jawiki... [21:15:06] scappin [21:18:59] ok, good [21:19:11] all trunk is good [21:27:02] what is quicker or less resource consuming: to have message overriden, but plaintext, or to have default from php but using {{VARIABLE}}? [21:37:09] what would be the regular expression for something that contains a "/" or ":"? would it be [.]*:[.]*|[.]*\/[.]* ? [21:38:34] blue_asterisk: That depends entirely on what you're trying to match. That code would match "foo:bar blah blah rest of page content and anyway, he said: ", for example. [21:38:35] blue_asterisk: \ <--- removes the special meaning of the sign afterwards [21:39:20] Pathoschild: is it at least syntactically correct? [21:39:44] 03(NEW) Translation alternate names of special pages nds-nl - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12006 15enhancement; normal; Wikimedia: General/Unknown; (servien) [21:39:48] oh wait, hmm, that only matches one of ':' or '/' [21:39:55] "[.]" can be written ".". :) [21:40:47] blue_asterisk: [\/\:]* [21:41:05] blue_asterisk: [\/\:]+ [21:41:25] or wait.. [21:41:30] Arnomane: ahh, thankyou, that's much smarter [21:41:33] ? [21:41:58] the second one should work [21:42:00] no? [21:42:12] one moment [21:42:16] nope [21:42:30] what about other stuff around it? [21:42:57] 03(mod) Translation alternate names of special pages nds-nl - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12006 (10servien) [21:43:03] it could be negated [21:43:11] then reverse the logic? [21:47:04] blue_asterisk: What are you trying to match? Any string of characters containing at least one ":" and one "/", even if it's the entire page content? [21:47:55] blue_asterisk: [\/\:]+ yes it works ;) [21:47:55] Pathoschild: yes, but I'm only matching an input field, not an entire page [21:48:34] blue_asterisk: [ ] bracket that contains all characters you want to search [21:48:34] Arnomane: how does it account for other chars around it? [21:48:39] oh [21:48:52] blue_asterisk: + one or more matches [21:48:57] yeah [21:49:03] blue_asterisk: \ escapes [21:49:09] Arnomane's pattern will also match anything that has at least one character, whether or not it contains ":" or "/". [21:49:17] I kinda know the rules, it needs getting used to [21:49:20] ...er, no, you removed the dot. [21:49:22] Pathoschild: no [21:49:31] yes ;) [21:49:33] ^ is starts with and $ is ends with [21:49:36] It will only match those two, but not anything else. :p [21:49:50] Pathoschild: at first this was my mistake [21:50:44] ? does it work? [21:50:44] So it depends what you want to do; check if it contains one of those two characters, or manipulate it? If you only want to check, Arnomane's pattern is good (and you don't even need the +). [21:52:04] yes, I want to return true if one or more of each character is present [21:52:29] in a string? [21:52:52] yes [21:53:28] then it works... however you also have to consider the search mode [21:53:38] ? [21:53:43] in case you have returns/newlines [21:53:51] in your string [21:53:58] return (ereg('[\/\:]+', $name)); [21:54:04] ^^ will that work? [21:54:55] hm this should work [21:54:58] Arnomane: oh, it is only an input field I'm checking and it only has one line [21:55:13] I think it should be fine [21:55:14] ah nevermind my comment [21:55:23] thanks! [22:02:10] Does ther current version of Mediawiki have a feature that e-mails you when your talk page is changed? [22:04:19] JohnReaves2: yes, but not enabled per default [22:04:53] okay, thanks [22:04:57] http://www.mediawiki.org/wiki/Enotif [22:05:10] err, wrong, that's the outdated stuff [22:05:17] here's the right one: http://www.mediawiki.org/wiki/Manual:Configuration_settings#Email_notification_.28Enotif.29_settings [22:07:50] *blue_asterisk grumbles; it's not working [22:08:56] why? [22:11:55] try preg_match() [22:13:22] preg_match('/[\/\:]+/m', $input) [22:15:03] I hate how php is so loose; if $var is false, does `echo $var` a blank string? [22:15:37] Arnomane: what is the difference? [22:16:08] blue_asterisk: preg is more powerfull, and faster, and more standard :) [22:17:35] what is /m ? [22:17:47] multiline. [22:18:01] so 1 and $ matches linebreaks, not only start/end of input [22:18:13] thanks, regex can be confusing sometimes [22:18:54] read http://de2.php.net/manual/en/reference.pcre.pattern.syntax.php and http://de2.php.net/manual/en/reference.pcre.pattern.modifiers.php [22:28:15] is it possible to program {{plural:}} the way it can have 2 or 3 options and not _exactly_ 3? (speaking about cs now) [22:30:55] how would i do a shortURL setup for http://namespace.example.com/Page_title ? [22:32:37] Not advisable, but read: [22:32:39] !shorturl [22:32:39] To create simple URLs (such as the /wiki/PAGENAME style URLs on Wikimedia sites), follow the instructions at http://www.mediawiki.org/wiki/Manual:Short_URL. There are instructions for most different webserver setups. [22:34:49] Danny_B: maybe somehow with parserfunctions? [22:35:07] yeah, i see the ones that move Page_title over, but not ones that strip the namespace out of title=foo:bar [22:35:19] Danny_B: {{#ifexist or something like that? [22:35:26] Arnomane: i mean exactly in php source [22:35:36] ah ok [22:35:49] since i don't know if the # of params is fixed or not [22:56:35] hello [22:58:15] why