[00:05:41] 03siebrand * r44743 10/trunk/extensions/MetavidWiki/ (2 files in 2 dirs): Rename message 'category-media-count' to 'mv-category-media-count'. 'category-media-count' is used in MediaWiki 1.12. [00:22:46] 03werdna * r44744 10/trunk/extensions/Configure/Configure.obj.php: Allow array() to be saved. This will occur legitimately when you want to restore all settings to defaults. [00:25:49] domas [00:26:57] is $wgDefaultUserOptions the same as $wgGroupPermissions['*'] ? [00:28:11] nevermind [00:35:10] 03siebrand * r44745 10/trunk/phase3/languages/messages/ (4 files): Localisation updates from Betawiki [00:36:48] How much effort would it be for some element on the delete form _other_ than Mediawiki:Excontent in the reason textbox to contain a portion of the source code of the page being deleted? [00:37:57] on ENWP, we're trying to avoid having defamatory/etc stuff from the article show up in the deletion log, so having it in the default reason is a no-go, but a script to fill in the delete reason to the most likely CSD category depends on availability of that code [00:42:13] 03werdna * r44746 10/trunk/extensions/Configure/Configure.page.php: Move wiki selector above main form [00:47:55] 03(mod) Make *.wap.wikipedia.org URLs redirect to *.mobile.wikipedia. org - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16692 (10brion) [00:50:47] brion: is 'crashier' a new word? [00:53:20] AaronSchulz: yes! [00:53:49] brion: http://www.gamefaqs.com/console/psx/review/R40128.html [00:53:54] wow, 1/10 [00:54:23] ouch [00:56:37] brion: http://hg101.classicgaming.gamespy.com/bubsy/bubsy3d-7.png [00:57:22] Shanel == Pathoschild. [00:57:23] I knew it! [00:58:05] AaronSchulz: that's hillarious [00:58:16] and the screenshot makes me lol [00:59:07] eww? [00:59:16] i saw better 3d done on 386s [01:06:05] hi all ! [01:06:31] is the 1.7 version also affected by the recent security issues ? [01:07:18] Yes, and you should update past 1.7 anyway.. it's really ridiculously old. [01:07:25] Like, mid-2006. [01:07:59] werdan7: 1.7 is the version of the current debian stable package... [01:08:07] Is there a patch planned for this version ? [01:08:18] If one was released, yes. [01:08:42] But one wasn't, so no. [01:08:43] no new tag for 1.7 [01:08:58] humpf [01:09:10] I guess we will have to adapt from the other patches then [01:09:23] probably the 1.6 one is not that different.. [01:09:29] You should update to a version that's actually recent. [01:09:30] can't you just install it the normal way... [01:09:47] nope, that's the way it works with stable packages [01:09:52] Rather than relying on sudo apt-get install x for stuff as simple as PHP software? [01:10:07] Generally speaking, we don't support distribution packages. [01:10:22] well, security support is a concern with PHP software [01:10:23] Providing secure packages of an application is the responsibility of debian. [01:10:31] and it is not simple [01:10:46] werdnum: yes sure [01:10:57] so I said we will adapt :) [01:11:10] you already did a great job with these releases [01:11:14] I really wouldn't use the deb packages of mw imo [01:11:30] I usually found mediawiki to have a very good security support compared to other PHP projects.. [01:12:06] You can probably just use the patch against 1.6. [01:12:16] OverlordQ: probably you're also not running a server with multiple differents uses [01:12:25] users [01:12:48] *shrug* still wouldn't use the package [01:12:55] *FunPika wouldn't use deb packages of any web software since they are always outdated it seems :P [01:12:55] by the way, is there any sort of multisite support planned for MW or am I asking a FAQ ? [01:13:21] By multisite support do you mean something like same user database for multiple wikis? [01:13:43] or simply various site and DB configured from the same PHP files... [01:14:05] about the package debate, I could word it another way: [01:14:15] FunPika: some stuff it's fine, if you want it to 'just work' :P [01:14:18] the popularity of the mediawiki package is fairly good [01:14:34] meaning people seem to care :) [01:15:20] about the multisite: how do you people do when you want several wikis running on the same machine ? [01:15:30] toots: we support that, it's just kind of ugly right now :) [01:15:35] (probably I don't know an obvious thing..) [01:15:36] there are several alternate ways of doing it [01:15:43] i think there's a faq page on it [01:15:48] !wikifarm [01:15:48] --mwbot-- To run multiple wikis, you can simply install MediaWiki in different folders, with different databases or in one with database prefixes. You can also have multiple wikis using a single installation: and . [01:15:54] hey handy [01:16:00] *brion rubs mwbot's belly [01:16:03] brion: thanks ! [01:16:36] 14(INVALID) html div bug - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16694 +comment (10danny.b) [01:17:22] brion: the trick with LocalSettings.php calling another file is very fine [01:17:35] I fail to understand why I didn't think about it before :) [01:18:12] so neat and easy ! [01:18:34] :D [01:18:46] good good [01:18:52] 03(mod) Enable Lucene 2.1 for all remaining wikis again - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16685 (10rainman) [01:19:00] 03werdna * r44747 10/trunk/extensions/Configure/Configure.php: Add myself to extension credits... [01:19:10] about the security issues, I am a bit worried for the delay to apply for the debian packages, [01:19:16] 03werdna * r44748 10/trunk/extensions/Configure/Configure.obj.php: Add that awful InitialiseSettings hack for Wikimedia for Configure extension. [01:19:24] they usually want security fixes to be as less changes as possible [01:19:48] and the current patch is huge compared to the usual ones.. [01:20:16] 03yaron * r44749 10/trunk/extensions/SemanticMediaWiki/includes/ (SMW_QP_RSSlink.php SMW_QP_iCalendar.php): [01:20:16] Shortened parameters to just 'title' and 'description', with old values still [01:20:16] supported for backward compatibility [01:22:35] is it possible to work-out fixes that are probably less complete but minimals ? [01:24:07] 03(mod) Watch pages for a few days only - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=6964 +comment (10Sebastian) [02:54:04] Can someone familiar with CSS help me for a moment? Trying to place an ad on the left column of hackepedia.org. Pastebin: http://pastebin.com/ma2ab97 [02:57:46] 03tstarling * r44750 10/trunk/release-tools/make-release: [02:57:46] * Force re-export on every invocation, don't use existing directories. Otherwise [02:57:46] a server-side or client-side abort might cause a partial dump to be created. [02:57:46] Tested server-side aborts, they cause a non-zero exit status from svn and so [02:57:46] should be dealt with gracefully. [02:57:48] * Use svn+ssh for export instead of http [03:00:02] hrm I see there is a second #content section which probably doesn't help http://pastebin.com/m4520653d [03:15:29] Anyone that has went through the process of adding banners to mediawiki that can help me with this? I can't find any helpful documentation other than adding google adsense. [03:15:49] I'm almost there as you can see, it's just a matter of adjusting the formatting [03:23:43] <_aib> i upgraded to svn head and now the fckeditor mediawiki extension is borking on me. [03:23:44] <_aib> i get ' Call to undefined method Parser::parser()' on line 29 of this file: http://pastebin.ca/1288290 [03:24:11] <_aib> i am trying to fix it so please let me know if you have any pointers [03:54:48] _aib: you should contact the maintainer of the extension [04:00:42] <_aib> the thing is, they say they want the community to take over the extension [04:00:49] <_aib> so i think that's why they aren't fixing this stuff [04:01:01] <_aib> why don't you guys put it into your revision control system and maintain it a tiny bit? [04:01:05] <_aib> it's a great bit of work they've done [04:01:09] <_aib> ping TimStarling [04:02:22] <_aib> http://mediawiki.fckeditor.net/ [04:03:19] <_aib> code rot [04:03:37] <_aib> [[Software rot]] [04:04:08] <_aib> here's an idea - a new hire whose job is to keep 3rd party extensions safe and working [04:04:12] <_aib> (not me) [04:04:27] Here's an idea - learn PHP and fix it yourself. :-) [04:04:40] zing [04:04:48] 03(mod) Admins don' t see when page creations match against the blacklist - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13780 +comment (10mikelifeguard) [04:05:19] if I'm bribed enough I might look at it :P [04:05:40] <_aib> oh yeah? [04:05:48] <_aib> 20 bucks, paypal. [04:06:26] USD? [04:06:36] <_aib> is that a bad thing? lol [04:06:50] Nobody wants 20 rupees or 20 yen. [04:06:53] how and/or where can I change what does? [04:06:57] <_aib> USD [04:06:58] Or (gasp) Aussie money. [04:07:27] MZMcBride: i could go for some rupees, its dangerous to go alone and i need a weapon from a vendor [04:08:25] Max_-: is whitelisted html [04:08:53] you can style it in CSS [04:09:02] I might throw in another $20. pretty good [04:09:33] Splarka is it the "code" style?.. in the CSS [04:09:46] if you mean
 or  those are completely different,  is inline background-colored monospace, it is similar to say 
[04:09:53] 	Max_-: no, just 
[04:10:02] 	browsers decide how to render it themselves
[04:10:24] 	that doesn't mean you can't style it more, it just means there is no default beyond html specs
[04:11:16] 	Max_-: what do you want to do with it?
[04:11:20] 	ohh ok.. well I just want it to stop breaking paragraphs... just have the blue background and monospace
[04:12:52] 	breaking paragraphs? it shouldn't do that...
[04:12:57] 	got an example?
[04:13:05] 	humm sure, wait a sec
[04:13:12] 	btw, if you actually mean 
 or  you get stabbed
[04:13:16] *Splarka 	shows you the knife
[04:14:15] 	no.. I'm really talking about 
[04:14:21] 	here what I write
[04:14:22] 	here is some text this is some codez and some text again
[04:14:24] 	03(mod) New extension to enforce minimum password strength. - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16435  +comment (10mikelifeguard)
[04:14:24] 	here's the result
[04:14:28] 	http://morphologix.com/wiki/Sandbox
[04:15:53] 	That's not  ........
[04:15:55] 	
this is some codez
[04:16:09] MZMcBride: it is, but he's got some extenion messing it up [04:16:13] http://morphologix.com/wiki/index.php?title=Sandbox&action=raw&ctype=text/css [04:16:23] Max_-: well, I can't view your http://morphologix.com/wiki/index.php?title=Special:Version [04:16:31] but you've got something screwing it up [04:16:41] shouldn't do that [04:16:56] but if you're desperate, try maybe [04:17:30] Maybe GeSHiCodeTag is messing it... [04:17:45] other than that I have EmailForm, GoogleAdsense and Lockdown installed [04:17:51] that sounds likely [04:18:01] it is probably what is turning into
[04:18:13] 	which is kinda silly, you can just use 
[04:20:35] 	I'll try changing the GeSHi parser hook... maybe I'll get  restored and I'll use ...  maybe for GeSHi
[04:20:53] 	03(mod) The move reason field should be one line only - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13627  (10mikelifeguard)
[04:21:00] 	that is used by other syntax hilight extensions
[04:21:28] 	ohh.
[04:21:36] 	I guess I won't install two at a time
[04:21:54] 	heh
[04:25:21] 	works now... but .. I agree I will have problems if I install another extension dealing with this...
[04:27:13] 	03(mod) Please configure more global blacklists - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14322  (10mikelifeguard)
[04:27:19] 	<_aib> here's an idea - a new hire whose job is to keep 3rd party extensions safe and working
[04:27:52] 	I'm not sure if wikipedia readers would feel that that is the best use for their donations
[04:28:29] 	IIRC some fund drives allow you to choose where your contribution should go, you should try that one year
[04:28:36] 	yes, it's not exactly related to our core mission, is it?
[04:28:37] 	in this particular case, I imagine wikia will maintain the extension
[04:28:55] 	like say, break down a pie chart of 1/4 to servers, 1/4 to administration, 1/4 to code development, and let the donator choose where the last 1/4 should go, and let people submit suggestions
[04:29:15] 	03(mod) wgSpamRegex addition request: anontalk.com - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16597  (10mikelifeguard)
[04:29:34] 	Splarka: you'd need a new hire to actually make sure targetted donations go where they're supposed to.
[04:29:35] 	but generally, if nobody wants to maintain an extension, it should be left to die
[04:29:41] 	(x) booze and hookers ( ) third party extensions ( ) Jimmy's travel expenses ( ) fighting censorship
[04:30:06] *werdnum 	stabs Splarka.
[04:30:15] 	( ) stabbing Splarka
[04:30:16] 	Jimmy's travel expenses aren't paid by the foundation, AFAIK.
[04:30:35] 	http://en.wikipedia.org/wiki/The_Swimming_Hole#Reception
[04:30:39] 	This is one good article.
[04:31:15] 	oooh, hawt
[04:31:30] 	( ) moar nudes on Commons
[04:32:01] 	( ) less PETA-sponsored porn on Commons
[04:32:14] 	PETA-sponsored porn?
[04:32:59] 	yep...
[04:33:15] 	well, it's not porn
[04:33:20] 	I think
[04:33:24] 	'tasteful nudes in bad taste'
[04:33:25] *Mike_lifeguard 	hasn't seen it
[04:33:45] 	"I'd rather wear nothing at all, than wear fur"
[04:34:02] 	Ooh, sounds fun.
[04:34:10] 	and under no copyright restrictions, and insanely high resolution
[04:34:13] 	"I'd rather fake it." XD
[04:34:18] 	or whatever
[04:34:25] *Mike_lifeguard 	huggles PETA
[04:34:34] 	they are such hippo crates
[04:34:50] 	(that is a pun, since they pretend to like animals, and hippos are animals)
[04:35:29] 	http://tvtropes.org/pmwiki/pmwiki.php/Main/DontExplainTheJoke
[04:36:24] 	here are some of them, werdnum: http://commons.wikimedia.org/wiki/Category:People_for_the_Ethical_Treatment_of_Animals
[04:38:19] *werdnum 	will be fine, thanks.
[04:38:35] 	Jenna Jameson, Pamela Anderson
[04:39:07] 	so all the people who'd go naked anyway.
[04:39:15] 	Even if non-fur was an option.
[04:39:27] 	Eva Mendes, Janine Habeck, Lisa Fitz, Olivia Jones, Paul McCartney
[05:13:40] 	http://www.thelocal.se/16398.html  oh sweden
[05:14:00] 	anybody run a wiki using fastcgi?
[05:14:14] 	Moi
[05:14:45] 	FastCGI+nginx, with beautiful short urls
[05:14:52] 	wtc does it randomly decide to log me in/out, say my password works/doesn't work
[05:15:27] 	mw or php session issue?
[05:15:41] 	prolly PHP, i need to tweak php settings?
[05:16:26] 	Well check out the session path settings... make sure everything is ok with permissions and location... and that the fcgi instances use the same one
[05:17:12] 	using mod_fcgid
[05:19:57] 	althiough might switch to something like lighttpd or nginx
[05:22:56] 	sessions are getting saved though
[05:37:09] 	03raymond * r44751 10/trunk/extensions/ (4 files in 2 dirs): 
[05:37:09] 	* Add i18n file
[05:37:09] 	* Remove ?> at end of files and vim:set comments
[05:37:10] 	* Add this extension to Translatewiki
[05:46:27] 	Dantman: any tips for switching to nginx :D
[05:47:46] 	Hmmm... ^_^ be prepared for a smaller config file
[05:48:08] 	I also have an init script and mw config you might want
[05:48:16] 	I might possibly :D
[05:48:19] 	And stay AWAY from the apt package
[05:49:26] 	Actually, if you're on a debian/ubuntu like system, I do have a nice shell script that handles everything from downloading, to building, to startingup/shutting down, and finding if it's alive
[05:49:40] 	Heh... you might also want to look into god
[05:49:42] 	yea I do happen to be on ubuntu
[05:49:46] 	*debian
[05:49:50] 	http://god.rubyforge.org/
[05:49:51] 	god eh?
[05:50:06] 	heh... sorry, I'm not religious
[05:50:36] 	:P
[05:50:48] 	clever lol
[05:52:00] 	but yea, pastebin those goodies :)
[05:54:11] 	God uses Ruby? That explains all the evil in the world, then. :p
[05:54:32] 	heh
[05:56:03] 	In reality, though, God uses C. :) http://www.gnu.org/fun/jokes/dna.html
[05:56:48] 	When I find myself in times of trouble, mother mary comes to me.
[05:56:52] 	Speaking words of wisdom, "Write in C"
[05:56:57] 	03raymond * r44752 10/trunk/extensions/ (7 files in 4 dirs): 
[05:56:57] 	Add some description messages
[05:56:57] 	Fix typo in mediawiki-defines.txt
[05:57:25] 	 * 0017-03-12 03:14  1.3  Added extra sex drive to male.h; took code from
[05:57:25] 	 *                        elephant-dna.c
[05:57:53] 	   /* G_spot *g;   Removed for debugging purposes */
[05:57:57] 	OverlordQ, firstly, my nginx shell script. I use ~/nginx to run it... I suggest putting it in /root and simlinking it to your home if you use sudo http://wiki-tools.pastebin.com/d7320e659
[05:58:57] 	I've got a wikimeet to go to
[05:59:00] 	see everybody later :)
[05:59:30] 	I suggest it being in /root cause I've tweaked my own init script to use that script to make things easier with god: /etc/init.d/nginx http://wiki-tools.pastebin.com/d1a576029
[05:59:49] 	http://xkcd.com/224/
[06:04:14] 	OverlordQ, and a cutout of the bits of nginx config: http://wiki-tools.pastebin.com/d5133f2da
[06:05:06] 	My own location line has a bit more than you normally need... I have some action paths for some extensions (SemanticForms and /formedit/), and I'm using action paths
[06:06:28] 	^_^ I load-balance between 3 fastcgi instances, all monitored by god. If something kills one of them, the others pick up the slack till god shortly brings it back to life
[06:07:11] 	O_O
[06:07:52] 	alright lets start with getting nginx running first :D
[06:07:52] 	heh
[06:07:53] 	Latest stable is 0.6.34
[06:08:19] 	~/nginx download 0.6.34; I believe
[06:08:25] 	And then ~/nginx build
[06:09:30] 	Hmmm... note to self... I need to do another server backup sometime
[06:11:07] 	yea, i just reinstalled, figured I'd check out my other httpd options :D
[06:12:06] 	03(mod) User CSS/JS should be parsed as it is rendered, with all wikicode ignored (treated as source/pre) - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16683  (10random832)
[06:12:22] 	bad dantman, dont have LSB headers on your init script :P
[06:12:33] 	Hmmm?
[06:13:41] 	dependency based booting 
[06:16:03] 	03(mod) User CSS/JS should be parsed as it is rendered, with all wikicode ignored (treated as source/pre) - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16683  (10mikelifeguard)
[06:16:58] 	03(mod) User CSS/JS should be parsed as it is rendered, with all wikicode ignored (treated as source/pre) - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16683  +comment (10dan_the_man)
[06:58:25] 	03(mod)  Subpages in MediaWiki namespace should be parsed in respective languages - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=16669  (10niklas.laxstrom)
[07:20:04] 	hi all, I'm having some problems with references on my wiki
[07:20:06] 	http://wiki.command-q.org/index.php?title=FFMPEG#References
[07:28:54] 	03(mod) User CSS/JS should be parsed as it is rendered, with all wikicode ignored (treated as source/pre) - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16683  (10herd)
[07:29:22] 	*sigh* another bug that I'll find when searching for all my comments
[07:29:50] 	(NEW) Shut up Splarka - https://bugzilla.wikimedia.org/show_bug.cgi?id=* (herd)
[07:31:45] 	03(mod) Oversighted edits still linked to in the watchlist - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16122  +comment (10jayvdb)
[07:35:27] <_wooz>	lo
[07:52:39] 	03nikerabbit * r44753 10/trunk/extensions/Translate/Stats.php: * Allow passing uselang
[08:13:59] 	03aaron * r44754 10/trunk/extensions/FlaggedRevs/FlaggedRevs.php: Typo
[08:21:59] 	http://wiki.command-q.org/index.php?title=FFMPEG#References
[08:22:03] 	can someone help me with this?
[08:25:12] 	do you have Cite installed?
[08:25:17] 	er
[08:25:40] 	how do I install cite :|
[08:26:16] 	!e Cite | antitab_
[08:26:16] --mwbot--	antitab_: http://www.mediawiki.org/wiki/Extension:Cite
[08:26:32] 	Whats the magic word to display the time of a time zone
[08:27:14] 	03raymond * r44755 10/trunk/extensions/AjaxTest/AjaxTest.php: Add extension credits
[08:27:49] 	03aaron * r44756 10/trunk/extensions/FlaggedRevs/ (FlaggedRevs.hooks.php FlaggedRevs.php): 
[08:27:49] 	*Permission tweaks
[08:27:49] 	*Break long line
[08:28:35] 	03raymond * r44757 10/trunk/extensions/ (3 files in 3 dirs): Assign more special pages to $wgSpecialPageGroups
[08:40:02] 	Whats the magic word to get local time of a specific timezone?
[08:40:11] 	!magicwords
[08:40:11] --mwbot--	For more information about creating magic words and their inner workings, see . For a list of magic words, please see .
[08:41:31] 	03(NEW) Global watchlist - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16695 15enhancement; normal; Wikimedia: Site requests; (kevinjduke)
[08:43:00] 	Prom_cat: is there one? maybe you mean #time, a ParserFunction parser function
[08:43:26] 	!time
[08:43:26] --mwbot--	For help with configuring timezones in MediaWiki, please consult .
[08:43:33] 	dam...
[08:43:53] 	Basically, I want to display the time in GMT +9:30
[08:43:55] 	http://meta.wikimedia.org/wiki/Help:ParserFunctions#.23time:
[08:45:43] 	{{#time:H:i:s|+9 hours +30 minutes}}
[08:54:57] 	where mediawiki images are saved? When we parse it using mwlib? Which db should be used
[09:00:27] 	03raymond * r44758 10/trunk/extensions/ExtensionDistributor/ExtensionDistributor.php: Assign one more special page to $wgSpecialPageGroups
[09:07:58] 	Hi guys, I'm trying to analize the code of your project whith SCAFortyfy, this is a project for an italian university
[09:08:52] 	someone can help me to figure out if the issues can be a security problem ? 
[09:10:36] 	For example, Line 543 of DatabaseOracle.php sends unvalidated data to a web browser, which can result in the browser executing malicious code.
[09:10:49] 	it can be a security problem ?  
[09:11:44] 	if your doing this for a university project, you should already know what is and isn't a security issue
[09:12:06] 	that is the  problem
[09:12:35] 	Unfortunally your project is too big to analize without someone that know it
[09:13:10] 	if you can be so kindly...
[09:16:03] 	03raymond * r44759 10/trunk/extensions/CentralNotice/CentralNotice.php: Delete dupe line
[09:16:48] 	1. Data enters a web application through an untrusted source, most frequently a web request or database.
[09:22:16] 	Hi, any one use mwlib for parsing wikitext. I am successful to parse wikitext but i can't get images? can anyone tell me how to do it?
[09:29:13] 		$memckey = $term_title ? wfMemcKey( 'ajaxsearch', md5( $term_title->getFullText() ) ) : wfMemcKey( 'ajaxsearch', md5( $term ) ); 
[09:29:51] 	md5 can be trusted, even if the function can be break ? 
[09:30:14] 	is there any mwlib user?
[09:31:48] 	Can anyone tell me where the images are stored in wikidb? It use the same db to store wiki images? or different database?
[09:33:56] 	03ialex * r44760 10/trunk/extensions/Configure/Configure.obj.php: 
[09:33:56] 	Fix for r44748: fixed this error:
[09:33:56] 	Notice : Undefined variable: IP in extensions/Configure/Configure.obj.php (line 221)
[09:33:56] 	Also globalized $wgConf to be able to use it in InitialiseSettings.php
[09:34:11] 	Ginovation: files are stored on the file-system, only metadata are in the db
[09:39:08] 	ialex: Thanks, Can u tell me how to  parse image, using mwlib.   
[09:39:58] 	no idea
[09:40:41] 	ialex: ok. Can u suggest , what can help in this?
[09:50:33] 	ialex hi
[09:51:39] 	in User.php on version 1.13.2 I've some security problem whith md5 and sha-1 algorithms.
[09:52:08] 	Can I trust this encryption or not ? 
[09:53:43] 	md5 should be good enough for passwords, especially with the salting scheme used by mediawiki. md5 shouldn't be used for "real secrets" though. sha1 is a bit better, but i don't think it makes much of a difference for this use
[09:53:50] 	read up on the algorithems on wikipedia, if you like
[09:58:12] 	03raymond * r44761 10/trunk/extensions/ContributionReporting/ (3 files): 
[09:58:12] 	* Localize some more text
[09:58:12] 	* Use formatnum
[09:58:12] 	* Introduce an own special page group for contribution
[09:59:41] 	03raymond * r44762 10/trunk/extensions/Translate/groups/mediawiki-defines.txt: Follow up r44761: Make message optional
[10:04:29] 	hey, is there any way to run update.php without access to a php binary?
[10:04:39] 	(ie, only accessible via the webserver)
[10:11:46] 	04(REOPENED) meta=siteinfo&siprop= namespaces|namespacealiases should return canonical names as well - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=16672  +comment (10daniel)
[10:12:30] 	checkers: you can rerun he web-based installer
[10:12:40] 	!upgrade
[10:12:40] --mwbot--	http://www.mediawiki.org/wiki/Manual:Upgrading
[10:13:36] 	thanks
[10:21:24] 	I want to find all articles that had exactly one edit, how can I do that through SQL or the API?
[10:22:48] 	lch: try [[Special:Fewestrevisions]]
[10:23:18] 	thanks
[10:24:05] 	ironically, this only seems to show pages with 2 revisions and up
[10:33:23] 	=(  I've followed the Manual:Upgrading to the letter. First as root and now secondly with the webinstaller. But still the same error message when I click on an image: "Fatal error: Call to undefined method UploadForm::usercanreupload() in /var/www/web5/web/wiki/includes/ImagePage.php on line 573" 
[10:33:45] 	1.13.3 is not meant to be for me
[10:36:17] 	odd
[10:40:33] 	and you're sure you've overwritten the files?
[10:40:45] 	yes I am
[10:41:03] 	Special:Version says 1.13.3
[10:41:15] 	and the rest of the wiki is working
[10:43:45] 	Maybe 1.9.3 -> 1.13.3 is a too big gap?
[10:44:18] 	shouldn't
[10:45:42] 	well... I could at least try a lower version...or? What can you propose for version?
[10:47:41] 	*shrug* might wait to see if anybody else has an idea, weird error to get.
[10:47:53] 	03(mod) API parse results differ when JSON callback is used - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16616  (10lupo.bugzilla)
[10:49:06] 	is it unsafe to stay on 1.9.3?
[10:51:36] 	Quakeile: yes. no security patches are made for 1.9.
[10:52:06] 	Quakeile: you should upgrade directly from 1.9 to 1.13. it should not be a problem
[10:53:19] 	thanks Duesentrieb, but I've tried many times to upgrade, still the same error message
[10:54:40] 	Quakeile: same error message while upgrading?
[10:54:50] 	that error doesn't look like it should happen during an upgrade
[10:54:55] 	either it is loading that file from somewhere else, or it is not loaded at all (includes/Autoloaded.php, permissions)
[10:55:20] 	I noticed that the file user.php often use the function mt_rand, do you think that is sure ? Where it takes the seed to generate the random nubers ?
[10:55:49] 	scuffio: mt_rand is not for strong crypto. what are you looking for?
[10:56:27] 	mediawiki does not support strong security. we use hashed passwords, that's about it.
[10:56:34] 	user.php line 1566, I want to know if the mt_rand make it sure ? what do you think ?
[10:56:36] 	we don't even have encrypted login per default
[10:56:55] 	Duesentrieb:  before upgrade: no error message, after upgrade: there is an error message. during upgrade: no error messages 
[10:57:02] 	scuffio: secure enough for what?
[10:58:13] 	Quakeile: ok then. hm... i think i know what your problem could be. at some point, several files got moved from the includes dir into subdirectories. if you have them in both locations now, yit might be using the old version instead of the new
[10:58:16] 	Could anybody exploit that vulnerability ?
[10:58:33] 	understanding the hash ?
[10:58:53] 	Quakeile: make sure to remove all old files. i'd suggest to replace the include and skin directory by the respective dirs from the latest release
[11:05:22] 	I noticed that all the security about the hash is based on the salt,  Do you preview a way (time limit or something else) for limit an ipotethic attacker that want to exploit that information ?
[11:07:00] 	preview = foresee
[11:12:18] 	03(NEW) lijwiki has 3 identical namespacenames - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=16696 15enhancement; normal; Wikimedia: Language setup; (bugzilla.wikimedia)
[11:15:59] 	In the line 337 of the file user.php what is the rule of the function selectField ?
[11:17:48] 	scuffio: select the user_id field from the user table
[11:22:18] 	03(FIXED) lijwiki has 3 identical namespacenames - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16696  +comment (10raimond.spekking)
[11:41:01] 	What is the security threat if an hacker break the newFromConfirmation function ?
[11:42:31] 	03siebrand * r44763 10/trunk/phase3/languages/messages/MessagesLij.php: (bug 16696) remove duplicate namespace definitions in MessagesLij.php
[11:42:49] 	03(mod) lijwiki has 3 identical namespacenames - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16696  (10siebrand)
[11:43:53] 	scuffio: user's email will be confirmed, but it won't be logged in
[11:51:12] 	ok so it's not a security problem .... what about the setToken (line 1566 file User.php) ? 
[11:55:37] 	03(FIXED) Localized namespaces for mt.wp - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16373  +comment (10siebrand)
[11:58:30] 	scuffio: tokens are used in persistant connection and are set in cookies
[12:00:03] 	03siebrand * r44764 10/trunk/phase3/languages/messages/MessagesAs.php: 
[12:00:04] 	(bug 15670)-ish: Update fallback from Hindi to Bengali. Assamese and Bengali
[12:00:05] 	share the same script, and the languages are spoken in the same region, albeit
[12:00:06] 	mainly in different countries (India and Bangladesh respectively).
[12:01:08] 	03(mod) Probable namespace change in the Assamese Wikipedia - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15670  (10siebrand)
[12:01:11] 	03(FIXED) Avoid lengthy repetitions in series of system messages. - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15762  (10siebrand)
[12:06:39] 	moin
[12:13:46] 	03(NEW) MediaWiki does not support extended Latin script properly - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=16697 04BLOCKER; normal; MediaWiki: User interface; (Gerard.meijssen)
[12:15:35] 	03(NEW) Typing error in namespacename in Cebuano wikipedia? - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16698 15enhancement; normal; Wikimedia: Language setup; (bugzilla.wikimedia)
[12:16:36] 	03ialex * r44765 10/trunk/extensions/Configure/ (5 files): Added support for APCOND_ISIP and APCOND_IPINRANGE conditions of $wgAutopromote
[12:17:44] 	is there a script somewhere to see what users are active (4 a specific period of time; lets say last month/week) on a certain wikipedia?
[12:22:36] 	03(mod) MediaWiki does not support extended Latin script properly - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16697  +comment (10siebrand)
[12:22:54] 	03(mod) MediaWiki does not support extended Latin script properly - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16697  (10siebrand)
[12:27:54] 	03siebrand * r44766 10/trunk/phase3/languages/messages/MessagesCeb.php: (bug 16698) Fixed typo in NS_FILE_TALK
[12:28:01] 	03(FIXED) Typing error in namespacename in Cebuano wikipedia? - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16698  +comment (10siebrand)
[12:30:32] 	The function setCookiePassword  (line 1573 file user.php) simply make the md5 of the password ... Don't you think that is not enough for the security ?
[12:31:40] 	03ialex * r44767 10/trunk/extensions/Configure/Configure.i18n.php: Fix newly added messages, per siebrand
[12:31:43] 	you mean it should be salted?
[12:32:52] 	03(mod) MediaWiki does not support extended Latin script properly - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16697  (10Gerard.meijssen)
[12:33:30] 	scuffio: that function seems to be unused
[12:33:51] 	03(mod) MediaWiki does not support extended Latin script properly - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16697  (10Gerard.meijssen)
[12:33:53] 	even User::$mCookiePassword seems to be unused
[12:39:05] *werdnum 	waves.
[12:39:19] 	hi ialex 
[12:39:25] 	ialex: Configure went live on test.wikipedi
[12:39:26] 	a
[12:39:26] 	hello werdnum 
[12:39:33] 	werdnum: I saw :P
[12:39:45] 	yay!
[12:39:50] 	maybe now I can make my rent in SF :P
[12:40:01] 	:D
[12:40:53] 	werdnum: but how it's installed in CommonSettings.php is a bit strange...
[12:41:38] 	03(mod) MediaWiki does not support extended Latin script properly - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16697  (10Gerard.meijssen)
[12:41:43] 	I haven't looked yet.
[12:42:10] 	werdnum: simply added at the bottom of the file
[12:42:18] 	yeah I just saw :)
[12:42:25] 	so settings are extracted twice
[12:42:34] 	hopefully a full deployment will involve installing into InitialiseSettings.
[12:42:48] *ialex 	expects it won't be like when live on enwiki
[12:42:52] 	*that
[12:42:54] 	yeah
[12:43:00] 	or servers will crash :P
[12:44:23] 	:D
[12:44:36] 	werdnum: or dispatched accross config files ;)
[12:45:17] 	yeah, still been thinking about how to do it properly.
[12:46:05] 	werdnum: maybe by not calling efConfigureSetup() at all and doing it manually
[12:46:18] 	yeah, that's what I was thinking.
[12:46:21] 	(it's what I do on my test wiki)
[12:46:39] 	But DB handler imports settings in Setup.php (it's deferred with a hook)
[12:46:53] 	And some of the settings are relied upon.
[12:47:22] 	03werdna * r44768 10/trunk/extensions/Configure/ (Configure.obj.php SpecialViewConfig.php): Add re-initialisation after Wikimedia hack.
[12:48:14] 	werdnum: that's the problem
[12:48:37] 	(SetupAfterCache hook)
[12:48:57] 	Brion told me to defer to post-Setup.php
[12:49:25] 	but if we're gonna do that, we may as well defer all of CommonSettings.php to then.
[12:52:55] 	werdnum: putting all CommonSettings.php in an extension function o_O
[12:53:53] 	ialex: no, it would have to be SOME of it (DB stuff separate), deferred via a hook that calls require( 'CommonSettings.php' );
[12:55:26] 	werdnum: you'd need something like extract( $GLOBALS )?
[12:55:37] 	ugh that's so disgusting.
[12:55:59] 	:(
[12:56:10] 	srsly
[13:00:57] 	hello. after an update, i've had the "incomplete GD library configuration" message on a thumbnail. I've installed gd since then, checked apache finds it (with phpinfo). And i still have the message, even after restarting apache and doing a force reload from my browser.
[13:01:07] 	i guess there's some caching involved, but can't find what/where
[13:01:19] 	Do you have any advice?
[13:02:52] 	werdnum: and Special:ViewConfig is rectricted to sysop :(
[13:02:57] *ialex 	can't see anything
[13:03:12] *werdnum 	can.
[13:03:16] 	Aren't you a sysop on test?
[13:03:19] 	werdnum: no
[13:03:20] 	there's an empy ./d/dc/ directory in thumbs/ 
[13:03:36] 	ialex: check again.
[13:04:27] 	werdnum: thank you! :)
[13:04:47] 	But it's messed up
[13:04:57] 	your code assumes that settings = array() means no settings at all.
[13:04:57] 	yeah
[13:05:06] 	Whereas in reality, it means no changes from defaults.
[13:15:18] 	Nobody can hint me ? gd is installed and working, including png support, no error in logs, apache restarted, and mediawiki still displays "Incomplete GD library configuration: missing function imagecreatefrompng" for one image, not others.
[13:21:01] 	nothing on google that can find this. everything found is related to 'install gd'
[13:21:18] 	nothing on various faq
[13:47:05] 	orzel: tried deleteing and reuploading the image?
[14:00:28] 	14(DUP) Whatlinkshere selection "redirects" is misleading - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16218  +comment (10cbm)
[14:00:29] 	03(mod) Multiple rows created in pagelinks table for a redirect page - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=7304  +comment (10cbm)
[14:07:34] 	p858snake|sleep: i can't ask my user to do check/change all images
[14:25:27] 	orzel: didn't you say it just happens with one image?
[14:33:38] 	 Quakeile: ok then. hm... i think i know what your problem could be. at some point, several files got moved from the includes dir into subdirectories. if you have them in both locations now, yit might be using the old version instead of the new - I think you are correct Duesentrieb. Eg: I have SpecialAllmessages.php in both /includes and in /includes/specials
[14:34:13] 	 Quakeile: make sure to remove all old files. i'd suggest to replace the include and skin directory by the respective dirs from the latest release
[14:34:36] 	I will, brb
[14:44:32] 	hahaha, Encyclopedia Dramatica won the second Annual Mashable Open Web Awards for the wiki category
[14:45:08] 	O_o
[14:46:03] 	it beat wikipedia?
[14:46:34] 	Runner-up was WikiHow
[14:46:41] 	wow, that's even sadder lol
[14:46:46] 	http://mashable.com/2008/12/16/open-web-awards-2-winners/
[14:47:46] 	or rather http://mashable.com/openwebawards/the-winners/
[15:00:21] 	03catrope * r44769 10/trunk/phase3/includes/api/ApiQuerySiteinfo.php: API: Fix up r44676: convert underscores to spaces in canonical names.
[15:13:10] 	03(FIXED) meta=siteinfo&siprop= namespaces|namespacealiases should return canonical names as well - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=16672  +comment (10roan.kattouw)
[15:26:16] 	Duesentrieb: i had the pb one one image on one page, but can't check them all. on this same page, otheres were ok.
[15:26:32] 	Duesentrieb: i've updated mediawiki, it seemed to be the pb.. it's now fixed. 
[16:28:12] 	03(mod) Global watchlist - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16695  +comment (10mikelifeguard)
[16:29:27] 	03jhsoby * r44770 10/trunk/tools/planet/gmq/ (3 files in 2 dirs): Changing index.html to portal, adding Faroese translation
[16:30:07] 	hi,can any one point me to a good doc about adding google adsense in version 1.9.6
[16:30:17] 	http://paulgu.com/wiki/Google_AdSense
[16:30:24] 	the above link is not working for me
[16:30:29] 	i will try it again
[17:24:19] <^demon>	Morning brion.
[17:25:59] 	heheheh http://tinyurl.com/3tqyy3
[17:26:12] 	03(NEW) {{#language:... }} should support variable language **output** as well as input - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=16699 15enhancement; normal; MediaWiki: Internationalization; (happy_melon)
[17:26:23] <^demon>	Haha.
[17:26:42] 	http://www.mofahaimages.com/b3ta/IEupdate_firefox.gif   wth was the point of tinyurl that?
[17:30:18] 	03(mod) {{#language:... }} should support variable language **output** as well as input - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16699  +comment (10brion)
[17:30:34] 	03(mod) {{#language:... }} should support variable language **output** as well as input (CLDR) - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16699  summary (10brion)
[17:33:03] 	"someone pointed out that this content is being created"  s/someone/Nikerabbit
[17:34:15] 	the main problem I forsee on two dimensional #language is what to do on failure, there are 3 obvious choices but none perfect... 
[17:34:53] 	translation (if exists) > content language > english, translation > english, translation > failure message
[17:35:26] 	03(mod) Upload filter insufficient to stop XSS - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15895  +comment (10innocentkiller)
[17:35:29] 	Splarka, translation > native name?
[17:36:29] 	that is a fourth possibility yah
[17:36:55] 	but really, if you wanted that, you can just do it with a single parameter
[17:37:31] 	Nikerabbit said the CLDR extension had a configurable fallback, maybe a third parameter
[17:37:48] 	er, maybe a third for #language, that is
[17:38:07] 	letting you choose: english, content language > english, native, content language > native, or error
[17:38:24] 	( so #iferror can catch it)
[17:39:25] 	03(mod) Upload filter insufficient to stop XSS - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15895  +comment (10brion)
[17:42:48] 	03(mod) Support X-Content-Type-Options: nosniff for IE 8 - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15461  (10innocentkiller)
[17:42:58] 	03(mod) Support X-Content-Type-Options: nosniff for IE 8 - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15461   +need-review +patch (10innocentkiller)
[17:44:44] 	Splarka: ?
[17:45:07] 	03(mod) Support X-Content-Type-Options: nosniff for IE 8 - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15461  +comment (10innocentkiller)
[17:45:14] 	Nikerabbit: latest bug, 16699 
[17:45:46] 	14(DUP) lineStart invoked from {{PAGENAME}} breaks messages - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13378  +comment (10mormegil)
[17:45:46] 	03(mod)  Newline added to template and parser function result breaks parsing - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12974  +comment (10mormegil)
[17:46:01] 	yes?
[17:46:32] 	br*on indicated possibly positive-leaning ambivalence to integrating cldr into #language
[17:47:07] 	what do you think the default fallback order should be though? native, content, or english? 
[17:48:59] 	Splarka: oh
[17:49:15] 	Splarka: well, #languagename from I18nTags almost does it already
[17:50:11] *Splarka 	nods
[17:50:37] 	does it require the cldr extension, or is it a parallel extension from the same data?
[17:51:03] 	Splarka: it uses cldr if available
[17:51:09] 	03nikerabbit * r44771 10/trunk/extensions/Translate/ (Stats.php _autoload.php utils/Font.php): * Use fontconfig if available
[17:52:27] 	Nikerabbit: how would you integrate it into #language as core, hypothetically? Would you create something like /languages/NamesExtended.php (or stick it into Names.php)?
[17:52:51] 	Splarka: just copy the code from I18nTags
[17:52:55] 	Splarka: Or add messages for them?
[17:52:58] 	it already has the logic
[17:53:02] 	RoanKattouw: NO!
[17:53:05] 	ouch
[17:53:15] 	messages = messyages
[17:53:25] 	03rainman * r44772 10/trunk/extensions/MWSearch/ (MWSearch.php MWSearch_body.php): 
[17:53:25] 	* add custom search timeout (defaults to 6s)
[17:53:26] 	* convert all namespaces to all: prefix for the backend
[17:53:26] 	* supress redirect checkbox in special:search
[17:53:34] 	we are not going to put those for translators
[17:53:42] 	Nikerabbit: right, but.. extensions aren't the same as core, asking _where_ you'd put them in /languages
[17:54:22] 	(or even if you'd put them there, as you said yesterday, you wouldn't ask for translations but direct translators to unicode.org)
[17:54:32] 	Splarka: I'm not, people should just install the cldr extension if they want all possible alternatives
[17:54:57] 	otherwise they just get what mediawiki has, i.e. native names
[17:54:58] 	so you think it shouldn't ever be core?
[17:55:31] 	Splarka: well, that's not for me to decide
[17:55:40] 	brion: per -tech, [[:File:]] and [[:Image:]] links don't seem to be registered in the link or image tables, unless the image is a redlink
[17:55:44] 	but it should be read only data
[17:55:54] 	write, er, right
[17:56:08] 	Splarka: That sounds an awful lot like the inconsistent registration of [[Media:]] and [[Special:]] links
[17:56:22] 	Roan: That's exactly what I thought
[17:56:47] 	what revision exactly, started registering links to [[Special:Nosuchspecialpage]] etc?
[17:56:58] 	I don't know
[17:57:17] 	So track down what piece of the code registers them and annotate it ;)
[17:57:20] 	I'm trying to read titles directly from the _page table in the database and im running into some encoding issues. If you log in to wikibon.org with u: Hen p: temppass and go to the "Home" tab you will see what i am talking about in the community activity list... how do I fix this?
[17:57:42] 	AphelionZ: Ever heard of the API
[17:57:43] 	?
[17:57:48] 	http://www.mediawiki.org/wiki/API
[18:01:21] 	03rainman * r44773 10/trunk/phase3/ (2 files in 2 dirs): (log message trimmed)
[18:01:21] 	New search UI tweaks:
[18:01:21] 	* Show namespace checkboxes only for advanced search. Showing these for "Project Pages" searches
[18:01:21] 	does provide information for advanced users, but clutters UI for everyone else. User should be able to
[18:01:21] 	refer to help pages to find out which namespace are in "Project Pages" and these are also in tooltip
[18:01:23] 	* put "did you mean.." inside the fieldset for increased visibility
[18:01:25] 	* missing 'advanced' req param was making advanced form behave strangely, fixed
[18:06:23] 	RoanKattouw: yes I'm using that extensively in the site
[18:06:49] 	I just need to know what i need to do to properly read the titles from the database
[18:08:23] 	03(mod) Support X-Content-Type-Options: nosniff for IE 8 - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15461  +comment (10brion)
[18:16:22] 	03simetrical * r44774 10/trunk/phase3/ (CREDITS RELEASE-NOTES skins/common/wikibits.js): 
[18:16:22] 	(bug 16459) Use native getElementsByClassName
[18:16:22] 	Patch by Derk-Jan Hartman. Seems not to break table sorting in Firefox
[18:16:22] 	3, and he's tested it in other browsers, including IE6. (Of course it
[18:16:22] 	falls back to ordinary JS if the native version is unavailable.)
[18:16:42] 	03(FIXED) user native getElementsByClassName - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16459  +comment (10Simetrical+wikibugs)
[18:16:52] 	uuh
[18:17:18] 	?
[18:18:41] 	AaronSchulz: Any status on https://bugzilla.wikimedia.org/show_bug.cgi?id=15814 ? :-)
[18:19:41] 	03(NEW) Nesting templates lead to excess whitespace - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=16700 normal; normal; MediaWiki: Page rendering; (mormegil)
[18:21:01] 	03(mod) Nesting templates lead to excess whitespace - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16700  (10mormegil)
[18:21:15] 	03(mod) Nesting templates lead to excess whitespace - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16700   +patch (10mormegil)
[18:22:52] 	Simetrical: cool
[18:26:36] 	hi Splarka
[18:27:11] 	rar
[18:35:53] 	03(mod) Upload filter insufficient to stop XSS - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15895  +comment (10abarth-wikimedia)
[18:44:45] 	03(mod) Page number attribute for  tags - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13127  +comment (10duncan.hill1)
[18:46:47] 	03(mod) Page number attribute for  tags - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13127  +comment (10happy_melon)
[18:48:50] 	Do you know of a way to extract data from wiktionary? Maybe by xml or so?
[18:50:06] 	hopeless if you ask me, but some are doing it anyway
[18:53:17] 	peleg: you can write parses to get info out of th exml dumps. it's going to vary according to the particular wikt 
[18:53:55] 	atglenn, according to the particular wikt, but not according to the specific definition in that wikt, right?
[18:54:02] 	well
[18:54:09] 	formats usually are a bit fluid
[18:54:19] 	:-)
[18:54:26] 	so when a change is instutited in format, it's not necessarily applied across the board to all pages by a bot
[18:54:37] 	because some changes aren't easy to do by bot. 
[18:54:54] 	the short answer is that with some hacking you can probabyl pulll about 90% without a otn of work
[18:54:56] 	Do you think wiktionary's community think about making a useful non-fluid database that will be easily integrated with desktop tools, for example?
[18:55:01] 	it's the last 10% that will kill you
[18:55:12] 	people are always thinking about this
[18:55:18] 	and?
[18:55:32] 	it's why I and ht and cirwin and pologlot all have parsing tools for various things (and that's just from a couple of wikts)
[18:55:50] 	I will be back in a moment (reboot, new drivers)
[18:55:56] 	k
[19:12:26] 	03(mod) Updating searchindex locks up database - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16664  +comment (10brion)
[19:13:34] 	Simetrical: $wgStyleVersion++ ;)
[19:14:30] 	ialex: it is a very passive change, probably not like, critical to be bumped
[19:16:09] 	Splarka: yeah, but it doesn't forbid to bump it ;)
[19:19:13] 	baa
[19:19:14] 	hmm am I the only one seeing 6 log entries in "block log" for today in http://test.wikipedia.org/wiki/Special:RecentChanges ?
[19:19:42] 	(with ehanced change list, of course)
[19:19:43] 	imho wiktionarys are just useless, the data is useless for any kind of automatic processing
[19:22:00] 	why is enhanced "of course"?
[19:22:12] 	do you imagine everyone uses it, ialex? ^_^
[19:22:24] 	Splarka: for entries groupping ;)
[19:22:28] 	mau
[19:22:37] 	Splarka: nope, everybody uses clean changes
[19:22:51] *ialex 	slaps Nikerabbit :D
[19:23:18] 	(there needs to be a uri parameter for enhanced, damnit)
[19:23:42] 	Splarka: cleanchanges provides that too :)
[19:24:19] 	ialex: whoa, looks buuuuuuuuuggy
[19:24:23] 	MZMcBride's fault
[19:24:36] 	:O
[19:24:41] *MZMcBride 	did nothing!
[19:24:45] 	I wonder if the enhanced RC grouper/parser is just stupid
[19:24:51] 	(Except hide a block log entry.)
[19:24:57] 	someone take screenshot
[19:25:02] 	it sees MZM changing visibility of the block log, and assumes they're all block logs
[19:25:40] 	^maybe it
[19:25:42] *MZMcBride 	volunteers Splarka.
[19:25:44] *MZMcBride 	hides.
[19:26:14] 	http://test.wikipedia.org/wiki/File:Mzmfault.gif
[19:27:31] 	Should upload it as an attachment to the bug you file. :P
[19:27:51] 	eek, java RC
[19:27:52] 	George Bush doesn't care about enhanced recent changes
[19:28:03] *Splarka 	sighs, gets out the knife, educates Charitwo
[19:28:14] 	*carve into forehead* java != javascript
[19:28:25] 	wiki != wikipedia
[19:28:33] 	have care with abbreviation
[19:28:50] 	!rar
[19:28:50] --mwbot--	rar
[19:29:24] 	lol
[19:39:00] 	03(NEW) meta settings - main page in user lang - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16701 15enhancement; normal; Wikimedia: Site requests; (danny.b)
[19:46:14] 	03ialex * r44775 10/trunk/phase3/includes/ChangesList.php: 
[19:46:14] 	Moving check for "$rc_type == RC_LOG" before check for special pages since log
[19:46:14] 	entries can now have a tagert in NS_SPECIAL, this happens for log item deletion
[19:46:14] 	(rev_deleted stuff) and thus breaking recent changes display, (see the current
[19:46:14] 	version of http://test.wikipedia.org/wiki/Special:RecentChanges, where the
[19:46:16] 	deletion log appears as "block log" since the last entry is hiding an item from
[19:46:18] 	the block log)
[19:49:58] 	03(mod) Add Commons as an import source for Meta - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16665  (10brion)
[19:52:02] 	03(mod) {{FILEPATH:{{PAGENAME}} }} doesn' t work for filenames containing characters that get escaped to HTML entities - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16474  +comment (10brion)
[19:52:03] 	03(mod) {{#ifexist}} does not recognise URL encoded filenames - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14779  (10brion)
[19:54:12] 	03(FIXED) Using Linker::makeKnownLink("interwiki:Page#anchor") gives repeated anchor - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=6158  +comment (10brion)
[20:01:52] 	TimStarling you around?
[20:06:05] 	I see under the API docs that I can edit/add page content, but can I attach files with it yet?
[20:10:01] 	brion you around?
[20:11:47] 	oy
[20:12:36] 	yay
[20:12:38] 	okay
[20:12:49] 	brion
[20:12:52] 	do you recall
[20:12:52] 	http://en.wikipedia.org/wiki/User:Brion_VIBBER/Cool_Cat_incident_report#Targeted_log
[20:14:31] 	brion I was wondering if you could use a similar method in cehking out these:
[20:14:31] 	http://en.wikipedia.org/wiki/Wikipedia:Requests_for_arbitration/Scientology/Evidence#Update:_Additional_confirmations_in_COFS_checkuser_case
[20:14:48] 	comphappy: https://bugzilla.wikimedia.org/show_bug.cgi?id=15227
[20:15:01] 	White_Cat: anything of that sort should get run through the oversight group first
[20:15:07] 	er checkuser
[20:15:23] 	brion a checkuser isnt helping in this case because these people claim they are using the same public computers
[20:15:26] 	if they can't do it directly there, we can escalate as necessary to dig in for detail checks
[20:15:48] 	brion they did
[20:15:51] 	and IPs match
[20:15:56] 	then there's not really any check that can be done short of staking out the computer and stalking everyone who uses it
[20:15:58] 	but the people claim thats merely coincidental
[20:15:59] 	which i don't recommend
[20:16:08] 	this is creating some level of confusion
[20:16:45] 	http://en.wikipedia.org/wiki/User:White_Cat/CSI/Scientology#Shutterbug.C2.A0.28talk.C2.A0.C2.B7_contribs.29
[20:17:57] *MZMcBride 	imagines CheckUsers in camo.
[20:18:44] 	brion checkusers cannot handle this alone
[20:18:53] 	all they can get is a lot of "possibles"
[20:19:18] 	and all of these hordes of possible socks claim to be using likeminded but independendt accounts
[20:19:33] 	and arbcom is seemingly buying it :/
[20:21:13] 	brion maybe you should try talking the arbcom over this, maybe they'd like it
[20:21:16] 	This doesn't really seem like a #mediawiki issue...
[20:22:24] 	03ialex * r44776 10/trunk/extensions/Configure/Configure.i18n.php: shame on me! This is CIDR and CDIR :(
[20:22:37] 	ialex, there's not really any point in gratuitously slowing down a page load for thousands of people by invalidating the cached copy of wikibits unnecessarily.
[20:23:17] 	White_Cat, so why do you think brion will be able to figure out anything more definitive than the checkusers?
[20:23:31] 	Simetrical brion can preform magic
[20:23:31] 	oops, forgot "not" in my commit message
[20:23:34] 	not likely to be anything i can do specific there :)
[20:23:42] 	White_Cat, no he can't.  :)
[20:23:44] *brion-lunch 	wnaders off for a bit
[20:23:50] 	brion-lunch I dont know they claim to be using their own laptops
[20:24:11] 	therefore Token shouldnt match
[20:24:37] 	or cookies
[20:24:39] 	I dont know
[20:24:51] 	you know the technical ids better than me and Simetrical combined
[20:25:18] 	I'm pretty sure I know the relevant info about as well as Brion.  Can't speak for you.
[20:25:56] 	Things like cookies are presumably stored in a form visible to checkusers if they're stored anywhere at all.
[20:26:15] 	If the checkusers think they could use extra info, I'm sure they're capable of asking the appropriate people.
[20:27:04] 	Simetrical I dont know any
[20:27:07] 	:P
[20:27:33] 	Simetrical see: http://en.wikipedia.org/wiki/User:Brion_VIBBER/Cool_Cat_incident_report
[20:27:55] 	in that case brion made some magic by storing info that is not normaly stored on servers
[20:28:25] 	I guess I should convince arbcom first
[20:28:35] 	query FT2
[20:28:36] 	ack
[20:30:25] 	If two people are using the same public computer, there's no technical measure that will tell them apart.
[20:30:38] 	Assuming the same browser, user, etc. (which is likely).
[20:30:52] 	is there anything like HNP that can be used with a 1.12 or 1.13 version?
[20:32:07] 	Simetrical they claim they are using the same wireless hub
[20:32:12] 	but not the same computer
[20:32:29] 	mmcgrath, HNP?
[20:32:53] 	"I am working from two locations, one on the East Coast, one on the West Coast. In between I am logging in from airports or internet cafes. When using wireless I am going through a VPN/SSL connection (or something like that, hub, proxy, maybe there are different names for this). The idea is that the wireless line can be hijacked and using a SSL connection helps preventing that."
[20:33:01] 	White_Cat, then you could possibly distinguish them by OS or browser, using User-Agent and related things.  But CheckUsers have access to this, I believe.
[20:33:08] 	03(FIXED) Add Commons as an import source for Meta - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16665  +comment (10rhalsell)
[20:33:16] 	And it only works if their OS/browser happen to be different.
[20:33:24] 	Simetrical the thing is
[20:33:27] 	"Firefox 3 on Vista" is not very useful.
[20:33:40] 	last time brion managed to identify the Marmot IP spoofing by using a veriety of methods
[20:33:50] 	among them was cookie matching
[20:33:53] 	That was different.  There he was exploiting a bug in MediaWiki's X-Forwarded-For handling.
[20:33:54] 	Simetrical: the ACL stuff. 
[20:34:06] 	Simetrical right
[20:34:10] 	I need to replace it ASAP with something that can do ACL's on the wiki because some people who use our wiki think its a CMS.
[20:34:12] 	mmcgrath, ACLs for what?
[20:34:19] 	but brion figured out it was him using cookies
[20:34:22] 	MW doesn't really support ACLs.
[20:34:24] 	page editing.
[20:34:27] 	White_Cat, cookies can be easily forged.
[20:34:32] 	it doesn't, HNP was a plugin that provided that.
[20:34:34] 	If you felt like it.
[20:34:40] 	Simetrical right
[20:34:47] 	but I doubt these guys are that smart
[20:34:48] 	Not even with difficulty.
[20:35:09] 	Marmot wasn't that smart.
[20:35:12] 	Log out, log in as another account, AFAIK cookies should be completely cleared.  (I might be wrong on that.)
[20:35:24] 	Simetrical didnt help marmot :)
[20:35:44] 	mmcgrath, I don't know.  I only know about the core software and some of the extensions used by Wikimedia, not third-party stuff.
[20:36:02] 	"he cookies appear to declare the attacker as a logged-out User:MARMOT."
[20:36:04] 	
[20:36:09] 	I'll get to some googling.
[20:36:15] 	03aaron * r44777 10/trunk/phase3/includes/specials/SpecialSearch.php: Fix odd syntax that made XHTML error
[20:36:20] 	 >:(
[20:37:07] 	Yeah, we store identifying cookies even when logged-out, it seems.
[20:37:12] 	I don't know if checkusers have access to those.
[20:37:35] 	And I don't know if they persist if you log in as a sockpuppet.
[20:37:43] 	MARMOT wasn't logged in when he was impersonating you, right?
[20:38:42] 	No, cu_changes doesn't store cookies.
[20:38:48] 	03siebrand * r44778 10/trunk/phase3/languages/messages/ (25 files): Localisation updates for core messages from Betawiki (2008-12-18 21:28 CET)
[20:38:50] 	moin
[20:38:53] 	Might be useful if it did, although you'd want to obfuscate them a bit, perhaps.
[20:39:10] *Simetrical 	shrugs
[20:39:28] 	Simetrical now we are talking about the same thing :)
[20:40:03] 	Simetrical if brion (or anyone with root access) made the necesary changes to store the data in question it could help with the arbcom case :)
[20:40:27] 	The ArbCom could ask that if it felt it would be useful.
[20:40:40] 	arbcom doesnt know about this
[20:40:50] 	I am going to make sure they are informed
[20:40:56] 	I doubt there's much benefit now that the user(s) has/have been alerted to the fact that they're under suspicion.
[20:40:58] 	but my technical knowledge is limited
[20:41:12] 	so I am not sure how to explain it to arbcom
[20:44:37] 	03siebrand * r44779 10/trunk/extensions/ (109 files in 101 dirs): Localisation updates for extension messages from Betawiki (2008-12-18 21:28 CET)
[20:46:44] <___wander___>	any complete pdf manual for mediawiki ????
[20:47:27] 	brion-lunch: Some time ago I ran an EXPLAIN on the query that the API's list=watchlist does, and found out that it filesorts the result set. So I looked at Special:Watchlist's query, and to my great amazement that one filesorts its result set too; also, it doesn't use a LIMIT in default mode (show one revision per page), only in allrev mode (show all revisions of all watched pages in the...
[20:47:29] 	...given timeframe). Are these issues known to you or should I open a bug?
[20:47:33] 	03(NEW) Creation of bilderberg-l - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16702 15enhancement; normal; Wikimedia: Mailing lists; (wikipedia)
[20:47:46] 	___wander___: Not that I know of. There's pretty extensive docs on www.mediawiki.org though
[20:48:25] 	extensive? lol
[20:48:38] 	Well....
[20:49:03] 	Oh wait I'm confused. The *API* is documented pretty extensively there, the rest of MediaWiki isn't :P
[20:49:15] 	Anybody ever tried to run mediawiki through a load balancer that does ssl-offload? I can't seem to figure out how to get mediawiki to stop redirecting me to http .. I've set  $wgProto, $wgServer, $wgServerName, and $wgCookieSecure appropriately, but it's still doing the redirect
[20:49:24] 	i've even tried hacking through the PHP code, but to no avail
[20:49:32] 	RoanKattouw, filesorts are not *necessarily* a problem.  It depends on how large the set being filesorted is.
[20:49:40] 	___wander___: https://bugzilla.wikimedia.org/show_bug.cgi?id=1
[20:49:47] 	doh
[20:50:00] 	Simetrical: How about all pages on your watchlist? Or, worse, all changes to those pages in the last 72 hrs
[20:50:26] 	That's not worse, it's possibly better.
[20:50:38] 	In the typical case, it will be a very small number of rows.
[20:50:41] 	03aaron * r44780 10/trunk/phase3/skins/Standard.php: Remove some less-useful redundant links from sidebar
[20:50:45] 	Simetrical: No. The result set is bigger and it'll still be filesorted
[20:50:59] 	Which result set is bigger than what?
[20:51:02] 	In the former case it's probably a small number, yes. In the latter case...
[20:51:27] 	Oh I see I misrepresented: the former case doesn't list all pages on your watchlist, but only those that have changed in the past 72 hrs.
[20:51:49] 	The latter lists all those changes separately, so is at least as large and possibly larger (by definition)
[20:52:48] 	So in the former case you've usually got a small result set, but in the latter case you won't have if you happen to be watching an often-edited page
[20:53:08] 	Oh?  What about pages that haven't changed in the last 72 hours?
[20:53:09] 	anybody on my redirecting issue?
[20:53:33] 	cronos-: $wgArticlePath OK?
[20:53:35] 	I have a couple hundred pages on my watchlist, only twenty or thirty usually change every three days.
[20:54:10] 	It's likely not many rows being sorted in most cases.
[20:54:30] 	There's a reason that there's a covering index for watchlists, anyway, with a dedicated slave.
[20:54:47] 	Covering index?
[20:54:47] 	Well, not dedicated to only that, I mean.
[20:55:13] 	hi, i'm looking for the actual code that defines what gets counted as an article vs a page, i found a place in the api that calls SiteStats::articles() but no idea where to look for that
[20:55:18] 	At least on Wikimedia there is, AFAIK, yeah.
[20:55:21] 	Simetrical: You seem to be right:         $this->selectNamedDB('watchlist', DB_SLAVE, 'watchlist');
[20:55:29] 	Yup.
[20:55:34] 	!class SiteStats | uberfuzzy 
[20:55:34] --mwbot--	uberfuzzy: See http://svn.wikimedia.org/doc/classSiteStats.html
[20:55:43] 	That slave has a special index on recentchanges to service the query, IIRC.
[20:55:48] 	RoanKattouw: $wgArticlePath is the default of false... I guess you could think of the load balancer as a "SSL proxy".. you talk SSL to the load balancer, it talks HTTP to the mediawiki server, then you get your response back in SSL.. but, mediawiki is trying to be smart and realizes that the request came in on HTTP, so tries to build all the links as HTTP no matter what I have the settings to be..
[20:55:49] 	In point of fact, it might not filesort with that index in place.
[20:55:55] 	uberfuzzy: Do you want to know what the criteria are?
[20:55:55] 	thanks RoanKattouw 
[20:56:03] 	sure
[20:56:12] 	Simetrical: Yeah, but I can't check that since I don't know what that index is :P
[20:56:19] 	uberfuzzy, there are different criteria in different places, actually, I think.  :D
[20:56:31] 	It's kind of buggy, I wouldn't assume it's sane.
[20:56:50] 	figures
[20:57:51] 	Simetrical, uberfuzzy: Article::isCountable() 
[20:58:18] 	return $this->mTitle->isContentPage() && !$this->isRedirect($text) && in_string($token,$text);
[20:58:28] 	Where $token is '[[' (or ',' is $wgCommaCount is true)
[20:58:38] 	RoanKattouw, yeah, but look at maintenance/initStats.php.  It doesn't call that function, does it?
[20:58:51] 	so, is in a content namespace, is not a redirect, and has atleast 1 outgoing link
[20:58:54] 	$good  = $dbr->selectField( 'page', 'COUNT(*)', array( 'page_namespace' => $wgContentNamespaces, 'page_is_redirect' => 0, 'page_len > 0' ), __METHOD__ );
[20:59:28] 	03(NEW) Allow everybody to view patrolled pages on eswiki - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=16703 15enhancement; normal; Wikimedia: Site requests; (Platonides)
[20:59:49] 	uberfuzzy: Pretty much. Actually, containing [[ is enough, which doesn't necessarily mean it has an outgoing link :P
[20:59:56] 	That version also requires that page_len > 0, which probably fails for old articles, and . . . wait, isn't there supposed to be a join to pagelinks there?
[20:59:59] 	Simetrical: Yeah, just saw that, it skips the [[ criterion
[21:00:05] 	There should be
[21:00:15] 	Oh right page_len isn't filled for very old pages :S
[21:00:33] 	Well a JOIN on pagelinks could replace that: empty pages can't have outgoing links :P
[21:01:54] 	Simetrical: Do you know or can you find out what that extra watchlist index is?
[21:01:57] 	bit of an assumption
[21:02:03] 	(i.e. on which fields it is)
[21:02:12] 	Oh, and even better, we *also* have maintenance/updateArticleCount.php, which uses still a *different* method.
[21:02:13] 			return "SELECT COUNT(DISTINCT page_namespace, page_title) AS pagecount " .
[21:02:13] 				"FROM $page, $pagelinks " .
[21:02:13] 				"WHERE pl_from=page_id and page_namespace IN ( $nsset ) " .
[21:02:13] 				"AND page_is_redirect = 0 AND page_len > 0";
[21:02:37] *Simetrical 	wonders if COUNT(DISTINCT ...) works in non-MySQL environments
[21:02:42] 	That makes absolutely no sense
[21:02:51] 	RoanKattouw, I think it's a covering index, so on all fields.  Ask domas.
[21:03:08] 	RoanKattouw, sure it makes sense.  The join to pagelinks handles the question of whether it has any links.
[21:03:08] 	page_len > 0 is redundant because of the join on pagelinks, it'll only cause false negatives
[21:03:20] 	It's not redundant, it's flat-out wrong.  :)
[21:03:25] 	Well yeah, but the fact that page_len > 0 is in there makes no sense
[21:03:39] 	It was probably copy-pasted from initStats.php with the addition of the join for some reason.
[21:03:41] 	God knows why.
[21:04:13] 	Probably not, since initStats.php uses selectField() while updateArticleCount.php builds SQL like it's Lego
[21:04:50] 	Could be one was subsequently ported.
[21:04:52] 	RoanKattouw: sorry, it looks like you're busy, but do you have any other ideas on the redirect thing?
[21:04:54] 	Want to investigate?  :P
[21:05:26] 	cronos-, doesn't MediaWiki generate relative URLs?  Like /wiki/blahblahblah?
[21:05:43] 	Simetrical: Investigating that isn't really useful, fixing it is. But we should first agree to one standard query to use
[21:05:46] 	If the request is on http://, the browser will then use that.
[21:06:00] 	RoanKattouw, ideally we should actually make it consistent with the isCountable() thing too?
[21:06:17] 	Simetrical: Yes, but from what he says it sounds like https://www.example.com/wiki/ is redirected to http:// rightaway, in which case it's probably an Apache config issue
[21:06:38] 	Simetrical: Exactly. Which means $wgCommaCount has to go
[21:06:42] 	$wgServer = $wgProto.'://' . $wgServerName;
[21:07:01] 	cronos-: I think you need to find out who's doing the redirecting: MediaWiki or Apache
[21:07:17] 	cronos-, what actual links aren't working?  I'm not clear if it's everything, or links generated by MediaWiki, or what.
[21:07:38] 	it's not that the links aren't working.. they're just HTTP instead of HTTPS
[21:07:58] 	Going directly to the URL works?
[21:08:07] 	cronos-: When you visit the wiki through HTTPS, do you actually see an https:// URL in your address bar?
[21:08:34] 	I don't suppose this site is publicly viewable.
[21:08:37] 	RoanKattouw: I'm not a n00b..  no, i don't see HTTPS
[21:08:43] 	https://wiki.pittstate.edu/ois
[21:09:18] 	Taking forever to load.
[21:09:32] 	Timing out for me
[21:10:15] 	hmm.. doesn't time out for me
[21:10:56] 	Where's that site located? Australia?
[21:11:10] 	.edu would suggest US to me.
[21:11:16] 	yeah, its in the US
[21:11:26] 	Hmm ping reply in 130 ms, so latency isn't the problem
[21:11:37] 	Australia sites are usually in the 300 ms range
[21:11:53] 	Works: http://www.pittstate.edu/
[21:12:01] 	Different server, though, maybe.
[21:12:04] *Simetrical 	has to go, can't check
[21:12:06] 	yeah, different webserver
[21:12:10] 	Ah Pittsburg of course
[21:15:33] 	ugh.. i guess i need to go harass my friendly neighborhood unix administrator to figure out wtf is going on now
[21:15:36] 	03aaron * r44782 10/trunk/phase3/skins/common/wikistandard.css: Make quickbar fill in extra bottom bit
[21:22:25] 	When will it be possible to rename files without deleting them and reuploading?
[21:23:19] 	03catrope * r44783 10/trunk/phase3/RELEASE-NOTES: Break long line in RELEASE-NOTES
[21:23:21] 	svip: it's already possible, just experimental (and disabled by default)
[21:23:49] 	!wg AllowImageMoving | svip
[21:23:49] --mwbot--	svip: http://www.mediawiki.org/wiki/Manual:%24wgAllowImageMoving
[21:25:07] 	Hm, Skizzerz, wouldn't it be wise to give it a special permission?
[21:25:10] 	Or does it already have that?
[21:25:16] 	not sure
[21:26:01] 	probably could just be tied to the upload permission
[21:26:53] 	that is upload, reupload, reupload-shared...
[21:27:33] 	03(mod) Add rcuser= and rcexcludeuser==?UTF-8?Q?=20=C3=A0=20la=20prop?= =revisions - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14200  +comment (10roan.kattouw)
[21:30:08] 	Splarka:  But it would be nice to have a specific one for it, move-file
[21:30:14] 	domas: Could you tell me what the special index ("covering index" as Simetrical calls it) on the recentchange table on the watchlist slave is? That way I'll be able to make sure my API watchlist queries don't filesort on that server (they do on mine; even the regular Special:Watchlist query filesorts, but I'm assuming your special index prevents that).
[21:30:53] 	03aaron * r44784 10/trunk/phase3/includes/ChangesList.php: Fix for r44775: remove debug line
[21:31:32] 	svip: and for moving one you uploaded vs someone else? and for moving from or to a shared repository title?
[21:31:55] 	you want an extra right for each? seems like the existing upload rights would work for such cases
[21:31:59] 	I would assume the same rules would have to apply to the move permission.
[21:32:01] 	IMHO
[21:32:08] 	I just one a move permission.
[21:32:14] 	Leave it to sysops to handle the deal.
[21:32:21] 	want*
[21:32:41] 	lots of people want move to be a sysop only function
[21:33:24] 	I just want a move-file permission, regardless of file.
[21:33:43] 	I think people who can upload should be able to move their own images, for example, but not other people's
[21:33:55] 	and not over shared repository titles
[21:34:21] 	But shouldn't sysops be able to move other people's files?
[21:34:40] 	sure, and they have 'reupload' and 'reupload-shared'
[21:35:07] 	actually... I think it might be tied to 'move'...
[21:35:20] 	not sure though, and a grep of /includes for AllowImageMoving didn't turn anything up
[21:47:09] <_Slick_Rick>	Is there a way to apply an alphabetical sorting to a link like ... Special:Whatlinkshere/{{PAGENAME}}
[21:47:23] <_Slick_Rick>	by default it seems to always sort by the date created, not alphabetical
[21:47:40] 	_Slick_Rick, for that page, no, not by default that I know of.
[21:48:14] 	I wonder why it does that though; the API manages to sort by (ns, title) just fine
[21:48:19] 	03raymond * r44785 10/trunk/phase3/includes/DefaultSettings.php: Bump $wgStyleVersion per r44774/r44782
[21:48:28] *RoanKattouw 	wonders how that stuff is paged
[21:48:39] <_Slick_Rick>	Simetrical: dang.... Thats no good.. maybe a plugin or something that can do it?
[21:48:49] 	Hmm, looks like it sorts and pages by page ID
[21:48:57] <_Slick_Rick>	ya i think by page ID
[21:49:08] <_Slick_Rick>	which is basically the order it was entered in
[21:49:22] 	RoanKattouw, you might want to Google "covering index", by the way, for your own education, if you don't know what that is.
[21:49:22] <_Slick_Rick>	but its confusing to look at the list that way.  Alphabetical is much nicer
[21:49:49] 	Simetrical: Right. I have some database performance education to catch up to, but I guess you knew that already
[21:50:23] <_Slick_Rick>	yes i know about indexing databases
[21:50:34] 	I don't see why WhatLinksHere can't sort by (ns, title) either.
[21:50:39] 	Seems like the index is present.
[21:50:52] 	Simetrical: Because it would require the IndexPager class to be rewritten
[21:50:54] 	Maybe it could be ported to IndexPager.
[21:50:55] <_Slick_Rick>	Simetrical: is there any params you can pass to it?
[21:51:10] <_Slick_Rick>	like.... Special:Whatlinkshere/{{PAGENAME}}/order/name
[21:51:12] <_Slick_Rick>	or someting lol
[21:51:16] 	I tried extending IndexPager to a two-field index (page_ns, page_title) once and gave up because it was too buggy
[21:51:23] 	_Slick_Rick, you could hack includes/(special/)SpecialWhatlinkshere.php to add an ORDER BY if you liked.
[21:51:29] 	_Slick_Rick: No, not to influence sorting anyway
[21:51:30] 	RoanKattouw, I thought I did that already.
[21:51:38] *RoanKattouw 	checks it out
[21:51:49] <_Slick_Rick>	Simetrical: ok ill try to do that perhaps
[21:52:19] 	03(mod) wgSpamRegex addition request: anontalk.com - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16597  (10brion)
[21:52:22] 	No, I did something else, apparently.
[21:52:33] 	Support ordering by multiple different indexes, I guess, that was it.
[21:52:39] 	For Special:Categories.
[21:52:45] 	Although I think that code is actually unused now.
[21:53:49] 	"Uncomment the multiple-ordering stuff in Pager.php, since it works fine, but comment it out in SpecialCategories.php, since the index is not unique and it won't sort correctly.  IndexPager needs to support multiple-column sort, and the index should be extended to (cat_pages, cat_title)."
[21:53:57] 	03(FIXED) wgSpamRegex addition request: anontalk.com - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16597  +comment (10brion)
[21:53:58] 	(log comment on r32259)
[21:54:06] 	Right.
[21:54:20] 	It looks messy.
[21:54:37] <_Slick_Rick>	Thanks for the help
[21:54:42] 	Is there a standard way to store arrays in CGI parameters?
[21:54:43] 	morning
[21:54:54] 	Evening
[21:55:08] 	Simetrical: Yes, I fixed up wfArrayToCGI() to do that IIRC
[21:55:17] 	What's the standard way.
[21:55:19] 	?
[21:55:20] 	lol, symmetry in converstation
[21:55:28] 	?arr[]=foo&arr[]=bar&arr[]=baz
[21:55:42] 	Which means $_GET['arr'] = array('foo', 'bar', 'baz');
[21:55:47] 	Hmm, and will $wgRequest handle that?
[21:56:00] 	PHP handles it automatically
[21:56:10] 	It worked for me :P
[21:56:13] 	Yeah, but I'm talking about $wgRequest.  Is there a $wgRequest->getArray() or something?
[21:56:28] 	No, but IIRC getValue() worked for me
[21:56:38] 	Lemme check out that wfArrayToCGI() patch
[21:57:29] 	Simetrical: http://svn.wikimedia.org/doc/classWebRequest.html#f076fc7e191f18b44f1040d3b7373006
[21:57:47] 	People actually use doc/?
[21:57:49] 	I didn't know it existed.
[21:57:56] 	Simetrical: Sure, all the time
[21:57:58] 	Did you know about:
[21:58:02] 	!class WebRequest
[21:58:02] --mwbot--	See http://svn.wikimedia.org/doc/classWebRequest.html
[21:58:43] 	mornin TimStarling :)
[21:58:58] 	Simetrical: Added the wfArrayToCGI() support in r35298, fixed up in r35783
[21:59:13] *Simetrical 	doesn't feel like dealing with this
[21:59:18] 	Doxygen is cool because you can see reference to each function
[21:59:56] 	ialex: Yes, as long as it parses everything correctly. Something like 'blah?>' will make Doxygen choke
[22:00:02] <_Slick_Rick>	Simetrical: i found someone who asked the same questions here, http://meta.wikimedia.org/wiki/Help_talk:What_links_here
[22:00:04] 	And somehow it doesn't recognize the MWNamespace class
[22:00:11] 	RoanKattouw: yeah... :(
[22:00:12] <_Slick_Rick>	aparantly there is no easy answer.  I think hacking the code is hte only way
[22:00:47] 	Hmm I guess Brion showed some good foresight there: WebRequest::getArray() was added back in r6708 (dec 04) "for the rare times that's what we want"
[22:01:00] 	:D
[22:01:23] 	But since wfArrayToCGI() used to choke horribly on array arguments until I fixed it, it doesn't seem to have been used much (or at least not in conjunction with IndexPager)
[22:03:48] 	ugh, slash parameter tyrrany
[22:04:00] 	?
[22:04:01] 	tyranny, even
[22:04:19] 	 Special:Whatlinkshere/{{PAGENAME}}/order/name
[22:04:45] 	Oh
[22:04:46] 	I think Special:Listusers is the only thing with multiple (and ambiguous) slashie parameters
[22:04:49] 	Luckily, that doesn't work
[22:04:52] 	Yeah
[22:04:57] 	03jhsoby * r44786 10/trunk/tools/planet/gmq/templates/index.nn.html.tmpl: Fixing Nynorsk translation
[22:04:58] 	Special:Code is good about it too
[22:05:01] 	but if there are to be MORE of these, ugh, pleas efix the associated bug
[22:05:24] 	(user scripts can't tell wtf the slashies are being parsed as)
[22:05:25] 	How about Special:Code/MediaWiki/authors/catrope
[22:05:34] <_Slick_Rick>	heh
[22:05:35] 	nobody cares about Special:Code ^_^
[22:05:57] 	Splarka: Sure they can, given a list of all groups on the wiki, which is available from the API
[22:06:06] 	but for example, Special:Log/block vs Special:Log?type=block vs title=Special%3ALog&type=block vs title=Special%3ALog%2Fblock
[22:06:10] 	Of course, you might as well just use list=allusers in that case :P
[22:06:15] 	Roan ^
[22:06:34] 	https://bugzilla.wikimedia.org/show_bug.cgi?id=16462
[22:07:10] 	Oh Javascript ...
[22:07:15] *RoanKattouw 	doesn't care
[22:07:24] 	you should
[22:07:40] 	the API would be limited to a few python noobs and a few live mirrors otherwise ^_^
[22:07:52] 	Yeah that's true
[22:08:11] 	But about all that wgTitle and wgTitleF crap and God knows what else, I don't
[22:08:58] 	well, fair
[22:10:10] 	but parsing slashes is obfuscatory, and dirty (especially compared to transclusion...), and if they are to continue to be used for convenience of wikilinking, they should be standardized darnit
[22:10:16] <|X|>	Is there someone here who speaks either English or German fluently, and is willing to do some translation for an independant tool?
[22:13:22] 	|X|: Translation from and to which languages?
[22:13:23] *Simetrical 	feels brain-fried, needs to play a computer game or something
[22:13:51] 	Splarka: BTW, the MediaWiki::API library for Perl also seems to be on the rise now
[22:13:52] <|X|>	RoanKattouw, I meant to say French or German. I need some English->another language
[22:14:09] 	|X|: How about Dutch?
[22:14:18] <|X|>	Well...sure, I guess :)
[22:15:13] 	perl? ew
[22:15:20] 	Splarka: My thoughts exactly.
[22:15:28] 	But if they wanna use the API, hey, I don't complain
[22:15:30] <|X|>	Perl is evil
[22:15:42] 	Just don't expect me to help them with MediaWiki::API
[22:15:45] <|X|>	They need a Perl equivelant of unserialize()
[22:15:56] <|X|>	instead of parsing xml
[22:16:10] 	Perl needs a good object structure, their current method of having "objects" and "classes" is laughable
[22:16:18] 	Actually that library was written by a guy who often throws API bugs at me
[22:16:24] 	Maybe it was that Jools Smith guy
[22:16:48] 	Yup, Jools Smyth
[22:17:22] 	|X|: XML is evil. The API has to do all kinds of nasty and/or ugly stuff just to keep XML from breaking
[22:17:30] 	Like setIndexedTagName()
[22:17:36] <|X|>	*shudder*
[22:18:10] 	TimStarling, does this patch look remotely sane?  Or if not, does it look insane but in an easily-fixable way?  http://pastey.net/104599
[22:37:03] 	what does \cancel do?
[22:37:17] 	am I just mean to know?
[22:39:28] 	TimStarling, stuff like this.  http://www.twcenter.net/~simetrical/images/cancel.png
[22:39:46] 	Basically allows crossing stuff out, which as far as I can tell isn't possible with the commands we have now.
[22:40:05] 	I'm not a LaTeX expert, though.
[22:40:23] 	Simetrical: Nice. But the arrow notation with the 0 is kind of weird, since you're only gonna be crossing out stuff that's 0 or 1 anyway
[22:40:51] 	RoanKattouw, that's \cancelto{}{}, you can also use just \cancel{} to have it only crossed out.
[22:41:01] 	OK good
[22:41:05] 	Thanks, didn't know about that
[22:41:11] 	Does it require a special package?
[22:41:17] 	Yes, "cancel".
[22:41:27] 	I didn't know about it either until someone in #math complained that Wikipedia didn't support it.
[22:41:27] 	A
[22:41:29] 	h
[22:41:43] 	may as well add it
[22:41:49] 	The patch I gave works fine on my machine, which is Ubuntu Hardy.  I have no idea if it will work elsewhere.
[22:42:00] 	TimStarling, the patch doesn't look scary to you?  What happens if the package isn't available?
[22:42:15] 	And why is this practically the only package manually included like that?
[22:43:13] 	Ugh, you know you need new hardware when you feel it's necessary to sudo renice -19 $$ to do anything.
[22:43:26] *Simetrical 	had better be sure he doesn't start any services without renicing again
[22:50:13] 	there's lots of usepackage commands in texutil.ml, I don't know what you mean
[22:50:52] 	TimStarling, they seem to all be conditional.
[22:51:08] 	Either encoding-related or with if's beforehand.
[22:52:37] 	that's because it's enabling the modules depending on what macros it sees in the input
[22:52:57] 	you could add a function for it similar to tex_use_ams()
[22:53:06] 	Does it matter?
[22:53:14] 	no
[22:53:53] 	Is the package guaranteed to be available?
[22:55:59] 	Gah, I dislike yum.
[22:56:10] 	Crash with stack trace: UnboundLocalError: local variable 'result' referenced before assignment
[23:01:53] 	hmm, Configure doesn't like right navbar
[23:04:19] 	Simetrical: that's not an easy question to answer, I'd have to check all the distros
[23:04:28] 	TimStarling, so should I check it in?
[23:04:53] 	you could test it without the package and see if it fails gracefully
[23:04:56] 	Or could a conditional be added that would skip those bits if the package wasn't available?
[23:05:10] 	How do I remove the package?  I'd assume it would fail with a fatal error.
[23:06:25] *Simetrical 	can't get LaTeX working on his server to begin with, the PNG conversion fails even though latex, gs, dvips, and convert all work on the command line
[23:06:51] 	s/LaTeX/texvc/
[23:07:51] 	03(mod)  Invalid UTF-8 in percent-encoded links cause page rendering error - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=11143  summary (10brion)
[23:08:49] 	03(mod) XML import does not update the redirect table - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12507  +comment (10brion)
[23:09:40] 	03(mod) Add RSS with all data dump items - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14794  +comment (10brion)
[23:10:05] 	03aaron * r44788 10/trunk/phase3/includes/Skin.php: Streamline links a bit
[23:10:14] 	03(mod) Include uncompressed sizes in dump file RSS info - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14631  (10brion)
[23:13:03] 	03(FIXED) Upload filter insufficient to stop XSS - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15895  +comment (10tstarling)
[23:14:10] *RoanKattouw 	hides from AzaTht
[23:14:33] 	have nothing to throw on you at the moment
[23:14:55] *RoanKattouw 	feels relieved
[23:15:09] 	but if you want, I can dig up something
[23:18:23] 	OK, but I won't be poking at it today
[23:18:39] 	Still working on putting the diff functionality back in prop=revisions
[23:18:50] 	And it's past midnight already, I should get some sleep after that
[23:19:22] 	TimStarling: Have you had a chance to work on the installer re-write lately?
[23:19:32] 	no
[23:19:48] 	currently I'm catching up on patches and review that I left off during the security work I did
[23:20:04] 	Ahh, okay.
[23:20:17] *RoanKattouw 	kindly reminds TimStarling of bug 16012
[23:20:46] 	16012 is so far down the list
[23:21:40] 	:-(
[23:21:42] 	:( 
[23:21:56] 	It's blocking all schema updates, so that kind of sucks
[23:23:31] 	no it's not
[23:23:47] 	you don't need to do a master rotation if you're changing a small table, or adding tables
[23:23:55] 	True
[23:24:02] 	03tomasz * r44789 10/trunk/extensions/CentralNotice/SpecialNoticeTemplate.php: Changing message lookup function to respect inc tags
[23:24:08] 	changing indexes or adding fields on primary tables, though...
[23:24:10] 	But adding indices or propagating existing indices to all slaves (bug 14200) does
[23:24:13] 	03simetrical * r44790 10/trunk/phase3/ (RELEASE-NOTES math/texutil.ml): (log message trimmed)
[23:24:13] 	Enable \cancel and \cancelto in texvc
[23:24:13] 	This works on Hardy, but may break all math support in the event that
[23:24:13] 	some platform doesn't have support for \cancel. I don't know much
[23:24:14] 	OCaml or LaTeX, so I'm not sure how to make it import the package only
[23:24:15] 	and those are the things that have been held up
[23:24:16] 	if it exists or something similarly useful.
[23:24:18] 	If this gets reverted, though, at least I'll have committed it and
[23:24:30] 	RoanKattouw, to large tables, anyway.
[23:24:45] 	Well which tables aren't large on the WMF servers? :P
[23:24:57] 	RoanKattouw, site_stats!
[23:25:17] 	Yeah but having indices on that table is pointless
[23:25:18] 	ipblocks
[23:25:28] 	Does this mean that in the near future we might actually be able to get schema changes done without months of holdups, though?
[23:25:30] 	ipblocks not large?
[23:25:38] 	nope
[23:25:43] 	Aren't the non-big-ten tables not large?
[23:25:44] 	Simetrical: I doubt the nearness of that future
[23:25:50] 	takes like 1 second for a schema change
[23:25:52] 	I mean, en.wiki is massive, but the smaller sites can't have too big tables...
[23:26:00] 	MZMcBride, the change has to be done on all wikis.
[23:26:04] 	So that doesn't help you.
[23:26:07] 	Right.
[23:26:13] 	would it be possible for you to update the Date&Time function
[23:26:15] 	Just sayin', most tables are small.
[23:26:21] 	to allow automatic DST etc...
[23:26:26] 	AzaTht: What Date&Time function?
[23:26:42] 	MZMcBride: Yeah, but the few tables that aren't small (far from it) kind of spoil it
[23:26:47] 	preferences
[23:27:03] 	Indeed. Damn you enwiki_p.revisions!
[23:27:09] 	and also, WTF is going on with the donation quotes?
[23:27:18] 	they are totoally stupid
[23:27:28] 	AzaTht: Err... manipulating preferences isn't possible through the API, and it won't be until someone rewrites the preferences system
[23:27:29] 	That's a #wikimedia-tech issue. ;-)
[23:27:44] 	MZMcBride: :-P
[23:27:45] 	Is anybody assigned to rewriting preferences?
[23:27:55] 	RoanKattouw: not talking to you about the preferences
[23:27:57] 	(the Vodafone branch does implement changing prefs, but I have no idea if it even works)
[23:28:10] 	was refering to MZ and TimStarling 
[23:28:14] 	MZMcBride: I don't think so. Whoever takes that up should read my rant about preferences at www.mediawiki.org
[23:28:18] 	AzaTht: Oh
[23:28:22] 	ツ
[23:29:20] 	!prefsrant is The current preferences system sucks and needs a rewrite. For a list of deficits and rewrite recommendations, see 
[23:29:20] --mwbot--	Successfully added keyword: prefsrant
[23:29:44] 	hehe
[23:30:07] 	I put it on MW.org because I got tired of repeating my rant on IRC, so this is the next logical step
[23:30:36] 	Indeed.
[23:30:49] 	!prefsrant
[23:30:49] --mwbot--	The current preferences system sucks and needs a rewrite. For a list of deficits and rewrite recommendations, see 
[23:31:16] 	haah
[23:31:45] 	What I want is 1: posibillity to define local timezone instead of "fill in from browser", and two, based on (1), allow for automatic DST
[23:32:46] 	Yes, there's an ancient bug about that.
[23:32:56] 	soley based on (1)? and allow as in, a checkbox? or allow as in "you got no choice"
[23:33:10] 	https://bugzilla.wikimedia.org/show_bug.cgi?id=505
[23:33:33] 	Splarka: it's allways good to give people choices
[23:33:34] 	Hmm there's actually a pretty recent patch there
[23:33:40] 	https://bugzilla.wikimedia.org/show_bug.cgi?id=505#c40
[23:36:00] 	brion: did it snow down there?
[23:37:01] 	bah, put in a