[00:00:56] 03gwicke * 10/trunk/extensions/VisualEditor/modules/parser/pegTokenizer.pegjs.txt: Accept IPv6 (and IPv4) addresses in the tokenizer, so another test passes. [00:03:37] 03(NEW) PhoneGap-based iOS app is missing 'About' - 10https://bugzilla.wikimedia.org/33776 normal; Wikimedia Mobile: iphone; (brion) [00:03:40] 03(mod) iOS PhoneGap app 1.0 release (tracking) - 10https://bugzilla.wikimedia.org/33673 (10Brion Vibber) [00:06:15] 03gregchiasson * 10/trunk/extensions/ArticleFeedbackv5/api/ApiViewFeedbackArticleFeedbackv5.php: AFT5 - short circuit gender thing for the time being, so we can test the layout. Known issue, will fix [00:11:02] New code comment: NeilK; There's a typo, here __MEHOTD__ -- marking fixme \ Seems weird to do this with two queries, isn't tha; [00:12:17] 03(mod) Translation of "Wikipedia" namespace in Assamese for Assamese Wikipedia - 10https://bugzilla.wikimedia.org/33507 +comment (10shijualex) [00:17:31] 03(mod) Excessive punctuation highlighting in wikidiff2 - 10https://bugzilla.wikimedia.org/33331 +comment (10mr.heat) [00:17:37] 03awjrichards * 10/trunk/extensions/CongressLookup/SpecialCongressLookup.php: Added logic for rudimentary zip code validation and provided a mechanism to make it possible to display an error on the zip code submission form in the event that the zip is invalid [00:19:21] 03awjrichards * 10/trunk/extensions/CongressLookup/SpecialCongressLookup.php: Made validation slightly stricter [00:22:00] 03bsitu * 10/trunk/extensions/MoodBar/ (3 files in 2 dirs): update memcache expiration time for top responder, remove escaping from plain text email and email copy update [00:22:12] 03(NEW) PhoneGap-based iOS app menus don't disable when things are disabled - 10https://bugzilla.wikimedia.org/33778 normal; Wikimedia Mobile: iphone; (brion) [00:23:08] 03gregchiasson * 10/trunk/extensions/ArticleFeedbackv5/ (3 files in 2 dirs): AFT5 - more HTML chanes [00:23:21] 03gwicke * 10/trunk/extensions/VisualEditor/modules/parser/pegTokenizer.pegjs.txt: Eat '[[[' as plain text token, makes it 212 passing. [00:26:18] 03siebrand * 10/trunk/phase3/languages/messages/MessagesQqq.php: Revert r107881 per CR. [00:26:43] 03aaron * 10/trunk/extensions/CongressLookup/maintenance/ (. checkContacts.php): Added quick script to hit all the contact URLs in the DB and report the broken ones [00:28:34] Project MediaWiki-postgres-phpunit build #337: FAILURE in 7 min 25 sec: http://integration.mediawiki.org/ci/job/MediaWiki-postgres-phpunit/337/ [00:28:57] 03gregchiasson * 10/trunk/extensions/ArticleFeedbackv5/api/ApiViewFeedbackArticleFeedbackv5.php: AFT5 - followup to r109258 - bad HTML, should be fixed now. [00:32:50] 03awjrichards * 10/trunk/extensions/CongressLookup/SpecialCongressLookup.php: Fixing inversed logic for checking string lengths [00:33:16] 03(NEW) Allow access to wikimedia servers through an I2P router - 10https://bugzilla.wikimedia.org/33779 normal; Wikimedia: General/Unknown; (matthew.j.shea) [00:33:55] Yippie, build fixed! [00:33:56] Project MediaWiki-postgres-phpunit build #338: FIXED in 5 min 13 sec: http://integration.mediawiki.org/ci/job/MediaWiki-postgres-phpunit/338/ [00:34:51] 03awjrichards * 10/trunk/extensions/CongressLookup/SpecialCongressLookup.php: Fixing typo in function name call [00:36:50] 03khorn * 10/trunk/extensions/CongressLookup/scripts/zip_file_parser.php: More log parsing. This one deals with the split zipcodes. Still dying out of the box for safety. [00:37:49] 03gregchiasson * 10/trunk/extensions/ArticleFeedbackv5/SpecialArticleFeedbackv5.php: AFT5 yet more HTML changes on feedback page [00:44:16] 03awjrichards * 10/trunk/extensions/CongressLookup/ (CongressLookup.i18n.php SpecialCongressLookup.php): Fixed bad handling of empty zip param; first pass at adding error message display. [00:45:15] 03kaldari * 10/trunk/extensions/CongressLookup/data/cl_senate.sql: updating some senate data [00:46:10] 03rmoen * 10/trunk/extensions/MoodBar/modules/ext.moodBar.dashboard/ext.moodBar.dashboard.js: Added better InterfaceConcurrency tooltip ui handling for feedback dashboard responses. only showing concurrency notice once for each item. [00:46:48] ^demon: we can call in when you are ready [00:47:06] <^demon> I just glanced at the queue, the only e-mail is for a key update. [00:47:09] never mind, [00:47:11] yeah [00:47:15] Tim-away: never mind [00:47:17] no call today [00:47:19] sorry [00:47:27] I thought I saw something in the queue and was wrong. [00:47:36] ^demon: sorry for bothering you. [00:47:44] I'll take care of the key update. [00:47:57] <^demon> No bother :) [00:48:54] 03(mod) PhoneGap-based iOS app menus don't disable when things are disabled - 10https://bugzilla.wikimedia.org/33778 +comment (10brion) [00:49:26] 03(mod) PhoneGap-based iOS app is missing 'About' - 10https://bugzilla.wikimedia.org/33776 +comment (10Yuvi Panda) [00:50:07] 03awjrichards * 10/trunk/extensions/CongressLookup/SpecialCongressLookup.php: Adding minor documentation [00:55:43] I'm on my way to the airport, it's not really the best time for phone calls [00:56:49] 03gregchiasson * 10/trunk/extensions/ArticleFeedbackv5/ (ArticleFeedbackv5.i18n.php SpecialArticleFeedbackv5.php): AFT5 - sigh. Last boneheaded commit of the night, probably. [00:58:06] Tim-away: no prob, call was obviated [00:58:11] Great, otrs queue is now empty [00:58:30] Tim-away: hope you have a good flight. [00:58:49] 03(mod) PhoneGap-based iOS app is missing 'About' - 10https://bugzilla.wikimedia.org/33776 +comment (10brion) [00:59:07] 03aaron * 10/trunk/extensions/CongressLookup/maintenance/checkContacts.php: Disable SSL host/peer checks to avoid issues with local SSl certificate problems (like being out of date) [01:01:10] 03awjrichards * 10/trunk/extensions/CongressLookup/ (CongressLookup.i18n.php SpecialCongressLookup.php): Adding explanatory note for users who will be displayed contact information for > 1 representative [01:01:47] yuvipanda: ready? [01:02:05] sumanah: sure. [01:04:25] I will not be the first nor last person to try to wrap my head around this issue, but could somebody please help me figure out what exactly BASEPAGENAME does to apostrophes? [01:04:48] The relevant code appears to be: case 'subpagename': $value = wfEscapeWikiText( $this->mTitle->getSubpageText() ); [01:05:55] If I put {{#ifeq: {{BASEPAGENAME}} | Name of a Page with an Apost'rophe | equal | not equal}} it returns "not equal"If {{#ifeq: {{BASEPAGENAME}} | Eul's Scepter of Divinity | equal | not equal [01:06:20] Sorry, copy-paste-o. If I put {{#ifeq: {{BASEPAGENAME}} | Name of a Page with an Apost'rophe | equal | not equal}} it returns "not equal" [01:07:40] 'basepagename': $value = wfEscapeWikiText( $this->mTitle->getBaseText() ); [01:22:01] 03kaldari * 10/trunk/extensions/CongressLookup/data/cl_senate.sql: fixing some more senate data [01:30:12] <[Relic]> Q: if I am understanding this correctly a beauracrats delete only moves or tags an entry with a "deleted" tag; but how does one remove completely a page? [01:33:10] 03preilly * 10/trunk/extensions/MobileFrontend/ (3 files): add sopa banner to mobile [01:36:51] any OpenID ppl around? [01:37:14] when users log in with openID, they only have the option to link to an existing account, not create new [01:38:16] The site is probably only a OpenID consumer and not a OpenID server [01:38:57] 03gwicke * 10/trunk/phase3/includes/Sanitizer.php: [01:38:57] Correct typo in comment, so that it reflects the spec and the actual value in [01:38:57] the regexp. [01:39:18] faceface: sorry, I'm not being clear, a user 'logs in' with their google open id account, for example, then they usually have the choice to create a new account on the wiki or link to an existing one [01:39:36] on my site, they only have the latter choice (preventing signup) [01:40:43] * faceface wonders about just updating openid plugin... [01:40:50] it was so buggy when I installed it [01:43:04] 03gwicke * 10/trunk/extensions/VisualEditor/modules/parser/ (3 files): [01:43:05] Add the start of a minimal sanitizer stage, that only strips IDN ignored [01:43:05] characters from host portions of links hrefs for now. This module needs to be [01:43:05] filled up with pretty much everything Sanitizer.php does, including tag and [01:43:05] attribute whitelists and attribute value sanitation (especially for style [01:43:05] attributes). [01:43:06] We'll also need to think about round-tripping of sanitized tokens. [01:47:27] 03kaldari * 10/trunk/extensions/CongressLookup/data/cl_zip5.sql: new zip5 data [01:53:57] hmm... I disabled account creation [01:54:08] Thinking I'd only have openID registered users [01:54:21] but those users need to create accounts... [01:54:47] how do I configure it so that regular accounts are disabled, but openID accounts are enabled? [01:56:14] 03kaldari * 10/trunk/extensions/CongressLookup/SpecialCongressLookup.php: dropping cybersecurity language, updating link [01:56:37] <[Relic]> wonder if it would just be better to wipe the whole wiki and start from scratch to deal with the massive size for something that should be quite tiny [01:57:13] * faceface doesn't want to :( [01:58:47] <[Relic]> trying to assist someone, but can't figure out how to permanently delete something cause they didn't add a captcha; so I am looking for any way to trim all the deleted stuff which should be permanently deleted out but simply can't find it in the documentation. would be a lot easier if I had direct server access to what I am trying to assist with but I don't atm [01:59:25] [Relic]: really powerfull spam fix tools are missing [01:59:47] they can be written, but once you know how to fix it, you aLready just fixed it... [02:01:11] 03preilly * 10/trunk/extensions/MobileFrontend/MobileFrontend.php: only set displayNoticeId to 2 if en [02:04:57] <[Relic]> problem with that it I can't seem to find the delete page permanently fix right now short of wiping the whole thing and starting over [02:08:09] <[Relic]> or maybe a way to move all the pages wanted to be kept into a temporary file so they can be restored easily would work [02:10:02] New code comment: Awjrichards; With this sql file, you're inserting values into an auto_increment field (clz5_id) - no good! Can yo; [02:14:32] 03preilly * 10/trunk/extensions/MobileFrontend/ (MobileFrontend.i18n.php SopaNoticeTemplate.php): fix message [02:29:13] how do I disable account creation except via openID? [02:31:54] 03preilly * 10/trunk/extensions/MobileFrontend/stylesheets/common.css: change notice to black background with white text [02:34:55] 03preilly * 10/trunk/extensions/MobileFrontend/stylesheets/common.css: make link underlined and bold the text [02:35:43] 03raindrift * 10/trunk/extensions/CongressLookup/data/cl_house.sql: [02:35:43] integrated more up-to-date phone and link data from http://www.contactingthecongress.org/downloads/ContactingCongress.txt [02:35:43] fixed a few with broken https by hand [02:35:43] tracked down james sensenbrenner's contact info manually [02:37:47] 03(mod) iOS PhoneGap app 1.0 release (tracking) - 10https://bugzilla.wikimedia.org/33673 (10Brion Vibber) [02:40:29] 03preilly * 10/trunk/extensions/MobileFrontend/stylesheets/beta_common.css: add styles to beta common stylesheet [02:42:13] 03preilly * 10/branches/wmf/1.18wmf1/extensions/MobileFrontend/ (5 files in 2 dirs): 1.18wmf: MFT r109276 - r109287 [02:43:59] 03raindrift * 10/trunk/extensions/CongressLookup/data/cl_house.sql: [02:43:59] this guy's CAPTCHA is broken with the other link. [02:43:59] followup to r109285 [02:44:42] 03liangent * 10/trunk/extensions/Gadgets/ (Gadgets_body.php SpecialGadgets.php): Followup r100509: Don't validate skin names written in the definition page. [02:49:22] 03khorn * 10/trunk/extensions/CongressLookup/data/cl_zip5.sql: Added in the new data without the zip5 index column. [02:50:03] New code comment: Liangent; I removed that line in r109290 and now if one write the following line: \ * GadgetX[skin=]|GadgetX.j; [02:51:33] 03yuvipanda * 10/trunk/extensions/CongressLookup/SpecialCongressLookup.php: Update with new HTML from Brandon [02:58:18] 03khorn * 10/trunk/extensions/CongressLookup/data/cl_zip5.sql: [02:58:18] followup r109291 [02:58:18] Dumb! I killed all the commas. Rectifying the situation. [03:21:36] 03(NEW) Textboxes now unbearably narrow in text browsers - 10https://bugzilla.wikimedia.org/33780 normal; MediaWiki: Page editing; (jidanni) [03:21:46] 03(mod) "cols" preference is redundant and should be removed - 10https://bugzilla.wikimedia.org/24430 (10jidanni) [03:21:55] night [03:24:39] 03kaldari * 10/trunk/extensions/CongressLookup/SpecialCongressLookup.php: formatting fixes [03:26:20] 03kaldari * 10/trunk/extensions/CongressLookup/SpecialCongressLookup.php: color tweak [03:29:28] 03kaldari * 10/trunk/extensions/CongressLookup/data/cl_zip9.sql.gz: removing giant file for now [03:34:17] New code comment: Jpostlethwaite; This is no longer a problem. See r109291.; [03:35:17] New code comment: Jpostlethwaite; The field was removed.; [03:37:37] 03neilk * 10/branches/wmf/1.18wmf1/extensions/CongressLookup/: removing CongressLookup to rebranch from trunk [03:39:29] 03neilk * 10/branches/wmf/1.18wmf1/extensions/CongressLookup/: Rebranch from trunk @ r109297 of CongressLookup [03:41:03] 03awjrichards * 10/trunk/extensions/CongressLookup/maintenance/populateCache.php: Adding a maint script to prepopulate caches [03:44:22] * varnent is finally back - ugh [03:44:32] varnent: Welcome back [03:45:07] johnduhart: ugh - sumanah emailed me that you worked on that extension - thank you!!! I felt terrible about leaving it so quickly - but yeah - I promise my day sucked royally :) [03:45:36] It's no problem, you had other things to attend to :) [03:46:04] johnduhart: it happens - I wish I could say it was all a success - but I'll be back there all day tomorrow :( [03:46:46] I know it's like 11th hour - but need anything from me? :) [03:47:16] varnent: It's pretty much done, take a look and se if anything needs doing [03:47:21] 03raindrift * 10/trunk/extensions/CongressLookup/data/cl_house.sql: [03:47:22] changed links to mailto: for the two reps with actual email addresses [03:47:22] followup to r109285 [03:47:45] johnduhart: varnent: https://meta.wikimedia.org/wiki/English_Wikipedia_SOPA_blackout/Technical_FAQ I'm editing now [03:48:03] johnduhart: excellent - ty - I'll tidy up the wikipage for the extension and link into that technical FAQ [03:49:49] well, varnent, that FAQ only talks about how the WMF is doing stuff, right now [03:50:05] sumanah: okay - I'll just add it to the list of resources for folks [03:50:09] ok [03:50:29] sumanah: I don't have any sense yet of how much the extension will be in demand after tomorrow or how worth it is documenting long-term - maybe just a one shot deal? [03:50:39] I just don't know, varnent [03:51:02] I did hear it a lot on NPR and BBC tonight - so I wouldn't be surprised if there's a rush of wikis jumping on tmrw morning [03:51:06] it all depends on how often wiki owners will want to protest stuff. [03:51:31] sumanah: yeah - I'll probably de-SOPA it in a few days to be used for other things in the future [03:51:42] right [03:56:55] johnduhart: looks great - I'll start working on language and post-SOPA use - ty again - setting up for a few wikis right now :) [03:57:08] You're very welcome :) [03:57:11] I'm behind in email - would it be worth emailing the link out or was that done? [03:57:34] varnent: emailing the link to Extension:Blackout? [03:57:37] varnent: I emailed mediawiki-l [03:57:48] sumanah: excellent - ty! :) [03:57:53] varnent: :-) [03:58:27] varnent: hmmm, you may want to look at those mediawiki-l emails from me and see whether you wanna copy-n-paste to wikitech-l [03:58:43] sumanah: I'll go check them now [03:58:46] sumanah: ty for the email earlier btw - was happy to get something that I didn't either have to skip for later or read in relations to today's fun :) [03:59:20] varnent: :-) I'm grateful for and appreciative of your abundant work in the MediaWiki and activist communities [04:00:19] sumanah: always happy to do so - I can't complain about today too much actually - it sucked way more for my friend - I basically sat there all day providing support, wheels and fundage - but I would have much rather being working on SOPA :) [04:10:47] New code comment: Jpostlethwaite; Running populateCache.php now. This is going to take a while :); [04:10:55] 03awjrichards * 10/trunk/extensions/CongressLookup/maintenance/populateCache.php: Added the ability to just output a file of all URLs to load into wget or something. Also fixed the way in which we check to see if there are any more data results. [04:11:34] Krinkle: got a second/ [04:11:35] ? [04:11:44] sumanah: a second [04:12:13] Krinkle: so can you take a look at http://mobile-feeds.wmflabs.org/w/index.php/Special:ApiSandbox and see if it implements the bugs you filed regarding better descriptions of the parameters? [04:12:31] I never quite grokked what was desired there but you know because you asked for it :0 [04:12:33] ) [04:13:25] sumanah: yup, the buttons I had in mind are there and do the expected thing [04:13:30] that UI needs a lot of love though [04:13:38] I find my self scrolling back and forth [04:13:55] but it's dev targeted so UI is forgivable :) [04:13:59] Krinkle: ha! yeah [04:14:00] great work whoever did that [04:14:07] (the technique that is) [04:14:08] Krinkle: it's MaxSem, right? [04:14:15] Krinkle: well, have you seen the Etsy sandbox? [04:14:20] if you say so [04:14:29] not eyt [04:14:53] Krinkle: they call it the API Browser http://www.etsy.com/developers/documentation/resources/api_browser [04:14:58] * johnduhart is playing converting MobileFrontend to use skins [04:15:07] playing with* [04:15:40] sumanah: It's powered by a 3rd party service called apigee [04:16:04] johnduhart: accelerate my API strategy!!!1! [04:16:29] sumanah: reminds me of https://dev.twitter.com/console [04:16:51] I guess it's a re-usable framework for RESTFUL APIs. [04:17:04] MediaWikis' API doesn't' follow the Restful spec though [04:17:44] Krinkle: Maybe one day... [04:17:46] * johnduhart dreams [04:17:56] 03awjrichards * 10/trunk/extensions/CongressLookup/maintenance/populateCache.php: Added the rest of the necessary query string for the URL generation [04:19:02] New code comment: Jpostlethwaite; You need to put an equals sign for path: \ php populateCache.php --url http://wikimedia-commit.local; [04:28:42] 03yuvipanda * 10/trunk/extensions/CongressLookup/SpecialCongressLookup.php: Added geo-targetted sharing options [04:28:58] 03yuvipanda * 10/trunk/extensions/CongressLookup/SpecialCongressLookup.php: Removed fax information [04:30:29] 03aaron * 10/trunk/extensions/SwiftMedia/ (3 files in 2 dirs): Consolidated fake php-cloudfiles code a bit [04:30:39] sumanah: also nice; http://mayert-douglas4935.myshopify.com/collections/all [04:30:54] oh, wrong link [04:31:03] Krinkle: I think I got it [04:32:17] New code comment: Jpostlethwaite; The links need to be fixed on this. It all points to test.wikimedia.org; [04:34:46] mornin [04:38:55] 03yuvipanda * 10/trunk/extensions/CongressLookup/SpecialCongressLookup.php: Fixed wrong hardcoded url with correct one [04:48:32] New code comment: Jpostlethwaite; This should work with production, but it does not work on a local machine.; [04:50:28] 03yuvipanda * 10/trunk/extensions/CongressLookup/SpecialCongressLookup.php: Fixed jquery loader to use $wgScriptPath [04:54:05] 03yuvipanda * 10/trunk/extensions/CongressLookup/SpecialCongressLookup.php: Minor style changes + a stray \n removed [04:55:35] 03aaron * 10/trunk/extensions/CongressLookup/maintenance/checkContacts.php: Made checkContactLink() handle email addresses better. Also send a user-agent. [04:59:41] 03yuvipanda * 10/trunk/extensions/CongressLookup/SpecialCongressLookup.php: New Text for Share Options [05:04:33] 03neilk * 10/trunk/extensions/CongressLookup/SpecialCongressLookup.php: do proper twitter links [05:15:07] 03raindrift * 10/trunk/extensions/CongressLookup/data/cl_house.sql: [05:15:07] integrating first batch of changes from manual checks by Solitarius, et al. [05:15:07] checked up to line 55 [05:15:07] followup to r109285 [05:25:52] So, I'm told I should no longer be able to see the main page... but I can. And someone pointed out you can use wp if you turn off js, but I just see the normal site whether js is on or off... [05:29:06] didn't you clear browser cache? [05:29:10] 03kaldari * 10/trunk/extensions/CongressLookup/SpecialCongressLookup.php: hiding the other fax field [05:29:45] Yes. [05:30:10] 03khorn * 10/trunk/extensions/CongressLookup/scripts/zip_file_parser.php: Added db manipulation functions to the maintenance script. This way, we can edit locally, do a dump, and re-upload. [05:30:53] Amgine: Is the page taking a while to load, the JS might take a little bit on slow connections [05:31:18] Nope, quite snappy. And I've never seen the RC so slow since at least 2005. [05:31:52] lol [05:32:07] Amgine: https://meta.wikimedia.org/wiki/English_Wikipedia_SOPA_blackout/Technical_FAQ in case you haven't already seen it [05:32:10] Amgine, removing edit rights from everyone can do that [05:33:27] 03kaldari * 10/trunk/extensions/CongressLookup/SpecialCongressLookup.php: small tweaks [05:33:32] Amgine: some sort of anti-advertisement extension or something like that maybe? [05:33:52] Central banner is disabled, natch. Wasn't thinking about that. [05:35:08] yuvipanda: hey free? [05:36:10] potter: sorry, I don't think can help with anything tonight [05:36:41] yuvipanda: it's ok.. btw its is day here :) [05:37:00] potter: ah, am in SF now [05:37:31] yuvipanda: i know that.. can you tell me when will you be free? [05:38:25] bawolff: hey there? [05:38:32] potter: yes [05:38:42] for now anyways, I'll probably be going soon [05:39:15] mostly because its 1:30 am, and I'm mostly still awake for the sole purpose of seeing the sopa blackout in action [05:39:19] bawolff: whenevr i look through the code.. i always find it difficult to deal with global function and global variables [05:39:38] potter: Good, we're trying to kill them :) [05:39:52] bawolff: i didnot get you [05:40:09] potter: basically we're trying to phase out the use of most global variables [05:40:22] or the non-configuration one's anyways [05:40:27] bawolff: oh good.. but what is the best way to understand them? [05:40:48] For global functions, most of them are defined in includes/GlobalFunctions.php [05:40:58] The majority of which are small wrappers around something else [05:41:05] bawolff: i mean if i google i dont get complete information . sometimes there is no result at all [05:41:32] bawolff: so is it good to spend some time with GlobalFunction.phph and understand the functions cause it is too long [05:42:07] hmm, most of the functions in that file are kind of useless and mostly ther for compatability [05:42:22] I'd more wait until you come accross one you don't know, then look it up [05:42:53] bawolff: but each time i look up one.. it uses another and then it goeson like that [05:43:22] There's a couple global functions that are important - wfMessage is a wrapper around the Message class, and is rather important, wfMemKey is used a lot and just basically makes an id string [05:43:49] bawolff: hmmm... and what about global variables. [05:43:50] ? [05:44:10] That depends a lot more on the specific global variable. There's two types [05:44:32] Config type variables $wgSomeSettingName usually have an explanation in DefaultSettings.php [05:44:43] bawolff: yeah.. [05:44:50] as well as a doc page like manual:$wgSomeSettingName in mediawiki wiki [05:45:17] The other things like $wgOut , $wgParser , etc generally are just a global instance of an object [05:45:28] but sometimes its hard to figure out which object [05:45:50] bawolff: yeah..thats the problem evry newbie faces i suppose [05:45:56] bawolff: like me!! [05:46:01] admin naging me why a backup on a certain box hasn't ran in 5 days ... found out because he commented out the backup script in cron ... need coffee nao [05:46:06] The most common ones though are $wgOut is an instance of OutputPage, $wgUser is an instance of the User object representing current user, and so on [05:46:39] bawolff: yeah..those aere self explantory.. is there any proper documentation of all these [05:47:03] oh, goddes yes. In a manner of speaking. [05:47:19] potter have you seen doxygen? [05:47:29] bawolff: No [05:47:30] http://svn.wikimedia.org/doc/ [05:47:38] basically the code comments re-formatted [05:47:51] in particular it just goes through the methods of a class [05:47:53] for example [05:47:54] It eats my netbook and spits out sand. [05:47:56] !class User [05:47:56] See http://svn.wikimedia.org/doc/classUser.html [05:48:06] huh [05:48:28] svn.wikimedia.org/doc/classOutputPage.html <- that page in particular. Takes about 5 minutes to process. [05:49:06] Amgine: Right!! [05:50:05] is it possible to have docs copy on the comp.. i mean offline copy [05:50:22] [05:52:11] potter: yes, but i've never done it [05:52:34] bawolff: how ?is it huge? [05:52:40] you'd need to download doxygen. There should be isntructions in the docs subdirectory of your mediawiki install (possibly even a makefile) [05:52:56] It's not unreasonably huge (I assume) [05:53:40] bawolff: thats cool .. so doxygen has all the documentation? [05:54:10] Well not all the documentation. It really just turns source code comments into html pages [05:54:42] bawolff: sad.. there are very little comments [05:54:45] There's also some spotty docs on mediawiki.org and a couple other txt files in the docs subdirectory of the install (hooks.txt) [05:55:15] bawolff: where ? [05:55:58] spread out in various places. Most of it (esp low level stuff) on mediawiki.org is kind of crap [05:57:03] * bawolff has to go. ttyl [05:59:03] 03awjrichards * 10/branches/wmf/1.18wmf1/extensions/CongressLookup/ (6 files in 4 dirs): MFT revs 109299 through 109315 [06:00:05] 03kaldari * 10/trunk/extensions/CongressLookup/SpecialCongressLookup.php: more text update [06:03:54] 03neilk * 10/trunk/extensions/CongressLookup/SpecialCongressLookup.php: flip hash and bang [06:07:20] 03neilk * 10/branches/wmf/1.18wmf1/extensions/CongressLookup/ (. SpecialCongressLookup.php): MFT r109318 - flip hash and bang for twitter [06:07:47] 03kaldari * 10/trunk/extensions/CongressLookup/SpecialCongressLookup.php: formatting fix for notes [06:09:15] would someone tell me the best simple way to upgrade mediawiki? [06:15:45] !upgrade [06:15:45] http://www.mediawiki.org/wiki/Manual:Upgrading [06:22:36] 03raindrift * 10/trunk/extensions/CongressLookup/ (CongressLookup.php patches/CongressDataErrors.sql): added a new table for recording data errors in a more friendly way [06:22:44] 03awjrichards * 10/branches/wmf/1.18wmf1/extensions/CongressLookup/ (. SpecialCongressLookup.php): MFT r109317, r109320 [06:25:12] 03erik * 10/trunk/extensions/CongressLookup/SpecialCongressLookup.php: committing final messaging provided by Sue [06:28:05] 03awjrichards * 10/branches/wmf/1.18wmf1/extensions/CongressLookup/ (. SpecialCongressLookup.php): MFT r109323 [06:29:38] 03varnent * 10/trunk/extensions/Blackout/skins/SopaStrike.php: modify urls used [06:37:54] mw-bot: Thank you [06:42:53] 03raindrift * 10/trunk/extensions/CongressLookup/ (ApiCongressLookup.php CongressLookup.php): API for congress lookup errort reporting [06:49:57] sumanah: hey still at the office/ [06:49:58] ? [06:51:27] hi yuvipanda [06:51:28] yeah [06:51:34] thought I might help with comment moderation if nothing else [06:51:40] sumanah: :D cool [06:54:33] 03raindrift * 10/trunk/extensions/CongressLookup/data/cl_house.sql: [06:54:33] Added second batch of corrected data, per Soli et al [06:54:33] up to line 99 [06:54:33] followup to r109285 [07:00:32] New code comment: Aaron Schulz; Module should include 'problem' in the name...; [07:01:59] 03santhosh * 10/trunk/extensions/WebFonts/ (resources/ext.webfonts.js tests/qunit/ext.webfonts.tests.js): [07:01:59] qunit tests for webfonts -- build menu [07:01:59] Fixes in extension javascript [07:01:59] - Before creating the menu check if one exists or not. if exists, remove it [07:01:59] - Correct return values of the method. [07:05:04] sumanah: do ping me before you leave I'd rather not get lost at night [07:05:17] yuvipanda: oh, are you waiting for me to leave with you? [07:05:19] I didn't know! [07:05:20] ok [07:05:29] sumanah: no no, am doing some work for the extension [07:05:35] oh ok [07:05:40] just that i'm hoping you'd leave late enough that I could tag along :) [07:06:17] 03khorn * 10/trunk/extensions/CongressLookup/scripts/zip_file_parser.php: [07:06:17] Wrote a table dumper that's actually useful for our purposes, that can be run through the command line. [07:06:17] Output from here will land in CongressLookup/data, and should be copied over the old data file and committed. [07:06:17] Note: You will probably have to remove one comma from the second to last line in the file... [07:07:12] yuvipanda: ok. what time you thinking? in 30 min? [07:07:17] New code comment: Nikerabbit; Only half?; [07:07:20] sumanah: sounds good [07:08:03] 03khorn * 10/var/ (6 files in 6 dirs): [07:08:04] Netbeans SVN client generated message: create a new folder for the copy: [07:08:04] Renaming an ever-more-badly named file. [07:08:11] 03khorn * 10/var/www/trunk/extensions/CongressLookup/db_manipulator.php/zip_file_parser.php: Renaming an ever-more-badly named file. [07:09:17] 03khorn * 10/trunk/extensions/CongressLookup/scripts/db_manipulator.php: Renaming an increasingly badly-named file... [07:15:32] New code comment: Khorn (WMF); Uhm... \ I'd revert this incredibly weird thing if I could find it locally. :/; [07:19:39] 03(NEW) invaliduser error is misleading - 10https://bugzilla.wikimedia.org/33781 normal; MediaWiki extensions: WikiLove; (liangent) [07:24:29] 03yuvipanda * 10/trunk/extensions/CongressLookup/SpecialCongressLookup.php: Implemented per-rep social media sharing [07:32:10] 03amire80 * 10/trunk/extensions/WebFonts/tests/qunit/ext.webfonts.tests.js: Added tests for application of web fonts to body, input, select and textarea. [07:43:07] 03awjrichards * 10/branches/wmf/1.18wmf1/extensions/CongressLookup/ (. SpecialCongressLookup.php): MFT r109333 [07:43:23] 03neilk * 10/trunk/extensions/CongressLookup/ (CongressLookup.php SpecialCongressLookup.php): make report link go to Special:CongressFail [07:46:47] bestest sopa protest GIF on theoatmeal! lol [07:46:48] 03raindrift * 10/trunk/extensions/CongressLookup/ (CongressLookup.php SpecialCongressFail.php): adding stub special page for recording errors [07:52:22] 03raindrift * 10/trunk/extensions/CongressLookup/SpecialCongressFail.php: [07:52:22] wrong page name in parent constructor [07:52:22] followup to r109337 [08:00:17] New code comment: Siebrand; Hmm, I was sure I changed two files... Let's see...; [08:00:20] 03neilk * 10/trunk/extensions/CongressLookup/ (CongressLookup.php SpecialCongressLookup.php): get the url for the special page in a more correct way [08:02:50] 03(mod) margin-right:0 for table.wikitable - 10https://bugzilla.wikimedia.org/33445 (10fomafix) [08:03:06] hi MaxSem [08:03:55] morning sumanah [08:04:30] MaxSem: tomaszf asked me to look into what else is waiting on the API sandbox before it's all we wish for the hackathon this Friday [08:05:02] well, I need to tweak the UI a bit [08:05:33] 03siebrand * 10/trunk/phase3/languages/messages/MessagesEn.php: Follow-up r107881: revert. Adding forgotten part in r109260. [08:05:35] MaxSem: a little pasting from earlier: [08:05:35] Krinkle: so can you take a look at http://mobile-feeds.wmflabs.org/w/index.php/Special:ApiSandbox and see if it implements the bugs you filed regarding better descriptions of the parameters? [08:05:36] I never quite grokked what was desired there but you know because you asked for it :0 [08:05:36] ) [08:05:36] sumanah: yup, the buttons I had in mind are there and do the expected thing [08:05:36] that UI needs a lot of love though [08:05:38] I find my self scrolling back and forth [08:05:54] MaxSem: Timo also gave you props regarding the design/approach/technique. [08:09:26] MaxSem: I also contacted the API list to try to get feedback on the sandbox, both the deployed version and the version in your labs instance [08:09:57] MaxSem: and tomorrow when I am more coherent I aim to give you feedback on the way the examples are presented/described. I'm very glad they are implemented! [08:10:00] MaxSem: sound good? [08:11:22] 03varnent * 10/trunk/extensions/Blackout/ (6 files in 3 dirs): localisation [08:11:32] :) [08:11:55] varnent: :-) any specific interest from wikis in using that? [08:12:03] http://www.mail-archive.com/openstack@lists.launchpad.net/msg03330.html [08:12:17] hehe, caught me by surprise [08:12:22] sumanah: localisation? no clue :) [08:12:42] 03neilk * 10/trunk/extensions/CongressLookup/ (CongressLookup.alias.php CongressLookup.i18n.php): title for CongressFail page [08:12:46] but I fear the wrath of the localization team :) [08:12:48] varnent: I mean in using the Blackout ext [08:13:14] sumanah: not sure - been working on the code - still have a few hundred unread emails from today [08:13:18] ha! [08:13:21] good luck. [08:13:32] MaxSem: need anything more from me urgently or can it wait till "tomorrow"? [08:13:43] nope [08:13:53] ugh - it's just going to get worse - I have to go to bed soon and will just be moving furniture around all day tmrw [08:14:11] so it'll be 600 unread before I get to them :) [08:14:18] by then SOPA may be law - lol - j/k [08:14:20] aieeee [08:14:23] ok [08:14:50] 03raymond * 10/trunk/extensions/ArticleFeedbackv5/ (ArticleFeedbackv5.i18n.php SpecialArticleFeedbackv5.php): Fix r109271/r109272 in a better way to make translations less awful. [08:15:54] 03raindrift * 10/trunk/extensions/CongressLookup/SpecialCongressFail.php: contains a form, stores things in the database, is very web 1.0 [08:18:29] 03rfaulk * 10/trunk/udplog/filters/SOPA-filter.c: Added "learn-more" wiki article to filter [08:18:59] !doc [08:19:00] !docs [08:19:01] An overview of available documentation about MediaWiki can be found at [08:19:01] An overview of available documentation about MediaWiki can be found at [08:23:16] 03raindrift * 10/trunk/extensions/CongressLookup/SpecialCongressFail.php: fixed a db bug, added a success message. [08:25:02] New code comment: Raymond; Value for $1 is 0 always? \ + ), wfMessage( 'articleFeedbackv5-updates-since', 0 ) ); \ Otherwise the; [08:25:58] what is wfProfileIn [08:26:37] 03(FIXED) It's impossible to type Hebrew characters in the search box until some Latin characters were typed - 10https://bugzilla.wikimedia.org/33518 +comment (10Amir E. Aharoni) [08:28:39] potter: it's for profiling. it tells the profiler to start recording time. there's also a wfProfileOut [08:28:47] 03neilk * 10/trunk/extensions/CongressLookup/SpecialCongressFail.php: deal with lack of zip [08:29:06] raindrift :What is a profiler? [08:29:20] potter: it measures how long code takes to execute. it's for performance tuning. [08:29:26] Hello, someone speak French here ? [08:29:53] raindrift: so what kind of arguments are given to the function? [08:30:36] here is some documentation on using it: http://www.mediawiki.org/wiki/Manual:How_to_debug#Profiling [08:30:46] though, honestly, i do my profiling with xdebug. [08:30:52] raindrift: ok thanks a lot :) [08:31:04] np [08:31:17] New code comment: Siebrand; Let me have a go...; [08:33:32] 03siebrand * 10/var/: Revert r109330 and r109331. [08:34:02] New code comment: Siebrand; Reverted using TortoiseSVN's repo browser.; [08:36:50] 03neilk * 10/branches/wmf/1.18wmf1/extensions/CongressLookup/: remove CongressLookup for recopy from trunk [08:37:29] !sanitizer [08:38:48] what is sanitizer class used for ?i find it at may places [08:42:09] hashar: hey .. what is sanitizer class used for? [08:42:29] it is used to remove unwanted UTF-8 characters [08:42:35] and other things too. Cant remember [08:42:43] let me have a look at it [08:43:03] hashar: is there any documents for it? [08:43:33] almost sure we do not have any docu [08:43:35] documentation [08:43:57] so you will have to look at the methods comments [08:45:12] hashar: i was reading through code of a extension and i have reached the sanitizer class ... [08:45:35] well it is not that much about UTF-8, more about making some XHTML sane / safe [08:45:40] by stripping unwanted code [08:47:23] oh there is even my email validation function in it :-b [09:03:44] 03raymond * 10/trunk/translatewiki/MediaWiki/mediawiki-defines.txt: r109276: Ignore new message for translation. Propably for 1 day in use only. [09:49:30] hello [09:49:44] anyone know the nick of the community outreach person who was in here last year? [09:49:49] (terrible with names) [09:50:21] he offered me stickers and stuff for a conference? [09:52:08] last year?? [09:52:13] yeah [09:52:22] 360 days ago? [09:52:36] last calendar year [09:53:26] dbolser: Cary Bass? [09:53:32] nope [09:53:48] he's on the mailing list sometimes too... [09:53:52] * dbolser goes to search [09:56:15] lost best comment on blackout till now ... "Obviously Wikipedia decided to be up for the same number of days as last year ... and have a leap-day off" :D [09:56:38] :-D [09:57:34] varnent should get some publicity on https://www.mediawiki.org/wiki/Project:WikiProject_Extensions/Projects/Page_Drive [10:00:41] Nikerabbit: just like all those "a word from jimmy/developer/editor/whoever" fundraiser sitebanners on wmf ... why aren't banners about stuff like this or "a word from our bugmeister" on mediawiki ... #justsaying [10:24:04] 03nikerabbit * 10/trunk/extensions/Translate/ (18 files in 3 dirs): Doc header updates and consistency and few small fixes [10:27:28] dbolser: Cary was the outreach guy before he left, And Sumanha(spelling?) has since taken over the position [10:28:46] Sumana [10:29:27] 03(mod) DNS configuration for Wikimedia Polska - 10https://bugzilla.wikimedia.org/33509 +comment (10Marcin Cieślak) [10:29:28] dbolser: Although jorm is a graphics guy so he might of offered stickers and such [10:31:25] Sumana! [10:32:44] dbolser: just a heads up, sumana is not a he [10:33:24] tys [10:33:32] that's still our community person [10:33:42] Sumana!Sumana Harihareswara [10:34:02] bah [10:34:09] is it S Harihareswara ? [10:34:18] * dbolser isn't sure how common a name it is [10:34:43] ok, there seems to be only one on th list [10:34:52] wikitech-l [10:35:11] that would be the one yes [10:35:19] thanks all [10:38:43] 03(mod) The Babel localisations for Haitian (ht) are not available on the WMF projects - 10https://bugzilla.wikimedia.org/33768 normal->major; +comment (10Gerard Meijssen) [10:48:48] New code comment: Nikerabbit;
 \ Expected: \ "\"Lohit Telugu\",Helvetica,Arial,sans-serif" \ Result: \ "'Lohit Telugu', Helvetica; 
[10:51:33] 	 04(REOPENED) formlink query string broke line with apostrophe in {{PAGENAME}} - 10https://bugzilla.wikimedia.org/33771  +comment (10Neill Mitchell)
[10:52:32] 	 New code comment: Nikerabbit; Output should be escaped.; 
[11:05:35] 	 New code comment: Hashar; Files did not get copied with their svn history .... So you should probably revert this rev and foll; 
[11:07:35] 	 New code comment: Santhosh.thottingal; Almost ok, except "@copyright Copyright © 2022-2012 Niklas Laxström"; 
[11:10:05] 	 03nikerabbit *  10/trunk/extensions/Translate/utils/MessageHandle.php: I really have to stop traveling to the future this often, ping r109353
[11:13:19] 	 New code comment: Nikerabbit; Darn those typing errors...fixed.; 
[11:15:31] 	 How do I add infoboxes to my mediawiki?
[11:21:00] 	 plantain: http://en.wikipedia.org/wiki/Help:Infobox
[11:21:24] 	 and http://www.mediawiki.org/wiki/Manual:Importing_Wikipedia_infoboxes_tutorial
[11:26:34] 	 hi, is this a common/known one that might happen after a (failed?) upgrade:   PHP Fatal error:  Cannot redeclare wfprofilein() in /var/www/w/includes/ProfilerStub.php on line 18
[11:27:36] 	 ignore me, it is in Upgrade instructions
[11:27:36] 	 mutante: http://www.mediawiki.org/wiki/MediaWiki_1.18/FAQ
[11:28:41] 	 p858snake|l: thanks, sry even, i should have seen that 
[11:30:30] 	 Hey huys, I have an problem with rendering pages which contain "special characters" like: ë , If I previw an page with that character the whole layout is FUBAR, It is seen in multiple browsers. At first I suspected something in the DB, but when I retreive the BLOB which contains the page, and decode it to text I see the character as aspected. Anybody an idea where I have to look?
[11:31:55] 	 Gotiniens: database character encoding praps?
[11:35:23] 	 Saruman: Didn't I rule that out by having the right character when decoding the binaryBLOB that contains the page?
[11:36:10] 	 04(REOPENED) Arabic numerals on Arabic Wiktionary - 10https://bugzilla.wikimedia.org/33758  (10Mohamed ElGedawy)
[11:36:39] 	 Saruman: I can't seem to get that to work, for example attempting to clone the Martin Luther King infobox yields this: http://thebedridden.org/index.php?title=Baterz
[11:39:19] 	 03santhosh *  10/trunk/extensions/WebFonts/tests/qunit/ext.webfonts.tests.js: Correct the testcases that were failing in r109334
[11:45:25] 	 03(mod) formlink query string broke line with apostrophe in {{PAGENAME}} - 10https://bugzilla.wikimedia.org/33771  +comment (10Pascal.Beaujeant)
[11:48:57] 	 anyone here still using ubuntu 10.4 x64 ?
[11:52:46] 	 plantain: "Importing templates into your third-party wiki from Wikipedia is often times a lengthy and challenging process."
[11:53:01] 	 mebby easier to make an infobox yourself
[11:53:51] 	 plantain: an ultrasimple infobox with only a single parameter: https://dya-knowledge.sogeti.nl/dir/Template:Maturity
[11:56:34] 	 how do I nuke all the pages I just imported then
[11:58:17] 	 plantain: that's be http://www.mediawiki.org/wiki/Extension:Nuke
[11:59:22] 	 unfortunately that doesn't show files imported
[11:59:51] 	 [[Special:Log/import]]
[12:00:40] 	 that shows me a list of nearly 500, how do I get rid of them?
[12:02:13] 	 nevermind, I modified Nuke to query for imported pages
[12:02:42] 	 oeh, nice mod
[12:02:57] 	 Saruman: Also, I expect that when I click "Preview" The page doesn't go trough the DB
[12:04:28] 	 03(mod) Ghost categories in wanted categories - 10https://bugzilla.wikimedia.org/15152  +comment (10Beta16)
[12:05:14] 	 Gotiniens: I couldn't say
[12:11:03] 	 03ialex *  10/trunk/phase3/includes/ (4 files in 2 dirs): Pass some more __METHOD__ to DatabaseBase::begin() and DatabaseBase::commit()
[12:32:26] 	 03(NEW) adding __ONLYTOC__ to have only Table of contents - 10https://bugzilla.wikimedia.org/33782 normal; MediaWiki: General/Unknown; (reza.energy)
[12:41:38] 	 what is wgOut object?
[12:42:22] 	 ?
[12:42:44] <^demon|zzz>	 It's a global instance of OutputPage
[12:44:36] 	 ^demon: i was going through some extensions code .. i found that lots of global variables and global functions are used ...can you suggest me best way to deal with them
[12:46:32] 	 New code comment: Siebrand; Please make the help page open in a new tab/window.; 
[12:48:26] 	 Localisation and internationalisation bug triage in 15 minutes in #wikimedia-dev. See http://etherpad.wikimedia.org/BugTriage-i18n-2012-01 for details.
[12:51:11] 	 Hey huys, I have an problem with rendering pages which contain "special characters" like: ë , If I previw an page with that character the whole layout is FUBAR, It is seen in multiple browsers. At first I suspected something in the DB, but when I retreive the BLOB which contains the page, and decode it to text I see the character as aspected. Anybody an idea where I have to look?
[12:58:01] 	 03santhosh *  10/trunk/extensions/Narayam/resources/ext.narayam.core.js: Open the Narayam help page in new tab/window.
[12:58:16] 	 Another interestingly annoying thing about SOPA. IP addresses for Canada are assigned by ARIN which is in the US... in other words, not only is every generic tld such as .com, .org, etc... revokable but a site in the .ca domain name controlled by Canada, running on a server hosted in Canada, only targeting a Canadian audience can still have it's IP address revoked and be shut down.
[13:00:51] 	 I wonder if the government is hosting any materials that are PD in Canada but not yet PD in the US because of it's horrid copyright law.
[13:03:49] 	 potter: For $wgOut and some of the related stuff
[13:03:54] 	 !requestcontext | potter
[13:04:03] 	 :/
[13:04:16] * Dantman  sets a mine for mw-bot
[13:04:34] 	 potter: https://www.mediawiki.org/wiki/Manual:RequestContext
[13:06:26] 	 Dantman: thanks
[13:07:41] 	 03(NEW) Mechanism used for SOPA blackout is not effective if JavaScript disabled - 10https://bugzilla.wikimedia.org/33783 major; MediaWiki: General/Unknown; (billlalonde)
[13:07:53] 	 mw-bot wasn't accidentally hosting copyrighted content, was it?
[13:10:36] 	 03(NEW) Jenkins build are now 13 minutes long or so - 10https://bugzilla.wikimedia.org/33784 normal; Wikimedia: Testing Infrastructure; (hashar)
[13:10:58] 	 03(mod) Jenkins build are now 13 minutes long or so - 10https://bugzilla.wikimedia.org/33784  +comment (10Antoine "hashar" Musso)
[13:18:15] 	 03(mod) Mechanism used for SOPA blackout is not effective if JavaScript disabled - 10https://bugzilla.wikimedia.org/33783  +comment (10billlalonde)
[13:19:50] 	 03(WONTFIX) Mechanism used for SOPA blackout is not effective if JavaScript disabled - 10https://bugzilla.wikimedia.org/33783  +comment (10Chad H.)
[13:25:53] 	 03(VERIFIED) Mechanism used for SOPA blackout is not effective if JavaScript disabled - 10https://bugzilla.wikimedia.org/33783  +comment (10Antoine "hashar" Musso)
[13:26:03] 	 03johnduhart *  10/trunk/extensions/CongressLookup/SpecialCongressLookup.php: Fixing wording per email by Sue
[13:27:54] <^demon>	 johnduhart: We'll need to force a i18ncache run.
[13:28:05] <^demon>	 Otherwise it won't get picked up until this evening.
[13:28:15] 	 ^demon: It's not a localized string
[13:28:25] <^demon>	 Oh yeah en-only.
[13:28:31] <^demon>	 So just merge+deploy.
[13:29:00] 	 ^demon: do you know if our cli installer can DROP / truncate the existing database before doing installation? :b
[13:29:50] <^demon>	 That sounds likely to destroy data o_O
[13:29:57] 	 that is the point :b
[13:30:28] 	 the jenkins jobs using a postgre backend create a fresh new database on each run
[13:30:34] <^demon>	 I can already see the bugs...
[13:30:45] <^demon>	 "MEDIAWIKI HOSED 15 YEARS OF CUSTOMER RECORDS YOU IDIOTS"
[13:30:49] 	 HAHAHAHA
[13:30:56] 	 hashar ... just add a DROP DATABASE *** as the first line in tables.sql :D
[13:31:47] 	 ^demon: only 15? You aren't doing a good enough job then >.>
[13:31:51] 	 hehehehe
[13:32:03] 	 ^demon: after that you can just say "we did tell you to backup before upgrading, right?"
[13:33:07] <^demon>	 That being said, I would love to see TRUNCATE support in the database subclasses.
[13:33:07] 	 hashar: go the full mile and add a rm -rf / just to be sure
[13:33:14] <^demon>	 It would make clearing the tables for testing way faster.
[13:33:23] 	 johnduhart: there is already a rm -rf /
[13:33:28] 	 Oracle doin' it already :D
[13:33:47] <^demon>	 freakolowsky: With a hack or with a ->truncate() method?
[13:34:12] 	 erm ... hack
[13:35:52] 	 just not sure if other DBMS-es support truncate ... 
[13:36:02] <^demon>	 Mysql does. Dunno about Sqlite.
[13:36:06] 	 PG?
[13:36:43] <^demon>	 Ah, Sqlite does if you just omit the where clause in a DELETE
[13:36:58] <^demon>	 "When the WHERE is omitted from a DELETE statement and the table being deleted has no triggers, SQLite uses an optimization to erase the entire table content without having to visit each row of the table individually. This "truncate" optimization makes the delete run much faster"
[13:37:39] <^demon>	 PG manual says yes.
[13:38:04] 	 lol Ora goes a step further ... truncate is actualy a low level drop+create :D
[13:38:28] 	 is there a way to check if mediawiki works ok with squid? it looks to me that installation on labs just forward all to webserver, because it's still on high load while squid server is doing nothing
[13:38:51] 	 gazillion record can be lost in a microsecond ... and with out undo :d
[13:38:57] <^demon>	 petan|wk: If MediaWiki didn't work with Squid then wikipedia would crash :)
[13:39:09] 	 I think they mean a specific instance
[13:39:23] 	 ^demon: i think he means "Is there a way to tell if squid is working on labs"
[13:40:08] 	 ah question is wrong demon
[13:40:24] 	 I mean is it ok to check if certain mediawiki works ok with squid :)
[13:40:27] 	 debug if it's ok
[13:40:46] 	 ap ergos helps me a bit I will try to check headers
[13:40:52] 	 OrenBo: can you try committing something in the new lucene-search-3 SVN directory please?
[13:41:15] 	 OrenBo: that could wait after bug triage
[13:41:56] 	 quiestion if I keep pressing f5 on article should squid has high load or apache?
[13:41:57] 	 03demon *  10/branches/wmf/1.18wmf1/extensions/CongressLookup/SpecialCongressLookup.php: MFT r109359
[13:42:18] 	 in my case apache has load 30+ and squid 0.0
[13:42:27] 	 so I guess it's wrong
[13:43:19] 	 btw apergos how do I get headers in firefox
[13:43:27] 	 hashar: sec
[13:43:36] 	 l8r
[13:43:40] 	 I make a new ptoject
[13:44:51] 	 there used to be a plugin livehttpheaders
[13:45:22] 	 with the newer versions maybe there's somethign built in like to the webconsole
[13:45:38] 	 I juse use Firebug
[13:45:57] * apergos  smacks self upside the head
[13:46:02] 	 yeah of course that will work too :-D
[13:46:21] 	 In some specialized cases I also have HttpFox on hand
[13:46:22] 	 ok
[13:46:30] 	 I hope I won't need to restart ff
[13:46:35] 	 Mostly just for the case of http redirects
[13:46:36] 	 or I am droping that :D
[13:46:41] 	 I usually resort to wget for stuff like this
[13:46:52] 	 wget shows it?
[13:47:01] 	 I could use that
[13:48:46] * Dantman  doesn't mind restarting FF much anymore
[13:49:19] 	 I do because I would need to reconnect chat
[13:49:28] 	 pft...
[13:49:30] 	 I can't use irc client here
[13:49:40] 	 all ports firewalled in work
[13:49:41] 	 Get a bouncer
[13:49:52] 	 I could use ssh proxy but this is easier
[13:49:58] 	 * tunnel
[13:50:04] 	 What OS?
[13:50:28] 	 I have win here, but I could putty to some my server to do that
[13:50:41] 	 this is not my pc
[13:57:22] 	 03yaron *  10/trunk/extensions/SemanticForms/includes/forminputs/SF_RadioButtonInput.php: Follow-up to r109076 - replaced openElement() and closeElement() with rawElement() - cleaner code, and works with MW 1.16
[13:57:48] 	 apergos: if I keep hitting f5 on a page should it have impact on squid or even apache?
[13:58:09] 	 because now it spawn a process of apache on every hit
[13:58:20] 	 which maybe isn't necessarily wrong
[13:58:34] 	 problem is that I never saw a working mediawiki with squid :>
[13:58:48] 	 so I don't know how it should look
[13:59:19] 	 f5 forces a reload (tossing locally cached items)
[13:59:45] 	 so it should affect squid or apache?
[13:59:53] 	 I mean which server would have high load?
[13:59:57] 	 caused by that
[14:00:04] 	 03oren *  10/trunk/lucene-search-3/pom.xml: based a Maven archtype for SOLR deployment
[14:03:12] 	 Project MediaWiki-postgres-phpunit build #395: ABORTED in 1 min 52 sec: http://integration.mediawiki.org/ci/job/MediaWiki-postgres-phpunit/395/
[14:03:37] 	 I want to blackout my mediawiki site for the sopa blackouts, but I'm not sure where the header file is where I should put the