[00:01:24] Who decided that we needed different functions for different argument types anyway? [00:02:01] Is this C or something? What's wrong with if( !is_object( $title ) ) { $title = Title::newFromText( $title ); } ? [00:03:30] *Simetrical counts 320 lines of code that should be more like 50 [00:04:47] No, wait, more. [00:05:09] Something like a third of Linker is makeLink variants. I guess that explains the name of the file, huh? [00:06:01] 03(mod) Character reference link can generate unreachable non-NFC title - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14952 +comment (10brion) [00:06:08] 03(mod) Unicode (UTF-8, utf8) compatibility (tracking) - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=3969 (10brion) [00:07:44]
Date and time [00:07:44]
Date format [00:07:58] *Splarka wonders when anyone last, like, cleaned up Preferences [00:08:38] 03simetrical * r38162 10/trunk/phase3/includes/Linker.php: [00:08:38] Linker.php cleanup: [00:08:38] * Allow makeLinkObj to accept an associative array of arguments for $aprops, so Brion's eyes can be saved from melting. [00:08:38] * Fail fast when various methods are passed non-Titles, don't just return some garbage and hope no one notices. [00:08:39] * Whitespace, wfDeprecated(). [00:09:19] zmog [00:17:13] 03(mod) Mass deletion of images - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=8527 (10mike.lifeguard) [00:18:14] 19 previously passing test(s) now FAILING! :( [00:18:47] edit [00:18:53] *brion *cough*s at Simetrical [00:18:57] 03(mod) Expand functionality of Special:nuke on meta to all wiki' s within Wikimedia - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=11069 +comment (10mike.lifeguard) [00:19:06] Erm. [00:19:15] I could swear I sort of tested, honest! [00:19:20] :D [00:19:21] Fixing, fixing . . . [00:19:32] i'd whinge but i broke at least two major skins yesterday ;) [00:19:42] Yeah, I heard. [00:24:39] major? nobody uses modern/simple ^_^ [00:24:45] brion, wow, this interface is complete crap. [00:26:41] heh [00:26:43] which? [00:26:48] Splarka: enough do to bitch when i tbreaks :D [00:27:35] damn, who left the dev door unlocked! [00:29:28] 03(mod) New revisions occasionally created with wrong text ( but correct rev_len) - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14933 +comment (10jeroenvrp) [00:30:11] 03(mod) New revisions occasionally created with wrong text ( but correct rev_len) - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14933 (10jeroenvrp) [00:32:46] 03(mod) Enable GlobalBlocking extension - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=8707 (10mohsensalek) [00:35:56] 03simetrical * r38163 10/trunk/phase3/includes/Linker.php: [00:35:56] Partially revert r38162. Caused bugs due to incredibly incomprehensible [00:35:56] parameter list for makeKnownLinkObj. (Hint: $aprops and $style do virtually the [00:35:56] same thing, except that the latter has a completely misleading name and [00:35:56] documentation.) [00:35:56] There is no salvation possible for this method. It must die. [00:36:44] 03simetrical * r38164 10/trunk/phase3/includes/Linker.php: Uh, also I shouldn't introduce syntax errors while fixing regressions. [00:45:07] 03(mod) Expand functionality of Special:nuke on meta to all wiki' s within Wikimedia - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=11069 (10meno25wiki) [00:50:55] hi, is there any method for adding google maps to a mediawiki ? [00:51:31] http://www.mediawiki.org/wiki/Extension:Google_Maps [00:51:37] http://www.mediawiki.org/wiki/Extension:Semantic_Google_Maps [00:51:37] 03simetrical * r38165 10/trunk/phase3/ (5 files in 2 dirs): Merge TitleArray and UserArray into one unified class, ObjectArray. Adding support for a new type of object will now just take a few lines. [00:51:44] http://www.mediawiki.org/wiki/Extension:Ricks_Google_Maps [00:51:51] thnxx [00:52:03] im on it :) [00:52:41] and whatever xkcd uses (doesn't show in Version) [00:52:46] http://wiki.xkcd.com/geohashing/Special:Version [00:53:51] oh, also http://www.mediawiki.org/wiki/Extension:GoogleMapExtension [00:55:05] thnx [01:02:04] 03simetrical * r38166 10/branches/REL1_13/phase3/ (4 files in 2 dirs): Merge r38165 and part of r38107 from trunk: I changed the interface of a hook added in 1.13, so the change should be in the 1.13 release. [01:21:17] ok i just added the map extension... it works great! BUT i have another question... while editing and adding a map to a wiki article, the default view is set to hybrid... how do i change this in localsettings.php to "maps" [01:21:22] this is my code in localsettings [01:21:41] $wgGoogleMapsDefaults = array( "lat" => 23.402765, "lon" => -102.041016, "zoom" => 5); [01:25:44] probably "type" => [01:27:41] yah, "type" => [01:28:12] accepted values are 'map','normal','hybrid','terrain','satellite','elevation','infrared' [01:28:24] boogieman ^ [01:35:13] ok thnx [01:35:15] let me check [01:36:01] thnxx it worked... [01:39:46] Would someone please take a look at http://en.wikipedia.org/wiki/User:Lady_Aleena/Sandbox#Tests and tell me why the #if:s are not working the way they should? For some reason one or both of them want to intrude on the row above. I need to go do a few things in my kitchen (load the dishwasher and dinner), so if you have any advice, would you please put it somewhere on that page or the talk page? [01:40:09] Thanks in advance! [01:45:16] Lady_Aleena: i got the first one working [01:54:14] are there any scripts around that sync pages between wikis regularly? [01:54:25] i'm particularly in syncing Common.css and Common.js among the wikis in our farm [01:54:37] i know how to do it with pywikipediabot but was wondering if someone already had [01:54:50] aib, I know this is done, sure. [01:54:51] aib: you could run a cron job [01:54:59] I don't know by whom. [01:55:01] aib: mind telling me how to do it with pywiki? I didn't know it was easy to do [01:55:34] well pywiki will allow you to just grab the "article" content from one, change wikis, then paste it into the other [01:55:43] aib, I'd suggest for your purpose, though, that you add an extra script load to all of your pages, so you don't have to sync manually. An extension a few lines long should do it. [01:56:03] That way it would just load your really-common.css/js before loading Common.css/js. [01:56:13] And the former could be stored centrally anywhere you like. [01:57:31] simetrical, are you suggesting having a master wiki and redirecting the others towards it when loading those files? [01:57:46] well [01:58:14] from that perspective..i could do it with htaccess possibly [01:58:20] more hackishly in each MediaWiki:Common.css have an @import [01:58:46] and in MediaWiki:Common.js have a document.write(" to every page. [01:59:13] You could also just do the @import thing, yeah, although you'd have to set that up manually when initializing the wiki. [01:59:20] right [01:59:25] those are great solutions, thanks [01:59:28] and you'd have to have it at the beginning of MediaWiki:Common.css [01:59:33] Using a hook to add the extra CSS/JS load to the will be more straightforward. [01:59:42] because @import only works at the beginning of css (in most browsers) [01:59:54] but you can cascade it, heh [02:00:04] In *all* standards-compliant browsers. :P [02:00:08] Good night, all. [02:00:43] good night [02:00:44] you assume all browsers are standards-comliant [02:00:46] heh [02:00:47] heh [02:01:03] ^_^ Hook? Like the one brion reverted? [02:01:24] aib: wikia has an extension that also does this [02:01:43] it loads MediaWiki:Global.css and .js from www.wikia.com for all wikis on the shared user database [02:02:04] and for logged in users, it also loads User:user/global.css / js from same [02:02:46] awesome [02:02:48] :) [02:03:36] http://dev.wikia.com [02:05:32] 03(mod) Wikimedia OpenID provider - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13631 +comment (10dzonatas) [02:07:02] wikia's extensions page has no entries and their trunk has pretty much every mediawiki extension ever created in it :/ [02:07:06] any pointers to this things name? [02:07:12] https://svn.wikia-code.com/wikia/releases/200807.5/extensions/ [02:08:43] Global CSS/JS (Version 1.0) [02:08:46] yay:) [02:09:03] A good place to look is the /extensions/wikia folder [02:09:16] Most of the custom stuff is put there [02:09:34] found it here, thanks Dantman: https://svn.wikia-code.com/wikia/releases/200807.5/extensions/wikia/GlobalCSSJS/ [02:10:25] be sure to change the hardcoded URLs ^_^ [02:11:10] hmm [02:11:27] Dantman: someone should go through the wikia extensions and fill out 'url' now that there is a public repository [02:14:16] the usage line in that doc has this line of code. it goes in LocalSettings.php? [02:14:17] $wgHooks['SkinTemplateSetupPageCss'][] = 'wfGlobalWikiaCSS'; $wgHooks['BeforePageDisplay'][] = 'wfGlobalWikiaJS'; [02:15:15] probably, yah [02:15:22] you might wanna change the name to remove 'Wikia', heh [02:15:46] and rewrite the extension of course, maybe specify the base URL in your local settings [02:15:53] and if you don't have a global user database, disable the user JS [02:16:11] 03aaron * r38167 10/trunk/extensions/FlaggedRevs/FlaggedRevs.php: bump version [02:16:45] mmm, looks like they disabled the MediaWiki:Global.* too? why... damn you wikia [02:33:08] They moved the system global into the software itself [02:33:31] Also, be wary of 1.13 changes for shared users [02:34:07] When you test $wgSharedDB also be wary that from 1.13 on the user table is only shared if it shows up in $wgSharedTables [02:35:36] if( isset($wgSharedDB) && ( !isset($wgSharedTables) || in_array('user', $wgSharedTables) ) ) [02:36:27] lovely [02:37:37] 03shinjiman * r38168 10/ (12 files in 4 dirs): [02:37:37] Update the Chinese conversion tables [02:37:37] Last minute changes for the release branch 1.13 [02:42:21] i'm using the FCKeditor's mediawiki extension but some of my users really want the standard toolbar. i tried two solutions to move the standard toolbar under the edit box: moving it with javascript and moving {$toolbar} in EditPage.php. both work when the FCKeditor extension is not enabled, but not when it is [02:42:27] here's the js for an idea of what i'm doing: document.getElementById("editform").insertBefore(document.getElementById("toolbar"), document.getElementById("editpage-copywarn")) [02:42:48] any idea how i can get them both, one above, one below? [02:43:05] you can run that in FF by prefixing javascript: [02:44:59] my wiki just got hit by a page move vandal [02:45:41] anyone know how to move 2-300 pages back to where they previoisly were? [02:47:01] with a bot :) [02:47:27] or FF + tabbed browsing + snaplinks + revert links [02:47:33] (which rhymes :) ) [02:48:06] very easy with a spider [02:48:12] have it click all the rollback links [02:48:35] can even use wget | grep | sed | wget [02:52:50] snaplinks [02:53:13] yeah thats basically my first suggestion https://addons.mozilla.org/en-US/firefox/addon/4336 [02:53:29] hope you have lotsa ram:) [02:56:34] gotta find a haxed version to work with new FF [02:59:00] 03aaron * r38169 10/trunk/extensions/FlaggedRevs/FlaggedRevs.php: Expand license header [04:18:29] 03aaron * r38170 10/trunk/extensions/FlaggedRevs/flaggedrevs.css: Size reduced [05:20:15] I tried installing OggHandler but it breaks trying to view/listen to Oggs complaining of a missing PEAR.php--- there is no extensions/OggHandler/PEAR/File_Ogg/PEAR.php [05:24:23] yea you need to install that seperately ala the Readme [05:24:48] o wait nm [05:24:50] it's included [05:30:59] you do need php-pear tho [05:33:08] apt-get install php-pear should fix that :) [06:01:07] *qsheets runs away [06:18:27] 03(mod) Character reference link can generate unreachable non-NFC title - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14952 15enhancement->minor (10tstarling) [06:35:15] anyone can give me a tip on how to disable the login system on front page??? [06:36:06] ...login system on front page? [06:36:20] a link to "login" at the top, or other? [06:37:11] its like you need to login before you can see its content [06:38:20] i want the public to read my content without logging in [06:42:01] The login link is in a perfectly inconspicuous location and doesn't mark itself out as something you need to use [06:42:28] Rather it sounds like you're mixing MediaWiki up with a CMS system which it is not [06:42:54] if you have actually disabled viewing of other pages, set: $wgGroupPermissions['*' ]['read'] = true; [06:43:13] if not, maybe the main page just seems to say something that sounds like you do? can you link to your main page or upload a screenshot of it? [07:03:00] hahahah thanks guys solved it by myself. Just need to set the menu to Allow anonymous users [07:03:24] O_o [07:28:55] Simetrical: abstract classes? [07:29:38] funny [07:48:56] I'd like to have a numbered list using '#', but to interrupt the numbered list with non numbered lines [07:49:04] can I do that without interrupting the numbers? [07:50:51] checkers: if you use
    and
  1. html manually you can [07:51:25] or if you do some nasty... [07:51:39] # foo bar baz


    la la la


    [07:53:45] https://bugzilla.wikimedia.org/show_bug.cgi?id=13223 <-- also see [07:58:44] checkers: http://test.wikipedia.org/wiki/Ordered_list_continue [08:01:10] ta [08:02:56] evil no matter what, oh well [08:03:23] Nikerabbit: Remember that database abstraction system? I've already moved it out into a library... Right now I just need to create a makeValues function and once that's done it should have full support for all the basics. [08:03:35] well, do you want the non-numbered lines in the source, or in the rendered page, checkers? [08:05:28] source [08:05:44] right, then <-- \n --> is your best option [08:05:46] I'd like to be able to use '#' (or something else easy) but interrupt it randomly [08:05:56] yeah [08:06:00] until tim fixes 13223 [08:08:05] Dantman: cool [08:24:53] Nikerabbit: Though be warned... there is no way this will ever be MW compatible ;) [08:25:41] Some of the things done with it put things in places dictated a bit more by objective logic... [08:26:47] For example... because there is a lot more handled inside of a DatabaseResult, $db->selectRow is actually a SELECT with a LIMIT 1, just like select it returns a DatabaseResult rather than a row object [08:27:21] That'll never play well with all the code in MW using selectRow [08:28:38] mm? [08:29:05] MW: $row = $db->selectRow( ... ); [08:29:26] Dantman: why would you want to do that? [08:29:54] LIB: $res = $db->selectRow( ... ); $row = $res->fetchObject(); [08:30:20] 03purodha * r38171 10/trunk/extensions/intersection/DynamicPageList.i18n.php: [08:30:20] "intersection", not "union", as per: [08:30:20] http://translatewiki.net/wiki/Support#MediaWiki:Intersection-desc [08:30:20] which will likely become# [08:30:20] http://translatewiki.net/wiki/Support/Archive/2008/3#MediaWiki:Intersection-desc [08:30:30] There's a lot more information inside of $res.. Things like the insertId, and other info, it's specially handled inside of the result [08:31:39] So actually... you could actually make multiple queries and things like $res->numRows() would have been pre-stored instead of having the same content because of the way it's handled by normal DB functions [08:32:17] you know that requesting insertId EVERY time you insert a row is a giant waste of resources, right? [08:32:40] You seen mysqli? [08:33:23] I am aware of it [08:34:27] It's already something grabbed, it's stored inside of $mysqli->insertid; after each query [08:42:04] 03siebrand * r38172 10/trunk/phase3/languages/messages/ (33 files): Localisation updates for core messages from Betawiki (2008-07-29 10:30 CEST) [08:46:45] 04(REOPENED) Skins "Simple" and "Modern" are broken - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14954 +comment (10pbirken) [08:50:13] 14(INVALID) TorBlock-Extension generates false positives - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14934 +comment (10Andrew) [08:53:01] 03siebrand * r38173 10/trunk/extensions/ (81 files in 75 dirs): Localisation updates for extension messages from Betawiki (2008-07-29 10:30 CEST) [08:53:16] How do I create a list of all pages in a custom namespace (similar to Special|All pages but without the top boxes and/or similar to a category listing). Cannot find documentation on this. [08:53:31] 03(NEW) Raise rollback limit on English WP - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14967 minor; normal; Wikimedia: Site requests; (catlow) [08:55:12] hi [08:55:30] where can i find mediawiki themes/styles/templates? [08:56:06] 03siebrand * r38174 10/trunk/phase3/languages/messages/MessagesDe.php: [08:56:06] Localisation updates for extension messages from Betawiki [08:56:06] * Great to have German 'core' on board! [09:02:05] 03siebrand * r38175 10/trunk/extensions/Translate/Translate.alias.php: Localisation updates for special page aliases for extensions from Betawiki [09:09:33] hi here, i am looking for the best public wiki running mediawiki out there. I already know editthis.info but do you know others? Got a list? [09:10:26] wikia? [09:10:27] :) [09:11:22] {{fact}} [09:11:57] Wikia is probably the best... but it caters to what I call sector 2... basically community wiki owned by the global community... if you're going for a wiki you or a group want to 'own' then you'll need to find a sector 3 host... most of those are crap though [09:12:57] what is sector 1? [09:14:29] Sector 1: General wiki, owned by the global community; Sector 2: Focused wiki, owned by the global community; Sector 3; Private owned wiki; ^_^ In other words... Sector 1: Wikimedia; Sector 2: Wikia; Sector 3: Host it yourself cause most of the hosts in that sector are crap; [09:14:56] i see [09:15:18] It's not the point of the page but: http://nadir-point.com/wiki/Wiki_Hosting [09:15:58] Hmm... I still have references to the old project title there... [09:16:38] so you're trying to open a startup in sector 3? [09:16:58] At some point in time [09:16:59] or is it just a plan at this stage? [09:17:04] oic [09:17:06] It's a plan [09:17:06] nice [09:18:20] I can point out two somewhat good hosts in the 3rd sector... But none mark themselves as top, or provide firm competition to any group that gets to the quality level of Wikia or Wikimedia [09:19:27] I guess the problem with 3rd sector hosting is that it requires a high level of customizability by the users themselves - with sector 2 or there is often a dedicated team that you are supposed to go to, but I wouldn't want that [09:19:28] Scribble Wiki seams fairly stable, however they are lacking in some of the common features... There's another startup aimed at Semantic wiki... However I can't vouch for it's stability [09:19:46] ...in sector 3 [09:20:21] Dantman: this is all a marketing ploy for your business :P [09:20:26] Heh [09:20:46] I don't even have the money for registering a company right now... [09:21:01] :/ I don't even have the money to buy myself a laptop right now... [09:21:28] ;) As for customizability... I have a bit of useful code already [09:22:12] Actually that new 3rd sector host doesn't look that great to me [09:22:56] Last time I checked it was $250/mo for a 'Enterprise' wiki, the primary feature of that level is the ability to point your own domain name to the wiki... [09:23:26] IMHO it's pretty lame if you need to pay a high ammount of money to point something that costs $10/yr to a site [09:23:59] ;) For that kind of money you could hire yourself a full time mediawiki consultant and get your own VPS [09:26:46] hi, what can be the reason that img_bits, img_major_mime, img_minor_mime, img_width and img_height are set to ''. I can upload an image. It exists. But not on image page and can be displayed by [[Image:XYZ.png]] [09:27:57] sorry forgot version: MediaWiki: 1.10.0 PHP: 5.2.4-2ubuntu5.2 (apache2handler) MySQL: 5.0.51a-3ubuntu5.1-log [09:30:18] somehow only fileicon.png is displayed on image page :-( [09:37:04] 03nikerabbit * r38176 10/trunk/extensions/Translate/ (README SpecialMagic.php Translate.php): * 2008-07-29:1 bug fixes and enhanced magic word support for AdvancedTranslate [09:40:07] anyone here owning a wiki at wiki-site.com ? I can't setup my new wiki, i don't get what to enter into the database config section [09:47:51] baah. after upgrading a foreign wiki from 1.12 to 1.13.0.rc1 I get a lot of "expansion depth limit exceeded". any idea why? [09:47:56] *mhm* seems also that image is not stored in /images/thumb/ [09:48:32] can i list with semantic all articles of a namespace? [09:49:27] aftypes.cpp:81: error: no match for ???operator>>??? in ???ss >> * val??? [09:49:31] that's better [09:51:14] 03midom * r38177 10/trunk/phase3/includes/JobQueue.php: don't limit random range, no need, and not efficient [09:51:24] 03rotem * r38178 10/trunk/extensions/intersection/DynamicPageList.i18n.php: Localization update for he. [09:53:20] TimStarling: any idea why I get a lot of "expansion depth limit exceeded" after upgrading a wiki from 1.12 to 1.13rc1? [09:56:49] what the fuck [09:57:06] *domas runs 'svn blame' [09:57:35] it's a mario! [09:58:29] Nikerabbit: Btw... My insert/replace handles multi-inserts like array( array( 'foo' => 'bar', 'bar' => 'baz' ), array( 'foo' => 'bar', 'baz' => 'blah' ) ); [09:58:47] Tim! :) [10:02:11] 03midom * r38179 10/trunk/phase3/maintenance/runJobs.php: don't count if you don't use it! :) [10:06:11] 03midom * r38180 10/trunk/phase3/RELEASE-NOTES: jobqueue changes [10:06:54] 03(NEW) Specifying blank action parameter triggers WMF Error instead of API Error - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14968 trivial; normal; MediaWiki: API; (overlordq) [10:29:10] 04(REOPENED) TorBlock-Extension generates false positives - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14934 +comment (10gnu1742) [10:31:07] 03(mod) TorBlock-Extension generates false positives - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14934 (10raimond.spekking) [10:44:45] I'm trying to upload a file from a URL and it gives me an error about int_curl not working. [10:45:05] I didn't have curl installed, so I just apt-getted it, but it's still giving me the same error. [10:47:06] Fatal error: Call to undefined function curl_init() in /opt/mediawiki-1.13.0rc1/w/includes/specials/SpecialUpload.php on line 181 [10:49:43] Did you restart your webserver first? [10:49:51] Or php if you're using fastcgi [10:50:18] Nope, I didn't! [10:50:26] or he just installed curl and not the php-curl :) [10:50:50] s/first/after you installed it [10:51:02] Mmmm [10:51:09] Nikerabbit: apt-get install php5-curl [10:51:33] New php modules or configuration don't take effect until php is reloaded [10:52:00] If you are running fastcgi you restart the fastcgi instance, if you are using apache's mod_php you restart apache [10:52:10] Simetrical: you broke Special:ListUsers [10:52:32] Nice, it works! [10:52:50] Thanks a bunch Nikerabbit and Dantman [10:53:13] Common issue for new sysadmins [10:53:22] I didn't do anything :o [10:53:50] 99% of software won't reload config till you restart... some support a reload command in some way, but sometimes it doesn't work [10:54:33] otoh, a php app restarts for every page request :P [10:55:41] VasilievVV: .,. [10:56:26] Nikerabbit: ? [10:56:35] VasilievVV: indeed? :D [10:56:59] Nikerabbit: You told me I needed php5-curl and not just curl. ;-) [10:57:09] VasilievVV: fix your tables if there is invalid titles? [10:57:20] Nikerabbit: yes. Enable CentralAuth and check [10:57:50] VasilievVV: I have no need for centralauth [10:58:30] Nikerabbit: Wikimedia do. And r38165 causes PHP fatals in it [10:59:00] Hi Guys, I am been talking to the IT dept at my school and we are thinking about hosting our own local copy of en-wikipedia via db dump etc. Im wondering if there is any way we could get the images to work OR when you click on the red link, the wikipedia image page came up for that image? [10:59:21] VasilievVV: where? [10:59:51] Nikerabbit: in CentralAuthUserArray, since UserArrayFromResult was removed [11:00:19] VasilievVV: your parser is completely reimplemented :) [11:00:49] VasilievVV: UserArray::newFromResult is there [11:00:49] Wonderful! :) [11:00:56] May I take a look at it? [11:01:04] Sure, I'll commit it now. [11:01:11] Prom3th3an: Well you should really think about why you would want to do that first, its very big and there is little benefit (and your school could be made liable for any mailicious content it hosts); it is possible to do, but requires in depth knowlege - I believe you can download the images from somewhere (not sure though, and they take up litrally terabytes of space) or you can scrap the images off of the servers [11:01:49] VasilievVV: ah, but the hook broke [11:02:23] should be easy to fix [11:02:31] Nikerabbit: and also, autoloader entry is broken [11:02:39] Wait, I feel I fixed it [11:03:41] 03werdna * r38181 10/trunk/extensions/AbuseFilter/ (17 files in 2 dirs): Add native parser for AbuseFilter. Currently being made into a PHP extension. Also some PLURAL stuff that somehow wasn't committed previously [11:04:03] I go to the largest school in the southern hemisphere, and I personaly run 6 wiki's (all connected by central auth) some of which are already mirroring wikimedia projects (Such as wikiquote) so I think i have enough experiance. I also think it would benifit wikimedia by us hosting our own internal (and readonly) copy [11:04:58] Prom3th3an: you can avoid most of the problems if it is not viewable by the public web [11:05:14] IT WILL BE INTERNAL :P [11:05:25] you nkow, intranet :P [11:05:44] lol [11:06:10] Im trying to save bandwidth, not host a free for all wikipedia :) [11:06:46] I laugh when people say "largest in the southern hemisphere" [11:06:53] let's see.. what else is in the southern hemisphere? [11:07:05] Africa, the South Pacific, Australia, South America. [11:07:08] that's.. like.. it. [11:07:16] LOL [11:07:18] yeah [11:07:32] Also the largest in australia too :P [11:07:51] Werdna: You forgot antartica! [11:08:00] haha [11:09:02] 03(mod) Mass deletion of images - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=8527 (10Bryan.TongMinh) [11:11:46] Ok, well heres the game plan guys, the school and I want to reduce costs and increase speed. So were going to host a private, internal, intranet- read only copy of Wikipedia. This will save us (and presumably wikimedia) big bucks. We have set aside to blade servers to host it, Our only issue is we do not want to store the images locally, is there away we can link it to wikimedia???s copy OR... [11:11:48] ...when the redlink for imges is clicked, it redirects to the image page on Wikipedia? [11:12:00] 2^ [11:12:42] The pros of doing this is reduced load on the proxy server and gateway, no more vandalism on wikipedia and of course less bandwidth being used (it will be quicker also) [11:12:59] can't you set up a squid or two for pictures? [11:13:19] We dont want to host them locally [11:13:23] Actually why not just a squid of the whole thing? [11:13:46] We could just get a static html dump [11:13:51] but they are large [11:13:59] because of the images [11:14:27] Honnestly I suggest a squid... or varnish [11:14:44] Do explain? [11:14:51] does it mean we have a local copy? [11:15:27] Basically you squid cache every /wiki/, disallow the other urls, and you can forward image urls to the original images [11:16:09] Rather than trying to go and save a huge copy all at once... You get a local version of the page which can last you as long as you want [11:16:10] So your talking almost like a local reverse proxy? [11:16:28] Mhmm... squid is a reverse proxy cache [11:16:41] Its also a forwared proxy [11:16:42] Werdna: can you check I didn't break anything? [11:16:45] which is what we have now [11:16:49] 03nikerabbit * r38182 10/trunk/extensions/AbuseFilter/SpecialAbuseFilter.php: * i18n fixes [11:16:58] at the moment a large preportion of wikipedia is cahced [11:17:02] we are running squid [11:17:17] but we would prefer just to have a 100% local copy [11:17:29] (except for imageS) [11:17:50] ^_^ You do know that images are what really eats up bandwidth [11:18:09] you're not going to save yourself that many bucks if you're only caching text [11:18:14] Yes, But the squid caches them [11:18:20] Bascially [11:18:24] mkay... [11:18:25] this is complicated :S [11:18:29] heh [11:18:36] Not to me... [11:18:39] The squid server as is caching all web traffic [11:18:44] Ok [11:18:55] we want to get the text local because it changes often [11:19:12] The images can stay remote because they are mostly cached in the squid proxy [11:19:57] What's wrong with it changing often? Thats the biggest reason you would want to cache it rather than saving... [11:22:16] 03(mod) Something about these images created by a nightly build of Inkscape prevents RSVG /MediaWiki from rendering the images properly - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14953 (10will.pittenger1+wikibugzilla) [11:22:17] We would prefer to have our own, stable copy that didn't need to be re downloaded every time someone fixes a grammer error. Most the articles we use are good enough. Also having a local copy has the added advantage of teachers being able to edit articles and remove or alter stuff as they see fit [11:22:31] If your issue is you're recaching often... then make squid use longer cache times and not update as often... you'll lag behind a bit, but if you need to simply sending a "fetch from originating server" or whatever it was called header with the browser (in FF you just use CTRL+SHIFT+R) and it'll force update [11:22:46] Werdna: hmm.. Why is it D_NULL, D_NUMBER, D_FLOAT, D_STRING? Where is D_BOOL? [11:23:20] Prom3th3an: override the cache time then, make squid cache wikipedia for 1 month or so [11:23:23] "Bool... I thought those didn't exist in lisp?" heh... ooc [11:24:25] null, false -> NIL (which is actually equivliant to "()"), true -> T [11:24:32] ^_^ Don't you love it when things are simpler [11:24:47] Prom3th3an, you are also risking to keeping a vandalised version removed 5 minutes after [11:25:06] *Duesentrieb wants stable version dumps [11:25:13] ;) Which is why I suggest the browser purging [11:25:31] Ok I'll think about the whole squid cache angle, however i would still like to know if it is possible to host a local copy (DB dump) of a wiki media project and get the images of wikimedia when required [11:25:37] If someone sees vandalism they can force refresh and if it was ever removed it'll update [11:25:54] Also. my school is like one of the worse vandalising schools..... [11:26:09] Prom3th3an it is, but not too easy [11:26:29] Prom3th3an: If vandalism is an issue... ;) Just don't allow anyone to POST to Wikipedia [11:26:53] An innocent civilions? [11:26:57] and^ [11:27:01] Dantman: or just ask Wikipedia admins to block school's IP [11:27:02] blcok them to? [11:27:08] Yes it is blocked [11:27:38] But the issue is the ammount of vandalism done in a short time and of course the schools IP whilst static does change [11:27:57] Prom3th3an disable posting to wikipedia at the squid [11:28:07] that would fuck me over. [11:28:08] :) [11:28:16] Hmm, why? [11:28:20] you can probably exempt your itnernal ip ;) [11:28:22] I post on wikipedia [11:28:56] lol, theres like over 2000 workstations [11:29:00] i use multiple [11:29:30] Hmm... what was the name of that term [11:29:44] make it check the username [11:29:56] Ah... VPM [11:29:59] VPN* [11:30:03] mau [11:30:41] Platonides> Prom3th3an it is, but not too easy [11:30:43] continue [11:30:51] I want to persue this line for now [11:30:55] Setup something where a second proxy will allow for posting and the like... But only administrators can access it [11:31:28] That will allow you access, but dissallow it for the vandalizing students [11:31:37] Prom3th3an you go to downloads.wikimedia.org [11:31:42] download a page dump [11:31:51] install mediawiki [11:31:56] import that dump into a mediawiki [11:32:24] Use the ForeignFileRepo to use WP's and Commons images... [11:32:35] But it's honnesly a bad patch to the issue [11:32:41] Werdna: I successfully compiled it under Cygwin after throwing out regex-related part [11:33:02] Werdna: by the way, you accidently commited binary file itself [11:33:20] It'll be harder to deal with local vandalism already inside of the dbdumps, and you'll have out of date info after awhile [11:33:42] You could also alway try using a web crawler [11:33:55] the local copy should be read-only [11:34:03] Yes... [11:34:15] Dantman they already hace a squid, wwhich is much better than a web-crawler [11:34:16] Which is the issue if there is already vandalism inside of the db dump [11:34:18] and get my IP blocked from wikipedia..... [11:34:26] lol [11:34:28] no [11:34:29] VasilievVV: c'est un problem? [11:34:30] for using a web carwler [11:34:41] Nope... it's only bad if you don't throttle it right [11:34:59] Werdna: well, it's better not to commit them [11:35:28] i search for a skin with a forest/wood design [11:35:31] any ideas? [11:36:04] VasilievVV: why do you say that? [11:36:32] Just a note [11:37:21] VasilievVV: no, but why should you not commit binary? [11:37:48] Werdna: because it's not cross-platform [11:37:59] well, I figure that if somebody wants the binary, they can have it [11:38:04] nothing is lost by committing it. [11:38:21] VasilievVV: to get the regexes working, install libboost_regex [11:38:27] I see [11:38:55] It is 20 Mb, about 8 hours downloading on my current connection [11:39:02] ... [11:39:06] what kind of connection is that? [11:39:42] Radio-based connection which sometimes has delay with about 30 seconds [11:39:48] VasilievVV, usually source is smaller than binaries [11:40:28] Platonides: 20 Mb is source [11:41:53] VasilievVV: any suggestions for how to marshal the data to the binary? [11:42:05] PHP extensions are scaring the living crap out of me :) [11:43:13] Why do they? :) [11:43:17] I could write files, but that sounds awfully expensive. [11:43:34] Pipe? [11:43:37] well, it's all a little complicated, and I just want to get /something/ working. [11:43:50] brb [11:56:59] Werdna: also, it doesn't support Unicode [12:27:57] i search skin with wood/forest or similar with natural? [12:28:00] any idea? [12:28:54] Anyway to tell which pages use a particular extension? [12:53:13] 03(NEW) set correct Time Zone for hr.wikibooks - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14969 normal; normal; Wikimedia: Site requests; (dalibor.bosits) [12:55:06] 03(mod) TorBlock-Extension generates false positives - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14934 (10Andrew) [12:58:18] 03rotem * r38183 10/branches/REL1_13/phase3/ (RELEASE-NOTES includes/api/ApiQueryUsers.php): Backport minor bugfix: API change: Registration time of users registered before the DB field was created is now shown as empty instead of the current time. [13:14:34] 03(mod) Refactor upload code to split backend and interface - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14925 (10Bryan.TongMinh) [13:15:46] 03(mod) set correct Time Zone for hr.wikibooks - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14969 +shell (10raimond.spekking) [13:16:53] 03(mod) Raise rollback limit on English WP - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14967 +shell (10Bryan.TongMinh) [13:43:20] hi, i have a question: when i was editing my texts yesterday there was a bar on top of the editor - now there is nothing anymore. how can i get it back ? [13:50:56] How do I create a list of all pages in a custom namespace (similar to Special|All pages but without the top boxes and/or similar to a category listing). Cannot find documentation on this. [13:51:39] kassoe: write an extension [13:52:23] kassoe: http://www.mediawiki.org/wiki/Manual:Special_pages [13:52:27] or use DPL [13:52:29] !dpl [13:52:29] --mwbot-- The DynamicPageList (DPL) extension outputs reports based on criteria given in a special tag. For more information, see and . [13:53:12] VailievVV, oops - glad I asked then as I'm still planning the structure, mightl be much easier then to use categories. Will check your links. Newbie on the way up a steep learning curve ... :-) [13:54:00] mwbot, thanks to you too ... [13:57:15] I need a hint with skinning: I want to change the Modern skin to place an image in the right upper corner [14:11:41] hello [14:11:46] hi yannf [14:12:49] i had a DDoS attack on my wiki: 24,419 invalides requests by 93 different addresses [14:13:28] invalid? [14:13:44] 64.22.89.194 - - [29/Jul/2008:08:57:32 -0400] "GET [14:13:44] HTTP/1.1" 500 500 [14:14:13] /wiki/index.php/index.php?a=http://madebymantila.yoll.net/id.txt?? HTTP/1.1" 500 500 [14:14:23] 03greg * r38184 10/trunk/phase3/includes/SearchPostgres.php: Double-check that all old revision are set null. [14:16:38] i had to reboot the server: it was not responding anymore [14:20:22] can i do something to prevent it? [14:21:06] yannf, do you use Squid? [14:21:18] no [14:21:38] how would it make a difference? [14:21:55] Well, if they were requesting the same page it would do a lot better. [14:22:05] If they were all requesting different URLs it would have to forward it to Apache. [14:22:23] But it could still help, if they're doing something like holding the connection open for a really long time. [14:22:28] Squid handles that far better than Apache. [14:22:28] all different [14:22:42] lighttpd with FastCGI would also do better than Apache with mod_php. [14:23:09] You'd still have to run the PHP processes, but those should deal with their part of it quickly, just returning an error page. [14:23:14] hmm? [14:23:22] Hmm whaT? [14:23:26] t? [14:23:46] oh you weren't talking about the performance [14:24:06] Simetrical: VasilievVV complainted that you broke special:listusers because you broke UserArrayNewFromResult hook that Centralauth uses [14:24:09] *Simetrical notes to self that being able to make a commit to both trunk and branches at once does not justify checking out mediawiki/ [14:24:13] Yes, I'm reverting that now. [14:24:23] Except I thought I'd do it in one commit instead of two, which turns out to be a bad idea. [14:26:09] ssl-login-work? Who's ever heard of that? [14:26:48] Simetrical: would be easy to fix to retain the old hook for compatibility or fix ca [14:27:21] Nikerabbit, I can't debug CA changes because I don't have it installed. [14:27:32] So I'll revert and ask Tim what he thinks of it. [14:27:48] It was his class, anyway. [14:27:52] on 8711 requests, 893 different (invalid) urls [14:28:16] yannf, Squid would definitely help. [14:28:48] It would not only handle the duplicate requests, it would also handle the connections much more efficiently. [14:28:56] You could configure it to reject invalid URLs, too, if you liked. [14:29:01] Simetrical: what there is to debug :D [14:29:04] So it wouldn't even have to bother Apache. [14:29:11] Nikerabbit, you're tempting me. [14:29:28] I'm wondering if we would benefit from squid too [14:29:31] *Simetrical watches REL1_4beta2 get checked out [14:29:36] Simetrical, rejecting invalid urls, that would really help [14:29:43] *Simetrical is going to let this checkout finish out of morbid curiosity [14:29:54] yannf, it might not be needed, though, if you installed Squid to begin with. [14:30:11] <^demon> Simetrical: I tried it once. Took a very looooooooooong time. I suggest go getting a coffee or something. [14:30:23] Simetrical: are you familiar with caching? [14:30:41] A real DDoS is going to do stuff like open a TCP connection and then stall for a while, maybe before it even sends the full HTTP request. [14:30:53] Simetrical, any howto somewhere? [14:30:57] If you do lots of that, every request is going to have to spawn a new Apache, which is killer. [14:31:19] Squid probably doesn't even create a new thread for each request, I assume. [14:31:30] yannf, well, apt-get install squid is a good start (or yum install squid). [14:31:37] !squid [14:31:37] --mwbot-- I don't know anything about "squid". [14:31:40] Nope, no luck. [14:31:43] lol [14:31:46] Nikerabbit, I'm somewhat familiar with caching, yeah. [14:31:55] As a general concept, you know, or what? [14:32:04] well installing squid is the easy part [14:32:09] ^demon, oh, well, it's nice 19, I can just put it in the background. [14:32:09] Simetrical: well I have two problems at the moment [14:32:46] yannf, http://www.mediawiki.org/w/index.php?title=Special:Search&search=Squid [14:32:59] 1) what is the correct way to purge a page from the code 2) how do I set correct caching headers for images generated by a special page [14:33:44] 1) Try looking in LinksUpdate.php 2) OutputPage almost certainly has a method for this [14:34:32] 1) more specifically I need only to purge cached items for specific language [14:35:09] Nikerabbit, are you talking about parser cache or Squid cache? [14:35:26] Squid cache is URL-specific and cookie-specific, not language-specific. [14:35:26] can you do image previews on when mouse-over text with popup balloons? [14:35:35] Tronyx_work, what? [14:35:40] Simetrical: maybe both, I'm not sure [14:35:45] man i asked that like a nut. sorry i just woke up [14:36:10] i was wondering if it would be possible to use balloons to display a preview image of a link to an image [14:36:15] did that make more sense? [14:36:19] Simetrical: similar to template changes... but I don't really need to purge everything, just for if specific big in parser cache hash is true [14:36:31] Tronyx_work, yes, sure. Lupin's popups do that, for instance. [14:36:49] Nikerabbit, I don't know so much about this. I'd have to poke around. [14:36:57] ok [14:37:21] usually i configure squid the other way round: to cache out going requests [14:37:33] sorry Simetrical, lupin's popups? [14:37:35] yannf, yes, it's being used here as a *reverse* proxy. [14:37:56] Tronyx_work, http://www.google.com/search?q=Lupin's+popups [14:38:10] lupin's image previews are broken ircc [14:38:23] argh the upload code is one big mess [14:38:25] Well, they worked at some point, anyway. [14:38:39] the code is not updated for a year or something [14:39:18] hmmm [14:39:23] am I allowed to break compatibility with pre-1.14 extensions? [14:39:46] <^demon> Bryan: I was looking at your proposal yesterday, I like :-) [14:40:00] Bryan, you shouldn't do that if possible. [14:40:20] Bryan, if you're doing a rewrite, keep all the old public functions as wrappers for the new ones. [14:40:29] *^demon wonders how he gets on some mailing lists sometimes [14:40:39] ok [14:41:02] I should not worry about $mMemberVariables right? [14:41:13] because you are not supposed to access them from outside anyway [14:41:22] <^demon> If it's public, something somewhere might.... [14:41:26] Bryan, it depends. [14:41:26] Bryan: those are tricky if they are passed to hooks [14:41:47] but in some cases 100% compatibility may not be possible [14:41:56] Ideally all member variables would be private and only be changeable through accessors and mutators. [14:41:57] luckily upload code is vert scarcely hooked [14:42:14] If it's really impractical, you can just give up on compatibility, but try to avoid that if possible. [14:43:36] thanks for the 1.13.0rc1 guys :) [14:43:53] Simetrical: ideally it should be possible to easily add functions masking direct access to member variables afterwards [14:43:55] 4.9G mediawiki/ [14:43:57] not when it is coded [14:44:05] And I didn't even finish the checkout. [14:44:15] Nikerabbit, that's what __get and __set are for. Too bad they're horribly broken. [14:44:52] Simetrical: yes [14:45:13] Simetrical: but that doesn't really matter, even Java doesn't do it easily [14:45:25] Python does. [14:45:33] and C# I've been told [14:46:18] @property iirc in python [14:48:59] Python has __getattr__ and __setattr__. http://docs.python.org/ref/attribute-access.html [14:49:59] <^demon> Searching "__get" on the PHP website shows 2 entries about overloading, and over 100 bugs (not sure how many are open/closed :-p) [14:50:05] :D [14:50:18] giant __get with a switch isn't really what I had in mind [14:50:22] ^demon, the major point is it doesn't allow recursion. [14:50:36] No __get call can be nested in another __get call. [14:50:44] Even, apparently, of another class entirely. [14:50:53] If that happens, the latest call silently returns null. [14:51:05] Which is UNBELIEVABLY BROKEN. And working as intended. [14:51:13] *Simetrical stabs PHP some more [14:51:14] :D [14:51:46] <^demon> Features are great. [14:51:54] <^demon> Bugs labelled as features are even better. [14:53:23] 03simetrical * r38185 10/trunk/phase3/ (5 files in 2 dirs): Revert r38165 for now, breaks CentralAuth and I don't have that installed anywhere to debug. [14:56:39] 03simetrical * r38186 10/branches/REL1_13/phase3/ (4 files in 2 dirs): Revert r38166, per r38185 (breaks CentralAuth). [14:59:54] Hi all. Quick question: I am getting an error on en WP and need to find out whether its my fault, or WP itself. [15:00:31] I am using Popups, and whenever I hover on a picture, I get the error: imagepage preview failed. Is the query.php extension loaded? [15:01:02] sorry... loaded >>> installed. [15:05:11] query.php is broken and should be killed [15:05:21] actually, it should be rewritten to use the new api [15:08:16] Because we love trashing stable interfaces we create specifically for third-party tools to integrate with. [15:08:35] gee, "stable"? [15:08:41] Yeah, evidently not. [15:12:13] 03vasilievvv * r38187 10/trunk/extensions/AbuseFilter/parser_native/ (affunctions.cpp makefile): Support Unicode via ICU in parser_native [15:42:07] qsheets, you have a new mesage on [[en:User talk:Qsheets]]. [15:46:35] Now, I am about to go insane. I can not figure out why the {{#ifeq:{{{needs infobox|}}}|yes| ... is not staying hidden when it really should. [15:46:41] hi [15:46:48] VasilievVV: oh [15:47:44] 03purodha * r38188 10/trunk/phase3/languages/messages/MessagesDe.php: Reduces the potential to be misinterpreted for few German Exif messages. [15:48:36] BLAST! I wanted to keep each {{#ifeq: seperate, but I keep forgetting that wikicode is bad when it comes to mixing tables and parser functions. [15:49:36] I have to out the carrage return. *grr* [15:50:51] Nikerabbit: ?\ [15:50:59] VasilievVV: nothing :o [15:53:32] 03jhsoby * r38189 10/trunk/extensions/Gadgets/Gadgets.php: Adding