[00:00:07] as I said, mwbot is rather dumb [00:00:13] here, his brain is: http://is.gd/hMRz [00:00:26] Splarka: Sorry, that was a "macro" on the cheatsheet. [00:00:27] :) [00:00:49] !?! [00:00:49] --mwbot-- !!! [00:00:49] mwbot: !?! [00:01:01] don't prefix "mwbot" [00:01:01] Ah, there we go. Now I'm done with that. [00:01:41] 03aaron * r49166 10/trunk/extensions/FlaggedRevs/FlaggedArticle.php: Make latest entry of the stabilization log appears below the reviewing box when needed [00:12:04] 03(mod) Spam not being deleted from OTRS system - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=18042 (10rjd0060.wiki) [00:14:18] 03aaron * r49167 10/trunk/extensions/FlaggedRevs/ (FlaggedRevs.class.php FlaggedRevs.hooks.php): Correct autoreview behavior with $wgFlagAvailability [00:35:45] 03(mod) eliminate principal source of duplicated text table rows: unchecked undos - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=18333 (10jidanni) [00:42:36] hi guys, does anyone know how to align a google map in a wiki entry to the right (like an image frame)? [00:54:01] What needs to be done to rename a user other than changing the rev_user_text in revision and user_name in user to the new username? [00:56:12] hi [00:56:31] what this room about [00:58:06] PeaceFul: It's about the MediaWiki software [00:58:24] what does it mean [00:59:11] is it geeks room [01:03:42] is Ævar Arnfjörð Bjarmason on IRC? [01:06:45] how do you add to the sidebars using an extension? [01:07:15] xaxxon: read the hooks on mediawiki.org [01:07:19] chuck: not atm [01:07:42] xaxxon: http://www.mediawiki.org/wiki/Manual:Hooks/SkinBuildSidebar [01:07:45] Skizzerz: what's his name? [01:07:47] well, he is... but he hasn't talked in a while [01:07:48] awesome [01:07:50] avar [01:07:51] chuck: avar [01:08:04] on now... [01:08:27] avar has been idle 3mins 22secs, [01:08:31] avar: ping [01:08:32] *Splarka pokles Skizzerz [01:08:42] ah [01:08:48] xaxxon: you might look at some extensions that do that, like Collection [01:08:52] my client doesn't give the labels, so I thought it was 3hrs :P [01:08:57] or google adsense [01:09:17] 21:09:08 [freenode] CTCP VERSION reply from Skizzerz: do I know you? o.0 [01:09:18] :'( [01:09:22] :) [01:09:25] Skizzerz: what client do you use? [01:09:34] you've probably asked me that at least 20 times by now [01:10:04] have not! [01:10:09] hint: it starts with the letter that is often found on fictional treasure maps [01:10:43] xchat! [01:10:49] :D [01:11:01] hi guys, does anyone know how to align a google map in a wiki entry to the right (like an image frame)? [01:11:17] wrap it in
? [01:11:28] (and
afterwards) [01:11:50] but if I want a text to be on the left? [01:12:07] put the text after the [01:12:19] will give it a shot. thanks [01:12:39] the power of my limited grep skillz: [01:12:40] charlie@serv1:~/irclogs/freenode$ grep "client" * | grep "Skizzerz\:" [01:12:40] #mediawiki.log:21:09 < chuck> Skizzerz: what client do you use? [01:12:40] #yourwiki.log:20:37 <+Alexfusco5> Skizzerz: what client are you using? [01:12:40] #yourwiki.log:19:42 < glacierwolf> Skizzerz: What client would you suggest, then? [01:12:54] heh [01:13:07] I told you in #botters yesterday or the day before iirc [01:17:47] yhank you [01:24:12] is there an extension that allows you to delete old revisions, or some part of the default installation that does? why i'm asking is that i have some really old revisions that happen to be compressed, and i'm not sure if i can get them decompressed efficiently.... [01:27:37] simonrvn: there is a maintenance script somewhere in /maintenance/ [01:27:53] Splarka: ok, looking - thanks [01:28:09] deleteOldRevisions.php [01:28:42] that might not be what you want [01:28:50] the names there kinda suck [01:29:05] that one is for deleting all non-top revisions, I think [01:29:12] you might want deleteArchivedRevisions.php [01:29:13] seems like it, i read the comments [01:29:19] oh maybe [01:29:41] deletes "deleted" revisions [01:29:50] but they couldn't call it DeleteDeletedRevisions, heh [01:30:09] heh ok :) [01:31:06] thanks :) [01:34:39] rar [01:52:42] avar: ping [01:52:58] 03kim * r49168 10/trunk/wikiation/util/allextension.py: The script to test all extension: installs [02:17:25] 03(mod) Special:Log should be able to list pages at revision level as well - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12083 (10N/A) [02:30:50] 03(NEW) Test implementation of FLaggedRevs for the English Wikipedia - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=18334 enhancement; Normal; Wikimedia: Site requests; (cenarium.sysop) [02:36:36] O-o [02:36:42] How many bugs does enwiki need? [02:38:35] Mike_lifeguard: at least 9000 [02:38:40] Mike_lifeguard: about flaggedrevs specifically? :P [02:38:49] yes [02:39:01] heh [02:39:28] it's as if people think that the developers might have completely forgotten about flaggedrevs on enwiki and they need a reminder or something [02:39:36] 03(NEW) Log entries for deleting log entries are opaque - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=18335 normal; Normal; MediaWiki: Deleting; (mikelifeguard) [02:40:06] 3(NEW) Stop opening bugs requesting the enabling of Flagged Revisions on English Wikipedia KTHX!- 10https://bugzilla.wikimedia.org/show_bug.cgi?id=18336 4BLOCKER; highest; Wikimedia: Site requests; (herd) [02:41:03] *Splarka still thinks there needs to be a top 10 most duped bugs list shown on the new bug form [02:41:19] mine is possibly a dupe [02:41:20] and not just open ones, "set focus on search box" for example [02:41:24] you're a dupe [02:41:26] Splarka: lol [02:42:22] i wonder how many open ones are already done just never closed [02:42:37] site requests ya mean? [02:42:44] where do you think I got all my fixed bugs? [02:42:49] *Mike_lifeguard sure as hell didn't fix them :D [02:42:53] Splarka: no /all/ bugs [02:43:23] oh, mike, those stats were for the reporter, whether or not they closed them themselves [02:43:47] so for example you have a 60% success rate of having your bugs FIXED rather than WONT/DUP/INV/WFM by someone else [02:43:59] I know [02:44:10] but if you check, I've marked bugs as fixed too [02:44:16] not as many :D [02:44:56] 03(mod) Log entries for deleting log entries are opaque - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=18335 +comment (10mrzmanwiki) [02:45:20] well, nobody cares about who is fixing bugs, just about reporting them [02:45:42] so jidanni is winning then? [02:46:26] shhh it's keeping him away from the mailing lists [02:47:05] Hopefully this can be a quick and easy answer for someone here to help fix my 1st MediaWiki site that I broke - What table/file is the 'actual' content of a page stored in? - Is it 'text/php' files in the directory structure OR is it in a 'specific' table of the DB? [02:47:43] database [02:48:28] 03(mod) Log entries for deleting log entries are opaque - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=18335 +comment (10mikelifeguard) [02:49:16] Thanks, Is each 'page' in a seperate table or are all pages in a single table and if you know the table name would be greatly appreciated as PhpMyAdmin browsing tables is giving me greif from the ISP upgrade from 4.x to 5.x [02:49:29] hmm we should have a enwiki keyword >.> [02:50:11] Maybe someone should code a little AJAX thing to pull a search with terms of the new bug's subject like GetSatisfaction does [02:50:26] IIRC, each revision has a row in the table [02:50:44] (the text table? the revision table? couldn't tell you without checking) [02:50:49] so it's not per-page [02:51:02] it's one table for all page content [02:51:07] http://upload.wikimedia.org/wikipedia/commons/4/41/Mediawiki-database-schema.png ah, I guess text is stored in the text table (which is only /pointed to/ from the revision table)? [02:52:19] dunno what the page table is then :D [02:52:34] *Mike_lifeguard lazes [02:52:41] someone who knows this should help them [02:52:53] Many Many Thanks - Digging through the tables now :) [02:53:07] that design was from 2007 though so it would be slightly differnt [02:53:38] the true/current db layout can be found in maintenance/tables.sql [02:54:07] The REVISION table holds metadata for every edit done to a page within the wiki. Every edit of a page creates a new revision row, which holds such information as the user who made the edit, the time at which the edit was made, and a reference to the new wikitext in the TEXT table. [02:54:21] The TEXT table holds the wikitext of individual page revisions. [02:55:20] The core of the wiki: each PAGE has an entry here [the PAGE table] which identifies it by title and contains some essential metadata . The text of the page itself is stored in TEXT. [02:55:27] and there we have it :D [02:57:49] Mike_lifeguard: now make a cs scheme diagram of it >.> [02:57:58] dunno how [02:58:02] (unless you're using external storage for page text) [02:58:11] *Mike_lifeguard gets out of so much because of not knowing stuff XD [02:58:26] In that case, revision -> text -> ES ? [02:58:30] OK - I have the 'REVISION' blob data there and that makes sense and 'appears' fine - The 'PAGE' table has each of the ~30 pages and page_title etc all correct by the look of things [02:58:33] something like that [02:59:05] seems inefficient... why wouldn't you replace the text table with ES... instead of a two-level reference? [02:59:30] Stephen-Netweb: is there something specific you're looking for? or a specific problem you need to resolve? [03:01:28] The 'content' of the Wiki 'vanished' but categories/users/page_titles are there just 'nothing' as to the content that was in each page is there anymore and hoping to find that content to cut &paste into a fresh install (or repair) if I can [03:02:23] do you have a text table? [03:02:47] So I have ~162 'BLOB' entries in the 'TEXT' table and to now find the 'actual' text' from these 'blobs' and fix that [03:03:12] good luck [03:03:54] Thanks ;) [03:06:58] Just need to slap my hoster I think ;) As /wiki/Special:AllPages shows the pages but each page has zero/zip/none content :P [03:07:48] 03mrzman * r49169 10/trunk/phase3/ (RELEASE-NOTES includes/specials/SpecialSearch.php): * (bug 18316) Removed superfluous name="fulltext" from Special:Search [03:08:40] 03(FIXED) In [[Special:Search]], there two named "fulltext" control tags. - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=18316 +comment (10mrzmanwiki) [03:13:08] Done - I give up and throw my hands into the air - Each and every 'BLOB' in the 'TEXT' table has zero data and asides from the actual content everything else is still there...... - Message to Wiki Users - Sorry please start again :P [03:13:40] who is your hoster? [03:13:48] Site5.com [03:15:07] And no backups - They did an upgrade and I 'thought' (the presumption of all...) that the MySql PhpMyAdmin backup I did before the 4.x to 5.x MySQL upgrade 'had' all the data and didn't check and now it returns to bite me ;) [03:19:47] wait was the upgrade to php or the myphpadmin installs? [03:20:32] Over the past month of so they have upgraded PHP from 4.x to 5.x - MySql 4.x - 5.x and a Python Upgrade [03:21:46] Feel free to have a look at the site 'if anything is obvious standing out' http://bleedingedge.com.au/wiki and as a reference of the page content/views http://bleedingedge.com.au/wiki/Special:Statistics [03:43:07] From a skin, is there a way to detect which categories the page being rendered is in? [03:43:26] (With the hope of using a different template per-category, or to at least pass in a different stylesheet based on a category.) [03:51:44] is there any available stubs that can be used upon a fresh install? [03:51:55] I'm looking to put an indication that this current wiki page is under construction [05:00:47] http://www.mediawiki.org/w/index.php?title=Download&diff=246967&oldid=246364 is this actually an improvement? :\ [05:13:10] Emufarmers: wat [05:13:23] also the next edit removed teh so, no, it isn't any improvement without the (properly added) [05:15:26] but that looks ugly too without a clear somewhere [05:21:08] Well, yes, I meant with it fixed [05:22:26] *Splarka shrugs [05:22:42] adds more economic use of vertical real estate [05:26:22] Are you being sarcastic? [05:27:27] a bit [05:27:50] everyone with a valid opinion on that subject is in Berlin drinking [05:27:57] ah [05:28:06] Well, I figured Tim would probably see it eventually and revert it [05:28:11] except maybe Marybelle MZ/FF [05:28:20] >_> [05:28:24] <_< [05:28:29] yah you [05:28:35] fix the [[Download]] page [05:28:53] FF? [05:29:36] metahumor [05:59:41] chuck: pong [06:21:52] Any active mediawiki devs around, because I have a bone to pick with you guys regarding the API and cookies that is potentially a problem [06:22:16] Well not potentially, it is a problem, that caused a lot of trouble for me, and possibly other users [06:26:16] Welll, since the channel is logged, I should put it on the record. The docs, until recently, suggested that the information provided on login by the API, would be enough to construct the cookies needed to show that a bot is logged in. But since centralauth has been enabled, those cookies aren't even passed any more, but instead some centralauth ones are instead. So now the login info is completely inaccurate and to construct the cookies you have [06:26:47] I've written up a draft of some partly-baked ideas on "Continuous Integration for Wikis" at http://www.cfcl.com/rdm/weblog/archives/001673.html - Comments solicited... [06:28:47] Atyndall: You cut off at "the cookies you have." [06:28:55] *Lady_Aleena sets out cookies on the table for everyone. [06:30:15] Marybelle: ...to scrape the Set-Cookie: headers. [06:30:27] Which is silly and annoying [07:54:13] Hello, :) I'm hoping to get some help with my media wiki config if anyone is around. Anyone around? [07:55:37] Well, there is a developer get-together in Berlin right now, so that's where most of the tech folks are (me included), and the ones that didn't come are probably American, and it's like 2am over there [07:55:59] It's 1am :) Should I drop my question anyway? [07:57:31] Sure, feel free, but don't be surprised if no one answers for a while [07:57:39] okay. :) [07:59:06] So I'm looking to "trick" the categories page or make one myself. See a page can have a category that is a date. Well I have a page where I user can click on those years for those dates. I want a list of all articles that have a category with the year given. Assuming that the date is in wm_category [07:59:33] I'm not sure how to join the SQL or how to trick the category page. So I came here :) [08:00:37] I'd make a template like {{categorydate|2009|4|3}} [08:01:09] that did [[Category:year {{{1}}}]] [[Category:date {{{1}}}/{{{2}}}/{{3}}}]] or such [08:01:37] (template and parameterize it, then you can do what you want) [08:02:07] wow I have no idea how to even start to do something like that. If you can tell me how to join the SQL, I can design the page myself. [08:02:11] you could even use the #time ParserFunction and send it any old date, and categorize it thusly [08:02:52] Thanks for the quick response though :) [08:03:13] [[Category:year-{{#time:Y|{{{1}}}}}]] [[Category:month-{{#time:m|{{{1}}}}}]] [[Category:day-{{#time:d|{{{1}}}}}]] [08:03:36] well, the point of this is there'd be no hacks needed [08:03:45] you'd have each page categorized by as many components of the date string as you wanted [08:04:27] so they'd be in all of: [[Category:Year 2009]] [[Category:Month 04]] [[Category:Day 03]] [[Category:Date 2009/04/03]] [08:04:48] So there would be many categories then right? [08:05:06] yes, though you could hide the ones you didn't necessarily want to show, in article views [08:05:12] a given article would have 4 categories. [08:05:13] (__HIDDENCAT__) [08:06:36] this I think would be easier than trying to string/regex/wildcard search for sublists of categories [08:07:33] But I'd have to ask the user to specify a date and year seperately. I'm not sure it would be easier then just executing the sql to grab the title based off a category. That should be simple, I'm just not sure how to write the join. [08:08:00] well, using #time you could accept almost any date string that wasn't ambiguous [08:08:48] but they'd have to use templates rather than direct [[Category:date]] stuff [08:09:11] I'm sure what a template is sorry [08:10:01] http://www.mediawiki.org/wiki/Help:Templates if interested [08:11:33] Okay. Can I ask, what is so bad about my way? [08:12:51] no idea, just sounded like the native features in MediaWiki categories could work just as well [08:13:25] (plus I don't know MySql, someone else might wake up and answer you there) [08:13:36] okay. Thanks for your help :) [08:17:50] hello [08:18:42] my Interlinker.js is having a problem [08:21:01] http://rafb.net/p/tSuTI118.html [08:21:48] hmm so everyone is at a conference, it appears to be the best time to break something >.> [08:22:49] heh [08:24:31] Splarka, please can you take a look at http://rafb.net/p/tSuTI118.html for me? [08:24:59] can you describe the symptoms other than 'having a problem'? [08:25:24] the interwiki links show up as redlinks, Splarka [08:25:38] any JS console errors? [08:26:02] no JS consle errors as far as I'm aware [08:26:13] I transwikied the js from meta [08:27:14] Error: missing } after function body [08:27:14] Source File: file:///e:/temp/xx.htm [08:27:14] Line: 106 [08:27:14] Source Code: [08:27:22] ack, bad paste [08:27:27] but I get that error running the code, hmm [08:27:59] yah, it is missing a } just before var docobj [08:28:27] also the returns in that top function shouldn't be full URLs [08:28:48] Splarka, I did change the ones in the top to non-full URLs [08:30:07] also you probably don't need that 'global' bit [08:30:39] hmm [08:30:54] Splarka, http://rafb.net/p/nBVfFm45.html [08:31:05] Splarka, http://rafb.net/p/nBVfFm45.html [08:31:34] ... [08:31:49] Splarka, http://rafb.net/p/nBVfFm45.html [08:32:08] yes, I saw the first two times [08:32:25] those were my virtualhosts for testing on localhost [08:32:58] this version has even more syntax errors [08:33:20] oh dear. Please check out the version at mw.org [08:33:37] sorry test.wikipedia.org [08:34:33] *Splarka just fixes them [08:45:20] has anyone made the page/discussion/edit/etc... tabs use ajax? [08:46:01] cuz jeez, that sounds really convenient! in fact, id like to see ajax sprinkled on many of the interface links [08:46:35] maybe i could do it pretty easily using mwclient and mediawikis own ajax support [08:47:52] annoying for linking [08:48:06] what do you mean [08:48:13] if the URL never changes [08:48:20] might as well navigate in a flash frame X_X [08:48:26] i see [08:49:05] lots of things on a wiki can be (and some are) enhanced by ajax, but navigation doesn't seem like a good one [08:49:07] perhaps the extension could provide an alternate toolbar to the user that displayed the actual url [08:49:19] why? why not just click and be taken there? [08:49:33] why reload the entire interface? [08:49:40] we're getting a new server but mediawiki is slow [08:49:42] cached, cheap, gzipped [08:49:53] heh [08:49:55] set up cache servers [08:50:04] well, anyway [08:50:16] ajax is perceptually faster [08:50:22] there are some basic ajax tools (well, one init function really), and there is the api [08:50:29] it can all be done in a few user scripts [08:50:52] i think i know how, just from Common.css/js [08:51:10] an addOnloadHook to usurp the default talk page tab, a function to fetch the talk page on click via api, and display it in a popup [08:51:37] in a popup? oh god [08:51:40] you must be kidding me [08:51:45] to 'preview' it [08:51:57] oh i see, a lightbox [08:52:36] this? http://www.emanueleferonato.com/2007/08/22/create-a-lightbox-effect-only-with-css-no-javascript-needed/ [08:52:47] ew [08:52:51] anyway: http://www.mediawiki.org/w/api.php [08:52:51] or hovery css [08:52:57] read that, if you wanna do it, the api is the way [08:53:19] yeah i added parse() to mwclient recently, he committed it :) [08:53:31] http://www.mediawiki.org/w/api.php?action=parse&page=MediaWiki [08:54:04] that'll give you the text of a page in finalized HTML, you just need to unescape it and stick it in the body content [08:54:17] and of course, replace all the hrefs with onclicks to do the same [08:54:20] (yuck_ [08:54:32] hey Tim, are you in Berlin? [08:54:38] tx splarka [08:54:39] http://rafb.net/p/xFBQeH94.html - my common.js errors [08:54:41] yes [08:54:57] Sunwell5: http://test.wikipedia.org/wiki/Special:RecentChanges [08:55:16] Tim: I have yet to meet an australian of this generation who likes Vegemite, do you? [08:55:23] (unrelated question) [08:55:24] yes [08:55:28] freaky [08:55:50] i eat vegemite [08:55:57] I just did a side by side comparison of Marmite and Vegemite, the Veg does have the milder taste and less sour aftertaste [08:56:02] (real british Marmite, not NZ) [08:56:54] *Splarka wonders why Sunwell has such a sucky connection [08:57:23] prefrontal: it is possible to make a live mirror using just javascript, offsite even, and serve ads [08:58:07]