[00:05:22] 03(mod) retrieving passwords for accounts at locked database wikis - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12746 (10tfinc) [00:37:07] 03shinjiman * r38604 10/trunk/phase3/languages/messages/ (4 files): Localisation updates Cantonese, Chinese and Old/Late Time Chinese [00:41:30] 03(mod) Install News Channel extension on Wikinews - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=14805 (10codemonk) [00:53:34] 03(mod) Requesting FlaggedRevs be enabled on en.wikinews - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15014 (10JSchulz_4587) [01:11:09] 03krimpet * r38605 10/trunk/phase3/includes/PageHistory.php: Added an id to history-search, so it can be accessed easily via DOM [01:20:47] 03dale * r38606 10/branches/MetavidWiki-exp/MetavidWiki/skins/mv_embed/flvServer/ (MvFlv.php MvFlv_old.php README mvFlvServer.php): flvserver mostly working [01:21:02] 03dale * r38607 10/branches/MetavidWiki-exp/MetavidWiki/skins/mv_embed/flvServer/MvFlv.php: flvserver mostly working [01:47:07] Is anybody who has worked on ConfirmEdit here? I'm working on a patch, but I've hit a little roadblock [01:53:54] Emufarmers, I could try to take a look, what's the issue? [02:13:55] is it possible to put a message at the bottom of all non-existant pages that lists the "what links here" links? [02:14:14] instead of on the separate "what links here" page? [02:37:42] 03demon * r38608 10/trunk/phase3/ (3 files in 2 dirs): [02:37:42] Refactor badaccess-groupX and friends to use User::getGroupsWithPermission(). [02:37:42] * Should be a bit cleaner, and remove those ugly ugly switch() statements. [02:37:42] * Two messages are all that's needed here. Badaccess-group0, a generic [02:37:42] "Permission denied" when no group is allowed access; and badaccess-groups which [02:37:43] takes two parameters of $1 (groups required for access) and $2 (number of group [02:37:47] matches, used for {{PLURAL:}}ing the message). [02:38:17] whoa there, CIA-55. :p [02:42:03] <^demon> That would be my fault ;-) [03:16:03] 03(mod) CAPTCHA group exceptions only apply to edits - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12142 (10N/A) [03:17:38] 03(mod) CAPTCHA group exceptions only apply to edits - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12142 (10N/A) [03:21:46] 03(mod) CAPTCHA group exceptions only apply to edits - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12142 (10N/A) [03:26:21] 03raymond * r38609 10/trunk/phase3/maintenance/language/messages.inc: Follow up per r38608: Update messages.inc [03:39:43] Does anyone know how to put advertisement on a wiki? Not on the actual article though. [03:39:56] hi, i downloaded a giant .xml that's a backup of a wiki i want to use for offline use, do i have to do anything different when installing mediawiki? i just want to dump the whole thing into a folder and use it as an offline reference [03:40:53] and only on this machine, not on a network [03:41:06] so i don't need it as a server per se, at least not networkable, i just need to access it on this machine [03:41:43] Does anyone know how to put advertisement on a wiki? Not on the actual article though. [03:57:01] http://www.google.com/search?q=site%3Amediawiki.org+ads [04:00:31] http://svn.wikimedia.org/viewvc/mediawiki/trunk/extensions/GoogleAdSense/ [04:12:12] so i can just install mediawiki into a folder, put the wiki's .xml dump in another folder, and set it up like that? [04:12:15] i just need it for offline use [04:12:22] i don't need any sort of network rigamarole [04:23:44] slvmchn, not quite, it requires a web server and database to run [04:24:06] you could run it on a local webserver package like XAMPP, though. :) [04:24:55] which wiki's dump is it? do they offer static html dumps (most wikimedia projects do)? [04:25:30] anyone there? [04:25:32] ? [04:25:35] hello!!! [04:25:47] !!!!!!!!!!!!! [04:25:47] --mwbot-- I don't know anything about "!!!!!!!!!!!!". [04:25:56] mwbot [04:26:17] i'm not sure splarka, it's the nethack.wikia.com one [04:26:57] i went here: http://www.wikia.com/wiki/Database_download and followed those instructions and got the .xml.gz [04:27:11] there's no way i can just parse the .xml into .htmls or something? [04:28:05] not without MediaWiki... [04:29:38] and mediaiwki requires a webserver... hmm [04:29:49] maybe i'll look into xampp [04:30:02] not helpful for you, but just for reference: http://static.wikipedia.org/ [04:32:15] "It's 9:30pm, do you know where your dev is?" [04:32:19] i suppose i could just wget the site recursively but that's ridiculous [04:32:38] i already did the root folder but it's missing most of the content which is in subdirectories [04:33:13] how big is nethack.wikia ? [04:35:22] Splarka, pretty big, looks like [04:35:58] hi guys [04:36:07] heh, slvmchn: be sure to let Jsharp know before you wget so he can block you [04:36:09] the .xml is about 100 megs [04:36:11] i was wondering if it was possible to embed vimeo videos into a mediawiki instance [04:36:18] haha yeah exactly, that's why i wouldn't do that [04:36:36] indeed, not very nice on the servers [04:36:42] generally it is nice to put in a user-agent with contact addy, and use &maxlag=5 to not tax the servers [04:36:47] is there some xml browser or something i can use to just search through the file? [04:37:07] the problem is, the xml are generally raw wikicode, not very useful without mediawiki [04:37:17] ok if i resort to that i'll make sure to do that... i guess it's not such a big issue at the moment, just sometimes i spend time in maine with no internet and nethack helps pass the time [04:37:24] the parsing of such pages depends heavily on what other pages exist, templates, images, and full access to the database [04:37:27] yeah [04:37:45] sherrod: http://www.mediawiki.org/wiki/Extension:FramedVideo maybe [04:38:19] Splarka: OMG YAY THANK YOU [04:38:20] :D [04:38:50] http://www.mediawiki.org/wiki/Extension_talk:EmbedVideo#Vimeo <-- only mentions it on the talk page though, might not support it out of the box [04:38:56] (a second extension, that is) [04:39:35] I know we can embed youtube, i wonder if there is just a differnt syntax to use vimeo [04:39:36] FramedVideo looks to be the best bet [04:40:08] I will give this a try right now and see what happens! [04:43:06] "google for it: http://www.google.com/search?q=site:mediawiki.org" should be in the topic ^_^ [04:43:35] oh snap! i think i found something: http://wikitravel.org/en/Wikitravel:Offline_Reader_Expedition [04:43:50] aww only for wikitravel [04:43:51] sigh [04:44:04] Splarka, it's kind of embarrasing, though, for us to admit Google is better than our own internal search for that ;p [04:44:35] I consider it a source of pride [04:44:43] do wikia wikis have a much different format than other wiki sites? if i can find some offline wiki app that reads a raw .xml dump is there any chance it'd work with a wikia wiki dump? [04:45:02] slvmchn, all MediaWiki dumps should be more or less the same [04:45:27] slvmchn: you still need it to be able to use the parser tag and parser function extensions wikia has installed [04:45:42] i don't know what that means :-( [04:45:42] otherwise it might not know what to make of weird syntax [04:45:50] oh [04:45:51] ok [04:46:08] well, like Wikia has http://www.mediawiki.org/wiki/Extension:StringFunctions [04:46:22] most wikis don't, so what would such a tool do when it saw #replace [04:46:38] no chance there's any backup mirrors i can leech off are there? [04:46:42] or something like that [04:46:48] the parser is not really a parser, but an iterated replace system [04:46:59] there's a couple of offline readers here: http://meta.wikimedia.org/wiki/Data_dumps#Wikipedia_Dump_Reader though I'm not sure to what degree they support ParserFunctions. [04:46:59] heh, well, google's cache? ^_^ [04:47:18] ah well i give up for now, i can figure this out later, i just want to figure it out before ww3 so when the internet dies i can still play nethack and read my spoiler wiki lol [04:48:30] you need access to the full database, for things like templates and even simple things like whether [[Foo]] is blue or red (class=New), you need the same parser extensions in Special:Version, you need pretty much the same LocalSettings.php (although you can guess at many), and the same system messages (generally), and if you want images you need the images too... it isn't a simple task to static dump without the same version and setup of mediawiki [04:48:53] this is why wikiwyg is so hard [04:49:31] you can only peripherally guess at simple wikicode, like '''foo''' being bold and [[abc:foo]] being a link of some sort (redirect? red? interwiki? namespace?) [04:50:34] heck, abc: might be the localized version of Category:, which would have a completely different behavior than an interwiki [04:50:40] 03brion * r38610 10/trunk/phase3/ (4 files in 4 dirs): [04:50:40] Revert r38587, r38589 for now ("(bug 2314) Add links to users custom CSS and JS into Special:Preferences") [04:50:40] This clutters up the prefs UI with a lot of links which don't currently apply, [04:50:40] most of which will never be used. It also isn't clear how it would apply to [04:50:40] proposed non-skin-specific pages such as User:X/common.css/.js and [04:50:42] User:X/print.css [04:51:13] 04(REOPENED) Users custom CSS and JS not linked from Special:Preferences - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=2314 +comment (10brion) [04:51:18] I'd say wget at maybe 12 pages/min with maxlag=5, until Jsharp blocks you ^_^ [04:51:28] it will take you forever [04:51:33] obviously [04:51:40] best to just install a local mysql instance and php [04:51:56] the thing they hate the most is that IE5/6 mode "Make available offline", with absolutely NO time between requests [04:52:47] it's blocked most everywhere [04:53:04] also, always obey robots.txt :) [04:53:21] brion: a very good point against any type of user/common.js is that changing skins is a refuge for people who put body {display:none;} in their monobook.css and setInterval(0,'alert:("yay!")') in their monobook.js [04:53:44] Splarka: turning off javascript in the browser does a good job at that too [04:54:05] mmm [04:54:10] so some say [04:54:19] but it doesn't help people at libraries and 'net cafes [04:54:26] :) [04:54:35] (of which I've had to help maybe 4-5 while at wikia, who couldn't disable JS) [04:54:44] indeed [04:55:08] it's rather annoying sometimes, JS load is way too high there [04:55:16] unless the wiki has a rouge admin messing with people's JS, though, is that a problem? :) [04:55:25] mostly because of YUI *bleh* [04:55:26] ;) Which is why people also propsed &nocss=1&nojs=1 [04:55:59] javascript: you have Heston's gun collection. You find 30 ways to shoot yourself in the foot. [04:56:06] heh [04:56:16] generally someone knowledgeable enough to lock themselves out with CSS/JS with no emergency exit probably means to do so :p [04:56:23] heh [04:56:32] Splarka, and you have eval() for ammunition [04:56:38] right, but user/common.js is just so... I dunno, ookie [04:56:45] &nocss/&nojs sounds like a good idea, though. [04:56:48] 03brion * r38611 10/trunk/phase3/includes/ (Linker.php Xml.php): (log message trimmed) [04:56:48] Revert r38591 -- "Make good-faith effort to run BrokenLink hook in [04:56:48] Linker::link(). No parser-test regressions, and it shouldn't change behavior [04:56:48] unless Xml::expandAttributes() and Sanitizer::decodeTagAttributes() aren't [04:56:48] inverses up to normalization." [04:56:49] IMHO this is an excellent opportunity to kill a horrible interface and replace it with a sane one. Note the only use of the BrokenLink hook currently in our SVN is in SemanticForms: [04:56:54] /** [04:56:59] "In case of rougeness, break glass." [04:57:16] Splarka: ^_^ Wikia already has something even more evil... global.js... You can lock yourself out of every last one of Wikia's wiki at once... rotfl [04:57:22] but user/print.css sounds trei sexy [04:57:37] ;) And every skin at that [04:57:38] Dantman|FS: I bribed Datrio personally for that [04:57:44] heh [04:57:45] global is really rather evil [04:57:46] it was my favorite Wikia feature ever [04:57:51] mhmm [04:57:56] Splarka: assuming we wish to protect against a broken/rogue common.css/.js, then just hitting another skin doesn't help :) [04:58:02] unless there's a no-css-no-js skin :D [04:58:19] brion: well, most users won't create every user/skin.js [04:58:27] and those that do... well, they deserve it [04:58:36] I tried writing a global CSS/JS greasemonkey script, since I figured something that global would be better on the client side... [04:58:38] they were asking for it? tsk tsk [04:58:44] my problem with user/common.js is that new scripts will have instructions to install them into common.js [04:58:48] and people will of course do that [04:58:53] little did I know greasemonkey's security model makes it nigh impossible -_- [04:59:43] T_T You guys discussed embedding external videos without poking me... [04:59:52] and not only are most scripts template-skin oriented (addPortletLinks for example), they are also mostly untested in the other skins [04:59:53] *JSharp wonders why [04:59:54] heh [05:00:09] Greasemonkey apparently makes it impossible to inject .js into the page that runs completely within that page's scope. [05:00:20] I do code mine for other skins, but almost never actually test them in other skins [05:01:02] anyway, I just think user/common.js would be a pandora's box that would cause nasty regression if it was later removed [05:01:16] yep, there's no taking it away for sure [05:01:23] well, not without a lot of pain [05:01:27] like ripping off a band-aid [05:01:51] it isn't terribly helpful to people who stick with one skin 99.9% of the time, it would decrease usability slightly, and it would deprecate the skinspecific.js [05:01:55] on the other hand, a user Common.js might encourage scripters to code cleaner [05:02:09] heh, right... [05:02:15] rotfl [05:02:22] brion: you reverted Simetrical's statement of deprecation for MediaWiki:Skinname.js but it will be a long time before people believe that [05:02:33] heh [05:02:35] and it will carry over to user/common.js if implemented X_X [05:02:46] mmm [05:03:14] if common.js was to be used, would it use a combined load with another gen=js or yet *another*