[01:05:13] 03(NEW) Allow hyperlinks in edit summary/block log when it' s to a Wikimedia page - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14892 15enhancement; normal; MediaWiki: History/Diffs; (TheEvilSpartan) [01:44:10] 03demon * r37929 10/trunk/phase3/docs/README: Typofix [01:46:18] hello everyone [01:47:06] I installed a skin for mediawiki script [01:47:20] but the proble is that when I preview i get an error in some parts of the page [01:47:49] it looks like: fatal error: Call to undefined method SkinMonoCMS::tooltipAndAccesskey() in [01:48:09] \xampp\htdocs\mw\skins\MonoCMS.php [01:48:14] any ideas? [01:48:15] Then there's something wrong with the skin [01:48:35] I thought so [01:48:51] and I checked the tooltipandaccesskey [01:48:59] it does not exist in the orginal wiki skins [01:49:19] but why was it added to this skin [01:49:25] if it does not work [01:49:27] ? [01:49:41] No clue, they must have been trying to do something but did it in a screwy way [01:49:54] that function was supposed to be a Linker -> Skin function [01:50:07] I expect they were trying to call it inside of a SkinTemplate [01:50:22] abdul, what version of MW? [01:50:31] I downloded it yesterday [01:50:36] so I guess it is the latest [01:51:08] There are actually 2 "latest" versions... [01:52:52] the file that I downloaded has 1.9.3 [01:53:20] I checked their website and it seems that they have 1.12 [01:53:35] so propable mine is outdated [02:03:02] 03aaron * r37930 10/trunk/extensions/FlaggedRevs/ (FlaggedRevs.php specialpages/RatingHistory_body.php): [02:03:02] * Graph fixes [02:03:02] * Tweak default params [02:28:13] What CSS class am I looking for if I want to modify the look of the left hand nav bars? .portlet is used by to much else to be what I'm looking for I think. [02:38:03] YurtleTheTurtle, name each of the id's individually, plus get the class for the navigation thingies. Look in skins/monobook/main.css for examples. [02:38:28] I think i've found what the offical skin is doing... but its just not behaving atm [02:49:13] 03aaron * r37931 10/trunk/extensions/FlaggedRevs/ (2 files in 2 dirs): UI tweaks [03:09:09] 03(mod) Special:Nuke does not list imported pages - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=9120 +comment (10cameronem) [03:10:54] 03(mod) Special:Nuke does not list imported pages - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=9120 (10cameronem) [03:39:44] 03sql * r37932 10/trunk/extensions/AntiSpoof/ (AntiSpoof.i18n.php AntiSpoof.php SpoofUser.php): [03:39:45] Try again at bug 12232 - Return more than one result on attempted spoofing. [03:39:45] Split the spoof message into five messages, one for each number of spoofs detected per the last revert. It's a little less flexible than I would [03:39:45] have liked, but, it works. [03:40:33] 03(FIXED) AntiSpoof should return more than one result - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12232 +comment (10sxwiki) [03:49:29] Server Error in '/' Application. [03:49:31] -------------------------------------------------------------------------------- [03:49:33] Object reference not set to an instance of an object. [03:49:34] Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. [03:49:36] Exception Details: System.NullReferenceException: Object reference not set to an instance of an object. [03:49:38] Source Error: [03:49:40] An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception can be identified using the exception stack trace below. [03:49:43] Stack Trace: [03:49:45] [NullReferenceException: Object reference not set to an instance of an object.] [03:49:47] IIS7Injector.InjectedContentStream.Write(Byte[] buffer, Int32 offset, Int32 count) +146 [03:49:49] System.Web.HttpWriter.FilterIntegrated(Boolean finalFiltering, IIS7WorkerRequest wr) +265 [03:49:51] System.Web.HttpResponse.FilterOutput() +80 [03:49:53] System.Web.CallFilterExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute() +54 [03:49:55] System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously) +64 [03:49:57] [03:49:59] -------------------------------------------------------------------------------- [03:50:01] Version Information: Microsoft .NET Framework Version:2.0.50727.1434; ASP.NET Version:2.0.50727.1434 [03:50:07] why came this error? [03:51:15] No idea. There's nothing in there that has any information about mediawiki. And, please, don't paste that kind of thing directly in the channel, use a pastbin [03:51:58] sorry friends [03:52:41] how to see a new user in the recent changes? [03:55:35] [[Special:Log/newusers]] [04:03:10] 03sql * r37933 10/trunk/extensions/AntiSpoof/ (AntiSpoof.i18n.php AntiSpoof.php): [04:03:10] Put requests inside double quotes, as OverlordQ pointed out to me a few seconds after the last commit, so that it is easier to distinguish exactly [04:03:10] which username the offending one is. Also moved a word from antispoof-name-conflict into the numbered ones, so that it can vary in plurality based [04:03:10] on the number of conflicts [04:06:34] holy newlines batman *slaps sql* [04:07:52] oh, christ, I forgot to run 'flip' on it, didn't I [04:48:48] Has anyone tried the Ogg Theora video streaming extension with high reslution video? ie 1920x1080? [04:50:18] Hello, how to enable mediawiki Template:Extension ?? [05:01:13] 03stipe * r37934 10/branches/MetavidWiki-exp/MetavidWiki/skins/mv_embed/ (26 files in 3 dirs): started implementing new skin with flowplayer [05:06:56] 03stipe * r37935 10/branches/MetavidWiki-exp/MetavidWiki/skins/mv_embed/mv_flashEmbed.js: disabled flowplayer controls [05:07:09] is it possible to show both the FCKeditor and the standard mediawiki edit toolbar? [05:07:31] i modified EditPage.php and I also wrote custom javascript to move the mediawiki toolbar below the textarea. both solutions failed [05:07:55] they work when the FCKeditor is not enabled, but when it's enabled they don't show up below it..which is bizarre! [05:28:58] i have installed confirmedit on mediawiki v1.12.0, but it doesn't seem to work.. [05:29:01] can someone help? [05:29:27] Is there a place to get non-wiki help from trusted wiki users? [05:31:23] njathan: In what way? [05:32:33] i presume that after installing the extension, a user who edits a page should be required to key in a confirmation code... right? it aint asking for any confirmation... [05:33:03] Not really... You're supposed to configure it first [05:34:03] 03(mod) AntiSpoof should return more than one result - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12232 (10overlordq) [05:34:05] The defaults only trigger captchas on url adding, account creation, and bad logins for anyone without the bot or sysop flags [05:34:26] You'll need to modify $wgCaptchaTriggers if you want anything more restrictive [05:35:14] err.. http://www.mediawiki.org/wiki/Extension:ConfirmEdit said add the line in localsettings.php "Thats All"... :P [05:36:15] ^_^ Ya, it'll work... but it'll be far from what you want [05:36:42] hey thanks Dantman... [05:36:54] Also. you may want to enable FancyCaptch [05:36:56] +a [05:37:08] By default ConfirmEdit uses simple math captchas [05:38:42] sure.. i'll try my hand at that too.. and come back with more questions to bug the channel.. ;) [05:39:19] anyone know if StatSVN's been run against phase3 anywhere? :) [05:39:30] (Curious, before I blow 4 hours running it myself) [05:39:42] well kinda [05:40:29] http://thread.gmane.org/gmane.science.linguistics.wikipedia.technical/39037 [05:40:59] OverlordQ: not exactly, but, I did get that mail :P [05:41:13] (and, it's a very cool video!) [05:42:58] *SQLDb runs a quick one on AntiSpoof before running it over extensions/ or phase3/ [05:46:12] ^_^ There's something you might find more interesting [05:46:42] I've got commit access to code_swarm and now there's MediaWiki support inside of code_swarm [05:48:04] yeah, code_swarm is quite awesome :) [05:59:29] ^_^ I already have a visualization of the Narutopedia [05:59:39] Though, I can't get x264 working [06:00:33] example: http://toolserver.org/~sql/svn-stats/AntiSpoof/index.html [06:07:56] has anyone used the extension boardvote and antibot.. theres not much info on them on mediawiki! [06:08:09] well boardvote is really MW specific [06:08:43] (I'm assuming) :) [06:10:22] there not much documentation on it.. [06:25:14] Hello, why the categorytree extension works properly with http://domain.com and not with http://www.domain.com ? [06:28:02] :-D [06:40:40] \join #apache [06:41:25] what? [06:42:28] heh [06:44:15] hello [06:44:36] phoenixsampras: hi [06:50:42] Dantman: Thanks!! confirmedit with fancycaptcha working cool.. :) [06:51:53] How to enable the template:extension , aint working? [06:52:27] anyone using emoticon extension for mediawiki 1.12 [07:04:42] Is anyone in here familiar with the DjVu extension? [07:05:33] before i submit a bug report: can somebody try to reproduce my problem (involves modifying MediaWiki:Anoneditwarning)? [07:08:14] i only have 1.11.2 installed, but i suspect what i found can be seen in higher versions [07:09:21] what you want changed? [07:12:05] transclude some page on MediaWiki:Anoneditwarning, and have that page uses and . Then look at ???MediaWiki:Anoneditwarning directly AND compare with what you see when you edit a page as anon. You should see a difference. [07:15:18] How to enable the template:extension , aint working? i mean the summary at the left of each page, i want to setup it [07:16:22] schnittche1: I'll transclude http://no.wikipedia.org/wiki/Mal:Anonedittest on nowiki. are you ready to test? [07:16:39] sorry it is not ;-) [07:17:06] uuh different language... [07:17:25] just change the template to what you want tested [07:17:42] ... and you can change the language ;) [07:18:19] and all do different things [07:18:33] this is not included [07:18:44] this only shows when transcluded [07:18:48] Zorolll: you setup Mal:Aninedittest, but this is not MediaWiki:Anoneditwarning [07:19:04] This makes the rest of the page act as if it were wrapped in noinclude [07:19:07] schnittche1: yeah, I'll include the template as soon as you are ready to test [07:19:10] MediaWiki:Anoneditwarning is displayed on top of the edit box when editing a page anonymusly [07:19:17] things show when not included, it is not like includeonly [07:19:23] oh i see [07:19:25] I don't want to have it there for longer than needed [07:20:04] Zorolll: it looks ok for me... uses both features [07:20:16] foo bar baz flibble blibble blip blop bloop [07:20:28] foo bar baz flibble blibble blip blop bloop [07:20:32] those behave the same (pretty much) [07:20:45] handy if you have a large page and only want to include a tiny bit [07:21:12] schnittche1: done, do your tests and let me know when you're finished [07:22:00] huh, it works! [07:22:12] ah i know [07:22:25] so I can revert? [07:22:30] sorry Zorrolll, it probably needs another indirection [07:22:43] not if you can wait for 5 minutes [07:22:53] ok, I'll hang [07:23:04] Hey does anyone know of a bot I can use to copy a cetegory of images from commons to my own personal mediawiki? [07:23:17] how do integrate custom google maps into my wiki page! [07:29:19] Zorroll: http://no.wikipedia.org/wiki/Mal:Anonedittest now transcludes a page using and so. can you copy it over? [07:30:24] Zorrolll: forgot an "l" on your name ;-) [07:31:52] schnittche1: that template is already included [07:31:59] weird [07:32:27] it works! [07:33:02] maybe because of the newer version? [07:33:15] This is Wikipedia. Wikipedia never fails! [07:33:21] so I can remove the test? [07:33:34] yes you can [07:33:37] thank you! [07:33:41] the parser was rewritten since 1.11 IIRC [07:33:48] i have to go to work now [07:34:00] the way messages handle transcluded messages was changed a bit [07:34:18] Splarka: this will be exactly it, i guess [07:35:12] we have 1.10 [07:36:54] thanks to both of you! cu [07:44:52] how do integrate custom google maps into my wiki page! [07:45:01] So does anyone know of a bot that can copy images from one wiki (say commons) and place them into another media wiki [07:45:04] ?? [07:46:23] moaning [07:47:57] so, is there a way to create your own "formatter" in MediaWiki, toghether with the usual ones such as ''' for bold, and so on? [07:48:22] like, lets say I wanna create something to act as ,blockquote... [07:48:30]
sorry [07:50:16] MediaWiki already has colon for blockquote [07:50:20] :Indented comment [07:50:22] translates into [07:50:31]
Indented comment
[07:51:20] Is there anything like a pack of default templates I can import? For common stuff like {{stub}} and merge etc. [07:51:26] yeah, but lets say I have a whole, multiline secion that I want indented? [07:51:32] harej: no it doesn't [07:51:41] what does colon do then? [07:51:44] it turns into
Indented comment
[07:51:50] oh [07:52:19] because : is used with ; to create a definition list. it just indents by accident, so people use it for that as well [07:52:28] ; my term : a definition of my term [07:53:04] I think ! does tje indentation [07:53:18] Help please, how to enable the template:extension , aint working? i mean the summary at the left of each page, i want to setup it, but it doesnt show anything [07:53:30] Drazha: : does indentation, just not with
[07:53:43] flyingparchment: ok... [07:53:52] anyway, back to the original question...? [07:55:01] there isn't [07:55:10] you can add new and new {{words}} though [07:55:30] kewl... where can I read about that/ [07:55:55] look up mediawiki extensions on the website [07:56:23] morning [07:59:06] ok, another one, how do i add an image to a category [07:59:16] or viceversa, how do i assign a category to an image? [07:59:52] 03brion * r37936 10/trunk/phase3/ (3 files in 3 dirs): [07:59:52] Revert r37924 "(bug 14883) Create hook AlternateSkinPreferences to alternate skin section in user preferences" [07:59:52] This hook seems pretty unclear to me. What's it meant to accomplish? It passes no data to be customized, and has no code interface for building a sustainable, forward-compatible extension from it. [08:00:38] Heh, last 30d of commits (well, to /trunk) :) http://toolserver.org/~sql/svn-stats/wm-1month/ [08:01:24] 900k lines of code? omfg [08:01:35] when we hit a million, it's going to implode and greate asingularity [08:02:10] Duesentrieb: hehe It's probably counting *every* file in /trunk as being full of code [08:02:33] well, there are not many thatare not [08:02:34] pah, i can do that: https://fisheye.toolserver.org/chart/wikimedia/ [08:02:39] 03(mod) Create hook (AlternateSkinPreferences) to alternate skin section in user preferences. - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14883 +comment (10brion) [08:02:40] readme, docs, changelog... [08:03:14] flyingparchment: that's cool, I didn't know that existed :) [08:04:00] ok, phase3 looks more reasonable [08:04:01] https://fisheye.toolserver.org/chart/wikimedia/trunk/phase3 [08:04:07] 600k lines ist still a friggin lot [08:05:35] only 150k of which are in /includes, about 450k for localization! wow! [08:05:52] heh [08:05:58] Drazha: Edit the image just like what you'd do to articles. [08:06:06] Duesentrieb: https://fisheye.toolserver.org/chart/~charttype=Pie,stacktype=Author/wikimedia/trunk/phase3 [08:06:09] *SQLDb debates on slowly importing the last year [08:06:11] stupid i18n ;) [08:06:37] 03nikerabbit * r37937 10/trunk/extensions/WikimediaMessages/WikimediaMessages.i18n.php: * Move a comment upper [08:08:24] up* [08:09:42] flyingparchment: that chart is no fair, siebrand and nikerabbit sync betawiki :) [08:09:52] actually... it would be nice to use a dedicated account for that [08:09:56] how about it, Nikerabbit? [08:10:08] wtf!?! [08:10:28] why are there 3 commits that change messages while we are in a messages freeze? [08:11:07] siebrand: because people don't know about it? [08:11:51] Duesentrieb: why are you asking me? [08:12:45] ignorance is no excuse. [08:12:55] Nikerabbit: because you are, after siebrand, the main committer :) [08:13:00] siebrand: but an explanation. [08:13:23] anyway - how about using a dedicated account for synking translations from translatewiki? [08:13:26] siebrand: Oh, I'm probably to blame for two of those (to antispoof) :/ Should I revert my changes for the time being? [08:13:28] just for neatness, you know [08:13:57] SQLDb: if you could hold those off until after 1.13 release, that would be great. [08:14:12] Duesentrieb: I haven't been for a long time :o [08:14:25] siebrand: yeah, no problem, sorry about that [08:14:50] SQLDb: you haven't committed in a while, or changed your committer name? [08:15:01] siebrand: I'm fairly new [08:15:58] SQLDb: ah, great. The more the merrier :) [08:16:00] 03brion * r37938 10/trunk/extensions/AntiSpoof/ (AntiSpoof.i18n.php AntiSpoof.php SpoofUser.php): [08:16:00] Revert r37932, r37933 -- this is just downright bizarre. o_O [08:16:00] Format the list as a *list*. There's no need for multiple messages with different numbers of parameters! [08:16:00] Also use the {{PLURAL:}} parser function where necessary, rather than creating separate messages for "account" and "accounts" etc. [08:16:22] oh, beat me to it [08:16:30] *siebrand nods at SQLDb. [08:16:37] (and, it appears I misread the last revert) [08:17:13] TimStarling_: ping? [08:17:30] meh, time for sleep :/ [08:20:10] hello? [08:20:35] i'd like some help regarding mediawiki [08:22:00] shoot [08:22:42] ...? what [08:23:41] theandric: what do you need help with? [08:24:52] this may be a newbie question, but is it possible to use the wiki on my single computer instead of uploading it to the web, or in a way that I can just access it to start off with? [08:25:23] how do i register myself with some registration nickname thing that this channel requires? [08:25:36] 03siebrand * r37939 10/trunk/phase3/languages/messages/MessagesEn.php: Just removing a few justs. [08:26:13] theandric: it's not required, freenode is lying to you. ignore it [08:26:16] theandric: u talking about windows, or linux os? [08:26:29] windows xp prof sp3 [08:26:35] you can, but it requires that you install a webserver with php support, as well as a supported database system [08:26:47] for example apache2 with php, and mysql [08:26:56] all exist in windows versions [08:27:11] i have dl'ed all the stuff via the media wiki website [08:27:15] the four things [08:27:22] what four? [08:27:56] mysql, php, mediawiki 1.12 or whatever and apache [08:28:02] ok [08:28:22] then you have to install those, make sure that it works, and then you can install mw 1.12 on the computer [08:28:27] and i probably shouldn't have, but i have also installed mysql via the installing package [08:28:43] shouldn't be a problem [08:28:45] it works [08:29:12] but, when installing apache, it has that setup screen which says domain name, server name etc... [08:29:25] doesn't matter [08:29:34] what do i enter there [08:29:45] you don't have to fill out those when you're only using it on a local computer [08:30:04] leave them blank [08:30:05] just leave them blank, use default values if there are, or use example.com or similar [08:30:07] ? [08:30:35] so it doesn't matter what apache is set up as, really, as long as it's installed? [08:30:58] i've got an icon in the taskbar saying running all apache services [08:31:08] ok [08:31:25] try to open a web browser, and type localhost in the address bar [08:31:36] see if you get a page or not [08:31:54] just hold a tick though, i haven't installed php yet because i don't know how [08:31:59] does that matter [08:32:11] nope [08:32:14] ok [08:32:27] apache should display a sample html page to let you know that it works regardless of php [08:32:49] it says it works [08:32:58] via firefox [08:33:18] Duesentrieb: What would be the point of a dedicated account; it would just reduce productivity (since siebrand\Nikerabbit would have to have multiple checked out working copies) and what for, making a chart look more accurate? [08:33:41] ok [08:33:45] then you need to install php [08:34:29] i dl'ed php-5.2.6.tar [08:34:34] if i remember correctly, the readme file for php/win is quite good, and tells you what to do [08:34:42] oh, that might be the wrong file [08:34:50] and unzipped it via winrar, now have a folder with random doc's [08:35:01] try this: http://www.php.net/downloads.php [08:35:20] which one [08:35:21] tar files are usually for non-windows [08:35:23] MinuteElectron: correctly attributing contributions. technically, when committing stuff from translatewiki under your own name, your are infringing on the contributor's copyright. using a dedicated account would make it more clear where to at least find the real contributors. [08:35:31] makes sense now [08:35:45] 03(NEW) Create hook (NotifyOnPageChangeComplete) to collect information of page's changes - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14893 15enhancement; normal; MediaWiki: Email; (moli) [08:35:48] try the first one listed under windows binaries [08:36:06] download, extract, and you should have a readme file with all installation steps [08:36:23] give me 3 mins to d/l [08:36:37] not much to do, you just have to tell apache to serve files ending in .php, and to load and activate the php module [08:36:39] sure [08:37:42] what mediawiki version would you suggest is best for me [08:38:06] Duesentrieb: That is incorrect, all translators are correctly attributed above the translation themseleves; and commiting under a dedicated username would also suggest that that organisation owns copyright - which is inaccurate aswell. [08:38:18] mau [08:38:24] and where to download from, i've got... mediawiki-1.12.0.tar [08:38:26] Duesentrieb: And the commit message itself always includes a reference to Betawiki. [08:38:43] unless they forget to include it, heh [08:39:54] theandric: that one is ok [08:40:04] mw is mw no matter the os [08:40:08] 03siebrand * r37940 10/trunk/phase3/maintenance/language/messages.inc: [08:40:08] Fix for r37928. [08:40:08] So if you *do* decide to add messages during a freeze, then at least take the time to do it right. [08:40:17] theandric: and 1.12 is the best for you [08:40:30] in my case, is it possible to auto download from the SVN or whatever? [08:40:42] you can, but i wouldn't reccomend it [08:40:44] after i have it running? [08:40:49] ok [08:40:53] 1.12 is stable, while svn might contain various bugs [08:41:20] MinuteElectron: yes, you are right, attributing authors inline is good, and probably sufficient. and no, using a dedicated account is not *important*. I just think it would be nice. [08:42:18] the php file is a zip on my desktop, unzip it? [08:44:15] You may actually want to try just using something simple like XAMPP [08:44:15] theandric: depends on how you access your web space. that's where you want to have the files in the end, after all [08:44:49] how do i forward all requests to xyz.com to www.xyz.com in mediawiki i believe its to modify in .htaccess file and add some code! heard it help in seo! [08:45:00] oh... you want to set up a webserver on your desktop? use xampp [08:45:06] ABUT THINK ABOUT WHY YOU WANT TO DO THAT [08:45:10] oops, sorry [08:45:23] so, is it possible to activate the wiki so i can use it via firefox in some kind of offline mode? [08:45:27] i can't think of a good reason, unless you are developing web applications and need to test them locally [08:45:56] 03siebrand * r37941 10/trunk/phase3/languages/messages/ (23 files): Localisation updates for core messages from Betawiki (2008-07-23 10:30 CEST) [08:46:15] theandric: not offline, but you install it onto your computer [08:46:21] theandric: "offline" as in "on your own computer" yes. but then it's ONLY on your computer. and mediawiki is... not designed to be used as a personal scrap book [08:46:24] Duesentrieb: you have to talk to the ones who actually do the comitting about how hard it would be to alter the process that way :o [08:46:40] theandric: you are trying to fly a jubo jet to the grocery store... [08:47:12] 03(mod) Create hook (NotifyOnPageChangeComplete) to collect information of page's changes - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14893 +comment (10moli) [08:47:14] Duesentrieb: if I understood him right, he wanted to install it locally to set it up before making public on a site [08:47:18] Nikerabbit: i thought you did that too, sorry. i just saw lots of i18n commits from you - but they might have been old. i was just looking at river'S pie chart [08:47:19] yes, that's exactly it [08:47:40] i don't see a good reason for that [08:47:47] although you can just set read permissions to users only, and disable registering if you want to [08:47:54] it just adds the pain of setting up a XAMP enviropnment [08:47:57] and then the pain of migrating [08:48:01] all a waste of time [08:48:02] TimStarling_: Your commit of the Werdna feature was broken, you forgot to add the DoubleRedirectsJob.php file [08:48:05] so where and what do I do with the php [08:48:15] RoanKattouw: White_Cat feature ;) [08:48:19] theandric: unzip and read the readme ;) [08:48:23] Oh whatever [08:48:28] theandric: you put them on your web space. into an extra directory under your web root. [08:48:33] 03tstarling * r37942 10/trunk/phase3/includes/DoubleRedirectJob.php: Missing file for r37928 [08:48:34] RoanKattouw: =P [08:48:34] The feature we've all been waiting for, anyway [08:48:36] 14(INVALID) Create hook (NotifyOnPageChangeComplete) to collect information of page's changes - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14893 (10moli) [08:48:41] Wow, fast response xD [08:49:18] If this passes Brion's code slush and makes it into 1.13, a lot of end users would be very happy, I think [08:49:24] 03(NEW) Create hook (NotifyOnPageChangeComplete) to collect information of page's changes - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14894 15enhancement; normal; MediaWiki: Email; (moli) [08:49:42] so if I want everything on my local computer, how do I do it [08:50:07] how do i forward all requests to xyz.com to www.xyz.com in mediawiki i believe its to modify in .htaccess file and add some code! heard it help in seo! [08:50:13] theandric: scrap that php download. get XAMPP. install it. read the instructions. set it up. [08:50:18] RoanKattouw: actually I was responding to wikitech-l [08:50:28] sorry for repeat jst flt it mite have missed admist d ongoing conv! [08:50:29] I think I saw it there before you started typing here [08:50:32] TimStarling_: Ah. Well that was me too, and only 2 mins earlier :P [08:50:34] theandric: then you have a web root on your computer. create a subdirectory. put mediawiki there. direct your browser to that location [08:50:48] so guys [08:50:51] new p2p wikipedia out there [08:51:21] theandric: if you want that, you will need to learn how to set up and use a webserver with php support, and a database server. [08:51:42] theandric: but again, i see no good reason whatsoever to set it up on your local comuter. it's just a waste of time. [08:51:48] 03siebrand * r37943 10/trunk/extensions/ (85 files in 80 dirs): Localisation updates for extensions messages from Betawiki (2008-07-23 10:30 CEST) [08:51:58] domas: wtf is p2p wikipedia? [08:52:05] mediawiki-l [08:52:10] anyway, scalaris is interesting for some stuff [08:53:08] Duesentrieb: I used to run MediaWiki on my computer, it's useful if you want to development\want to create something on your own\need a computerized notepad - especially if you don't have a VPS or web hosting. [08:53:43] Duesentrieb, i don't run a site or anything though [08:54:01] got it :D [08:54:43] MinuteElectron: if you develop, sure. if you just need a notepad, use something more lightweight. settign up a wiki locally and then transfering it to an online webspace is pointless, and if you have no clue about setting up a webserver or database its just a lot of needless pain. [08:55:07] theandric: err. but you said you want to set it up locally to make it available online later. [08:55:35] theandric: if you want it *only* for local use, i really recommend something more lightweight than mediawiki. again, mediawiki is a jumbo jet. it's designed for millions of users. [08:55:58] google for "personal wiki engine" or some such [08:57:35] it's ok, I'd just like to know how to do it [08:58:33] get xampp, read docs, set it up [08:58:45] ok, i'll try that [08:59:00] put mediawiki into your web dir, go there with your browser, follow instructions [08:59:21] where's my web directory [08:59:52] wherever you put it [08:59:53] That would depend on what server setup you're using [09:00:10] .. [09:00:14] theandric: as i said: read instructions, set up xampp (that is, apache, php, mysql, etc) [09:00:30] if youwant to know, you will need to learn... [09:00:43] and you will need to learn how these things actually work and interact. [09:00:58] will xampp work if I have already installed apache, mysql, will i need to download it again via xampp [09:01:17] if you already have a working setup, that's fine. [09:01:27] xampp just makes it much easier to set it up and manage it [09:01:36] less need to fiddle with config files, get versions right ,etc [09:01:39] xampp or xampp lite [09:01:46] lite probably does it [09:01:51] read what it contains [09:01:57] what you need is apache, php and mysql [09:03:54] i'm on the xampp dl screen, do i want the zip, the exe or the installer [09:05:03] ok, installer [09:30:47] can anybody help me? [09:30:53] 03(mod) Global deleted image review for Commons admins - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14801 +comment (10banyantreewp) [09:30:55] i've just installed xampp [09:31:07] theandric: You wanted to set up a wiki, right? [09:31:12] yep [09:31:46] Alright, so you installed Apache, MySQL and PHP [09:31:54] 03river * r37944 10/trunk/switchboard/ (fcgi_record_writer.h fcgi_server_connection.h fcgi_socket.h): (log message trimmed) [09:31:54] fcgi_record_writer needs to track alive_ to avoid race between socket close/cancel and async function call causing assert failure [09:31:54] switchboard: fcgi_record_writer.h:167: void [09:31:54] fcgi_record_writer::write_done(asio::error_code, size_t, [09:31:55] boost::function >) [with Socket [09:31:59] = asio::basic_stream_socket asio::stream_socket_service >]: Assertion `socket_.native() != [09:32:23] flyingparchment: what the heck is that? :) [09:32:32] domas: CIA fucking up the commit message formatting [09:32:38] hehe [09:32:39] switchboard: fcgi_record_writer.h:167: void fcgi_record_writer::write_done(asio::error_code, size_t, boost::function >) [with Socket = asio::basic_stream_socket >]: Assertion `socket_.native() != -1' failed. [09:32:49] i've installed mysql and apache, how can i test if php is installed? [09:32:57] flyingparchment: what is switchboard? [09:33:33] domas: it listens to fastcgi requests from a webserver and creates php processes on demand to handle requests. it caches old processes to avoid excessive fork()ing, and setuid()s to the owner of the php file (like suexec) [09:33:54] flyingparchment: ah. [09:34:00] flyingparchment: what... about... apc and stuff? [09:34:05] sysv shared memory? [09:34:37] domas: that'd work the same way it currently does with CGI PHP... not actually sure off hand. [09:34:42] i assume it'd use a temporary file [09:35:18] flyingparchment: it wouldn't, then [09:37:27] domas: hmm.. each fastcgi process executes 500 requests before it exits, so i think it should be able to do some caching [09:37:58] hmm? [09:38:13] apc works fine for me with fcgi [09:38:19] theandric: You verified that Apache is installed then? [09:38:27] Nikerabbit: does it do bytecode caching? [09:38:44] flyingparchment: both [09:38:51] good [09:38:53] yes [09:39:02] theandric: To verify that PHP is installed, download and try to install MediaWiki. That'll fail miserably if PHP isn't installed [09:39:09] flyingparchment: http://translatewiki.net/wiki/Special:ViewAPC [09:40:19] runtime 1101 hours :) [09:41:03] how do I install mediawiki? [09:41:14] i've downloaded the 1.12 [09:41:58] !install | theandric [09:41:58] --mwbot-- theandric: Installing MediaWiki takes between 10 and 30 minutes, and involves uploading/copying files and running the installer script to configure the software. Full instructions can be found in the INSTALL file supplied in the distribution archive. An installation manual can also be found at . See also: !download [09:41:59] flyingparchment: that would be lots of cache [09:42:02] i'll be back tomorrow [09:42:05] flyingparchment: 100MB/instance [09:42:11] domas: maybe for mediawiki, not so much for toolserver [09:42:17] ah [09:42:19] ye [09:43:03] i'll be back tomorrw [09:43:08] thanks for your time [09:44:25] 03(mod) Special:Import should have a hook - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=11537 (10rv1971) [09:52:52] 03rotem * r37945 10/trunk/phase3/languages/messages/MessagesHe.php: Localization update for he. [10:07:24] 03river * r37946 10/trunk/switchboard/ (async_read_fcgi_record.h fcgi_socket.h): [10:07:24] - async_read_fcgi_record: remove cout debugging [10:07:24] - fcgi_socket: writer_ might not exist at this point, assertion failure ensues [10:07:24] switchboard: /opt/boost/include/boost-1_35/boost/shared_ptr.hpp:375: T* [10:07:24] boost::shared_ptr::operator->() const [with T = [10:07:24] fcgi_record_writer asio::stream_socket_service > >]: Assertion `px != 0' failed. [10:32:21] 03rotem * r37947 10/trunk/phase3/languages/messages/MessagesHe.php: Localization fix. [10:35:32] 03(NEW) Other Table Layout - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14895 15enhancement; normal; MediaWiki extensions: Semantic MediaWiki; (dasch_87) [10:48:52] hello. [10:49:02] hi [10:53:25] hi Werdna [10:53:30] TimStarling: I have asked brion, but I think you've done more DB design than he has (as far as I remember). If I wanted to store a list of "flags" for each recent change (i.e. edit made through tor, edit considered suspicious by heuristic X, edit should be given extra scrutiny), added by software before the edit was submitted, what would be the best DB layout? [10:53:53] Werdna: add new row? [10:54:09] VasilievVV: you mean a column? [10:54:16] Yeah, sorry [10:54:32] There'd be multiple flags per recentchange [10:54:37] which would make filtering on them difficult. [10:55:01] No, add row per each flag [10:55:22] add a row to what? [10:55:43] an edit<->flag table [10:55:58] Ahem, I mixed up rows and columns again [10:56:05] so a separate table? [10:56:10] hmm [10:56:17] if you have an unlimited number of flags, then you need to [10:56:31] You can either add new column for each flag, and that would be the best [10:56:32] a tinyblob in the recentchanges table would probably be enough [10:56:40] just do a field with comma separated values.. like in e107 CMS. th3 best database design! [10:56:40] or a bitfield [10:56:47] Or do like it's done with page_props [10:56:55] VasilievVV: that sounds terrible. a schema change for every new flag? [10:56:57] a separate table wouldn't be much faster [10:57:00] No bitfields. I think they are expensive to filter [10:57:06] TimStarling: but how would filtering go with that? [10:57:16] Werdna: that's why it's an idealistic variant [10:57:38] well, you can't have filtering with that [10:57:45] right. We'd want filtering [10:57:55] Special:Recentchanges?filterflag=tor [10:57:59] gives tor edits, for example. [10:58:03] you'd need an index on each flag [10:58:03] Werdna: create rc_props table [10:58:12] yeah, a separate table [10:58:38] is the extra join going to be nasty? [10:58:48] yes [10:58:54] but less nasty than any other solution [10:59:03] varchar(255) please [10:59:06] you can order by the primary key in the new table [10:59:14] brion suggested caching the values themselves in recentchanges [10:59:18] or whatever [10:59:29] so when we display, they can be shown without a join, and we use rc_props only for filtering. [10:59:52] rc_props would likely include keys to logging/revision tables, so we can display the data in, for example, contributions. [11:00:22] ugh, so much denormalizing :( [11:00:34] you're talking about a pretty large, hot table [11:00:57] TimStarling: The idea is that the flags are set ONLY when the edit itself is made. [11:01:23] so like 5 times per second? [11:01:28] I think we need ORM [11:01:37] and fetch objects one by one [11:01:40] to the application [11:01:42] TimStarling: Well, I wouldn't imagine flags would be set very often. [11:01:49] I think it's going to have hardware costs no matter which way you do it [11:02:02] TimStarling: If an edit matched an abuse filter, if an edit was made through tor, all that kind of thing. [11:02:06] what will be stored in the flags ? [11:02:15] aah ok [11:02:30] see, storing is easy [11:02:33] now what are you going to query [11:02:35] is another issue [11:02:49] getting 'show me all tor edits' will be very expensive whatever you do [11:03:07] domas: well, if we store both in recentchanges and in rc_props table, it won't be too awful, will it? [11:03:11] well, like I say, you can have a (tor_flag,id) key [11:03:19] TimStarling: but still the data is quite sparse [11:03:34] it's not sparse in that index [11:03:37] TimStarling: so you end up either having complete dataset in memory, or doing lots of seeks [11:03:40] it's sparse when you load the other table, yes [11:03:42] yeah, but it still has to lookup all rows [11:03:49] from the table itself [11:04:19] what about adding more hardware? [11:04:20] mmm, the table would need pruning [11:04:25] and so a timestamp index too [11:04:28] Werdna: for Tor, you may want to introduce new column to RC. For AbuseFilter, you may join it with abuse log [11:04:33] wee, hardware [11:04:37] TimStarling: that solves it, yes, especially if we set it as querygroup [11:04:43] VasilievVV: after a more general solution. [11:04:52] or send to ariel, which essentially has recentchanges in memory :) [11:04:57] AaronSchulz: Actually, I'd like the data to stay around. [11:05:09] if we can store the whole table in memory, then we can do lots of fun stuff with it [11:05:35] Werdna: add it to cu_changes [11:05:42] thats just 2G [11:05:44] with 1 month of data [11:05:47] then it should be tor_flag,timestamp,id [11:05:48] or is it 2 months of data :) [11:05:52] You don't need general solutions [11:05:54] cu is 3 months [11:06:15] AaronSchulz: RC is one iirc [11:06:16] on the other hand [11:06:39] having 'tor flag' index for whole recentchanges is inefficient :( [11:06:44] see, plain RC is completely useless on the english wikipedia [11:07:11] cause you end up maintaining index entries for all the remaining data [11:07:11] we need things like efficient breakdown of RC into categories [11:07:17] VasilievVV: yes, I know, and CU is 3 [11:07:20] anyone wants to make mysql support partial indexing? :) [11:07:34] *AaronSchulz points to domas [11:07:46] Werdna: why do someone need to see "users editing from Tor"? [11:07:55] damn, would be so nice to have 'don't index NULLs' [11:08:09] VasilievVV: It's part of a plan to allow tor. [11:08:27] domas: I'm still proud of that RC union ;) [11:08:30] I think that if users have the fact that they are editing through tor exposed to the world, they will go back to proxies and such if they intend to do bad things. [11:08:35] you could just set the username to "tor" [11:08:43] true [11:08:44] heh [11:08:47] but thats just an example [11:08:48] Werdna: they have ipblock-exempt [11:09:06] TimStarling: logged-in editing? [11:09:11] http://en.wikipedia.org/wiki/Special:Contributions/Tor [11:09:18] yeah, not for logged-in editing obviously [11:09:36] VasilievVV: ipblock-exempt is not a catch-all solution to tor. [11:10:11] VasilievVV: has an exceptionally high barrier to entry, and requires somebody to "explain themselves" to a group of editors who are more than likely to say "Ah, I don't believe you/you don't really need tor, edit normally". [11:10:39] Tor is blocked because it causes problems. If we can come up with a better way to solve the problems it causes, then we can unblock it. [11:10:51] For example abusefilter (to a degree), and this work would make it far better. [11:10:51] 03(NEW) Hidden SVG layers should not be rendered - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14896 minor; normal; Wikimedia: General/Unknown; (marti) [11:11:29] is svn head broken? [11:11:49] """ [23-Jul-2008 12:11:08] PHP Fatal error: Call to a member function selectRow() on a non-object in /home/www/html/w/includes/User.php on line 826 """ [11:12:17] *Werdna looks [11:12:18] features don't necessarily have to have negligible performance impact to be accepted [11:12:41] *AaronSchulz hopes a Tor filter would be negligable [11:12:42] Of course, but minimum performance impact is important. [11:12:42] they have to be efficient, but beyond that, it's a trade-off between cost and value [11:12:55] AaronSchulz: it's already done. [11:13:18] cirwin: hmm [11:13:20] $dbr = wfGetDB( DB_MASTER ); [11:13:21] $s = $dbr->selectRow( 'user', '*', array( 'user_id' => $this->mId ), __METHOD__ ); [11:13:23] Werdna: do you get what domas was saying about the join being sparse? [11:13:28] I don't see an RC tor filter? [11:14:20] TimStarling: yes. He's saying that most of the time, the JOIN will look for tags, and find none. That's nasty, performance-wise, because it has to do a binary search on the index, which takes longer on average if it doesn't exist than if it does. [11:14:24] I think that's right. [11:14:29] AaronSchulz: TorBlock extension. [11:14:37] oh, RC filter. [11:14:39] no, that's coming. [11:14:42] *AaronSchulz sighs [11:14:55] not sure you're quite there... [11:15:25] an index groups rows with a similar index value together [11:15:48] so if you have an index on (rcp_tor,rcp_id), then the most recent tor edits are all together [11:16:02] so that part of the query is highly localised [11:16:08] Werdna: so you can create table with (rc_id, tagname, tagvalue(1 or 0)) and JOIN it [11:16:40] but the bulk data in the recentchanges table is going to be approximately ordered by the time it was inserted [11:16:57] VasilievVV: extra DB query on save, searching more difficult because there are more rows, waste of space, etc. [11:17:21] how are /newbies filtered so efficiently? [11:17:22] so loading the rows from the recentchanges table is going to be seeky, it'll hit lots of places on the disk [11:17:27] TimStarling: oh, so it's more about I/O seeks? [11:17:31] Werdna: each-flag-has-its-column is the only performance-wise mode [11:17:31] yes [11:17:46] if you have enough RAM to hold the whole table in memory, there's no problem [11:17:47] (in Contributions that is, own table?) [11:17:49] Splarka: We find the uid for the top 1% of users. [11:17:56] bah, too easy then [11:18:04] and filter on rev_user_id>that_val [11:18:32] domas says it's around 2GB at the moment for enwiki [11:18:43] thats without indexes [11:18:54] and there's lots of indexes [11:19:05] got to go grocery shopping, bbl [11:19:10] hehe, if we used SSD disks.. [11:19:11] ;-) [11:19:12] TimStarling: at 9:19pm? [11:19:25] yeah, I think they're open until 10 [11:19:36] have fun. [11:21:15] unless it is pruned, the timestamp needs to be in that tag index [11:21:25] otherwise, the filesort would just get larger [11:21:33] AaronSchulz: rp_rcid is in the index. [11:21:34] Werdna: sorry, my mistake - mysql had died [11:21:37] I have 24/7 shop nearby [11:21:40] big supermarket [11:21:45] I love shopping at 3am [11:21:45] domas: you weird europeans. [11:21:53] Werdna: thats unusual for most of europe though [11:22:12] domas: you're still weirdos. [11:22:30] domas: I have a Giant Eagle nearby [11:22:33] open 24/7 [11:22:51] Werdna: so you'd assume rc_id corresponds to rc_timestamp? I guess it should [11:23:17] AaronSchulz: indeed [11:23:19] *AaronSchulz has built up mistrust due to rev_id/rev_timestamp [11:23:27] Werdna: I'd still prune it [11:23:37] since the old rows would link to dead RC rows [11:23:41] AaronSchulz: why? then you lose data? [11:23:44] and therefore just take up space [11:23:46] AaronSchulz: It would also link to rev_id. [11:24:02] ok, that's better then [11:24:21] but then the table shouldn't be named "RC" something ;) [11:24:52] domas: I went there late last time. No traffic pretty much \o/ [11:25:23] AaronSchulz: I meet lots of friends after midnight :) [11:26:38] well in europe, I'm sure it's a party down there that late at the store [11:27:34] in winter I don't really care much [11:27:36] is it day or night [11:27:40] it is dark 20 hours a day anyway ;-D [11:31:24] domas: it's winter here. very annoying. [11:31:27] it's freezing~! [11:31:32] like 10 degrees [11:31:34] *Werdna whines [11:31:48] -10 degrees of Celsuim? [11:31:50] *AaronSchulz has his a/c on [11:31:54] ahhh [11:32:27] Werdna: thats like summer in san francisco! [11:32:33] Werdna: +10 or -10? [11:33:58] domas: +10 [11:34:00] celsius [11:34:15] well, I know there's no -10 in aus [11:34:23] domas: don't tell me that! I'm thinking of being in san francisco for a month in jan! [11:34:48] Werdna: Jan is +20C [11:34:51] Werdna: Jun is +15C [11:34:53] :) [11:35:07] Werdna: when it is hot in valley, it pulls cool air from ocean to san francisco :) [11:35:15] domas: hehe, sydney jan is +40C, sydney jun is +10C [11:35:29] +40C in sydney? thought it is much milder [11:35:40] a few days each jan anyway :P [11:35:44] *Werdna likes the +40C days [11:35:46] most often 30C [11:35:52] SF in january is fine really [11:35:57] SF in july is not ;-) [11:36:03] death valley, CA.. 56C [11:36:10] I'll be in England for July [11:36:12] that will be bad. [11:36:27] Truckee, CA, ... 10 meters of snow (not snowfall, not drift pack, snow on the ground) [11:37:03] the earthquakes and rain cause mudslides that put out the fires every fall, though <3 [11:37:07] Werdna: I more like -15C :) [11:37:24] 03aaron * r37948 10/trunk/extensions/FlaggedRevs/language/RatingHistory.i18n.php: typo [11:42:03] what is smorgasbord? [11:42:56] http://en.wiktionary.org/wiki/smorgasbord "a buffet" [11:43:20] but I think it can more generally mean, something with a selection of choices [11:43:51] 03aaron * r37949 10/trunk/extensions/FlaggedRevs/ (2 files in 2 dirs): [11:43:51] * Remove debug line [11:43:51] * Improve UI [11:44:56] Hi... how can i set an user as steward? :) [11:51:29] Alan042: maybe with [[Special:UserRights]], otherwise in the database [11:54:19] Or use createAndPromote.php [11:55:17] I didn't find that file, though I glanced through the maintenance scripts [11:55:26] it should have a name with user or rights in it :) [11:55:55] It create user and promotes it [11:57:38] where is that file VasilievVV ? [11:57:53] in the maintenance directory [11:58:22] so, my bosses want color in the wiki because they are "already bored by the seeming "black and white only"" wiki...I don't even know how to respond to this...suggestions? [11:58:24] so is runnable from the bash, not from browser right? [11:58:30] Yes [11:58:41] php createAndPromote.php username password [11:58:45] tomow: play around with [[Mediawiki:monobook.css]] [11:59:44] i try it now.. [12:00:48] how can i change the icons or rather make the icons bigger that appear when u edit a page.. can i add my own tools too? [12:02:43] When I have xdebug installed but disable it completely my scripts are still 2 times slower, anybody else having this problem? [12:04:12] VasilievVV, it says : [12:04:15] wikidbco: Creating and promoting User:Marco...account exists. [12:04:23] Ah [12:04:31] Do you have bureaucrat rights? [12:04:51] how can i know that? [12:04:58] the user has 1 as id [12:05:05] Enter Special:Listusers [12:05:34] Marco 200e(Bureaucrat, Sysop) [12:05:45] You have it then [12:06:05] yes...but i want Marco as steward.. [12:06:20] i need to merge user accounts in all wikis [12:06:43] Do you have installed CentralAuth? [12:06:59] yes [12:07:02] heh [12:07:14] Do you even have configured it using $wgConf? [12:07:49] yes... but i had to change something about the code in the page of Centralauth, otherwise nothing work.. [12:09:09] Alan042: what version of MediaWiki do you use? [12:09:23] 1.12 [12:09:28] *^demon waves to VasilievVV [12:09:36] *VasilievVV waves to ^demon [12:13:18] the problem with the wgconf setting wrote on the mediawiki page is that sitefromDB return null if you don't edit the code...more there is a mistake genereting the link of the primary wiki .... by the way..my main point today is to give the opportunity to th admin to merge accounts without leaving this task to user only... [12:14:47] The parsing code seems to have changed significantly in v1.12, causing an extension bug which I'm having a hard time fixing. In summary... [12:16:02] ...a parser {{#function}} which formerly parsed any data it was passed ( e.g. {{#function:something|{{#anotherfunction}}}} ) now no longer does so. [12:16:31] 1.12 got a brand new preprocessor [12:16:43] however, parsing of parameters to parser functiosn should work as before [12:16:48] can't see why it wouldn't [12:16:58] have you checked how it works with other parser functions? [12:17:07] The way it worked before, the {{#anotherfunction}} had to be called using escapes for the {{}} -- \o and \c -- but now those are just being rendered with no further parsing. [12:17:11] One of the main problem with CentralAuth is that migrateuser0.php and migrateuser1.php scripts are wrong, the second one seach for a table "migrateduser" not even created by centralauth.sql.. [12:17:46] I should try it with other parser function, yes, good thought. [12:17:55] If they work, then I should see how their code is different. [12:18:06] If not, then I'll be back... [12:21:45] VasilievVV, any idea how to do it? :| [12:22:20] Alan042: INSERT INTO user_groups VALUES (0,'steward'); should do it [12:23:01] ok...and then how i promote the user? with the script of before' [12:23:36] cirwin: sry for the delay. i had to take a call. anyway, they want to randomly inject color into the middle of a wiki, not change the font color. [12:23:51] ?? [12:24:03] run sql.php and type "INSERT INTO user_groups VALUES (1,'steward');" [12:24:09] as in have a random square that changes colour randomly? [12:24:16] no. [12:24:19] as in text colors. [12:24:30] like randomly add red or blue font fact to an article. [12:24:45] *font face [12:25:01] you could write some javascript that selects a random node in the article and colours it [12:25:18] but I'm not sure I'm understanding you properly [12:25:21] no, they want to control it. [12:25:38] like say stuff here and have it render on the page as red. [12:25:38] they can use Blah [12:26:28] {{yellow|text here}} [12:26:33] {{blue|text here}} [12:26:35] etc. [12:26:40] and then {{{1}}} in a template [12:26:47] and {{funfact|}} calling it [12:27:17] ah. good call. I'll propose that idea to them. I think random colors in an article without meaning something is probably the stupidest thing. ever. [12:27:51] *cirwin still likes the actual "random" colours idea :) [12:28:06] then I think you should make an extension. :) [12:28:16] zebra striping by section...yay or nay? [12:28:19] http://en.wikipedia.org/wiki/Main_Page [12:28:21] heh [12:28:25] Did you know... [12:28:31] '... that when the Congolese village of Bogoro was attacked in 2003, survivors were imprisoned in a room filled with corpses, and women and girls were sexually enslaved?' [12:28:41] geez, I guess not [12:29:01] wow. [12:30:07] things haven't changed since the tribal wars of year dot... [12:30:50] 03river * r37950 10/trunk/switchboard/ (13 files): [12:30:50] - add some missing svn:keywords [12:30:50] - never include directly, instead use specific headers [12:44:20] i'll admit, I haven't looked at it yet, but how easy is it to skin mediawiki? [12:45:10] look inside skins/MonoBook.php [12:45:18] good morning (night?), brion [12:45:21] just have to create a file that does that [12:46:24] mornin [12:46:55] my sleep schedule's still all messed up from egypt, normally i'd kill a man rather than be awake before 6am :D [12:51:24] sup [12:51:29] yo yo yo [12:51:46] *brion writing up some notes from oscon before heading out for breakfast [13:04:01] hey brion [13:04:43] hi brion, i suppose you're the author of CentralAuth right? [13:04:53] Alan042: it has many authors [13:05:09] Alan042: among others, including Tim Starling, a bit from myself, a bit from Vasiliev, etc. [13:05:26] oh... :) [13:06:00] well...another question about this useful extension... how can i merge an user account being administrator? [13:06:16] Special:MergeAccount [13:06:30] Alan042: You mean you want it to be administrator everywhere? [13:07:15] mmm Werdna that is a task that i want reach but i'm asking another thing... i'm the administrator of english wiki [13:07:28] a new user, Frank, register... [13:07:34] i want to merge his account... [13:07:47] you can't do that unless you're a steward. [13:07:54] i'm a steward [13:08:01] Special:CentralAuth [13:08:08] i go in Special:CentralAuth.. [13:09:05] Werdna: nobody can merge account for user except the user himself [13:09:11] i insert the name Frank but it says No unified account for this username. [13:09:17] ah, then you can't do it. [13:09:35] oh VasilievVV... so i can't? [13:09:53] Alan042: only Frank may merge his account [13:10:10] currently that is the sad case, yes [13:10:17] ok :) [13:10:21] you could do it manually in the database perhaps though :) [13:10:30] or via code into eval.php [13:10:54] another question about the merge process... [13:12:13] x.x.x.x/mediawiki-1.12.0c 20:39, 22 July 2008 login [13:12:24] x.x.x.x/mediawiki-1.12.0c 18:04, 22 July 2008 primary [13:12:32] x.x.x.x/mediawiki-1.12.0c 20:48, 22 July 2008 login [13:12:44] i've three wikis... common, italian, spanish.. [13:13:07] when i go in the merge page of an user i see those 3 links [13:13:13] mmhmm? [13:13:33] Alan042: it shows you all accounts of the user [13:13:41] i put the x manually now for hide the ip, but the directory in always the same... [13:14:19] why is not diffent? it must be mediawiki-1.12.0it for the first one, and mediawiki-1.12.0sp, for the third one [13:15:06] seem that it can't find the right directory from the DB name... [13:16:45] Alan042: you're using symlinks? [13:16:48] VasilievVV, yes, those are all the accounts, "primary" i suppose shows the root wiki where the user born, and "login" where the user can login...but i don't understand why the links are all the same.. [13:17:00] no Werdna.. [13:17:22] i hope there's not something wrong in my setting of wgconf [13:17:36] It is [13:19:07] ...oh really?..some hints where search the mistakes? [13:20:55] for those interested -- http://leuksman.com/log/2008/07/23/oscon-reports-open-mobile-exchange/ [13:21:13] " [13:21:13] It's unfortunate that a modern parser generator for a supposedly fast [13:21:13] language like Java can't match hand-optimised PHP for speed. It's not like [13:21:13] we've set a high bar here. [13:21:19] oops, damn line breaks [13:21:23] anyway... hehe :) [13:21:57] :D [13:22:14] tim hit it right on the head again [13:22:55] VasilievVV, can you give a look here http://rafb.net/p/qAGBz870.html and tell me please what could be wrong? or brion? [13:23:52] Well, CA doesn't support one-domain applications [13:24:28] does it want more subdomain? [13:24:35] VasilievVV: erm? [13:24:58] it works fine with multiple paths on a single domain [13:25:06] that what i use on my test machine in the office [13:25:08] oh fine.. [13:25:23] brion: how do it display them then? [13:25:53] VasilievVV: iirc as hostname/path [13:26:09] it looks like the problem is the $wgScriptPath setting in that conf array [13:26:17] they need to be listed explicitly [13:26:30] same with wgArticlePath [13:26:54] i had to change that line cause there was a var called $wiki, totally empty.. [13:26:55] otherwise it can't pick out the correct path for the variuos wikis [13:26:57] brion: for my machine I installed many virtual hosts and added wikis like en.wikipedia.loc to /etc/hosts [13:27:11] VasilievVV: that works too, i'm just lazy ;) [13:28:13] brion, so how i've to change it? :( [13:28:47] list out those script paths [13:28:59] or perhaps its the lang? [13:29:05] in which case 'default' => '$lang', will do [13:29:08] or rather [13:29:13] 'default' => '/$lang', [13:31:41] 'wgScriptPath' => array( 'default' => '/$lang',) [13:31:47] but nothing changed... [13:32:12] Alan042: do the article path too [13:32:12] bedtime, night [13:33:30] the var lang is empty [13:33:51] the link created is ip//index.php [13:35:02] look : http://$IP//index.php/User:Jack [13:36:43] Ok, on my parser {{#function}} issue... the problem is that the expression involves multiple layers of parsing. Prior to v1.12, I could put in escapes to defer the parsing, but now it doesn't seem to be parsing the output anymore. [13:37:37] The wikicode looks like this: {{#xploop:{{#var:keylist}}|\n* '''$s$''': "{{#var:$s$}}" looks like: [#var:$s$]}} [13:38:04] keylist is a demarcated list of strings, each of which gets plugged into $s$. [13:38:30] Alan042: well, set it manually for each or figure out a way to get the variables split right in the standard fashion [13:38:42] 03aaron * r37951 10/trunk/extensions/FlaggedRevs/FlaggedArticle.php: Index the current revision if it is the default [13:38:52] [#var:$s$] is coming out properly as "[test1], [test2]" and so forth... [13:39:27] ...but apparently {{#var:$s$}} is being parsed before the substitution takes place, so all I get is the value of a variable named "$s$". [13:39:27] TehWuzyl: what extension is this? [13:39:54] There are two extensions involved. [13:40:01] brion...i'l try..i thought btw this was the aim of sitefromDB function... [13:40:20] One is Extension:VariablesExtension [13:40:42] is it in svn? [13:40:43] The other is a slightly customized version of Extension:StringFunctions [13:40:56] why does it have extension twice? [13:41:11] I copied/pasted the title from mediawiki.org [13:41:21] Alan042: that assumes names consist of language prefix + site suffix [13:41:29] I'll get the names from my Special:Version if that will help... [13:41:33] you seem to be using something like site prefix + country suffix [13:41:36] are you the maintainer? [13:42:00] I'm the author of the custom tweaks to StringFunctions, but not the rest of it. [13:42:10] I should probably try to use the standard version first, shouldn't I... [13:42:32] The tweaks are not radical, but just to eliminate possibilities... [13:44:00] brion, can you make an example of a proper way to call db and directory of mediawiki, that make sitefromDB works properly please? [13:44:36] I don't see any #function [13:44:38] example: eswiki (where 'wiki' is one of the listed suffixes and 'es' is a language code) [13:44:56] TimStarling: I meant that in a generic sense. [13:45:21] so db and the directory should have the same name right? [13:45:23] The actual functions giving me a hard time are {{#var}} and my custom function, {{#xploop}} -- which isn't in SVN [13:46:01] is it the interaction between those two? or are they both broken? [13:46:25] It seems to be an interaction. The parsing is in a different order now. [13:46:38] damn right it is [13:46:42] the old order was broken [13:46:47] It used to be that the {{#var:}} substitution took place on return from {{#xploop}} [13:47:04] Now it takes place beforehand, which means that the $s$ substitution hasn't happened yet. [13:47:10] show me the code for #xploop [13:47:21] Thx. One moment... [13:47:27] TehWuzyl: just fyi... if you need to get this fixed *now* for a live site, you can simply switch to the old parser. [13:47:52] Are there instructions on that? Just swap out parser.php or something? [13:47:58] Duesentrieb: are you hinting that I should have something better to do? [13:48:04] no, simpler [13:48:08] TimStarling: nono :) [13:48:23] It's a live site, but low-traffic and nobody but me gives a flying spaghetti monster, really ;-) [13:48:30] there have been a couple of complaints from extension developers about the new parser, and I'd like to sort them out [13:48:34] TehWuzyl: TimStarling probably knows the setting, i have to look it up. wgParserClass or something [13:48:45] because I think the new parser is pretty general, and it should be able to do a lot of things [13:49:05] TimStarling: I'm sure there's a way to fix it. It'd be nice to have that mentioned in the v1.12 docs, so yeah... let's see if we can figure out what's going on. [13:49:08] if you have an extension that requires the old parser, it's going to conflict with extensions that need the new parser [13:49:10] *TehWuzyl goes back to pasting code... [13:49:17] like, say, ParserFunctions [13:49:17] TimStarling: sure, i'm all for that. i'm just telling him how to fix a live site, so the real issue can be sorted out in peace. [13:49:41] TehWuzyl: there was a whole section in RELEASE-NOTES about it [13:50:11] Alan042: they don't have to, but it's convenient if they are the same :) [13:50:12] TimStarling: btw - did you ever get around to looing at the PPCustomFrame stuff i did? that should have unbroken quite a few extensions. [13:50:20] TimStarling: http://rafb.net/p/nOI42D71.html [13:50:36] Duesentrieb: briefly, I think, did you have an application for it? [13:50:43] Oh, there was? Bother not having time to read all the docs, cuz I usually like to... /me goes to look that up. [13:50:44] hi [13:51:10] There is a noindex-Tags on very much Wikipedia pages... [13:51:19] Is there somthing wrong? [13:51:32] TimStarling: yes, my News extension. basically, it allows the replaceVariables method in the parser to also be called "old style", with a simple map of variable values. [13:51:43] TimStarling: some extensions are using that. it's broken in 1.12 [13:51:50] with no good way to work around it [13:52:27] thoth_: $wgParserConf['class'] = 'Parser_OldPP'; <-- consider that a nasty, temporary fix [13:52:31] err, dam [13:52:36] TehWuzyl: see above [13:52:44] Hey, I have a quick question, and google's not giving me anything to go on. I want to be emailed every time a new user signs up [13:52:58] Joe_CoT: use RSS instead. [13:53:23] Duesentrieb: ups , what ? [13:53:26] Joe_CoT: and... i think you need to install an extension, NewUserLog or something. that should give you an RSS feed too [13:53:32] thoth_: sorry, nick typo [13:53:35] :) [13:53:56] Duesentrieb: yeah, looks fine, pretty trivial really [13:54:40] *TehWuzyl saves Duesentrieb's fix for later... can't debug if everything's working ;-) [13:55:03] (Last resort, if I have to go before fixing it for real.) [13:55:17] TimStarling: yes, trivial... i was afraid i missed something :) [13:55:21] TehWuzyl: what's your test case? [13:55:53] Duesentrieb: the fact that you left out the whitespace hack is probably wise [13:56:03] if the application just wants a simple hashtable of arguments [13:56:04] TimStarling: Let me set up a page without lots of gunk on it... my sandbox is currently a mess ;-) [13:56:32] TimStarling: whitespace hack? oh, you mean treating empty params as missing? [13:56:52] TimStarling: yea, well, if i get an explicit map, i suppose i should not touch it. [13:57:22] 03aaron * r37952 10/trunk/extensions/FlaggedRevs/FlaggedArticle.php: Make $wgFlaggedRevsLowProfile work as expected if the draft is the default [13:58:08] maybe you missed it... the reason the base classes treat numbered parameters and named parameters differently is because named arguments trim leading and trailing whitespace and numbered arguments don't [13:58:23] it's just for backwards compatibility [13:58:39] arbitrary behaviour in the old code, keeping it the same minimises complaints [13:59:49] TimStarling: hah, yes, i did indeed miss that one :) [14:00:11] but again - if the params are passed programmatically, i guess they should be left alone.# [14:00:16] yeah [14:00:23] hi [14:01:25] TimStarling - test case: http://issuepedia.org/Help:Sandbox/debug [14:01:27] i have a question/problem about/with Extension:Google Custom Search Engine. Is someone familar with it? [14:04:14] *TehWuzyl continues reading about the parser changes, in case a solution leaps out... [14:05:17] I guess the question is, how can I call {{#var:}} in such a way that it will be parsed *after* {{#xploop}} is done? [14:05:54] Or, even, is there some other variable substitution method I can use to accomplish the same thing (which might not solve anyone else's probblems, however)? [14:06:46] what does CharacterEscapes::charUnesc() do? [14:07:13] That does the replacement of \o and \c to {{ and }} [14:08:15] freaky [14:08:39] and \n? [14:08:39] It's a bit of a kluge. I keep thinking I should install something like Winter, for a more integrated text-processing language... [14:09:06] \n inserts a newline, so "*" will be parsed to a list item [14:09:34] (The \n substitution is also done by charUnesc()) [14:10:19] CharacterEscapes is documented here: http://www.mediawiki.org/wiki/Extension:Character_Escapes [14:10:28] you could use 1= instead [14:10:48] Where? [14:11:00] hmm, no, other way around [14:11:30] Maybe I just need to build a variable parser into #xploop... the code is pretty simple... [14:11:50] ...but again, that doesn't solve anyone else's problems. [14:12:03] 1= doesn't work with the old calling convention anyway [14:12:11] The problem, in generalized form, is when you need multipass parsing. Which this is. [14:12:34] multipass parsing sucks [14:12:38] Yeah. [14:12:38] 03guyvdb * r37953 10/branches/visual_diff/phase3/includes/Diff.php: performance improvements [14:13:07] It's inefficient, for one thing -- and you've probably removed much of that inefficiency with the new parser code, but in the process breaking anything which requires multipass. [14:13:53] So the question then becomes, is it within MediaWiki's scope to try and support these more elaborate uses of the parser? Or should that be a separate project -- something like Winter? [14:13:55] 03jojo * r37954 10/trunk/extensions/Collection/ (4 files): make list of writers configurable. iterate over available writers. [14:14:02] I think it's possible to do what you want to do, using SFH_OBJECT_ARGS [14:14:20] Hmm! I saw that in the ParserFunctions code, but couldn't find any doc for it. [14:14:41] So I don't know what it means or how it works. [14:14:56] I can write some docs [14:15:07] That's be awesome. [14:15:14] s/That's/That'd/ [14:15:30] there's one problem with SFH_OBJECT_ARGS which is that the first parameter is always expanded, that was a problem for other extension developers [14:15:34] but for you, it doesn't matter [14:15:42] Right. [14:16:17] basically SFH_OBJECT_ARGS will give you an array of arguments, where all except the first one is a parse tree object instead of text [14:16:32] Intriguing. [14:16:48] so when you do the loop, all you have to do is expand the parse tree in a different context each time [14:17:04] I'm hoping I'll be able to figure out how that works from examples. [14:18:23] it'll be easiest if you change the syntax slightly [14:18:49] instead of $s$, you could use say {{$s}} [14:18:55] Hm! [14:19:25] You think that would work without any other changes? [14:20:13] a few little changes [14:21:50] *TehWuzyl tries a few things... [14:22:24] if you register your parser function with SFH_OBJECT_ARGS, it'll be called with three arguments: func($parser, $frame, $args) [14:23:03] the argument in question will be $args[1] [14:23:20] ordinarily, you would expand it with $frame->expand($args[1]) [14:23:37] that's what is being done automatically for you, in the backwards compatible interface that you're using now [14:23:42] Ahh. [14:23:58] what you can do is use a custom frame [14:24:23] $myFrame = new MyCustomFrame( $frame, $extraArgs ); [14:24:29] $myFrame->expand($args[1]); [14:24:56] *TehWuzyl takes notes... [14:25:48] And what does MyCustomFrame need to accomplish, basically? [14:26:03] it's going to be a bit tricky if you need to support arguments that come from the enclosing scope [14:26:30] This may be a kluge upon a kluge... [14:26:48] there's a function called $myFrame->getArgument(), you can have it return the right variable value [14:27:33] the trouble is, the base class is an interface, not a class [14:27:40] and there's duplicated functionality between PPF_ [14:27:45] PPFrame_DOM and PPFrame_Hash [14:27:49] Sounds hairy. [14:28:23] Maybe I just need to write a mini-parser which does nothing but variable substitution, and call that within #xploop. [14:28:50] well, if you don't mind losing the enclosing scope, you can just use Duesentrieb's PPCustomFrame_* [14:29:10] Not particularly elegant, but I don't know enough about MW's parser design to come up with something elegant... PPCustomFrame? [14:29:36] like this: [14:29:46] $pp = $parser->getPreprocessor(); [14:30:21] $myFrame = $pp->newCustomFrame( array( '$s$' => $value ) ); [14:30:38] $out .= $myFrame->expand($args[1]); [14:30:48] well, if all you want is replace variables in a chunk of text, just use $parser->replaceVariables($text, array("foo" => "bar")); [14:31:18] no need to think about frames, it's all handled internally. don't know the details about scope handlign though :) [14:31:18] Duesentrieb: the issue is that if he doesn't use SFH_OBJECT_ARGS, the variables are already expanded [14:31:47] and he ants to expand them later? [14:32:04] he wants to do something like {{#foreach $x in a, b, c | {{#var:$x}} }} [14:32:11] Yah [14:32:19] so he wants to expand the #var in three different custom frames [14:32:24] hm... well, replaceVariables can still be used, but you'd have to supply a frame that actually makes the variable resolve to their golabl meanings, instead of some fixed array of values [14:32:25] It's a klunky way of implementing an array. [14:32:32] I'd take another method if there was one. [14:32:51] TehWuzyl: do you get what I mean about enclosing scope? [14:32:56] isn't there a loop extension? [14:33:11] Duesentrieb: if there is, it's probably broken by 1.12 ;) [14:33:14] I'm not sure about enclosing scope... you mean the variable values set elsewhere in the page? [14:33:26] TimStarling: possibly :) [14:33:28] http://www.mediawiki.org/wiki/Extension:LoopFunctions [14:33:54] "I was very happy with this extension, until I upgraded MW to 1.12. It doesn't work anymore." [14:33:55] :P [14:33:57] There's a loop extension which supposedly replaces the functionality which was in an extension which is replaced by the ParserFunctions extension for 1.12 but is no longer in the current version... or something like that. [14:34:00] TehWuzyl: if the {{#xploop}} call is in a template, then you might want to access the template arguments [14:34:20] I think I could do without that in my immediate usage. [14:34:25] I *think*. [14:34:35] you could use Winter... [14:34:35] http://www.mediawiki.org/wiki/Extension:Winter [14:34:37] Oh, wait... no. [14:34:41] Winter. [14:34:41] (not seriously) [14:34:42] Right. [14:35:02] I've been looking at that... but it's iffy? [14:35:40] oh wait, here we go [14:35:41] http://www.mediawiki.org/wiki/Extension:Loops [14:35:48] this one was explicitly written for 1.12 [14:35:52] Right. [14:36:02] I guess I should make sure it can't be adapted to do what I want... [14:36:04] *TehWuzyl reads [14:36:13] didn't look how it works [14:37:01] Tentatively, it looks like it might work... [14:37:35] The {{#var}} calls aren't inside the {{#while}} function call, so perhaps they are deferred. [14:38:20] Oh wait, they are. Confusing formatting. [14:39:01] It *looks* like it wouldn't work, but the example specifically uses #var substitution. [14:39:31] I'll have to muck around with all these ideas later; gotta go in a few minutes, so I'd better put in that hack to get the old parser back temporarily. [14:39:40] Thanks Tim & Duesentrieb for the ideas. [14:40:00] have fun [14:40:07] quit [14:43:08] at the moment extension:intersection cannot list page in alphabetical order [14:43:34] http://www.mediawiki.org/wiki/Extension:Intersection [14:43:49] Duesentrieb: which class do I plug in -- the one which processes #xploop, or the one which processes #var? Or both? [14:43:50] but it can do it simply adding 4 line of codes [14:44:37] I have tried and it works [14:45:06] TehWuzyl: plugin where how? [14:45:26] $wgParserConf['class'] = 'Parser_OldPP'; [14:45:35] Substitute for 'class' [14:45:57] no supstitute [14:46:00] that's literal [14:46:05] Oh. Doy. [14:46:06] both are, actually [14:46:06] Ok... [14:46:21] this simply switches the entire wiki to the old parser [14:46:29] no fuss with specific functions or anything [14:46:34] 03jojo * r37955 10/trunk/extensions/Collection/ (Collection.php README.txt): remove ODT in default value [14:47:16] where can I ask for get the revision published? [14:48:24] Ramac: post a feature request on bugzilla, and attach your changes, preferrably as a patch created by svn diff. [14:48:40] thank you :) [14:48:51] Yeah, that works. Thx again, Duesentrieb. [14:49:33] Ramac: this should work well enough if the extension is maintained by an active mediawiki developer. it may be wise to point the extension's author to your request otherwise. [14:49:42] TehWuzyl: no problem :) [14:50:20] well I can leave them a message :) thank you [14:51:53] 03tstarling * r37956 10/trunk/phase3/includes/parser/Parser.php: Some documentation for SFH_OBJECT_ARGS [14:54:05] What's the article on Wikipedia with the most lines of source? [14:54:13] is it Bush? [14:54:35] probably some deletion log from 2003 [14:55:03] I want to test my new diff code on a realistic problematic article [14:55:24] where is the pastebin in chatzilla? [14:56:16] the major city articles, and the country articles, are generally pretty beefy [14:57:33] e.g. [[United States]] [14:59:10] George Bush is bigger [14:59:28] 03(NEW) Javascript conflict between Headertabs and timeline inline queries - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14897 04BLOCKER; normal; MediaWiki extensions: Semantic MediaWiki; (laurent) [15:02:40] 03tstarling * r37957 10/branches/REL1_13/ (4 files in 4 dirs): Branching MediaWiki 1.13 [15:03:22] Hey [15:03:24] Im wondering [15:03:31] what perameters do i have to set in localsettings.php to get central auth to work? [15:04:07] you don't want to use CentralAuth [15:04:40] Yes i do [15:04:48] are you testing code for wikimedia? [15:04:49] Your not gonna talk me out of it [15:04:54] nope [15:05:03] sounds like you don't need it then [15:05:19] Im running three Wikis [15:05:22] soon to be four [15:05:25] Trust me i do [15:05:25] oh wow, four [15:05:35] wikia runs over 1000, they don't use centralauth [15:05:48] TimStarling: they use better mechanism [15:05:50] I cam here for help [15:05:54] not a lecture [15:05:54] and they have lots of features that wikimedia doesn't, like global user talk notification [15:06:34] "I want to shoot myself in the foot, I insist you help me with that" [15:06:39] Prom3th3an|AWAY: I'm afraid you're only paying me enough for a lecture [15:06:43] TimStarling: if we compare Wikia's global login system and ours, our will look like ugly hack. That's considering we are pioneers of wiki technologies :( [15:06:43] help is extra [15:07:06] Wikia didn't actually write theirs, JeLuF did [15:07:11] so we can take credit either way [15:07:43] they wrote the user talk notification thing though [15:07:44] Ok well, shared DB's just doesnt cut it for what im after. [15:08:06] Ive split the Db's up, and am hoping to get central auth installed [15:09:00] Prom3th3an|AWAY: have you tried reading the comments in CentralAuth.php? [15:09:13] Yes, they all failed [15:09:38] Im hopping ill sucede [15:09:47] im an optimist [15:10:50] Prom3th3an: have you set up $wgConf? [15:11:02] Bingo [15:11:07] thats what im missing [15:11:09] deh [15:11:11] lol [15:12:12] hi, I am new here and have a problem with my wiki - is there someone who can (try to) help me [15:12:20] !ask | barcalex [15:12:20] --mwbot-- barcalex: Don't say "I have a question", or ask "Is anyone around?" or "Can anyone help?". Just ask the question, and someone will help you if they can. Also, please read < http://workaround.org/moin/GettingHelpOnIrc > for a good explanation of getting help on IRC. [15:12:39] ok, sorry, like i said - i am new :) [15:13:10] it is about time allocation. if someone says "i can try and help" before they know the problem, they might not, and yet they have then given up time etc [15:13:17] easier/quicker to just ask [15:13:27] and other reasons [15:14:17] i have the homepage http://www.allekochen.com and since some days i always have the same problem with the page. it is not showing the latest news i wrote in. but that is not always, some days it works perfectly and shows everything (sorry for bad english) [15:16:36] it seems that there are two wikis on my page - some days it shows the first, other days it shows the other [15:26:59] 03river * r37958 10/trunk/switchboard/ (fcgi.h process_factory.cc process_factory.h): [15:26:59] - fcgi.h: missing #include [15:26:59] - process_factory: should be n.uid/n.gid here, not uid/gid from released [15:26:59] process. need to track uid/gid in struct waiter. [15:30:58] u are not talking to me, are u ? [15:31:47] barcalex: it sounds like your hosting provider's fault [15:32:14] i already talked to them but they say they do not have any problems [15:54:09] 03demon * r37959 10/trunk/extensions/CheckUser/CheckUser_body.php: (bug 14528) Trim spaces on $user input. [15:54:26] 03(FIXED) Trim spaces on Checkuser IP input - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14528 +comment (10innocentkiller) [16:05:52] 03(NEW) Excessively large offset specified in {{#time: }} causes timeout - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14898 minor; normal; MediaWiki extensions: ParserFunctions; (ed.nr.drie) [16:07:35] TimStarling: why is there a lock on RELEASE-NOTES? [16:09:37] because I'm about to blank it, wait 30 seconds [16:10:01] i was going to do it :) [16:10:13] so you know why it's locked [16:10:44] dunno why a lock is required [16:11:58] so that you don't waste your time and mine doing precisely the same work that I'm doing [16:12:09] 03tstarling * r37960 10/trunk/phase3/ (HISTORY RELEASE-NOTES): HISTORY/RELEASE-NOTES to 1.14 [16:12:15] it's not a race [16:12:38] find something to do which isn't instantly going to conflict with me [16:13:20] 03ialex * r37961 10/trunk/phase3/includes/DefaultSettings.php: $wgVersion too [16:13:29] asshole [16:18:11] 03tstarling * r37962 10/trunk/ (2 files in 2 dirs): More updates for 1.14 [16:23:43] 03tstarling * r37963 10/branches/REL1_13/phase3/ (RELEASE-NOTES includes/DefaultSettings.php): Version number update. [16:30:05] 03tstarling * r37964 10/branches/REL1_13/phase3/ (RELEASE-NOTES includes/DefaultSettings.php): 1.13.0rc1 not 1.13rc1 [16:30:43] brion: should I tag that now? [16:31:01] or is there something else to do? [16:34:31] TimStarling: pack and announce it? [16:36:39] Tagging comes before that. [16:37:14] oy [16:37:38] I know, review the entire codebase, right? [16:37:43] updated version and release notes? excellent [16:37:52] because released versions are stable, aren't they? [16:38:04] there's some installer upgrade regression tests lying around somewhere [16:38:20] in release-tools [16:38:33] doesn't hurt to run that :) [16:39:20] http://svn.wikimedia.org/viewvc/mediawiki/trunk/release-tools/ [16:41:40] gpg --detach-sign $f [16:41:53] guess I should put a key in there first [16:42:17] prolly :D [16:43:45] what is the signature for anyway? where would someone get your public key from? [16:44:52] from any pgp key server [16:45:19] but anyone can upload a key under your name [16:45:39] ah, but there's the web of trust! [16:45:47] i think like two people have signed my key :) [16:45:57] so... you're assuming that the downloader wil trust one of those two people? [16:46:09] or someone who signed one of those people's keys [16:46:14] ;) welcome to the wonder of pgp [16:46:27] or someone who signed the key of someone who signed one of those people's keys [16:46:41] why don't we just host the keys on our HTTPS server? [16:46:45] then it's authenticated via a CA [16:46:51] can do that too, sure [16:47:03] i think mine's on my web site somewhere [16:47:16] see, you're promoting an insecure solution ;) [16:47:40] downloading a public key from a keyserver and checking it against a package gives you a false sense of security [16:47:41] at least we don't have to pay $50 for a signed key [16:48:21] TimStarling: even though i don't have a trusted path to brion's key, i know it's his because i validate all his list posts with it. so, either the release was signed by brion, or someone has been impersonating him for the last 4 years [16:48:53] i've been paying a kid in india to post for me, frees up a lot of time [16:48:54] or someone has been acting as a MITM for the last 4 years, changing all his signatures [16:49:38] you know, when I had enigmail, it'd often fail on brion's posts, because gmane would munge the email addresses [16:49:47] invalidating the sig [16:49:54] hehe [16:50:20] i remember i used to post in pgp-mime format and nobody with outlook express could read my posts [16:50:22] those were the days [16:50:35] TimStarling: i think more likely than that is that someone compromised the https server you're about to put the key on [16:50:41] yeah, I used to use outlook express until you started doing that [16:50:57] see, i was encouraging open source adoption [16:51:02] well done [16:51:03] :D [16:52:09] flyingparchment: you're still thinking only of your own situation, most mediawiki users don't have 4 years of email to verify brion's signature [16:52:20] I just updated all my wikis, and some of my old ones have one more table than the others [16:52:26] Can I safely remove the extra table? [16:52:40] TimStarling: just put the damn keys on the server, we get it :) [16:53:49] It's ipblocks_old [16:53:49] Elvanor: which tabe? [16:53:55] ialex: ^^ [16:53:59] :) [16:54:19] It's empty anyway so I will just remove it [16:54:48] maybe it was used to update the ipblocks table, this is not used in MediaWiki [16:55:12] I was hoping someone might be able to help me out. I am trying to set up 3 columns similar to the setup on the mediawiki homepage and have them be side-by-side [16:55:30] tronyx_, tables. [16:55:33] when i enter the 3 tables, they will not line up. is there an easiest way to do this? I currently have them setup as 3 different tables [16:55:37] Or at least that's how I bet the MediaWiki homepage does it. [16:55:43] You want one table, three columns. [16:56:08] Either that, or some arcane 4 KB CSS library that will make your website look like Swiss cheese in IE5. [16:56:35] i'll take my chances with one table and 3 columns [16:56:35] thanks [17:04:30] Simetrical: any chance i can send you a pastebin of what I am trying to setup? this is my first day with mediawiki and i am having a hard time getting this page to display how I want [17:05:26] I'm not really a layout expert. Why don't you just copy the formatting from the source of the mw.org front page if that's what you want? [17:09:31] You can pastebin it here if you want. [17:11:31] AaronSchulz: a little discussion in #wikimedia about FlaggedRevs and DPL (classic DPL) for wikinews [17:11:34] you may be interested [17:12:04] TimStarling, something is buggy with the double redirect fixer. http://en.wikipedia.org/wiki/Special:MovePage/User:NicDumZ shows "Move associated talk page" and "Watch this page", whereas http://en.wikipedia.org/wiki/Special:MovePage/User_talk:NicDumZ shows "Update any redirects that point to the original title" and "Watch this page". I think that all three options are supposed to be shown altogether, right ? [17:12:49] NicDumZ: i believe it won't show the redirect fix checkbox if there aren't any redirects [17:12:57] right [17:12:59] and since it's a talk page, it won't show the 'also move talk page' box [17:13:42] You know, one thing I like about git is that you use git add on *every* changed file you want to commit. So then on the rare occasion you do want to commit a new file, you don't forget, because it's no different from committing any other file. [17:13:52] ah... I get it, sorry then :) [17:15:24] *qsheets wonders why #ifexist doesn't work with redirects [17:15:32] did you get my message Simetrical? [17:16:13] qsheets: in what way doesn't it work? [17:16:25] tronyx_, better to post in the channel. More people to see it and help here. [17:16:31] And also I actually notice. [17:16:58] tronyx_, you want those to be three columns, then? [17:17:05] I am trying to get all 3 of the tables setup in a row [17:17:06] yea [17:17:22] brion: {{#ifexist: PAGE | #REDIRECT [[PAGE]] | Page doesn't exist}} [17:17:24] well three colums or 3 separate tables, whichever works, but side-by-side with a little space in between [17:17:26] There are more than three . . . [17:17:39] qsheets: ewwwwwwwwwwwwwwwwwwwwww [17:17:46] I count eight tables. [17:17:56] Not sure which you want in columns. [17:17:58] no, redirects are parsed specifically as raw markup [17:18:12] strange wikitext constructs around them, of any kind, won't accomplish anything [17:18:15] qsheets, you can't ever have #REDIRECT imported by parser function or template. [17:18:23] SOPs, Webinar and policies Simetrical. 3 sections rather [17:18:24] It has to be the raw first bytes of the page. [17:18:30] i c [17:20:33] tronyx_, try this. http://pastebin.com/d22063553 [17:21:55] thank you so much Simetrical [17:51:14] 03tstarling * r37965 10/trunk/release-tools/upgradeTest.py: Less noisy svn commands. Added 1.13. [17:53:41] 03simetrical * r37966 10/trunk/phase3/includes/DefaultSettings.php: Documentation: note that $wgNamespaceRobotPolicies can't change the default for NS_SPECIAL. [18:14:51] hey there.. so looking into mw1.13 i saw that the monobook skin was altered to enable customBox and different settings to the sidebar, is there anywhere to get more info on this? [18:17:55] what is customBox? [18:18:43] what different settings? [18:18:55] TimStarling : $this->customBox( $boxName, $cont ); line ~160 [18:19:32] in the monobook.php [18:19:37] that's a refactor [18:19:53] custom boxes of that kind have been allowed (and used heavily on wikimedia) for years [18:20:33] heh, wasn't aware of that :) , so basically any extension can use it to create a box ? [18:21:16] nope [18:22:11] I'm looking to implement __INDEX__ and __NOINDEX__ magic words. How should that work in terms of OutputPage methods? It only has setRobotPolicy(), which sets the whole policy. I only want to set the "index" bit and leave the follow part untouched. [18:22:38] to add things there in extensions, you would need a hook in Skin::buildSidebar() [18:22:43] but there isn't one [18:23:04] TimStarling : in the actual function for custom box there is a note that says: # allow raw HTML block to be defined by extensions [18:23:28] i'm guessing this is where the hook would come into play, if it ever gets introduced? [18:23:34] that is true [18:23:59] ok, thanks TimStarling... [18:24:03] <^demon> Simetrical: Maybe setRobotPolicy() needs to be changed to allow setting the two terms (index/follow) independently of one another? [18:24:39] but I'm still not seeing a hook [18:24:41] ^demon, then I guess I'd have to parse input to setRobotPolicy(). Shouldn't be a problem, I guess. [18:24:46] maybe you could hack the template data [18:25:47] <^demon> Simetrical: Maybe refactor it? setRobotPolicy( $index = true, $follow = false ) [18:26:02] TimStarling, for what i need I might get rid of the sidebar all together.. thanks again :) [18:26:57] ^demon: bleh, how about setRobotPolicy( array( 'index' => true, 'follow' => false ) ) [18:27:00] ^demon, no, no good for backward compatibility. [18:27:14] then you could implement backwards compatibility [18:27:22] <^demon> TimStarling/Simetrical: Whatever works well. I was just throwing out ideas. [18:27:24] and the calling code would be readable [18:27:43] Well, why not just setRobotPolicy( $index = 'index', $follow = 'follow' )? [18:28:07] cute [18:28:28] Except actually not a good idea, since what happens with setRobotPolicy( 'follow', 'index' )? [18:28:35] I could use varargs, I guess. [18:28:52] Or just use setRobotPolicy( 'index,follow' ) as at present, and grep it. [18:28:55] That's what I'll do, I think. [18:29:14] there's an obvious alternative to shelling out to grep you know... ;) [18:29:23] you could use explode(',', $policy) [18:29:36] and then the result could be processed in the same way as func_get_args() in your vararg idea [18:29:46] Sensible. [18:29:52] it's not exactly the hardest format to "parse" is it? [18:29:55] That provides more error-checking. [18:29:57] No, no it's not. [18:36:13] TimStarling: remember to update the Main Page of www.mediawiki.org ;P [18:37:50] why should it be updated? :O I don't think rc1 has been tagged or put into tar.gz yet [18:38:26] FunPika: because rc1 for 1.12 was updated on the main page [18:38:31] wb brion [18:38:32] Hmm, this causes a regression if anyone is specifying just "index" or "follow" or whatever and is expecting that to be output literally, since the other bit would then formerly be implicitly reset to index/follow. [18:38:50] 03tstarling * r37967 10/tags/REL1_13_0RC1/: Tagged 1.13.0rc1. [18:39:03] and NOW it was tagged :P [18:41:11] yay [18:41:16] Nothing in core, but could be in extensions, or people setting $wgArticleRobotPolicies or something. [18:41:23] I guess I have to respect that. Let's think of a new syntax. [18:42:17] why does make-release.sh use svn co instead of svn export? [18:42:35] i probably never thought of it :) [18:42:44] that would be sensible probably [18:42:44] *Simetrical solicits brion's opinion on syntax for OutputPage::setRobotPolicy that would allow setting *only* the index status and leaving the follow status unchanged [18:43:05] Maybe I should do a separate setIndexPolicy/setFollowPolicy? [18:43:25] hmmm [18:43:52] that could work i suppose [18:43:57] keep the old one for compat [18:44:14] Also for if you *do* want to set both at once. [18:44:43] brion: when you wrote "sha1 $f" and "md5 $f", did you actually mean "sha1sum $f" and "md5sum $f"? [18:44:55] or is that a Mac OSX quirk or something? [18:45:58] mac osx has a 'md5' but not 'md5sum' [18:46:06] i think i wrote a one-line 'sha1' script myself :) [18:46:18] you could make it do a fallback or something [18:46:39] does mac osx have "which"? [18:46:48] it does [18:47:05] yes [18:48:18] Okay, so something about my refactoring was wrong? [18:48:19] TimStarling: http://rafb.net/p/T8sArx91.html <- my sha1 script :D [18:48:20] [Wed Jul 23 14:47:17 2008] [notice] child pid 26891 exit signal Segmentation fault (11) [18:48:21] . . . [18:48:51] or you could install sha1sum from ports ;) [18:48:56] Last time I observe tendantion to build bot-like tasks into software itself (or extension) [18:49:06] So, how to debug segfaults in PHP? [18:49:21] Simetrical: gdb [18:49:40] https://wikitech.leuksman.com/view/GDB_with_PHP [18:49:45] (note a common source of segfaults in php is actually too-deep recursion which smashes the stack. installing xdebug can help catch those) [18:49:57] Simetrical: usually they are stack overflows iirc [18:51:13] btw, why PHP can't handle stack overflows nicely, like it's done with the memory overflow? [18:51:13] Hmm, apparently I shouldn't add magic words to MagicWords.php without actually adding code to handle them. [18:51:18] VasilievVV, because it's a piece of crap. [18:51:21] :) [18:52:05] *TimStarling removes the -v flag from tar and adds -q to svn [18:52:15] php devs don't seem to consider it a problem that it segfaults instead of erroring gracefully :) [18:52:22] I don't really need a list of all files in the tarball when I'm doing a release... [18:52:29] i'm sure you'll find it listed as BOGUS 50 times on bugs.php.net [18:52:52] file listing gives you the warm fuzzy feeling of knowing that tar works [18:53:10] the devs are going to mark my latest bug as "no feedback" unless I fix it for them [18:53:31] I think that's how it works anyway [18:53:34] Do they have a policy that they don't get paid if there are open bugs or something? [18:53:38] That would explain a lot. [18:53:56] they do have a bot that produces a list of open bugs on the mailing list once every week or so [18:54:04] maybe they don't want to download too much [18:54:37] Has anyone considered forking PHP under new, non-retarded management? I guess it would be too much to ask. [18:55:21] it's the community that's the problem, as much as the management [18:55:27] :) [18:55:28] you can't really have a successful fork without the community [18:59:22] It's a funny thing. The things I like best about git are mostly the totally incidental command-line interface details. [18:59:59] Like automatically piping the output of things like diff and log to less. [19:00:08] Of course, then I forget when I'm using SVN and get 5000 lines in my terminal. [19:00:50] hello [19:05:56] 03simetrical * r37968 10/trunk/phase3/includes/ (16 files in 2 dirs): [19:05:56] Refactor a bit preparatory to fixing bug 8068: rewrite the robot policy stuff in [19:05:56] OutputPage to allow index and follow policy to be set separately. Also now [19:05:56] validates input to setRobotPolicy(). And renamed setRobotpolicy to [19:05:56] setRobotPolicy, too. If anyone was accessing $mRobotpolicy directly they're out [19:05:59] of luck, though. [19:16:45] https://secure.wikimedia.org/keys.html [19:17:06] TimStarling, why do we often set member variable defaults in constructors instead of in the class definition? [19:17:20] TimStarling: oooohhh! [19:18:32] o.o [19:18:48] Simetrical: it was a habit of mine carried over from C++ [19:18:54] I don't usually do it anymore [19:20:22] there's a difference in how PHP implements it, but it's probably not important in PHP 5 [19:22:31] Is the difference a good thing or a bad thing? [19:23:35] if you do something like: var $a = array(); [19:23:36] hello people. I backed up my site and moved it to a new server, but now my search is missing allot. All the articles are there, but the search doesnt seem to be picking up article names [19:23:55] hey, anyone familiar with an extension that allows restricting viewing certain articles? [19:24:02] if you put it in the class def, each object gets a copy-on-write reference to the same zval hashtable [19:24:17] but if you put it in the constructor, each object makes its own zval [19:24:28] so the class def way is probably faster [19:24:51] *but* there was a little problem we encountered which is that PHP 4 used a 16-bit integer to store its reference counts [19:25:28] so if you try to put 100,000 such objects in an array, the reference counter for the default value overflows, and then it double-frees and segfaults [19:25:41] >_< [19:25:51] Oh, bugger. [19:25:54] I asked on php-dev whether they might consider fixing it [19:26:04] 03simetrical * r37969 10/trunk/ (31 files in 23 dirs): Follow-up to r37968: forgot to commit the change setRobotpolicy -> setRobotPolicy in extensions. [19:26:12] they said "that would be a binary compatibility break, why don't you just upgrade to PHP 5?" [19:26:23] Whoops, I buried my actual commit in there too. [19:26:24] I said "ok" [19:26:40] and upgraded our ~30 servers at that time to PHP 5 [19:26:52] heh [19:27:15] PHP 5 uses a 32-bit reference count [19:27:27] 03simetrical * r37970 10/trunk/ (30 files in 22 dirs): Revert last commit for a moment, committed lots of stuff I didn't mean to. [19:27:46] is mediawiki on wikimedia officially the most massively used single php app? or are there others in the same league? [19:27:59] Google is smart enough not to use it. [19:28:08] I don't know, what do places like Facebook and MySpace use? [19:28:27] yahoo is said to use PHP for various things [19:28:28] google isn't written in php? [19:28:33] and how many hits do they get? [19:28:40] google search isn't PHP [19:28:49] but they probably have some stuff in it [19:29:03] facebook uses php and something called fbml [19:29:54] let me guess. "facebook markup language" [19:29:56] HOW ORIGINAL [19:30:01] damb capslock [19:30:15] i need to figure out how to *permanently* map that to right-alt [19:31:23] you know, the cute thing is, SGML started the idea of using "markup language" to describe everything [19:31:50] and "SGML" happens to be the initials of the inventors [19:31:50] How many programming languages are Extensible Markup Languages nowadays? [19:32:02] so it was a contrived backronym [19:32:05] XSL, to begin with . . . [19:32:25] so if "markup language" sounds awkward, that's why [19:33:41] 03simetrical * r37971 10/trunk/extensions/ (26 files in 19 dirs): Follow-up to r37968: forgot to commit the change setRobotpolicy -> setRobotPolicy in extensions. (Not committing lots of other stuff this time.) [19:37:57] xoj [19:38:03] lot of things happening today [19:38:53] 04(REOPENED) Don't show box for default case - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14697 +comment (10pbirken) [19:42:18] 03aaron * r37972 10/trunk/extensions/FlaggedRevs/FlaggedArticle.php: lowProfileUI() tweak [19:42:28] 03(FIXED) Don't show box for default case - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14697 +comment (10JSchulz_4587) [19:49:59] 03simetrical * r37973 10/trunk/phase3/ (6 files in 4 dirs): (log message trimmed) [19:49:59] (bug 8068) New __INDEX__ and __NOINDEX__ magic words allow control of search engine indexing on a per-article basis. Remarks: [19:49:59] * Currently __INDEX__ will override __NOINDEX__ regardless of their relative positions, due to the way things are written. Instead, the last one on the page should win. This should be pretty easy to fix. [19:49:59] * __INDEX__ and __NOINDEX__ override $wgArticleRobotPolicies. This is almost [19:50:00] certainly incorrect, but it's not totally obvious how to fix it, because of the [19:50:02] way the code is structured. Probably not a big deal, but should probably be [19:50:04] fixed at some point. [19:50:24] 03(FIXED) Magic word to add noindex to a page's header - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=8068 +comment (10Simetrical+wikibugs) [19:53:47] TimStarling, we need the version numbers in Bugzilla to be updated. [19:55:08] hm? do we have ignition, err, fork? [19:55:26] oh... we do :) [19:55:39] does that mean i can merge my development branch into trunk? [19:55:47] i hope it does :) [19:58:29] 03(NEW) When __INDEX__ and __NOINDEX__ both occur, the last one in the source should win - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14899 minor; normal; MediaWiki: General/Unknown; (Simetrical+wikibugs) [19:58:35] 03(mod) Magic word to add noindex to a page's header - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=8068 (10Simetrical+wikibugs) [19:59:56] is there a way to index my site so that i get better search results? [20:00:11] get a sitemap [20:00:17] !sitemap | momelod [20:00:17] --mwbot-- momelod: I don't know anything about "sitemap". [20:00:20] Grr. [20:01:39] sorry, i dont mean like google searches. I mean when i search using mediawiki's search box i dont get the results i expect [20:02:13] for example if i have a wiki page named TestPage, when i search for "TestPage" i get 0 results [20:02:44] momelod, default search sucks. Try the LuceneSearch extension. (This requires you to have root access, which is why it doesn't work by default.) [20:03:26] If you're expecting Google-quality search results, though, either use Google or give up. They've spent billions of dollars over the course of a decade or more refining their search technology, random web apps can't possibly compete. [20:03:53] 03huji * r37974 10/trunk/phase3/languages/messages/MessagesFa.php: Localisation updates: Adding/updating Persian translations [20:04:03] Simetrical: why does lucene require root access? [20:04:10] Duesentrieb, to install Lucene, surely? [20:04:24] Simetrical: i moved this wiki install from an old server. On the old server, my searches would return results where the new install doesnt. [20:04:25] Or can it be just uploaded via FTP if you have exec() privilege? [20:04:29] it's just a java prog, no? [20:04:35] Hmm, I guess so? [20:04:40] you just need shell, not root [20:04:40] Isn't it a daemon or something? [20:04:44] Okay, shell access, then. [20:04:47] so, like, just copy the jars and run it? [20:04:55] well, i wouldn't want to try without shell access :) [20:04:56] Wait, what if Java's not installed? [20:05:02] cron and stuff would also be good [20:05:05] Could you install that too? :) [20:05:09] but *root* is probably not needed [20:05:13] Fair point. [20:05:19] openjdk is comming with latest linux distributions [20:05:25] well, cron tends to be there :) [20:05:32] so in the future it is more likely to be installed [20:05:59] but yes, you could install java as ordinary user [20:06:09] 03(NEW) __INDEX__ and __NOINDEX__ should not override $wgArticleRobotPolicies - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14900 minor; normal; MediaWiki: General/Unknown; (Simetrical+wikibugs) [20:06:15] 03(mod) Magic word to add noindex to a page's header - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=8068 (10Simetrical+wikibugs) [20:06:20] *Simetrical stole the idea of follow-up bugs from Mozilla [20:06:23] *Simetrical rather likes it [20:06:28] Of course, it's fairly silly if I get reverted. [20:06:45] They don't have that problem, because they do strange and mysterious things like code review *before* commit. [20:12:19] 03(mod) option to protect pages from being indexed by search engines - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=9415 +comment (10Simetrical+wikibugs) [20:24:17] 03vasilievvv * r37975 10/trunk/extensions/CentralAuth/ (7 files in 2 dirs): [20:24:17] * (bug 14556) Global groups defined for certain sets of wikis [20:24:17] This may be done via creating wiki sets and assigning them to specific global groups. [20:24:17] Still may need to do some cleanup, remove some strange logging-related notices and simplify SQL queries. [20:24:54] 03(FIXED) Global groups defined for certain sets of wikis - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14556 +comment (10vasilvv) [20:49:06] http://www.mediawiki.org/wiki/User:Cortesea ? [20:51:51] TimStarling: see also http://www.mediawiki.org/wiki/User:Brion_VIBBER#Templates.2Fpages_to_change_on_release [20:54:21] ialex: yeah, I thought it would be in a user subpage somewhere [20:58:14] 03(mod) Global deleted image review for Commons admins - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14801 (10thogol) [21:07:16] 03(NEW) Email notification mistakes edit for new page creation - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=14901 minor; normal; MediaWiki: Email; (tracy.poff) [21:09:34] Simetrical: Will it possible to list pages that use __INDEX__ or __NOINDEX__, a category or Special: page, perhaps? [21:11:56] 14(INVALID) Interwiki links not using the protocol specified in interwiki. iw_url, breaking https servers - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14827 +comment (10hightowe) [21:12:40] MZMcBride: you could just wrap the __INDEX__ in a template so you get the WhatLInksHere [21:12:53] I worried about abuse. [21:12:57] I'm. * [21:13:02] Or misuse, rather. [21:13:08] yes, I can see why [21:13:21] So, I think a tracking mechanism is pretty important. [21:14:16] 03jeluf * r37976 10/trunk/extensions/SlippyMap/ (OpenStreetMap.js SlippyMap.class.php): [21:14:16] Changes by Grant Slater: [21:14:16] - Updated Tiles@Home urls to current [21:14:16] - Removed lonLatToMercator/MercatorToLonLat dependancy, use OL instead [21:14:16] - Minor quoting tweaks [21:15:49] hey [21:15:53] i got a question :) [21:16:22] will a bot-nulledit do the same work for the job queue as if I do the nulledit? [21:20:21] 03jeluf * r37977 10/trunk/extensions/SlippyMap/ (SlippyMap.class.php SlippyMap.i18n.php SlippyMap.php): GPL headers [21:21:01] MZMcBride, not as it's presently handled. It could be added at some point, undoubtedly. [21:21:22] In my opinion, it would be a necessity. [21:21:43] Could be. A simple list would probably be unmanageable, though. [21:23:47] I see it as providing a mechanism for two types of disruption. Vandalism to templates to wipe out search engine coverage of large amounts of articles. Or people wanting to remove certain individual articles for political purposes. [21:24:15] Something similar to Special:ProtectedPages would be ideal, I think. Sorting by namespace, etc. But that requires a new table. [21:25:17] *Simetrical nods [21:25:37] It might need to be disabled by default until this can be worked out. [21:25:57] *MZMcBride nods. [21:26:24] But the basics are written, at any rate, if a magic word is what's desired. [21:36:21] how do you show the newest articles on the main page? [21:36:55] kiba: you'll need an extension to do that (or do it by hand) [21:37:06] kiba: extensions that can do that are News and DPL [21:37:09] !e News [21:37:09] --mwbot-- http://www.mediawiki.org/wiki/Extension:News [21:37:12] !e DPL [21:37:12] --mwbot-- http://www.mediawiki.org/wiki/Extension:DPL [21:37:17] 03aaron * r37978 10/trunk/extensions/FlaggedRevs/specialpages/StablePages_body.php: Move header up [21:37:54] the News extension is needs a patch to work with 1.12 though (will be fine with 1.11 or 1.13) [21:38:23] thanks [21:38:28] AaronSchulz_: you have a DB patch for that yet? [21:39:17] 03aaron * r37979 10/trunk/extensions/FlaggedRevs/specialpages/ReaderFeedback_body.php: redirect to graph on submit [21:39:23] does ???FlaggedRevs work with 1.12? [21:39:41] not HEAD [21:40:50] is there an easy way to import all the templates from wikipedia [21:42:15] why would you want them all? [21:42:31] well maybe not all, but like I need all the cite related templates [21:42:31] cause you gotta catch em all! [21:42:56] and it looks like some kind of maze of templates using templates [21:43:43] it is [21:43:53] chances are you can get away with something much simpler [21:45:30] Cite_book looks really complicated [21:45:44] look at Wiktionaries version [21:46:28] that won't have so many dependancies [21:46:37] valan: copying templates from en wikipedia will get you crazy. it's better to make your own. or at least, copy somethign nice and simple. [21:47:58] alright [21:48:44] #join wikipedia [21:48:51] having a set of nice templates around for people to copy would be nice... [21:49:06] WP-Gast: close :P [21:49:57] hi [21:50:39] does wikipedia use a lot of extensions? [21:50:54] several. not a "lot" really [21:51:08] is there a list anywhere? [21:51:12] you definitly want parserfunctions of you are going to mess with templates [21:51:22] valan, Special:Version [21:51:28] thanks [21:51:30] indeed. [21:51:44] It uses 16. [21:51:49] Although MakeBot and MakeSysop could be dropped now. [21:52:59] not makebot [21:53:08] I see like 40 [21:59:08] Only 16 are extensions. [21:59:12] Well, probably. [21:59:22] There might be one or two secret ones. [22:07:28] Hello I am getting a database error when trying to add an image [22:07:29] from within function "LocalFile::loadFromDB". MySQL returned error "1054: Unknown column 'img_sha1' in 'field list' (localhost)". [22:08:00] stiv2k, did you recently upgrade and not run maintenance/update.php? [22:08:03] !upgrading | stiv2k [22:08:03] --mwbot-- stiv2k: http://www.mediawiki.org/wiki/Manual:Upgrading [22:08:05] I am using mediawiki 1.12.0 [22:08:33] Simetrical: let me check it out [22:09:38] I gotta give it superuser? [22:10:42] Simetrical: I don't get it, it doesnt tell me where to provide the superuser information [22:11:43] !adminsettings | stiv2k [22:11:43] --mwbot-- stiv2k: AdminSettings.php is an additional configuration file for use with command line maintenance scripts. See AdminSettings.sample in your installation directory for details. [22:12:23] Is it normal if I only have a .sample [22:12:36] My wiki's been that way for a long time :| [22:12:57] sure, you only need to configure that for using command line scripts [22:13:16] ah [22:13:18] but when upgrading, you *should* run update.php [22:13:21] !update [22:13:21] --mwbot-- http://www.mediawiki.org/wiki/Manual:Upgrading [22:14:56] does offsite image loading work [22:15:02] it doesnt seem to be working in preview mode [22:15:33] !externalimages [22:15:33] --mwbot-- To allow images from elsewhere to be included in your wiki, see . To limit this to some specific sources, see . [22:17:13] what file is that in [22:19:20] hi guys... [22:19:30] Duesentrieb: Why doesn't it say which file it's in? [22:19:50] !config [22:19:50] --mwbot-- All configuration is done in LocalSettings.php (near the end of the file). Editing other files means modifying the software. Default settings are not in LocalSettings.php, you can look in DefaultSettings.php. See , , , and [22:20:10] Duesentrieb: Oh so the variable isn't declared I have to enter it [22:20:15] is there a way to allow to view only the Main Page? No Special Pages at all? [22:20:26] !access | Alan762 [22:20:26] --mwbot-- Alan762: For information on customizing user access, see . For common examples of restricting access using both rights and extensions, see . [22:20:44] !secrets | Alan762 [22:20:44] --mwbot-- Alan762: MediaWiki was not designed with read-restrictions in mind, and may not provide air-tight protection against unauthorized access. We will not be held responsible should private information, such as a password or bank details, be leaked, leading to loss of funds or one's job. [22:21:50] wtf, it still isn't working [22:23:47] i wrote this : [22:23:50] $wgGroupPermissions['*']['read'] = false; [22:23:59] I think images are disabled [22:24:05] $wgWhitelistRead = array ( "Main Page") ; [22:24:13] but i'm still able to see login page.. [22:25:33] Alan762: yes. that is the only exception (and hardcoded). how would *you* ever log in, if you can't see that page? [22:25:45] !images [22:25:45] --mwbot-- For instructions to use images in MediaWiki, see . For more technical details about image uploads, see and . Note that uploads are disabled per default (see !uploads). [22:26:14] Due, that wiki is just a common wiki, for upload images and for the homepage, nothing more... [22:26:27] I don't get it [22:26:34] I'm doing it right but it's displaying it as a link [22:26:40] Alan762: err? but you have to be able to log in somehow, no? [22:26:50] Alan762: a "common wiki" means everyone can see and edit everything. [22:26:56] [[Image:http://neoturbine.net/img/tango/32x32/categories/applications-development.png]] [22:27:16] stiv2k: single [ ] for external links, double for internal. [22:27:22] Duesentrieb: Oops [22:27:29] stiv2k: no Image: prefix for external links either [22:27:41] Duesentrieb: I don't want it to be a link, I want it to be an image [22:27:46] yes. [22:27:47] no matter [22:27:57] the image function is triggered by the file extension [22:28:02] (if you have external images on) [22:28:11] this is sucky and inconsistent [22:28:24] external images are a rather nasty feature. deprecated really. [22:28:49] Due, i don't need to login in the common wiki, the common wiki is not going to be used by users....it's aim is just to allow common upload, images, ad homepage...the users of the others wiki use centralauth ... [22:29:28] ah [22:29:54] i'm afraid the login page will still be visible. when that rule was hardcoded, no one thought about centralauth i guess [22:29:59] what's the harm in that anyway [22:30:17] oh, and... if you want people to upload images, they should also be able to see the descriptions, etc [22:30:26] maybe change image info, organize stuff... [22:30:38] maybe someone needs to look at images to *delete* them?... [22:31:05] he point is: trying to hide it away is probably not a good idea [22:31:06] Duesentrieb: I see [22:31:26] Duesentrieb: Now to get the image and large text on the same line ;) [22:31:50] off to bed [22:31:52] have fun [22:32:49] How do I do it? [22:41:58] 03guyvdb * r37980 10/branches/visual_diff/phase3/includes/Diff.php: performance improvements and new LCS implementation [22:55:43] hello? it's my very first time in an IRC.. [22:56:13] I have a dubt about mediawiki, and i don't know if it's the best place (an hour.. ) [22:56:47] perhaps not.. ok.. i'll try tomorrow again [23:06:08] ok anyway i'll ask now too... does somebody know how to register readings of the people (one row for each person registered and page).. because i need to register every reading of every person, I mean, each page in the wiki that the people navigate.. and of course I don't know how, and i spent few hours few days and I cant get an extension or whatever to do it propertly [23:08:37] it would be easier to make a new table [23:08:52] of person title (and date/time) [23:09:04] and just insert a new row each time someone visits a page [23:10:18] mmm.. i see [23:10:25] so [23:13:11] my kwnoledges about php or wikis are.. quite poor (cause i never worked in it.. ) but I need it for my final project [23:13:40] it shouldn't be too hard [23:13:49] do you know anything about databases? [23:13:50] i'll study a little more.. [23:14:01] yes, i do [23:14:08] that's ok then [23:14:09] about DB i know [23:14:21] you just need to execute one SQL query in php [23:14:32] and you can do the rest wherever you are analysing this data [23:15:10] but where.. i have to "play" with the database? where is the point where i have to write my inserts and my stuffs.. is there a general place to do it for each page? [23:15:37] well all pages are handled by the same php [23:15:54] i like it :) [23:16:32] you will want to attach an event hook http://www.mediawiki.org/wiki/Hooks [23:16:51] ok.. i tried to not go inside php because it's not my project at all (it about social networking.. ) [23:16:58] ok.. i'm taking notes.. [23:17:42] probably to the OutputPageBeforeHTML Hook [23:18:13] which uses the globals $wgUser and $wgTitle [23:18:31] but i'll do because I need.. and last question. do you know some extension which makes it? perhaps in a more elegant way.. (because I have not so much faith in mine.. :) ) [23:18:47] many people seem to prefer logging stats offsite (via or javascript or such), can do plain text logging and aggregate it on demand [23:18:53] ok.. it's sounds quite easy.. [23:19:07] that's a good point [23:19:13] you could write it in javascript [23:20:18] thanks.. it could be easier than the last "easy" XD [23:21:11] if(wgUserName) addOnloadHook(webbug) [23:21:29] function webbug() { [23:21:36] var img = document.createElement('img'); [23:22:33] img.setAttribute('src','http://someserver.com/imagelogscript.php?username=' + wgUserName + '&page=' + wgPageName); [23:22:44] document.body.appendChild(img); [23:22:46] etc [23:22:53] tomorrow i'll try.. and the day after.. if i'm not able I'll ask again.. and if I solve it I'll post it in the wiki.. even a little post about "how to take good stats to make a social network stats with wikis" [23:23:09] ok [23:23:19] and in future, don't ask whether anyone is around - just ask the question [23:23:30] thank you very much again [23:23:58] is there a way to make a template for templates where you can [[Category:DataItem]] [23:24:00] yes, sorry... I read it before but my fish-memory erased it too soon [23:24:23] neurotope: I don't think so, I tried a while back [23:24:39] so no way to include only, 2 pages down? [23:24:55] I couldn't make one work [23:24:57] in the old preprocessor you could split the tags... [23:25:16] thats a hack isnt it? [23:25:31] clude> or something [23:25:38] everything is a hack [23:25:45] that didn't work for me [23:25:52] but maybe I didn't do it right [23:26:02] Thats realy cute... ill play with it [23:26:11] no warranty expressed or implied [23:26:17] hurm [23:26:23] no, im comming for you when it doesnt work [23:26:24] stupid {{PLURAL}} not doing what it should :( [23:26:48] heh [23:28:07] only> [23:28:44] used to might have worked too [23:29:16] that's not fair, you can't use {{#tag: for noinclude [23:29:40] grrr [23:29:44] indeed [23:30:44] there are 3 general styles of <> in wikicode: whitelisted html tags, xml parser tags, and transclusion magic... #tag only works with the xml parser tags [23:31:02] I have a message in the form of: $2 {{PLURAL:$2|singular|plural}} [23:31:22] and calling it with wfMsg but the output I get is: 1 plural [23:31:36] which means I'm an idiot or I'm missing something [23:31:40] this worked [23:31:42] [23:31:42] clude>[[Category:DataItem]]clude> [23:31:42] [23:31:53] haha [23:31:56] hack [23:32:01] yeah it is [23:32:21] cirwin, there is your answer.... what a freekin mess :-) [23:32:26] yay [23:32:27] and my answer too [23:32:33] now I have to remember why I needed it... [23:33:06] how dirty we get our hands when we demand the tidy end user solution... now my code monkies can just use the subst: [23:33:12] bru ha ha ha ha [23:33:33] *cirwin doesn't like subst [23:33:44] but only 'cos its frequently overused [23:33:44] next i will make mediawiki download copyrighted material by sacrificing inocent kittens [23:34:17] slipery slope i know, but i guess my hands are dirty now [23:34:22] thankyou sooo much [23:36:55] Hi Guys, Well good news I nearly have SUL / CentralAuth 100% set up, I just have one thing ive probably overlooked, do I have to modify the login page at all? [23:43:47] 03(NEW) PHP/5.2.1 servers returning dual max-age headers on action =raw - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14902 trivial; normal; Wikimedia: General/Unknown; (herd) [23:44:20] 03(mod) Unicode normalization "sorts" Hebrew/Arabic/ Myanmar vowels wrongly - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=2399 (10ravi.chhabra) [23:47:27] What do I do to install a mediawiki skin? [23:47:34] Just place the files in the skins dir? [23:49:31] They are pretty much either dead or dying at the moment...... [23:49:33] lol [23:50:57] Help I installed 2 skins and now my wiki is broken I get apache errors [23:51:23] Well uninstall them [23:51:33] how do you "install" them anyway? [23:51:39] I just put it in the skins dir [23:51:57] !addingskins [23:51:57] --mwbot-- I don't know anything about "addingskins". [23:52:00] !skins [23:52:00] --mwbot-- Overview: . Skin usage: . Gallery of CSS styles: . Writing your own: [23:52:43] will that allow me to set it as a site default? [23:52:53] err [23:53:05] none of those explain installing skins [23:54:09] either your skins are not compatible, or the file permissions are wrong, if I had to guess why your getting apache errors [23:54:29] i removed the skins and its still giving me errors [23:54:32] *stiv2k frustrated [23:54:50] [Wed Jul 23 19:53:52 2008] [error] [client 72.189.228.152] PHP Warning: dir(/home/neoturbine/mediawiki/mediawiki-1.12.0/skins) [function.dir]: failed to open dir: Permission denied in /home/neoturbine/mediawiki/mediawiki-1.12.0/includes/Skin.php on line 42 [23:54:55] [Wed Jul 23 19:53:52 2008] [error] [client 72.189.228.152] PHP Fatal error: Call to a member function read() on a non-object in /home/neoturbine/mediawiki/mediawiki-1.12.0/includes/Skin.php on line 45 [23:55:22] wtf [23:55:48] looks like the permissions on the skins directory might of gotten changed [23:55:57] yeah, but how [23:56:09] Make sure that the www-user had read perms, [23:58:39] anybody got any ideas on why {{plural}} is treating a 1 as not a 1 [23:59:14] OMG [23:59:19] the permissions got changes again [23:59:32] *stiv2k looks at file-roller suspiciously.