[00:23:57] hi everyone [00:24:17] i need help in configuring the thumbnails for images in my wiki page [00:24:33] the server it's hosted has passthru() function disabled [00:24:34] where are the links in the tabs made? Changing makelinkobj didn't seem to affect them. [00:25:38] so instead of the thumbnails appearing, an error message appears: [00:25:40] Error creating thumbnail:
[00:25:42] Warning: passthru() has been disabled for security reasons in /home/khsy/domains/khairul-syahir.com/public_html/wiki/ib/includes/GlobalFunctions.php on line 1839
[00:25:54] can anybody help me?? [00:26:25] i've tried asking the host to enable passthru(), but they seem to be reluctant as that would cause a security vulnerability [00:27:06] try exec? [00:27:37] if passthru is disabled, it seems unlikely exec would be allowed [00:27:52] silverks_: you need to use GD instead of imagemagick [00:28:31] ok, how do i change from ImageMagick to GD? [00:28:45] GD is the default, you should undo whatever you did to enable imagemagick [00:29:41] (probably, make sure $wgUseImageMagick is not set in LocalSettings.php) [00:29:55] ok, i'm commenting that now [00:30:46] ok, that fixed the thumbnailing for normal images [00:30:52] now i'm having one more problem [00:31:09] the thumbnailing for svg images still shows the error message [00:31:19] you can't render SVGs without an external program [00:32:36] i thought ImageMagick can do it, even though without great accuracy? [00:32:50] imagemagick is an external program [00:32:56] that's why it doesn't work when passthru is disabled [00:33:04] your host doesn't let you execute external programs [00:33:10] ah...i see.. [00:33:33] then, is there any workaround for this? [00:33:44] like, installing svg renderer on cgi-bin for example [00:33:58] i did that for mimetex [00:35:52] 03dale * r39374 10/branches/MetavidWiki-exp/frontend_pages/index.php: auto complete search fix [00:45:20] by the way, thanks twincest for the help [00:45:32] at least i got the thumbnailing for the other images working fine now [00:48:55] 03dale * r39375 10/trunk/extensions/MetavidWiki/ (159 files in 20 dirs): (log message trimmed) [00:48:55] merged experimental branch: here is a draft of the official release notes: [00:48:55] == Unified Search == [00:48:55] * new unified search model groups and aggregates relevant semantic metadata per search [00:48:55] * advanced search improvements [00:48:56] * improved autocomplete suggestions/display [00:49:00] == Improved Skinning Support == [00:49:30] big merge committed ;) [01:16:34] 03(mod) List previously deleted pages as Recreated instead of Created ( N; New) - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15131 (10N/A) [01:50:01] 03(mod) alphabetical order method for DynamicPageList - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14971 +comment (10mediazilla) [01:55:13] 03demon * r39376 10/trunk/phase3/ (4 files in 2 dirs): [01:55:14] * Bug 12976: Use $WebResponse->setCookie() rather than raw setcookie() calls. [01:55:14] * Moved all of the debugging/logic to WebResponse so it can be properly used elsewhere. [01:55:14] * A bit of cleanup so cookies set by $wgUser->setCookie() use $wgCookiePath as they should. [01:55:14] * Bug 14887: $wgEnablePersistentCookies has been added to allow for disabling of persistent cookies. [01:55:30] 03(mod) WebResponse->setcookie() should handle prefixes - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12976 +comment (10innocentkiller) [01:55:38] 03(FIXED) WebResponse->setcookie() should handle prefixes - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12976 (10innocentkiller) [01:55:48] 03(FIXED) Configuration option to disable persistent cookies - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14887 +comment (10innocentkiller) [02:02:24] I get 404 errors when I try to get to work the short urls (no root access)... I did exactly what they told in manual:short url, still get 404... Can someone help me? [02:02:44] try a different method? [02:03:25] well I don't know any other [02:03:27] ... [02:04:09] my mediawiki is hosted in w/ directory, and i'm using some kind of fix for php5 extension [02:08:56] no idea anyone? [03:37:01] ???Hook InfoboxDataCapture::save failed to return a value, why this happens with Infoboxdata.php ???? [04:04:58] I tried to cancel my "away" status and xchat abruptly exited [04:05:10] probably segfaulted, that's not very nice, is it? [04:05:41] Chatzilla FTW :) [04:05:56] can anyone confirm Roan's story about the duplicate post to wikitech-l? [04:06:30] there's no duplicate in the archives or on gmane [04:07:55] i saw a duplicate release announcement on wikitech and on mediawiki-l [04:08:10] you mean two posts to each? [04:08:13] yes [04:08:19] maybe there's an alias or something [04:12:30] brion will know [04:22:12] gcc doesn't seem to behave much like the documentation [04:22:17] gcc -MMD file.cc produces no output [04:25:54] 03river * r39377 10/trunk/tools/switchboard2/: new directory [04:27:18] 03river * r39378 10/trunk/tools/switchboard2/ (17 files): switchboard 2. faster, more scalable and less crashy. [04:30:10] I'm having trouble to configure short URL... I followed the steps in manual:short URL but it doesnt work [04:31:58] 03river * r39379 10/trunk/tools/switchboard2/main.cc: compile fixes for linux/gcc4 [04:33:27] proletaire: which procedure did you follow? [04:33:59] The one with no root access : http://www.mediawiki.org/wiki/Manual:Short_URL/wiki/Page_title_--_no_root_access [04:34:47] my mediawiki is hosted in a w/ directory and I use some kind of script to use PHP 5 [04:34:59] hmm [04:35:11] what results are you seeing? [04:35:15] error 404 [04:35:23] can't find the page [04:35:45] have you confirmed .htaccess is enabled on your server? [04:36:10] I'm sure it is, I used it for converting php files to ph5 [04:36:27] *php5 [04:36:29] hmm [04:36:40] could you pastebin your current .htaccess? [04:37:42] one sec please [04:40:09] you got it? [04:40:32] yes - http://pastebin.com/ for future reference, BTW :) [04:41:03] sry, didnt know about that pastebin thing [04:41:57] OK; where is your w/ folder located exactly? [04:42:13] is it in the document root, e.g. http://example.org/w/ ? [04:43:04] it's in the first directory (www.example.com/) while the rest is in www.example.com/w [04:43:10] as they said to do it [04:43:29] wait, what's in the root, exactly? [04:44:39] nothing execpt .htaccess and index.php that redirects to w/ [04:44:56] hmm, remove that index.php for one [04:45:02] .htaccess, as far as I know, affects all subdirectories [04:45:20] Uhm... just to note... [04:45:40] .htaccess stuff normally relies on nonexistance of files [04:45:51] If you have a redirecting file it might skew things [04:46:13] Dantman: yes, why I suggested he remove index.php in his root :) [04:46:21] RewriteRule ^/*$ wiki/ [L,QSA] [04:46:23] ^_^ yup [04:46:28] covers it [04:46:33] eeek [04:47:24] null paths ftl... ^_^ action paths ftw! [04:47:40] deleted the redirect, didnt change a thing [04:47:54] proletaire: one thing to try; in the first RewriteRule, add a / before w/ [04:48:06] it's a php redirect like location: http://www.example.com [04:48:35] so it'll instead look like: RewriteRule ^wiki/(.*)$ /w/index.php?title=$1 [PT,L,QSA] [04:48:51] ^/wiki/ ? [04:49:14] Dantman: no, ^wiki stays [04:49:44] tis the way mod_rewrite works, it' confusing -_- [04:49:50] bleh [04:49:57] I'll stick with nginx [04:50:08] adding "/" almost worked... it seems that it now finds the page, but without the short URL... I'll look further with that [04:50:11] thank you [04:50:28] the URL is : elcq.net/w [04:50:44] Did you set $wgActionPath? [04:50:50] Ack no [04:50:54] $wgArticlePath [04:51:14] yep, $wgArticlePath should be all you need now. :) [04:51:39] articlepath is set, so scriptpath [04:51:44] 03(NEW) User preferences: Date & Time - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15169 15enhancement; normal; MediaWiki: User interface; (AheunHana) [04:51:57] $wgScriptPath stays as /w [04:52:02] what's it set to? [04:53:18] w/ ... actually, it works now but the short URL redirect to the longer one... is there any way to keep that adress short in the adress bar? [04:53:38] Eeek... [04:53:44] norelatives [04:53:56] proletaire: $wgArticlePath = "/wiki/$1"; [04:54:32] set that right under $wgScriptPath = "/w"; [04:54:58] it is set like that [04:55:22] Bah... then give out the right info when questions are asked [04:57:02] Thank you guys [04:58:27] 03river * r39380 10/trunk/tools/switchboard2/ (process_factory.cc process_factory.h version.h): [04:58:27] V-2.0.1 [04:58:27] re-implement missing features: [04:58:27] - kill old php-cgi processes after 30 seconds [04:58:27] - max-q-per-user config option [05:10:07] 03river * r39381 10/trunk/tools/switchboard2/ (request_thread.cc version.h): [05:10:07] V-2.0.2 [05:10:07] provide proper error messages instead of just closing the server connection [05:12:27] 03(NEW) Signature fails link to user_talk - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=15170 15enhancement; normal; MediaWiki: User interface; (AheunHana) [05:17:56] 03river * r39382 10/trunk/tools/switchboard2/main.cc: ignore SIGPIPE [05:39:28] 03(NEW) New added pages aren't visible in the CategoryTree - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15171 major; normal; MediaWiki extensions: CategoryTree; (sebastian.zuercher) [05:40:09] Hi, is there any articles about setting up mediawiki with linux default smtp servcer postfix? I installed mediawiki in openSUSE. Enabled all the mail options, but my maibox still cannot receive any emails. btw, I didn't put any information about my mailserver. I assume mediawiki should use localhost:25 to send email by default. [05:41:01] correction: I assume mediawiki will use sendmail to send emails by default. I tried sendmail < mailbody.txt which works fine. [05:43:41] 03dantman * r39383 10/trunk/extensions/SemanticForms/ (12 files in 2 dirs): [05:43:41] Attempt fixing bug 14357. [05:43:41] Extension messages are now all loaded dynamically. [05:45:15] 03(mod) Semantic Forms short-circuts the new wfLoadExtensionMessages system and tests incorrectly - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14357 (10dan_the_man) [05:55:31] 03dantman * r39384 10/trunk/extensions/SemanticForms/ (16 files in 3 dirs): [05:55:31] Code style fixes: [05:55:31] * Remove remaining ?>'s [05:55:31] * Correctly indent Language classes [05:55:31] * Fix random leading spaces strewn around the code. [05:55:39] hi all [05:55:46] long time since ive been in here [05:58:12] question: on our categories page are many broken links [06:00:21] &action=edit [06:00:27] 03(NEW) Go button on Special:RecentChanges and Special:Watchlist - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15172 trivial; normal; MediaWiki: Page rendering; (MZMcBride) [06:00:34] is placed at the end which breaks it [06:10:36] 03(mod) Retiring the developer user group - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12569 (10N/A) [06:11:12] TimStarling: Could you comment on https://bugzilla.wikimedia.org/show_bug.cgi?id=12569 ? [06:11:47] 03dantman * r39385 10/trunk/extensions/SemanticForms/includes/ (SF_GlobalFunctions.php SF_ParserFunctions.php): [06:11:47] Finishing up applying MediaWiki's scalability standards to Semantic Forms: [06:11:47] * Move $wgExtensionCredits out of the way [06:11:47] * Autoload classes [06:11:47] * Use a StubObject for the $sfgFormPrinter global. Now we can autoload the SFFormPrinter object. [06:11:49] * Use ParserFirstCallInit so that SF parser functions are correctly applied to all parsers and they are offloaded. [06:19:23] 03river * r39386 10/trunk/tools/switchboard2/ (11 files): [06:19:23] V-2.0.3 [06:19:23] - support I/O timeouts [06:19:23] - support limiting the maximum request size [06:19:23] - don't try writing more than 64KB in one fcgi record [06:21:03] 03river * r39387 10/trunk/tools/switchboard2/util.cc: missing file [06:22:12] 03river * r39388 10/trunk/tools/switchboard2/request_thread.cc: fix I32LP64 compile [06:37:29] 03river * r39389 10/trunk/tools/switchboard2/ (fcgi.cc request_thread.cc request_thread.h version.h): [06:37:30] - nicer errors when a request fails [06:37:30] - remove debugging code [06:38:52] 03river * r39390 10/trunk/tools/switchboard2/ (21 files): svn:keywords, svn:eol-style [06:46:00] 03dantman * r39391 10/trunk/extensions/SemanticMediaWiki/ (16 files in 2 dirs): Autoload SMW_Language so that we do not need to include it in every single language file. [06:47:17] 03dantman * r39392 10/trunk/extensions/SemanticMediaWiki/includes/SMW_GlobalFunctions.php: *BOOM* Wrong filepath [06:55:45] 03dantman * r39393 10/trunk/extensions/SemanticMediaWiki/ (16 files in 2 dirs): [06:55:45] Undo my autoloading changes. [06:55:45] :/ SMW does direct inclusion of an arbitrary language file early on (even before [06:55:45] enable semantics is called where the AutoLoading is set [this is Poor Design [06:55:45] IMHO]) meaning that it is impossible to autoload the languages. [07:03:35] Hi all - can anyone help me hide tabs above pages (on monobook) for anon users not logged in? solution in FAQ at http://www.mediawiki.org/wiki/Help:FAQ#How_do_I_remove_the_article.2Fedit_etc_tabs_for_users_who_are_not_logged_in.3F does not work [07:03:46] Hi, I'm having some problems getting MediaWiki to cooperate with imagemagick. [07:04:06] When it tries to generate a thumb, I get this instead: [07:04:08] Error creating thumbnail: convert: unable to access configure file `colors.xml'. [07:04:21] (I've looked that file, it doesn't exist, and I can't find anything in the docs about it). [07:04:37] convert: no decode delegate for this image format `/Users/Audacitor/Sites/wiki/images/e/e8/Xcodeicon.png'. [07:04:48] (delegates.xml does exist, however). [07:05:01] convert: missing an image filename `/Users/Audacitor/Sites/wiki/images/thumb/e/e8/Xcodeicon.png/180px-Xcodeicon.png'. [07:05:46] (I think this is essential a 404; it couldn't find the thumb to display, which makes sense since the thumb was never generated). [07:07:22] 03dantman * r39394 10/trunk/extensions/SemanticCalendar/includes/SC_ParserFunctions.php: Use ParserFirstCallInit in Semantic Calendar [07:10:04] 03dantman * r39395 10/trunk/extensions/SemanticCalendar/includes/SC_GlobalFunctions.php: Autoload the only class in the extension [07:21:44] 03river * r39396 10/trunk/tools/switchboard2/ (6 files): [07:21:44] V-2.0.5 [07:21:44] - don't use lots of CPU doing nothing in main() [07:21:44] - a resource leak could occur if a failed failed. make it not occur. [07:38:51] 03siebrand * r39397 10/trunk/extensions/MetavidWiki/languages/MV_Messages.php: [07:38:51] * lower case [07:38:51] * revert removals [07:48:41] 04(REOPENED) Make Special:Watchlist look and behave like Special: RecentChanges - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15157 +comment (10siebrand) [07:52:07] <_wooz> lo [07:57:24] Duesentrieb, you were so kind to respond yesterday - i still seem to be stuck on mediawiki upgrade & character sets ;) [07:57:30] some pages have their titles mangled [07:58:01] that's bad... how did you import the dump? [07:58:04] i tried some tricks for mysql to convert tables/columns to utf8 (like converting it to binary first, to utf8 later), but that still failed [07:58:37] plain mysql import from commandline [07:58:51] mysql has was to *convert* or to *redefine* - you deinitly should NOT convert, because it's already utf8. [07:58:59] "plain" how exactly? [07:59:09] as far as i understand, latest mediawiki with a recent mysql should work just fine with utf8 data types, and is in fact preferred [07:59:17] mysql < dump :) [07:59:41] and yes, converting to binary first was supposed to just redefine [07:59:57] then applying utf8 to it still fails with the mangled string & duplicate key error [08:00:51] latest mediawiki with a recent mysql should work just fine with utf8 data types, as long as you do not have 4 byte utf8 codes. which are still not supported by mysql. [08:01:52] it looks like mw has a mix of latin1 encoded utf and non-utf data... [08:02:05] it breaks on Equipment/ string [08:02:09] if anything is encoded as latin1, it's broken [08:02:22] TimStarling: What do you think about rewriting replaceInternalLinks to use the preprocessor and work similar to braceSubstitution? [08:02:26] mediawiki does not use lating1 for anything, though it might declare some columns as latin1 for compatibility reasons [08:02:36] but it does not use latin1 since 1.5 [08:02:42] not for anything. ever. [08:02:49] yeah. i have a dump from 1.5.6 ;) [08:03:06] so if you see latin1 codes, it was mangeled on export [08:03:18] don't rewrite replaceInternalLinks until I've committed my changes to it, ok? [08:03:29] oh heh... [08:03:58] Well... not like my local edits changed much there... [08:04:52] I'm trying to introduce a new concept... Link Hooks... A lot of extensions are starting to extend the link syntax... and normally it leads to substandard implementation, or a regex that causes server errors. [08:06:33] Richlv: try using --default-character-set=latin1 with mysqldump as described in http://www.mediawiki.org/wiki/Backup - that's the only thing i can think of. [08:06:43] Duesentrieb, thanks, will try that [08:06:56] the most important thing with the parser is backwards compatibility [08:07:00] 03siebrand * r39398 10/trunk/extensions/intersection/DynamicPageList.php: (bug 14971) Added method for alphabetical ordering ('alphabetical') (patch by Ramac) [08:07:03] Richlv: also... if you converted from 1.4 previously, is it possible that that conversion dodn't work correctly, and you have broken titles from that? [08:07:21] the syntax has to be precisely the same in every little edge case [08:07:37] 03(NEW) API Query Links not db agnostic - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15173 normal; normal; MediaWiki: API; (overlordq) [08:07:43] 03(mod) PostgreSQL/pgsql support (tracking) - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=384 (10overlordq) [08:07:52] 03(FIXED) alphabetical order method for DynamicPageList - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14971 +comment (10siebrand) [08:08:01] Duesentrieb, as far as i know, it was converted from 1.4 - but titles work fine in the existing 1.5.6 installation [08:08:04] mhmm [08:08:50] obviously there are a few problems with that if you want to take the links into the preprocessor [08:08:59] Richlv: hm... well, try that lating1 trick then. [08:09:18] Richlv: fyi, here's the bug report on mysql not supporting full unicode: http://bugs.mysql.com/bug.php?id=14052 [08:09:22] Hmm... well maybe an offshoot [08:09:25] the preprocessor, after all, is called a preprocessor because it happens *before* something [08:09:31] will not be fixed before mysql 6, it seems. [08:10:25] TimStarling: Well, ParserBeforeStrip and all the other plaintext hooks still work fine even with your new preprocessor... there are no hooks working directly with links... [08:11:06] I'm not talking about hooks [08:11:07] Duesentrieb, thanks. what characters would require 4 byte utf8 encoding ? [08:11:15] I don't mind breaking extensions, I just don't want to break wikipedia [08:11:26] Richlv: rare stuff. gothic, some obscure chinese... [08:11:51] Richlv: the problem is that people *can* input it. and if they do, bad things happen. [08:12:35] not sure if they'd just get stripped... when trying to import a dump with those chars, i had mysql fail on me before. [08:12:58] Well, when you preprocess something, you usually expand off that? I only see the difference being turning links into nodes then expanding rather than mucking with a pile of text... either way they should end up treated the same? [08:13:15] Richlv: acording to the bug report, "utf8 truncates the string at the point the char appears" - which means data loss. [08:14:53] the difference is that preprocessing allows you to create links which didn't syntatically exist in the first place [08:15:17] hmmm? [08:15:21] {{openlink}} link {{closelink}} [08:15:34] where openlink = "[[" and closelink = "]]" [08:15:37] that's the simplest example [08:16:21] O_o I thought the preprocessor was created to stop that kind of stuff, not create it? [08:17:23] Duesentrieb, ok, we got relatively common cyrillic broken, so i guess it's not that issue :) [08:17:50] probably not [08:17:58] Well I guess I'm looking for something similar to the preprocessor then... just a small subprocessor for replaceInternalLinks [08:18:01] stop it? [08:18:14] Richlv: but... *some* nonenglish titles are ok, but others are garbled?... [08:18:28] no, the preprocessor was created to be exactly the same as a section of the old parser, except faster [08:19:12] Duesentrieb, i managed to break the db with alter table stuff ;), but i think all i saw were broken [08:20:39] did you verify the dump contained valid utf8? [08:20:51] you can do that by runnign an utf8-to-utf8 conversion using iconv [08:21:28] Duesentrieb, at least non-english parts were ok, will run it through iconv like that right now [08:22:15] nope, that fails - so i guess it contains something fishy [08:24:40] even the dump created with --default-charset-whatever? [08:24:43] 03(mod) User preferences: Date & Time - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=15169 +comment (10niklas.laxstrom) [08:24:45] that's bad... [08:26:32] Duesentrieb, no, i haven't yet gotten such a dump, the person who manages the db has not yet responded today (might be sleeping or something, i think we have different timezones :) ) [08:28:03] 14(INVALID) Signature fails link to user_talk - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15170 +comment (10lejonel) [08:31:15] TimStarling: ^_^ I think I love the current way the parser handles obscure links http://dev.wiki-tools.com/wiki/LinkHook#Old_Tests That works perfectly with my concept... [08:37:49] 03river * r39399 10/trunk/tools/switchboard2/ (6 files): [08:37:49] V-2.0.6 [08:37:49] - properly handle errors in the child php, should reduce the number of FastCGI protocol errors [08:37:49] - up the listen backlog a little [08:39:20] 03(mod) maintenance/updateSpecialPages.php fails with PostgreSQL - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14414 +comment (10overlordq) [08:46:42] TimStarling: Can you comment on the last testcase here? http://dev.wiki-tools.com/wiki/LinkHook#Old_Tests [08:47:20] I can understand not having a media embed within another media embed... but is killing both of them the optimal way to handle this here? [08:51:37] heyho, is it possible to include the text [[Category:Foo Bar]] verbatim in a page? [08:52:07] sure [08:52:15] [[Category:Foo Bar]] [08:52:15] ooor [08:53:05] No entry for terminal type "tty"; [08:53:05] using dumb terminal settings. [08:53:05] [[Category:Foo Bar]] [08:53:10] ... [08:53:17] ignore me :) [08:53:20] lol [08:53:22] cool, tnx :) [08:53:35] and sorry for being a lazy piece of sh*t :) [08:54:28] ^_^ If you're going to be lazy in the domain of links... at least give me a good testcase for stupid syntax that parser changes need to keep the same... [08:56:02] Dantman: that's the way it is [08:56:02] Dantman: Has anybody ever challenged you not to type '^_^' for a day? :) [08:56:12] heh [08:56:13] hmm, waht about [[Category:Foo Bar|baz]] as a silly attempt to link to a category page? :) [08:56:19] [[/stupid_is as/stupid#does.|what he said]] [08:56:27] gmc: [[:Category:Foo Bar|baz]] [08:56:40] to link to a category, instead of using it, prefix the link with a colon, as I did in that example. [08:56:51] i knew that! [08:56:57] image link inside image link doesn't work [08:57:16] ie: Technical limitation... or... it should never work? [08:57:20] or was the problem the third nested link? [08:57:34] I know, don't embed an embed inside of an embed [08:58:18] I'm meaning, rather than [[Image:Filename.ext|[[Image:Filename.ext|[[link]]]]]] outputting the Image: stuff verbatim and the [[link]] being linked [08:58:39] you're asking hard questions [08:58:40] Can't the first Image: be an embed, the [[link]] a link, and the second Image: left verbatim [08:59:11] you have to do a lot of research if you want to change the parser [08:59:20] the research probably takes longer than the coding [08:59:42] it's not something I can just do for you on IRC [08:59:48] mhmm [08:59:58] well, I could, but it would take a long time [09:00:04] heh [09:00:14] yknow, Tim, when I helped ya debug the preprocessor rewrite I had insomnia for a month after? [09:00:20] and I never even touched the php, just the wikicode [09:00:26] Dan: avoid the parser, go do something safer [09:00:26] What are the pieces of things to research? [09:00:30] heh [09:01:46] Eeeekkkk... [09:02:01] http://dev.wiki-tools.com/wiki/LinkHook#Old_Tests 2nd to last... [09:02:02] Research means looking at how things are used. [09:02:07] before you can change the parser, you have to know what the syntax is [09:02:11] Right... I believe that was a bug report [09:02:12] so that you know whether you're changing it or not [09:02:41] you have to know how the current parser works and why it's written that way [09:03:14] has anybody else noticed quite how odd the inheritance of Skins is? [09:03:59] ^_^ Which is why I'm asking... is the screwed up handling of nested embeds on purpose, or due to technical limitations? [09:04:11] Werdna: you mean SkinTemplate? [09:04:24] Dantman: when would you like an answer by? [09:04:35] heh... dunno [09:04:39] bah... i'm afraid my skin revamp has grown stale... need to redo that :/ [09:04:59] Time to track down link related bugzilla reports [09:05:27] I recal there was one directly on this screwed up image syntax... ^_^ I believe they wanted it fixed... rotlf [09:06:14] TimStarling: well, it looks like Skin inherits from SkinTemplate, which inherits from Linker [09:06:46] no, SkinTemplate inherits from Skin [09:07:00] class SkinTemplate extends Skin { [09:07:22] ah, so skins themselves inherit from SkinTemplate, which inherits from Skin, which inherits from Linker? [09:08:37] SkinTemplate should actually be called TemplateSkin [09:08:51] the misnomer has been bugging me for years now [09:09:13] the odd thing is that SkinTemplate doesn't even use a template [09:09:34] well, ideally we'd have a separate Linker class, loaded on demand from $skin->getLinker(), and a single parent class to inherit from to define a skin. [09:09:41] so I think SkinTemplate should just not exist at all [09:10:05] right, a single skin class which you inherit from [09:10:11] I'm not sure why we have Linker [09:10:33] domas introduced that one, iirc, he never really explained it to me [09:10:56] btw tim, somebody just asked me about disabling Special:CentralAuth / Special:GlobalGroupPermissions / Special:GlobalGroupMembership everywhere except meta, to make sure logging is in the same place. [09:11:20] not right now [09:11:40] have to fix the parser [09:14:35] 03mkroetzsch * r39400 10/trunk/extensions/SemanticMediaWiki/ (README includes/SMW_GlobalFunctions.php): Modified parser function registration as per DanTMan's suggestion + added Daniel to the contributors' list [09:15:42] http://xkcd.com/463/ <--- and it's *true*! [09:20:03] what's that function I use to clean up a comment? [09:20:38] skin->commentBlock( [09:20:39] that's the one [09:29:48] hello all [09:36:29] 03(mod) maintenance/updateSpecialPages. php tries to INSERT invalid values - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14414 summary; +comment (10overlordq) [09:49:06] !bugzilla [09:49:06] --mwbot-- All bugs in MediaWiki should be reported at https://bugzilla.wikimedia.org. This is also the place to request site configuration changes, new features or enhancements to existing features, although bear the following points in mind before making a feature/enhancement request: 1) If the request is specific to a Wikimedia wiki, please discuss the issue on that wiki first. 2) Consider whether a custom extension would be more appropriate. [10:02:45] TimStarling: wasn't me [10:03:00] webresponse is me [10:03:01] webrequest is me [10:03:10] sorry domas [10:03:17] linker isn't me! :) [10:03:18] thats ok! [10:03:24] people blame me everywhere! [10:04:22] with great ego comes great culpability [10:07:58] *Werdna throws rocks at Splarka [10:08:42] I am a billion times more modest than you! haha [10:08:48] [10:09:02] In fact, I'm the LEAST competitive, so I WIN. [10:18:34] *Splarka pouts, moves to Penultimateville. Population: 1 [10:29:13] > $t = microtime(true); foreach ( explode( "\n", $test ) as $elt ); print microtime(true)-$t [10:29:13] 1.2810978889465 [10:29:13] > $t = microtime(true); foreach ( new ExplodeIterator( "\n", $test ) as $elt ); print microtime(true)-$t [10:29:14] 9.4570820331573 [10:29:26] is that cool or what? [10:29:56] eh? [10:30:04] PHP only 7.4 times slower than C, and uses 200 times less memory [10:30:41] they both do the same thing, but the ExplodeIterator class (which I just wrote), doesn't store elements as a hashtable [10:30:47] it just iterates through the string using offsets [10:33:31] > $t = microtime(true); substr_count($test, "\n"); print microtime(true)-$t [10:33:31] 0.031369924545288 [10:34:03] see, I can use substr_count(), which is insanely fast, to determine whether to use explode() or ExplodeIterator [10:34:18] and so choose the optimal trade-off between size and speed [10:34:29] awww [10:34:31] that's cute [10:34:38] foreach ( StringUtils::explode("\n",$lines) as $line ) { [10:34:59] hm [10:35:03] why there isn't some extension that adds memory efficient algorithms and datatypes to php? [10:35:03] I can use this code [10:36:18] TimStarling: ^_^ A third preprocessor? Could you also come up with a method of editing the internals which suffer from utter code duplication... [10:36:45] Don't you wish PHP had multiple inheritance [10:36:45] who's writing preprocessors? [10:37:07] what's that? what's this little project for then? [10:37:22] this is reducing memory usage of existing parser code [10:37:35] I had another way to do it, but this is much better [10:37:38] heh... you mentioned hashtable and I thought PP_Hash [10:37:53] ^_^ PP_String [10:38:27] PP_parrot [10:38:40] I was especially lead to that since you mentioned C and you mentioned you were going to write a C preprocessor. [10:41:33] TimStarling: that's pretty cool. [10:41:42] an 8-fold improvement. [10:44:28] not exactly [10:44:50] explode() is what we use now, it's 8 times faster than ExplodeIterator but uses 200 times more memory [10:44:56] it's a trade-off [10:45:18] how about doing it in batches? [10:45:44] that would work too [10:45:55] but the timing difference is actually really small [10:46:24] that 8x difference is only a few microseconds per loop, and we don't have any loops that tight [10:46:33] 03guyvdb * r39401 10/branches/visual_diff/phase3/includes/ (Diff.php DifferenceEngine.php): New diff implementation [10:46:40] oops, I should read :P [10:46:54] so by using explode(), I'm actually only saving a tiny amount, as a percentage [10:47:00] 03guyvdb * r39402 10/branches/visual_diff/phase3/ (includes/HTMLDiff.php skins/common/diff.css): bugfixed and improvements for the HTML differ [10:47:00] and it's not really worth implementing more complex solutions [10:47:31] TimStarling: so memory was being sucked up by explode()? [10:47:53] yes [11:08:26] TimStarling are numbered tables supported by mediawiki? [11:08:35] I want to have number indexes visible on tables [11:08:40] like how # lists work [11:09:28] no [11:12:11] *Splarka grins [11:12:34] White_Cat_Zzz: is anyone else allowed to answer? or did you ask T*m only and specifically? [11:12:56] anyone can answer [11:13:08] I just seen tim active [11:13:16] answer -> no [11:13:19] *Splarka just wanted to do that [11:13:29] (actually, you can probably do it in CSS2: http://www.w3.org/TR/REC-CSS2/generate.html#counters ) [11:15:54] but that doesn't work in IE6 or 7, and only in later Opera versions (partially buggy) [11:16:03] I actualy have a way [11:16:06] [ ] linkage [11:16:33] http://meta.wikimedia.org/wiki/Steward_requests/Username_changes#Ownbot_.40_many_wikis [11:20:02] lame ^_^ [11:20:15] :P [11:20:35] Splarka I am also looking for a way to rename my accounts on wikis that have been closed [11:21:05] interfering with merging accounts? [11:21:54] I need an account rename first [11:21:57] then I can merge it [11:22:28] I have identified 11 bot accounts and a few others [11:23:18] I am also looking for a way to rename my accounts on wikis that have been closed <- ............................................................................. [11:23:36] sounds like a question for #wikimedia-tech [11:23:46] they are closed, forget it [11:23:47] I am banned from that channel by Domas [11:24:22] DarkoNeko if I needed your adivce. I'd ask for it [11:24:35] hai, hai [11:24:50] that never prvented me from giving it anyway ^_^ [11:25:35] 03(mod) @ (at-sign) needs to be totally invalid for usernames - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12581 (10maba-mailings-bugzilla-wikimedia) [11:25:43] DarkoNeko when I do ask for your advice, help, or expertise I generaly get a mildly offensive remark in return [11:25:50] I also really am not in the mood OK? [11:26:38] *DarkoNeko yawns [11:27:09] *DarkoNeko goes preparing breakfast, then [11:28:38] 03jojo * r39403 10/trunk/extensions/Collection/collection/collection.js: turn cursor into wait cursor during loading of collection [11:35:19] 03aaron * r39404 10/trunk/extensions/CheckUser/CheckUser.php: remove comment [11:35:38] Splarka btw [11:35:44] are bot edits hidden from watchlists? [11:35:55] as in bot page moves [11:37:18] sounds very doubtful, but log events showing workably on watchlists is sorta new (last few months), better test it [11:37:24] hi, I installed mediawiki in linux. i am using default mail server setting, thus it's using sendmail right now. I tried the feature "E-mail this user", it works perfectly. But my watchlist sending mail dosen't work. I enabled "E-mail me when a page on my watchlist is changed". and in LocalSetting.php, I make wgEnableEmail, wgEnotifWatchlist UserTalk = true. Any ideas why it doesn't work? [11:38:01] Splarka can you check please? [11:38:38] no [11:40:29] 03guyvdb * r39405 10/branches/visual_diff/phase3/ (276 files in 15 dirs): merge with trunk [11:41:04] White: you should ask your questions less specifically, especially when changing topics [11:41:11] It this the right workflow: merging trunk with branch, comitting branch, copying branch to trunk and comitting trunk? [11:41:41] (instead of just asking the last person who talked) [11:46:33] please ignore my previous question. I found out why. it will not send email if the person edit the watched page by himself. how stupid I am . [11:46:49] I spent a whole day to find out why... [11:57:05] would be nice if URLENCODE http://meta.wikimedia.org/wiki/Help:Parser_function#URLENCODE didn't encode slashes. anyway to remedy this (like replacing the encoded slashes back with slashes ? [12:06:06] damn, too much duplicate information outdating itself on meta and mw [12:07:49] exobuzz: string functions :) [12:08:52] *Splarka stabs werdna, gently [12:08:58] ow [12:09:35] exobuzz: what is your specific application? there may be other workarounds [12:11:17] I mean, for example, if it is a local url, you can use fullurl instead: [12:11:32] 03(ASSIGNED) API Query Links not db agnostic - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15173 +comment (10roan.kattouw) [12:11:34] a download template "[http://files.exotica.org.uk/?file=exotica/{{urlencode:{{{file|}}}}} {{{name}}}] (ftp)" where you pass it a path to a file. [12:12:08] it works fine, but its much more readable on apache logs etc with / [12:12:16] apache logs... [12:12:21] heh [12:12:26] well. they get parsed by awstats [12:12:50] also it looks better on the browser when a user hovers over a link [12:13:01] its more obvious what they are downloading etc [12:13:53] my redirect script also url encodes to pass to an ftp site - but not slashes because encoded slashes dont work for ftp etc [12:14:41] or at least i had problems with encoded slashes. but anyway, i just want it for aesthetic reasons in this case [12:14:42] :-) [12:15:14] (also slashes dont generally need to be encoded) [12:16:12] which is more readable http://files.exotica.org.uk/?file=exotica/media%2Faudio%2FUnExoticA%2FGame%2FHuelsbeck_Chris%2FTurrican_3.lha or http://files.exotica.org.uk/?file=exotica/media/audio/UnExoticA/Game/Huelsbeck_Chris/Turrican_3.lha [12:16:14] for example [12:16:35] http://www.bash.org/?4281 [12:17:20] well, I believe encoding slashes in a URI component is standard [12:17:36] so actually that should be exotica%2Fmedia [12:18:27] go to google.com put in a slash, and hit enter, should show as q=%2F [12:18:43] you will find lots of examples where it breaks stuff too [12:19:01] but i never said it wasnt the proper way. i know according to rfcs it is [12:19:06] but i want slashes [12:19:07] ;-) [12:20:02] you might have to make your own parser function [12:20:07] also on php urlencode there are lots of references to it [12:20:17] ok.. then i guess i will do that [12:20:18] or use another bouncer that lets you rectify them [12:22:58] is it too specialised to get into the core code do you think if i submit it ? perhaps urlencode could have an include/exclude list [12:23:27] Splarka: [13:23] is it too specialised to get into the core code do you think if i submit it ? perhaps urlencode could have an include/exclude list [12:24:06] um... [12:25:18] also i wonder if urlencode should be using phps rawurlencode rather then urlencode but anyway [12:25:41] sigh [12:25:49] what? :) [12:25:51] *Splarka stabs self and then suggests StringFunctions to you [12:26:15] ive added my own parserfunction now.. so ill use that for the time being [12:27:12] https://bugzilla.wikimedia.org/show_bug.cgi?id=8254 [12:28:12] yup. there are some problems if you link to a subpage but urlencode the subpage slash etc too [12:28:22] closest bug I can find [12:28:31] it is apache's fault [12:29:38] i made "urlencodenoslash" [12:29:39] :-) [12:30:06] mmm... hmm [12:30:21] a cog just clicked in my brain [12:31:10] maybe i would be better adding a "replace" parserfunction [12:31:10] you mostly want to remove spaces, right? [12:31:13] Splarka I just checked [12:31:15] and combining it [12:31:19] bot moves are shown [12:31:20] i want to remove %2F [12:31:22] or change them to + or _ or such? [12:31:24] no [12:31:27] I mean when urlencoding [12:31:28] it should be hidden by default [12:31:32] your main problem is spaces? [12:31:36] 03siebrand * r39407 10/trunk/extensions/MetavidWiki/ (4 files in 4 dirs): [12:31:36] * relevent -> relevant [12:31:36] * update crappy formatting of files I touched for previous issue (end of line whitespace, replace ^SPACETAB with ^TAB, remove "?>" at end of file) [12:31:39] spaces and other ??e??e????????e?? chars [12:31:43] ahh [12:31:48] well, why not add an interwiki link [12:31:53] and then use fullurl to it [12:32:04] try {{fullurl:rtfm:foo bar/baz}} on wikimedia [12:32:05] 03werdna * r39408 10/trunk/extensions/GlobalBlocking/GlobalBlocking.i18n.php: Add some missing messages [12:32:15] i used to use an interwiki link instead of my download template and there were other problems [12:32:24] but i cant remember what now [12:32:24] rtfm -> ftp://rtfm.mit.edu/pub/faqs/$1 [12:32:34] did you use fullurl: ? [12:32:44] can't remember. perhaps not [12:33:05] it does some peculiar things that might work to your advantage [12:33:10] can interwiki links handle multiple parameters ? [12:33:23] sorta [12:33:28] it just appends them after the $1 [12:33:32] {{fullurl:rtfm:foo bar/baz|action=foo}} [12:33:35] ftp://rtfm.mit.edu/pub/faqs/foo_bar/baz?action=foo [12:33:37] aah k [12:33:55] if only i could remember why it didnt work.. i might try again [12:34:10] note that you have to urlencode the additional parameters, but not the first one [12:34:29] 03(NEW) Watchlist - hiding bot moves - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15174 15enhancement; normal; MediaWiki: General/Unknown; (wikipedia.kawaii.neko) [12:34:31] {{fullurl:rtfm:{{{1}}}|action={{urlencode:{{{2}}}}}|}} [12:34:33] the problem was an encoding problem i remember. aha i remember now [12:34:39] it used to convert spaces to _ [12:34:44] will fullurl stop that ? [12:34:54] no, it'll probably continue to do that [12:35:02] that is no good then [12:35:05] but who cares, _ and / are cleaner than + and %2F ^_^ [12:35:06] *Splarka hides [12:35:18] you want to rename 370,000 files on my ftp ? [12:35:18] :-) [12:35:23] hey, you want visual prettiness over functionality [12:35:27] so yes I do [12:35:30] hehe [12:35:31] that or live with %2F [12:35:45] 03(mod) Please add cross-wiki talk page notification. - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=1066 (10wikipedia.kawaii.neko) [12:36:02] i think i will rename my urlencode parser function splarkaencode [12:36:11] as an tribute [12:36:14] hehe [12:36:20] meanie [12:36:29] besure to octothorpe it [12:36:45] that went over my head. [12:36:51] what is octothorpe ? [12:36:53] (hash, pound, numbersign, sharp) [12:37:02] octothorpe sounds sexier though [12:37:14] like circumflex sounds better than rabbitfood [12:37:30] http://en.wikipedia.org/wiki/Number_sign#Other_names_in_English [12:37:31] ok. so, how do you mean though ? [12:37:43] {{#splarkaencode}} instead of {{splarkaencode}} [12:37:51] oh i see [12:38:10] I won't stand for no possibly-ambigouous-template-namesake-function! [12:38:14] well ive been naughty anyway, and stuck it in the core code. ;-) i should make it an extension [12:41:22] hi [12:44:17] 14(WFM) Watchlist - hiding bot moves - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15174 +comment (10JSchulz_4587) [12:51:21] can someone do this for me? [12:51:21] $ php parserTests.php --regex 'category in language variants' [12:51:33] I'm having trouble believing my eyes [12:52:51] hello. [12:53:10] TimStarling: the category link is not generated [12:53:20] it's a fail, right? [12:53:20] i'm getting an 'error creating thumbnail' message that I can't figure out [12:53:49] I had it recorded as an expected pass [12:53:50] TimStarling: yes. [12:53:52] 'convert' is in the right path [12:54:00] not sure that happened since the test is obviously broken [12:54:13] joss_winn, check 1) is imageMagick or GD ar correctly installed 2) is there is the correct access right on that folder [12:54:22] TimStarling: dunno anything about that stuff, i just reporduced like you asked :) [12:54:25] thanks Duesentrieb [12:54:43] i'm off, bbl [12:55:21] DarkoNeko, the folder permissions are fine. To test I've chown -R apache:apache to everything in 'images' [12:55:39] hmm [12:56:28] have you altered the folder choice in your LocalSettings ? [12:56:36] DarkoNeko, i think imageMagick is ok, too, as 'convert' is installed in /usr/bin and LocalSettings.php is looking there for it [12:57:20] DarkoNeko, i can upload and display an image OK. It just has to be full size. Entering 250px for example, produces an error [12:57:54] and it says permission dened in the error creating thumbnail messsage? [12:58:20] it says 'error creating thumbnail' [12:58:36] in the grey square where i'd expect to see a thumbnail [12:59:04] hmm, sorry, I don't know then :) [12:59:09] Hi all. I'm running MW 1.11.1 and I wish to upgrade to 1.13. I've backupped everything (sql+files), what I need to do now is to upgrade all the new version files overwriting old ones? [12:59:20] that's odd, there's usually a error saying why like permision denied, etc [12:59:36] Shown, that should be explained/detailed at mediawiki.org [12:59:51] i'm reading :( [12:59:56] the code is: [[Image:picture.png|200px]] [13:00:27] if I remove the |200px, it displays OK full size. but adding the |200px gives a grey square 200px wide with that error message [13:01:50] *shrug* I dunno either, the error message normally includes a reason why [13:03:08] OK. Thanks OverlordQ & DarkoNeko [13:45:22] latest installment of DeleteQueue screenshots: http://en.wikipedia.org/wiki/User:Werdna/dev3 [13:53:52] 03(mod) Please add cross-wiki talk page notification. - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=1066 +comment (10Simetrical+wikibugs) [14:23:15] after upgrade wiki started going extremely slow [14:23:19] and LocalSettings.php is the same of version 1.11.1 even if now i've 1.13 [14:23:21] wtf??? [14:25:05] localsettings.php did not update when running "php update.php" [14:26:05] Shown, it's not supposed to. [14:26:13] Those are your customized settings that apply across versions. [14:26:36] The wiki should not be going any slower in 1.13 than in 1.11. If you can figure out what's making it slower, please say so so we can look into it. [14:26:37] may that the cause of a terrible wiki slow down? [14:27:47] it shouldn't [14:28:09] i did not find any particular instruction to upgrade from 1.11.1 to 1.13 so after db/fs backup, I overwrite all wiki folder and run "php upgrade.php", without errors on the shell [14:28:35] then looked at Special:Version page and it tells 1.13 as supposed [14:29:25] what's the php version, are you using mysql, postgresql and what version of that? [14:29:41] * PHP: 5.1.2 (apache2handler) * MySQL: 5.0.19-standard [14:30:01] looks good enough to me [14:30:17] are you using any extensions? maybe some need to be upgraded too [14:30:24] no one [14:30:47] in addition to the general wiki slowness, i noticed that all custom templates disappared [14:32:29] oh better, they're not working as in 1.11 [14:34:13] hi how are you people? [14:34:24] how can I put code now? [14:34:32] it used to be [html] [14:34:46] but now it needs to be on a different syntax, anyone can put it? [14:34:57] maybe http://www.mediawiki.org/wiki/Release_notes/1.13 can help you figure it out [14:35:11] Shown that is [14:35:27] i think tha tshould read also 1.12 RN [14:35:27] can u just tell me here [14:35:46] ??? [14:37:33] JZA its with the SyntaxHighlight GeSHi extension [14:37:46] darkcode did not find anything helpful on 1.13 and 1.12 RN [14:37:46] right... source [14:38:27] i was just running a clean 1.11.1 install [14:39:05] Shown did you download the 1.13 version using svn? [14:39:20] no a tar.gz from the site [14:40:06] darkcode: how can I get sub categories? [14:40:20] I've been looking all over the documentation but no luck [14:40:39] seems [:Category: text] [14:40:47] but it doesnt really work [14:41:18] subcategories are just categories where its been included from another category like Category:A might include [[Category:B]], so than B is a subcategory [14:41:59] how can I include one in another? [14:42:08] just edit the Category:A page? [14:42:21] Shown, maybe a bad tar? Other than that I got no other ideas [14:42:37] hm [14:42:37] yes [14:42:40] uh [14:42:46] i think i find out what the problem is [14:43:21] i remember that when i first installed the wiki, in order to make some template work as expected, i needed to add some things on some css files [14:43:33] yes JZA just edit the Category:A page [14:43:36] and those were def ones [14:43:49] so surely they we're lost to overwrite [14:44:35] Shown: if you added them to MediaWiki:Common.css for instance they shouldn't of been overridden [14:45:09] 03(NEW) JavaScript Regexp syntax highlighting does not support literal strings - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15175 15enhancement; normal; Wikimedia: wikibugs; (manishsmail) [14:45:40] no darkcode [14:45:48] i modified .css files [14:46:02] is there a possibility to add CSS infos in sql db? :s [14:46:08] Shown: than yes they were overridden, you aren't suppose to need to edit the css files [14:46:32] that's great, so where do i have to put additional infos? [14:46:35] Shown: you can edit the wiki pages MediaWiki:Common.css to customize css [14:46:41] ok [14:47:09] and the wiki page MediaWiki:Common.js if you want to add javascript [14:47:22] tnx i'll try [14:47:23] there are also pages for each skin if you want changes to only effect certain skins [14:47:43] like MediaWiki:Monobook.css and MediaWiki:Monobook.js [14:48:30] users can also add there own changes with User:Shown/monobook.css for instance [14:48:37] lol [14:48:43] great [14:49:05] working on .css files was a real pain in the ass [14:49:32] ya if you had down it in the wiki to begin with, would not of had that problem [14:49:40] when you upgraded [14:50:06] darkcode i've differences on the following files: [14:50:15] MonoBook.php, common_rtl.css, shared.css [14:50:23] the wiki page Special:Allmessages list all messages that can be customized on wiki without changing any files [14:51:16] I see, well those you can't customize with a wiki page [14:51:31] so you would need to back those up when upgrading [14:52:00] ok [14:52:12] the wiki now is not slow anymore [14:52:45] anyhow, dunno if there are somethings to add in LocalSettings.php since it's the old one the one i'm using now [14:53:01] you shouldn't really need to change shared.css though since MediaWiki:Common.css will effect everyone the same way, no alternative for common_rtl.css though [14:53:41] dunno what common_rtl.css stand for, maybe i just need common.css [14:54:07] it meant for languages written from right to left, rather then left to right [14:54:31] ok [14:54:48] so unless your using a right to left language, it won't have any effect [14:55:13] nice to know [14:55:32] so yes common.css and for that you don't need to change either since MediaWiki:Common.css will effect everyone [14:56:38] you should only need to edit MonoBook.php if your trying to change the elements on the page, rather then css styles [14:56:40] Someone has vandalized my MW 1.12 site so that every page says "Viewing disabled", with a long message -- does anyone know quickly what the vandal might have done to accomplish this? [14:57:15] I'm guessing a system message or something, but I don't even know how to re-enable viewing (yet)... [14:58:21] It doesn't seem to be a system message. [14:58:53] darkcode your help was really useful and appreciated, much thanks, all seems to work as expected [14:58:57] hello [14:59:10] TheWoozle: they shouldn't be able to edit system messages, unless the vandal had sysop rights, or you used a template within a system message that anyone could edit, and I'm not aware of any system message that can be changed to disable viewing [14:59:20] Right. [14:59:28] I'm grasping at straws here. [15:00:00] I'm hoping they didn't break in via FTP... [15:00:14] I'm not seeing any evidence of that yet, but I can't figure out how else they could cause this effect. [15:00:18] the only thing that comes to my mind is if set Special:Userrights up to allow anyone to add and remove rights from anyone [15:00:22] cya all, thanks again guys [15:00:33] yw Shown [15:01:51] and you had set up the wiki to require certain rights to be able to edit pages TheWoozle [15:02:10] or view pages rather [15:02:17] why would they disable editing if they compromised your account? [15:02:30] They haven't disabled editing, just viewing. [15:02:44] Or, rather, the *message* says they did this... and it seems to be on all pages... [15:02:58] ...and editing the page shows different content than what's actually displaying. [15:03:04] issuepedia.org [15:03:13] 03(mod) JavaScript Regexp syntax highlighting does not support literal strings - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15175 (10raimond.spekking) [15:03:54] hello guys... [15:03:58] cute [15:04:20] just upgraded my wiki to latest version and something went awfully wrong, anyone could help me straighten things out? [15:04:51] TheWoozle: aaah [15:04:53] now I get it [15:05:03] Oh? Do tell! [15:05:21] I'm still floundering. [15:05:38] I only see it on the main page [15:06:05] TheWoozle: it in some of your articles [15:06:24] http://issuepedia.org/Template:Data.pair is the source of your problem [15:06:24] So it isn't *really* disabled viewing... [15:06:27] Ah. [15:06:41] http://issuepedia.org/LOLOLOLOLOLOLOLOLOLOLOLOLOLOLOLOLOLOLOLOLOLOLOLOLOLOLOLOLOLOLOL_HAHAHAHAHAHAHAHA_AL_GORE_GIVES_GEORGE_BUSH_AND_BARACK_OBAMA_GOOD_BLOWJOBS this one [15:06:44] probably [15:07:05] Damn, I knew I should have protected those templates. [15:07:50] what I did when I noticed it wasn't on all pages was looked at the included templates [15:07:50] Yep, that's it. I knew I was panicking... [15:07:53] until I found which one [15:07:57] Right. [15:08:18] http://issuepedia.org/index.php?title=Template:Peppers&action=edit another one [15:08:20] FWIW, this is probably the same guy who has been doing the Brian Peppers Day vandalism. [15:08:45] I don't know how widespread that is, but I've been contacted by one other wiki owner about it. [15:08:51] yet another http://issuepedia.org/index.php?title=Template:Woozle.init&action=edit [15:08:52] Now to undo all the crap... [15:09:27] may want to check all of this: http://issuepedia.org/Special:Contributions/I_liek_mountain_dew [15:10:33] Yeah. Getting to that. [15:10:48] I saw that name in my feed and figured he *probably* did everything... [15:17:03] <^demon> TheWoozle: Tried to help out by reverting a few of his contribs to the last good versions. [15:17:40] ^demon: if you don't stop that you will get a user relations award ;) [15:17:52] <^demon> Does it come with a cookie? [15:18:05] <^demon> TheWoozle: Might want to block him too, (doesn't appear that you had) [15:18:28] Wow, the link he put in that vandalism goes to a really poisonous page -- both in content and in scripting. It grabs your browser window and moves it around so you can't close the tab. [15:18:37] I just had to close Firefox to get out of it. [15:18:58] And no, haven't gotten around to blocking him yet. High on my list. [15:19:53] ^demon: no, it comes with a apple pie slice [15:20:10] TheWoozle: I suggest you install noscript extension [15:21:13] how is that going to work Nikerabbit, the script was on an external website? [15:21:25] The scripts were on the linked site, which was external... right... [15:21:53] <^demon> Nikerabbit: I'll take it :-) [15:21:54] afaik, there's no way for a regular user to insert scripts on a wiki page unless I've somehow enabled that, and I don't think I have, so... non-issue. [15:22:17] <^demon> TheWoozle: "Template:Peppers" is an outright deletion, nothing worth reverting to. [15:22:37] Yeah. [15:22:49] darkcode: it works by not executing anything without permission [15:22:53] I'm still doing early clean-up and trying to figure out what the pattern was. [15:23:21] It used to be renaming pages to "$1 ENJOYING BRIAN PEPPERS DAY???", so I disabled renaming (for now)... this is presumably his upping the ante. [15:23:39] <^demon> TheWoozle: I've almost finished reverting all his stuff ;-) [15:23:49] Cool! [15:24:09] Firefox is still reloading my way-too-many tabs. [15:24:25] Is there a tool anywhere which will effectively undo everything done by a given user? [15:25:37] I'd write one, except I'm not sure how to move/delete a page from within PHP code. It's probably simple enough, but I haven't messed with that functionality yet. [15:26:26] <^demon> He's fully reverted now except for "User:I liek mountain dew" and "Template:Peppers" [15:26:37] <^demon> Which I can't delete, as I'm an anon :-) [15:28:05] Nikerabbit your talking about a firefox extension rather then a mediawiki extension than right? [15:28:20] darkcode: yes [15:28:27] <^demon> Back in a moment folks. [15:28:36] darkcode, TheWoozle: http://noscript.net/ [15:28:48] *TheWoozle looks... [15:28:54] ya I thought you were talking about a noscript mediawiki extension [15:28:59] Ahh, right, FFx extension. [15:29:00] *darkcode already knows about the firefox one [15:29:30] darkcode: nope, and I wasn't aware there is one [15:30:55] I think you have to install an extension if you want to *allow* scripts on MW. [15:30:58] I'm not aware of one either, I thought you were suggesting that a noscript mediawiki extension could somehow prevent scripts from being run on an external website which didn't make sense [15:31:08] Javascript etc. is stripped out by default. [15:32:04] true [15:32:18] but I was talking generally [15:39:19] Much thanks to ^demon; I'm double-checking, but so far it looks like you got it all ^_^ [15:39:55] 03rotem * r39409 10/trunk/extensions/Oversight/HideRevision.i18n.php: Localization update for he. [15:53:05] 03(NEW) Flood flag for Meta - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15176 15enhancement; normal; Wikimedia: Site requests; (mike.lifeguard) [15:53:10] 03(mod) Allow a user to add a specified group to themselves and remove it - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15126 (10mike.lifeguard) [15:54:09] hello [15:57:00] 03tstarling * r39410 10/trunk/phase3/maintenance/parserTests.inc: Avoid notice when no categories are present [15:57:29] 03tstarling * r39411 10/trunk/phase3/maintenance/parserTests.txt: Update for revert to old link trail [15:58:17] can you help me for sql infoboxdata [16:02:12] 03tstarling * r39412 10/trunk/phase3/includes/parser/CoreParserFunctions.php: Don't use $wgParser when {{int:}} is called, use $parser->replaceVariables() instead. Removes an unnecessary potential $wgTitle reference, and fixes Parser_DiffTest. [16:02:46] #1064 - You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '{{{Schema}}}.`{{{tag}}}_infoboxdata` ( [16:02:46] `ib_from` INT(8) UNSIGNED NOT NULL DEF' at line 1 [16:02:51] Why ?? [16:03:50] 03tstarling * r39413 10/trunk/phase3/includes/parser/Parser_DiffTest.php: Added shortOutput option. [16:06:19] Didoudu73: uh... [16:06:35] Didoudu73: and HOW are you making that statement [16:06:56] perfectly valid syntax, that [16:07:03] I don't know how it could cause a problem [16:07:17] with phpmyadmin [16:07:50] because I want to instal Extension:Infoboxl [16:08:04] http://www.mediawiki.org/wiki/Extension:Infobox_Data_Capture#Handling_Lists [16:08:08] TimStarling: you're joking, right? [16:08:14] yes [16:08:23] just making sure [16:08:49] to commit, or to do more testing [16:08:53] the eternal quandry [16:09:28] TimStarling: commit the bitch and add 'testme' keyword :) [16:09:28] is it normal for the `text` table to get to over 2GB in size? [16:09:50] depends whether you put 2GB of text in it [16:10:01] not intentionally.... [16:10:21] however the row size is 352,844 with 6000 rows [16:10:25] try running maintenance/compressOld.php [16:10:50] can that be run as nobody? [16:10:52] if it doesn't destroy your database entirely, it'll make it smaller [16:10:57] err the apache user? [16:11:09] TimStarling: so... make a backup first? [16:11:11] it can be run as whoever you like, as long as they have database access [16:11:17] k [16:11:29] we haven't tested it recently at wikimedia [16:11:36] but we've had no reports of it breaking thinigs [16:11:39] things [16:13:11] Didoudu73: here's a hint about your issue, by the way: wiki markup isn't valid in phpmyadmin [16:13:44] has anyone ever wondered why requires you to start every line with Image:? [16:13:48] Oki but the code sql isn't correct [16:15:09] TimStarling: yes, because my extension that extends gallery doesn't :) [16:15:48] kool [16:15:52] Simpler syntax? [16:16:27] MZMcBride: ? [16:16:53] Why requires to start every line with Image: ? [16:17:19] I don't necessarily see how that is simpler [16:17:22] pretty simple to fix [16:17:24] - $tp = Title::newFromText( $matches[1] ); [16:17:24] + $tp = Title::newFromText( $matches[1], NS_IMAGE ); [16:17:45] then you can use whatever namespace you want, but you get a suitable default [16:17:56] There's an ancient bug regarding renaming the Image: namespace... [16:18:06] I'm supposed to remind you all about it occasionally. [16:18:11] thank you [16:18:13] there s? [16:18:15] err is? [16:18:39] https://bugzilla.wikimedia.org/show_bug.cgi?id=44 [16:18:49] wow 44 :O [16:19:17] well, well, that *is* ancient [16:19:43] TimStarling: Your explodeiterator thing you were talking about last night sounds awesome. [16:19:54] ooooohhh yeah. i do actually remember that now [16:19:58] Look Skizzerz [16:19:59] CREATE TABLE {{{Schema}}}.`{{{tag}}}_infoboxdata` ( [16:19:59] `ib_from` INT(8) UNSIGNED NOT NULL DEFAULT '0', [16:19:59] `ib_datablock_order` INT(11) NOT NULL DEFAULT '7', [16:19:59] `ib_datablock_name` VARBINARY(255) NOT NULL DEFAULT '', [16:19:59] `ib_attribute_order` INT(11) NOT NULL DEFAULT '7', [16:20:01] `ib_attribute` VARBINARY(255) NOT NULL DEFAULT '', [16:20:03] `ib_value` BLOB, [16:20:05] `ib_isvalid` INT(1) UNSIGNED NOT NULL DEFAULT '1', [16:20:09] Bah. [16:20:09] Didoudu73: yes, you may stop pasting [16:20:09] `ib_comment` BLOB, [16:20:11] KEY `ib_from` (`ib_from`,`ib_datablock_order`,`ib_datablock_name`,`ib_attribute`), [16:20:12] o.O [16:20:13] Use a pastebin. [16:20:13] KEY `ib_datablock_name` (`ib_datablock_name`,`ib_from`) [16:20:15] ) ENGINE=MyISAM DEFAULT CHARSET=BINARY; [16:20:20] Skizzerz: he can't [16:20:28] ok [16:20:30] Didoudu73: the issue is with your first line [16:20:32] not the rest of it [16:20:42] {{{Schema}}} and {{{tag}}} are not valid [16:21:00] I put with the place [16:21:15] switch it out with your database name and your database prefix [16:21:31] if you don't know what those are, then you shouldn't be touching your database to begin with [16:22:08] what's up kids [16:22:23] hi brion [16:22:33] you want to see some cool iterator tricks? [16:22:54] oooh [16:22:56] sure [16:24:35] brion: Can you comment on this bug, please? https://bugzilla.wikimedia.org/show_bug.cgi?id=12569 [16:25:12] Skizzerz : I know my foundation of data it is sodalis_wiki [16:25:30] Didoudu73: what is your wiki's database named? [16:25:47] sodalis_wiki [16:26:58] it's a name for database [16:28:42] omg it's brion [16:28:59] i have a question for you [16:29:46] any idea why when i click on login or try to submit a page i just get a blank page? I am not getting any errors [16:30:13] in AntiSpoof in the function getConflict() is that isLegal call needed? since getConflict is only called in a conditional if isLegal is true [16:31:22] Didoudu73: and in your wiki, what is your "user" table named? e.g. is it "user" or "wiki_user" or what? [16:32:40] sodalis_wiki [16:32:59] no, the TABLE name [16:33:07] NOT the name of the database [16:35:19] The Table user ?? or for the Exetention [16:37:27] the table [16:37:29] no CIA today? [16:37:36] stop worrying about the extension for now [16:37:49] *MZMcBride kicks CIA-56 [16:37:49] ow [16:38:26] I don't think that helps [16:38:34] table name is user [16:38:38] maybe it makes you feel better though [16:38:38] ok [16:38:43] ok, then replace your first line with this: CREATE TABLE `sodalis_wiki`.`infoboxdata` ( [16:38:58] then re-run that query. it should work [16:39:04] http://svn.wikimedia.org/viewvc/mediawiki?view=rev&revision=39414 [16:39:33] Tells you if the bot is still present and responding to pings. [16:41:09] good the table added [16:41:52] Your request SQL was successfully carried out (Treatment in 0.0085 dryness.) [16:47:52] 03tstarling * r39414 10/trunk/phase3/ (8 files in 3 dirs): (log message trimmed) [16:47:52] * In the parser: do link existence tests in batches of 1000. Avoids using excessive memory to store Title objects. [16:47:52] * Split link placeholder/replacement handling into a separate object, LinkHolderArray. [16:47:52] * Remove Title objects from LinkCache, they apparently weren't being used at all. Same unconstrained memory usage as the former $parser->mLinkHolders. [16:47:55] * Introduced ExplodeIterator -- a workalike for explode() which doesn't use a significant amount of memory [16:47:57] * Introduced StringUtils::explode() -- select whether to use the simulated or native explode() depending on how many items there are [16:48:00] * Migrated most instances of explode() in Parser.php to StringUtils::explode() [16:48:04] whee [16:48:21] better late than never [16:49:54] 03(mod) Rename the "Image" namespace to "File" - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=44 (10jhsoby) [16:51:32] 03(mod) Add MultiUpload extension. - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15168 (10brion) [16:51:59] 03ialex * r39415 10/trunk/phase3/ (5 files in 2 dirs): [16:51:59] Fixes for r39406: [16:51:59] * Set missing properties svn:eol-style and svn:mime-type [16:51:59] * Use auto loader if possible [16:51:59] * Added $wgEnableHtmlDiff to DefaultSettings.php (todo: fix description of this setting) [17:06:32] I'm updating an auto authentication plugin, and I'm having an issue with checking to see if a user is already logged in. I'm doing exactly what is being done in the CentralAuth plugin, but it doesn't seem to be working for me. I'm using $user->isLoggedIn() [17:08:26] isLoggedIn() means that they're registered, as opposed to anonymous. [17:08:32] right. [17:08:49] and i'm using a logged in user, and it is returning false [17:09:55] the initialisation sequence in authentication plugins is a bit confused [17:10:02] agreed :) [17:10:20] it's not hard to get a half-initialised user object [17:10:36] and it's not hard to get into a state where the database is more up-to-date than the object's cache [17:11:06] how is it handled in CentralAuth? [17:11:09] 03(mod) Rename the "Image" namespace to "File" - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=44 (10N/A) [17:11:20] I'm trying to avoid an ldap search on every page view [17:12:03] a worthy goal [17:12:08] heh. yeah [17:12:14] Hi, I'm using MediaWiki as a small CMS on my page, which restrictions do i need so no anonymous user can do anything but view the page? I disabled upload and edit, what did I forget? [17:12:25] bsm, createaccount? [17:12:37] !rights [17:12:37] --mwbot-- For information on customizing user access, see < http://www.mediawiki.org/wiki/Help:User_rights >. For common examples of restricting access using both rights and extensions, see < http://www.mediawiki.org/wiki/Manual:Preventing_access >. [17:12:52] i'd also like to move some of the auto authentication stuff to the core, so that core-style stuff can be updated and maintained [17:12:55] User_rights has the complete list. [17:13:14] things like checking logged in users, and creating user accounts :) [17:14:06] thanks Simetrical, MZMcBride, I'll have a look at that [17:15:21] TimStarling: do you think extensions/ExtensionFunctions.php should be made obsolete? [17:15:40] only once all the extensions that still use it are transitioned [17:15:54] Skizzerz: yes, of course. [17:16:36] it's a backwards-compatibility interface for MediaWiki 1.6, it was never meant to be permanent [17:17:11] once an extension no longer wants to support 1.6, it can stop using ExtensionFunctions.php [17:17:18] eew. MetavidWiki uses it. [17:17:25] That shouldn't be. [17:17:29] I used to check for logged in users by creating a user with User::LoadFromSession(). It was reliable, but for some reason it was made into a private function... [17:17:40] It appears to be using a copy, even. [17:17:56] ./var/www/w/extensions/MetavidWiki/ExtensionFunctions.php [17:18:22] Ryan_Lane: maybe you wanted User::newFromSession()? [17:18:49] siebrand: it's not hard to find things wrong with metavidwiki [17:18:54] but we do have a guy working on it [17:19:01] TimStarling: i've tried that. It seems to cause me serious problems. Let me try that again right now so I can remember what the issue was [17:19:20] TimStarling: yeah. In his merge from trunk he also removed 50% of newly added localisation, and some other fixes. [17:19:36] merge to trun, even. [17:22:55] TimStarling: oh yeah, it calls the UserLoadFromSession hook [17:23:09] TimStarling: causing an infinite loop for me. [17:24:23] which awesomely enough causes php to seg fault :) [17:24:27] I think if you want to use User::loadFromSession() for some good reason, you should just make it public [17:24:32] <^demon> Ryan_Lane: There's a bug about infinite recursion with UserLoadFromSession (bug 14178) [17:24:48] but note in the doc comment that it's just meant for auth plugins [17:25:00] PHP doesn't have friend classes, it makes protection a bit difficult sometimes [17:25:15] 03(mod) Install the Babel extension on sv.wikipedia.org - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15165 (10Eugene.Zelenko) [17:25:34] ^demon: you sure that's the right bug? :) [17:25:41] TimStarling: ok, cool. thanks [17:26:02] TimStarling: what should I do about backwards compatibility? [17:26:24] distribute an old version [17:26:45] auto auth has been broken since that method turned private [17:27:00] i don't have a workaround [17:27:18] i could just tell people to upgrade i guess :) [17:27:19] I'd have to look at it more closely [17:27:31] but I'm going to bed instead [17:27:39] ok. thanks for the help [17:30:38] *Ryan_Lane mumbles. [17:31:04] heh. loadFromSession() actually calls the hook. newFromSession() just happens to call loadFromSession() [17:33:22] could it be that we are just calling the auto authentication plugins at the wrong time? it looks like the plugins are being called before the session is loaded. [17:56:42] 03(FIXED) Special:Allpages unconscionably large on giant wikis like en. wikipedia.org - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13902 +comment (10JSchulz_4587) [18:10:55] ???Help InfoboxDataCapture::save failed to return a value, why this happens with Infoboxdata.php ???? [18:11:00] Can someone give me a sanity check on my code? I want to be sure I don't release something insecure... [18:11:59] php6th: is it a hook? if so, is it not returning a value? [18:12:38] Ryan_Lane: yes its hook [18:13:24] php6th: it needs to return either true or false. false if failure of that hook should stop execution of other functions for the same hook [18:13:33] php6th: i'm betting it should return true [18:14:37] Ryan_Lane: one sec ill paste the error [18:15:01] \o/ CIA-56 [18:15:26] php6th: can't you just check the function to see if it is returning? [18:18:31] http://virtual.codebin.ca/1173227 [18:19:00] 03siebrand * r39420 10/trunk/extensions/UnicodeConverter/ (4 files): [18:19:00] * remove dependency on ExtensionFunctions.php [18:19:00] * add $wgExtensionAliasesFiles (no support in Translate because it is an example extension) [18:19:00] * method renamed for consistency with other extensions [18:19:00] * delayed message loading [18:19:07] when i enable this, i get that error: require_once("$IP/extensions/InfoboxData/InfoboxData.php"); [18:20:10] Ryan_Lane: http://virtual.codebin.ca/1173227 ???when i enable this, i get that error: require_once("$IP/extensions/InfoboxData/InfoboxData.php"); [18:20:15] php6th: yeah, I'm familiar with the error, and i'm telling you that the function needs to return some value. [18:20:34] Ryan_Lane: do you have a clue what is the workaround? [18:20:43] php6th, he's already told you several times. [18:20:50] php6th: go to the function, and add "return true" at the end [18:20:55] Simetrical: sorry i missed , i just returned home [18:20:59] ohhhh [18:21:02] php6th: or "return true;" to be more exact [18:21:02] ok 1 second [18:21:12] which function? [18:21:37] php6th: InfoboxDataCapture::save [18:22:30] php6th: which is probably going to be the save function in InfoboxDataCapture.php [18:22:57] http://virtual.codebin.ca/1173234 << i did that [18:23:37] php6th: and now what error are you getting? the same one? [18:24:26] Ryan_Lane: no more error, but the infobox is not shown :( [18:24:41] php6th: check your php error logs [18:24:43] {{#listsplit:synonyms|function;foo;foobar;method|;|1|All valid names for a function}} [18:25:16] i tested with that content, but no errors, just it displays text instead of the infobox [18:26:00] php6th: what extension are you using? can you post a link to the extension's page? [18:26:08] one sec [18:26:31] http://www.mediawiki.org/wiki/Extension:Infobox_Data_Capture [18:27:04] i want a Box, like the blue one in the right side of that page [18:28:01] php6th: thats an infobox, not an inputbox [18:28:09] php6th: and it is done with templates [18:28:31] how i can implement that? pleaseeee [18:28:36] php6th: you don't need to install any extensions for it. well you *might* have to install ParserFunctions [18:29:06] 03aaron * r39421 10/trunk/phase3/includes/specials/SpecialAllpages.php: double-check with isLocal() to be safe [18:29:27] php6th: you just need to copy whatever templates are being used. let me see if there are any generic infoboxes around [18:30:01] Ryan_Lane: pleaseee, ill really thank you for that [18:34:11] php6th: see: http://en.wikipedia.org/wiki/Template:Infobox [18:35:00] php6th: you'll need to copy that template, and all the templates it uses (and all the templates they use) [18:35:23] Special:Export may have recursive export now. [18:35:33] oh. that would be nice :) [18:35:34] vandal needs a block: http://www.mediawiki.org/wiki/Special:Contributions/80.234.11.10 [18:35:53] wow i love you, if i were a girl ill give myself to you [18:36:30] MaxSem: Done. [18:36:38] kthx [18:36:45] MaxSem: By the way, feel free to use your admin rights there. [18:36:49] php6th: my girlfriend would likely kick your ass ;) [18:37:05] I can nearly guarantee nobody would care if stewards cleaned up there. [18:37:17] MZMcBride, Cometstyles said absolutely the opposite to us stewards not so long ago:) [18:37:26] php6th: go to: http://en.wikipedia.org/wiki/Special:Export [18:37:48] MaxSem: I've found a healthy dose of ignoring him is usually healthy. [18:37:49] 03siebrand * r39422 10/trunk/extensions/gis/ (14 files): (log message trimmed) [18:37:49] * remove dependency on ExtensionFunctions.php [18:37:49] * add $wgExtensionAliasesFiles [18:37:49] * use author array in extension credits [18:37:49] * method special page renamed for consistency with other extensions [18:37:50] * delayed message loading [18:37:54] * remove some obsolete, commented out code [18:37:59] php and type in Template:Infobox; make sure to select the "include templates" checkbox [18:38:01] *MZMcBride used healthy twice. >_> [18:38:13] ) [18:38:30] I'll post somewhere and see if we can't get that clarified... [18:39:47] That, or I'll just have a bureaucrat make all the stewards +sysop explicitly. ;-) [18:43:03] Ryan_Lane: how should i install that template? i cant figure it out [18:43:24] Templates are like articles or other pages. [18:43:26] php6th: once you have exported it, you'll need to import it into your wiki using Special:Import [18:47:47] Ryan_Lane: is this the page i need to export? http://en.wikipedia.org/wiki/Template:Infobox [18:48:01] php6th: yes. [18:48:15] 03siebrand * r39423 10/trunk/extensions/BoardVote/ (BoardVote.php BoardVote_body.php GoToBoardVote_body.php): [18:48:15] * remove dependency on ExtensionFunctions.php [18:48:15] * add $wgExtensionAliasesFiles [18:48:15] * minor changes for consistency with other extensions [18:48:15] * delayed message loading (and wfBoardVoteInitMessages()) [18:48:16] * remove some obsolete, commented out code [18:48:18] * EOL whitespace fixes [18:48:31] siebrand: top speed [18:49:15] Nikerabbit: trying not to rush it... [18:49:41] Nikerabbit: and there are still 8 extensions left in Subversion that use it :( [18:50:39] AaronSchulz: this was you, wasn't it? PHP Strict Standards: Declaration of SpecialPrefixindex::showChunk() should be compatible with that of SpecialAllpages::showChunk() in /var/www/w/includes/specials/SpecialPrefixindex.php on line 43 [18:51:07] 03(mod) Documentation is out of date, incomplete - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=1 (10tfinc) [18:51:25] omg. Updates to bug 1... [18:51:27] ugh, they are coupled [18:51:31] *AaronSchulz pokes [18:51:47] Ryan_Lane: it doesnt work, sorry i cant figure it out [18:52:08] Importing pages... Import failed: No pages to import. [18:52:11] php6th: go to: http://en.wikipedia.org/wiki/Special:Export [18:52:17] im there [18:52:18] php and type in Template:Infobox; make sure to select the "include templates" checkbox [18:52:22] siebrand: I'm barely able to read text from the screen :< [18:52:28] Hahaha. Tim reopened it. [18:52:47] php6th: once you have exported it, you'll need to import it into your wiki using Special:Import [18:52:53] Nikerabbit: how so? Sitting in the sun with your laptop? [18:53:30] siebrand: naah [18:53:34] I'm little sick [18:53:37] Importing pages... Import failed: No pages to import. << same error [18:53:53] Nikerabbit: I know you're sick. ;) Be well. [18:54:08] siebrand: how can you know that :O [18:54:09] php6th: you are giving Special:import the file you got from Special:Export on the english wikipedia? [18:54:44] Ryan_Lane: exactly Wikipedia-20080815184450.xml [18:54:49] Nikerabbit: http://dictionary.reference.com/browse/sick 4. :) [18:55:26] siebrand: you damn kewler [18:55:31] i forget what to add to the url so i can view action=raw in the browser [18:56:08] Nikerabbit: No results found for kewler. [18:56:36] php6th: on special:export, did you type exactly "Template:Infobox" without the quotes? [18:57:42] Template:Infobox 1 revision Template:! 1 revision Template:Clear 1 revision Template:Documentation 1 revision Template:Documentation/docname 1 revision Template:Documentation subpage 1 revision Template:Infobox/doc 1 revision Template:Intricate template 1 revision Template:Nowrap 1 revision Template:Ombox 1 revision Template:Pp-meta 1 revision Template:Pp-template 1 revision Template:Tl 1 revision Template:Tnavbar 1 revision [18:57:46] worked! [18:57:49] :) [18:58:07] now you'll need to go look at the documentation for the template to figure out how to use it [18:58:09] Ryan_Lane: Thank you a lot... now ill test, dont move [18:58:36] php6th: I hope you know how to use templates well, because you'll need to [19:00:27] Ryan_Lane: it looks horrible [19:00:42] php6th: is your wiki public? [19:01:02] sure ,, ill pm you [19:01:57] brion: the end of monobook.php looks like it is missing a ?> http://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3/skins/MonoBook.php?revision=39082&view=markup [19:02:15] or some other dev. [19:02:22] php6th: you need to install ParserFunctions [19:02:24] they are removed intentionally [19:02:25] exobuzz: these days we usually skip the ?> at the end, it's not necessary [19:02:36] brion. oh.. ok. [19:02:44] how do i use the bot to link to an extension? [19:02:51] !ex ParserFunctions [19:02:51] --mwbot-- I don't know anything about "ex". You might try: !actions !bizzwiki !boilerplate !bundels !centralauth !dpl !e !editbuttons !extensionmatrix !extranamespace !feeds !filetype !interwiki !ldap !moderation !parserfunctions !renameuser !sidebarex !threads !userapproval [19:02:56] Ryan_Lane: !e [19:03:01] 03aaron * r39424 10/trunk/phase3/includes/specials/SpecialPrefixindex.php: Decouple from allpages a little to avoid E_STRICT notices [19:03:02] !e ParserFunctions [19:03:02] --mwbot-- http://www.mediawiki.org/wiki/Extension:ParserFunctions [19:03:09] thanks [19:03:14] it's also prone to error as extra whitespace after the ?> can mess up binary or xml output [19:03:18] brion: btw, a str_replace type parserfunction would be handy.. [19:03:21] Ryan_Lane: http://www.mediawiki.org/wiki/Extension:ParserFunctions ?? [19:03:22] :) [19:03:24] ah ok [19:03:29] Ryan_Lane: give me 5 mins [19:03:34] siebrand: thanks [19:04:06] !e StringFunctions | exobuzz [19:04:06] --mwbot-- exobuzz: http://www.mediawiki.org/wiki/Extension:StringFunctions [19:04:09] :) [19:04:31] 03(mod) Rename the "Image" namespace to "File" - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=44 -need-review -patch ; +comment (10ssanbeg) [19:04:33] Skizzerz: crap and you know what. i wrote my own earlier.. damn ;-) [19:04:39] Skizzerz: i should search better [19:04:42] :P [19:05:15] can you do parserfunctions inside of parserfunctions ? [19:05:22] brion: i'm having trouble with auto authentication. It seems that checking $user-isLoggedIn() doesn't work [19:05:49] exobuzz yes you can [19:06:15] ok nice. then i can use replace and urlencode instead of my own urlencode which ignores slashes [19:06:38] brion: is that call actually working in CentralAuth? I see you check it as well, but it looks like the hook is called before the session is loaded, so checking $user-isLoggedIn() will always return false. [19:07:49] Ryan_Lane: it works!!! but i dont see the border [19:08:19] php6th: you'll need to apply css; see the documentation for infobox [19:08:45] php6th: http://en.wikipedia.org/wiki/Template:Infobox [19:09:44] 03(NEW) dammit.lt/wikistats stopped working - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15177 15enhancement; normal; Wikimedia: Usage Statistics; (Wiki.Melancholie) [19:10:00] i wish definition lists could be styled so that the width of the
is that of the max length of a term as well as having
on the same line. so you can have nice clean lists no matter term lengths. [19:11:07] 03(mod) dammit.lt/wikistats stopped working - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15177 15enhancement->normal; normal->highest (10Wiki.Melancholie) [19:11:18] doing dt { float: left; width: 10em; } dd { margin-left: 10em } type thing, but with a non fixed width/margin [19:13:17] Ryan_Lane: i cant add a border :( [19:14:33] php6th: look at the very first example. it has a border [19:14:36] php6th: http://en.wikipedia.org/wiki/Template:Infobox#Examples [19:15:09] yeh, i copied it exactly, but doesnt show border [19:15:16] 03(mod) dammit.lt/wikistats stopped working - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15177 (10Wiki.Melancholie) [19:15:26] php6th: oh. that is probably done in the sitewide css [19:15:49] Ryan_Lane: where are those css's? [19:15:54] php6th: http://en.wikipedia.org/wiki/MediaWiki:Common.css [19:16:17] php6th: you'll need to take the infobox stuff, and put that on your wiki's MediaWiki:Common.css page [19:16:44] wow thanks, 5 mins [19:22:19] 03siebrand * r39425 10/trunk/extensions/ (8 files in 2 dirs): [19:22:19] * remove dependency on ExtensionFunctions.php [19:22:19] * add $wgExtensionAliasesFiles [19:22:19] * minor changes for consistency with other extensions [19:22:19] * delayed message loading, removed use of $wgMessageCache, added message to i18n file and mark it optional in Translate [19:22:20] * EOL whitespace fixes [19:25:21] Ryan_Lane: i love you... [19:25:35] php6th: everything working well? [19:25:39] is there a book or soemthing to learn all thsi tricks? [19:25:50] yeah its working as expected [19:26:07] php6th: I think someone might have published an oreilly book on MediaWiki recently, i'm not sure if this kind of stuff is in it [19:30:24] is there a $wgOut->addLink call for adding for example a css AFTER the main monobook css etc ? [19:30:41] an extension might want to override some styles for example.. i guess i can reorder my skin a bit but. [19:31:34] i was to reference my css a little later [19:31:35] want [19:33:42] exobuzz: as far as I can remember, i don't think you can order the css [19:34:10] AaronSchulz: your new version of Special:Allpages broke {{Special:Allpages/...}} [19:34:30] exobuzz: I haven't looked at that code in a while though, so maybe something has changed [19:35:19] because i have a default style for definition lists, but i want to use different styled definition lists in my extension, and the defaults are messing me up [19:35:34] the common.css comes after my extension css [19:35:53] exobuzz: what hook are you using to add the css? [19:36:24] $wgOut->addLink [19:37:23] ialex: should just refer to prefixindex if we want to keep that [19:39:16] exobuzz: oh. have you tried looking for a hook that may output after the other css? [19:39:39] i had a brief look [19:40:08] exobuzz: if you don't see one, you could always add one [19:40:20] 03siebrand * r39426 10/trunk/extensions/ListChangedArticles/ (ListChangedArticles.php ListChangedArticles_body.php): [19:40:20] * remove dependency on ExtensionFunctions.php [19:40:20] * add FIXME [19:40:20] * EOL whitespace fixes [19:41:15] ill try addextensionstyle.. see if that is any different [19:42:58] 03aaron * r39427 10/trunk/phase3/includes/specials/SpecialAllpages.php: Support {{Special:AllPages/x}} rather than defaulting to main index [19:45:06] weird. Call to undefined method OutputPage::addExtensionStyle() yet it is in the api documentation [19:45:12] no longer exists ? [19:45:55] grumble. [19:46:01] looks like it is gone. [19:46:06] update the api! [19:50:17] 03siebrand * r39428 10/trunk/extensions/ (4 files in 2 dirs): [19:50:17] * remove dependency on ExtensionFunctions.php [19:50:17] * add $wgExtensionAliasesFiles and support in Translate [19:50:17] * method renamed for consistency with other extensions [19:50:17] * remove EOL whitespace [19:53:17] Ryan_Lane: i managed to style things differently now so one takes priority. thanks [19:53:26] exobuzz: no problem [19:58:03] 03dantman * r39429 10/trunk/extensions/SemanticForms/includes/SF_GlobalFunctions.php: Undo autoloading of SF_Language class. SF Seams to be doing something similar to SMW. [20:02:01] 03siebrand * r39430 10/trunk/extensions/ (4 files in 2 dirs): [20:02:01] * remove dependency on ExtensionFunctions.php [20:02:01] * delayed message loading [20:02:01] * add $wgExtensionAliasesFiles and support in Translate [20:02:01] * method renamed for consistency with other extensions [20:02:02] * remove EOL whitespace [20:03:24] how do i define default namespaces to be searched? [20:03:42] currently only (Main) is selected. [20:04:55] http://www.mediawiki.org/wiki/Manual:$wgNamespacesToBeSearchedDefault [20:05:02] ty [20:05:10] 03river * r39431 10/trunk/tools/switchboard2/ (Makefile switchboard.c): switchboard should run under a watchdog process so it's automatically restarted when it dies [20:05:21] 03siebrand * r39432 10/trunk/extensions/MakeDBError/ (MakeDBError.php MakeDBError_body.php): [20:05:21] * remove dependency on ExtensionFunctions.php [20:05:21] * method renamed for consistency with other extensions [20:06:07] 03river * r39433 10/trunk/tools/switchboard2/switchboard.c: svn:keywords, svn:eol-style [20:11:15] 03river * r39434 10/trunk/tools/switchboard2/switchboard.c: #include [20:13:16] 03siebrand * r39435 10/trunk/extensions/geoserver/ (7 files): [20:13:16] * remove dependency on ExtensionFunctions.php [20:13:16] * method renamed for consistency with other extensions [20:13:16] * remove EOL whitespace [20:14:25] 03(mod) Cite anchors should be numbered starting at 1 to correspond with on-screen labels - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=10537 (10ssanbeg) [20:16:29] 03siebrand * r39436 10/trunk/extensions/geoserver/ (7 files): revert r39435. Not sure about this one. Need to look closer. [20:17:07] 04(REOPENED) Image, category pages list both draft and stable links, mixed together - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14557 +comment (10pbirken) [20:22:07] 03siebrand * r39437 10/trunk/extensions/geoserver/ (7 files): [20:22:07] Recommit of r39435. It contained the special page definition in the wrong file, and actual removal of ExtensionFunctions.php dependency was omitted. [20:22:07] * remove dependency on ExtensionFunctions.php [20:22:07] * method renamed for consistency with other extensions [20:22:08] * remove EOL whitespace [20:24:34] I cannot figure out why my extension class function isn't getting called.. The details are here: http://zuesworks.com/wiki/index.php/Extension_testing [20:26:15] what's so special about this, this $wgSpecialPages cannot do? [20:26:16] extAddSpecialPage( dirname(__FILE__) . '/specials/MV_SpecialMediaSearch.php', 'Search', 'MV_SpecialSearch' ); [20:26:29] Comment says: //don't override special search page: (requires ExtensionFunctions.php) [20:30:37] 03river * r39438 10/trunk/tools/switchboard2/ (8 files): fix warnings with Intel C++ [20:33:46] 14(DUP) JavaScript Regexp syntax highlighting does not support literal strings - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15175 +comment (10dan_the_man) [20:33:55] 03(mod) Upgrade to the latest version of GeSHi (1.0.8) - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=10967 +comment (10dan_the_man) [20:36:30] 03river * r39439 10/trunk/tools/switchboard2/ (Makefile main.cc process.cc util.cc version.h): [20:36:30] - fix errors with sun c++ [20:36:30] - remove unneeded link libraries [20:36:56] 03(mod) Rename the "Image" namespace to "File" - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=44 +comment (10dan_the_man) [20:45:23] 03(mod) Cite anchors should be numbered starting at 1 to correspond with on-screen labels - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=10537 (10webboy) [20:50:10] Thats weird, 1.12-1.13 and 1.13 is slower.. [20:51:06] uh, maybe the fact the mw1.13 has 100 more languages you've never heard of? [20:51:10] just a guess tho :p [20:51:26] that could be it ;) [20:51:37] that shouldn't make any difference [20:51:46] only the language in use is loaded [20:52:04] aye [20:52:28] (also, every release adds more language support, it just wasn't in release notes until now) [20:53:00] ya [20:53:40] its potentially something else tbh [20:53:47] Dantman: Re: Bug 44. Huh? [20:54:15] So we'd have Media:, Image:, File:, and a fourth namespace? [20:54:24] Huh? [20:54:43] "...and create a new Image: pesudo-namespace..." [20:54:51] The proposal there is change 'Image' into 'File' and alias 'Image' to 'File'. [20:54:55] What I said [20:55:07] What's wrong with making File: = Image: and being done with it? [20:55:44] Because if they have the same ID, then the parser will consider File: and Image: equivilant [20:55:54] And? [20:55:59] In other words... whenever you try to use [[File:Filename.ext]] the parser will try to embed an image [20:56:13] Right. [20:56:27] File's aren't just images [20:56:44] The correct handling would be Image embedding on Image: namespace, not on File: [20:56:45] If you do [[Image:Foo.ogg]], what happens right now? [20:57:06] Yes, but why should File: embed an image when the file could be an audio file? [20:57:16] if it's an audio file it should embed an audio file [20:57:24] isn't that what it does already? [20:57:28] File: should embed whatever Image: embeds currently. [20:57:43] i hate the naming Image: on mediawiki. its so irrelevent with the different media kinds you can upload [20:58:13] id love a change. [20:58:21] ok hate/love are a bit strong [20:58:39] http://test.wikipedia.org/w/index.php?oldid=62516 [20:58:55] If you use Image: currently with a non-image, it still embeds the media. [21:00:00] That's got nothing to do with the software... that has to do with the OggHandler extension [21:00:01] One sec [21:02:05] Does anyone know of an rpm out there for the latest stable release of mediawiki I am looking for an rpm built on fedora, centos, or, redhat [21:02:30] Helga: just install it from the tar.gz from mediawiki.ogr [21:02:35] Thats weird [21:02:40] it lives in basically one folder, you dont need a package of it imho [21:02:43] exobuzz, no I want an rpm no source [21:02:45] Unlocking the database speeds it up (even though i was just browsing) [21:02:53] Helga: why ? [21:02:57] Helga: err, php is a scripting language. there is only source. [21:03:09] exobuzz, cause I like to package everything yes even php that sits in a folder [21:03:32] what if I need to upgrade, or patch it or somethoing... [21:03:39] I just like RPMS for any software I install :) [21:03:43] you get a new version and unpack it over the old [21:03:47] Helga: well, i have not ever once seen a distribution package of the latest release of mediawiki. they are always at least one release behind, and tend to suck terribly [21:04:00] why do they sucl? [21:04:01] suck? [21:04:21] if something needs to have files in man/usr/share etc. a package is sueful also with dependencies, but something like mediawiki, it is pointless imho and actually makes live harder [21:04:30] life [21:04:31] Helga: i use package management for everythign but webapps. linux distris tend to try to treat them like deamosn or something ,and splatter files over half the system. while this is how webapps do NOT work. it's silly. [21:04:53] well thats not because of the package management system [21:04:57] Helga: they change everything around and make the documentation not work [21:05:00] Reedy: Logged in or logged out? [21:05:05] thats because the person who wrote the spec file for the rpm didnt know what they were doing... [21:05:06] logged in [21:05:07] 03siebrand * r39440 10/trunk/extensions/ (8 files in 2 dirs): [21:05:07] * remove dependency on ExtensionFunctions.php [21:05:07] * add $wgExtensionAliasesFiles and support in Translate [21:05:07] * methods renamed for consistency with other extensions [21:05:07] * delayed message loading [21:05:08] * removed EOL whitespace [21:05:12] * removed some commented out code ($wgExtensionFunctions) [21:05:12] exobuzz: they actually *make* mediawiki put files into /etc/ /share, whatever, for no good reason :) [21:05:24] MZMcBride, it was taking ages and not always loading [21:05:30] unlock DB == problem soved [21:05:31] Helga: right. which is true for all packages for mediawiki i have seen so far. or any web app. [21:05:32] Hmm. [21:05:35] Strange. [21:05:45] Duesentrieb: nice.. good to make the users job more difficult! and then the user comes here and asks for help, but you cant help them since they have the stuff in the wrong place etc [21:06:19] hmm [21:06:31] Helga: Use svn, it works far better for updates than any package manager will for you... [21:06:32] Helga, exobuzz: linux distributions tend to have policies about what files go where. those policies usually don't fit the way web applications are designed. suckage ensues. [21:06:48] no no [21:06:48] i use svn for managing my mediawiki install and do vendor loads/merges [21:06:53] there are no such policies :-/ [21:07:04] thats all determined by the person who built the package not the operating system [21:07:09] well not linux at least [21:07:13] i like it when someone asks for help you give them advice and they start arguing.. [21:07:14] :) [21:07:24] Helga: not the OS proper, no. but the distribution managers, afaik. [21:07:24] no no [21:07:29] exobuzz, i am not trying to argue [21:07:36] just laying down facts sir... [21:07:37] *Splarka points to the handmade wooden sign outside the saloon doors to the channel: "packages unsupported, namespace read protection unsupported, null article and script paths unsupported, say 'Semantic' too often and get a free stabbing" [21:07:41] Helga: whatever. if you want to make a good rpm for mediawiki, more power to you. [21:07:46] i have yet to see any good package [21:07:51] Helga: our advice is. dont use a package for mediawiki. take it or not ;-) [21:07:59] What file is supposed to be writable for lockdb? [21:08:01] exobuzz, I won't take it [21:08:03] instead [21:08:05] ok [21:08:07] Ill build a package [21:08:08] just for you two [21:08:11] one that is perfect [21:08:17] yay :) [21:08:17] and puts the files right where they should be [21:08:20] !wg LockDB [21:08:20] --mwbot-- http://www.mediawiki.org/wiki/Manual:%24wgLockDB [21:08:21] as if i just unpacked it with tar [21:08:25] Helga: Can you tell me what benifits a package serves you? [21:08:31] Helga: make sure to write about it to the mediawiki-l mailing list [21:08:39] !ml | Helga [21:08:39] --mwbot-- Helga: See http://lists.wikimedia.org/ [21:08:48] helga likes to see "Mediawiki" in yums package list or something [21:09:18] i can understand that. it just never worked for me wit ha webapp. [21:09:20] Dantman, well its more organized easier to managment specially if i want to install this package across say 100 machines or say if a differenyt sys admin has to upgrade that package or patch or or remove it its much easier to do through a package manager rather then things like tar and rm [21:09:47] its more practical to use package management for ALL software its just a better way to manage software and ANY software even web apps [21:09:54] Helga: svn covers that perfectly fine. [21:10:02] Helga: in theory, i agree. [21:10:16] MediaWiki can update perfeclty fine with svn switch, svn up, and update.php [21:10:18] yeh better to keep it on svn and do an export to the 100 machines [21:10:22] Dantman: not really. much harder to see for example what is installed on box nr 75 [21:10:31] MZMcBride, wiki/uploads//lock_yBgMBwiR [21:10:34] fixed, ty [21:10:42] aah true you can easily run it from the mediawiki svn too [21:10:47] well svn is fine [21:10:50] and thats okay sort of [21:10:52] actusally thats more useful than a package [21:10:52] The variable is wgReadOnly [21:10:55] if you are managing a lot of boxes, using package management for everythign is good. provided the packages don't suck. [21:10:57] not really the best way to manage software [21:10:59] !wg ReadOnly [21:10:59] --mwbot-- http://www.mediawiki.org/wiki/Manual:%24wgReadOnly [21:11:00] so you end up making your own packages [21:11:07] like wikimedia does :) [21:11:07] but what if i need to make a custom patch to my media wiki? [21:11:11] then I can't use svn anymore.... [21:11:15] MZMcBride, uploads folder wasnt fully writable ;) [21:11:22] Heh. [21:11:23] Helga: huh? sure you can. [21:11:33] didnt put write permissions on, no biggy :) [21:11:36] what if i have no internet access for the machines the mediawiki is installed on ? [21:11:37] :) [21:11:50] *Splarka idly wonders why one would want 100 separate unsynched medawiki installs... [21:11:53] ^_^ If you're making your own packages that's an internal thing and you're not relying on external package distributors [21:11:56] we use mediawiki on 300 machines that don't have internet access with a lot of local patches and complex configuration... works fine [21:11:57] ^i [21:11:58] Helga: i have my custom mediawiki in my own svn. i use svn_load_dirs to upgrade my vendor tree and then merge with my own custom one [21:12:04] (svn trunk, even) [21:12:15] exobuzz, sounds like a lot more trouble then creating an rpm :P [21:12:18] twincest: well, they *do* have access to the svn repos, though [21:12:21] also sounds like a lot of things can go wrong there [21:12:27] Duesentrieb: but the live patches don't go into svn [21:12:35] true [21:12:48] it is easier for me, so that i can upgrade mediawiki and have my changes intact [21:12:53] The issue is external package distributions who think they can offer more ease of use than svn and the update script [21:13:05] twincest: they are just managed by hand, and rsynced to all boxes? [21:13:13] yes, there's a script to do the sync [21:13:18] (scap) [21:13:24] yea [21:13:37] Synchronize common all PHP. [21:14:14] exobuzz, I don't really disagree with you [21:14:23] Helga: the 300 boxes twincest was talking about are running wikipedia :) but anyway... ido see your point. i just never saw a decent package. make one, share it :) [21:14:44] exobuzz, they dont difference is production vs home...if this was for a home environment id just use svn simple and easy [21:14:45] and wikipedia never goes wrong! [21:15:00] exobuzz, but in a production environment its far better to use a package management system [21:15:34] twincest: well, it never goes down. it has been amazingly stable for the last couple of years. when was the last major outage? i can hardly remember. [21:15:37] if the production environment has a customised mediawiki, then i dont see how. [21:15:57] probably the colo move [21:16:02] Duesentrieb: perhaps a bullet list of all the problems you've seen in packages would help, such as all the common mistakes made in the existing packages (regarding configuration, patches, oldness of base code, inability to upgrade easily or switch to svn, etc) [21:16:05] when someone shut down the machine that was serving the downtime page ;) [21:16:06] i mean. i have my local dev, a staging site and a live site for my wiki [21:16:27] managed via svn with hooks for staging, and a script to update the live site.. a package wouldnt help me [21:17:16] Splarka: i have not played with it myself enough. search the ml for "rpm" and it's going to turn up quite a bit of shit, i'm sure :) [21:17:40] Splarka: what people tell me in here about packages makes my skin crawl... [21:17:53] *Splarka points to the unsupported list, nods [21:18:16] we even have !package [21:18:19] !package [21:18:19] --mwbot-- Many Linux distributions provide MediaWiki in a packaged format for that distribution. The MediaWiki development team refers you to your Linux distribution for assistance with installing, configuring or using them. The individual communities & companies who maintain such packages should provide installation instructions. [21:18:32] and dammit, amidaniel needs to make mwbot's logs greppable [21:19:34] Yeah, they're rather useless as is. [21:20:40] to have multiple languages on one wiki is a subpage with template the recommended way to go ? [21:20:54] subpages per language [21:21:38] I suppose. [21:21:50] Depends how many languages you're dealing with, I suppose. [21:22:20] ok.. well it was really a theory question.. [21:22:52] Most multilingual wikis that use the same site use Page_title/en, Page_title/es, etc. [21:23:09] Though if you were only dealing with two or three languages, you could use namespaces. [21:23:50] if say you had some pages with /en /es. is it possible to show a nagivation link only if the subpages exist ? or you would have to manually put in a template for example [21:24:00] #ifexist [21:24:12] (parserfunctions extension) [21:24:26] ok cool and would it be possible to have that on every page automatically ? [21:24:38] hello, can anyone explain this result to me? http://biodatabase.org/index.php/Special:Search?search=TOBFAC&fulltext=Search+text [21:24:40] sure, if you put a template there, heh [21:24:57] I see 'no hits' but not only does a page of that name exist, that term is used on the page [21:25:03] maybe some skin modification then [21:25:11] exobuzz, header? [21:25:21] exobuzz, there is a message called 'site notice' [21:25:22] {{#ifexist:{{FULLPAGENAME}}/{{int:userlang}}|{{{{FULLPAGENAME}}/{{int:userlang}}}}|{{{{FULLPAGENAME}}/en}} }} [21:25:24] or something [21:25:34] facefaceface: aah ok. [21:25:35] ignore me [21:25:36] (where int:userlang is a mw message trick) [21:25:48] arg... hate that template-scripting [21:25:54] Splarka: how does that work then ? [21:25:56] sitenotice probably won't work for that [21:25:57] int:userlang [21:26:11] because it is cached in a way that make page-specific magic words fail often [21:26:16] Splarka, sorry... I came in half way throuhg [21:26:20] *Splarka nods [21:26:31] exobuzz: in theory, anyway [21:26:35] and I cant spell through in any language [21:26:44] it is a bug, a circumvention of a user-specific magicword ban [21:26:53] (there are probably extensions to do it more reliably) [21:27:07] did anyone click the link I pasted? is anyone else speachless in-credulity? [21:27:21] 03river * r39441 10/trunk/tools/switchboard2/ (Makefile.config.example README): add README [21:27:53] 03siebrand * r39442 10/trunk/extensions/MetavidWiki/ (69 files in 14 dirs): svn properties and whitespace fixes for Metavid [21:28:04] why should this fail? http://biodatabase.org/index.php/Special:Search?search=TOBFAC&fulltext=Search+text is my search database screwed up or is it expected? [21:28:05] faceface: what am i supposed to see ? [21:28:20] I would like you to see a big fat 'one page found' please [21:28:31] faceface: perhaps the search index is outdated? [21:28:39] not the 'No page title matches' which is a lie [21:28:46] Duesentrieb, how could that have happened? [21:29:06] i got a review. woo http://cpanratings.perl.org/dist/MediaWiki-API [21:29:14] I'll hit the maintainance script and see if it goes away [21:29:14] faceface: by not being updated? it doesn't update automatically under all circumstances. depends on config and engine used. [21:29:36] faceface: also, some indexers might ignore all-uppercase words. [21:29:38] using default both ... I think [21:29:50] when using all defaults, it should work. but who knows [21:29:57] oh... that l'll prollyy be it [21:30:14] exobuzz, cool [21:30:29] exobuzz, how does it compare to pywikibot? [21:30:37] its more low level. [21:30:41] faceface: dunno if that's really the case.. look at the docs for mysql's fulltext index [21:30:49] exobuzz, which uses the user interface I believe... yeah, is it easier to code? [21:30:55] it does a lot less depending how you look. or more.. it is a direct interface to the mediawiki api [21:31:15] exobuzz, yeah, does that make it better? easier to code? etc? [21:31:27] i.e. is it faster / more robust / more functional ;-) [21:31:36] it is quite fast [21:31:51] and perl > python [21:31:52] and it supports all functions that api.php supports [21:32:15] faceface: see some example http://search.cpan.org/~exobuzz/MediaWiki-API-0.13/lib/MediaWiki/API.pm [21:32:18] exobuzz, I not so up on the api.... I mean I know it exists ... [21:32:27] thanks [21:33:35] you send a hashref you get a hashref back mostly. it handles edit tokens/timestamp for edits and loops for calling multiple times for large result sets [21:33:49] what is a token? [21:34:22] it is needed to do stuff to the wiki like edit.. it is to protect against someone linking to a "delete page" and someone logged in clicking it type thing [21:34:35] there are also tokens based on page name [21:34:37] for rollback [21:34:54] basically its a hidden field in the form when you edit normally on the wiki [21:35:09] exobuzz, OK, but with this interface I dont need to worry about them? [21:35:15] right [21:35:37] nice [21:35:40] there is an edit call which you can use which handles it. you could use the api call directly in which case you would need to handle them [21:36:00] yeah, I remember seeing ppl talking about that in here [21:36:03] the list call handles getting data which might have a "continue" result meaning there are more results to get and it handles that [21:36:19] ok, can I list history? [21:36:27] you can also supply a hook for that for example. so your bot can process 250,000 pages gradually [21:36:42] how do you mean hook? [21:36:44] you can list anything the mediawiki api supports http://www.mediawiki.org/wiki/API [21:36:49] history included i think [21:37:26] http://www.mediawiki.org/wiki/API:Query_-_Properties#revisions_.2F_rv [21:37:32] exobuzz, out of interest, do you know about the xml interface? i.e. are you good at xml? [21:37:49] (I is just curious) [21:37:56] i know xml, but i dont use the xml interface. [21:38:06] i sue the json interface, as the perl json::xs module is v.fast [21:38:08] sure [21:38:16] and its less "bandwidth" than xml [21:38:27] hihi [21:38:39] *Splarka uses json with callback, no parsing or eval needed <3 [21:39:01] just define the callbacked function and iterate over the pseudoobject's properties [21:39:19] (that assumes you're using the One True unlanguage, javascript, of course) [21:39:24] hehe [21:39:29] PERL ftw! [21:39:36] exobuzz, I have (had) a major 'pywikibot' project on the horizon, I'll sure look into this pm :D [21:40:06] perl > 1/0 [21:40:10] perl is not made by oysters, python is not made by snakes, at least javascript is honest.. written with coffee [21:40:25] faceface: we manage a few thousand pages on our site through a bot which uses now my perl api . before we used another pm. http://www.exotica.org.uk/wiki/UnExoticA [21:40:37] Splarka: haha [21:41:21] exobuzz, cool [21:41:27] I like real exotica too [21:41:37] you listen to vegas vicks tiki lounge? [21:42:04] no.. i think you will find this site rather different from what you might think. it is a geeky retro computing site [21:42:06] repton! [21:42:15] full of exotic audio from amiga games etc [21:42:33] exobuzz, but don't ignore the louge core [21:42:44] i will look it up ;-) [21:43:41] where do I search? [21:43:55] OK, got it [21:44:25] what is that dudes name, just under the I of exotica? [21:44:29] I thought it was repton [21:45:06] that is brendan macgyre.. in his turrican suit! [21:45:15] turrican! [21:49:07] What's up wiki people? Question: I have set up a parser that uses mediawiki formatting and the toolbar that is at the top of the wiki content creation window is not showing up on my staging server. It shows up when I am working with it on my local host, but not on stage. The codebases are exactly the same, but the bluish toolbar doesn't show up on stage... any ideas? [21:49:44] ryno, proxy? [21:50:47] faceface: what do you mean? [21:51:03] is your 'staging server' serving via a proxy server? [21:51:37] I don't know much about ht...pd? architecture, but I had a headache using a reverse proxy Apache server setup the other day [21:51:56] if you dont understand then it prolly isn't [21:52:38] :-) Well I just shot our guy an IM to see [21:53:05] the thing with 'proxy servers' is that they can get paths screwed up [21:53:31] so the 'local' version works, but the 'outside' version (with identical backend) don't [21:54:34] exobuzz, Couldn't find package uade [21:54:46] how do I install a player on linux [21:55:04] brion [21:55:19] do you think you can b free on 25 jan - 1 feb ? [21:57:26] uade http://zakalwe.fi/uade/ [21:57:34] build from source [21:59:25] 03(mod) DynamicPageList - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=11112 (10mohsensalek) [22:00:03] build makes fan go [22:00:47] exobuzz, wait now... is lha the music or the game? [22:04:37] Couldn't find package libao [22:04:45] *faceface hate at ubuntu [22:06:27] 03(mod) DynamicPageList - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=11112 (10mostafa) [22:06:52] exobuzz, :-D [22:12:25] faceface: That is what looks like is happening. I need it to reference it sites/all/modules/mediawiki/application/skins/common and it is being referenced /wiki/skins/common/ [22:13:00] ryno, hmmmm.... [22:13:20] this muisc is helping... but I am not sure my skills are up to it... [22:14:04] faceface: Do you think that is being set in the configuration for mediawiki or is the server just messing things up? I am a mediawiki newcomer... [22:14:16] ryno, lemmy think [22:14:32] its literally the same config inside and outward facing? [22:15:09] yeah....exactly the same. I am using subversion and I have removed and updated.... [22:15:30] you need to know how the reverse proxy is configured... i.e. what is the internal vs. the external path, and what is the proxy doing [22:15:41] removed and updated what? [22:15:58] The files...through subversion [22:16:03] OK, subversion of your local project [22:16:07] yeah [22:16:20] ryno, well, basically yeah... this could be a MW config thing [22:16:30] because the paths are changing... [22:16:38] our staging server is not serving via a proxy server [22:16:40] yeah [22:16:46] do you configure any 'pretty url' stuff in MW, or have you not touched that? [22:16:47] I will just ahve to keep looking [22:17:02] I don't think I have messed with that [22:17:03] oh... I figured it was the proxy... [22:17:20] OK, you need to look at variables to do with 'server' [22:17:21] 03ialex * r39443 10/trunk/extensions/Configure/ (CHANGELOG Configure.php Configure.settings-core.php): Added $wgEnablePersistentCookies and $wgEnableHtmlDiff [22:17:23] wgServer? [22:17:51] also, depending on apache config, it could be something strange [22:18:27] if you can, paste your LocalSettings.php, and the relevant section of your ApacheConf, and I am sure ppl here can get you sorted [22:19:05] also provide details of URL structure and file system structure... put all that into one email / paste... [22:19:52] Let me round up the stuff on the stage... [22:24:20] OK, found it finally, check $wgScriptPath, $wgArticlePath, $wgUsePathInfo and $wgServer. [22:24:31] also make sure your css isn't using absolute URLS [22:24:36] night [22:36:34] 03dantman * r39444 10/trunk/extensions/SemanticMediaWiki/ (43 files in 12 dirs): [22:36:34] Initial phase of fixing SMW's message short circuiting. [22:36:34] Adding wfLoadExtensionMessages before all messages and on special pages. [22:36:34] Unfortunately a few special pages are still using functions instead of classes and because of this we cannot deffer extension messages till they are fixed. [22:43:24] Hello, I have a simple question : is there a policy on mediawiki to unauthorized previous password and how many ? like on Active Directory [22:44:10] 03dantman * r39445 10/trunk/extensions/SemanticMediaWiki/ (2 files in 2 dirs): Moving that commented out message into the actual message file. [22:48:34] Gdgourou: nope [22:48:45] brion : ok [23:00:47] 03dantman * r39446 10/trunk/extensions/SemanticForms/ (includes/SF_GlobalFunctions.php specials/SF_UploadWindow.php): Add groups to Special pages and make the UploadWindow unlisted. [23:03:47] 03(mod) Using UserLoadFromSession hook causes segfault - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14178 +comment (10rlane32) [23:05:31] 03dantman * r39447 10/trunk/extensions/SemanticForms/languages/SF_Messages.php: Add a message for the Semantic Forms specialpage group. [23:10:38] 03river * r39448 10/trunk/tools/switchboard2/ (util.cc version.h): [23:10:38] V-2.0.8 [23:10:38] - use poll() instead of select(), significant performance improvement at 200+ concurrent connections [23:12:12] 03dantman * r39449 10/trunk/extensions/ (SpamRegex/SpamRegex.php regexBlock/SpecialRegexBlock.php): Special page groups for Spam regex and Regex block [23:14:17] 03dantman * r39450 10/trunk/extensions/regexBlock/SpecialRegexBlock.php: Oh right, globals... nasty little things... [23:24:52] 03river * r39451 10/trunk/tools/switchboard2/ (request_thread.cc request_thread.h): timed out requests would retry instead of throwing an error [23:41:53] 03werdna * r39452 10/trunk/tools/planet/en/config.ini: Remove Scarian, because of this post: http://wiki-elysium.blogspot.com/2008/08/announcement-to-wiki-community.html [23:42:30] brion_: does something need to be done to sync the planet config? [23:42:47] you need to run the secret update incantation on the planet server [23:43:00] Werdna: it syncs every hour [23:43:23] AH. [23:43:26] T_T Ack.. curses... damn variable setup [23:43:39] *Werdna de-caps. [23:44:19] brion_: Am I treading on anybody's toes by removing something from there? [23:45:14] depends what you're removing [23:46:09] well.. the post was an image of a former Wikipedia contributor (who is transgender), and the text "I'm in love with the bra-less Kelly Martin... She lets it all hang out... What a catch!" [23:48:46] Although, the link to the post doesn't work now, it seems to have been removed. [23:49:33] well that sort of crap has no place on planet [23:49:57] that's why I removed it. [23:52:00] ... [23:56:24] anyway, time for me to go off and do some work :(