[00:01:37] 03skizzerz * r31332 10/trunk/phase3/skins/common/images/spinner.gif: updating spinner.gif image to work with non-white backgrounds [00:03:41] 03(mod) Categories need piping feature to list by alternative name - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=491 +comment (10t.d.lee) [00:08:46] how do you change the name of the main page? [00:11:38] newrkv: [[MediaWiki:Mainpage]] ? [00:11:40] *kibble thinks that's it [00:12:17] yeah, that's it [00:16:03] 03(mod) Though user is logged in, switching pages seems to force a logout - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13082 (10cwisniewski) [00:54:25] 03(mod) Though user is logged in, switching pages seems to force a logout - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13082 (10brion) [01:16:02] http://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3/includes/GlobalFunctions.php?r1=31327&r2=31326&pathrev=31327 [01:16:12] well that was a well tempered statement :D [01:22:13] hehe [01:24:01] lol [01:28:04] http://us3.php.net/manual/en/function.array-merge.php , makes you cry [01:33:14] 03(mod) Preference to toggle edit tools - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=11130 (10N/A) [01:38:13] is there an easy way to get a list of pages that inadvertently link to themselves? [01:42:57] 03dale * r31333 10/trunk/extensions/MetavidWiki/maintenance/metavid2mvWiki.inc.php: maintenance updates [02:00:42] Hello, if Im adding a namespace to a wiki in Spanish for example, should I use $wgExtraNamespaces[101] = "Idiomas_talk"; or _discussión ... discussión being talk in SPanish. Can anyone help please_ [02:04:32] well I would assume you use _talk since you can actually navagate to the Talk:articla page and it takes you to DiscussiónÑarticle page. [02:10:32] 03(mod) Pressing Enter in edit summary should not save the page - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13065 +comment (103tx38k) [02:13:39] Ok thanks, that helps a lot have a nice evening! [02:18:06] 03(mod) Pressing Enter in edit summary should not save the page - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13065 +comment (103tx38k) [02:20:21] ehm [02:50:40] hi y'all. i'm kinda confused on passwording pages [02:50:54] anyone got a URL that 'splains it for the limited intellect? [02:50:56] Passwording pages? [02:51:04] You mean protecting? [02:51:15] yes [02:51:30] limiting access [02:51:37] Are you looking on how to protect them? [02:51:49] i'd like to limit access to them [02:51:53] First, click the tab at the top that says "protect" [02:52:46] 03(mod) EXIF handling ignores tags with Count >= 1 - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13172 04CRIT->normal; +comment (10tstarling) [02:54:15] Soxred93: i don't think that's what i want [02:54:29] Oh, you mean restricting ALL pages? [02:55:09] Soxred93: no, i have some pages i'd like to limit to a specific group of people ( users ) [02:55:59] Ok, you only want a few of the pages protected? [02:56:11] i guess [02:57:32] On all the pages you want protected, click the "protect" tab [02:57:58] ok [02:59:15] Under the "Edit" column, select "Block new and unregistered users" for only users to edit, or "Block all non-admin users" for sysops only. [02:59:39] i need it more fine-grained than that :( [03:00:00] some of this group of people will be 'new users' [03:00:07] Hmm...I don't know how to put it into easier terms than that [03:00:27] i saw what you just said when you first mentioned "protect" [03:01:13] Hmm...I am not good at explaining this anymore. [03:01:15] Sorry [03:01:50] well, the options in the "protect" field aren't what i was looking for [03:02:18] do you want to protect from 'read' or just from 'write' ? [03:02:27] i need a access-limited-to-a-group-of-people protection scheme [03:02:35] some will be new [03:02:38] some will be ancient [03:03:34] 03(mod) Double-underscore magic words interpreted in links - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13164 summary (10tstarling) [03:04:13] nalioth: do you want to protect these pages from being readable or just editable? [03:04:20] nalioth: http://www.mediawiki.org/wiki/Category:Page_specific_user_rights_extensions http://www.mediawiki.org/wiki/Extension:Page_access_restriction [03:04:37] Splarka: readable only by a select group of folks [03:04:44] 03dale * r31334 10/trunk/extensions/MetavidWiki/ (10 files in 7 dirs): [03:04:44] * media search automatic *logical* subject grouping [03:04:44] * absolute links for metadata export [03:04:44] * maintenance updates [03:04:44] * stubs for json stream metadata export [03:04:44] * updates for semantic browser inline media links rewrite [03:04:55] nalioth: right, mediawiki is not a content management system... [03:05:09] what Soxred93 was describing was protection from editing [03:05:15] Splarka: ok [03:05:17] nalioth: you can look at the Lockdown extension, but mediawiki is really not designed for this [03:05:53] read: http://www.mediawiki.org/wiki/Security_issues_with_authorization_extensions [03:06:12] per flyingparchment, http://www.mediawiki.org/wiki/Extension:Lockdown nalioth [03:06:20] arr [03:07:59] flyingparchment: kirby: thanks [03:08:09] what would be the best software for this purpose? [03:08:10] np [03:08:27] the extensions are installed on top of the existing mediawiki software [03:10:16] right. [03:10:17] nalioth: you could create some parallel wikis with all read access restricted except to logged in users, and restrict user creation except by you [03:10:31] Splarka: yeah, i've thought of that [03:10:43] or create some folders and .htaccess password lock them [03:10:52] Splarka: thought of that, too [03:11:00] *nalioth just needs a secure whiteboard [03:11:52] as you can see, there are at least 13 paths to data insecurity in a mixed-security mediawiki [03:12:04] anyone recommend a CMS ? [03:12:10] 03(mod) updaters.inc will never load interwiki.sql - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12200 04BLOCKER->normal; +comment (10tstarling) [03:12:39] you could always make local install and test it that way, then when you get it working the way you want it to. Make the extension(s) like on the wiki you're referring to. [03:12:56] s/like/live [03:12:58] kirby: yep, i kinda needed it sooner than later (and i have a local wiki ) [03:13:39] 03(new) Timstarling keeps taking away our blockers! - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12345 04BLOCKER +comment (10splarka) [03:13:59] heh [03:14:33] *MrZ-man notes bug 12345 is actually "Server should return a 410 HTTP status code for deleted pages" [03:14:57] eww, that'd be silly... [03:15:04] unless it was like action=raw or action=render [03:15:21] or maybe returnstatus=true [03:15:27] Resolution: WONTFIX [03:15:33] then it could do 404 and 410 and 505 and 8675309 [03:16:01] 402! heh [03:16:17] 403 for action=edit on pages you're banned from [03:16:27] 409 for edit conflicts [03:16:36] 402 Payment Required... [03:16:51] subst/payment/donation [03:17:33] 201 for new article creation [03:19:59] splitsville [03:37:46] I just installed Mediawiki and I want to change the URL from DOMAIN.com/mediawiki to DOMAIN.com/wiki, entirely. How can I do this? [03:38:03] I tried changing wgScriptPath but then it just breaks the php files... Where does it alias itself with Apache? [03:41:57] well, thanks y'all [03:42:12] i just made a new subforum on my phpbb3 [03:42:19] seemed quicker and easier [03:42:41] y'all take care [03:44:27] 03(FIXED) Enable new parser on Wikimedia (tracking) - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12652 (10tstarling) [03:45:28] 03(FIXED) Unclosed transclusion eats next section in rendering, but not in section editing - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=2375 (10tstarling) [03:45:37] 03(FIXED) HTML headings point to nonexistent sections when transcluded - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=4034 (10tstarling) [03:45:51] 03(FIXED) Double-parse in parser function results - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=5678 (10tstarling) [03:46:09] 03(FIXED) Section edit links for transcluded templates are not affected by < noinclude> or - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=6563 (10tstarling) [03:46:32] 03(FIXED) Section edit links generated from templates are wrong - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=7083 (10tstarling) [03:48:04] 03(FIXED) Unexpected output on bad input ("<") to {{#time: }} parser function. - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=7140 (10tstarling) [03:48:58] 03(FIXED) h1 Tag causes saved page to have missing data (section editing) - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=7713 (10tstarling) [03:49:34] 03(FIXED) References don't hide under wrong #if - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=9083 (10tstarling) [03:50:16] 03(FIXED) Invalid heading markup made valid by template expansion, causes ghost section - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=9156 (10tstarling) [03:51:02] 03(mod) Install Labeled Section Transclusion extension on fr.wikipedia - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=11766 (10tstarling) [03:51:04] 03(mod) Enable new parser on Wikimedia (tracking) - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12652 (10tstarling) [03:51:18] 03(FIXED) Template section edit issues: ghost sections and lost sections - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=11911 (10tstarling) [03:51:38] 03(FIXED) Impossible to rename or protect a page - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12772 (10tstarling) [03:52:08] 03(mod) LabeledSectionTransclusion: Localized tags do not work with new parser - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13024 (10tstarling) [03:52:12] 03(mod) Enable new parser on Wikimedia (tracking) - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12652 (10tstarling) [03:52:36] 03(FIXED) Wrong section from section edit button (tracking) - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=4899 (10tstarling) [03:56:23] 03(FIXED) on a page doesn't list tags from included templates - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=8693 (10tstarling) [03:57:23] 03(mod) Template parameters unavailable to XML-style parser tags - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=2257 (10tstarling) [03:57:24] 03(mod) on a page doesn't list tags from included templates - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=8693 (10tstarling) [03:57:28] 03(mod) Enable new parser on Wikimedia (tracking) - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12652 (10tstarling) [03:58:31] 03(mod) Enable new parser on Wikimedia (tracking) - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12652 (10tstarling) [04:00:47] not bad for a performance project eh? [04:01:29] *Splarka gives tim a cookie [04:13:04] 03(NEW) Add a lang="mediawiki" to syntax highlighting - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13173 15enhancement; normal; MediaWiki extensions: SyntaxHighlight (GeSHi); (md5) [04:16:22] Nadir_Seen_Fire: how far rewritten was GeSHi 2.whatever for not using broken regex? [04:16:59] I'm not an expert... But it's supposed to be more like an actual 'code parser' [04:17:19] What I can say, is unlike anything done in regex... It can actually handle nested languages [04:17:20] but it only supported like a handful of the modes the regex, right? [04:17:30] ^the regex does [04:17:33] ie: PHP nested inside HTML [04:17:42] I don't know to much about it [04:18:39] one sec [04:19:13] *Splarka tries to remember who was keeping him updated on it last year [04:21:54] Well here's one note... [04:22:04] One thing regex would never be able to do [04:22:19] GeSHi2 actually links certain things [04:22:46] nice [04:22:47] A number of the native functions are linked to their php-docs when you highlight their code [04:23:30] data types also have a title attribute which tells you what type of data they are when you hover over them [04:24:08] (ie: Hover over a $a to see php/php/varstart and over a 2 to see php/php/num/int) [04:24:56] Well... actually... nearly every part of code does that [04:26:59] as soon as text is involved, you start needing internationalisation [04:27:31] I bet the text that comes up on mouseover isn't translated into 150 languages [04:28:30] bbl [04:29:58] 03ivanlanin * r31335 10/trunk/extensions/ (4 files in 3 dirs): [04:29:58] Indonesian (id) localisation updates for 3 extensions: [04:29:58] * DoubleWiki [04:29:58] * FlaggedRevs [04:29:58] * ParserFunctions [04:34:05] 03(mod) @ (at-sign) needs to be totally invalid for usernames - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12581 +comment (10dan_the_man) [04:40:59] 03(mod) index.php does not honor the variant param with action=raw - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12683 (10jhecking) [04:53:38] 03brion * r31336 10/trunk/phase3/ (3 files in 2 dirs): (log message trimmed) [04:53:38] * (bug 12723) OpenSearch description name now uses more compact language code [04:53:38] to avoid passing the length limit as often, is customizable per site via [04:53:38] 'opensearch-shortname' message. [04:53:38] Note that the spec actually says ShortName should be trimmed to *16* chars, [04:53:39] which is pretty awful -- 'Wikimedia Commons' doesn't even fit. I haven't [04:53:43] changed the current trimming (to 24 characters). [04:56:42] 03brion * r31337 10/trunk/phase3/ (3 files in 2 dirs): Tweak up last commit about opensearch desc to be nicer. Cleaner message name, use wfMsgForContent, remove an obsolete line. [04:57:15] 03(FIXED) Trimmed name for Firefox search plugin - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12723 +comment (10brion) [05:11:31] 03(mod) Support crosswiki blocking - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12987 (10N/A) [05:11:33] 03(mod) Support global blocking - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=8707 (10N/A) [05:47:36] 03raymond * r31338 10/trunk/phase3/ (3 files in 2 dirs): Add new message from r31337 to the maintenances scripts. [05:53:13] 03(FIXED) Images in dumpHTML broken - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12122 +comment (10tstarling) [06:10:03] 03tstarling * r31339 10/trunk/phase3/includes/Database.php: Bug 11980: don't complain on rollback of non-transactional tables. Not sure how to reproduce it, but this should fix it in any case. [06:11:44] 03(FIXED) Image deletion broken on MyISAM - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=11980 +comment (10tstarling) [06:19:21] 03tstarling * r31340 10/trunk/phase3/ (4 files in 3 dirs): Replaced maintenance/dumpHTML.php with a message pointing to the DumpHTML extension. Deleted subsidiary files. [06:38:14] 03tstarling * r31341 10/branches/REL1_12/phase3/ (5 files in 4 dirs): Backporting r31339 and r31340: fixes for bugs 11980 and 12122 [06:40:46] 03ivanlanin * r31342 10/trunk/extensions/ (7 files in 7 dirs): (log message trimmed) [06:40:46] Indonesian (id) localisation updates for 7 extensions: [06:40:46] * CentralAuth [06:40:46] * DynamicPageList [06:40:46] * LabeledSectionTransclusion [06:40:47] * Nuke [06:40:49] * ProofreadPage [06:48:06] hi, i am installing media wiki on a usb drive, [06:48:32] i have it up and running and just want to know how to configure it [06:48:49] do i need to physically edit a php file [06:48:58] by that i mean manually [06:50:03] btw its 1.11. somthing on PHP5 [06:52:20] LocalSettings.php ? [06:53:10] yes [06:53:16] how do i edit it [06:53:35] open with a text editor? [06:54:01] ooh [06:54:06] i think that works [06:55:30] TimStarling: shouldn't RELEASE_NOTES in 1_12 be updated when backporting fixes? [06:59:21] absolutely [07:02:45] 03ivanlanin * r31343 10/trunk/phase3/ (RELEASE-NOTES languages/messages/MessagesId.php): Indonesian (id) localisation updates for core messages. [07:06:44] 03tstarling * r31344 10/trunk/phase3/includes/media/Bitmap.php: Partial solution (quick hack) for bug 12970. [07:09:36] 03siebrand * r31345 10/trunk/extensions/Translate/ (17 files in 2 dirs): Some whitespace fixes [07:11:38] where do i fing httpd.conf? [07:13:44] crap, Invitations doesn't show up on Special:Specialpages [07:13:48] I should fix that, eh? [07:22:21] found it [07:22:30] now where do i put this code [07:22:34] HELLO! [07:24:30] im doing http://www.mediawiki.org/wiki/Manual:Short_URL [07:26:47] Simetrical: around? [07:29:56] hello [07:32:45] 03werdna * r31346 10/trunk/extensions/Invitations/Invitations.i18n.php: Fix log-page, add Special:Specialpages entry for Invitations extension [07:33:06] the code is Alias /wiki /var/www/w/index.php [07:38:38] you all fucking suck [07:41:39] So there. [07:42:47] lol [07:44:02] why is linking between namespaces not recommended on mediawiki? [07:44:24] even wikipedia is full of them so why recommend not using them? [07:47:31] who says it's not recommended? [07:47:36] 03ivanlanin * r31347 10/trunk/extensions/ConfirmEdit/ConfirmEdit.i18n.php: Indonesian (id) localisation updates for ConfirmEdit extension. [07:50:48] I was just listening to a couple of audio captcha examples: recaptcha and google [07:51:20] they're certainly formidable for any cracker, if I were trying to break in, I'd pick the image captcha any day [07:52:38] TimStarling: well for example this extension: http://www.mediawiki.org/wiki/Extension:CrossNamespaceLinks which is used on wikipedia [07:52:50] they both use recordings of spoken numbers as a starting point, not TTS [07:52:56] probably with a large dictionary [07:53:43] and they both seem to use small (sub-word) snippets of human voice as their noise model [07:55:54] the noise has a similar amplitude to the spoken numbers [07:57:07] humans have a quite extraordinary ability to pick out a single voice at a noisy party, I don't believe any speech recognition engine can come close to that [07:57:57] piksi: that extension is an aid to policy enforcement on wikipedia [07:58:30] if you don't have any such policy, then obviously you don't need the extension [07:59:47] TimStarling: ok, i was thinking it was rather a technical restriction, thanks for clarifying [08:13:01] can someone has the idea for a recommend for a pdf creating extension? [08:15:29] *TimStarling wonders if marvxxx is a lolcat [08:17:51] lolcat? ;-) [08:18:04] ja its early here...and my english gets even worse [08:18:18] when i didnt got my doubled espresso ;-) [08:19:17] can anybody explain to me how i upgrade my wiki°!? i'm quite new to this wiki stuff and want to upgrade my 1.10 to the in a half year version. how complicate is a upgrade. what about my extension? do i have to check if the are comptable with the new version? how big are usually the changes between the mw versions? [08:23:46] <[algo]> can I write retricted HTML in mediawiki ? [08:23:51] <[algo]> *restricted [08:25:06] 04(REOPENED) Support crosswiki blocking - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12987 +comment (10pathoschild) [08:25:39] mondschein: http://www.mediawiki.org/wiki/Manual:Upgrading_to_1.11 [08:25:44] [algo] see http://www.mediawiki.org/wiki/Manual:$wgRawHtml maybe [08:25:48] and the "related extensions" [08:26:24] 03(mod) Support global blocking - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=8707 (10pathoschild) [08:27:50] <[algo]> ok [08:27:58] <[algo]> I want raw for specific users [08:28:15] <[algo]> seems I can do it [08:28:24] has anybody already uupgraded his own wiki!? does the upgrade work pay°? [08:29:51] i mean is it a lot of work....? [08:30:10] no it isn't [08:30:41] just unpack the .tar.gz to the dir and run the update script (there are detailed instructions on mediawiki site) [08:33:05] ah ok...thanks a lot for the informations and personal opinion...it's just that if i read instructions it's all clear and seems quite easy....but reality sometimes differ a lot...that's why i'm asking... :-) [08:33:08] and one last question: how often do mw release a new version°? [08:39:07] does anyone use the pdfexport extension with mediawiki 1.11.1? i get a error when i get on the special page for pdfprint [08:44:59] TimStarling: got a minute? page_props gave me an idea of implementing a "wikidata light" type of feature, and i'd like to know if you think it would work. [08:45:57] basically, the idea is to have a parser function like {{#property:key|value}} that would allow you to directly set key/value pairs in page_props (or a new page_data table with a similar structure). [08:46:06] would that be sane? [08:47:00] tim: what would you do if you got asked a lot of hypothetical questions? [08:47:11] *Splarka smirks [08:48:00] <[algo]> can I make links case-insensitive [08:48:01] <[algo]> ? [08:48:48] yes [08:49:28] $wgCapitalLinks [08:50:02] no [08:50:06] he says *in*sensitive [08:50:30] bah [08:50:42] [[Page|PaGe]] ? heh [08:51:18] Splarka: i'd be interested in your oppinion about this as well [08:52:01] hmm, who was working on a case-insensitive link system... [08:52:14] Duesentrieb: maybe [08:52:19] i started on it once, but never finished [08:52:26] the schema might have to be beefed up a bit [08:52:50] Dues: I know little of the actual database structure [08:53:26] we've had this idea before, but it never got to the user interface stage [08:53:47] TimStarling: well, my idea was to simply not have the ui problem by not using forms/masks [08:54:05] just use a parser function for settiong values, and assign them to the page. simple and flexible. [08:54:22] it would be easy enough to write an interface, but that could be done later [08:54:30] once it's clear what people are actually doing with the feature [08:54:32] also true [08:54:33] well, nobody ever did, last time around [08:54:49] here's one thing i would like to do: collect bibliography records. [08:54:50] write-only memory isn't particularly useful [08:55:02] they could be used as templates in [08:55:05] you'd need to have some initial application [08:55:49] collecting data that would would be smple enough, and i guess i could get a lot of support quickly. the next obvious thing would be to make the data useful. [08:56:02] that is, there would have to be a special page for querying it in a meaningful way [08:56:33] i guess the method of setting and maintining the properties could be generic, but the special page for querying it would have to be specific to the task [08:56:51] in the case of bibliographic records, "list all publications by X", for example [08:57:19] what do you think? [08:58:03] where X is a page title? [08:58:08] what kinds of properties will the page_props table store? [08:58:33] it stores one property currently: the hiddencat property [08:58:40] TimStarling: no, X would be a page property. the page titles would be listed (each page would be a book) [08:59:01] oh ok [08:59:05] <[algo]> I want to make wiki-manual [08:59:07] <[algo]> so that [08:59:13] TimStarling: yes, i know. and i'm not sure if "content" properties should share a table with "meta" properties (brion said it was "silly") [08:59:14] <[algo]> wiki.javascript.ru/getElementById [08:59:22] <[algo]> and wiki.javascript.ru/getelementbyid will be same [08:59:23] so you'd need indexing by value [08:59:35] TimStarling: by (key, value) [08:59:36] <[algo]> redirecting to canonical page is fine [09:00:17] TimStarling: and perhaps by (page, key, value) to quickly find pages with/wothout a specific property or peroperty value [09:00:25] err, wait, no. [09:00:35] that's the wrong way. anyway, you know what i mean. [09:00:40] well, currently page_props just has (page,key) [09:00:57] ah, so it's only flags currently. [09:01:06] the minimum for my application would be (page, key, value). [09:01:15] you can retrieve the value efficiently, indexed by key [09:01:21] but you can't retrieve the row indexed by value [09:01:38] you could go all crazy with something like (page, key, unit, value, comment) and also define the datatype of properties, and validate input [09:01:59] have you read about SMW? [09:02:36] not recently, but i was part of the panel that presented it at wikimania 2005 :) [09:02:43] smw seemed rather over-complicated when i looked at it [09:03:06] I think the syntax has changed since then [09:03:07] yes, smw doesn't scale to mw. wikidate (as used by omegawiki) is also very complex [09:03:13] they use double-colons now [09:03:15] i was looking for something quick and simple [09:03:29] something that could go live on wikipedia in a matter of months, something that can be introduced gradually [09:04:00] parser-functions would allow this to be done without thouching the syntax, or the core at all, actually. [09:04:22] but I was going to say, SMW uses typed links [09:04:40] which is a nice way to implement searches like "list of books written by [author]" [09:04:44] yes. in addition to properties. [09:04:54] smw is far more powerful, and far more complex [09:05:05] it's a lot harder to maintain, and asks a lot more of the editors. [09:05:33] now there's a 'shareduploadwiki' message? how would that be different than the 'sharedupload' message? [09:05:34] i like the concept, but i don't think it's feasable for wikipedia. i'm not sure it would scale on the db side, and i'm pretty sure it wouldn't scale on the editor's side [09:06:35] TimStarling: anyway, onece we have structured data, it's easy enough to convert it to a different system. just having it in the first place would be a huge step [09:06:40] the idea we had at wikimania 2005 was to use the information in infoboxes [09:06:50] that was one of the ideas [09:06:57] add metadata tags to infobox parameters [09:07:01] dbpedia has that pretty much covered [09:07:21] what i am suggesting can easily be used with infoboxes [09:07:34] anyway, I think if you want incremental development, it has to start off abstract [09:07:36] the infobox template would just forward parameters to the {{#property}} parser function [09:07:45] and it has to have an initial concrete application [09:08:45] yes. so my idea would be to have generic a page/key/value store, and a generic parser function to set that data. and apply it for bibliographic records, and perhaps for the "personendaten" stuff at dewp (which was also discussed at that panel btw) [09:09:01] perhaps i'll write a blog post about it. [09:09:47] bah, i should be working on my thesis, not blog abotu crazy ideas :) but then, my thesis is about extracting semantics from wikipedia, and i read a lot of papers about getting structured data... so this is kind of obvious [09:10:11] (even though what I'm trying to do is a bit different) [09:10:30] what do you mean by bibliographic records exactly? [09:11:38] <_wooz> lo [09:13:58] TimStarling: author, pub date, publisher, title, etc. what bibtex does. i'd suggest to have a namespace for that, and then use that info template style in citations. [09:14:36] right, so one page per citation? [09:14:44] TimStarling: each page in that namespace would be one such record, and use page-props/data to store the individual values. the "face text" of the page could be used as a template, like {{Book:Knuth1997}} [09:15:02] TimStarling: well, the same book may of course be cited multipel times [09:15:11] sounds pretty cool [09:15:18] \o/ [09:16:12] TimStarling: the backgroujnd is, among other thigs, my frustration with online bibliographies (see http://brightbyte.de/page/The_Bibliography_Thing) and my thesis work (see http://brightbyte.de/page/WikiWord) [09:16:21] good morning people [09:16:36] how can i access the text below the Headlines in Parser methods? i mean which method handles them? [09:16:42] i guess i'll indeed write a page about it. watch the ticker for it :P [09:16:59] add a review tab to the bibliographic record [09:17:03] emilsedgh_: "text below the headlines"? [09:17:09] then we'll take over the world [09:17:22] == Foo == [09:17:23] Baaaar [09:17:32] Dralspire: which method handles Baaaar ? [09:17:48] emilsedgh_: err, all of them? it's just, yoiu know, wikitext that has to be parsed? [09:17:59] hm [09:18:12] TimStarling: hehe... namespace manager. but no need for that. my idea would be to include reviews (or at least abstracts) in a section [09:18:29] Duesentrieb: i want to add collapse/expand feature to it, so how could i for example put a
on it? [09:19:10] emilsedgh_: yes. nither wikitext nor html handles sections as nested blckes. headings are just deviders in flat text. if you want to change that, you have to place divs manually. [09:19:55] Duesentrieb: manually? you mean something like: [09:20:09] == Foo == [09:20:10]
Baaar
[09:20:10] ? [09:22:23] emilsedgh_: helps to specify.. you want to use javascript to scrape the rendered page for headlines, and make all sections collapsible with the headers as the delimiters? [09:22:51] yes [09:23:47] is there a known bug with using 2 wikis on the same database with different schemas? [09:24:28] nakee: as long as you use propert table prefixes, you should be fine. [09:24:46] nakee: you mean with postgresql? [09:24:54] Duesentrieb: the table names are the same, I just put it in different schemas [09:24:59] yes in postgres [09:25:02] ah PG [09:25:07] no idea, talk to flyingparchment. [09:25:08] i don't see why that wouldn't work [09:25:13] it suppose to be good enough unless wiki is doing something funny [09:25:15] (as long as you set $wgDBschema properly, of course) [09:25:23] i was thinking of schemas as in slightly different layouts for different versions of mediawiki [09:25:29] make sure your search_path isn't being set to something screwey [09:25:41] nakee: what is the actual problem you're seeing? [09:26:15] emilsedgh_: the headers generate (in monobook at least), a

\n

title title

[09:26:25] flyingparchment: well after installing a second wiki on the same database but in a different schema the first wiki stoped working [09:26:31] Duesentrieb: wanna fix #13168? i'm lazy [09:26:37] nakee: could you be more specific? [09:26:38] you can grab the innerHTML (ew) of the #bodyContent and iterate over all matches for that... [09:26:44] flyingparchment: telling me no content on all the pages [09:26:45] Splarka: i already modified headers [09:27:02] in javascript? [09:27:55] Splarka: yes.i added onclick too tag.but i have to put some
on the content [09:28:09] on the content below that header [09:28:12] emilsedgh_: so parse the page before you modify the headers [09:28:15] flyingparchment: i promised myself to not do any codeing until i'm done with the thesis :) the fact that i hang out here and make plans to revolutionize wikipedia is bad enough :) [09:29:59] flyingparchment: it uses 1.11.0 and the new one is 1.11.1 [09:30:11] flyingparchment: moving it to another database fixed the problem [09:30:30] Splarka: im not getting what do you mean, could you please explain more? [09:31:38] sec [09:31:54] why don't you come in to #mediawiki-scripts which is for javascript fun [09:32:23] as this channel seems to be mostly about carpentry... though I don't know why mediawiki has so much to do with tables (this is a joke) [09:33:04] oh [09:39:12] 03(mod) Categories need piping feature to list by alternative name - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=491 +comment (10roan.kattouw) [09:41:29] <[algo]> any plugin for case-insensitive matching of titles ? [09:42:33] 03(NEW) HIDDENCAT categorizes images and pages - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=13174 minor; normal; MediaWiki: Categories; (codeispoetry) [09:43:04] Duesentrieb, I scrolled back to read your {{#property:key|value}} idea. How do you know the datatype of key? Is its value always a page? [09:43:22] i would expect the key and value to be any string [09:45:00] Well, SMW's property::value syntax [[ISBN::12345]] defaults to 12345 being a page, but you can say it's a string or a wiki page, number, temperature, etc. [09:46:49] 03(mod) Enable new parser on Wikimedia (tracking) - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12652 (10mediawiki) [09:47:59] skierpage: i'm not planning on implementing relatiosn between pages this way. if we want that, we should go for SMW. the value would be a simple literal property, a string in the database; I could imaging that a datatype could be specified in order to allow validating input. but that only makes sense if there's a separate fields for the unit, and another field for comments like "estimate" or "in 2003" etc. (think population numbers in [09:47:59] infoboxes, etc) [09:48:57] Duesentrieb, there are several extensions that do "SMW lite", see http://www.mediawiki.org/wiki/Extension:Semantic_MediaWiki#Similar_extensions , I'm not familiar with them. [09:49:23] skierpage: i have looked at a few a while back, non seemed really cool to me. but i'll have a nother look [09:49:56] i'm working with the spellcheck extension: http://www.mediawiki.org/wiki/Extension:Spellcheck but somehow it doesn't work [09:49:57] if i change a word which is within the "wordlist" and then click show changes the word doesn't change in the edit box and though the added_words.txt has chmod 666 add to dictionary doesn't work...i work with 1.10 mw...can somebody help me°? [09:50:51] Indeed, it's hard to wrap your head around more than one of them, I wish there was a compare and contrast review of them all ;-). http://www.mediawiki.org/wiki/Extension:Infobox_Data_Capture ... ? [09:55:51] 03(mod) HIDDENCAT categorizes images and pages - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13174 (10N/A) [09:57:32] any pdfexport users here? does images work with that extension? [09:58:04] skierpage: Data and WikiDB seem quite similar to my idea. when i get around to write something, i'll look into them some more [09:58:52] marvxx1: i use pdf2wiki and i couldn't get images goin' :( [10:00:07] mondschein: the page tells that you should disable deleting the temp file...but i dont know any single thing about php [10:00:47] mhm...maybe you have to disable it in localsettings.php!? [10:01:26] what pdf exporter do you use!? [10:02:32] http://www.mediawiki.org/wiki/Extension:Pdf_Export [10:02:37] this one [10:02:41] with htmldoc [10:04:33] 03(mod) Pressing Enter in edit summary should not save the page - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13065 (10huji.huji) [10:04:59] mhm don't know this one...does it put tex etc. in pdf as well!? how about if a page is in a namespace!? [10:05:30] do you have a tmp folder in your image folder!? [10:05:57] mondschein: yes [10:06:05] i have a temp folder in images [10:07:11] mhm...maybe you have to look thourg upload.php or the code that is responsible for the upload progress....maybe there [10:07:30] oh well can't say I didn't try [10:07:43] a better doc wouldnt be bad [10:07:52] if he says i should disable removing...but how [10:08:49] mhm and the discussion pages gives you no hint!? [10:10:04] sorry i'm no big help for you [10:10:28] im searching through it without success [10:10:34] if you find no help here try the mailing list...maybe somebody there can help you [10:11:15] i know i'm having a lot of troubles with my extension as well and it always seems that i'm the only one who has that problem... :( or i'm to ***** whatever ;) [10:13:01] username too long "acridfusion_acrid" (DB username) [10:13:46] maybe just acridfusion_a would work huh [10:14:09] config/index.php [10:14:42] welp i'll try it [10:16:26] sweet [10:16:30] thanks guys! ;) [10:19:11] i give up [10:19:46] as someone who never coded php i cant find code which removes a temp file or something [10:21:52] 03(mod) Problem creating image thumbnails in 1.11 - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12970 (10ggt) [10:23:30] is there a LocalSettings variable to not delete temp files? [10:27:12] hmm [10:27:47] all good except when i go to ......com/wiki/ its just a blank screen [10:28:01] perhaps its a permissions factor? [10:28:16] 03catrope * r31348 10/trunk/phase3/ (RELEASE-NOTES includes/api/ApiQueryRecentChanges.php): (bug 12394) Adding rctitles parameter to list=recentchanges to facilitate easier retrieval of rcid. Also changing RELEASE-NOTES entries to be more consistent with each other [10:29:07] 03(FIXED) rcid cannot be easily retrieved - implementation seems patchy - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12394 +comment (10roan.kattouw) [10:29:48] is there another way to make pdfs out of a mediawiki? [10:30:17] is there a way to make it work after you install it? [10:34:01] perhaps if iput in this safe_mode = off in php.ini [10:34:06] i'll let ya know [10:37:34] where is php.ini found??????? [10:37:56] mostly in /etc/php/ [10:38:06] 03daniel * r31349 10/trunk/extensions/CategoryTree/ (CategoryTree.css CategoryTree.js CategoryTree.php): versioning (test for svn:keywords) [10:41:25] i dont have that folder [10:41:33] etc either [10:41:41] 03(mod) Allow from parameter on Special:Unwatchedpages - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12272 (10waldir) [10:41:47] dang hostsnake [10:46:33] hello [10:47:29] on a multilingual wiki, is it possible that the "main page" links to the prefered language of the user? [10:48:41] i.e. if the user chooses French, the "main page" is called "Accueil", so i would like that the link changes to Accueil [10:56:33] yannf: the 'link' part of each MediaWiki:Sidebar line is not translated [10:57:05] isn't there an extension for that? [10:57:45] if anyone who knows somethinga about coding extensions for mediawiki could help: i have this output in an extension: $wgOut->addWikiText( wfMsgHtml( "requestaccount-text" ) ); [10:58:01] apparently it doesn't send anything through a parser since all wikisyntax gets stripped away [10:58:34] is there another call i should use instead of wfMsgHtml to parse the text ? [10:59:55] wfMsg [11:00:11] wfMsgHtml is used when you're outputting raw HTML and need to html-escape the message [11:00:30] oh [11:01:02] is using $wgout->addwikitext(parse()); incorrect? [11:01:18] that would parse it twice [11:01:27] as the name suggests, addwikitext will already treat its input as wikitext [11:01:34] oh [11:23:56] f [11:24:02] oh [11:24:13] hello! [11:25:14] does anybody know how to display the author's name on an article page? [11:28:48] hey Duesentrieb ^^ [11:29:19] ponyesk: _The_ author? [11:30:07] Dashiva the author of the actual watched version [11:31:40] The last person to edit, in other words. Might not even have changed the visible content. [11:32:26] Dashiva yes right .. what else he/she might have changed? [11:32:44] the name? [11:33:22] Categories, templates, markup improvement, etc [11:33:46] interlang links ^_^ [11:33:58] *Dashiva denies their existence [11:34:01] *Werdna turns scabbed's bs upsie-down [11:35:14] 03(NEW) API's action=help lists no example for list=blocks - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13175 trivial; normal; MediaWiki: API; (ais523) [11:42:45] moin [11:43:43] I'm thinking about kind a taxonomy support [11:43:49] <[algo]> I installed mediawiki [11:43:54] <[algo]> but no "upload file" link available [11:44:01] <[algo]> how to enable upload file feature ? [11:44:11] Are you logged in? [11:44:23] Dashiva okay .. this would be no problem .. so lets say: the last person who edit this edit .. [11:44:31] similar to categories but allowing to either import a tax. or read on the fly from db [11:44:38] without restricting "what is edited" [11:45:24] i thought about using categories but in the meantime i'd do it with a new NS [11:45:49] ie http://en.wikipedia.org/wiki/Special:Search?go=Go&search=Taxonomy:TheNode [11:46:18] hm [ [ Taxonomy: TheNode ] ] [11:46:23] <[algo]> Dashiva: yes [11:46:26] <[algo]> I'm logged in [11:46:34] i meant. anyone ideas bout that? [11:47:35] [algo], $wgEnableUploads = true; [11:47:50] <[algo]> ah [11:47:51] <[algo]> thanks [11:48:04] <[algo]> can I specify upload path ? [11:48:11] <[algo]> root upload dir [11:48:51] i don't know that [11:49:21] Dashiva any idee for the problem? [12:06:10] steffkes: There's nothing built-in, but it's not that hard to create on your own. Might already be an extension that does it [12:16:36] !e PageBy | ponyesk [12:16:36] --mwbot-- ponyesk: http://www.mediawiki.org/wiki/Extension:PageBy [12:17:25] !upload | [algo] [12:17:25] --mwbot-- [algo]: File uploads are an often-used feature of MediaWiki, but are disabled by default in all current release versions. To enable them, first make the upload directory (default images) writable by PHP, then set $wgEnableUploads to true in LocalSettings.php (i.e. "$wgEnableUploads = true;"). See for more info. [12:21:35] Hi, i was wondering if it would be possible to combine a blog system like wordpress with mediawiki, so that users can write together on blogposts on the wiki and the blogposts are then exported to wordpress where they will be published to the public? [12:25:43] can i somehow specify that one user should receive all information about users that register on a wiki, it should automatically subscribe to all new pages etc ? [12:30:00] KenSentMe: if you write is, it will be possible :) [12:31:09] Duesentrieb, hehe, good one [12:32:15] Im not a coder, but im looking for a way to combine the functions of blogging with the openness and colaboration of a wiki [12:36:18] KenSentMe: well, mediawiki isn't wuite made for this, but i have written a few extensions that can turn a wiki into a pseudo-blog [12:36:41] KenSentMe: hae a look at brightbyte.de - the extensions to use are primarily Lockdown and News [12:37:28] Richlv: you would wnat the Newuserlog extension (not sure why account creations are not logged per default, but they arn't), and then subscribe to the RSS feed on the Recentchanges page [12:37:34] hi Duesentriebe [12:37:39] hello Nikerabbit [12:38:06] did you see my chat with TimStarling about "wikidata light" earlier? any comments? [12:38:19] \o/ [12:38:50] Duesentrieb: nope, at which time? [12:38:52] oh, hi domas :) please scrall back and tell me why this is totally and utterly insane :) [12:39:25] *domas hugs small scrollback buffer [12:39:28] *domas gets back to work [12:39:32] 03(mod) Interwiki map update - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12763 +comment (10andrew) [12:39:34] Nikerabbit: 9:45 CET [12:39:56] Duesentrieb: it's insane because it should be spelled "lite" [12:40:08] *Duesentrieb threateningly waves a huge logfile at domas [12:40:48] Duesentrieb, cool, let me take a look, thanks [12:41:09] e_s_p: it should probably be spelled "2.0" [12:41:50] KenSentMe: but be advised that there are probably other systems out there that more neatly integrate wiki and blogging functions. i just hacked mediawiki to do it, because i know it well. [12:45:17] woohoo, duesentrieb [12:45:26] thank you very much. indeed! [12:51:16] Duesentrieb, thanks, will look into that [12:53:28] Duesentrieb, hmm, but that only seems to log new users (which is nice), but wouldn't notify a user of all new pages (and wouldn't "watch" all new pages for him), right ? [12:54:06] Richlv: the extension wouldn't. subscribing to the rss feed on Special:Recentchanges does [12:56:41] Duesentrieb, hmm, i'd have preferred emails, but i guess rss is better than nothing :) [12:57:06] Richlv: since most mail programs support rss, where's the difference? [12:57:35] the reason that mail is not supported is basically that it would mean sending a *lot* of emails, if oyu have a busy wiki. [12:57:44] and several people subscribed to the service [12:58:33] Richlv: anyway, google gives me 3870 hits for "rss-to-mail" [12:59:01] Richlv: there *is* a raw udp interface, for use with an irc bot... [12:59:05] Duesentrieb, thunderbird doesn't have any sensible rss filtering capabilities, at least as far as i checked - but i guess i can live with that, thanks :) [12:59:09] !rcbot [12:59:09] --mwbot-- To create an IRC bot to display recent changes to your Wiki, follow the directions at . [13:11:08] Duesentrieb .. the PageBy Extension, i have to set an tag manually .. should be possible to include this in _every_ article, no? [13:13:22] not possible without modifyint the skin [13:13:47] well, it might be possible to inject it into a skin with some evil hackery. but not nicely. [13:15:14] Duesentrieb no problem .. its only an internal wik, with only one skin .. no user changes possible [13:15:19] but, where to hack in the skin? [13:15:39] skins/monobook.php [13:15:58] you would have to take the code from the extension, adapt it, and place it in the skin output. [13:16:10] not pretty [13:16:17] (skins aren't easy to customize) [13:20:27] hi [13:21:33] Duesentrieb dirty is my secondname ;) [13:21:55] When I include an image map in an article, the wikifier adds
 around the  tag and makes it badly formated XML. How can I fix this ?
[13:26:32] 	I could rephrase my question as : how to include HTML in a page and ensure it's not wikified ?
[13:29:12] *dachary 	reading http://en.wikipedia.org/wiki/Wikipedia:How_to_edit_a_page
[13:29:57] 	dachary: 
[13:30:09] 	dachary: unless your html contains 
 too. then it gets tricky.
[13:30:25] 	it does not
[13:30:27] 	thanks
[13:30:34] 	geshi probably also has a html mode
[13:30:35] 	it does not contain pre
[13:30:37] 	!highlight
[13:30:37] --mwbot--	there are several extensions for syntax highlighting, see http://www.mediawiki.org/wiki/Category:Syntax_highlighting - the most popular one is at http://www.mediawiki.org/wiki/Extension:SyntaxHighlight_GeSHi
[13:31:16] 	dachary: for "inline" stuff, use 
[13:32:44] 	hm, how to detect an article page .. o_O
[13:33:21] *dachary 	trying
[13:34:25] 	what would be the difference between pre and nowiki ? I'm not sure about the meaning of "inline"
[13:40:08] *dachary 	wonders if parserTest is not introducing some weird formating (pre or nowiki are surrounded by 

... [13:47:51] Just curious about the progress of the formal parser. I saw the related page was not updated almost one year. [13:52:04] <_yawn> organizational question: i'd want to create an index of names [13:52:07] Mountain: see http://www.mediawiki.org/wiki/Markup_spec/ANTLR and http://lists.wikimedia.org/pipermail/wikitext-l/ [13:52:15] <_yawn> people may or may not belong to a family (category page) [13:52:30] Mountain: please update whatever "related page" with links there. [13:52:35] <_yawn> and share some common traints (such as being dead or alive for instance) which are also possibly categories [13:52:36] thanks [13:53:02] <_yawn> is there a way to change to sort order of the name pages [13:53:05] 03(mod) Problem with Special:Import - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13068 (10moissinac) [13:53:20] <_yawn> in order to avoid naming a page e.g. "Einstein, Albert" [13:53:43] <_yawn> and/or are there patterns for maintaining a somewhat largish (1000+) index of names in mediawiki instances? [13:53:47] _yawn: not dynamically, but yes, when assigning a category. just say [[Category:Foo|Einstein, Albert]] [13:54:31] <_yawn> Duesentrieb: currently i am thinking of having a toplevel category 'names' to which all names belong [13:54:32] _yawn: however, if you need anything more flexible, look into semantic mediawiki or wikidata. both are poweful, heavy-weight extensions/modificatiosn of wikipedia [13:54:48] <_yawn> Duesentrieb: +categories for large families such as the einsteins [13:54:49] err, of mediawiki [13:54:50] damn! [13:55:03] fater 4 years, this *still* happens to me! [13:55:16] *after [13:55:21] <_yawn> Duesentrieb: mediawiki, wikimedia, wikipedia ... don't blame yourself :) [13:55:46] <_yawn> Duesentrieb: heavyweight sound too heavyweight for me however ... :-) [13:55:53] well, i deal with this stuff every day... i'M writing my damn thesis about it! you would think i'd get it streigt at some point :P [13:56:22] <_yawn> Duesentrieb: about mediawiki or wikimedia? ^^ [13:57:04] wikipedia really. but i have to deal with mediawiki and wikimedia too, obviously. [13:57:39] <_yawn> phd or master/german diploma? [13:59:14] german diploma [13:59:40] http://brightbyte.de/page/WikiWord/Expose [14:00:34] <_yawn> ambitious! [14:00:52] <_yawn> i think i'll bookmark that and check back in march :) [14:01:08] Duesentrieb: just as i have seen that you are working on a new parser....You specify it in ANTLR, but why? I thought ANTLR is used to build Parsers on JAVA-code.... [14:01:28] Isnogud: i'm not writing that. [14:01:35] <_yawn> isnogud: not really - java is just on the possible targets [14:01:51] Isnogud: anyway, ANTLR is a grammaer specification language cum parser generator. java is one possible target language for generation [14:01:57] (php is currently not one) [14:02:05] why antlr? no idea. why not? [14:02:37] <_yawn> Duesentrieb: because it has some quite sophisitcated debugging and development tools? good documentation? good support? [14:02:43] <_yawn> my .5 eurocents [14:02:49] I remember that there is a grammar generator build on php somewhere around pear. i tried it but failed. [14:02:58] maybe. i never really looked into it. [14:03:02] just *at* it, briefly :P [14:04:52] <_yawn> Duesentrieb: to summarize - do you think the way i think the index should work is correct? [14:05:05] Morning All! [14:05:10] I have a couple of quick questions [14:05:39] I recently moved my wiki from one server to another, everything so far has been running smoothly [14:05:55] _yawn: categories are probably not suffiecient for what you want. look for existing genealogy extensions, or write your own, or look into semantic mw or wikidata [14:06:04] (the latter is very much in flux, though, iirc) [14:06:17] but the one thing is in the top left hand corner the wiki logo I setup is not showing up all the time [14:06:30] It only appears on like 1/2 of the pages that I view [14:06:34] <_yawn> Duesentrieb: okay. any thoughts about timelines in general? are those relations covered as well by the semantic extensions? [14:06:45] whobutsb: make sure $wgLogo is an *absolute* web path (or a full url) [14:07:06] _yawn: if you need something like inheritance (attibutes are inherited by another superconcept) then maybe those will not be sufficient. [14:07:16] /by/from/ [14:07:21] _yawn: should be. havn't tried [14:08:35] Duesentrieb: any test cases for people to write a parser? I want to have a try on this topic. [14:08:57] Duesentrieb: Thanks that did it! [14:09:06] Mountain: look at http://www.antlr.org/ [14:09:13] One more quick question [14:09:21] Mountain: there are a lot of parser test cases in the maintenance directory [14:09:43] Ok, I will have a look, thanks guys [14:09:59] I'm trying to set my wiki up so that only registered users can edit the pages, but everyone including anonymous are allowed to contribute in the discussion page of the articles [14:10:09] I can't seem to figure out the settings I need to set [14:10:23] Mountain: piece of advice: it's easy to cover 90%, it's hard to cover 99%, it's very verys tricky to get the 99.9% we need for application on wiukipedia. [14:10:55] Mountain: you need plugin-points for extensions that mess with the syntax, and you have to deal with bits of the syntax that depend on internationalization. [14:11:19] whobutsb: mediawiki can't do that per default [14:11:24] !Lockdown | whobutsb [14:11:24] --mwbot-- whobutsb: Lockdown is an extension for preventing read or write access by namespace and limiting access to special pages. For more information, see < http://mediawiki.org/wiki/Extension:Lockdown >. For general information on preventing access to your wiki, see < http://www.mediawiki.org/wiki/Manual:Preventing_access >. [14:12:02] mwbot: Thanks I will take a look at Lockdown [14:12:16] Duesentrieb: seems to be a very hard problem [14:12:31] Mountain: that'S why so many have tried and failed [14:12:45] one general IRC question, is there any way to select a user and respond to them rather than type their name? [14:13:17] whobutsb: usually it's enough to type the first two or three letters and the hit . should work in any decent client [14:13:23] 03siebrand * r31350 10/trunk/extensions/ (66 files in 64 dirs): [14:13:24] * add svn:keywords 'LastChangedDate' [14:13:24] * replace text 'version' in ExtensionCredits with version based on 'LastChangedDate' for automatic version updating [14:13:42] whobutsb: and there's the private chat you can enter using /msg or /query - but you have to register with nickserv for that to work. [14:13:45] Duesentrieb: Your right, very cool! This is my first experience with IRC! [14:13:48] (on freenode, at least) [14:17:13] <_yawn> Duesentrieb: thanks, i'll have a look at it [14:17:46] vikidia-fr [14:35:33] mediawiki is not upgradeable with phpshell. you shouldn't mention it on the update instructions page [14:44:37] At http://ligamen.org/mediawiki/index.php/Hop i would like the to be treated as regular HTML. Is there a way ? I've tried
 and  already ;-)
[14:53:28] 	!html | dachary
[14:53:28] --mwbot--	dachary: For allowing any and all HTML, see . This is of course VERY DANGEROUS. Safer options include ,  and .
[14:54:11] 	thanks!
[14:54:20] *dachary 	must learn to use mwbot
[14:54:28] 	!imagemap | dachery
[14:54:28] --mwbot--	dachery: ImageMap is an extension that allows you to override the default linking behavior of images. See  for more information.
[14:54:37] 	!help | dachery
[14:54:37] --mwbot--	dachery: Hi! I'm mwbot, a bot that was quickly whipped up by Daniel Cannon (AmiDaniel) to help out around #mediawiki. Some quick help is at < http://www.mediawiki.org/wiki/Mwbot >, you can find all my source code at < http://amidaniel.com/viewvc/trunk/MWBot/?root=svn >
[14:55:43] 	i've been reading the imagemap extension but still can't figure out why its output is not modified in the same way (i.e. no 
 around the first  element).
[14:55:57] 	question.. I migrated several wikis into one using the phpDump procedure but none of the images exist even if I copy them over to the /images directory.. my guess is the links are not there which are defined in the DB.. exactly which tables would I have to export over from the old wikis to get the images to work again?  I currently have images and imagelinks.. are those the only two?
[14:56:34] 	dachary: because users are not allowed to use arbitrary html, but extensions are allowed to do that (it's their purpose, really)
[14:56:49] 	I get that now :-)
[14:57:13] 	Me trying to experiment manualy to understand formatting of extensions output does not make any sense.
[14:57:38] 	wolfspirit: yes. cahces are the database is fine but your pathes are configured wrong. doublecheck wgUploadPath and wgUploadDIrectory
[14:58:10] 	.21
[14:58:17] 	dachary: indeed. well, you can do that on a flat html file, but not on-wiki
[14:58:20] 	Duesentrieb: so are you saying that I would have to migrate those two tables as well or the phpDump is fine?
[14:58:34] 	wolfspirit: wtf is phpDump?
[14:58:39] 	!backup wolfspirit
[14:58:39] --mwbot--	http://www.mediawiki.org/wiki/Manual:Backing_up_a_wiki
[14:59:12] 	ah, better still:
[14:59:18] 	!move | wolfspirit
[14:59:18] --mwbot--	wolfspirit: http://www.mediawiki.org/wiki/Manual:Moving_a_wiki
[15:03:02] 	Duesentrieb: sorry I meant dumpBackup.php
[15:03:25] 	Duesentrieb: I didn't do a full backup of the databases because we were all on different versions
[15:03:52] 	wolfspirit: that gives you a content (text) dump only. that's good for a long-term backup, and stable across mediawiki versions, etc, but can't be used to restore an operational wiki
[15:03:55] 	it contains page text only.
[15:04:03] 	read the backup page
[15:04:08] 	use the script at the bottom, if you like
[15:04:10] 	Duesentrieb: Yes I know this
[15:04:17] 	Duesentrieb: but I'm past that point now
[15:04:37] 	Duesentrieb: everyone is migrated over to the same wiki.. so I have to figure out how to get the images to work
[15:05:03] 	well, you can restore images. you can *not* restore user accounts or logs.
[15:05:24] 	to restore images, use maintenance/refreshImages.php --missing
[15:06:35] 	Duesentrieb: what does that do?  I would have to copy over the images from their old instances of mediawiki first into the new /images and then run that?
[15:06:47] 	yes
[15:07:03] 	it looks at the files, and fills the image table from that
[15:07:07] 	I got read of all the added pre .... by removing leading white spaces in the extension output ! :-)
[15:07:09] 	you will also have to rebuild link tables
[15:07:18] 	after importing the images, run maintenance/refreshAll.php
[15:07:57] 	Duesentrieb: nice so in theory I won't have to copy over any old tables from the old DBs
[15:08:38] 	wolfspirit: if you don't need user accounts or logs, then no.
[15:09:01] 	dachary: oh, you are writing an extension? yes, that's a nasty problem. it got me badly once too. the problem is that extension output only bypasses *part* of the parser. which really sucks. see https://bugzilla.wikimedia.org/show_bug.cgi?id=8997 for details
[15:09:08] 	Duesentrieb: no I don't because I use the LDAP autentication for user accounts
[15:09:40] 	:-)
[15:09:52] 	I now have to figure out why it adds 

[15:10:11] wolfspirit: you should be fine then. except that all users will lose their user settings. [15:10:19] it's a *lot* more convenient to simply copy the full db [15:10:33] dachary: read the bug report. [15:10:41] dachary: then, give up :P [15:10:42] (I am) [15:10:50] ahahah [15:11:03] I'd be happy with a workaround really. [15:12:10] dachary: i had that problem when writing HTMLets. see http://www.mediawiki.org/wiki/Extension%3AHTMLets and the talk page there. [15:12:40] dachary: look at the source code, it includes a workaround. two, actually; the one used per default (called "bypass") is a really nasty hack, but works reliably. [15:13:03] (looking) [15:13:09] Duesentrieb: thanks that is a lot easier than I thought.. I was about to import SQL into the db heh [15:13:37] wolfspirit: hm? but that's exactly what importing a dump does. [15:13:55] but it's rally simple. a single command. you never look at the sql. [15:15:28] Duesentrieb: yeah I didn't know.. was about to do everything manually :) [15:15:42] that would suck :P [15:15:48] yes indeed [15:19:55] Duesentrieb: thanks. I've escaped the problem by outputing all I need on a single line. Not pretty but works. [15:19:57] Hello [15:24:22] Is there any way (hook) to modify wgAutoloadClasses in order to replace a class usually loaded by a custom one. In fact I'd like to replace 'ThumbnailImage' autoloaded class by mine in order to change html prepared this class. [15:24:46] no [15:25:21] but you can replace the media handler [15:25:55] then you can proxy doTransform() calls to BitmapHandler, and have it convert the thumbnail to whatever you want to convert it to, before returning it [15:29:18] Duesentrieb: you still there? I tried using "php refreshImages.php --missing" and it says I have an input file missing [15:29:39] TimeStarling: Can you point me to some doc about how to replace media handler ? [15:30:07] wolfspirit: if you imported the full database, you don't need to run that. you should touch your localsettings.php though, so the cache gets purged. [15:30:21] wait [15:30:30] wolfspirit: i donÄt know why it would complain abotuj a missing input file, besides perhaps adminsettings.php [15:30:33] !adminsettings [15:30:33] --mwbot-- AdminSettings.php is an additional configuration file for use with command line maintenance scripts. See AdminSettings.sample in your installation directory for details. [15:30:34] it's rebuildImages not refreshImages [15:30:46] maybe that's the problem [15:30:48] oh, yea, that too [15:30:59] but again: if you copied the full db, you don't need that [15:31:01] no I didn't do the full copy of the DBs yet.. [15:31:32] why not? you don't need to do all the cruft if you simply do that [15:31:50] Telemac: I can give you some starting points in the code [15:32:11] Duesentrieb: because I already have this in production I can't play around with it much anymore.. people are using it already [15:32:33] look at $wgMediaHandlers in DefaultSettings.php, then have a look at includes/media/Bitmap.php [15:32:39] i'd call that rash :) [15:32:57] actually you can probably subclass BitmapHandler, rather than proxy to it [15:33:29] there's a handler for .dia files [15:33:32] !e Dia [15:33:32] --mwbot-- http://www.mediawiki.org/wiki/Extension:Dia [15:33:49] should work as an example [15:36:54] TimStarling: Ok I see what you mean. So I can change wgMediaHandler and plug a subclass of BitmapHandler. [15:37:36] yes [15:37:42] then it'll be like... [15:37:51] function doTransform(...) { [15:37:59] $ret = parent::doTransform(...) [15:38:04] ... do stuff with $ret [15:38:10] return $ret; [15:38:10] } [15:42:52] TimStarling: ok, thanx [15:44:44] 03(mod) Problem creating image thumbnails in 1.11 - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12970 (10tstarling) [16:00:40] Duesentrieb: thanks to you http://www.mediawiki.org/wiki/Extension:Silva now has image maps. [16:09:56] Duesentrieb? the PageBy Extension uses an $wgContLang->timeanddate() .. my problem is .. the output is 1 hour smaller than that date .. mediawiki shows on the versions-page !? [16:10:01] whats wrong with? [16:12:10] steffkes: on the version page, mediawiki uses your personal timezone setting (see your user preferences). for signatures, and also for PageBy (i hope), it uses the global timezone [16:12:20] (which might by UTC per default, not sure) [16:12:48] ah okay .. hm, have to see how to change it .. thanks for the info :) [16:12:58] http://www.mediawiki.org/wiki/Manual:Timezone [16:14:01] living manual, hm? ;) [16:16:04] seems to have no effect? [16:16:58] ah .. hm, perhaps not working on history data [16:20:32] Anyone know if generateSitemap.php requires libmysqlclient.so.15 only because of the compression? [16:22:29] Wiredtape: it needs to access the database. [16:24:56] MinuteElectron, based on a test i just did.. the only reference to libmysqlclient.so.15 was for gzopen()-> the outcome of the test was that when --compress=no was used, there was no error regarding missing libmysqlclient.so.15 definition.. [16:26:16] 03(mod) Install Labeled Section Transclusion extension on fr.wikipedia - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=11766 +shell; +comment (10ssanbeg) [16:35:08] ok [16:42:11] Hi folks. Anyone here knows for which tasks GD and ImageMagick are used in MediaWiki? What is GD used for and what is ImageMagick used for? Must i install php5-gd _and_ php5-imagick? Its for an embedded system so I try not to install everything. [16:42:32] running debian [16:42:48] GD and ImageMagick are used for thumbnialing. [16:43:00] If ImageMagick is used then GD is not required. [16:43:10] If ImageMagick is not used the GD is fallen back to. [16:43:23] MediaWiki does not use the ImageMagick extension. [16:43:45] It uses the convert command from the command line; so you need to install imagemagick on its own or the PHP gd extension. [16:48:53] now we're on the subject: on my (Mac OS X 10.4.11) system both GD and ImageMagick is installed. Have lots of problems with .svg. Is there a way to force mediawiki to use ImageMagick and ignore GD? [16:52:04] Mal: [16:52:10] !wg ImageMagickConvertCommand [16:52:10] --mwbot-- http://www.mediawiki.org/wiki/Manual:%24wgImageMagickConvertCommand [16:52:12] i think [16:52:28] yeah [16:54:08] thanks! but i have that one defined. no dice :/ [16:54:53] tried $wgSVGConverter too, but it doesn't seem to do anything [16:57:48] !seen TimLaqua [16:57:48] --mwbot-- I don't know anything about "seen". [17:04:53] VRAG- : what's new ? [17:07:13] VRAG- : could you see my text in pvt [17:07:27] No, i couldn't [17:07:41] are you subscribed with freenode? [17:08:47] no [17:08:57] #exlomproject [17:14:55] can anyone please try and access http://he.wiredtape.com and tell me if it's a timeout? [17:15:56] Wiredtape: comes up for me [17:17:43] hmm :-\ (thanks jlerner) [17:28:50] MinuteElectron: sorry, was away for a while. thx for imagemagick/gd info [17:28:57] np [17:31:54] hello [17:32:23] hey Nikerabbit [17:49:46] hi, anyone knows why media wiki does not load images in skins other then monobook? i'm running the latest stable 1.11.1, my machine is php5. even in the default theme classic [17:50:15] umm not classic but simple, no images are shown. [17:50:27] bsm, no images, where? [17:50:34] like [[image:..]]? [17:51:07] umm sorry Wiredtape, images included in the skin [17:52:23] bsm, probably it's some sort of css setting.. like img {display:none;} [17:52:55] do the files exist? [17:53:02] file/folder permissions? [17:55:38] hm, which one do you prefer: turck mmchache, eAccelerator, APC or XCache? [17:57:17] hm. permissions are exactly like in the monobook skin. [17:57:43] can you see them putting their url with the browser? [17:59:00] heh... i didn't even notice, i got commit number 31337 :) [18:01:01] heh [18:03:30] gobble gobble.. [18:04:38] Platonides: thanks for input, seems like unix unzip made something... wrong. [18:04:43] 03(NEW) Namespace Request - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=13176 15enhancement; normal; Wikimedia: Site requests; (serhat.er.95) [18:12:45] Is there a way to move a page and all of it's subpages? [18:13:39] no [18:14:48] TimLaqua: make a bot [18:14:59] too many bots.... [18:15:06] evil things are gonna run rampant one of these days [18:15:25] hello [18:15:26] *dungodung pets his bots [18:15:30] *dungodung huggles Nikerabbit [18:15:32] did someone write an extension for cascading moves yet? [18:15:32] *Nikerabbit pets dungodung [18:16:25] TimLaqua, not yet, afaik, if you do let us know :) [18:16:46] ya, i'll put one together at some point, this is incredibly annoying. [18:18:29] why doesn't bugzilla have a "random bug" for open bugs.. it would be much more fun trying to stumble and fix.. [18:21:59] rand( 1, 13176 ); [18:22:06] that doens't take into account bug status though [18:22:54] 03(WONTFIX) Should record Accept header - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13101 +comment (10brion) [18:22:54] 03(mod) Problem with Special:Import - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13068 (10brion) [18:22:55] 03(mod) Pressing Enter in edit summary should not save the page - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13065 +comment (10brion) [18:23:25] 03(mod) Allow editting sections of a page using form elements ( drop-downs etc) - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=3849 +comment (10bredtape) [18:23:44] I edited Mediawiki:Sidebar as wikiadmin, but changes are not showing up, even after a browser cache purge. Halp! [18:24:08] MinuteElectron, :) [18:25:29] 03(mod) Parser functions to receive parameters like $1 in MediaWiki site messages (especially fullurl, urlencode, localurl, ...) - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=6261 +comment (10Wiki.Melancholie) [18:27:46] 03(mod) Parser functions to receive parameters like $1 in MediaWiki site messages (especially fullurl, urlencode, localurl, ...) - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=6261 (10Wiki.Melancholie) [18:27:54] I have a strange problem here. i have defined a Tag Hook through setHook. I call this at ParserBeforeStrip and it works. All the stuff that is in there is replaced by a placeholder in the parse function. [18:28:28] But: i want to return WikiMarkup within this hook. [18:29:29] The Problem is: at the point where this WikiMarkup would be replaced to be processed later, there is already a replacement. [18:31:18] linear algebra :-{ [18:31:26] The internalParse does in Fact not get the Tag to parse though! [18:32:07] Can namespaces be inherited so you could nest more sensitive, specific information, at the lower levels of a wiki while still retaining linking capability to the levels above it. [18:32:22] later the replacement for the tag will be filled with the replacement that is WikiMarkup that will now NOT be parsed anymore., [18:33:00] hrm. i think i was to confusing, was i? [18:35:44] In short: At the point where the WikiMarkup should be parsed it cant, because there is no wikimarkup. later the replacement for the tag, which is wikimarkup can not be processed, because the function that is responsible for that, internalParse is already triggered. [18:38:30] well. i fixed it. i just called replaceInternalLinks in a second hook and after that replaceLinkHolders. that's it. thanks. [18:38:45] 03catrope * r31351 10/trunk/extensions/EditOwn/EditOwn.php: EditOwn: Fixing typo [18:39:12] I edited Mediawiki:Sidebar as wikiadmin, but changes are not showing up, even after a browser cache purge. The edit shows up in special:allmessages in green. What am I doing wrong? [18:41:29] 03yaron * r31352 10/trunk/extensions/SemanticForms/specials/ (SF_AddData.php SF_EditData.php): [18:41:29] Removed includes for CSS and Javascript for Scriptaculous; YUI now responsible [18:41:29] for all autocompletion [18:42:15] 03yaron * r31353 10/trunk/extensions/SemanticForms/includes/SF_FormClasses.inc: Fixed incorrect text for SMW 0.7 [18:51:55] https://bugzilla.wikimedia.org/show_bug.cgi?id=13101 [18:52:01] Can't connect to the database. [18:52:02] Error: Can't connect to MySQL server on 'srv7.wikimedia.org' (111) [18:52:24] brion: "Is your database installed and up and running?" :D [18:53:52] :P [18:55:18] 03yaron * r31354 10/trunk/extensions/SemanticForms/includes/SF_FormPrinter.inc: [18:55:18] Attribute fields can now have property-based autocompletion - added [18:55:18] 'autocomplete' parameter for fields, and replaced getAllPagesForRelation_0_7() [18:55:18] with getAllPagesForProperty_0_7(); fixed setting of "yes" and "no" values for [18:55:18] booleans in SMW 1.0; re-added setting of important character in form inputs [18:55:20] that was accidentally removed in v0.9.3; removed all usage of Scriptaculous - [18:55:22] autocompletion now handled entirely by YUI. [18:56:05] 03yaron * r31355 10/trunk/extensions/SemanticForms/skins/SF_yui_autocompletion.css: Added some right padding on autocompletion values [18:57:08] 03yaron * r31356 10/trunk/extensions/SemanticForms/libs/SF_yui_autocompletion.js: [18:57:08] sf_autocomplete() can now be called for non-list fields, with a delimiter [18:57:08] of 'NULL' [18:58:37] 03yaron * r31357 10/trunk/extensions/SemanticForms/ (INSTALL includes/SF_GlobalFunctions.php): New version: 0.9.7 [19:08:00] I can't find Extension:Calendar (Barrylb) in subversion. [19:08:56] I tried cutting and pasting the files but get errors, so is it possible to copy the files for this extension from somewhere? [19:36:38] https://bugzilla.wikimedia.org/ not working [19:38:53] known problem [19:38:58] server down for a while [19:41:46] 03(NEW) modern skin, accessibility jump links - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13168 15enhancement; normal; MediaWiki: User interface; (random832) [19:41:46] 03(FIXED) CIDR ranges api.php - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13157 +comment (10roan.kattouw) [19:41:52] 03(mod) Namespace Request - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13176 +shell (10raimond.spekking) [20:01:15] 03nikerabbit * r31358 10/trunk/extensions/Translate/export.php: * Be less strict what can be exported [20:03:17] Are there scripts to create categories ? I want to "import" a hierarchie from an external source (DB) as a category tree. [20:33:32] 03(mod) Images. - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13116 (10brion) [20:35:12] 03(mod) Talk subpages link to main subpages even when they are turned off - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13119 (10brion) [20:36:18] 14(INVALID) "frame" image layout doesn't scale image down like "thumb" - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13121 (10brion) [20:37:15] 03(WONTFIX) Internet Explorer should be added to javascript browser detection globals - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12766 (10brion) [20:42:32] 03(mod) wrong webserver error message - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13125 (10brion) [20:45:18] i am trying to force custom authentication into mediawiki and have a couple questions/sanity check. has anyone here done that or could help me out? [21:03:46] i want to add text to an article on submitting the edit form in my extension [21:03:51] so i use AlternateEdit hook [21:04:10] How to achieve that? [21:04:53] if ($wgRequest->getCheck( 'wpSave' )) { [21:05:13] $editpage->mArticle->updateArticle( ...) } [21:05:18] ? [21:06:28] you want to add changes to EVERY edit flo_? [21:07:03] i want to add a custom tag (googlemap extension tag) to articles which dont have it yet [21:07:24] It's created from the page title [21:07:53] well. do you really want to add this tag? Better to just add this tag on viewing the page, right? [21:08:17] 03raymond * r31359 10/trunk/extensions/ (5 files in 5 dirs): Localisation updates German [21:08:19] exactly no. I did that before. But [21:08:24] in the other case you have to check wether the tag exists or not. [21:08:47] to allow googlemaps extension edit mode reads the wiki text and loads that map for editing [21:08:50] which would mean you have to parse the wiki markup. [21:09:25] sure, but that should be easy. It would be at the beginning [21:10:18] And in addition the added text would show up in the wiki text, what would mean that people may change it. You then have to check if there is a changed tag in there. [21:10:29] and replace it by your version. [21:11:33] i guess i have to rely people would leave it alone [21:11:43] at least for that first hack [21:12:00] 03(NEW) Enable Password Reset extension on Wikimedia private wikis - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13177 15enhancement; normal; Wikimedia: Site requests; (guillom.pom) [21:12:48] well. but what if you just add it with ParserBeforeStrip? Then it will show up on every page. The WikiMarkup will be rendered. [21:13:59] and you don't have to fear that anybody will change it. [21:14:28] but googlemap ext. edit mode wont work then. [21:14:40] as far as i can see [21:14:48] flo_: Ah! OK. [21:15:27] yeah. It's stupid. The perfect goal would be a google map support independent from wiki text [21:15:45] but that's too big for me for now. [21:15:57] (but i have huge plans :) [21:16:12] well. i am running out of tim.e [21:16:24] i have to leave. sorry! [21:16:29] so you know what i have to set to change wikitext in editform? [21:16:41] ok, [21:19:05] flo_: you could do an SQL quer [21:19:08] *y [21:19:32] I'm not sure of the exact tables, but if you poke about you'll find one. [21:27:29] 03raymond * r31360 10/trunk/extensions/OAI/ (OAIRepo.i18n.php OAIRepo.php): Add description message for [[Special:Version]]. [21:36:23] 03(mod) Enable Password Reset extension on Wikimedia private wikis - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13177 +comment (10cbrown1023) [21:43:29] 03(NEW) Property::Value Bundle - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13178 15enhancement; normal; MediaWiki extensions: Semantic MediaWiki; (dasch_87) [21:51:25] 03greg * r31361 10/trunk/phase3/maintenance/postgres/compare_schemas.pl: Table page_props needed in tests after all. [21:54:45] 03(mod) Namespace Request for trwikisource - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13176 summary (10cbrown1023) [21:55:22] 03(mod) logevents query returns zeros for logid & pageid - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=10660 (10brion) [22:05:07] 03(FIXED) "Page protection" component for MediaWiki - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13138 +comment (10brion) [22:27:02] 03(mod) Talk subpages link to main subpages even when they are turned off - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13119 (10wikimedia) [22:36:32] 03(WONTFIX) updaters.inc will never load interwiki.sql - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12200 +comment (10brion) [22:39:05] <[algo]> how to remove/add menu items [22:39:09] <[algo]> from wiki left navigation ? [22:39:47] <[algo]> like Community Portal link [22:40:05] 03(mod) Add a lang="mediawiki" to syntax highlighting - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13173 (10brion) [22:40:25] <[algo]> I mean sidebar [22:41:49] 03(mod) index.php does not honor the variant param with action=raw - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=12683 (10brion) [22:42:05] brion: should interwiki.sql be checked and updated? First site I checked (http://www.ourpla.net/cgi-bin/pikie.cgi) has moved a while back and has been dead for a few months on its new location. [22:42:27] ("dead" as in no new content) [22:43:40] <[algo]> found it [22:45:59] 03(mod) Talk subpages link to main subpages even when they are turned off - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13119 (10brion) [22:57:51] <[algo]> how to change front page name ? [22:58:29] [algo]: edit [[MediaWiki:Mainpage]] [22:58:41] <[algo]> thanks [23:01:49] Hi. I wanted to ask one question - is it possible to configure wiki in such a way that when I have a redirect from domain.com/abbr to domain.com/definition the engine actually returns 301 to the definition page? [23:02:23] at the moment it simply stays on /abbr and serves content from /definition [23:04:20] <[algo]> can I make mediawiki detect case-insensitive title match and redirect to proper page ? [23:04:39] [algo]: didn't you ask that yesterday? the answer is still no [23:06:14] <[algo]> why not ? [23:06:18] <[algo]> when it matches title [23:06:29] <[algo]> I just need to query db for case-insensitive match [23:06:41] <[algo]> and if I found it => then redirect if needed [23:06:56] because no-one implemented it yet [23:07:03] <[algo]> any problems in it ? [23:07:15] <[algo]> what's the right place to put the fix ? [23:07:17] there's no index or column in the database that would allow checking for a case-insensitive match [23:07:29] there are several places in the code that would need to be changed, it's not trivial [23:07:59] <[algo]> well, make it collate the right way [23:08:00] <[algo]> so query will become case-insensitive [23:08:06] <[algo]> database itself compares case-insensitive for text [23:08:26] <[algo]> so it will match the title [23:09:32] [algo]: go ahead, then [23:10:54] <[algo]> that seems to work [23:11:03] <[algo]> but I do not see possible problems yet [23:11:15] <[algo]> also I'd like to put redirection in there [23:11:26] <[algo]> where should I put it ? [23:11:39] <[algo]> should be just after we've got the page from database by page_title [23:11:43] flyingparc: how about my question :) ? [23:16:31] Bartek: i don't zthink we have a switch for forcing "hard" redirects. although it has been doiscussed before, iirc [23:17:08] Bartek: btw, don't map your wiki into the document root. it's hard to maintain, and also a conceptual fallacy [23:18:37] <[algo]> where should I hook redirect ? [23:18:45] <[algo]> ah sorry already no one knows [23:27:20] Duesentrie: I have my wiki on dedicated subdomain - only wiki will be there, so making URLs shorter seemed to be better.. is there something I am unaware of? [23:28:28] and about the redirects - well thats unfortunate :(. If I will try to satisfy possible abbreviations then I'll be forced to create 3-4 copies of each article. Google wont like it :/. [23:28:36] Bartek: yes. the wiki does not only serve "pages". it also serves image files, css files, js files, etc. it also wants to address index.php directly for several things. all of that conflicts with the conceptual namespace of your pages if you map them to the docroot [23:29:22] Bartek: it can be made to work, with having a lot of exceptions to the rewrite rules (and keeping them up to date and tuned toextensions), or by excluding all existing files and directories from redirection (which means you can not have a page called FAQ, for example). [23:30:10] Bartek: also, you don't have to create copied. mediawiki does that. google will cope, it doesn't choke on wikipedia, which has about 50% redirects. [23:31:00] yes, but dont forget that wikipedia is the most trusted website across whole internet :) it simply can have that many duplicates [23:31:00] i would be, in fact, a nice option to have. one of the problems would be how to supply an edit link back to the redirect page. [23:31:18] i don't think it's a real problem, seriously. [23:31:45] but as i said, it would be nice to have it as an option. search bugzilla, i'm sure we have something there [23:32:14] Bartek: in any case, do rething putting pages into the document base. it's a Bad Idea (tm) [23:32:44] ok, thank you a lot for your input. [23:33:06] <[algo]> I have a Title [23:33:10] <[algo]> how to make redirect from it ? [23:33:12] we even have a template for that http://www.mediawiki.org/wiki/Template:Wiki-in-docroot [23:33:18] <[algo]> how to redirect to page for it [23:33:40] *[algo] will try getFullURL [23:37:40] <[algo]> nice [23:37:41] <[algo]> I did it [23:37:49] <[algo]> works for now [23:38:32] hi [23:39:19] is there something like html2wiki to dump a web page ( text *and* image ) ? [23:43:50] hello, is it possible to have users tick an agreement upon registration? [23:47:35] there's a shitload of people in this room but no chatter, WTF? [23:49:47] 03(mod) process_bug.cgi hides "bugs" - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13134 (10brion) [23:50:55] pdowling: since you didn't bother to stick around for more than 4 minutes, no one is able to answer your question. [23:51:03] too bad for you [23:51:30] please forgive us for not watching the channel like a hawk to pounce upon your question [23:51:47] doing our jobs, or helping people in other channels, is maybe something we should stop doing [23:51:49] just in case you come in [23:52:03] [23:54:12] Since you're in such a good mood, now must be a perfect time to bring up unified login ;) [23:54:20] \o/ [23:54:25] when i get to it :D [23:56:21] how about standardizing elements (ids, hooks, etc.) across all skins so Common.css/Common.js and a few extensions all work as expected ;) [23:57:21] newb question.... what file do i edit to make adjustments to the wiki? is it LocalSettings.php? [23:57:33] jaypro_: yes, you edit LocalSettings.php [23:58:00] is that the only file that you edit, as far as customization goes? or are there others [23:58:51] pretty much that's the only one you have to edit. If you install extensions, there may be others that need to be edited as well, but LocalSettings.php is usually it [23:59:34] 03brion * r31362 10/trunk/phase3/ (RELEASE-NOTES includes/SpecialUserrights.php): [23:59:34] * (bug 13135) Special:Userrights now passes IDs through form submission [23:59:34] to allow functionality on not-quite-right usernames [23:59:58] one last question. after i edit that file, how do i restart mediawiki so that those changes are incorporated?