[00:04:19] i am trying to find if it is possible to add a picture from the outside as an inline picture - is it possible? what is the syntax? where on mediawiki can i find it? any hints? [00:05:21] putlake: so, anyone edititing a page that has such tags in wysiwyg mode is liable to destroy the tags, because he doesn't see them? are they even preserved if editing takes place somewhere else? [00:05:27] this seems pretty evil... [00:05:48] well, the absense of round-trip-compatibility with anything is the main reason there's no wysiwyg for mediawiki. [00:06:03] Duesentrieb: FCKeditor uses HTML front end so the tags are not visible [00:06:07] But they are preserved [00:06:50] as long as no one unwittingly deletes the space that contains them. [00:07:00] SvenG: http://www.mediawiki.org/wiki/Manual:%24wgAllowExternalImages [00:08:03] Duesentrieb: Yes that is definitely a limitation [00:08:45] putlake: to me, it's a reason never to consider it for anything serious, actually. [00:09:16] *putlake pretends to be hurt [00:10:00] :P [00:10:04] *stabstab* [00:10:17] Duesentrieb: It depends upon what you think your audience will be like [00:10:29] Duesentrieb: thanks! :) [00:10:56] hey there guys [00:10:58] putlake: well... if they are "non-tech" and want to use wysiwyg, then it has to be safe. [00:11:11] anyway to add a background to mediawiki or know of any free skins/templates? [00:11:46] UK_Jimbow: put your custom styles into MediaWiki:monobook.css (or MediaWiki:commons.css) [00:11:56] !skins | UK_Jimbow [00:11:56] UK_Jimbow : For information on creating or using skins, see . For simple CSS skins you can apply without editing the files, see . [00:12:05] has anyone tried to host a personal wiki... and synchronize it between hosts? [00:12:18] Duesentrieb: ah ok... so just sticking the url of the picture into that wiki does not make it show... apparaently the feature is off. so.. what is the syntax to include an uploaded pictures? "[[$file]]"? [00:12:18] or at least the first part [00:12:19] but afaik there are really no read-to-use 3rd party skins. [00:12:37] SvenG: [[Image:MyPicture.png]] [00:12:39] ah ok thanks [00:12:56] Duesentrieb: merci :) [00:12:59] SvenG: http://www.mediawiki.org/wiki/Help:Images [00:13:23] erflynn_: synchronize between hosts? what's that supposed to mean? [00:13:40] Duesentrieb: revelation! :) [00:14:06] Duesentrieb, err sounds vague doesn't it. It would be so cool to have a local copy of your wiki, make some changes, and later replicate the changes over to a remote copy [00:16:14] erflynn_: well... that's not really what m,ediawiki is designed for. maybe you should look at something more tailored for personal use / offline use. [00:16:35] erflynn_: you can use export/import to move pages around, but for day-to-day working, that would suck. [00:16:59] could you sync mysql databases? [00:17:04] mediawiki was written to serve LOTS of pages to LOTS of people and allow LOTS of edits all at once. [00:17:26] MZMcBride: if you use multiple masters (requzired for offlien edit), you will get inconsistencies. [00:17:27] Duesentrieb, i'm sure there is a solution, since there's wikipedia and they need all kinds of database replication. so I guess this is more of a DB question. thanks for your help [00:17:35] MZMcBride: or your live copy is read only [00:17:45] in which case you should really use a CMS, not a wiki. [00:18:05] erflynn_: replication is one thing. offline editing is quite different. [00:18:18] Duesentrieb, that's true [00:18:19] erflynn_: replications means master/slave. i.e. you can only edit in one place. [00:18:40] mediawiki is specifically designed for that (wikipedia uses dozents of slave dbs) [00:19:33] wait how do you export a single page? [00:19:50] can you do that? [00:20:38] use Special:Export - it will give you an xml file (you can get either the top revision, or everything, IIRC) [00:20:38] Special:Export [00:20:44] yep [00:21:32] sweet [00:21:33] thanks [00:23:02] is there any way of importing those xml files to WMF wikis these days? [00:23:19] or do you *have to* use special:import [00:23:38] wmf? [00:23:46] wikimedia foundation [00:25:48] Mike_lifeguard: hm? hat other way would you like to have? [00:26:33] i'm just wondering for no particular reason [00:27:11] that doesn't make sense... over the web, admins can use Special:Import. From the command line, server admins can use the import scripts [00:27:13] as it should be [00:27:41] Yeah. that's fine. [00:27:43] generally, Special:Import is configured to only allow direct import from a set of specific sources [00:27:44] lol there's an article titled "Pity Sex" in wikipedia [00:28:20] but that reduces the places you can import from, and we have to bug the developers every time we want to add something [00:28:49] it's not an actual problem, and I only ask because erflynn's question reminded me about that old process [00:29:25] it's intentionally difficult/restrictive, because importing involves license issues. [00:29:38] actually, in the absense of SUL, it's pretty bad, because authorship is unclear. [00:29:45] yes. which are the problem of the importing admin. [00:29:56] re [00:30:00] in theory. [00:30:03] does someone talk C in here? [00:30:09] in practice, it's the problem of whoever has to clean up the mess. [00:30:13] that's why xml imports could still be useful; you can fiddle with the history to make usernames match up properly [00:30:33] Mike_lifeguard: i'd rather have SUL go live :) [00:30:38] someone was actually asking about that recently as en.wb [00:30:45] yeah so would I [00:30:52] i hacked a mediawiki-filesystem and now want to code it cleanly to release it. It would be nice to have participants, since i'm really out of time :( [00:33:05] rigid: maybe you can excite flyingparchment for it :) [00:33:17] at least for giving it a look [00:33:34] yeah i'll idle here... if i see him [00:33:34] otherwise, file it with the ten million other almost finished oss projects on sorceforge. [00:34:04] heh... i really don't like to release the current version, the code is a shame :) [00:34:18] bah [00:34:20] be bold [00:34:28] anyway, going to bed. [00:34:30] have fun [00:34:37] heh.. nite [00:36:29] question: when you do an import using Special:Import, does it overwrite pages? [00:38:17] no. If you import to a page which already exists, it will history-merge [00:38:24] ok [00:38:25] great [00:38:30] this is perfect. [00:38:50] not really. you can "lose" the version you want to be on top at the end depending on when the revisions were made. [00:39:11] oh [00:39:12] make sure you have a copy of the wikicode in your clipboard so you can restore the version you want to be on top at the end of the day [00:39:41] so if you want to history-merge and have the pre-existing version still there, then [00:39:57] copy that wikicode to the clipboard, then import (which will automatically history-merge) [00:40:23] then edit the page, paste the clipboard contents and save, saying "restoring current version" or somesuch [00:43:00] clear as mud? [00:43:08] sry i was away hold on [00:43:41] 03(mod) api.php should honor maxlag - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11206 (10yuriastrakhan) [00:44:55] so i have a page dated nov 6, and i pull another version dated nov 5 (same namespace and title) then if import over the nov 6 dated page, then it will sneak it in as an old revision? [00:46:28] all i want to is update pages, don't really care about revisions so much [00:47:27] we always avoid importing to anything but the Transwiki: namespace specifically to avoid these issues. [00:47:52] I *think* that your imported version will be in the history, and the pre-existing versio nwill remain as the top revision [00:48:10] ok [00:48:15] if you're just updaing a page maintained elsewhere, then you can import all versions to Transwiki:page [00:48:29] then move to Namespace:Page, deleting the version that is there [00:48:55] Simetrical, you about?: [00:48:57] since you have all versions in your Transwiki:Page, you'll not lose anything [00:49:05] (assuming import doesn't fail) [00:49:07] setuid, yep. [00:49:08] I've posed the question to the wikitech-l list a few times, and I still don't see any valid response... [00:49:08] well i wouldn't be importing into anything on wikipedia, just personal wikis [00:49:16] What is all this fluff about broken dumps? [00:49:23] I've repeatedly proven they're not broken [00:49:41] At least as far as the latest public dump available via the rss feeds. [00:49:53] setuid, you've confirmed that all pages are present? [00:49:59] Er, wait, which RSS feeds [00:50:00] ? [00:50:03] Yes, all 5.27M rows [00:50:27] sure. but if it's MediaWiki, then it'll work (and if you're importing from WP etc, you still need to abide by GFDL requirements, which means you need a history) [00:50:53] Simetrical, I wrote a perl script which hits the feeds, grabs the url to the linked dump in the feed, fetches it, unpacks it, drops the tables in my db for that ${lang}_wiki, and then re-imports the whole shebang. [00:51:11] So I'm getting the latest dumps, via the rss feeds' link to the dump file [00:51:32] setuid, 5.27 M rows for what? [00:51:44] page table? Special:Statistics says there are 10.9M pages. [00:51:50] Simetrical, Sorry, enwiki's page table has 5.27M rows of articles [00:52:13] Right, someone answered that awhile ago, there _are_ 10.9M pages, but only 5.27M rows in the db, which _create_ those 10M pages. [00:52:22] Let me find the ref... the 5.2M rows is correct [00:52:56] mysql> EXPLAIN SELECT COUNT(*) FROM page; [00:52:56] +----+-------------+-------+-------+---------------+---------+---------+------+----------+-------------+ [00:52:56] | id | select_type | table | type | possible_keys | key | key_len | ref | rows | Extra | [00:52:56] +----+-------------+-------+-------+---------------+---------+---------+------+----------+-------------+ [00:52:56] | 1 | SIMPLE | page | index | NULL | PRIMARY | 4 | NULL | 10910131 | Using index | [00:52:57] +----+-------------+-------+-------+---------------+---------+---------+------+----------+-------------+ [00:53:12] There should be one row per page. [00:53:23] Redirects and all that jazz have an entry in the page table. [00:53:50] Of course that's the complete dump, of all namespaces, and it's an up-to-date count. [00:54:12] If you're using an old or not-all-namespaces dump, that would have fewer by design. [00:55:58] 03(NEW) "All namespaces" in Special: Newpages does not work any more - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12031 normal; normal; MediaWiki: Special pages; (uv.wiki) [00:56:34] So these backups, even though the compressed file is correct, even though the row count has been roughly the same/slowly increasing for the last year, is 50% of what it should be? [00:56:43] If that's the case, then this has been broken for well over a year or more [00:56:47] I have no idea, I haven't looked at them myself. [00:57:48] how do you check the mediawiki version? [00:57:58] erflynn_, Special:Version [00:58:02] kk [01:00:19] Simetrical, So if I go to Special:Statistics for my _installed_ wiki, it's going to misreport the number of pages, right? (based on a possibly-incorrect number of inserted rows) [01:01:00] Simetrical, btw, I was the one who raised 'rsync' today on the list, if that didn't ring a bell from our discussions here about it a few years back. [01:01:02] setuid, if you don't use the dumped site_stats table, if we dump that. [01:01:10] setuid, I haven't been here a few years. [01:01:16] What's your name on the list? [01:01:21] Desrosiers [01:01:38] Ah, okay. [01:02:03] I'm mostly concerned with "cleansed", offline versions of ${lang}wiki, for the mobile projects I work with [01:02:16] I'm going to tinker with that python/c++ code and see where that gets me [01:12:25] question about calling special pages: is there a way on Wikipedia to call the page Whatlinkshere, that is, make it appear on a page, and not just link to it? I know it can be done with Preifxindex, among others… [01:12:47] Anyone use the CategoryCloud like extensions. I started using "SeletCategoryTagCloud" and I've noticed that there appears to be a nasty bug in it. Whenever it parses the edited page removes all whitespaces from the front of text in the lines. Which means that if you are creating secitons by putting " " in front of the text it all disappears. Anyone know a fix for this? [01:21:22] 03(mod) Install the StringFunctions extension - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=6455 (10ross) [01:30:05] would anyone be interested in links to Special:Userrights, on Special:Userlist [01:30:07] ? [01:37:33] ok, looks like today is Asking Day and not Answering… no problem :) [01:40:10] *amidaniel blinks [01:42:10] Simetrical, Something weird with using the same explain syntax on my server... I'll try a drop/reload and see what results I get. Takes about 1 hour to drop all tables and reload from the full dumps for the 9 languages. [01:42:27] It currently reports NULL for all fields, but select count(*) returns 5+M rows [01:42:38] setuid, probably not indexed. [01:42:38] mysql> select count(*) from ep_en_page\G; [01:42:39] *************************** 1. row *************************** [01:42:39] count(*): 5836166 [01:42:56] Well, it should still report a number of rows. [01:44:07] Actually, EXPLAIN should report a number of rows under any circumstance . . . anyway. [01:44:28] The point is, is that as many pages as there are supposed to be? Has the number of pages almost doubled since the dump? [01:44:32] Or not? [01:45:31] Over time, I've never seen a "doubling" (or halving) of pages, its remained slowly constant, or steadily increasing, but no significant jumps... so that leads me to believe either I have the right number of pages, or... the dumps have been broken for > 1 year or more. [01:46:11] I just doubled the ram in the server here a few minutes ago, so I'll give it a go and see what happens [01:46:34] Hrm, or... mayne I can start iwth a smaller language wiki. Does this dump problem affect _all_ wikis? Or just enwiki? [01:46:57] I think just enwiki. I don't know anything much about it, though, honestly, you're asking the wrong person. [01:47:08] Doubling in a year certainly seems reasonable. [01:47:10] Has anyone implemented the Extension:SpecialForm? [01:48:27] Simetrical, sure, in a year's time, I'd agree, but that would be a gradual progression, not a jump of a million articles in a month [01:49:12] Er, if you gain five million pages in a year, you're going to be gaining not far from a million pages a month. [01:49:46] Anyway, there are stats somewhere on milestones. Check for the five- and ten-million page milestones and you'll see if the timescale is plausible. [01:50:10] bleh, ok, that came out wrong. What I meant was... if you graph the increase in articles, you don't have a huge spike, then a slow plateau upward... that's what I mean. Let's say the issue is a broken dump script, and it gets fixed, I'd expect to see a huge jump at the next dump... [01:50:15] Ok [01:50:30] I'll poke around, but you've given me some ideas to play with, with the :Statistics page [01:50:37] I'll compare my mirror with the live version and see [01:50:43] woo woo :P I fixed the TagCloud bug :P they were using php trim which strips all the blanks from begining and the end. I changed it to rtrim works like a champ it seems :) [01:51:24] setuid, note that the Statistics stuff is cached and may be incorrect if you load a dump. Try maintenance/rebuildStatistics.php first. [01:52:17] Hi everyone, I have a feq quick questions [01:52:31] How can I make my URLs imitate Wikipedia's as in "/wiki/Article_name"? [01:52:40] Right now, I have the "index.php?title=Main_Page" style [01:52:40] Simetrical, No such beast on the current mw version or trunk [01:52:48] designs_703, mod_rewrite [01:52:50] setuid, feh, then I got the name wrong. [01:53:06] Simetrical, Nothing matching '*Statistics.php' at all, I'll grep around [01:53:27] initStats.php, maybe? [01:53:34] I need to patch a good chunk of maintenance/* anyway, so it takes params [01:53:35] !shorturl | designs_703 [01:53:35] designs_703 : To create simple URLs (such as the /wiki/PAGENAME style URLs on Wikimedia sites), follow the instructions at http://www.mediawiki.org/wiki/Manual:Short_URL. There are instructions for most different webserver setups. [01:53:46] because it doesn't work if you have table prefixes, or multiple wikis in one db [01:54:57] OK, thanks [02:04:45] Simetrical, What column should the index for page/text/revision be on? [02:04:54] just page_id? or page_title also? [02:05:15] setuid, check maintenance/tables.sql. I don't recall offhand. [02:05:18] There appears to be indices on page_random and page_len [02:05:21] Ok [02:05:28] page_title is almost certainly not part of the primary key. [02:05:41] Varchar primary keys are somewhat insane for InnoDB. [02:05:55] UNIQUE INDEX name_title (page_namespace,page_title) [02:06:10] Looks like it is: [02:06:11] PRIMARY KEY page_id (page_id), [02:06:11] UNIQUE INDEX name_title (page_namespace,page_title), [02:06:46] 03(NEW) Include links to Special:Userrights on Special:Listusers - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12032 15enhancement; normal; MediaWiki: Special pages; (jlerner) [02:07:15] hrm, the index is already there in my version [02:16:22] 03(mod) Include links to Special:Userrights on Special:Listusers - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12032 +comment (10Simetrical+wikibugs) [02:20:39] is there a way to remove revisions? [02:21:01] erflynn: the oversight extension [02:21:52] you can selectively delete revisions, so that only admins can see them, but it requires oversight to remove them completely from the database [02:22:08] where do you get oversight [02:23:49] !oversight [02:23:49] Oversighting removes revisions from access by normal users and sysops. More information is available here: . [02:25:42] basically, i want to remove revisions from the database and from view [02:37:01] erflynn, Don't import that table then [02:37:13] import? [02:37:28] w/e [02:56:35] Anyone know a good faq for creating skins? I doctored the MonoBook skin up and i want to rename it to something else but I can't seem to find everywhere I need to rename it. [03:02:11] Tambu: I think you just rename it [03:02:17] then select it in your user preferences [03:02:44] jlerner: funny thing is I've renamed MonoBook.php MonoBook.deps.php monobook(dir) to the new name [03:03:06] it shows up in the Preferences but when I select it it's like it's not finding the files cause it loads some ugly default blank skin [03:04:12] there may be a few references in MonoBook.php [03:04:49] jlerner: yeah.. can't find it.. will keep looking. [03:05:36] 03(mod) Minor edit: change CSS class img.thumbborder to just . thumbborder - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11549 (10N/A) [04:00:25] Hmmm .. is the CIA bot still MIA? [04:04:31] I was just in #cia asking about it, no response yet [04:06:41] *setuid perks up [04:07:09] Tambu, The templating needs a retrofit, most of us agree... it's not a clean, self-contained, "drop-in" system as yet [04:25:15] conceptually, there's no reason why you couldn't extend Special:Export to redirect its output to Special:Import on another wiki - yes? [04:26:19] Why would you want to do that? [04:30:34] we're developing pages on a private site [04:30:45] then they get published to a more open site [04:31:21] setuid: i was hoping the publishing process could be one-click [05:47:34] CIA-MIA: O.o [05:48:12] jclerner: If you set up the remote wiki as an interwiki, you can do an interwiki import [05:48:24] And, I believe, batch interwiki import as well [05:50:07] amidaniel: great, I'll look into that [05:50:18] If I can help, please let me know. [05:58:43] 03(VERIFIED) Correct project namespace for Kannada Wikipedia - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11717 +comment (10shushruth) [06:15:54] what's the page that shows the extensions running again? [06:16:53] Special:Version [06:16:58] thanks [06:25:54] hey how/where do I go to enable nice urls [06:26:04] !faw [06:26:07] !faq [06:26:07] Before reporting a problem or requesting assistance, please check the FAQ first. The FAQ for MediaWiki can be found at http://www.mediawiki.org/wiki/Manual:FAQ [06:27:07] nevermind I had it haha [06:27:38] apparently PHP doesn't allow you to assign a constant expression to a class constant, it has to be a literal [06:27:58] so you can't split string literals over multiple lines using "." [06:40:08] TimStarling: can't you split string literals across multiple lines using double-quotes anyway? [06:53:48] Werdna: you mean with linefeeds in the string? or C-style? [06:54:24] linefeeds in the string [06:54:35] both work.. [06:54:36] well, the string in question can't have linefeeds in it [06:54:39] maybe you can escape the linefeeds, shell-style, that'd be pretty cool [06:55:07] why not? We do "this [06:55:14] all the time in Messages**.php" [06:56:03] nope, can't do either [06:56:39] o_O [06:56:43] c-style: PHP Parse error: syntax error, unexpected T_CONSTANT_ENCAPSED_STRING, expecting ',' or ';' in C:\htdocs\w\includes\Parser.php on line 62 [06:56:55] well.. that's kind of annoying [06:57:10] shell-style: PHP Warning: Unexpected character in input: '\' (ASCII=92) state=1 in C:\htdocs\w\includes\Parser.php on line 62 [06:57:25] Hi! Which is the most actual MessagesEn.php http://svn.wikimedia.org/viewvc/mediawiki/branches/REL1_9/phase3/languages/messages/MessagesEn.php?view=markup&sortby=file seems to be quite old [06:58:11] yes, that's because it's from 1.9 [06:58:24] sorry, shell-style error was incorrect [06:58:27] I'm not sure what you mean by the "most actual".. "most recent"? [06:58:36] actually shell-style gives you a literal \ and a literal LF [06:58:50] in which case you're after trunk/phase3/languages/messages/... [06:59:08] hmm.. what does linefeeds in the string give you? [06:59:24] it gives you linefeeds in the string [06:59:32] which as I already said, the string in question can't have [06:59:58] oh, I see what you're talking about [07:00:01] it being a regex without the /x modifier [07:00:26] I thought you wanted to put linefeeds in the string, but you somehow couldn't [07:00:33] Werdna do you have a link to a revisioon not older then one month ?? [07:01:21] gangleri: replace, in the url you have, branches/rel1_9 with trunk [07:01:35] ok looking [07:04:02] thanks it is http://svn.wikimedia.org/viewvc/mediawiki/trunk/phase3/languages/messages/MessagesEn.php?view=markup [07:04:59] yes [07:13:51] *amidaniel waves to CIA-37 [07:19:13] still dead, but there was an apparent system reset [07:57:09] 03siebrand * r27611 10/trunk/phase3/languages/messages/ (4 files): [07:57:09] Localisation updates from Betawiki. [07:57:09] * ca, kaa, sdc, ss [07:57:09] mediawiki: 03siebrand * r27611 10/trunk/phase3/languages/messages/ (4 files): [07:57:09] mediawiki: Localisation updates from Betawiki. [07:57:10] mediawiki: * ca, kaa, sdc, ss [07:57:56] uh, that's annoying [08:03:39] why? [08:03:41] :( [08:03:53] ah [08:03:54] two cias [08:08:11] I'm sorting out the CIA thing [08:36:02] 03siebrand * r27612 10/trunk/extensions/ (29 files in 26 dirs): [08:36:02] Localisation updates from Betawiki. [08:36:02] * Fixes and additions to 26 extensions for af, an, ar, el, eu, fr, hr, hsb, it, oc, pl, sv [08:49:18] 03siebrand * r27614 10/trunk/extensions/Blahtex/Blahtex.php: Update URL for documentation [08:49:43] Утро доброе [09:03:05] <_wooz> lo [09:13:38] I want to make a table that is wider than the screen get a scroll bar, how do I do this? Do I simply add something in the template, or in the article where the template is used, or do I have to do something with common javascript? [09:13:52] Example: http://af.wikipedia.org/wiki/2007 [09:15:18] better to redesign it [09:15:26] but there is overlow: scroll or something [09:16:39] i just moved my wiki and its got no styles now, is there a config directive i have to change? [09:17:05] yes [09:17:09] flaccid: Look in LocalSettings, should be something there [09:18:21] i changed one, /wiki to / which fixed the infinite loop but not the styles [09:19:37] its frustrating this. i don't know why it can't be coded portable.. [09:20:57] nope i can't find anything. no idea what it could be.. [09:22:20] i needed this $wgScriptPath = ""; didn't work as "/" [09:24:46] 03siebrand * r27615 10/trunk/extensions/Translate/ (MessageGroups.php Translate.php): Add support for translation of ChangeAuthor messages [09:25:47] 03siebrand * r27616 10/trunk/extensions/ChangeAuthor/ (ChangeAuthor.i18n.php ChangeAuthor.setup.php): [09:25:47] * update indentation of messages [09:25:47] * update nl messages [09:25:47] * change extension description [09:43:59] 03tstarling * r27617 10/trunk/phase3/includes/filerepo/RepoGroup.php: Fix bug in RepoGroup::getRepo(), 0 == 'local' [10:04:04] 03siebrand * r27618 10/trunk/extensions/ListChangedArticles/ListChangedArticles_body.php: Fix typo [10:04:31] 03tstarling * r27619 10/trunk/phase3/includes/filerepo/LocalFile.php: [10:04:31] Use setProps() to set properties in LocalFile::loadFromCache(), instead of DIY [10:04:31] property-setting. I must have forgotten to do this when I introduced setProps(), [10:04:31] the result was a bug (hereby fixed) causing zero cache-hit ratio for the image [10:04:32] cache. [10:15:38] 03(NEW) Parameter lost when using #ifexpr in another parameter - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12033 minor; normal; MediaWiki extensions: ParserFunctions; (gtisza) [10:22:43] 03tstarling * r27620 10/trunk/phase3/includes/filerepo/LocalFile.php: dataLoaded also needs to be set for negative cache hits. [10:23:09] 03(mod) Language::formatNum() should prefix negative values with − - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=8327 (10gtisza) [10:27:42] 03(NEW) Request: Different "MediaWiki:Spamprotectiontext" for local and global spam-blacklist - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12034 15enhancement; normal; Wikimedia: General/Unknown; (a9502784) [10:41:57] hey all [10:42:25] how can I display a list of most recent new articles (to include on the main page of sh wiktionary)? [10:48:44] does not validate. seems like two different templates have caused elements with duplicate @id's [10:50:47] cabrilo: You can transclude {{Special:Newpages}} [10:51:02] amidaniel: aha, thanks [10:51:19] chmod007: That's probably an issue to take up on the talk page, not really a mediawiki problem [10:51:21] amidaniel: but I'd only like names of the pages and links [10:51:47] amidaniel: mediawiki does guarantee document validity? [10:52:12] cabrilo: Then you're going to have to come up with a clever solution .. nothing in mediawiki will do that to my knowledge [10:52:19] 03(NEW) Text does not appear - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12035 15enhancement; normal; MediaWiki: Page rendering; (jeandavidrobert) [10:52:23] chmod007: I wish it were so :) [10:52:23] amidaniel: ok, thanks [10:52:58] amidaniel: oh, ok. I’ll try to work out the issue on wikipedia then [10:53:55] 03(FIXED) Text does not appear - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12035 +comment (10cannon.danielc) [11:15:23] 03(NEW) distinguish already deleted pages in Special:Newpages - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12036 15enhancement; low; MediaWiki: Special pages; (pavel_vozenilek) [11:19:01] morning [11:56:19] I think there's a bug with calling __autoload() from call_user_func_array() [11:58:30] TimStarling: did you see the image caching stuff on #wikimedia-tech? [11:58:50] no [11:59:02] please read back. something is apparently very broken. [12:00:04] 03(mod) Request: Different "MediaWiki:Spamprotectiontext" for local and global spam-blacklist - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12034 (10raimond.spekking) [12:04:38] re [12:06:59] 03(mod) Different error message when attempting to mark non-mainspace pages as patrolled - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12027 trivial->normal (10raimond.spekking) [12:10:23] 03(mod) import from oldwikisource to en.wikisource.org - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12022 +shell (10raimond.spekking) [12:11:08] 03(mod) New translation file for CategoryTree extension - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12018 (10raimond.spekking) [12:18:12] 03raymond * r27621 10/trunk/extensions/CategoryTree/CategoryTree.i18n.ms.php: [12:18:12] * (bug 12018) Add Malay translation [12:18:12] Patch by Aviator [12:18:32] 03(FIXED) New translation file for CategoryTree extension - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12018 +patch; +comment (10raimond.spekking) [12:21:43] 03raymond * r27622 10/trunk/extensions/ConfirmEdit/ConfirmEdit.i18n.php: [12:21:43] * (bug 12009) Update French translations [12:21:43] Patch by Bertrand GRONDIN [12:21:55] 03(FIXED) update french translation - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12009 +comment (10raimond.spekking) [12:40:37] 03raymond * r27623 10/trunk/phase3/languages/messages/MessagesNds_nl.php: [12:40:37] * (bug 12006) Translation alternate names of special pages nds-nl [12:40:37] Patch by Servien [12:41:59] i have the following problem [12:42:09] if i try to upload a 8mb or 10mb file i dont get any feedback and just see the empty upload page in mediawiki again [12:42:15] i have defined: max_post_size & upload_max_filesize in php to 15M [12:42:46] are there any other mediawiki values then $wgUploadsizeWarning ? [12:43:02] uploading 1M files works without problems [12:43:43] TimStarling: I commited a few changes to Linker.php and Parser.php yesterday... don't know if you had a look at them [12:46:53] anyone got some minutes for my upload problem ? [12:47:02] Fatal error: Allowed memory size of 20971520 bytes exhausted (tried to allocate 8388609 bytes) in /var/www/wikis/abt_edv/includes/SpecialUpload.php on line 1116 [12:47:36] reached that point with: post_max_size = 30M & upload_max_filesize in php.ini [12:47:42] Hello [12:49:46] increase memory size then? :) [12:49:57] domas: just doing that :P [12:50:19] seems like it is really a php problem rigfht now [12:50:25] but if you got some minutes [12:50:45] i have a question regarding wgUploadSizeWarning [12:50:57] is it in byte only ? or can i enter i.e. 10M ? [12:51:18] and am i forced to use a similar synthax then the example which is 150 x 1024 [12:58:12] and well. having changed: memory_limit = 30 M && post_max_size = 30M && upload_max_filesize = 15M i have still that error with my 8MB testfile [12:58:19] Fatal error: Allowed memory size of 20971520 bytes exhausted (tried to allocate 8388609 bytes) in /var/www/wikis/abt_edv/includes/SpecialUpload.php on line 1116 [12:58:33] i dont understand where this value comes from [12:59:08] as textfile = 8M and the error/values havent changed after modifiyng php-values at all [12:59:39] i had the same problem with mediawiki 1.7, but seems i am still to ***** to get uploads > 1M working with mediawiki [13:02:44] 03thomasv * r27624 10/trunk/extensions/ProofreadPage/ (ProofreadPage.i18n.php ProofreadPage.php): make quality1 the default; restrict colours to page namespace [13:04:40] 03(NEW) Search feature within categories - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12037 15enhancement; normal; MediaWiki: Categories; (dunc_harris) [13:07:41] ok it really looks like it is related with the upload script which needs much more ram then filesize [13:08:04] so it would be interessting so see the php values of users which can upload i.e. 10MN as fileupload in a mediawiki [13:08:33] using post_max_size = 30M, memory_limit = 30M and upload_max_filesize = 15M here with apache2 and php5 [13:08:53] heh, seems you are right on topic already :) [13:09:03] ? [13:09:10] i am asking atm :D [13:09:35] you wouldnt by any chance have the problem that you cant upload big files? [13:10:27] yeah [13:10:32] buuuut, my problem is right now that mediawiki is not making thumbnails for me when i use [[Image:Coffee.png|thumb|right|Coofofoffeeeeeee]] [13:10:35] just playing with php values right now [13:10:43] but when i use [[Image:Coffee.jpg|right|50px|]] it works [13:10:45] sorry wrong person to ask [13:11:09] odb|fidel_, you are trying to change the values in php.ini? they are probably not being used at all, try changing apache conf [13:11:28] for example php_value upload_max_filesize "25M" [13:11:54] why do you think it doenst uses php.ini values ? [13:12:26] odb|fidel_, i don't know the reason why it doesn't, but it just know it might not :) [13:13:31] 6mb works here, 8mb seems to be the magic frontier for uploads with my values [13:14:10] you should try changing your apache conf [13:14:45] in php5 module of apache2 i guess right ? [13:14:56] yup, exactly [13:15:07] http://meta.wikimedia.org/wiki/Uploading_files#Frequently_Asked_Questions [13:15:44] dfp: gonna test this, thanks [13:15:54] i really expected php.ini as magic location [13:16:14] thats what you would think :) [13:16:21] hehe [13:16:45] wondering about the notation in your link [13:17:02] shoulnt it be upload_max_filesize insteed of upload max filesize [13:17:09] but, i gotta go, if someone has a solution to my thumbnail problem, dont hestitate to answer :) aslong as you hilight the line with my nick [13:17:10] looks like missing ___ [13:17:17] :P [13:17:44] actually its just the html messing with you [13:17:46] they are there :P [13:18:04] ok, ghood luck with your problem dfp [13:20:49] damn [13:25:10] hi! Is it possible to "hide" Special:Userlist to all users except bureaucrats and/or sysops? [13:27:25] yes [13:29:03] could you tell me where to find the manual or the instructions? I've been looking on MediaWiki and asked in MWUsers forum, but I haven't found any how-to. [13:31:18] http://www.mediawiki.org/wiki/Manual:Preventing_access#Restrict_editing_by_all_non-sysop_users [13:31:53] except you want to restrict access to reading, not editing, of that special page [13:31:58] it's described below [13:38:15] anyone inhere able to upload 10 mb files to his mediawiki and willing to compare some php values with me ? [13:40:05] 03(NEW) Extension:Assert Edit appears to have no effect - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12038 normal; normal; MediaWiki extensions: AdvancedRandom; (ais523) [13:41:57] Hello [13:42:09] lch, I'll read the manual more carefully when I have more time. If it isn't what I want, I'll continue searching or i'll ask here again. See you! [13:45:47] Hello [13:46:28] 03raymond * r27625 10/trunk/ (4 files in 4 dirs): * Updates German [14:40:25] hello i am receiving a Fatal Error: Allowed memory size of YYY bytes is exhasuted. Ubuntu box [14:40:38] which values besides: memory_limit could that be ? [14:41:26] i have changed memory_limit, post_max_size & upload_mac_filesize already, but i still have that 20MB fatal error and i havent realized any 20mb setting in my php setup [14:54:48] odb|fidel_: this is per default in LocalSettings.php: ini_set( 'memory_limit', '20M' ); [14:55:12] odb|fidel_: 20MB should normally be plenty memory for mediawiki. [14:55:45] hello Duesentrieb_ [14:55:48] yeah well [14:56:09] we have a docu-wiki, which is used i.e. to attach files [14:56:28] it works with 6M files, but 8M files fail with that limit [14:56:40] memory_limit is for ram. files don't go into ram. [14:56:44] havent realized that there is a limit definition in Localsettings too [14:56:48] at least, they shouldn't. [14:56:57] Duesentrieb_: well then i dont get my problem :D [14:57:19] i am trying to configure mediawiki, apache,php so that i am able to add 10M files to the wiki [14:57:35] i have defined mem_limit of php to 64M [14:57:46] and you looked at max size for post and upload, for all three? [14:57:58] post_max_size = 25 & upload_max_filesize to 25 [14:58:16] Duesentrieb_: could you define "for all three" ? [14:58:19] ignore memory limit id you don't actually get errors related to that. as i said, oit's for ram, and ram is not relevant for uploads [14:58:43] well i have no more idea howto debug atm [14:58:45] odb|fidel_: apache, php, mediawiki. though mediawiki doesn't have a hard limit afaik [14:59:05] Duesentrieb_: gonna first test the mediawiki 20M limit, if you have some time [14:59:18] i gotta run in a couple of minute [14:59:23] mh [14:59:32] odb|fidel_: one suggestion for debugging: simplify the setup. try with a php form that does nothing but letting you upload a file [14:59:34] any other information i can provide to you ? [15:00:06] the "php file" can actually be plain html, with a php extension. [15:00:13] Duesentrieb_: is it safe for testing issues to change the 20 M limit in localsettings to lets say 100 ? [15:00:37] just for being sure its not that value and to see if i still get the same error ? [15:00:59] sure. or simple comment it out. [15:01:22] as it is part of Localsettings.php there is no need to restart or similar. right ? [15:01:30] just try the upload once again [15:01:52] works [15:01:55] hrhr [15:02:15] Duesentrieb_: thank you for the input. It looks like in this sepcial case the 20M limit caused the error [15:02:39] odb|fidel_: very interesting. you should find out why it's eating so much ram. what type of file are you uploading? [15:02:49] Duesentrieb_: thank you very much. have a nice day. [15:03:04] Duesentrieb_: i am using a windows command to generate .txt files with a fix byte value [15:03:21] i.e. created a 2mb file, 4mb, 6mb,8mb,10mb [15:03:42] ah... large text files... well, it might try to load them to check them for evil scripts. [15:03:48] it does that for unrecognized files [15:03:49] but i dont think it is related with the fiel-creation process [15:03:56] you probably wouldn't even have that problem with real data [15:04:06] Duesentrieb_: if you are interested in, i could send you one tomorrow [15:04:08] it is related to the file contents [15:04:26] different file contents -> different checks -> different ram use. [15:04:29] Duesentrieb_: well i started with that topic as a colleuge was unable to attach files [15:04:43] what type of files? [15:04:52] and he attaches several programming files from time ti time [15:04:55] (btw, "attach" is not the right concept here) [15:05:06] like .h .hex or similar [15:05:19] sorry, not a native english speaker :D [15:05:21] yea. large text files [15:05:30] nither am i :) but i's a conceptual thing [15:05:31] in most cases its text-files [15:05:34] attachements belong to a page [15:05:37] in some minor cases graphics [15:05:41] mediawiki uploads don't belong to anything [15:05:47] anyway... have fun. i gotta go [15:05:52] bye bye [15:05:57] thank you very much once again [15:06:07] np [15:06:26] ou are always a big help inhere. [15:08:31] 03vasilievvv * r27626 10/trunk/phase3/includes/api/ (ApiBase.php ApiHelp.php ApiMain.php ApiQuery.php): [15:08:31] * Use ApiBase::dieDebug() to render maxlag error properly [15:08:31] * Allow modules to ignore maxlag attribute [15:08:46] CIA works now. fine :) [15:24:08] hi [15:27:11] Am I missing something? Text seems to wrap just fine but if I use " " in front of text to put it into a box it stops wrapping. Is there a special command I need to use? [15:32:32] Tambu: that's the markup for preformatted text [15:33:24] 03brion * r27627 10/trunk/phase3/ (11 files in 3 dirs): [15:33:24] Revert r27581, 27598, 27626 [15:33:24] format=raw is an HTML injection machine like action=raw but without any safeguards; it's trivial to create JavaScript exploits which hit at least Internet Explorer. [15:33:24] There's no reason to add a whole new danger point here when you've got machine-readable structure already... please do not add this raw formatter back. [15:38:07] 03raymond * r27628 10/trunk/extensions/ParserFunctions/ (Expr.php ParserFunctions.php): Use not , per hint of Simetrical on wikitech-l. Thanks. [15:38:38] h brion [15:38:59] yo [15:39:17] I did a bit of code review this morning, I had that format=raw down as a possible revert [15:39:22] didn't think of the XSS thing though [15:39:31] first thing i thought of :) [15:39:43] 03brion * r27629 10/trunk/phase3/includes/ (Linker.php Parser.php): [15:39:43] Reverting r27599 [15:39:43] * Uses hardcoded magic numbers extensively, which is poor practice [15:39:43] * Adds two hooks with no documentation [15:39:43] * Dropping $class unencoded into the HTML output feels like bad practice to me [15:39:44] * A link-by-link coloring plugin sounds like it could be very expensive to begin with; I'm a bit leery of adding in such overhead. [15:40:34] brion: and why did you revert bug 11206 fix? [15:40:46] VasilievVV|NA: i reverted the commits [15:40:59] brion sorry stepped away. So if thats preformated is there a way to create a "box" and have wiki inside it? [15:41:01] if you put an unrelated bug fix in with the security flaw, too bad for it [15:41:19] Tambu:
and define whatever pretty style you like [15:43:11] brion: hardcoded magic numbers were there before... :-( [15:43:25] well don't spread em around dude :) [15:43:50] actually I think I reduced them... [15:44:20] brion: thanks.. I look it up online I must be missing something cause
text
doesn't seem to be doing anything. [15:46:42] brion: r27626 was unrelated to raw printer [15:46:55] they're all in a bunch [15:47:00] revert one, revert the following [15:48:00] please remember that the priority is to keep trunk working and secure [15:48:12] there's no special right to have commits stay there [15:48:31] now if part of it was unrelated and correct, then feel free to add it back [15:48:49] but i always reserve the right to revert anything with extreme prejudice. [15:49:06] the alternative is to *only* let a couple of people like me and tim ever commit anything [15:49:23] that would slow things down a bit. [15:49:48] *VasilievVV|NA imagined that way and started to manually restore his bugfix [15:49:54] super :) [15:50:20] scappin' [15:50:59] what happens to XML illegal characters in the edit box, do they get removed somewhere? [15:51:00] brion: that's fine with me, but maybe you could have a look at those colours and the way 'stub' links are coded [15:51:20] spread over 2 files, with magic numbers [15:51:32] that's not easy to work with [15:52:28] preg_replace( '/[\x00-\x08\x0b\x0c\x0e-\x1f]/', UTF8_REPLACEMENT, $string ); [15:52:33] I suppose that would do it [15:52:53] whenever i start a new page on my wiki installation, it automatically make the first letter capital. Is there any mechanism to make the first letter small ? [15:54:22] ThomasV: i'll take a gander in a bit [15:54:35] ok, thanks [15:55:12] someone plz help me out :) [15:55:18] I think the 'stub' test is replicated in three different places [15:57:38] *generalBordeaux wonders if anyone here gives a damn to his question :( [15:58:11] 03catrope * r27630 10/trunk/phase3/ (7 files in 3 dirs): Revert part of Brion's 27627: please don't throw away the child (maxlag) with the bathwater (format=raw) [15:58:25] RoanKattouw: hi [15:58:31] Hi VasilievVV [15:58:54] *VasilievVV|NA just commited his r27630, and saw a commit conflict [15:59:04] *RoanKattouw caused that [15:59:31] http://svn.wikimedia.org/viewvc/mediawiki?view=rev&revision=27630 [15:59:56] *VasilievVV|NA saw report about it by CIA-37 [16:01:07] What were you trying to commit anyway? [16:01:57] can anybody direct me to a mediawiki support/help channel, if there is one ? [16:02:04] Right here [16:02:07] generalBordeaux: it's here [16:02:38] VasilievVV|NA: i think i asked a pretty simple n straight question with no crap .. [16:02:51] whenever i start a new page on my wiki installation, it automatically make the first letter capital. Is there any mechanism to make the first letter small ? [16:02:51] Let's hear it [16:02:58] Yes [16:03:06] http://en.wiktionary.org/ does it for instance [16:03:08] Lemme check [16:03:59] http://www.mediawiki.org/wiki/Manual:%24wgCapitalLinks [16:04:13] Basically, what you want to do is add $wgCapitalLinks = true; to LocalSettings.php [16:04:24] false, not true, sorry [16:04:37] RoanKattouw: thanks a ton ... [16:04:52] *generalBordeaux thinks he is a fool as he was googling about the damn thing :P [16:05:01] Forget Google [16:05:03] www.mediawiki.org [16:05:26] RoanKattouw: thanks man ... i acted stupidly :( [16:05:32] You didn't know [16:06:25] 04(REOPENED) Create an author namespace on huwikisource - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11425 +comment (10gtisza) [16:06:42] 03huji * r27631 10/trunk/phase3/languages/messages/MessagesFa.php: [16:06:42] * Fixes bugs 12023, 12024 and 12025. [16:06:42] * Translation of the monthname-gen months ending with "heh" were updated. [16:06:42] * Date format was updated and completed. [16:06:54] 03(FIXED) 'edit' message incorrectly translated to Persian - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12023 +comment (10huji.huji) [16:07:20] 03(FIXED) Correcting Persian date formats - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12024 +comment (10huji.huji) [16:07:23] 03(FIXED) Persian genitive month names incorrect - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12025 +comment (10huji.huji) [16:13:51] 03catrope * r27632 10/trunk/phase3/includes/SpecialBlockip.php: [16:13:51] APIEDIT BRANCH MERGE: [16:13:51] * Splitting UI and DB logic in SpecialBlockip.php [16:13:51] ** doBlock() does the actual work [16:13:51] ** doSubmit() wraps around it [16:13:52] ** Introduced BLOCK_* constants [16:13:54] 03huji * r27633 10/trunk/phase3/languages/messages/MessagesFa.php: Right-to-Left-Marker should appear in the beginning of all of the strings, to ensure the text is shown in correct order in left-to-right wikis. [16:22:08] hello [16:23:30] I'm trying to mov 2 basic wiki installations on one sinlgle server : I have issues with the DB migration : how should I do that ? I already mysqldumped the 2 original DBs, but they both have the same name : how can I ontegrate them without confusion ? [16:24:23] RoanKattouw: Just a piece of advice, since brion is so busy and there is lots of diskspace it is best to commit different things seperatley. That way if one dies they both don't. [16:24:49] I've just update from 1.7.7 to last stable and I've got the following error message : "Notice: Uninitialized string offset: 0 in /var/www/wiki/includes/Parser.php on line 2198" (I've followed the update instructions) [16:25:01] electron I know, but it wasn't my commits Brion reverted, they were VasilievVV's [16:25:10] RoanKattouw: Oh, sorry. [16:25:22] VasilievVV|Busy: Just a piece of advice, since brion is so busy and there is lots of diskspace it is best to commit different things seperatley. That way if one dies they both don't. [16:26:24] Could anybody help me for this error [16:27:03] Mazzu: in LocalSettings.php add error_reporting( E_NONE ); somewhere [16:27:30] electron bad idea [16:27:43] Errors should be fixed, not surpressed [16:27:56] Mazzu what's the error? [16:28:27] "Notice: Uninitialized string offset: 0 in /var/www/wiki/includes/Parser.php on line 2198" [16:28:37] ^^ [16:28:41] MW version? [16:29:45] the last stable one : 1.11 [16:29:53] Right [16:30:26] RoanKattouw: It could just be that his server has a high error reporting level in php.ini. [16:30:32] True [16:30:37] Oh it's just a notice [16:30:41] :) [16:30:46] Then set the error_level to something sensible [16:30:49] But not E_NONE [16:31:26] Mazzu: Add to LocalSettings.php: error_reporting( E_PARSE ); [16:31:28] ok, then E_ALL ^ E_NOTICE ? [16:31:35] E_PARSE is better [16:32:09] No it isn't. [16:32:16] error_reporting(E_ALL ^ E_NOTICE); [16:32:18] is good [16:32:31] ok [16:32:36] many many thanks [16:32:54] RoanKattouw: Since you want all error notices except E_NOTICE. [16:33:13] Becuase really E_NOTICE are the only non-bad things. [16:33:22] True, but I very much doubt you want stuff like E_RECOVERABLE_ERROR and that kind of nitpickery if you don't even want E_NOTICE [16:33:34] With E_PARSE you'll only get stuff that's worse than E_NOTICE [16:34:17] true [16:35:40] 03mark * r27634 10/trunk/debs/squid/debian/ (5 files in 2 dirs): (log message trimmed) [16:35:40] squid (2.6.16-1wm1) edgy; urgency=medium [16:35:40] * New upstream release 2.6.STABLE16 [16:35:40] - Drop patch 23-storedir-minobjsize, now integrated upstream [16:35:40] - Drop patch 01-cf.debian, conflicts with upstream changes and serves [16:35:41] no purpose for Wikimedia use [16:35:43] * Add a build dependencies on libgoogle-perftools-dev, debhelper and [16:48:06] 03(mod) Extension:Assert Edit appears to have no effect - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12038 (10raimond.spekking) [16:50:31] 14(DUP) Search feature within categories - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12037 +comment (10raimond.spekking) [16:50:34] 03(mod) Restrict search by category, or set of categories (e.g. range of date categories) - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=2285 +comment (10raimond.spekking) [17:12:50] 03(mod) List of footnotes separate from references in Cite.php - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11899 +comment (10random832) [17:32:44] electron: [18:47:08] revert one, revert the following [17:33:13] eh? [17:33:22] What does that mean..? [17:34:34] electron: it means that brion reverted code to last stable version [17:35:12] Yeah, I missed the conversation. However my point still stands, nonetheless. [17:35:34] And I saw that it has already been mentioned. [17:35:40] So sorry for repeating others. [17:35:51] *electron will read backscroll in future. [17:36:08] VasilievVV: you committed a change to the format=raw thing together with the maxlag thing, which is why Brion just reverted the whole thing [17:36:22] Brion's pretty much the only one who gets away with laziness, since he's ever so busy [17:39:25] *setuid smirks [17:44:22] Hello, Could anyone help with the pdf extension please ? [17:44:42] I cannot figure out how to use it? [17:45:08] oops : PDF Book extension [17:50:33] 03(FIXED) Create an author namespace on huwikisource - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11425 +comment (10jeluf) [17:51:24] 03rotem * r27635 10/trunk/phase3/languages/messages/MessagesHe.php: Fixes. [17:52:00] 03rotem * r27636 10/trunk/phase3/includes/Article.php: Fixing a typo that broke historywarning. [17:52:47] 03rotem * r27637 10/trunk/phase3/languages/messages/MessagesHe.php: Typo. [17:56:54] Hello, Could anyone help with the pdf book extension please ? [17:57:10] Houka what's your problem? [18:01:09] 03(mod) Extension:Assert Edit appears to have no effect - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12038 +comment (10ssanbeg) [18:12:29] 03(FIXED) New mailing list for eu.wikipedia.org - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11774 +comment (10cbass) [18:18:17] 03(FIXED) Request for New ML: WikiJa-sysops - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11882 +comment (10cbass) [18:20:53] ok, quick question - is there a way to display the original author, last person to edit an article, and time of last edit? [18:22:05] i.e., a way to display this automatically at, say, the top of every page? [18:22:45] I know you get the time of last edit at the very bottom, but I want to make this slightly more obvious, and more detailed [18:29:28] hey, anyone know how I'd build mediawiki rpms for centos5? [18:30:04] let's suppose I was going to use mock [18:33:56] 03tlaqua * r27638 10/trunk/extensions/PasswordReset/ (PasswordReset.php PasswordReset_body.php): Changed required permission from 'userrights' to 'passwordreset' - bug 11914 [18:34:10] Trying to make a pdf book out of my wiki ... maybe the architecture of the wiki does not fit [18:34:19] why is a password reset extension needed? [18:34:43] 03(FIXED) Change the 'userrights' requirement by 'passwordreset' in the PasswordReset extension - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11914 +comment (10t.laqua) [18:34:52] when email is disabled f/ random resets. [18:35:32] dont' hate. ;-) [18:35:42] I have a main page which is only a table of content and links to article,.. HOw to make a book out of it ? [18:36:14] REcursdively looking for all pages... [18:38:06] 03straussd * r27639 10/trunk/fundcore/modules/fundcore/gateways/ (fundcore_moneybookers.module fundcore_paypal.module): Add one-dollar check [18:40:19] flyingparchment: For users who don't have shell access or are unable to follow basic instructions. [18:40:46] ;-) or if you work in higher ed and people can't be bothered with tasks like resetting their own password [18:41:11] *TimLaqua really dislikes adding new permissions. [19:24:57] 14(DUP) Seperate email notification for accounts created by other users ? - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=11616 +comment (10raimond.spekking) [19:24:58] 03(mod) Use a separate message for the email content when an account is created - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=3973 +comment (10raimond.spekking) [19:26:51] 03maarten * r27640 10/trunk/extensions/Wikidata/ (4 files in 2 dirs): Version 0.1 of the wikidata api module. [19:27:50] 03maarten * r27641 10/branches/wikidata/includes/ (5 files in 2 dirs): Version 0.1 of the wikidata api module. [19:30:02] In #1192 "We got RF5413/5414 to work", is that the same as the AR5007 ? [19:31:04] Is there a way to test it with a distro kernel or is that restricted to the -wireless git? [19:32:53] buggs: i suspec that was meant for a different channel .) [19:33:24] uhm sorry [19:33:44] madwifi mediawiki, so many chars in common ... [19:33:58] hehe :) [19:44:55] this the right channel for help using mediawiki? [19:45:11] 03maarten * r27642 10/trunk/extensions/Wikidata/util/tbx.xsl: fixed xsl:param should be xsl:variable [19:47:18] what is the problem Hoki? [19:47:30] we talk here over software issues [19:48:25] I'm putting a table on a page but I want some text underneath and no matter where I put it in the edit box it appears on top [19:51:16] exemplews with the right table syntaxen http://de.wikipedia.org/wiki/Wikipedia:Tabellen [19:51:29] Hoki; you probably made a small error in your syntax. If you don't proably close all tags layout will be pretty messed up :) [19:51:56] Hoki; try if a very simple table works by copy/pasting some simple example from the manual [19:52:07] okies [19:54:27] well pasted table example works [19:59:58] ahh found it [20:17:03] when dealing with attachments it is easier to upload files and link them in a normal fashion or to edit the localconfig and att the file types needed ? [20:17:38] does he have an email address noon ? [20:19:31] anyone could help with pdf_book extension please ? [20:20:05] Is there any way to use an image server or content distribution network with Mediawiki? It would be enough if I could add 'http://www.example.com/' in front of every image URL. [20:20:15] I3ooI3oo, I don't really understand your question? [20:20:27] Houka, please read the subject :) [20:21:29] oops sorry... [20:22:59] !externalimages | BartVB [20:22:59] BartVB : To allow images from elsewhere to be included in your wiki, see . To limit this to some specific sources, see . [20:23:05] !farm | BartVB [20:23:05] BartVB : To run multiple wikis, you do not need anything more than to run one wiki. You simply install them in different folders, and if possible using seperate databases. If you only have one database, simply use a different table prefix. For more advanced setups, see [20:23:32] I3ooI3oo: "easier" depends on your use case. [20:24:14] Duesentrieb_, thanks but that's not really what I'm looking for :) I would like to use the upload features of Mediawiki but serve the images though a CDN for performance and bandwidth reasons. [20:26:53] BartVB: well, it's all http, right? so the obvious thing is to use transparent reverse proxies. wikimedia for example has > 100 squid servers for delivering content. [20:27:10] i think more like 40 [20:27:14] beyond that, you'll have to ask someone else. i don't know much about cdn [20:27:33] flyingparchment: really? i though it was more. isn't the total > 300 now? [20:27:56] Duesentrieb_, that's exactly what I want :) But to do that I need to change URLs like "/images/a/b/bla.jpg" to "http://www.mysquidserver.com/images/a/b/bla.jpg" [20:28:02] so... there are a lot of squids, a bunch of apaches, and some db slaves... [20:28:17] BartVB: no you don't. [20:28:41] Duesentrieb_, well, I don't if I want to put everything through squid but I don't (at this moment anyway :D) [20:28:44] BartVB: there's geo-dns, or you have the domain point to a load balancer [20:28:54] but talk to someone who knows more about this than i do :) [20:29:11] putting everything through squid is the way to go. [20:29:18] for some stuff, it can kjuts forward [20:29:24] *just [20:29:33] (which is, again, what wikimedia does) [20:29:48] alas not in my setup, some other applications on that site aren't really compatible with reverse proxies :\ [20:30:02] BartVB: anyway - "everything" is relative. you can configure the domain from which images are served (somehow). [20:30:10] so, it would only apply to your images. [20:30:33] ah, that's exactly what I'm looking for :) but I can't figure out how to do that [20:30:48] flyingparchment: 40? [20:30:51] I think I've found a way to do this for deleted images but that's not that useful :D [20:31:01] Duesentrieb_ well our site it used internally only not viewable by non-registered members [20:31:06] BartVB: i'm not sure, but maybe $wgUploadPath can be a full url prefix... [20:31:12] we write custom software [20:31:12] mark: 30+10? or did i miss some? [20:31:15] 75 [20:31:16] BartVB: oh, do ask mark - he's the wikimedia network guru :) [20:31:19] and 90 within a few weeks [20:31:25] major releases will be linked via the wiki [20:31:43] *mark is not mediawiki guru [20:32:29] I3ooI3oo: i wouldn't upload software releases to the wiki. it's pointless. this would only make sense if it was hard for people to make the files available on the webserver by other means. [20:32:56] we our users are idiots [20:32:59] I3ooI3oo: in your context, o would expect your build/release system (or person) to do it. so just link them [20:33:01] *well [20:33:15] err, i though your users are the developers writing this stuff. [20:33:26] 1/2 are [20:33:37] the other users are field service personel [20:33:39] I3ooI3oo: well, who supplies the files? the idiots? [20:33:50] come on, developers are idiots :) [20:33:52] no the software departments [20:33:57] BartVB can't you somehow set up Apache to rewrite/alias/whatever those image URLs to your external repository? [20:34:35] I3ooI3oo: if you want random users to be able to upload random files, in a wiki like, nice and unorgnaized fashion, then use wiki uploads. [20:35:03] I3ooI3oo: note that's there's a limit to the size of files php accepts in uploads. the default being something like 2MB. [20:35:04] but alas some of my developers don't understand linux in the least and can't be held responsible for uploading files in correctly [20:35:28] I3ooI3oo: this is why you should have a build manager... [20:35:48] I3ooI3oo: anyway: do whatever fits your needs. try it out. [20:35:48] so i guess i can create a page to "help them upload the files" [20:35:55] sure [20:36:27] RoanKattouw, I can but that's really, really inefficient (and I loath mod_rewrite) [20:36:39] BartVB: why doesn't setting $wgUploadPath work for you? [20:36:47] one that does the uploading creates a link for them to copy into the artical [20:36:59] thanks for the help [20:37:00] (and implementing your extension is still on the todo list :\) [20:37:08] mark: $wgUploadPath can be a http://full url then? [20:37:14] I would assume so [20:37:18] I would at least try ;) [20:37:36] sounds like the logical thing [20:37:44] mark: err - isn't that the way wikipedia does it? [20:37:49] yes [20:37:57] As far as I understand wgUploadPath needs to be a local path? [20:38:03] But I'll take a look [20:38:04] did you actually *try* [20:38:24] $wgUploadBaseUrl = ""; [20:38:29] I think this is what you need, actually [20:38:36] No because it wouldn't really make much sense to upload to a URL :) [20:38:39] hmm, interesting [20:38:58] path != directory [20:39:37] BartVB: no, path -> web, dir -> diles. there'S an upload path, and an upload dir. and, as mark pointed out, also the base url, which probably is in fact what you want :) [20:39:50] * dir -> files [20:40:21] hmm, nice. going to try that now :) [20:43:01] great! That did exactly what I want :) Thanks a lot! [20:50:49] is there any way to force not stretching of image if given dimensions are larger than image itself? [20:51:30] afaik only |thumb ... but it adds a frame [20:51:48] right :( [20:52:52] Danny_B: no [20:53:03] |frame works too (with frame as well :-/) [20:53:42] *Danny_B wonders how difficult would be to add nostretch keyword [20:54:46] hey, i have installed mediawiki, and am a bit confused [20:54:52] how do i create an article? [20:54:58] can i have subcategories? [20:55:02] hello [20:55:19] say [[Image:Foo.png|400x600px-nostretch]] or something like that [20:58:36] anyone? :) [20:59:27] dsmtuners [20:59:27] dsmtuners, you can probably create an article by reading the documentation :) [20:59:46] You create an article by just going to an article that doesn't exist yet, type something in the edit box and hit Save [20:59:54] And yes, you can have subcategories [21:00:11] Look at http://en.wikipedia.org/wiki/Category:China for instance [21:02:04] how do i change the background color of a cell in a table? [21:05:02] fuzzy, |style="background-color: #00ff00;"|cell text [21:05:12] ah ok [21:10:26] is there any way to customize the navigation page? [21:10:49] hrrrrm. [21:10:50] navigation page? [21:10:56] can't even figure out how to edit the main title. lol [21:11:03] MediaWiki:Sidebar [21:11:05] "Editing Main Page" [21:11:31] dsmtuners: edit the content, you mean? you can't edit a title - you can move pages. and you can tell mediawiki to use another page as the main page. [21:11:42] oh - "move" [21:11:44] :) [21:12:14] thanks TimLaqua [21:12:16] dsmtuners: yes. but you probably also want to edit MediaWiki:Mainpage after doing that to the main page. that's the system message that determines which pages *is* the main page. [21:13:17] fuzzy, there's a way to do just about everything in MW. ;-) [21:13:22] This page provides interface text for the software, and is locked to prevent abuse. [21:13:27] how do i unlock it? [21:13:27] Hi, does anyone know if the Usage_Statistics extenstion works in mediawiki 1.6.8 ? [21:13:38] fuzzy, login. [21:13:41] ah [21:13:44] fuzzy: you don't. you need to be logged in as sysop to edit [21:13:53] thanks [21:13:53] TimLaqua: as sysop. [21:14:11] Duesentrieb_, if he hasn't logged it, we can safely assume that only ID1 exists. [21:14:17] ;-) [21:14:24] well... probably :) [21:14:48] but it's good to know that not all logged in users can edit that [21:15:26] Duesentrieb_, but we musn't destroy the magic! [21:15:44] the easter bunny lives. [21:15:58] he was just obducted by aliens [21:16:07] good thing we still have superman. [21:16:44] So is FCKEditor being actively developed by the FCK guy? [21:16:50] er, the extension that is [21:17:08] I saw he got commit rights a few months back [21:18:09] ya [21:18:15] brion! :) [21:18:35] sweet. [21:18:41] *brion chmod u-s setuid [21:18:52] FCKEditor scares the fuck out of me. [21:18:55] *setuid looks for his 4 bit [21:19:02] Hi, does anyone know if the Usage_Statistics extenstion works in mediawiki 1.6.8 ? [21:19:09] Duesentrieb_, Is that the one they use in Wordpress? It's VERY broken [21:19:14] Duesentrieb_, ya, I tried hacking it in to submission a few months back. I'm really happy he came on board. [21:19:23] apparently, it simply hides everything it doesn't understand. so editing in wysiwyg mode means potentially killing content without noticing [21:19:34] Duesentrieb_, That's the experience I have [21:19:39] It clobbers tags [21:19:54] Duesentrieb_, oh for sure - one big issue we had was the MW DD/DT syntax. FCKEditor did some funky stuff w/ it. [21:20:00] i dunno, i usually run as soon as i head "wysiwyg". [21:20:07] WikEd is a nice try [21:20:25] it's slow, and the highlighting isn't live though. too bad. [21:20:29] I think FCKEditor *can* work well. [21:20:29] e [21:20:43]

this para

That para

Third para

is eaten by FCK as
This para That para Third para
[21:20:51] TimLaqua: i don't think anything based on a round trip via html will ever really work [21:21:11] setuid: is

inside

legal? [21:21:11] why's that? [21:21:18] really, it's just rendering the HTML [21:21:25] the underlying code doesn't change [21:21:36] TimLaqua: you edit on the html level, afaik [21:21:40] Duesentrieb_, Same with
at the end of each section [21:21:42] but i din't look into it, really [21:21:51] setuid: too damn bad :) [21:21:55] hahaha [21:22:13] So blockquote means "smush everything together into one big run-on paragraph" [21:22:45] hm... timstarling sounded exited about some new parser thingy... i'm very curious.... [21:23:15] Duesentrieb_, eh? FCKEditor by default throws out html tags in one (hidden) box, and the other layer that you actually edit on is just a representation of the hidden code [21:23:33] but you can modify how fckeditor behaves and help it understand wikitext... sort of. [21:23:51] wikitext is a little too far removed from html and that's where fckeditor seems to freak out [21:23:53] TimLaqua: custom tags? parser functions? [21:24:33] 03raymond * r27643 10/trunk/phase3/ (5 files in 4 dirs): * (bug 3973) Use a separate message for the email content when an account is created by another user [21:24:33] Duesentrieb_, how on earth would we ever make that WYSIWYG? [21:24:59] TimLaqua, Play with the latest Wordpress in Admin mode (Write Post), and toggle between the WYSIWYG mode and the "Tag" mode, and see how it eats tags, lists, and other things. [21:25:14] Duesentrieb_: From what I can gather he has 1) rewrote the parser so that it is 33ms to parse a test page instead of 72ms. and 2) Run the current parser and the new one through a fuzz tester and fix a variety of bugs. [21:25:24] setuid, yeah, the mediawiki.fckeditor.net extension does the same thing [21:25:36] setuid, but I think it's all configuration. you *can* make it understand [21:25:40] TimLaqua: you wouldn't. that's why the max you can gat is MYSIWYM. But there has to be some representation for it - at least a grey block or something. otherwise, it'll get accidentally deleted [21:25:42] Right [21:25:49] It's all 1's and 0's [21:26:05] 03(FIXED) Use a separate message for the email content when an account is created - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=3973 +comment (10raimond.spekking) [21:26:10] electron: sounds good. very good. if that also means we get a decent spec [21:26:37] setuid, ;-) I'm just sayin' that the problem w/ the wordpress and current mediawiki.fckeditor.net implementations is simply a configuration issue [21:27:10] Well, where "configuration issue" is "Rewrite the code to render the tags properly" [21:27:15] I've also seen fckeditor do "i don't know what this is" representation as well (those grey boxes) [21:27:32] Maybe implementing a Gecko backend [21:27:40] setuid, no, fckeditor has a solid JS-based configuration [21:27:46] I know [21:28:05] understanding tags is universal - start, stop. [21:31:12] anyhoo, i'm excited to see some more fckeditor extension releases. [21:33:05] TimLaqua, Its not that simple [21:33:26]

foo

is not the same as

foo

[21:33:39] And making sure there's no missing bits, unclosed quotes, etc. [21:33:47] on a tag level, it is [21:33:59] Like:

[21:34:28] If you can tokenize it out, by building a DOM tree, and keep the tag attributes intact _without_ parsing HTML, then yes, I agree. [21:34:30] most parsers would grab the tag, then parse quotes [21:34:50] *setuid is very familiar with this, having written hundreds of HTML parsers in the last 10 years [21:34:54] TimLaqua, Its not that simp0le [21:35:11] You're trusting that quotes, tags, and humans will write syntactically-correct HTML [21:35:29] *TimLaqua sigs [21:35:41] yes, we tidy things, verify, etc. [21:35:58] of course [21:36:04] Duesentrieb_: s/good/too good ;P [21:36:07] And it's easy to get garbage past tidy(1) [21:36:30] But as long as you flag/remove/comment-out garbage that can' t be handled, and pass the rest to FCK, that's probably fine [21:36:44] setuid, I don't even know what we're talking about anymore [21:36:48] hjahahaha [21:36:53] Ok, lay it to rest [21:37:03] setuid, but we have come full circle to fckeditor being workable. ;-) [21:37:35] so, new topic - whos working on mwbb? [21:38:21] sf.net has devs listed as stinkfly and iqbalo_cool [21:39:39] and i've never heard of either of htem. [21:42:22] brion: concerning Linker.php, that link coloring plugin was not going to be used on a whole wiki, but only on a few pages. it generates one sql request per link. [21:42:53] and displaying the colors is intended to save page requests [21:44:37] HELP! Does anyone know if the Usage_Statistics extenstion works in mediawiki 1.6.8? [21:45:15] !tias | kryptt [21:45:15] kryptt: Try it and see. You learn much more by experimentation than by asking without having even tried. [21:46:00] mwbot: it didn't work for me... [21:47:11] kryptt: so the answer is no, apparently. [21:47:11] mwbot: ...already tried... i wanted to know if anyone knew what was the minimum version for it.. [21:47:17] kryptt: anyway - 1.6.8? why? [21:48:24] Duesentrieb_: we already have a whole bunch of pages... its in constant use, and i am not particularly fond of upgrading [21:48:35] *or risking the upgrade... [21:48:39] kryptt: your loss then. [21:48:51] !upgrade [21:48:51] http://www.mediawiki.org/wiki/Manual:Upgrading [21:49:37] kryptt: actually, you should at least keep up with the security patches. 1.6.8 has known exploits. and you really should upgrade to latest. [21:49:57] upgrade risk always exists, bit it's minimal if you simply copy everything before doing it. [21:49:59] Duesentrieb_: its in a private network... [21:50:18] ok, then just live with the general suckage, or upgrade [21:51:00] Duesentrieb_: yeah... I'll probably do a backup and try everything out during the weekend... oh well... [21:51:48] kryptt: read the backup page. beware mysql's charset issues. do an xml dump for good measure. [21:52:32] Duesentrieb_: Ok, thanks. [22:11:09] hm. how do i remove a user account? [22:11:47] (he has no edits). [22:13:40] Well, you can at least remove it from the database. [22:13:44] Of course, that's risky... [22:18:00] User Merge & Delete, baby! [22:18:11] mm... too much coffee. [22:25:00] 03(NEW) Fatal error (memory) - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12039 15enhancement; normal; Wikimedia: Interwiki links; (marius) [22:27:32] 03(mod) Fatal error (memory) - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12039 15enhancement->04CRIT (10marius) [22:29:52] 14(INVALID) Fatal error (memory) - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=12039 +comment (10t.laqua) [22:38:28] 03raymond * r27644 10/trunk/phase3/ (RELEASE-NOTES includes/SpecialUserlogin.php): [22:38:28] * Do not force a password for account creation by email [22:38:28] set pseudo password, it will be replaced later by a random generated password [22:48:56] question: is it possible to provide my own implementation of EditPage::showEditForm in the form of an extension? [23:00:55] anyone? [23:19:00] 03siebrand * r27645 10/trunk/phase3/languages/messages/ (13 files): [23:19:00] Localisation updates from Betawiki. [23:19:00] * an, ay, br, ca, eu, fiu-vro, fo, fr, kaa, nl, qu, sdc, stq [23:27:53] hi ppl [23:28:20] how could i know what mediawiki is installed if just have access to the web directory [23:28:37] i mean access to the server [23:28:57] ive been looking to the files but cant find where its says what version it is [23:28:58] :( [23:29:53] Visit the wiki page "Special:Version". [23:30:04] 03siebrand * r27646 10/trunk/extensions/ (4 files in 4 dirs): [23:30:04] Localisation updates from Betawiki. [23:30:04] * Fixes and additions to 4 extensions for ar, eu, fo, hsb, kaa, nl [23:49:24] 03(mod) Break messages used in Special:Statistics down further - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=5619 (10jidanni)