[00:10:45] Hi all. [00:11:37] This patch fixes t/inc/Search.t - http://www.shlomifish.org/Files/files/code/mediawiki/mw-fix-inc-Search-t.patch - it is the product of hours of effort. [00:14:42] it's pretty dead in here tonight [00:15:26] if there's a bug in bugzilla for search.t, you may want to add the patch to that [00:15:40] carl-m: OK. [00:16:20] I don't know why the API is giving me NeedToWait errors half of the time [00:17:05] TheLetterX: Roan made some changes to fix that, but it will be a few days before they get to the live server [00:17:14] Ok [00:17:19] Why was it happening? [00:17:41] the main cause was that the clocks on the different servers are off by about three minutes [00:18:04] Ah [00:18:08] also, throttling for successful logins will be removed - when the patches get to the live servers [00:18:30] in the meantime, all you can do is not log in too frequently [00:19:26] 03brion * r40058 10/trunk/phase3/includes/EditPage.php: (log message trimmed) [00:19:26] Revert r40027 "Recommit my changes to EditPage without the regressions:" [00:19:26] This caused at least a regression in tab order which causes all my edits to fail: [00:19:26] 1) Click "edit" [00:19:26] 2) Hit "tab" to tab into edit field [00:19:27] 3) Type something [00:19:31] 4) Hit "tab" to tab into the comment field [00:21:31] hello [00:22:29] carl-m: doesn't seem to be anything of relevance - I'll try tomorrow. [00:22:32] Bye. [00:22:40] is anyone here familiar with querying the mysql database that mediawiki is hosted on directly? [00:22:48] zenkat, yes. [00:22:53] !ask | zenkat [00:22:53] --mwbot-- zenkat: Don't say "I have a question", or ask "Is anyone around?" or "Can anyone help?". Just ask the question, and someone will help you if they can. Also, please read < http://workaround.org/moin/GettingHelpOnIrc > for a good explanation of getting help on IRC. [00:23:33] apologies. i would like to get the list of wpids of all active pages that would be output by the dumpbackup.php script. [00:23:37] what is the sql to do this? [00:24:04] wpids? [00:24:05] hmmm, i think i just apologized to a bot. :) [00:24:11] sorry. page_id [00:24:25] What's an "active page"? [00:24:42] anything that would be output by dumpbackup.php [00:25:43] it seems that this is not just: [00:25:51] select page_id from page; [00:30:32] 03dale * r40059 10/trunk/extensions/MetavidWiki/skins/mv_embed/ (23 files in 2 dirs): [00:30:32] fixes for playlists [00:30:32] added example_usage folder and more example clips (moved samples to that folder) [00:30:51] zenkat, dumpBackup.php dumps all pages unless you specify otherwise. [00:35:45] How does MediaWiki check whether a given user's IP address matches any blocked CIDR range? [00:38:20] @simietrical -- but "all pages" is not "select page_id from pages; [00:38:31] or am i mistaken? [00:38:39] Pathoschild: do you want a pointer to the code, or a description? [00:38:53] Both, or a description. :) [00:40:06] first, it takes the first 16 bits of the ID as a "range" [00:40:36] a rangeblock is entered in the database as that prefix, and the starting and ending values within the range [00:41:06] so the software makes a query for any entry in the db with the right range, where the starting address is <= the given address and the limit of the block is > the given address [00:41:19] the function is loadRange() in Block.php [00:42:08] zenkat, the logic is in includes/Export.php on line 166 ff. [00:42:31] Cool, thanks! Let me go look ... [00:43:46] 03stipe * r40060 10/trunk/extensions/MetavidWiki/skins/mv_embed/mv_embed.js: CC button enabled, added a delay before processing plugin load callbacks [00:44:09] zenkat, every page appears to be dumped by default. [00:44:20] carl-m: Are the starting and ending ranges available from the toolserver, or only the CIDR notation in the ipblocks table? [00:44:32] starting and ending addresses, that is. [00:44:51] as far as I know everything is in the ipblocks table [00:46:03] which stores the addresses in packed form [00:47:10] Ooh, there it is: ipb_range_start and _end. :D [00:47:32] Thanks. :) [00:48:29] @sim - when i do the following query: [00:48:32] select page_id, page_title from page where page_title = 'The_Cleveland-Loretta_Quagmire'; [00:48:35] i get: [00:48:45] +---------+--------------------------------+ [00:48:45] | page_id | page_title | [00:48:46] +---------+--------------------------------+ [00:48:46] | 2070258 | The_Cleveland-Loretta_Quagmire | [00:48:46] | 3716857 | The_Cleveland-Loretta_Quagmire | [00:48:46] +---------+--------------------------------+ [00:48:48] 2 rows in set (1 min 19.03 sec) [00:49:18] oopos, nevermind ... [00:51:29] ls [00:52:09] @simetrical -- many thanks for your help ... i need to do some more research on my side ... [00:52:58] zenkat, those are in different namespaces, probably a page and its talk page. [00:53:06] Select page_namespace as well to see. [00:53:14] And don't paste a lot of text, use a pastebin. [00:53:26] yes, that was my "d'oh" moment :) [00:53:33] "pastebin"? [00:53:38] !pastebin [00:53:38] --mwbot-- Please do not paste more than 2-3 lines of text into the channel as it disrupts the flow of conversation. Instead please use a pastebin such as and post a link to your paste in the channel. [00:53:47] thx [00:54:15] @simetrical -- some issues we were tracking down here made it seem like not all pages were exported by default [00:54:25] but i think we must be mistaken [00:54:46] thanks for your help, we need to do more homework here :) [00:56:34] 03stipe * r40061 10/trunk/extensions/MetavidWiki/skins/mv_embed/mv_embed.js: callbacks only delayed while plugin verified to not exist [01:27:02] \ [01:56:18] Hmm... for some reason, my script is saying my (admin) account is not allowed to use rollback (when requesting an API rollback token) [01:57:01] see what you get for the user info query, which lists the rights that the api thinks you have [01:58:02] It says I can has rollback [02:01:44] Aha [02:01:48] I was using my old username [02:01:53] that would do it [02:06:10] @Simetrical ... many thanks for the link into Export.php [02:06:24] we have investigated a bit more, and it looks like it is not so simple [02:07:13] there appear to be "hidden" pages in the pages table that are named like "hidden@#######" [02:07:45] these pages are excluded from dumps when we use the --current flag [02:08:25] by the clause "page_id=rev_page AND page_latest=rev_id " [02:09:42] this explains the mysteries we were seeing ... [02:10:16] anyways, thanks again for pointing us in the right direction! [02:19:48] I forgot, how do you edit templates? [02:19:52] those called by {{}} [02:22:15] hola alguien habla espa�ol? [02:22:42] Lars_G: they are at a page such as Template:X [02:22:48] just edit that page for {{X}} [02:22:51] hmmm right [02:22:52] thanks [02:37:25] hi i is it possible to merge an old database to the new database of my current mediawiki? [02:38:55] merge or move? [02:38:58] hello [02:39:46] merge [02:46:52] 03aaron * r40062 10/trunk/extensions/FlaggedRevs/ (FlaggedRevs.php specialpages/RatingHistory_body.php): Limit use of READER_FEEDBACK_SIZE [02:55:24] 03tstarling * r40063 10/trunk/extensions/TrustedXFF/TrustedXFF.php: Fixed configuration section [02:56:49] hi [02:57:15] are there any admins / developers of mediawiki around? [02:58:02] Just ask. [02:58:14] !ask [02:58:14] --mwbot-- Don't say "I have a question", or ask "Is anyone around?" or "Can anyone help?". Just ask the question, and someone will help you if they can. Also, please read < http://workaround.org/moin/GettingHelpOnIrc > for a good explanation of getting help on IRC. [02:58:35] *MZMcBride glares at Werdna. Brevity is the soul of wit. [02:59:44] Brevity is .. wit. [02:59:54] :-) [03:01:17] ok then [03:02:57] actually, i just want to know if you guys offer remote installation service... [03:03:43] my boss is just curious [03:05:45] we are planning to launch a wiki site... suing mediawiki [03:05:51] using mediawiki i mean [03:05:56] using, hopefully :P [03:06:07] :D [03:06:09] well, you can usually find someone experienced who'll do it for you [03:06:17] however, Wikimedia doesn't offer one officially. [03:06:25] (for a fee, of course) [03:06:37] yes, for a fee of course [03:07:07] I think Nick Jenkins set up a website with people willing to do freelance stuff, but I'm not sure what happened to that [03:07:51] i see [03:07:57] how about storage services? [03:08:02] wikihr.net, maybe? [03:08:09] chimera7562: as in somebody to host your site for you? [03:08:53] im not really sure what my boss meant but he wants to know if you have storage services [03:09:16] anyway, ill just ask him again later to clarify it [03:10:55] Werdna, thanks for the help :) [03:11:34] last question... im just curious... who started wikimedia [03:11:45] ooh, this is an interesting question. [03:11:59] Jimmy Wales and Larry Sanger started 'Nupedia', which was supposed to be done by experts [03:12:15] one or the other had the idea to allow anyone to contribute with a wiki model (Wikipedia) [03:12:21] and that started in January 2001. [03:12:58] Larry left because of issues with the project's treatment of experts (I think), and in 2004 the website became, not a project of Bomis (Jimmy's company), but its own entity called Wikimedia. [03:13:39] tell him about Bomis [03:14:14] wow... cant believe something this big started from just two people :) [03:14:22] anyway, thanks guys for the help :) [03:14:38] Bomis was a porno site :P [03:14:45] in a manner of speaking, anyway :P [03:19:42] Werdna: Larry left because Jimmy stopped paying him [03:21:41] TimStarling: o rly? [03:21:47] not over some policy issue [03:21:59] I wasn't there. [03:22:00] he only started writing about experts and policy years later [03:22:20] I wasn't there either, but it's all still there to read [03:22:24] apparently that's his way of talking up citizendium and hiding the real reason, then. [03:23:31] it's said that he had the same opinions about experts in 2001, but he didn't care to start a project until after wikipedia took off [03:24:15] http://meta.wikimedia.org/wiki/My_resignation--Larry_Sanger [03:24:18] ah, here we go. [03:24:36] I last read it years ago, so evidently my recollection is a little fuzzy. [03:25:44] Hmm...API move isn't giving a reason... [03:27:12] 03aaron * r40064 10/trunk/extensions/FlaggedRevs/specialpages/RatingHistory_body.php: add vote count to tables [03:27:31] Got it, nvm [03:32:34] 03tstarling * r40065 10/trunk/extensions/BotQuery/query.php: English. [03:32:54] !rev 40065 [03:32:54] --mwbot-- http://svn.wikimedia.org/viewvc/mediawiki?view=rev&revision=40065 [03:33:24] *Werdna hard-disables Raymond_ [03:46:03] 03(NEW) tag cannot be rendered correctly when more than one tag is in the same line - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15333 normal; normal; MediaWiki: Page rendering; (liangent) [04:09:13] how do I list all users of my wiki? mysql or web doesn't matter but how can I do it? [04:10:25] Special:ListUsers usually [04:10:42] unless you want those with userpages: Special:Prefixindex/User: [04:12:37] Splarka: thank you !! [04:55:06] TimStarling: Brion mentioned something about perhaps needing a better test system for the ui... JSharp pointed Selenium out to me some time ago... it might actually be pretty usefull [04:55:39] HI I'm trying to import an image dump that I downloaded using the wikix tool and it isn't working. The images show up in the filesystem and they are in the database, but the links to them in the wiki articles aren't working. [04:56:15] I imported them using maintinence/importImages.php [04:56:39] is there some other script I should be running also or what? [04:58:15] you could try refreshLinks [05:04:24] 14(INVALID) deleteRevision.php does not erase version from history - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15291 +comment (10hugoborrell) [05:19:27] sigh, again I try to google for something and find something I did as the first result. Damn you google, stop being such a slut to wikipedia [05:20:33] suckups [05:20:47] Time to fix a regression... [05:27:04] is this a regression I should be worried about? [05:28:50] Dantman: trying that now [05:28:55] heres hoping it works [05:28:58] god its slow [05:29:29] No, just the one in the edit form brion spotted [05:30:18] Attributes inputted into EditPage::showTextbox were being overridden... so the tabinputs that I gave it were being ignored [05:30:31] tabindex* [05:30:54] I fixed it... and I'm doing a page diff at the moment to make sure there are no more ui changes in the rest of the page [05:31:22] Perhaps I'll create some sort of dom-diff tool to check for differences in the dom of the edit page [05:40:32] TimStarling: The editpage contains a small bit of markup to create spacing between a pagetop preview and the edit toolbar... Previously the markup was "preview...


..." but my changes make it "preview...


..." basically the

break

block has been moved outside of the preview area [05:40:42] That ok, or would that be a regression? [05:41:17] sounds fine to me [05:41:40] mkay... that was the only other difference [05:42:17] There are no more ui regressions so I'll fix and recommit [05:56:47] 03dantman * r40066 10/trunk/phase3/includes/EditPage.php: [05:56:47] Recommit the EditPage changes: [05:56:47] Thanks for the heads up brion. $attribs was being overridden in EditPage::showTextbox rather than having individual things set which caused the tabindexes inputted to be ignored. Your test case is fixed. [05:56:47] I did a page diff, there are no more ui regressions. The only difference is the preview-toolbar


has been moved from the inside of the preview to outside of it. There's no real difference in that. [05:58:03] hello, i moved over a wiki from one server to another, anytime i click basically anything i get "Warning: session_start() [function.session-start]: open(/home/users/web/b872/ipw.insomnia-consulting/phpsessions/sess_1acd1e9c78ecd043156389f19f694cd9, O_RDWR) failed: No such file or directory (2) in /var/www/www.pghcodingdojo.org/htdocs/includes/User.php on line 627" [05:58:23] i can't seem to find the settings to fix this [05:58:50] looks like an issue with your php session settings [05:59:36] sessions work fine with everything else, just not media wiki [06:03:41] 03(mod) tag cannot be rendered correctly when more than one tags are in the same line - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15333 summary (10liangent) [06:04:25] Dantman: Parse error: syntax error, unexpected T_DOUBLE_ARROW in EditPage.php on line 1505 [06:04:43] Agh.. checking [06:05:10] Sorry bout that [06:05:23] I fixed on my webserver, then redid the same locally... [06:05:30] cept I forgot to remove the > this time [06:06:52] 03dantman * r40067 10/trunk/phase3/includes/EditPage.php: Minor error... sorry bout that, forgot to remove the > when I redid the fix locally. [06:07:16] fixed [06:08:17] :/ [06:13:28] 03dantman * r40068 10/trunk/extensions/StringFunctions/StringFunctions.php: Committing Juraj Simlovic's updated StringFunctions.php code. [06:13:52] Dantman: stringfunctions is deprecated. [06:14:03] Whatsthat? [06:14:19] AFAIK stringfunctions was merged into parserfunctions [06:14:29] brion reverted that [06:14:29] TimStarling: or did that get reverted? (stringfunctions being in parserfunctions) [06:14:33] o rly [06:14:35] why? [06:14:59] He took one look at the code, and said he found them to be etremely memory inefficient [06:15:07] That's what the stringfunctions update is about [06:15:18] everyone work on krimpet's lua extension instead [06:15:23] Someone did some work to try and make it more memory inefficient [06:15:59] Though, I'm not completely confvinced it's good... it's probably, better, but still not pfunc worthy [06:16:02] TimStarling: brion doesn't like that. [06:16:13] I talked to him the other day about it. [06:16:58] brion would like it if we had some kind of support for external mediawiki installations [06:17:11] like abuse filter :/ [06:17:34] he wants me to rewrite the parser before it goes live. [06:17:38] but if that's the only thing holding it back, it's going to become a pretty weak excuse [06:17:39] erm, the PHP one. [06:17:56] texvc doesn't work on safe_mode shared hosts either [06:18:27] you can't implement LUA in PHP.. that's insanity. [06:20:45] 03(mod) Install the StringFunctions extension - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=6455 (10dan_the_man) [06:24:40] <_wooz> lo [06:31:45] ^_^ heh... creating a dom-diff engine is going to be tricky when I have little knowledge of diff algorithims [06:33:34] look up longest common subsequence problem. [06:33:44] or just derive from a generic implementation. [06:44:33] 03(mod) some pages are delivering raw GZIP encoding - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15149 (10tk1b) [06:46:01] 03aaron * r40069 10/trunk/phase3/includes/DefaultSettings.php: Add $wgPhpCli var so shell outs know what exec name to use [06:47:11] Dantman: just do a text diff [06:47:21] if you want it to ignore whitespace, strip the whitespace first [06:49:16] hmmm [06:49:23] Well it's more than that' [06:49:33] I wan't to ignore attribute location differences [06:49:53] you can normalise that. [06:49:57] ie: type="text" accesskey="," vs. accesskey="," type="text" should output the same [06:50:02] but it's ugly. [06:50:02] 03(NEW) Bad rendering - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15334 normal; normal; MediaWiki: Page rendering; (liangent) [06:50:24] We'll see [06:50:26] yeah, normalising will be easier than writing a diff algorithm for the tree [06:50:47] hhmm... right... ^_^ doms are trees... [06:50:48] heh [06:51:05] Mkay.... [06:51:08] you can use one of the XML parsers to give you an appropriate view into the tree [06:51:13] Mhmm [06:51:14] and use that to output a normalised form [06:51:42] I'll use a xml parser to output into a kind of special format showing only the relavant data line by line and sorting things... then a simple text diff will show differences [06:51:57] sounds good [06:52:57] But first... the murder of a frozen block of ground beef needs my attention.... ((cue meniachal laughter...)) [06:53:26] 03aaron * r40070 10/trunk/extensions/FlaggedRevs/ (14 files in 2 dirs): [06:53:26] * Add some urlencode calls and refactor ugly link code [06:53:26] * Add some HTML escaping [06:53:26] * On windows, use NUL:, not NUL [06:53:26] * Use wfShellExec() [06:53:27] * Let $wgFlaggedRevsStatsAge be set to false to disable stats updates [06:53:29] * Move some message loading to constructors [06:54:09] thanks Aaron [06:57:14] 03(mod) DISPLAYTITLE doesn' t always work when current revision is sighted - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=15330 (10dapete.ml) [07:08:28] mmm... any chance to have a preference to set a max number if diff lines to attempt to render... [07:08:45] preferences need to be fixed. [07:08:54] (those damn overflow table cells can be so slow to load) [07:09:15] ((and there is little indication in the byte change to indicate how bigg the diff will be)) [07:09:58] morning [07:10:20] if/of,gg/g [07:20:07] simple question, but how do I link from an image to a different article (i.e. not to the image)? [07:20:18] !imagelink [07:20:18] --mwbot-- Image linking is intentionally not supported in MediaWiki, so access to the image's description page is always available. If you need image links, see for methods and information. One way to achieve this is . [07:20:32] omnivibe: ^^ [07:20:40] o_O thanks! [07:23:23] *Splarka still thinks that should be (default off) optional in core [07:23:59] [[Image:foo.png|link=[[bar]]|baz]] [07:36:46] PLEASE?!? And have it work with class="icon-click". [07:36:53] *Lady_Aleena gets on her knees and begs. [07:37:15] *AaronSchulz loves Apache [07:37:35] (the melody) [07:38:39] 03nad * r40071 10/trunk/extensions/SimpleSecurity/SimpleSecurity.php: bug fix: userCanReadTitle not accounting for $wgPageRestrictions [07:38:52] Splarka: I tried and implemented it once, but it didn't seem to work well, so I dropped that patch [07:39:32] doh [07:43:10] oooh icon extension works great! :) [07:43:13] thanks [07:45:28] time for scap? [07:46:08] as long as EditPage.php is ok [07:47:08] Dantman tells me it make the same HTML now as the old one [07:47:25] don't worry, I haven't committed anything since the last scap :) [07:47:59] I've got something big in a few days, though. [07:51:39] mm nice, MediaWiki:Loginlanguagelinks doesn't support html entities [07:51:54] Relatively the same [07:52:00] The DOM isn't really different [07:52:39] There might be a whitespace difference... the


has moved from inside the preview to the outside... and a few attributes may have their locations inside the tag reorganized [07:52:40] The MediaWiki: namespace makes me very sad. [07:52:45] say guys is there a way to edit the navigation block without manualy editing its html on the skin file? [07:52:57] kfir: MediaWiki:Sidebar [07:53:13] !sidebar | kfir [07:53:13] --mwbot-- kfir: To edit the navigation menu on the left, edit [[MediaWiki:Sidebar]] using its special syntax. For more details, see . [07:53:32] thanks :) [07:59:33] hi - is there a technical description of the mediawiki-syntax? [08:00:14] i like to implement the syntax in perl for my app [08:00:30] Well, there's Parser.php... [08:00:32] !syntax | jegade [08:00:32] --mwbot-- jegade: For help with MediaWiki's Wikitext syntax, please see . For an (incomplete) formal specification, see . [08:00:32] 'unportable' [08:00:42] !parsers | jegade [08:00:42] --mwbot-- jegade: For alternative parsers, see . A new formalization and parser for mediawiki syntax is being discussed on wikitext-l, see . For an (incomplete) specification of mediawiki syntax, see . [08:00:54] jegade: easy to get to 90%, impossible to get 100% [08:01:15] 90%? made up number? ^_^ [08:01:33] yes. 78% of all figures are made up. [08:01:35] I'd say, more like 5% (not counting plain text) [08:01:35] Duesentrieb: great list [08:01:48] [[Link]] is impossible to parse, for example, without the full database [08:01:59] Splarka: bullshit. [08:02:07] class="new" or not? [08:02:09] Heh. [08:02:11] image? category? [08:02:13] interwiki? [08:02:15] you can't tell [08:02:20] Splarka: that's not parsing. [08:02:21] Splarka: you can parse that, you just can't output necessarily. [08:02:30] exactly [08:02:31] Splarka: you need configuration, not database access. [08:02:31] Heh. [08:02:35] i need only basic text-formatting, links, images and tables [08:02:40] Bull [08:02:48] you need database to class="new" or not [08:02:53] config won't help much with that [08:02:57] yes, that's for *output* [08:02:59] Depends what the goal of the parser is. [08:03:01] not for *parsing* [08:03:02] Splarka: but that's not PARSING, that's RENDERING. [08:03:03] the two are distinct. [08:03:05] bah [08:03:16] well, the SHOULD be distinct :) [08:03:18] well what about {{foo}} ? [08:03:25] jegade: that should be doable. [08:03:38] if Template:foo exists, your *parsing* is different than if it isn't [08:03:44] because itself might contain code to parse [08:03:48] jegade: it gets complex when you get into templates, and impassible when you get into parser functions [08:04:09] and extensions muck it up nicely [08:04:18] jegade: also note that image links have a special syntax, and for recognizing image links, you need to know the wiki's config. [08:04:32] So do other odd variables like HTMLTidy and server versions... [08:04:43] jegade: but if you don't want to parse stuff from an existing mediawiki, but just want to implement a largely compatible subset, that's not so hard. [08:05:02] jegade: the nasty bit is the edge cases people rely on :) [08:05:12] 03(FIXED) Change configuration for autopromotion on de.wikipedia.org - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14975 +comment (10JSchulz_4587) [08:05:14] tag extensions, parser hook extensions, free text hooks, localizations for namespaces, unpredictable interwikis, partially unpredictable interlang, listed external link protocols... ugh [08:05:28] Splarka: yes. [08:05:57] Splarka: write a parser for existing wikipedia content? UGH! write a parser suporting a compatible subset for use with your own prog? not a biggy. [08:06:00] jegade: you might consider hooking into the API with action=parse of whatever wiki you want to do [08:06:01] Duesentrieb: i create the rules :-) - so people must live with my parser [08:06:17] what wiki is this for? it has to be for some wiki [08:06:36] Splarka: core discussion board. or cms. or something [08:06:37] otherwise you must be highly masochistic to want to mess with mediawiki syntax [08:06:49] *Splarka grins [08:06:51] Splarka: again, a compatible subset is no problem [08:07:30] sure, but the subset is vanishingly small [08:07:30] jegade: look at the (incomplete) formal spec if you want, but it's probably better to just go by the usage help/tutorials and cover the most used features. [08:07:34] 95% of text, maybe [08:07:41] 50% of used syntax maybe [08:08:01] it all depends on what you want to do with it [08:08:02] but on en.wp, quite a lot of individual markup is reliant on extensions, config, and db [08:08:22] yes, sure [08:08:38] though... not that much for parsing. for rendering, yes. [08:08:39] so, my made-up 5% might be '' ''' {| |- || [[, while every template, magic word, parser function, and tag extension is the rest fo the 95% [08:08:50] there are not that many templates that produce parts of syntactical constructs [08:08:52] arr [08:08:54] and those are the main problem [08:09:15] dues: I would say the difference in parsing [[Foo]] and [[Image:Foo]] exists, not just rendering [08:09:22] Splarka: for most people, paragraph separation, '', ''' and links is all they need. maybe * and # too. [08:09:23] but depends where you draw the line [08:09:40] Duesentrieb: thanks for your help [08:09:42] Splarka: yes, image links need to be parsed differently, because they can contain nested links. [08:09:48] exactly [08:09:58] Splarka: and to identify them, you need to know namespace names. this is one of the nastier bits. [08:09:59] so you can't predict what localisation the image namespace might have! [08:10:01] whenever you parse, you parse for a *purpose* [08:10:13] yah [08:10:16] Splarka: you don't need to predict it. [08:10:20] You can say "I'm parsing the text out of this text" [08:10:22] well [08:10:25] jegade: have fun. [08:10:27] Duesentrieb: I mean you can't assume it [08:10:29] $parsed_output = array( $text ); [08:10:33] you have to check, or know [08:10:35] Splarka: you don't have to assume it either. [08:10:43] you do for a generic parser [08:10:53] who said something about a generic parser? [08:10:53] which de-facto can't exist until major changes come about [08:11:09] why, I believe jegade did ^_^ [08:11:38] *Splarka wonders what convo Duesentrieb was having [08:11:40] you guys probably scared him off. [08:11:56] all I was doing was saying it was more impossible than you were saying it was [08:12:05] 5% possible vs 90% possible [08:12:07] no, he said somethign about implementing mediawiki syntax for his own use. not for usse with existing wikitext. and not necessarily without special config. [08:12:30] Splarka: you're just throwing up roadblocks, which can mostly be worked around, or aren't show-stoppers. [08:12:34] Splarka: depends on what you base the numbers on. [08:12:45] Werdna: road blocks are usually for a reason [08:12:54] prefer to have people drive into the steamrollers? [08:13:01] which is why I had the second clause of my statement [08:13:03] Splarka: number of pages that don't use any "nasty" features? number of syntactical constructs? whatever. [08:13:03] Dues: as said, all made up [08:13:07] they can be worked around, or aren't show stoppers. [08:13:28] hello [08:13:33] Splarka: sure. there's no measure here. only an impression. [08:13:35] whatever. [08:13:37] Werd: sure, but then they are only approx, unless you work from a vanilla install [08:13:48] from a specific revision... [08:14:02] and specific default configuration [08:15:15] Splarka: ... he just wants links, images and tables! [08:15:24] heh [08:15:33] Image: namespace localized? [08:15:38] that can make a big difference [08:15:49] Presumably, he won't be setting up his software with an 'image' namespace. [08:15:58] I would imagine that he'd use the image-url syntax. [08:16:05] anyway, I actually requested action=render in the API (later changed to action=parse, which did the same thing but with more info passed, which again confuses the difference between the two) so Wikia could make a better wikiwyg, based in wikicode and not html [08:16:20] even if it's localized, it's perl. just make it an option. [08:16:23] no problem. [08:16:30] and outlined to Bartek how it could be used [08:16:50] eg, parsing all [[links]] as some neutral color live and periodically batch checking them to render red/blue [08:16:58] 03nad * r40072 10/trunk/extensions/SimpleSecurity/SimpleSecurity.php: comments [08:17:06] and all {{}} as temporary placeholders waiting for next ajax hit to the action=parse [08:17:15] (I think he gave up on it though) [08:20:28] 03nad * r40073 10/trunk/extensions/SimpleSecurity/SimpleSecurity.php: remove debugging [08:20:55] Splarka: how's fckeditor at wikia coming along? [08:21:26] no idea, been about a year and a half since I was interested in that X_X [08:21:35] *Splarka pokles JSharp [08:22:01] eep [08:22:13] I once wished for a syntax hilighter mode that could be used along with it, and a raw code view, I think Datrio still worked there at the time though [08:22:20] indeed, thought it's getting some attention [08:22:36] I'm not on the project so I'm not quite sure [08:22:36] is it wikicode based or still html based? does it use the API? [08:23:35] nau [08:32:15] *Dantman|Food wanted a syntax highlight mode to [08:33:53] Tim tried to make an expandtemplates mode with associative mouseover twixt code <--> output, IIRC.. I wished it had worked:/ [08:34:56] Dantman|Food: WikEd does that, though you have to click abutton to make it update the highlighting. it's also a bit slow. btu it's a start. [08:35:13] Mhmm [08:35:25] I was just wanting something inside the FCKeditor [08:35:52] A nice compromise that could make keeping wysiwyg enabled nice for even advanced users [08:36:15] well, modifying FCKE towards wysiwym would be nice. you know, mixed markup, highlight and formatting. [08:36:27] Also... I wanted a keyword you could add to a page to force wysiwyg mode off on a page... [08:36:29] when you type == foo == it gets formatted as a heading, but you still see the == [08:36:52] I like the idea of a mix [08:37:42] would be nice if fck understood closing tags [08:38:09] ie... when you type the ]] on a [[link... it backtracks and checks to see if it can convert that into a link [08:41:52] one of the things that pisses me off with wysiwyg editors: the often automatically link stuff, but it's really tricky to unlink. and often impossible without using the mouse. [08:42:00] an editor that forces me to use the mouse is useless [08:42:28] i do use the mouse for some things, but i should never have to. [08:43:12] <_mary_kate_> Duesentrieb: without using fck, i think basic syntax highlighting would be nice [08:43:17] <_mary_kate_> the way vim formats HTML [08:43:36] yes, something like that [08:43:55] foreground colors for different markups... [08:44:06] background colors for different depths (especially on {{ {{ {{ {{ {{) [08:44:11] maybe even a bit more. like formatting == foo == as an actual heading. [08:44:24] and possibly an indent preview mode for {{}} [08:44:34] Splarka: i like the idea of background colors. well, one color, but going darger when you go deeper, or something [08:44:35] <_mary_kate_> well, vim is limited by being a terminal app, but a textarea could do proper headings and stuff [08:44:44] yea [08:44:56] Dues: well, one for [[]], which is only two deep, in Image links.. [08:45:04] one for {{}}, say darker shades of red... and one for html maybe [08:45:08] WikEd goes into that direction, but isn't quite it [08:45:33] and if it did nothing but hilight it'd be sweet [08:45:48] Splarka: {{ and {{{ yes. and image links can be nested indefinitly (though whoever does that should be taken out and shot) [08:46:00] mmm, bang [08:46:46] just need to replicate the preprocessor in javascript [08:47:23] or do we... someone go rewrite the preprocessor in javascript, and then write a php interpreter to use it, then we can use it native, (cue Werdna throwing rocks at me) [08:47:39] *Werdna duly throws rocks at Splarka [08:47:44] ow [08:51:39] 03(mod) Technical updates to bs.wiki - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13451 (10demicx) [08:52:51] Hi [08:52:58] Can anyone tell me why via the API I get only a trunkated version of the comments? and not the hole text in there? [08:53:28] URL? [08:54:58] http://commons.wikimedia.org/w/api.php?action=query&titles=Image:Capillarity.svg&prop=imageinfo|categories&iiprop=timestamp|user|comment|url|size|sha1|mime|metadata|archivename|bitdepth&iiurlheight=200&iiurlwidth=200&format=xml [08:56:51] TimStarling: I do something wrong? [08:58:15] the comment there appears to be a truncated version of the content, which is normal [08:58:28] http://commons.wikimedia.org/wiki/Image:Capillarity.svg <-- truncated there too: ...Author=[[us) [08:58:54] 03(mod) diff messages i18n problem - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15299 (10bugzilla.wikimedia) [08:59:03] under "file history" [08:59:04] (autosummary) [08:59:36] anyone ever pondered giving image uploads a separate comment field? [08:59:53] seperate, even [09:00:32] bit counterintutive... the textarea is the comment and the content for new uploads, but only the comment for reuploads [09:00:54] Splarka: I see now that is trunkated also on the page ... [09:01:04] alecghica: isn't really truncated.. what happens is... [09:01:23] separate, right the first time [09:01:26] the summary field on uploads is used for the content, license info and such, but the first ~250 bytes of that is used for the upload summary [09:01:51] if everyone could spell, wiktionary would be pretty useless [09:01:57] oh wait, it is, except english [09:02:13] almost no other languages have such a thing as a "Spelling Bee" [09:02:23] although the french have a gameshow based on conjugating verbs [09:02:40] well, english is intermediate, as languages go [09:03:04] the spanish have a game called tortuga [09:03:06] some are phonetic, some are historical, some are so historical that it's like learning a different language when you write [09:03:15] between what? a language and random keystrokes? [09:03:21] ahh [09:03:38] chinese people often have to learn a different language when they write, for instance [09:04:04] "Most languages borrow from other languages. English follows them down dark alleys, knocks them over, and goes through their pockets for loose grammar. [09:04:09] and in arabic there are several regional forms [09:04:35] TimStarling: huh? I thought the grammar is pretty much the same for Chinese [09:04:43] from what I've seen of it, french spelling is easily as hard as english [09:04:50] spoken language is quite natural for our species, no remote isolated tribe has ever been found without a spoken language, writing is counter-intutive [09:05:25] Nikerabbit: spoken chinese is divided into about a dozen mutually unintelligible dialects [09:05:58] Splarka: I understand now ... to rephrase my question ... is there a way via the API to get the full description from summary? :) [09:06:04] sure [09:06:08] but does written chinese have different grammar from spoken? [09:06:19] not if you happen to speak mandarin [09:06:29] http://www.shlomifish.org/Files/files/code/mediawiki/mw-fix-inc-Search-t.patch - please apply this patch - it fixes t/inc/Search.t . [09:06:32] but if you speak any of the other dialects, there's going to be differences [09:07:19] prop=revisions&rvprop=contents probably easiest [09:07:20] Splarka: you saw my API URL earlier? what i do wrong? [09:07:25] http://commons.wikimedia.org/w/api.php?action=query&titles=Image:Capillarity.svg&prop=revisions&rvprop=content [09:08:23] I am not sure if you can combine it with your previous query [09:09:25] heh, I managed to answer an email about wikimedia in french the other day. [09:09:31] somebody emailed me about toolserver in french. [09:09:47] Splarka: thx, thats what I needed :) [09:10:42] # 20:15 brion: set wgNewUserSuppressRC to true, was false unsure why it's annoying <-- brion has an ethical objection to commas. [09:11:04] Nikerabbit: this is one of the reasons why we have zh-min-nan.wikipedia.org [09:11:44] because some people in Taiwan think it's unfair that they have to learn a new grammar in order to write [09:22:00] Hi all. [09:22:01] http://www.shlomifish.org/Files/files/code/mediawiki/mw-fix-inc-Search-t.patch - please apply this patch - it fixes t/inc/Search.t . [09:24:06] rindolf: credited to who? [09:24:16] TimStarling: Shlomi Fish. [09:25:44] 03tstarling * r40074 10/trunk/phase3/t/Search.inc: Patch by Shlomi Fish to fix this test [09:26:15] TimStarling: thanks. [09:27:25] thanks for fixing it [09:27:40] but I'm sure you know, it's a drop in the ocean as far as testing goes [09:30:24] Hj, im newbie, can u help me install Mediawiki? [09:30:45] !installation | tuantrinh85 [09:30:45] --mwbot-- tuantrinh85: Installing MediaWiki takes between 10 and 30 minutes, and involves uploading/copying files and running the installer script to configure the software. Full instructions can be found in the INSTALL file supplied in the distribution archive. An installation manual can also be found at . See also: !download [09:31:01] ok [09:31:09] but i install again [09:31:20] and find bugg, [09:31:48] pls Can u see http://tracuu.tainguyen.vn/wiki [09:33:06] tuantrinh85: without a more detailed description of the problem you are running into, and how you got there, there is no way to help you. [09:33:23] *tuantrinh85 slaps siebrand around a bit with a large trout [09:36:02] I have got bug in wiki [09:36:10] im install again but can not find file index.php in dir config [09:36:17] but i see index.php and type parameter [09:36:24] pls help me [09:37:45] U can click button "Install Mediawiki" and see bug 406 [09:37:48] tuantrinh85: which webserver are you using? [09:38:00] i use linux [09:38:05] U can click button "Install Mediawiki" and see bug 406 [09:38:10] tuantrinh85: have you properly configured PHP for it? [09:38:43] im havent got server, i use hosting in server services [09:38:55] tuantrinh85: HTTP Error 406 - Not acceptable ? [09:39:33] I can install complete firts, but second, im istall again not complete [09:39:40] huhu i dont know [09:39:58] tuantrinh85: you are referring to "bug 406". What "bug 406"? [09:40:13] huhu i dont understand Error 406? [09:40:52] because on one server, i isntall complete firts [09:41:03] because on one host, i isntall complete firts [09:41:38] but install again, error 406? [09:42:23] i chmod 777 dir "config", not complete, [09:42:33] An appropriate representation of the requested resource /wiki/config/index.php could not be found on this server. [09:42:33] Additionally, a 404 Not Found error was encountered while trying to use an ErrorDocument to handle the request [09:42:49] tuantrinh85: I am at work and I am having understanding the exact problem you are having. I hope someone else is able to help you. [09:43:09] huhu [09:43:35] sounds like an apache/php configuration error. but no clue [09:43:36] pls, Can u know some poeple know error? [09:43:38] (and lunch time) [09:43:51] oh [09:44:31] but im not system manager, i dont edit config [09:44:49] What doing now Duesentrieb? [09:45:53] What doing now Duesentrieb? [09:45:55] HEY [09:46:04] HELP ME, DUESEN [09:46:27] tuantrinh85: please don't beg like that. [09:46:42] It is unlikely to endear you to developers, who, after all, aren't being paid to help you. [09:47:11] As he said, he went to lunch. It would be unreasonable to expect a volunteer to delay their lunch to help you with your problems, especially if you shout at them for "ignoring" you [09:47:31] oh noes [09:49:18] yeah [09:53:51] How are links with class="external free" generated? [09:55:59] I am trying to output some HTML of the form and it's getting turned into http://X"> [09:56:03] And that is fucked up. [10:00:10] davidmccabe: what revision are you on? [10:02:54] 03mkroetzsch * r40075 10/trunk/extensions/SemanticMediaWiki/includes/ (SMW_GlobalFunctions.php SMW_Settings.php): move array of available formats from GlobalFunctions to Settings where they belong [10:04:06] Gah... [10:05:05] TimStarling: It's nice and fancy now... it works with xml quite nicely... cept.... aaagh, when I throw it output from MW it gives me xml errors that make no sense [10:05:26] davidmccabe: http://svn.wikimedia.org/viewvc/mediawiki?view=rev&revision=39980 seems like some parser changes have been wheeled back and forth, so you just need to be in the right revision to not have it happen (?) [10:08:14] Thank you Splarka. [10:08:44] I am on 37740 fwiw. [10:08:53] hey davidmccabe [10:09:13] yo werdna. [10:09:21] david: might not be related then, these seemed to be in the last week [10:09:45] ooh, we're up over 40k. [10:09:55] davidmccabe: I rearranged liquidthreads to be more like a mediawiki extension, btw [10:10:06] Werdna: more like... a mw extension? [10:10:07] used the autoloader, refactored classes, etc [10:10:09] Dantman|Food: mmm can you give me some food too [10:10:18] heh... [10:10:20] werdna, sounds good. thansk for mentioning it. [10:10:22] ohwait... ack [10:10:26] damnit... the taco [10:10:38] Werdna: BTW I'm going to be working on LQT again starting this very week. [10:10:40] most of our extension's code is so horriibel... [10:11:02] we should have some quality standards and give badges to good extensions :) [10:11:44] Splarka: SMOOOOOCH [10:11:58] buy me dinner first [10:12:44] TimStarling: http://www.shlomifish.org/Files/files/code/mediawiki/mw-fix-Global-t.patch [10:12:51] TimStarling: another fix for the test suite. [10:13:51] Splarka: Stop by Portland and I'll take you out. [10:14:08] oh, I guess I just missed the joke. [10:14:10] but I will :) [10:14:15] haha [10:14:30] Oregon or back east, or other? [10:14:53] Oregon. [10:15:29] Splarka: don't worry, we don't have to kiss on the first date. [10:16:52] aww [10:19:22] davidmccabe: that's great! [10:19:34] Werdna: Are you using LQT for any sites? [10:19:52] davidmccabe: not currently, but I'm going to install it on a site I set up next week. [10:20:02] Neat. [10:20:38] haven't had a chance to have a good play with it, but it's something I'm very interested in seeing go live. [10:21:49] Hopefully we'll finally be taking it from beta to rock solid over the next not-too-long. [10:21:54] I wish I had time to work on it more steadily. [10:22:04] Oh, and collaborators :) [10:22:29] didn't wikia try to hire you fulltime for a while for that? [10:22:39] No. [10:22:44] They hired me part-time, for unrelated stuff. [10:22:52] (at least I think it was parttime; I forget) [10:22:55] anyways it wasn't for LQT. [10:23:19] bah [10:27:46] bah indeed [10:27:56] Yeah, LQT would have been much better :) [10:31:13] Hey, somebody want to sell me a high-res LCD for really cheap? [10:31:19] Like, because you have too many? [10:35:10] *Werdna jealously protects his dual 22s [10:35:22] *davidmccabe is on a 15" laptop and it hoits. [10:38:18] Hmmm. Dell is selling a 1920x1080 for $350. I wonder if it sucks. [10:39:21] hoits? [10:39:30] hurts, said in a cute way. [10:39:35] :o [10:39:36] why [10:39:56] not enough pixels man. [10:40:01] I'm perfectly comfort with my 1400x1050 15" laptop display [10:40:16] and this fucking MW source doesn't break at 80. [10:40:22] and it's unlikely my next laptop will have even that many pixels, probably less [10:40:35] mmm, sorry, what's the rating on this channel again? [10:40:56] TimStarling: Should I commit my mostly useless domdiff code into svn? [10:43:14] davidmccabe: http://svn.wikimedia.org/viewvc/mediawiki?view=rev&revision=34271 same as source prolly [10:44:06] umm, is this a commit whose sole purpose is to add curses? [10:44:21] ... [10:44:23] AWESOME. [10:44:39] see the two revisions it reverts for more [10:44:47] ...more backstory, that is [10:45:07] ohh, gotcha. [10:45:24] forincate you php, fornicate you aurally [10:45:29] *davidmccabe should read commit messages. [10:45:53] PHP, [10:46:03] hmm? [10:46:27] oh, I was just going to dream up a wild curse for PHP. [10:46:29] but I'm too tired. [10:46:52] Thanks for the tip, Splarka, patch, werdna. [10:47:04] I'm getting {{ t/inc/Parser....ALERT - script tried to increase memory_limit to 4294967295 bytes which is above the allowed value }} [10:47:16] What can I do about it? [10:56:07] *Dantman likes Wikia's page completion feature... [10:57:11] ^_^ Actually... I'd like to see that kind of feature in FCK [10:58:03] I'd like to see FCK not screw the existing page up. [10:58:16] well, that to... [10:58:30] But I'm thinking of good features for if it ever starts to work right [10:59:30] But thinkabout it... a sort of tabcomplete feature for links when you're editing [11:00:07] You start typing [[Sand and inside of a floating list is "Sandbox", you hit the down arrow, enter, and it completes the link [11:01:19] It's been invaluable on the Narutopedia... Especially with my forum topics I don't always remember the name of the page, I just start typing and it gives me an autocompletion for the title [11:08:41] 03(NEW) Enable FlaggedRevs on Ukrainian Wiktionary - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=3D15335 15enhancement; normal; Wikimedia: Site requests; (Ahonc.ua) [11:28:49] Dantman: I would like that feature. [11:28:53] for templates, too. [11:30:03] t/inc/Parser.t fails. [11:30:59] 03(NEW) Allow html tags inside page names in Special: SpecialPages - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=15336 15enhancement; normal; MediaWiki: Internationalization; (bugzilla.wikimedia) [11:45:20] Werdna: You could probably easily tweak it to accept {{ as a starting as well as [[... then you just need to remember Template as the default namespace... [11:45:36] And as a bonus, you could grab a magic word list, and even a parser function list. [11:46:32] mmm, delicious. [11:47:59] You know... that would be interesting to plugin to SMW as well [11:48:19] Start typing a property name... and for the enumerated types, it could give you the list of possible values [11:50:20] [insert "he said semantic" image here] [11:52:03] Splarka: "he said semantic" image? [11:52:40] http://img393.imageshack.us/img393/576/hesaidsemanticga2.jpg [11:54:14] Actually, I wish I could see a processed xml tree [11:54:28] Preprocess everything, then expand after... [11:54:53] Unfortunately, we can't really do that since it could never be compatible with current syntax [11:55:06] hi Dantman [11:55:57] The idea behind the xml tree, would be wysiwy(g|m) editors could get that from the API, use some minor transformations to present it, modify it, and then feed it back to the API and get the WikiText back [11:56:23] What should I do about the failing tests in t/inc/Revision.t ? [11:56:40] Dantman: preprocessing already does XML tree [11:56:47] Not completely [11:56:52] And it's not possible to go full [11:57:02] <_mary_kate_> i always thought it would be nice to store text as XML [11:57:08] <_mary_kate_> convert it to wikitext for editing and back on save [11:57:27] Because unfortunately {{Start}}FooBar{{End}} where Template:Start contains [[ and Template:End contains ]] produces a valid WikiLink [11:57:38] _mary_kate_: I found your photo on enwiki :) [11:57:39] That would be impossible if we converted completely to XML [11:58:10] <_mary_kate_> Werdna: onoes [11:58:24] Unless we parsed multiple times and found a way to deal with the fact that we've got parsed xml, mixed with WikiText, mixed with HTML [11:58:47] And even then, we'd have to parse over and over and over, and there's a technical limit to that [11:59:34] ^_^ We could always implement an alternate parser that doesn't follow the same rules as the normal one.... perhaps trial it arround and see how people take to it... [11:59:55] :o [12:00:04] Heh [12:00:19] hmm [12:00:20] Well think of it as premptive code for phase4 [12:00:41] Werdna: I thought it was on Board Elections somewhere [12:00:54] would it be possible to implement {{uilangoce}} if it left a placeholder that is ignored in links and templates? [12:01:04] Work on a setup now that is completely incompatible, and release it as the primary when we get to a point where we actually forcively break everything [12:01:13] ^_^ MW 2.0.0 [12:03:24] VasilievVV: Wikipedia:Facebook [12:04:42] That XML idea was one of the mw-incompatible ideas I wanted to try in writing my own wiki engine [12:05:04] Heh... Parser_XWT [12:20:05] 03raymond * r40076 10/trunk/phase3/ (2 files in 2 dirs): Remove unused messages, function deleted with r39717. [12:25:04] 03raymond * r40077 10/trunk/extensions/EditUser/ (EditUser.php EditUser_body.php): Remove option, function deleted in core with r39717. [12:26:05] 03(mod) arz.wikipedia has been approved - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15013 (10robin_1273) [12:26:54] Nikerabbit: http://www.youtube.com/watch?v=r99MiiLcLM0 ;-D [12:26:58] Nikerabbit: thats how sauna happens in lithuania [12:27:27] domas: hmm I can't watch youtube videos... [12:27:36] domas: sfw? :P [12:27:53] whoooa [12:28:08] hahahaha [12:28:11] *domas is crying [12:28:16] domas: men only? [12:28:17] jeez.. [12:32:50] domas: bah [12:33:17] =) [12:33:28] I'm crying [12:44:17] hey there, is there a manual with the tags to add boxes in the mainpage, for example to display the last articles, etc... ? [12:44:55] or is templates that i have to create first and then call them in the main page? [12:45:11] i dont really know how the wiki language works [12:50:19] DPL Could be used to display the articles [12:50:41] Other than that most stuf is just normal markup... mostly html with a bit of wikitext mixed in [12:55:33] hello [12:58:44] yeah, is there a wikitext dictionnary? [12:58:55] hello Nikerabbit wiki guru [12:58:58] ^^ [12:59:07] huh? [12:59:11] pierre_: there's a manual somewhere [12:59:11] ^^ [12:59:14] !wikitext [12:59:14] --mwbot-- For help with MediaWiki's Wikitext syntax, please see . For an (incomplete) formal specification, see . [12:59:22] thx [12:59:43] rawr [13:04:26] 03vrandezo * r40078 10/trunk/extensions/SemanticMediaWiki/ (4 files in 3 dirs): Rewrite of the Browse User Interface [13:08:37] do you know if dynamic article list extension is working under 1.13? [13:12:55] 03nad * r40079 10/trunk/extensions/SimpleSecurity/SimpleSecurity.php: bail from validateRow if $wgSimpleSecurity doesn't exist yet [13:14:21] http://www.mediawiki.org/wiki/Extension:Barack_Obama_Indexing_Service <- O_o [13:34:43] can the Api return results like Special:Recentchangeslinked? recentchanges cannot be used as generator [13:35:57] 03(NEW) Strange font size CSS in monobook - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15337 15enhancement; normal; MediaWiki: User interface; (random832) [13:36:52] 03btongminh * r40080 10/trunk/phase3/includes/ (3 files in 2 dirs): [13:36:52] * Fixup validation methods in UserEmailForm a bit so that they don't return arrays when it's not necessary. [13:36:52] * Add email errors to the API's message map [13:38:33] TimStarling: Do you ever have dreams where MediaWiki's WikiText syntax is a strict syntax and so the parser can really preprocess... [13:39:08] Bryan: + return $wgEnableEmail && $wgEnableEmail; [13:39:24] ... [13:39:27] fuc [13:39:29] shouldn't that be return $wgEnableEmail && $wgEnableUserEmail; [13:39:29] k] [13:39:57] you mean really parse? [13:40:00] Heh... [13:40:10] it does preprocess already [13:40:14] some might say, too many times [13:40:23] 03btongminh * r40081 10/trunk/phase3/includes/specials/SpecialEmailuser.php: Check on $wgEnableEmail && $wgEnableUserEmail [13:40:24] Well... I mean the whole thing... [13:40:32] Right now all we really do are the curly braces [13:40:39] We can't really do much else [13:40:57] nowiki? [13:41:13] it's a preprocessor in the sense that the C preprocessor is one [13:41:24] mhmm... nowiki is an issue I'm thinking of as well... [13:41:27] you input wikitext plus some extra bits, and it outputs wikitext [13:42:00] However, I think includeonly is going to be even more of an issue than nowiki [13:43:40] For nowiki, you can just find all tags beforehand, and turn them into nodes... you don't have to worry about anything nested inside of them... [13:44:09] what do you mean it's going to be an issue? when? [13:44:11] However, noinclude and includeonly tags are evil..... the operate outside of the entire dom side of the parser [13:44:19] ^_^ Oh... just my chatter [13:44:38] You know me... I'm the one who attempts interesting projects no matter how insane they are [13:45:51] Heh... this time it's experimenting with a parser for WikiText like syntax (read: like... there's going to be strong incompatibility) that parses everything from WT into XML which can then can be converted to whatever format you want... even back to the same WT as before [13:46:48] a tree doesn't describe parsed wikitext particularly well [13:47:00] I think you'd need a more generic structure [13:47:16] at the preprocessor level, we have a tree [13:47:36] but then you have a second grammar at the main pass level, which overlaps the first [13:48:17] you can track text from start to end, just like you can track text through two stages of C language parsing [13:48:50] I certainly don't think you can expand the preprocessor to handle all main-pass syntax and still call it a preprocessor [13:49:00] Heh... [13:49:10] the preprocessor is parse+expand [13:49:14] I know... derivitive syntax mostly... [13:49:16] not just parse [13:49:51] well, handing on the pp level seems likea good idea. [13:50:04] It's a idealist idea, where you could take something like WikiText, feed it through the parser and get a XML dom, modify that with say... WYSIWYG ;) and then feed it back to the parser to turn back into WT [13:50:38] I never said it was going to be compatible or completely like WikiText [13:50:54] Sure 90% of the syntax looks the same... [13:51:05] a parse tree, yes. an xml dom... why? [13:51:05] But it's far stricter [13:51:14] to get back to your original question [13:51:35] You got something else that can be handled by a wide variety of languages? [13:51:39] I dream about parsers that are as loose as human language and just seem to know what you mean when you type it [13:51:47] not about XML [13:52:14] What should I do about the failing tests in t/inc/Revision.t ? [13:52:33] Dantman: XML is a good exchange format, but not really the best internal representation. [13:54:19] You got another idea? Something even JavaScript can manipulate? [13:54:28] rindolf: all the tests pass for me [13:54:32] If you've got something interesting I'd hear it [13:54:50] Right now, the idea of even XSLT is rather interesting [13:54:54] 03(NEW) Allow blocked users to edit their own talk page on Russian Wikipedia - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=15338 normal; normal; Wikimedia: General/Unknown; () [13:55:08] JSON? :) [13:55:32] 03(mod) Allow blocked users to edit their own talk page on Russian Wikipedia - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15338 +shell (10alex.emsenhuber) [13:55:33] That's still rather tree like [13:55:52] Hmm... but it does fit in [13:56:00] We have Hash/DOM so it's possible [13:56:58] Dantman: have you looked at the way FCKEditor does it? [13:57:03] Dantman: s-expressions?... [13:57:11] rotfl... [13:57:25] they're not far off, they don't get any of the minor details right, but it does look pretty [13:57:50] ^_^ Love that idea... cept a lisp processor isn't as simple as you think [13:57:55] Dantman: but usually, the *internal* representation is an object structure tailored specifically for the application. once you have that, serializing it to xml, json, s-ex or whatever isn't hard. [13:58:02] brion: Is there any cross-db compatible method for changing the type of a field? [13:58:09] XML DOM is just anextremlöy bloated way to keep a parse tree in memory. [13:58:11] e.g. from char to varchar [13:58:18] Skizzerz: there's no brion here [13:58:23] lol [13:58:29] *Skizzerz just tends to append that to his questions [13:59:07] Dantman: evaluating lisp isn't so simple, but parsing & manipulating s-expressions is pretty trivial. the hardest bit is really string literals, with all the escaping cruft. [13:59:27] ^_^ ' ` , @ [13:59:41] the only method I can think of that works cross-db is creating a new temp field with the correct type, migrating everything to that field, deleting and recreating the old (empty) field with the right type, then migrating everything back [14:00:06] in mysql it's a simple ALTER TABLE query, but the method used there only works for mysql and oracle (i think) [14:00:07] Oh right || was somewhere in there to [14:00:35] definately doesn't work in postgre though [14:00:54] Last time I checked FCK they were doing something like taking HTML and using id's or classes I think [14:01:08] 03(mod) Preferences Form Customization - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15235 (10jlerner) [14:01:28] 03(mod) A hook to enable putting options to the preferences tab - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14806 (10jlerner) [14:01:49] hello [14:02:03] ialex: hmmm.... [14:02:05] is there some way to do bi-directional links in mediawiki? [14:02:32] i.e. if it goes away on one side, it goes away on the other? [14:03:04] ialex: http://sial.org/pbot/32008 [14:03:26] I have a shared media pool situation set up and want to know if I am supposed to set the vars like $wgUseImageMagick = true; in the localsettings.php of the wiki that uses the pool or the wiki which is the pool [14:03:30] !smw | davidfetter [14:03:30] --mwbot-- davidfetter: SemanticMediaWiki is an extension that lets you conenct wiki pages with semantic relations. See and . Mailing lists are available at . [14:04:06] Duesentrieb, thanks :) [14:04:13] Skizzerz: it's quite simple, there are two steps: [14:04:30] 1. write a column modification routine separately for each kind of DBMS [14:04:41] 2. call the appropriate routine for the current DBMS [14:05:04] TimStarling: i just found that i can't positively answer Gary_Kirk's question... could you? How *does* thumbnailing work with a shared repo? [14:05:06] 0: don'te even attempt to support multiple dbms back-ends. it's too expensive no matter how you do it [14:05:30] TimStarling: k. [14:05:40] s/te/t [14:05:48] Duesentrieb: magic, requires kitten sacrifice [14:05:51] Duesentrieb doesn't it only with with 404 handlers? [14:06:22] the 404 handlers are actually required for a single wiki scenario [14:06:25] hehe :P [14:06:38] hello [14:06:41] the fact that nobody has noticed that the whole setup is full of bugs without them is a wonder [14:07:23] bwahaha! is the documentation on setting up 404 handlers? [14:07:41] i only have the vaguest of ideas [14:08:02] So anyway, TimStarling, do I set the path etc to ImageMagick in the pool wiki's localsettings or in the ones which use the pool? [14:08:14] both [14:08:50] and you have to make sure that you have the path to the foreign repo configured, so that the foreign MediaWiki knows how to find the source images and where to put the thumbnails [14:09:00] My host just told me "ImageMagick is /usr/bin/", so that is what I set the path as? No filename or anything? [14:09:11] /usr/bin/convert [14:09:19] At the moment the pool works fine, but SVG doesn't [14:10:11] I wanna use my wiki to document an API, do you have any recommendation for that? In particular, to display function nicely with colors and all, and I want function name and types to be clickable. so void myfunc(int a, int b); would provide a link to void, myfunc and int. [14:10:15] Right I am looking here: http://www.mediawiki.org/wiki/Manual:Image_Administration#SVG [14:10:17] I bet it doesn't work fine [14:10:38] I bet it's all broken, but mediawiki's clever fallbacks are fooling you into thinking it's working for ordinary images [14:10:50] Probably xD [14:11:25] well the wikis that use the pool images display images from the pool locally fine.. it would appear, anyway [14:11:36] <^demon> Whoo, spontaneous firefox freezing... [14:12:28] Gary_Kirk: is the wiki public? [14:12:45] yes [14:12:54] image URL? [14:13:15] of an svg or any? on the pool wiki or the wiki which uses it? [14:13:36] all of the above [14:13:47] http://external.xinki.org.uk/blrag/wiki/index.php?title=Main_Page [14:14:07] that is a wiki which uses the pool, and that has images hosted on the pool [14:14:23] *TimStarling does not believe in "information overload" [14:15:17] ^demon: imho your
arount watchlist options takes too much screen space [14:15:33] <^demon> AlexSm: I liked it :( Open to suggestions though. [14:15:50] charming 404 handler you've got there [14:16:12] TimStarling: me? [14:16:23] ^demon: http://en.wikipedia.org/wiki/Wikipedia:Village_pump_(technical)#.22go.22_button_in_watchlist (second half of the section) [14:16:33] the suggestion is to remove it :) [14:16:39] Gary_Kirk: yeah, redirecting from /images/ to the wiki if the image doesn't exist [14:17:04] <^demon> AlexSm: I did it to make it consistent with RC et. al. But reading the thread now, give me a moment... [14:17:14] Okay, I don't know what that means :) [14:17:21] TimStarling: is that script still running? [14:18:03] <^demon> AlexSm: I see. Want me to see if I can shrink it and make it less-crap? If not, I can certainly remove it. [14:18:24] In the config of that wiki that uses the pool I have this by default: [14:18:28] wgEnableUploads = true; [14:18:30] $wgUseImageMagick = true; [14:18:32] $wgImageMagickConvertCommand = "/usr/bin/convert"; [14:18:50] what am I supposed to add? $wgSVGConverter and $wgSVGConverterPath? [14:19:04] Gary_Kirk: it works fine for me [14:19:07] AaronSchulz: yes [14:19:18] ...doing page_id from 3544500 to 3544999 [14:19:37] those are all PNGs. this isn't: http://xinki.org.uk/wiki/Image:Nuvola_apps_korganizer.svg [14:19:47] I tried with tools.svg [14:20:13] ah [14:20:39] <^demon> Bah, this is the second time in a row Ctrl+F has caused Firefox to explode....this is getting old.... [14:20:44] yes that seems to work. So am I supposed to add $wgSVGConverter and $wgSVGConverterPath in the localsettings of both wikis? [14:21:05] or does $wgImageMagickConvertCommand = "/usr/bin/convert"; suffice? [14:22:20] the korganizer thumbnails look like crap because you're using imagemagick for svg rendering [14:22:32] you need to install rsvg instead [14:22:41] I am on a shared host :( [14:22:52] well, upload PNGs then [14:22:57] hi all [14:22:57] render them at home [14:23:11] is faq on mediawiki a plugin? [14:23:12] Ok, so basically some SVgs will look nice and others won't? Fine [14:23:23] yes [14:23:36] TimStarling: any ideas why do ImageMagick supports SVG rendering if its rendering is worse than crap [14:23:37] alright, thanks [14:23:45] or a function [14:23:46] ??? [14:23:52] VasilievVV: who knows [14:24:08] compare: http://www.xinki.org.uk/wiki/images/1/1f/Tools.svg http://www.xinki.org.uk/wiki/images/thumb/1/1f/Tools.svg/128px-Tools.svg.png [14:24:12] TimStarling: by the way, do Wikimedia use imgserv and are there any plans on its usage? [14:24:21] what's imgserv? [14:24:59] ^demon: the problem is,
has huge padding and margin from site CSS; also, the legend "Watchlist options" imho states the obvious [14:26:01] TimStarling: image thumbnailing daemon, see http://svn.wikimedia.org/viewvc/mediawiki/trunk/imgserv/ [14:26:28] I saw brion had put a bug about such daemon long time ago [14:26:42] <^demon> AlexSm: Fair enough, one moment. [14:26:48] no we don't use it [14:27:30] TimStarling: any plans on using such server? [14:27:48] maybe [14:28:05] well, I mean no, but it's a good idea [14:29:24] Should it support only image thumbnailing or it should be also capable to store files like WebStore extension to replace NFS? [14:29:50] I just remembered: it was Angela who recommended this shared host to me. Apart from them being hopeless in helping with MediaWiki stuff, they've been pretty good :D [14:30:11] even WebStore separates those roles [14:31:33] ok, why does the edit screen have this stupid extra space? [14:32:27] 03demon * r40082 10/trunk/phase3/ (3 files in 3 dirs): Remove fieldset from Watchlist that apparently only I liked :) [14:32:33] VasilievVV: all it needs is a fastcgi proxy that proxies 404s to the scaling cluster [14:32:37] like http://svn.wikimedia.org/viewvc/mediawiki/trunk/tools/upload-scripts/ [14:32:54] it doesn't matter how it works exactly [14:32:56] <^demon> AlexSm: Done in r40082 [14:33:13] How come if I put a bullet mark (*) in MediaWiki:Returnto, it just shows up as an asterisk when logging in/out? [14:33:20] also thumb.php needs to work somehow [14:33:28] 03siebrand * r40083 10/trunk/phase3/languages/messages/ (154 files): Remove messages and rebuild all messages files per r40076. [14:33:38] just the interface, not the script necessarily [14:34:24] TimStarling: brion mentioned that one of the points of havingsuch daemon is to avoid invoking ImageMagick via shell and don't suffer of escapeshellarg() bugs [14:34:59] ^demon: thank you; now I see that RC indeed had a for a long time; someone tried to remove it by Brion reverted [14:35:09] btw, RC filedset legend "Recent changes" (not even "...options") is even more useless [14:35:15] we don't suffer from escapeshellarg bugs [14:36:06]


[14:36:10] wtf is this? [14:36:40] Is there any way I can add some text to the footer on every page? [14:37:15] AaronSchulz: iirc someone rewrote EditPage and introduced that [14:37:22] AaronSchulz: talk to Dantman [14:37:25] and be nice [14:37:31] It was even mentioned in the commit summary [14:37:36] HTML in $wgCopyrightIcon appears on the left [14:37:45] TimStarling: https://bugzilla.wikimedia.org/show_bug.cgi?id=4854 [14:38:02] I didn't introduce it, I modified it's location [14:38:12] It used to be inside the preview, now it's right after it [14:38:31] That merely changed because I moved it outside of getPreviewText [14:38:52] well can someone please fix it? [14:38:57] I've been trying to organize the ui, backend, and various other categories of code that are scattered arround EditPage [14:39:00] In what way? [14:39:31] so that it doesn't make a newlines worth of space for no reason [14:39:38] It's not for no reason [14:39:44] It's commented right in the file [14:40:01] It's there to provide a blank space between the preview area and the edit toolbar [14:40:09] Though I have been thinking of replacing it with a margin [14:40:20] Probably need an extra class though [14:40:45] it should add it when needed [14:40:58] I'm not even previewing and I see all this space there for no reason [14:40:59] Oooh... [14:41:06] Ok... I just realized that [14:41:09] One sec [14:41:19] ok, tx :) [14:42:47] I'll give displayPreviewArea a $isOnTop input [14:43:00] That way it's also possible to easily migrate to using a margin in the future [14:43:18] awww , I like the watchlist fieldset :( [14:43:23] hi all, I enabled $wgMainCacheType = CACHE_MEMCACHED; on my server and performance seems to have decreased [14:43:33] how should I troubleshoot this? [14:44:14] loads are taking ~10secs with no load generated during this time [14:44:34] I've also set $wgMemCachedServers = array( "127.0.0.1:11000" ); [14:44:48] ^demon|class: huh? Fieldset removed? [14:45:05] oops, disregard me, port wrong >_> [14:45:09] thanks again! [14:47:43] T_T You do realize something [14:47:56] 03dantman * r40084 10/trunk/phase3/includes/EditPage.php: [14:47:56] Fix a minor regression pointed out by aaron. [14:47:56] The spacer for between the preview area and the edit toolbar was being outputted even when not previewing. [14:47:58] ^demon|class: then I think you should remove it from RC too. There the fieldset has the same padding, and also states the obvious... And for the record: I liked it. [14:48:26] If we output the preview area, but no spacer when not previewing... then when someone gets to a good ajax preview... the preview area won't be spaced when people are using ajax [14:49:34] ^demon|class: I like the fieldset too and now it [14:49:46] it is inconsistent with RC and other forms :( [14:51:13] Dantman: perhaps it could be display:none in those cases, and ajax can flip it? That or ajax add it to begin with as needed? [14:51:33] Hmmm... that is a interesting prosect [14:52:12] The margin would be ignored when the element is display: none; right? [14:54:20] Raymond_: what's the point in RC fieldset with the legend that again says "Recent changes"? [14:55:05] fieldsets are the best [14:55:14] they don't need to have a point [14:55:46] AlexSm: the legend text can be changed, i.e. to "Options" [14:56:39] fine, let's have RecentChanges that I have to scroll down to see any recent changes ... [14:57:06] *VasilievVV|NA likes fieldsets too [14:57:34] ^_^ Fun.. todays export... useless tools... [14:57:51] Heh... [14:58:35] AlexSm: That can be fixed using CSS, you are suggesting that the majority of users should be disadvantaged based upon a minority who use unusually small screen resolutions. [14:58:55] http://dev.wiki-tools.com/extensions2/TestingGrounds/DomDiff.php I created a briliant DomDiff tool to help track down dom differences between two versions of an EditPage... :/ But the whole XML parsing thing doesn't like EditPage's output so it's pointless... [14:59:23] AlexSm: In this day and age it is not unreasonable to expect users to have a display of at least 1024x768 - and for mobile users the problem you point out will still be an issue even without the fieldset. [14:59:44] MinuteElectron: on ruwiki I saw many users complaining that Wikipedia or AWB doesn't work well under subnotebooks like ASUS Eee PC [15:00:11] They have resolutions like 320x200 [15:00:44] but not the EeePC. I have had one at Wikimania Alexandria. Nice working in Wikipedia [15:00:46] right, at that resolution the cited problem (having to scroll down to see RC) will be an issue even without the fieldset [15:01:13] MinuteElectron: ok, to be honest, I do see some recent changes without scrolling, but this options still take much more space than before [15:01:29] AlexSm: why should we care? [15:01:36] I know I can tweak that with CSS, I just think more compact would be better for most users [15:01:53] Why would it be better? [15:03:03] VasilievVV|NA: because useless extra space on the screen is, well, useless? [15:04:56] AlexSm: grouping control elements into fieldset is a good form [15:05:26] <^demon> I'm an idiot. Of all days for me to show up to class an _hour early_, it had to be on a day where it's pouring rain. I'm soaked now, and for no legitimate reason.... [15:05:54] ^demon: [15:05:56] 11:05:30 -rakkaus:#mediawiki-i18n- [27-Aug-2008 15:01:44] PHP Notice: Undefined variable: form in /var/www/w/includes/specials/SpecialWatchlist.php on line 272 [15:07:31] <^demon> MinuteElectron: Done. [15:07:45] 03demon * r40085 10/trunk/phase3/includes/specials/SpecialWatchlist.php: Can't add to a variable if it doesn't exist... [15:07:50] thanks [15:09:28] Hey, does anyone new if theres a way to upload files to another namespace than image:? maybe file: instead? [15:10:37] you can rename the Image namespace, or add an alias to it [15:11:06] *^demon toyed with trying to make multiple image namespaces once...didn't go well [15:12:32] <^demon> Trying to duplicate image and category namespaces just doesn't work :-p [15:14:25] not very encouraging ;-) [15:14:33] well ill try out later [15:23:03] 03demon * r40086 10/trunk/phase3/ (3 files in 3 dirs): [15:23:03] Reverting my reversions of the fieldset on Special:Watchlist. I liked it, and [15:23:03] apparently some other people did too (although some don't). Plus, it's been [15:23:03] pointed out that semantically, a fieldset belongs around fields. Who knew? Could [15:23:03] use some CSS tweaks maybe. [15:27:21] 03(mod) Wrong formlink when viewing an empty page - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15329 (10dasch_87) [15:28:49] *siebrand gives ^demon a cookie. [15:29:11] How do I use infoboxes on my own wiki? [15:29:18] <^demon> !templates [15:29:18] --mwbot-- For more information about templates, see . The most common issues with templates copied from Wikipedia can be fixed by installing ParserFunctions and enabling HTML Tidy . [15:29:21] I have parserfunctions installed; I thought that was enough [15:29:48] I'm on a shared host too [15:29:55] <^demon> Gary_Kirk: See the first link especially. Taking a look at some of the infoboxes on enwiki might help as well. [15:30:01] <^demon> You can certainly do it on a shared host. [15:30:14] yeah, this is a new wiki I have set up [15:30:22] <^demon> It's just tricks with templates, nothing you can't do with simple wikitext + parserfunctions. [15:30:33] ^demon, nothing? :P [15:30:54] <^demon> Reedy: Double negative, I know. [15:31:01] lol [15:31:05] depends on your pov [15:31:11] Alright, so I have copied from enwp Template:Infobox and Template: Infobox_Person [15:31:13] it either means a positive, or enforces the negative [15:31:38] on my wiki Template:Infobox displays blank and Infobox_Person displays as just code [15:31:46] meow [15:31:56] http://external.xinki.org.uk/blrag/wiki/index.php?title=Template:Infobox_Person [15:32:10] space is needed to give the page some structure and to be not too crowded [15:32:11] <^demon> You'll also need Template:Infobox (it's a meta template designed to help the simpler infobox person, infobox boat, etc) [15:32:42] <^demon> Reedy: Mean it as a positive. English teachers can bite me. :-) [15:32:43] yep, I copied template:infobox but think I left out the bottom ie {{documentation}} or something similar? [15:32:59] <^demon> {{documentation}} is an enwiki specific thing to add documents to a page. [15:33:12] Ok, so what have I done wrong? [15:34:02] <^demon> Let me take a look. There's a lot of sub-templates involved in in infobox person. [15:34:37] <^demon> You'll need..... !, Age, Birth date and age, Image class names, Infobox, MONTHNAME, MONTHNUMBER, Nowrap. [15:34:42] exactly which part of Template:Infobox Person and Template:Infobox am I supposed to copy? [15:34:46] <^demon> Maybe others? Those at the very least. [15:35:10] all those ^ are Template: names? [15:35:42] <^demon> Yes :( [15:35:53] if you look at the bottom of the edit page, shouldnt it tell you what templates the page uses? [15:36:03] <^demon> http://en.wikipedia.org/wiki/Special:Export/Template:Infobox_Person - might help you. You could import that, technically. [15:36:26] <^demon> Reedy: Yeah, but I removed some of the enwiki things like {{protected}} and whatnot, they shouldn't be needed. [15:36:32] ay [15:36:42] so what do I copy from Template:Infobox? View source and copy everything? :S [15:37:04] <^demon> Yes...but if you do it that way, you'll have to _manually_ copy _all_ the sub-templates it uses. [15:37:20] So I should export it [15:38:09] and tick include templates> [15:38:11] *? [15:38:13] <^demon> You're better off exporting (visit Special:Export http://en.wikipedia.org/wiki/Special:Export), and type in "Template:Infobox_Person" into the big box (without the quotes), check "Include Templates." When you get this XML, you can _import_ it to your wiki, and Infobox Person should work automatically. [15:38:15] <^demon> Yes. [15:38:35] Infobox Person and Infobox? [15:38:43] got it [15:39:10] <^demon> Not "Infobox", as it should be included as a sub-template automatically. [15:39:19] ok [15:41:31] Duesentrieb: Daniel, have you been made aware of this yet? http://www.mediawiki.org/w/index.php?title=Extension_Matrix&diff=207751&oldid=205103 [15:41:52] sorry peopple [15:42:07] ^demon: I've purged the page and still not working. [15:42:18] <^demon> Bah :( [15:42:23] (after having imported that) [15:42:49] i want to ad FAQ on my wikipedia portal [15:42:51] <^demon> One second. [15:43:08] k [15:43:22] just like support -> faq [15:43:27] in wikimedia site [15:43:34] that points to http://www.mediawiki.org/wiki/Manual:FAQ [15:43:44] is it possible ? [15:44:04] is this a standard feature or an extension/plugin [15:44:06] ??? [15:45:33] <^demon> Gary_Kirk: Those couple of anon edits from 209.x are from me. [15:45:39] <^demon> I'm trying to play with this. [15:45:55] feel free [15:46:10] <^demon> Hmm. [15:46:33] <^demon> I've got to run to class. I'll take a poke when I get back if you haven't already fixed it (just ping me if you do :)). Best of luck. [15:46:42] thanks :) [15:46:48] <^demon|class> For real this time guys, I swear. :-p [15:52:54] is there any point in submitting a patch for a hook in the "watchlist" section of the preferences? [15:53:32] or might as well just wait for bugs 14806 to 15235 to be closed [15:53:34] ? [15:54:34] https://bugzilla.wikimedia.org/show_bug.cgi?id=14806 and https://bugzilla.wikimedia.org/show_bug.cgi?id=15235 are more generalized solutions [16:00:42] 03(mod) DISPLAYTITLE doesn' t always work when current revision is sighted - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=15330 (10dapete.ml) [16:21:49] Hello! I wonder if someone here could help me please. I posted this in the wrong channel earlier. [16:22:05] Hello all. I want to ask about a feature request for wiki software - specifically WP. On the button in the extended toolbar, can it be arranged so that when its clicked, it includes a {{reflist}} on the edited page? I used it and just had to get quanticle to correct my edits. [16:22:45] drivamgr2008Spri, that button is not part of the software. Ask whoever added it to Wikipedia, some sysop or other. [16:23:08] I don't know who provides the extended toolbar... [16:23:16] I dont even know how the heck I got it [16:25:31] Actually its not in the extended toolbar. Its in the main one. Where you get the image button, etc... [16:27:59] hello [16:28:53] i need help moving our mediawiki wiki from one server to another... is there an easy way to do this? [16:35:23] 03(mod) ISO Autoformatting accuracy error with Julian versus Gregorian calendar. - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15311 +comment (10pmanderson) [16:40:41] 03catrope * r40087 10/trunk/phase3/includes/api/ApiBase.php: Fix fatal error caused by missing comma [16:42:16] How do I remove the Privacy and About {{SITENAME}} links like I have on this wiki: http://xinki.org.uk/lab/Home ? [16:43:45] !footer [16:43:45] --mwbot-- For changing the page footer, see the FAQ (!) . More information at http://www.mediawiki.org/wiki/Footer and http://www.mediawiki.org/wiki/Manual:Skinning#Footer [16:44:02] self-help ftw! [16:44:23] 03(mod) Strange font size CSS in monobook - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15337 +comment (10dan_the_man) [16:47:23] i need help moving our mediawiki wiki from one server to another... is there an easy way to do this? [16:48:44] 03catrope * r40088 10/trunk/phase3/includes/api/ (5 files): [16:48:44] API: [16:48:44] * Add titlePartToKey() and keyPartToTitle() which use the substr() hack to preserve trailing spaces [16:48:44] * Migrate function calls where needed. ??continue parameters still use titleToKey() because they're generated using keyToTitle() and therefore can't contain trailing spaces [16:49:55] Do I have to edit MonoBook.php? [16:52:11] 14(INVALID) Strange font size CSS in monobook - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15337 +comment (10hashar) [16:53:07] 03raymond * r40089 10/trunk/extensions/Renameuser/SpecialRenameuser_body.php: No need for a version_compare since we branch extensions too [16:53:31] <_mary_kate_> i really hate the idea of "branching extensions" being a magical solution to the version problem [16:53:46] <_mary_kate_> do extension authors even know we're doing that? there was no announcement about it, it just started happening one day [16:53:56] <_mary_kate_> do people fix extension security issues on every old branch? [16:54:49] major security issues, yes [16:55:07] like that one tim discovered [16:55:07] <_mary_kate_> so Tim applied his recenty fixes to every branch? [16:55:18] 03vrandezo * r40090 10/trunk/extensions/SemanticMediaWiki/languages/SMW_Messages.php: Changed opposite to inverse [16:55:32] he did apply a security fix using a tool he made to all branches [16:55:50] MinuteElectron: Are you a developer? I am trying to make sense of something... [16:56:06] go ahead [16:56:36] I asked earlier and spoke to simetrical, but I got my fact wrong. In mediawiki, above the edit box (with no extras) is a toolbar. [16:56:49] Is that part of the main mw software? [16:56:57] yes [16:57:14] Right. Does it have a reference button on it as standard? [16:57:35] no [16:57:39] Ah shoot. [16:57:42] Special:ExtensionDistributor explicitly says that many extensions work across versions, and that you can try the trunk version if you like [16:57:51] if you need the latest features [16:57:53] Do you know who added it in please? [16:58:16] drivamgr2008Spri: I don't know who, but it is done in MediaWiki:Common.js [16:58:27] see http://en.wikipdia.org/wiki/MediaWiki:Common.js [16:58:38] Ah. Well it needs amending, and I don't know how to do it. [16:58:45] I am hopeless with js [16:58:51] TimStarling: removing the version_compare was a bad idea? [16:59:34] When you click the button, it needs to add a {{Reflist}} on the page, otherwise the references it enters don't show up on the page. [16:59:38] most likely the extension is broken anyway for 1.8 [17:00:05] so it probably doesn't matter [17:00:09] ok [17:00:12] I do php... or bits of it, as Tim knows, cause I sent him something ages ago. [17:00:16] but I wouldn't make a rule of removing version checks [17:00:31] I managed to get all of the extended edit buttons working in IE6 [17:00:36] on my own wiki [17:01:21] 14(INVALID) Ghost categories in wanted categories - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15152 +comment (10hashar) [17:02:05] Am I allowed to edit MediaWiki:Common.js more to the point? [17:03:14] ok. no i am not... I get a View Source... [17:04:52] OK Then. Last question: How do I make a feature request for something? [17:05:05] cause in its current state, the ref button doesn't work properly... [17:17:07] 03(mod) Check Babel extension for compatibility with Wikimedia Wikis - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15308 (10robin_1273) [17:22:21] Is it possible to have your Section headers enumerate like the Content box does? [17:30:23] Can someone answer my q please? How do i make a feature request for the mediawiki software? [17:32:57] drivamgr2008Spri: bugzilla [17:33:16] it's not only for bugs but also feature requests and similar [17:33:26] thanks Bryan. Its not major, just something which needs doing to Common.js - and I can't edit it [17:34:14] One of the extended edit buttons attached to it is incomplete [17:34:34] 03(mod) Oversight of file histories including images - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=8196 (10cbass) [17:35:19] is there any wiki with query.php still up? [17:35:40] (mostly i just want the docs generated when you visit with no querystring) [17:38:48] 03btongminh * r40091 10/trunk/phase3/ (9 files in 5 dirs): [17:38:49] Splitting backend upload code from SpecialUpload. [17:38:49] * All common upload code resides in UploadFromBase. Then depending on the upload source, one of its derived classes is initiated by Special:Upload. [17:38:49] * SpecialUpload::ajaxGetExistsWarning now only returns warnings that are related to existence. [17:38:49] * Allow LocalFile::upload to attribute the upload to another user than $wgUser [17:38:52] This introduces breaking changes for upload extensions. [17:39:50] rewrites are scary [17:41:45] Ok, got a bugzilla a/c now so I will get the feature request done. Better search it first to make sure its not already been asked for. [17:43:06] 03(FIXED) Provide time sorting for OldReviewedPages - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15293 (10JSchulz_4587) [17:48:38] write api is so cool 8) [17:51:08] <^demon|class> Bryan: So is your Upload rewrite. Thoughts on moving them to ~/includes/upload though, for consistency with other grouped packages (filerepo, api, diff, parser, etc)? [17:51:24] later [17:51:32] first see whether this gets through [17:51:41] <^demon|class> Looks good to me so far :) [17:51:58] <^demon> But I'm not the judge, hehe. [17:52:26] please test it :) [17:52:36] and put all kind of weird stuff into it [17:52:49] <^demon> A test suite for it might be cool :) [17:53:48] Bryan: is write api enabled for wmf? [17:53:55] yes [17:54:17] *jeremyb should port pywikipedia [17:55:13] *VasilievVV should write his own bot framework on PHP [17:55:51] <^demon> VasilievVV: Isn't there one? [17:56:14] writing it yourself makes you a better man :P [17:56:15] ^demon: mote than one :) [17:56:24] *more [17:56:59] http://xkcd.com/409/ [17:57:05] 03(NEW) Reference creation button produces incomplete entries in extended toolbar. - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15339 15enhancement; normal; MediaWiki: General/Unknown; (hammer.of.thor) [17:57:57] hey guys.. are there any guides for helping move a mediawiki site from one server to another (also from CentOS to Gentoo) and from old MediaWiki to new version (i think 1.6 to 1.11) [17:59:03] <^demon> Bryan: So far so good :) [17:59:25] hi there. do you know a nice tool troff -> mediawiki syntax? i.e. to convert common unix manpages? [18:00:10] <^demon> Bryan: "Cancel upload and return to the upload form" generates a blank page. [18:00:17] <^demon> On uploading a dupe file. [18:00:38] ok checking [18:01:10] <^demon> Well, it says "Upload" but the page has no content (looks like a blank wikipage) [18:03:01] *AaronSchulz wonders why UploadForm doesn't extend UploadFromBase [18:09:31] 03btongminh * r40092 10/trunk/phase3/includes/ (UploadFromBase.php specials/SpecialUpload.php): [18:09:31] * Remove a debug statement [18:09:31] * Return a status on unsaveTempFile [18:09:56] because the UploadFromX things are backends [18:10:07] and UploadForm is frontent [18:10:16] 03mark * r40093 10/trunk/ubuntu/autoinstall/pmtpa.cfg: Update LVS service IP [18:11:56] Does API has action=upload? [18:12:43] not yet [18:12:49] too tired [18:13:16] There's a bug for it. [18:13:19] Does API action=move handles image moving correctly? [18:13:23] yes [18:13:28] (last time I checked) [18:13:59] Bryan: I found image moving broken recently [18:14:04] From what we see, how's the outlook for API edit? (As in, does it look like it will be permenent?) [18:14:41] 03(FIXED) Refactor upload code to split backend and interface - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=14925 +comment (10Bryan.TongMinh) [18:15:23] 03(ASSIGNED) action=upload should be added to the API - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15227 (10Bryan.TongMinh) [18:15:39] That's the bug :) [18:17:11] I'm doing an extension for my wiki which only serves the purpose for 'tagging' for backend db interaction, nothing should be output to the page, as such I just have a return true; however once the page is saved there's


tags inserted thus pushing all the content down. Is there anyway so it doesn't insert these p/br tags? [18:17:48] brion: they make you log in on vacation? [18:21:12] brion-vacation: have nice vacations :) [18:21:52] - there isnt any parameters or anything for code tags is there? [18:23:16] 03raymond * r40094 10/trunk/phase3/ (3 files in 3 dirs): Move the timestamp into the message of the current revision link to make the link consistent with the link to the previous version on the left side [18:25:38] 03hashar * r40095 10/trunk/phase3/ (3 files in 2 dirs): Lets render BMP pictures to PNG. [18:26:03] 14(DUP) BmpHandler doesn't generate tags for .bmp images - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15144 +comment (10hashar) [18:26:06] 03(mod) $wgMediaHandlers is missing image/x-bmp in DefaultSettings.php - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13265 +comment (10hashar) [18:29:54] I'm doing an extension for my wiki which only serves the purpose for 'tagging' for backend db interaction, nothing should be output to the page, as such I just have a return true; however once the page is saved there's


tags inserted thus pushing all the content down. Is there anyway so it doesn't insert these p/br tags? [18:30:59] Okaaaayyy.... something is wrong with my array_splice call, eugh... [18:31:39] tekmosis: Inside the content? [18:33:23] Agh... damn... stupid function [18:33:34] Dantman, correct. Even though while I'm editing I add in it will run through my function, even with a return of true or a blank string it will add in the p/br tags to the saved page [18:34:17] Can you give me a url? Also, what revision of SVN are you using? [18:37:32] Dantman, http://footballmanager.neoseeker.com/wiki/Category:Player_Guides you can see all the extra


it is adding in as there is 8 implementations of the extension [18:38:15] So what version of MW [18:38:22] 1.12 [18:39:02] Oh, sorry n/m... [18:39:33] I thought you were comming in and saying that when you edited


was being appended to the end of the text that you saved into the page [18:39:59] I just perked up about that because I've been doing some EditPage changes and part is related to a sequence similar to that [18:40:10] I just thought it was another regression I'd have to fix [18:40:14] ahh ok [18:43:53] ahh, UploadFromBase [18:44:01] I thought it was UploadFormBase [18:45:08] seems like an ackward name [18:50:42] 03aaron * r40096 10/trunk/extensions/ConfirmAccount/RequestAccount_body.php: now UploadFromBase [18:51:22] What was the url to get my mediawiki version again? I forgot. [18:51:38] solifugus: special:version [18:53:03] *Jack_Phoenix waves to ialex :-) [18:53:17] Raymond_: ok.. that's what i tried.. turns out i made a typo earlier in the url.. thanks tho [18:53:18] hello Jack_Phoenix [18:53:21] rar [18:58:25] hi [19:00:39] so, some api calls return timestamps as ISO 8601 and some return as one huge integer (like history offset) [19:00:44] what's up with that? [19:01:37] the last one is wrong [19:01:45] I think [19:03:21] jeremyb> maybe it is a UNIX timestamp ? [19:03:50] It still should be normalised [19:03:59] /standardised [19:04:18] timestamp for the win ! [19:04:51] jeremyb, log it as a bug [19:05:03] and make sure you give some examples :p [19:05:08] yaeh [19:06:02] it is probably a TS_MW [19:06:07] hashar: that just happens to start with 2008? [19:08:43] hmm [19:08:51] Strict Standards: Declaration of UploadFromStash::verifyFile() should be compatible with that of UploadFromBase::verifyFile() in /Library/WebServer/Documents/phase3/includes/UploadFromStash.php on line 32
[19:08:51] Strict Standards: Declaration of UploadFromStash::checkWarnings() should be compatible with that of UploadFromBase::checkWarnings() in /Library/WebServer/Documents/phase3/includes/UploadFromStash.php on line 32
[19:09:01] not to mention [19:09:01] $ date -ur 20080325010316 [19:09:02] Tue Jan 19 03:14:07 GMT 2038 [19:11:11] 03btongminh * r40097 10/trunk/phase3/includes/UploadFromStash.php: Fixup the function declarations that we're overridden from UploadFromBase. [19:13:05] 03(mod) Commons EXIF extractor does not handle "flash" - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=11884 summary; +comment (10hashar) [19:13:17] jeremyb:> echo wfTimestamp( TS_DB, 20080325010316 ) [19:13:17] 2008-03-25 01:03:00 [19:13:38] ialex: so what? [19:14:27] this is not "Tue Jan 19 03:14:07 GMT 2038" ;) [19:15:39] ialex: ... and what did hashar say? what was I responding to? [19:16:18] i'm trying to change the font of my wiki, as i had some problems, i would like to know if anybody knows a turorial for it [19:16:40] *jeremyb points to [[MediaWiki:Common.css]] [19:17:06] and http://w3.org/TR/CSS21 [19:20:02] !style | maxxxxx [19:20:02] --mwbot-- maxxxxx: To change styles for your wiki, go to one of the MediaWiki:xxx.css wiki page and put your custom styles there (sysop/admin rights required). MediaWiki:common.css is for all skins and should be used for content styles. MediaWiki:monobook.css is for the MonoBook skin (default), etc. For more information, see !skins and [19:20:06] !skinning | maxxxxx [19:20:06] --mwbot-- maxxxxx: Overview: . Skin usage: . Gallery of CSS styles: . Writing your own: [19:23:12] 03(ASSIGNED) Commons EXIF extractor does not handle "flash" - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=11884 (10hashar) [19:25:47] 03(mod) Commons EXIF extractor does not handle "flash" - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=11884 (10hashar) [19:25:58] night [19:34:08] 03huji * r40098 10/trunk/phase3/languages/messages/MessagesFa.php: Localisation updates: Using a better Persian translation for "hide" [19:44:47] hello, I am trying to import some images from wikibooks into a mirror I am creating. I got the images downloaded by using wikix. I have checked them and they have all downloaded correctly. Now, when I import them using maintinence/importImages.php the images all end up in the file system and in the database, but the links to the images in the pages don't work. They just say 'this image hasn't been uploaded yet.' I hav [19:45:32] 03(mod) Bad rendering - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15334 (10huji.huji) [19:47:59] test [19:49:02] cooperq: qour question was truncated [19:49:07] after "I have" [19:49:22] 03raymond * r40099 10/trunk/phase3/ (3 files in 3 dirs): Add an own message for the fieldset legend instead of duplicating the special page name [19:49:30] but the first question is: did you clear the server side cache? [19:50:05] Duesentrieb: Thanks, how do I do that? Would just restarting apache be enough? [19:50:14] no [19:50:23] apache has nothing to do with it [19:50:31] parser cache resides in the db, per default [19:50:39] just touch your LocalSettings.php [19:50:56] all cached objects that are older than that file are considered invalid automatically [19:51:03] anyone know a live query.php service? (i'm porting) [19:51:30] ah [19:51:37] Duesentrieb: that didn't do it [19:51:42] here is the full question [19:51:44] hello, I am trying to import some images from wikibooks into a mirror I am creating. I got the images downloaded by using wikix. I have checked them and they have all downloaded correctly. Now, when I import them using maintinence/importImages.php the images all end up in the file system and in the database, but the links to the images in the pages don't work. They just say 'this image hasn't been uploaded yet.' I [19:52:02] that's not full :) [19:52:31] cooperq: it's too long. just post the second part :) [19:52:50] line limit on irc is what - 500 chars? [19:53:41] hehe, sorry :) [19:53:46] cooperq: hm, if the images are in the database, [[Image:...]] links should work. did you try adding a new one and hitting preview? [19:53:50] 03(NEW) section edit bug - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15340 04CRIT; normal; MediaWiki: Page editing; (PhiLiP.NPC) [19:54:11] tried rebuilding the site from scratch, running just about every maintinence script all to no avail. Can someone plese give me some pointers here? I feel like there is something obvious that I am missing. [19:54:22] ok there is the full image [19:54:22] Duesentrieb: no, it's variable. msg limit is 512 including all the overhead stuff. target channel, "PRIVMSG", etc. [19:54:33] no I haven't tried adding one manually [19:54:34] one sec [19:54:35] and that's bytes not chars i'm guessing [19:54:47] jeremy: yea [19:54:57] cooperq: link to your site? [19:54:58] brion-vacation: Please look at bug 15340, looks like a problem with the EditPage rewrite [19:55:44] he is on vacation, ask tim [19:55:48] Duesentrieb: and then when you receive it, it has the sender's nick in it. so each nick length+channel name length combo has a different max msg length [19:56:21] 03brion * r40100 10/trunk/extensions/OpenSearchXml/ (. ApiOpenSearchXml.php OpenSearchXml.php): [19:56:21] Add extension for the OpenSearch suggest XML variant supported by IE 8 beta 2. [19:56:21] Includes preliminary versions of some text & image extraction code which will be generalized to other search interfaces pending improved support: [19:56:21] * OpenSearch suggest JSON interface provides for text extracts, but I believe Firefox doesn't support it currently [19:56:24] * Image extractions could be used nicely for on-site search, and we'll want it for future geo-type searches. [19:56:27] * Improved text extraction could be used to update the abstract extension and various other list-like interfaces. [19:56:56] AaronSchulz: Really? :P [19:57:02] and i'm done :D [19:57:09] bye everybody, see ya next week [19:57:13] Crap [19:57:39] And Tim is probably in bed [19:58:35] Tim-away: Please look at bug 15340, looks like a problem with the EditPage rewrite. Some regressions in that rewrite were fixed recently, so svn up'ing *may* solve the problem [19:59:20] Duesentrieb: http://freetextbookpublishing.com [20:00:10] cooperq: hrm... wiki files in the docroot... not good. but not the current problem [20:00:41] Duesentrieb: It's not bad here, because he's not using pretty URLs [20:00:51] yes [20:01:05] it's just generally a bad idea [20:01:10] really, why is that? [20:01:11] Yeah, I know [20:01:19] Just in case you ever wanna move to pretty URLs [20:01:27] definitely will [20:01:34] cooperq: http://freetextbookpublishing.com/index.php/Image:Salsa_dancing.jpg http://freetextbookpublishing.com/index.php/Image:Antennariadioeca.jpg http://freetextbookpublishing.com/index.php/Image:Plate_tectonics_map.gif [20:01:36] !rewriteproblems | cooperq [20:01:36] --mwbot-- cooperq: 1) Try as a fail-safe method; 2) Do not put the files into the document root; 3) Do not map the pages into the document root; 4) Use different paths for real files and virtual pages; 5) Do not set a RewriteBase; 6) Put all rules into the .htaccess file in the document root. [20:01:47] cooperq: are those images you just imported? [20:02:28] yes they are [20:02:51] cooperq: works fine for me [20:03:01] cooperq: http://freetextbookpublishing.com/index.php?title=Free_Textbook_Publishing%3ASandbox&diff=1239083&oldid=1227951 [20:03:31] where do they not work? [20:04:01] http://freetextbookpublishing.com/index.php/Chess [20:04:12] is there a reason we shouldn't have wiki files in the document root? [20:04:46] cooperq: well, perhaps you simply didn't import that image? [20:04:54] cooperq: you are mirroring Wikibooks? [20:05:03] cooperq: have you looked for the respective file names in the dir you copied the images from? [20:05:04] Mike_lifeguard: yes [20:05:09] yay for reusing out content. may I ask why? [20:05:35] Duesentrieb: yes I have [20:05:46] cooperq: make sure to comply to the GFDL. YOu aren't, currently. [20:06:07] is there a reason we shouldn't have wiki files in the document root? [20:06:33] Mike_lifeguard: contract job for my girlfriends dad, he wanted a wiki where people could upload there own books and he wanted some content already on it [20:06:43] Duesentrieb: in what way? [20:06:59] cooperq: you don't attribut individual authors. [20:07:10] hello? [20:07:12] megagram: Yes. [20:07:24] ahh wasn't sure if i was being ' [20:07:32] heard' or not [20:07:42] megagram: naming conflicts aka namespace pollution [20:07:58] When you use pretty URLs, mapping stuff to the site root nearly always violates the most important rule: real and virtual parts must not overlap [20:08:01] megagram: resulting in troubles with redirects, styles, extensions, etc [20:08:09] does this matter if i'm using namevirtualhost on the server? [20:08:18] cooperq: anyway - are all non-working pictures SVG? [20:08:27] thats a good question [20:08:28] pardon my ignorance but what are 'pretty urls'? [20:08:30] megagram: yes. [20:08:59] megagram: /wiki/Foo instead of index.php/Foo or index.php?title=Foo [20:08:59] and what are the real and what are the virtual parts? [20:09:16] megagram: Real path = /w/index.php?title=Foo virtual path = /wiki/Foo [20:09:22] gotcha [20:09:25] Duesentrieb: no. pngs aren't working too: http://freetextbookpublishing.com/index.php/Opening_theory_in_chess [20:09:31] Lots of evil stuff happens when you try to map /Foo to /index.php?title=Foo [20:09:36] cooperq: hehe, that's what i was looking at right now too [20:09:43] unless it's in its own directory? [20:09:46] Such as infinite redirect loops if you don't set it up properly [20:09:52] but I do notice that some of the images are working now, that is a step in the right direction [20:09:54] :) [20:09:56] And the inability to create a page called index.php [20:10:08] cooperq: will the books all be freely-licensed (the ones people are going to be uploading)? [20:10:10] cooperq: well then. did that file get copied during import? is there a corresponding entry in the image table? [20:10:17] Mike_lifeguard: Definitely [20:10:19] 03(NEW) Option to show/hide disambiguation pages in Special: ShortPages - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=15341 15enhancement; normal; MediaWiki: Special pages; (shijualex) [20:10:20] cooperq: btw, did you import the pages of the Image namespace too? [20:10:21] ok thanks for this information.. so basically there is a way to make it work but its best to keep wiki files in their own directories? [20:10:29] megagram: The preferred way is to put the files in one directory (/w) and the pretty URLs in another (/wiki). Under no circumstance should those paths overlap [20:10:52] i don't think we are using pretty URLs though.. assuming this is something that needs to be configured, right? [20:10:54] well, they canoverlap, but you have to be aware of the implications. [20:11:16] megagram: ut you may want to in the future. using "technical" urls sucks too :) [20:11:32] Duesentrieb: I imported all pages from the latest-pages-articles dump on wikibooks [20:11:34] 03(mod) Option to show/hide disambiguation pages in Special:ShortPages - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15341 (10shijualex) [20:11:39] cooperq: GFDL? or some-other-free-license? [20:11:40] megagram: Yes. [20:11:59] not sure if that would have imported the images namespace or not [20:12:04] Mike_lifeguard: GFDL [20:12:10] cooperq: hm... don't know if that includes images... [20:12:12] Duesentrieb: Being aware of the implications basically means that you're smart enough to know you shouldn't even try it [20:12:17] \o/ [20:12:24] shit I have to go to a doctors appointment [20:12:27] we will have to watch and grab content from you too [20:12:32] I will talk to you all when I gut back [20:12:37] RoanKattouw: thank you for the info... [20:12:39] what a shit time to leave [20:12:53] if any of you have any suggestions to email me my contact is cooperq@bitsamurai.net [20:12:57] thanks everyone for your time [20:13:10] Duesentrieb: I got the images from using wikix to parse the XML [20:13:24] Bryan: Am I right in assuming that you've finished the necessary upload rewrites and are now writing an API upload module? [20:14:02] currently I'm tired :) [20:14:09] and somewhere half way with that module [20:14:33] Bryan: OK, no pressure. The point is the hellish part is done, which is great. [20:16:22] phew, got a repreive [20:16:29] back [20:17:43] so Duesentrieb would it hurt to just : mkdir wiki; mv * wiki [20:17:52] to get my files out of the root? [20:18:42] cooperq: you have to asjust $wgScriptPath in your localsettings (and passibly other pathes too, but probably only that) [20:19:02] cooperq: it would currently be "/" and should then be "/wiki/" [20:19:25] ok cool [20:19:34] hmm Duesentrieb, here is something interesting [20:19:49] cooperq: If you ever want pretty URLs, you're better off putting the files in /w [20:19:51] http://freetextbookpublishing.com/index.php/Chess/Basic_Openings [20:19:59] RoanKattouw: thanks [20:20:12] Duesentrieb: some of the image files work and some don't [20:20:17] they are all pngs though [20:20:23] I Have a theory about this [20:20:54] uh... htere should be svgs too. and jpgs, no? [20:21:02] I started running refreshlinks.php last night and it crapped out before it totally finished [20:21:25] cooperq: You might wanna do $wgNamespacesWithSubpages[NS_MAIN] = true; so [[Chess/Basic Openings]] links back to [[Chess]] [20:21:31] Duesentrieb: sorry there are, what I meant was some of the pngs are working and some aren't so its not a file type specific thing [20:21:34] cooperq: oh, btw, listen to RoanKattouw - the path you move your files to is NOT the path your pages will be available under. that'S the point of having diofferent pathes for files and for pages. [20:22:18] cooperq: again, please check if the non-working images show in the image table [20:22:26] ah right [20:22:27] one sec [20:25:17] AaronSchulz: (or anybody else with a commit access): could you add one character to SpecialAllpages.php, line 279, or should I file a bugreport? ;-) [20:25:28] Mormegil: I'll do it [20:25:33] 03mark * r40101 10/trunk/ubuntu/autoinstall/pmtpa-internal.cfg: Fix LVS service IP for internal hosts as well [20:25:49] RoanKattouw: [20:25:51] - list( $namespace, $toKey, $to ) = $toList; [20:25:53] + list( $namespace2, $toKey, $to ) = $toList; [20:26:20] it fixes the inability to use http://en.wikipedia.org/wiki/Special:AllPages/User:Foo [20:26:31] (or, at least, I hope so ;-) ) [20:26:56] Duesentrieb: they are not [20:27:51] Mormegil: How do you want to be credited? Mormegil? [20:28:03] yes, please (and thanks) [20:28:07] cooperq: so they are not imported for some reason. [20:28:19] ok that makes sense [20:28:25] Probably the smallest patch ever [20:28:26] now we just have to figure out why I guess [20:28:27] 03catrope * r40102 10/trunk/phase3/ (CREDITS includes/specials/SpecialAllpages.php): Make Special:Allpages/User:Foo actually work. Patch by Mormegil [20:28:55] Mormegil: So for that one character, you're actually listed as a patch contributor in the credits for 1.14 [20:29:20] thanks again, it was much faster than going through MediaZilla (and I have some experience with _that_ ;-) ) [20:29:48] well: I've had some patches before, but without any credit (AFAIR) [20:29:57] Well don't I feel like a stupid jerk [20:29:59] Yeah it's new [20:30:17] Mormegil: Believe it or not, some bugs actually get fixed in 10 minutes [20:30:21] Duesentrieb: sorry for wasting your time, the missing images don't seem to have ever been downloaded [20:30:42] hrm, i did tell you to check for them in the directory... [20:30:43] the few that I checked (when none of the images were working) had been successfully downloaded I guess [20:30:50] Depends on how busy people are, whether they feel like bug-squashing at that time, etc. [20:30:52] RoanKattouw: this one was patched in 3 minutes ;-) [20:31:00] So some bugs stay in there for 3 years [20:31:24] hmmm ok [20:31:41] Duesentrieb: some that are NOT working are in the file system, for example documentPrint.svg [20:32:47] still not in the database though [20:33:05] ok so it looks like I have two basic problems here [20:33:24] cooperq: Did you know you can upload from a URL? [20:33:27] I don't have all the images I need, and all the images I have are in the database [20:33:48] RoanKattouw: I did not know that [20:33:56] cooperq: you'll need to set up svg rendering, for one thing. also... documentPrint.svg shouldn't be on the file system. it should be DocumentPrint.svg [20:33:57] Lemme look up hw [20:34:17] Duesentrieb: that was a typo, sorry [20:34:28] cooperq: Sysops should automatically be able to do that [20:34:38] cooperq: np, just looking for causes of trouble [20:34:49] RoanKattouw: i don't think so. [20:34:54] Oh wait [20:34:59] $wgCopyUploads = true; [20:35:25] Duesentrieb: I guess the first thing to do would be to download all the image files again, do you think? [20:35:31] to try and get the missing ones? [20:35:48] cooperq: Not necessarily. You can upload directly from a URL by setting $wgCopyUploads = true; in LocalSettings.php [20:36:16] cooperq: well, idealy, you make a list of missing ones and try to dl them. and if that doesn't work, find out why. [20:37:04] Hmm ok [20:37:19] *cooperq sets about making a list of missing images for a download script to parse [20:37:41] um, any way I can make mediawiki give me a list of broken image links? [20:37:53] or at least a list of all broken links [20:42:21] SELECT il_to FROM imagelinks LEFT JOIN page ON (page_namespace, page_title) = (6, il_to) WHERE page_title IS NULL; [20:43:23] sql-fu! [20:44:56] 03aaron * r40103 10/trunk/phase3/includes/GlobalFunctions.php: Set replace flag for headers [20:48:29] cooperq: huh, i have never seen a tupeled on-clause like this. cool, that works? I just use AND :) [20:48:40] but yes, that'S how i would have made that list, too [20:48:53] you may want to group it though, to remove dupes [20:49:10] or SELECT DISTINCT [20:49:35] Duesentrieb: Yeah I didn't know you could do that either. Really nice syntax for ns/title combos [20:50:03] looks a bit more pretty, yes. [20:50:46] 03(mod) sitemap-index doesn't include full location path - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=9675 +comment (10mediawiki) [20:51:12] Is that portable? [20:51:38] i sql ever portable? [20:51:44] 03(mod) sitemap-index doesn't include full location path - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=9675 (10mediawiki) [20:51:54] Simetrical: Does it matter? Our DB abstraction layer never produces that kind of SQL [20:52:16] Partly because it doesn't care about beautifying its SQL [20:52:19] 14(INVALID) Reference creation button produces incomplete entries in extended toolbar. - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15339 +comment (10Simetrical+wikibugs) [20:52:27] Duesentrieb, sure. [20:52:57] RoanKattouw, well, if we're only talking about our DB abstraction layer then the syntax is irrelevant. I was wondering more generally [20:52:57] . [20:53:24] I'm gonna wait to install imagemagick and get svg support working here and reimport all the images to see if I can't get some (or all) of these svg images working at least [20:53:26] Simetrical: OK. I thought you were in sceptical mode rather than in wondering mode ;) [20:54:15] cooperq: well, svg should improt even if it doesn't render [20:54:16] (ns, title) is so much more beautiful than AND 8) [20:55:07] 03aaron * r40104 10/trunk/extensions/FlaggedRevs/flaggedrevs.js: fix backwards coloring [20:56:36] Wow, hashar is alive? [20:58:03] 03(mod) Strange font size CSS in monobook - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15337 (10random832) [21:01:16] 03aaron * r40105 10/trunk/extensions/FlaggedRevs/specialpages/RatingHistory_body.php: set $plot->minY [21:04:05] 03mark * r40106 10/trunk/ubuntu/autoinstall/ (5 files): Fix occasional unassigned hostname problem, preseed NTP [21:04:36] Duesentrieb: actually it looks like none of the svg images imported [21:04:50] this is probably because I didn't have svg on my list of allowable images [21:05:29] cooperq: --extensions= Comma-separated list of allowable extensions, defaults to \$wgFileExtensions [21:05:38] that'S from the usage info :) [21:06:46] Cool! [21:06:59] now all of the svg images that I successfully downloaded work! [21:07:06] now I just need to download the rest and import them! [21:12:57] 03brion * r40107 10/trunk/phase3/includes/ (Article.php UserMailer.php): [21:12:57] Revert r40042 "* In Article::replaceSection(), actually return null when $section is bogus. Used this in my half-complete and now kind of abandoned attempt at rewriting EditPage.php" [21:12:58] This causes regression bug 15340 -- null-edits to a section destroy the rest of the page. [21:13:29] 03(FIXED) section edit bug - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15340 +comment (10brion) [21:13:38] brion-vacation: Oops. Sorry for blaming dantman for this. [21:15:54] 03catrope * r40108 10/trunk/phase3/includes/UserMailer.php: Revert part of r40107 (Revert r40042 because of regressions). The changes to UserMailer.php had nothing to do with that bug. [21:22:55] anyone know what would cause mediawiki and rc bot not work. [21:23:13] can someone help me out please.... i've installed mediawiki to a wiki folder in my apache documentroot but everytime i navigate to my server i just see a directory listing.. is there an easy way to redirect automatically to the wiki directory? [21:28:09] 03mark * r40109 10/trunk/pybal/ (5 files in 3 dirs): Implement "RunCommand" Monitor, which can run an arbitrary process and decide on server health based on the return code, or a timeout kill. [21:28:50] Duesentrieb: hmm, trying to figure out the best way to automate the downloading of all of these images, problem is some are in the wikipedia commons and some are specific to wikibooks and they are all in that weird image driectory structure [21:29:05] can't do it by hand either cus theres about 3000 of them [21:32:11] cooperq: http://en.wikibooks.org/wiki/Special:FilePath/Foo.jpg will redirect to the file in question [21:32:17] no matter where it is [21:32:24] :D thanks [21:32:48] So many idiosyncracys of mediawiki [21:32:53] *sic [21:33:04] megagram, DirectoryIndex index.php in a .htaccess file should do you [21:38:11] 03siebrand * r40110 10/trunk/phase3/languages/messages/ (51 files): Localisation updates for core messages from Betawiki (2008-08-27 23:24 CEST) [21:44:03] megagram: Do you use Apache? [21:44:25] Duesentrieb: How long does it usually take for cloak requests to be approved? [21:44:53] RoanKattouw: i have no clue really. a couple of days i guess. [21:45:12] just poke seanw ;_) [21:45:15] ;) [21:45:17] even [21:45:18] OK. So I can stop WHOISing myself every hour :P [21:45:49] 03siebrand * r40112 10/trunk/extensions/ (84 files in 71 dirs): Localisation updates for extension messages from Betawiki (2008-08-27 23:24 CEST) [21:45:50] RoanKattouw: it probably only applies when you log out and log in again anyway. [21:45:51] make a script and when the cloak hits, make it send you an email :P [21:45:52] 03dale * r40111 10/trunk/extensions/MetavidWiki/skins/mv_embed/ (8 files in 3 dirs): [21:45:52] updates to mv_embed [21:45:52] better support for firefox native video [21:45:52] bug fixes [21:46:15] alnokta: I don't intend to learn IRC bot scripting just for that [21:46:25] I'll just whois myself when I login every day and I'll see [21:46:39] :) [21:47:54] for an rc->irc gateway, are those 3 lines enough? [21:48:04] and netcat -ulp should echo? [21:50:18] gn8 [22:10:21] 03(NEW) "Invert selection"-checkbox doesn't work when "(Main)" is selected - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15342 minor; normal; MediaWiki: Special pages; (jobjorn) [22:10:54] anyone knows something like a linklist extension? Maybe with voting possibility? [22:11:11] Phlogi: What do you mean with linklist? [22:11:31] A list with external links, that can be rated and assigned to categories [22:12:01] OK. No, I don't know any extensions that implement that. I do know how to get a list of all external links, though [22:12:05] 03(NEW) Subpage namespace activation request in french wikipedia - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=15343 15enhancement; normal; Wikimedia: Site requests; (pixeltoo) [22:14:04] n8 [22:18:24] 03(mod) Subpage activation for Help: namespace in french wikipedia - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15343 summary (10alex.emsenhuber) [22:23:16] 03ialex * r40113 10/trunk/phase3/ (RELEASE-NOTES includes/specials/SpecialWatchlist.php): (bug 15342) "Invert" checkbox now works correctly when selecting main namespace in Special:Watchlist [22:23:49] 03(FIXED) "Invert selection"-checkbox doesn't work when "(Main)" is selected - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15342 +comment (10alex.emsenhuber) [22:37:36] 1hello [22:38:01] i would like to use templates to rearrange this kind of information, but preserve the monospacing and easily add multiline info, is it possible? http://rodpedia.poromenos.org/wiki/The_large_steel_shield [22:38:51] can i have multiple lines in templates? [22:39:05] Stavros: Sure [22:39:42] Basically, everything you can put in a regular page can also be put in a template [22:40:06] !template | Stavros [22:40:06] --mwbot-- Stavros: For more information about templates, see . The most common issues with templates copied from Wikipedia can be fixed by installing ParserFunctions and enabling HTML Tidy . [22:40:27] oh that's a relief, i thought you couldn't do things like multiline or monospace (i tried and failed), thanks [22:40:53] hmm, those don't mention multiline anywhere... [22:41:41] right now we have this template, does it look sane? http://rodpedia.poromenos.org/wiki/Template:Item [22:41:43] Well apparently you're the first one who assumes templates can't be multiline then ;) [22:42:11] hmm [22:42:29] Stavros: Lines 7 and 8 don't look sane [22:42:42] if should probably be #if: [22:43:06] And {{ #if: {{{id1|}}} | {{{id1}}} is kind of redundant, {{{id1|}}} alone will do [22:43:11] hmm, it works though [22:43:12] oh [22:43:13] thanks [22:43:23] Line 24 is like 7 and 8 [22:43:47] i don't know much about that, but it has the two pipes, which is different from the other lines [22:44:02] right now, for multiple lines, we do id1, id2, etc [22:44:23] No need, you can merge those into one id parameter [22:44:28] What's {{{category|[[Category:{{{area}}}]]}}} supposed to do? [22:44:48] Or the two lines below that, for that matter? [22:45:08] Is it supposed to output [[Category:{{{area}}}]] if category is set? Because it won't [22:45:24] hmm [22:45:30] it's supposed to always output category:Area [22:45:38] It won't [22:45:51] why not? [22:46:10] unfortunately someone else wrote those, so i'm not quite clear on the details :/ [22:46:15] Also, the problem in the docs you mention about using | in categories (which will obviously fail) can be solved by creating [[Template:!]] with just | in it, and replacing troublesome | characters with {{!}} [22:46:35] Because {{{foo|bar}}} will output the value of foo if it's set, or "bar" if foo isn't set [22:46:58] So {{{category|[[Category:{{{area}}}]]}}} will output the value of category, OR [[Category:{{{area}}}]] if category isn't set [22:48:06] hmm [22:48:16] i'll have to talk with the person who made this [22:48:22] which problem did you mean in the docs? [22:48:40] They say you can't use [[Category:Foo|Bar]] in the categories parameter [22:48:51] 03dale * r40114 10/trunk/extensions/MetavidWiki/skins/mv_embed/ (4 files in 3 dirs): improved playhead support for firefox native video [22:48:57] While you can, as long as you replace | with {{!}} and create Template:! (which you probably already have) [22:49:01] oh, that's because it's determined with a parameter now [22:49:07] it's not that you can't, but you shouldnt [22:49:09] Yup, you've got it [22:49:24] it's improper wording if it says you can't, let me find it [22:49:44] No, it says don't [22:50:04] ah, that's correct then [22:50:18] most people did |item name, the [22:50:20] for every category [22:50:23] which they now shouldn't [22:50:44] by the way, what's the best way to have one id parameter that prints its contents monospaced? [22:50:45] Because of DEFAULTSORT [22:50:48] yes [22:51:07] What's monospaced again? [22:51:12] [22:51:28] {{{foo}}} ? :P [22:51:31] err [22:51:38] sorry, not :/ [22:51:44] the thing where you put a space before each line [22:51:47] to get the code display [22:51:48] Yeah that [22:51:53] same thing? [22:52:01] No [22:52:31]
{{{foo}}}
would probably work. Not 100% sure, but give it a try [22:52:37] hmm, sec [22:53:33] looks like i messed the "if" up, but pre works [22:53:40] Ok [22:53:42] i want to print that whole thing only if id is defined [22:53:48] but i'm not sure how to close the if [22:53:59] this language is a bit confusing :p [22:54:07] So should use {{#if:condition|ifempty|ifnotempty}} [22:54:29] And {{{foo|}}} is empty if foo isn't set (or is set to empty) [22:54:39] aha great, it works now [22:54:44] thanks [22:55:19] Still, you should use {{#if: instead of {{if [22:56:10] i will tell the person who wrote it and he'll change it, thanks [22:56:41] ah, the multiline thing is very not working :/ [22:58:01] do i need to put it all in one line and use
? that's not very good :/ [23:02:17] Stavros: No, don't use
, use real line breaks within the
[23:02:42] 	RoanKattouw: it doesn't accept the parameter
[23:02:49] 	?
[23:02:55] 	look at this http://rodpedia.poromenos.org/index.php?title=Template:Item/doc&action=edit&preload=Template:Documentation/preload
[23:03:02] 	it doesn't render
[23:03:12] 	actually this http://rodpedia.poromenos.org/wiki/Template:Item/doc
[23:03:51] 	Argh
[23:04:18] 	Use {{{id|}}} instead of {{{id}}}?
[23:04:23] 	And finally replace {{if by {{#if:
[23:04:36] 	Not that it matters very much, but it's more consistent
[23:04:48] 	i told you i can't replace that until i talk to the guy who did it, he might have a good reason
[23:04:54] 	Right
[23:05:30] 	{{{id}}} doesn't work either :/
[23:06:00] 	err, id|
[23:06:16] 	Hmm apparently 
 is so strict that even {{{id|}}} isn't expanded
[23:06:38] 	Lemme look some stuff up
[23:06:46] 	oh :/
[23:06:49] 	hmm
[23:07:11] 	i probably need to have 
 within the parameter then, still not very good
[23:08:56] 	Hmm leading spaces look like your best bet
[23:09:08] 	So you should tell users to input stuff with leading spaces
[23:09:17] 	You probably want to hard-code the leading space for the first line
[23:09:23] 	hmm, let me try that
[23:11:55] 	you are quite right, thank you
[23:21:18] 	well, the ifs were for a reason, apparently header doesn't work with normal ones
[23:21:35] 	03(mod) Add NUMBEROFROLLBACKERS magic word - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13471  +comment (10soxred93)
[23:21:56] 	Yay, finally submitted a working patch
[23:26:50] 	can i see a list of pages that use a given template?
[23:26:58] 	"what links here"?
[23:27:14] 	Yes, just use whatlinkshere for that
[23:27:18] 	yes
[23:27:19] 	You'll see (inclusion)
[23:27:24] 	you can hide links to show only transclusions
[23:27:35] 	(In newer versions of MediaWiki.)
[23:27:39] 	ah, thanks
[23:27:43] 	03(mod) Add NUMBEROFROLLBACKERS magic word - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13471   +need-review (10soxred93)
[23:30:55] 	guyvdb: ping?
[23:31:12] 	siebrand: pong
[23:31:26] 	if i convince a sysadmin to switch to MW, do i get a teeshirt?
[23:31:35] 	so
[23:31:54] 	best to start by distinguishing things
[23:31:56] 	wikipedia is the online encyclopedia
[23:32:02] 	guyvdb: compare http://translatewiki.net/w/i.php?title=Translating:Intro&diff=677804&oldid=651056&htmldiff=1 and http://translatewiki.net/w/i.php?title=Translating:Intro&diff=677804&oldid=651056&htmldiff=0 . Lots of stuff missing...
[23:32:02] 	mediawiki is the software that runs it
[23:32:20] 	mediawiki is an open source project, free to download
[23:32:33] 	and you can do pretty much anything you like with it
[23:32:38] 	So, what do I need to do, and can I convert the previously existing wiki to a mediawiki?
[23:32:42] 	siebrand: Whoa, it just drops everything past the first paragraph
[23:32:49] 	much more powerful, flexible, fully-featured than other wiki engines
[23:32:51] 	Xiong: we do distribute free bzip2-ed cookies. Will that suffice?
[23:32:52] 	antwaungomez: Depends what kind of wiki it is
[23:32:58] 	um
[23:32:59] 	wikispaces
[23:33:00] *siebrand 	nods at RoanKattouw.
[23:33:12] 	siebrand: can i hang them from my rearview mirror?
[23:33:31] 	antwaungomez: how much content does the existing wiki have?
[23:33:34] 	Xiong: you have to print the screendump.
[23:33:48] 	Xiong:  A very many posts, pages, and discussions.
[23:34:08] 	siebrand: must be a bug, yes, will look in to it
[23:34:10] 	is this mostly just long runs of text?
[23:34:27] 	or a mature site with carefully formatted text and images together?
[23:34:30] 	guyvdb: sure. I'll remove the span for a second and see if that's it.
[23:34:37] 	you said the site was new
[23:34:49] 	Xiong:  Lots of carefully formatted bullets and whatnot, not a lot of images
[23:34:56] 	hm
[23:35:27] 	you might get acceptable results on the first pass by copying the source from each page and putting it into a text editor
[23:35:29] 	guyvdb: oh yes, that is definitely it.
[23:35:29] 	http://translatewiki.net/w/i.php?title=Translating:Intro&diff=679492&oldid=677811&htmldiff=1
[23:35:33] 	do some global search and replace
[23:35:35] 	i see
[23:35:42] 	the span doesn't get through the XML parser then
[23:35:45] 	strange
[23:36:12] 	for instance, if wikispaces uses [link], change that to [[link]] using global find and replace
[23:36:14] 	then paste
[23:36:35] 	hi, no one is working on imagetagging extension?
[23:36:36] 	Where do I get hosting?
[23:37:14] 	it is possible to set up MW hosting on a free account but not recommended
[23:37:21] 	shell out $5 a month
[23:37:25] 	i've done it both ways
[23:37:35] 	this is a freehosted install:
[23:37:42] 	guyvdb: it's invalid html. The span is inside 

and not closed there, but only at the end of the last paragraph. [23:37:42] http://arms.x10hosting.com/ [23:37:48] this is a paid hosting install: [23:37:54] http://www.beyondeuclid.com/ [23:37:54] Is there an available person with commit access that can review a patch? [23:38:54] still vdiff behaviour is odd. [23:38:57] (zzz...) [23:39:14] siebrand: the xml parser probably discards it then? [23:39:18] TheLetterX: Me [23:39:22] Which patch [23:39:27] siebrand: that's a "no fix"? [23:41:09] guyvdb: dunno. Depends on what you think vdiff should diff: what it can process, or what the wiki engine renders. It does the former now. The latter seems more intiutive, as not everyone will start analysing html source to resolve it. This would get bugzillas over and over. [23:41:37] guyvdb: speaking of bugzilla, have you already created an account? Then I can add the vdiff component and you would be informed of new issues. [23:41:52] (better even: they would be assigned to you automatically) [23:42:36] guyvdb: Betawiki is running tidy. I woudn't expect tidy to let this through, btw. [23:42:43] antwaungomez: looking over wikispaces, i'd say you'd be better off to copy the rendered pages [23:43:03] siebrand: the xml parser does have a handler for parser errors, but even then, I'd have to parse it myself. That would be so inefficient it would take 10 or 20 seconds for realistic pages [23:43:07] it's very easy to add simple formatting, like bold text and bullet points [23:43:45] siebrand: even now, when using the parser module, the handlers are the most expensive functions [23:43:48] TheLetterX: Your patch seems to be out of date. Please provide a patch against SVN HEAD [23:44:00] I just barely generated it [23:44:08] Oh. Wait [23:44:32] the big payoff with MW is that if you use it to support latex, you can produce beautiful, arbitrarily complex formulas [23:44:36] see this: http://www.beyondeuclid.com/wiki/Circumscribed_Trapezoid [23:44:44] that's a simple example [23:45:07] guyvdb: otoh, if it's not XHTML 1.0 Transitional, what can you do... [23:45:16] notice the correctly typeset subscripts [23:45:28] guyvdb: that page had 4 errors. With my later fix, it had none, and vdiff is working ok. [23:46:21] siebrand: bugzilla account is guyvdb@gmail.com [23:48:58] The edit screen looks very, very goofy with the random
[23:49:55] TheLetterX: Your patch somehow has tabs replaced by spaces. Please generate a proper patch that doesn't have this problem [23:50:01] ? [23:50:13] That's not normal [23:50:26] No, it isn't. But it did happen, which is why it won't apply [23:54:19] Xiong: Thank you very much. So you recommend I just render my images in Latex and then upload them? [23:56:13] What's the code for generating a patchfile? [23:57:28] TheLetterX, "svn diff > /tmp/diff" [23:57:39] Hmm...I did that. [23:58:06] 03dale * r40115 10/trunk/extensions/MetavidWiki/ (5 files in 4 dirs): some updates for compatibility with new special pages [23:58:07] You didn't try copying off the terminal window, right? [23:58:10] That converts tabs to spaces. [23:58:16] That's what went wrong [23:58:22] That's why it was having problems [23:58:31] Simetrical: Well since he did convert tabs to spaces and dropped the last newline, I suspect he did [23:58:38] Could also be that I used < instead of > [23:58:53] Still want the patch? [23:59:04] No, I'm half-done hand-applying it [23:59:52] 03(mod) Add NUMBEROFROLLBACKERS magic word - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13471 +comment (10soxred93) [23:59:59] Oh well, there it is