[00:14:52] Firefox 3.5 forever! [00:28:45] I miss the sensible version numbers [00:34:28] me too [00:34:39] I don't want my software to be older than me [00:34:53] heck at this rate, soon firefox will be "older" than my parents [00:35:04] heh [02:44:14] is it intentional for all cells (on the same text line) following a header cell to become header cells as well? [02:44:19] for example... [02:44:38] ! header cell || also a header cell [02:44:38] | also a header cell: Hello, I'm wm-bot. The database for this channel is published at http://bots.wmflabs.org/~wm-bot/db/%23mediawiki.htm More about WM-Bot: https://meta.wikimedia.org/wiki/wm-bot [02:45:40] Gravis: I think so... [02:45:50] but I don't really know [02:47:20] i cant find any instances of it in the mw page on Help:Tables on mw or wikipedia. [02:47:54] thus why i ask [02:48:18] bawolff: do you know who would know? [02:48:36] jackmcbarn: you like wiki table syntax right? [02:48:47] bawolff: i know it, but i don't like it [02:49:05] hmm, does anyone like it? [02:49:10] i doubt it [02:49:13] doubt it [02:49:15] lol [02:50:05] Gravis: wiki table syntax is so convoluted, and so intertwined with the parser, I feel like its intended behaviour is simply whatever it does at this point, because nobody would ever want to change it ;) [02:50:36] Maybe some of the parsoid people would be able to better talk about what the "intention" is [02:50:48] bawolff: that example does sound like a bug to me [02:51:02] but the intended behavior probably isn't what you want either [02:51:52] !! isn't special on lines that start with a |, so i don't think || should be special on lines that start with an ! [02:51:52] , so i don't think || should be special on lines that start with an !: bla [02:52:08] wtf @ wm-bot [02:52:18] jackmcbarn: you started a line with ! [02:53:26] !! [02:53:26] bla [02:53:33] !botbrain [02:53:33] Hello, I'm wm-bot. The database for this channel is published at http://bots.wmflabs.org/~wm-bot/db/%23mediawiki.htm More about WM-Bot: https://meta.wikimedia.org/wiki/wm-bot [02:54:04] yeah, i'm writing my own mw DB compatible server and i suddenly found some of my tables rendered wierdly [02:55:01] to my surprise the wikicode was representative of what was on the screen [02:56:16] oh, i just understood how wm-bot gave that response (took me a while) [02:57:31] bawolff: we normally make decisions on whether we can fix something based on how much existing content it would affect [02:58:15] typically the answer is "can't change it", but sometimes (like some stuff around definition lists) it is actually very rare and fixing it up in wikitext is doable [02:58:45] ! [02:58:45] Hello, I'm wm-bot. The database for this channel is published at http://bots.wmflabs.org/~wm-bot/db/%23mediawiki.htm More about WM-Bot: https://meta.wikimedia.org/wiki/wm-bot [03:01:56] gwicke: i dont think this would effect much content but if it does, it can be corrected programatically [03:03:41] Gravis: you should grep [03:03:53] we were often surprised about the results [03:03:56] * jackmcbarn digs into doTableStuff [03:04:21] gwicke: grep what, all wikipedia articles? [03:04:36] a dump or two, yes [03:04:49] we normally use https://github.com/wikimedia/parsoid/blob/master/tests/dumpGrepper.js for that purpose [03:05:02] takes about 20 minutes for enwiki [03:05:49] eewwww [03:05:50] if ( $first_character === '!' ) { $line = str_replace( '!!', '||', $line ); } [03:06:23] ;) [03:06:24] mmm buggy [03:06:42] Gravis: if you have a pattern, then I could also run it for you [03:07:00] gwicke: ^!.*\|\| [03:07:20] ^that [03:07:42] hmm, I thought you were looking for !! ? [03:07:49] nope [03:08:01] I think the pattern you are looking for is pretty common [03:08:04] but let me run it [03:10:51] well the options here are to either document it or change programatically update pages that contain the pattern. i prefer correctness over "i cant be bothered to fix that" [03:11:32] plus i dont want to change my code :P [03:11:40] Gravis: if it can be fixed without major impact on existing content (with a bot for example), then yes that's a great thing to do [03:11:59] keep in mind though that you might have to fix it in the entire history [03:12:06] which can be expensive [03:12:34] (if you don't, then old content will all of a sudden be broken) [03:12:48] gwicke: why do you have to fix ths history? why not just make a new entry with the note "bug fix fix" or something [03:13:39] Gravis: we have a lot of content from all the way back to 2001 [03:14:00] people still view those old revisions for various reasons, be it research or curiosity [03:14:27] if those old revisions render significantly differently from the way they should, then that's a problem [03:14:39] in this case it sounds like the table would no longer render as a table [03:15:08] no, in this case header cells may look like regular cells [03:15:24] I'm running into some issue with libxml [03:18:43] I'm afraid my grepper is in repair & I can't quickly run this for you [03:20:46] well... how does one get a dump of wikipedia? [03:21:02] !dumps [03:21:02] For information on how to get dumps from Wikimedia Wikis, see http://meta.wikimedia.org/wiki/Data_dumps . For a how-to on importing dumps, see https://www.mediawiki.org/wiki/Manual:Importing_XML_dumps . [03:21:40] But keep in mind dumps are huge, actually doing something useful with it is not a trivial undertaking [03:22:41] there are several parser tests covering this pattern [03:23:14] it's not so bad really [03:23:46] if it wasn't for annoying xml entities it would be fairly straightforward to use something like egrep [03:25:13] why xml, why? [03:25:48] AWB has a facility to search through a db dump [03:26:39] AutoWikiBrowser? [03:26:55] jackmcbarn: got a link? [03:27:11] gwicke: http://enwp.org/WP:AWB [03:27:53] thanks! [03:28:56] sounds like it's windows-only [03:29:14] not good for nix people like me :( [03:29:48] but this pattern should be queryable via sql [03:30:22] Gravis: it is, but there's not public access to enwiki's text table [03:30:41] people would also not appreciate hogging the db for a long time [03:30:52] it does work on non-Windows, see https://en.wikipedia.org/wiki/Wikipedia_talk:AutoWikiBrowser/Mono_and_Wine [03:31:06] ugh... mono [03:31:13] Wine is the preferred option [03:32:02] * gwicke tries to fix the issue with libxml & resolves to publish the dumpgrepper as a separate utility [03:32:23] yeah... but i have no idea which file to get. :) http://dumps.wikimedia.org/enwiki/20141208/ [03:33:11] Gravis: https://dumps.wikimedia.org/enwiki/20141106/enwiki-20141106-pages-meta-current.xml.bz2 [03:33:21] beware, that's 20GB compressed and a lot bigger uncompressed [03:33:57] http://dumps.wikimedia.org/enwiki/20141208/enwiki-20141208-pages-articles.xml.bz2 should work too [03:34:04] only 10.7G [03:34:18] depend on how much you care about meta ;) [03:34:48] *sigh* why cant i just get the latest raw wikicode for pages? :(((( [03:35:05] that's what that is [03:35:36] ok... now why is my drive almost out of space? xD [03:36:16] Gravis: you should tell those editors to stop writing all these articles [03:37:00] gwicke: i'll be sure to request they put that on the next donation banner [03:37:20] "we need your money... and stop making changes, you bastards!" [03:39:01] ;) [03:46:55] got the grepper running again, grepping now [03:49:25] thanks [04:02:12] I somehow have trouble narrowing the pattern down to match ! only [04:03:59] \n![^\r\n\\|\-]+\|\| is what I'm currently trying [04:05:34] gwicke: that seems fine. what's it doing wrong exactly? [04:08:22] nm, it works now [04:08:31] better with the -m flag [04:08:39] there are many matches [04:09:00] let me run it to completion to get you a count [04:09:07] but my guess would be thousands [04:11:14] in the first 1G of the dump there are already 64904 matches [04:11:40] jackmcbarn, Gravis ^^ [04:11:56] gwicke: can you link me to one or two so i can make sure they're not FPs? [04:12:39] https://gist.github.com/gwicke/d69956fec4fe727fe4bd [04:12:51] eh, I lied [04:13:03] 580 matches in 64904 revisions [04:13:05] but still.. [04:13:28] yeah, this isn't easily fixable [04:13:28] that's likely >10k over a 30G dump [04:13:59] let me push that script [04:14:51] jackmcbarn: it's just text replacement. [04:15:11] Gravis: it's still a big deal [04:15:45] and very doable [04:16:52] Gravis: remember, over the entire history [04:17:08] gwicke: i know [04:17:49] I think it's much easier to create a cleaned-up version of wikitext at some point, and then use parsoid to serialize to it [04:18:08] after adding a version field in the db to track which wikitext version each revision follows [04:18:27] in the magical HTML-primary future that problem also goes away [04:22:56] https://github.com/gwicke/dumpgrepper [05:12:28] gwicke: whatever your solution is, as long as it gets fixed for new versions, i'm fine with it. [05:14:55] gwicke: however, it's worth pointing out that you can have a bot running at low priority that fixes everything over the perioud of a week without any interruption to operations. [05:15:34] gwicke: then when it's all done, swap the parser [07:34:02] Gravis: It's possible to do a big fix-up, but it's a significant effort and risk. All third-party users of MediaWiki will need to do the same. With Parsoid we now have the option of doing such fix-ups on the fly and with much lower costs, which is why I see that route as more promising. [09:58:23] Hi i am new to mediawiki ! I have strong foundation in php and javascript. I am loving wikimedia and the work it does. Currently i m starting and have done minor changes in code. I am looking for some projects to work on that is interesting and the organisation needs help in. Is thr place where i can search all projects with thr detailed workflow and description? [09:58:59] !hacker | Abhi__ [09:58:59] Abhi__: http://www.mediawiki.org/wiki/How_to_become_a_MediaWiki_hacker [10:03:24] yeah i have read it. i have fixed few bugs too as said. I went through translation extention, spelling Api, secure polling,fund raising crm etc. but want to know how do they work very quickly. Like i m kind of lazy and want someone to explain me say for some 10 mins and then i will start coding through way thr. Thanks! [10:24:00] Abhi__ : check this, you may find some ideas, contacts here [10:31:17] grus_ : this link ? How to become mediawiki hacker ? [10:33:27] not a hacker, but it is a community they can maybe help you where to continue [10:33:45] if you could not find enough information on the site suggested by wm-bot [10:34:40] or if you see some extension which is interesting for you and needs maintiner you might can ask the owner how you can contribute [10:35:26] hmm. Thanks grus ! I will better go hard and take a look again ! Will read (y) [10:41:31] it is always nice to have new contributors, good luck with it :-) [10:47:54] Abhi__ ehh [10:48:18] it seems I am too sleepy today, I did not copy the link properly :S http://www.mediawiki.org/wiki/Groups/Proposals/MediaWiki_Cooperation [10:48:21] sorrry [10:48:51] I refered this site with my comment ' check this, you may find some ideas, contacts here ' [10:50:09] grus_ haha thanks a lot grus !i just opened this and loving reading these short descriptions. https://gerrit.wikimedia.org/r/#/admin/projects/ Will finish all links today ! [10:54:17] hello [10:54:41] i cannot find which template is generating the Filter bar in http://wiki.cyanogenmod.org/w/Devices [10:54:45] could you please help me? [10:55:57] I seem to recall looking for this before [10:56:11] Not attempting to do this on my phone though. Haha sorry [10:57:09] Reedy: can you check https://gerrit.wikimedia.org/r/#/c/180229/ [10:57:16] whether it was actually deployed [10:58:23] What did roan do? [10:58:32] I don't have my laptop atm [10:58:58] Reedy: oh okay [10:59:23] Sorry [11:03:37] Reedy: nothing broken currently, just a risk that things will break [12:09:16] it was in Common.js [13:24:16] a peculiar parsing behavior i found was that table openings ("{|") can have whitespace before them but table rows cannot. is this intentional? [13:31:11] Hi! I'm trying to import mediawiki dump to mysql with mwdumper but I am getting table 'DBNAME.revision' or 'DBNAME.text' doesn't exist. I figured out solution but it isn't quite effective: read from mediawiki wiki schema for these tables and then add those tables to mysql database. Is there easier way? [13:41:39] I found tables.sql from mediawiki installation source so everything is working now. :) [13:56:32] Putti (y) great [14:44:05] bd808 the manual clone fixed the problem. thank you! [15:18:47] hi there [15:19:21] Hi popopo, what's up? [15:19:31] hi marktraceur [15:19:38] well I might have a question [15:19:42] Might? [15:19:47] What does it depend on? [15:21:35] I have been looking through the web to find a nice way to add banner at the top of the mediawiki [15:21:50] marktraceur: the easiest was to modify a skin [15:22:02] Define "easiest". :) [15:22:09] and adding on top of the template the banner itself [15:22:15] popopo: There's also CentralNotice or WikiBanner, I guess. [15:22:19] !e CentralNotice [15:22:19] https://www.mediawiki.org/wiki/Extension:CentralNotice [15:22:21] !e WikiBanner [15:22:22] https://www.mediawiki.org/wiki/Extension:WikiBanner [15:22:28] I tryed wikibanner but it failed [15:22:38] Oh? [15:22:47] I got a big "type undefined" and couldn't resolve it [15:23:01] Aw. OK [15:23:05] What about CN? [15:23:11] also, wikinbanner is for within wikipage [15:23:36] central notice just said it is not supposed to work if your tables are prefixed ... [15:23:43] which I did when I set up it [15:23:50] Huh, weird. [15:24:05] Why would CN be built in a broken way [15:24:09] Anyway. [15:24:42] I'll try CN because they said it is not guarantee it works, it means it could [15:24:49] popopo: So, is your qquestion about skins? [15:24:52] Oh, OK [15:25:00] I'd prefer that, I think, to hacing up a skin. [15:25:03] hacking* [15:25:09] marktraceur: according to the design of mediawiki, is modifying the skin to inject the banner is the proper way ? [15:25:23] CN would be cleaner [15:25:41] Yeah, I think CN is nicer, but if it doesn't work, I guess a skin hack would "work" [15:25:54] I hesitate to concede that it's a good way to go about this. :) [15:26:08] marktraceur: understood [15:26:16] me too, to be honest [15:26:41] I don't wan't to do something screwed up because I know it will come back to me one day ! [15:27:05] That's a good attitude :) [15:27:31] hehehe [15:28:46] marktraceur: do you know any website using central notice ? I wish to see if the result is what I am expecting [15:29:03] popopo: Well, WMF wikis use it. [15:29:07] Let me see... [15:31:34] marktraceur: thanks :) [15:33:26] I don't see any obvious campaigns running right now, sorry [15:34:14] alright no worries [15:36:56] marktraceur: do you see this website : http://www.pouet.net/ the gfx banner in the middle, this is exactly what I want to do, is CN able to do that ? [15:37:55] popopo: You mean the site's logo? [15:38:38] yeah [15:39:02] marktraceur: yeah sry [15:39:06] popopo: OK, I misunderstood, that should totally be a skin edit. [15:39:25] popopo: CN is for temporary banners giving people information, you want to change the layout of the site totally. [15:41:05] marktraceur: sorry, I have not been clear enough, my bad [15:41:14] It's OK :) [15:41:15] marktraceur: so have to tweak the skin [15:41:20] Yeah [15:41:29] It'll be a bit more work than CN, sadly, but it'll be more stable [15:41:42] marktraceur: alright [15:41:53] marktraceur: thanks a lot for your help [15:45:37] marktraceur: have to go, talk to you again soon, see ya [15:45:49] Cheers! [15:48:12] hey guys [15:48:44] i am struggling with configuring a mediawiki after moving it to a new server [15:49:01] at the moment i can access the wiki via browser but the style looks all messed up [15:49:12] it seems like some kind of plain text version [15:49:29] could anyone of you give me a hint on how to solve this issue? [15:52:43] fucatus: Do you get a big yellow warning at the top about no skins being installed? [15:53:21] at any of the sites? no i have not seen it [15:53:26] Hm. [15:53:34] also vector seems to be there [15:53:59] fucatus: Can you open your browser's network panel and make sure all the requests you're making are being fulfilled? [15:54:10] It may be that some requests are failing, more details would be helpful. [15:54:43] let me google this^^ [15:55:18] No problem :) [15:58:42] it just loads 3 things [15:58:52] the html of the site and two small pictures [15:58:58] one being the mediawiki logo [15:59:10] and they all work [15:59:35] oh no wait [15:59:37] sorry [16:01:51] the site itself as html is listed as 304 not modified [16:01:59] this is also the case for the mediawiki logo [16:01:59] Hm, OK [16:02:20] the other picture doesn't load 500 internal server error [16:02:22] fucatus: Can you look at the source of the page and see if there are references to javascript or CSS anywhere? It's weird that nothing else is loading. [16:03:09] no nothing there [16:03:19] looks all very... plain^^ [16:06:34] anyone here working on hhvm to upgrade our mediawiki code-base to hhvm from php? [16:07:03] fucatus: Weird. [16:07:28] Abhi__: It's pretty much done :) WMF servers now use HHVM for all(?) non-API traffic. [16:08:16] i have pretty good experience in hhvm. I want to know if the development has started someway? [16:08:17] wow ! [16:08:19] :) [16:08:32] marktraceur_ [16:09:34] marktraceur: can u link me to some ongoing projects ? [16:13:34] well thanks anyway marktraceur [16:15:08] Abhi__: Uhh, well, it's all happening in the core code [16:15:12] !hhvm [16:15:17] ...no? [16:15:19] !hiphop [16:15:23] !git [16:15:23] MediaWiki development is using git, a distributed source control manager, starting on March 21st, 2012; details: https://www.mediawiki.org/wiki/Git_happens Instructions for using it: https://www.mediawiki.org/wiki/Git/Workflow To get an account: https://www.mediawiki.org/wiki/Project:Labsconsole_accounts [16:15:25] Sure. [16:17:49] <^d> !hhvm is HHVM is a thing. It's like PHP but faster. MediaWiki supports HHVM. [16:17:50] Key was added [16:18:10] ty ^d [16:18:16] <^d> anytime :) [17:59:59] superm401, could you please take a look at "consistent preferences icon" in Echo: https://gerrit.wikimedia.org/r/#/c/177269/? [18:00:03] https://gerrit.wikimedia.org/r/#/c/177269/ [19:14:38] marcinlawnik_, yep, thanks for following up. [19:57:20] Wow, these new language settings in the sidebar are superbad. Such usabilitry makes me angry. 5 trials and I couldn't manage to display the links. Thanks for nothing. [21:23:06] Hello, can anyone tell what should be the 'file' attribute in the request in the upload module of api while trying to upload an image file? [21:23:28] as here [21:23:29] http://en.wikipedia.org/w/api.php?action=upload&filename=Test.txt&file=file_contents_here&token=+\ [21:24:24] codezee: file contents [21:24:48] the contents of the image you're trying to upload, basically [21:25:07] so basically that means the binary data of the image I guess? [21:25:22] yes [21:28:21] also can the stashing of uploads done only through api, or the upload interface has some way to do that? [21:28:57] Hello again [21:29:05] if I have a template with a variable, let's say {{{ 1 }}} [21:29:19] I know I can pass text there, even some simple html tags (for example: test) [21:29:39] but I've noticed that passing breaks the template - the variable doesn't get resolved to anything [21:29:43] am I doing something wrong? [21:29:50] codezee: Special:Upload can do that only if it encounters a conflicting file (when it gives the option to proceed anyway or discard, the upload has been stashed) [21:30:32] Vulpix:alright, thanks! [21:30:39] samu: the equal sign in class="xxx" is confusing the parser. You shoud pass it as {{template|1= ah. okay, thank you [21:31:03] samu: the text preceding the equal sign is being interpreted as a variable name [21:31:16] okay, I get it. Thank you Vulpix ;) [21:31:21] it works fine ;) [21:40:02] I've been trying to unsubscribe myself from the mediawiki-accounce list now several times, but I'm still getting announce mails :-/ The mailman says "a confirmation message has been sent", but I never get such a message. who can I contact to stop that list spamming me with news I don't want to know? [21:43:55] gr8: Have you checked to see if your spam filter is trapping the unsubscribe confirmation? [21:45:00] bd808: I don't have a spam filter, so yes [21:46:21] gr8: You could try sending an email to mediawiki-announce-owner@lists.wikimedia.org explaining your trouble and asking to be removed manually. [21:46:52] Other than that... no ideas. The mailman interface is at https://lists.wikimedia.org/mailman/listinfo/mediawiki-announce [21:47:17] ok thanks [21:49:44] most mails come from a guy called Markus Glaser - is he one of the owners? I'd like to not post to a list for this [21:50:15] i doubt it [21:50:19] he's one of release managers [21:50:29] I think he might just be an authorised announcer... [21:51:03] ok [21:51:45] The administrator is Tim Starling. he should be able to manually remove you [21:52:10] note that mediawiki-announce-owner@ is not the same email address than mediawiki-announce@ [21:52:22] the first one won't go to the public list [21:52:27] Yeah, you won't be allowed to post to mediawiki-announce :) [21:53:26] gr8, why do you no longer wish to receive announcements? [21:54:23] that question m( why don't you want to get spam? don't you think it's interesting? [21:54:36] because I don't want, that's why ;) [22:00:08] gr8, nobody else considers it spam [22:00:49] Krenair: yes I know. the point is just, I'm not interested in it any more, therefore it became annoying [22:02:08] makes sense? [22:03:23] I think there's no need to have a valid reason to unsubscribe from a mailing list. Everyone is free to do so without the need of a justification [22:07:00] <^d> The mail subscriptions shall continue until morale improves. [22:12:35] oh, the problem was some weird email forwarding I set up years ago on my shared hosting provider ^^ problem solved :) [22:13:03] :) [23:33:41] hi there, I have a little problem if someone can help me maybe [23:35:09] kipp: go ahead and share your problem, someone might help when they see it [23:35:24] ok thanks [23:36:06] When I enable short url with mediawiki everything works fine except for special pages who don't receive the full url link but only point to the page name (example : special:SpecialPages) and thus not working