[00:00:46] Reedy: hmm, i actually would need the ircecho [00:00:58] http://svn.wikimedia.org/viewvc/mediawiki/trunk/tools/wikibugs/start-wikibugs-bot?revision=67499&view=markup [00:01:48] I think that's in svn somewhere [00:02:13] it's bad, that bunch of tools are not available in repo so ppl can't improve them [00:02:25] https://svn.wikimedia.org/viewvc/mediawiki/tags/REL1_4_9/phase3/irc/mxircecho.py?revision=10836&view=markup -- some IRC tools were bundled with core MW ages ago [00:02:43] For gerrit and nagios you'd need to submit upstream [00:02:55] There's little point us having forks in our repos unless we're actively adding to them [00:05:08] 03(mod) SUL account recently created with no attached or unattached accounts, and undeletable - 10https://bugzilla.wikimedia.org/35792 +comment (10Jeff G.) [00:05:27] ashley: but that's obviously not the ircecho running for wikibugs [00:06:14] If you ask ops someone will know where the ircecho is [00:06:17] it might not be _the_ ircecho, but the ircecho used by wikibugs most likely is based on that, maybe with small or bigger changes [00:06:17] I'm sure it was in svn before [00:07:14] is there a way to delete the userrights log? [00:07:37] Why, pokeswap? [00:07:56] No, nevermind, forget I asked. [00:08:16] lol [00:09:36] because i have SO many usergroups and the log at the bottom is annoying... so? is there? [00:09:50] sudo forget I asked [00:10:38] is there? [00:10:49] yes if you delete them from the database table [00:12:39] * DanielFriesen wonders if he's going to dare starting another potential dead end project [00:12:53] * pokeswap asks how [00:12:53] 03(mod) iOS 4.2 Never loads an article - 10https://bugzilla.wikimedia.org/35771 +comment (10Yuvi Panda) [00:12:56] how? [00:13:26] remove the usergroups, if I understood Reedy correctly. [00:13:38] No, from the log table [00:13:41] !dbtable [00:13:41] http://www.mediawiki.org/wiki/Manual:`e1_table [00:13:45] !db table [00:13:45] See http://www.mediawiki.org/wiki/Manual:table_table [00:13:51] !db [00:13:51] See http://www.mediawiki.org/wiki/Manual:`e1_table [00:13:53] !schema [00:13:53] http://www.mediawiki.org/wiki/Manual:Database_layout [00:14:10] pokeswap: :/ What did I say about playing with the bot? [00:14:34] what query do i run to delete the logs? what table? [00:14:56] 03(mod) iOS 4.2 Settings menu never loads - 10https://bugzilla.wikimedia.org/35772 +comment (10Yuvi Panda) [00:15:34] !dbtable logging [00:15:35] http://www.mediawiki.org/wiki/Manual:logging_table [00:16:33] 03(mod) Gerrit email title truncation over eager - 10https://bugzilla.wikimedia.org/35802 +upstream (10Chad H.) [00:18:52] so, that is the table, but what query do i enter to "takeout the existing logs?" [00:20:32] Have you considered googling the answer to that question? after all, we don't know what database engine you're using so *WE DON'T KNOW*. [00:20:38] Why don't you see if you can work it out yourself? [00:20:57] tried [00:21:10] i am using PHPMYADMIN AKA MySQL [00:21:22] don't shout [00:21:31] [00:21:37] logies, event. [00:21:40] gah. [00:21:45] [00:27:45] 03(mod) iOS 4.2 Settings menu never loads - 10https://bugzilla.wikimedia.org/35772 +comment (10Tomasz Finc) [00:28:05] EH? Ugh... NTP CRITICAL: Offset 30786.32944 secs [00:29:01] i am not [00:31:32] 03(mod) {{PAGENAME}} must not escape special chars, otherwise it makes {{#ifeq:}} unusable - 10https://bugzilla.wikimedia.org/35746 +comment (10Bawolff) [00:31:35] 05THIS IS ME SHOUTING :) [00:37:20] 03(NEW) Feedback submission button not working - 10https://bugzilla.wikimedia.org/35807 normal; MediaWiki extensions: ArticleFeedbackv5; (okeyes) [00:38:20] 03(mod) Feedback submission button not working - 10https://bugzilla.wikimedia.org/35807 +aftv5-2.0 (10Oliver Keyes) [00:42:58] In the abuse filter, is there a way like if someone adds the word "sex" to a page, that they will be blocked? [00:43:11] yes [00:43:38] how? [00:46:25] !e AbuseFilter [00:46:25] https://www.mediawiki.org/wiki/Extension:AbuseFilter [00:49:43] that does not show me what to put in abusefilter to make it so useris blocked if they put the word sex in a page [00:50:56] so, how do i do that? [00:51:36] https://www.mediawiki.org/wiki/Extension:AbuseFilter/RulesFormat [00:51:44] https://www.mediawiki.org/wiki/Extension:AbuseFilter/Actions [00:53:33] ??? [00:54:37] People don't have the time nor the inclination to spoon feed everything [00:57:41] 05I AM VERY CONFUSED WITH THE ABUSE FILTER [00:58:02] * pokeswap goes to abusefilter and punishes it  [01:01:09] * pokeswap than goes to reedy and hands him the broken abusefilter [01:04:39] pokeswap, acting like that in here won't get you very far. If you want to create abuse filters, then learn how to do it yourself. Reedy provided appropriate links for you to learn how to do so. [01:43:58] 03(mod) SUL account recently created with no attached or unattached accounts, and undeletable - 10https://bugzilla.wikimedia.org/35792 +comment (10Simon Walker) [02:02:34] Tch... already loosing motivation this far in [02:06:02] Chocolate. Chocolate will revive the motivation, DanielFriesen. Chocolate solves *all* problems. [02:33:03] 03(mod) extending Narayam Marathi support in wiktionaries across all languages other than Marathi Wiktionary - 10https://bugzilla.wikimedia.org/35790 +comment (10Mahitgar) [02:34:57] 03(NEW) localsettings.php,if insert:$wgExternalLinkTarget = ‘_blank’;some page will bad - 10https://bugzilla.wikimedia.org/35808 normal; MediaWiki: Page editing; (18671917188) [03:01:41] 03(NEW) GENDER magic word doesn't work in Serbian - 10https://bugzilla.wikimedia.org/35809 normal; MediaWiki: Internationalization; (theranchcowboy) [03:13:09] 03(NEW) ! N pages non-patrol-able - 10https://bugzilla.wikimedia.org/35810 normal; MediaWiki: Recent changes; (Technical_13) [03:14:06] 03(mod) ! N pages non-patrol-able - 10https://bugzilla.wikimedia.org/35810 +comment (10ShoeMaker) [03:30:37] for the Vector skin, what css class is it for just the link color of the unselected top nav tabs [03:30:47] Chrome inspector isnt helping much [04:09:50] JRWR, is this what you're looking for: div.vectorTabs li a { color } [04:11:46] !seen bawolf [04:11:46] potter, I don't remember seeing bawolf. [04:11:53] !seen bawolff [04:11:54] bawolff (~bawolff@wikinews/bawolff) was last seen quitting from #mediawiki 2 hours, 41 minutes ago stating (Quit: ChatZilla 0.9.88.1 {[Iceweasel} 3.5.16/20110302220840\]). [04:12:16] !seen ialex [04:12:18] ialex (~IAlex@mediawiki/pdpc.active.ialex) was last seen quitting from #mediawiki 8 days, 6 hours, 38 minutes ago stating (Quit: ialex). [04:28:35] lol... `git init --bare && git init` creates a valid git repo where the working directory you can commit is another valid git directory [04:54:59] New patchset: Santhosh; "Notify the translators using the talk page." [mediawiki/extensions/TranslationNotifications] (master) - https://gerrit.wikimedia.org/r/4394 [05:05:59] 03(mod) Enable WebFonts extension on mr.wikisource.org - 10https://bugzilla.wikimedia.org/35426 +comment (10Shantanoo) [05:14:47] DanielFriesen: how is that unexpected? [05:15:00] huh? [05:15:07] oh [05:16:35] ^_^ Sometimes a vcs writes in checks to make sure you don't screw up by doing things like creating a vcs inside a vcs [05:16:54] More importantly... [05:17:13] seems contrived! [05:17:27] Why is it that when I search for php frameworks, every single one I find is a shit MVC framework... [05:17:53] anyone know symphony? i'm starting to bang my head against it [05:18:06] I want something to abstract away all the php crap... not force me to write something one way [05:18:37] maybe you're looking for not php? ;) [05:18:54] [10:06] If you ask ops someone will know where the ircecho is [05:20:27] jeremyb: T_T They won't let me use ruby. [05:20:29] my flight less than a week ago was sitting next to a jazz musician. he was familiar with django's music but doesn't play any of it. apparently there's more similarities between the framework and the namesake than i realized. (e.g. DRY) [05:20:37] DanielFriesen: i was thinking python [05:20:46] * DanielFriesen never got into pythong [05:23:05] DanielFriesen: following the lists? apparently C++ is a lang for making UI for mediawiki file uploaders! [05:23:16] * jeremyb is absolutely boggled by that [05:56:59] jeremyb: VB.net! then we can track the ip of users as well >.> [05:57:02] New review: Aaron Schulz; "(no comment)" [mediawiki/core] (master); V: 0 C: 2; - https://gerrit.wikimedia.org/r/4513 [05:57:05] Change merged: Aaron Schulz; [mediawiki/core] (master) - https://gerrit.wikimedia.org/r/4513 [05:57:30] p858snake|l: i totally don't understand. can't we do the same with any HTTP based UI? [05:58:37] I would like to create a page for my company. Now I want to insert company logo in page. How to do it?? [05:58:52] Pramod: on which wiki? [05:58:54] jeremyb: https://www.youtube.com/watch?v=hkDD03yeLnU [05:59:22] wikipedia [05:59:45] p858snake|l: is that from a real episode? [05:59:55] p858snake|l: not a deleted scene or something? [06:00:07] yes its real [06:00:32] haha, it has an FUR! [06:00:41] how to change company logo?? [06:01:01] Pramod: Which wikipedia are you editing? [06:01:12] There are many languages. [06:01:19] english [06:02:03] Pramod: have you read https://en.wikipedia.org/wiki/Wikipedia:Notability and https://en.wikipedia.org/wiki/Wikipedia:Conflict_of_interest ? [06:02:21] Pramod: and https://en.wikipedia.org/wiki/Wikipedia:Notability_%28organizations_and_companies%29 [06:03:10] Pramod: http://en.wikipedia.org/wiki/Help:Visual_file_markup [06:03:38] You might also wish to join the #Wikipedian-en channel, and they can help you with the many issues you are about to face. [06:07:42] chughakshay16: i can't quite remember... are you a GSoC applicant? [06:08:06] jeremyb, yes i am .. :) [06:08:48] ahh, conference extension [06:09:08] jeremyb, yes [06:09:42] chughakshay16: i think the problem is that GSoC takes place in the wrong season. best would be to start ~6-10 weeks after wikimania i think. not that it's possible [06:10:29] jeremyb, :), well it was quite hard for me to get in touch with wikimania people for the same reason.. [06:11:24] jeremyb, i mailed them a couple of times but was of no help.. [06:12:10] jeremyb, but later i got in touch with saper, who helped me out set the scope for the proposal [06:12:27] jeremyb, so all in all it turned out okay. [06:13:25] saper is already more than 1 year after his wikimania. so he has some chance of more free time [06:14:24] chughakshay16: so... text in pictures sucks [06:14:27] I'm doing mass-editing of my wiki with the command-line tool curl from Debian. When I access the edit forms with curl, I pass to the webserver all the required "" tags. These are tags like wpStarttime, wpEditToken, and like. I can correcly edit pages this way, under an anonymous user not logged in. But if I try the same when logged in, I get the error: "Sorry! We could not process your [06:14:29] edit due to a loss of session data." [06:14:38] hey davide [06:14:44] i like big posts [06:14:52] and you cannot lie? [06:15:38] jeremyb, u referring to the UI mockups .. ? :) [06:15:46] chughakshay16: i guess [06:16:01] chughakshay16: could you port to ascii art or something else? [06:16:20] chughakshay16: idk what other ppl use. could ask jorm [06:16:29] jeremyb, i would have to draw it all over again.. [06:16:38] well, I'm editing pages on my wiki by means of curl [06:16:43] jeremyb, i used comic sans as my font.. [06:16:54] davide_at_debian: that's not the most verbose msg... [06:17:07] chughakshay16: i don't follow [06:17:08] I can do that, passing all the
to webserver [06:17:42] chughakshay16: "i would have to X all over again" means you shouldn't have done what you did or shouldn't be doing what you're doing [06:17:43] davide_at_debian: have you considered using the API instead? [06:17:46] !api [06:17:46] The MediaWiki API provides direct, high-level access to the data contained in the MediaWiki databases. Client programs should be able to use the API to login, get data, and post changes. Find out more at . For client libraries in various languages, see . [06:18:10] chughakshay16: in this case, putting canonical data in raster images. [06:18:26] thanks. That may have been useful .. If I knew earlier.. [06:18:36] davide_at_debian: what's your preferred language? [06:18:38] if you have los of text, put your images into png and don't compress them [06:18:40] pywikipediabot also helps for automating tasks [06:19:02] davide_at_debian: there's dozens of bot frameworks/bindings written for that API [06:19:03] by the way, I'm nearly finishing writhing "that-so-special-software" that handles this editing fpr me [06:19:13] davide_at_debian: :( [06:19:19] jeremyb, I used an online tool to draw these images.. [06:19:22] AWB as well if you have a windows box and more comfortable on that [06:19:34] right, forgot about AWB [06:19:51] p858snake|l: i imagine he's not more comfortable on windows? [06:19:54] just a wild guess [06:20:24] I can succesfully edit pages by command like: curl --http1.0 -s -L --cookie cookies.txt --cookie-jar cookies.txt -F "wpStarttime=xx" ... http://wiki_place [06:20:29] jeremyb, i didnt get it when you said "could you port to ascii art or something else?" [06:20:36] that works right [06:20:42] that just looks... icky [06:20:50] chughakshay16: https://en.wikipedia.org/wiki/Ascii_art [06:20:55] Skizzerz: :o how could you forget reedy like that, is he not special enough? [06:21:19] davide_at_debian: you need to grab the edit tokens, which is why using the api is a much nicer method than screen scraping [06:21:22] but I get this error if I do that curl as a logged-in user: "Sorry! We could not process your edit due to a loss of session data." [06:21:39] p858snake|l: I already implementet grabbing in software [06:21:44] I have all the tokens in hand [06:21:47] no, I just never used AWB myself so I kinda forgot it exists, versus I did use pywikipedia (although a long while ago) [06:21:57] and I'm able to succesfully edit pages automatically [06:22:13] but I get login troubles, that's all [06:22:32] davide_at_debian: is any of this releasable? might be easier for someone to port it to an API library/framework for you... [06:23:01] seems like too much of a hack to actually be going into a package [06:23:18] so, idk if it's going to be DFSG free ;) [06:23:24] jeremyb: that's absolutely releasable :) but (yes jeremyb) it's mostly a hack [06:24:14] I think the whole troubles lies on mediawiki dirs permissions [06:24:14] davide_at_debian: anyway, obvious solution: tcpdump both curl and a web browser and compare [06:24:31] really?! [06:24:38] does it not work in a web browser? [06:25:00] in web browser it is perfect, both as anonymous editing, and as logged in user [06:25:14] this is it --> "Sorry! We could not process your edit due to a loss of session data." [06:25:21] i don't see where mediawiki dir perms could fit in [06:25:25] I get this when logged-in by curl [06:25:42] jeremyb: /var/php [06:25:52] still don't get it [06:25:53] davide_at_debian: Have you heard about the API? [06:25:59] Amgine: we told him [06:26:00] jeremyb: actually that's mostly a webserver side [06:26:39] Amgine: coming to DC? [06:26:50] Amgine: yes of course. I'll try that in case. But, really, that software needs maybe just 1 line to get corrected [06:26:50] Only if I can get scholarship. [06:27:22] davide_at_debian: you have not set up cURL to manage the cookies correctly. Session is passed as a cookie. [06:27:28] I think I'm missing somwthing with curl itself, like wrong cookies or like. [06:27:42] davide_at_debian: well, tell us what the difference is between you and a web browser. or script up a self contained bash test case [06:27:47] I have also written screen scraper editors for mw. [06:27:50] Amgine: probably. I'll show you how curl is run: [06:28:17] for the login phase: [06:28:18] Thanks for the offer, but I've done it once or twice. [06:28:34] I first access the login page to get the "magic" editToken [06:29:05] curl --http1.0 -s -L --cookie cookies.txt --cookie-jar cookies.txt -o tmp mediawiki_page [06:29:24] it saves the page to tmp file (bad, I know) [06:29:40] then I grab the magic token from it [06:30:32] and then: curl --http1.0 -s -L --cookie cookies.txt --cookie-jar cookies.txt --data-urlencode"wpName=${mwiki_user}" --data-url-encode "wpPassword=${mwiki_pass}" [...] mediawiki_page [06:30:50] and that works great [06:31:17] i get succesful logged-in page (page shows me as logged) [06:31:45] you need to get the edit token for each page iirc [06:31:56] so then on I keep curl sticked to those cookies: they contain my login status. [06:32:07] p858snake|l: exactly, I do. [06:32:10] then: [06:32:11] Query: you're using php to manipulate the data? [06:32:23] just bash -_- [06:32:34] davide_at_debian: continue [06:32:39] sure [06:34:21] then I need to download an X page from mediawiki and compare it to some data into a database. This is handled in bash, but hasn't great importance to us now [06:35:01] again, the page I get shows I am logged in as the previous user, so login works, and curls keeps that status. [06:35:10] i hope it's not really bash. (awk or sed or perl something?) [06:35:21] anyway, keep going ;) [06:35:48] jeremyb: yes, bash is only the "container" script fof accessing systenm programs [06:36:37] davide_at_debian: well you could store the whole page html in a bash var and then use ${//}, ${%}, ${#}, etc. [06:36:48] davide_at_debian: but i'm *hoping* you're not that crazy [06:37:22] jeremyb: no, I made it simplier :) no need for super speeds. [06:37:25] [davide_at_debian(i)] then, finally my script wants to edit that X page on mediawiki. I downloads [06:37:48] lolwut? "Windows compatibility: stream_select() used on a pipe returned from proc_open() may cause data loss under Windows 98." [06:37:59] then, finally my script wants to edit that X page on mediawiki. it downloads the page and extracts 4 tokens: [06:38:43] davide_at_debian: btw, are you a DD? having trouble finding a unique match in LDAP. there's 3 davide's and none have davide as a username [06:39:02] wpStarttime wpEdittime wpAutoSummary wpEditToken -- the page is the "Edit" one, accessed from the same X page [06:39:26] jeremyb: (no, i'm not) [06:41:13] when I got these 4 tokens in hand, I point my curl to the editing form's "action=http://...", passing all the needed tokes as usual, beside the cookies. [06:42:20] davide_at_debian: so... i'm still thinking "script up a self contained bash script test case" or just show us your code as is [06:42:25] well, If I do that as an anonymous user, avoiding the login phase made by curl esrlier, it works [06:43:04] jeremyb: yes, I'm posting the script on a pastebin. Then I'll highlight the hot sections [06:43:06] IIRC, anon edits do require less CSRF checks? not sure exactly [06:44:05] but as logged in user, I get the famous git message ""Sorry! We could not process your edit due to a loss of session data." [06:44:10] (posting script now..) [06:44:13] been to DebConf? [06:44:32] davide_at_debian: s/git/MW/ ? [06:44:33] jeremyb: no [06:44:48] ok, then i guess i don't know you ;) [06:45:21] no lol, I mean the "piggy message", not to to with "git" itself! [06:49:22] ok, nearly done [06:49:58] never mind me on how stuff got bad inside here! --> pastebin.com/Sz4GvGaE [06:50:27] ahh, wikia [06:50:42] :) [06:50:46] dont cry! [06:51:10] we have a mwiki_login_f() function that performs login on mediawiki [06:51:30] so, have you actually tried it against some other mediawiki to be safe? [06:52:22] not really.. but I must say, my mediawiki works good when I do the stuff myself by browser [06:53:13] what do you mean "my"? this is for wikia, no? [06:53:49] well, it *crawles* pages from wikia, then it patches them agains a database, and puts them on "my mediawiki" [06:54:14] (we at freeciv are passing to new website structure) [06:54:52] why not just do a dump and set up a new mediawiki somewhere? [06:55:07] surely, they'd give a dump if asked [06:55:29] Or it would be easy enough to export all page titles. [06:56:33] yes. I wasen't sure about their permissiveness. But also, I need to convert wikia syntax to match mediawiki's one [06:57:01] anyway, I'll show you the hot parts .. [06:57:56] yep, all before line 195 is about crawling. And that douesn't matter to us [06:58:13] how does the syntax vary? [06:58:40] very little, you can check it from line 120 to 127 :) [06:59:05] (matter is that the script almost works!) [06:59:12] they may not be franklin street statement compliant (i've no idea if they are) but I doubt they'd stand in the way [06:59:36] I think they are commercial, and nothing more [06:59:49] they are more [07:00:30] they do mediawiki development and participate on development mailing lists. i think they contribute back and i think they open source some of their extensions. (partly guesses) [07:01:00] I'll reconsider that way sure, if I can't find that single line that isn't wrking! [07:01:47] yes, here is the part that wants to edit pages on mediawiki: [07:02:16] line 196 - it starts by downloading the page to edit, to extract magic tokens [07:02:28] (and it works) [07:03:04] morning [07:03:11] then, at line 219, it wants even to evict the mediawiki page in favour of its own [07:04:03] curls points to the form's "action=http://...", passing all the tokens, and the previous cookies [07:04:23] (and it works) [07:04:34] (... but only as anonymous :) [07:05:23] if you want to try, I'll gift you the password :) [07:05:46] it already works, all url are absolute - no internal IPs or such [07:07:08] will you be around tomorrow? it's 9am for you (i assume) but 3am here. i need sleep. in the mean time maybe you can be convinced to use API or just get a dump from them [07:07:08] then, after line 233 there is the "image posting" part, where curl is asked to upload images to mediawiki. But that's another story. [07:08:00] lol.. yep! I'll get here this evening too! (morning for japanese :) [07:08:48] ewwww, wikia's violating the first rule of URL namespace separation for scripts vs. articles! [07:08:51] ;( [07:09:09] http://freeciv.wikia.com/wiki/Main_Page [07:09:10] http://freeciv.wikia.com/wiki/Index.php [07:09:14] err [07:10:09] ... that might be full of meanings to you! [07:11:31] here you go: http://freeciv.wikia.com/api.php [07:12:49] it's so easy! after I spent 5 days writing this script, I must admit I'm full of joy to undertake another 5 days in learning APIs! [07:13:31] the point is you don't have to learn the api [07:14:01] your choice of python, perl, php, ruby. i'm pretty sure all of those have mediawiki API libraries/frameworks/bindings [07:15:06] ciao a tutti [07:15:23] ciao jeremyb! [07:23:45] 03(NEW) the message moodbar-form-note-dynamic doesn't support PLURAL - 10https://bugzilla.wikimedia.org/35811 normal; MediaWiki extensions: Moodbar; (amir.aharoni) [07:27:43] 03(NEW) The tip of the MoodBar bubble points to the left in RTL environments - 10https://bugzilla.wikimedia.org/35812 minor; MediaWiki extensions: Moodbar; (amir.aharoni) [07:27:46] 03(mod) RTL/bidirectional issues (tracking) - 10https://bugzilla.wikimedia.org/745 (10Amir E. Aharoni) [07:32:33] 03(mod) RTL/bidirectional issues (tracking) - 10https://bugzilla.wikimedia.org/745 (10Amir E. Aharoni) [07:32:33] 03(NEW) The arrow near "What is this?" in the MoodBar bubble points the wrong way in RTL environment. - 10https://bugzilla.wikimedia.org/35813 normal; MediaWiki extensions: Moodbar; (amir.aharoni) [07:33:12] 03(ASSIGNED) The arrow near "What is this?" in the MoodBar bubble points the wrong way in RTL environment. - 10https://bugzilla.wikimedia.org/35813 (10Amir E. Aharoni) [08:08:06] Ahhh... I need a better understanding of process control [08:10:52] in case anyone needs an idea for a gsoc project: creating an anti vandal bot based upon past entries from the database; ie looking at all the editions which were reverted by humans and getting the most added/removed words. if nobody does it, i'll give it a try [08:12:53] doh! [08:13:02] I used 'r' for stderr instead of 'w' [08:13:13] :/ which is what caused the infinite loop of "" freads [08:13:31] Someone needs to build a really nice proc_open abstraction [09:32:38] 03(mod) GENDER magic word doesn't work in Serbian - 10https://bugzilla.wikimedia.org/35809 +comment (10Niklas Laxström) [09:39:41] New review: Nikerabbit; "(no comment)" [mediawiki/extensions/TranslationNotifications] (master); V: 0 C: 2; - https://gerrit.wikimedia.org/r/4394 [09:53:46] New review: Nikerabbit; "This is not documentation." [mediawiki/core] (master) - https://gerrit.wikimedia.org/r/4516 [10:11:51] Reedy: If you were to write a review system for git, what would you name it? [10:18:49] Hmmm... I'll randomly go with Gareth [10:27:28] New review: Hashar; "(no comment)" [mediawiki/extensions/Math] (master); V: 0 C: -1; - https://gerrit.wikimedia.org/r/4422 [10:31:07] 03(mod) Empty tag gives UNIQ - 10https://bugzilla.wikimedia.org/31824 +comment (10Antoine "hashar" Musso) [10:31:34] New review: Hashar; "(no comment)" [mediawiki/extensions/Math] (master); V: 0 C: -1; - https://gerrit.wikimedia.org/r/4422 [11:12:52] i have some javascript code, which needs severe review and probably will not get into core, it's just some test on a feature. how can i submit it for review without entering the main core? [11:13:47] * DanielFriesen wonders if it would be evil of him to take his review system, kill off the database, and instead use git's own plumbing to store all the review system data [11:15:08] *twitch* comments in git objects... lol [11:21:59] anyone? [11:31:00] joancreus: bugzilla? [11:32:22] DanielFriesen: ok [11:43:48] 03(mod) Introduce custom events for MediaWiki's front-end flow - 10https://bugzilla.wikimedia.org/30713 (10joan.creus.c) [12:01:12] to the sleepy jeremyb: Solved. Script finished. It can crawle all the wikia's articles related to topic Freeciv and put them in a mediawiki. [12:04:19] problem was, that when I manually looked at mediawiki's
special codes, I was not logged in. That made the wpEditToken to always show empty, end hence I assumed it as permanently empty when passed by curl. Unfortunatly that's not the case when a user uses the same
after a login. So, I just needed to add 1 more magic token to my curl. [12:08:27] Is there a way to add javascript and css to the head section? I'm using MediaWiki 1.18.1 [12:19:31] 03(mod) Vector "watch" star icon is low-resolution - 10https://bugzilla.wikimedia.org/35335 +comment (10Lejonel) [12:55:32] Reedy, turns out all of the extensions my $wgFilterCallback removal patch needs changed are not in Git [12:56:48] User59603: user script in js, or php code? [12:58:41] http://pastebin.com/Cmcvq4FX [13:07:11] joancreus: php code [13:08:05] User59603: not an expert, but maybe modifying the skin to your needs? ie: vector [13:08:29] skins/Vector.php [13:08:43] could somebody check if what i'm saying is true? [13:10:57] joancreus: thank you! but what do i need to insert here, to make it work between
and ? [13:12:18] 03(FIXED) API should allow edit using pageid - 10https://bugzilla.wikimedia.org/32492 (10Krenair) [13:13:33] User59603: hmm i don't know. what are you exactly trying to accomplish? [13:19:00] Krenair, what's your problem? [13:19:33] Platonides, ? [13:20:21] " Reedy, turns out all of the extensions my $wgFilterCallback removal patch needs changed are not in Git" [13:21:35] I made a patch a while back which removed the deprecated $wgFilterCallback and a few related things which only it uses [13:21:49] New review: Umherirrender; "Ok, I have to look at all and not at the least console line." [mediawiki/core] (master) C: 0; - https://gerrit.wikimedia.org/r/4503 [13:22:59] It modifies some extensions - but they're in SVN, and I don't have access to that [13:23:46] It's only a single patch so it's kinda pointless requesting SVN access [13:24:05] I can committ it for you [13:25:20] that'd be great [13:25:33] svn part is in that pastebin I posted [13:26:00] gah, be back soon. family wants me to do something [13:29:04] I'm not sure about removing the existing handling [13:29:25] if a wiki were using $wgFilterCallback, it would stop working, which could be quite bad [13:31:20] New review: Umherirrender; "(no comment)" [mediawiki/core] (master) C: 0; - https://gerrit.wikimedia.org/r/4507 [13:32:28] could somebody take a look at https://bugzilla.wikimedia.org/show_bug.cgi?id=30713 ? [13:33:15] joancreus, I'm not a jQuery guy [13:33:27] I prefer to leave it for Krinkle [13:34:42] ok. in comment 3 https://bugzilla.wikimedia.org/show_bug.cgi?id=30713#c3 i have a "solution" (in quotes, not ready for production probably) [13:40:16] Hi room, I've got a quick question : when adding an image to an article (on en wikipedia), how do i reference a file that is stored in another language (serbian - sh) of the wikipedia network? [13:40:27] you can't [13:40:32] quick reply. [13:40:36] :) [13:40:40] so i need to download the file and reupload it? [13:40:47] (it's copyleft) [13:40:49] the file should be stored in commons [13:40:58] yeah, sure. [13:41:02] it can then be used seamlessly on all our wikis [13:41:28] so i should just make a move or something? mark the old one as deprecated once i've updated all usage links? [13:41:39] -usage+used [13:43:47] you wouldn't need to update uses if using the same name [13:47:52] okay...not sure what that means as this image is stored in some personal profile space under a specific user name. [13:48:01] New patchset: Jeroen De Dauw; "work on making menu items configurable" [mediawiki/extensions/EducationProgram] (master) - https://gerrit.wikimedia.org/r/4539 [13:48:30] no, sorry. [13:49:44] 03platonides * 10/trunk/extensions/Configure/scripts/findSettings.php: Double reason to skip $wgFilterCallback now, as it's deprecated. [13:50:05] heh [13:50:07] New review: Jeroen De Dauw; "(no comment)" [mediawiki/extensions/EducationProgram] (master); V: 0 C: 2; - https://gerrit.wikimedia.org/r/4539 [13:52:16] Git workflow question: with SVN I used to be able to commit a bunch of stuff from one machine, and then could just get that and continue working on it from another, even if it was not reviewed yet. Is there a way to do this with gerrit? [13:52:40] JeroenDeDauw: push to another remote repo [13:53:21] Reedy: so you suggest working with another repo and then once the feature is fully complete push a pile of commits to gerrit? [13:53:24] You can still commit something, and put in the commit summary NOT READY FOR REVIEW [13:53:32] and then amend it [13:54:12] To be honest, I'm not sure what would be the best way... But that could work [13:54:16] New review: Jeroen De Dauw; "(no comment)" [mediawiki/extensions/EducationProgram] (master); V: 1 C: 2; - https://gerrit.wikimedia.org/r/4539 [13:54:17] could/should [13:55:53] New patchset: Reedy; "Title::moveToInternal doesn't return anything, but it does throw an exception" [mediawiki/core] (master) - https://gerrit.wikimedia.org/r/4517 [13:56:03] Reedy: I actually have no problems working with another repo - but I suspect people will go mad when I then push all the commits to gerrit, as they need to be in one patchset(?) I actually don't get how to do this at all without losing all my individual commit messages and stuff [13:56:25] Change merged: Jeroen De Dauw; [mediawiki/extensions/EducationProgram] (master) - https://gerrit.wikimedia.org/r/4539 [13:56:25] You can squash them so it's one commit if you wanted [13:56:38] Though pushing multiple commits isn't a big deal [13:57:11] New review: jenkins-bot; "Build Successful " [mediawiki/core] (master); V: 1 C: 0; - https://gerrit.wikimedia.org/r/4517 [13:57:49] Reedy: every time I've done this so far people got mad at me and said that I should resubmit them as a single thing [13:58:04] lol [13:58:08] Hmm [13:58:14] Well, squashing should do what you want [13:59:08] well, I still think it's better to view unrelated changes as different commits [13:59:09] Y'know, when you think about it, if you say one editor is wiki-stalking you, it is usually likely that it can be anyone [13:59:23] Indeed [13:59:33] (not to the stalking) [13:59:57] ah [14:00:07] New patchset: Platonides; "Add option to rebuild message files on a different folder." [mediawiki/core] (master) - https://gerrit.wikimedia.org/r/4540 [14:00:42] Heck, even Reedy here could be guilty of wiki-stalking [14:00:45] ...right? [14:01:23] New review: jenkins-bot; "Build Successful " [mediawiki/core] (master); V: 1 C: 0; - https://gerrit.wikimedia.org/r/4540 [14:01:26] (key word: could) [14:01:26] Is there a way to disable the whole review thing for a certain extension on gerrit? I want to be able to have my commits merged in directly without me having to approve them first. Makes no sense to have review overhead now, since review has not happened for all the other code on svn, and we'll have a full CR of the extension later on. [14:02:10] JeroenDeDauw: without editting the extension's code? [14:02:11] I think it's possible to allow direct push [14:02:54] Er, nvm [14:02:57] LL2|JedIRC: ?? [14:03:07] Platonides: any idea where the button is? I should have the rights to do this if possible I think [14:03:23] I realized just now that gerrit is probably software and not a mediawiki extension [14:04:23] 03(NEW) Uncaught RangeError: Maximum call stack size exceeded when loading ResourceLoader module with many dependencies - 10https://bugzilla.wikimedia.org/35814 normal; MediaWiki: Resource Loader; (questpc) [14:05:40] JeroenDeDauw, I think it has to be configured by ^demon [14:05:47] that it's a repository property [14:05:50] Yeah, people can direct push in some cases [14:05:52] either reviewed or direct-push [14:06:27] It's possible, but you would need a git admin to give you rights, I think. [14:06:42] * Reedy looks [14:07:05] Hmm, don't tink I've enough rights [14:07:24] thanks for the commons tip, ciao. [14:09:36] 03(NEW) ResourceLoader modules which define only CSS styles should be applied at server-side. - 10https://bugzilla.wikimedia.org/35815 normal; MediaWiki: Resource Loader; (questpc) [14:10:18] New patchset: Reedy; "Title::moveToInternal doesn't return anything, but it does throw an exception" [mediawiki/core] (master) - https://gerrit.wikimedia.org/r/4517 [14:11:35] New review: jenkins-bot; "Build Successful " [mediawiki/core] (master); V: 1 C: 0; - https://gerrit.wikimedia.org/r/4517 [14:14:41] New review: Siebrand; "It could be that the spec is not clear, or that I misread the code." [mediawiki/core] (master); V: 0 C: 0; - https://gerrit.wikimedia.org/r/4540 [14:16:39] 03(mod) SUL account recently created with no attached or unattached accounts, and undeletable - 10https://bugzilla.wikimedia.org/35792 +comment (10Jeff G.) [14:25:43] 03(mod) SUL account recently created with no attached or unattached accounts, and undeletable - 10https://bugzilla.wikimedia.org/35792 (10mabdul) [14:29:14] Hello. I'm looking for an IRC bot/script that works with the flaggedrevs extension, in order to notify when a page has a pending review. If there is any existing bot that I can base this on, something that has a more advanced filtering of recent-changes or something that simply checks the history of pages would be great. [14:33:23] 03(mod) Uncaught RangeError: Maximum call stack size exceeded when loading ResourceLoader module with many dependencies - 10https://bugzilla.wikimedia.org/35814 +comment (10Krinkle) [14:34:07] hi everyone, there used to be a link on the mobile site (m.wikipedia.org) to permanently turn off the mobile view (i'm not sure about the exact wording was in the english interface). that seems to have disappeared. is there any reason to this? [14:34:30] *-was [14:41:09] 14(INVALID) ResourceLoader modules which define only CSS styles should be applied at server-side. - 10https://bugzilla.wikimedia.org/35815 +comment (10Krinkle) [14:41:17] 03(mod) ResourceLoader modules that style server output should be loadable without javascript or FOUC - 10https://bugzilla.wikimedia.org/35815 summary (10Krinkle) [14:41:56] 03(mod) Need a way to debug exceptions thrown from my ResourceLoader module "Uncaught RangeError: Maximum call stack size exceeded" - 10https://bugzilla.wikimedia.org/35814 summary (10Krinkle) [14:54:25] minifying html server-side is too agressive? [14:59:36] are there any media handling extensions that detect the user's browser and return gracefully-degraded media? i notice that the ogg video handler (e.g. in media of the day at http://commons.wikimedia.org/wiki/Main_Page) initially passes back a .png snapshot of a frame in the ogg video (which all browsers support), but i'm looking for a media handler that initially passes back a fall-back... [14:59:38] ...version of the media based on what the requesting browser supports [14:59:44] joancreus, I don't think we're minifying html anywhere [15:00:15] Emw, which kind of media? [15:00:36] nono, i mean, adding it to mw as an extension or something [15:00:42] it might make a difference on mobile [15:00:44] , for example [15:00:59] also on server-side resources [15:01:44] i'm developing a media handler for WebGL-enabled interactive 3D models. IE doesn't support WebGL, and i'd like to pass back a static image representation of the model for IE. (it would be possible to pass back a static image representation for all browsers, then have the user, say, click on the image to enable interactivity, but i think this would significantly dilute the user experience.) [15:03:49] joancreus: how much smaller would a server-side minified HTML file be than the same, unminified but normally gzip-compressed HTML file? [15:04:47] davide_at_debian: hi [15:04:58] Emw: i believe the percentage would be pretty high [15:05:00] let me check [15:07:28] Emw, what does a WebGL produce for a browser? [15:07:38] would that require it to load a lot of data just to show it? [15:07:50] 4,857 (8.17%) on the english wikipedia main page using http://kangax.github.com/html-minifier/ [15:07:52] Emw: [15:08:45] Platonides: WebGL uses the HTML5 canvas element. it doesn't necessarily require a lot of data. [15:09:00] huh, thinking about it, maybe it's not necessary because there's gzip [15:09:04] i don't know [15:09:55] joancreus: ya, that's what i was thinking. does the 8.17% reduction in size compare the minified HTML file to the unminified but gzipped HTML file? [15:10:09] non-gzipped, i think [15:11:46] 16006 vs 15062 bytes [15:11:55] ie. 944 bytes of difference [15:12:23] probably just whitespaced? [15:12:26] 5.89% of the original gzipped [15:12:29] whitespace* [15:12:30] & comments [15:12:36] ah [15:13:23] what i think it'd make a difference is allowing the