[00:38:07] Hmm, a bot of mine is getting 503s when accessing wikipedia. [00:38:32] a930913, known sitewide issue, being investigated [00:40:46] Eloquence: Since when? :o [00:40:57] 10-15 mins ago [00:41:44] Hmm, did I miss notification of it? [01:07:39] 3Wikimedia Labs / 3tools: Provide namespace IDs and names in the databases similar to toolserver.namespace - 10https://bugzilla.wikimedia.org/48625#c34 (10Alex Monk) All I can do is get a list of tables in that DB. Everything else is denied. [07:05:33] andrewbogott_afk: no links [10:24:19] mutante: can you help me fix my dev account? or anyone else? [10:29:43] jayvdb: currently i don't know how because the bot trigger that explained how seems gone.. and really busy [10:32:44] jayvdb: What do you mean by that? [10:35:00] jayvdb: i'm trying to figure out the old docs though.. nevertheless now .heh [10:35:30] scfc_de: quote ". If you currently have SVN access, then you have an account, but need to have it linked to Labs (how-to for admins: !add-labs-user). " [10:35:40] scfc_de: from that ^ , we need to how-to part [10:35:52] how do i link his svn user to labs again? [10:36:09] https://svn.wikimedia.org/viewvc/mediawiki/trunk/tools/subversion/user-management/add-labs-user?view=markup&pathrev=96446 [10:44:26] mutante: Not a clue. [10:44:43] Weren't all SVN users imported to LDAP? [10:45:20] scfc_de: https://wikitech.wikimedia.org/wiki/Help:Access#Anonymous_users [10:45:48] hrmmm [10:46:09] What was jayvdb's SVN name? [10:46:34] jayvdb [10:47:00] uidNumber: 1130 [10:47:17] loginShell: /usr/local/bin/sillyshell [10:47:18] There's an LDAP record for that? So what's missing? [10:47:31] the "linked to labs" [10:47:40] so he can actually log in [10:47:43] on wikitech [10:47:48] jayvdb: right [10:48:00] he gets an error when he wants to login [10:48:25] i just forgot that step it was :p [10:48:48] i have done it but maybe 1.5 years ago [10:49:25] Ah, okay. The cn/sn for uid is lower-cased, so there can't be a wikitech user currently. [10:49:56] Probably best to file a bug/RT. [10:50:15] agree, because the docs say to use that trigger [10:50:23] so the bug is that doesnt work anymore [10:50:38] !add-labs-user [10:50:39] that [10:55:29] it wont let me create a wikitech wiki account [11:19:11] 3Wikimedia Labs / 3Infrastructure: !add-labs-user gone, fix or add docs to link SVN users to labs/wikitech - 10https://bugzilla.wikimedia.org/64596 (10Daniel Zahn) 3NEW p:3Unprio s:3normal a:3None this is about existing SVN users from the past who are in LDAP but can't use wikitech, because they need... [11:19:17] jayvdb: scfc_de ^ [12:06:07] thx mutante and scfc_de [12:18:21] scfc_de: bist du da? [12:26:22] Steinsplitter: Nicht richtig; erst wieder abends. [12:26:34] ah, schade :( [13:55:14] Coren: hi, the filedescription is not replicated @commonswiki ? [13:56:29] Steinsplitter: Not sure what you mean exactly? You mean the actual text of the File:page? [13:56:43] yepp :) [13:57:25] i can't find the table O_O [14:20:20] Coren: Do you have MediaWiki-namespace editing capabilities on wikitechwiki? Looks like all the messagebox / notice templates on that wiki are broken because they're lacking css. I think they got lost in the migration [14:21:00] If you can copy over https://wikitech.wikimedia.org/wiki/User:Krinkle/MediaWiki:Gadget-enwp-boxes.css and https://wikitech.wikimedia.org/wiki/User:Krinkle/MediaWiki:Gadgets-definition that'll fix it [14:31:02] Krinkle: Yes. Hang on. [14:32:03] Steinsplitter: That's normal. The WMF production doesn't use the text table to store revisions at all, and the mechanism to do so is not accessible from labs so you have to fetch text via the API. (And even if it were, it'd be /slower/ than hitting the API because caching). [14:32:44] Steinsplitter: Alternately, you can fetch revision text from the dumps but that has a lag. [14:33:35] k, thx [14:39:47] Krinkle: {{done}} [14:51:22] Thx [17:57:05] (03PS1) 10MarkTraceur: Add analytics stuff to #wikimedia-multimedia [labs/tools/grrrit] - 10https://gerrit.wikimedia.org/r/130397 [17:57:14] YuviPanda: :D [17:57:46] marktraceur: ki? [17:58:01] YuviPanda: Re that patch [17:58:10] ki re? [17:58:14] :P [17:58:31] * marktraceur lobs YuviPanda a slow pitch with https://gerrit.wikimedia.org/r/130397 [17:58:48] (03CR) 10Yuvipanda: [C: 032] Add analytics stuff to #wikimedia-multimedia [labs/tools/grrrit] - 10https://gerrit.wikimedia.org/r/130397 (owner: 10MarkTraceur) [17:58:51] (03Merged) 10jenkins-bot: Add analytics stuff to #wikimedia-multimedia [labs/tools/grrrit] - 10https://gerrit.wikimedia.org/r/130397 (owner: 10MarkTraceur) [17:58:51] * YuviPanda appeals for an LBW [17:59:00] marktraceur: want me to deploy? [17:59:24] YuviPanda: That would be awesomesauce [17:59:24] on the wya [17:59:58] YuviPanda: None of the expansions I see of "LBW" make any sense. :P [18:00:15] marktraceur: https://en.wikipedia.org/wiki/Leg_before_wicket [18:00:42] Ah [18:01:15] !log local-lolrrit-wm restarted to deploy patch to route cabal messages to cabal channels [18:01:17] Logged the message, Master [19:25:01] (03PS1) 10coren: Front-end support for tomcat webservices. [labs/toollabs] - 10https://gerrit.wikimedia.org/r/130429 (https://bugzilla.wikimedia.org/54845) [19:26:03] (03PS2) 10coren: Front-end support for tomcat webservices. [labs/toollabs] - 10https://gerrit.wikimedia.org/r/130429 (https://bugzilla.wikimedia.org/54845) [19:27:02] (03CR) 10coren: [C: 032 V: 032] "Already tested to work." [labs/toollabs] - 10https://gerrit.wikimedia.org/r/130429 (https://bugzilla.wikimedia.org/54845) (owner: 10coren) [19:28:39] 3Wikimedia Labs / 3tools: Add support for Java Servlets on Tool Labs - 10https://bugzilla.wikimedia.org/54845#c12 (10Marc A. Pelletier) This provides, from a tool account: setup-tomcat to create a tool-local instance of tomcat and webservice -tomcat (start|stop|restart) To manipulate it. [20:06:25] 3Wikimedia Labs / 3tools: mounting problems with /public - 10https://bugzilla.wikimedia.org/60276#c1 (10Marc A. Pelletier) This is almost certainly caused by the 'autofs5' package being installed on the instance and conflicting with the fixed mounts. Normally, simply purging the package and rebooting will f... [20:08:25] 3Wikimedia Labs / 3tools: automatic encouragements to use OGE instead of cronjobs, when appropriate - 10https://bugzilla.wikimedia.org/54720#c3 (10Marc A. Pelletier) The new submit-host cron system forces jobs to be sent to the grid. While it is not currently the default (and I've had no time to push for t... [20:10:10] 3Wikimedia Labs / 3tools: lighttpd redirects fail without a trailing slash - 10https://bugzilla.wikimedia.org/59926#c8 (10Marc A. Pelletier) 5REO>3RES/FIX That is unrelated. The proxy provides an X-FORWARDED-PROTO header which can be used by the application to construct URIs if it really must redirect b... [20:12:24] 3Wikimedia Labs / 3tools: build and install tcl fcgi - 10https://bugzilla.wikimedia.org/56995#c14 (10Marc A. Pelletier) I do not currently have the bandwidth to manually construct packaging. If someone volunteers to create the source deb, I'd be happy to build and deploy it. [20:47:54] 3Wikimedia Labs / 3deployment-prep (beta): Use scap to deploy on apaches - 10https://bugzilla.wikimedia.org/63746 (10Bryan Davis) 5PAT>3RES/FIX [20:52:24] 3Wikimedia Labs / 3deployment-prep (beta): Set up graphite monitoring for the beta cluster - 10https://bugzilla.wikimedia.org/52357#c2 (10Bryan Davis) The deployment-graphite instance is intended to do this monitoring. I started to work on it but haven't finished making the puppet modules/roles usable in bet... [21:06:55] Coren: can you do me a favour and check out the IP of that guy https://tools.wmflabs.org/paste/view/18b34b01 [21:07:06] Coren: and set him on hold ;) [21:10:04] hedonil: Done, but I won't hold it too long since it's a dynamic IP [21:10:20] Coren: 'k. thanks. [21:14:54] 3Wikimedia Labs / 3tools: mounting problems with /public - 10https://bugzilla.wikimedia.org/60276#c2 (10Tim Landscheidt) 5UNC>3RES/FIX I think this was reported for Tools in pmtpa, so it will have been resolved by the migration to eqiad. [21:48:55] hmm [21:48:57] labs died? [21:49:01] where has grrit-wm gone? [21:49:05] wikibugs is up [21:51:17] 10 minutes till we discuss Ryan's and Tyler's RfCs in #wikimedia-office https://www.mediawiki.org/wiki/Architecture_meetings/RFC_review_2014-04-29 [21:55:28] Coren, do you have some spare time? I’m looking at the Tomcat webservice. (First of all thanks for implementing it. :-)) [22:07:19] we are now discussing dependency management in #wikimedia-office https://www.mediawiki.org/wiki/Architecture_meetings/RFC_review_2014-04-29 [22:09:52] ireas: I can give a hand, but beware that I am no Java expert. [22:09:56] 3Wikimedia Labs / 3tools: lighttpd redirects URLs of directories without a trailing slash from https to http - 10https://bugzilla.wikimedia.org/64627 (10Tim Landscheidt) 3NEW p:3Unprio s:3normal a:3Marc A. Pelletier +++ This bug was initially created as a clone of Bug #59926 +++ lighttpd redirects U... [22:11:48] Coren: great :-) as far as I know, it should be sufficient to drop the .war in the webapps dir (that’s what I did). where should I expect the tool to be available? looking at the server.xml, I’d guess tools.wmflabs.org:8080/«tool»/«war_name» (which does not work) [22:12:23] Damianz: hey [22:12:31] Damianz: is cluebot using huggle's whitelist too? [22:14:54] 3Wikimedia Labs / 3tools: lighttpd redirects fail without a trailing slash - 10https://bugzilla.wikimedia.org/59926#c9 (10Tim Landscheidt) I filed bug #64627 for the https -> http issue. [22:15:00] ireas: You'll need to do more than just add the war, I think. I'm guessing you have to explicitly add some sort of redirection because the proxy leaves the path unchanged entirely. The "simple" solution would seem to be that you want your war to be named after your tool. [22:15:20] So that the /toolname/ URI (which will always be there) matches. [22:15:47] So I’d need to configure lighttpd to redirect to tomcat? [22:16:57] 3Wikimedia Labs / 3tools: build and install tcl fcgi - 10https://bugzilla.wikimedia.org/56995 (10Tim Landscheidt) 5ASS>3NEW a:5Marc A. Pelletier>3None [22:17:10] ireas: No, no, the proxying is automagic. [22:17:19] ireas: You have tomcat /instead/ of lighttpd [22:17:31] "ASS>NEW" does nobody think things through around here [22:18:07] Or at least, you should. It's not clear which one the proxy will pick if you have more than one. :-) [22:18:21] marktraceur: Que? [22:19:51] Coren: ah, okay. I’ll try to change the configuration … [22:20:05] marktraceur: feel free to submit a patch to make it a four-letter abbr, or to change ASS into ASN, or whatever. I chose the 'we are all adults here' path :-p [22:20:27] ireas: http://tools.wmflabs.org/javatest/ [22:20:36] valhallasw: Not necessarily [22:20:50] All I did was drop the apache sample.war and move it to webapps/javatest.war [22:21:33] 'adult' meaning 'mature enough not to point-and-laugh-at-everything-that-could-be-considered-a-'dirty'-word'. [22:22:21] So yeah, you want/need is normally for your app to be named like your tool. Alternately, I'm pretty sure you could do a subdir. [22:22:40] valhallasw: This is demonstrably not true [22:24:33] HEHEHEHE HE SAID DEMONSTRABLY [22:24:34] HEHEHE [22:24:49] Yeah, that. Whatever. I'll happily merge a patch. [22:25:03] *nod* probably won't come from me in the near future but bug is filed [22:25:21] YuviPanda: Gee thanks [22:26:59] Coren: hmmm… I downloaded the sample.war and moved it to ~tools.isbn/public_tomcat/webapps/isbn.war, restarted tomcat and … 404 [22:27:08] marktraceur: HEHEHE YOU SAID GEE! [22:27:13] ok, now I'll go back to writing code. [22:27:21] YuviPanda: It's my duty [22:27:29] marktraceur: essentially, any abbreviation can be considered rude in some language. [22:27:30] marktraceur: saing GEE is your duty? [22:27:36] marktraceur: you sound like sucheta now. [22:27:45] "And are they trying really hard not to laugh at the word 'duty'?" [22:27:56] YuviPanda: My voice has been getting a bit higher lately [22:28:09] ireas: You have a lighttpd running. :-) [22:28:17] marktraceur: :) [22:29:36] Coren: oh ^^ webservice stop && webservice -tomcat restart, still 404 :-/ [22:37:06] petan: If you mean Wikipedia:Huggle/Whitelist, yes [22:37:06] Damianz: lol that thing is deprecated and outdated for like 2 years [22:37:06] then no [22:37:06] anyway, today we switched to another new backend using postgresql [22:37:06] so you could directly select the whitelisted users from db [22:37:06] or add them if you like [22:37:06] the db on tools? [22:37:10] unfortunately there is no postgre yet on tools [22:37:12] I'm not sure why. Lemme try something. [22:37:12] ireas: Oh, it's because I'm an idiot. [22:37:12] Patch incoming. [22:37:18] once there is I will happily migrate it [22:37:22] lol [22:37:23] I really hate mysql [22:37:44] or no, it's not like I "hate" mysql, mysql is fine [22:37:52] but postgre is so much better :P [22:37:58] that is phrased better I guess [22:38:09] Coren: I’m sure you’re wrong :-) [22:38:16] https://github.com/DamianZaremba/cluebotng/blob/master/bot/misc_functions.php#L25 https://github.com/DamianZaremba/cluebotng/blob/master/bot/feed_functions.php#L101-102 will need changing I guess then... assuming you have some method of telling the list has changed [22:38:17] marktraceur: bugzilla itself seems to use four letters, so switching to that makes sense, I gues. [22:38:31] * Damianz adds to his todo list as it's nearly midnight and he's in the middle of a BGP config [22:38:50] ireas: https://gerrit.wikimedia.org/r/#/c/130510/ says I'm indeed an idiot. :-) [22:38:54] Damianz: http://huggle.wmflabs.org/data/wl.php this is how huggle access new whitelist for some time [22:39:27] basically you can run http://huggle.wmflabs.org/data/wl.php?wp=en.wikipedia.org&action=read which give you very nasty | separated list :P [22:40:06] it's 1 huge line of all whitelisted users, more cute output (human readable) is here http://huggle.wmflabs.org/data/wl.php?wp=en.wikipedia.org&action=display [22:40:12] ireas: Starting your tomcat anew should now work as expected. [22:40:16] Coren: well, I now comprehend why you came to that conclusion, but … no :-) [22:40:27] So there is no way to tell when that changes [22:40:30] I could give you more formats if you need, but I still think that direct psql access is best for you [22:40:36] petan: You can use the Labs PostgreSQL server, for the moment on a beta basis, if that's necessary. You'll need to ask akosiaris for access (there's a bug about that as well). [22:40:39] Something tells me you'll get grumpy if I request that a few times a second [22:40:43] marktraceur: not sure where you posted the bug, but https://github.com/valhallasw/pywikibugs/issues/25 [22:40:59] scfc_de`: cool, once that is stable I will move things there, I just hope for 1 thing [22:41:12] that usernames and database names will not be so mad like on mysql [22:41:24] ah, hello world is working … [22:41:27] like u353646430962496582406 instead of "huggle_wl" etc [22:41:31] You know what would be awesome for this shizzle - dsnlb style stuff [22:42:35] Damianz: you can easily check if it was update by running SELECT insertion_date FROM list ORDER BY insertion_date DESC LIMIT 1; that will return last update time [22:43:27] But I still have to query /that/ on every edit, or at some random time interval [22:43:41] I'll add something in to reload the list every $x when I get a few [22:43:50] petan: Well, in German we say: "Namen sind Schall und Rauch." :-) It's something in a configuration file, to be forgotten about. [22:44:11] ok, the php script we use for whitelist could push updates somewhere like redis I guess, or whateve [22:44:34] Damianz: The whitelist doesn't change that often does it? [22:44:36] * Damianz wonders what was wrong with using a wiki page :P [22:44:43] a930913: Exactally [22:44:49] So requesting it on every edit is pointless [22:45:09] scfc_de`: not very nice when you want to access the database by running psql -d -U which in case of labs would be psql -d p49549696956942542682946 -U u349546540795369024960464 and that is hard to remember :P [22:45:26] Coren: the tool isn’t working yet, but I have an error log and an exception stack trace, and that’s all I need at the moment :-D Thank you! :-) [22:45:27] But yes, the whitelist not being on the wiki means the wiki can't access it easily. [22:45:33] I prefer cute and short names [22:45:54] ireas: That's a good starting point. :-) [22:45:55] petan: Actually, does the whitelist api send CORS stuff for wiki access? [22:46:00] also I would like to share this db with many people, I don't want to tell them just select this from p656548734682458405826 [22:46:13] a930913: what is that [22:46:26] a930913: whitelist changes like every 5 minutes [22:47:15] petan: For a website to access another, there needs to be headers sent that allow domain1.com to access domain2.com [22:47:37] a930913: it was producing zillions of revisions, gigabytes of useless mysql data and possibly megabytes of useless traffic as users had to always rewrite old whitelist with new whitelist (you can't easily append data to wiki page) [22:47:53] a930913 there are really many many reasons why using sql database as backend is better than using wikipage [22:48:51] @seen dan-nl [22:48:51] Steinsplitter: Last time I saw dan-nl they were quitting the network with reason: Client Quit N/A at 4/29/2014 6:19:20 PM (4h29m30s ago) [22:49:46] !mysql is petan likes mysql, when he say he doesn't, he is lying [22:49:47] This key already exist - remove it, if you want to change it [22:49:50] meh [22:50:39] a930913: I still don't know what you mean, maybe move this discussion to #huggle so that we don't bother others with this [22:51:20] a930913: there are many flaws in current whitelist implementation, but it can be always fixed, it's open source written in nice php :) [22:51:36] petan: Put in the lighty config, setenv.add-response-header = ( "Access-Control-Allow-Origin" => "*", "Access-Control-Allow-Methods" => "GET, POST, OPTIONS" ) [22:51:59] Oh wait, it's served by the fcgi, isn't it? [22:52:16] a930913: why? and I am using apache not lighty [22:54:49] petan: On en.wikipedia > xhr=new XMLHttpRequest(); xhr.open("GET", "http://huggle.wmflabs.org/data/wl.php?wp=en.wikipedia.org&action=read", false); xhr.send(); [22:55:08] petan: XMLHttpRequest cannot load http://huggle.wmflabs.org/data/wl.php?wp=en.wikipedia.org&action=read. No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'https://en.wikipedia.org' is therefore not allowed access. [22:56:03] I would prefer php functions to modify header [22:56:04] changing on webserver level affect all pages [22:56:43] petan: Get a better config then ;) [22:56:58] But yeah, put it in the php for the page. [22:57:52] a930913: ok [22:58:09] a930913: you can show me at hackaton how to do that I have no idea what is that and why is that needed [22:58:15] petan: add_real_header_not_to_be_confused_with_the_old_function() or something. [22:58:21] Damianz: you going to zurich? :o [22:58:31] petan: Which hackathon? [22:58:39] a930913: zurich one [22:58:57] !hackaton [22:59:14] petan: Am I meant to be coming? :o [22:59:33] !hackaton is http://www.mediawiki.org/wiki/Zürich_Hackathon_2014‎ [22:59:34] Key was added [22:59:52] a930913: all devs who can should be :P [23:00:07] a930913: we can't drink all that beer without your help [23:00:19] petan: Unless you add access control headers, other sites (wikipedia) can't access the page. [23:00:30] interesting [23:00:36] howcome huggle can do that :P [23:01:36] petan: Because huggle is not a website. [23:04:06] petan: Huggle doesn't have all my cookies and therefore can't act on my behalf. (Until I give it credentials such as wiki login.) [23:05:45] well you need to give it credentials because wmf doesn't offer any other way to login [23:05:55] I would prefer using some other system if there was some [23:07:21] Coren: I added two short paragraphs to the help page: https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools/Help#Java_webservice_.28Tomcat.29 [23:07:55] petan: The browser already has my credentials ;) But we digress. [23:08:55] 3Wikimedia Labs / 3tools: Add support for Java Servlets on Tool Labs - 10https://bugzilla.wikimedia.org/54845#c13 (10Robin Krahl) Thanks Coren. Tomcat seems to be working for me. See [0] for a short summary of the WAR deployment. [0] https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools/Help#Java_webse... [23:09:07] tbh I trust huggle more than my browser when it comes to credentials but maybe that is because I know the code :P [23:10:39] petan: Anyway, can you add the headers? [23:10:55] I am too lame for that [23:10:57] but you can! [23:11:06] https://github.com/huggle/wl [23:11:25] * a930913 is not a git. [23:11:50] you don't even need a git for that [23:11:57] there is "edit" button in github [23:12:02] something I miss in phabricator [23:16:13] petan: What file am I putting it into? [23:16:39] petan: All this php oop is confusing me. [23:17:01] a930913: best would be just to create some function in whitelist.php class which I would use everywhere where it's needed [23:17:21] or if you want that header everywhere, just use index.php [23:17:29] insert it somewhere to top of index.php [23:17:37] you will avoid OOP there is none :P [23:18:23] it's really late here I will go sleep [23:22:41] petan: I think I did it. [23:22:58] thanks [23:23:16] But bear in mind the last project I touched with git lost a month of work. [23:24:05] I work on this since yesterday [23:24:15] so in worst case I loose 1 - 2 days of work [23:24:29] petan: When will it take effect? Immediately? [23:24:36] yes [23:25:17] petan: No errors now on the wiki :) [23:25:24] cool [23:25:46] petan: The * means that any site can direct clients to access it. [23:26:40] It's not likely to ever be an issue, but if you're worried, you can limit it just to wikimedia sites. [23:27:34] Though you'll need to parse headers to send back the right site.