[13:00:13] (PS2) Dzahn: use new wikivoyage logo on stats.wikimedia.org [analytics/wikistats] - https://gerrit.wikimedia.org/r/88978 [13:00:30] (CR) jenkins-bot: [V: -1] use new wikivoyage logo on stats.wikimedia.org [analytics/wikistats] - https://gerrit.wikimedia.org/r/88978 (owner: Dzahn) [13:01:54] (CR) Dzahn: [C: 2 V: 2] "manual rebase , fixed path conflict" [analytics/wikistats] - https://gerrit.wikimedia.org/r/88978 (owner: Dzahn) [13:02:06] (CR) jenkins-bot: [V: -1] use new wikivoyage logo on stats.wikimedia.org [analytics/wikistats] - https://gerrit.wikimedia.org/r/88978 (owner: Dzahn) [13:02:16] morning guys [13:06:54] drdee: morning, i just fixed the path conflict on that with manual rebase, but i cant override the unrelated Jenkins issue due to the manual pushes [13:07:21] otto is off today, right? [13:13:08] morning [13:13:13] yes Snaps, otto is off [13:13:16] yup he is [13:13:34] mutante: email erik zachte :) [13:14:03] but Snaps you are very welcome to join us in our IRC channel aka 'the coffee barister' [13:14:10] or our bat cave [13:14:12] where we tell jokes [13:14:25] gossip about the kardashians [13:14:36] and just have a good time [13:14:38] lol [13:14:57] thank you sincerely dear doktor, but it wouldnt feel right without andrew of otto [13:15:19] wow that's quite some loyalty you are displaying here [13:16:04] youve got to pick your allies with care [13:23:50] drdee: ok. done. cya around [16:34:37] average around? [16:35:04] yes [16:35:14] I am here [16:35:30] I am a watchful protector of the channel, reading the mailist, reading the backlog [16:35:38] awesome [16:35:44] writing some code on and off [16:35:48] k [17:36:18] drdee: I posted some comments on https://www.mediawiki.org/wiki/Talk:Analytics/Hypercube [17:36:28] ty! [17:40:23] this whole big data problem might be solved better soon [17:40:31] in canada there's a company called D-Wave [17:40:37] they have some quantum chips out for sale [17:40:51] apparently they got them like production ready or something and people are using them [17:40:58] I saw that yesterday [17:53:00] drdee: re: large uploads. metrics is sitting behind my proxy, and while it shouldn't cause problems with 'large file upload', i've not tested that :D so one thing to keep in mind while debugging [18:07:18] hey YuviPanda -- have you had a chance to play with hadoop? [18:07:26] hey tnegrin [18:07:37] not yet, was waylaid by onset of CTS :( [18:07:46] so taking care of it now and doing less work [18:08:04] take care of yourself! [18:08:20] sorry to hear it [18:08:22] yeah [18:14:54] milimetric: ddid you see YuviPanda 's remark about the proxy server? [18:15:05] it *shouldn't* be an issue [18:15:10] but still, a possibility [18:15:13] yea [18:15:21] i'm doubtful that's causing any issues [18:15:37] i'll let you know [18:16:11] k [18:23:29] oh that can't be it YuviPanda / drdee_ because a 438 user cohort was uploaded on October 8th [18:23:40] ah, right [18:23:56] hm, bugger [18:24:03] it's definitely giving me a 504 though [18:24:07] and i see nothing in the logs [18:24:11] means it's never getting to me ... [18:24:12] hm... [18:24:38] maybe related to labs maintenance ? [18:25:13] has something changed with the SSL setup YuviPanda ? [18:25:14] milimetric: 2013/10/11 18:23:11 [error] 1040#0: *944 upstream timed out (110: Connection timed out) while reading response header from upstream, client: 68.43.214.146, server: , request: "POST /cohorts/upload HTTP/1.1", upstream: "http://10.4.0.75:80/cohorts/upload", host: "metrics.wmflabs.org", referrer: "https://metrics.wmflabs.org/cohorts/upload" [18:25:21] so it got sent to you [18:25:27] just too slow [18:25:35] drdee_: nope [18:25:40] drdee_: labs is back online too [18:25:42] k [18:31:31] ok, this is definitely some type of proxy thing, sorry [18:31:41] I get this line in my log, AFTER the timeout: [18:31:41] 10.4.0.214 - - [11/Oct/2013:18:29:31 +0000] "POST /cohorts/upload HTTP/1.1" 200 14521 "https://metrics.wmflabs.org/cohorts/upload" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/30.0.1599.66 Safari/537.36" [18:32:00] milimetric: should I increase the timeout or something? [18:32:05] yeah, definitely [18:32:11] it should be like half an hour or something [18:32:14] lol [18:32:27] seriously, but we gotta prioritize card 818 drdee_ [18:32:36] (818 is what lets us make this damn validation async) [18:32:57] milimetric: it's 60s by default [18:32:59] hang on though YuviPanda [18:33:05] gimme a sec to try one more thing [18:33:07] milimetric: is it more than 60s now? [18:34:07] yeah, YuviPanda, it's definitely the timeout [18:34:12] milimetric: let me increase it [18:34:13] moment [18:35:06] milimetric: try now? [18:35:06] but is it seriously stupid to increase it to 30 min? [18:35:13] milimetric: temporarily upped it to 5m [18:35:22] 30m does sound a little stupid :P [18:35:29] or maybe not [18:35:33] uploads *can* take that long [18:35:54] well, it's not the upload [18:35:58] it's validating all the users in the cohort [18:35:59] sure [18:36:01] so that's the dumb part [18:36:05] can you test now/ [18:36:06] ? [18:36:09] but yeah, maybe let's up it to 30 min. for now [18:36:19] and we'll schedule 818 next Wednesday [18:36:24] yep, trying [18:40:13] yep, that worked for a 1000 sized cohort YuviPanda [18:40:17] right [18:40:21] milimetric: let me puppetize that [18:40:21] I think maybe up it to 10 min. for now [18:40:41] and I'll let you know as soon as we fix 818 to turn it back down to something more sensible [18:40:52] ok :) [18:40:54] moment [18:40:57] though if people upload like 200k cohorts, the csv upload alone might break it [18:42:36] upping to 6000s [18:42:39] err [18:42:40] wait [18:42:47] that's like a lot more than 10m :D [18:42:49] 600s [18:43:51] :) [18:43:53] thanks YuviPanda [18:44:48] milimetric: https://gerrit.wikimedia.org/r/89247 [18:45:19] thank you very much [18:45:48] not merged yet [18:50:10] milimetric: merged [18:50:14] milimetric: running puppet now [19:50:44] (PS2) Milimetric: Adding import logic for dumps pagecounts [analytics/kraken] - https://gerrit.wikimedia.org/r/89125 [19:51:39] (CR) Milimetric: [C: 2 V: 2] "Merging this so others can work with it, but it's still a work in progress." [analytics/kraken] - https://gerrit.wikimedia.org/r/89125 (owner: Milimetric) [19:54:14] (PS1) Milimetric: Adding import logic for dumps pagecounts [analytics/kraken] - https://gerrit.wikimedia.org/r/89327 [19:54:15] (PS1) Milimetric: merged cleanly [analytics/kraken] - https://gerrit.wikimedia.org/r/89328