[10:49:24] New patchset: Erik Zachte; "new files for portal" [analytics/wikistats] (master) - https://gerrit.wikimedia.org/r/71002 [10:51:16] Change merged: Erik Zachte; [analytics/wikistats] (master) - https://gerrit.wikimedia.org/r/71002 [14:01:23] tatataaaaaaaa [14:04:25] yoyoyo [14:10:54] how is my fellow east-sider doin? [14:16:11] ori-l thanks for helping Snaps! [14:16:53] doiiin alriiigh [14:17:02] i'm running the deduplicate on the days in may [14:17:11] also messing with jmxtrans kafka ganglia stuff [14:29:01] cafe time, back in a bit [15:08:54] ottomata! [15:08:55] get out of here [15:09:15] that will show him [15:09:29] that guy is a jerk! [15:29:37] New patchset: Milimetric; "fixes NamespaceEdits test" [analytics/wikimetrics] (master) - https://gerrit.wikimedia.org/r/71073 [15:32:12] whoami [15:32:18] ottomata! [15:32:21] i win. [15:45:11] Change merged: Milimetric; [analytics/wikimetrics] (master) - https://gerrit.wikimedia.org/r/71073 [15:51:20] milimetric: morning. wtf __init__ stuff looks good. I'm working on celery stuff now [15:51:34] cool [15:51:50] yeah, i thought it was a little hacky but now I'm ok with it :) [15:52:38] it made sense to me (witht he comments), which i guess is a good test [15:56:36] milimetric: btw, i just discovered that the flask app has a built-in logger object, so you can just go: app.logger.debug('test app log') [15:56:42] and it prints it out in the server console [16:32:56] sorry erosen, was in a hangout with toby [16:33:17] so yeah, i think i saw that flask logger somewhere. but it wouldn't tell you what file/class the log is from, is that the problem? [16:37:51] no, it will [16:37:55] you just have to set the formatter [16:38:14] i have some formats that I like [17:01:15] average: scrum [17:30:37] New patchset: Milimetric; "adds placeholder and fixes Metric inheritance tree" [analytics/wikimetrics] (master) - https://gerrit.wikimedia.org/r/71080 [17:30:54] Change merged: Milimetric; [analytics/wikimetrics] (master) - https://gerrit.wikimedia.org/r/71080 [17:58:18] erosen around? [17:58:36] drdee: yeah, fixing last minute dashboard stuff with amit and kul [17:58:37] sup? [17:58:52] erosen i also sent you the june xcs data [17:58:59] yeah, i looked at it and have no idea [17:59:03] you can share with him as well, Dialog is weird though [17:59:05] should I ask amit for ideas? [17:59:06] k [17:59:09] but Beeline looks good [17:59:12] yeah for sure [18:00:47] what's Beeline ? [18:02:43] average: it's a russian mobile carrier that has a partnership with wikipedia zero [18:03:12] oh [18:06:58] drdee: amit says he has no ideas [18:07:13] i suggest to loop in yurik and adam [18:07:13] but he did mention that sri-lanka should only have zero data [18:07:18] yeah [18:07:38] my hypothesis is that they were incorrectly logging mobile traffic for the first 11 days [18:07:45] k was he happy with the beeline data? [18:08:35] drdee: yup [18:08:39] he said it checks out [18:08:45] but didn't tell the story he was hoping it would [18:08:54] drdee: they ran an add a few days ago apparently [18:08:56] don't blame the messenger ;) [18:08:59] hehe [18:09:09] i will create a 1 point card for this [18:09:44] nice [18:23:29] thanks milimetric! [18:24:02] you're welcome! [18:24:05] what'd I do? [18:24:05] :) [19:02:01] hey drdee [19:02:10] yoyo [19:02:42] :) the varnishkafka repo is still owned by 'project creators', which is inappropriate -- i'm happy to change it, though. what should it be? 'analytics'? [19:03:02] we'll there are actually two varnishkafka repos [19:03:09] not sure who created them :( [19:03:16] i didn't create either [19:03:18] the operations/varnishkafka should be deleted [19:03:33] operations/software/varnish/varnishkafka or something like that is the right one [19:03:41] i guess it should be owned by ops [19:03:48] https://gerrit.wikimedia.org/r/#/admin/projects/operations/software/varnish/varnishkafka,access [19:03:52] kk [19:08:15] ottomata: https://gerrit.wikimedia.org/r/#/c/71093/1 [19:16:14] eh? [19:45:14] milimetric: another limn q [19:45:19] sure [19:45:30] is this all for 244 btw? [19:45:51] no [19:45:57] this is regarding asaf's request [19:46:04] i tried to create a graph and it never loaded the datasources [19:46:21] I was hoping you could take a look at gp.wmflabs.org/graphs/create [19:46:25] and see what is going on [19:46:56] erosen: which asaf's request? [19:47:09] asaf bartov [19:47:17] i mean what request [19:47:29] hehe [19:47:31] drdee: he just wanted to check in on whether it was possible to create a graph [19:47:34] you always rub in my cutchness [19:47:42] stupid autocorrect [19:47:54] i think I just interpreted it wrong [19:47:59] no clue what 'cutchness' means [19:48:06] same [19:48:20] i meant to say dutchness [19:48:27] oh right erosen, I think it's that there are so many datasources [19:48:29] yeah, i figured that part out [19:48:33] and it searches the filesystem every time [19:48:34] milimetric: that was my suspciion [19:48:38] k [19:49:41] but yeah, there's some nasty performance issues that we haven't gotten in. Those shouldn't be too hard to fix [19:49:49] certainly easier than that damn x-axis tick spacing thing [19:49:55] milimetric: cool [19:50:16] milimetric: I guess I wanted to make sure that was the problem [19:50:32] milimetric: because when I check my console after a bit i see the error: Uncaught TypeError: Object [object Array] has no method 'toLowerCase' [19:50:32] i'll double check locally [19:50:48] oh [19:50:51] hm... [19:52:26] milimetric: what's up? [19:52:58] no that's odd, i'm trying something [19:56:32] heh, oh yeah erosen, one of your datasources has a name that's in square brackets [19:57:33] run it locally and checkout http://localhost:8081/datasources/grants_spending_by_country.json?pretty=1 [19:57:52] http://gp.wmflabs.org/datasources/grants_spending_by_country.json?pretty=1 [20:00:21] thanks ori-l for double teaming me with erosen! [20:00:28] ;) [20:00:53] i just saw the backlog, am sitting in the couch area on 3 and trying not to laugh out loud [20:01:07] i don't know why but 'cutchness' is incredibly hilarious to me [20:01:23] i am very afraid that it is a very offensive term [20:01:28] somehow [20:02:40] milimetric: aah, thanks for realizing this [20:02:44] i hadn't thought to run it locally [20:03:29] np, glad to help - errors are pretty helpful in limn actually [20:03:50] but yeah, i checked and if all the names are ok, this should render just fine [20:04:04] since name is a required field, I'm hesitant to make it handle this as a special case [20:04:16] yeah [20:04:18] fair [20:17:11] New patchset: Milimetric; "create_all is safe after all" [analytics/wikimetrics] (master) - https://gerrit.wikimedia.org/r/71111 [20:17:21] Change merged: Milimetric; [analytics/wikimetrics] (master) - https://gerrit.wikimedia.org/r/71111 [20:29:31] heya drdee, milimetric [20:29:39] hi Andrew [20:29:40] what's up [20:29:44] i was just thikning about webrequest-all-sampled-1000 deduplicating [20:29:53] i think I can't fix that dataset [20:29:54] i can deduplicate [20:29:55] but. [20:30:07] there will be ~2x data anyway [20:30:10] i think [20:30:12] yes [20:30:16] because it is sampled [20:30:44] it is unlikely that the duplicate lines in the multicast stream will both be selected by sampling [20:30:52] drdee, does that sound right? [20:37:58] the only way you can 'deduplicate' the sampled dataset is by discarding 50% at random assuming there is twice as much data as expected [20:38:29] right [20:40:17] ja, twice as much data [20:40:33] about 1G vs 500MB in an import [20:43:58] New patchset: Erosen; "fixes celery issues; celery tests now run and pass" [analytics/wikimetrics] (master) - https://gerrit.wikimedia.org/r/71116 [20:44:19] Change merged: Erosen; [analytics/wikimetrics] (master) - https://gerrit.wikimedia.org/r/71116 [21:23:35] guys i am oututtyytytytyty, i launched zero jobs to fill /wmf/data with may 22 − 27 [21:23:51] hopefully the .tsvs will get the coalesced data after they finish and the next daily job runs [21:27:43] laterz [22:22:23] milimetric; erosen: have a great weekend guys! [22:22:35] drdee: thanks, you too! [22:22:45] you too drdee [22:22:47] :) [22:22:52] later everyone