[06:38:39] (PS1) Rfaulk: add - Aggregator class. [analytics/user-metrics] - https://gerrit.wikimedia.org/r/80955 [06:38:40] (PS1) Rfaulk: rm - thin_client view method. [analytics/user-metrics] - https://gerrit.wikimedia.org/r/80956 [06:38:41] (PS1) Rfaulk: add - regex filtering in all_requests view. [analytics/user-metrics] - https://gerrit.wikimedia.org/r/80957 [06:39:18] (CR) Rfaulk: [C: 2 V: 2] add - Aggregator class. [analytics/user-metrics] - https://gerrit.wikimedia.org/r/80955 (owner: Rfaulk) [06:39:31] (CR) Rfaulk: [C: 2 V: 2] rm - thin_client view method. [analytics/user-metrics] - https://gerrit.wikimedia.org/r/80956 (owner: Rfaulk) [06:39:42] (CR) Rfaulk: [C: 2 V: 2] add - regex filtering in all_requests view. [analytics/user-metrics] - https://gerrit.wikimedia.org/r/80957 (owner: Rfaulk) [13:20:04] muuuuuuuuuurning [13:23:00] yoyooyo [13:30:59] Morning :-) [13:32:03] ok emails checked! moving to cafe for coffee and bagel [13:32:04] nom nom [13:36:35] morning morning everyone [13:37:50] YO! [13:38:23] hey drdee, let's talk :) [13:38:29] gingle and bingle went to do some coding [13:38:30] aight [14:06:56] milimetric: hey Dan, I tried to switch the sqlite dbs to in-memory [14:07:34] milimetric: in wikimetrics. now tests run ~4-5 times faster but they fail :| [14:07:50] that's a bit weird not sure why it's happening [14:08:31] oh, I forget too, but I had the same problem [14:08:33] also, since the db connection string for sqlite in-memory is "sqlite://" I suppose it would always use the same db (although there are 2 calls to create_engine) [14:08:37] it's nothing to be worried about [14:08:59] yeah, I'm not pushing this to gerrit, but it would be nice if I would get it to work [14:09:02] that's it! [14:09:12] it's because when it's in memory i think it's volatile [14:09:16] and can be deleted when it's not used [14:09:27] oh, is there a point when it's not used ? [14:09:33] I mean when nose runs stuff.. [14:09:41] does it run everything in one-process ? [14:09:49] or does it make multiple processes for each test ? [14:10:07] i think when all the open sessions are closed, it might delete it [14:10:13] i have no idea though [14:10:20] that was just my theory back then [14:10:26] so we made it persist and everything was fine [14:10:31] you could write a test to verify :) [14:10:36] but it's not high priority honestly [14:10:49] how's the new metric working? were you able to write any other tests? [14:11:05] oh, now I know what the difference is between your computer and mine [14:11:07] I have an SSD [14:11:13] that's why it's so much faster in memory [14:11:21] for me it's almost the same in memory as on-disk [14:11:30] make sure the computer you ask for has an SSD [14:18:31] ottomata: when you puppetized the geowiki's feeding of data from one db to the other, did that puppetized part work when you set it up? [14:23:23] i think so [14:23:32] oh yeah you need stats sudo [14:23:36] lemme see if I can make that happen [14:24:31] That would be awesome :-) [14:35:37] heya qchris, what in particular do you need to do as the stats user [14:35:40] just crontab stuff? [14:35:42] or other things? [15:44:43] ottomata: Yes. Crontab stuff, and assuring by hand that things are/should be working [15:45:42] ottomata: First thing will be to try and see where the scripts currently fail for the stats user. The database is too empty. [15:48:08] Done :-) [16:23:44] hey drdee, my mingle wip view is messed up [16:23:57] how do I get back the columns? [16:24:36] click on the half circle arrow pointing left (in the tab) not the full circle [16:25:11] ooo, weird [16:25:13] danke [16:27:39] np [17:01:08] average, ottomata, scrum [17:40:25] average: when do you think you can push the latest patch set for 704? [17:52:40] yo DarTar! [17:52:47] hey [17:52:47] https://plus.google.com/hangouts/_/2da993a9acec7936399e9d78d13bf7ec0c0afdbc [17:53:09] I'm there in 1 min [18:01:57] (PS3) Stefan.petrea: Added metric for card 704 [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/80659 [18:02:04] drdee: ^^ we're go for review & deploy [18:02:09] milimetric: ^^ [18:02:16] milimetric: let's discuss in a separate hangout the deployment [18:02:28] milimetric: (scrum hangout is taken for the moment) [18:03:03] milimetric: flake8 is happy [18:25:02] hey milimetric; does the json response contain a permalink to the job itself? [18:31:24] (CR) Milimetric: [C: 2 V: 2] Added metric for card 704 [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/80659 (owner: Stefan.petrea) [18:39:35] (PS1) Stefan.petrea: Date range constraint for Pages Created metric [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/81026 [18:45:10] drdee: are you admin on github for github.com/wikimedia ? [18:45:20] drdee: if so can you flip on the travis switch for wikimetrics ? [18:49:05] try again [18:55:33] (PS2) Stefan.petrea: Date range constraint for Pages Created metric [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/81026 [18:58:08] (PS3) Stefan.petrea: Date range constraint for Pages Created metric [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/81026 [18:59:01] (PS1) Milimetric: added start/end date to namespaces also, more tests [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/81031 [18:59:11] (CR) Milimetric: [C: 2 V: 2] Date range constraint for Pages Created metric [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/81026 (owner: Stefan.petrea) [18:59:23] (CR) Milimetric: [C: 2 V: 2] added start/end date to namespaces also, more tests [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/81031 (owner: Milimetric) [19:01:53] drdee: please [19:01:54] :) [19:02:05] average: worked? [19:02:13] drdee: it worked [19:02:19] awesome! [19:02:23] oh wait, you were saying about the travis switch [19:02:28] lemme try now [19:03:24] drdee: we're deploying atm [19:03:35] AWESOME! Great job average! [19:07:52] deployment finished? [19:11:10] average: ^^ [19:12:29] drdee: deployment done [19:12:47] drdee: we also added start_date/end_date for metrics also [19:12:53] nice! [19:15:46] drdee: what's next !! :) [19:16:10] you can always check [19:16:10] https://mingle.corp.wikimedia.org/projects/analytics/cards/grid?aggregate_property%5Bcolumn%5D=estimate&aggregate_type%5Bcolumn%5D=sum&color_by=project+name&filters%5B%5D=%5BType%5D%5Bis%5D%5BFeature%5D&filters%5B%5D=%5BType%5D%5Bis%5D%5BDefect%5D&filters%5B%5D=%5BType%5D%5Bis%5D%5BInfrastructure+Task%5D&filters%5B%5D=%5BRelease+Schedule+-+Release%5D%5Bis%5D%5B%28Current+Release%29%5D&filters%5B%5D=%5BRelease+Schedule+-+Sprint%5D%5Bis%5D%5B%28 [19:16:11] ent+Sprint%29%5D&group_by%5Blane%5D=development+status&group_by%5Brow%5D=class+of+service&lanes=Ready+for+Dev%2CCoding+and+Testing%2CSign-off%2CShipping%2CShowcasing%2CDone&tab=WIP+-+Features [19:16:20] (WIP Features in mingle) [19:16:26] and check the 'Ready for Dev' column [19:16:33] there you will see card 1089 [19:16:44] please brain bounce with dan on how to do that [19:16:48] i also have some thoughts [19:16:53] milimetric: ^^ [19:17:26] average: ^^ [19:33:11] dudes, i'm taking hadoop offline! [19:33:23] just reinstalling the base now, leslie and I are going to reinstall hadoop itself tomorrow [19:35:30] ok drdee, average is up to speed with most of the codebase :) [19:35:39] drdee , milimetric thanks :) [19:35:55] brb [19:36:10] you know guys, I'm really proud of wikimetrics having only 1400 or so meaningful python lines [19:36:13] that's Really small [19:47:56] cool ottomata! [20:25:14] drdee: who owns your udp2log stuff from an operations point of view? [20:25:37] ottomata [20:26:03] ottomata: there are a lot of packets being dropped (or perhaps just with error but still being processed) on gadolinium (and the udp2log specific metrics are not being reported) [20:28:10] ottomata: I also don't have good numbers on it yet; but comparing fundraisings banner impression numbers to the official page view stats results pre/post our move to gadolinium shows much greater daily variation post move [20:28:19] which seems to me like we're seeing packet loss [20:39:38] hey ottomata, two qs: [20:39:51] 1) are you OK getting EventLogging alerts? (they should be few and far in between) [20:39:57] 2) If so, can you review https://gerrit.wikimedia.org/r/#/c/81121/ ? [20:39:57] sure! [20:40:07] cool cool :) [20:40:08] thanks [20:40:25] (I'll innocently interpret the "sure!" as scoped to both 1 & 2 :P) [20:53:31] ha totally, might not get to #2 today though :) [20:53:51] oh this looks simple [20:53:53] jaja reviewing [20:55:40] ori-l, don't you need a new entry in checkcommands.cfg.erb? [20:55:41] not sure [20:56:21] ottomata: oh, probably. let me update the patch. [20:56:23] ah no, it si nrpe [20:56:24] hm [20:56:44] i don't know how the nrpe/icinga puppet stuff works very well [20:57:19] ah, yes, I think you do [20:57:24] check_command => "nrpe_check!check_${title}", [20:57:36] ori-l, i gotta run, but should be back on shortly [20:57:47] oh i have one comment [20:57:48] will add [20:58:49] k ya, back in a little bit [21:06:14] drdee, milimetric: a quick question on cohort upload in Wikimetrics [21:06:21] yes [21:06:25] real quick [21:06:33] i'm around drdee, I got it [21:06:39] i am teasing [21:06:49] I'm creating a test cohort that I'd like to use for QA'ing individual reports between UserMetrics and WikiMetrics [21:07:20] I want to specify three users in 3 projects [21:07:33] so I added 9 rows in the CSV as per the cohort upload format specs [21:07:50] arg damn, sorry, I need to run to a meeting [21:07:52] bbl [21:08:08] how's that for a cliffhanger? [21:08:12] :) [21:08:16] i'm like and [21:08:19] AND?! [21:08:25] AND????????? [21:08:31] come on DarTar [21:08:31] he'll be back [21:08:34] they're always back! [21:08:39] you can't just walk away like that [21:08:54] like a combine invasion [21:09:03] not a combine invasion, a zombie invasion [21:23:10] ottomata: updated the patch [21:26:36] ok cool [21:26:57] ori-l, doing that later is fine with me, but i've had annoyances with trying to change checks in icinga/nagios before [21:27:25] maybe I just don't know the proper way, but obsoleting ones you don't want anymore has been annoying for me sometimes [21:28:19] ottomata: i'd probably keep it as-is, then. it'd make sense to make the consumer checks independent, but the other instance types compose a single logical data processing pipeline [21:28:49] hmm ok [21:28:57] hokay! [21:29:04] want me to merge? [21:29:04] halright! [21:29:08] NO DON'T! [21:29:11] k [21:29:11] um, sure. [21:29:12] oh [21:29:13] ha [21:29:14] uhhh [21:29:16] hokay! [21:29:25] haproxy. [21:30:14] ha [21:30:26] hadoop! [21:30:39] HAnamenode [21:30:50] i'm all out [23:16:08] (PS1) Milimetric: test coverage is now over 90 percent [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/81153 [23:16:22] (CR) Milimetric: [C: 2 V: 2] test coverage is now over 90 percent [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/81153 (owner: Milimetric) [23:33:38] milimetric: that's a wow [23:33:52] >90% [23:33:54] great !