[13:03:11] (PS1) Milimetric: Merge branch 'release/October2013' [analytics/reportcard/data] - https://gerrit.wikimedia.org/r/87070 [13:03:36] (CR) Milimetric: [C: 2 V: 2] october update [analytics/reportcard/data] - https://gerrit.wikimedia.org/r/87069 (owner: Milimetric) [13:03:51] (CR) Milimetric: [C: 2 V: 2] Merge branch 'release/October2013' [analytics/reportcard/data] - https://gerrit.wikimedia.org/r/87070 (owner: Milimetric) [14:01:55] morning guys [14:02:15] ooooooootoooooomaaataaaa can i ask you for a small favor? [14:02:59] ottomata: ^ [14:03:00] :P [14:03:15] yoyooy [14:03:17] whaaasup? [14:07:15] qchris: can you update your mingle cards for today's sprint showcase? [14:07:20] average: can you update your mingle cards for today's sprint showcase? [14:07:25] ottomata: can you update your mingle cards for today's sprint showcase? [14:07:32] milimetric: can you update your mingle cards for today's sprint showcase? [14:07:32] yes [14:07:36] ty [14:07:38] :) [14:07:39] drdee sure. [14:07:43] ty qchris [14:08:15] average: when should the new report for http://stats.wikimedia.org/wikimedia/squids/TablesPageViewsMonthlySquidsMobile.htm be ready? [14:08:36] he's not on irc drdee [14:08:46] :( [14:09:45] why why why? [14:09:46] You must have the wrong version [14:10:12] the wrong irc version hahah [14:15:22] why? what? [14:15:23] I haven't had the chance to run that code yet [14:19:38] you see CommanderData's response are always eerie in their applicability [14:20:45] milimetric: ok if i put #1109 in showcasing? [14:20:45] drdee hopes that someone will have a look at https://mingle.corp.wikimedia.org/projects/analytics/cards/1109 [14:20:57] and #701 as well? [14:20:57] drdee hopes that someone will have a look at https://mingle.corp.wikimedia.org/projects/analytics/cards/701 [14:21:10] 1109 yes, but it's not very "showcase-worthy" [14:21:20] sure ;) [14:21:25] and I'm not sure about 701, since average isn't around [14:21:30] we might not actually showcase it :) [14:21:34] yeahhhhhhh :( [14:21:41] I told him to push whatever progress he got and that I'd finish it up this morning [14:21:53] but I don't see a new patchset so I'm hesitant to finish it [14:22:01] bummer [14:22:12] i could finish it I guess, and deal with whatever merge may come... [14:22:13] but only the unit-tests were missing, right? [14:22:27] yeah, we got the logic fixed up last night I believe [14:22:46] hm, maybe I'll just do that [14:23:07] qchris; what shall we do with #1137 [14:23:07] drdee hopes that someone will have a look at https://mingle.corp.wikimedia.org/projects/analytics/cards/1137 [14:23:08] i'll finish reading my email and see if he signs on. If not I'll finish it and deploy it [14:23:12] k [14:23:29] drdee: I am just working on it :-) [14:23:47] I think it'll be in Done in 10-15 minutes. [14:23:57] awesome, you rock! [14:24:11] milimetric: #1180; did you notify EM? [14:24:35] haven't pushed that live drdee [14:24:41] ottoman, #1124 & #1152 : what's the verdict? [14:24:41] drdee hopes that someone will have a look at https://mingle.corp.wikimedia.org/projects/analytics/cards/1124 [14:24:41] drdee hopes that someone will have a look at https://mingle.corp.wikimedia.org/projects/analytics/cards/1152 [14:24:45] milimetric, k [14:25:09] i'll push it since no concerns were raised [14:27:05] so! milimetric [14:27:06] k [14:27:08] i got that data loaded [14:27:10] and a hive table on it [14:27:11] but! [14:27:15] no but's ! [14:27:15] cool [14:27:27] i had been using a slightly outdated version of varnishkafka, which did not automatically default num types to 0 [14:27:27] how much data did you load? [14:27:29] the plural of but is buts [14:27:36] but's is something belonging to a but [14:27:40] magnus fixed this in a newer version [14:27:55] in the version I have installed I have to specify the default when a value is not given [14:28:02] ok [14:28:07] ty milimetric [14:28:08] i had seen a bad integer value for the response_size type [14:28:12] so I had defaulted that to 0 [14:28:21] buts…… [14:28:31] but i had never seen a bad value for sequence or for time_firstbyte [14:28:37] so I didn't default them to 0 explicitly [14:28:47] and apparently there are times where varnish doesn't set those? [14:29:03] so there are bad integer values in the data [14:29:07] which means hive can't run queries [14:29:14] SO aggghhhh [14:29:32] so i've just explicitly defaulted the values to 0 on cp3003 [14:29:43] and turned off varnishkafka on cp1048 (to make looking for missing seqs easier) [14:29:50] and i'll have to collect more data before we try again [14:29:57] can we make this a faster failure-cycle [14:30:05] were we import 1 hour of data [14:30:08] try to run the hive query [14:30:12] yeah we can do that [14:30:19] fix import if necessary [14:30:23] and then import more data [14:30:25] yeah sure [14:30:35] k [14:31:11] ok so the path to varnishkafka deployment looks like: [14:31:27] - verify validity of data that gets imported [14:31:40] - pipe mobile stream into main udp2log stream [14:31:54] - enable varnishkafka on mobile varnishes [14:32:05] - install camus on kraken [14:32:11] is that correct? [14:32:14] and what's missing? [14:32:53] the install camus on kraken can happen in parallel, but yeah [14:33:05] sure :) [14:33:05] i'm used camus to import all the data from last week, and it worked great! [14:33:16] but, we need to do it better than the way I am [14:33:20] git deploy kraken, [14:33:24] have a camus target in kraken [14:33:25] etc. [14:33:28] ok [14:33:42] drdee, did you see this? [14:33:51] (PS1) Milimetric: manually updated [analytics/reportcard/data] - https://gerrit.wikimedia.org/r/87138 [14:33:52] (PS1) Milimetric: Merge branch 'hotfix/h1' [analytics/reportcard/data] - https://gerrit.wikimedia.org/r/87139 [14:33:53] install camus on kraken includes maven setup [14:33:53] https://gerrit.wikimedia.org/r/#/c/86894/ [14:33:58] yes [14:34:09] (CR) Milimetric: [C: 2 V: 2] manually updated [analytics/reportcard/data] - https://gerrit.wikimedia.org/r/87138 (owner: Milimetric) [14:34:16] (CR) Milimetric: [C: 2 V: 2] Merge branch 'hotfix/h1' [analytics/reportcard/data] - https://gerrit.wikimedia.org/r/87139 (owner: Milimetric) [14:34:55] hadn't seen it ottomata but's awesome! [14:35:33] i'm worried that it won't be very efficient, we'll see [14:37:26] #1074 [14:37:26] drdee hopes that someone will have a look at https://mingle.corp.wikimedia.org/projects/analytics/cards/1074 [14:39:06] qchris, milimetric: can we quickly talk in the bat cave about what we will showcase today? [14:39:20] sure, coming [14:39:23] Sure. Booting the machine. [14:39:30] ottomata: do you want to demo something? [14:39:48] hmm [14:39:55] dunno [14:40:10] brb, moving to cafe [14:48:14] back [14:48:33] so drdee, what would I demo? [14:48:43] i dunno; asking you ;) [14:48:46] hm [14:48:54] naw, not really, i guess [14:59:09] ottoman, what's the status of #1124 and #1152 [14:59:09] drdee hopes that someone will have a look at https://mingle.corp.wikimedia.org/projects/analytics/cards/1124 [14:59:09] drdee hopes that someone will have a look at https://mingle.corp.wikimedia.org/projects/analytics/cards/1152 [15:00:03] why? [15:00:03] The third party documentation doesn't exist [15:00:09] heh [15:00:09] ok [15:01:23] ottomata: ^^ [15:02:38] drdee, 1124, stilll want to check for missing seqs over long term of data [15:02:42] that's why i'm doing the hive stuff [15:02:43] That error means it was successful [15:03:00] ok [15:03:04] for kafka monitoring…hmm, i can repuppetize the prod kafka brokers now [15:03:10] been meaning to merge something anyway [16:24:50] (PS1) Erik Zachte: update dump server location [analytics/wikistats] - https://gerrit.wikimedia.org/r/87160 [16:34:10] drdee: hi [16:34:21] susprisignly the reports magically updated and they're up-to-date [16:34:31] september shows up now [16:34:53] but I didn't have any way to predict that [16:35:07] 701 is not finished, there are still some tests left to be done [16:35:12] not ready for showcase [16:44:34] average: are you sure we cannot showcase it? [16:44:46] i mean we don't show case the unit-tests :) [16:44:59] i understand that some work is left to be done [16:45:13] but if we can demo it from an end-user pov, then let's do it [16:48:06] I need data in the demo controller to be able to showcase it [16:48:27] and have to try it out before the showcase to make sure it works [16:48:44] and we have to deploy it too [16:48:44] please do that, in 1 hour is the showcase [16:48:58] milimetric can you chime in? [16:49:29] sure [16:49:42] showcasing of #701 [16:49:42] drdee hopes that someone will have a look at https://mingle.corp.wikimedia.org/projects/analytics/cards/701 [16:49:42] I thought the demo was in 10 minutes [16:49:52] 1 hour 10 minutes [16:49:53] that's why I said that [16:49:54] You can't use that version on your system [16:49:55] it's close [16:49:59] it was standup in 10 minutes [16:50:03] no standup [16:50:10] ok [16:50:14] on sprint demo days, there are no standups [16:50:19] ok we have time I think [16:50:39] so how many cases are left untested? [16:50:43] it's hard to schedule those exceptions with google calendar [16:51:21] brb [16:52:45] oh man i hope hope hope my internet does ok [16:52:59] do you have the mifi ottomata? [16:54:22] yup [16:54:25] will fall back to it [17:08:17] brb [17:10:34] (CR) Milimetric: [C: 2 V: 2] "Merging this out of necessity, it's not yet ready" [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/86878 (owner: Stefan.petrea) [17:31:05] b [17:31:16] bb [17:31:23] bbb [17:31:34] you guys sound like my daughter [17:31:49] :-D [17:32:13] this IRC room feels very homey :) [17:33:22] homely [17:33:25] wait [17:33:28] no homey :) [17:35:27] The german/english dictionary has homey :-) : http://dict.leo.org/?search=homey#centerColumn [17:48:40] (PS1) Milimetric: Merge "Adding censored property" [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/87173 [17:48:41] (PS1) Milimetric: survivors cleanup 1 [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/87174 [17:50:03] (PS1) Milimetric: Merge branch 'master' of ssh://gerrit.wikimedia.org:29418/analytics/wikimetrics [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/87176 [20:10:48] fyi qchris. [20:10:53] it was an iptables problem on the consumer [20:10:56] weird that I didn't see more errors [20:11:04] but the consumer couldn't connect to one of the brokers [20:11:11] which meant it only got half of the messages [20:11:15] about half [20:11:19] phew! [20:12:38] Phew! :-) [20:12:41] Yes indeed. [20:12:48] Good to know it was just iptables. [20:13:34] Strange though that no errer went into the logs. [20:15:08] yeah very strange, i'm also unsure as to what is wrong with my iptables rules [20:15:13] especially since this was working the other day [20:15:48] Are they simple enough that a non-op would understand them? [20:16:06] ja [20:16:06] https://gist.github.com/ottomata/6799866 [20:16:20] Interesting :-) Danke. [20:17:16] refresh, updated with comments [20:19:24] milimetric do you have the link for the expense report meeting? [20:21:29] ottomata: sounds interesting ... I am curious what the final solution will be :-) [20:27:22] milimetric: ? [20:27:27] milimetric: you there ? [20:29:37] "final solution" [20:30:29] "final solution" as in "which adjustment makes it work again" [20:35:17] yeah dunno what's wrong with it [20:35:23] if i flush the rules it works fine [20:36:08] Oh. So it's just an IP that's missing? [20:39:54] dunno, shouldn't be [20:42:31] So the packet count for the reject rule is 0? Strange :-/ [20:45:52] ? [21:53:40] ottomata: hey Andrew [21:53:53] ottomata: can I have a stream that webstatscollector has to play around with it on stats1002 [21:55:40] *stat1002 [21:57:03] ok, running home, back in a bit [22:41:14] (PS1) Milimetric: fixing query performance [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/87280 [22:41:25] (CR) Milimetric: [C: 2 V: 2] fixing query performance [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/87280 (owner: Milimetric) [22:44:33] nice work milimetric [22:44:52] nice work? [22:45:03] true story :) [22:45:09] stefan and I are working on this together [22:45:14] you mean the query? [22:45:16] it doesn't work yet :) [23:00:21] i am just the cheer leader ;) [23:00:34] and whenever i see a commit i cheer!