[02:45:07] (PS2) Stefan.petrea: Implemented Survivor metric [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/81421 [03:14:44] (CR) Milimetric: "(1 comment)" [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/81421 (owner: Stefan.petrea) [13:09:36] morning everybody [13:12:32] Good mornung :-) [13:18:17] hi ! :) [14:26:03] https://www.youtube.com/watch?v=NECvpNA97jU [14:26:05] ottomata: ^^ [14:26:12] qchris: ^^ [14:26:14] milimetric: ^^ [14:26:53] What's that? [14:27:28] qchris: it's a hangout about deployments/releases [14:28:10] Ok. [14:28:13] Thanks :-) [14:34:42] np [15:26:38] cool, thanks average, i'm in a meeting but will watch later [15:56:42] are we doing the hangout nowish? [16:04:06] anybody seen drdee today? [16:05:59] ottomata: he briefly connected for 3.5h ago [16:06:10] ottomata: I mean 3h ago [16:06:47] hm [16:29:57] yoyo [16:31:08] hey qchris, average, milimetric [16:34:18] milimetric, around? [16:39:37] Hi drdee [16:39:44] heya [16:40:10] ottomata was looking for you [16:40:17] i am here :) [16:40:17] Oh. Now he is not here :-) [16:40:20] yup [16:40:25] Hehe. [16:40:40] There he is :-) [17:11:14] (PS3) Stefan.petrea: Implemented Survivor metric [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/81421 [17:12:02] (PS4) Stefan.petrea: Implemented Survivor metric [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/81421 [17:14:45] milimetric: after scrum can we hangout to talk about logic in for 701 card ? [17:25:26] qchris: now i know what the google doc of evan contains [17:25:39] and i think you can drop that and use this as a replacement [17:25:40] https://github.com/wikimedia/kraken/blob/master/kraken-generic/src/main/resources/country-codes.json [17:30:06] qchris_away: ^^ [18:02:51] qchris: ^^ [18:38:06] drdee: Cool. Let me have a look :-D [18:39:27] regardless we should drop the sep on the google doc [18:39:43] and use the country-codes.json maybe we might have to add a new field but that's okay [18:41:10] qchris: what is the data repo for the gp.wmflabs.org dashboards? analytics/global-dev/dashboard-data ? [18:41:22] (if so we need to rename that repo :) ) [18:42:41] Yes, currintely, that's the repo. [18:42:54] Why do we need to rename it? [18:43:01] (Renaming is not possible) [18:43:12] (We can only delete it and create a new one) [18:44:04] we don't need to rename it but it's slightly confusing [18:44:48] Yes, indeed. [18:58:19] (PS1) Diederik: Name should be a string, not an array. [analytics/global-dev/dashboard-data] - https://gerrit.wikimedia.org/r/81547 [18:59:34] drdee: That repository ^ gets overwritten automatically by cronjobs [18:59:40] (At least ideally ...) [19:00:16] I'll fix that change's issue by hand for now [19:00:20] yes i am aware [19:00:50] but wanna make sure that is the fix for not being able to search datasources [19:04:56] It did not fix the issue for me :-/ [19:12:35] did you run git pull on limn0 ? [19:12:37] (CR) Milimetric: [C: 2 V: 2] Name should be a string, not an array. [analytics/global-dev/dashboard-data] - https://gerrit.wikimedia.org/r/81547 (owner: Diederik) [19:14:18] Do not run git pull on limn0 :-) [19:14:29] That will cause problems. [19:14:33] why? [19:14:45] (i won't run git pull0 [19:15:12] The issue is that we copied files over by hand as repos did not match when regenerating the files [19:15:43] If we pull, we'd kill the old local commit to that repo [19:15:45] right [19:15:47] i remember that now [19:15:48] sorry [19:15:58] np [19:16:00] so drdee, you can just apply the same fix manually to the files there [19:16:07] I applied the fix you suggested locally [19:16:16] but it was not sufficient. [19:16:32] It was needed in both datasources/grants_spending_by_country.json, and graphs/grants_spending_by_country.json [19:16:39] I just tested it. [19:16:57] And doing the array -> string convesion in both files seems to fix the problem [19:17:03] (At least search now seems to work) [19:17:08] awesome [19:17:18] i really could not find the cause in the script [19:17:38] yes confirmed search is working again [19:35:56] average_ around? [19:39:38] drdee: yeah [19:39:47] pm'ed you [19:39:55] looking [19:46:19] qchris: regarding https://mingle.corp.wikimedia.org/projects/analytics/cards/1029 [19:46:28] were did you see these errors? [19:46:34] can you point me to the log files? [19:47:34] You are in a bug squashing mood today :-) [19:47:46] Let me have a look [19:48:55] yup [19:50:44] drdee the logs of the scripts do not seem to be written to disk [19:51:06] So you'll have to run the commands be hand and pipe the ountput to files. [19:51:15] ok [19:51:23] can you give me the list of commands? [19:53:26] See deploy_dashboard.sh in /home/erosen/src/geowiki/scripts [19:53:37] They contain the relevant commands [19:53:53] Basically all I know about it, is in the card. [19:54:17] Those scripts may or may not run for you [19:54:40] They did not run for me, so I brought them over to my home directory and made them work during fire fighting [19:54:59] If you have sudo powers that might help. [19:57:41] k [19:58:28] drdee: The card is not yet on the wall. Should I bring it in for you? [19:58:38] no, i am doing the analysis [19:58:42] i am not fixing it [19:58:51] Oh. Ok. [19:59:02] but want to add more details so we can fix it quiker [20:45:44] filed an enhancement request to linkify mingle cards in gerrit -> https://bugzilla.wikimedia.org/show_bug.cgi?id=53499 [21:02:11] yo drdee [21:02:19] hi [21:02:22] you beat me by 1 card [21:02:50] here's the hadoop job card -- https://mingle.corp.wikimedia.org/projects/analytics/cards/1114 [21:03:21] we realized that hadoop itself has benchmarks and running those would be easier than writing something ourselves [21:03:28] plus they test all the components [21:05:12] cool! [21:05:19] you mean terasort? [21:05:40] qchris: do you think the country-codes.json can replace evan's google doc/ [21:06:18] teragen (IO), sort (IO/MR), plus some name node tests that will be useful to have numbers for [21:06:25] drdee: No idea :-) [21:06:35] drdee: (I haven't looked at those parts, as different priorities have been forced.) [21:06:38] we used this page: http://www.michael-noll.com/blog/2011/04/09/benchmarking-and-stress-testing-an-hadoop-cluster-with-terasort-testdfsio-nnbench-mrbench/ [21:07:08] drdee: But IIRC the file names for the different gcat usecases did not match exactly. [21:07:32] yeah michael noll has written some real nice blogposts about hadoop but also avro and storm IIRC [21:07:39] qchris: aight