[01:29:16] ottomata, milimetric, dschoon: I know you're busy, but do any of you know about changes to the supervisor node setup? [01:29:29] no worries if you can't respond [01:29:37] what about it? [01:30:02] global-dev.wmflabs.org is failing agt 7+ express = require('express'); [01:30:38] s/agt/at:/ [01:30:52] Error: Cannot find module 'express' [01:55:43] two problem left [01:55:49] *Two problems left [01:55:52] BV and IO [01:56:18] according to http://en.wikipedia.org/wiki/ISO_3166-1 [01:56:20] these are [01:56:24] Bouvet Island [01:56:25] and [01:56:34] British Indian Ocean Territory [01:56:59] RegionCodes.csv show them as being in the south so they are counted [01:57:05] in the Global South [01:57:24] so the invariant is now -4 instead of being 0 because I still have these two regions left [01:57:38] I'd need to talk to Erik now but I'm not sure whether he's available, he said there was a meeting [01:57:47] probably the one mentioned above ^^ [01:58:04] average_drifter: what are you guys using to count something as global south? [01:58:46] average_drifter: BV and IO --> unknown [01:58:49] don't' worry about it [01:59:03] drdee: unknown as in X as in nowhere [01:59:06] drdee: right ? [01:59:09] XX [02:03:24] https://gerrit.wikimedia.org/r/38483 [02:03:58] looks like there's a problem [02:04:01] with the test [02:04:04] it didn't find outputdir [02:04:07] gotta look on it [02:04:51] trying again [02:05:08] ALL GREEN [02:05:09] passing [02:05:42] drdee: the invariant also includes ipv6 because these are unknown also [02:05:56] drdee: https://integration.mediawiki.org/ci/job/wikistats/33/console [02:06:21] my $country_code_invariant_1 = $world_total - $global_north - $global_south - $ipv6 - $unknown; [02:06:26] this is supposed to be 0 [02:06:28] and it is now [02:07:37] AWESOM! [02:09:01] :) [07:44:35] hashar: morning Antoine ! :) [07:44:40] drdee: morning Diederik ! :) [07:44:49] hey [07:46:05] hello [07:48:16] average_drifter: there :-D so others can follow the discussion :D [09:07:58] average_drifter: ping around ? :) [09:08:29] I got all existing jenkins jobs migrated to https://integration.mediawiki.org/ci/view/Analytics/ :-D [09:08:37] but webstatscollector does not work :) [09:09:18] https://integration.mediawiki.org/ci/view/Analytics/job/analytics-webstatscollector/4/console [09:09:23] it is missing config.h.in :/ [09:09:46] the jobs are generated from a yaml definition which is at https://gerrit.wikimedia.org/r/#/c/38497/1/analytics.yaml,unified [09:09:57] I basically copy pasted from the old jobs but may have missed something [09:10:10] hashar: well how come the webstatscollector job works and the analytics-webstatscollector does not ? [09:10:20] hashar: if the first works, the latter should work also [09:10:29] I will double check the commands so [09:10:30] hashar: they must be doing different things [09:10:34] please do [09:10:52] also I am wondering, isn't make clean && make enough ? [09:10:58] I would expect make clean to delete those files [09:10:58] no [09:11:04] and make to autoreconf / auto something for us [09:11:15] unfortunately no. all the commands there are necessary [09:11:38] we want to clean up and completely rebuild after each gerrit changeset [09:11:51] because we may have modified some Makefile.am and stuff [09:12:00] well the workspace is wiped out between each changes iirc [09:12:07] so we scratch everything and make a clean new build each time [09:12:08] will have to verify that [09:25:11] average_drifter: so the old jenkins job as a config.in.h which is not in the repository [09:25:22] average_drifter: it got generated at some point and has not be deleted [09:25:42] average_drifter: should it be added to the repo . [09:25:43] ? [09:26:02] /* config.h.in. Generated from configure.ac by autoheader. */ [09:26:03] ahh [09:26:08] maybe we need autoheader [09:28:00] apparently autoreconf generates it [09:29:41] average_drifter: can we replace autoconf && automate --add-missing by "autoreconf" ? [09:29:46] that generates the missing config.h.in [09:30:13] autoreconf runs autoconf, autoheader, aclocal, automake, libtoolize [10:01:26] noit should be deleted [10:01:32] any file generated should be deleted [10:02:26] oh ? autoreconf runs those ? [10:02:47] try it on 3 successive builds and if they work, fine by me [10:03:12] hashar: can you keep a list of the commands in case stuff happens ? do not delete the old jobs yet.. leave them around for a while [10:03:49] yup autoreconf should handle everything for you [10:03:58] I am trying to figure out how to make it to run libtoolize [10:04:02] currently: $ autoreconf -i [10:04:02] aclocal: couldn't open directory `m4': No such file or directory [10:04:03] autoreconf: aclocal failed with exit status: 1 [10:06:34] hashar: you want to optimize the build-time ? :D [10:06:49] I want to optimize the number of commands required to build :-] [10:07:07] libtoolize && autoreconf --verbose --make [10:07:09] that would be fine [10:09:04] cool [10:09:27] hashar: did you keep the first line ? "rm -rf Makefile Makefile.in configure m4/ aclocal.m4 Makefile.am.new" [10:10:09] nop [10:10:28] then how can you be sure your commands actually made a fresh new clean build ? [10:10:45] faith ? [10:10:46] :) [10:10:50] the workspace is cleaned out between each build [10:10:57] and reclined from the git repository [10:11:03] going to check that Right now :) [10:11:06] hashar: does jenkins guarantee that ? I didn't know [10:11:46] yes; it'll actually complain if it can't clean up its workspace [10:12:08] I think the build times would be much higher if that were actually true... [10:12:21] since if you had a big repo, it would have to re-pull it or something... [10:12:27] maybe it keeps copies somewhere ? dunno [10:12:39] I can confirm that the workspace directory is deleted :) [10:12:45] great ! :) [10:12:49] then all is good [10:12:56] hashar: thanks ! :) [10:12:56] but then I get some more failures :/ [10:12:57] https://integration.mediawiki.org/ci/job/analytics-webstatscollector/5/console [10:13:08] hrm: https://wiki.jenkins-ci.org/display/JENKINS/Workspace+Cleanup+Plugin so maybe that's not the default behavior [10:13:15] ahh automake --add-missing [10:13:16] funn [10:13:19] :) [10:13:34] now I am wondering how to ask autoreconf to run automake with that option :- [10:13:52] the autotools are a beauty [10:14:05] I struggled with them for some time [10:14:06] they are a huge pile of shitty old software that nobody should use anymore [10:14:09] (end of rant) [10:14:18] =)) [10:14:24] you should use scons [10:14:25] o [10:14:29] almost everyone uses them but I do agree that they're a pile of shitty tools [10:14:34] or some other modern building tool [10:15:09] hashar: I would like to use scons TBH. but then all the people who want to build stuff and work on it would have to learn scons also [10:15:18] hashar: is a Makefile easy to write with scons ? [10:16:05] hashar: anyone in WMF uses scons on a particular project ? [10:16:26] if not.. I could break the ice :D [10:16:28] ah autoreconf --install will run --add-missing [10:18:09] yahhhheeaa [10:18:10] https://integration.mediawiki.org/ci/job/analytics-webstatscollector/6/console [10:18:14] that works :-] [10:18:15] niiiiice :) [10:18:56] I like that you did that submodule --iniut [10:18:57] so yeah we just need libtoolize && autoreconf --verbose --install --force --make [10:18:59] --init [10:19:02] (on webstats collector) [10:19:06] I've been meaning to do that myself... [10:19:12] so yeah [10:19:20] the jenkins git plugin is supposed to update the submodule for us [10:19:31] but due to a "feature" that does not work in that particular setup [10:19:40] I have to update them manually :/ [10:19:50] aka I configured Git plugin to NOT process any submodule [10:19:56] why is that ? [10:20:13] the clone is made from a non bare git repository which has the change submitted [10:20:30] in such a case, the git plugin will happily rewrite the submodules URL to points to the non bare repository [10:20:39] so if the non bare repo is in /git/repo [10:21:00] the debianize submodule URL is rewritten from http://github.com/.../debianize to /git/repo/debianize [10:21:04] which does not exist :-] [10:21:26] long story short, we have to run the commands ourselves [10:21:49] this ZUUL, I see there are a lot of parameters there [10:21:58] as a developer, I would like things to be simple [10:22:05] yup they are environment variable [10:22:28] how easy is it for me to create a new job on jenkins ? do you have a template please ? [10:22:34] for devs the idea is to migrate everyone to Jenkins Job Builder. Still have to figure out / write doc for you to use it [10:23:08] the repository is integration/jenkins-job-builder-config.git [10:23:17] that holds some YAML files to create the jobs [10:23:32] I will make it so that Jenkins self update whenever a change is merged in that repository [10:23:36] still need a bit more of integration first [10:24:01] ok [10:24:18] so who does the actual build now ? autoreconf ? [10:24:25] cause I see no "make" in there [10:27:21] hashar: why does it say "Reported result: ERROR" here ==> https://integration.mediawiki.org/ci/job/analytics-wikistats/2/ ? [10:27:29] hashar: because the tests in that Pass [10:27:50] ah [10:28:08] I will fix that [10:28:13] jenkins attempt to merge the change [10:29:24] hashar: also, could you please answer my question about what a "trusted developer" is ? [10:29:48] and beyond the security reasons, is there any other reason why ZUUL would improve upon a basic jenkins ? [10:30:19] yup [10:30:23] that is written in python :-] [10:30:30] so it is easier to hack for me than a java plugin [10:30:31] I'm a big fan of minimalism and over-engineering is a bit orthogonal with my ideas ( http://suckless.org/philosophy ) [10:30:32] annnd [10:30:41] it is still actively maintained / reviewed etc [10:31:05] hashar: I understand [10:31:18] so you chose it because you could extend it [10:31:25] hashar: so what was the java plugin which bothered you ? [10:34:37] average_drifter: Gerrit Trigger Plugin [10:49:38] average_drifter: I am disabling the old jobs :-] [10:49:53] the new ones are all working :-D [10:50:15] I will disable jenkins job builder for them so you can easily alter / tweak the jobs as you want [10:50:41] alright [10:50:45] I hope I won't have problems [10:56:38] average_drifter: sent a mail to most people [10:56:51] average_drifter: I have listed the Changes I have used to verify all jobs are running fine [10:57:08] now I need to disable the analytics jobs in Jenkins job builders :-] [10:57:16] so you can manually update jobs whenever you want [11:01:20] sounds cool [11:01:35] average_drifter: we could migrate to use jenkins job builder [11:01:43] average_drifter: but that probably add too much overhead [11:02:21] average_drifter: the configuration is at https://gerrit.wikimedia.org/r/#/c/38497/ [11:02:44] average_drifter: I have disabled it so it is not going to override changes you are making in jenkins graphical interface [11:08:33] hashar: it would be a good idea to study projects already using jenkins job builder [11:08:39] and seeing their magnitude [11:08:42] number of developers [11:08:51] how often they send changesets(patches..) [11:09:11] or zuul for that matter [11:09:53] but I trust you have good judgement on this since you decided to implement it I'm sure it's mature [11:10:01] average_drifter: Zuul has been written by OpenStack [11:10:09] I am not sure how many people use that though [11:10:16] it is definitely mature and stable [11:10:24] with a well maintained code base :-] [11:10:38] we could probably have used the java plugin [11:10:50] but Zuul had some interesting concepts [11:11:01] and its python <3 [11:11:08] I <3 python too :) [11:11:15] i prefer python >= 3 [11:11:42] [11:11:51] okay, i'll shut up now. [11:18:31] I'm watchin this http://www.youtube.com/watch?v=o9pEzgHorH0 [11:18:35] it's awesome [11:20:19] we have a few developers in mediawiki that tends to write overcomplicated code :/ [11:27:00] hashar: I have 2 more things I need in jenkins [11:27:17] https://github.com/wmf-analytics/ua-parser [11:27:24] https://github.com/wmf-analytics/debianize [11:27:29] how can I add these ? [11:28:01] ah [11:28:10] average_drifter: you want to pull them from github [11:28:14] yes [11:28:15] which is done by libcdr job [11:28:18] i think [11:28:23] probably [11:28:52] so you're suggesting to mirror them in internal git repo ? [11:28:54] *repos [11:29:39] average_drifter: have a look at https://integration.mediawiki.org/ci/job/analytics-libcidr/configure [11:29:44] you could create a new job [11:29:55] on creation, jenkins let you copy an existing job [11:30:00] so you could copy analytics-libcidr [11:30:15] then edit the github URL to points to ua-parser / debianize [11:30:39] I am still unsure why there is a https://github.com/wmf-analytics/ project though [11:30:57] but you can do it :-] [11:31:02] hashar: why would we want to have code locked up in a repo right ? [11:31:04] jenkins will poll github from time to time [11:31:14] hashar: I mean, why would we want to have it in a cage :) [11:31:19] a cage ? [11:31:41] hashar: well, some things are indeed private but these things should be public [11:31:45] for example [11:31:58] all our code is public already anyway :-] [11:32:06] :) [11:32:14] average_drifter: when creating the jobs, please prefix the job name with 'analytics-' [11:32:20] alright [11:32:30] watching your video still :-) [15:38:30] morning guys [15:49:52] morning everyone [15:50:07] my computer decided to start crashing a lot and kill its own speakers [16:33:53] drdee, we should have a standard name that we refer to the firehose stream as [16:33:56] that actually describes the content [16:34:11] web access logs [16:34:11] web request logs [16:34:11] request logs [16:34:11] access logs [16:34:12] something like that [16:36:57] hmm, on data formats is says web request [16:37:05] https://www.mediawiki.org/wiki/Analytics/Kraken/Data_Formats#Web_Request_Format [16:37:17] ok, so web request it is :) [17:00:57] Web Request Firehose [17:05:16] :) [17:08:29] ping average_drifter [17:55:25] guys, can we do scrum 1 hour later [17:55:26] ? [17:55:31] hihi [17:55:32] yes. [17:55:33] dec and i have platform meeting [17:55:41] ahhhh platform! [18:02:49] to those who want to play the roadmap game from home: https://www.mediawiki.org/wiki/Analytics/Roadmap [18:06:23] anyone interested in joining me for today's standup? [18:06:32] dschoon, ottomata, erosen, drdee ^^ [18:06:45] i moved it to 11 [18:06:46] can we postpone it for 1 hour? [18:06:48] oh [18:06:50] sure [18:06:57] because we have to do a platform roadmap update [18:06:59] cool [18:07:14] wonder why it didn't update my calendar [18:07:36] ah, it did. my bad [18:09:23] /win [18:09:43] platform roadmap update? [18:10:10] ottomata: i'm updating the kraken roadmap [18:10:14] http://etherpad.wmflabs.org/pad/p/AnalyticsRoadmap [18:12:37] ottomata: we still have more to do with jmx monitoring, right? [18:13:10] if we want something fancier, yes, but as it is we can pretty easily pipe any jmx stat we want into ganglia [18:13:25] right. [18:13:28] true. [18:13:35] okay, i suppose that doesn't count [18:13:56] i'm not really sure what else we want to do, aside from maybe alerts for specific stats [18:14:09] while not the most beautiful, ganglia works [18:16:03] it'd be great to get graphite set up, though [18:16:13] so querying these things could be done in a useful fashion [18:16:38] hey milimetric, you should join us at http://etherpad.wmflabs.org/pad/p/AnalyticsRoadmap [18:16:47] k [18:16:49] so i don't weave blatant lies and overpromise about limn :) [18:16:55] on [18:17:17] gonna be hard since I can't hear you guys talking though [18:18:21] we're not. [18:18:30] analytics is just using irc to coord [18:48:46] ori-l: that is a really good list, btw [18:49:54] thanks; i'm trying my hand at the whole 'document what you're doing' thing, which is a bit new for me [18:53:04] :D [18:53:06] same. [18:53:27] i said in the doc chat, but i moved that stuff under Kraken, since the doc is by project [18:53:49] and since it's a chrono-sensitive doc (next two months) i copied the full lies to the pixel service page [18:53:52] https://www.mediawiki.org/wiki/Analytics/Kraken/Pixel_Service [18:54:04] that way nothing would be lost, even though not all of it is for dec or jan [18:58:29] let's change the name! [18:58:33] its not a pixel [18:58:34] nor a gif! [18:58:59] stand up now [18:59:00] ? [18:59:43] paste the link, we join soonish [19:01:02] https://plus.google.com/hangouts/_/2e8127ccf7baae1df74153f25553c443bd351e90 [19:05:40] dan and I here [19:05:42] drde dschoon [19:05:44] drdee [19:05:50] trying [19:05:51] You are not allowed to join this hangout. [19:05:55] yeah i got that too [19:05:58] had to change authuser [19:06:00] https://plus.google.com/hangouts/_/2e8127ccf7baae1df74153f25553c443bd351e90?authuser=1 [20:00:51] drdee, dschoon, ottomata, are we meeting? [20:01:05] for arch review? [20:01:07] it's been moved. [20:01:34] oh [20:01:36] whaaa [20:01:38] to when? [20:01:41] i thought it was now too [20:01:44] to tomorrow [20:01:47] asher can't make it [20:01:58] ottomata: where is our cluster hardware breakdown/list? [20:03:50] I never made an assignment wiki, its on my todo list [20:03:53] but there is the email discussion [20:04:41] http://lists.wikimedia.org/pipermail/analytics/2012-November/000220.html [20:04:45] http://lists.wikimedia.org/pipermail/analytics/2012-November/000226.html [20:05:01] yeah, just email [20:05:01] ok [20:05:05] i'll do it now. [20:46:09] ottomata: https://www.mediawiki.org/wiki/Analytics/Kraken/Infrastructure [20:49:34] nice thank you! [20:49:55] oh, btw, nimbus is currently on an02 [20:49:59] not an27 [20:50:01] like that email says [20:55:39] bb in 1h, going outside [21:05:12] fix the wiki page :) [21:05:17] ^^ ottomata [21:06:27] haha, will do [22:29:44] ping milimetric [23:01:00] https://plus.google.com/hangouts/_/16010b2dc6fcc94159aad7ef068f686f78a94021 [23:02:07] milimetric: we're still in the hangout fyi [23:02:12] if you still have time/patience [23:06:51] many host down messages in #wikimedia-operations right now related to analytics boxen [23:10:13] (03:08:00 PM) LeslieCarr: sorry guys [23:10:13] (03:08:09 PM) LeslieCarr: i guess the blip caused some momentary packet loss [23:10:19] so...nm