[03:21:36] New patchset: Ram; "Added notes in README on usage." [analytics/udp-filters] (master) - https://gerrit.wikimedia.org/r/62403 [06:41:36] New patchset: Stefan.petrea; "Fixed 341" [analytics/wikistats] (master) - https://gerrit.wikimedia.org/r/62209 [08:35:37] New patchset: Stefan.petrea; "Fixing mingle 358" [analytics/wikistats] (master) - https://gerrit.wikimedia.org/r/62422 [12:37:46] ugh, why didn't i see this before [12:37:55] and no dartar [12:38:19] hrmmmm, rfaulckr1's 5 hrs idle [13:17:53] moooorning [13:17:57] average, around? [13:18:25] milimetric, whhaaaaaaz up? [13:20:54] drdee: hello sir [13:21:06] drdee: 2 code reviews pending [13:21:16] drdee: 1 acceptance criteria clarification pending [13:21:29] drdee: 1 problem with geocoding that I bumped into [13:22:45] drdee: 1 notification for going on vacation for 2 weeks starting this wednesday [13:26:51] what's the geocoding problem? [13:28:10] drdee: checking just the first line if it has pre-geocoded (88.88.88.88|US) stuff in it does not entail the whole file has the same format [13:28:20] rfaulckr1: please test! [13:30:36] average: actually it does :) [13:30:58] ok [13:35:03] ok 2 more problems [13:35:18] shoot [13:35:30] the udp-filter binary on stat1 is probably old, because it's attaching country code in the style presented above [13:35:40] so it turns 88.88.88.88 ==> 88.88.88.88|US [13:36:27] not sure if the new one was installed on stat1 [13:36:32] or maybe I'm missing some parameters when running it [13:36:46] zcat $file_path | udp-filter -F'\t' -g -b country | gzip > $OUTPUT_DIR/$geocoded_file_name [13:36:50] zcat $file_path | udp-filter -F' ' -g -b country | gzip > $OUTPUT_DIR/$geocoded_file_name [13:36:55] this is how I run it ^^ [13:37:29] what's the new style? [13:37:46] it should be appending country code as the last field [13:38:00] that's where wikistats is expecting it to be [13:38:07] 16th field [13:39:33] erikz always wanted to have it joined with the ip address for wikistats [13:40:22] ok, my bad, we'll do it like that then [13:41:55] then the two gerrit patchsets I mentioned above should be put on hold so I get a change to update them with this [13:42:34] nevertheless they can be reviewed(but not merged yet) [13:42:55] hasher, around? [13:43:05] hashar ^^ [13:43:56] drdee: yes [13:44:06] yo! about jenkins and udp-filters [13:44:09] aren't you supposed to sleep at that time of the day drdee ? [13:44:16] i never sleep [13:44:35] would it be possible to install libanon and libcidr on jenkins? [13:44:57] we could add it as submodules to udp-filters but it feels like cluttering the udp-filters repo [13:45:29] both libraries are already packaged and available in the apt.wikimedia.org repo [13:47:01] iirc that is installed already [13:47:03] let me check [13:47:25] ooops no [13:51:53] https://gerrit.wikimedia.org/r/62434 [13:52:12] drdee: https://gerrit.wikimedia.org/r/62434 will do it. I let you guys find out someone from ops to merge it :-] [13:52:28] did you check the package names? [13:52:29] installing the package on the building server is probably just a workaround [13:52:41] ideally they dependencies should be installed / build by the udp-filters soft I guess [13:53:29] can you also do apt-get install during a jenkins run? [13:53:49] I pasted the output of apt-cache policy form gallium [13:53:52] as a comment to the change [13:53:56] will install the libs manually [13:54:45] installed! [13:54:58] ? [13:55:00] drdee: jenkins can't apt-get install [13:55:16] so what is the jenkins way to install all those dependencies? [13:55:22] if not on the machine itself? [13:55:30] you could have submodules and build them before building your app [13:55:39] or use git clone + build each of them [13:55:53] for libgeoip? libdb-dev, libanon, libcidr, libpcap? [13:56:12] theorically you could get the udp-filters change to be build against the libs at master + the libs at their latest stable version. [13:56:42] # these are needed to build libanon and udp-filter [13:56:42] package { ['pkg-config', 'libpcap-dev', 'libdb-dev']: [13:56:47] we did install a bunch of them already [13:56:56] the puppet manifest is modules/contint/manifests/packages.pp [13:57:26] libgeoip is not there though [13:57:42] line 92 [13:57:57] ahh yeah [13:58:00] that is convenient [13:58:15] this way whenever wikistats deps changes in production, they are also made available on the contint server [14:08:14] average: can you review https://gerrit.wikimedia.org/r/#/c/62198/ ? [14:09:06] milimetric: around for a pm chat? :) [14:13:36] New review: Stefan.petrea; "(1 comment)" [analytics/udp-filters] (master) - https://gerrit.wikimedia.org/r/62403 [14:14:02] drdee: I'm looking over 62198 also now [14:14:09] thank you! [14:17:11] average; once you are done with review, give me a ping, i want to fix jenkins for udp-filters today [14:17:39] about mingle card 356, yes please wait with working on that until the acceptance criteria are done [14:18:03] ok [14:18:33] drdee: where is the stream coming from that is read by the udp-filters ? [14:18:36] in production [14:18:39] I mean in a real case [14:18:40] emery ? [14:18:56] udp2log [14:19:02] ok, which is running on ? [14:19:04] gadalinium ? [14:19:09] and that runs on emery, oxygen and gadolinium [14:19:55] alright [14:22:10] drdee: where will the multiplexor runb ? [14:22:11] *run [14:22:29] it might run on one of the analytics machines [14:24:55] drdee: does udp2log aggregates all the streams from emery, gadalinium and oxygen ? [14:25:03] no [14:25:16] udp2log sends the entire stream to each individual box [14:26:50] average: continuous integration for wikistats is working in both jenkins and travis, right? [14:27:07] if yes, can i close https://mingle.corp.wikimedia.org/projects/analytics/cards/581 ? [14:27:43] i mean not close, i just put it in WIP [14:28:19] drdee: yes you can [14:28:25] what's WIP ? [14:28:29] work in progress [14:28:39] card 581 shows now in [14:28:40] https://mingle.corp.wikimedia.org/projects/analytics/cards/grid?aggregate_property%5Bcolumn%5D=&aggregate_type%5Bcolumn%5D=count&color_by=blocked+reason&filters%5B%5D=%5BType%5D%5Bis%5D%5BFeature%5D&filters%5B%5D=%5BType%5D%5Bis%5D%5BDefect%5D&filters%5B%5D=%5BType%5D%5Bis%5D%5BTech+Debt%5D&filters%5B%5D=%5BType%5D%5Bis%5D%5BInfrastructure+Task%5D&filters%5B%5D=%5BType%5D%5Bis%5D%5BSpike%5D&filters%5B%5D=%5BRelease+Schedule+-+Sprint%5D%5Bis%5D [14:28:41] 28Current+Sprint%29%5D&filters%5B%5D=%5BRelease+Schedule+-+Sprint%5D%5Bis%5D%5B%28Last+Sprint%29%5D&group_by%5Blane%5D=Development+Status&group_by%5Brow%5D=class+of+service&lanes=Queued+for+Dev%2CCoding%2CTesting%2CReady+for+Showcase%2CShipping&tab=WIP+-+Features [14:28:49] nice permalin :D [14:29:12] hah [14:29:21] mingle mingle :) [14:29:41] I could add some bit.ly support if I had the source heh [14:30:06] once you are done code review, give me a ping [14:33:53] drdee: is this accurate ? http://i.imgur.com/NdNZZMh.png [14:34:13] * average is unsure [14:34:57] no, the multiplexor is in front of the udp-filter instances, consult with xyzram when he arrives [14:35:16] nice diagram btw! [14:35:25] thanks [15:20:37] New review: Ram; "(1 comment)" [analytics/udp-filters] (master) - https://gerrit.wikimedia.org/r/62403 [15:45:54] xyzram: hi [15:48:30] hi [15:49:29] New review: Stefan.petrea; "(1 comment)" [analytics/udp-filters] (master) - https://gerrit.wikimedia.org/r/62403 [15:51:17] average: are you there ? [15:51:29] yo [15:51:31] xyzram: yes [15:51:32] I am here [15:51:57] xyzram: it's better to amend an existing commit then to push new commits [15:52:07] we have now a bunch of commits that cannot be merged [15:52:39] Your diagram is not right: input from udp2log comes directly to multiplexor; it then fans out to multiple udp-filter child processes. [15:53:09] xyzram: oh, thanks, I think I get it now [15:53:11] drdee: Sorry, I forgot to do the --amend flag on the commit. [15:53:33] np [15:53:48] average: are you comfortable to start merging the commits? [15:54:21] also, will I be able to amend somebody else's change ? [15:55:45] yes i think so [15:55:49] ok [15:55:55] you will first have to fetch the changeset [15:56:13] but then you should be able to change it and commit it again (using --amend ;) ) [15:56:41] average: ^^ [15:57:02] xyzram: may I ask something ? [15:57:52] So I fetched ottomata's change, made my changes, did the commit (forgetting the --amend) and then did git review; it then created a new changeset presumably because of the missing --amend ? [15:57:58] average: surely [15:58:54] xyzram: first of all I want to say it's a nice mechanism you made [15:59:06] Thanks. [15:59:10] xyzram: the sequence numbers will not be monotonic anymore, is that right ? [15:59:27] strictly monotonic increasing [15:59:43] xyzram: because you round-robin read them from the udp-filter instances [15:59:54] but that's less important [16:00:03] The will not be even from udp2log because it uses UDP, they can arrive out of order. [16:02:27] But input order will be preserved for each child process running udp-filter, so if the input to multiplexer is 1, 2, 3, 4, 5, 6, assuming 2 child processes P1 and P2, P1 will get 1, 3, 5 and P2 will get 2, 4, 6 [16:03:26] If input is 9, 10, 3, 4, P1 will get 9, 3 and P2 will get 10, 4 [16:08:33] xyzram: is it more like this ? http://i.imgur.com/q9wWldI.png [16:09:54] Almost but not quite. [16:10:32] The output from udp-filter does not come back to the multiplexer: it can go to files or can be piped to other processes. [16:11:22] Also, the multiplexer runs on the same box as udp2log since it is intended to be started by udp2log via its config file. [16:12:42] So either udp2log --> multiplexor --> udp-filter {multiple} --> output file [16:12:43] or [16:13:14] udp2log --> multiplexor --> udp-filter {multiple} --> kafka {or other process} [16:13:22] New patchset: Diederik; "Simple bash script to generate seed.sql to load MW database with a small set of test data for UMAPI." [analytics/user-metrics] (master) - https://gerrit.wikimedia.org/r/62450 [16:14:02] milimetric, erosen: if you have a sec please review https://gerrit.wikimedia.org/r/62450 [16:15:30] average: nice pictures; what tool do you use to generate them ? [16:16:54] average: also about your chart, the multiplexor could run on any box [16:17:28] so more correct would be a pic that says network, with four arrows to oxygen, emery, gadolinium and an09 [16:17:40] and then the multiplexor diagram could be inside each host [16:18:35] oh, alright, I understand now [16:18:41] xyzram: just Dia [16:21:44] average: can we merge xyzram's patch sets? [16:22:57] drdee: yes sir [16:23:07] I feel like I understand his mechanism [16:24:22] great! [16:24:30] maybe you can post your diagram on a wiki? [16:25:37] ok this is my last version of it http://i.imgur.com/R257la2.png [16:25:50] xyzram: would a linked list instead of **optr be slower ? [16:25:55] xyzram: circular linked list [16:25:57] just asking [16:26:01] going to +1 the code now [16:27:31] drdee: what wiki page ? I am planning on uploading it to a wiki page with the source also so it can be modified [16:27:33] Generally a linked list will be slower than an array since it requires 1 extra memory access. [16:27:52] New review: Stefan.petrea; "(1 comment)" [analytics/udp-filters] (master) - https://gerrit.wikimedia.org/r/62403 [16:28:14] average: http://www.mediawiki.org/wiki/Analytics/UDP-filters [16:29:10] That diagram is still not completly accurate since it shows udp2log on a box different from the one running the multiplexor/udp-filter. [16:29:56] New review: Diederik; "Ok." [analytics/udp-filters] (master); V: 2 C: 2; - https://gerrit.wikimedia.org/r/62198 [16:30:55] New review: Diederik; "Ok." [analytics/udp-filters] (master); V: 2 C: 2; - https://gerrit.wikimedia.org/r/61049 [16:31:26] Change merged: Diederik; [analytics/udp-filters] (master) - https://gerrit.wikimedia.org/r/62198 [16:31:45] New review: Diederik; "Ok." [analytics/udp-filters] (master) - https://gerrit.wikimedia.org/r/61049 [16:31:46] Change merged: Diederik; [analytics/udp-filters] (master) - https://gerrit.wikimedia.org/r/61049 [16:32:03] New review: Stefan.petrea; "Looked at the code. Talked to Ram. After receiving some explanations from him and Diederik I feel li..." [analytics/udp-filters] (master) - https://gerrit.wikimedia.org/r/62198 [16:32:09] New review: Diederik; "Ok." [analytics/udp-filters] (master); V: 2 C: 2; - https://gerrit.wikimedia.org/r/62403 [16:32:10] Change merged: Diederik; [analytics/udp-filters] (master) - https://gerrit.wikimedia.org/r/62403 [16:33:07] New review: Diederik; "Ok." [analytics/udp-filters] (master); V: 2 C: 2; - https://gerrit.wikimedia.org/r/62304 [16:33:08] Change merged: Diederik; [analytics/udp-filters] (master) - https://gerrit.wikimedia.org/r/62304 [16:40:37] drdee , xyzram http://i.imgur.com/pfDWRcO.png ? [16:41:35] yes that looks good [16:41:55] cool [16:44:19] yes, agree. I'd suggest labelling the red arrow also "pipe" the same as the blue arrows. [16:46:55] average: can you now look into jenkins for udp-filters? [16:47:04] lib anon and libcidr have been installed [16:47:14] mayb we should just use xyzram's SimpleMake file [16:47:23] https://integration.wikimedia.org/ci/view/Analytics/job/analytics-udp-filters/21/console [16:50:54] erosen, milimetric, average, xyzram: headsup scrum in 10 minutes [16:51:06] usual place so that would be: https://plus.google.com/hangouts/_/2da993a9acec7936399e9d78d13bf7ec0c0afdbc [16:51:18] :) [16:51:22] sure, how long does it take usually ? [16:51:26] sorry i was late last time [16:53:16] 10 - 15 minutes [16:53:47] average: comment about jenkins? [16:58:27] New patchset: Diederik; "Simple bash script to generate seed.sql to load MW database with a small set of test data for UMAPI. Fix seed.sql Change-Id: I05daf93edcbbcdc3fa0ee42a6184e37623b4dd10" [analytics/user-metrics] (master) - https://gerrit.wikimedia.org/r/62450 [16:59:59] drdee: I was looking on it [17:00:13] drdee: didn't get a chance to look more deep into it [17:00:25] modified the job two times and rebuilt.. but still have to look more on it [17:00:42] average: scrum [17:01:12] yes, on my way [17:02:10] erosen: around? [17:28:26] Trying to build kraken for the first time, I'm seeing this error during test phase: [17:28:33] Native code library failed to load. [17:28:34] java.lang.UnsatisfiedLinkError: Can't load library: /usr/lib/libdclassjni.so [17:28:34] java.lang.UnsatisfiedLinkError: Can't load library: /usr/lib/libdclassjni.so [17:28:38] yes!!! [17:29:31] average: please help xyzram [17:29:46] xyzram: i belief we have a debian package somewhere [17:30:05] xyzram: wget http://garage-coding.com/releases/libdclass-dev/libdclass-dev_2.0.12_amd64.deb [17:30:14] xyzram: sudo dpkg -i libdclass*.deb [17:30:26] drdee: sorry, not yet on github, will put it [17:30:32] promise [17:30:47] oh wait, you can't anymore [17:30:52] drdee: why ? [17:31:01] github no longer supports that [17:31:13] drdee: it's just 159kb .. [17:31:18] we should put the package in apt.wikimedia.org, or is it already there? [17:31:20] drdee: I can add it to the repo as a regular file.. [17:31:46] drdee: the deb is not on http://apt.wikimedia.org/wikimedia/pool/main/ [17:31:49] drdee: I just had a look [17:32:03] no, that's bad :( [17:32:27] I think ottomata is doing that ? [17:32:40] drdee: can I go in #wikimedia-operations and ask if they could put it there ? [17:32:58] yes [17:33:02] I installed the deb and ran "mvn package" again; it's running and downloading a lot of stuff from nexus. [17:33:10] xyzram: nice :) [17:37:57] Now seeing this: [17:38:01] Tests run: 40, Failures: 0, Errors: 7, Skipped: 1 [17:38:01] [INFO] ------------------------------------------------------------------------ [17:38:01] [INFO] Reactor Summary: [17:38:01] [INFO] [17:38:01] [INFO] Kraken ............................................ SUCCESS [0.392s] [17:38:01] [INFO] dClass JNI Wrapper ................................ SUCCESS [10.858s] [17:38:02] [INFO] Kraken Base Library ............................... SUCCESS [23.577s] [17:38:02] [INFO] Kraken EventLogging ............................... SUCCESS [0.987s] [17:38:03] [INFO] Kraken Pig Library ................................ FAILURE [3:03.000s] [17:38:03] [INFO] Kraken Funnel ..................................... SKIPPED [17:38:04] [INFO] Kraken ETL ........................................ SKIPPED [17:38:04] [INFO] ------------------------------------------------------------------------ [17:38:05] [INFO] BUILD FAILURE [17:38:05] [INFO] ------------------------------------------------------------------------ [17:39:11] drdee: I'm seeing some sort of pig test failure: [17:39:16] [ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.10:test (default-test) on project kraken-pig: There are test failures. [17:39:21] mmmm [17:39:32] ok what's the failure? [17:39:37] geoip coding? [17:40:19] [INFO] Kraken Pig Library ................................ FAILURE [3:03.000s] [17:40:52] Yes, geoip it looks like [17:41:07] probably because you don't have a local geoip database [17:41:09] Tests in error: [17:41:09] testExec1(org.wikimedia.analytics.kraken.pig.GeoIpLookupTest): /usr/share/GeoIP/GeoIPCity.dat (No such file or directory) [17:41:09] testExec2(org.wikimedia.analytics.kraken.pig.GeoIpLookupTest): /usr/share/GeoIP/GeoIPCity.dat (No such file or directory) [17:41:09] etc. [17:41:17] right [17:41:20] give me 5 minutes [17:41:41] Ok. [17:42:01] I'll try installing geoip if I can find the package. [17:42:57] dpkg shows I have something like that installed: [17:43:01] ii geoip-database 20120609-1 all IP lookup command line tools that use the GeoIP library (country database) [17:43:01] ii libgeoip-dev 1.4.8+dfsg-4 amd64 Development files for the GeoIP library [17:44:05] xyzram: don't worry i will help you soon [17:46:20] xyzram: scp ram@stat1.wikimedia.org:/tmp/GeoIP_City-20130212.tar.gz . ; tar xzvf GeoIP_City*.tar.gz [17:47:58] xyzram: sudo cp GeoIPCity.dat /usr/share/GeoIP/GeoIPCity.dat [17:49:18] oh wait, not sure if you can access stat1 [17:50:01] average: Permission denied (publickey). [17:54:01] xyzram: in the meantime: can you have a look at https://integration.wikimedia.org/ci/view/Analytics/job/analytics-udp-filters/33/console and give us a tip on how to solve that? [17:58:06] xyzram: wget http://garage-coding.com/releases/geoip.zip [17:58:09] Looks like it cannot find SimpleMake for some reason; is it using master branch or something else ? [17:58:13] xyzram: password available through e-mail [17:58:30] xyzram: you branched out of some other branch and SimpleMake is not present [17:58:51] xyzram: can you add it to the gerrit patchset please ? or I can try to do that if it's ok [17:58:51] SimpleMake is present in master [17:59:01] xyzram: in master but not in the branch you have on gerrit [17:59:44] average: true but it is in the cleanup patch that I pushed some time ago. [18:00:30] should be in master [18:00:39] i merged all outstanding patchsets [18:05:24] average: [geoip.zip] GeoIP_City-20130212.tar.gz password: [18:05:34] pm this [18:06:02] average: that zip file wants a password to expand it. [18:06:12] xyzram: I sent the password by e-mail [18:06:21] xyzram: please check your e-mail [18:06:38] Oh, sorry, got it now, thanks. [18:06:41] np [18:12:38] drdee: average: now "mvn package" reports "BUILD SUCCESS" [18:12:53] that is great I think :) [18:16:55] yes think so too :) [18:30:49] drdee: gitweb's crashing when i try to access your seed.sql changes [18:31:14] yup [18:31:19] that happens always [18:31:21] gerrit bug [18:31:30] you need to pull the patch set [18:39:51] milimetric: or you can emulate what the browser is doing [18:39:53] milimetric: curl --header "Content-Type:application/json; charset=UTF-8" --header "Connection:keep-alive" --header "Accept-Encoding:gzip,deflate,sdch" --header "Accept-Charset:ISO-8859-1,utf-8;q=0.7,*;q=0.3" --header "Accept:application/json,application/json,application/jsonrequest" -d '{"jsonrpc":"2.0","method":"patchScript","params":[{"fileName":"scripts/seed.sql","patchSetId":{"changeId":{"id":62450},"patchSetId":2}},null,{"changeId":{"id": [18:40:04] there's your seed.sql [18:40:13] :D [18:43:03] lol [18:43:42] didn't you say that you love gerrit? [18:53:39] average: check https://gerrit.wikimedia.org/r/gitweb?p=analytics/udp-filters.git;a=tree;h=refs/heads/master;hb=refs/heads/master [18:53:55] the SimpleMake file is present [18:55:48] drdee: yes, in master it is [18:56:10] drdee: this is what the code jenkins tried to build https://gerrit.wikimedia.org/r/gitweb?p=analytics/udp-filters.git;a=tree;h=c64f6bc3403727a50c576e97cb1d95cbe74d606d;hb=c7256832bd4edce539c3ca46c46ba3cd87e7fc8d [18:56:57] so it's not fetching HEAD? [18:58:10] milimetric; let's use https://plus.google.com/hangouts/_/2da993a9acec7936399e9d78d13bf7ec0c0afdbc [18:58:15] k [18:58:51] drdee: so build #30 of udp-filters was started by you, and the code was this for it https://gerrit.wikimedia.org/r/gitweb?p=analytics/udp-filters.git;a=tree;h=fd99f0eca68913916e75fd9ae450bcd03077a3ea [18:59:07] drdee: the SimpleMake is not present in that one [18:59:28] ok, let's push a simple commit to udp-filters and trigger a new build [18:59:29] drdee: I think it was branched out from an older version here SimpleMake was not present [18:59:36] drdee: sure [19:00:02] rounce123: we are in the default scrum hangout [19:00:44] New patchset: Stefan.petrea; "Abandon this" [analytics/udp-filters] (master) - https://gerrit.wikimedia.org/r/62471 [19:00:53] drdee: https://gerrit.wikimedia.org/r/62471 [19:01:10] drdee: build succesful https://integration.wikimedia.org/ci/job/analytics-udp-filters/34/ [19:01:24] because I branched out from the most recent master where SimpleMake was present [19:11:32] average: AWESOME! [20:00:41] average: i fixed jenkins for webstatscollector: https://integration.wikimedia.org/ci/view/Analytics/job/analytics-webstatscollector/30/console [20:01:53] drdee: nice :) [20:02:40] xyzram: need any help? [20:04:16] hashar; around? [20:04:50] nop conf call sorry [20:09:06] drdee: just doing some background reading; breaking for lunch now; let me know if there is anything concrete to look at. [20:13:59] diederik (not using irc handle on purpose because you're at lunch): https://mingle.corp.wikimedia.org/projects/analytics/cards/578 [20:21:09] drdee: got a problem [20:21:26] drdee: how do I call udp-filter for Tab-separated input ? [20:21:29] udp-filter -F'\t' ? [20:25:42] average: yes that's how the config file on oxygen has it. [20:27:23] ok [20:32:15] drdee: i am back for the next 20 minutes or so [20:32:26] drdee: and you are the first in the queue of pending requests hehe [20:36:15] hashar [20:36:27] Looking at the "Logging Solutions Recommendation" page, and related links, kafka looks very interesting ! [20:36:40] how can we enable jenkins for limn (a node.js app)? [20:36:50] it needs npm and install dependencies [20:36:53] hashar ^^ [20:37:21] ah [20:37:26] how is it installed in production? [20:37:35] milimetric ^^ [20:37:40] we do not use npm on Jenkins [20:38:03] right, we just use the debian that average created [20:38:04] no prob [20:38:11] or we revisit the whole parsoid approach [20:38:12] the way parsoid fixed it is that Jenkins query a webservice hosted on labs which report back the tests [20:38:22] make our own blessed mirror [20:38:22] yep [20:38:22] dear lord [20:38:40] well let's stick with travis in that case [20:38:46] we have the same "issue" with all third parties repository [20:38:49] repositories [20:38:54] like gem for ruby, pip for python etc.. [20:39:03] you guys should be creating vm's :D [20:39:12] for my python script, I package the python module for Debian (I got commit access in debian repo) [20:39:14] run the test, delete the vm [20:39:20] and get the python package in apt.wm.o [20:39:26] which can then be installed via puppet [20:39:49] drdee: yeah you are right, the idea is to get the tests in vagrant box [20:40:13] some of us talked about it while I was in SF in march. But haven't progressed on that front yet [20:40:30] that would work nicely for parsoid and limm I guess. [20:40:48] I should probably get the Vagrant idea out of my pile of stuff to do and kick start something [20:40:49] definitly! [20:41:15] ah if only loans were not that high in SF. I would have relocated and we could have hacked that over a week :D [20:41:44] * hashar fallback to mail [20:42:31] drdee: the bug is https://bugzilla.wikimedia.org/show_bug.cgi?id=45499 Jenkins should run tests in disposable sandboxes [20:44:42] nice! i will subscribe tot hat [20:44:56] you can have a look at the previous comments [20:45:01] I am not sure how to kick start it [20:45:07] maybe get parsoid / limm teams involved [20:45:16] adding in Ori who worked a lot of vagrant already [20:53:28] hashar: like Travis ? [20:53:36] nooo travis :-D [20:53:40] I mean [20:53:51] feel free to use travis meantime :-] [20:54:01] but that is not going to solve the issue for us which is that we need Sandboxes [20:54:06] hashar: I just meant the disposable sandbox idea.. [20:54:12] ahh [20:54:19] I am not sure how Travis works [20:54:25] but yeah, that is the idea, disposable sandbox. [20:54:29] hashar: I can tell you how I'd do it :) [20:54:51] hashar: template LXCs , copy Template LXC, fire up, run tests, delete [20:55:01] hashar: what do you think ? [20:55:13] yeah LXC would be nice [20:55:18] but iirc they can be escaped [20:55:23] I mean, [20:55:26] hashar: not really [20:55:37] I think we discard linux container because they can be exited just like chroot [20:56:01] hashar: that was the case in 2011, and in 2012 they found a JVM JIT heap spraying exploit [20:56:17] hashar: but after that, LXCs were considered pretty secure [20:56:44] hashar: is there anything safer than LXC ? [20:59:04] oh, reading the bugzilla link.. [21:02:01] hashar: Travis uses Vagrant http://about.travis-ci.org/docs/user/ci-environment/ [21:02:02] ok mailed the eng list [21:02:06] ah good finding [21:04:38] milimetric: ping [21:04:45] hi ori-l [21:05:50] hey :) so i noticed you ended up using /a/limn-public-data for the mobile stuff [21:06:10] is the name now imprecise? [21:06:10] um, i stopped any work on that [21:06:18] ok [21:06:36] * drdee mumbles name sounds fine to me ;) [21:06:37] so i think it's just YuviPanda that's working on it [21:06:54] average: drdee: I have pinged marktraceur (parsoid) and ori (who did a vagrant box for mediawiki). follow up on eng list :) [21:07:19] i'm really not sure what the requirements are there, but it seems like it should be part of the talks that Dario's been putting together [21:07:25] yes, saw that! thank you! [21:10:18] hashar: I put it in the background http://www.youtube.com/watch?v=JicPX7K_dJs [21:10:35] hashar: I'm writing code while audio-grepping for the guy saying the word "security" [21:11:07] average: crazy :-) [21:11:36] average: all the travis talks/pres I have seen are geared toward the devs , not the ops :/ [21:14:34] hashar: I remember talking to someone who did what you want to do with these boxes [21:14:54] hashar: he created a distributed queue with zeromq with the stuff the vagrant boxes should run [21:14:54] well Jenkins has a vagrant plugin [21:14:59] not sure how well it works though [21:15:07] ahh zeromq [21:15:10] hashar: (well actually that person used LXC but it doesn't matter) [21:15:21] hashar: yes, so this was the way he isolated the Boxes from teh hosts [21:15:34] hashar: the only thing the boxes could use were these queues [21:15:48] and part of the outside world [21:15:55] so no connection back to the physical hosts [21:16:24] so stuff flows through the queues [21:16:30] a queue would probably be better than the jenkins plugin [21:16:36] and each vagrant box picks up a job from the queue [21:16:49] the Vagrant plugin basically wrap the test in a vagrant box, but it has to boot it up first which takes a bit of time [21:17:50] I also need to have a look at http://gearman.org/ [21:38:35] New patchset: Diederik; "Simple bash script to generate seed.sql to load MW database with a small set of test data for UMAPI." [analytics/user-metrics] (master) - https://gerrit.wikimedia.org/r/62450 [22:15:01] New patchset: Stefan.petrea; "Wikistats now expects ip field with country code" [analytics/wikistats] (master) - https://gerrit.wikimedia.org/r/62528 [22:15:04] drdee: ^^ [22:15:09] ty [22:15:46] tests failed [22:15:52] :) [22:16:01] gotta look what happened (on my machine they're passing) [22:19:09] New patchset: Stefan.petrea; "Wikistats now expects ip field with country code" [analytics/wikistats] (master) - https://gerrit.wikimedia.org/r/62528 [22:19:30] drdee: build is green [22:19:41] woot woot [22:21:27] :) [22:23:42] I want to raise the testcount to 200 [22:30:57] New review: Diederik; "Ok." [analytics/wikistats] (master); V: 2 C: 2; - https://gerrit.wikimedia.org/r/62528 [22:30:57] Change merged: Diederik; [analytics/wikistats] (master) - https://gerrit.wikimedia.org/r/62528 [22:31:10] average : ^^ [22:31:11] drdee: thanks [22:31:19] drdee: now I can fix the other two [22:31:24] kool [22:36:09] New patchset: Stefan.petrea; "Fixed 341" [analytics/wikistats] (master) - https://gerrit.wikimedia.org/r/62209 [22:38:39] New patchset: Stefan.petrea; "Fix for mingle 356" [analytics/wikistats] (master) - https://gerrit.wikimedia.org/r/62206 [22:39:25] drdee: 341 and 356 are also ready now [22:40:11] thx [22:40:16] let's ez review those [22:40:59] ok [23:28:30] milimetric: ping (again) [23:51:15] hi ori-l-away, i was tending to chickens but now i'm back - lemme know if i can help