[00:01:42] drdee: still doing some updates [00:02:10] I want to make this thing more simple [00:02:31] what you said about the separation into more files is imperative, udp-filters.c has reached 1000+ lines [00:02:49] actually close to 2000 [13:02:53] hey drdee [13:02:56] hey [13:03:03] drdee: please check your e-mail :) [13:03:12] yo [13:03:22] btw, are you keeping track of your hours on odesk? [13:03:41] drdee: yes [13:03:52] drdee: I'm updating it as I work [13:03:56] pefect [13:05:31] drdee: if I add manual time for the previous day does that show up ? [13:05:43] dont' know [13:05:43] w [13:05:51] ok [13:09:07] drdee: forwarded that question to Joady [13:09:22] drdee: today I want to also separate code for udp-filter like you told me yesterday, into separate files [13:09:58] k [13:10:00] drdee: afterwards will we debianize udp-filters and try it out somewhere ? [13:10:13] yup, that's the plan [13:22:25] hey drdee, wanna help me with Erik's thing? [13:22:45] yes let's do it [13:24:05] average_drifter: https://gerrit.wikimedia.org/r/#/c/26027/ [13:24:05] k, I have his email up [13:24:36] clone this first https://gerrit.wikimedia.org/r/#/admin/projects/analytics/reportcard/data [13:24:53] so gerrit.wikimedia.org:29418/analytics/reportcard/data.git [13:26:24] drdee: should I clone that ? [13:26:26] or milimetric ? [13:26:31] i think me [13:26:33] no milimetric [13:26:35] permission denied drdee [13:26:41] ??? [13:26:47] hold on [13:27:10] embarrasing.... [13:27:17] you are not part of the analytics group [13:27:34] because you don't have a gerrit account :) [13:27:44] okay, go to gerrit.wikimedia.org (in browser) [13:27:58] and create an account [13:28:00] register? [13:28:00] k [13:28:19] i thought dschoon had guided you through all the account stuff [13:29:10] brewing some coffee, how was your celebration last night? [13:29:54] quite fun :) [13:30:20] I registered for an account using a form almost identical to this but I don't remember what account that was. There are a lot... [13:30:34] I requested username milimetric btw [13:32:00] no accounts yet [13:32:30] but this is your git/gerrit account [13:32:39] so basically your username will be Dan Andreescu [13:36:40] still no account [13:36:45] (I'll be fiddling with d3 until I get my account) [13:37:16] i'll ping you the instant i get the email [13:37:34] that should be instantanousley [13:37:47] oh jeremyb, if you're around maybe you can help finish my gerrit account registration [13:38:08] looking at the other ones it looks like a person has to do it manually [13:38:51] no there shouldn't be any manual steps [13:39:01] can you login to gerrit? [13:43:43] milimetric: hangout and share screen? [13:44:05] sure [13:44:23] ahhhhh internet, morning! [13:44:43] mmmmoooooooornnnning ottomata otttomata ottomata ottomata ottomata ottomata [13:44:58] i've got some fresh coffeez [13:45:14] milimetric: https://plus.google.com/hangouts/_/2b0f65aee1adb8ee53ca69328873c181abbb910a [13:47:37] i'm making fresh coffeeeee [15:20:35] drdee: please review [15:23:33] average_drifter: is src/internal-traffic.c a good name for the collector output stuff? [15:24:01] drdee: we can agree on a different one [15:24:29] why not call it collector_output.c ? [15:24:38] drdee: can do, yeah [15:24:42] because internal traffic is not an accurate description [15:27:11] drdee: should I git review again with the filename changed ? [15:27:19] yes please [15:29:48] drdee: ok, done [15:30:52] ok [15:43:08] average_drifter: merged [15:48:40] drdee: ok, thanks [15:54:04] drdee: there are most certainly manual steps [15:54:12] milimetric: are you sure it wasn't finished? [16:00:41] milimetric: on further inspection you do in fact have an account already [16:00:55] milimetric: or were you looking to get the SVN part finished? [16:01:52] drdee: ^ [16:02:15] oh, no you're not even the SVN guy anyway [16:02:22] milimetric is all set, thx [16:02:50] thx jeremyb, I misunderstood how the accounts work [16:02:51] i'm confusing with stefan.petrea [16:03:17] who is the same as average_drifter i guess [16:03:18] jeremyb, I don't need an SVN account. thank you [16:03:37] milimetric: right. i'm confusing you with average_drifter (who did ask for one) [16:03:47] cool. thanks again, I appreciate it [16:07:09] 02 16:06:27 <+wm-bot> Change on mediawiki a page Developer access was modified, changed by Jeremyb link https://www.mediawiki.org/w/index.php?diff=589762 edit summary: /* User:DAndreescu */ not done [16:07:13] ;) [16:25:50] milimetric: let's do another run with the report card data [16:25:57] i am getting coffee first though [16:26:03] ottomata, how is CDH4 behaving? [16:26:15] drdee cool, one sec [16:26:24] just getting everything back up to normal and puppetized [16:26:27] drdee do i have time to run get a sandwich? [16:26:29] yarn has more /different daemons [16:26:41] milimetric : of course! [16:26:46] my name linux raid partition on an01 is being funny [16:26:58] ottomata: yep, yarn is quite different than MRv1 [16:27:02] but, it is installed! and Ihad it running for a sec! [16:27:05] back in a sec [16:27:06] great! [16:27:16] trying to get the http proxy + iptables back up [16:27:21] i reinstalled an01 with precise [16:42:49] oook, drdee, if you have the proxy set up: [16:42:50] http://analytics1001.wikimedia.org:8088/cluster [16:44:13] proxy, how to again? [16:46:26] ottomata ^^ [16:46:29] drdee, wanna do stats after standup? I just got back with my lunch [16:46:35] yes [16:48:04] in your browser, set a proxy to use [16:48:10] analytics1001.wikimedia.org:8085 [16:48:44] :D [16:49:03] i am loading in the november fundraising data now [16:49:05] will take a while... [16:50:12] so this proxy thing [16:50:25] best is to have a separate browser configured with that proxy? [16:50:43] or can you have the proxy work only for specific URL's? [16:52:32] https://plus.google.com/hangouts/_/2e8127ccf7baae1df74153f25553c443bd351e90 [16:52:39] ottomata: [16:52:40] drdee: best is to have a separate browser configured with that proxy? [16:52:41] [12:50pm] drdee: or can you have the proxy work only for specific URL's? [16:52:48] that's what I do [16:52:53] you can, but i thikn you need an extension [16:52:55] i never bothered [16:52:57] foxy proxy [16:52:58] for ffc [16:52:59] ff [16:53:02] i unno [16:53:02] k [16:54:14] ottomata: https://chrome.google.com/webstore/detail/proxy-switchy/caehdcpeofiiigpdhbabniblemipncjj [16:55:38] ah nice [16:59:11] mmmmmm it is not working… :( [16:59:17] https://plus.google.com/hangouts/_/2e8127ccf7baae1df74153f25553c443bd351e90 [17:01:03] yeah not for me either [17:01:05] with proxy switchy [17:39:41] moving locs and gettin food, back soon [18:36:43] average_drifter: all good? [18:47:50] ungh, i chose cafes poorly [18:47:54] music: awful and loud [18:47:57] no outlets [18:47:59] yarghhh [18:48:03] this is going on my map! [18:50:04] milimetric: can you paste hangout url? [18:50:18] https://plus.google.com/hangouts/_/2e8127ccf7baae1df74153f25553c443bd351e90 [19:02:45] drdee: working on it, will have a new review soon [19:03:00] cool cool [19:58:00] ottomata, do we still have the fundraising files for 2011 and 2012 in kraken? [19:58:58] no [19:59:03] they are mounted on an03 [19:59:07] from the netapp [19:59:15] i am loading in 10 days worth of data from nov 2011 [19:59:23] but it is taking av ery long time [19:59:28] lots of data in tiny files [19:59:44] how long is very long time? [20:00:10] is it worthwhile writing a script to join these files? [20:00:17] that might be faster overall [20:01:31] that's what it is doing [20:01:36] it is about half done [20:01:42] i started it as soon as cdh4 was up [20:01:47] so, around noon [20:01:59] i'm zcating all of these into one big hdfs file [20:02:25] 2011 nov 18 − 28 [20:02:28] i'm at 2011-11-24-01PM--15 right now [20:02:32] so more than half done [20:02:49] i've been cleaning up my puppet stuff, making a more robust cdh4 module [20:05:10] perfect! and i didn't realize that, sorry [20:19:38] drdee: I need values from udp-filter.h in both geo.c and udp-filter.c [20:20:09] drdee: can I include udp-filter.h in geo.c ? I'll use a include guard in udp-filter.h [20:20:21] sure, why not? [20:20:37] ok [20:23:01] ottomata, what is the HDFS block size? [20:23:08] 64, 128 or 256Mb? [20:23:16] (or even larger)? [20:24:26] ottomata ^^ [20:28:05] 64 [20:28:12] configurable in puppet, but 64 by default [20:30:33] drdee :) [20:30:33] https://github.com/wmf-analytics/cloudera-cdh4-puppet [20:46:08] cool ottomata [20:47:09] can you make me and dan a member of wmf-analytics as well? [20:49:41] yup [20:49:56] dan, what's your same github un? [20:50:06] oh he is a member [20:57:24] ottomata, can you also monitor and graph swap usage of kraken using ganglia, maybe even part as the CDH4 puppet class? [20:57:55] this will give us viable feedback on how to tune mapred.child.java.opts [20:59:12] that is a good idea, not as the cdh4 puppet module, but yeah [20:59:49] we do already ahve that [20:59:49] http://ganglia.wikimedia.org/latest/?c=Miscellaneous%20eqiad&h=analytics1001.wikimedia.org&m=cpu_report&r=hour&s=descending&hc=4&mc=2 [20:59:59] at least on a per node basis [21:00:26] buuut swap usage? [21:00:31] i don't think these ciscos are going to swap [21:00:33] they have almost 200G RAM [21:00:52] right but for the dell machines :) [21:01:11] aye [21:01:33] this is going to be tricky actually, very heterogenous hardawre [21:17:46] latas dudes, i might be back on a little later to work on some stuff..mayyyybe [21:17:48] :) [21:21:07] laterz [21:26:39] what's up with git-review ? [21:26:56] I just wrote git review and it's just standing there [21:27:04] git review is from Mars [21:27:44] drdee: tommorow I'm gonna attend saper's presentation, I talked with him on his SIP server(voip) and he said he's going to explain how to manage without git review [21:28:08] :) [21:28:26] can you still get your stuff in gerrit? [21:29:00] there it is [21:29:05] https://gerrit.wikimedia.org/r/#/c/26404/ [21:29:21] damn, it was my uber-focus script, I need to add rules for gerrit for that (I have this script written to block me from teh outside world so I can focus, it's based on iptables) [21:29:31] drdee: yeah that's the review :) [21:33:41] drdee: done [21:34:17] t [21:34:17] ty [21:37:09] average_drifter: so let's leave this for ottomata, shall we have a look at asana? [21:39:13] did you ever submit fixes to wikistats? [21:39:25] if yes, what was the change set id? [21:39:55] drdee: I55eb95747048a0e54959b72342a3ee94c1d9baf0 [21:40:04] that's the git hash [21:40:17] what's the gerrit change set id (integer 5 digit) [21:40:58] checking [21:41:33] drdee: https://gerrit.wikimedia.org/r/#/c/25036/ [21:41:36] drdee: 25036 [21:42:13] drdee: there's a pending review there [21:42:35] did you fix my comments? [21:42:41] drdee: no [21:42:46] drdee: but I am going to [21:42:47] ok :) [21:45:02] does gerrit know how to handle stuff like .. chunks of code moving from one file to another ? [21:45:18] I mean git actually.. because gerrit only uses what git provides I guess.. [21:46:23] yes use git mv [21:46:27] what I actually mean is, when I move a big chunk of code from file A to new file B , it's shown as if I just added that big chunk of code to B out of nowhere [21:48:55] righ [21:48:56] mmmmm [21:48:59] good question [21:49:00] google! [21:49:45] git knows how to compress that operation when it pushes/pulls as far as I know [21:50:20] but ignores it as far as tracking where it came from and crap like that [21:50:22] http://stackoverflow.com/questions/4908336/can-git-really-track-the-movement-of-a-single-function-from-1-file-to-another-i [21:51:10] apparently git blame is the way to do this [21:59:22] drdee: cool, git blame -C [21:59:31] yup [21:59:46] this is awesome! I can feel myself growing out of my nooby git skin [22:47:48] d3!!!!!! so close but so far [22:48:04] btw, playing with SVG makes one feel like God [22:48:23] dinnertime, tty guys later [22:49:38] laterz [23:10:29] hey ottomata [23:11:33] @ drdee , @ ottomata some tests in udp-filters/run.sh were failing. I added one there, but there were some failing before I started writing code [23:11:57] which ones, most likely because you added new more example lines to example.log [23:12:09] the tests are very simple look in the bash script [23:12:17] it just expects a number of lines [23:12:18] drdee: I created a separate example.collector.log [23:12:23] drdee: so I didn't modify the existing one [23:12:30] so if you add new example lines then you also have to update the bash tests [23:12:48] i fixed all the tests last week, they all passed [23:17:05] average_drifter: crap [23:17:11] this change was not merged https://gerrit.wikimedia.org/r/#/c/25408/1/run.sh [23:17:17] that contains all the fixes [23:17:32] if i merge now then we probably have to rebase [23:18:03] shall i try to merge it? [23:18:42] well right now I'm a bit fuzzy so if stuff goes wrong I'm not 100% [23:18:48] can we do it tommorow ? [23:18:54] git fetch https://gerrit.wikimedia.org/r/analytics/udp-filters refs/changes/08/25408/1 && git checkout FETCH_HEAD [23:19:00] let's do it tomorrow [23:19:03] ok [23:19:03] i tried to merge [23:19:04] and it failed [23:19:17] but that commit does contain the fixes for the bash tests [23:19:45] drdee: oh alright, because this is the output of run.sh for the latest commit before I started writing stuff on udp-filters https://gist.github.com/e14702eea2892d8d86db [23:20:06] yup, i fixed taht