[00:06:52] mk [00:36:33] dschoon: could we catch up tomorrow? I need to debrief from a few meetings today before meeting. what time are you coming in? [00:36:46] sounds good. [00:36:58] kraigparkinson: i'll be in a little before 10 [00:36:58] thanks for the flexibility [00:37:32] cool [00:37:33] no worries. [01:43:08] ciao [14:18:13] guuuuuuud meuuuuuuning [14:19:37] ottomata!!!!!!!!!! [14:19:43] moriing! [14:19:47] whoomp there he is! [14:19:53] whooomp there he is! [14:20:21] i am pumping some beats this morning [14:21:03] my body is now utterly confused about what timezone it is in. Eugh. [14:21:04] hello drdee [14:22:26] morning guys [14:22:54] mooorning [14:23:18] so ottomata, before we get that puppet stuff approved by ops, can we try it on a labs instance? [14:23:25] yesssuhhhhh [14:23:26] hey SleepyPanda, [14:23:28] it could be our kripke migration [14:23:31] i can try to make that happen [14:23:33] MILIMETRIC!!!!!! [14:23:36] i tried for a minute on friday but got annoyed [14:23:43] what's up Dr dee? [14:23:46] oh ok [14:23:54] i think drdee has had his coffee [14:23:56] that is all that is up [14:24:01] i just feel like shouting :D [14:24:10] ah yes [14:24:15] and this is without coffee [14:24:17] I don't understand the coffee people [14:24:26] mornings are for somber reflection [14:24:27] :) [14:24:29] aha [14:24:40] just took my first sip of coffee [14:24:43] coffee is just yummy [14:24:53] food wakes me up more than coffee [14:25:06] it's more the ritual than anything else (IMHO) [14:25:24] yeah, i like eat sugar wrapped in as much fiber as necessary to prevent it from killing me :) [14:25:52] sooooo ottomata, how eager are you to deploy the filter component of webstatscollector with the replace function? We actually received an official bugzilla bug about this :) [14:26:02] so is there anything I can try with labs? do you just spin up an instance and run the puppet stuff on it manually? [14:26:12] oh no - you said there was some puppet master [14:26:14] drdee, suuure we can do that [14:26:23] we just revert if things break [14:26:28] milimetric [14:26:29] https://labsconsole.wikimedia.org/wiki/Help:Self-hosted_puppetmaster [14:26:31] but then at least we have tried :) [14:26:38] cool ottomata, I'll take a look [14:26:54] i have an instance with most of those instructions done already [14:27:08] puppet-dev.pmtpa.wmflabs [14:27:20] milimetric: about the limn sprint, very few tasks seem to have been finished so far, is that correct? [14:27:40] that's right [14:27:48] it's all puppet/deb that's in process [14:27:58] but i feel like a lot got done [14:28:18] how far are you guys with the debianization? [14:28:20] i will add a few tasks that got done in order to make that happen [14:28:24] ok [14:28:25] deb is done [14:28:31] nice! [14:28:36] without npm? [14:28:37] we just wanted to wait to test on puppet [14:28:39] without yea [14:28:42] k [14:28:57] did average_drifter do the debianization? [14:31:05] milimetric [14:31:22] i'm going to try puppet dev stuff now, i was just talking with hashar about the bit i wasn't sure how to do [14:33:18] cool [14:35:52] average_drifter: can you review https://gerrit.wikimedia.org/r/#/c/47827/ [14:47:46] agh, milimetric, i think i borked this instance when I was playing with it friday, i'm going to trash it and start a new one, unless you are interested in trying to follow those instructions…? [14:50:12] um, sorry ottomata - no I'm looking at something else atm [14:50:18] destroy away [14:51:13] mk [15:06:24] ottomata: wondering if you can help me with some issues building the kraken? [15:06:33] i would ask drdee, but he doesn't seem to be around [15:07:03] ottomata: have you successfully compile the kraken jars lately? [15:08:17] no [15:08:23] but i haven't tried very much [15:08:24] actually [15:08:24] yes [15:08:26] i got pretty clos [15:08:30] but I had to skip tests [15:08:31] hehe [15:08:34] mvn package -DskipTests [15:08:35] i think [15:08:39] yeah that ws my iussue [15:08:40] yay [15:08:44] i'll give that a try [15:08:46] k [15:08:58] or do you know a way to compile only specific parts? [15:09:41] actually nvm, i thiknk running mvn compile in the subproject directories worked cause they have their own poms [15:10:39] aye [15:22:52] ottomata: another kraken / maven q: do i need to do anything special to use the wmf nexus instance? [15:23:03] not that I know of [15:23:06] k [15:23:13] i'm getting this error: Could not resolve dependencies for project org.wikimedia.analytics.kraken:kraken-pig:jar:0.0.1-SNAPSHOT [15:23:32] and i suspected it had to do with not having the the right repositories available [15:23:39] yeah, i had that problem tooooooooo, but I thought drdee fixed it for me yesterday [15:23:48] interesting [15:24:45] i think i need to put a settings.xml file somewhere [15:24:53] i found this one in kraken: https://github.com/wikimedia/kraken/blob/master/maven/example.settings.xml [15:27:27] oo that sounds familar, maybe somewhere in ~/.m2 [15:27:28] hmm [15:27:35] yeah [15:27:38] put that in there [15:27:39] i think [15:27:59] trying now [15:31:51] guys, analytics1027 is being really weird with cron, and I'm suspcious of the recent ldap nss removal stufff [15:32:02] i'm going to reboot it, but this means oozie is going down for a min [15:32:49] ottomata: succes! settings.xml in ~/.m2 did it [15:35:50] yeehaw [15:40:31] drdee, re: that sql query for active editor skin usage [15:40:56] I feel like we should have an up to date SQL slave, whether or not we do anything with it [15:41:19] it would be useful to see what the rest of the org sees, and that seems to be how most people start thinking about analysis [15:41:28] so, what can I do to help make that happen? [15:42:33] milimetric: drdee isn't around afaik, but if you want access to the db slave, you could probably get access to the s*-analytics-salves [15:42:57] is that something that everyone else has? [15:43:04] not everyone [15:43:08] i mean our team [15:43:11] the "analysts" do, though [15:43:26] ok, yeah that sounds good. Any idea who to ask? [15:43:27] i suspect diederik has access, but otherwise, i might be the only one [15:45:38] i think Dario is a good contact, let me find the relevant wiki page [15:45:39] i'll ask Dario, no worries [15:45:42] can't find it. I think they are just credentials--not user specific accounts [15:45:42] k [15:45:42] cool [15:45:42] so in theory I could share mine with you, but I should check the norms [15:45:53] yeah, i'll go through Dario to make sure we're on the up and up [15:45:54] :) [15:57:32] dschoon: you know anything about the kraken build process? [16:08:06] milimetric, can you show me how to set up dev-reportcard with limn pointing at a tsv and pivoting? [16:08:11] http://stats.wikimedia.org/kraken-public/webrequest_loss_by_hour.tsv [16:08:49] the pivot happens automatically when the datasource refers to a datafile and defines format: 'log' [16:08:57] but I can help [16:09:00] hangout? [16:09:07] standup link [16:09:42] k [16:37:52] HMM [16:37:53] milimetric [16:38:00] after installing the .deb (and apt-get isntal [16:38:00] so [16:38:03] I did [16:38:14] apt-get install nodejs rlwrap [16:38:20] sudo dpkg -i limn...deb [16:38:26] but [16:38:42] /usr/bin/env: coco: No such file or directory [16:38:45] average_drifter ^ [16:43:12] ottomata: looking at it [16:43:30] ottomata: can you link me to the deb you used please ? [16:47:09] uh, its the one milimetric just emailed me [16:47:12] i can email it to you [16:47:35] oops, was downstairs [16:47:48] also, you should be able to ssh into puppet1.pmtpa.wmflabs [16:47:50] where i'm working on this [16:47:53] cool [16:47:55] the .deb is in my homedir [16:47:55] doing so now [16:50:02] trying the dpkg -i myself now [16:51:11] ottomata: I know the reason [16:51:22] the reason is this [16:51:22] oh ja? [16:51:34] me and Dan talked about coco [16:51:41] he said "do npm install -g coco" [16:51:56] and I said "hey, but let's have it self contained within the node_modules" [16:52:02] I tried to have it in node_modules [16:52:15] oh I see the confusion [16:52:17] but then coco was complaining that it couldn't find server.js [16:52:24] I was saying install -g coco was ok on the Build box [16:52:27] so because we had to pull out a deb I decided to -g coco [16:52:32] but no coco sits in /usr/lib somewhere [16:52:38] and we haven't included it in the .deb [16:52:45] i think it is in the .deb [16:52:53] ottomata: let me check [16:52:54] just not in the path [16:52:57] it should be included by default in node_modules [16:53:10] moment, I'll check [16:53:23] /usr/lib/limn# ls -l node_modules/.bin/coco [16:53:23] lrwxrwxrwx 1 root root 22 Feb 20 16:18 node_modules/.bin/coco -> ../coco/lib/command.js [16:53:39] server.co does [16:53:43] #!/usr/bin/env coco [16:53:54] but node_modules/.bin/coco is not on the path [16:54:05] need wrapper script :p [16:54:12] wrapper script ? [16:54:31] ottomata: oh btw, remember you gave me a gist with the limn.conf ? [16:54:36] yes [16:54:39] ottomata: that one is in debian/limn.upstart now [16:54:46] ok [16:54:49] ottomata: did you build from develop branch ? [16:54:59] I thought we were just going to do $LIMN_INSTALL/node_modules/.bin/coco server.co to start it [16:55:08] milimetric: no, because it doesn't work [16:55:09] oh we were? [16:55:13] no [16:55:20] why doesn't that work? [16:55:24] because coco doesn't like that [16:55:25] works locally for me [16:55:34] it works i think too [16:55:37] milimetric: it works locally for you because you already have coco installled [16:55:37] i just tried [16:56:01] ottomata: did you do npm install -g coco on the machine where you tried that ? [16:56:12] because if so, then that means coco is able to find its stuff in /usr/lib... [16:56:22] mmm, no, I just npm remove -g coco and it still works [16:56:41] /usr/lib/limn/node_modules/.bin/coco /usr/lib/limn/server/server.co [16:56:42] path.existsSync is now called `fs.existsSync`. [16:56:42] Limn Server (port=8081, env=development, rev=8780a4f, vardir=./var, data=./var/data) [16:56:44] no i did not [16:56:45] ok, time to kill the coco myths [16:57:07] node_modules/.bin/coco is the same exact thing as the global coco you get when installing with npm i -g coco [16:57:26] milimetric: try removing all the cocos you have on your machine [16:57:29] if it's not working in some environment, let me know where and we'll figure out what the problem is [16:57:35] milimetric: and then try the node_modules one [16:57:38] I did average_drifter, they're all gone [16:57:40] milimetric: if that works then I'm wrong [16:58:00] I removed it and now if I go "coke server" it doesn't know what coke is [16:58:01] i never installed coco on puppet1 [16:58:03] and that worked [16:58:13] but node_modules/.bin/coco still works [16:58:18] milimetric: doesn't coco pull coke too ? [16:58:34] I remember dschoon told me coco pulls coke too [16:58:40] pull? [16:58:42] as a dep [16:58:55] npm i -g coco installs coke, yes [16:59:04] but coke isn't needed for running the server, just building and stuff like that [16:59:35] ok, I'll build from develop and install the deb [16:59:42] oh crap [16:59:47] milimetric: ? [16:59:49] but without coke we will have to manually link the data [16:59:55] because coke link_data does that for us [17:00:05] milimetric: so we need coke too right ? [17:00:16] inside the .deb [17:00:19] but that's not something we can help right? Ops doesn't let us do npm i -g coco right ottomata? [17:00:31] I don't think we can put coke inside the deb unfortunately [17:00:54] milimetric: but do we really need coke ? [17:01:10] no, it's not strictly necessary, it's just nice [17:01:19] ok [17:01:28] I'll try to build the deb and see if I get the same error [17:01:28] right [17:01:34] also [17:01:37] average_drifter [17:01:38] I'll let you know in a couple of minuts [17:01:41] when I remove the .deb [17:01:41] ottomata: yes [17:01:43] uninstall [17:01:45] yes [17:01:51] $ sudo dpkg --purge limn [17:01:51] (Reading database ... 38788 files and directories currently installed.) [17:01:51] Removing limn ... [17:01:51] Removing user `limn' ... [17:01:51] Warning: group `limn' has no more members. [17:01:51] Done. [17:01:51] Purging configuration files for limn ... [17:01:52] /usr/sbin/deluser: The user `limn' does not exist. [17:01:52] dpkg: error processing limn (--purge): [17:01:53] subprocess installed post-removal script returned error exit status 2 [17:01:53] Processing triggers for ureadahead ... [17:01:54] Errors were encountered while processing: [17:02:16] ottomata: I'll fix that too [17:02:22] ok cool [17:02:39] it's the deluser inside the postrm script [17:02:49] yeah it look slike it tries to remove it twice [17:03:08] maybe the deb is smart and it already removes it and then the postrm tries to do that again [17:03:09] so yeah, we should make the upstart script [17:03:18] start by calling node_modules/.bin/coco [17:03:18] ottomata: the upstart script is already made [17:03:25] instead of just server.co [17:03:29] right, but that will fix the coco problem [17:03:35] ottomata: what branch are you building the .deb from ? develop right ? [17:03:44] i htink so, whatever milimetric gave me [17:03:45] i'm not building it [17:03:55] milimetric: what branch have you built the .deb from ? [17:03:56] one sec average_drifter [17:03:57] I'll push [17:04:00] ok [17:04:16] ok, pushed [17:04:20] it's develop yes [17:04:27] just had one commit not on /wikimedia [17:04:33] yeah, average_drifter [17:04:38] this in the upstart inti works [17:04:38] exec ./node_modules/.bin/coco ./server/server.co [17:05:43] hi all [17:06:16] kraigparkinson: hello Kraig [17:08:00] milimetric: sorry I'm late. [17:08:15] no prob kraigparkinson, I'm on the standup hangout [17:09:10] if it doesn't work, you can start a meeting and invite me instead [17:20:12] ok guys, i'm moving to a cafe [17:20:23] fyi, aside from that coco startup issue, the .deb and puppetization seem to work! [17:21:24] ottomata: cool, we'll get the coco startup sorted out in the next hour or so [17:40:41] yo drdee: where did you end up putting that kraken jar yesterday? [17:40:51] no where :) [17:41:07] hehe, i suspected [17:41:16] i attempted to build it myself and ran into some issues [17:41:20] have a sec? [18:18:56] * brion waves hello [18:19:13] zomg a brion :D [18:19:17] I'm looking for some usage figures on iOS browser versions [18:19:27] http://stats.wikimedia.org/wikimedia/squids/SquidReportOperatingSystems.htm has comparative versions but was last generated in november [18:19:33] is there anything more up to date i can grab? [18:20:08] ideally, i want to decide that iOS 5 usage is low enough i don't have to support it for new apps ;) [18:20:16] brion: very alpha [18:20:17] http://test-reportcard.wmflabs.org/graphs/pageviews_mobile_by_manufacturer [18:20:20] love feedback [18:20:28] we are about to send this to the mobile team anyways [18:20:37] browser version are very high on the priority list [18:20:40] pretty! [18:20:48] data view is broken :( [18:20:53] but we love love feedback [18:20:57] data is 15 minutes old [18:21:19] yeah lots of NaN :) [18:21:29] we know :) [18:21:38] you can download the ison file [18:22:26] nice, i like being able to see different breakdowns by country [18:22:43] this is still vendor / manufacturer [18:22:45] that would be more useful to me on an OS and OS version basis than manufacturer [18:22:53] *nod* [18:22:57] drdee: Apple is... 20 times bigger than Samsung? [18:23:03] yes, we are on it [18:23:29] YuviPanda: we might need to tweak the device recognizer more [18:23:45] YuviPanda: rumor is apple users surf the web more than android users on average. don't know how true this is, that's a big difference indeed [18:23:51] hmm, and Samsung is only slightly bigger than apple in India [18:24:02] which definitely doesn't sound too right [18:24:12] brion: true, but 20 times? [18:24:28] yeah :) [18:24:31] remember: this is alpha and this is exactly the feedback we need [18:24:41] yeah, so giving :) [18:24:49] drdee: also I can't see the full list - my screen is too small, I guess? [18:24:51] (13") [18:25:08] average_drifter, where can I find lib-dclass .deb? [18:25:12] that could be a limn bug [18:25:13] gets cut off after Samsung on India. I don't see the end of the popup. [18:25:15] or how can I get it on my mac? [18:25:17] milimetric ^^ [18:25:37] right there is 'desktop' which I'm not seeing. [18:25:44] I can see it if I zoom all the way out [18:25:55] oh more (I found motorola after desktop) [18:26:18] drdee: and 'see a table' is very Batman-ish [18:26:25] 0NaN-NaN-NaN NaN:NaN [18:26:34] yes that's broken [18:27:06] ottomata: https://github.com/wikimedia/dClass/tree/package [18:27:15] yeah and clips too. like I can't see any numbers for australia [18:27:36] can I maybe jsut clone and make && make install? [18:27:40] average_drifter: can you review https://gerrit.wikimedia.org/r/#/c/47827/ [18:28:01] ooh this is an awesome version marketshare graph… http://thetechblock.com/wp-content/uploads/2012/11/share-count-10.jpg [18:28:13] drdee: yes sir, I'll have a look at it [18:29:28] hmm, doesn't look like dClass will compile on os x [18:29:36] ottomata: oh yes it will [18:29:41] oh? [18:29:45] ottomata: drdee was able to compile it on OSX [18:29:45] main.c:73: error: ‘CLOCK_REALTIME’ undeclared (first use in this function) [18:29:54] #I also compiled it on [18:29:57] OSX [18:30:25] that problem was long ago fixed [18:31:05] ottomata: check_1) have you switched to package branch ? [18:31:24] ottomata: check_2) look at https://github.com/wikimedia/dClass/blob/package/Makefile.am , it does not involve compiling main.c [18:31:39] ah i did not switch branches, sorry [18:31:47] ok [18:33:07] dschoon, when I am in SF this/next week [18:33:15] can you help me get setup with a good java env [18:33:20] every time I try to do anytihng java I just give up [18:33:37] just install intellij idea [18:33:39] all I want to do is edit java code and compile, now i'm down the rabbit hole again [18:34:08] i've got intellij [18:34:14] not using it though cause I don't know how :p [18:34:21] average_drifter [18:34:22] ok [18:34:24] i'm in pacakge branch [18:34:26] what do I do? [18:34:34] there's no info about compiling [18:34:36] just info on how to build .deb [18:34:41] which I obvi can't do on os x [18:34:55] ottomata: ok, do this [18:35:44] ottomata: libtoolize; [18:35:44] aclocal; [18:35:44] autoheader; [18:35:44] autoconf; [18:35:44] automake --add-missing; [18:35:54] make [18:36:01] now look in .libs/ [18:36:06] #you should find the .so you need there [18:36:13] now make install [18:36:19] -bash: libtoolize: command not found [18:36:20] :p [18:36:22] brew? [18:36:25] ottomata: brew [18:36:34] brew or port [18:36:40] k [18:36:44] whichever you prefer [18:38:34] make: *** No targets specified and no makefile found. Stop. [18:38:49] in src? [18:38:58] configure? [18:39:05] configuring [18:39:22] close [18:39:23] jni/dclass-wrapper.c:4:17: error: jni.h: No such file or directory [18:39:48] yes [18:39:50] moment [18:40:05] ottomata: if you look at Makefile.am [18:40:10] there's a JAVA_HOME there [18:40:15] you need to set that in your .bashrc [18:40:20] you probably don't have that variable [18:40:38] check_3) make sure $(JAVA_HOME)/include/linux exists [18:40:50] check_4) make sure $(JAVA_HOME)/include exists [18:40:52] echo $JAVA_HOME [18:40:53] /System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home [18:41:13] no $JAVA_HOME/include/ dir [18:42:04] how do I get an include/ dir? [18:42:25] ottomata: moment [18:42:38] k [18:42:55] firing up OSX 10.7 vm [18:43:13] brion and YuviPanda - the clipping of the infobox is an annoying problem, we know about it [18:43:44] and the tabular data view is "coming soon" either this week or early next [18:43:58] i'm on 10.8, fyi [18:44:13] interesting, from a research standpoint: http://aws.amazon.com/redshift/ [18:44:49] milimetric: is the clipping problem a task in asana? [18:45:07] it used to be on dschoon's map todo list [18:45:17] does it still exist? [18:45:24] i thought that wasn't a real problem. [18:45:27] oh yeah, dschoon, now I remember [18:45:31] you said you fixed it at some point [18:45:48] but I never saw it change - it clips when the box is too big [18:46:19] i did? [18:46:43] i think probably the more likely narrative is that i don't understand the bug :) [18:46:54] ah, that might make sense [18:46:57] if somebody can screenshot what the issue is, i'll take a look sometime [18:47:04] we can reopen it for another sprint [18:47:12] but for example, i don't see any missing countries [18:47:14] i'll add a screenshot on a task in Asana [18:47:16] k [18:47:23] no missing countries, that was a different thing [18:51:21] k, added and made you guys follow it [18:51:39] preilly: hello, I don't have rights to delete branches on github.com/wikimedia/dClass [18:51:43] preilly: what can I do ? [18:56:29] i think chad (^demon) is the man to talk to [18:58:02] alright [19:02:13] ok average_drifter, I have admin [19:02:16] how can I help? [19:02:30] ah try now, analytics was not a team owner [19:02:48] i've got admin rights on github as well [19:02:55] what's your github username? [19:03:06] hmm [19:03:12] ok fixed [19:03:13] worked now [19:03:20] I wanted to delete that pesky debianize branch [19:03:29] ah yeah [19:03:30] you're there [19:03:30] since the package branch had all our debianization stuff [19:03:31] cool [19:03:34] aye ok [19:04:18] I'll also carefuly check some other leftover branches on [19:04:23] for example limn on github.com/wikimedia/limn [19:04:34] erm, there's one branch there that has to be deleted also [19:04:47] ottomata: anyway, going back to your compiling on OSX [19:05:00] :) [19:05:17] i will delete limn_vardir since it has been merged [19:05:57] yea, too many branches can create confusion, 'specially if they're old and not used anymore [19:06:15] i think we're all strongly in favor of deleting old branches :) [19:06:21] :D [19:06:41] :o [19:07:16] drdee, can you point me to a commit in GeoIPLookup that used getCacheFiles? [19:07:30] yes hold on [19:07:35] i went back as far as the logs had, but they start from when you reorganized the dir hierarchy [19:10:18] drdee, what is the advantage to having this.databases in that class? [19:10:29] can't the constructor just take in a file path? [19:10:32] to a .dat file? [19:10:46] oh i see validateDatabaseName [19:10:51] i guess... [19:11:13] that seems a little unnecessary though, if someone passes in a bad path that would fail just the same [19:11:17] it's a bit verbose [19:11:21] why does it matter what the file is named though? [19:11:49] an additional check on the input [19:11:55] but feel free to remove / imrpove [19:11:56] But it'll fail anyway [19:12:00] when it doesn't open [19:12:02] yes? [19:12:07] right [19:12:12] no real value in checking, as it's not like there's any cleanup either way [19:12:14] and it makes you have to hardcode paths [19:12:19] in the code [19:12:22] validation is important when unrolling the problem is hard [19:12:25] like /usr/share/GeoIP [19:12:29] but otherwise, it's better to just fail early [19:12:36] hardcoding seems bad. [19:13:40] LookupService throws an exception anyway [19:13:40] Parameters: [19:13:41] databaseFile the database file. [19:13:41] Throws: [19:13:41] http://grepcode.com/file/repository.grepcode.com/java/root/jdk/openjdk/6-b14/java/io/IOException.java#IOException if an error occured creating the lookup service from the database file. [19:13:51] disagree, reason why i did was to make it easier for analysts to run the pig script and have to know where files are installed [19:14:07] i mean don't have to know [19:14:42] right, but hm, i see why you did it, but it makes the class hard to work in other places [19:14:43] like hadoop [19:14:48] sure [19:14:52] but can't we do what otto did [19:14:57] (what did I do?) [19:15:01] and make a special macro that contains the paths? [19:15:04] ah [19:15:06] so that way we can have our generic class [19:15:09] that's a good idea [19:15:20] and put all the conf in a macro, so the analysts can still focus on analysis? [19:15:25] hm. is that possible, yeahhhhhh, i think so [19:15:28] hmm [19:15:44] hmmm, but we might as well do that in the UDF [19:15:46] at that point [19:15:51] just check multiple paths [19:16:02] ottomata, getCacheFiles is implemented [19:16:05] yeah [19:16:06] i know [19:16:08] maybe it already gets callled [19:16:15] yeah, i like it. [19:16:21] but, this.dbPath is hardcoded [19:16:25] it is not called [19:16:26] i am trying to find the docs [19:16:34] let's call it :) [19:16:45] righhhhhht, I can [19:17:03] but i'm not sure how to use it [19:17:10] it then sets cacheFiles [19:17:12] that's fine [19:17:14] i'm going off to lunch, dschoon we can talk when i get back. I pushed an approach to having all these "smooth node", "scaling node", little widgets [19:17:15] then what? [19:17:21] it slides down from above the legend [19:17:28] (it's on my fork if you wanna see) [19:17:28] coolio [19:17:35] catch you after lunch [19:17:35] i like it. [19:17:36] oh if cacheFiles.hasKey(this.db) [19:17:38] word. [19:17:46] new LookupService ... [19:17:47] hmmm [19:18:40] ottomata look https://github.com/louisdang/kraken/blob/master/src/main/java/org/wikimedia/analytics/kraken/pig/GeoIpLookup.java [19:18:48] this was how it was previously working [19:18:59] there is no explicit call to getCacheFiles [19:19:09] i think it always gets called [19:19:15] the default implementation returns null [19:19:33] drdee , ottomata would you like to agree on where your JDK sits on OSX ? [19:19:56] it depends on the version of OSX :( [19:20:02] oh hm [19:20:31] ok, then please give me hostnames to your local machines so I can switch depending on that [19:20:37] haha [19:20:48] i'm using 10.8.2 [19:21:01] average_drifter: there's a cli utility [19:21:02] $ echo $JAVA_HOME [19:21:03] i believe [19:21:03] /System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home [19:21:28] ottomata: ok, I'll use your $JAVA_HOME [19:21:30] /usr/libexec/java_home [19:21:32] run that [19:21:34] ok [19:21:36] it'll tell you where JAVA_HOME is [19:21:38] $ ls $JAVA_HOME [19:21:39] bin bundle lib man [19:21:48] $ /usr/libexec/java_home [19:21:48] /System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home [19:21:52] the variable should only be set when you need to OVERRIDE it [19:21:57] but i have no $JAVA_HOME/include directory [19:21:59] like you were expecting [19:22:08] right [19:22:09] here's why [19:22:15] apple stopped shipping the headers [19:22:17] for the jdk [19:22:25] you need to install an update on OSX > lion [19:22:48] you wouldn't know this because you compiled jzmq in your VM :) [19:23:24] so ergh, drdee cacheFiles is a List, not a hash, I can't just check to see if the desired database has a key in it [19:23:31] really, I think the UDF shoudl jsut load a file path [19:23:35] whatever it is given [19:23:52] users of the UDF are supposed to make sure that the file exists wherever it is suppsoed to be [19:25:05] btw [19:25:17] does this exist? /Developer/SDKs/MacOSX10.*.sdk/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Headers/ [19:25:20] go ahead and fix it :) [19:25:57] haha, ok, but i'm going to take out the db hash :p [19:26:02] ok, could you also agree on everyone using the same version of JDK on OSX ? [19:26:02] and probably break everything [19:26:18] no [19:26:19] dschoon ^ [19:26:19] no ? [19:26:19] :D [19:26:24] it would help me a lot if you do [19:26:26] ottomata: yeah, you need to install some bullshit from apple then [19:26:32] average_drifter: just ignore supporting OSX :) [19:26:47] but we need to support it to be able to build kraken jars [19:26:47] dschoon: but we need it [19:26:52] on local envs [19:26:54] we already had fixed this [19:27:00] you told me to use intellij, remember :p [19:27:00] average_drifter: jdk 1.6 [19:27:09] [~]$ java -version [19:27:09] drdee: we fixed it for you and me [19:27:10] java version "1.6.0_37" [19:27:17] that should work [19:27:25] i think its the problem dschoon says [19:27:26] we do? [19:27:28] not versions [19:27:30] ah. [19:27:36] no. [19:27:38] yeah, mvn package fails [19:27:41] because I don't have dClass installed [19:27:47] the problem is that you need to install the headers [19:27:48] from apple [19:27:49] drdee: actually we fixed it when we were using the compile-dwrapper.sh script which had logic for Debian and Darwin in it [19:27:57] xcode? [19:27:59] drdee: but then we made a Makefile.am and we don't have logic for OSX in that [19:28:02] otherwise building anything involving JNI will fail [19:28:03] thought I had that crap [19:28:04] no, sadly [19:28:06] and it assumes a working java install ;) [19:28:12] it doesn't ship with xcode any more [19:28:16] as of lion [19:28:18] hold on [19:28:21] i'm looking for links [19:28:22] trust me :) [19:28:24] k [19:28:49] drdee, are there tests for GeoIPLookup that will run when I try to build? i really have no idea what will happen when I change this thing [19:28:50] ottomata: ok, if you can get standard jdk 1.6 then that would be great [19:28:59] ottomata: standard jdk 1.6 for OSX that is [19:29:06] i went through this when installing jzmq for storm on OSX [19:29:14] this is not standard? [19:29:14] [~]$ javac -version [19:29:15] javac 1.6.0_37 [19:29:36] ok, it is standard [19:29:43] you would have to write pig unit tests to test the getCacheFiles [19:29:48] then I will just use the JAVA_HOME and see how it goes [19:29:49] Install the Java Developer Package from Apple's ADC site: http://connect.apple.com/. You'll need it because for whatever reason, the included JDK install on Mac OS X moved the /include directory, which contains the jni.h file, to some other location besides the directory containing /bin and /lib. I suppose I could have symlinked it, but this seemed cleaner. [19:29:59] From their docs: [19:29:59] The Java Developer package puts an additional copy of the Java SE 6 bundle in /Library/Java/JavaVirtualMachines/. This copy is installable without disturbing the existing system JDK. [19:29:59] The download, since Apple doesn't like direct links, is under Downloads > Java. I used the Java for Mac OS X 10.6 Update 3 Developer Package download. [19:30:13] http://codeslinger.posterous.com/getting-zeromq-and-jzmq-running-on-mac-os-x [19:30:39] (use whatever the latest one is) [19:30:47] Set your JAVA_HOME to point to the new JDK. I usually stick this in .bash_profile. [19:30:47] export JAVA_HOME=$(/usr/libexec/java_home) [19:30:58] that's essentially what I did. [19:31:05] ...it still might not work. [19:31:07] but [19:31:10] this is somewhat off-topic, but why don't we have maven installed on stat1 or the analytics machines? [19:31:10] try that first. [19:31:20] we don't? [19:31:27] huh [19:31:35] i tried stat1, an01, an10, and an2 [19:31:39] an02, that is [19:31:56] i installed it on an02 today :p [19:32:02] i dunno man, because we're supposed to do this crap in labs? [19:32:06] which is annoying [19:32:08] i dunno [19:32:09] AGH [19:32:18] every time I try to do java stuff I spend hours not doing what I want [19:32:20] that's a reasonable answer [19:32:46] I am filling out an apple developer profile right now AGHHH [19:32:53] my implication about maven installs is that it just seems like we should be using linux as the build env [19:33:01] yes. [19:33:18] it shouldn't be as hard as it is, to be honest. [19:33:26] i would [19:33:27] this is why people have a build machine [19:33:32] and use CI to build all their jars. [19:33:32] except i'm supposed to use intellij, right? [19:33:36] ottomata: I did that too, don't worry, you just have to do it once [19:33:47] nobody should be using their local machine to deal with all this except for testing [19:33:59] we should have one machine, one env, which is set up like our production deploys [19:34:08] and it should be the jenkins machine [19:34:15] from there we draw all our release builds [19:34:21] and deploy them to the rest of our machines [19:34:48] but [19:34:57] lots of but [19:34:58] don't I need to be able to compile on my local in order to use intellij? [19:35:01] i mean, that's not reality. [19:35:04] yes. [19:35:05] i don't want to commit in order to run code [19:35:13] i'm just saying that intellij should make all this easier. [19:35:28] the problem you're having now uniquely involves building code that uses JNI [19:35:40] JNI (java native interface) requires the headers [19:35:46] the headers don't ship on OSX any more [19:35:51] haha, 150MB yay! [19:35:58] yep. [19:36:01] it's a whole JDK. [19:36:04] fun times. [19:36:40] can you paste that link again? [19:36:48] i need to do this same process, so I might as well start the download [19:36:55] ottomata: ^ [19:37:23] http://codeslinger.posterous.com/getting-zeromq-and-jzmq-running-on-mac-os-x [19:37:33] --> http://connect.apple.com/ [19:37:41] Install the Java Developer Package from Apple's ADC site: http://connect.apple.com/. You'll need it because for whatever reason, the included JDK install on Mac OS X moved the /include directory, which contains the jni.h file, to some other location besides the directory containing /bin and /lib. I suppose I could have symlinked it, but this seemed cleaner [19:37:51] just found it, thanks [19:37:59] you want [19:38:03] "Java for Mac OS X 10.6 Update X Developer Package" [19:38:07] 10.6? [19:38:09] where X is whatever the latest version is [19:38:12] I'm getting [19:38:18] whatever [19:38:18] Java for OS X 2013-001 Developer Package [19:38:21] sure [19:38:23] ok [19:38:23] sounds good. [19:38:30] These notes are from 2011 [19:38:36] but the basic outline is right [19:38:44] i followed them recently for jzmq [19:39:07] ok [19:39:09] so [19:39:17] i still ahve the same jni.h error, what do I need to do? [19:39:26] you set JAVA_HOME? [19:39:35] oh its in /Developer now [19:39:36] hmm [19:39:41] or where? [19:39:59] ls: /Developer/: No such file or directory [19:40:25] ls $( /usr/libexec/java_home )/../Headers/ [19:40:30] that should have stuff [19:40:36] if not, you're in trouble. [19:40:37] yes! [19:40:37] i do [19:40:44] okay. [19:40:47] ah, [19:40:50] the makefile is not linking to that [19:40:50] hmmm [19:40:56] export JAVA_HOME=$(/usr/libexec/java_home) [19:41:08] that needs to go in .profile [19:41:11] or whatever [19:41:28] oo is getting better [19:41:52] average_drifter [19:41:52] ld: unknown option: --no-undefined [19:41:53] collect2: ld returned 1 exit status [19:41:53] make[1]: *** [libdclassjni.la] Error 1 [19:41:53] make: *** [all] Error 2 [19:42:10] importantly! it found the lib :) [19:42:10] ottomata: if all else fails, I have a plan Z for you [19:42:14] yup! [19:42:17] which means it found the headers [19:42:17] we're getting close [19:42:20] jni.h is linking! [19:42:25] and thus, i will brb [19:42:25] :) [19:42:30] thank you dschoon! [19:42:46] ottomata: which project exactly are you trying to compile [19:42:46] dClass [19:42:59] not even a kraken target, but it is required for kraken udfs [19:43:21] interesting [19:43:37] i hit this error shortly after the profile.xml success [19:44:15] ottomata: I was going to give you explicit gcc commands to run to get the .so [19:44:26] but if it worked as dschoon mentioned then even better [19:44:31] well, it links now [19:44:32] so that's good [19:44:33] but [19:44:35] i'm on this error now [19:44:38] ld: unknown option: --no-undefined [19:44:52] libtool: link: gcc -dynamiclib -Wl,-undefined -Wl,dynamic_lookup -o .libs/libdclassjni.0.dylib .libs/libdclassjni_la-dtree_core.o .libs/libdclassjni_la-openddr_client.o .libs/libdclassjni_la-dclass_file.o .libs/libdclassjni_la-dtree_mem.o .libs/libdclassjni_la-dtree_util.o .libs/libdclassjni_la-dclass_client.o .libs/libdclassjni_la-dclass-wrapper.o -Wl,--no-undefined -install_name /usr/local/lib/libdclassjni.0.dylib -compatibility [19:44:53] ld: unknown option: --no-undefined [19:44:57] ottomata: can you push your changes so I can replicate please ? [19:44:57] ottomata: another q: which dir are you trying to build? [19:45:03] i didn't make any changes [19:45:11] ok [19:45:15] ? [19:45:18] erosen^? [19:45:30] i mean what are the commands you are using [19:45:33] in kraken? [19:45:36] or dClass? [19:45:36] yeah [19:45:40] kraken: [19:45:44] i think just pig [19:45:45] cd kraken-pig [19:45:46] ottomata: let's do a hangout [19:45:50] k [19:45:50] mvn package -dskipTests [19:45:51] k [19:45:52] ottomata: you wanna hangout with me ? [19:45:53] that is what I wanted to know [19:45:55] thanks [19:46:07] https://plus.google.com/hangouts/_/2da993a9acec7936399e9d78d13bf7ec0c0afdbc?authuser=1 [19:46:16] oops [19:46:25] https://plus.google.com/hangouts/_/2da993a9acec7936399e9d78d13bf7ec0c0afdbc?authuser=1 [19:46:26] ack [19:46:30] https://plus.google.com/hangouts/_/2da993a9acec7936399e9d78d13bf7ec0c0afdbc [19:46:31] there [19:57:54] back [20:01:49] why do you need -Wl,--no-undefined if you have -Wl,-undefined? [20:02:07] ^^ ottomata average_drifter [20:06:41] back [20:08:02] fyi, haven't looked yet, milimetric [20:08:05] going to lunch soon [20:08:20] well that's lower priority dschoon [20:08:31] we should work on the canvas thing, assuming you're done with the total thing [20:08:36] I can do the total thing as well [20:09:00] i've been helping these guys with their build problems. i'll jump on it after lunch, but if you want to add the total to the hack, that'd be fine [20:13:32] do do dooooo [20:13:40] we are uploading and downloading .dmg files all over the place [20:13:43] brb food [20:13:45] BAM [20:13:48] UP [20:13:48] DOWN [20:13:49] LOAD [20:13:51] BAM [20:13:54] brb food :) [20:14:05] we find this easier than stefan remembering his apple developer password [20:24:24] ottomata, milimetric: just curious, are you guys blocked on your build issues without dschoon's help? [20:26:44] don't thikn so [20:27:03] i just got average_drifter to reproduce my problem [20:27:12] which took 45 minutes! AGHHH JAVAAAAA [20:27:30] included downloading and uploading several .dmgs from apple [20:27:46] ~150MB dmgs [20:27:48] blablabla [20:28:47] good times... [20:29:28] ottomata: is it worth putting notes together somewhere in case we run into this again? [20:37:52] hm, i dunno, no i think….i dunno [20:38:02] average_drifter is fixing the build files so it will compile on os x [20:38:45] but, wait, i'm confused, drdee, why can't I compile the GeoIPLookup class without dclass [20:38:49] oh maybe I can, i just can't use maven? [20:38:53] you can [20:39:00] just go to kraken-pig [20:39:02] and run mvn compile [20:39:06] that's what i'm doing [20:39:09] (well, mvn package) [20:39:13] but dClass is a pig udf [20:39:15] so it builds that too [20:39:19] ohh right [20:39:27] this is one time pain [20:39:33] sorry :( [20:39:35] how do you develop? [20:39:48] intellij and everything builds on my osx :) [20:39:52] haha [20:39:57] how's it build on yours but not mine!? [20:39:59] also [20:40:03] tell me how to use intellig [20:40:05] j [20:40:07] you could just delete the dclass pig udf [20:40:10] i'm still textmate and cli [20:40:11] hm [20:40:18] i have intellij opened [20:40:19] (not from git obviously)( [20:40:21] and i think i imported the project [20:40:27] I have 5 main directories [20:40:33] 6 actually [20:40:47] you should have one [20:40:47] http://cl.ly/image/2f3y3u1C0m2c [20:40:56] import the parent pom in kraken [20:40:59] the rest will follow [20:41:10] ahhm [20:41:16] oink [20:41:19] that looks odd [20:41:27] i think i imported the dir and not the .pom [20:41:28] uhm [20:41:33] anything to check in the import dialog? [20:41:38] automatically download Sources [20:41:39] ? [20:41:49] don't import the dir [20:41:51] import the pom [20:41:54] right [20:41:56] doing tha tnow [20:41:57] start fresh [20:41:59] ok [20:42:04] so [20:42:07] yes and download sources and docs [20:42:15] buuuuut [20:42:22] first go to kraken/maven [20:42:27] settings.xml [20:42:28] and setup settings.xml properly [20:42:28] got tha [20:42:30] t [20:42:33] ok [20:42:39] then just import the main pom [20:42:40] select nexus profile? [20:42:43] you should be good to go [20:42:44] yes [20:42:46] is .so actually called .dylib on OSX ? [20:43:01] http://cl.ly/image/1P3p132F0A3h [20:43:01] ? [20:43:01] yes i think so [20:43:10] why this means we're done [20:43:11] check mark sources, docummentation [20:43:14] import kraken SNAPSHOT [20:43:16] ottomata: just comment --no-undefined [20:43:19] hmmm [20:43:22] errrr [20:43:23] wait wait [20:43:28] and module groups for multim-module projets [20:43:36] the name is also important because it's then loaded from a specific path [20:43:38] and also import maven automaticaly [20:44:14] drdee [20:44:14] http://cl.ly/image/1P3p132F0A3h [20:44:15] yes/no? [20:44:40] average_drifter: name should be /usr/local/lib/libdclassjni.0.dylib [20:44:51] drdee: is that the name on your local machine ? [20:44:53] ottomata yes plus all my comments [20:45:03] average_drifter: yes [20:45:14] and module groups for multim-module projets [20:45:17] don't know what that means, drdee [20:45:26] check the third box [20:45:31] drdee: is there Java code specifically loading libdclassjni.0.dylib ? [20:45:36] oh on that screen [20:45:47] average_drifter: yes [20:45:52] ok [20:46:02] this looks better [20:46:14] can i see it? [20:46:29] ottomata: just hit make install then [20:46:33] ottomata: hit make install [20:46:53] do I need to re-automake or something? [20:46:55] ottomata i will send you my dclass osx file [20:46:58] hold on [20:47:00] and module groups for multim-module projets [20:47:17] ottomata: yes [20:47:36] ottomata: aclocal; autoconf; autoheader; automake --add-missing; ./configure; make ; sudo make install [20:48:01] ottomata: then ls /usr/local/lib/libdclassjni.0.dylib [20:48:05] woot [20:48:05] (package *)[cc9d9cf]$ ls -l /usr/local/lib/libdclassjni.* [20:48:06] -rwxr-xr-x 1 root admin 56164 Feb 20 15:47 /usr/local/lib/libdclassjni.0.dylib [20:48:06] lrwxr-xr-x 1 root admin 20 Feb 20 15:47 /usr/local/lib/libdclassjni.dylib -> libdclassjni.0.dylib [20:48:06] -rwxr-xr-x 1 root admin 926 Feb 20 15:47 /usr/local/lib/libdclassjni.la [20:48:11] AWESOME! [20:48:13] Cool Beans :) [20:48:30] so [20:48:30] how do I build with intellij? [20:48:31] .so :D [20:48:33] hah [20:48:38] just run [20:48:41] so many buttons! [20:48:44] but try first mvn package [20:48:54] make sure that works [20:49:08] oh run is greyed out [20:49:09] ok [20:49:25] do it from kraken/ [20:49:32] it will download a shit ton of jars [20:49:39] one time pain [20:49:44] Failure to find org.wikimedia.analytics:kraken-dclass:jar:0.0.1-SNAPSHOT in http://nexus.wmflabs.org/nexus/content/groups/public was cached in the local repository, resolution will not be reattempted until the update interval of nexus has elapsed or updates are forced [20:49:53] do I need to update my local cache somehow? [20:49:57] ottomata: just delete your .m2 [20:50:03] no [20:50:18] ah better [20:50:23] just had to run from kraken [20:50:23] hm [20:50:27] yes [20:50:39] but do first mvn install [20:50:42] then mvn package [20:50:50] sorry, didn't know. I thought if maven caches something and you can just delete ~/.m2 to "invalidate the cache" [20:51:04] mvn install? [20:51:09] i don't want to install it, right? [20:51:24] you do [20:51:28] sooooo [20:51:49] because kraken-dclass is not yet in the nexus repo [20:51:53] it can't' find it [20:52:02] mvn deploy ? [20:52:06] running mvn install will install it in .m2 [20:52:10] not mvn deploy [20:52:12] mvn install [20:52:13] then [20:52:15] mvn package [20:52:18] if that builds [20:52:21] your are good to go [20:52:31] drdee: how does one deploy to nexus ? [20:52:36] mvn deploy [20:52:46] drdee: can kraken-dclass be deployed to nexus ? [20:52:48] i'm not passing tests though [20:52:54] no /usr/share/GeoIP [20:53:00] but that will only work after successful package [20:53:03] can I add the free GeoIP db files to resources [20:53:04] ? [20:53:15] can't install if I don't pass tests :p [20:53:25] my adivce [20:53:32] just copy the files to /usr/share [20:53:35] haha [20:53:35] nope [20:53:37] i'm fixing this now [20:53:41] seriously [20:53:44] this is the whole reason I've spent the last hours on this [20:53:50] to fix the /usr/share/GeoIP thing [20:54:02] i had to clean the entire repo a couple of weeks ago [20:54:10] oh right, yeah [20:54:11] that's a big file [20:54:13] let's not commit those [20:54:20] can we make the test smarter or maybe the build phase [20:54:21] just put it locally [20:54:22] it needs fixture data [20:54:28] well i mean [20:54:31] low priority :() [20:54:33] i'm fine with putting it locally [20:54:35] but [20:54:49] not at /usr/share/GeoIP, the tests will have the location [20:54:59] i want to remove /usr/share/GeoIP from the UDF [20:55:15] from the UDF [20:55:24] but from the tests it's fine to have locally [20:55:39] and then with pig unit you can mimic getCachFiles() [20:55:46] hmmm, ok [20:58:26] hoi ottomata [20:58:33] hio [20:58:44] see you soon mr. Dario! [20:59:20] when are you in town? [20:59:30] tomorrow afternoon [20:59:40] fantastisch [21:02:14] hmm, drdee, SessionTest is failing [21:02:18] but it is all commented out [21:02:31] it should pass :) [21:02:38] test(org.wikimedia.analytics.kraken.pig.SessionTest) Time elapsed: 9.759 sec <<< ERROR! [21:02:39] org.apache.hadoop.mapred.InvalidJobConfException: Output directory not set in JobConf. [21:03:50] the jobconf is still validated on every startup, ottomata [21:03:58] so you need to give it defaults for all required params [21:04:11] just comment it out [21:04:17] comment what out? [21:04:23] SessionTest [21:04:29] it is all commented out [21:04:42] doesn't matter. [21:04:56] it probably validates on construction no matter what. [21:04:56] but i could be wrong :) [21:05:07] haha, i mostly don't know what you guys are talking about [21:05:11] and as for mvn install -- that refers to installing an artifact in the local repository [21:05:31] which is ~/.m2/repository [21:05:36] you never want to delete ~/.m2 [21:05:41] because that contains settings.xml [21:05:46] which has pointers to our nexus [21:06:03] Hm. I deleted the SessionTest.java file [21:06:16] but it is still running SessionTest? [21:06:26] mvn clean [21:06:31] it probably has a .class file [21:06:39] and it needs to be told not to load it [21:06:46] which requires a rebuild [21:06:51] great article -- http://perspectives.mvdirona.com/2013/02/04/ThePowerFailureSeenAroundTheWorld.aspx [21:06:52] mvn clean... [21:06:59] (i might have linked that already) [21:07:00] it compiles fine on my machine [21:07:07] it compiles [21:07:09] just test fails [21:07:12] Ah! [21:07:13] ok [21:07:13] yeah [21:07:17] yay clean [21:07:18] after clean and compile [21:07:25] tests pass [21:07:28] i deleted SessionTest.java [21:07:29] :) [21:08:07] drdee, can I laod files from local using file:/// [21:08:12] and from hdfs using hdfs:/// [21:08:12] ? [21:08:38] i think you [21:08:38] i say, "yes", but i will defer to drdee [21:08:42] can [21:09:50] what if I make dbPath [21:09:53] behave like an include path [21:09:57] make it an array [21:10:05] and make it search all the paths for .dat files when loading [21:10:27] that sounds cool [21:10:29] no reason not to. [21:10:41] people love -D for this. [21:10:55] -Dgeoip.include.path=foo,bar,baz [21:10:58] split on comma [21:11:34] String dbPaths = System.properties.get("geoip.include.path") [21:11:36] etc [21:11:43] (btw, I feel like this will take me 2 days to do, i keep floundering so much) [21:12:13] k i like that [21:13:23] dschoon, what will that be if it is not specified on cli? [21:13:25] null? [21:13:47] null [21:13:57] let's not use the cli for this but the constructor of the pig udf [21:14:21] we are running oozie as well :) [21:14:29] you can set properties there. [21:14:36] they should show up in the right place [21:14:45] the constructor is considered harmful :) [21:14:51] hence the builder pattern [21:14:52] ? [21:14:57] oh pass it in [21:15:05] (constructors that take arguments can't be build with the factory pattern) [21:15:06] yeah, i'd rather use cli/oozie param [21:15:14] right, because if we used constructor arg [21:15:16] and those should, in fact, be the same. [21:15:18] we'd have to modify pig scripts [21:15:19] to run them [21:15:23] correct. [21:15:28] tha'ts what i'm trying to avoid [21:15:34] i'm setting a default though [21:15:36] you just make sure you have a default when the param is unset. [21:15:44] file:///usr/share/GeoIP,hdfs:///libs [21:15:53] i think there is even a System.properties call [21:15:53] which lets you pass one [21:15:53] so hopefully we won't have to think abou tit [21:16:00] oh [21:16:21] http://docs.oracle.com/javase/7/docs/api/java/lang/System.html#getProperty(java.lang.String, java.lang.String) [21:16:48] String dbPaths = System.getProperty("geoip.include.path", DEFAULT_GEOIP_INCLUDE_PATH); [21:17:08] http://docs.oracle.com/javase/7/docs/api/java/lang/System.html#getProperty(java.lang.String,%20java.lang.String) [21:17:11] stupid space [22:12:44] ergh, drdee, [22:12:49] oh your are babying [22:12:50] dschoon [22:13:00] unreported exception java.io.IOException; must be caught or declared to be thrown [22:13:02] but I have [22:13:11] /** [22:13:11] * @param db [22:13:11] * @throws IOException [22:13:11] */ [22:14:01] oh need it in func decl [22:14:42] milimetric, how do I use Intellij to run a test? [22:17:12] right click on the project ottomata [22:17:22] on the right side [22:18:13] yeah, right click - Run All Tests [22:18:30] hmm, don't see that [22:18:32] on the right side? [22:18:42] you mean the project in the filebrowser on the left? [22:18:43] left [22:18:46] sorry ugh [22:18:59] ah ok think i found it [22:19:14] ergh it runs on cli but gets expection in intellij [22:20:42] you can debug through it to see where it's choking [22:20:58] but maybe a pom isn't processed so it doesn't have everything? [22:21:30] is this something I can try? [22:22:13] if someone has access to wikimedia-incubator can you please flip ON the switch for debianize ? [22:22:21] on Travis I mean [22:22:50] I tried but couldn't [22:23:12] back [22:24:01] yeah, you got it ottomata [22:24:14] is wikimedia-incubator on Travis ? [22:24:32] average_drifter, ottomata is the guy who has the the privs for that [22:24:55] ottomata: can you enable it on Travis please ? [22:25:12] the travis hook for github I think needs to be activated [22:26:34] uhhh [22:27:00] have no admin powers :( [22:27:18] back to ^demon! [22:27:24] asking him :) [22:28:17] i have github power [22:28:25] which repo? [22:28:31] yay! fixed the infobox [22:28:36] https://github.com/wikimedia-incubator/debianize [22:28:41] you have it in wikimedia-inc? [22:28:50] no [22:28:51] crap [22:28:55] only wikimedia [22:29:01] dear lord [22:29:06] nothing goes smooth [22:29:11] back to ^demon! [22:29:14] yup [22:30:10] i think I just got it, now what? [22:30:14] I need travis token? [22:31:16] hmmm, drdee [22:31:17] Caused by: java.lang.NoClassDefFoundError: org/wikimedia/analytics/kraken/schemas/JsonToClassConverter [22:31:29] ottomata: ^demon said you got full rights on github [22:32:01] ok, ungh, will continue java later, i gotta run very soon [22:32:02] yes [22:32:03] i got it now [22:32:11] but what do I do? [22:32:37] I need Travis user, token, and domain [22:33:45] milimetric set it up for limn back in the day [22:33:54] maybe he has some insight for ottomata and average_drifter? [22:34:02] kraigparkinson: https://plus.google.com/hangouts/_/27787d395d3286e9a6ee8b7a98fd8e7a05f342e4 [22:34:03] it was actually drdee [22:34:15] but instructions are on Travis's site [22:34:15] i think drdee will have to help you, i gotta run [22:34:28] i'm logged into travis-ci [22:34:29] well, he is a travis-lover. [22:34:35] but I do'nt have anything under my repositories [22:34:37] ooh, cool [22:34:41] http://about.travis-ci.org/docs/user/how-to-setup-and-trigger-the-hook-manually/ [22:34:44] you can do it manually [22:34:51] so if you guys don't have access you can try that [22:36:52] i did something [22:36:55] dunno if it worked [22:37:41] hm i don't see any incubator repos [22:37:42] hmm, [22:37:43] welp [22:37:45] i did it manually on github [22:37:49] dunno what else [22:37:57] i have to go now [22:38:09] bye guyyys!