[12:49:58] morning [13:18:08] yooo ottomata [13:20:41] mooorning [13:42:46] i am still in the hangout [13:43:18] trying [13:43:30] k [13:44:02] there is a problem connecting to this hangout [13:59:42] brb, restarting [14:02:10] ah, drdee, I re-read hashar's 2nd review comment [14:02:20] i went back and forth on what he is suggesting too [14:02:21] he might be right [14:02:43] about yarn / mrv1 [14:02:44] ? [14:09:33] ahah you are here! [14:09:35] yup [14:09:40] i just responded to your comments [14:11:57] hashar, are you coming to the amsterdam hackathon? [14:12:04] yes sir [14:12:10] landing in Ams on thursday evening IIRC [14:12:13] and departing on monday evening [14:14:47] cool!, i'll be there wed morning, leaving sunday or monday, not sure yet [14:39:43] ori-l, you up this early? [14:40:03] i want to deploy a change to the event logging relay, and I want to be able to confirm that it still works after [14:47:17] milimetric! welcome to a bright sunny east coast day [14:47:42] :) [14:47:49] it's quite lovely [14:47:56] so so so nice [15:03:45] hey average [15:03:47] what's up? [15:03:59] talk about 353? [15:21:40] drdee: hi, I didn't get to look at it more [15:21:46] I'm stuck with stuff in the house right now [15:21:53] cleaning and stuff [15:21:55] it's a mess here [15:51:30] drdee: I added a bit to: https://mingle.corp.wikimedia.org/projects/analytics/cards/586, fyi [15:52:09] ty!! [15:55:53] brb gonna pick up food [16:35:56] ori-l haiiiii [16:46:54] ottomata: thoughts on running an ipython notebook server on stat1? [16:47:15] no good [16:47:20] stat1 has public IP and still has private data [16:47:29] through ssh tunnel though? [16:47:53] hmmmm, it would be better to wait until the private data was moved to stat1002 [16:48:07] but aside from that if it was only accessible via ssh tunnel that would be fine [16:48:25] k, well ironicallyt, the use case I'm imagining is showing you or diederik the log data analysis [16:48:36] which is the data we're talking about moving [16:48:44] ha [16:48:57] hm. [16:49:06] i think if it is via ssh tunnel it is ok, especially if it is a temporary thing [16:49:13] not a service that is expected to be up all the time [16:49:29] yeah, i'm imagining it like a shared screen session [16:49:34] or something like that [16:51:46] ottomata: so i've actually given this tunnel option a try (per http://wisdomthroughknowledge.blogspot.com/2012/07/accessing-ipython-notebook-remotely.html) and I wasn't able to connect [16:53:34] are you running it right now? [17:00:24] standup ? [17:00:32] ohohohoh [17:00:45] are you guys in it ? [17:01:01] ottomata: ? [17:01:16] average: yes we are [17:01:38] oh, I'm on my way [17:01:47] kool, same day same time [18:03:18] drdee what's the end-goal in using this vagrant mediawiki ? [18:03:29] I mean I know it's a good thing because you can get a mediawiki really fast [18:03:32] world domination [18:03:53] yesss !! I need that [18:05:42] so I'm going to sum up what I've noticed [18:05:53] I installed vagrant 1.1.2 , Virtualbox 4.2.8 ruby 1.9.3 [18:06:03] all from packages [18:06:05] nothing by hand [18:06:14] and vagrant up worked for mediawiki-vagrant [18:06:17] it took a lot of time [18:06:22] it trashed my system load [18:06:30] but now it's almost done [18:06:44] and inside the box that it made I do have interwebz access [18:07:08] which means the Vagrantfile has the right NAT/Bridge thing for the eth interfaces it creates [18:07:39] I'd also mention I'm running Ubuntu 13.04 (which comes with Vagrant 4.2.8 in the distro repo) [18:08:27] drdee: when we get the world, I'd like a small island please :) [18:08:27] I want to set up a small hut [18:08:48] and go fishing in the morning and make a small campfire each night [18:09:09] nice! [18:09:18] can i join? [18:09:27] sure :) [19:01:37] ori-l if you have a sec: https://gerrit.wikimedia.org/r/62018 [19:17:07] milimetric, i ran into a bug using mysqldumper on vagrant, https://gerrit.wikimedia.org/r/62018 fixes that [19:17:20] ah, cool [19:17:34] asked ori-l to review it [19:44:59] drdee: [19:44:59] root@oxygen:/a/squid# uptime [19:44:59] 19:44:46 up 3 min [19:45:00] :) [19:47:26] nice! [19:47:29] no problems? [19:53:22] nope! it came up real fast [19:53:38] multicast relay 100% on gadolinium now [19:55:28] awesome! [20:02:44] erosen: do you have a google hangout? [20:02:54] dario just created one [20:02:57] k [20:03:09] drdee: that patch is totally right.. i ran into that before but forgot to fix [20:03:12] thanks, merging [20:04:42] drdee shall we wait for you? [20:11:03] ori-l, happy to help with something minor ;) [20:24:42] can we delete https://github.com/wikimedia/user_metrics ? [20:25:16] seems like we should check with dartar and rfaulckr [20:27:57] the new repo is https://github.com/wikimedia/analytics-user_metrics [20:28:06] the one that i mentioned hasn't been updated in 17 days [20:30:40] i think it is clean, but I think we should still check with rfaulckr [20:30:51] erosen: confirmed, makes sense to delete wikimedia/user_metrics [20:31:07] ^^dartar [20:31:48] I'm good to go with analytics-user_metrics on gerrit once I get my keys sorted [21:12:17] drdee, rfaulckr, erosen +1 [21:12:52] rfaulckr: filing a ticket, I haven't heard back from leslie [21:33:15] thanks dartar [21:33:30] start waving those forms around ;) [21:33:41] you're filed [21:34:06] copying joady everywhere if people have questions on who you are [21:54:59] erosen, drdee, milimetric: I think we have to adjourn to tomorrow for the cohort upload review [21:55:12] oh really? [21:55:20] I was testing against prod but I can't get it to work, so I need to set up a new test instance [21:55:28] hmm [21:55:32] what isn't working? [21:55:37] well that's good to know at the very least [21:55:39] 500 errors upon upload [21:55:44] from prod [21:55:50] did you clear cache and reload? [21:55:57] it's probably just js being cached [21:56:01] and there's no cache-busting [21:56:04] k hang on [21:56:13] and open up your console to see whether or not there are JS errors [21:57:29] I also haven't managed to run a full test, we can chat about the notes that I've taken so far, but they might be incomplete [21:57:31] first testing with an empty cache [21:57:32] no you're right DarTar [21:57:36] I'm getting an error now too [21:57:39] ok [21:57:41] this is new, I'll check it out [21:59:02] so up to you guys, we can quickly touch base now for – say – 15 mins, but then maybe reconvene tomorrow or another day/time that suits you, after I've installed a fresh test instance from gerrit [21:59:33] if you feel like you need more time to test it, I would opt for a later meeting [21:59:37] yeah, same [21:59:43] k sounds good [22:00:25] would tomorrow same time work? [22:00:32] yeah [22:02:40] ok DarTar: I was wrong [22:02:43] it works [22:02:53] there are a couple of problems though [22:03:06] one is that the deployment's improperly configured to check things like arabic wikipedia [22:03:19] the second is that you're using a file that's not the right format and we're not handling that gracefuly [22:03:27] i'll email you a file that works to upload [22:03:44] oh, I'm intentionally trying to break things - usertesting :) [22:03:51] ok, sent [22:03:59] cool, that's good [22:04:16] so you broke it - bad files will kill the server, I'll add a try-catch [22:04:48] if you could, please add the file that broke it into the user_metrics/test directory [22:04:49] ok, will add that to my notes [22:05:12] and if you get through testing within half an hour, I'm still ok with meeting for the last half hour we had schedueld [22:05:22] ok cool [22:05:22] ottomata, can you make settings.py owned by the wikidev group of user-metrics on stat1001? [22:06:31] i am pretty sure that rfaulkner did not define s7 in the settings.py file [22:06:42] that's why you get a KeyError [22:06:46] and hence the 500 error [22:07:29] milimetric, DarTar: ^^ [22:07:42] s7 contains eswiki huwiki hewiki ukwiki frwiktionary metawiki arwiki centralauth cawiki viwiki fawiki rowiki kowiki [22:07:46] right [22:07:50] that works [22:08:00] i mean that *would* be the cause [22:09:46] is the latest settings.py using get_project_host_map() ? [22:11:06] drdee: you are referring to "connections" in settings.py, right? [22:11:16] yes [22:11:49] ok got it, so this is a result of the upload defaulting to arwiki and s7 missing from connections [22:12:39] yup [22:12:43] DarTar that's correct for the mapping - https://github.com/wikimedia/analytics-user-metrics/blob/master/user_metrics/config/settings.py.example#L136. However, in the example settings file several keys for host alias aren't present [22:13:31] rfaulckr: it took me a while to get your IRC nick, neat [22:13:45] hah ;) [22:14:04] most of my flickr hacker friends have this problem of removing vowels from their apps [22:14:38] enter UsrMtrics [22:17:22] UsrMtrics [22:17:56] *#FF00FF [22:17:59] :P [22:19:32] New patchset: Milimetric; "added exception handling to the csv uploads" [analytics/user-metrics] (master) - https://gerrit.wikimedia.org/r/62098 [22:20:44] DarTar: patch for exception handling ^ [22:20:55] lemme know if you got anything to work and if you still wanna talk [22:21:35] yep, still busy testing :) [22:21:47] frwiki also fails [22:21:53] that's probably s6 [22:22:16] patch is here: https://gerrit.wikimedia.org/r/62099 [22:22:23] ottomata can merge this tomorrow [22:22:52] yeah, i mean testing with wikis besides enwiki is useful only to see that it handles unicode properly [22:22:59] ok sweet, I'll list all this stuff, let's review it later [22:23:03] milimetric: right [22:23:11] there's no logic that handles different wikis differently [22:27:24] ottomata, still around? [22:41:30] yup, but kinda not working [22:41:31] suuup? [22:41:58] can you merge https://gerrit.wikimedia.org/r/62099 [22:42:06] it's a quirky :) [22:42:11] i mean quicky :) [22:43:46] done! [22:43:47] ty! [22:48:06] drdee: ping [22:48:15] whhaazzzzuppp? [22:48:21] i PM'd [22:48:27] yup [23:01:03] drdee: ottomata want ot do x_cs spike tomorrow morning? [23:02:11] yup [23:02:30] DarTar, s6 and s7 should now work [23:02:30] k, i'm thinking maybe the hour before scrum? [23:02:35] yup [23:02:47] drdee: great [23:03:01] sending my early feedback in a sec