[01:02:49] Anyone who can review/merge https://gerrit.wikimedia.org/r/101111 would basically be my favourite [01:04:16] marktraceur: what is this for? [01:04:28] ori-l: For a script for handling EventLogging data for Multimedia [01:05:08] ori-l: -operations seemed to assent, should still be in the scrollback there [01:05:25] yeah, that's perfectly fine. it's not nice to add a class that does nothing but wrap a package resource, tho [01:05:42] so we should ideally identify an existing class of analysis-related packages that we could add this to, [01:05:48] or else create one [01:05:49] * ori-l looks [01:07:13] marktraceur: you could add it to misc::statistics::packages [01:07:56] Oh, hrm [01:08:03] That works [01:08:04] marktraceur: so just add it as an element to the array that starts on line 85 of statistics.pp [01:08:08] Yeah [01:08:10] I didn't see that [01:08:18] 'sokay, it's a pretty messy class [01:08:22] lots of people use that box [01:08:29] messy class hierarchy, i mean [01:10:06] i'll be back in a few; can merge it if you like [01:14:05] Ironholds: I'm inclined to merge Mark's patch. Nodejs is useful for running JavaScript code without a browser context. Do you object? [01:14:14] I can hold off if you'd prefer to discuss it further [01:14:24] ori-l, eh, not particularly; I'll +1 it just to confirm there are no particular issues. [01:14:39] cool, thanks [01:14:52] I've got a lingering reservation, which is exclusively "we shouldn't be adding things to do X when we already have things to do X", but I'm the last person to complain about that (we have how many hard-installed R packages now? ;p) [01:15:46] generally I agree, but stats1 is a bit of a special case -- I don't mind if every person who uses that box has her / his own preferences [01:16:33] Yeah [09:37:30] hey qchris [09:37:35] Hi average [09:37:45] how you doin ? [09:38:04] Good :-) [09:38:16] How's life on your end? [09:38:28] The new RAM doing fine? [11:25:00] qchris: new RAM doesn't fit [11:25:10] What? :-(((( [11:25:17] qchris: I mean, it fits, but it's not compatible [11:25:17] I had to read my motherboard manual [11:25:32] Oh :-( [11:25:35] now another set of RAM is on the way [11:25:37] for real [13:29:04] incompatible RAM is one of the most hideous things [13:36:15] Nemo_bis: yes it is, but I will get to the bottom of it, I will solve it and then I'll be able to have proper gear to work on things [13:36:27] :) [15:43:00] ok, no dice [15:43:08] so 3 sets of RAM, none is compatible [15:48:39] next week, returning all sets of ram, I'm getting mobo, cpu, ram. I'm letting those guys assemble it. I'm obviously not a hardware person [16:53:10] whoa, milimetric, teach me about python deep copy [16:53:14] this is not working for me! :) :p [16:53:27] in a loop [16:53:44] i want to save the previous dict in a different variable [16:53:49] and then update the current one [16:53:53] right [16:53:59] ok, lemme brush up on that [16:54:07] so i'm doing [16:54:08] self.flattened_stats_previous = copy.deepcopy(self.flattened_stats) [16:54:15] and I update flattened_stats laster [16:54:19] but, it seems no matter what I do [16:54:28] flattened_stats_previous is always the original empty dict [16:54:39] even though that deepcopy gets called every iteration [16:55:22] ok, trying some things [16:56:05] seems like it should work hmm [16:56:21] ok so why using deepcopy? [16:56:27] i'm not familiar with that but I've always used copy [16:56:38] like prev = now.copy() [16:56:52] b/c copy is just pass by reference, right? [16:57:04] oh wait wha [16:57:06] hm [16:57:14] ok maybe this is not my problem [16:57:23] hm [16:57:32] dict update() actually changes the values of the dict, right? [16:57:50] yeah, it merges what you pass in with what you have [16:58:09] and no, copy actually copies [16:58:10] hm [16:58:18] i just tried it for nested dictionaries and it seems fine [16:58:24] so I just noticed that at the start of my method, both of my dicts are empty [16:58:29] and that's what I use in wikimetrics and it definitely works for a few layers [16:58:34] OHHHHH [16:58:35] i know why [16:58:36] doh doh doh [16:58:45] sorry, unrelated to copy [16:59:06] this guy's right too, python's built in dict.copy() is pretty damn fast: http://stackoverflow.com/questions/5861498/fast-way-to-copy-dictionary-in-python [16:59:08] i'm calling metric_init in a loop for cli testing, which is resinstantiating my object every time, so getting fresh dicts [16:59:35] that'll do it [17:00:13] milimetric: I thought copy/deepcopy worked like http://stackoverflow.com/a/3975388/1188479 [17:12:27] you're totally right protonk, I didn't understand what you meant before [17:12:29] http://stackoverflow.com/questions/5105517/deep-copy-of-a-dict-in-python [17:12:50] so ottomata, deepcopy is for when your dictionary has pointers to objects in it [17:12:56] if it has just values, you're safe to use copy [17:13:04] but if it has, say, a list [17:13:15] then it'll just copy the pointer to it [17:14:47] *simple copy will just copy the pointer to it that is [17:15:37] structure is [17:16:19] { [17:16:19] 'A': {'a': 1, 'b': 2}, [17:16:19] 'B': { 'c': 3, 'd': 4} [17:16:19] } [17:16:26] so i think i need deepcopy [17:16:48] for that, copy seemed to work [17:17:24] nope :) [17:17:39] hmm, it workd [17:17:49] prev = curr.copy() [17:17:58] curr.update({'B': {'c': 4}}) [17:18:04] {'A': {'a': 1, 'b': 2}, 'B': {'c': 4}} [17:18:09] prev [17:18:09] {'A': {'a': 1, 'b': 2}, 'B': {'c': 3, 'd': 4}} [17:18:23] yes, but only if you update the high level keys [17:18:39] hm [17:18:44] if you try to update curr['B']['c'], it'll update prev['B']['c'] as well [17:18:47] i always call update on the whole thing [17:19:10] yeah, seems like in your case copy works but it's slightly dangerous [17:19:14] hey, it's definitely faster [17:19:42] but deepcopy seemed to work just fine too, so there must've been something else going on [17:25:17] k, running to the office, bb in a bit [17:56:28] qchris [17:56:33] ay sorry [18:00:25] ping ottomata [18:02:54] hi yaa [18:02:56] sorry was eating lunch [18:02:59] nuria, hi! [18:03:21] holaaa [18:03:35] I was wondering if you can help me getting my ssh keys set up in labs [18:03:59] sure [18:04:13] they send me this https://wikitech.wikimedia.org/wiki/Help:Access#Prerequisites [18:04:19] but the #1 [18:04:27] https://wikitech.wikimedia.org/w/index.php?title=Special:UserLogin&type=signup&returnto=Main+Page [18:04:29] "sign up for an account and log in" [18:04:44] is that what you are looking for? [18:05:54] the wikitech page ask me for a "token" when i am logging in... [18:06:30] but never got a a "token" from it [18:09:49] let me ask it for an account at wikitech [18:10:24] token? [18:10:25] hmm [18:11:35] hm, i'm trying now [18:11:36] am getting Enter your information below. [18:11:36] Account creation error [18:11:36] Incorrect or missing confirmation code. [18:11:59] oh were you able to create an account? [18:12:12] i don't htink you need to enter a token [18:12:20] nuria ^ [18:14:53] ok let me ask it [18:15:11] cause they created couple wiki accounts for me , meybe they need to create this one too [18:15:16] will get back to you [18:15:23] thanks much for teh fast response [18:15:26] *the [18:16:03] what is christian's irc id? [18:19:00] qchris [18:27:12] hola nuria [18:27:20] hola [18:28:06] welcome to el canal analítico [18:28:52] ja ja ja (spanish laugh) [18:29:09] also, remember to send me your ssh key and preferred shell username (default: nruiz) so I can file an access request for stat1 [18:29:53] ja ja ja? interesting [18:29:59] the dutch have hi hi hi, I believe. [18:33:54] I believe that there's a small settlement near the north poll that laughs with "ho ho ho". [18:34:01] pole [18:35:20] halfak: okay, you win. [18:35:30] :D [18:40:11] DarTar: skip the stat1 specific-request and request shell on all analytics machines, IMO [18:40:56] yeah, good point [18:42:22] ottomata: can I just mention "all analytics machine" in an RT ticket for Nuria or is there a list I should explicitly refer to [18:42:29] >machines [18:43:55] also check https://www.mediawiki.org/wiki/Analytics/Onboarding that contains the list of servers [18:44:01] stat* machines are separete from the concept of 'analytics' machines, even though t hey are all under analytics team usage [18:44:20] stat machines are a bit more restricted (except for stat1) and access should only be requestsed if there is a need for it [18:44:30] How do you create an Rt ticket ? i send a request to tech support but it sounds like they do not deal with ssh keys [18:44:33] analytics machines you probably only need access to if you need to do hadoop or related stuff [19:03:05] nuria: I think you need an account :/ [19:03:08] thanks folks [19:03:16] in fact, DarTar, I'd strongly suggest adding "can Nuria have an RT account" ;p [19:03:29] I've spent..many days stymied by not having an RT account until a few months ago. [19:03:49] yes, I just forgot the procedure (halfak is going through it recently) [19:04:04] can you guys advise her how to get started? [19:04:26] ottomatta, will send you ssh keys in a sec [19:04:34] tnegrin is pushing to get me an RT account last I heard. [19:04:38] drdee: https://www.mediawiki.org/wiki/Analytics/Onboarding is really useful [19:04:56] I guess i just needed to create an account on wikitech which i did not have [19:04:59] In fact I felt like it was about time for another ping. It's been nearly a month since my request for an RT account went in. [19:05:06] btw, Hi nuria :) [19:05:11] ok so the procedure is: prod tnegrin [19:05:24] if there is a better way, I'd like to know [19:05:42] it hasn't been easy for me to figure out how to request an RT account [19:06:11] I will update the onboarding wiki with thin info today [19:06:16] tnegrin: is the process blocking on someone else right now? [19:06:18] from https://wikitech.wikimedia.org/wiki/RT#Adding_a_user: [19:06:19] Can I prod that person? [19:06:32] "Make sure they have the right approvals (what are those?) and have signed the right NDAs (which ones?)" [19:06:40] all staff should have NDAs filed [19:06:58] re: approvals, that depends on managers [19:21:43] ok, got ssh keys , was able to clone event logging depot [19:38:57] cool [20:29:41] milimetric: i still am working on this [20:29:45] but would appreciate review of this [20:29:45] https://gerrit.wikimedia.org/r/#/c/101431/1/files/varnishkafka_ganglia.py [20:29:51] we might want to do that together later [20:29:54] i have to do interview soon though [20:32:51] milimetric: thanks for following up on the mobile stats thread [20:33:22] it's hard to answer a question about data quality for data that we're not generating ourselves [20:33:49] my initial thought was a problem with deletions, but EL data is stateless [20:34:24] i.e. there are no joins with the revision table that may explain this [20:34:57] I also don't know of any EL outage that would be later recovered and explain the change kenan reported [20:35:39] so at the moment my only explanation is a change in the query that was applied to historical data