[10:02:58] (CR) Nuria: [C: -1] "I shall comment on the precise file changes but from looking at the files we are missing UI changes." [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/122638 (owner: Csalvia) [10:15:46] (CR) Nuria: "I think we need another patch with more testing plus UI changes that removes the code we do not need from the testing file." (4 comments) [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/122638 (owner: Csalvia) [12:29:50] gi11es: ping me if you want help with limn [12:37:59] gi11es: ^^ [12:39:30] gi11es: ^^ in relation to this I think https://github.com/wikimedia/limn-data/issues/1 [12:39:46] problem solved average_, thanks [12:40:23] who's average_ ? I'm average [12:41:01] :D [12:52:18] (PS46) Milimetric: [WIP] Run recurring reports using the scheduler [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/112165 [12:52:48] nuria: both good points in your email, addressed in this patchset ^ [12:53:35] looking looking [13:22:33] (PS47) Milimetric: [WIP] Run recurring reports using the scheduler [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/112165 [14:34:01] (CR) QChris: [C: -1] "I tried to run PS3 twice, but although both are marked as" [analytics/kraken] - https://gerrit.wikimedia.org/r/121531 (owner: Ottomata) [14:35:35] qchris: just guessing about number of open files and why it might be diferent on PS3 [14:35:41] since timestamp field is probably set correctly in camus [14:35:47] it would have to write to many more directories [14:35:54] instead of just the current runtime hour [14:36:07] anyway, i'm focusing on stat1003 for the moment, tryign to test some stuff and get out an email [14:36:10] then I will look at camus stuf [14:36:14] ok. [14:45:49] (PS1) QChris: Drop jobtracker from Hadoop tunnel configuration [analytics/kraken] - https://gerrit.wikimedia.org/r/123865 [14:49:23] ottomata: Did icinga notice you that stat1002's home partition went full during the night? [14:49:39] (One of my jobs at 04:30 failed) [14:49:59] I freed up some GB now. [14:50:07] Is this worth notifying others? [14:50:51] it did not tell me that, no [14:50:58] geez :p [14:51:22] 628G ./ironholds [14:51:27] 315G ./qchris [14:51:34] Ok. I'll file a bug ... becouse i should notify someone. [14:51:46] ha you guys know there is a writeable 3TB directory on tha tmachine? [14:51:57] yes. [14:51:59] make a /a/qchris directory and use that ! [14:51:59] :-D [14:52:17] Those 315GB started as a small tmp dir. [14:52:39] Then people wanted to cover more months ... more details ... boom 315GB :-) [14:53:02] ha, maybe we should use quota on /home [14:53:03] geeeez [14:53:09] totally! [14:53:58] i'm sending out an email about stat1003 today, i'll include a warning about using /home [14:54:22] Not sure. [14:54:28] Why is /home bad. [14:54:34] its not bad, except it isn't big! [14:54:48] i guess we could put /home on /a or something, dunno [14:54:56] brb [14:54:58] k. [15:03:23] fyi: ottomata, I'm just about to kick off an rsync of my home directory [15:03:38] onto stat1003 [15:04:25] * halfak fails to log into stat1003.  [15:04:27] hmmm [15:05:04] Connection attempt hangs on "debug1: Connecting to stat1003.wikimedia.org [208.80.154.82] port 22." [15:05:25] * halfak can connect to old stat1 just fine.  [15:05:58] same story when trying to connect to stat1003 from stat1. [15:07:36] ok halfak [15:07:37] yes [15:07:40] sorry, more changes [15:07:40] same for me. [15:07:52] you can't direct ssh into stat1003 due to firewall rules [15:07:58] you ahve to use bastions [15:08:14] That must have changed in the last 20 minutes. [15:08:16] yes [15:08:18] it did [15:08:29] i was tryign to make fewer annoyances for you guys like this [15:08:37] but they wanted me to not allow direct ssh [15:08:43] i think they are right, but mehhhh [15:08:56] halfak, I will be sending all this out in an email today too [15:08:57] Yes. Connecting to stat1003 through bast1001.wikimedia.org works for me. Thanks. [15:09:16] but, for now, do you know how to setup .ssh/config to use bast1001 as proxy? [15:09:38] Maybe. I set it up for labs (different bastion). [15:09:44] I'll see if I can figure it out for prod. [15:11:02] (PS7) Stefan.petrea: Fixed regression test. [analytics/wikistats] - https://gerrit.wikimedia.org/r/123603 [15:11:12] yup. Got it. [15:11:16] ah great! [15:11:27] * halfak is kicking off rsync.  [15:11:42] ok great [15:12:02] everyone's home on stat1003 is rsynced except for you :) [15:12:20] Would you mind running it for me since you have the command ready? [15:12:42] I was just browsing the man page to refresh myself and realized that was silly. [15:12:45] home rsync? sure ja [15:12:49] i'll run it in a screen too [15:13:07] (PS8) Stefan.petrea: Fixed regression test. [analytics/wikistats] - https://gerrit.wikimedia.org/r/123603 [15:13:12] Any way you could ping me right when it finishes? [15:13:22] sure, will try to notice :p [15:13:24] it will take a bit [15:14:33] Hopefully the pipe is fat [15:16:44] ottomata, Looks like I can track the process with ps, so no sweat. [15:16:53] cool [15:17:26] * halfak watches his clean new home dir fill up with all the old half-baked projects [15:17:45] heheh [15:19:20] (CR) Ottomata: [C: 2 V: 2] Drop jobtracker from Hadoop tunnel configuration [analytics/kraken] - https://gerrit.wikimedia.org/r/123865 (owner: QChris) [15:25:53] heh. Permission denied on EVERYTHING. [15:26:46] Looks like I'm drawing pictures and writing up docs today. No worries. I have that queued up due to wrist injury. [15:32:42] permission denied! [15:33:07] halfak: where, what? [16:16:38] ottomata, sorry, was talking about all the rsync'd files. It looks like you'll need to chown post rsync. [16:17:05] oh... wait. [16:17:08] Looks good now [16:17:52] hmm, ok, maybe it was root running rsync process? [16:17:56] hm, [16:18:02] i see trash and www still owned by root [16:18:19] but rsync is also still running [16:23:42] oops [16:23:44] ottomata, good news! [16:23:48] I worked out how I broke stat1002. [16:23:54] Email going out now. My apologies again :/ [16:24:49] bad news: it is incredibly stupid. [16:25:10] ok!@ [17:09:34] (PS48) Nuria: [WIP] Run recurring reports using the scheduler [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/112165 (owner: Milimetric) [17:43:26] (PS49) Nuria: [WIP] Run recurring reports using the scheduler [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/112165 (owner: Milimetric) [17:57:58] (PS50) Milimetric: [WIP] Run recurring reports using the scheduler [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/112165 [17:58:30] nuria: I assumed you were done, so I pushed my stuff ^ [18:01:05] argh retardation on my side SORRY [18:01:21] as i thought your change wasn't there. feel free to push again and override [18:01:34] (CR) Milimetric: [WIP] Run recurring reports using the scheduler (1 comment) [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/112165 (owner: Milimetric) [18:03:22] * milimetric going to lunch [18:57:19] hey ottomata, it looks like rsync is done. can you confirm that there were no errors? [18:57:38] du -hs * [18:57:39] looks ok to me [18:57:39] sent 390198 bytes received 451335068319 bytes 83495598.65 bytes/sec [18:57:39] total size is 451278567773 speedup is 1.00 [18:57:42] woops [18:57:48] Thanks [18:57:55] perms look good too [18:58:03] Agreed :) [18:58:08] Thanks [19:04:01] tnegrin, got a sec to respond to Heather re. jobvite for FR position? [19:04:29] n/m I just saw that lzia emailed you already [19:04:30] meeting 30 mins [19:09:19] (CR) QChris: Fixed regression test. (1 comment) [analytics/wikistats] - https://gerrit.wikimedia.org/r/123603 (owner: Stefan.petrea) [19:18:59] ottomata, I use sftp with my text editor to work files on stat1. Now, it doesn't work with stat1003 because I need to forward through bastion. Any pro-tips? [19:19:13] I guess I could sshfs, but that's a problem with imperfect connections. [19:23:16] hmm [19:23:30] if your .ssh/config is right and working witih ProxyCommand [19:23:39] it should look the same to your sftp [19:23:45] unless sftp doesn't read .ssh/config [19:23:50] uses its own built in ssh or somethign [19:23:55] It doesn't appear to. [19:24:05] milimetric, or possibly others, is there a way of checking which link in a given article is clicked most often? [19:24:16] oof, these are issues I was expecting when they wanted me to firewall this thing off :p [19:24:16] heheh [19:24:17] Hmm.. I could sshfs mount my project dirs on bastion. [19:24:24] yeah but that is not ideal [19:24:26] hmmm [19:25:07] jgonera: yes, we capture referrer information [19:25:20] though that's imperfect because of https, etc. [19:25:36] anyway, in theory we could do something like: [19:25:40] milimetric, but I assume this is not something easily accessible and probably time consuming to generate? [19:25:59] select article, count(*) where referrer = 'something' group by article [19:26:11] yeah, in theory we have the data for mobile [19:26:16] we do not have it yet for desktop [19:26:24] oh, why not for desktop? [19:26:38] and in what way do you have it for mobile? as in calculated periodically? [19:26:39] varnish-kafka is not enabled on the desktop varnishes yet [19:26:49] we're basically blocked on a network issue in Amsterdam [19:26:55] I see [19:27:00] no, we have the data as in the raw request logs [19:27:06] halfak: are you using a sftp gui? [19:27:07] or cli? [19:27:16] It's built into my editor: jedit. [19:27:19] hm [19:27:32] so it would still take some time to calculate this for a few articles, right? [19:27:32] so from where we are now to answer your question, it would mean 1. figuring out the caveats around looking at referrer and 2. running the analysis [19:27:46] yes, it would take a while to get a good answer [19:27:49] I'm using sshfs right now, but I like to minimize that since sshfs likes to lock my machine up. [19:28:00] but jgonera and Ironholds might want to meet about this [19:28:09] since Ironholds is now king of mobile data, right? [19:28:17] oh, right [19:28:24] and milimetric is shitty at data analysis :) [19:28:48] I was interested in both mobile and desktop though, for a possible side project, but since it is a side project I'm not going to bother anyone for more than 5 minutes with this ;) [19:29:12] I just wanted to know how feasible it is and if it's automatized in any way [19:29:24] thanks for help milimetric [19:29:53] oh yeah, no problem [19:30:11] the other thing to do would be to turn on eventlogging on the articles you care about [19:32:28] milimetric, don't call me that. [19:32:30] ;p [19:33:03] good point Ironholds, I owe you as many beers as unexpected projects that your new title brings you [19:33:04] :P [19:33:29] hahah [19:53:54] (CR) QChris: [C: -1] Fixed regression test. (10 comments) [analytics/wikistats] - https://gerrit.wikimedia.org/r/123603 (owner: Stefan.petrea) [20:17:01] * YuviPanda plops [20:17:14] I've a small question about funnels and EventLogging and schema design. [20:17:21] * YuviPanda wonders if he should just ask or just do it and then ask [20:28:41] (PS9) Stefan.petrea: Fixed regression test. [analytics/wikistats] - https://gerrit.wikimedia.org/r/123603 [21:58:13] * milimetric runs away before he goes nutty checking email [21:58:16] <3 [21:58:18] have a good weekend [21:58:19] byebye [22:03:36] byebyeyyyy [22:40:30] (CR) Ottomata: "Thanks qchris!" [analytics/kraken] - https://gerrit.wikimedia.org/r/121531 (owner: Ottomata)