[13:31:27] (PS1) Gilles: Clean up limn graphs [analytics/multimedia/config] - https://gerrit.wikimedia.org/r/143872 [13:42:02] note: I'm cleaning up the puppet setup on dev and staging ... now [13:46:59] ottomata / qchris: I got this error several times when trying to pull latest from puppet, on dev, prod, and staging [13:47:00] error: The following untracked working tree files would be overwritten by checkout: [13:47:00] modules/varnish/files/ganglia/.pep8 [13:47:00] modules/varnish/files/ganglia/ganglia-varnish.py [13:47:13] ... (it keeps listing all the files in modules/varnish) [13:47:13] ohhhh [13:47:14] yes [13:47:16] what is that?! [13:47:31] bblack turned got rid of the varnish submodule [13:47:32] put it back into ops puppet [13:47:36] Yup. [13:47:36] oh [13:47:39] rm -r modules/varnish [13:47:40] ok, i've been doing this; [13:47:44] right [13:47:48] that's what i've been doing [13:47:51] but felt dirty about it [13:47:56] ok, good to know it's proper [15:05:24] headed to cowork space, back in a bit [16:46:16] (CR) Nuria: ">I'm just starting out testing, but I think I see the problem. So, the links are >ONLY created for WikimetricsBot. There was no Wikimetr" [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/143040 (https://bugzilla.wikimedia.org/66087) (owner: Milimetric) [16:59:34] milimetric: /var is full on wikimetrics staging. That's why processes that try to write to disk (like puppet) fail. [16:59:42] ah! [16:59:57] Not sure what I may/may not clean there. [17:06:21] qchris_away: let me look [17:06:31] Toooooo late. I am cleaning up already. [17:06:50] 0Bytes free went to 900MB free. [17:07:32] [travis-ci] wikimedia/mediawiki-extensions-EventLogging#226 (wmf/1.24wmf12 - d0181cd : Reedy): The build passed. [17:07:32] [travis-ci] Change view : https://github.com/wikimedia/mediawiki-extensions-EventLogging/commit/d0181cd006a0 [17:07:32] [travis-ci] Build details : http://travis-ci.org/wikimedia/mediawiki-extensions-EventLogging/builds/29076777 [17:08:30] qchris_away: sound good [17:41:15] (PS7) Nuria: Fix wiki cohort display for report cohorts [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/142514 (owner: Milimetric) [17:41:55] milimetric, i fixed a small bug on the patch, now i think is ready to be merged: https://gerrit.wikimedia.org/r/142514 [17:42:14] good catch nuria! [17:42:20] (just read the fix) [17:42:25] ok, I agree it's ready to merge [17:42:32] so I assume you were ok with my fixes? [17:42:53] ya, the sql is a simpler one right? [17:42:57] that can only be good [17:44:24] I am going to try the other patch on dev machine [17:47:52] cool, sorry was distracted nuria, I'm all yours now, this is #1 priority [17:51:42] ok,milimetric try https://gerrit.wikimedia.org/r/142514 [17:51:53] and merge if you think is ready [17:52:02] i will be testing symlinks in development [17:52:13] nuria: what do you mean by try? [17:52:25] like run tests, etc? [17:52:41] I'll run them locally if you already ran them in dev [17:58:59] milimetric: i mean i will try patch https://gerrit.wikimedia.org/r/#/c/143040/ [17:59:01] in dev [17:59:07] machine on labs [17:59:27] right, but you said: [17:59:32] ok,milimetric try https://gerrit.wikimedia.org/r/142514 [18:00:08] (CR) Milimetric: [C: 2] Fix wiki cohort display for report cohorts [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/142514 (owner: Milimetric) [18:01:08] ok, I merged that after testing, it looked good [18:01:14] so, the two things left: [18:01:14] https://gerrit.wikimedia.org/r/#/c/143040/ [18:01:17] ok, good [18:01:18] https://gerrit.wikimedia.org/r/#/c/142007/ [18:01:51] I am testing 143040 [18:01:55] (PS3) Milimetric: Remove limit on recurrent, add throttling [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/142007 (https://bugzilla.wikimedia.org/66841) [18:03:43] ok nuria, once we merge https://gerrit.wikimedia.org/r/#/c/142007/ i think we can deploy to staging and try to set up a recurrent report of newly registered in some small wiki [18:04:02] we can even do that now though [18:04:03] with a created= maybe [18:04:22] that last patch is the removing of the 30 day limit [18:04:29] so we won't get the results we need without it [18:04:35] ya but we can test all but that [18:04:54] oh, right - i thought that's what you were doing now, I meant after you're done [18:04:55] meaning: we can test 1st 30 days [18:05:23] do you wanna hangout and test? I'm just writing an email so I can postpone if you want company [18:05:23] and once is all good (we will let the chnage bake) [18:06:17] i cannot cause i am in a little corner of the coworking so they do not kick me out (sad...).. without cable [18:06:28] and wi fi sucks, i cannot do a hangout [18:07:40] give some mins to test the patch in dev machine [18:45:08] milimetric: just curious, how's oozie stuff oing? [18:45:09] going* [18:45:56] ottomata: I've not worked on it since this morning [18:46:03] what I have submitted is my latest work (in gerrit) [18:46:11] you can feel free to take a look but I haven't tested yet [18:46:20] there were some concerns with testing if you take the cluster down [18:46:30] but qchris was going to try to see if labs testing worked [18:47:43] i have a cdh5 cluster up where it should work to test oozie [18:47:52] the webrequest table isn't htere [18:47:58] need to make one and add data.. [18:48:25] milimetric: add me as reviewer? [18:49:45] ottomata: added [18:49:53] ooh, so we can test on CDH5 theoretically?! [18:49:54] cool [18:50:01] I can make the tables and stuff as long as I have rights [18:50:02] ya, the 3 hadoop-e nodes [18:50:04] you should [18:50:07] i'd use hadoop-e-worker0 [18:50:11] log into that and use stuff from there [18:53:09] ok, but i gotta finish this GIANT email I'm writing [19:13:34] Weird stuff in squid logs: User:....%22_class%3D%22resultLink/admin/record_company.php/password_forgotten.php in squid logs [19:18:02] sigh, thanks Nemo_bis, file a bug? but I think people have seen those in the past [19:18:19] Nah, it's just a curiosity [19:18:34] People can visit all sorts of weird URLs, doesn't damage anyone ;) [19:19:52] yeah, it's weird tho :) [19:20:20] Probably spambots for some other CMS going rogue [19:20:49] What I'm not sure about are redirects from canonical namespaces: e.g. curl -I http://fi.wiktionary.org/wiki/User:Nemo_bis [19:20:55] HTTP/1.1 301 Moved Permanently [19:21:03] Location: http://fi.wiktionary.org/wiki/K%C3%A4ytt%C3%A4j%C3%A4:Nemo_bis [19:21:55] And then in files like pagecounts-2013-06-views-ge-5-totals.bz2 I see often lines like X User:Y Z / X $ALIAS:Y Z [19:22:56] When those numbers are the same, they make me suspect we're double counting that sort of redirect because it produces a new HTTP request, which makes sense. Mostly negligible effects though [19:23:26] yeah, i think that particular double counting has come up before [19:23:56] I'm guessing Erik deals with it, but it's one of the reasons I'd love to get a solid pageview definition (or a set of definitions as qchris prefers - and justly so) [21:04:14] (PS2) Gergő Tisza: [WIP] Track opt-out ratio [analytics/multimedia] - https://gerrit.wikimedia.org/r/143501 [22:41:36] milimetric: wikibugs is here