[00:10:04] drdee: where can I find an up-to-date definition for the UV figures that we use in the reportcard based on ComScore data? [00:10:33] comscore site ? :D [00:11:12] so we have no internal document describing what we call UV? I've only found this page from Stu so far: http://meta.wikimedia.org/wiki/User:Stu/comScore_data_on_Wikimedia [00:12:33] maybe erik zachte would know [00:13:16] k, I'll drop him a line [00:44:49] brb guys [01:48:37] drdee: finishing up that changeset, I'm adding a test to it [01:48:50] great! and then we merge! [01:49:02] yes [14:33:38] good morning! [14:41:30] mooorning! [14:58:06] hello :) [15:00:00] hiya [15:02:44] oh hi [15:02:59] hashar: good day sir [15:03:05] hey :) [15:03:14] I am migrating your Jenkins job to the new zuul system :) [15:03:19] https://integration.mediawiki.org/ci/view/Analytics/ [15:06:45] hashar: monsieur Antoine, est-ce que nous pouvons exposons certains fischiers du jenkins sur une web serveur qui marche en jenkins ? [15:07:21] * average_drifter is remembering some french [15:09:09] one more in the french cabal :-] [15:09:15] :D [15:09:15] ton français est très bon! [15:09:25] ouis mais j'ai jamais du accents :D [15:09:34] damn I'm forgetting it [15:11:02] average_drifter: what do you want to publish ? :) [15:16:05] hashar: testdata from wikistat's workspace [15:16:57] hashar: we're running tests(basically rendering reports using artificially generated squid logs) and if we can expose the testdata results that'd be cool [15:19:15] hashar: there will be no information leak if expose just testdata/ [15:19:43] hashar: because everything in there is just generated data, or anonymized data, so no real data [15:27:36] hashar: should I paste what I wrote above ? [15:27:43] hashar: there was a netsplit.. [15:27:52] yuip :) [15:28:49] 17:16 < average_drifter> hashar: testdata from wikistat's workspace [15:28:52] 17:17 < average_drifter> hashar: we're running tests(basically rendering reports using artificially generated squid logs) and if we can expose the testdata results that'd be cool [15:28:55] 17:19 < average_drifter> hashar: there will be no information leak if expose just testdata/ [15:28:59] 17:19 < average_drifter> hashar: because everything in there is just generated data, or anonymized data, so no real data [15:29:19] unless the set is not properly anonymized :-] [15:30:34] average_drifter: you could most probably publish them under /srv/org/mediawiki/integration [15:35:50] oh cool [15:36:05] hashar: what's the route that is associated to /srv/org/mediawiki/integration ? [15:36:27] hashar: also, will mediakwiki people be ok with us exposing our stuff in their directory ? [15:36:43] hashar: would it make more sense to have a /srv/org/analytics/integration ? [15:36:48] depends what you want to expose I guess [15:37:45] re sorry [15:37:45] average_drifter: it depends which data you are going to show I guess [15:40:08] grmmnm [15:40:14] I got a bug in Jenkins Git plugin :( [15:40:40] hashar: basically reports like this ~ [15:40:45] hashar: http://stats.wikimedia.org/archive/squid_reports/scrap/2012-10-new/SquidReportOrigins.htm [15:40:58] hashar: only they are generated from fake data [15:41:20] so I am not sure what the issue is [15:41:25] the report being already public isn't it ? [15:41:38] hashar: that one is public yes, but that was manually put there [15:41:44] hashar: and manually generated [15:42:08] hashar: whereas I want to generate them while running our tests and expose them in /srv/org/mediawiki/integration :) [15:42:55] I'll expose them in /srv/org/mediawiki/integration/wikistats/ [15:43:52] hashar: is there an html validator on Gallium ? [15:44:05] hashar: I mean if there's one already installed [15:44:08] not that I know off [15:46:49] milimetric: hey [15:46:58] howdy average_drifter [15:47:01] milimetric: you guys don't generate any static htmls that need validation right / [15:47:04] ? [15:47:15] well, we probably should validate [15:47:29] but I think that's something we'll work on after we actually have a release [15:47:48] we've had some architectural struggles these past couple of months [15:47:48] milimetric: if you had to choose a html validator what would you choose ? [15:47:53] and are just now seeing the end of it [15:48:19] milimetric: we got many errors in some pages. I mean they're showing up alright in the browser but the js console is jammed with err messages and warnings [15:48:23] I think I'd just start with this: http://validator.w3.org/ [15:48:30] oh I see [15:48:42] well those most likely won't get caught by html validators [15:49:12] but i can help you debug some javascript stuff in a bit if you need [15:49:21] I'm just gonna finish something up real quick, 20 minutes. [15:50:25] average_drifter: don't spend too much time with having a validator :-) [15:50:37] average_drifter: it is something nice to have but I don't think it is that much important :-D [16:03:19] out [17:47:39] yoyo [17:47:57] hey drdee [17:48:13] oh hello! [17:48:19] how is everyone! [17:48:21] i feel GREAT [17:48:33] cf. http://www.youtube.com/watch?v=Y6rE0EakhG8 [17:48:36] morning dschoon [17:48:43] hey milimetric [17:48:53] i am totally gonna read the hell out of that slide deck again! [17:49:28] that cool, drdee? [17:49:31] drdeeeee hia! [17:49:40] dschooon haiiii [17:49:44] BABIES EVERYWHERE [17:49:47] very cool indeed dschoon [17:49:47] that's what drdee says [17:49:49] drdee! [17:49:55] YES OTTOMATA! [17:50:00] what is a way to test hue email stuff? [17:50:04] WE ARE VERY EXCITED TODAY [17:50:09] EVERYBODY IS PUMPED [17:50:37] ottomata, i created an oozie workflow that should send an email if the job fails [17:50:47] not sure what triggers an email action within hue [17:52:50] hmmm oooozie [17:53:23] is the email thing part of the oozie job properties? [17:57:24] no, it's part of the oozie xml documet [17:57:33] ? [17:57:37] is that different? [17:57:41] yes [17:57:51] can I see the conf? [17:57:52] an oozie requires two files: [17:57:55] ah ok [17:57:57] sure, hold on [17:58:08] the job.properties file with system / hadoop settings [17:58:16] and the actual job specification, an xml document [17:59:41] https://plus.google.com/hangouts/_/2e8127ccf7baae1df74153f25553c443bd351e90 [18:03:07] dschoon ^ [18:03:14] kay! [18:03:14] sec [18:14:09] ottomata: check an01:/home/diederik/workflow.xml [18:14:16] that contains the example email action [18:14:38] i got kicked out of hangout [18:17:34] erosen: http://mlpy.sourceforge.net/ [18:17:48] built on top of scipy [18:20:03] so this plane does have wifi but not way of charging my battery :( [18:21:43] gonna get lunch, be back in a bit [18:22:54] erosen, really [18:22:56] hey [18:30:07] okay, not done with the deck, but i need to take care of interview stuff [18:30:13] i'll come back to it this afternoon. [19:18:04] ah ok, drdee [19:18:21] i see, the oozie email notification is sent by oozie [19:18:23] i think it should just work [19:18:24] does it not? [19:18:58] i don't think so, iirc that job failed a number of times but i never received an email [19:19:03] i could be mistaken [19:21:12] ahhh, i think i know [19:23:01] * drdee is waiting in suspense [19:28:02] ottomata, did you change the sub-sub domain names to make stuff work with ldap? [19:28:35] no? no [19:28:54] i need to change that when I enable https [19:29:02] so I think all of that is going to have to change [19:30:27] ok, shall i make an asana task? [19:34:22] done, anyways [19:36:26] okay guys i have only 8 minutes of battery left….. if you have a question shoot now :D [19:41:11] 5 minutes left and counting [19:44:08] you don't have an outlet at your seat? [19:44:15] NO :( [19:44:30] drdee: hey where you goin ? [19:44:30] this is so frustrating [19:44:39] there is wifi but no outlet [19:44:49] average_drifter: my battery is about to die [19:45:02] i'll be online in about 2.5 hours [19:45:12] hmm ok [19:45:25] send me stuff quickly if you need feedback :D [19:45:36] like 1 minute quickly [19:45:36] silly ol' Delta [19:46:15] :( git conflict [19:46:19] won't make it in 1m [19:46:19] ok, drdee [19:46:20] i have a q! [19:46:24] shoot [19:46:26] now [19:46:30] can I try to run that oozie job and have a failure? [19:46:31] hurry!! [19:46:31] :) [19:46:33] how can I check email? [19:46:36] i think it should work now [19:46:40] want to verify [19:46:50] make a small change to the oozie doc [19:46:54] change an [19:47:00] ah to make it busted [19:47:01] ok [19:47:04] and then to submit? [19:47:33] damn [19:49:01] ah, lost him [21:03:10] dschoon: heard anything about or looked at http://techblog.netflix.com/2012/12/hystrix-dashboard-and-turbine.html ? [21:04:42] interviewing, brb [21:36:31] !log log test [21:36:34] Logged the message, dummy [21:36:39] oooo [21:37:04] https://www.mediawiki.org/wiki/Analytics/Server_Admin_Log [21:37:12] cool! [21:44:40] andrewbogott: thank you! that's great :) [21:44:50] Sure thing. Hope it works! [21:46:20] thank you! [21:48:55] good call milimetric [21:49:19] what's the channel is logged: <> thing? [21:49:22] though http://ur1.ca/a8gl3 [21:49:24] seems broken [21:49:26] yea [21:49:32] should we just take that off? [21:49:40] well it would be useful [21:49:44] or is that where Everything would be logged [21:49:51] well there are two logs [21:49:56] there is the complete IRC log [21:50:02] which is what the tiny url is supposed to point to [21:50:04] and the admin log [21:50:06] gotcha [21:50:21] http://bots.wmflabs.org/~petrb/logs/%23wikimedia-analytics/ [21:50:21] so no removing, just fix at some point [21:50:24] this one works fine [21:50:37] but the tiny url points to http://bots.wmflabs.org/~petrb/logs/%23wikimedia-analytics/?C=M;O=D [21:50:39] for some reason [21:50:58] they both give me a 403 but that may be 'cause i'm an outsider [21:51:12] http://ur1.ca/bzklp [21:51:18] try that ^ one [21:51:30] i guess it probably won't change anything if you could do the other one [21:51:38] yeah that points to the one you said worked but still not allowed from out here [21:51:45] interesting [21:52:14] weird [21:52:33] there :) better [21:52:57] something is still messed up [21:53:10] i have this link open in two tabs [21:53:11] http://bots.wmflabs.org/~petrb/logs/%23wikimedia-analytics/ [21:53:17] and one works and the other doesn't [21:53:39] and now it works [21:53:42] very confused [21:53:56] thanks for updating [22:08:57] ottomata: do you remember the file system set up on labs? [22:09:03] i.e. if can't write to my home dir [22:09:15] which partition is full? [22:11:17] erosen: labs homedirs are full and misbehaving at the moment. We havea long-term solution in the works but not so much a short-term one :( [22:11:45] gotcha, thanks [22:12:04] andrewbogott: is it the sort of thing I can help by clearing out space? [22:12:12] or is it more complicated than tha [22:12:12] t [22:12:13] yes! [22:12:16] hehe [22:12:20] i'll see what I can do [22:12:40] In theory anything bigger than .bashrc should live in /data/projects rather than your homedir. [22:15:29] thanks [22:20:49] average_drifter: do you think you can move some of your files on the labs NFS to /data/project? [22:21:40] i'm not sure you are to blame for the full partition, but your home dir is readable so I know there is a little bit of stuff at least ;) [22:33:01] spetrea@stat1:~$ du -csh . [22:33:01] 39G . [22:33:02] 39G total [22:33:08] I'm using 39G [22:33:37] there's no /data/project on stat1 [22:33:43] oh not on stat1 [22:33:47] just on labs [22:33:49] build1 ? build2 ? [22:33:52] yeah [22:34:06] I think those have their own nfs [22:34:43] basically the analytics project has a total shared amount of home dir space on the labs machines [22:34:55] spetrea@build1:~$ du -csh . [22:34:55] 122M . [22:34:55] 122M total [22:34:57] yaeh [22:34:59] I'm using very little space [22:34:59] not much [22:35:12] There are 18g available, shared among all homedirs in all projects. [22:35:23] So even Ms of storage will fill it up pretty quickly. [22:35:23] hmm [22:35:38] perhaps a bigger disk would solve it ? :) [22:35:39] andrewbogott: any way to tell who is using space from your end? [22:36:18] average_drifter: We're going to migrate away from that system but right now I need just, like, 1k of space in order to test my migration plan :( [22:38:45] average_drifter: 1kilobyte ? [22:38:56] what is 1k ? [22:39:05] 1kb ? [22:39:33] i think so [22:39:38] right now it is 100% full [22:39:46] so I (nor andrewbogott) can do anythign [22:39:46] I can delete 1MB of data. Does that help ? [22:41:11] It might. Probably there's a process running wild someplace that's gobbling up any space that appears [22:41:41] that seems likely. I deleted a few k just now and I didn't seem to help [22:51:55] I think there's an utility which like htop/top that tells you which processes are using I/O a lot [22:51:55] http://serverfault.com/a/9433/138688 [22:51:55] iotop [22:51:55] :) [22:51:55] http://guichaz.free.fr/iotop/ [22:59:47] back [23:00:01] !log kraken initial log entry [23:00:03] Logged the message, cap'n [23:00:13] tadaaa the logbot works! [23:00:28] dschoon,ottomata, milimetric, average_drifter ^^ [23:00:33] hehe [23:00:48] yeah, I changed our topic [23:00:49] the first 'word' indicates the project [23:00:49] :) [23:01:08] !log limn initial log entry [23:01:10] Logged the message, Master [23:01:15] drdee: ! [23:01:16] woa! [23:01:19] I'm master [23:01:44] :) [23:01:45] drdee: welcome back :) [23:02:05] drdee: uploaded two screencasts explaining how the tests worked [23:02:12] awesome! [23:02:13] url? [23:02:17] drdee: added a sample test setup for anyone who wants to write tests [23:02:33] drdee: check e-mail please [23:04:20] brb smoke [23:07:11] dschoon: got a sec for git deploy questions? [23:07:17] yeah, totally [23:07:21] that is, global-dev.wmflabs deploy [23:07:30] in person work? [23:36:05] drdee: https://gerrit.wikimedia.org/r/#/c/38157/ [23:36:33] drdee: I couldn't remove all the /home/ezachte , I need to ask Erik about it because some of them are Windows-dependent [23:36:49] drdee: so there's 3 ways Erik runs scripts [23:36:54] drdee: * on his Windows machine [23:37:00] drdee: * in his home directory on stat1 [23:37:07] drdee: * in production [23:37:36] if there were just two I could've done a simple check for just $job_runs_in_production variable, but there are more than 2