[00:33:42] drdee: may I ask another question ? [00:34:12] sure [00:35:10] it works now btw, with Geo::IP , it's a bit slower than before but it works (since geocoding's done at run-time) [00:35:14] so my question is [00:35:38] because jenkins is ruled by zuul which is a thing/bot that I have very little knowledge of [00:35:55] I'd like to switch wikistats to travis somehow [00:36:02] because there's bound to be a lot of tests coming up [00:36:07] to fix all the bugs that are reported on it [00:36:24] and jenkins/zuul requires too much overhead to control/tweak [00:36:34] so I'd like to ask permission to do that [00:37:40] https://github.com/wikimedia-incubator/debianize/blob/master/.travis.yml [00:37:57] this is how a travis.yml looks for a Perl project [00:39:36] average: I don't think you need permission [00:39:52] it's OK to just do it. though if it's nothing to you, keep the setup in jenkins all the same [00:40:03] that way it can be improved over time. [00:40:18] (i mean, have both) [00:40:33] ori-l: well actually, since hashar introduced zuul on jenkins, it got very complicated, so now I lost control over jenkins [00:40:52] due to lack of zuul knowledge [00:41:29] having both is perfectly possible [00:42:06] have you filed a bug? [00:43:06] it's not a bug [00:43:09] a bug report is useful for things like this -- there doesn't need to be to an actual underlying application error. it's ok to just file a bug in integration and say "i don't know how to set up testing for my perl project because zuul is confusing" [00:44:36] just look at the current crop of integration bugs and you'll see that this is quite normal to do: https://bugzilla.wikimedia.org/buglist.cgi?title=Special%3ASearch&quicksearch=zuul&list_id=194500 [00:46:48] average: if you like, I'd be happy to file one for you [00:48:53] no, don't worry about it [00:49:08] I'll make a travis.yml [00:50:10] OK, but think about it. a bug report is easy to file and that way you are giving the integration team a chance to respond and make things easier to use. hashar and krinkle are both very nice. [00:52:14] do you share my view about zuul ? [00:52:34] just curious. maybe I'm wrong [00:53:49] well, I was able to set up puppet linting for mediawiki-vagrant recently and it wasn't too hard. BUT, there was an existing puppet-lint job defined somewhere, so i just had to copy/paste and change the project name. [00:54:02] your situation is a bit different, in that there's a different test runner involved [00:54:11] I have no idea how complicated that is to set up, to be honest [00:54:31] hey ori-l, you got my bug report :) [00:54:48] which one? [00:55:17] my bugzilla e-mail filters are wrong and i haven't fixed them yet so i may have missed it [00:55:26] average: about jenkins/zuul/travis: let's make new mingle card for continuous integration for wikistats and spec it out first [00:55:35] drdee: ok [00:55:37] ori-l: on gthub [00:55:45] which repo? [00:55:51] mediawiki-vagrant [00:56:32] (wasn't sure how you wanted to handle bugs for that but i saw that you already closed one bug on that repo so i assumed you were using it) [00:57:11] oh, I missed it. Just found it now. Thanks for the positive feedback. Things are on Bugzilla now, but it's confusing because the repository moved twice, so it's OK that it's there [00:57:40] i really like vagrant and what you did! [00:57:43] it's really awesome [00:58:19] i am trying to abuse the existing user metrics puppet module and get it to work with the mediawiki-vagrant setup [00:58:27] if i would send you a pull request, would you consider it? [00:59:36] yes, of course [01:00:02] vagrant is pretty handy for puppet stuff, right? i've used it for that myself. labs isn't very convenient, though it looks like otto made it work well. [01:00:25] used it for that == used it to develop and test puppet stuff that i later ported to operations/puppet [01:00:33] about scribunto [01:00:43] i just replied [01:00:43] i think if core depends on it then it should be in core [01:00:47] saw it [01:00:51] i mean read it [01:01:09] how can core depend on an extension? doesn't make sense to me [01:01:14] I don't think core depends on it; it was referenced in the dump [01:01:28] because it means that mw is broken when just using it [01:01:52] well, it's not as crazy as you make it sound, but there's a problem there, sure [01:01:56] :) [01:02:05] as a total puppet noob [01:02:16] what's the command to test a puppet manifest/ [01:02:18] I'd have to check to be certain but it looks like the XML of the dump just specifies a ContentHandler class for each article [01:02:40] yeah i think it's something like that [01:02:53] and your dump is a dump of a wiki that had scribunto enabled, so some of the pages specify lua module content type, and when you try to load it the wiki freaks out [01:03:43] or importDump should warn me that i don't have all required extensions installed :) [01:04:37] i'll answer the puppet q in a sec, brb [01:09:14] drdee: https://bugzilla.wikimedia.org/show_bug.cgi?id=47270 [01:09:28] thanks :) [01:09:56] anyways, for puppet stuff, you can just edit the puppet manifests in puppet/ using your favorite editor [01:10:00] and then run 'vagrant provision' [01:10:06] which will rerun puppet against the vm [01:10:07] got it! ty [01:55:44] ori-l, around [01:55:46] ? [02:25:01] New review: coren; "(3 comments)" [analytics/log2udp2] (master) - https://gerrit.wikimedia.org/r/58449 [02:41:22] drdee: about to put noam to bed; what's up? [02:41:42] nothing important; first things first :) [03:13:52] drdee: what's up? [03:14:16] touché regarding the tab stuff btw :D [03:14:25] buuut..... [03:14:32] it happens way less than space :) [03:14:37] and that was the most important thing [03:14:45] but probably it's good to escape tabs [03:14:47] anyways [03:15:05] yeah, my e-mail was snarky, sorry [03:15:13] just fooling around with puppet [03:15:28] was wondering what the simplest way is to run git clone from inside a manifest [03:15:29] hey, whatever you do in the privacy of your own home is none of my business [03:16:14] but you don't need to help me now [03:16:21] well, okay, i'll show you the simplest way first, then the way you probably want to do it if you want this merged to operations/puppet [03:16:21] ryan faulkner is waiting for me [03:16:29] to drink a beer [03:16:31] he's in toronto? [03:16:34] yup [03:16:39] where are you guys going? :P [03:16:47] ryan style :D [03:17:10] which neighborhood do you live in, again? [03:17:49] roncesvalles [03:18:02] let's talk tomorrow [03:18:48] sure [03:18:53] send ryan my best [12:59:27] morning [13:03:07] mornig [13:27:38] morning [13:28:58] I just read an article http://bit.ly/10LNAul . I can't say I learned a lot from it. I wish I read something more detailed [13:29:12] it talks about a grownd-up rewrite of an analytics solution [13:29:15] ottomata: so kraken's settled on kafka? [13:30:29] um? yes? [13:30:48] settled? i guess, we like it, ops likes it, we just need to make it useable from frontends [13:32:38] ok... i wasn't sure how familiar you had gotten with it yet [13:33:10] we've been using with 2 brokers, 4 producers, and one consumer since decemberish [13:33:22] who are those? [13:33:40] i.e. what data is it handling so far? [13:36:15] mostly varnish access logs from the 4 mobile varnish servers [13:46:57] ahh [13:47:29] and for the forseeable future everything also goes to udp simultaneously? [13:49:16] for now, yeah, we'd like to send to kafka from varnish, but that will take some work [13:58:45] milimetric: is there a standardized test for future telling? [13:58:59] Change merged: Milimetric; [analytics/E3Analysis] (master) - https://gerrit.wikimedia.org/r/59264 [13:59:35] ottomata: i don't follow. i thought you were saying varnish was already going to kafka? so what do you mean would need some work? [13:59:57] yeah jeremyb_, I think it's this: can I say "I told you so"? If yes, your future telling powers get +2 bonus points [14:00:27] current flow [14:00:35] then if you have 10 bonus points you get to be a master class future teller and you can order everyone around [14:00:38] and they're always positive points? [14:00:38] varnishncsa -> udp2log -> kafka -> hadoop [14:00:41] ideal flow [14:00:46] varnishncsa -> kafka -> hadoop [14:00:51] or even more better [14:00:52] right [14:00:56] varnishncsa -> kafka -> storm -> hadoop [14:00:58] no, if you are wrong, you get "cry wolf" points [14:01:06] and if you have 10 of those, you get eaten alive by wolves [14:01:32] do these cancel eachother out? [14:01:53] ottomata: so what all needs doing for that? [14:04:14] well yea, if you get eaten alive by wolves you can't boss people around. But other than that, no. [14:06:18] ottomata: can I ask what storm does ? [14:07:10] distributed stream processing, kinda like hadoop but streaming data instead of batch jobs [14:07:22] http://storm-project.net/ [14:07:38] http://www.mediawiki.org/wiki/File:Kraken_flow_diagram-2.png [14:07:56] we'd like to use it for distributed geocoding, anonymization, user agent lookup, etc. before things get to hdfs [14:08:07] that way some prelim processing has been done, and no private data is stored in hdfs [14:08:29] jeremyb_, for varnsh -> kafka bit? [14:08:41] we need a kafka producer in C that has zookeeper support, and a kafka varnish module [14:16:36] is there one that does not have zookeeper support? [14:17:19] and any special reason it should be in C? [14:18:37] ottomata [14:25:21] or c++? whatever varnish is in [14:25:26] yes, the existing C ones do not have zookeeper [14:25:40] changing locs, back in a sec [14:32:55] ping average [14:33:08] pong drdee [14:33:48] Host Loss% Snt Last Avg Best Wrst StDev [14:33:51] 1. 192.168.1.1 2.2% 1741 2.1 18.1 1.6 4511. 130.0 [14:34:00] there you go drdee :-) [14:34:23] jeremyb_: I'm average :) [14:34:36] ok I get it was a joke :) [14:34:40] average: {{fact}}! [14:34:46] :P [14:34:51] :D [14:43:11] drdee: https://mingle.corp.wikimedia.org/projects/analytics/cards/581 [14:43:20] yes i saw it [14:43:21] ty [14:43:39] I updated it just now [14:44:37] i know, i saw it :) [14:45:37] averege: poke hasher and ask about how to setup zuul/jenkins for wikistats [14:45:54] i want to have more information before deciding travis vs. jenkins [14:46:09] average: about the mobile page view report [14:46:22] what's left to do to publish it on stats.wikimedia.org? [14:47:48] copying them to stats.wikimedia.org [14:47:53] they are here atm http://analytics.wmflabs.org/reports/r52/out_sp/EN/ [14:48:12] do the other reports already contain links to the new mobile report? [14:49:18] my understanding is that we overwrite two .htm files [14:49:38] I think the links from the other reports to it will be valid [14:51:39] no, we will keep the old mobile report (based on webstatscollector [14:51:44] and add your report as a new report [14:51:53] so we can compare the two in the coming months [14:52:24] let's make a final todo list: http://etherpad.wikimedia.org/publish-new-mobile-report [14:53:36] didn't erik have a final comment on the text that we sent him the other day? [14:54:35] milimetric, around? [14:54:44] yep [14:54:45] hi [15:02:43] hey ottomata [15:02:47] hihi [15:40:41] drdee: updated the etherpad [15:40:48] ty [15:43:19] drdee: in terms of time/benefits, would it be a good idea to document wikistats? I mean how to run all the reports in a first phase(not the code itself). [15:57:53] so I'm going to update the links for the reports now [16:00:28] ping erik first [16:00:31] and talk with him [16:17:36] mornin [16:18:26] drdee: e-mail sent, you are in Cc [16:19:35] morning dschoon [16:25:03] ottomata you about? [16:25:15] i have all manner of good-news/bad-news dilemmas for you [16:30:11] ? [16:30:12] drdee: ? [16:33:23] dschoon: nobody seems to be about [16:42:24] dschoon: we'll have standup in 18m [16:52:00] average; what do you mean by card https://mingle.corp.wikimedia.org/projects/analytics/cards/583 [16:53:13] hiii, was at lunch [16:53:17] ssuuup? dschoon? [16:55:02] drdee: I'm trying to spec it out now. I'm not 100% if it's something valid yet [17:00:37] scrumy! [17:00:58] average, milimetric, ottomata : ^^ [17:09:35] drdee, ottomata: I marked https://bugzilla.wikimedia.org/show_bug.cgi?id=47269 as blocker + highest priority [17:09:48] see my last comment [17:10:36] ok, DarTar, do you want me to kill and restart? [17:10:39] that won't solve the bug [17:10:44] but should keep things moving for now [17:10:46] please do [17:12:01] when you get a chance can you also take a look at the logs to see if there's any useful information we can use to troubleshoot? [17:13:49] as usual, I can get data from the dev instance, so that doesn't affect me directly [17:16:25] do you have access to stat1001? [17:17:00] ok, DarTar, restarted stuff [17:17:18] ottomata: I don't [17:17:24] cool, checking it out [17:18:56] ok, new requests go through and complete – thanks [17:21:53] it's weird to have the urls in all these bug reports be 127.0.0.1 -- makes it difficult to know what you guys are talking about. is there any way to standardize access? [17:32:18] grooming! https://plus.google.com/hangouts/_/e920454707b309ebf66331e55e9d557eb5ed1caa [17:32:25] hygeine is important! [17:33:43] milimetric, average, erosen, ottomata [17:34:16] yes! [17:34:20] ori-l: I was just referring to the fact that whoever has access to a local dev instance can generate that specific job really fast, the same requests can be used on any instance, including the production one [17:34:32] huh? [17:34:53] milimetric; grooming [17:36:13] DarTar: that didn't make sense to me, but it night just be because i don't know enough about the project [17:37:17] yeah, I was implicitly referring to a setup that is the same for all people working on the code, you're right that it's not particularly elegant [18:22:37] ottomata: you should get sysop or something on wikitec h [18:22:40] wikitech* [18:36:33] ? [18:36:47] (going to get coffee, brb) [18:40:02] drdee: what do we do about tommorow ? is Tomasz expecting to see the new reports on stats.wikimedia.org ? [18:40:25] drdee: if so, I think I can do this right now and give you .htm so you can replace them with the ones on stats.wikimedia.org [18:40:36] drdee: so that the links are present [18:40:40] yes, that's why i have been pushing this for a week :) [18:40:43] i need to figure out how http://upload.wikimedia.org/wikipedia/en/7/70/Bob_at_Easel.jpg is a metaphor for limn [18:40:47] okay awesome, poke ottomata [18:40:54] he can help you copying the file [18:41:18] drdee: ok here goes [18:42:12] dschoon: well, there is the soothing "it can be whatever you want it to be" thing that he would say [18:43:26] ottomata: I am httrack-ing stats.wikimedia.org [18:45:35] "and now, let's all make a happy little graph. let's just put some very active editors on here and... aw~ now doesn't that look nice! that's definitely a content, peaceful little graph. aw." [18:46:35] "whoops, almost reached for some comscore there! boy would that have ruined things. but you know what? it wouldn't. don't worry. it's YOUR graph. don't let anybody tell you otherwise." [18:48:14] it's your secret little graph. We won't tell anyone else about it [18:48:14] if you tell anyone, I'll Cut you! [18:48:14] (family guy Bob Ross) [18:48:22] brb lunch [18:48:41] haha [19:02:50] https://github.com/blog/1472-new-github-logo [19:15:06] drdee: just sent an invite for a reminder to review the grooming cards in preparation for the grooming meeting [19:15:20] feel free to ignore, or tell me to delete it ;) [19:20:25] average, how can I help you? :) [19:51:50] I'm still mirroring stats.wikimedia.org [19:51:59] eh? [19:52:12] I know it's lame :( but I have to do it in order to find out where the lines I need to S&R are [19:52:41] ottomata: so my idea is to finish mirroring it on my disk (hopefuly it will be done soon) [19:52:54] after that I make a oneliner with just the files needed to be changed [19:53:01] I give the oneliner to you [19:53:09] and you run it on stat1001 [19:53:40] um [19:53:53] i don't understand what you need to do [19:53:54] but [19:53:54] q [19:54:09] are you copying anytihng to /public/datasets right now? [19:54:13] in labs? [19:54:21] ottomata: no. why ? [19:54:49] wikistats file in /public/datasets/public [19:55:07] hmm, ok unrelated [19:55:08] carry on [19:55:08] nm [19:55:20] ottomata: ok, so I want to explain what I want to do [19:55:32] ottomata: TablesPageViewsMonthlyMobile.htm [19:55:46] ottomata: this file contains the old mobile reports [19:56:03] ottomata: wherever it appears inside any html as a link, a new link needs to be added to the new reports [19:56:08] that's basically it [19:56:41] ok…..i guess i don't konw what I can help you with [19:57:00] you need a subdir on stats.wikimedia.org to copy stuff to? [19:57:43] ottomata: yes, please first copy this folder to a folder called new-mobile-pageviews-report http://analytics.wmflabs.org/reports/r52/out_sp/ [19:57:58] to the documentroot of stats.wikimedia.org [19:58:23] does that exist on stat1? [19:58:29] or just on analytics.wmflabs.org? [20:07:57] ottomata: just there [20:08:15] not on stat1 [20:11:05] ottomata: can you copy them to stat1001 please ? [20:11:21] on it [20:11:35] ottomata: did you see my pm? [20:12:16] http://stats.wikimedia.org/new-mobile-pageviews-report/ [20:12:18] yes, hehe [20:15:17] thanks :) [20:15:18] now [20:15:27] the httrack is still at it [20:15:30] :( [20:19:49] average: you only need to add new links to the new report for the current month [20:20:00] no need to do that for all the motnsh [20:26:19] drdee: yes, but I don't know what pages should have the link on them [20:26:22] ottomata: grep -l "href.*TablesPageViewsMonthlyMobile" `find -name "*.htm"` [20:26:34] ottomata: please run this on the documentroot on stats.wikimedia.org [20:26:51] ottomata: grep -l "href.*TablesPageViewsMonthlyMobile" `find -name "*.htm" -o -name "*.html"` [20:28:43] omg, ha [20:28:48] wait, you know wikistats is ons tat1, right? [20:28:56] erik zachte rsyncs this over to stat1001 from stat1 [20:29:18] /a/stats.wikimedia.org/htdocs [20:29:54] ottomata: so can I modify it myself on stat1 ? [20:29:58] ottomata: and maybe afterwards you can rsync ? [20:30:01] i think so, we might want to ask him [20:30:04] i'm not sure how he does it [20:30:09] drdee, do you know? [20:30:10] yeah we could ask him if he was reachable [20:30:43] let's only fix the current month :) [20:30:46] average: check http://stats.wikimedia.org/wikimedia/squids/SquidReportCountryData.htm [20:31:00] and only fix the links in the pages [20:31:01] : [20:31:07] drdee, do tyou know how ezachte deploys stats.wm.o? i can't remembe [20:31:07] See also: Requests by destination or by origin / Methods / Scripts / User agents / Skins / Crawlers / Op.Sys. / Mobile devices / Browsers / Google / [20:31:07] Geographical info: Overview / User agents / Browsers / Operating systems / Projects / Trends , a [20:31:10] r [20:31:21] nope, bash script? [20:31:26] i don't think it's automatic [20:31:30] don't thikn so either [20:32:50] ok, I'm going on stat1 in /a/stats.wikimedia.org/htdocs [20:34:18] yeah i guess so…i'm really not sure how he updates stats [20:34:21] am going to add the link to /a/stats.wikimedia.org/htdocs/wikimedia/squids/SquidReportCountryData.htm [20:34:22] milimetric, do you know? [20:34:36] reading [20:34:54] ezachte creates the html files for stats.wikmedia.org on stat1 [20:34:58] then they are deployed to stat1001 [20:35:01] not sure how he deploys [20:35:11] i'm thinking he copies them [20:35:23] manually? [20:35:23] a manual rsync each time? [20:35:28] it's just a guess, but yes [20:35:29] yes probably [20:35:32] it's consistent [20:35:33] you're probably right [20:35:51] just do a find/grep for stat1001 :) [20:35:51] ok average, that's fine then, i should be able to rsync [20:35:57] where? [20:37:06] ottomata: the directory you added earlier [20:37:54] ? [20:37:54] ok modified ! [20:38:03] no, my q was fro drdee [20:38:06] ottomata: can you rsync stat1 ===> stat1001 please ? [20:38:08] find/grep where? [20:38:11] ha, um...ok! [20:38:21] drdee: modified only the link you told me [20:38:39] oh, you said in all those pages [20:38:44] yes :) [20:38:55] ok, hopefuly the rsync will tell us if I modified the right file [20:39:46] is there only a single file you changed? [20:39:49] or did you change a bunch? [20:39:52] ottomata: yes, just one [20:40:02] can I just copy that one file? [20:40:02] ottomata: this one /a/stats.wikimedia.org/htdocs/wikimedia/squids/SquidReportCountryData.htm [20:40:04] i'm scrrrd [20:40:08] :( ? [20:40:10] what happened ? [20:40:15] now [20:40:16] no [20:40:16] ha [20:40:18] not screwed [20:40:20] scared [20:40:22] ok [20:40:23] hehe [20:41:20] k i copied the one file [20:41:21] http://stats.wikimedia.org/wikimedia/squids/SquidReportCountryData.htm [20:42:10] hmm [20:46:31] ok, I'll just modify the other ones too [20:46:39] it looks a bit suspicious to me now [20:47:03] considering the page has 2081 validation errors.. [20:47:05] anyway [20:48:38] average: how come all the numbers are missing for http://stats.wikimedia.org/wikimedia/squids/SquidReportCountryData.htm [20:50:33] drdee: I only added one link [20:50:39] and all the nubmers disappeared [20:50:44] the page has 2081 validation erros [20:50:46] *Errors [20:50:51] sigh [20:51:51] I am in the hangout, if you want to discuss with me what we should do [20:53:50] guys, the file is now overwritten with another file [20:54:24] ottomata, average: ^^ [20:54:32] yes [20:54:38] why? [20:54:53] because I added a link, and asked ottomata if he could rsync [20:55:00] and he rsync-ed [20:55:15] no; the country report no longer contains country dat [20:55:15] a [20:55:42] yes, one possibility is that I modified the wrong file [20:56:05] milimetric, care to join? [20:56:09] spetrea@stat1:/a/stats.wikimedia.org/htdocs$ find -name "SquidReportCountryData.htm" [20:56:12] ./archive/squid_reports/2012-01/SquidReportCountryData.htm [20:56:15] ./archive/squid_reports/2011-10/SquidReportCountryData.htm [20:56:17] ./archive/squid_reports/2011-11/SquidReportCountryData.htm [20:56:20] ./archive/squid_reports/2012-01b/2012-01/SquidReportCountryData.htm [20:56:22] ./archive/squid_reports/2012-03b/SquidReportCountryData.htm [20:56:25] ./archive/squid_reports/2012-03/SquidReportCountryData.htm [20:56:27] ./archive/squid_reports/2012-02/SquidReportCountryData.htm [20:56:30] ./archive/squid_reports/2011-12/SquidReportCountryData.htm [20:56:32] ./wikimedia/squids/SquidReportCountryData.htm [20:56:34] I modified the last one and asked Andrew to rsync [20:56:37] but that may have been the wrong one [20:56:40] there are many Country Reports [20:56:44] apparently it was the wrong one [20:57:03] I'm sure you guys are better equipped to deal with it. I've gotta focus on my #1 priority [20:58:13] ottomata? [20:58:18] wha? [20:58:23] whatchu need? [20:58:42] can we revert? the country report file has been overwritten with the wrong file [20:58:42] drdee , ottomata can we try again ? [20:59:08] hmm, ha, i don't htikn I saved it, but we only changed one file [21:00:02] hangout? [21:00:44] cmooon lemme work on other things! :p [21:00:55] where does the file come from? [21:00:58] can you regenerate it? [21:02:10] it would be complicated [21:02:17] we need to fix this :) [21:02:36] ok, i mean, i know nothign about wikistats or how it is deployed [21:02:41] or how it is created [21:02:52] you tell me how I can help and I will do it [21:02:53] k [21:24:47] ottomata , drdee I'll be ready to rsync again in 4m [21:25:27] k [21:30:40] all righty guys (ottomata, milimetric, erosen, average) part 2 of the grooming session [21:30:41] :D [21:30:43] https://plus.google.com/hangouts/_/f34ac18b582d1f011365b31385129dc694eca04b [21:31:24] drdee can't make part 2, for another 30 min [21:32:54] oh man do i have to go [21:33:37] alrighty [21:33:44] do I have tooooooooo? [21:33:51] ottomata: can you please please please [21:33:56] ottomata: stay for another 10m ? [21:33:57] pls ? [21:34:04] oh i can stay, i just don't want to do any more grooming [21:34:25] ottomata: that's fine :) [21:34:31] you are dismissed for grooming :) [21:34:49] yes! [21:42:40] ottomata: poke [22:05:58] hi [22:06:02] oh sorry [22:06:03] yes yes [22:06:07] i was about to do that [22:06:09] then got distractedc [22:06:37] done [22:09:55] ori-l^ [22:10:04] ottomata: thank youuuuuuuuuuuuuu [22:10:19] i'm just writing a report on the outage, i'll cc analytics@ [22:10:27] it's poetic justice for snarking drdee [22:30:26] ottomata: hey, I finished [22:30:35] ottomata: /home/spetrea/modified_squid_reports_to_contain_link_to_new_mobile_reports_2013_03 [22:30:36] drdee, for tomorrow: http://stats.wikimedia.org/kraken-public/webrequest/mobile/device/props/2013/ [22:30:40] ottomata: can you please rsync ? [22:30:49] haha, ok should I backup this time? [22:30:52] is it just the one file? [22:33:33] ottomata: yes, please do [22:33:39] ottomata: all the files in http://stats.wikimedia.org/kraken-public/webrequest/mobile/device/props/2013/ [22:33:47] ottomata: if you can sync them then it would be awesome [22:33:54] wait [22:33:56] what did I say [22:34:16] ottomata: /home/spetrea/modified_squid_reports_to_contain_link_to_new_mobile_reports_2013_03/* ======>>> stats.wikimedia.org [22:34:22] ottomata: that's basically it [22:34:55] still unsure, wait guess what [22:34:57] you can do this [22:35:06] rsync has a ::a module [22:35:07] so [22:35:29] rsync -av balblabla/ stat1001.wikimedia.org::a/srv/stats.wikimedia.org/htdocs/..blbalbalab/ [22:35:35] add the -n flag to try it out first [22:35:40] phaw? [22:35:42] what is ::? [22:35:47] rsync module [22:35:55] rsync d [22:35:58] * dschoon man rsync [22:36:09] ottomata: I don't have perms :| [22:36:15] you should [22:36:19] its writable by wikidev [22:36:24] it isn't ssh [22:38:14] rsync -av /home/spetrea/modified_squid_reports_to_contain_link_to_new_mobile_reports_2013_03/ stat1001.wikimedia.org::a/srv/stats.wikimedia.org/htdocs/ [22:38:17] like this [22:38:25] ? [22:38:40] ottomata: huh! so do we keep the rsync daemon running on stat1/stat1001? [22:38:42] where else? [22:38:49] stat1001 [22:38:52] stat1 as well [22:39:04] there are a few machines that are allowed to copy files between /a [22:39:07] my read is that it's like ftp, but secure and fast. [22:39:14] is that right or crazy? [22:39:19] ottomata: can you please confirm ? [22:39:26] ottomata: I am going to run that and I'm unsure [22:39:29] I don't want to break anything [22:39:36] average, first if you add the -n flag [22:39:37] --dry-run [22:39:38] it will tell you what is going to do [22:39:40] before you do it [22:39:40] ja [22:39:42] --dry-run [22:39:43] 2nd [22:39:46] which is -n :) [22:39:52] ok, I'm going to try that [22:39:54] my typical flags are: [22:40:00] if you want modified_squid_reports_to_contain_link_to_new_mobile_reports_2013_03/ to be in the htdocs/ dir [22:40:01] rsync -Cavz --progress [22:40:03] that will work, yes [22:40:43] heh, -a expands to -rlptgoD [22:41:10] have you looked at the source to rsync before? it's somewhere between a holy vision and a nuclear wasteland [22:41:14] same with bzip2 [22:41:32] it's like, beautifully concise and optimized... which makes it essentially unreadable [22:42:26] oh, wow. [22:42:29] i did not know about --delay-updates put all updated files into place at end [22:43:41] drdee: done [22:43:42] drdee: http://stats.wikimedia.org/wikimedia/squids/SquidReportScripts.htm [22:43:49] ottomata: thank you [22:44:01] drdee: http://stats.wikimedia.org/wikimedia/squids/SquidReportCountryData.htm [22:44:15] drdee: http://stats.wikimedia.org/wikimedia/squids/SquidReportScripts.htm [22:44:16] god, the day i found --bwlimit i was SO HAPPY [22:44:26] hokay. brb lunch (oy) [22:44:31] dschoon: the day I found about unison I was very happy [22:44:36] but rsync is good too [22:44:37] unison? [22:44:37] :) [22:44:47] dschoon: you see ? today might be a happy day for you [22:44:54] linky [22:44:56] aptitude show unison [22:44:57] i'll check it out when i get back [22:45:02] <-- osx [22:45:09] brew list unison [22:45:19] it's written in Ocaml [22:45:23] correctness proofs and stuff [22:45:25] :P [22:45:34] ottomata: you tried unison ? [22:46:03] neat. [22:46:37] naw, looks cool [22:48:24] ah-hahah ocaml [22:48:48] dschoon: do you know ocaml or haskell ? [22:49:01] I know you're mentioning haskell quite a lot. I actually think it's a very interesting language [22:49:09] for many reasons [22:49:14] I've watched some video lectures on it in the past [22:49:31] i was like, "okay, so this this is a tremendously complicated change resolution system. this means it's probably a distributed state machine and uses a big pile of wacky logical-time algorithms." [22:49:48] dschoon: which it probably does [22:49:50] "this means it has to be written in ... haskell, ocaml, or like fucking mozart-oz" [22:49:57] ocaml! [22:49:59] dschoon: http://www.cis.upenn.edu/~bcpierce/papers/index.shtml#File%20Synchronization [22:49:59] i win the prize! [22:50:09] okay, food 4 realz [22:51:09] average: i was pretty excellent with SML back in the day, but no, i'm rusty with all the hindley–milner languages [22:51:35] i stopped trying to find projects for them a long time ago. *shrug* [22:51:35] brb [23:00:29] drdee: dschoon : yo, did the device class report finish today ? [23:00:59] ja, tfinc [23:01:08] same url as before -- i just haven't gotten to the email yet. [23:01:23] some things were on fire last night, but none of your things burned :) [23:01:47] much like its namesake, mobile reports are agile and clever [23:02:05] tfinc: http://stats.wikimedia.org/kraken-public/webrequest/mobile/device/props/2013/ [23:02:14] thanks [23:02:27] the data for 4/16 is unreliable right now. [23:02:34] but that will resolve itself eventually. [23:02:43] march is solid for sure. [23:02:46] ^^ tfinc [23:02:54] i need to (uh) eat a food. i missed lunch. [23:02:56] brb! [23:03:04] (will answer any questions whence i return) [23:53:51] hahaha dschoon, I blame that python mistake I made *entirely* on Coco [23:53:56] corrupting my brain [23:53:57] back. [23:54:00] haha [23:54:09] yeah, it's weird though [23:54:13] i totally forget what the rule is [23:54:18] because if you open the python shell: [23:54:18] I totally thought you could go "is 'blah'" [23:54:26] >>> s = "foo" [23:54:34] >>> s is "foo" [23:54:35] True [23:54:39] >>> "foo" is "foo" [23:54:41] True [23:54:47] ...which i swear did not always work. [23:55:05] is is identity, not equality [23:55:30] so "foo" then is a constant that always has the same identity? [23:55:42] so s is "foo" just happens to work by accident of implementation sort of? [23:55:53] >>> [1,2,3] == [1,2,3] [23:55:53] True [23:55:53] >>> [1,2,3] is [1,2,3] [23:55:55] False [23:56:18] python interns strings as an optimization [23:56:27] aha! yea, cool [23:56:28] so 'foo' really *is* 'foo' [23:56:29] makes sense [23:56:30] thanks! [23:56:58] in Coco, s is "foo" is translated to === "foo" [23:57:13] good to know, hate that I have to :) [23:57:35] >>> print id('foo'), id('foo') [23:57:35] 4525146760 4525146760 [23:57:46] see? same object. [23:58:34] [travis-ci] develop/faa1c32 (#127 by milimetric): The build is still failing. http://travis-ci.org/wikimedia/limn/builds/6399200 [23:59:08] oh I believed ya :) [23:59:29] hrm, gotta fix those tests at some point... grumble deliverables grumble