[00:08:24] (CR) Gergő Tisza: Track loading time for MediaViewer and the file page (1 comment) [analytics/multimedia] - https://gerrit.wikimedia.org/r/148021 (owner: Gergő Tisza) [00:12:06] (PS1) Yuvipanda: Add title attribute to queries [analytics/quarry/web] - https://gerrit.wikimedia.org/r/149197 [00:12:08] (CR) jenkins-bot: [V: -1] Add title attribute to queries [analytics/quarry/web] - https://gerrit.wikimedia.org/r/149197 (owner: Yuvipanda) [00:12:35] (CR) Yuvipanda: [C: 2] Added full bootstrap release [analytics/quarry/web] - https://gerrit.wikimedia.org/r/149188 (owner: Yuvipanda) [00:12:41] (Merged) jenkins-bot: Added full bootstrap release [analytics/quarry/web] - https://gerrit.wikimedia.org/r/149188 (owner: Yuvipanda) [00:12:42] (PS2) Yuvipanda: Add title attribute to queries [analytics/quarry/web] - https://gerrit.wikimedia.org/r/149197 [00:12:46] (CR) jenkins-bot: [V: -1] Add title attribute to queries [analytics/quarry/web] - https://gerrit.wikimedia.org/r/149197 (owner: Yuvipanda) [00:14:56] (PS3) Yuvipanda: Add title attribute to queries [analytics/quarry/web] - https://gerrit.wikimedia.org/r/149197 [00:14:58] (CR) jenkins-bot: [V: -1] Add title attribute to queries [analytics/quarry/web] - https://gerrit.wikimedia.org/r/149197 (owner: Yuvipanda) [00:15:09] (PS4) Yuvipanda: Add title attribute to queries [analytics/quarry/web] - https://gerrit.wikimedia.org/r/149197 [00:15:14] (CR) jenkins-bot: [V: -1] Add title attribute to queries [analytics/quarry/web] - https://gerrit.wikimedia.org/r/149197 (owner: Yuvipanda) [00:15:16] (CR) Yuvipanda: [C: 2] Add title attribute to queries [analytics/quarry/web] - https://gerrit.wikimedia.org/r/149197 (owner: Yuvipanda) [00:15:19] (CR) jenkins-bot: [V: -1] Add title attribute to queries [analytics/quarry/web] - https://gerrit.wikimedia.org/r/149197 (owner: Yuvipanda) [00:17:05] (PS5) Yuvipanda: Add title attribute to queries [analytics/quarry/web] - https://gerrit.wikimedia.org/r/149197 [00:17:07] (CR) jenkins-bot: [V: -1] Add title attribute to queries [analytics/quarry/web] - https://gerrit.wikimedia.org/r/149197 (owner: Yuvipanda) [00:17:30] (PS6) Yuvipanda: Add title attribute to queries [analytics/quarry/web] - https://gerrit.wikimedia.org/r/149197 [00:17:48] (CR) Yuvipanda: [C: 2] Add title attribute to queries [analytics/quarry/web] - https://gerrit.wikimedia.org/r/149197 (owner: Yuvipanda) [00:17:55] (Merged) jenkins-bot: Add title attribute to queries [analytics/quarry/web] - https://gerrit.wikimedia.org/r/149197 (owner: Yuvipanda) [00:44:25] (PS1) Gergő Tisza: More duration time clarification text [analytics/multimedia/config] - https://gerrit.wikimedia.org/r/149204 [00:44:44] (CR) Gergő Tisza: [C: 2 V: 2] "Self-merge more text changes." [analytics/multimedia/config] - https://gerrit.wikimedia.org/r/149204 (owner: Gergő Tisza) [00:47:10] milimetric: ping? did you put your hack from the hackathon up somewhere? [00:47:44] yes YuviPanda, hang on it's in a weird repo in gerrit [00:47:49] (where code goes to be forgotten) [00:47:50] milimetric: :D [00:48:17] YuviPanda|zzz: https://git.wikimedia.org/tree/mediawiki%2Fextensions%2FLimn [00:48:31] milimetric: that's an interesting name [00:48:35] are you looking for something specific? [00:48:46] haha, yeah, I lost a bet with Toby so he gets to name it going forward [00:48:48] milimetric: no, was just going to check how much work it would be to productize [00:48:52] that repo is just a Proof of Concept [00:48:57] k [00:49:07] it's ready if the contenthandler stuff can handle it [00:49:45] it could use a json editor for the namespace it defines, and a rename I guess [00:50:22] milimetric: does it store the data in the content handler? [00:51:03] milimetric: I was thinking it should have a whitelist of domains it can hit for the data (wikimetrics, to begin with, and perhaps stat1003) [00:52:02] I relied on data coming from the local server [00:52:07] hmm, right [00:52:20] data is hard :/ [00:52:30] milimetric: heh, yeah. I'll probably spend some time tinkering with it after quarry is in a stable state [00:52:34] and I also am not sure what's possible to do with wiki [00:52:39] milimetric: you might like: http://quarry.wmflabs.org/query/2 [00:52:45] milimetric: I don't think the raw data should be on wiki at all. [00:52:54] yeah, agreed definitely [00:53:08] it really should be in wikidata [00:53:18] but they aren't quite there yet with their interfaces [00:53:20] milimetric: no, not there either. WikiData isn't really built for largish things [00:53:33] it is actually, I talked to Lydia extensively [00:53:37] they're working towards any and all data [00:53:46] but they say that so that people are very clear about what it can do *now* [00:53:49] milimetric: right but it's the size of it that's going to be a problem for a long long time [00:54:08] yeah, definitely not holding my breath, but that's where I'm looking at in the long term [00:54:12] right. [00:54:45] quarry's coming along Yuvi [00:54:47] I love the name too [00:54:56] milimetric: so Quarry's web interface is sortof complete - you can create new queries, view other people's, edit yours, and there's a status page :) I'll get to start it having running queries soon [00:54:59] milimetric: :D [00:55:21] milimetric: we talk about this + a lot more in -research, you should probably autojoin :D [00:55:44] i had totally forgotten about that channel - done [08:58:04] (CR) QChris: [C: 2 V: 2] "Works as expected." [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/149123 (https://bugzilla.wikimedia.org/68534) (owner: Milimetric) [08:58:13] (Merged) jenkins-bot: Improve connection pool configurability [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/149123 (https://bugzilla.wikimedia.org/68534) (owner: Milimetric) [10:40:35] (CR) QChris: [C: -1] Alter names of oozie jobs to make it easier to identify in Oozie UIs. (2 comments) [analytics/refinery] - https://gerrit.wikimedia.org/r/149107 (owner: Ottomata) [10:40:49] (PS2) QChris: Alter names of oozie jobs to make it easier to identify in Oozie UIs [analytics/refinery] - https://gerrit.wikimedia.org/r/149107 (owner: Ottomata) [10:40:58] (CR) QChris: [C: 2 V: 2] Alter names of oozie jobs to make it easier to identify in Oozie UIs [analytics/refinery] - https://gerrit.wikimedia.org/r/149107 (owner: Ottomata) [10:46:03] (CR) QChris: Alter names of oozie jobs to make it easier to identify in Oozie UIs (2 comments) [analytics/refinery] - https://gerrit.wikimedia.org/r/149107 (owner: Ottomata) [11:10:28] Analytics / General/Unknown: ULSFO post-move verification - https://bugzilla.wikimedia.org/68199#c13 (Mark Bergsma) Yeah, amssq47 had been used as a test server before, not receiving any traffic. Brandon reinstalled and put it back in production around that time, so that would explain it. [13:00:48] (PS1) QChris: Document choice of constant in connection pool test [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/149298 [13:00:50] (PS1) QChris: Drop unneeded imports in connection pool test [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/149299 [13:00:52] (PS1) QChris: Be lighter on database when testing connection pools [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/149300 [13:00:54] (PS1) QChris: Drop noop assertion in connection pool test [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/149301 [13:00:56] (PS1) QChris: Bring connection pool tests into main test suite [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/149302 [13:00:58] (PS1) QChris: Put focus on last connection in connection pool test [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/149303 [13:01:41] (CR) jenkins-bot: [V: -1] Put focus on last connection in connection pool test [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/149303 (owner: QChris) [13:04:26] (PS2) QChris: Put focus on last connection in connection pool test [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/149303 [13:06:05] (CR) QChris: "> I have a few nits, but will try fixing them as separate commits, to" [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/149123 (https://bugzilla.wikimedia.org/68534) (owner: Milimetric) [13:10:14] (PS1) QChris: Switch to CNAMEs in coding conventions [analytics/refinery] - https://gerrit.wikimedia.org/r/149306 [13:23:01] Analytics / Refinery: Make oozie use system's libpath per default - https://bugzilla.wikimedia.org/68568 (christian) NEW p:Unprio s:normal a:None Probably just needs adapting oozie.use.system.libpath in templates/oozie/oozie-site.xml.erb of operations/puppet/cdh repo. [13:24:29] Analytics / Refinery: Make oozie use system's libpath per default - https://bugzilla.wikimedia.org/68568#c1 (christian) (See comments on line 50 of https://gerrit.wikimedia.org/r/#/c/143486/5/oozie/webrequest/compute_sequence_stats/job.properties ) [13:27:02] Analytics / Refinery: Make oozie write external stats per default - https://bugzilla.wikimedia.org/68569 (christian) NEW p:Unprio s:normal a:None Probably just needs adapting oozie.action.external.stats.write in templates/oozie/oozie-site.xml.erb of operations/puppet/cdh repo. Or otherw... [13:29:16] Analytics / Refinery: Decide on job.properties vs. {workflow,coordinator,bundle}.properties - https://bugzilla.wikimedia.org/68570 (christian) NEW p:Unprio s:normal a:None See discussion on line 1 of https://gerrit.wikimedia.org/r/#/c/143486/5/oozie/webrequest/compute_sequence_stats/job.... [13:30:05] qchris, i had even thought that that file needed to be on all datanodes yesterday, but then I was like "of course it is there!" [13:30:07] but it is not! [13:30:09] DUUUUHHHH [13:30:19] (CR) QChris: Coordinate computing sequence statistics through Oozie (9 comments) [analytics/refinery] - https://gerrit.wikimedia.org/r/143486 (https://bugzilla.wikimedia.org/67128) (owner: Milimetric) [13:30:39] It is not? [13:30:53] It worked for me ... [13:31:01] Let me check again. [13:31:33] analytics1011 has it. [13:32:09] Most other nodes do not let me in. [13:32:27] But it is puppetized. I had expected the file to be there ... why isn't it? [13:32:29] Analytics / Wikimetrics: Story: Researcher has prototype for wikimania - https://bugzilla.wikimedia.org/68516#c1 (Dan Andreescu) https://metrics-staging.wmflabs.org/static/public/dash/ [13:33:56] i'm doing it now [13:34:02] its puppetized for hive clients [13:34:07] hive client wasn't included on datanodes [13:35:51] ah. ok. [13:36:24] We could also put the file in hdfs. [13:41:14] (Abandoned) QChris: Coordinate computing sequence statistics through Oozie [analytics/refinery] - https://gerrit.wikimedia.org/r/143486 (https://bugzilla.wikimedia.org/67128) (owner: Milimetric) [13:42:19] naw i'd prefer to keep it simple, and not worry about that, trying file// soon [13:42:28] Ok. Thanks. [14:18:40] (CR) Gilles: Track loading time for MediaViewer and the file page (1 comment) [analytics/multimedia] - https://gerrit.wikimedia.org/r/148021 (owner: Gergő Tisza) [14:21:26] oh, qchris, another thing, i had problems using even namenode.analytics.eqiad.wmnet for the name_node variable [14:21:36] because it didn't match what was in core-site.xml [14:21:41] i had to make them match [14:21:48] i had to use hdfs://analytics-hadoop/ [14:23:18] (PS1) Gilles: Switch back to domComplete [analytics/multimedia] - https://gerrit.wikimedia.org/r/149314 [14:23:32] (CR) Gilles: [C: 2 V: 2] Switch back to domComplete [analytics/multimedia] - https://gerrit.wikimedia.org/r/149314 (owner: Gilles) [14:24:18] ottomata: Really? [14:24:19] (PS1) Ottomata: Use hdfs://analytics-hadoop for name_node property [analytics/refinery] - https://gerrit.wikimedia.org/r/149315 [14:24:23] * qchris is puzzled [14:24:55] yes, uh, lemme getcha the error [14:24:59] (CR) QChris: [C: 2 V: 2] Use hdfs://analytics-hadoop for name_node property [analytics/refinery] - https://gerrit.wikimedia.org/r/149315 (owner: Ottomata) [14:25:03] No it's ok. [14:25:09] I got a similar error at some point. [14:25:20] ok [14:25:20] :) [14:25:22] But it was because I tried to understand what oozie does. [14:25:34] And then switching back to the main namenode worked for me. [14:25:50] Well hdfs://analytics-hadoop it is :-D [14:28:15] ok running, fingers crossed!!!! [14:28:34] add partition success!, sequence going... [14:29:48] qchris: wanna check out the hue oozie ui? [14:29:57] Sure [14:30:11] Which port, which host? [14:30:35] Or maybe ... [14:31:01] https://analytics1027.eqiad.wmnet:8888 [14:31:12] you can log in with your labs ldap creds [14:31:20] Trying ... [14:31:51] Mhmmm ... [14:32:05] woot, generate for mobile finished! [14:32:09] so, i use sshuttle... [14:32:09] I cannot tunnel ssh to it "Permission denied (publickey)" [14:32:36] or, tyr this [14:32:39] ssh -N bast1001.wikimedia.org -L 8888:analytics1027.eqiad.wmnet:8888 [14:32:39] ? [14:32:43] OH [14:32:51] * qchris is no ops :-) [14:33:00] yeah maybe you don't have ssh access there [14:33:01] hmm [14:33:03] use sshuttle! [14:33:05] you don't need it! [14:33:15] https://github.com/apenwarr/sshuttle [14:33:22] then do [14:33:23] sshuttle works without ssh access? [14:33:36] you ssh to bastion [14:33:41] and it forwards packets around for you [14:33:57] alias wmvpn='sshuttle --dns -vvr otto@bast1001.wikimedia.org 10.0.0.0/8' [14:34:30] I guess no hue for me for now. [14:34:43] I won't to sshuttle on production bastions. [14:34:53] s/to/do/ [14:35:02] Sounds too scary. [14:35:04] But! [14:35:10] Maybe you can screenshare me along? [14:35:17] ah we can get you access for sure [14:35:17] I am in trap chat. [14:35:20] i was thikning you shoudl have it anyway... [14:35:26] bu i'll show ya [14:35:44] here? [14:35:44] https://appear.in/wmf-analytics-batcave [14:36:29] I am alone in there ... [14:36:32] Trying to reconnect [14:37:52] qchris: to hangout? [14:39:02] Trying to ... [15:55:51] ok I need to say this publicly [15:55:54] because I bash mysql so much [15:56:18] SELECT seq FROM seq_1_to_3000 is awesome [15:56:21] like, REALLY awesome [15:56:35] I love springle extra much for telling us about that [16:30:42] hey dan… I’m playing with the prototype some more [16:31:18] milimetric: I’m looking at the NewlyRegistered users in the prototype dash [16:32:42] it looks like every project is spiking up on the number of newly regisred users today [16:33:23] 50% more than the previous max. seems fishy [16:58:29] kevinator: all the data is looking fishy to me [16:59:01] i'm running queries now, trying to identify what's going on with that as well as create the timeseries backfilling [16:59:13] ok cool [17:00:02] I was trying to imagine what in the world would cause Germans to suddenly take to wikipedia and register new accounts [17:00:12] haha [17:02:04] yeah, i don't think that's normal kevinator, i'll let you know as soon as I find out anything [17:02:42] thanks [17:47:18] (PS1) Ottomata: Make check_sequence_statistics_workflow's name more descriptive [analytics/refinery] - https://gerrit.wikimedia.org/r/149347 [17:58:09] (CR) QChris: [C: 2 V: 2] Make check_sequence_statistics_workflow's name more descriptive [analytics/refinery] - https://gerrit.wikimedia.org/r/149347 (owner: Ottomata) [17:58:25] qchris, I realized I forgot to submit this job to the standard queue! [17:58:28] its running in adhoc! [17:58:28] :p [17:58:34] which I guess doesn't really matter, since there are no other jobs running [17:58:39] Right. [17:58:46] But it also shows we picked the right default. [17:58:48] but, i'll probably kill this one and start it again...maybe after this one catches up all the way [17:58:50] haha [17:58:54] it does? [17:58:56] Yes. [17:59:09] I'd certainly forget to [17:59:21] set the right queue once in a while while during testing. [17:59:28] And then it's on a more limited queue. [17:59:39] Being defensive rules ;-) [18:47:59] (PS2) Ottomata: Switch to CNAMEs in coding conventions [analytics/refinery] - https://gerrit.wikimedia.org/r/149306 (owner: QChris) [18:48:05] (CR) Ottomata: [C: 2 V: 2] Switch to CNAMEs in coding conventions [analytics/refinery] - https://gerrit.wikimedia.org/r/149306 (owner: QChris) [19:13:07] (PS1) QChris: Update the database connection settings for alembic [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/149383 [19:13:16] (PS1) QChris: Allow to run nosetests without coverage report [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/149384 [19:14:33] (CR) QChris: "Not sure how others feel about it, but forcing the coverage" (1 comment) [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/149384 (owner: QChris) [19:38:40] (PS2) Hashar: Update the database connection settings for alembic [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/149383 (owner: QChris) [19:38:45] (PS2) Hashar: Allow to run nosetests without coverage report [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/149384 (owner: QChris) [19:39:03] qchris: trivial changes in commit message to retrigger tests [19:39:11] qchris: jenkins/zuul went dead somehow again :( [19:39:52] Ah. Ok. [19:40:16] Thanks. [19:57:58] (CR) QChris: [C: -1] "Thanks!" (6 comments) [analytics/wikistats] - https://gerrit.wikimedia.org/r/147876 (owner: Chmarkine) [20:59:35] hey milimetric: what's the secret with knockout/src/components directory when building w/bower? [21:00:05] it doesn't appear to install on its own. [21:00:23] is there some intermediate build step I'm not aware of in the case of knockout? [21:36:32] halfak: would you just run a query then, or do you have a way to "nice" these longer running jobs? [21:37:43] It's a script that runs the query for me. No nice necessary since all the computation happens on the SQL server. [21:39:49] The script just makes sure that the query runs over all the wikis. [21:40:32] Wanna help me review the new query? http://pastebin.com/jmGErwkY [21:40:35] milimetric, ^ [21:40:54] k [21:41:02] Found an error already [21:41:21] user name and reg are not necessary, but I'm assuming you need those [21:41:48] left(timestamp, 8) right? [21:41:53] http://pastebin.com/GCDj9R5P [21:41:57] yup [21:42:22] probably "as editor_days" [21:42:47] SUM(revisions * archived) AS archived, ? [21:42:51] oh sure. Stupid aliases that aren't used. [21:42:56] not SUM(revisions + archived) AS total, or something? [21:42:59] Yeah. archives = 0 or 1 [21:43:04] oh! [21:43:06] sry [21:43:28] It's a confusing strategy [21:43:32] interesting way of doing it :) [21:43:39] I could also just do sum(if(...)) [21:44:04] yeah, i think this might be faster though, and it's cool [21:44:31] don't you wanna group by wiki too? [21:44:41] in the inner queries [21:44:57] Oh yeah.. That's not necessary, but it wouldn't hurt. [21:45:25] oh, doh, nvm heh [21:45:33] http://pastebin.com/eZPqzGKF [21:45:45] Still a confusing strategy. That last one is much more straightforward. [21:46:02] does this get rid of any records? INNER JOIN USER USING (user_id) [21:46:22] It shouldn't. I dropped it anyway. [21:47:01] cool, looks good to me [21:47:13] i'll run it a couple times manually to sanity check [21:47:52] OK. I'll kick it off. It's no problem if we want to restart. [21:50:13] What was the status on NRU? Should I run that too? [21:52:33] oh, NRU is super fast, I think I can run that without any special table [21:52:44] halfak: did you mean for revisions to include archived revisions? [21:53:02] SUM(revisions) AS revisions [21:53:02] instead of: [21:53:02] SUM(IF(archived, 0, revisions)) AS revisions [21:53:08] Yup. [21:53:19] revisions are revisions whether they are archived or not. [21:53:29] k, works for me, you can always subtract [21:53:39] +1 [21:53:47] awesome aaron, thanks very much! [21:54:58] with this and how fast NRU runs I think we'll get this filled up and looking good in time [21:55:10] pizzzacat: welcome back - glad to talk knockout if you're having trouble still [21:57:47] milimetric, woot on NRU. Back to the other stuff! [22:02:15] thanks milimetric [22:02:45] I wonder if I have the wrong version of knockout since I just don't have the src/components directory or any reference to it [22:03:03] I think you're using the 3.2 beta too [22:03:10] those might be related :) [22:08:19] oh yeah pizzzacat, knockout 3.2 is required [22:08:34] this is from my bower.json: "knockout": "~3.2.0-beta", [22:09:06] 3.1 didn't have components and 3.2 is still beta so it won't be the default [22:09:12] yeah [22:09:31] I guess I assumed when I built it it'd be the lastest n' greatest [22:09:42] *downloaded it [22:12:01] if you check it out from their github they have some separate branches for stuff that's not yet "released" [23:12:49] pizzzacat: in case you're curious, I got to the bottom of the performance problems and glitches [23:13:05] a lot of them are way more confusing than they will be in the final product [23:13:19] because I'm just hardcoding a lot of crap, but it might be of interest anyway: [23:13:19] https://github.com/milimetric/dashiki/commit/daa54074c21e08e72e743d1abf622754941f225a