[01:22:23] Analytics, Editing-Analysis: Move contents of ee-dashboards to edit-analysis.wmflabs.org - https://phabricator.wikimedia.org/T135174#2473771 (Neil_P._Quinn_WMF) a:Neil_P._Quinn_WMF>None [08:39:55] elukey: o/ [08:40:03] joal: o/ [08:40:13] joal: elukey o/ [08:40:58] addshore: o/ ! [08:42:54] elukey, is there any chance you could take a look at https://gerrit.wikimedia.org/r/#/c/298931/? :D [08:47:34] addshore: finally what have you agreed with nuria_ on the old data? [08:48:03] well, the old data is all there :) the job was running perfectly as far as I could tell yesterday so I filled the gap [08:48:04] https://grafana.wikimedia.org/dashboard/db/article-placeholder [08:48:35] So now, It's all just about the 2 new patches making the job use webrequest and the new header fields! [08:48:36] addshore: great :_) [08:49:11] addshore: I did yesterday and this morning, we need Nuria's approval as per Ops meeting follow up [08:49:24] after that we'll proceed [08:49:40] elukey: are you sure we are talking about the same patch? :O [08:50:16] ah no you are right sorry [08:50:21] checking [08:59:07] pcc looks weird but for a hiera secrets problem: https://puppet-compiler.wmflabs.org/3379/stat1002.eqiad.wmnet/change.stat1002.eqiad.wmnet.err [08:59:11] that was already present [08:59:19] so I am not super familiar with the change and the contezt [08:59:22] *context [08:59:40] it looks good to me but if it is not super urgent I'd say to wait for ottomata [08:59:58] joal: two questions :) [09:00:06] 1) do we deploy the refinery in labs/beta? [09:00:10] elukey: more if you want [09:00:16] 2) do I need to wipe the cluster ? :) [09:00:47] 1) We don't - We don't have a stable test cluster in beta, so no easy to test the newly deployed code [09:01:54] 2) No - I have changed compression yesteday, and given that almost all data needs to be reloaded, previsouly lz4 compressed data will end-up recompacted therefore recompressed (soooooo easy) [09:02:08] elukey: Are those the answers you expected --^ ? [09:03:00] yep thanks :) [09:03:20] how did we end up with a different compression alg? [09:03:21] np :) [09:03:53] elukey: By default cassandra uses lz4 (faster read/write) [09:04:41] (I asked 1) because I am preparing scap configs :) [09:04:43] elukey: When urandom pointed the compression algorithm difference yesterday, I recalled we had changed it on old cluster manually on gwicke advise (to use less space even if perf hit) [09:04:48] elukey: kk :) [09:04:54] elukey: I'm have seen you work on that [09:05:16] elukey: I thank you really a lot for doing this, it'll make our dpeloys easier (I think) [09:07:06] joal: it is fun but I am working with releng to improve the docs, I am not ottomata and trying to figure out config files and where to put things is a bit difficult sometimes :D [09:07:23] elukey: I can imagine ! [09:07:50] elukey: let me know if you think I acn help, but I'm no ottomata either [09:08:35] (PS1) Elukey: Initial basic configuration for the Refinery Scap repository. [analytics/refinery/scap] - https://gerrit.wikimedia.org/r/299714 (https://phabricator.wikimedia.org/T129151) [09:10:43] (PS6) Joal: [WIP] Process Mediawiki page history [analytics/refinery/source] - https://gerrit.wikimedia.org/r/295693 (https://phabricator.wikimedia.org/T134790) (owner: Milimetric) [09:11:28] (CR) jenkins-bot: [V: -1] [WIP] Process Mediawiki page history [analytics/refinery/source] - https://gerrit.wikimedia.org/r/295693 (https://phabricator.wikimedia.org/T134790) (owner: Milimetric) [09:11:36] (PS5) Joal: [WIP] Process MediaWiki User history [analytics/refinery/source] - https://gerrit.wikimedia.org/r/297268 (https://phabricator.wikimedia.org/T138861) (owner: Mforns) [09:12:33] (CR) jenkins-bot: [V: -1] [WIP] Process MediaWiki User history [analytics/refinery/source] - https://gerrit.wikimedia.org/r/297268 (https://phabricator.wikimedia.org/T138861) (owner: Mforns) [09:12:57] (PS2) Elukey: Initial basic configuration for the Refinery Scap repository. [analytics/refinery/scap] - https://gerrit.wikimedia.org/r/299714 (https://phabricator.wikimedia.org/T129151) [09:20:23] joal: there is one thing that we can discuss, namely how to test the refinery before Prod [09:20:41] elukey: we surely can, it would be a fantastic win [09:20:57] elukey: That was the inner reason for ottomata trying to set up a beta hadoop cluster [09:21:19] yeah but I meant for the scap switch :) [09:21:35] could be the same or maybe we could use a different and simpler approacj [09:21:54] the main problem is that some permissions/file attributes will change [09:22:11] for example, on analytics1027 the refinery is owned by root:root [09:22:24] and it will become analytics-deploy:analytics:deploy [09:22:42] not a big deal because ugo permissions will remain the same [09:22:53] hm ... I don't see why it'd cause problems [09:23:01] nono I am not saying that [09:23:07] but you know that I am paranoid [09:23:08] :) [09:23:12] elukey: And I don't see how we could test refinery without a hadoop cluster :S [09:23:27] PaRaNoïAAAAAA ! [09:30:07] * elukey runs away screaming [09:30:22] :D [09:40:56] (PS3) Elukey: Initial basic configuration for the Refinery Scap repository. [analytics/refinery/scap] - https://gerrit.wikimedia.org/r/299714 (https://phabricator.wikimedia.org/T129151) [09:49:04] Analytics, Analytics-Cluster, Analytics-Kanban, Deployment-Systems, and 3 others: Deploy analytics-refinery with scap3 - https://phabricator.wikimedia.org/T129151#2474657 (elukey) Summary: 1) External scap repository config: https://gerrit.wikimedia.org/r/299714 2) puppet changes: https://gerrit... [10:40:13] (PS1) Joal: [WIP] Modify User History scripts [analytics/refinery/source] - https://gerrit.wikimedia.org/r/299728 [10:40:15] (PS1) Joal: [WIP] Add Generic FixPoint code. [analytics/refinery/source] - https://gerrit.wikimedia.org/r/299729 [10:41:05] (CR) jenkins-bot: [V: -1] [WIP] Modify User History scripts [analytics/refinery/source] - https://gerrit.wikimedia.org/r/299728 (owner: Joal) [10:54:15] addshore: my bad, I needed to add fake data to the fake private repot to allow a good puppet compiler run https://puppet-compiler.wmflabs.org/3384/stat1002.eqiad.wmnet/ [10:54:25] if this is what you expected, I think we can merge [10:54:27] it looks fine [11:03:36] (Abandoned) Joal: [WIP] Modify User History scripts [analytics/refinery/source] - https://gerrit.wikimedia.org/r/299728 (owner: Joal) [11:10:00] (PS2) Joal: [WIP] Modify MediawikiUserHistory [analytics/refinery/source] - https://gerrit.wikimedia.org/r/299729 (https://phabricator.wikimedia.org/T139745) [11:11:35] Hey milimetric, let me know when you're ready to be bothered :) [11:12:18] I was born ready to be bothered [11:12:29] huhu :) [11:12:38] Makes me think of a song ;) [11:12:40] (CR) jenkins-bot: [V: -1] [WIP] Modify MediawikiUserHistory [analytics/refinery/source] - https://gerrit.wikimedia.org/r/299729 (https://phabricator.wikimedia.org/T139745) (owner: Joal) [11:13:06] what's up? [11:13:35] I have pushed updated code for history-work, and wanted to let you know [11:13:59] also, wanted to ask you if I can help, and/or if I an starting generification with the trait :) [11:14:34] milimetric: --^ [11:14:39] Sadly right now the status is that the whole algorithm won't work at all. [11:14:43] :( [11:14:57] Mooow :( [11:15:00] how come? [11:15:24] I can explain, just give me 20 minutes to get ready [11:15:44] gotta shower and feed the cat and stuff :) [11:15:45] milimetric: Take your time, we'll discuss later :) [11:15:48] sure sure ! [11:17:18] (PS3) Joal: [WIP] Modify MediawikiUserHistory [analytics/refinery/source] - https://gerrit.wikimedia.org/r/299729 (https://phabricator.wikimedia.org/T139745) [11:59:06] joal: https://hangouts.google.com/hangouts/_/wikimedia.org/ehr-batcave [11:59:44] milimetric: OMW [12:02:51] hi milimetric :] [12:03:03] hey mforns we're in the cave / board [12:32:49] Analytics, Analytics-Cluster, Analytics-Kanban, Deployment-Systems, and 3 others: Deploy analytics-refinery with scap3 - https://phabricator.wikimedia.org/T129151#2474875 (elukey) Deleted the keys from the keyholder private repo, I need to pass protect them properly as stated in https://wikitech.... [13:17:35] Analytics-Tech-community-metrics, Differential, Developer-Relations (Oct-Dec-2016): Make MetricsGrimoire/korma support gathering Code Review statistics from Phabricator's Differential - https://phabricator.wikimedia.org/T118753#2475001 (Aklapper) For Jul 2016 to Jun 2017, we agreed on migrating to th... [14:21:31] milimetric, joal, trying to rejoin... [14:21:51] I ejected you if that helps :) [14:52:38] milimetric: I finally think mforns is right in saying we can't use fixedPoint [14:52:54] milimetric: I have an example giving him reason :) [14:53:13] milimetric, mforns : I'll keep it for later [15:11:18] Does anyone here have any idea how to debug mediawiki & statsv stuff? [15:28:27] joal: do you want to do the ops sync? [15:28:30] or we should skip [15:29:07] elukey: I'm in 1-1 with nuria_ [15:29:27] okok :) [15:34:25] elukey: Ready ! [15:36:59] elukey: Arf, lsot you :) [15:37:39] joal: ah I thought you wouldn't have been available [15:37:45] huhu :) [15:38:12] elukey: I am now, and actually I'd need some brain-bounce :) [15:38:24] elukey: If you've not started something else of course [15:38:51] sure! I have bee fighting with passwords and I lost [15:38:55] so I could use a break [15:38:58] :D [15:40:55] milimetric, I'm back if you're here [15:41:13] mforns / joal: let's hear that example :) [15:41:30] cave? [15:41:35] i'm there [15:41:46] milimetric, mforns : in meeting with elukey, will be there in a while [15:57:00] nuria_: after discussing with elukey, he also wants us to wait another month-load before changing, so we'll go that way [15:57:38] joal: and another month load will be liek a week of calendar time? [15:57:57] correct nuria_, we should have results by end of next week [15:58:17] joal: ok, shoudl we set up a meeting to discuss? [15:58:29] joal: if not needed np [15:58:43] nuria_: I think elukey prefers to have a look at results before even discussing :) [15:58:48] joal: k [15:59:15] nuria_: and it makes a lot of sense (I'm a bit stressed on that, so willing to go too fast I guess) [15:59:38] joal: sounds good, let's touch base next week [16:00:16] yup [16:09:58] a-team: do we want to do the retro or skip? :) [16:26:08] hey ori. Let me know if I should schedule another meeting for continuing the discussion on recommendation-api. It makes sense to do it this week, if you have some time. [16:40:48] elukey: no retro [16:40:55] elukey: as we had quaterly retro last week [16:41:03] a-team: no retro [16:41:16] ok, noted [16:41:29] a-team: milimetric is the owner of meeting i believe so i cannot cancel [16:42:02] super thanks :) [16:43:29] leila: wfm [16:48:16] will schedule ori then. thank you! [17:04:58] * elukey afk! [17:04:59] byeeee [17:14:00] a-team, hashar: My patch to jenkins job builder was merged! [17:14:17] YAY madhu ! [17:15:02] :D [17:17:24] nuria_: I managed to run the old oozie job and get the data into graphite that I needed, Now it it just a case of reviewing and moving forward with the new job! [17:31:58] * leila claps for madhuvishy [17:32:19] leila: :D in office? [17:32:32] noo. :( [17:32:45] where are you, madhuvishy? we can work remotely together from a Hangout session. ;) [17:32:46] aah okay :) [17:33:19] ha ha I'm in office - I moved my desk to other side so was wondering if I missed seeing you lol [17:33:57] :D [17:34:03] okay, enjoy the new spot then. [17:34:21] I may end up working from home for all this week, madhuvishy. [17:35:00] leila: ooh okay :) have fun! [17:35:26] addshore: ok, let's abandon the older patch then, correct? [17:35:39] addshore: you did document your additions in x-analytics right? [17:42:47] milimetric: having scanned https://en.wikipedia.org/wiki/Strongly_connected_component, it seems the connected subgraph can be found in linear time :) [17:50:35] Yeah, for each page though [17:51:12] That makes sense, but for all pages it's n*m2 no? [17:51:49] Anyway this thing is on 2008 and it hasn't died yet :) [17:54:39] nuria_: yes the additions are in x-analytics! [17:55:19] (Abandoned) Addshore: Ignore namespace when matching Special:AboutTopic [analytics/refinery/source] - https://gerrit.wikimedia.org/r/298723 (https://phabricator.wikimedia.org/T138500) (owner: Addshore) [17:55:22] and {{done}} [17:56:43] I have just had a request to also track spiders in https://gerrit.wikimedia.org/r/#/c/298724/ however so I will go and make that change now! [17:59:16] addshore: I am not sure i understand, where is your change about spiders? I made some comments on https://gerrit.wikimedia.org/r/#/c/298724/ about commit message and docs [18:00:21] joal: I did that once using DFS and reversing graph, i think that is linear: https://github.com/nuria/study/blob/master/algorithms1/week4/scc.py [18:01:27] joal: will check in detail later [18:15:02] nuria_: It was just requested, so I am going to add it to the patch!. I'll check the comments now too! [18:16:03] addshore: what do you consider spiders? "self reported bots?" as in they tell you on the UA? [18:16:17] addshore: cause if so you do not need to add that to x-analytics [18:17:57] Analytics, MediaWiki-extensions-CentralNotice, Operations, Traffic: Generate a list of junk CN cookies being sent by clients - https://phabricator.wikimedia.org/T132374#2476507 (AndyRussG) @BBlack What are the advantages to using a regex rather than an explicit list? [[ https://www.mediawiki.org/... [18:22:42] nuria_: spiders as defined in the pageview definition! [18:22:52] addshore: we already have that info [18:22:53] but currently the job / query only looks at users [18:23:00] yes, but the job needs to send it to graphite! [18:23:23] addshore: if you are looking at traffic marked "user" is not spider [18:23:34] addshore: your query was filtering that was it not? [18:23:53] addshore: you want to also report non -use traffic? [18:24:16] Yes, and I have had a request to also track spider traffic, so I'm going to change the patch ( https://gerrit.wikimedia.org/r/#/c/298724/ ) [18:25:16] addshore: ok, one thing to have in mind is that "absolute counts" make not-so-good metrics [18:25:54] addshore: if you are just counting requests to some special pages it would be useful to normalize that data so you do not catch spikes that are beyond your control [18:26:12] addshore: a possible normalization is using the total number of pageviews [18:26:31] addshore: so you report a rate "pageviews to page X"/"total pageviews on project" [18:26:56] addshore: otherwise you will missidentify spikes that are due to spureous traffic increases overall [18:28:45] Well, I have been asked for the absolute pageview counts! Hence the approach! [18:29:15] addshore: what numbers are meaningless if you cannot attach meaning right? [18:29:32] addshore: imgine you see a spike of 50% in your user traffic from 1 week to the next [18:29:37] addshore: is it meanigful? [18:29:53] addshore: how can you know if you have no point of reference [18:30:58] Would the better approach not be to then add "total pageviews on project" to graphite so that the viewer can choose to have the normalized version of absolute version. [18:31:27] Once of the reasons this data is needed is for 'operational purposes' where I imagine exact counts have value. [18:33:08] Also interaction on the page is tracked using absolute counts! which is one of the things the pageviews will be compared with. [18:34:14] addshore: how are "absolute counts" helpful for operational purposes? [18:40:52] for example, the data displayed on the pages is drawn directly from wikidata & the wikidata databases on every load, knowing what impact these pages have on the stress wikidata causes is valuable. [18:41:13] especially jumps in page views as it is deployed to further wikis. [18:41:32] A % of total requests to a site would loose this value [18:42:19] addshore: no, it would have equal value plus more information [18:42:36] addshore: think about it , the scale is just different [18:42:46] addshore: but it is just a scale [18:43:14] Yes, but there is no 'easy' way to get back to the absolute number. [18:43:59] But as said, including the absolute pageview number per day in graphite and then doing the calculation when presenting the data would perhaps be a way to solve that. [18:44:50] (PS6) Addshore: Match title & ns using x_analytics header & get all agent_types [analytics/refinery/source] - https://gerrit.wikimedia.org/r/298724 (https://phabricator.wikimedia.org/T138500) [18:46:37] addshore: You can do as you think is best but we see loads of numbers like "# of clicks on x" completely devoid of meaning as clicks would increase if all of a sudden , say, there is a spike on traffic, which will be undetected in data presented on that fashion [18:46:42] For example, If on one of the wikis the extension is deployed on the number of page views across the whole wiki doubled, and the page views to Special:ArticlePlaceholder also doubled, That would not be visible when taken as a % of total views [18:46:46] Analytics, MediaWiki-extensions-CentralNotice, Operations, Traffic: Generate a list of junk CN cookies being sent by clients - https://phabricator.wikimedia.org/T132374#2476603 (BBlack) It would just be simpler, but we can do a list. The data we have in Cookies_to_remove is just from Ori's 20-mi... [18:48:26] (CR) jenkins-bot: [V: -1] Match title & ns using x_analytics header & get all agent_types [analytics/refinery/source] - https://gerrit.wikimedia.org/r/298724 (https://phabricator.wikimedia.org/T138500) (owner: Addshore) [18:49:01] That is what will be done with, for example the clicks on buttons on the page, they will be % of views that result in clicks for example, but the raw / absolute data is still the data to be stored in graphite, and then the calculations can be done when viewing the data. [18:49:35] addshore: that seems like an ok compromise [18:54:03] (CR) Thcipriani: [C: 1] Initial basic configuration for the Refinery Scap repository. [analytics/refinery/scap] - https://gerrit.wikimedia.org/r/299714 (https://phabricator.wikimedia.org/T129151) (owner: Elukey) [18:56:15] Analytics, Analytics-Cluster, Analytics-Kanban, Deployment-Systems, and 3 others: Deploy analytics-refinery with scap3 - https://phabricator.wikimedia.org/T129151#2476622 (thcipriani) >>! In T129151#2474657, @elukey wrote: > Summary: > > 1) External scap repository config: https://gerrit.wikimed... [18:58:51] also nuria_ I have a request for https://phabricator.wikimedia.org/T125095 which could be expanded to add counts from all counts to graphite, which would mean I could normalize the views to the special page for each site! [18:59:22] addshore: sounds good [19:05:09] (PS7) Addshore: Match title & ns using x_analytics header & get all agent_types [analytics/refinery/source] - https://gerrit.wikimedia.org/r/298724 (https://phabricator.wikimedia.org/T138500) [19:13:03] Analytics, MediaWiki-extensions-CentralNotice, Operations, Traffic: Generate a list of junk CN cookies being sent by clients - https://phabricator.wikimedia.org/T132374#2476731 (AndyRussG) >>! In T132374#2476603, @BBlack wrote: > It would just be simpler, but we can do a list. Fantastic, thx!!... [19:18:18] (PS1) Nuria: Normalize project parameter [analytics/aqs] - https://gerrit.wikimedia.org/r/299824 (https://phabricator.wikimedia.org/T136016) [19:20:25] Analytics-Kanban, Patch-For-Review: Upgrade AQS node version to 4.4.6 - https://phabricator.wikimedia.org/T139493#2476769 (Nuria) Open>Resolved [19:20:43] Analytics-Kanban: Consider SSTable bulk loading for AQS imports - https://phabricator.wikimedia.org/T126243#2476771 (Nuria) Open>Resolved [19:21:01] Analytics-Cluster, Analytics-Kanban, Patch-For-Review: Cleanup terabytes of logs on hdfs - https://phabricator.wikimedia.org/T139178#2476772 (Nuria) Open>Resolved [19:21:29] Analytics-Cluster, Analytics-Kanban, EventBus, Operations, and 2 others: Better monitoring for Zookeeper - https://phabricator.wikimedia.org/T137302#2476785 (Nuria) Open>Resolved [19:23:01] milimetric, BTW, do you have permits to delete the data from the table? in case you need? [19:23:17] no worries, I can just import it into my db [19:23:23] ok ok [19:23:32] bye a-team! see you tomorrow [19:23:53] nite mforns [19:31:23] (CR) Nuria: Match title & ns using x_analytics header & get all agent_types (2 comments) [analytics/refinery/source] - https://gerrit.wikimedia.org/r/298724 (https://phabricator.wikimedia.org/T138500) (owner: Addshore) [19:41:18] nuria_: I'm guessing you want the CPU time not actual time? And would you like it as a comment in the code or just in the CR? [19:42:13] addshore: well, both just to have a gauge of how things are working. You can add it to commit msg which is normally what we do. In this case is no biggie but just for documentation purposes [19:46:47] Analytics-Dashiki, Analytics-Kanban: Dashiki should load stale data if new data is not available due to network/api conditions - https://phabricator.wikimedia.org/T138647#2476909 (Nuria) [19:59:47] (PS8) Addshore: Match title & ns using x_analytics header & get all agent_types [analytics/refinery/source] - https://gerrit.wikimedia.org/r/298724 (https://phabricator.wikimedia.org/T138500) [20:00:20] (CR) Addshore: Match title & ns using x_analytics header & get all agent_types (2 comments) [analytics/refinery/source] - https://gerrit.wikimedia.org/r/298724 (https://phabricator.wikimedia.org/T138500) (owner: Addshore) [20:00:48] (CR) Addshore: "I have removed the content_type check as this result in more real and CPU time being used." [analytics/refinery/source] - https://gerrit.wikimedia.org/r/298724 (https://phabricator.wikimedia.org/T138500) (owner: Addshore) [20:01:07] ahh, to the commit message (ill put it there now)! [20:02:33] (PS9) Addshore: Match title & ns using x_analytics header & get all agent_types [analytics/refinery/source] - https://gerrit.wikimedia.org/r/298724 (https://phabricator.wikimedia.org/T138500) [20:02:38] nuria_: ^^ :) [20:03:51] addshore: and your oozie code was reviewed and such? if so this looks readay [20:03:54] *ready [20:04:35] Yes, but https://gerrit.wikimedia.org/r/#/c/298726/ will be needed as the table is changing! [20:06:53] addshore: why not merge those two changes together? [20:07:43] addshore: and note-to-self: not sure why table is a parameter on those as select depends on table so it could be hardcoded and things should be fine [20:08:20] nuria_: indeed, I could hard code it, but it was a parameter in the job that I had taken as an example! [20:08:50] addshore: ya, I think that is a mistake too cause it's not that select and table are interchangeable [20:09:47] addshore: having parameters comes handy when you run jobs on testing agains local db [20:09:56] *against [20:11:41] addshore: in this case I would remove the table parameter unless you can think a way in which it is helpful [20:11:44] Analytics-Dashiki: Migrate Expense Report Template to Google Sheets - https://phabricator.wikimedia.org/T140818#2477135 (Milimetric) [20:13:04] Will do nuria_! [20:13:33] Analytics-Dashiki: Automate or Simplify calculating Per-Diems - https://phabricator.wikimedia.org/T140819#2477149 (Milimetric) [20:13:54] addshore: that way we remove boiler plate from job too [20:15:02] Analytics-Dashiki: Discuss Review Nominations Process - https://phabricator.wikimedia.org/T140820#2477166 (Milimetric) [20:15:03] (PS10) Addshore: Match title & ns using x_analytics header & get all agent_types [analytics/refinery/source] - https://gerrit.wikimedia.org/r/298724 (https://phabricator.wikimedia.org/T138500) [20:15:56] Analytics-Dashiki: Improve Individual Travel Request Process - https://phabricator.wikimedia.org/T140821#2477179 (Milimetric) [20:18:12] (PS2) Addshore: Use webrequest in wikidata/articleplaceholder_metrics [analytics/refinery] - https://gerrit.wikimedia.org/r/298726 (https://phabricator.wikimedia.org/T138500) [20:18:19] How does that look? [20:21:43] addshore: I *think* it is better, but in order to test it though did you test it against your local copy or against the webrequest table? [20:24:11] addshore: cause i am looking everywhere and table is passed i. VERY SORRY. let's just merge your prior change [20:27:40] I tested against the webrequest table! I'll put the patchsets back! [20:28:20] addshore: sorry again [20:28:40] (PS3) Addshore: Use webrequest in wikidata/articleplaceholder_metrics [analytics/refinery] - https://gerrit.wikimedia.org/r/298726 (https://phabricator.wikimedia.org/T138500) [20:29:03] (PS11) Addshore: Match title & ns using x_analytics header & get all agent_types [analytics/refinery/source] - https://gerrit.wikimedia.org/r/298724 (https://phabricator.wikimedia.org/T138500) [20:29:08] those should be the 2 nuria_ [20:30:59] addshore: ok, can we make those two one change? as they have to be run together [20:31:14] they are in seperate repos! [20:32:36] Analytics-Dashiki: Improve Individual Travel Request Process - https://phabricator.wikimedia.org/T140821#2477311 (Milimetric) https://office.wikimedia.org/wiki/Travel_request_step_by_step_guide [20:32:50] (PS3) Addshore: Use new config vars [analytics/wmde/scripts] - https://gerrit.wikimedia.org/r/298938