[02:14:41] elukey: ping on this CR if you have the time https://gerrit.wikimedia.org/r/#/c/operations/puppet/+/535681/ [06:05:22] nuria: o/ - I tested the change on the fly on an-tool1007 and it reports error, added to the code review [06:05:28] it seems not finding $bytes [06:05:36] (among the measures) [06:06:35] and super weird thing, turnilo now shows only "count" [06:06:41] (after the restore/restart) [06:13:17] could be caching not sure [06:29:51] interesting thing that I noticed on the Druid coordinator's UI [06:29:58] some of the more recent segments [06:30:12] for 'wmf_netflow' have 1 dimension and 3 metrics [06:30:30] basically one segment is like that, and all the others are good [06:31:31] this might be some thing still not perfect in the realtime handling, but now I am wondering if this causes Turnilo to miss metrics [06:39:48] ok just tried with explicitly adding the dimensions in the turnilo config [06:39:52] bytes/packets work [06:53:39] all right going to merge the code as it is [06:53:44] with the dimensions listed [06:53:53] then we'll discuss what to do [06:56:07] done [06:57:18] 10Analytics, 10Analytics-Kanban: Add more dimensions to netflow's druid ingestion specs - https://phabricator.wikimedia.org/T229682 (10elukey) With https://gerrit.wikimedia.org/r/#/c/operations/puppet/+/535681/ I renamed `tag2` as `Direction` but I didn't find a quick and easy way to replace the values. [07:01:05] Good morning team [07:02:25] elukey: what is that with druid segments? have we changed something lately? [07:03:41] joal: bonjour! [07:03:54] o/ [07:04:16] not that I know, but I noticed that it is related only to the more recent hours.. could it be a side effect of a still-not-perfect kafka supervisor? [07:04:22] they keep improving it every release [07:04:32] very possible elukey [07:04:45] elukey: could it be that the supervisor doesn't include the metrics? [07:05:08] joal: the interesting bit is that the segment show 3 metrics but only 1 dimension [07:05:16] MEH [07:06:00] elukey: an idea: netflow has sent only 1 dimension for an hour? [07:07:06] joal: but for the same hour, there is another segment with all the dims [07:07:48] oh [07:07:50] hm [07:08:17] elukey: I have an idea ( I think I have seen that behavior when monitoring the realtime task) [07:10:44] elukey: also, it looks like the data has not been daily indexed since a few days [07:11:16] yes I noticed as well, I wanted to check the daily job on an-coord1001 [07:11:20] but got distracted :) [07:11:22] doing it now [07:11:22] :) [07:11:33] elukey: about weird segments [07:11:43] elukey: They always are empty (0 bytes) [07:12:03] yep [07:12:28] elukey: I assume those are lefovers from druid not converting correctly realtime segments into deep-storage, or something equivalent [07:13:42] joal: the daily job is configured with --since $(date --date '-4days' -u +'%Y-%m-%dT00:00:00') --until $(date --date '-3days' -u [07:13:45] etc.. [07:13:50] ah [07:13:51] so it seems lagging on purpose [07:14:04] ok - I wonder why though [07:14:08] anyway [07:14:18] IIRC I made those parameters configurable in puppet, I can tune them [07:14:27] ohhhh :) [07:16:11] in case you've not noticed it elukey, have a look at that link I pasted yesterday evening (https://blog.acolyer.org/2019/09/11/procella/) [07:17:54] I didn't, thanks :) [07:57:47] need to run errand for 30/40 mins [08:50:50] joal: wat [08:50:51] https://github.com/wikimedia/analytics-aqs/blob/master/lib/aqsUtil.js#L72-L90 [08:51:01] I just realized this [08:57:54] it's also funny that the function is not being used by anything [09:06:46] oh wait no, the one that I removed by accident did use it [09:07:19] Hi fdans - sorry missed the earlier ping [09:07:33] nono just commenting as I go :) [09:07:52] fdans: indeed - double page normalize function ??? WTW? [09:08:32] fdans: ok with comments I made? [09:08:41] fdans: I'm super open to discussing them [09:08:51] yea, will send changes now, almost done [09:11:38] \o/ Thanks :) [09:13:44] elukey: if you're around I could do with some help - I need a hammer please :) [09:14:05] Or to be precise - I'd need you to use your hammer :) [09:16:03] sure, what happened? [09:16:25] joal: --^ [09:16:43] elukey: nothing happened, YET :) [09:17:03] elukey: I need a git repo to be dropped, and another to be overwritten [09:18:35] can you tell me more details :) [09:19:03] elukey: sure! [09:19:31] elukey: as discussed yesterday, I'd like us to drop https://gerrit.wikimedia.org/r/#/admin/projects/analytics/ua-parser/uap-core [09:19:50] elukey: and I also need to push force https://gerrit.wikimedia.org/r/#/admin/projects/analytics/ua-parser/uap-java [09:20:25] I have not yet a disgnostic for https://gerrit.wikimedia.org/r/#/admin/projects/operations/debs/python-ua-parser, but something similar might be needed [09:23:33] ahh so in gerrit [09:26:33] yessir [09:26:55] elukey: I'm going to document the plan on the task first - give me a minute :) [09:27:16] 10Analytics, 10Analytics-Kanban: Upgrade ua parser to latest version for both java and python - https://phabricator.wikimedia.org/T212854 (10JAllemandou) [09:28:41] so I am not sure if I can drop the repo [09:28:55] I think that I can set it read only and then add a flag to mark is as deprecated [09:31:24] joal: i disagree with the "a file stored in mediawiki databases" bit [09:31:41] like, that's not where the files are stored no? [09:31:47] true [09:32:04] But I think upload.wikimedia.org is really not explicit for most [09:32:13] We shoujld find a better explanantion :) [09:32:19] Mine was also not good [09:32:24] joal: people will have to know the upload.wikimedia.org path to query this API, so they will have to be at least aware of its existence [09:32:59] I don't agree fdans - People don't even know that images in pages are served through different requestsan [09:33:10] (most, from my personal experience) [09:33:35] So I think we should be good in explaining - The main metric, and also the referer notion [09:33:39] joal: I'd say it's not the place of this API's documentation to explain that [09:35:05] Makes sense - I still don't like using the `upload.wikimedia.org` - seems too technical and not functional [09:37:20] joal: to me mentioning upload.blahblah makes it explicit that the user needs a direct path to the file to see this data, and not a commons file page, for example [09:37:33] probably can be better worded though [09:37:44] this implies user know about upload, which I think is mostly not the case [09:38:01] hm [09:44:09] joal: maybe change it to "Given a direct path of a media file..." [09:44:37] but I do think not mentioning upload.wikimedia.org makes it more confusing [09:44:46] because then it can be anything [09:45:20] you can find out what upload.wikimedia.org pretty quickly, but if you don't have that parameter you're going to be more lost [09:48:04] I hear your point - I just don't know how to try to make things clearer - Maybe something in the idea of: a file served from upload.wikimedia.org (the mediawiki media endpoint) ? [09:51:05] joal: (the file storage for all media in every wiki) [09:51:27] I'm trying to deliberately avoid using mediawiki, since it doesn't necessarily have anything to do with it [09:51:37] for all media from any wiki? [09:51:46] from every sounds better [09:51:47] sounds good [09:51:54] \o/ !! [09:52:00] Thanks fdans :) A lot better [09:59:05] (03PS1) 10Fdans: Add fake test data for mediarequests per file [analytics/aqs/deploy] - 10https://gerrit.wikimedia.org/r/536556 [10:01:44] (03PS12) 10Fdans: Add per file mediarequests endpoint to AQS [analytics/aqs] - 10https://gerrit.wikimedia.org/r/534824 (https://phabricator.wikimedia.org/T231589) [10:22:36] 10Analytics, 10Operations, 10Traffic: Images served with text/html content type - https://phabricator.wikimedia.org/T232679 (10BBlack) The URL mentioned at the top isn't a media URL, it actually is HTML content and is a pageview. Try it in your browser: https://commons.wikimedia.org//wiki/File:Arm_muscles_b... [10:24:32] joal: just to be on the same page, I am going to wait for your input before proceeding with the repos right? [10:28:47] elukey: I'm on it !! [10:30:13] 10Analytics: Check home leftovers of atgomez - https://phabricator.wikimedia.org/T232821 (10MoritzMuehlenhoff) [10:31:57] 10Analytics, 10Analytics-Kanban: Upgrade ua parser to latest version for both java and python - https://phabricator.wikimedia.org/T212854 (10JAllemandou) After yesterday's talk in tasking, here is the plan: - Use uap-core upstream master for regexes. This means we no longer need to maintain our own uap-core f... [10:32:00] elukey: --^ [10:36:18] 10Analytics, 10Better Use Of Data, 10Product-Infrastructure-Team-Backlog, 10Epic: Client side error logging production launch - https://phabricator.wikimedia.org/T226986 (10fgiunchedi) [10:40:53] so joal I don't think that we can delete a repo, but I can mark it as read-only and add a deprecation note [10:41:10] work for me elukey :) [10:41:24] elukey: Does the plan make sense? [10:41:41] 10Analytics: Check home leftovers of atgomez - https://phabricator.wikimedia.org/T232821 (10elukey) ` ====== stat1004 ====== total 0 ls: cannot access '/var/userarchive/atgomez.tar.bz2': No such file or directory ====== stat1006 ====== total 0 ls: cannot access '/var/userarchive/atgomez.tar.bz2': No such file o... [10:42:28] joal: https://gerrit.wikimedia.org/r/#/admin/projects/analytics/ua-parser/uap-core [10:42:50] 10Analytics, 10Analytics-Kanban: Upgrade ua parser to latest version for both java and python - https://phabricator.wikimedia.org/T212854 (10elukey) Set https://gerrit.wikimedia.org/r/#/admin/projects/analytics/ua-parser/uap-core as read-only and added a deprecation notice in the description. [10:42:51] Yay ! thanks elukey [10:44:59] joal: https://gerrit.wikimedia.org/r/#/admin/projects/analytics/ua-parser/uap-java,access shows that you should be able to push there [10:45:00] ok elukey I'm going to push the 23 commits needed [10:45:00] for the python one you can't, but I can add perms [10:45:00] I'll need nuria confirmation before updating the wmf branch [10:45:00] elukey: sounds good for perms [10:46:45] for python I'd need to triple check with Andrew/Nuria the gerrit perms of the group Analytics [10:46:52] they seem a bit out of date [10:47:07] ok, no rush :) [10:47:36] just sent an email [10:47:41] but you should be unblocked for java [10:47:57] elukey: just pushed a bunch of commits :) [10:48:11] super [10:49:10] I'm gonna wait for nuria later on before updating wmf branch [10:52:22] ack, going to have a quick lunch and then I'll be back [11:09:34] I am about to reboot the vm that runs turnilo [11:09:44] if anybody is using it speak up [11:09:45] :) [11:10:11] * joal speaks down [11:42:33] (03PS1) 10Ladsgroup: Rebuild the jar [analytics/wmde/toolkit-analyzer-build] - 10https://gerrit.wikimedia.org/r/536572 (https://phabricator.wikimedia.org/T232826) [11:42:57] (03CR) 10Ladsgroup: [C: 03+2] Rebuild the jar [analytics/wmde/toolkit-analyzer-build] - 10https://gerrit.wikimedia.org/r/536572 (https://phabricator.wikimedia.org/T232826) (owner: 10Ladsgroup) [11:43:05] (03Merged) 10jenkins-bot: Rebuild the jar [analytics/wmde/toolkit-analyzer-build] - 10https://gerrit.wikimedia.org/r/536572 (https://phabricator.wikimedia.org/T232826) (owner: 10Ladsgroup) [11:43:52] (03PS1) 10Ladsgroup: Track *.jar files as git lfs [analytics/wmde/toolkit-analyzer-build] - 10https://gerrit.wikimedia.org/r/536573 (https://phabricator.wikimedia.org/T230015) [11:44:08] (03CR) 10Ladsgroup: [C: 03+2] Track *.jar files as git lfs [analytics/wmde/toolkit-analyzer-build] - 10https://gerrit.wikimedia.org/r/536573 (https://phabricator.wikimedia.org/T230015) (owner: 10Ladsgroup) [11:44:14] (03Merged) 10jenkins-bot: Track *.jar files as git lfs [analytics/wmde/toolkit-analyzer-build] - 10https://gerrit.wikimedia.org/r/536573 (https://phabricator.wikimedia.org/T230015) (owner: 10Ladsgroup) [11:46:47] (03PS1) 10Ladsgroup: Rebuild the jar [analytics/wmde/toolkit-analyzer-build] - 10https://gerrit.wikimedia.org/r/536575 (https://phabricator.wikimedia.org/T232826) [11:46:55] (03CR) 10Ladsgroup: [C: 03+2] Rebuild the jar [analytics/wmde/toolkit-analyzer-build] - 10https://gerrit.wikimedia.org/r/536575 (https://phabricator.wikimedia.org/T232826) (owner: 10Ladsgroup) [11:47:01] (03Merged) 10jenkins-bot: Rebuild the jar [analytics/wmde/toolkit-analyzer-build] - 10https://gerrit.wikimedia.org/r/536575 (https://phabricator.wikimedia.org/T232826) (owner: 10Ladsgroup) [12:03:54] 10Analytics, 10Operations, 10hardware-requests, 10User-Elukey: eqiad: 1 misc node for the Kerberos KDC service - https://phabricator.wikimedia.org/T227288 (10faidon) a:05faidon→03RobH Approved. It sounds like our spare pools are being drained, so if that's the case feel free to open a task to replenis... [12:04:28] 10Analytics, 10Operations, 10hardware-requests, 10User-Elukey: codfw: 1 misc node for the Kerberos KDC service - https://phabricator.wikimedia.org/T227425 (10faidon) a:05faidon→03RobH Approved. [12:08:04] \o/ [12:10:33] (03CR) 10Joal: [C: 03+1] "Good to go for me :) Thanks for the changes :)" [analytics/aqs] - 10https://gerrit.wikimedia.org/r/534824 (https://phabricator.wikimedia.org/T231589) (owner: 10Fdans) [12:19:50] (03CR) 10Joal: [C: 03+1] "\o/ Thanks :)" [analytics/aqs/deploy] - 10https://gerrit.wikimedia.org/r/536556 (owner: 10Fdans) [12:20:15] (03PS17) 10Fdans: Add cassandra loading job for mediarequests per file metric [analytics/refinery] - 10https://gerrit.wikimedia.org/r/533921 (https://phabricator.wikimedia.org/T228149) [12:20:25] thank you so much for the reviews joal and nuria [12:21:17] (03CR) 10Fdans: [V: 03+2 C: 03+2] "Merging with +1s from nuria and joal" [analytics/refinery] - 10https://gerrit.wikimedia.org/r/533921 (https://phabricator.wikimedia.org/T228149) (owner: 10Fdans) [12:56:04] (03CR) 10Fdans: [C: 03+2] Add per file mediarequests endpoint to AQS (0311 comments) [analytics/aqs] - 10https://gerrit.wikimedia.org/r/534824 (https://phabricator.wikimedia.org/T231589) (owner: 10Fdans) [13:04:33] 10Analytics, 10Better Use Of Data, 10Product-Infrastructure-Team-Backlog, 10Epic: Client side error logging production launch - https://phabricator.wikimedia.org/T226986 (10fgiunchedi) >>! In T226986#5362943, @Nuria wrote: > I would like to suggest a deployment strategy for this code that I think would mak... [13:55:25] Gone for kids - Back for standup [14:29:17] 10Analytics: Verify what Python 2 packages deployed to Analytics hosts are needed - https://phabricator.wikimedia.org/T204737 (10elukey) [14:36:34] 10Analytics: Verify what Python 2 packages deployed to Analytics hosts are needed - https://phabricator.wikimedia.org/T204737 (10elukey) Just sent an email to Research/Product-analytics/Analytics mailing lists explaining the problem. [14:38:10] 10Analytics: Verify what Python 2 packages deployed to Analytics hosts are needed - https://phabricator.wikimedia.org/T204737 (10elukey) [14:38:39] 10Analytics: Verify what Python 2 packages deployed to Analytics hosts are needed - https://phabricator.wikimedia.org/T204737 (10elukey) [14:42:29] 10Analytics: Verify what Python 2 packages deployed to Analytics hosts are needed - https://phabricator.wikimedia.org/T204737 (10elukey) [14:46:22] 10Analytics: Verify what Python 2 packages deployed to Analytics hosts are needed - https://phabricator.wikimedia.org/T204737 (10elukey) [14:47:12] 10Analytics: Verify what Python 2 packages deployed to Analytics hosts are needed - https://phabricator.wikimedia.org/T204737 (10elukey) Explicitly adding @Gilles (due to https://gerrit.wikimedia.org/r/#/c/operations/puppet/+/459735/) and @EBernhardson to triple check :) [14:47:37] we have a ton of python2 packages :D [14:50:46] taking a break before standup :) [15:13:02] sorry a-team my internet was kaput until now [15:16:50] 10Analytics: Can we add Parsoid and ORES data to mediawiki history? - https://phabricator.wikimedia.org/T232843 (10mforns) [15:20:35] 10Analytics: Release wikimedia history dumps sorted and partitioned by user ID and page ID - https://phabricator.wikimedia.org/T232844 (10mforns) [15:20:58] 10Analytics, 10Research: Recommend the best format to release public data lake as a dump - https://phabricator.wikimedia.org/T224459 (10mforns) Hi all! Here's a summary of the survey responses. I added the conclusions that we Analytics have drawn, inline. Please, feel free to comment on them. In a later post... [15:22:42] 10Analytics: Can we add Parsoid and ORES data to mediawiki history? - https://phabricator.wikimedia.org/T232843 (10Nuria) I do not think it will be a wise decision (even if we could do it technically) to have mediawiki history be the monolith of all the things, with revision ids you should be able to retrieve ea... [15:32:46] 10Analytics, 10Operations, 10Traffic: Add google weblight to the list of trusted proxies - https://phabricator.wikimedia.org/T232849 (10Nuria) [15:47:39] 10Analytics: Client_IP and Ip are always the same , even for proxied requests for opera mini or googleweblight - https://phabricator.wikimedia.org/T232795 (10Nuria) [15:57:08] 10Analytics, 10Research: Recommend the best format to release public data lake as a dump - https://phabricator.wikimedia.org/T224459 (10mforns) So, this is the final format of the MediaWiki history dumps: ### Updates The job that generates the dumps will execute once a month, together with the release of eac... [15:58:34] 10Analytics, 10Analytics-Kanban, 10Research-Backlog, 10Patch-For-Review: Release edit data lake data as a public json dump /mysql dump, other? - https://phabricator.wikimedia.org/T208612 (10mforns) See the final format of the dumps, chosen after the community survey, here: T224459#5491080 [16:00:32] 10Analytics, 10Research: Recommend the best format to release public data lake as a dump - https://phabricator.wikimedia.org/T224459 (10Nuria) Can we move the info to wikitech so we can access it easily when this ticket is closed? [16:00:50] 10Analytics: Can we add Parsoid and ORES data to mediawiki history? - https://phabricator.wikimedia.org/T232843 (10JAllemandou) I middly disaggre with @Nuria for ORES scores - I think it could be very cool to have them (some models only, one model version only). Maybe in a separate table, or in different dumps.... [16:02:54] 10Analytics, 10Analytics-Kanban, 10Tool-Pageviews: Load media requests data into cassandra - https://phabricator.wikimedia.org/T228149 (10fdans) [16:03:25] 10Analytics, 10Analytics-Kanban, 10Tool-Pageviews: Load media requests data into cassandra - https://phabricator.wikimedia.org/T228149 (10fdans) a:03fdans [16:05:12] 10Analytics: Reload Hive2Druid datasources from already indexed data instead of raw data - https://phabricator.wikimedia.org/T232852 (10mforns) [16:52:44] 10Analytics, 10Services (watching): Add mediarequests per referer endpoint to AQS - https://phabricator.wikimedia.org/T232857 (10fdans) [16:53:43] 10Analytics, 10Services (watching): Add cassandra loading job for mediarequests per referer - https://phabricator.wikimedia.org/T232858 (10fdans) [16:55:21] nuria: what we just discussed is summarized here https://phabricator.wikimedia.org/T212854#5490312 [16:56:53] 10Analytics, 10Operations, 10Traffic: We are not capturing IPs of original requests for proxied requests from operamini and googleweblight. x-forwarded-for is null and client-ip is the same as IP on Webrequest data - https://phabricator.wikimedia.org/T232795 (10Nuria) [16:57:48] !log Reset ua-parser/uap-java wmf branch to up-to-date master using push force [16:57:50] Logged the message at https://www.mediawiki.org/wiki/Analytics/Server_Admin_Log [16:58:40] 10Analytics, 10Operations, 10Traffic: Images served with text/html content type - https://phabricator.wikimedia.org/T232679 (10Nuria) I have started another ticket that as you mentioned, better explains the rationale behing having "trusted proxies", we really do not need them if we can capture the original i... [16:59:01] 10Analytics, 10Operations, 10Traffic: We are not capturing IPs of original requests for proxied requests from operamini and googleweblight. x-forwarded-for is null and client-ip is the same as IP on Webrequest data - https://phabricator.wikimedia.org/T232795 (10Nuria) ping @Ottomata and @JAllemandou for thou... [16:59:07] https://www.reddit.com/r/ProgrammerHumor/comments/1zu0p1/git_push_force/ [17:02:08] 10Analytics: Can we add Parsoid and ORES data to mediawiki history? - https://phabricator.wikimedia.org/T232843 (10Nuria) I do not disagree that ores score would be useful in a table accessible by revision, +1 to that . I just do not think that the process that retrieves them and maintains them should be related... [17:07:36] (03PS1) 10Joal: Bump uap-core submodule to master [analytics/ua-parser/uap-java] - 10https://gerrit.wikimedia.org/r/536644 (https://phabricator.wikimedia.org/T212854) [17:10:15] (03PS1) 10Fdans: Add daily and monthly jobs for mediarequests per referer [analytics/refinery] - 10https://gerrit.wikimedia.org/r/536646 (https://phabricator.wikimedia.org/T232858) [17:59:30] 10Analytics, 10Operations, 10Traffic: We are not capturing IPs of original requests for proxied requests from operamini and googleweblight. x-forwarded-for is null and client-ip is the same as IP on Webrequest data - https://phabricator.wikimedia.org/T232795 (10BBlack) The problem stems from the "Trust" in "... [18:16:24] 10Analytics: Verify what Python 2 packages deployed to Analytics hosts are needed - https://phabricator.wikimedia.org/T204737 (10EBernhardson) Everything in search should be running on 3, sadly that migration only happened in the last year. But it happened! [18:40:44] 10Analytics, 10Operations, 10Traffic: We are not capturing IPs of original requests for proxied requests from operamini and googleweblight. x-forwarded-for is null and client-ip is the same as IP on Webrequest data - https://phabricator.wikimedia.org/T232795 (10Nuria) Right, I see the UA issue but in the abs... [18:50:48] (03CR) 10Nuria: "Looks good, let's make sure to test job. I see we are using referer (one 'r') everywhere and while is a misspell in the http spec (referre" (032 comments) [analytics/refinery] - 10https://gerrit.wikimedia.org/r/536646 (https://phabricator.wikimedia.org/T232858) (owner: 10Fdans) [20:02:13] 10Analytics: Can we add Parsoid and ORES data to mediawiki history? - https://phabricator.wikimedia.org/T232843 (10JAllemandou) +1 to that :) [20:10:43] (03PS1) 10Joal: Parameterize cache size in CachingParser [analytics/ua-parser/uap-java] - 10https://gerrit.wikimedia.org/r/536673 (https://phabricator.wikimedia.org/T212854) [20:13:53] (03Abandoned) 10Joal: Bump uap-core submodule to master [analytics/ua-parser/uap-java] - 10https://gerrit.wikimedia.org/r/536644 (https://phabricator.wikimedia.org/T212854) (owner: 10Joal) [20:14:05] (03Abandoned) 10Joal: Parameterize cache size in CachingParser [analytics/ua-parser/uap-java] - 10https://gerrit.wikimedia.org/r/536673 (https://phabricator.wikimedia.org/T212854) (owner: 10Joal) [20:14:29] (03PS1) 10Joal: Bump uap-core submodule to master [analytics/ua-parser/uap-java] (wmf) - 10https://gerrit.wikimedia.org/r/536676 (https://phabricator.wikimedia.org/T212854) [20:15:51] (03PS1) 10Joal: Parameterize cache size in CachingParser [analytics/ua-parser/uap-java] (wmf) - 10https://gerrit.wikimedia.org/r/536677 (https://phabricator.wikimedia.org/T212854) [20:16:14] sorry for the mess team --^ Forgot to commit reviews to the correct branch [20:27:55] (03PS1) 10Joal: Update pom.xml file to release v1.4.4-wmf [analytics/ua-parser/uap-java] (wmf) - 10https://gerrit.wikimedia.org/r/536678 (https://phabricator.wikimedia.org/T212854) [21:02:07] 10Analytics, 10Research: Recommend the best format to release public data lake as a dump - https://phabricator.wikimedia.org/T224459 (10Isaac) this looks great -- thanks @mforns for writing this up so clearly! [22:01:40] nuria: I mean it's a spelling mistake but it's used everywhere and yea let's not change it [22:03:24] (03CR) 10Fdans: Add daily and monthly jobs for mediarequests per referer (031 comment) [analytics/refinery] - 10https://gerrit.wikimedia.org/r/536646 (https://phabricator.wikimedia.org/T232858) (owner: 10Fdans) [23:34:13] 10Analytics, 10Patch-For-Review, 10Performance-Team (Radar): Eventlogging processors are frequently failing heartbeats causing consumer group rebalances - https://phabricator.wikimedia.org/T222941 (10Krenair) Is puppet on deployment-eventlog05.deployment-prep.eqiad.wmflabs expected to be failing with errors...