[00:17:20] issue solved in case anyone is worried [00:37:31] quick question: I'm trying to run a query and I'm going on stat1007 and running beeline like I always did, but it tells me " GSS initiate failed". Did something change regarding how to do it? [00:37:46] Pchelolo: did you do >kinit? [00:37:57] Pchelolo: if so, do you get same error on hive? [00:38:23] oh, kinit is something new :) [00:38:29] it says Client 'ppchelko@WIKIMEDIA' not found in Kerberos database while getting initial credentials [00:39:51] Pchelolo: that is whan you ask elukey to create a kerberos user for ya' [00:40:08] oh. ok. will do that. thank you. [00:41:05] it's even in the docs! sorry... [00:45:10] 10Analytics: Requesting kerberos access for ppchelko - https://phabricator.wikimedia.org/T245091 (10Pchelolo) [00:45:13] 10Analytics, 10Operations, 10ops-eqiad: rack/setup/install kafka-jumbo100[789].eqiad.wmnet - https://phabricator.wikimedia.org/T244506 (10Jclark-ctr) [01:09:21] 10Analytics, 10Cloud-Services, 10Developer-Advocacy (Jan-Mar 2020): Further improvements to the WMCS edits dashboard - https://phabricator.wikimedia.org/T240040 (10srishakatux) [01:28:18] 10Analytics, 10Cloud-Services, 10Developer-Advocacy (Jan-Mar 2020): Modify ReportUpdater to support `YYYY-MM` dates for monthly reports - https://phabricator.wikimedia.org/T245096 (10srishakatux) [01:28:47] 10Analytics, 10Cloud-Services, 10Developer-Advocacy: Create a WMCS edits dashboard via Dashiki - https://phabricator.wikimedia.org/T226663 (10srishakatux) @Milimetric It looks like there is no data generated for the `tabular` and `hierarchical` view for the month of January and no data at all for February. C... [06:34:57] 10Analytics, 10Analytics-Wikistats: Wikistats New Feature - bot edits / new articles - https://phabricator.wikimedia.org/T241922 (10fdans) @FocalPoint don't get me wrong, the big tables in Wikistats 1 are really valuable. They are part of the next phase for Wikistats 2, which is adding better data exploration... [06:35:13] heeelloo [06:46:11] foks: hi! were you using the test cluster?? [06:46:24] fdans: hola! [06:46:39] o/ [06:46:51] foks: ah yes I can see you on an-tool1006! [06:47:54] ahhhh the email message still refers to an-tool1006! [06:48:00] my bad foks sorry! [06:48:14] changing it now [06:49:16] the next ones will mention stat1007 [06:51:28] nuria: as FYI https://gerrit.wikimedia.org/r/#/c/operations/puppet/+/571868/ [07:52:23] 10Analytics, 10Operations, 10serviceops, 10vm-requests, 10User-Elukey: Create a replacement for kraz.wikimedia.org - https://phabricator.wikimedia.org/T244719 (10elukey) My bad, today I reviewed irc2001 with Moritz and it turned up that a GNOME deployment happened. This is because when d-i was running an... [07:55:57] 10Analytics: Requesting kerberos access for ppchelko - https://phabricator.wikimedia.org/T245091 (10elukey) ` elukey@krb1001:~$ sudo manage_principals.py create ppchelko --email_address=ppchelko@wikimedia.org Principal successfully created. Make sure to update data.yaml in Puppet. Successfully sent email to ppch... [08:10:57] Hi folks [08:19:32] bonjour [08:20:00] 10Analytics: Requesting kerberos access for ppchelko - https://phabricator.wikimedia.org/T245091 (10elukey) 05Open→03Resolved [08:36:17] \o/ Pietr on the cluster :) [08:40:19] Hi elukey - Would you have a minute for a braindump on https://phabricator.wikimedia.org/T226663#5879290 please? [08:44:15] joal: in a meeting now, will be free in a bit [08:45:18] sure, ping me when you want :) [08:45:22] elukey: -^ thanks :) [08:45:44] * joal wouldn't be cafeinated enough for a meeting that early :) [08:52:07] dcausse: Good morning :) [08:52:55] dcausse: Would you please ping me when you have minute, I have questions regarding datasets on hdfs - cheers :) [09:41:55] 10Analytics, 10Analytics-Kanban, 10Security, 10security-related: `mediawiki_cirrussearch_request` and `mediawiki_api_request` events are not deleted - https://phabricator.wikimedia.org/T245124 (10JAllemandou) [09:42:07] 10Analytics, 10Analytics-Kanban: HDFS space usage steadily increased over the past three month - https://phabricator.wikimedia.org/T244889 (10JAllemandou) After a (not so) quick audit, growth usage is due to the problem described in T245124. [09:42:53] elukey: can you please confirm that the task I created is hidden ?? [09:45:44] 10Analytics: Delete raw events after some time even if not needed - https://phabricator.wikimedia.org/T245126 (10JAllemandou) [09:47:41] joal: I think it needs restrictions on visibility [09:47:50] just did that [09:47:52] okok [09:47:53] elukey: can you confirm? [09:48:16] what I usually check is what phab lists as people who can view [09:48:25] since I can view now by default [09:48:39] I flaged you on the ticket [09:48:45] maybe that's it? [09:49:38] I changed it again elukey [09:49:46] joal: I just hit the protect as security issue joal [09:49:49] ah [09:49:55] * joal should learn how to that properly [09:50:17] I mean it is maybe too much but just in case [09:50:27] yeah yeah [09:50:41] sounds good :) [09:50:45] Thanks elukey [09:51:07] np! [09:51:41] joal: possibly it would be great to add Erik/David to the subs? (if not there yet) [09:51:47] sure [09:52:27] done [09:52:49] Ah - David is holidays [10:00:54] elukey: late but yes lol, I was on the test cluster :D [10:01:03] Worked it out with Nuria :) [10:01:55] foks: my bad! I fixed the emails now, sorry! [10:02:08] It's all good :) [10:04:25] Hi foks - let me know if you need help with queries :) [10:06:32] thanks a lot! I got this one sorted out. Was just a case of working out the ideal way to filter it. [10:06:44] Nu_ria was a big help there [10:06:51] great :) [10:09:59] notebook1004 is alerting in Icinga for disk space on / [10:10:58] yep I am going to ack it [10:39:38] (03CR) 10Joal: "Answering to comments, patch to follow." (033 comments) [analytics/refinery] - 10https://gerrit.wikimedia.org/r/569836 (https://phabricator.wikimedia.org/T209655) (owner: 10Joal) [10:46:44] (03CR) 10Joal: "Answer to comments, patch to follow." (033 comments) [analytics/refinery/source] - 10https://gerrit.wikimedia.org/r/346726 (https://phabricator.wikimedia.org/T209655) (owner: 10Joal) [10:47:29] (03PS15) 10Joal: Add spark code for wikidata json dumps parsing [analytics/refinery/source] - 10https://gerrit.wikimedia.org/r/346726 (https://phabricator.wikimedia.org/T209655) [10:57:45] hey team! [10:58:02] hi mforns - Hope you feel better :) [10:58:03] 10Analytics, 10Cloud-Services, 10Developer-Advocacy: Create a WMCS edits dashboard via Dashiki - https://phabricator.wikimedia.org/T226663 (10mforns) Looking into this now [10:58:16] yes, feeling better :] [10:58:16] thanks mforns --^ :) [10:58:22] about the ticket [10:58:23] :) [10:58:28] np [10:59:10] I looked quickly with Luca earlier but as nothing appeared magically, we decided to let it ot knowledgeable people :) [10:59:41] I just added one thing to https://wikitech.wikimedia.org/wiki/Analytics/Data_access#User_responsibilities [10:59:53] to make sure that people check free space [10:59:58] I'll also add an example [11:00:17] Super great elukey :) [11:04:43] joal: https://wikitech.wikimedia.org/wiki/Analytics/Data_access#User_responsibilities - is it ok in your opinion? [11:04:46] if so I'll send it around [11:06:03] hello mforns :) [11:06:35] elukey: I would create a new page with the example (and possibly anther one using du to see personal space used), and link it from the main page - This would facilitate reading [11:06:54] heya elukey :] [11:07:23] joal: sure, under the data access namespace? [11:08:47] hm, should we actually do something like: Analytics/Users/{Data_Access,Data_Storage} etc? [11:08:50] I wonder [11:09:07] Cause here we're not really on "access" per say [11:09:52] maybe under https://wikitech.wikimedia.org/wiki/Analytics/Tutorials ? [11:09:59] Works for me :) [11:09:59] and then link it [11:10:09] ack :) [11:18:19] ok I started https://wikitech.wikimedia.org/wiki/Analytics/Tutorials/Analytics_Client_Nodes [11:18:22] with some info [11:20:45] or maybe better under https://wikitech.wikimedia.org/wiki/Analytics/Data_access#Analytics_clients ? [11:21:05] different things [11:21:06] mmmm [11:22:57] ok reworked a bit [11:24:59] yes seems reasonable, will ask for a review of the team and in case send it to more lists [11:29:11] (03PS1) 10Mforns: Add more delay to wmcs reports [analytics/reportupdater-queries] - 10https://gerrit.wikimedia.org/r/571947 (https://phabricator.wikimedia.org/T226663) [11:30:07] (03CR) 10Mforns: [V: 03+2 C: 03+2] "Self-merging to unbreak production." [analytics/reportupdater-queries] - 10https://gerrit.wikimedia.org/r/571947 (https://phabricator.wikimedia.org/T226663) (owner: 10Mforns) [11:35:42] 10Analytics, 10Cloud-Services, 10Developer-Advocacy, 10Patch-For-Review: Create a WMCS edits dashboard via Dashiki - https://phabricator.wikimedia.org/T226663 (10mforns) I checked the report files, the logs, and the data in edit_hourly. I believe the problem was that the mediawiki_history data set was not... [11:42:16] * elukey lunch! [12:18:26] hi, I am trying to sync folders/projects across two different stat-machines using rsync (not very successfully); could anyone give a pointer or has an example how to make it work? thanks [12:20:17] Hi mgerlach - https://wikitech.wikimedia.org/wiki/Analytics/FAQ#How_do_I_transfer_files_between_stat_boxes? [12:22:35] joal: thanks (didnt see that) [12:22:46] hope it helps mgerlach :) [13:15:33] 10Analytics, 10Cloud-Services, 10Developer-Advocacy, 10Patch-For-Review: Create a WMCS edits dashboard via Dashiki - https://phabricator.wikimedia.org/T226663 (10mforns) I checked in https://analytics.wikimedia.org/datasets/periodic/reports/metrics/wmcs/ and all files are now complete with latest data. The... [13:23:25] (03PS1) 10Fdans: Improve flexibility of media file paths [analytics/aqs] - 10https://gerrit.wikimedia.org/r/571968 [13:23:43] (03CR) 10jerkins-bot: [V: 04-1] Improve flexibility of media file paths [analytics/aqs] - 10https://gerrit.wikimedia.org/r/571968 (owner: 10Fdans) [13:24:50] (03PS2) 10Fdans: Improve flexibility of media file paths [analytics/aqs] - 10https://gerrit.wikimedia.org/r/571968 (https://phabricator.wikimedia.org/T244712) [13:24:58] (03CR) 10jerkins-bot: [V: 04-1] Improve flexibility of media file paths [analytics/aqs] - 10https://gerrit.wikimedia.org/r/571968 (https://phabricator.wikimedia.org/T244712) (owner: 10Fdans) [13:25:38] (03PS3) 10Fdans: Improve flexibility of media file paths [analytics/aqs] - 10https://gerrit.wikimedia.org/r/571968 (https://phabricator.wikimedia.org/T244712) [13:26:10] elukey: lucaaaa you mind merging this? https://gerrit.wikimedia.org/r/#/c/operations/puppet/+/571672/ [13:27:01] milimetric: hellooo the changes we talked about yesterday are ready I think [13:27:14] in the end I went all for it and added both things [13:31:56] 10Analytics, 10Multimedia, 10Tool-Pageviews: Allow users to query mediarequests using a file page link - https://phabricator.wikimedia.org/T244712 (10fdans) For some reason the bot is not adding the Patch-for-review tag: https://gerrit.wikimedia.org/r/#/c/analytics/aqs/+/571968/ This change should remove m... [13:54:36] fdans: checking! [13:55:56] elukey: thank you <3 [14:03:25] (03PS18) 10Fdans: Add vue-i18n integration, English strings [analytics/wikistats2] - 10https://gerrit.wikimedia.org/r/558702 (https://phabricator.wikimedia.org/T240617) [14:11:20] (03PS19) 10Fdans: Add vue-i18n integration, English strings [analytics/wikistats2] - 10https://gerrit.wikimedia.org/r/558702 (https://phabricator.wikimedia.org/T240617) [14:12:48] 10Analytics, 10Analytics-Kanban: Remove transui;ity and banner-impression from refinery-job - https://phabricator.wikimedia.org/T245151 (10JAllemandou) [14:12:57] 10Analytics, 10Analytics-Kanban: Remove transui;ity and banner-impression from refinery-job - https://phabricator.wikimedia.org/T245151 (10JAllemandou) a:03JAllemandou [14:13:06] 10Analytics, 10Analytics-Kanban: Remove transui;ity and banner-impression from refinery-job - https://phabricator.wikimedia.org/T245151 (10JAllemandou) [14:13:42] (03CR) 10Fdans: "Just rebased this change after the couple significant changes to the build" [analytics/wikistats2] - 10https://gerrit.wikimedia.org/r/558702 (https://phabricator.wikimedia.org/T240617) (owner: 10Fdans) [14:14:06] 10Analytics, 10Analytics-Kanban: Remove tranquility and banner-impressions streaming from refinery-job - https://phabricator.wikimedia.org/T245151 (10JAllemandou) [14:14:21] (03PS1) 10Joal: Remove BannerImpressions streaming job and deps [analytics/refinery/source] - 10https://gerrit.wikimedia.org/r/571981 (https://phabricator.wikimedia.org/T245151) [14:14:42] (03PS5) 10Joal: Add oozie job converting wikidata dumps to parquet [analytics/refinery] - 10https://gerrit.wikimedia.org/r/569836 (https://phabricator.wikimedia.org/T209655) [14:22:36] fdans: sweet, checking that and your i18n, but have some meetings [14:32:48] no rush! [14:37:19] (03CR) 10Ottomata: [C: 03+1] Remove BannerImpressions streaming job and deps [analytics/refinery/source] - 10https://gerrit.wikimedia.org/r/571981 (https://phabricator.wikimedia.org/T245151) (owner: 10Joal) [14:56:02] (03PS1) 10Fdans: Use different logic for chart axis number formatting from dashboard [analytics/wikistats2] - 10https://gerrit.wikimedia.org/r/571991 (https://phabricator.wikimedia.org/T242790) [15:06:20] mforns: i think i remember talking with you abou tthis [15:06:21] but [15:06:34] why doesn't the drop-el-unsanitized-events job not drop the non EventLogggint table data? [15:06:46] i'm reading the command and I dont' see why it wouldnt' [15:07:52] OH because of the underscores??? [15:07:58] ah! [15:07:59] yes [15:08:05] all of the non EL tables have underscores in the nam [15:08:09] and the regex doesn match that!~ [15:08:10] crazy! [15:08:11] :) [15:12:36] Underscores for the stay! [15:16:38] 10Analytics, 10Analytics-Cluster, 10User-Elukey: Upgrade the Hadoop test cluster to BigTop - https://phabricator.wikimedia.org/T244499 (10elukey) To avoid the https://issues.apache.org/jira/browse/YARN-8310 error in Yarn: ` yarn.resourcemanager.recovery.enabled - tr... [15:24:43] ottomata, back, yes :] [15:25:01] PROBLEM - Check the last execution of refinery-drop-apiaction-partitions on an-coord1001 is CRITICAL: NRPE: Command check_check_refinery-drop-apiaction-partitions_status not defined https://wikitech.wikimedia.org/wiki/Analytics/Systems/Managing_systemd_timers [15:26:12] this is andrew cleaning up --^ [15:26:29] when puppet runs on icinga1001 it will go away (we'll not see recovery, but np) [15:30:07] PROBLEM - Check the last execution of refinery-drop-cirrussearchrequestset-partitions on an-coord1001 is CRITICAL: NRPE: Command check_check_refinery-drop-cirrussearchrequestset-partitions_status not defined https://wikitech.wikimedia.org/wiki/Analytics/Systems/Managing_systemd_timers [15:30:12] same thing :) [15:30:46] ok [15:36:44] it made a long time I had not looked at piwik for wikistats, and I must say that 42K unique visitors in January 2020 is a heck of a score :) [15:37:22] for what site? [15:37:26] ah wikistats! [15:37:26] wow! [15:38:06] clearly all due to people pinged by fdans to check the website [15:38:07] :D [15:39:25] joal did you see the change when the link was added to the wikis? [15:41:55] fdans: not sure :-P [15:42:42] joal: half of the test cluster is running 2.8 now :O [15:42:51] \o/ !!!!n [15:43:01] joal: https://piwik.wikimedia.org/index.php?module=CoreHome&action=index&idSite=12&period=day&date=yesterday&updated=1#?idSite=12&period=week&date=2020-02-12&category=General_Visitors&subcategory=General_Overview [15:43:21] yes fdans - I have seen that :) [15:44:17] 10Analytics, 10Better Use Of Data, 10Performance-Team, 10Patch-For-Review, and 2 others: Switch mw.user.sessionId back to session-cookie persistence - https://phabricator.wikimedia.org/T223931 (10mpopov) a:05jlinehan→03mpopov [15:44:18] * joal is very impressed watching elukey climbing a bigtop on its own [15:49:35] 10Analytics, 10Better Use Of Data, 10Performance-Team, 10Patch-For-Review, and 2 others: Switch mw.user.sessionId back to session-cookie persistence - https://phabricator.wikimedia.org/T223931 (10mpopov) @Nuria: Can you please sign-off in the ticket description as the Analytics stakeholder if you agree wit... [15:49:36] don't jinx it joal :P :P [15:49:59] I had to disable automated recovery for Yarn to make it work, really weird [15:50:16] it seems that we hit https://issues.apache.org/jira/browse/YARN-8310 [15:50:20] fixed in 2.10 of course [15:54:35] if I temp disable it in the RM they bootstrap, and then the next restarts (with the option again to true) work [15:54:42] nodemanagers are more picky [15:56:30] it must be something that it reads from file [16:10:44] Gone to meetup - escrum sent [16:16:55] ah! found a way! it is a dir in /tmp! [16:16:56] * elukey dances [16:17:50] 10Analytics, 10Analytics-Cluster, 10User-Elukey: Upgrade the Hadoop test cluster to BigTop - https://phabricator.wikimedia.org/T244499 (10elukey) On nodemanager it is sufficient to execute `sudo rm -rf /tmp/hadoop-yarn/yarn-nm-recovery/*` to overcome the https://issues.apache.org/jira/browse/YARN-8310 error! [16:18:35] 10Analytics, 10Operations, 10Traffic: Wikipedia Accessibility, check false positives and false negatives of traffic alarms - https://phabricator.wikimedia.org/T245166 (10Nuria) [16:19:26] 10Analytics, 10Product-Analytics: Request for instructions for using DataGrip in the Kerberos paradigm - https://phabricator.wikimedia.org/T245040 (10mpopov) 05Open→03Declined Closing this request since there's nothing left to do here. Thanks all! Looking forward to a Presto+Airpal future :) [16:24:28] 10Analytics, 10Operations, 10Traffic: Wikipedia Accessibility, check false positives and false negatives of traffic alarms - https://phabricator.wikimedia.org/T245166 (10Nuria) We can also change the alarm to hourly and see events caught. Now, this might mean a super large number of false positives so we nee... [16:25:06] 10Analytics, 10Operations, 10Research, 10Traffic: Wikipedia Accessibility, check false positives and false negatives of traffic alarms - https://phabricator.wikimedia.org/T245166 (10Nuria) [16:27:14] 10Analytics, 10Better Use Of Data, 10Performance-Team, 10Patch-For-Review, and 2 others: Switch mw.user.sessionId back to session-cookie persistence - https://phabricator.wikimedia.org/T223931 (10Nuria) Are we using js-based cookies or non-js-based cookies? [16:27:34] 10Analytics, 10Better Use Of Data, 10Performance-Team, 10Patch-For-Review, and 2 others: Switch mw.user.sessionId back to session-cookie persistence - https://phabricator.wikimedia.org/T223931 (10Nuria) Question for @Krinkle mostly [16:27:48] sukhe, here's a link to the code that triggers the alarms: https://github.com/wikimedia/analytics-refinery/blob/master/oozie/data_quality_stats/workflow.xml#L214 [16:27:58] this part is the one that generates the email [16:28:28] please, ask any questions regarding that code :] [16:29:37] thakns mforns! I will look at it later today [16:29:45] k! [16:29:55] 10Analytics, 10Better Use Of Data, 10Performance-Team, 10Patch-For-Review, and 2 others: Switch mw.user.sessionId back to session-cookie persistence - https://phabricator.wikimedia.org/T223931 (10Krinkle) JS-based. `mw.user` is a JavaScript module that runs client-side and generates these IDs by computing... [16:33:50] mforns: is there a way to get refinery-drop-older-than to give me the shaw without doing the full dry-run dir search? [16:34:09] i'm running a dry run, and it seems to be taking forever doing the directory search [16:34:22] the table partitions it selected are correct [16:36:04] ottomata, no... [16:36:39] hm k i guess i'll jus wait [16:36:47] not sure if it is working though... [16:37:04] ottomata, how many partitions does it have? [16:37:37] 2 tables [16:37:46] ~18000 partiions total [16:38:01] ~19000 [16:38:12] 18k should finish in a couple minutes [16:38:27] k ya has been running for 25 [16:38:33] hmm... [16:38:37] will ctrl c try again [16:38:38] to much no? [16:38:45] ok [16:40:55] hmm, mforns this has an extra non dt partition [16:40:57] datacenter [16:40:59] would that be a problem? [16:41:13] ottomata, no I don't think so [16:41:16] ok [16:41:27] ottomata, is it a new data set? [16:41:30] no [16:41:33] ottomata, so you want to pair? [16:41:37] *do [16:42:00] hm, i'm mostly just waiting [16:42:04] i thnk i'm doing things right [16:42:04] ok [16:42:06] (linked you ticket too) [16:42:13] it gets to [16:42:13] 2020-02-13T16:40:30 INFO No directories removed for tree depth 5. [16:42:18] but then nothing else happens [16:43:09] aha, and what does the path-format param look like? [16:43:25] --path-format='^(mediawiki_api_request|mediawiki_cirrussearch_request)(/datacenter=(?P\w+)(/year=(?P[0-9]+)(/month=(?P[0-9]+)(/day=(?P[0-9]+)(/hour=(?P[0-9]+))?)?)?)?)?' [16:43:25] and base-path? [16:43:38] --base-path='/wmf/data/event' [16:44:59] I think you only need to name the time partitions, the datacenter one needs no label in the regex [16:45:51] ok [16:46:28] OH [16:46:30] it finished this time [16:46:32] ok cool [16:46:35] got a sha [16:46:42] cool [16:46:49] will rerun one more time with your recommendation [16:46:55] ottomata, also... [16:47:35] if we split it in 2 jobs, one for mediawiki_api_request and one for mediawiki_cirrussearch_request, it will go faster [16:47:40] hm [16:47:54] wait... [16:47:55] maybe we can do the first purge that way? [16:47:58] and schdule them together? [16:47:58] maybe not [16:48:13] no no I think I'm wrong [16:48:24] k [16:48:31] ottomata, one last thing [16:48:54] I believe you don't want the datacenter=blah directory to be removed by the script [16:49:03] no? why not? [16:49:05] hm [16:49:06] i guess not [16:49:17] how do I avoid that, just not capture it? [16:49:22] so you can make it not optional [16:49:28] ah k [16:50:35] ^(mediawiki_api_request|mediawiki_cirrussearch_request)/datacenter=\w+(/year=(?P[0-9]+)(/month=(?P[0-9]+)(/day=(?P[0-9]+)(/hour=(?P[0-9]+))?)?)?)? [16:51:15] looks good! [16:52:21] 10Analytics, 10Analytics-Cluster, 10User-Elukey: Upgrade the Hadoop test cluster to BigTop - https://phabricator.wikimedia.org/T244499 (10elukey) We have a couple of files deployed via puppet, here's the diff when puppet runs (we'll probably need to adjust them): ` Notice: /Stage[main]/Cdh::Hadoop/File[/usr... [16:52:21] ottomata, maybe try removing the leading ^ [16:53:12] I'm surprised it worked with the leading ^ ottomata :O [16:53:26] OH does it join it to the base path? [16:53:32] yes [16:53:35] AH ha [16:53:37] i think it isn't working [16:53:41] no dirs are found [16:53:52] aaaaah, OK [16:55:06] ah yay working now [16:55:13] this is a great tool [16:55:46] :D [16:57:47] maybe the path-format param should be more extensively documented... [17:32:18] 10Analytics: Add SWAP profile to 1005 - https://phabricator.wikimedia.org/T245179 (10Nuria) [17:38:05] 10Analytics: Add SWAP profile to stat1005 - https://phabricator.wikimedia.org/T245179 (10elukey) [18:32:38] elukey, sorry for not having managed to do the netflow thing today so far, I guess I'll do it now and let you know how it went tomorrow [18:33:18] mforns: in a bit will you pair with me to check on this data drop stuff? [18:33:26] 2 eyes, you know :) [18:33:29] ottomata, sure! [18:33:31] for eyes? [18:33:33] four* [18:33:39] hehe yea [18:33:42] ping me whenever [18:33:45] k lemme make sure puppet is just what I want, then we can manually run the first purge [18:33:46] together [18:34:19] mforns: yes yes makes sense! [18:34:28] k! [18:34:35] mforns: https://gerrit.wikimedia.org/r/c/operations/puppet/+/572041 [18:34:45] commands look right to you? [18:35:45] lookin [18:36:31] o/ [18:36:32] * elukey off! [18:38:59] 10Analytics, 10Analytics-Cluster: Hadoop Hardware Orders FY2019-2020 - https://phabricator.wikimedia.org/T243521 (10Ottomata) @elukey ...apparently we are not refreshing the leased hardware this year? SRE is just buying the leased servers, and then we will refresh them in two years! [18:39:15] mforns: ready to pair when youu are [18:39:31] ottomata, I'm looking at the code, thinking about one detail... [18:39:35] k [18:39:38] we can discuss it in bc if you want [18:40:28] let's discuss here so i can fix and then we can pair [18:40:36] migiht have to rerun and get sha, could take a bit [18:41:13] mforns: ^ [18:44:03] ottomata, ok one question [18:44:23] did you use double $$ when getting the checksum for the second job? [18:44:30] no [18:44:35] :] [18:44:38] :) [18:45:00] and you sure double backslash is needed? [18:45:10] in \\w+ [18:45:11] puppet was unhappy... [18:45:29] i think it is for puppet to escape the \ [18:45:35] ok ok [18:45:35] i also did not run with that [18:45:48] oh ok [18:46:16] ottomata, yea, I think code looks good! [19:31:26] (03CR) 10Nuria: "+1 merge when ready, thanks for cleaning up" [analytics/refinery/source] - 10https://gerrit.wikimedia.org/r/571981 (https://phabricator.wikimedia.org/T245151) (owner: 10Joal) [19:41:11] 10Analytics, 10Product-Analytics: Get "edits hourly" on a daily basis - https://phabricator.wikimedia.org/T231938 (10MMiller_WMF) Hi! I just want to post here and say that since I've been using Turnilo, I definitely want this data to be updated more frequently -- even weekly would be a great improvement. My... [20:01:13] 10Analytics: debianize presto python package so it is available by default - https://phabricator.wikimedia.org/T245194 (10Nuria) [20:03:53] ok mforns review is here: https://gerrit.wikimedia.org/r/c/operations/puppet/+/572041 [20:22:50] ottomata, +1ed, was the last deletion successful? [20:22:59] yup! [20:23:02] 10Analytics, 10Cloud-Services, 10Developer-Advocacy, 10Patch-For-Review: Create a WMCS edits dashboard via Dashiki - https://phabricator.wikimedia.org/T226663 (10srishakatux) @mforns thank you for looking into this! I still don't see February data in the reports. Is there a reason for that? [20:23:03] cool [20:23:05] can you read this now? [20:23:05] eventgate-analytics-canary-7477b599f9-jv8rj [20:23:09] oops [20:23:11] https://phabricator.wikimedia.org/T245124 [20:23:27] yes! [20:23:38] ok cool, ya posted the log there [20:23:52] great [20:25:41] 10Analytics, 10Cloud-Services, 10Developer-Advocacy, 10Patch-For-Review: Create a WMCS edits dashboard via Dashiki - https://phabricator.wikimedia.org/T226663 (10mforns) @srishakatux no problemo! Yes, the month of February is still not over, so reportupdater will wait until then to calculate the correspond... [20:31:55] 10Analytics, 10Cloud-Services, 10Developer-Advocacy, 10Patch-For-Review: Create a WMCS edits dashboard via Dashiki - https://phabricator.wikimedia.org/T226663 (10srishakatux) @mforns OOPS! I forgot in all these months how this was supposed to work :D [20:33:45] 10Analytics, 10Cloud-Services, 10Developer-Advocacy, 10Patch-For-Review: Create a WMCS edits dashboard via Dashiki - https://phabricator.wikimedia.org/T226663 (10mforns) Hehe, no problem, I'm also constantly checking these in my brain, especially with weekly reports. [20:57:26] 10Analytics, 10Analytics-Kanban, 10serviceops: Create production and canary releases for existent eventgate helmfile services - https://phabricator.wikimedia.org/T245203 (10Ottomata) [21:45:37] (03CR) 10Milimetric: "only one question about the css bundling - it would suck if we end up with N identical copies of CSS files in the dist directory" (032 comments) [analytics/wikistats2] - 10https://gerrit.wikimedia.org/r/558702 (https://phabricator.wikimedia.org/T240617) (owner: 10Fdans) [21:46:33] 10Analytics, 10Analytics-Kanban, 10serviceops, 10Patch-For-Review: Create production and canary releases for existent eventgate helmfile services - https://phabricator.wikimedia.org/T245203 (10Ottomata) Just applied [[ https://gerrit.wikimedia.org/r/c/operations/deployment-charts/+/572095 | this ]] for eve... [22:18:21] 10Analytics, 10Analytics-Kanban, 10serviceops, 10Patch-For-Review: Create production and canary releases for existent eventgate helmfile services - https://phabricator.wikimedia.org/T245203 (10Ottomata) [[ https://gerrit.wikimedia.org/r/c/operations/deployment-charts/+/572106 | eventgate-main will change ]... [22:25:44] joal: you will be so PROUD , i found BY MYSELF what was going on with the bots stuff... [22:26:39] joal: so much time lost for a TYPO [23:45:31] 10Analytics, 10Multimedia, 10Tool-Pageviews: Allow users to query mediarequests using a file page link - https://phabricator.wikimedia.org/T244712 (10MusikAnimal) > I think an acceptable compromise might be to just report views of media in commons. What do people think? We have the data on local uploads, so...