[06:28:48] good morning [06:48:02] 10Analytics, 10Analytics-Cluster, 10Analytics-Kanban, 10Patch-For-Review: Repurpose notebook100[3,4] - https://phabricator.wikimedia.org/T256363 (10ops-monitoring-bot) cookbooks.sre.hosts.decommission executed by elukey@cumin1001 for hosts: `an-launcher1001.eqiad.wmnet` - an-launcher1001.eqiad.wmnet (**PAS... [06:56:13] completely removed an-launcher1001 [06:56:25] (the old vm) [08:01:08] so I am checking archiva1002, and I don't see recent artifacts uploaded before the last gerrit upgrade [08:01:27] it is basically a collection of war files, but index/dir scanning doesn't pick it up [08:01:32] but I thought it was working [08:02:23] https://archiva-new.wikimedia.org/#browse~releases/com.googlesource.gerrit doesn't show the 3.2 version [08:10:16] now it seems to be possible to delete the artifacts etc.. db and re-scan everything [08:10:21] but where is the db? :D [08:22:53] https://archiva.apache.org/docs/2.2.4/adminguide/databases.html - might be the same as user database sigh [08:45:40] but it seems that https://github.com/apache/archiva/commit/d5c048d09c59b1b1e2b93493fde40e534730682b indicates "no more database to store artifacts info" [08:48:11] joal: o/, we're close to being able to automate the import of wikidata ttl dumps as triples dataframe, something I haven't thought about is the cleanup of previously loaded dumps (we'll be importing them weekly) [08:50:01] I was going to drop everything that is older than 1 month, keep the dataset 4x times basically but would love to have your input on this (nothing urgent) [09:30:54] ok so quickest solution is to re-sync all archiva1001's status and redo everything (repository group, etc..) [10:33:32] done! archiva1002 is ready again, should be a perfect sync with 1001 [10:33:53] tried a build from stat1004 (removing all .m2/etc.. cache) and it completed fine [10:34:10] I am going to lunch break, will switch archiva after it [11:13:23] Hi team [11:13:43] Hi dcausse - otoamted dumps conversion is a super good news :) [11:14:16] dcausse: we use a script to delete data on the cluster on a time basis - Maybe you can reuse it ? [12:15:15] joal: sure good idea! thanls! [12:37:50] dcausse: see, Joseph commented only your entries, he doesn't like me anymore :D [12:41:41] he :) [12:46:57] :-p elukey [12:48:42] joal: bonjour :) ok if I switch archiva? [12:48:57] Hi elukey - Please do! [12:57:35] 10Analytics, 10Analytics-EventLogging, 10Analytics-Kanban, 10Event-Platform, and 2 others: Vertical: Migrate SearchSatisfaction EventLogging event stream to Event Platform - https://phabricator.wikimedia.org/T249261 (10Ottomata) THIS IS SO SO GREAT! Thank you so much Timo! I'll talk to @EBernhardson and... [13:02:59] very funny: https://twitter.com/RyanMarcus/status/1277294558850813956 [13:09:54] !log archiva.wikimedia.org migrated to archiva1002 [13:09:55] Logged the message at https://www.mediawiki.org/wiki/Analytics/Server_Admin_Log [13:10:17] the other archiva is available at https://archiva-old.wikimedia.org/ [13:10:25] will keep it around for some days [13:26:06] 10Analytics, 10Analytics-Cluster, 10Analytics-Kanban, 10Patch-For-Review: Move Archiva to Debian Buster - https://phabricator.wikimedia.org/T252767 (10elukey) archiva.wikimedia.org now points to archiva1002, and archiva-old.wikimedia.org points to archiva1001. Will keep the latter around for a couple of da... [13:30:59] elukey: We'll need to have an eye at next jenkins deploy :) [13:32:19] joal: yep! [13:32:30] going to add a a note to the train's etherpad [13:33:18] dcausse: quick question - how should I name the events generated from WDQS? wdqs-events? is there an already used name? [13:37:56] joal: hm.. schema is /sparql/query/1.0.0 stream is wdqs-external.sparql-query so wdqs.sparql-query perhaps? [13:38:14] ack dcausse :) [13:38:15] we rarely name these events [13:44:36] dcausse: I hadn't look since some time - looks like data in hdfs:///wmf/data/event/wdqs_external_sparql_query/datacenter=eqiad stops in May! [13:44:58] There is data in hdfs up to end of April, but no more in May [13:44:58] ouch [13:45:19] :S [13:45:33] The refined flag is present, but no data :( [13:46:01] ottomata: if you're nearby - your help could be needed [13:46:24] The fact that it stops precisely on May 1st hour 0 is very bizarre [13:47:01] joal: I see things in e.g. /mnt/hdfs/wmf/data/raw/event/eqiad_wdqs-external_sparql-query/hourly/2020/06/29/01 [13:48:06] There seem to be data indeed dcausse - the problem must from something else [13:48:09] meaning, refine [13:51:00] dcausse: will investigate with ottomata [13:51:07] joal: thanks! [13:52:34] hello! [13:52:39] am nearby but about to start an interview [13:53:00] yikes [13:53:14] that does sounds weird and nasty and the fact that we don't get alerts is weird. [13:53:51] willl look after interview [13:54:01] indeed ottomata - Will try to understand more, let's talk once you're done [14:02:58] ottomata: Webrequest.isWikimediaHost("query.wikidata.org") --> false [14:07:31] possible idea [14:07:38] Gone for kids - back in a bit [14:48:43] 10Analytics, 10Analytics-Kanban, 10User-Ladsgroup: Add shnwiktionary to analytics whitelist - https://phabricator.wikimedia.org/T256013 (10Nintendofan885) 05Open→03Resolved It's now on line 699 of https://gerrit.wikimedia.org/g/analytics/refinery/+/master/static_data/pageview/whitelist/whitelist.tsv [14:56:26] ottomata: here for a pre-standup? [14:56:37] joal: in intereview [14:56:40] ack [14:56:42] laster [15:00:23] a-team will be afew mins late [15:01:10] fdans: pin [15:01:13] ? [15:02:34] sorriiii [15:10:32] 10Analytics: Create a tool checking for data presence based on file-size - https://phabricator.wikimedia.org/T256644 (10JAllemandou) [15:10:39] milimetric: --^ [15:21:47] (03CR) 10Joal: "One nit about code coherence - minimal" (031 comment) [analytics/refinery/source] - 10https://gerrit.wikimedia.org/r/608100 (https://phabricator.wikimedia.org/T256514) (owner: 10Bearloga) [15:37:03] 10Analytics: Create a tool checking for data presence based on file-size - https://phabricator.wikimedia.org/T256644 (10Ottomata) p:05Triage→03High [15:48:15] 10Analytics-Cluster, 10Analytics-Radar, 10Operations, 10ops-eqiad: Renamed notebook1003 to an-launcher1002 - https://phabricator.wikimedia.org/T256397 (10Ottomata) [15:49:42] 10Analytics, 10Analytics-Kanban: Setup data-drop for pageview_actor_hourly - https://phabricator.wikimedia.org/T256362 (10Ottomata) [15:49:44] 10Analytics, 10Analytics-Kanban: Delete pageview_actor_hourly data after 90 days - https://phabricator.wikimedia.org/T256417 (10Ottomata) [15:52:02] 10Analytics, 10Research, 10WMDE-Analytics-Engineering: Please upgrade R on stat100** servers - https://phabricator.wikimedia.org/T256188 (10Ottomata) So, we won't be explicitly upgrading R. Instead, we will be deploying an anaconda environment ({T251006}) which includes an updated R, but also should allow y... [15:52:14] 10Analytics, 10Research, 10WMDE-Analytics-Engineering: Please upgrade R on stat100** servers - https://phabricator.wikimedia.org/T256188 (10Ottomata) 05Open→03Declined [15:53:12] 10Analytics, 10Analytics-Kanban, 10Product-Analytics: Data missing in event_prefupdate in Druid - https://phabricator.wikimedia.org/T256178 (10Ottomata) a:03Milimetric [15:53:17] 10Analytics, 10Analytics-Kanban, 10Product-Analytics: Data missing in event_prefupdate in Druid - https://phabricator.wikimedia.org/T256178 (10Milimetric) p:05Triage→03High [15:56:00] 10Analytics, 10Better Use Of Data, 10Product-Analytics: Bug: 'Include Time' option in table visualization produces "0NaN-NaN-NaN NaN:NaN:NaN" - https://phabricator.wikimedia.org/T256136 (10Ottomata) @mpopov weird find. This sounds like an upstream superset bug. Could you make an issue with them? https://g... [16:03:38] 10Analytics: Check home/HDFS leftovers of nathante - https://phabricator.wikimedia.org/T256356 (10elukey) Precise list of leftovers: ` ====== stat1004 ====== total 24 -rw-r--r-- 1 20110 wikidev 12682 Nov 18 2018 DwellTimeModels.R.r drwxrwxr-x 3 20110 wikidev 4096 Sep 25 2018 R drwxrwxr-x 2 20110 wikidev 409... [16:04:32] 10Analytics: Check home/HDFS leftovers of nathante - https://phabricator.wikimedia.org/T256356 (10elukey) @Groceryheist anything that you want to preserve? [16:33:56] (03CR) 10Ottomata: Make JsonStringMessageDecoder search for list of possible camus.message.timestamp.field (031 comment) [analytics/camus] (wmf) - 10https://gerrit.wikimedia.org/r/607796 (https://phabricator.wikimedia.org/T256370) (owner: 10Ottomata) [16:35:04] (03PS3) 10Ottomata: Make JsonStringMessageDecoder search for list of possible camus.message.timestamp.field [analytics/camus] (wmf) - 10https://gerrit.wikimedia.org/r/607796 (https://phabricator.wikimedia.org/T256370) [16:36:22] (03CR) 10Joal: [C: 03+1] "LGTM :)" [analytics/camus] (wmf) - 10https://gerrit.wikimedia.org/r/607796 (https://phabricator.wikimedia.org/T256370) (owner: 10Ottomata) [17:03:03] ottomata: o/ [17:03:10] do you have a minute for a question about netflow? [17:04:36] ok I'll write it and we can discuss anytime.. Netflow data may need to be augmented with say GeoIP lookups for IPs, etc.. in the future [17:05:14] if we move it to eventgate, is there the possibility to do it as extra refine step or it may be difficult? [17:07:15] ottomata: do you have time now for us to understand the issue with events? [17:16:57] logging off folks, tomorrow I'll log in early in the eu morning and work a bit before going afk [17:21:59] ottomata: my investigation show me that only wdqs events are impacted - It's been done manually, I hope I didn't miss any :S [17:24:30] Going for diner, back after [17:48:07] ok great, joal and do you know if it was caused by the isWikimediaHOstname thing? [17:48:20] elukey: hi sorry was lunchin [17:48:59] hm, yes, if you set the proper field it and use the same refine job Refine will geocode [17:49:13] the field currently is http.client_ip, but perhaps we'll need to find a different one since these are not http requests! [17:49:14] hmm [18:12:17] ottomata: problem is not from geocode, but from domain-acceoptation AFAIU [18:12:56] ah sorry joal no geocdoe was an answer to luca's q [18:13:02] Ah! [18:13:11] so the problem is [18:13:11] isWikimediaHost [18:13:11] ? [18:13:33] ottomata: I have not investigated further, but it seems very probable - Do you remem [18:13:50] ber when you switched config? [18:17:58] joal: [18:18:01] seems about right [18:18:05] merged april 27 [18:18:05] https://gerrit.wikimedia.org/r/c/analytics/refinery/source/+/586447 [18:18:11] Right [18:18:14] https://gerrit.wikimedia.org/r/c/operations/puppet/+/592756/ [18:18:16] april 30 on that one [18:18:33] indeed [18:18:50] ya previously they just did dedupcliate_eventbus [18:18:51] HMMMM [18:19:22] joal: what do you think the right thing to do is? fix isWikimediaHost? [18:19:29] or not apply that for non eventlogging? [18:19:39] ottomata: Now the other question is: are there other domains qwe filter that shouldn't be, even not all events are not dropped [18:20:39] I think it's good to fix isWikimediaHost, but it feels brittle [18:22:38] ottomata: we wish to update the pageview-def this week, so a refinery-source deploy will be needed [18:23:23] hm [18:23:29] Interesting ottomata - We actually explicitely remove query from wikidata domain as accepted domain [18:23:34] oh weird [18:23:44] perhaps we shouldn't apply this domain filtering at all, event to eventlogging [18:23:44] nope - query.wikidata is no pageview! [18:24:01] e.g. we don't apply this filtering to webreqeuest [18:24:04] ottomata: we do that to prevent bots from other domains poluting the data [18:24:08] only for datasets derived from webrequset [18:24:21] maybe a tagged field would be better? [18:24:23] rather than removing them? [18:24:25] ottomata: I don't think we apply this filtering to webrequest [18:24:30] we on't [18:24:32] don't [18:24:35] but we do to pageviews [18:24:43] i feel like events are more like webrequest [18:25:02] they are the original source data, for some of them yes the data can be faked, but it could be filtered by usuers [18:25:05] users of the data [18:25:08] right, and refined events are more like pageviews :) [18:25:09] rathern than removing it from the source [18:25:11] no [18:25:16] ok [18:25:17] refined events are like wmf.webrequest [18:25:38] ok ok - Now we still shouldn't be accpeting data for other domains [18:25:49] i guess so? [18:26:37] oh joal i misunderstood, you were saying query.wikidata.org should be accepted [18:26:37] hm [18:26:49] ok gonna try to figure out why Refine doesn't choose these [18:26:55] query.wikidata should be accepted for events, but not for pageviews :) [18:27:02] joal: can we merge this? [18:27:03] https://gerrit.wikimedia.org/r/c/analytics/refinery/source/+/607788 [18:27:32] So far we use the same function in both places [18:27:39] (03PS2) 10Ottomata: [WIP] Overloaded methods to make working with default Refine related classes easier [analytics/refinery/source] - 10https://gerrit.wikimedia.org/r/607788 [18:28:17] right but somethiing else probably ends up filtering out query.wikidata for pageview counting [18:28:31] isWikimediaHost should return true for that, right? [18:29:15] ottomata: the function should be associated to Pageview, not Webrequest [18:29:28] As the filter is cloisely bound to pageview [18:29:30] ? [18:29:51] why? i mean it is a pretty generic function, so maybe it shoudlt' be in webrequest [18:29:52] but [18:29:56] isWikimediaHost could be used for anything [18:29:57] no? [18:30:27] ottomata: the logic currently used should be bound to Pageview, and another logic with the isWikimediaHost name should be used in webrequest [18:30:42] ottomata: I have not reviewed your patch yet [18:30:46] k [18:30:55] oh and i still have WIP on there [18:34:13] (03PS2) 10Bearloga: Label mobile-html endpoint requests as app pageviews [analytics/refinery/source] - 10https://gerrit.wikimedia.org/r/608100 (https://phabricator.wikimedia.org/T256514) [18:34:36] (03CR) 10Bearloga: Label mobile-html endpoint requests as app pageviews (031 comment) [analytics/refinery/source] - 10https://gerrit.wikimedia.org/r/608100 (https://phabricator.wikimedia.org/T256514) (owner: 10Bearloga) [18:37:40] (03CR) 10Joal: [C: 03+1] "LGTM - Thanks for the cleaning bearloga :)" [analytics/refinery/source] - 10https://gerrit.wikimedia.org/r/608100 (https://phabricator.wikimedia.org/T256514) (owner: 10Bearloga) [18:38:05] bearloga: Heya - Your patch is ready to be merged for me - Let me have milimetric or ottomata have a look at it, an [18:38:12] and then it's in [18:41:03] (03CR) 10Ottomata: [C: 03+1] Label mobile-html endpoint requests as app pageviews [analytics/refinery/source] - 10https://gerrit.wikimedia.org/r/608100 (https://phabricator.wikimedia.org/T256514) (owner: 10Bearloga) [18:43:14] (03CR) 10Joewalsh: [C: 03+1] Label mobile-html endpoint requests as app pageviews [analytics/refinery/source] - 10https://gerrit.wikimedia.org/r/608100 (https://phabricator.wikimedia.org/T256514) (owner: 10Bearloga) [18:43:26] (03CR) 10Dbrant: [C: 03+1] Label mobile-html endpoint requests as app pageviews [analytics/refinery/source] - 10https://gerrit.wikimedia.org/r/608100 (https://phabricator.wikimedia.org/T256514) (owner: 10Bearloga) [18:46:18] wow bearloga - that's a lot of +1s :) Mergign [18:46:40] (03CR) 10Joal: [C: 03+2] "Merging for deploy this week" [analytics/refinery/source] - 10https://gerrit.wikimedia.org/r/608100 (https://phabricator.wikimedia.org/T256514) (owner: 10Bearloga) [18:46:58] joal: haha! :D thanks! [18:47:53] 10Analytics, 10Analytics-Kanban, 10Patch-For-Review, 10Product-Analytics (Kanban): PageviewDefinition should detect /api/rest_v1/page/mobile-html requests as pageviews - https://phabricator.wikimedia.org/T256514 (10JAllemandou) [18:48:39] joal [18:48:42] scala> Webrequest.isWikimediaHost("query.wikidata.org") [18:48:42] res7: Boolean = false [18:48:54] Indeed - I tested that [18:48:59] og [18:49:00] oh [18:49:05] i thought you were saying that should be true [18:49:18] this function should be named: isWikimediaPageviewHost [18:49:20] OHHHHHH [18:49:24] query and test are taking out [18:49:26] I GET IT NOW [18:49:29] i see [18:49:30] ok [18:49:31] :) [18:49:32] weird [18:49:52] and we should have Webrequest.isWikimediaHost being a less strict version of it [18:50:09] so there is an explicit whitelist for refine's filter_allowed_domains [18:50:12] but that feels brittle [18:50:18] yeah i see joal that makes the most sense [18:50:48] i joal there are a lot 'localhost' queries [18:50:51] advantage of doing it this way is, we keep the Webrequest function name for refine, and update the pageview one (only refinery-source), while making functions being at better places [18:50:59] in this sparql query data [18:51:10] hm? [18:51:14] (03Merged) 10jenkins-bot: Label mobile-html endpoint requests as app pageviews [analytics/refinery/source] - 10https://gerrit.wikimedia.org/r/608100 (https://phabricator.wikimedia.org/T256514) (owner: 10Bearloga) [18:51:22] like domain being 'localhost' ? [18:51:35] hm - possibly for internal clusteR? [18:51:36] yes [18:51:41] MEH [18:51:41] in the external data [18:51:47] 10Analytics, 10Product-Analytics: Re-process webrequests from 2020-05-18 so that page views from latest Wikipedia app releases are counted - https://phabricator.wikimedia.org/T256516 (10mpopov) In our sync with AE we did talk about the volume of data and computational + labor/time costs of such an undertaking.... [18:51:48] hm, weird [18:51:50] this is for 06-01T00 [18:51:53] pdf.df.count [18:51:53] res4: Long = 192277 [18:52:02] scala> pdf.df.where("meta.domain = 'localhost'").count [18:52:02] res9: Long = 9988 [18:52:06] wow [18:52:28] dcausse: if you're still wotrking - any idea about the above? --^ [18:53:19] they all seem to have [18:53:26] | query|format|params| [18:53:26] +----------------+------+------+ [18:53:26] | ASK{ ?x ?y ?z }| null| []| [18:54:05] maybe its monitoring data? [18:54:11] possible [18:54:21] it looks pretty useless [18:54:26] but it begs the question [18:54:29] is filtering the right thing to do? [18:54:32] ottomata: I'd rather double check with Search folks [18:54:37] indeed [18:54:37] almost certainly n ot for internally generated data [18:54:53] i might argue not event for externally generated data [18:54:55] not sure. [18:55:05] i think maybe adding a field indicating that it is not a wikimedia domain [18:55:05] Right - And then back to the question - Is a single config for all the correct approach [18:55:08] is more the right thing to do [18:55:17] just adding that during refinement [18:55:27] makes sense [18:55:32] is_from_wikimedia_domain [18:55:33] or something [18:55:42] that is different than what EL refine does [18:55:47] yarhg [18:55:48] yup [18:55:54] I wass thinking that as well [18:56:00] and probably it would mess up some peoples metrics if they expect not to have trash now [18:56:19] I think it would [18:56:30] eeef [18:56:36] tricky :( [18:56:47] mighthave to add another annoying coniditional in the function [18:57:02] yarhhhhg but no transform funcs don't know [18:57:04] hmMmM [18:57:04] we [18:57:21] we'd have to make refine detect 'if eventlogging_ source' and configure different trransform functions [18:57:35] heh or use a different job [18:57:37] which i really don't like [18:57:42] yargh [18:57:51] probably should do that for now [18:57:53] there are still two jobs [18:57:59] use filter_allowed_domains only for el data [18:58:11] we won't be able to differentiate by topic name in the future though [18:58:28] so i guess we can rely on that for now to apply filter_allowed_domains [18:58:34] but for all data also add a column [18:59:50] for now: [18:59:58] remove filter_allowed_domains from event_transforms [19:00:12] and then configure El specific refine job to do event_transforms+ filter_allowed_domains [19:00:23] works for me [19:00:43] and add task to add new column to id non wikimedia domains [19:00:49] for new event stuff, rather than filtering [19:00:51] ahead of time [19:01:18] ottomata: I really don't know about that filtering [19:01:32] well we can make a task and discuss with bnuria [19:01:34] We shouldn't accept events not from "trusted" domains [19:01:37] yup [19:02:04] for the moment the quick-fix is to make it different between internally-generated and externally-generated [19:03:58] also ottomata: revieweing you patch now [19:04:53] ty [19:04:55] (03PS1) 10Ottomata: Remove filter_allowed_domains from common event_transforms [analytics/refinery/source] - 10https://gerrit.wikimedia.org/r/608442 [19:05:49] (03CR) 10Joal: [C: 03+2] "LGTM" [analytics/refinery/source] - 10https://gerrit.wikimedia.org/r/608442 (owner: 10Ottomata) [19:07:50] 10Analytics: Update refinery-core Webrequest.isWikimediaHost - https://phabricator.wikimedia.org/T256674 (10JAllemandou) [19:07:57] ottomata: --^ [19:10:03] (03Merged) 10jenkins-bot: Remove filter_allowed_domains from common event_transforms [analytics/refinery/source] - 10https://gerrit.wikimedia.org/r/608442 (owner: 10Ottomata) [19:14:02] (03CR) 10Joal: "Some questions and comments" (035 comments) [analytics/refinery/source] - 10https://gerrit.wikimedia.org/r/607788 (owner: 10Ottomata) [19:15:11] 10Analytics, 10Event-Platform: Refine should add field to indicate if event is from wikimedia domain instead of filtering - https://phabricator.wikimedia.org/T256677 (10Ottomata) [19:15:15] joal: ^ [19:15:16] :) [19:16:22] 10Analytics, 10Event-Platform: Refine should add field to indicate if event is from wikimedia domain instead of filtering - https://phabricator.wikimedia.org/T256677 (10JAllemandou) :) [19:19:15] (03PS1) 10Joal: Update webrequest hive jar version for pageview-def [analytics/refinery] - 10https://gerrit.wikimedia.org/r/608447 (https://phabricator.wikimedia.org/T256514) [19:21:06] ottomata: The patch I just merged means we need a deploy to start refining WDQS events, right? [19:21:27] hmmmmmm [19:21:40] ottomata: also, that patch needs to have its friend about filtering with the function in puppet for EL IIUC [19:21:42] yes, but i guess we could temporarily list out all the tranform fuynctions [19:21:48] instead of using the common event_transforms [19:21:55] right [19:22:07] ottomata: I plan on deploying tomorrow for pageviews anyway [19:22:14] joa ya [19:22:14] https://gerrit.wikimedia.org/r/c/operations/puppet/+/608443 [19:22:23] ok cool let's wait til tomorrow [19:22:34] ack! [19:22:42] tomorrow early morning if ok for you [19:22:54] k great [19:23:11] (03CR) 10Ottomata: [WIP] Overloaded methods to make working with default Refine related classes easier (033 comments) [analytics/refinery/source] - 10https://gerrit.wikimedia.org/r/607788 (owner: 10Ottomata) [19:28:00] (03CR) 10Joal: [WIP] Overloaded methods to make working with default Refine related classes easier (034 comments) [analytics/refinery/source] - 10https://gerrit.wikimedia.org/r/607788 (owner: 10Ottomata) [19:47:15] (03CR) 10Ottomata: [WIP] Overloaded methods to make working with default Refine related classes easier (032 comments) [analytics/refinery/source] - 10https://gerrit.wikimedia.org/r/607788 (owner: 10Ottomata) [20:06:13] Gone for tonight - See you folks [20:11:40] lattrrrzzz [20:16:41] (03CR) 10Ottomata: "Just tested with webrequset_upload existing config, as well as with a couple of eventlogging topics with meta.dt,dt and it works as expect" [analytics/camus] (wmf) - 10https://gerrit.wikimedia.org/r/607796 (https://phabricator.wikimedia.org/T256370) (owner: 10Ottomata) [20:16:54] (03CR) 10Ottomata: [V: 03+2 C: 03+2] Improve log messages in KafkaReader to include topic and partition [analytics/camus] (wmf) - 10https://gerrit.wikimedia.org/r/607795 (owner: 10Ottomata) [20:17:22] (03CR) 10Ottomata: [V: 03+2 C: 03+2] Make JsonStringMessageDecoder search for list of possible camus.message.timestamp.field [analytics/camus] (wmf) - 10https://gerrit.wikimedia.org/r/607796 (https://phabricator.wikimedia.org/T256370) (owner: 10Ottomata) [20:28:10] 10Analytics, 10Commons, 10Epic: Provide download statistics of files on Wikimedia Commons - https://phabricator.wikimedia.org/T218076 (10Ramsey-WMF) @Milimetric picking this up again. We want to measure "Save As..." ideally, but in previous discussions (it's been over a year, fuzzy) we were told it was hard... [20:33:49] (03PS1) 10Ottomata: Rename avro-repo-bundle-1.7.4-SNAPSHOT to avro-repo-bundle-1.7.4 for mvn release [analytics/camus] (wmf) - 10https://gerrit.wikimedia.org/r/608457 [20:38:11] (03PS2) 10Ottomata: Rename avro-repo-bundle-1.7.4-SNAPSHOT to avro-repo-bundle-1.7.4 for mvn release [analytics/camus] (wmf) - 10https://gerrit.wikimedia.org/r/608457 [20:39:15] (03Abandoned) 10Ottomata: Rename avro-repo-bundle-1.7.4-SNAPSHOT to avro-repo-bundle-1.7.4 for mvn release [analytics/camus] (wmf) - 10https://gerrit.wikimedia.org/r/608457 (owner: 10Ottomata) [20:39:26] (03Restored) 10Ottomata: Rename avro-repo-bundle-1.7.4-SNAPSHOT to avro-repo-bundle-1.7.4 for mvn release [analytics/camus] (wmf) - 10https://gerrit.wikimedia.org/r/608457 (owner: 10Ottomata) [20:39:47] (03Abandoned) 10Ottomata: Rename avro-repo-bundle-1.7.4-SNAPSHOT to avro-repo-bundle-1.7.4 for mvn release [analytics/camus] (wmf) - 10https://gerrit.wikimedia.org/r/608457 (owner: 10Ottomata) [20:40:34] (03CR) 10Ottomata: "Accidentally force pushed this one here: https://gerrit.wikimedia.org/r/plugins/gitiles/analytics/camus/+/10311cc83d8611b2fca9527d15485b35" [analytics/camus] (wmf) - 10https://gerrit.wikimedia.org/r/608457 (owner: 10Ottomata) [20:55:57] (03PS1) 10Ottomata: Deploy camus-wmf-0.1.0-wmf10.jar [analytics/refinery] - 10https://gerrit.wikimedia.org/r/608460 (https://phabricator.wikimedia.org/T256370) [22:01:37] 10Analytics, 10Commons, 10Epic: Provide download statistics of files on Wikimedia Commons - https://phabricator.wikimedia.org/T218076 (10Milimetric) Yeah, I'm not sure, it depends on exactly what you're trying to accomplish with the metric. I'm not sure if you saw in the meantime that we exposed API endpoin... [22:06:08] 10Analytics, 10Analytics-EventLogging, 10QuickSurveys, 10WMDE-Technical-Wishes-Team: QuickSurveys should show an error when response is blocked - https://phabricator.wikimedia.org/T256463 (10Milimetric) Hm, I dug in a little bit and see that sendBeacon might return "false" for adblock exceptions, at least... [23:16:10] 10Analytics, 10Analytics-EventLogging, 10QuickSurveys, 10WMDE-Technical-Wishes-Team, 10Readers-Web-Backlog (Tracking): QuickSurveys should show an error when response is blocked - https://phabricator.wikimedia.org/T256463 (10nray) [23:43:25] 10Analytics, 10VPS-project-codesearch, 10Patch-For-Review: Add analytics/* gerrit repos to code search - https://phabricator.wikimedia.org/T249318 (10Legoktm) https://codesearch.wmflabs.org/analytics/ is live now, except... uBlock Origin blocks `analytics/js` by default, so it doesn't work unless you disable...