[03:03:32] (03CR) 10Nuria: [C: 04-1] Replace numeral with numbro and fix bytes formatting (031 comment) [analytics/wikistats2] - 10https://gerrit.wikimedia.org/r/585725 (https://phabricator.wikimedia.org/T199386) (owner: 10Fdans) [05:38:59] (03PS7) 10Fdans: Replace numeral with numbro and fix bytes formatting [analytics/wikistats2] - 10https://gerrit.wikimedia.org/r/585725 (https://phabricator.wikimedia.org/T199386) [05:39:14] (03CR) 10Fdans: Replace numeral with numbro and fix bytes formatting (031 comment) [analytics/wikistats2] - 10https://gerrit.wikimedia.org/r/585725 (https://phabricator.wikimedia.org/T199386) (owner: 10Fdans) [05:41:30] 10Analytics, 10Dumps-Generation: Document missing project types in pagecount dumps - https://phabricator.wikimedia.org/T249984 (10fdans) cc @Nuria and @JAllemandou [05:48:53] (03PS8) 10Fdans: Replace numeral with numbro and fix bytes formatting [analytics/wikistats2] - 10https://gerrit.wikimedia.org/r/585725 (https://phabricator.wikimedia.org/T199386) [06:37:40] goood morning [06:37:44] very interesting: [06:37:45] WARN RefineTarget: hdfs://analytics-hadoop/wmf/data/raw/eventlogging/eventlogging_SearchSatisfaction/hourly/2020/04/29/19 -> `event`.`SearchSatisfaction` (year=2020,month=4,day=29,hour=19) previously failed refinement and does not have new data since the last refine at 2020-04-29T20:11:10.128Z, skipping. [06:50:45] interesting, I had to remove the REFINED flag to make it work [07:14:24] !log correct X-Forwarded-Proto for superset (http -> https) and restart it [07:14:25] Logged the message at https://www.mediawiki.org/wiki/Analytics/Server_Admin_Log [07:22:25] 10Analytics, 10User-Elukey: Secure Hue/Superset/Turnilo with CAS (and possibly 2FA) - https://phabricator.wikimedia.org/T159584 (10elukey) The `X-Remote-User` code is added in httpd conf via `RequestHeader set X-Remote-User expr=%{REMOTE_USER}`, so mod_cas should be fully compatible with Superset. [07:47:28] (03PS3) 10Fdans: Fix language dropdown for ios devices [analytics/wikistats2] - 10https://gerrit.wikimedia.org/r/589606 (https://phabricator.wikimedia.org/T246971) [09:06:14] addshore: hola, around_ [09:06:16] ? [09:06:40] fdans: if you have a moment, I still have that JS issue :( [09:07:11] elukey: bc? [09:08:35] sure! [09:08:39] hola! [09:08:52] dam you pinged me the other day and I didnt read it *scrolls up* [09:09:17] addshore: np! Sorry for the hassle :( [09:09:23] the TL;DR is https://phabricator.wikimedia.org/T249754#6092542 [09:09:23] https://phabricator.wikimedia.org/T119070 right? :D [09:10:05] addshore: yes correct! [09:11:00] right now we do still use it [09:11:26] it currently powers https://grafana.wikimedia.org/d/000000264/wikidata-dump-downloads?orgId=1&refresh=5m&from=now-90d&to=now [09:11:58] currently via https://github.com/wikimedia/analytics-wmde-scripts/blob/master/src/wikidata/dumpDownloads.php [09:12:14] would hdfs -> report updater work? [09:12:21] (we havn't really used report updater yet) [09:12:34] 10Analytics, 10Analytics-Kanban: Unify stat1007 puppet role with the rest of the stats cluster - https://phabricator.wikimedia.org/T249754 (10Addshore) 10:10 AM right now we do still use it 10:11 AM it currently powers https://grafana.wikimedia.org/d/000000264/wikidata-dump-downloads?orgI... [09:19:19] addshore: yeah I think that we could make it work, but I'll cut a task for that.. for the moment I'll just move the rsync rules without dropping [09:19:29] and then we'll think about a better refactor [09:19:33] thanks :) [09:24:24] brb [09:31:11] 10Analytics, 10Dumps-Generation: Document missing project types in pagecount dumps - https://phabricator.wikimedia.org/T249984 (10fdans) A couple of years ago @Akeron made this nice guide, which contains most of what I explained above: https://meta.wikimedia.org/wiki/Learning_patterns/Tips_for_reading_project... [09:44:32] going errand for hopefully ~30 mins, need to buy some stuff for my parents, if needed call me on the phone :) [09:52:44] joal: whenever you have a second can we talk in the bc? :) [09:55:35] 10Analytics, 10Analytics-Kanban, 10Tool-Pageviews: Image files with quotes do not resolve on the mediarequest API - https://phabricator.wikimedia.org/T247333 (10fdans) Confirming that URLs with punctuation now work correctly. Thank you so much @JAllemandou for deploying and checking :) https://wikimedia.org... [10:12:42] elukey: great! [10:12:55] if you want to change the location of the rsync too that would be fine, will be an easy change from our side [10:17:22] 10Analytics, 10Analytics-Kanban, 10Research: Proposed adjustment to wmf.wikidata_item_page_link to better handle page moves - https://phabricator.wikimedia.org/T249773 (10Milimetric) @Isaac btw regarding your query troubles before, I should've said but I run my queries in the superset SQL lab, with presto, a... [10:20:14] addshore: one solution to avoid opening ports etc.. is to hdfs-rsync (now available, it was not at the time) the nginx logs to hdfs, then a cron on stat1007 pulls them [10:36:20] !log run superset init to add missing perms on an-tool1005 and analytics-tool1004 - T249681 [10:36:22] Logged the message at https://www.mediawiki.org/wiki/Analytics/Server_Admin_Log [10:36:23] T249681: Experiment with Druid and SqlAlchemy - https://phabricator.wikimedia.org/T249681 [10:39:17] fdans: I solved the heisenbug, turns out superset init needs to be run on every upgrade [10:39:24] otherwise some perms are not set in tables etc.. [10:40:02] so the 401 is solved, but it seems that moving to druid tables on the current version (0.35.2) is not possible changing charts by hand, on 0.36.0 seems possible [10:40:05] sigh [10:53:40] 10Analytics-Kanban, 10Better Use Of Data, 10Product-Analytics: Experiment with Druid and SqlAlchemy - https://phabricator.wikimedia.org/T249681 (10elukey) Discoveries made today: * `superset init` needs to be run on every upgrade, otherwise some permissions may not be created on the db tables related to new... [10:54:43] ok mistery solved, lunch! [10:55:25] * RhinosF1 wonders how it got to lunch time already [11:00:12] elukey: i just updated nodejs on an-tool1007.eqiad.wmnet is it safe to restart turnilo [11:36:46] jbond42: yep [11:37:12] thanks [11:54:30] 10Analytics, 10Beta-Cluster-Infrastructure, 10Event-Platform: Cannot find module '../lib/factories/wikimedia-eventgate' - https://phabricator.wikimedia.org/T251510 (10Reedy) [12:09:36] Hi fdans - here I am ! [12:11:26] I've been fighting with my computer over an upgrade that broke me :( [12:13:54] joal: were you updating your firmware via laptop? :D [12:14:07] elukey: :-P [12:14:40] elukey: a dist-upgrade broke my graphics-driver, and I had issue reinstalling it [12:14:55] I had to force uninstall all rlated packages, then reinstall worked [12:15:19] possibly a usual thing for people used to play with packages, for it was not intuitive (it usually works great) [12:16:12] :( [12:16:22] are you using sid or stable? [12:17:11] buster [12:17:30] * hashar throws a Windows 10 licence at joal [12:17:42] ahahahah [12:17:56] * joal throws a free license to hashar [12:17:59] :) [12:18:02] stop using broken software written by random nerds around the world! [12:19:51] anyway, more seriously, I have deleted the old refinery-jar-updater jenkins job [12:20:19] that's great hashar - We're full-docker in CI now - I guess you can also drop the old VMs [12:22:57] yeah that is the idea ;) [12:23:43] meanwhile gehel had some interesting tip to override the scm developerConnection URL [12:25:05] yup, saw the CR [12:25:21] so I guess I will change the job to do that [12:25:27] ack hashar [12:25:44] and potentially we can drop the property I have added to the refiniery pom.xml [12:34:03] (03CR) 10Joal: "My convern with the "more generic" approach is drop data that shouldn't be. This concern is premature given we prefilter the data used as " [analytics/refinery] - 10https://gerrit.wikimedia.org/r/593246 (https://phabricator.wikimedia.org/T244597) (owner: 10Lex Nasser) [12:34:16] * gehel is happy to spend more time being the consultant on anything Maven related :) [12:35:37] \o/ thanks gehel :) [12:38:51] joal: bc for a minute? [12:38:55] sure fdans [12:40:26] gehel: joal: would you mind double checking the -DdeveloperConnectionUrl override? https://gerrit.wikimedia.org/r/#/c/integration/config/+/593300/ [12:40:36] will do it for the refinery job after [12:41:18] looking [12:42:24] hashar: looks ok, but I have no idea actually :) [12:42:44] I dont either but can try it out live hehe [12:43:53] looks OK to me, but not sure how to test it [12:44:22] i will try it live. Thanks! [12:54:21] 10Analytics, 10LDAP-Access-Requests, 10Operations: LDAP access to the wmf group for Antonino Hemmer (superset, turnilo, hue) - https://phabricator.wikimedia.org/T251123 (10DZierten) Approved thank you! [12:55:42] gehel: turns out it does not work. When passing -DdeveloperConnectionUrl=scm:git:https://maven-release-user@gerrit.wikimedia.org/r/wikidata/query/rdf , it still tries to use ssh bah [12:56:00] 00:05:54.779 [INFO] Executing: /bin/sh -c cd /src && git push ssh://gerrit.wikimedia.org:29418/wikidata/query/rdf refs/heads/master:refs/heads/master [12:57:38] maybe that user property has to be defined in the pom.xml [13:00:48] do you have the full log somewhere? [13:01:22] I think that Maven forks during the release process, so it might not have that property in the forked execution [13:01:24] lemme check [13:12:10] I'm finding conflicting documentation (or at least I don't understand it) [13:12:27] https://maven.apache.org/maven-release/maven-release-plugin/perform-mojo.html#connectionUrl -> looks like -DconnectionUrl could be used [13:14:16] but that might be only about the initial checkout [13:17:08] in the end, the best course of action is probably to replace the developerConnection in the pom with the HTTPS version [13:17:38] it is only used during release. If we move releases to jenkins, there is no need to override it for individual developers [13:18:01] also, username probably does not need to be in the pom [13:18:22] this might need more time than I have to get to the bottom [13:18:26] gehel: I am going to try with -DconnectionUrl ;) [13:18:47] that hack with a property might be the easier way in the short term, even if it hurts my sensibilities [13:18:54] hehe [13:19:00] zpapierski or dcausse might have experience with all that as well [13:20:23] actually... no :P [13:22:44] I need to setup a test project to experiment [13:31:20] gehel: that does not work. So I guess we can roll back to the hack I have made for refinery ;) [13:31:42] hashar: sadly, that sounds like the best short term option :/ [13:31:58] I'll try to find some time to play a bit more with this. There has to be a better solution! [13:32:26] we could also have the developerConnection use HTTPS in the pom, so there should be no need to override it [13:34:14] I have restored my change : https://gerrit.wikimedia.org/r/#/c/wikidata/query/rdf/+/593298/ [13:34:33] the thing is developer would push over ssh [13:34:54] so it is better to have ssh to be the default in case folks want to release from their local machine [13:41:12] 10Analytics, 10Analytics-Kanban: Unify stat1007 puppet role with the rest of the stats cluster - https://phabricator.wikimedia.org/T249754 (10Ottomata) >Β and change the consumers of the data to pull from it. @addshore if these files are small enough (haven't checked), you might even be able to run your script... [13:42:40] 10Analytics, 10Beta-Cluster-Infrastructure, 10Event-Platform: Cannot find module '../lib/factories/wikimedia-eventgate' - https://phabricator.wikimedia.org/T251510 (10Ottomata) Where are you getting this error? In MW Vagrant? [13:43:53] 10Analytics, 10Beta-Cluster-Infrastructure, 10Event-Platform: Cannot find module '../lib/factories/wikimedia-eventgate' - https://phabricator.wikimedia.org/T251510 (10abi_) On the beta cluster - https://logstash-beta.wmflabs.org/app/kibana#/doc/logstash-*/logstash-2020.04.30/eventgate-logging?id=AXHKt8f7Cmr1... [13:53:29] 10Analytics, 10Beta-Cluster-Infrastructure, 10Event-Platform: Cannot find module '../lib/factories/wikimedia-eventgate' - https://phabricator.wikimedia.org/T251510 (10Ottomata) That's a strange error, I'm not sure where '../lib/factories/wikimedia-eventgate' comes from, that isn't the right configuration.... [14:08:12] joal: I am testing the new druid exporter in hadoop test, seems working! [14:08:20] all metrics completely defined in json [14:10:18] mforns: if you find time, would appreciate a review of https://gerrit.wikimedia.org/r/c/analytics/refinery/+/593047 [14:12:36] 10Analytics, 10Beta-Cluster-Infrastructure, 10Event-Platform: Cannot find module '../lib/factories/wikimedia-eventgate' - https://phabricator.wikimedia.org/T251510 (10abi_) @Ottomata - Yup, the jobs seem to be executing again! Thanks for you help! Feel free to resolve this task if there is nothing further to... [14:16:00] 10Analytics, 10Dumps-Generation: page_restrictions field incomplete in current and historical dumps - https://phabricator.wikimedia.org/T251411 (10ArielGlenn) ` wikiadmin@10.64.32.76(enwiki)> select * from page_restrictions where pr_page = 13856248; +----------+---------+----------------+------------+---------... [14:16:08] 10Analytics, 10Beta-Cluster-Infrastructure, 10Event-Platform: Cannot find module '../lib/factories/wikimedia-eventgate' - https://phabricator.wikimedia.org/T251510 (10Ottomata) 05Openβ†’03Resolved a:03Ottomata [14:35:14] elukey: this is super great :) [14:35:20] elukey: shall we plan on bumping druid? [14:35:53] Hi dcausse - would ou have a minute for me? [14:36:02] joal: sure [14:36:41] dcausse: I'm trying to compile latest version of rdf-spark-tools, and I'm hitting an error :( [14:36:51] meh :( [14:37:12] joal: can you paste it somewhere? [14:37:26] the project builds fine in jenkins [14:37:26] sure dcause [14:38:15] dcausse: https://gist.github.com/jobar/59a0b5948f62644c1fb7097b617961b3 [14:38:37] dcausse: Failure to find org.wikidata.query.rdf:blazegraph-service:war:0.3.24-SNAPSHOT [14:38:48] ah [14:39:17] sadly you can't just build rdf-spark-tool without having other modules installed [14:39:41] dropping -pl rdf-spark-tools might help [14:39:44] shouldn't the -pl -am do the thing? [14:39:48] ack, testing [14:39:52] it's 2/3 mins usually [14:40:28] dcausse: The following artifacts could not be resolved: com.blazegraph:bigdata-cache:jar:2.1.6-wmf.1 [14:40:32] + others [14:40:42] hm... [14:40:44] for module: Blazegraph extension to improve performance for Wikibase [14:41:20] :( [14:41:39] dcausse: the -wmf modules [14:41:58] joal: do you have specific maven repo setup in you .m2/settings.xml [14:42:11] I don't think so, but checking [14:42:32] something that would conflict with "wmf.mirrored" or "wmf.releases" [14:42:35] dcausse: nope - http-proxy on that's all [14:43:14] the artifact is here: https://archiva.wikimedia.org/repository/releases/com/blazegraph/bigdata-cache/2.1.6-wmf.1/ [14:43:22] I wonder why it's unable to find it :/ [14:43:42] dcausse: trying having dropped my m2 cache (not nice for network but eh) [14:43:50] ouch [14:44:04] * joal DOWNLOADS THE WOOOOOOOOOOORLD ! [14:44:18] (03CR) 10Lex Nasser: "joal: I have no real justification for the use of mw_private_directory, other than all the other geoeditors datasets are located there." [analytics/refinery] - 10https://gerrit.wikimedia.org/r/593246 (https://phabricator.wikimedia.org/T244597) (owner: 10Lex Nasser) [14:44:31] same error dcausse [14:44:37] from stat1004 [14:44:44] Will try from another machine [14:45:03] joal: perhaps try ./mvnw instead of mvn ? [14:45:55] dcausse: what is the difference? [14:46:12] mvn is the system one [14:46:15] dcausse: it's doing something different [14:46:22] ./mvnw is the wrapper [14:46:37] to use the latest I assume? [14:46:55] a specific version [14:47:01] right [14:47:18] The stuff is blocked on trying to download nexus-staging-maven-plugin-1.6.8.pom [14:47:23] from maven-central repo [14:47:32] joal: web proxy? [14:47:45] normally it's set in my settings [14:47:48] will enable it [14:48:05] trying to build there as well [14:51:45] dcausse: I think I have working using "./mvnw --settings my-settings.xml clean package" [14:52:35] joal: cool, I could get it working with ./mvnw clean package -Dhttps.proxyHost=webproxy.eqiad.wmnet -Dhttps.proxyPort=8080 [14:52:44] right [14:52:56] looks like maven does not read the env var https_proxy [14:53:00] or java [14:53:34] dcausse: we should ask ottomata how we should update your poms to use archiva only (with mirroring) [14:54:06] sure [15:05:16] oh ya that would be better [15:05:41] this is how [15:05:41] https://wikitech.wikimedia.org/wiki/Archiva#Development [15:09:43] (03CR) 10Milimetric: [C: 03+1] "I'll only +1 to let Joseph respond. But here are my thoughts on both questions." [analytics/refinery] - 10https://gerrit.wikimedia.org/r/593246 (https://phabricator.wikimedia.org/T244597) (owner: 10Lex Nasser) [15:12:14] (03CR) 10Joal: "Ok for me with the "trim last 4" - Thanks for making me see the light Dan :)" [analytics/refinery] - 10https://gerrit.wikimedia.org/r/593246 (https://phabricator.wikimedia.org/T244597) (owner: 10Lex Nasser) [15:33:52] joal: re - druid upgrade, 0.18 is out, I think that we should upgrade next q to fill the gap [15:34:03] ok elukey :) [16:06:33] milimetric: thoughts on what Joseph said about mw_private? https://gerrit.wikimedia.org/r/#/c/analytics/refinery/+/593246/ [16:12:27] 10Analytics, 10Event-Platform, 10Services, 10Patch-For-Review, 10Sustainability: Create a class to make EventRelayer to send events to EventBus endpoint - https://phabricator.wikimedia.org/T134535 (10Krinkle) [16:17:34] 10Analytics, 10Analytics-Kanban, 10Tool-Pageviews: Image files with quotes do not resolve on the mediarequest API - https://phabricator.wikimedia.org/T247333 (10Nuria) 05Openβ†’03Resolved [16:17:40] 10Analytics, 10Tool-Pageviews: Statistics for views of individual Wikimedia images - https://phabricator.wikimedia.org/T210313 (10Nuria) [16:18:13] 10Analytics, 10Analytics-Kanban, 10Tool-Pageviews: Fix double encoding of urls on mediarequests api - https://phabricator.wikimedia.org/T244373 (10Nuria) 05Openβ†’03Resolved [16:18:19] 10Analytics, 10Tool-Pageviews: Add ability to the pageview tool in labs to get mediarequests per file similar to existing functionality to get pageviews per page title - https://phabricator.wikimedia.org/T234590 (10Nuria) [16:24:36] 10Analytics, 10Dumps-Generation: page_restrictions field incomplete in current and historical dumps - https://phabricator.wikimedia.org/T251411 (10ArielGlenn) I've been looking at the en wiki dump files for part 18 (containing this page) and there are no entries for page restrictions in either the 2019-02 dump... [16:26:10] ottomata: reviewed the camus change, +1! Didn't check all the details, if pcc is ok then feel free to deploy [16:29:40] nuria: https://wikitech.wikimedia.org/w/index.php?title=Analytics%2FData_Lake%2FTraffic%2FPageview_hourly&type=revision&diff=1864757&oldid=1863969 [16:33:46] 10Analytics: Make anomaly detection correctly handle holes in time-series - https://phabricator.wikimedia.org/T251542 (10mforns) [16:41:35] logging off o/ [17:03:28] thanks elukey ! [17:47:30] (03CR) 10Mforns: "LGTM overall!" (035 comments) [analytics/refinery] - 10https://gerrit.wikimedia.org/r/593047 (https://phabricator.wikimedia.org/T241241) (owner: 10Ottomata) [17:52:17] 10Analytics: Remove North Korea from data quality traffic entropy reports - https://phabricator.wikimedia.org/T251546 (10mforns) [17:58:06] (03CR) 10Nuria: [C: 04-1] "One remaining problem that I think is quite easy to fix." (033 comments) [analytics/wikistats2] - 10https://gerrit.wikimedia.org/r/585725 (https://phabricator.wikimedia.org/T199386) (owner: 10Fdans) [18:28:54] 10Analytics: Remove North Korea from data quality traffic entropy reports - https://phabricator.wikimedia.org/T251546 (10Nuria) Since this is affecting production and kind of poluting our alarms let's just hotfix the job removing north korea explicitily on the select, does that seem OK as compromise? [18:29:07] 10Analytics, 10Operations, 10Traffic: Remove North Korea from data quality traffic entropy reports - https://phabricator.wikimedia.org/T251546 (10Nuria) [18:30:01] 10Analytics, 10Operations, 10Traffic: Remove North Korea from data quality traffic entropy reports - https://phabricator.wikimedia.org/T251546 (10Nuria) I would remove it from daily/hourly jobs both: https://github.com/wikimedia/analytics-refinery/blob/master/oozie/data_quality_stats/hourly/queries/traffic_e... [18:43:26] 10Analytics, 10Analytics-Kanban, 10Operations, 10Traffic: Remove North Korea from data quality traffic entropy reports - https://phabricator.wikimedia.org/T251546 (10mforns) a:03mforns [18:44:16] (03PS1) 10Mforns: Remve North Korea from traffic entropy reports [analytics/refinery] - 10https://gerrit.wikimedia.org/r/593587 (https://phabricator.wikimedia.org/T251546) [19:00:56] (03CR) 10Nuria: [C: 03+2] Remve North Korea from traffic entropy reports [analytics/refinery] - 10https://gerrit.wikimedia.org/r/593587 (https://phabricator.wikimedia.org/T251546) (owner: 10Mforns) [19:00:58] (03CR) 10Nuria: [V: 03+2 C: 03+2] Remve North Korea from traffic entropy reports [analytics/refinery] - 10https://gerrit.wikimedia.org/r/593587 (https://phabricator.wikimedia.org/T251546) (owner: 10Mforns) [19:23:28] 10Analytics, 10Analytics-Kanban, 10Patch-For-Review: Spike: POC of refine with airflow - https://phabricator.wikimedia.org/T241246 (10mforns) a:03mforns [19:54:07] (03PS9) 10Mforns: Update the CX abuse filter statistics script for hive [analytics/reportupdater-queries] - 10https://gerrit.wikimedia.org/r/579022 (https://phabricator.wikimedia.org/T223958) (owner: 10Amire80) [20:08:14] (03CR) 10Mforns: "I tested this job in stat1007 and after some minor tweaks it worked!" [analytics/reportupdater-queries] - 10https://gerrit.wikimedia.org/r/579022 (https://phabricator.wikimedia.org/T223958) (owner: 10Amire80) [20:09:38] (03CR) 10Mforns: [V: 03+2 C: 03+2] "LGTM! and tested" [analytics/reportupdater-queries] - 10https://gerrit.wikimedia.org/r/579022 (https://phabricator.wikimedia.org/T223958) (owner: 10Amire80) [20:12:20] (03PS4) 10Lex Nasser: Fix project field of geoeditors public monthly and semantics [analytics/refinery] - 10https://gerrit.wikimedia.org/r/593246 (https://phabricator.wikimedia.org/T244597) [20:18:52] nuria: do you want me to test this? https://gerrit.wikimedia.org/r/#/c/analytics/reportupdater-queries/+/593092/ [20:35:48] (03CR) 10Ottomata: Add python/refinery/eventstreamconfig.py and use in in bin/camus to build dynamic topic whitelist (032 comments) [analytics/refinery] - 10https://gerrit.wikimedia.org/r/593047 (https://phabricator.wikimedia.org/T241241) (owner: 10Ottomata) [20:36:00] (03CR) 10Ottomata: Add python/refinery/eventstreamconfig.py and use in in bin/camus to build dynamic topic whitelist (031 comment) [analytics/refinery] - 10https://gerrit.wikimedia.org/r/593047 (https://phabricator.wikimedia.org/T241241) (owner: 10Ottomata) [20:42:55] (03PS5) 10Ottomata: Add python/refinery/eventstreamconfig.py and use in in bin/camus to build dynamic topic whitelist [analytics/refinery] - 10https://gerrit.wikimedia.org/r/593047 (https://phabricator.wikimedia.org/T241241) [20:53:28] (03CR) 10Mforns: [C: 03+1] "LGTM! I just left a comment on the format of the header." (032 comments) [analytics/refinery] - 10https://gerrit.wikimedia.org/r/593047 (https://phabricator.wikimedia.org/T241241) (owner: 10Ottomata) [20:59:48] (03CR) 10Ottomata: Add python/refinery/eventstreamconfig.py and use in in bin/camus to build dynamic topic whitelist (031 comment) [analytics/refinery] - 10https://gerrit.wikimedia.org/r/593047 (https://phabricator.wikimedia.org/T241241) (owner: 10Ottomata) [21:00:07] (03CR) 10Ottomata: [V: 03+2 C: 03+2] Add python/refinery/eventstreamconfig.py and use in in bin/camus to build dynamic topic whitelist [analytics/refinery] - 10https://gerrit.wikimedia.org/r/593047 (https://phabricator.wikimedia.org/T241241) (owner: 10Ottomata) [21:21:27] 10Analytics, 10Event-Platform, 10Inuka-Team (Kanban), 10KaiOS-Wikipedia-app (MVP), 10Patch-For-Review: Capture and send back client-side errors - https://phabricator.wikimedia.org/T248615 (10hueitan) Thanks @Ottomata, I can now see some data on the dashboard! πŸŽ‰