[00:06:56] I feel like it would be safe to assume that this isn't causal [00:07:06] why would ores increase typical time to reverts? [00:07:22] it seems more likely that communities adopting ORES are getting overwhelmed by vandals [00:11:00] I was curious so I took a stab at modeling it [00:11:01] https://teblunthuis.cc/outgoing/test_ttr_model.pdf [00:11:08] the models aren't great [00:11:35] but there's a robust positive association with longer ttr after adopting ORES [03:52:05] ok [03:52:11] it isn't that robust actuall [03:52:19] with a good enough time series model it vanishes [03:53:29] https://teblunthuis.cc/outgoing/test_ttr_model.pdf [04:07:04] this is a pretty simple hierarchical AR(1) model with wiki-level terms for the number of reverts, the lag term, and whether ores is enabled and shared term for ores_bias [04:07:08] sorry [04:07:15] not "ores_bias" "has_ores" [04:07:44] and that term is close to zero and has a low t score [10:13:42] (03CR) 10jenkins-bot: Localisation updates from https://translatewiki.net. [extensions/ORES] - 10https://gerrit.wikimedia.org/r/509293 (owner: 10L10n-bot) [10:49:25] 10ORES, 10Scoring-platform-team, 10MediaWiki-Core-Testing, 10MediaWiki-extensions-MultimediaViewer, and 8 others: Audit tests/selenium/LocalSettings.php file aiming at possibly deprecating the feature - https://phabricator.wikimedia.org/T199939 (10zeljkofilipin) [13:55:50] 10Scoring-platform-team: Gerrit repo scoring/ores/editquality LFS broken (smudge filter lfs failed) - https://phabricator.wikimedia.org/T212544 (10Halfak) 05Resolved→03Open Looks like this work-around isn't working now. ` $ git lfs push gerrit master Locking support detected on remote "gerrit". Consider e... [14:22:56] 10Scoring-platform-team: Gerrit repo scoring/ores/editquality LFS broken (smudge filter lfs failed) - https://phabricator.wikimedia.org/T212544 (10Halfak) Nevermind. I needed to do `git lfs fetch --all` first. [14:23:04] 10Scoring-platform-team: Gerrit repo scoring/ores/editquality LFS broken (smudge filter lfs failed) - https://phabricator.wikimedia.org/T212544 (10Halfak) 05Open→03Resolved [14:27:44] RoanKattouw, when you get in, I have a quick question. It looks like we'll soon be deploying a model with improved performance for fiwiki. Is there any manual work necessary in order to adjust the thresholds used onwiki? [14:34:30] 10Scoring-platform-team, 10Diffusion, 10GitHub-Mirrors, 10Release-Engineering-Team (Backlog): LFS objects are not mirroring from Github through Phab to Gerrit consistently - https://phabricator.wikimedia.org/T212818 (10Halfak) [14:34:43] 10Scoring-platform-team, 10Diffusion, 10GitHub-Mirrors, 10Release-Engineering-Team (Backlog): LFS objects are not mirroring from Github through Phab to Gerrit consistently - https://phabricator.wikimedia.org/T212818 (10Halfak) [15:21:46] (03PS1) 10Halfak: Adds svwiki articlequality and fiwiki editquality updates [services/ores/deploy] - 10https://gerrit.wikimedia.org/r/509430 [15:22:48] (03CR) 10Halfak: [V: 03+2 C: 03+2] Adds svwiki articlequality and fiwiki editquality updates [services/ores/deploy] - 10https://gerrit.wikimedia.org/r/509430 (owner: 10Halfak) [15:24:53] Going to beta [15:40:06] ls [15:40:09] oops [15:51:22] https://teblunthuis.cc/outgoing/N_reverts_increasing_decreasing.pdf [15:51:24] oops [16:31:15] (03PS1) 10Umherirrender: Pass User to Helpers::getThreshold() [extensions/ORES] - 10https://gerrit.wikimedia.org/r/509461 [16:32:57] (03PS1) 10Umherirrender: Remove default value from argument list [extensions/ORES] - 10https://gerrit.wikimedia.org/r/509463 [16:55:49] PROBLEM - puppet on ORES-web01.Experimental is CRITICAL: CRITICAL: Failed to apply catalog, zero resources tracked by Puppet. It might be a dependency cycle. [16:57:55] halfak: thats a new one ^ [17:24:45] RECOVERY - puppet on ORES-web01.Experimental is OK: OK: Puppet is currently enabled, last run 2 minutes ago with 0 failures [19:09:39] halfak: Yes, we should at least review the thresholds to see if they still make sense [19:10:21] In theory we are somewhat resilient to model changes, in practice medium to large changes in model performance require config tweaks [19:11:33] Yeah. This one will be large-ish. It won't go out until next week though. Should I file a task? [19:11:35] RoanKattouw, ^ [19:11:41] Yes please [20:24:23] (03PS1) 10Halfak: Enables svwiki artilequality model. [services/ores/deploy] - 10https://gerrit.wikimedia.org/r/509516 [20:24:52] (03CR) 10Halfak: [V: 03+2 C: 03+2] Enables svwiki artilequality model. [services/ores/deploy] - 10https://gerrit.wikimedia.org/r/509516 (owner: 10Halfak) [21:13:27] 10Scoring-platform-team (Current), 10Wikilabels, 10articlequality-modeling, 10User-Sebastian_Berlin-WMSE, and 2 others: Build article quality model for svwiki - https://phabricator.wikimedia.org/T202202 (10Halfak) This is now deployed to our beta (testing) service. See http://ores-beta.wmflabs.org/v3/scor... [21:46:49] Reviews complete! I'm headed out for the evening. have a good one, folks.