[11:04:01] lunch [14:14:53] o/ [15:06:15] \o [15:08:30] .o/ [15:15:37] inflatador: curious, is elasticsearch plugins reboot far off on the priority scale? [15:15:49] hoping to get it moving [15:17:42] ebernhardson sorry about that, let me find the ticket and I'll take a look [15:20:17] looks like it's T407520 ? [15:20:17] T407520: Deploy various plugins to fix various things - https://phabricator.wikimedia.org/T407520 [15:20:24] inflatador: ya thats the one [15:20:49] OK, I can grab that one and get started [15:22:51] just double checked, wmf8 of opensearch-extra has the last merged patch and the master branch on plugins has wmf8, we should be all set [15:23:12] thanks! [15:23:21] np [15:24:17] I just merged the patch to bring in the upstream opensearch deb repos, so we should be unblocked for working on opensearch 3 images...haven't had time to verify the repos are working yet though [15:25:23] oops and wdqs lag is creepin' up again. bah [15:27:59] nice! and also less nice :( [15:54:25] Ok, the opensearch plugins are installed everywhere, kicking off some roll-restarts now [15:54:37] awesome [16:56:19] cloudelastic/relforge are done, moving onto prod cirrussearch [17:03:06] lunch, back in ~1h [17:11:01] huh, just realized i didn't set up the interleaved part of the nearmatch 2x test on commonswiki [17:11:08] should be fine, but oops [17:23:59] overall, 2x test looks like a win. Smaller than i had expected though, only 20k sessions per bucket over a week [17:24:43] although it curiously has significant results for all three ZRR related metrics, even though this was only a ranking change [17:25:15] well, 2 of 3 are significant. [17:50:58] (also need to wait till end of day utc to have a full 1 week data collection, just getting it mostly going now) [17:56:45] back [18:35:26] on balance...i guess i have to suggest letting the test run a second week and run a second one-week analysis to compare, ZRR sesions decreasing from 12.4 to 10.8 (more than doubling the CI margins) is suspicious [18:59:48] ebernhardson: ^that sounds like a good idea. The ZRR change is indeed suspicious. [19:01:53] i started trying to write up about the changes in clickthrough rates (look good), but realized some of that is explained by the zrr change, which isn't explained by the treatment...seems best to run a second week and verify that's a random variance [19:12:33] also makes me wonder if clickthrough rate should be normalized against queries that returns results...but making metrics more complex doesn't seem great [21:24:20] meh, going to be some wierd thing...assertEscapeExpansion("[😂]", "[\uD83D\uDE02]") passes, but on-wiki insource:/[😂]/ works but the \u escaped variant does not [21:25:36] i guess not just on-wiki, adding some rewrite tests for how it gets rewritten all passes, but then running the full thing with the lucene regex evaluation fails [21:39:32] oh...hmm. the way i implemented it makes \u not properly expand multiple codepoints, because it injects a \ between them (which tells lucene to treat the resulting character as a literal) [21:40:01] so..hmm. Maybe i just have to directly recognize surrogates [21:48:04] i literally wrote tests that verify we emit illegal utf-8 :P I guess i just wasn't thinking that through [21:48:47] * ebernhardson sadly is also looking at patching a java plugin while we are rolling out the new plugin :P [21:49:15] sometimes when working with regexes, I *want* to use illegal utf-8... and patching plugins while rolling out plugins is the Java way... [22:02:25] ebernhardson the rolling restarts are finished, I'm still verifying but all relforge/cloudelastic/cirrussearch hosts should have the new plugins now [22:02:44] inflatador: excellent, thanks! Then we can get a reindex moving forward