[01:45:53] Reedy: could land https://gerrit.wikimedia.org/r/#/c/mediawiki/core/+/503759/ after rebase :) [01:47:24] cherry pick onto master, git review, ???, profit [01:49:39] yeah, sometimes Gerrit's java-git can be quite bad at rebases [01:49:43] surprising when it just works locally [02:37:08] Krinkle: That removes over 10% of the lines of .phpcs.xml :) [13:55:06] duesen: T44886 has a question about how Wikidata sitelinks behave (or should behave) when a page is deleted and then another is moved to the title versus when the "delete page to allow move" feature is used on Special:MovePage to delete-and-move in one request. Who would be good to ask about that? [13:55:07] T44886: No option to move over an existing non-redirect page in the API - https://phabricator.wikimedia.org/T44886 [14:45:49] anomie: Lydia owns the product, so she's the authority on the "should". [14:46:19] afaik, the link on wikidata will just be broken when the page is removed. but maybe that has changed [14:46:46] Leszek would know. Addshore as well. [14:47:20] Hola [14:47:34] que tal? [14:47:56] addshore: ohai! [14:48:01] (most of my spanish) [14:48:02] > duesen: T44886 has a question about how Wikidata sitelinks behave (or should behave) when a page is deleted and then another is moved to the title versus when the "delete page to allow move" feature is used on Special:MovePage to delete-and-move in one request. Who would be good to ask about that? [14:48:07] Bien gracias [14:48:07] T44886: No option to move over an existing non-redirect page in the API - https://phabricator.wikimedia.org/T44886 [14:49:23] Interesting question, I would ask Lydia, I'd assume the expected is the same in both cases [14:49:55] IE, the delete removes the site link and the move updates the sitelinks [14:51:15] addshore: yea, but if you do a "delete-so-we-ca-move", that would break... [14:51:34] Why would that break? [14:54:41] A delete so we can move is still just a delete and then a move right? [14:56:03] addshore: but it would delete the sitelink [14:56:16] the one that was pointing to the deleted page [14:56:29] that may or may not be what the user expects [14:56:57] if there are two sitelinks, you probably want to keep the one for the page that got deleted, right?... [14:56:59] But also, the page your moving from might have a sitelink, that should end up on the new entity [14:57:09] should it? [14:57:13] maybe... maybe not? [14:57:27] Yes, if there are 2 I guess the desired behaviour would be keep the one on the entity and remove the one for the entity for the page your moving [14:57:48] I used that means one less edit on Wikidata [14:57:52] *guess [14:58:12] The resulting data ends up being the same, but how we got there is different [14:58:48] It's probably easier to think about and code as just a single set of cases, on delete remove sitelinks, on move move sitelinks. [14:59:41] addshore: want to comment on the ticket? [14:59:48] Otherwise we might find the need to make a null edit on Wikidata for the entity? With a better description explaining what's gone on [15:00:05] Yes, which ticket? I'm just about to check into a hotel but will comment soon! [15:00:17] I'll poke Lydia in an email too, but I believe she is away for the week [15:00:38] Aaah https://phabricator.wikimedia.org/T44886 [15:06:35] addshore: https://phabricator.wikimedia.org/T44886 [15:06:37] discussion at the end [15:06:41] Ack [15:36:07] addshore: It's possible that the delete part of delete-and-then-move schedules the sitelinks update as a DeferredUpdate, and by the time that runs the move has happened so it thinks the page is no longer deleted after all. [15:36:38] Resulting in the sitelinks for the deleted page not being deleted. [15:37:47] * anomie tried to look in the Wikibase code, found that deletion hooks do seem to be using a DeferredUpdate, but gave up trying to find the sitelinks deletion code to see how it handles a recreation. [20:04:58] MatmaRex: https://gerrit.wikimedia.org/r/c/mediawiki/extensions/SpamBlacklist/+/526829 needs a rebase [20:06:14] oh, huh [20:07:08] anomie: it rebased with no conflicts locally, i updated the patch. (i can't test it now) [20:07:36] gerrit rebase just sucking [20:07:47] Who knew? ;-) [20:08:33] it actually makes no sense that it would fail to merge. there weren't even any changes in those files [20:08:56] and both Depends-On patches are merged [20:09:07] Maybe it was confused somehow about the already-merged depends-on patches? If it does it again, I'll try editing the summary to remove them. [20:34:51] MatmaRex: Looks like it's going to run to completion this time. Although Phan is going to fail because it's whining about a back-compat code path calling deprecated methods that were recently deleted: https://integration.wikimedia.org/ci/job/mwext-php72-phan-docker/11346/console [20:36:10] anomie: hmm, that's going to prevent the merge, isn't it? [20:37:11] MatmaRex: Yeah, I think so. If you want to put up the patch fixing it (adding @phan-suppress-next-line or dropping back-compat) I'll review it right away. [20:38:49] anomie: i think we can drop back-compat – Revision::getQueryInfo() is marked with @since 1.31, and SpamBlacklist already has "MediaWiki": ">= 1.31.0" in extension.json [20:42:45] anomie: https://gerrit.wikimedia.org/r/c/mediawiki/extensions/SpamBlacklist/+/535700 [20:43:51] * anomie +2s [20:57:27] Oh, I'm glad to see https://www.mediawiki.org/w/index.php?title=Extension:SpamBlacklist&diff=prev&oldid=2645481 [20:58:13] MatmaRex: But I wonder whether extension.json needs bumping to 1.34 now. Did your patch break compat with older MW? [21:14:51] anomie: sorry, i was away. i don't think it did? there was the patch in mediawiki/core that i put in Depends-On, but that was only because of the VE-related part of the patch [21:15:26] i admit i only tested with master, though [23:02:46] Krinkle, is there an easier way to create logstash dashboards instead of editing in the grafana ui? [23:02:56] like say editing a json blob or something like that? [23:05:44] subbu: logstash or grafana? [23:06:49] sorry, logstash. [23:07:18] subbu: right, I'm not aware of a JSON api for Logstash, it might exist but it's not available like a simple text editor the way Grafana supports. [23:07:38] i see. ok. [23:07:40] subbu: so I usually take a dashboard like mediawiki-errors and "save as" to create something new. [23:08:28] i already have a parsoid-tests one for parsoid/js and i needed to add new blocks for parsoid/php .. and it was feeling so clunky and slow .. so was wondering if there were a text editor .. but ok .. i'll do it the hard way :) [23:09:07] yeah, it's quite slow. especially the filter dropdowns - ref. https://phabricator.wikimedia.org/T189333 [23:09:24] ah .. so, it was not just me. [23:10:36] kibana (the thing at logstash.wikimedia.org) does actually store the dashboards as json blobs in elastcisearch. So its theoretically possible to download, edit, and upload. [23:11:29] I have not done it, but I think I remember ebernhardson manually fixing saved dashboards at one of the upgrades [23:11:53] ah, yes, i meant to say kibana, not grafana earlier. :) [23:12:53] i will get to it tomorrow when i feel fresher ... i am only doing easy tasks right now. [23:21:22] bd808: with each day having its own index, does that mean after 90 days, the number of fields should shrink of a change is deployed that uses fewer fields? Or are they indexed/stored elsewhere separate from the main logs? [23:21:33] of/if/ [23:22:19] Krinkle: yes, should shrink over time if log events have fewer fields [23:23:12] do we have retention up to 90 days again? I haven't followed a lot of the changes that the SRE team has been making to the ELK stack [23:23:31] * bd808 doesn't have anything useful logging to ELK these days for various reasons [23:24:04] oh that's possible, I didn't check. maybe 30 or 60. I said 30 recently and then someone said it was longer than that. [23:34:36] heh. that's how it goes I guess [23:35:03] it once was 32 days or retention, but that was really only because of disk space