[03:56:10] TimStarling, should I +2 https://gerrit.wikimedia.org/r/#/c/mediawiki/core/+/450490/ .. and you can add test later? or do you want to do it before I +2 it? Want to get this into the train if possible. [03:57:02] I can add a test now I guess [03:57:20] k [03:58:36] rebasing first [04:05:30] I think the thing about nested p tags is out of scope [04:05:53] the bug was corruption of the Serializer tree leading to a PHP notice and duplication of a

tag, which is fixed [04:14:32] done [04:15:21] by the way, legoktm is on vacation for the rest of this week [04:30:57] ah, ok reg legoktm .. i'm heading to bed now, but will look at the rest tomorrow morning. The updated remex can be pushed out next week, right? any reason to rush it into this one? [04:32:54] it's probably fine [04:33:36] if we start getting OOM errors then we can revert and push them out together, but I think it's unlikely [04:35:00] sounds unlikely to me as well. [12:08:54] tgr: 1+1+1= ? https://gerrit.wikimedia.org/r/c/mediawiki/core/+/450551 [12:09:52] DanielK_WMDE: I can merge, wasn't sure if you wanted Brad to have a look at it [12:11:57] I went through the code a few times and all uses of the default content model seem to be covered, I haven't interacted much with content storage code in the past though [12:12:29] well, I guess it's not any harder or riskier to look at it post-merge [12:17:36] tgr: there really isn't any code that uses RevisionStore for cross-wiki access yet. The Wikibase patch is the first thing that does this. [12:18:59] tgr: regarding your diff patch: the code looks fine to me, it's now just a matter of testing. [12:19:42] should I start testing or will Piotr do that? [12:20:44] I think the idea was for piotr to do that. Perhaps ask him? [12:22:26] tgr: what's the status of the ApiComparePages patch? Is it basically ready, just waiting for the diff stuff? [12:26:22] DanielK_WMDE: no, I still need to review that [12:26:49] will do some time today [12:27:30] if I'm not doing testing, I suppose [12:28:15] I'd say look at ApiComparePages first. Corey wanted Piotr to do the testing. [12:28:37] If that doesn't work out, we can still do that between us on thursday or friday [12:29:12] code looks fine at a first glance, but I'd have to step through to get the details. there are some oddities in there. [12:31:47] I'll go find some lunch now, and then i'll poke a bit at Revisionrenderer. [12:32:13] the main problem with that is really writing code for merging one ParserOutput into another. [12:32:18] That's not trivial [13:21:48] DanielK_WMDE: tgr: I spoke to Piotr yesterday, and he will be available for testing on Thursday. Is that soon enough. [13:23:08] Is that soon enough? He said he could possibly be ready to start on Wednesday, but he mostly likely would not be able to complete the testing that day in any case. [17:10:41] o/ CindyCicaleseWMF et al. [17:11:15] I'm hoping to get a discussion moving about MW Platform WRT the apparent crisis we're in WRT mariaDB limitations and the page/revision tables. [17:11:18] o/ halfak [17:12:06] I've learned from Mark and the DBAs that "0% growth is too much" and that they'll be blocking all new types of content contributions (wiki pages/namespaces) that can't be put into an entirely separate wiki. [17:12:20] Are you familiar with this issue? [17:13:06] No, I'm not. Is there a ticket with background? [17:14:06] There's not a general ticket as far as I know. It seems that JADE is the first to hit this issue, so our RFC is the best reference. [17:14:10] * halfak gets [17:14:28] https://phabricator.wikimedia.org/T183381 [17:14:34] That's the container task. [17:16:22] awight, is TechCom officially going to talk about this? [17:16:32] Or is Daniel just unofficially talking with us about it? [17:16:55] halfak: It’s on their board to eventually discuss, but so far we haven’t been scheduled. [17:17:14] Gotcha. [17:17:36] CindyCicaleseWMF, we're stuck between https://www.mediawiki.org/wiki/Everything_is_a_wiki_page and database limitations. [17:18:11] By making JADE use wiki pages, we get all of the benefits. But by making JADE use wiki pages, we're running against severe DB constraints. [17:20:00] Well, ^ we’re not running into constraints technically, but SREs have strongly advised that they will block us until we store the data elsewhere. [17:20:52] Here’s the RFC up for discussion, T200297 [17:20:52] T200297: Introduce a new namespace for collaborative judgments about wiki entities - https://phabricator.wikimedia.org/T200297 [17:24:23] Long story short, what we’d like is to have a threshold for a maximum rate of growth at which we can safely increase wiki content. Like halfak said, currently that seems to be “0%”, which makes our work impossible. [17:25:24] halfak: Ah, yes, I remember discussing that with you. [17:29:24] In a meeting now, but I will research and get back to you. [17:31:32] Thank you! [18:13:07] halfak: Corey's going to schedule a meeting to discuss this further. [18:13:32] OK. Shall I assume that talking to Corey == keeping you in the loop? [18:13:51] Yes, definitely. [19:25:03] if I am running a script on command-line on maintenance host (e.g. terbium) is there an easy way to see which DB queries this script is running? [23:21:03] so... actually this script doesn't restart cleanly, it takes a long long time to find its position again [23:23:43] I was up to revision ID 60M on commonswiki, it looks like it would take an hour to get up to that position again, even doing nothing