[13:13:39] Krinkle: A Heisenbug is where it mysteriously disappears when you try to test it because of some hidden state dependency. A Volkswagenbug is when it disappears because it depends on not-really-hidden state with code like "if ( isBeingTested() ) {" [13:17:12] tgr: Since the unwrapping bug messes with TextExtracts and I've seen several complaints about that, I'm going to backport the patch. [13:17:54] ack [13:18:21] there were a bunch of complaints on StackOverflow, I didn't make the connection :( [13:21:07] meh, sorry about this mess. i totally missed that the unwrap flag wouldn't work with old cached POs. [13:21:30] I missed it in code review too [13:21:37] speaking of cached POs: switching to JSON sounds really attractive, and shouldn't be hard, except for one thing: [13:22:09] making sure that the serialization code is updated when fields get added or changed [13:22:41] maybe a unit tests that uses introspection to enumerate members or something [13:24:01] I don't think JSON is really a magic bullet there. Either way you have to watch for stuff breaking because of serialization changes. The only thing that a switch to something like JSON would fix would be if we made everything that has to go into the JSON explicit so adding a field *requires* messing with the serialization code. [13:27:19] yes, that's what i meant: that'S the tricky bit [13:27:31] i definitly wouldn't want to do json_encode( $po ); [13:36:26] anomie: where are we at with this? https://phabricator.wikimedia.org/T203567#4568750 [13:36:51] I don't quite understand what the remaining ussue is, or what we can do except wait, or go for json [13:38:28] DanielK_WMDE_: It only happens if a PO serialized in wmf.20 gets unserialized and re-serialized in wmf.19 with HHVM. PHP7 loses data in the field in the same circumstance, but there mWrapperDivClasses will get its default [] instead so that error won't be output. [13:39:40] but krinkle said we where hitting that more than 5000 times per hour on saturday. and i can't imagine how it would happen at all. [13:39:49] is there anything we can do about this right now? [13:40:16] Purge any affected pages to clear the corrupted PO from the parser cache. [13:41:01] ah right, tgr suggested that [13:42:03] ok, that means there are no active development tasks for SDC right now [13:42:13] everything is stuck on review, as far as I can see [13:42:23] DanielK_WMDE_: BTW, he said 5300 hits over 24 hours, not per hour. [13:42:36] right, per day, sorry [13:44:25] Sigh. npm seems to be being flaky: https://integration.wikimedia.org/ci/job/mediawiki-quibble-vendor-mysql-hhvm-docker/5862/console [15:15:26] DanielK_WMDE_: 24 hours, not 1 hour [15:15:32] * Krinkle re-read his comment to make sure [15:16:04] Ah, thanks anomie [15:27:23] DanielK_WMDE: Re https://gerrit.wikimedia.org/r/c/mediawiki/core/+/457450/9/includes/Revision/RenderedRevision.php#312, I think it'd be better if we didn't have a giant rambling comment than ends with recommending premature optimization for revisions having a subset of metadata unset depending on the specific content. [16:29:26] CindyCicaleseWMF: the office wifi just died. [16:29:47] we were wrapping up anyway and i have to leave the room, so i won't try and re-connect [16:30:14] (writing the from the other laptop) [16:36:29] DanielK_WMDE__ Yeah, we saw you freeze. We were done 30 seconds after anyway. You didn't miss anything. [17:34:09] anomie: wtf happened there? why did this pass before? https://integration.wikimedia.org/ci/job/mediawiki-quibble-vendor-postgres-php70-docker/509/console [17:34:35] different php version? system locale? some config change? [17:34:55] DanielK_WMDE: postgres tests aren't run during the normal jobs, only on +2 or when you comment "check postgres" or the like. [17:35:41] anomie: oh, i see. it's probably missing a $db->timeThingy call [17:41:19] anomie: hm, I don't see it, and I don't have PG to test... the failure is from assertRevisionExistsInDatabase, which uses revisionToRow, which does call $this->db-timestamp. [17:41:26] DanielK_WMDE: It looks like the version of PG in CI is outputting "2018-09-10 16:59:04+00" instead of "2018-09-10 16:59:04 GMT". [17:41:28] Also, testInsertRevisionOn_successes uses the same check, and it doesn't fail. [17:41:47] or else something somewhere else is reformatting it [17:42:06] yea, but what I don't get is that it's comparing the output of select to the output of this-db->timestamp [17:42:12] so it should be consistent, whatever it is... [17:42:28] unless DatabasePostGres::timestamp is getting it wrong somehow? [17:43:18] * anomie has PG and is digging into this. [17:43:25] anomie: thank you [17:48:13] DanielK_WMDE: ... testInsertRevisionOn_successes is being skipped, see line 398. [17:50:00] oh. that wasn't me. git sais kunal added that line [17:50:06] maybe precisely because of this issue? [17:52:03] Probably. It looks to me like the Timestamp library needs a fix to match the date format PG actually uses. I'll make that once I finish checking a few things. [17:52:05] :o [17:52:34] yes, I suppressed all the postgres failures to get them passing so I could make jenkins voting [17:54:23] so, TS_POSTGRES just does the wrong thing?... wow [17:54:27] may be a version issue [17:54:56] of that's the case, DatabasePostgres may have to do something intersting based on the version... [17:54:57] That's what I'm trying to look into. [17:55:48] would be nice to make assertSelect resilient against this kind of thing, but i don't see how [17:56:02] Although https://www.postgresql.org/docs/7.1/static/datatype-datetime.html#AEN3279 seems to indicate the style I'm seeing now was used as early as 7.1 [17:56:10] legoktm: where they predominantly about date format? [17:56:57] anomie: i guess all our code is lenient against that difference, and PG is as well for input, but assertSelect compares strings [17:57:55] DanielK_WMDE: https://phabricator.wikimedia.org/T195807 might be a bit out of date now [17:59:01] legoktm: uh, the page archive test looks ugly. like it'S selecting the wrong revision. [17:59:22] patches welcome :) [17:59:42] commenting "check postgres" (or "check sqlite") will run those tests, otherwise they only get run on +2, like anomie said [17:59:48] i don't have PG handy ;) [18:00:28] https://manpages.debian.org/stretch/postgresql-common/pg_virtualenv.1.en.html [18:22:07] hm... https://gerrit.wikimedia.org/r/c/mediawiki/core/+/457513 will probably also fail on PG because of CONCAT [18:32:13] DanielK_WMDE: I don't see any CONCAT in there? [18:34:31] DanielK_WMDE: So… T198561 is marked as a blocker for T198308, but in its description it says "we'll want to have this enabled on the live systems for a while before releasing it as the default for 3rd party installs, see T198561" – so which way around is it? [18:34:34] T198308: Enable MCR migration stage "write both, read new" on live systems - https://phabricator.wikimedia.org/T198308 [18:34:34] T198561: Make "write both, read new" the default MCR migration stage for fresh MediaWiki installs / for CI - https://phabricator.wikimedia.org/T198561 [18:36:48] anomie: sry, i meant COALESCE. But I guess that works? [18:37:15] DanielK_WMDE: COALESCE is supported as such by all our DBs, as far as I know. [18:38:23] James_F: oh, the first "see T198561" is wrong. we want to have this love and on ci before releasing 1.32. [18:38:30] i'll fix that [18:39:40] done [18:39:51] Right. Note that there are only four trains until 1.32.0. [18:39:55] Thanks! [18:40:02] eek! [18:40:23] 22, 23, 24, 26 (WMF-deploy-2018-10-16) and then we're off to 1.33.0-wmf.1 [18:40:59] i tagged it as a dependency [18:41:05] should be able to make it, but just barely [18:41:28] the dc switch may mean we can't put it live on *all* systems by that point [18:41:43] DanielK_WMDE: So is https://gerrit.wikimedia.org/r/c/mediawiki/core/+/443831 mergeable? [18:44:09] James_F: from my side, yes [18:55:09] James_F: feel like ignoring the PG issue and forcing the merge of https://gerrit.wikimedia.org/r/c/mediawiki/core/+/455630 ? [18:56:49] DanielK_WMDE: Yes? But unfortunately Certain People™ who I like (K.rinkle and l.egoktm) will shout at me. (And it would break every following patch.) [18:57:00] DanielK_WMDE, James_F: No, do not force it. Either add `$this->markTestSkippedIfDbType( 'postgres' )` to the failing tests, or wait for the new tag of the Timestamp library. [18:57:15] No train tomorrow, it's not like the extra 12 hours is critical. [18:57:26] Skip for PG with a FIXME sounds best. [18:57:44] And remove the skip with the new release tomorrow or whenever? [18:58:36] hm, i just realized that foring it would cause it to fail again whenever anyone tries to merge anything [18:58:40] silly me, sorry :P [18:59:20] James_F: or just wait until tomorrow. this patch isn't critical. i'd like to have it off the table, but waiting a day doesn't hurt [19:02:34] legoktm, DanielK_WMDE, James_F: Hmm. Would this be v2.1.1 then, or is changing the output for TS_POSTGRES more significant? [19:06:17] I'd consider it a bug fix [19:13:04] anomie: I see it as a bug fix as well [19:17:43] * anomie pushes the tag [19:27:33] * anomie tries to figure out which version of composer is needed for the vendor repo... apparently 1.7.2 today. [19:34:30] DanielK_WMDE: You could rebase your patch on https://gerrit.wikimedia.org/r/c/mediawiki/core/+/459619 now, I think. [19:36:50] anomie done [19:37:27] anomie: https://gerrit.wikimedia.org/r/c/mediawiki/core/+/459620 [19:47:47] anomie: i rebased https://gerrit.wikimedia.org/r/#/c/mediawiki/core/+/457450 on top of that just now. [19:47:54] enough for today, going home [21:59:37] bpirkle: before the API existed, the best way to get page text/page in a machine readable way was to use Special:Export, which is why the parameters are so well documented :) I would be surprised if there are still automated clients using it (maybe really old bots that people haven't updated?) [22:07:30] legoktm: Thanks. Do you know if there there is a way using the API to invoke the code that I thought (hoped) was obsolete? I see only one usage of WikiExporter, in includes/api/ApiQuery.php, but it wouldn't be the first time I've missed something in the code. Today. :) [22:08:52] pretty sure ApiQuery is the only API related thing that uses export related code [22:09:51] https://codesearch.wmflabs.org/search/?q=WikiExporter&i=nope&files=&repos= might be useful? [22:12:02] thanks, that's convenient. I've been searching via PhpStorm, but it is nice that the web based one also searches extensions in one operation. [22:14:00] legoktm, was there no wgAjax* function for that? [22:14:21] Krenair: nope [22:16:26] I forget exactly how that AjaxDispatcher stuff worked but I remember it being difficult to find people willing to review a conversion from that to the modern API [23:00:33] Krenair: indeed. Ithink the last prod one we ported was CategoryTree? [23:03:05] there's still one open for Collection: https://gerrit.wikimedia.org/r/183228 [23:13:30] right