[00:02:20] 6MediaWiki-Core-Team, 7HHVM, 7Wikimedia-log-errors: error: request has exceeded memory limit in /srv/mediawiki/php-1.25wmf17/includes/specialpage/SpecialPage.php on line 534 - https://phabricator.wikimedia.org/T89918#1080501 (10greg) [00:05:35] 6MediaWiki-Core-Team, 10Wikimedia-Site-requests: InitializeSettings: decide upon 'wiki' vs. 'wikipedia' group usage - https://phabricator.wikimedia.org/T91340#1080523 (10bd808) I'd vote for using 'wikipedia' and 'special' separately and eliminating the funky 'wiki' dbname from our config entirely. [04:24:35] * TimStarling tries to remember how to log in to wikitech [04:25:06] LDAP credentials [04:25:18] same as Gerrit [04:25:54] plus 2FA [04:26:15] which involved running JAuth on my laptop, which IIRC could be done in some fairly obvious way on my old laptop [04:26:22] I use [04:27:02] I wonder if I had it in /usr/local [04:27:09] import onetimepass, subprocess [04:27:10] token = onetimepass.get_totp(SECRET) [04:27:10] subprocess.call('pbcopy <<<"%s"' % token, shell=True) [04:27:29] pbcopy being an OS X thing to copy to the clipboard [04:27:52] i spent half an hour with the cocoa bindings for python before i realized i was being a moron [04:28:08] that doesn't sound easier than figuring out how I used to do it and then doing that again [04:29:55] at the time, Ryan was very put out that I didn't have a smartphone to use for 2FA [04:30:18] which I suppose would be more secure [04:30:31] so if I wanted to set it up again, that would be an option [04:32:35] but I would presumably have to log in first to change my key [04:33:30] well, you can ssh into silver and mint yourself a session [04:33:33] anyway, I have my old root partition in a .tar.gz and I found a symlink in it from /usr/local/bin, seems I installed it in /opt [04:35:20] I'm musing over the idea of building HHVM in labs, have you ever tried that? [04:36:12] bd808: Checking in regarding error logging [04:36:18] yes, it takes a while [04:36:19] stacktraces etc [04:48:27] there's a lot of things about labs that seem to be designed to annoy me [04:50:56] maybe I should start doing something about some of those things [04:51:24] what kind of things? [04:52:19] like the sshd_banner "If you are having access problems, please see: https://wikitech.wikimedia.org/wiki/Access#Accessing_public_and_private_instances" [04:52:26] which spams your terminal twice every time you log in [05:07:42] oh, that's peanuts [05:08:17] there are way more awful things than that [06:27:00] TimStarling: ori do list them! I’m slowly working towards making labs less annoying [06:27:15] (slowly, all help happily welcome) [09:27:01] 6MediaWiki-Core-Team, 10GlobalUserPage: Wikilinks from global user pages should point to the central wiki - https://phabricator.wikimedia.org/T89916#1081400 (10Pikne) I wonder if the wanted behaviour is similar to Commons file description pages which are transcluded on local wikis and where links are still rel... [15:32:01] ^d: I so can't see those fucking groups in gerrit [15:32:14] I've tried a bunch of times to add jenkins to some groups but I can't even see that bastard [15:32:49] <^d> "Make group visible to all registered users." is checked [15:33:06] <^d> People -> Groups -> Type your group you're looking for in the filter box [15:33:24] <^d> Sadly our group scheme makes zero sense outside of the extension-* ones, so ymmv :( [15:37:00] ah [15:53:34] ^d: Any idea what db query might all front-end builds to consistently fail as of yesterday? [15:53:40] <^d> manybubbles: I also filed T91404 for doing the swift backups. We did a bunch of testing but never set up regular backups for them [15:53:49] <^d> (I tagged operations on it, they should take the lead) [15:53:55] I'd normally ask Aaron, but might be too early [15:54:00] https://phabricator.wikimedia.org/T91399 [15:54:27] <^d> Hmm, not sure [15:54:31] all sqlite builds using concurrency (e.g. a browser making two load.php requests, hard to avoid) are broken as of yesterday [15:54:48] uncommited database transaction [15:55:04] Happens with 6 unrelated extensions, so it's probably in core [15:56:39] <^d> https://gerrit.wikimedia.org/r/#/c/193261/ maybe? [15:57:03] <^d> Or https://gerrit.wikimedia.org/r/#/c/193285/, but the former seems more likely [15:57:12] Ah, yeah [15:57:18] quite possibly [15:59:27] I'll try and similute a build with that reverted [16:03:51] <^d> Ah, so it was that [16:03:52] ^d: thx [16:03:54] <^d> Hmm [16:03:57] <^d> yw [16:04:23] Normally I wouldnt revert db changes since it's probably for the good of prod that it was changed [16:04:34] but I know this one AaronS worked on specifically to improve sqlite [16:04:44] so given it did the opposite, I've reverted it [16:05:52] <^d> Yeah master's always supposed to be working so back to status quo makes sense [16:05:56] <^d> Can always revisit :) [16:10:15] Krinkle: pong on error logging -- I haven't worked on it at all [16:11:36] I'm spending my time trying to help keep projects running and planning for future projects; product manager stuff [16:29:11] <_joe_> bd808: anything from mwcore yesterday that needs ops attention? [16:29:31] <^d> manybubbles: So I guess we'll just meet up at the office Monday? [16:30:22] ^d: I was thinking I'd try to check into the hotel first then go to the conference next. Would you like to meet up there around 2? [16:30:22] _joe_: not that I noticed. You could see what's up with quotes for new logstash hardware if you are really bored. ;) [16:30:59] It was moving for a while and then stalled out when we didn't get back to robh for a day or two about disk things [16:31:48] <^d> manybubbles: At the office or at the conference? There's nothing at the conference other than registration & the welcome reception monday [16:32:24] ^d: I figured it'd be better for me to do the welcome reception for a bit and try to meet up with you [16:32:28] is ok? [16:32:41] <_joe_> http://blog.fogcreek.com/dev-life-interview-with-salvatore-sanfilippo/ antirez is working on a proper distributed queue, may be interesting [16:32:44] I suppose I could also hit the office in the morning, practice presentation for an hour or two [16:32:50] then go to hotel and then conference [16:36:46] <^d> manybubbles: Reception isn't until 5 [16:36:50] <^d> I was planning to go too :) [16:37:01] ^d: ah - k. lets meet at the office then on monday [16:37:03] your plan [16:37:28] <^d> Ok [16:37:44] <^d> Registration is open from 10-5 so we can hop over there anytime [16:45:20] yeah [16:45:37] I wonder what I can do to be functional past 5pm that day [16:45:48] because my flight takes off at 6:45 am or something [16:45:54] which is nice because I can be in the office before lunch [16:46:01] but its bad because I'm waking up super early [16:51:00] manybubbles: power nap [16:51:45] bd808: I'll need some good drugs for that. though [16:51:54] I've never found a good sleeping pill for me [16:53:15] I've never tried any. I'm one of the lucky folks who can mostly sleep whenever I decide to sleep [16:53:42] nice [16:53:52] I've had some success with a coffee nap [16:53:57] In trade I got gains-weight-by-looking-at-food [16:54:16] like, drink a cup off strong coffee then sleep then wake up when the caffee hits in half an hour [16:54:20] but I can't control the other side [16:55:11] Brandon was talking about that routine one day on irc. Sounded interesting. I think he even had some study data on it [17:01:15] 6MediaWiki-Core-Team, 10MediaWiki-Database, 5Patch-For-Review: Database constructor cleanups - https://phabricator.wikimedia.org/T90288#1082278 (10bd808) 5Open>3Resolved [17:04:37] 6MediaWiki-Core-Team, 5Patch-For-Review: Reduce SpoofUser deadlocks from rename jobs - https://phabricator.wikimedia.org/T90967#1082287 (10bd808) 5Open>3Resolved [17:27:43] 6MediaWiki-Core-Team: Reduce SpoofUser deadlocks from rename jobs - https://phabricator.wikimedia.org/T90967#1082363 (10Jdforrester-WMF) [17:27:45] 6MediaWiki-Core-Team, 10MediaWiki-Database: Database constructor cleanups - https://phabricator.wikimedia.org/T90288#1082364 (10Jdforrester-WMF) [17:28:35] 6MediaWiki-Core-Team, 10MediaWiki-extensions-SecurePoll: Fix SecurePoll_BallotStatus for Status/StatusValue changes - https://phabricator.wikimedia.org/T89475#1082418 (10Jdforrester-WMF) [17:28:37] 6MediaWiki-Core-Team, 10GlobalCssJs: Investigate DB connection spam from RL global modules - https://phabricator.wikimedia.org/T89507#1082415 (10Jdforrester-WMF) [17:28:38] 6MediaWiki-Core-Team: Try to avoid doCascadeProtectionUpdates() call on page view - https://phabricator.wikimedia.org/T89389#1082417 (10Jdforrester-WMF) [17:28:42] 6MediaWiki-Core-Team, 10MediaWiki-extensions-AbuseFilter: Slow AFComputedVariable users query - https://phabricator.wikimedia.org/T90036#1082416 (10Jdforrester-WMF) [17:29:11] 6MediaWiki-Core-Team, 10MediaWiki-General-or-Unknown: Exception if no accelerator (e.g. APC) is available - https://phabricator.wikimedia.org/T86162#1082452 (10Jdforrester-WMF) [19:05:46] <^d> legoktm: Ouch https://phabricator.wikimedia.org/P344 [19:25:58] 6MediaWiki-Core-Team, 10Wikidata-Query-Service: BlazeGraph Finalization: Validate AST rewrite - https://phabricator.wikimedia.org/T90128#1083530 (10Manybubbles) OK! I've validated AST rewriting. It works (of course) but it''ll be a huge pain to implement some things with it. I imagine it'll be just fine for... [19:54:44] lots of EditPage and Linker :/ [19:57:08] <^d> ya [19:57:48] bd808: does xhprof work properly on command line scripts? do you have to do anything other than enable it and run them? [19:58:55] also, how exactly it works in text mode - should it dump the info to stdout? [20:01:21] <^d> legoktm: Fixed the collection ones https://gerrit.wikimedia.org/r/194172 :) [20:07:35] 6MediaWiki-Core-Team, 10Wikidata-Query-Service: BlazeGraph Finalization: Validate AST rewrite - https://phabricator.wikimedia.org/T90128#1083908 (10Thompsonbry.systap) Do you have a pointer to your code? I'd like to understand where the pain is. [20:08:46] manybubbles: I think that's all it takes. [20:09:58] bd808: yeah I figured it out, but is there a way to make it output more data than a single number per function? [20:11:57] The text formatter just does wall time, number of calls and function name I think [20:13:05] The only data we really have other than that is memory and cpu usage [20:13:36] the old one used to do a huge wall of text [20:13:55] like this function had 57% of the time, this one 55% of the time, etc [20:14:09] bd808: how about in/out times, per function cuts, etc. like the xhprof gui has? [20:14:25] *nod* we compute that, it's apparently just not in the new text format output [20:14:54] <^d> legoktm: I rage-abandoned both changes [20:14:57] <^d> I hate collection [20:15:04] * bd808 is looking at the source to see what's up [20:15:31] ^d: :/ I think part of the problem is that it's using sajax instead of API modules [20:15:52] <^d> That too [20:16:20] 6MediaWiki-Core-Team, 10Wikidata-Query-Service: BlazeGraph Finalization: Validate AST rewrite - https://phabricator.wikimedia.org/T90128#1083940 (10Manybubbles) >>! In T90128#1083908, @Thompsonbry.systap wrote: > Do you have a pointer to your code? I'd like to understand where the pain is. https://github.com... [20:16:22] SMalyshev, manybubbles: here's the formatter that dumps the text report -- https://github.com/wikimedia/mediawiki/blob/master/includes/profiler/output/ProfilerOutputText.php#L56-L57 [20:17:06] The full data set that we have if you turn on all the possible flags is -- https://github.com/wikimedia/mediawiki/blob/master/includes/profiler/ProfilerXhprof.php#L96-L107 [20:17:36] we really really need to kill that :/ [20:18:29] bd808: I see. Is there any other way to get it closer to what xhprof UI usually does? [20:19:26] dump the raw data and feed it to xhprof gui tools I guess [20:19:31] https://phabricator.wikimedia.org/P345 Collection, FlaggedRevs, SemanticForms, SecurePoll [20:21:25] bd808: aha, ok, so where would be a good place to dump the data? [20:21:42] ^d: https://gerrit.wikimedia.org/r/#/c/190737/ simple cleanups [20:22:51] SMalyshev: The raw data is available inside our Xhprof class (includes/libs/Xhprof.php) but there's not an accessor for the data there [20:24:31] It wouldn't be too hard to make a Profiler subclass that does a raw dump I don't think [20:24:45] but it would take tweaking the Xhprof class too [20:24:47] AaronS: https://gerrit.wikimedia.org/r/194178 [20:25:06] bd808: ok, I'll try that [20:25:20] legoktm: did you see my comment about mLimit? [20:26:26] AaronS: yeah, it was unintentional, I'll change it to use min() in a little bit [20:26:36] I just copied that code from somewhere else [20:28:48] 6MediaWiki-Core-Team, 10Wikidata-Query-Service: BlazeGraph Finalization: Validate AST rewrite - https://phabricator.wikimedia.org/T90128#1083964 (10Thompsonbry.systap) Ok. That makes sense. In terms of turning this feature on and off automatically, there are some options. One is to add a query hint. See the... [20:34:47] legoktm: https://gerrit.wikimedia.org/r/#/c/194177/ [20:43:25] 6MediaWiki-Core-Team: Fix profiling endpoints and format - https://phabricator.wikimedia.org/T90623#1083980 (10aaron) a:3ori [20:46:57] legoktm: https://gerrit.wikimedia.org/r/#/c/190950/ [20:48:17] AaronS: the Aggregator isn't used by default? what uses it? [20:48:58] the job runner service...though that only talks to redis [21:04:23] I wonder if I was too harsh in reviewing https://gerrit.wikimedia.org/r/193349 [21:10:02] anomie: I've seen worse, but you probably could be a bit more encouraging in tone in some places [21:11:20] anomie: if you add ":-)" to the end of your message it makes everything ok :-) [21:11:53] legoktm: I'll have to remember that [21:12:17] legoktm: I hate you :-) [21:12:19] xD [21:12:27] RESOLVED WORKSFORME [21:12:39] :-) [21:14:41] <_joe_> bd808: we should factor wikipedia verification support into keybase [21:14:57] I think there was a request for that somewhere [21:15:01] It sounds familiar [21:15:18] Yup [21:15:18] https://github.com/keybase/keybase-issues/issues/704 [21:15:19] <_joe_> Reedy: need an invite, btw? [21:15:24] "Add support for Wikipedia and any MediaWiki website" [21:15:28] <_joe_> it's the social network fro gpg keys [21:15:29] I've been on keybase for a while :) [21:15:41] <_joe_> in nodejs [21:15:46] https://keybase.io/tehreedy [21:15:46] <_joe_> WCPGW? [21:15:49] hahah [21:16:00] It's webscale [21:16:03] <_joe_> Reedy: did you gift them with your private key? [21:16:10] "If you know tehreedy, you can ask them for an invitation to Keybase." [21:16:16] I can't remember what I did [21:16:22] <_joe_> I hope not [21:17:04] I don't remember doing that... [21:17:05] * Reedy looks [21:17:09] <_joe_> I would really like to add wikipedia support, but that means writing nodejs javashit again [21:17:38] "in the browser (unavailable for you)" [21:17:41] I guess I didn't :) [21:17:53] We've got enough node/js people we can poke to get it done [21:19:11] <_joe_> on wikipedia, not on restbase [21:19:41] <_joe_> Reedy: you know you will be everyone's buddy for the hackathon, right? [21:19:47] hahaha [21:20:00] Someone else (community member) asked me earlier today about that too [21:22:11] bd808: i realize i didn't ack earlier, but i added you as well to the atlas account [21:22:46] ori: oh! I forgot to ack too because I saw that and uploaded the base image [21:22:53] so thanks [21:22:53] cool [21:23:13] The lcx provider support is merged now [21:23:26] and at least manybubbles has tried it out [21:23:35] it fucking works [21:23:45] <_joe_> lcx or LXC? [21:23:59] LAX [21:24:08] <_joe_> btw, docker sucks badly, we'll never use it in production. LXC, maybe [21:24:11] _joe_: LXC [21:24:18] <_joe_> ori: LHC [21:24:24] :) [21:24:27] _joe_ wins [21:25:14] I didn't get a shiny feeling from the bit of playing I did with Docker [21:25:42] <_joe_> bd808: docker has a very very nice cli [21:25:47] <_joe_> and is dev-friendly [21:26:08] it had too many strong opinions for me. Felt like an ORM for containers [21:26:33] if you did the things they thought of allowing in the way they wanted then it would be nice [21:26:44] but if you had other ideas you would be in a hard place [21:27:05] <_joe_> makes you feel a badass devop, while you cranked toghether a glorified tar + rsync thing that requires you to a) run a daemon as root b) does not properly segregate c) has a buffonesque security model for deploys [21:27:39] <_joe_> every time I see people pulling in prod from the dockerhub... [21:27:55] One of the problems with many devops projects is that they are started by devs who don't want to work as/with ops [21:28:02] curl https://foo.bar/script.lol | bash [21:28:05] PROFIT [21:28:37] So does that feeling extend to go _joe_? [21:28:41] <_joe_> the coreos guys have had the right ideas as usual, but rocket is way too raw [21:29:18] <_joe_> no, but I won't recommend us doing things in go tbh before we experimented with async hack [21:29:37] I read something about why go has deliberately not made a package manager [21:30:01] <_joe_> yes and no [21:30:21] <_joe_> well if you're google, the dependency model of go makes sense [21:31:05] <_joe_> they prefer to have static binaries to run in their real-world implementation of kubernetes :) [21:31:12] This list has an awful lot of options -- https://github.com/golang/go/wiki/PackageManagementTools [21:31:36] <_joe_> btw, google just released grpc, which looks like a very nice thing for microservices building [21:31:41] <_joe_> still very very young [21:31:52] <_joe_> but has some concepts that are really interesting [21:31:56] I peeked at it after you droped a link a few days ago [21:32:04] but didn't really dig in [21:32:24] <_joe_> streams seem and timeouts are really really neat features [21:32:58] <_joe_> basically you escape the idea of having to build the full response at once and you just stream it to the client [21:33:15] <_joe_> which can process them as they come in [21:35:47] Does http/2 make that better than chunked encoding? [21:36:40] * bd808 skims https://http2.github.io/http2-spec/#StreamsLayer [21:37:15] http/2 makes me feel a bit like a cobol programmer in 1995 [21:37:40] I can see a big new learning curve ahead and trying to forget a lot of stuff I "know to be true" [22:06:25] 6MediaWiki-Core-Team, 10MediaWiki-extensions-CentralAuth, 10SUL-Finalization, 5Patch-For-Review: Expose users_to_rename table publicly - https://phabricator.wikimedia.org/T76774#1084551 (10Legoktm) 5Open>3Resolved [22:14:53] <^d> manybubbles: I spotted https://phabricator.wikimedia.org/P346 in production, saw highlighting mentioned [22:15:42] ^d: looks real [22:16:44] common? [22:17:02] <^d> No, just once that I saw [22:17:05] 6MediaWiki-Core-Team, 10CirrusSearch: CirrusSearch: Track down highlighting error - https://phabricator.wikimedia.org/T91459#1084628 (10Manybubbles) 3NEW [22:17:10] ^d: cool. filed [22:17:16] like, the bot says [22:17:47] <^d> No, a couple [22:17:50] <^d> "A_Filantrópica" [22:18:00] <^d> "List of populated places in Hungary (A–Á)" [22:18:14] <^d> "Enka İnşaat ve Sanayi A.Ş" [22:18:25] <^d> Something with diacritics [22:19:38] <^d> manybubbles: Need any other examples? [22:19:57] if you see any then dump 'em in! [22:32:40] <^d> wikibugs died again? [22:32:55] <^d> manybubbles: Pasted all of them from today's log into the bug [22:32:59] <^d> Should give you plenty of examples [22:33:04] thanks! [22:33:18] ^d: it's been having issues today....trying to fix [22:33:28] <^d> thx [22:35:35] <^d> manybubbles: Some are gibberish. Most are fairly sane search queries looks like [22:37:22] 6MediaWiki-Core-Team, 10CirrusSearch, 7Wikimedia-log-errors: CirrusSearch: Track down highlighting error - https://phabricator.wikimedia.org/T91459#1084701 (10Manybubbles) It certainly _something_ to do with the multibyte character handling. We're butching it somewhere along the stack. [22:37:33] 6MediaWiki-Core-Team, 10CirrusSearch, 7Wikimedia-log-errors: CirrusSearch: Track down highlighting error - https://phabricator.wikimedia.org/T91459#1084702 (10Manybubbles) [22:38:15] ah yes, because after you add logging it magically starts working [22:38:36] <^d> That's like my multiversion bug [22:38:41] <^d> I add more logging, it happens less [22:38:56] yay [23:00:19] legoktm: Your new centralauth exception is a great hit [23:00:39] In other words: stuff's broken for stewards :S [23:00:54] hoo: which one? [23:01:08] 2015-03-03 10:15:02 mw1085 enwikinews: [a2bd5aa3] /wiki/Special:CentralAuth/Bareok Exception from line 2069 of /srv/mediawiki/php-1.25wmf18/extensions/CentralAuth/includes/CentralAuthUser.php: Bad user row looking up local user Bareok@metawiki [23:01:54] is there a lot? [23:03:33] I see 3 [23:04:14] hoo: we can have SpecialCentralAuth catch the exceptions so users won't see it [23:04:36] Why is it an exception? Wouldn't a warning have sufficed? [23:04:44] I mean we log all 'centralauth' logs anyway [23:04:56] <^d> By, ApiQueryLogEvents.php is the most noisy right now [23:05:00] <^d> *btw [23:05:07] because it should never happen? [23:05:08] <^d> 274 Notice: Undefined index: 5::duration in /srv/mediawiki/php-1.25wmf19/includes/api/ApiQueryLogEvents.php on line 327 [23:05:09] <^d> 273 Notice: Undefined index: 5::duration in /srv/mediawiki/php-1.25wmf19/includes/api/ApiQueryLogEvents.php on line 325 [23:05:09] <^d> 273 Notice: Undefined index: 5::duration in /srv/mediawiki/php-1.25wmf19/includes/api/ApiQueryLogEvents.php on line 321 [23:05:43] legoktm: Ok, I guess so [23:05:57] Do we know how many there are? [23:06:01] And how to fix them? [23:06:25] I see 3 in todays logs [23:06:53] and I think they're global rename related, localuser has already been updated, but unless the job has run the local `user` table won't be updated [23:06:56] I mean how many broken records [23:07:04] mh ok [23:07:14] there shouldn't be any, I ran migratePass0.php a few days ago [23:07:40] btw... will we delete all the SULF stuff from CA after the migration? [23:07:47] If not it should go into an own extension [23:07:57] the extension is even messier these days than it used to be [23:08:08] that's probably ok for a few months now, but shouldn't be forever [23:08:53] dunno, I think we need to just kill all of CA :P [23:09:29] If you can get time to work on that [23:09:41] parts of it we surely want to retain (global rights, locks, ...) [23:14:40] legoktm: Do you (or Keegan) have any strategy for accounts like "Abuse filter"? [23:15:11] I was thinking of just ignoring reserved accounts? [23:15:22] legoktm: Yeah [23:15:46] I don't remember if AF reserves the account [23:15:56] I don't think it really does [23:15:57] btw [23:16:03] you seem to also broke global rename [23:16:14] 2015-03-03 22:55:25 mw1095 metawiki: [21bff5fb] /wiki/Special:CentralAuth/VirtLynx Exception from line 2096 of /srv/mediawiki/php-1.25wmf19/extensions/CentralAuth/includes/CentralAuthUser.php: Could not find local user data for VirtLynx@ruwiki [23:16:24] wrong one [23:16:34] 6MediaWiki-Core-Team, 7HHVM, 7Wikimedia-log-errors: OOM reported at SpecialPage.php:534 due to large output from Special:Book - https://phabricator.wikimedia.org/T89918#1084970 (10tstarling) [23:17:49] forget that [23:18:00] so..not broken? [23:18:15] No, not beyond it breaking the special page it seems [23:18:24] hoo: Yeah, I was just asking legoktm about that before I responded to the mailing list, it looks like it works https://meta.wikimedia.org/wiki/Special:CentralAuth/Madtechnerd [23:18:51] bd808: what you think of this? https://gerrit.wikimedia.org/r/#/c/194221/ [23:19:08] this would allow creating full xhprof dumps to use with xhprof gui later [23:19:26] (which is much more useful than text dump) [23:20:14] cool. I'll check it out in a bit (in a meeting now) [23:36:05] 6MediaWiki-Core-Team, 7HHVM: HHVM with FastCGI does not support streaming output - https://phabricator.wikimedia.org/T91468#1085043 (10tstarling) 3NEW [23:36:45] TimStarling: does that include readfile()? [23:37:06] yes [23:51:13] 6MediaWiki-Core-Team, 10CirrusSearch, 7Wikimedia-log-errors: CirrusSearch: Track down highlighting error - https://phabricator.wikimedia.org/T91459#1085113 (10greg) p:5Triage>3Normal [23:54:55] I guess it's good the scalers are on zend