[00:00:10] <^d> *texvccheck [00:00:27] I'm am probably just misreading this because this is surely too stupid to be true [00:00:59] misreading what? [00:01:20] HHVM is apparently sending out a multilingual error message with informative text buried deeply in it "PHP fatal error: request has exceeded memory limit" [00:01:35] then varnish apparently replaces this entire error message with its own error message [00:01:53]
[00:01:53] [00:01:53] Request: GET http://en.wikipedia.org/w/index.php?title=Special:Book&bookcmd=download&collection_id=3fd9b5e50dd79cbed462e304f6eb69c3a6e38a04&writer=rdf2latex, from 10.128.0.118 via cp1066 cp1066 ([10.64.0.103]:3128), Varnish XID 4091838518
Forwarded for: 121.217.2.15, 10.128.0.117, 10.128.0.117, 10.128.0.118
Error: 503, Service Unavailable at Thu, 19 Feb 2015 23:59:15 GMT [00:01:53]
[00:01:54]
[00:01:57] with no useful information [00:03:16] since we apparently have no usable fatal logging of our own, it would be nice if our users were at least able to tell us something about the errors they are seeing [00:03:54] hhvm-fatal will be back as soon as puppet runs everywhere [00:04:07] will it tell us URLs? [00:04:44] argh [00:04:46] sometimes [00:05:01] ok I have a reproduction procedure for T89918 [00:05:36] HHVM was apparently buffering its error messages for 5 minutes before even writing them to a local file [00:05:39] trying to trick me [00:06:37] "message repeated 5 times [00:06:42] bd808: for the record, the warnings and notices stream was created before you added hhvm fatals to it I think? [00:07:21] it was never on for more than 5 minutes. Sam and I enabled it for a few minutes during last hackathon in Amsterdam. [00:07:34] <^d> TimStarling: This is the one I was referring to...."Lipsește executabilul <code>texvccheck</code>; vedeți math/README pentru configurare. in /srv/mediawiki/php-1.25wmf17/extensions/Math/MathInputCheckTexvc.php on line 64" [00:07:43] At the time, there was quite a few notices on every request, hence it grew to gigabytes worth of data in minutes on fluorine and turned off. [00:07:44] <^d> :) [00:08:38] Wow, localised fatals? That's.. useful. [00:08:55] <^d> Yeah, from math [00:10:01] <^d> $msg = wfMessage( 'math_notexvccheck' )->inContentLanguage()->escaped(); [00:10:02] <^d> trigger_error( $msg, E_USER_NOTICE ); [00:10:07] <^d> wfDebugLog( 'Math', $msg ); [00:11:17] <^d> I'm fixing that, it's silly [00:12:53] 3MediaWiki-Core-Team: error: request has exceeded memory limit in /srv/mediawiki/php-1.25wmf17/includes/specialpage/SpecialPage.php on line 534 - https://phabricator.wikimedia.org/T89918#1051867 (10tstarling) Using the 5xx logs on oxygen I obtained a URL which reliably causes an error at SpecialPage.php:534 as d... [00:24:07] <^d> Krinkle: https://gerrit.wikimedia.org/r/#/c/191820/ :) [00:24:57] ^d: cool [00:25:39] ^d: btw, do we need the wfDebugLog there? We do listen for those notices in mediawiki natively now and direct to a log group (regardless of whethe the developer knows where to find php/hhvm/apache log) [00:26:05] <^d> Possibly not, I did think it weird to be logging in two places [01:06:28] 3MediaWiki-Core-Team: Slow AFComputedVariable users query - https://phabricator.wikimedia.org/T90036#1051933 (10aaron) [01:06:42] 3MediaWiki-Core-Team: Slow AFComputedVariable users query - https://phabricator.wikimedia.org/T90036#1051936 (10aaron) a:3aaron [01:10:54] TimStarling: is there anything else on https://gerrit.wikimedia.org/r/#/c/190409/6/includes/cache/BacklinkCache.php ? [01:12:58] 3MediaWiki-Core-Team: error: request has exceeded memory limit in /srv/mediawiki/php-1.25wmf17/includes/specialpage/SpecialPage.php on line 534 - https://phabricator.wikimedia.org/T89918#1051946 (10tstarling) I think the issue is a bug in HHVM's ob_get_status(). ExecutionContext::obGetStatus() has ``` if (l... [01:15:19] nothing else, just change that query to a join and I will give it +2 [01:30:53] 3MediaWiki-Core-Team: error: request has exceeded memory limit in /srv/mediawiki/php-1.25wmf17/includes/specialpage/SpecialPage.php on line 534 - https://phabricator.wikimedia.org/T89918#1051979 (10tstarling) Filed upstream: https://github.com/facebook/hhvm/pull/4868 [01:57:30] AaronS: is there a bug for using the edit stash with VE? [01:58:01] I thought there was, but I can't find it [01:58:15] i dont think so [01:59:03] How's your current workload? Do you think you might have time for it sometime soon? [02:02:22] possibly [02:04:31] OK, filed and assigned to you. If you don't get to it, that's OK. [02:07:21] ori: https://gerrit.wikimedia.org/r/#/c/191830/ [02:09:07] * AaronS is tired of that data disappearing for some reason or another [02:09:25] btw what is the plan for section profiling? [02:11:37] there are two problems currently [02:12:16] txStatsD insists on 1 datagram == 1 metric; it won't split on "\n" or any other control character [02:12:33] so the batching in MediaWiki is making it barf, which is easy enough to resolve, but [02:12:54] txStatsD is overloaded and dropping metrics [02:14:03] so godog asked that we disable it until the second problem is resolved, which will happen once he either (1) replaced txStatsD with something that performs better, like Statsite (https://github.com/armon/statsite), or (2) provisions another txStatsD instance. [02:14:43] txStatsD uses Twisted, so it is asynchronous, but single-threaded. [02:14:50] yastatsd splits datagrams on newlines :) -- https://github.com/bd808/yastatsd/blob/master/yastatsd.py#L272 [02:15:00] no idea if it would scale though [02:15:09] probably not [02:15:14] I doubt it [02:15:29] I profiled txStatsD yesterday and it's not doing anything crazy: https://gist.github.com/atdt/7af58c30b6e7b6ad5c92 [02:15:37] we throw a lot of packtes to handle in a single thread [02:15:41] yes [02:29:37] TimStarling: I made the query for https://gerrit.wikimedia.org/r/#/c/190409/7 less cool [02:29:40] is it OK now? [03:15:52] yeah, well I don't know if it is less cool but it is definitely less clever [03:16:07] you know I'm an anti-intellectual at heart right? [03:17:16] of course you do, you probably remember the s/iff/if/ thing ;) [03:39:50] did I flood off or something? [03:39:52] I should really debug that some day [03:44:52] yes [03:44:59] * TimStarling has quit (Excess Flood) [03:44:59] * TimStarling (~tstarling@vps3.angtim.com) has joined #mediawiki-core [03:48:07] I've already disabled away status tracking [04:01:26] TimStarling: /set cmd_queue_speed 0msec , /set cmds_max_at_once 0 [04:01:31] if you're on irssi [04:01:38] I'm not [04:01:46] dircproxy + ? [04:01:53] xchat [04:08:35] I think it's because you're joining too many channels at once (and getting /who) when your client connects [04:08:53] ZNC has MaxJoins, I don't think dircproxy has anything equivalent [04:10:53] well, I tried reconnecting to dircproxy while looking at network traffic from dircproxy to freenode [04:11:02] it seemed pretty conservative, a join/who every 2 seconds or so [04:11:39] but then it didn't flood off those times [04:21:19] the max. number of lines that your client can queue for processing is pretty conservative -- the default is 20 (in ), and freenode's setting is possibly even lower [04:22:21] i also notice "TimStarling has left IRC (Changing host)" before you reappear with your cloak, which means you're probably authenticating with nickserv by automatically sending it commands upon connection [04:23:29] you could try logging in with SASL or a client SSL certificate. Or, even easier, since you always seem to connect from vps3.angtim.com, just whitelist that address. [04:27:15] dircproxy doesn't have SASL support [04:27:48] anyway it seems unlikely that the nickserv commands would make it flood off [04:28:38] and the IP address vps3.angtim.com is public, since it hosts http://tstarling.com/ , so it doesn't concern me that I'm not cloaked from the start [04:29:16] yeah, i'm just thinking of ways to reduce the number of commands you send upon (re-)connecting [04:51:06] google is now providing web applications running on app engine with access to their security scanner [04:51:35] i'm trying to get a mediawiki instance running on app engine so i can sic a scan on it [04:52:27] the XSS detection seems potentially useful [04:52:35] "The scanner's cross-site script (XSS) injection test simulates an injection attack by inserting a benign test string into user-editable fields and performing a variety of user actions. Custom detectors observe the browser and DOM during this test to determine whether an injection was successful and assess its potential for exploitation." [05:04:10] https://www.mediawiki.org/wiki/Extension:GoogleAppEngine :) [05:05:10] heheh: note that parsoid at present is mostly suitable for organisations with an unlimited hardware budget [05:05:14] 3MediaWiki-Core-Team: error: request has exceeded memory limit in /srv/mediawiki/php-1.25wmf17/includes/specialpage/SpecialPage.php on line 534 - https://phabricator.wikimedia.org/T89918#1052318 (10tstarling) There's more to it than that, unfortunately. If I request a script via FastCGI which does only: ``` it's good to give our users options [05:06:34] currently they have a choice between an incredibly slow parser and an even slower one [05:06:40] I bet they are really loving us [05:07:17] the slower of the two attempts to camoflauge its slowness by doing requests back to the first parser with a concurrency of 50 [05:07:37] a clever scheme but it might not go down well if you try it on your AWS free tier instance [05:07:59] http://www.google.com/trends/explore#q=mediawiki&cmpt=q&tz= [05:09:21] SOA will kill mediawiki, IMO. I don't see how gwicke can be sincere with his deb proposal. [05:09:57] By SOA I mean the particular flavor we seem to have settled on [05:10:00] I'm fairly certain that he only cares about the Foundation cluster use-case [05:10:40] even right to fork seems to be very low priority [05:11:33] well, what alternative are we offering? [05:11:59] api team and librarization [05:12:29] with enhanced services as optional components for large deployments [05:12:39] which is all still valid today I think [05:12:50] except supporting VE without parsoid [05:13:03] that's not going to make me recommend mediawiki for someone looking for a wiki engine [05:13:12] we'd have to do a lot better than that [05:13:19] agreed [05:13:52] I lost the debate to use MediaWiki at $DAYJOB-1 to docuwiki :( [05:14:11] laravel's popularity is exploding, so i took a look to see what it offers [05:14:15] (http://laravel.com/) [05:14:18] the ops perspective was that requiring a database was too complex [05:14:39] mediawiki has a sqlite backend, you know [05:14:46] "Instant PHP Platforms On Linode, DigitalOcean, and more. Push to deploy, PHP 5.6, HHVM, queues, and everything you need to launch and deploy amazing Laravel applications." [05:15:20] "The official Laravel local development environment. Powered by Vagrant, Homestead gets your entire team on the same page with the latest PHP, MySQL, Postgres, Redis, and more." -- we have that, but MediaWiki-Vagrant has gotten too bloated, complex, and unstable IMO [05:15:53] https://laracasts.com/ is pretty amazing [05:16:11] the core roles are stable I think but it is turing into a mess with all the roles as it grows [05:16:30] I've been thinking about what a reboot might look like [05:16:41] I think multiwiki was a mistake [05:17:21] central auth was a highly asked for option but it has turned it into a wikifarm without a great management system [05:17:22] the role-per-extension system is covering up how bad our extension management is [05:17:30] +1 [05:17:40] yeah [05:18:16] https://www.mediawiki.org/wiki/Third-party_MediaWiki_users_discussion/Summary it usually comes down to ACLs, ease of upgrading, extension management, and custom skinning [05:18:57] we are also packing in things (like my scholarships and grantreview roles) that have no connection the wiki itself [05:19:12] well, i know, but that isn't the issue [05:19:20] everyone more or less knows what mediawiki is missing [05:19:24] or what should be improved [05:19:50] the problem is that there isn't a coherent proposal for how to get there that has anybody on board [05:20:05] too many cooks [05:20:18] what do you mean? [05:20:23] well, actually, I should say, legoktm's extension management stuff is an improvement [05:20:23] everybody wants their custom workflow to be supported [05:21:17] much like debates about changing code review -- many voices saying "the new system must not deviate from what I'm used to" [05:21:21] i don't see that really [05:21:49] i mean, the services thing proves that you don't need anyone to agree with you, right? :) [05:22:00] I think legoktm is building a suite of interlocking solutions for real problems [05:22:16] the general feeling I get is that people weren't able to get their ideas implemented in core, so they get built as hacky extensions that just happen to work, core changes something, and then you get complaints about breaking changes when you try and fix the underlying problem [05:22:33] I wasn't really joking the last time I said we should just give him a team of 4-5 people to get all his ideas built [05:23:33] I don't think the Foundation would go for that [05:23:48] I'm fairly certain they wouldn't [05:24:06] I don't necessarily fault anyone for that, either [05:24:13] but as you said services shows that it's not impossible [05:24:26] The Wikimedia Foundation is not the MediaWiki Foundation [05:24:32] *nod* [05:25:05] I wish I knew how to do something tactical towards a MediaWiki foundation [05:25:10] I think it's reasonable for management to expect that work on MediaWiki be directed at ends that benefit Wikimedia wikis [05:25:12] but it's a bit outside my realm [05:26:06] well, I gave you my pitch [05:26:57] partner with a hosting provider basically right? [05:27:13] wiki.wiki or similar [05:27:26] "hosting provider" isn't really ambitious enough, but sorta [05:27:27] I think that there is a lot of money to be made with MediaWiki-as-a-Service. It's not free money, but it's a domain that would reward entrepreneurship coupled with some technical ability. [05:28:00] We should try to cultivate interest in that and possibly incubate something internally [05:28:04] well, I think you'd run into similar issues. if you're offering a MW hosting service, why would you spend time on improving the installation/upgrading experience for users? [05:28:21] because you are one yourself :) [05:28:44] the WMF manages installs and upgrades 600+ wikis and yet the upgrade process still sucks :P [05:28:48] but it would be easy to fall into the fork trap that wikia did [05:29:10] it sucks because we have built up a tolerance for it [05:29:24] I think twentyafterfour has a chance to break that cycle [05:29:46] but no one is going to use the web updater on a WMF server, ever. [05:29:46] I technically did as well but I fumbled the opprotunity [05:30:09] https://phabricator.wikimedia.org/T89945 [05:30:47] I just don't believe that any internal effort will succeed unless it proposes to engage more developers than it diverts [05:32:20] MediaWiki-as-an-applicance would be interesting as well but probably not as easy to monitize [05:33:04] Mediawiki-as-a-microwave [05:33:30] MicroWave Foundation? [05:33:33] microwave-as-a-mediawiki-as-a-service [05:33:33] rather than the kitchen sink that mw-v is moving towards something like the web builder model that ori wrote a ticket about oh so long ago [05:33:53] if it can cook a hot pocket you might have something [05:33:59] Microwave-as-a-cloud [05:34:19] I like MicroWave Foundation [05:34:30] It'd better make next quarter's top 5 [05:34:35] the beauty is not only can it cook a hotpocket, but it can also tell you how to cook a hotpocket, and where to stick said pocket. [05:35:06] twentyafterfour: Does the amount of time to cook the hotpocket for have to be entered using a complex series of templates? [05:35:32] bd808: appliance? [05:35:53] heh. I worked on a prototype campaign for HP in 1999 that was about the internet of things before we had a name for it [05:35:57] well first you have to deploy the wikiwave which involves about 15 steps and lots of branching [05:35:59] {{cook|hotpocket|very hot|120V}} [05:36:13] one of the mock things was a microwave and fridge that could talk to each other [05:36:44] Next you'll be telling me we need VisualWave, which uses Waveoid to translate those templates into things a microwave user can understand [05:36:49] legoktm: pre-built VM images that you run on AWS, vSphere, etc [05:36:49] That's the future of the MicroWave Foundation [05:37:20] * Deskana feels a bit bad for derailing this very interesting conversation [05:37:45] that's what PMs do ;) [05:37:52] I was waiting for you to say that ;-) [05:38:28] The only thing I'll miss about being a temporary PM is the loss of credibility in making PM jokes [05:39:41] I'll pretend to be an engineer [05:39:47] bd808: What you ask is impossible! [05:40:24] +1 [05:40:41] (I feel weird pretending to be a bad engineer when nobody in this room is actually like this) [05:41:05] I said crap like that a lot in my early 20s [05:41:19] I grew out of it [05:41:29] now I jsut ask for more time and people [05:41:35] +1 [06:04:49] bd808: could you review https://gerrit.wikimedia.org/r/#/c/183314/ ? It's the last thing blocking moving it to a separate library [06:05:31] legoktm: yeah. is tomorrow morning ok? [06:05:37] sure [06:05:44] I owe you a review on the merge plugin too :( [06:17:54] legoktm: is there an example I could follow which shows how Config/ConfigFactory are to be used to configure external libraries? Is the idea that I should subclass the main class of the external library? [06:18:37] I don't think anyone has used Config for that yet [06:19:01] you could subclass, or you could just have a wrapper around the library and pass it configuration through the constructor? [06:19:48] symfony also has a "Config" component but I haven't had time to look into it yet [06:20:35] yeah I think this makes sense [06:24:29] legoktm: also, when should I use "main" and when not? What is the proper occasion for creating another factory instance? [06:24:36] or rather another Config instance [06:25:54] main is for any configuration setting that comes from core, each extension/skin should have their own Config instance [06:26:18] ok, but everything in core should use main, yeah? [06:26:22] yes [06:27:31] how would you feel about having a simple shortcut, ConfigFactory::main(), which just calls self::getDefaultInstance()->makeConfig( 'main' ) ? [06:27:44] "self::getDefaultInstance()->makeConfig( 'main' )" is a bit of a mouthful [06:27:59] or rather "ConfigFactory::getDefaultInstance()->makeConfig( 'main' )" [06:28:15] that's eight words :P [06:28:21] config factory get default instance make config main [06:29:22] I think Daniel kinda intended it to be like that, the idea was that you'd use dependency injection to pass around the Config object rather than going to the factory every time (only the top level would do that) [06:30:25] so special pages and API modules in core just use $this->getConfig() which gets them the right instance [06:30:39] ah, ok [06:30:58] in SkinVector for example, $this->config is set in the constructor, and that gets used everywhere [06:31:13] but what about top-level functions? [06:31:29] well, it's a minor point anyway [06:33:19] well ideally you'd only use the factory once [06:33:52] in load.php we create the Config object, pass it to ResourceLoader and I'm pretty sure that's the only time we use the ConfigFactory to get 'main' for the entire request [06:34:43] StatsCounter.php [06:36:39] StatCounter is a singleton so the "top" for it is the singleton() function which is where ConfigFactory is used [06:39:58] it mainly falls apart when you have an extension and each hook it uses needs to access Config, but it's static so you have to use the ConfigFactory [07:47:49] 3Librarization, MediaWiki-Core-Team: Move utfnormal functions into includes/libs/ - https://phabricator.wikimedia.org/T86069#1052440 (10Legoktm) [07:48:14] 3Librarization, MediaWiki-Core-Team: Move utfnormal functions into includes/libs/ - https://phabricator.wikimedia.org/T86069#960954 (10Legoktm) One patch left: https://gerrit.wikimedia.org/r/#/c/183314/ ! [07:48:31] 3Librarization, MediaWiki-Core-Team: Move utfnormal functions into includes/libs/ - https://phabricator.wikimedia.org/T86069#1052444 (10Legoktm) [09:27:26] godog: I worked on https://gerrit.wikimedia.org/r/191854 today [09:28:29] I also experimented with a statsd traffic generator, https://github.com/octo/statsd-tg , to test a few servers [09:28:33] ori: sweet! I should be ~done for this week with restbase so today I'm working on graphite/statsd and friends [09:29:27] ori: what servers did you test? IIRC statsite was on the cards, perhaps a few others [09:29:41] txStatsD is a sad case, drops metrics almost straight away [09:29:45] statsite was fast [09:29:51] those were the two [09:30:32] I also profiled txstatsd on graphite1001 for about a minute, just to see if there's any low hanging fruit for optimization [09:30:35] results: https://gist.github.com/atdt/7af58c30b6e7b6ad5c92 [09:30:50] tl;dr: no, there aren't. it's not doing anything insane. it's just python, single-threaded. [09:32:19] indeed it is just slow :( [09:32:28] or not very efficient [09:32:50] I don't have fancy numbers for statsite sadly, I only got around to trying it with statsd-tg with the default settings (invoked with no arguments) [09:33:00] it kept up; txstatsd did not [09:34:04] that's it from me, about to crash since i have a bit of a fever [09:34:11] anything you want me to look at tomorrow? [09:34:45] ori: oh ok! take care! we could swap one for the other, modulo figuring how what to do with the metrics that are named differently [09:34:53] i.e. trailing .value that txstatsd does [09:35:16] ori: anyway ping me tomorrow I should be around [09:35:42] could literally just rename the files; I don't think whisper files have any sense of what they're called [09:35:47] nod [09:35:53] good morning / good night [10:01:42] 3MediaWiki-Core-Team, wikidata-query-service: Investigate ArangoDB for Wikidata Query - https://phabricator.wikimedia.org/T88549#1052564 (10Smalyshev) 5Open>3Resolved [10:03:33] 3MediaWiki-Core-Team, wikidata-query-service: Port Wikidata-Gremlin to test against Neo4j - https://phabricator.wikimedia.org/T88821#1052568 (10Smalyshev) 5Open>3Resolved [10:06:46] 3operations, MediaWiki-Core-Team: Unexpected N4HPHP13DataBlockFullE - https://phabricator.wikimedia.org/T89958#1052584 (10hashar) Should we start monitoring the cold cache usage (maybe via diamond) and alarm on it? [10:24:18] 3MediaWiki-Core-Team, wikidata-query-service: Finish selecting BlazeGraph for wikidata query - https://phabricator.wikimedia.org/T90101#1052620 (10Manybubbles) 3NEW [10:25:18] 3MediaWiki-Core-Team, wikidata-query-service: Figure out if Neo4j is a possible alternative to Titan - https://phabricator.wikimedia.org/T88571#1052629 (10Manybubbles) a:5Manybubbles>3Smalyshev [10:27:44] 3MediaWiki-Core-Team, wikidata-query-service: Build a tool for synchronizing Wikidata changes into Blazegraph - https://phabricator.wikimedia.org/T89852#1052633 (10Manybubbles) [10:27:52] 3MediaWiki-Core-Team, wikidata-query-service: Investigate inference rules and truth maintenance in Blazegraph - https://phabricator.wikimedia.org/T89849#1052634 (10Manybubbles) [10:29:20] 3MediaWiki-Core-Team, wikidata-query-service: Investigate BlazeGraph aka BigData for WDQ - https://phabricator.wikimedia.org/T88717#1052640 (10Manybubbles) [10:31:25] 3MediaWiki-Core-Team, wikidata-query-service: Finish selecting BlazeGraph for wikidata query - https://phabricator.wikimedia.org/T90101#1052657 (10Manybubbles) Ping @Haasepeter and @Beebs.systap on the new task for confirmation. [10:31:27] 3MediaWiki-Core-Team, wikidata-query-service: Investigate BlazeGraph aka BigData for WDQ - https://phabricator.wikimedia.org/T88717#1052659 (10Manybubbles) >>! In T88717#1046235, @Beebs.systap wrote: >>>! In T88717#1045422, @Manybubbles wrote: >> @Beebs.systap - was reviewing code and saw some documentation typo... [10:31:39] 3MediaWiki-Core-Team, wikidata-query-service: Confirm selection of BlazeGraph for wikidata query - https://phabricator.wikimedia.org/T90101#1052660 (10Manybubbles) [10:35:21] 3MediaWiki-extensions-SecurePoll, MediaWiki-Core-Team: Board election SecurePoll type fails with exception on creation - https://phabricator.wikimedia.org/T89425#1052669 (10Aklapper) Could anomie's patch in Gerrit (14 lines) please get a review? [10:36:08] 3MediaWiki-Core-Team, wikidata-query-service: BlazeGraph Finalization: Operational issues - https://phabricator.wikimedia.org/T90103#1052673 (10Manybubbles) 3NEW [10:57:45] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: Zookeeper - https://phabricator.wikimedia.org/T90109#1052761 (10Manybubbles) 3NEW a:3Joe [10:58:16] 3wikidata-query-service, MediaWiki-Core-Team: Investigate Openlink Virtuoso for WDQ - https://phabricator.wikimedia.org/T90110#1052768 (10Manybubbles) 3NEW a:3Manybubbles [10:58:23] 3wikidata-query-service, MediaWiki-Core-Team: Investigate Openlink Virtuoso for WDQ - https://phabricator.wikimedia.org/T90110#1052777 (10Manybubbles) p:5Triage>3Normal [11:01:10] 3MediaWiki-Core-Team: Investigate Apache Jena for WDQ - https://phabricator.wikimedia.org/T90112#1052797 (10Manybubbles) 3NEW [11:07:58] 3wikidata-query-service, MediaWiki-Core-Team: Investigate Openlink Virtuoso for WDQ - https://phabricator.wikimedia.org/T90110#1052818 (10Manybubbles) The big blocker here is that Virtuoso's HA features being enterprise only. At least, that is my understanding of it right now. [11:18:34] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: Update performance - https://phabricator.wikimedia.org/T90114#1052844 (10Manybubbles) 3NEW a:3Manybubbles [11:18:54] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: Update performance - https://phabricator.wikimedia.org/T90114#1052844 (10Manybubbles) Assigned to myself. Will talk to BlazeGraph developers about it in person on Tuesday of next week and update. [11:19:21] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: Update performance - https://phabricator.wikimedia.org/T90114#1052854 (10Manybubbles) p:5Triage>3Normal [11:19:28] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: Zookeeper - https://phabricator.wikimedia.org/T90109#1052856 (10Manybubbles) p:5Triage>3Normal [11:19:43] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: Operational issues - https://phabricator.wikimedia.org/T90103#1052859 (10Manybubbles) p:5Triage>3Normal [11:30:38] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: Security Review - https://phabricator.wikimedia.org/T90115#1052870 (10Manybubbles) 3NEW [11:32:07] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: Machine Sizing/Shaping - https://phabricator.wikimedia.org/T90116#1052877 (10Manybubbles) 3NEW a:3Smalyshev [11:33:18] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: Scale out plans - https://phabricator.wikimedia.org/T90117#1052885 (10Manybubbles) 3NEW [11:34:37] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: Scale out plans - https://phabricator.wikimedia.org/T90117#1052885 (10Manybubbles) One option might be to keep the "truthy" dump on nice SSDs and make sure those are super duper fast but to support the more reified forms only on other machine... [11:34:56] 3wikidata-query-service, MediaWiki-Core-Team: Confirm selection of BlazeGraph for wikidata query - https://phabricator.wikimedia.org/T90101#1052895 (10Manybubbles) p:5Triage>3Normal [11:49:51] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: RDF Issues - https://phabricator.wikimedia.org/T90119#1052919 (10Manybubbles) 3NEW [11:51:02] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: Prefix vs Suffix - https://phabricator.wikimedia.org/T90121#1052933 (10Manybubbles) 3NEW [11:51:38] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: Prefix vs Suffix - https://phabricator.wikimedia.org/T90121#1052933 (10Manybubbles) This should be one of the topics for our Wednesday call between @haasepeter and Markus. [11:54:09] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: What about non-wikidata ontologies - https://phabricator.wikimedia.org/T90122#1052941 (10Manybubbles) 3NEW [12:10:25] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: Value representation - https://phabricator.wikimedia.org/T90123#1052949 (10Manybubbles) 3NEW [12:11:59] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: Value representation - https://phabricator.wikimedia.org/T90123#1052949 (10Manybubbles) Another point: units are identified by URIs. They usually point to wikidata. If they don't, do we just throw them out? What about dates? Dates have a... [12:13:13] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: Verify features work - https://phabricator.wikimedia.org/T90124#1052957 (10Manybubbles) 3NEW [12:13:50] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: Investigate inference rules and truth maintenance - https://phabricator.wikimedia.org/T89849#1052963 (10Manybubbles) [12:14:14] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: Investigate inference rules and truth maintenance - https://phabricator.wikimedia.org/T89849#1046959 (10Manybubbles) [12:14:15] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: Verify features work - https://phabricator.wikimedia.org/T90124#1052957 (10Manybubbles) [12:14:36] 3wikidata-query-service, MediaWiki-Core-Team: WDQ: Investigate Bitsy for single node users - assuming it supports TinkerPop 3 and we're still using TinkerPop 3 - https://phabricator.wikimedia.org/T88715#1052966 (10Manybubbles) 5Open>3declined [12:14:57] 3wikidata-query-service, MediaWiki-Core-Team: Figure out if Neo4j is a possible alternative to Titan - https://phabricator.wikimedia.org/T88571#1052969 (10Manybubbles) 5Open>3Resolved [12:15:13] 3wikidata-query-service, MediaWiki-Core-Team: Build a tool for synchronizing Wikidata changes into Blazegraph - https://phabricator.wikimedia.org/T89852#1052970 (10Manybubbles) p:5Normal>3Low [12:16:11] 3wikidata-query-service, MediaWiki-Core-Team: Build a tool for synchronizing Wikidata changes into Blazegraph - https://phabricator.wikimedia.org/T89852#1047037 (10Manybubbles) This is slightly less important than validating that BlazeGraph works. Lowering priority. [12:28:13] 3MediaWiki-Core-Team: Slow AFComputedVariable users query - https://phabricator.wikimedia.org/T90036#1052982 (10Aklapper) [13:18:37] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: Investigate inference rules and truth maintenance - https://phabricator.wikimedia.org/T89849#1053064 (10Manybubbles) [13:18:37] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: Verify features work - https://phabricator.wikimedia.org/T90124#1053065 (10Manybubbles) [13:19:03] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: Investigate inference rules and truth maintenance - https://phabricator.wikimedia.org/T89849#1046959 (10Manybubbles) [13:19:04] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: Verify features work - https://phabricator.wikimedia.org/T90124#1052957 (10Manybubbles) [13:23:53] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: Validate AST rewrite - https://phabricator.wikimedia.org/T90128#1053068 (10Manybubbles) 3NEW [13:25:14] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: Range Queries - https://phabricator.wikimedia.org/T90129#1053074 (10Manybubbles) 3NEW [13:25:51] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: Geo - https://phabricator.wikimedia.org/T90130#1053080 (10Manybubbles) 3NEW [13:26:56] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: Pluggable inline values - https://phabricator.wikimedia.org/T90131#1053087 (10Manybubbles) 3NEW [13:28:30] 3Services, MediaWiki-Core-Team: Authn and authz as a service - https://phabricator.wikimedia.org/T84962#1053099 (10Anomie) [13:28:31] 3MediaWiki-API, RESTBase, MediaWiki-Core-Team: Action API modules to support Restbase - https://phabricator.wikimedia.org/T88010#1053097 (10Anomie) 5Open>3Resolved [14:30:24] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: RDF Issues - https://phabricator.wikimedia.org/T90119#1053239 (10Smalyshev) Currently, we have two known issues with our RDF vs. BlazeGraph: # Date values (aka "13 billion BCE") # Geopoints notation [15:20:09] 3MediaWiki-API, MediaWiki-Core-Team: ApiQueryBase::titlePartToKey doesn't work correctly with the # character - https://phabricator.wikimedia.org/T38358#1053420 (10Anomie) a:3Anomie [16:04:36] hmm... no aaron [16:05:43] 3MediaWiki-API, MediaWiki-Core-Team: ApiQueryBase::titlePartToKey doesn't work correctly with the # character - https://phabricator.wikimedia.org/T38358#1053700 (10Umherirrender) 5Open>3Resolved The error 'invalidtitle' will now be reported when using hashes, it will be part of MediaWiki1.25wmf19, see https:... [16:06:56] aude: pretty early for him [16:08:11] just want him to be aware of https://gerrit.wikimedia.org/r/#/c/191870/ [16:08:19] hopefully easy fix and can be re-reverted [16:13:51] :( Hopefully [16:19:31] <^d> SF folks wake up too late [16:20:04] no kidding [16:24:33] 3MediaWiki-extensions-AbuseFilter, MediaWiki-Core-Team: Slow AFComputedVariable users query - https://phabricator.wikimedia.org/T90036#1053846 (10Se4598) [16:26:55] lazy SF folks wanting timezone relative work hours [16:31:05] ori: I've started enumerating what to rename, sadly statsite doesn't support meters but it shouldn't be too bad https://phabricator.wikimedia.org/T90111#1053893 [16:32:34] back when we had more than 3 ops in SF, they wouldn't show up until at least 10am. [16:35:19] still haven't decided if I prefer late or early meetings, perhaps the latter but I'm in the wrong part of the world then [16:36:04] anomie: my inbox is dying :( (also :D) [16:36:51] legoktm: How's the AuthStuff RFC doing? [16:37:58] good! I should have it up in an hour [16:38:44] good! [16:38:56] godog: the latter as in earlier :) [16:40:21] One thing I hate about Phabricator, finding the right project is a huge pain if you don't already know what the name of the project is. [16:41:15] yeah, hierarchies were nice [16:41:30] greg-g: hehe that too [16:48:22] anomie: I feel you [16:48:41] * Nemo_bis still uses https://old-bugzilla.wikimedia.org/describecomponents.cgi regularly [16:51:43] anomie: is it with upstream yet? surely the dropdown could score results based on the project description too [16:52:34] godog: No idea. [16:55:49] 3Labs-Vagrant, MediaWiki-Core-Team: labs-vagrant list-roles fails after updating vagrant repo - https://phabricator.wikimedia.org/T90176#1053655 (10bd808) [17:05:26] <^d> manybubbles: So I was declined to speak on my own. Am I reading your e-mail right in they're offering us a dual-speaker thing though? [17:05:52] ^d: I have no fucking clue [17:06:09] I didn't realize they didn't take your talk [17:06:16] I was going on the assumption that you would be there too [17:06:44] <^d> Yeah I got the "Thanks but due to the # of really good proposals..." e-mail [17:13:09] <^d> manybubbles: They're putting you up in the Parc55? Fancy :) [17:13:46] 3Wikipedia-App-Android-App, MediaWiki-API, Wikipedia-App-iOS-App, MediaWiki-Core-Team: Allow triggering of user password reset email via the API - https://phabricator.wikimedia.org/T32788#1054137 (10Deskana) [17:17:23] <^d> manybubbles: I'm going to e-mail him back to clarify. Assuming the answer is "yes" -- do you want to speak together or did you want to go solo? [17:18:21] AaronS: if you can resubmit the patch, with handling for temporary tables in tests, i'm happy to review + merge [17:18:30] * aude didn't have time to investigate today [17:18:48] e.g. review tomorrow [17:24:34] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: Scale out plans - https://phabricator.wikimedia.org/T90117#1054174 (10Jdouglas) > BlazeGraph doesn't support clustering - only high availability Does this refer to update scaling, or just query scaling? Blazegraph supports replication clust... [17:33:00] aude: it's pretty annoying since it greatly limits what subqueries/union can do just due to the test being different [17:33:50] AaronS: yeah :/ [17:34:28] or if particular tests can be skipped, perhaps? [17:34:33] on mysql [17:34:51] although not sure that's an option in this cas [17:34:51] e [17:36:40] hey godog [17:36:51] yo ori [17:37:04] aude: it should pass with --use-normal-tables [17:37:16] hmmm [17:37:47] not sure that's what we want unless we got rid of temporary tables option entirely [17:38:09] core tests should pass by default options on mysql, imho [17:38:11] godog: yeah, lack of meters doesn't seem critical [17:38:19] * bd808 just realized that "triplestore" is basically prolog rules & facts [17:38:27] bd808: :) [17:39:28] All I really remember about prolog was that the towers of hanoi solver is elegant and I barely passed the class [17:40:10] ori: yup, also limited usage it seems [17:43:16] godog: packaging *should* be straightforward; scons is the only build dependency iirc [17:45:20] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: Value representation - https://phabricator.wikimedia.org/T90123#1054261 (10Jdouglas) > some archaic unit that can't be reliably converted to other units My car gets forty rods to the hogshead and that's the way I likes it. [17:45:52] anything I can do to help? [17:46:36] ori: sure, getting rid of meters in favor of counters would be nice [17:47:09] godog: counters don't work in txstatsd for some reason, that's why we have meters everywhere [17:47:53] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: Value representation - https://phabricator.wikimedia.org/T90123#1054264 (10Jdouglas) > If [units] don't [point to Wikidata], do we just throw them out? Could we instead add them as needed to Wikidata, and point to them accordingly? [17:52:15] ori: I'm looking for counters I'm fairly sure I've seen them used but not 100% if they work or not [18:08:45] ori: sigh fail, they don't work indeed as per "statsd spec" they seem to behave as gauges instead [18:09:51] godog: if there is an afterlife i am sure that i will pay for the choice of txstatsd [18:09:54] if that is any consolation [18:10:52] ori: hahaha I am happy enough that we have some traction to get rid of it now [18:11:57] so I didn't really want to have the transition in lockstep with changing meters to counters [18:12:31] yes, that would suck. i'm trying to think of a nicer way. [18:12:40] we could just duplicate all the meters now and log them as counters as well [18:12:50] but the name would be different [18:12:53] nah, that wouldn't work [18:17:15] bd808: Sorry to bug, but could we get code review on https://gerrit.wikimedia.org/r/#/c/189049/ please? Last review was on 9 February. [18:17:49] anomie, bd808, csteipp: https://www.mediawiki.org/wiki/Requests_for_comment/AuthManager [18:22:41] anomie: can you give https://gerrit.wikimedia.org/r/#/c/189049 another look? It looks to me like Alex has addressed your comments but I'm not 100% sure. [18:22:54] James_F: ^ poke delivered ;) [18:23:47] Who was it that pointed out that "Manager" in any object name is probably not a good name? [18:24:20] bd808: I don't see any problems, although the stated purpose still seems weird. [18:25:06] csteipp: AaronSchulz on https://gerrit.wikimedia.org/r/#/c/187840/, I think [18:25:08] csteipp: not me. I used it to make manager jokes [18:25:16] bd808: Ta. :-) [18:25:37] James_F: can you help anomie understand the "why" of this patch? [18:25:51] He rightly hates to see new hooks without good reason [18:25:53] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: Geo - https://phabricator.wikimedia.org/T90130#1054401 (10Jdouglas) Perhaps some Frankensteinian creation from Blazegraph + Elasticsearch + GeoSPARQL? [18:26:04] and the linked bug is less than revealing [18:26:09] bd808: https://gerrit.wikimedia.org/r/#/c/191221/ is the WikiEditor patch using the hook. [18:27:13] ori: I have to go, I'll think it over too [18:27:27] godog: rest well [18:28:13] {{File:Sting.ogg}} [18:28:19] hahaha [18:28:22] that was not deliberate at all [18:29:02] csteipp: btw, google unveilled https://cloud.google.com/tools/security-scanner/ yesterday. closed-source, but you can run it for free on google app engine apps. the most interesting thing looks like their XSS detection. i was messing around with getting a mediawiki instance up, but since so much of our JS code is in extensions, it'd be hard to reproduce prod on GAE [18:29:09] hahaha SCNR! alright I'm off have a good weekend [18:29:10] might be worth pinging them to ask if they could run it against our site [18:29:25] you too godog [18:29:27] James_F: So I think part of the confusion is that https://meta.wikimedia.org/wiki/Schema:Edit says "Logs generic events related to editing activity. All events are logged client-side." [18:29:34] but this is a server side logging hook [18:29:44] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: Range Queries - https://phabricator.wikimedia.org/T90129#1054422 (10Jdouglas) [18:29:56] 3wikidata-query-service, MediaWiki-Core-Team: BlazeGraph Finalization: Range Queries - https://phabricator.wikimedia.org/T90129#1054423 (10Jdouglas) a:3Jdouglas [18:30:15] bd808: Yeah. ori and Krenair decided it was better to use server-side logging in this case, despite the original intent. [18:30:32] ori: Nice! I've been building out a tool based on a demo I saw of that in Dec. [18:31:02] bd808, that's not the only thing wrong with this schema :) [18:31:13] *nod* anomie does any of that help? [18:31:36] bd808: Meh. If people want to do crazy stuff with the event logging, I don't really care. [18:31:43] heh [18:33:40] csteipp: i thought about doing something clowny like having an app on app engine that acts as a thin reverse-proxy for WMF prod, but there is an explicit proviso in the terms telling you not to do that [18:37:56] csteipp: The debate on http://c2.com/cgi/wiki?DontNameClassesObjectManagerHandlerOrData makes my head hurt. Class naming purists don't help me get work done. [18:38:44] <^d> I was seeing the "recursion detected" bug and for a split second contemplated going after it [18:38:48] <^d> Then I came to my senses [18:38:56] ^d: DO IT! [18:39:54] bd808: No worries, I'm not a purist. If we're spending the time doing a refactor, we should make sure it's good quality. They make some good points about long term, everything can get added to a "manager" class.. but we can document the scope of stuff it should handle even if the name is general. [18:40:15] ^d: This might be the real answer to the recursion bug -- https://phabricator.wikimedia.org/T43103 [18:40:41] csteipp: AuthenticationOracle? [18:40:44] <^d> Hmm [18:41:13] AuthenticatedUserFactory [18:41:44] AuthenticationCoordinator [18:43:00] AuthenticationProviderCoordinator? Meh, then we'll have APC all over our code.. [18:43:16] AuthenticationBroker? [18:44:02] "a person who buys and sells goods or assets for others" [18:44:17] But the verb form "arrange or negotiate (a settlement, deal, or plan)" [18:44:23] that actually sort of fits [18:44:32] <^d> http://www.thesaurus.com/browse/coordinator [19:12:54] <^d> Hehe [19:12:59] <^d> 48 Pool error on key CirrusSearch-Search:_elasticsearch_enwiki during full_text search for 'why home work should be banned': pool-queuefull [Called from {closure} in /srv/mediawiki/php-1.25wmf17/extensions/CirrusSearch/includes/Searcher.php at line 1063] in /srv/mediawiki/php-1.25wmf17/includes/debug/MWDebug.php on line 300 [19:13:18] <^d> That's poolcounter doing its job and keeping someone from mashing F5 on the same search again and again [19:26:01] 6MediaWiki-Core-Team, 10wikidata-query-service: BlazeGraph Finalization: Geo - https://phabricator.wikimedia.org/T90130#1054866 (10Manybubbles) Its in their roadmap. Maybe we can influence some. I'm not sure why but they seem to have some people already using it but maybe they rigged it up themselves. [19:27:59] 6MediaWiki-Core-Team, 10wikidata-query-service: BlazeGraph Finalization: Value representation - https://phabricator.wikimedia.org/T90123#1054872 (10Manybubbles) We could create a report of measures with nonlocal units and fix them during volunteer time. Or let someone write a not for it. Then we'd have to hav... [19:32:21] 6MediaWiki-Core-Team, 10wikidata-query-service: BlazeGraph Finalization: Scale out plans - https://phabricator.wikimedia.org/T90117#1054913 (10Manybubbles) Already asked to get statistics on current usage but I imagine we can add new nodes if we need to. I think we'll have to see what load is like once we get... [19:51:33] <^d> aude: 13 Failed to load latest revision of entity Q19322067! This may indicate entries missing from thw wb_entities_per_page table. [Called from Wikibase\EditEntity::fixEditConflict in /srv/mediawiki/php-1.25wmf17/extensions/Wikidata/extensions/Wikibase/repo/includes/EditEntity.php at line 492] in /srv/mediawiki/php-1.25wmf17/includes/debug/MWDebug.php on line 300 [19:55:09] ^d: we have a script that repairs those (that runs weekly, i think) and known issue / bug for this [19:55:15] i could run the script now [19:55:36] <^d> Ok good there's a task :) [20:03:53] i do see quite a few log entries though for this [20:21:01] legoktm: whats going on with https://gerrit.wikimedia.org/r/#/c/179516/ ? [20:21:32] AaronSchulz: we haven't run into it lately so I never followed up on it... [20:22:43] I think we should just not batch moves and have one job per move instead [20:23:52] sounds fine to me [20:45:49] 6MediaWiki-Core-Team: Triage Mediawiki-API tasks - https://phabricator.wikimedia.org/T90003#1055217 (10Anomie) 5Open>3Resolved [21:21:42] 6MediaWiki-Core-Team, 10MediaWiki-Page-editing, 7I18n: Long edit comments get entirely removed instead of truncated (error in cutting multibyte chars?) - https://phabricator.wikimedia.org/T85700#1055309 (10Umherirrender) a:3Umherirrender [21:57:22] <^d> anomie: I just filed T90287. I think rMWaafae192b09dc0418fc4a1bb18c90464268d9584 fixes that though? [21:57:26] * ^d saw it seconds later [21:59:34] 6MediaWiki-Core-Team: Database constructor cleanups - https://phabricator.wikimedia.org/T90288#1055448 (10aaron) 3NEW a:3aaron [22:00:18] 6MediaWiki-Core-Team, 10MediaWiki-API, 7Wikimedia-log-errors: API blocks query module causes PHP undefined property notice if bkprop parameter does not include 'timestamp' - https://phabricator.wikimedia.org/T89893#1055459 (10Krenair) [22:02:02] <^d> Krenair: Ah, dupe. Thx [22:03:06] :) [22:06:46] <^d> "Unable to jump to row 0 on MySQL result index 19 in /srv/mediawiki/php-1.25wmf17/includes/db/DatabaseMysqli.php on line 264" [22:06:48] <^d> So weird [22:10:22] <^d> AaronSchulz: On the subject of OOMs...we've got one that keeps popping up in SectionProfiler [22:10:26] <^d> Line number useless, obvs. [22:23:53] ^d: Yeah, same log message as T89893 [22:27:16] <^d> Yeah, thanks for already taking care of it [22:43:46] ^d: https://gerrit.wikimedia.org/r/#/c/191823/ [22:46:06] legoktm: If you get bored and want to play with UI changes -- https://phabricator.wikimedia.org/T75062 [22:49:18] not really :/ [23:11:19] <^d> AaronSchulz: https://gerrit.wikimedia.org/r/#/c/192017/ [23:25:25] bd808: poke again for the merge-plugin patch? I think we need that to use composer for CI (instead of vendor) [23:26:37] legoktm: *nod* I'll review now!!!! [23:26:57] thanks :) [23:27:18] 6MediaWiki-Core-Team, 10wikidata-query-service: BlazeGraph Finalization: Range Queries - https://phabricator.wikimedia.org/T90129#1055704 (10Jdouglas) For dates [[ https://en.wikibooks.org/wiki/XQuery/SPARQL_Tutorial#Employees_hired_in_this_millennium | this ]] looks like it might work: ``` select ?ename ?h... [23:54:42] legoktm: I think I'd like to add an option that disables recursive includes too but that can be a follow-up patch [23:55:17] ok [23:57:36] bd808: do custom config variables go in "extra" or "config"? [23:58:19] It would go in extra as a sibling of our include key