[00:00:06] the_nobodies: I thought you were death mëtäl [00:00:56] ÿęß śīr [00:01:48] (writing those diacritics is sooo much easier on macs) [00:02:54] opt- [00:04:34] Wę šhóùłd hævë â "ūßė dïãçrītįćś" dåÿ [00:12:08] ok, got data importing from json dump into titan... now I wonder how long would it take to import 16M records... [00:13:33] o.o [00:14:33] Importing all of Wikidata to search fresh takes a day-ish. [00:14:38] (ballpark) [00:14:48] http://www.panix.com/~eli/unicode/convert.cgi?text=The+secret+is+out.+ [00:15:09] 𝖀𝖓𝖎𝖈𝖔𝖉𝖊 𝖎𝖘 𝖜𝖊𝖎𝖗𝖉 [00:17:35] ǝɯ ƃuıןןǝʇ ǝɹ,noʎ [00:18:28] 1000 records took ~80 secs [00:19:14] but that's without bulk settings... I'll probably need to look into those [00:19:39] right now it uses the dumbest way possible [00:37:42] ori: are you in the office? [01:01:05] 3MediaWiki-Core-Team: Investigate memcached-serious error spam that mostly effects some servers - https://phabricator.wikimedia.org/T75949 (10aaron) 3NEW p:3Triage [01:04:44] 3MediaWiki-Core-Team: kafkatee security review - https://phabricator.wikimedia.org/T75950#786875 (10csteipp) [01:05:00] 3MediaWiki-Core-Team: kafkatee security review - https://phabricator.wikimedia.org/T75950#786875 (10csteipp) [01:10:59] Reedy: AaronSchulz: TimStarling: Do one of you perhaps know a more elegant way to get unix timestamp of a date in PHP without causing php to flap its ears around date.timezeone not being set? https://gerrit.wikimedia.org/r/#/c/175907/ [01:11:42] I was looking to somehow pass 'utc' as context parameter to strtotime, but it seems it doesn't support that. Or rather, throws anyway. [01:11:47] emits warning * [01:12:23] The upstream stance is that date.timezone should be set in php.ini [01:12:28] 3MW-RFCs, MediaWiki-Core-Team: RFC: MediaWiki HTTPS policy - https://phabricator.wikimedia.org/T75953 (10csteipp) 3NEW p:3Normal a:3csteipp [01:12:32] which I sort of agree with [01:12:39] 3MW-RFCs, MediaWiki-Core-Team: RFC: MediaWiki HTTPS policy - https://phabricator.wikimedia.org/T75953#786925 (10csteipp) [01:12:47] date_create() [01:12:53] bd808: Do any standard distros do that? [01:13:52] Krinkle: I'm not sure. I typically don't run a distro provided php.ini or at least not one on its own [01:14:12] I mean, install apache2 and php5 on a node, it should just work. [01:14:28] I installed via homebrew, default php.ini doesn't have it set. [01:14:40] but in this case it's just for UTC, so it's kind of irrelevant [01:14:59] TimStarling: cool [01:15:13] I like how the hhvm fatal log is all one entry [01:15:22] someone rendering a massive gallery [01:16:54] bd808: I don't agree with it, I said so at length on php-internals [01:17:11] TimStarling: I've read some of that [01:17:13] I wish it would just default to UTC. [01:17:17] like 80000 files [01:17:28] via api.php...maybe a max-file limit is needed for galleries [01:17:48] the idea of a timezone set in php.ini makes sense for almost nobody, they should just drop the idea, not require it [01:19:10] @strtotime( '2014-11-09 00:00:00 UTC' ) [01:19:19] gmmktime( 0, 0, 0, 11, 9, 2014 ) [01:19:24] date_create( '2014-11-09', new DateTimeZone( 'UTC' ) )->getTimestamp() [01:19:44] second is shortest [01:19:49] Those are the three ways I found that will produce 1415491200 without warnings [01:20:01] the third is probably the slowest as well as being ugly [01:20:03] (short of method not related to dates) [01:22:03] I think the first could be quite slow too [01:22:26] This seems like such a simple thing. [01:23:27] It seems we've only used gmmktime once before [01:23:30] obviously the solution is to not use UNIX timestamps in your configuration [01:23:32] wfTimestampTest.php [01:23:42] Yeah.. true. [01:23:45] $wgNoticeOldCookieApocalypse [01:23:51] What's that even for anyway [01:24:34] $wgCacheEpoch, $wgThumbnailEpoch, etc. uses a string for a timestamp [01:24:37] TimStarling: Right. Parse it later. [01:25:01] yeah, which is also good if you ever want to set configuration in, say, JSON [01:25:56] TimStarling: And then parse with wfTimestamp ? [01:26:11] that returning a string is kinda annoying [01:26:39] yes [01:28:40] (int)wfTimestamp( TS_UNIX, '20141109000000' ) [01:30:03] TimStarling: Thx, https://gerrit.wikimedia.org/r/#/c/175907/ [01:30:51] uh-oh Fatal error: Class 'ProfilerSimpleText' not found in /vagrant/mediawiki/includes/profiler/Profiler.php on line 80 [01:33:21] the_nobodies: prod or locally? [01:33:30] vagrant [01:33:41] fresh mw and vagrant [01:33:48] 3MediaWiki-Core-Team: Refactor Title to make permission checking it's own class - https://phabricator.wikimedia.org/T75958 (10csteipp) 3NEW p:3Triage [01:33:56] What's StartProfiler have in it? [01:34:13] nod. There is a new config format for that that should be documented in StartProfiler.sample [01:35:18] the_nobodies: You need something like: $wgProfiler['class'] = 'ProfilerStandard'; $wgProfiler['output'] = array( 'text' ); [01:35:23] blergh [01:35:29] just nuked it [01:35:33] thanks! :) [01:35:51] $wgProfiler['class'] = 'ProfilerXhprof'; is cool stuff too [01:39:41] bd808: Hm.. just got bitten by out of date vendor.git (class lessc not found) [01:40:21] 3MediaWiki-Core-Team: Implement file usage limits in parsing - https://phabricator.wikimedia.org/T75960#787032 (10aaron) [01:40:23] Krinkle: Are you using mw-vagrant? [01:40:25] bd808: It gets boring really fast when reviewing an older branch, too. Potentially conflicting versions between upgrades. [01:40:34] bd808: No, I refuse to for the time being. I'm stubborn. [01:40:51] oh ok. So we have tagged branches that don't match? [01:40:56] Or ? [01:41:49] bd808: I mean 1) when I do git pull, I always update vendor too, but when I git-review a patch, I may be getting a branch newer than my master, thus my vendor may be too old, or even too new. [01:42:02] yeah. :( that's a pain [01:42:14] So in general that bug of "verify vendor and show useful message" would be cool. As I've wrongly suspected the patch to be wrong a few times over the past week. [01:42:22] the only way to fix it would be to put the submodule in which is ... worse [01:42:48] Yes. We need to work on that. -- https://phabricator.wikimedia.org/T74777 [01:43:07] I guess for local dev, I should just use composer. [01:43:14] That will down/upgrade accordingly each of the depenencies [01:43:29] bd808: does vagrant user compose straight up for mwcore, or vendor.git? [01:43:38] composer [01:43:42] coo [01:43:43] l [01:43:52] but it doesn't track the composer.json yet [01:44:04] sure, but one can just do that manually [01:44:10] *nod* [01:44:22] trying to rollback vendor/ to the right version is almost impossible [01:44:23] I'm using this right now -- https://phabricator.wikimedia.org/P107 [01:44:29] when reviewing random patches [01:44:35] simply re-running composer-update on the other hand is easy [01:45:28] My trigger works for `git pull`. I think we'd need a different trigger for `git review` [01:46:09] bd808: maybe post-checkout [01:46:30] It also makes me keenly aware of the pain of installing local composer changes. I lose my monolog every time composer.json is touched and have to put it back. [01:46:56] yeah we need to play with the trigger idea [01:46:59] bd808: so regarding extensions using composer [01:47:04] bd808: for wmf we stuff them into vendor [01:47:20] bd808: but what about local dev and third parties. Do we have a way of linking up a vendor/ dir in an extension dir? [01:47:33] assuming that the expected way to do it is to run composer-install in the extension dir [01:47:41] It's up to the extension to support that but it's an easy code snippet [01:47:50] we'd need extensions to have something like i18nmessagedirs[] array [01:47:57] which we could feed autoloaders maybe [01:48:01] (if_exist) [01:48:02] not sure [01:48:29] Krinkle: https://github.com/wikimedia/mediawiki-extensions-BounceHandler/blob/master/BounceHandler.php#L87-L90 [01:48:44] bd808: well, ideally it'd be the other way around. the extension is imho considered broken if it only works when mediawiki-core has it in its vendor/. It shoudl work on itsown. [01:49:06] because when I install mediawiki core and run composer-install, BounceHandler won't work, right? [01:49:20] bd808: cool. So that's settled. [01:49:27] I think what we are pusing people towards here is using composer to install the extensions [01:49:41] *pushing [01:50:03] Which may or may not be a solved problem at this point [01:50:09] (it's not) [01:50:38] 1.25 will have some bumps and bruises unless we work on these things more [01:51:02] But I think I can keep chipping away and hopefully enlist some community help [01:51:14] * bd808 has been meaning to write some emails on and off list [01:52:10] bd808: using composer to install extensions would fix the dependency issue for mediawiki development, not for extension development I think. You'd need to be able to work on the local git repo of said extension, change composer.json and it be possible to update vendor normally at that point [01:52:28] uhhh, I don't think we should be going in that direction. [01:53:04] Well you could install the extension as a live git clone via composer [01:53:35] but they would probably have to do some git origin magic while we are still using gerrit for review [01:53:36] bd808: youd' have to commit your work, push it to the branch composer.json points at [01:54:03] (not a solved problem) [01:54:38] We should have a "how does this crap work" session at the dev conference in January [01:58:29] bd808: We should also document how to bring composer into an extension [01:58:35] e.g. that file_exist/require_once is the desired pattern [01:58:43] We wanna avoid people inventing their own stuff everywhere [01:59:08] yeah. File a phab task for it and tag it to librarization [01:59:24] we need to write a crap ton of new docs [01:59:37] I think that's my month of December basically [04:07:09] 3MediaWiki-Core-Team: Isolate throughput issue on HHVM API action=parse - https://phabricator.wikimedia.org/T758#787234 (10tstarling) [06:49:25] csteipp: I've confirmed that the Authorization header is lost [06:49:45] in fact there is no $_SERVER['HTTP_AUTHORIZATION'] in either HHVM or Zend [06:50:05] but it does appear in apache_request_headers() in zend [06:50:15] but HHVM is FastCGI so only has the server environment [06:50:29] it does implement apache_request_headers() but it necessarily has the same information as $_SERVER [06:50:39] apache is probably stripping that header [06:50:56] <_joe_> TimStarling: why should it? [06:50:58] Oh, so WebRequest::initHeaders [06:51:04] <_joe_> I mean it's probably doing that [06:51:35] <_joe_> TimStarling: so it's the 'Authorization' header? [06:51:45] yes [06:52:14] <_joe_> http://stackoverflow.com/questions/17018586/apache-2-4-php-fpm-and-authorization-headers [06:52:32] also I just confirmed that raw post data is working on both servers [06:52:41] <_joe_> ok so instead of reverting [06:52:45] <_joe_> I'll fix apache [06:52:55] <_joe_> it will take me some time, but it's worth the effort [06:53:53] <_joe_> oh it seems it's a mod_proxy_fastcgi bug [06:54:04] <_joe_> TimStarling: where are you reproducing this on HHVM? [06:54:17] <_joe_> I'll patch apache there to check if it works [06:54:22] curl -d datadatadata -H'Authorization: blah' -x mw1243:80 'http://en.wikipedia.org/w/oauth-headers.php' [06:54:29] <_joe_> ok thanks [06:54:45] oauth-headers.php is the uncommitted test script I just deployed [06:54:52] <_joe_> nod [06:57:04] <_joe_> ok found a fix [06:57:15] <_joe_> confirm it works on mw1018 [06:59:30] yes, it works, are you deploying it everywhere? [06:59:38] <_joe_> 5 mins [07:00:27] my dinner is ready [07:00:34] <_joe_> go have dinner :) [07:00:47] ok, ttyl [07:00:51] <_joe_> when you're done, I should be ok [07:22:32] <_joe_> in ~ 15 minutes the change will be applied everywhere, I am taking a shower in the meanwhile, should be back by then [07:43:20] <_joe_> TimStarling: now the fix is applied everywhere [07:43:49] thanks, well done [07:44:05] <_joe_> well I didn't verify oauth is working :P [07:44:28] <_joe_> I mean we do pass that header now, but I'm not sure that's enough [07:44:29] I see there are no get_consumer() log entries since 07:09 [07:44:48] <_joe_> mh which is well before the change was live [07:45:03] what perentage of requests go to HHVM? [07:45:22] <_joe_> about 44% [07:45:30] <_joe_> sorry 34% at the moment [07:45:48] <_joe_> btw oauth worked for me twice [07:46:35] worked for me 5 times in a row [07:46:41] and I was able to reproduce it before [07:47:46] chance of 7 requests all going to zend would be 5% [07:48:00] <_joe_> so well, "apache sucks as a fcgi proxy" doesn't really describe it well enough [07:48:26] <_joe_> I was able to see flaws in the logic of their code, with my knowledge of C [07:48:45] <_joe_> I gotta move us to nginx sooner than later [07:56:15] hmm, yay nginx [07:56:36] I wonder if we can move toollabs runners from lighty to nginx, but probably too entrenched now. [09:24:20] 3MediaWiki-Core-Team: Create core-team project for MW Core Team - https://phabricator.wikimedia.org/T649#787624 (10Qgil) [15:23:57] 3MediaWiki-Core-Team: Change old wmf deployment branches to tags - https://phabricator.wikimedia.org/T1288#788204 (10Krinkle) [15:27:17] * anomie boggles at https://www.mediawiki.org/wiki/Phabricator/Diffusion/Callsign_naming_conventions/Existing_repositories [15:34:11] we have too many repos :( [15:35:34] I proposed on the wikitech-l thread that we just treat callsigns as random base-26 integers. But even the repo names they're proposing there are awful, IMO. [15:39:38] 3MW-RFCs, MediaWiki-Core-Team: RFC: MediaWiki HTTPS policy - https://phabricator.wikimedia.org/T75953#788244 (10Qgil) This RfC is being discussed today. Details: https://lists.wikimedia.org/pipermail/wikitech-l/2014-November/079629.html [15:42:29] anomie: I like the callsigns as random crud. [15:42:40] I disagree with namespacing everything still. It's ugly and redundant. [15:43:07] plus, repo names are just names. they don't /matter/ and can change at any time in phabricator land. [15:43:14] they don't affect cloning paths or urls. [15:46:10] wingswednesday: Without looking, is "Blackout" a skin, an extension, or something else? How about "Normal"? "Synagonism"? "Release"? [15:47:23] People don't find skins or extensions by looking at repo names anyway, you find them on mediawiki.org. [15:47:37] None of those repos matter to me, so I don't care what they're named. [15:49:08] I'm fine with coming up with a naming convention that makes sense, but I refuse the replicate the pseudo-hierarchy all over again. That was a pain in the ass to manage for minimal benefit beyond developer ocd desire to organize. [15:49:25] wingswednesday: I was looking at diffusions API, and unless we namespace, I don't see a way for ExtensionDistributor to be able to get a list of extensions. [15:50:04] Ooooh. [15:50:06] I don't care if the naming convention looks like "mediawiki/extensions/$FOO" or something entirely different, as long as it's consistent. [15:50:09] What if we do MW-style naming. [15:50:13] Extension:Foo [15:50:15] Skin:Bar [15:50:24] someone proposed E-Foo and S-Foo on the talk page [15:50:34] Extension:Foo and Skin:Bar gets +1 from me [15:50:37] (not for callsigns, for names) [15:50:52] legoktm: Somehow I thought we could use - in callsigns. [15:50:58] oh, names can be anything? [15:50:59] Which derailed most of those ideas. [15:51:07] Names could be "Foosaiojsiaocn ()21u40rjsLNON !!!" [15:51:29] Names are just descriptions really. Not used for linking at all. [15:51:44] And searching, I suppose. [15:52:01] so uh, 🌻 [15:52:09] Repo name could be that, yeah [15:52:14] I think giving extensions a root name is a mistake [15:52:29] for Git repository path that is [15:52:34] e.g. Cite.git [15:52:38] Git repo path doesn't matter here. [15:52:51] (those will be set later when we're talking about CR and hosting them on phabricator) [15:52:55] Well, James added that to the table as well [15:53:56] So what we need is a name (anything) and callsign ([A-Z]+) for each repo. [15:54:15] He started with just batch converting gerrit repo names to phab repo names. [15:54:20] probably keep mediawiki in there to avoid overly generic git repos (and because it depends/extends a platform, which typically is named that way, e.g. jquery-foo, node-foo, php-foo, grunt-foo. mediawiki-$name or mwext-$name seems like a good option. [15:54:27] But those I think are kind of ugly and I think we like the MW-style namespacing for that. [15:54:48] And then simply match callsign to repo name in uppercase? [15:54:52] (as default) [15:55:20] I like anomie's idea of just randomizing the callsigns. [15:55:23] Keeps them shorter. [15:55:32] (well, not random, but you know) [15:55:43] Krinkle: If you do that, you end up with silly and ridiculously-long callsigns like "EXTENSIONUNIVERSALLANGUAGESELECTOR" [15:56:10] 3MediaWiki-Core-Team: Change old wmf deployment branches to tags - https://phabricator.wikimedia.org/T1288#788263 (10Krinkle) I created tags for `wmf/1.23wmf*` and `wmf/1.24wmf*` and deleted the branches. Script: ``` #!/usr/bin/env bash remote='origin' version='1.24' git remote update $remote git remote prune... [15:56:25] anomie: No, you'd end up with MEDIAWIKI-CITE or MWEXT-CITE [15:56:34] Krinkle: Dashes aren't allowed in callsigns. [15:56:43] MWEXTCITE isn't awful. [15:56:51] I had originally proposed OPS for all ops-related repos [15:57:05] (OPSPUPPET, etc) [15:57:30] Phrabricators url routing is outright stupid. https://phabricator.wikimedia.org/rMW164ac414fadd endless concatenation. Everything is global [15:57:47] Ugh, it's gonna be ugly no matter what we pick [15:57:56] Yes. [15:58:13] I'm also fine with acronyms on a first-come-first-served basis. [15:58:16] Keeps them short. [15:59:00] 3MediaWiki-Core-Team: Convert old 1.23wmf* and 1.24wmf* deployment branches to tags - https://phabricator.wikimedia.org/T1288#788277 (10Krinkle) 5Open>3Resolved [15:59:42] (also, we'd already done a couple as examples, changing those becomes increasingly difficult by the day) [16:00:50] wingswednesday: OTOH, the advantage to random is that we don't have to worry about fighting over acronyms (e.g. does AdManager, AddMessages, or ActivityMonitor get "AM" as an acronym?) or coming up with acronyms for everything. And it's even shorter. [16:01:57] roll a dice :p [16:02:29] Even easier to just make it random. Maybe stick an "E" on the front for "extension". [16:03:19] E[A-Z]+ for extensions, S[A-Z]+ for skins. [16:03:48] Actually, I like that. Single letter identifier for the "group" it's in, then a random string nobody gets to fight over. [16:03:57] \o/ [16:04:29] Only exception is MW & VE because I'm not importing those again and it gets top-level in Phabricator :) [16:05:07] Yeah, let VE, MW, and CIRRUS be warts if need be. [16:05:19] We haven't linked CIRRUS, I'd be willing to redo it. [16:05:30] Phabricator ones too, probably. [16:05:37] * anomie saw the checkmark and thought that was "done" too [16:05:47] Krinkle: What do you think? [16:06:35] wingswednesday: also VE is not a mediawiki extension [16:06:48] I know :) [16:07:42] Well, there is mediawiki/extensions/VisualEditor... [16:07:49] I think E and S are too short and non-standard to not read naturally (not always a natural break between the "words"), but I don't care and it seems phabby [16:08:00] anomie: yeah, btu that's not the repo VE is linked to in Phab [16:08:03] anomie: Yes, but that's not what the "VE" repo is. [16:08:08] it's linked to the actual software, not the MW bridge [16:08:30] Morning James_F :) [16:08:37] anomie: VE-MW is a MediaWiki extension. VE is entirely stand alone, and used by people without MediaWiki installed. [16:08:41] Morning. :-) [16:08:44] True. But if we're doing first-letter-as-group then it'd be something like "XZZZ" (or whatever letter we chose for "miscellaneous") [16:09:23] anomie: I don't think we have to do stricter sorting for call signs than for repo names [16:09:29] One should be a subset of the other [16:09:44] e.g. if something has oojs.git or VisualEditor.git, it can have an unprefixed call sign, no? [16:11:50] <^d> I filed a task to have chase drop all repos other than VE and MW. [16:12:06] <^d> so those won't potentially get in the way of whatever we decide here [16:12:47] Krinkle: Well, if half the letters are used for prefixes then that limits the range of "unprefixed" call signs. May as well just pick a prefix for "miscellaneous". [16:13:32] <^d> James_F: For the name part (not callsign), we had an idea earlier that could work. Rather than flat-namespacing everything, we could do MW-style namespacing. [16:13:40] <^d> So like Extension:CirrusSearch, Skin:Vector, etc. [16:13:54] ^d: But without the colons? [16:14:09] <^d> Extension CirrusSearch? [16:14:12] Do we need to prefix misc repos? [16:14:13] ^d: So "MediaWiki", "ExtensionCirrusSearch", "SkinVector". [16:14:18] the cdb repo should be CDB imo [16:14:22] oojs can be OOJS. [16:14:25] ^d: Oh, sorry, for name. Yeah, that works. [16:14:28] ^d: Do the names have to "sort"? It'd be more natural imho to have "Vector skin", "Cirrus extension", "VisualEditor", "MediaWiki core" [16:14:36] <_joe_> so, who is the maximum expert on the OAuth extension? [16:14:43] * anomie would like it if http://www.mediawiki.org/wiki/$NAME would correspond to the repo named $NAME [16:14:54] or Vector-Skin, Cirrus-Extension, VisualEditor, MediaWiki-Core if they have to be project-y [16:15:07] they're not used in urls, right? [16:15:12] <^d> No, they're not. [16:15:17] I'd prefer prefixes over suffixes [16:15:18] _joe_: csteipp. Then probably either me or Aaron. [16:15:23] I like Extension:Foo / Skin:Bar [16:15:49] anomie: We can restructure MW.org a lot more easily than we can restructure Diffusion, it feels like. ;-) [16:15:51] I'd like it if json_decode($NAME) doesn't throw an exception [16:15:58] <_joe_> I wanted to advise everybody against using apache_request_headers() whenever possible [16:16:00] <^d> Analytics:Baz, Operations:Stufsssss [16:16:04] I think it matching to a url is abitrary and not useful for anything. Why would that have to work? [16:16:14] I'd like for a decision – whatever decision – to be reached. [16:16:23] I mostly want the shed painted. [16:16:46] <^d> Krinkle: I think it's more useful for mental connections between extension/skin doc pages and their repos. [16:16:57] <^d> Not actually that we could copy the links and reuse them or something. [16:17:31] _joe_: The only use of apache_request_headers I see in the OAuth extension is an imported 3rd-party library [16:17:59] <_joe_> oh ok :) [16:19:41] James_F: I'm told the repo names can be changed arbitrarily without any difficulty, so restructuring either doesn't sound like much of an issue. [16:19:48] ^d: But using a colon seems arbitrary, and lack of space after the colon doesn't help readability [16:20:06] it's a name. It should be readable and natural. Not some kind of mediawiki-ism that we all highly regret. [16:20:06] anomie: Oh, sure, if we're just talking about display names. [16:20:10] <^d> Repo names can change every friday if we want. [16:20:20] <^d> Krinkle: "Skin: Foo" is ok by me too :) [16:20:20] Randomly! [16:20:22] That'd be great. [16:20:31] "Skin Foo"? [16:20:48] There's plenty of colons and labels already in various places. Adding them in the name doesn't seem useful here. [16:21:12] * anomie would prefer "Skin: Foo" over "Skin Foo". [16:21:17] <^d> +1 to anomie [16:21:29] * anomie would prefer "Skin:Foo" even more, but he's used to MediaWiki titles. [16:21:44] This is for display labels in the interface [16:21:49] eg. Project: <> [16:21:59] <^d> Yes, which is why I'm ok with the space :) [16:22:03] it'd say Project: Skin:Foo [16:22:05] it'd say Project: Skin: Foo [16:22:44] <^d> Project: [Skin: Foo] actually. It'd be a pretty labled box :) [16:23:01] I usually call extensions "Extension:Foo" whenever just "Foo" might be ambiguous. [16:23:43] ^d: there are no pretty boxes in the JSON output :P [16:24:01] <^d> That seems like an obvious deficiency in json then. [16:28:14] <^d> By the way, here's what Cirrus would look like with Extension: CirrusSearch. [16:28:15] <^d> https://phabricator.wikimedia.org/diffusion/CIRRUS/ [16:28:39] ^d: Thanks! Examples help. [16:28:51] That looks sane to me. [16:30:02] <^d> anomie, legoktm, Krinkle? [16:30:05] Should we just make the change and see what people say, or propose it on the talk page and see what people say, or what? [16:30:19] +2 [16:30:35] <^d> If we all like it here, I'd jfdi :) [16:30:47] ^d: Fine with me [16:32:07] All bike sheds should be blue! [16:32:19] <^d> What about this one bd808? [16:32:32] <^d> (you spoke up, your mistake!!!) [16:32:32] Unless they also hold lawn mowers, then green [16:33:01] space or no space makes no difference to me. I do like the skin/extension indicator [16:33:42] I flipped a coin and it said the space is fine [16:33:50] <^d> can't argue there. [16:34:11] <^d> Ok, so repo /names/ will be namespaced, MW-style but with spaces. [16:34:25] 3Multimedia, MediaWiki-Core-Team: SHA-1 file name support - https://phabricator.wikimedia.org/T1210#788467 (10Gilles) [16:34:39] 3Multimedia, MediaWiki-Core-Team: SHA-1 file name support - https://phabricator.wikimedia.org/T1210#20954 (10Gilles) [16:34:46] 3Multimedia, MediaWiki-Core-Team: SHA-1 file name support - https://phabricator.wikimedia.org/T1210#788473 (10Gilles) p:5Triage>3Normal [16:35:00] We should fix MW whilst we're at it to chomp leading whitespace after namespace seperators. [16:35:10] So [[Foo: Bar]] === [[Foo:Bar]]. ;-) [16:35:28] <^d> {{scopecreep}} [16:35:33] What could possibly go wrong with adding more parser magic :/ [16:35:38] {{joke}} [16:35:59] bd808: Ask the Parsoid team how fun it's been tracking a moving target. :-) [16:36:27] Like writing a wiki with all human knowledge cataloged? [16:36:31] <^d> So, callsigns. [16:36:34] bd808: so, I'm trying to figure out how to install composer in a puppetized manner, I discovered that CI has a repo they clone (https://github.com/wikimedia/operations-puppet/blob/20aac946aa8e6854258d620c4d7dd1960eb725d8/modules/contint/manifests/slave-scripts.pp#L34), but I can't figure out how to install it...using php bin/composer says I need to do their curl url | php thing to install the dependencies... [16:36:47] James_F: It already does for recognized namespaces. [16:37:06] (but not for pseudo-namespaces, obviously) [16:37:57] legoktm: "how to install composer in a puppetized manner" like https://github.com/wikimedia/mediawiki-vagrant/blob/master/puppet/modules/php/manifests/composer.pp ? [16:38:00] anomie: It does? Hmm. Nice. [16:38:01] legoktm: I tried to install composer locally, but "apt-get install php-composer" didn't work so I gave up on it. [16:38:30] In that case, the repo names could actually be automatic links to MW.org. [16:38:36] legoktm: Or use it like -- https://github.com/wikimedia/mediawiki-vagrant/blob/master/puppet/modules/php/manifests/composer/install.pp [16:39:05] bd808: that feels...bad :/ [16:39:12] <^d> `brew install composer` [16:39:15] ^d: Could you add https://phabricator.wikimedia.org/diffusion/VE/ to the #VisualEditor project when you get a moment? [16:39:16] What feels bad about it? [16:39:17] <^d> {{done}} [16:39:40] Thanks. :-) [16:39:40] ^d: -bash: brew: command not found [16:39:44] composer is on phar file and it expects to be updated every 30 days or it gets sad [16:40:13] <^d> anomie: `ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"` [16:40:19] composer is a crazy hack basically and the upstream refuses to treat it as anythins else [16:40:35] There is no such thing as a "stable composer release" [16:40:41] ^d: Eeew. [16:41:15] anomie: Yeah... that's the official way to install a rational package manager for OS X [16:41:19] well I don't like the idea of being dependent on a third-party website. But then again the whole thing requires packagist.com so...meh [16:41:57] legoktm: Like being dependent on the apt repos? Or pip, or gems or ... [16:42:51] legoktm: Don't start acting like a grey beard opsen when you jam to Taylor Swift ;) [16:45:36] bd808: We could easily enough mirror the whole apt repo, or just the packages we actually use. As for pip and gems (and npm), I don't care for them much either. [16:46:32] I've been gpg signing release tags in my composer project lately but that's about all I can do [16:46:59] bd808: https://imgur.com/a/1PDRJ [16:47:41] heh. I've seen some of those @SwiftOnSecurity quips retweeted by YuviPanda [16:48:50] https://i.imgur.com/efRNHXP.jpg has actually worked on a few of my friends. [16:49:12] Ha. [16:50:22] bd808: so, rather than just copying the composer module from vagrant, can we get it in the ops puppet repo (I guess as a submodule) so I can use it in the extdist module? [16:50:58] legoktm: .... we could try I guess but I think that Faidon would -2 it [16:51:26] Where is extdist deployed? [16:51:29] in labs? [16:51:34] yeah, just labs [16:51:50] Did YuviPanda ever get puppetception to work? [16:52:03] That was his attempt at chaining puppet repos [16:52:30] Also submodules in puppet are teh suk [16:53:16] * bd808 thinks of many scope creeping projects related to puppet [16:54:09] legoktm: I would suggest copy pasta into a self-hosted puppetmaster or make your tool into a mw-vagrant module and deploy is using labs-vagrant [16:54:15] *deploy it [16:57:13] would I have to manually keep the self-hosted puppetmaster in sync with everything else? And I'd rather not tie this with mw-vagrant if we don't have to... [16:57:28] I'll ask hashar how he got composer to run on the jenkins slaves [16:58:14] There is a var you can set that will keep your self-hosted labs puppet master in sync with ops/puppet.git now. YuviPanda generalized a role I had made for beta. [16:58:38] So you turn on the flag and it will rebase your repo on ops/puppet once an hour [16:59:31] And if you are worried about backing up your local changes you can submit them to gerrit, cherry-pick locally and then abandon the gerrit change [16:59:45] then they are always in gerrit to be found again if needed [17:10:44] 3MediaWiki-JobRunner, Beta-Cluster, MediaWiki-Core-Team: beta cluster job runner keep running some periodic tasks - https://phabricator.wikimedia.org/T65681#788656 (10greg) [17:10:47] 3MediaWiki-Core-Team: Convert old 1.23wmf* and 1.24wmf* deployment branches to tags - https://phabricator.wikimedia.org/T1288#788657 (10Legoktm) Can this also be done for extensions as well? [17:15:37] 3MediaWiki-JobRunner, Beta-Cluster, Scrum-of-Scrums, MediaWiki-Core-Team: beta cluster job runner keep running some periodic tasks - https://phabricator.wikimedia.org/T65681#788663 (10greg) [17:55:52] can someone add me as an owner of the wikimedia github org? I need to add some webhooks for the github-->gerrit thing [17:58:51] legoktm: I just invited you [17:59:01] thanks [18:00:19] 3MediaWiki-Core-Team: Convert old 1.23wmf* and 1.24wmf* deployment branches to tags - https://phabricator.wikimedia.org/T1288#788742 (10Reedy) I don't see any reason we couldn't. It would be running the same script essentially for all of our submodules [18:09:53] 3MediaWiki-API, MediaWiki-Core-Team: API list=tags continuation breaks with defined tags - https://phabricator.wikimedia.org/T76051#788775 (10Anomie) [18:11:05] 3MediaWiki-API, MediaWiki-Core-Team: API: list=tags should indicate whether a tag is defined - https://phabricator.wikimedia.org/T76052 (10Anomie) 3NEW p:3Triage a:3Anomie [18:25:36] Grr. How do you get Phabricator to attach a file? When I click the stupid little "upload" icon it just gives me a useless alert box saying to drag-and-drop. [18:27:52] anomie: There is form hidden somewhere... [18:30:04] anomie: heh. one horrible way would be with the conduit api -- https://secure.phabricator.com/conduit/method/file.upload/ [18:30:43] <^d> manybubbles, anomie: I have a WIP for offsets on prefix searching. I think it's about half-working :) [18:30:52] <^d> Some tests are failing, can't get it to work in Cirrus yet. [18:44:53] good day everybody! have another question [18:45:23] bd808: I wound up just copy-pasting the patch into the security bug :( [18:45:42] so wikidatawiki has about 1300 property things (namespace=120) [18:45:51] but the dump seems to have only about 300 of them [18:46:02] what is the reason for that? [18:47:00] (unless there's a bug in my code finding them in the dump... - so I'm trying to figure out if there is) [18:47:29] hoo: ^ [18:48:12] SMalyshev: What dump? [18:48:22] wikidata dump [18:48:25] the json or the xml one? [18:48:29] json [18:48:46] the last one (from Monday?) [18:48:49] ? [18:48:51] sorry, was omitting some context :) [18:49:05] 20141124.json.gz [18:49:25] ok, let me have a look [18:50:30] that will take a bit, as I don't have the dump locally [18:51:14] hoo: sure, no prob, thanks [18:53:28] the weird thing is that those 300 are within first 1000 objects in the dump and then there's no other P objects for a long while... so I'm not sure if there are others at the end and my tool just goes wrong somewhere or they aren't there at all [18:53:59] is there any particular order in the dump? [18:57:14] 3MediaWiki-Core-Team: [bug 64767] Optimistic save API/JS - https://phabricator.wikimedia.org/T1096#788900 (10aaron) https://gerrit.wikimedia.org/r/#/c/174628/ [19:06:41] <^d> manybubbles: Got the prefix tests passing. [19:07:03] 3MediaWiki-API, MediaWiki-Core-Team: add an index field to show original order of queried titles - https://phabricator.wikimedia.org/T16859#788926 (10Anomie) 5Open>3Resolved It looks like this made it in time to hit WMF wikis with 1.25wmf10, see https://www.mediawiki.org/wiki/MediaWiki_1.25/Roadmap for the s... [19:08:22] ^d: wee! [19:36:16] legoktm: Are you interested in fixing T75985, or should I just do it? [19:37:05] if you could look into it, that would be appreciated. I'm trying to figure out why global user merge is broken right now [19:42:37] ok [19:51:03] 3MediaWiki-API, MediaWiki-Core-Team: API allows suppression of redirects by users without the "suppressredirect" right - https://phabricator.wikimedia.org/T75985#789073 (10Anomie) a:3Anomie [19:51:10] 3MediaWiki-API, MediaWiki-Core-Team: API allows suppression of redirects by users without the "suppressredirect" right - https://phabricator.wikimedia.org/T75985#787688 (10Anomie) [19:52:59] SMalyshev: Looks good to me [19:53:00] hoo@tools-dev:~$ zgrep -cP '^\{"id":"P\d+",' 20141124.json.gz [19:53:00] 1318 [19:53:21] ok then, it must be some bug in my work then :) thanks for checking [20:02:39] <^demon|lunch> anomie: I'd missed your comments on PS1 before I wrote PS2. All amended now :) [20:02:43] <^demon|lunch> In PS3/4. [20:02:50] ^demon|lunch: I just +1ed [20:03:12] <^demon|lunch> \o/ thx [20:03:57] Catchable fatal error: Argument 1 passed to MediaWiki\Extensions\OAuth\MWOAuthHooks::onMergeAccountFromTo() must be an instance of MediaWiki\Extensions\OAuth\User, instance of User given in /srv/mediawiki/php-master/extensions/OAuth/backend/MWOAuth.hooks.php on line 9 [20:04:02] namespaces >.> [20:09:19] ^demon|lunch, manybubbles: php /srv/mediawiki/multiversion/MWScript.php runJobs.php --wiki='zerowiki' --type='cirrusSearchLinksUpdatePrioritized' --maxtime='60' --memory-limit='300M' --result=json [20:09:19] Fatal error: Call to a member function getRootText() on a non-object in /srv/mediawiki/php-master/extensions/ZeroPortal/includes/ZeroConfigView.php on line 178 [20:09:36] lol [20:09:40] fuck yeah, zero [20:09:46] <^demon|lunch> legoktm: Blame zero. [20:09:53] should I file a bug? [20:10:04] legoktm: yeah probably on us and them too? [20:10:10] <^demon|lunch> Against zero. [20:10:15] <^demon|lunch> I'm not sure how it's us. [20:10:27] ok [20:10:28] $configXcs = $wgOut->getTitle()->getRootText(); [20:10:33] ewwww [20:10:34] getTitle() can return null [20:10:45] because $wgTitle won't be set [20:11:00] With code like that, it's really not Cirrus [20:11:24] <^demon|lunch> We don't use $wgOut :p [20:13:05] https://phabricator.wikimedia.org/T76078?workflow=create [20:15:11] https://gerrit.wikimedia.org/r/176041 <-- easy merge [20:17:59] thanks Reedy [20:44:49] 3MediaWiki-General-or-Unknown, MediaWiki-Core-Team: mediawiki/core - https://phabricator.wikimedia.org/T76085 (10hashar) 3NEW p:3Triage [20:48:44] 3MediaWiki-General-or-Unknown, MediaWiki-Core-Team: Suspicious wmf tags on mediawiki/core.git repo - https://phabricator.wikimedia.org/T76085#789267 (10hashar) [20:49:27] 3MediaWiki-General-or-Unknown, MediaWiki-Core-Team: Suspicious wmf tags on mediawiki/core.git repo - https://phabricator.wikimedia.org/T76085#789268 (10Legoktm) [20:49:47] 3Scrum-of-Scrums, Core-Features, MediaWiki-Core-Team: Enable $wgContentHandlerUseDB everywhere (bug 49193) - https://phabricator.wikimedia.org/T1217#789272 (10bd808) [20:51:01] 3MediaWiki-General-or-Unknown, MediaWiki-Core-Team: Suspicious wmf tags on mediawiki/core.git repo - https://phabricator.wikimedia.org/T76085#789275 (10Chad) [21:01:58] 3MediaWiki-General-or-Unknown, MediaWiki-Core-Team: Suspicious wmf tags on mediawiki/core.git repo - https://phabricator.wikimedia.org/T76085#789296 (10hashar) [21:05:52] 3MediaWiki-Core-Team: Convert old 1.23wmf* and 1.24wmf* deployment branches to tags - https://phabricator.wikimedia.org/T1288#22541 (10hashar) That might break zuul-cloner which expects to checkout branches which is used for {T1350} among others. (link: https://www.mediawiki.org/wiki/RFC/Extensions_continuous_in... [22:26:20] 3Scrum-of-Scrums, Wikimedia-General-or-Unknown, Core-Features, MediaWiki-Core-Team: Add ContentHandler columns to Wikimedia wikis, and set $wgContentHandlerUseDB = true - https://phabricator.wikimedia.org/T51193#789525 (10Spage) [22:26:54] 3Scrum-of-Scrums, Wikimedia-General-or-Unknown, Core-Features, MediaWiki-Core-Team: Add ContentHandler columns to Wikimedia wikis, and set $wgContentHandlerUseDB = true - https://phabricator.wikimedia.org/T51193#789531 (10Jdforrester-WMF) [22:34:09] happy thanksgiving whoever is still online! [22:34:36] manybubbles: happy tofukey day to too [22:34:42] tofurkey? [22:41:25] bd808: tofurkey isn't that good. mostly just stuffing and pie [22:41:41] :) pie is gooood [22:41:43] Stuffing in the US is strange. [22:42:05] It's still in beeta [22:42:12] * bd808 kids [22:42:23] "This stuffing is not ready for production." [22:42:26] Perhaps I will use that line tomorrow. [22:43:05] This is my first Thanksgiving in the US. Although I did have Thanksgiving dinner last year despite being in the UK, because it's one of the few American things my girlfriend likes to do. [22:43:34] Deskana: Are you talking about the sort of stuffing that people cook in the turkey vs in a pan? [22:43:57] The cooked-in-bird kind is gross in my opinion [22:44:10] I don't know what it was. I saw a box labelled "Stuffing" in Safeway and it looked like a bag of croutons. [22:44:11] But something like http://www.bbc.co.uk/food/recipes/sageonionstuffing_3580 is tasty [22:44:35] Ah yeah. we are too lazy to chop up stale bread [22:44:41] Yes! That is what I would call stuffing. [22:44:43] And it is delicious. [22:45:37] I guess US stuffing tends more to the "bread pudding made out of croutons with no egg" side