[00:43:35] bd808: Do you know if the request-id dashboard in Kibana should still work? [00:44:09] It was a simplified version of 'mediawiki' dashboard that filtered to only where a field matched the value of a kibana url query parameter for creating permalinks from on-wiki pages when X-Wikimedia-Debug logging is enabled. [00:44:17] E.g. https://logstash.wikimedia.org/#/dashboard/elasticsearch/request-id?id=V@sRjwpAIDAAAA9TZtQAAAAF [00:48:13] Krinkle: sadly, no. Kibana4 doesn't support templating like that. [00:48:51] I have been meaning to update the plugins to do the same thing differently but haven't gotten around to it [00:58:57] bd808: You know how the link is made for the Chrome WMDebug extension and where that code is maintained? [03:58:20] Krinkle: https://github.com/wikimedia/ChromeWikimediaDebug/blob/master/content_script.js#L73 [03:59:50] what needs to happen is to send a complete kibana4 dashboard description. That's their horrible replacement for the old template functionality [04:04:53] something like -- https://logstash.wikimedia.org/app/kibana#/dashboard/x-debug?_g=(refreshInterval:(display:Off,pause:!f,value:0),time:(from:now-1h,mode:quick,to:now))&_a=(filters:!((%27$state%27:(store:appState),meta:(alias:!n,disabled:!f,index:%27logstash-*%27,key:_type,negate:!f,value:mediawiki),query:(match:(_type:(query:mediawiki))))),options:(darkTheme:!f),panels:!((col:1,id:Events-Over-Time,panelIndex:14,row:1,size_x:12,size_y:2,type:visu [04:04:54] alization),(col:1,columns:!(level,channel,host,wiki,message),id:MediaWiki-Events-List,panelIndex:15,row:3,size_x:12,size_y:11,sort:!(%27@timestamp%27,desc),type:search)),query:(query_string:(analyze_wildcard:!t,query:%27reqId:V@tAWQpAAEsAAZhU58MAAACN%27)),title:x-debug,uiState:()) [04:08:40] I wrote it up for the FF plugin: https://github.com/wikimedia/FirefoxWikimediaDebug/issues/15 [04:09:18] I'll see if I can get it patched tomorrow and then we can make the same fix for the Chrome plugin [04:16:41] yay [04:21:57] I think the last time I started to work on this I fell into a rabbit hole of thinking about merging the two plugins into a single codebase. [04:22:13] I'm not going to do that again for a while :) [04:23:39] legoktm: https://github.com/wikimedia/composer-merge-plugin/pull/123 is lonely and wishing that it was cool enough to get your review. It keeps sending me sade text messages wondering what it can do to get your attention. Pathetic really. [14:24:35] Is there some magic comment to run mediawiki-phpunit-php55-trusty without actually +2ing a change? [14:52:45] anomie: I *think* "check php5" does that [14:53:05] bd808: Yeah. I asked on #wikimedia-releng and someone said that too [14:55:36] "comment: (?im)^Patch Set \d+:\n\n\s*check (php53?|zend)\.?\s*$" -- https://phabricator.wikimedia.org/diffusion/CICF/browse/master/zuul/layout.yaml;0c17e9f6c3a2b13d3bd5f6ff811bb6fdb9802f5c$541 [15:30:11] bd808 we will need to add php7 there too :) [15:50:05] AaronSchulz: seen in fatals -- Destructor threw an object exception: exception 'InvalidArgumentException' with message 'LoadBalancer::reuseConnection: connection not found, has the connection been freed already?' in /srv/mediawiki/php-1.28.0-wmf.20/includes/libs/rdbms/loadbalancer/LoadBalancer.php:617#012Stack trace:#012#0 /srv/mediawiki/php-1.28.0-wmf.20/includes/libs/rdbms/database/DBConnRef.php(591): LoadBalancer->reuseConnection()#012#1 [15:50:05] (): DBConnRef->__destruct()#012#2 {main} [15:50:27] not high volume, just 14 showing [17:49:05] So FireFox has apparently made it really hard to develop and debug extensions with their signing stuff. [17:49:34] there is an "xpi sign" command, but it gets mad at me because my extension is already published in their directory. [17:49:36] Yeah, HTTPSE weren't happy a little whil ago [17:49:55] * bd808 has to read a bunch of docs apaprently [17:50:34] "jpm run" spawns the browser with the extension loaded but won't run it because it's not signed :/ [17:50:47] I'm sure there is some hidden config to turn that off [17:51:05] but FFS the tool chain should do that for you right? [17:53:05] * bd808 decides to eat before falling into this dark hole [18:55:05] I was just pastebin'ing that bd808 [18:55:15] You're beating me at my own game :p [18:56:31] I try to make everyone look equally bad so we are all on the same footing [18:57:12] I only noticed because I fired up fatalmonitor on fluorine this morning [18:57:36] But I've been too chicken to deploy https://gerrit.wikimedia.org/r/#/c/313207/2 yet [19:00:44] back to my prior rant about FF: the answer is apparently that I have to install a special browser build that will allow me to run unsigned code -- https://wiki.mozilla.org/Add-ons/Extension_Signing [19:01:05] goofy shit FF [22:50:34] tgr: do you have a minute for me to bounce an idea off of you? [22:50:42] sure [22:51:09] I need to unblock using 2fa in Striker and have been going over a zillion ways to do that [22:52:08] my latest idea is to add some methods to the action api to let me query if oath is enable and validate a token [22:52:43] it would need to be protected with a permission since I would need to check this without being the user [22:52:49] as in, using 2fa with the Wikimedia secret? [22:53:17] yes. the way that horizon does this right now is gross. [22:53:28] it reads the mediawiki db tables directly [22:53:37] yeah, that would require a new API endpoint + permission + OAuth grant [22:53:58] I don't have oauth either for wikitech [22:54:10] it would jsut be a bot account that striker uses [22:54:36] but yeah new endpoint and permission [22:54:40] why do you need 2FA for a bot account? [22:55:55] I need it to authn an LDAP account in striker so that I can authz changes that would be protected on wikitech (e.g. ldap password, ssh keys) [22:56:10] Right now I just have password authn via ldap [22:57:04] but I can't add ssh key management there until I can check to see if the account should have 2fa and also validate the token if it is [22:57:38] because we use ssh access as a means of requesting OATH disable for lost phones [22:58:38] the bot account part is that Striker the app would not have an OAuth grant for the authenticating user (wikitech doesn't have oauth) [22:58:51] makes sense [22:59:10] I still don't see why OAuth is not better for this though [22:59:40] or you feel just the fact that someone is logged in as the user on a Wikimedia wiki is not proof enough for account ownership? [23:00:12] OAuth probably would be ok for this use case, yes [23:00:40] otherwise you could just register Striker as an OAuth consumer for Wikimedia wikis so users can grant it access to the 2FA check [23:00:51] that would take adding oauth to wikitech and figuring out how to work that into the auth flow for Striker [23:01:57] I don't think Striker would even need to care about 2fa then actually. We could just make them renew the oauth grant which would prove they are logged in on wikitech [23:02:04] wikitech does not need oauth for that, as far as I can see [23:02:26] (although adding OAuthAuthentication there would be nice) [23:02:42] ... but withtech is where the oath token lives [23:02:48] *wikitech [23:03:00] oh, right [23:03:20] I thought you want to verify Wikimedia account ownership for users who have 2FA there [23:03:42] no, I want to edit ldap for users who may have 2fa turned on [23:04:15] another idea in the past was to move the oath secrets to ldap [23:04:16] I am not really familiar with the security of 2FA [23:04:22] but that's kind of gross [23:04:29] as in, is it OK to just expose it to anyone out there? [23:04:58] that's a very different threat model than only exposing it once the user provided a valid password [23:05:10] no, the api would need to be protected so that only trusted users could ask to validate a user's token [23:05:22] right [23:06:02] the idea thing would be to move the secrets and validation into an internal network service that required strong auth to access [23:06:06] *ideal [23:06:31] but I don't want to build that service and I don't want to wait for 6 more months to see if someone else does [23:07:00] (google says it's brute-force resistant with sane throttling) [23:07:05] the quick thing would be to do what horizon is doing and go straight to the db [23:08:32] so Striker would have its own password but use the same 2FA secret as wikitech? or would it use the wikitech password as well? [23:08:50] it already uses the wikitech password (ldap bind auth) [23:09:47] (Services has plans for moving 2FA into an internal REST service btw, not sure that would happen within 6 months though) [23:10:18] yeah. I know the rough plan, but I need to do my stuff in Q2 :) [23:11:15] that seems like a reasonable plan [23:11:31] I'd actually be willing to write the htop half of the auth service, but I don't have time/energy to do it all I don't think [23:12:33] I've never written an api module, so I guess my next challenge is figuring that out and then getting code/security review on it [23:12:34] one thing to consider is that we might want to move to U2F at some point and that would not be shareable [23:13:33] well it goes with the ldap account. at some point wikitech won't be ldap auth any more and I bet that happens before u2f [23:14:17] firefox still doesn't support u2f so I wouldn't worry too much about us requiring it to access wikitech/ldap [23:15:37] I don't think it would ever make sense to require it, not everyone can obtain a token easily [23:15:52] but even allowing it as an option would be problematic [23:16:01] *nod* [23:16:15] still, fair point that the move from LDAP would happen sooner [23:18:38] in a u2f world we will need it to happen for SUL accounts against loginwiki. I don't see wikitech ever going with u2f precisely because it's protecting things that are moving elsewhere (horizon and striker) [23:19:08] anyway thanks for listening