[20:59:10] https://phabricator.wikimedia.org/E171 coming up. TimStarling is chairing this week [21:00:22] #startmeeting RFC meeting [21:00:22] Meeting started Wed May 11 21:00:22 2016 UTC and is due to finish in 60 minutes. The chair is TimStarling. Information about MeetBot at http://wiki.debian.org/MeetBot. [21:00:22] Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. [21:00:22] The meeting name has been set to 'rfc_meeting' [21:00:47] #topic RFC: Overhaul Interwiki map, unify with Sites and WikiMap | Wikimedia meeting channel | Please note: Channel is logged and publicly posted (DO NOT REMOVE THIS NOTE) | Logs: http://bots.wmflabs.org/~wm-bot/logs/%23wikimedia-office/ [21:01:17] * DanielK_WMDE__ wibbles [21:01:21] T113034 [21:01:22] T113034: RFC: Overhaul Interwiki map, unify with Sites and WikiMap - https://phabricator.wikimedia.org/T113034 [21:02:07] So, shall we start? [21:02:15] * jzerebecki nibbles a bit [21:02:37] jzerebecki: my word, just take a byte! [21:02:37] yes [21:02:42] so....DanielK_WMDE , what are you hoping to accomplish in this meeting? [21:03:04] I'm hoping to get more clarity on the next steps to take. [21:03:25] so the last meeting was in October? [21:03:26] We have a need for a more flexible, and more easy to maintain, system of information about other wikis on the cluster, and elsewhere [21:03:44] DanielK_WMDE: what's stopping you from writing a patch? [21:03:55] The very first step to achieve this is now up for review: https://gerrit.wikimedia.org/r/#/c/250150/ [21:04:09] and non-wikis elsewhere [21:04:26] which the interwiki map includes [21:04:31] robla: so the next step, i think, would be to write an InterwikiLookup (and/or SiteLookup) based on nexted arrays (aka JSON) [21:04:36] For that, we should agree on a format. [21:04:51] This is what I have in mind: https://phabricator.wikimedia.org/P3044 [21:04:53] s/json/php/ [21:05:28] One of the things to note is that any site can have several kinds of IDs (global, interwiki, etc), and several ID values for each kind (aliases, basically) [21:06:01] Also, any site can be in a number of groups of various types. Sites can be grouped by language, or by family (wikipedia, wikibooks, etc), or by the database cluster they reside on [21:06:12] * aude waves [21:06:17] DanielK_WMDE: is https://gerrit.wikimedia.org/r/#/c/250150/ close to getting merged? [21:06:30] bd808: yes, the idea is to read from .php or .json files. json is easier to maintain, php faster to read [21:06:59] you mean json instead of CDB? [21:07:04] robla: yes, i think so. addshore and legoktm are looking into it [21:07:15] TimStarling: yes, exactly [21:07:32] or, well, php instead of cdb, really, if we are talking about the fast option. # [21:07:44] in my mind, we would maintain the json on gerrit, and generate php for production [21:07:56] why? [21:08:11] Are we short of php editing experience? [21:08:16] from last meeting there is an info item: "TimStarling wants a CDB backend which would use the files generated by dumpInterwiki.php" [21:08:30] bd808: no, but if we have a generation step, we can pre-compute indexes. [21:08:35] we'd need to maintain these by hand otherwsie [21:08:50] I guess you can have serialized files if you have an APC cache on top, is that the idea? [21:08:54] didn't interwikis get migrated away from CDB? [21:09:11] TimStarling: the APC cache would be automatic for php files [21:09:15] that'S the idea [21:09:31] Krenair: they now use a php array that still uses the structure of the cdb [21:09:41] right [21:09:49] TimStarling: the CDB backend you want - i think ClassicInterwikiLookup in my patch is just that. it's basically the old code. [21:10:07] can you confirm that? [21:10:53] hey TrevorParscal! [21:10:54] #info question discussed: which backends should InterwikiLookup support? [21:11:02] yes, fine [21:11:42] robla: my plan: new structure in json and php. old structure via legacy code in php or cdb. old structure in the database. [21:11:48] the patch already has support for the last three [21:12:25] ok, so one question is whether the structure i propose is what we want. [21:12:58] another question is whether we want to support the new features (multiple ids and groups, all kinds if extra info) in teh db backend [21:13:26] existing 3rd party wiki clusters may depend on a db based interwiki setup that can be edited from inside the wiki. [21:13:29] DanielK_WMDE: which (single) question do you want this group to focus on first? [21:13:33] should they miss out on the new features? [21:14:03] robla: single question: give me feedback on https://phabricator.wikimedia.org/P3044 [21:14:29] does that structure seem sensible? is it missing something structurally? [21:14:41] that's not how you spell paths [21:15:07] bd808: at the bottom of the example are the indexes. i'd want to generated them. but that can be done from json to php, or php to php, or php to json... [21:15:26] TimStarling: oops ;) it's a wiki... [21:15:47] so there will be one such json file per source wiki? [21:16:07] DanielK_WMDE: keep in mind the global ids can be renamed [21:16:18] e.g. be-x-old to be-tarask [21:16:30] you have enwiktionary.ids.interwiki[0] == 'wikt' which only makes sense if the source wiki is in the english language [21:16:32] TimStarling: i imagine every wiki would read three files actually (and perform a deep merge): one with info shared across the family, one with info shared accross the laanguage, and one with local overrides for the specific wiki [21:16:34] how would you handle such cases? [21:17:11] if t the array is indexed by global id? ("enwiki": {) [21:17:22] TimStarling: so you would have only one faile saying that "wiktionary" is the interwiki prefix for enwikt on all english language wikis. And one that says that "en" is the prefix for enwiki on all wikipedias [21:17:32] actual wiki IDs (DB names) have almost never been renamed [21:17:43] DanielK_WMDE: line 88 cancels out line 87 [21:17:48] I did a few in the early days but it got more complicated later on, I don't think anyone has tried it lately [21:17:55] you can't have two keys in a dict with the same value [21:17:56] in the past no, but in future I'd like to be able to do that [21:17:56] aude: an entry can have multiple global ids. they act as aliases. only one of them would be used as a key in the file, makign it the *canonical* global id. [21:18:16] bd808: hm, wehn I try to edit the paste, i get an empty text box?... [21:18:18] DanielK_WMDE: also can interwiki ids be renamed? [21:18:18] silly [21:18:37] aude: that would break existing page content, right? but you can add prefixes. [21:18:52] DanielK_WMDE: that sounds fine, but it's not in the proposal you're asking me to review [21:18:54] Krenair: sorry, what was that? [21:19:02] renaming of wikis DanielK_WMDE [21:19:13] ah, right [21:19:21] DanielK_WMDE: maybe a site can have multiple interwiki ids? [21:19:29] TimStarling: sorry, what isn't? the thign about combining three files? [21:19:34] aude: sure [21:19:49] DanielK_WMDE: yes, just one file on P3044 [21:19:49] P3044 interwiki.json - https://phabricator.wikimedia.org/P3044 [21:20:46] * DanielK_WMDE can't edit [21:21:33] TimStarling: it's on the RFC page. it'S not reflected in the JSON file, that's true. The "three files" thing isn't backed into the logic. It would just be the thing i'd do for the wikimedia cluster. [21:21:37] Krenair: you mean renaming of the db name? in addition to renaming the language code which was what aude talked about. [21:21:46] TimStarling: as far as the software is concerned, it reads any number of files, and deep-merges them [21:22:04] do you think i should make three pasts for a more elaborate example? [21:22:32] maybe dbname could be another type of id? [21:22:34] # i imagine every wiki would read three files actually (and perform a deep merge): one with info shared across the family, one with info shared accross the laanguage, and one with local overrides for the specific wiki [21:22:46] aude: it could, yes [21:22:54] #info i imagine every wiki would read three files actually (and perform a deep merge): one with info shared across the family, one with info shared accross the laanguage, and one with local overrides for the specific wiki [21:23:42] #info aude: also can interwiki ids be renamed? daniel: you can add prefixes. [21:24:05] #info an entry can have multiple global ids. they act as aliases. only one of them would be used as a key in the file, makign it the *canonical* global id. [21:24:12] so... [21:24:19] any comments on the ugly indexes at the bottom? [21:24:24] prefixes or aliases? [21:24:27] i stole that idea from how the cdb files work [21:25:08] eew to maintain by hand [21:25:09] aude: you can have any number of any "kind" of id. so you can have multiple global ids, multiple interwiki prefixes, etc. [21:25:53] jzerebecki: yea :) The idea is that if they are not there, they are comuted on the fly. and when we arite the file back out (as php or json), we also output the indexes, for quicker reading [21:26:01] DanielK_WMDE: do those indices really need to be precomputed? That seems like the sort of thing that could be cached in APC [21:26:15] bd808: the files is cached in apc. [21:26:15] DanielK_WMDE: ok, think i understand what you mean by "interwiki prefix" [21:26:17] DanielK_WMDE: no need for a more elaborate example, I think the RFC more or less covers it [21:26:27] ok [21:26:51] bd808: to me the questions is: would it be ok to just re-compute the indexes when reading (or when needed)? [21:26:51] DanielK_WMDE: is "_by_id" -> global -> and then enwiki twice correct? [21:27:01] "when we write the file back out" -- that I don't like if you mean via a specialpage [21:27:02] aude: yes [21:27:26] bd808: right. if you want on-wiki editing, you want the db backend. which doesn't support the nice new features. [21:27:26] or should be "some-old-alias": "enwiki" [21:27:35] and "enwiki": "enwiki"? [21:28:02] bd808: so one of my questiosn for today is: do we need to add interwiki_ids, interwiki_groups, and interwiki_props in the database? perhaps that could be an extension. [21:28:13] aude: exactly [21:28:20] DanielK_WMDE: ok [21:28:37] i can't edit the paste now, but think it should be fixed [21:28:58] aude: yes... i wonder what'S wrong with the edit feature :( [21:29:08] i'll make a new one later [21:29:10] DanielK_WMDE: well, what will need them? Are you making wikibase or some other extension dependent on a particular backend if you don't make them all the same? [21:29:16] (DanielK_WMDE, jzerebecki, TimStarling and All: In what ways might this - https://phabricator.wikimedia.org/P3044 - help Content Translation become more precise in all of Wikidata/Wikipedia's ~300 languages?) [21:30:32] there are security concerns with allowing web users to write PHP files, it's done by one of the template engines but they use cryptographic signatures to make sure nothing other than MW writes those files [21:30:33] bd808: the current db backend doesn't support everything we need for wikibase (that'S why we added the sites table). For wikidata, we can just use the file based backend. but 3rd parties that rely on on-wiki editing may want the db backend *and* the new features. [21:30:57] TimStarling: my thought is: if you want on-wiki editing, use the db backend. [21:31:12] do we want on-wiki editing? [21:31:20] on the wmf cluster, no [21:31:39] many other wiki farms do. especially the smallish ones [21:31:41] ok, I'll relax now in that case [21:31:47] :) [21:32:20] another thing we should have is configuration for sorting order of interwiki ids (maintained in a sane place) [21:32:43] for legacy reasons, this information is in wikibase but totally doesn't belong there [21:32:45] aude: good point... not sure where to put that in this design, but i'll think about it [21:32:49] we do have m:Interwiki_map on meta [21:33:00] #info another thing we should have is configuration for sorting order of interwiki ids (maintained in a sane place) [21:33:01] which has to be transferred to the cluster manually by a deployer [21:33:11] there are pages on meta wiki where this ifnormation is maintained also for pywikibot and other stuff [21:33:22] anyway, yes, the JSON format you propose looks very extensible and will presumably meet our needs [21:33:23] https://meta.wikimedia.org/wiki/MediaWiki:Interwiki_config-sorting_order-native-languagename [21:33:24] Krenair: files on gerrit would be a *lot* nicer [21:33:26] https://meta.wikimedia.org/wiki/MediaWiki:Interwiki_config-sorting_order-native-languagename-firstword [21:33:33] DanielK_WMDE, it goes via gerrit [21:33:39] #info anyway, yes, the JSON format you propose looks very extensible and will presumably meet our needs [21:33:41] then there is something special for serbian and another languge [21:34:03] west frisian [21:34:04] there's a script that you run on tin to pull the page contents and generate the file, which you download and upload to the git repository, then commit and push it out as a wmf-config change [21:34:21] #link https://meta.wikimedia.org/wiki/MediaWiki:Interwiki_config-sorting_order-native-languagename-firstword [21:34:38] Krenair: I püropose to just maintain the files in the git repo [21:34:46] without an on-wiki page? [21:35:00] Krenair: the script sounds good [21:35:03] personally, yes. but if we want to, we can still pull info from that page [21:35:11] but the information about each site is getting increasingly complex [21:35:18] so it becomes nasty to maintain as wikitext [21:35:39] anyway... [21:35:53] I don't want to have m:Interwiki_map anymore [21:35:59] \o/ [21:36:01] it was a bad idea (of mine) to start with [21:36:19] there is some overlap between the interwiki/site info, and wgConf. Do we want to integrate them? Or keep them separate? [21:36:27] #info I don't want to have m:Interwiki_map anymore [21:36:36] wgConf is quite complex [21:36:56] what do you mean by integrate? [21:37:00] yea... i'm not proposing to replace it with the interwiki stuff completely. [21:37:04] there are at least 3 places of the language: wgLanguageCode groups:language props:language . may all three be different? [21:37:17] but e.g. why wiki uses which database could come from the interwiki/site json. [21:37:28] * aude notes we also have sitematrix [21:37:34] ^^ [21:37:50] aude: that uses wgConf, right? [21:37:52] on WMF generating interwiki JSON will be scripted, so wgConf could be used as a data source by the script [21:37:59] DanielK_WMDE: i think so [21:38:13] and then we (currently) populate sites from site matrix [21:38:17] a bit evil [21:38:19] jzerebecki: they can, though they would usually be the same. Actually, wgLanguageCode in wgConf should always be the same as props:language in the interwiki info [21:38:39] DanielK_WMDE: can we deup those two then? [21:38:44] It has some hardcoded stuff in it :( [21:38:46] (sitematrix) [21:38:47] s/deup/dedup/ [21:39:26] e.g. this: [21:39:29] if ( in_array( $lang, array( 'cz', 'dk', 'epo', 'jp', 'minnan', 'nan', 'nb', 'zh-cfr' ) ) ) { [21:39:29] continue; [21:39:34] jzerebecki: if we closely integrate wgConf with the sites info, maybe. But $wgLanguageCode will potentially need to be available very early during the init process. Not sure the interwiki info will be available early enough [21:39:39] it's a bit of a chicken-and-egg issue [21:39:51] (you need wgConf to know where to read the interwiki info from) [21:40:22] some of the hard coded stuff in dumpInterwiki.php can be replaced with wgConf [21:40:55] it's a chicken-and-egg issue to find the database that corresponds with a given language/domain? [21:40:56] DanielK_WMDE: for wmf production both will be in mediawiki-config.git so that they can be changed atomically? [21:41:31] TimStarling: i think it's solvable, but we have to thing about initialization order, yes [21:41:46] jzerebecki: i suppose so [21:42:11] yeah, it is just an index inversion problem, you just iterate through wgLanguageCode and flip it [21:42:29] * jzerebecki was thinking about what needs to be done for something like https://gerrit.wikimedia.org/r/#/c/277519/ [21:42:36] ok. i'd like to re-iterate the next steps i propose, so youall can tell me whether you approve, or have questions, or what. [21:42:48] 1) implement array-based InterwikiLookup (loads from multiple JSON or PHP files) [21:43:03] 2) implement maintenance script that can convert between different interwiki representations. [21:43:11] 3) Provide a config variable for specifying which files to read interwiki info from. If not set, use old settings and old interwiki storage. [21:43:20] bonus) split CDB from SQL implementation [21:43:33] more info on these is on the ticket, https://phabricator.wikimedia.org/T113034 [21:43:44] 1. this is non-threatening since we're not required to use it in production, right? [21:43:51] yes [21:44:13] 2. also mostly non-WMF? [21:44:14] that'S the nice thing about dependency injection. just swap stuff out :) [21:44:38] this is not the dumpInterwiki.php equivalent yet? [21:44:59] TimStarling: the conversion script would be used for deployment, or at least when migrating from CDB. But by itself, it doesn't change anything about how things work [21:45:32] (DanielK_WMDE, jzerebecki, TimStarling and All: In what ways might this session and this - https://phabricator.wikimedia.org/P3044 - help create a robust translator building on, for example, Content Translation, Google Translate and adding a sophisticated Wiktionary between all of Wikidata/Wikipedia's ~300 languages - and even lead to a Universal Translator between all 7,943+ languages? Thank you.) [21:45:37] no, this doesn't replace dumpInterwiki.php. it takes multiple json (or php) files, combines them, indexes them, and then writes json (or php, or sql) [21:45:53] you would use dumpInterwiki.php to generate CDB and then convert from CDB to PHP during deployment? [21:46:19] Scott_WUaS: i don't think it would. we are discussing the management of meta-information about websites. [21:46:48] DanielK_WMDE: Thanks [21:47:06] 3. sounds fine and conventional [21:47:18] TimStarling: i would use the conversion script to generate a JSON from the CDB, then split that manually, once, and put it into gerrit. I'd then use the conversion script to turn the json into indexed php during deployment [21:47:48] like we do for wikiversions.json ? [21:48:03] as part of scap? [21:48:14] TimStarling: we may have a dumpInterwiki equivalent that builds interwiki info from config, or use the old dumpInterwiki, and then convert. but i'd prefer to just stop using it, and maintain the interwiki info directly [21:48:26] not sure it needs to be regenerated that often [21:48:31] so the JSON would then be human-edited configuration? [21:48:32] aude: possibly. i'm blurry on the details [21:48:39] TimStarling: exactly [21:48:55] we could support yaml ;) [21:49:11] we don't have much time to discuss this now, but I'm not sold on that part [21:50:13] the reason for introducing rebuildInterwiki.php was to reduce the number of points in the configuration file that need to be simultaneously edited when a new wiki is introduced [21:50:15] #info Tim is not convinced that interwiki info should be maintained by hand as json. Perhaps we still want dumpInterwiki (or equivaloent) [21:50:28] or when you introduce an alias or whatever [21:50:57] TimStarling: yea, but does the config have a good place for all the extra info we want in the interwiki info? hm, perhaps it'S all there. [21:51:37] it's not all in wgConf, dumpInterwiki.php is canonical for some things [21:51:40] TimStarling: the entire thing doesn'Ät require us to stop using dumpInterwiki. It would *allow* us to stop using it, if we want to. [21:51:51] BUt we can keep using it - and we don' [21:51:52] you would have to work out which things and migrate them somewhere else [21:52:05] is there some connection between dumpInterwiki and MW* setup? [21:52:42] dumpInterwiki is a script in Extension:WikimediaMaintenance [21:52:43] TimStarling: yes... do you think we need to have a detailed plan for that in order to move forward? or can we figure that out when we get to it? [21:52:43] * aude has to run in a few minutes [21:52:57] it creates our interwiki.php file that controls interwiki link prefixes [21:53:03] we can figure it out later [21:53:04] but also wants feedback on https://phabricator.wikimedia.org/T90617 (and wants to solve this more short term) [21:53:10] I think the system I propose is flexible enough to allows us to use whatever bits and pieces are convenient [21:53:18] what to do with the populate sites script [21:53:19] yeah, you can go ahead with the implementation [21:53:55] #info Tim thinks we need to figure out what information can be taken from wgConf, and what should come from elsewhere, and how to maintain it. But it's not a blocker for now, we can figure iot out later [21:54:44] aude: we havea file based implementation of SiteLookup, right? Can't we just stop using the sites table? [21:54:52] then we wouldn't need that script at all. [21:55:20] DanielK_WMDE: then how to we generate and maintain the files? [21:55:32] file--based is just a caching layer [21:55:39] at the moment [21:55:46] in my mind, the files are the cononical info. [21:55:50] or should be [21:56:06] I'd want FileBasedSiteLookup to basically be the same as the file based InterwikiLookup. Could actually be the same class [21:56:12] DanielK_WMDE: ok [21:56:24] would this be easy for third parties to use? [21:56:48] e.g. for dvelopment wiki, i want all wikimedia sites in my site store / interwiki [21:56:49] easier to edit a json file than to manually insert stuff into the database, right? [21:57:06] can't wg* be primary info at least for local wikis? [21:57:15] aude: for a dev setup, we can just have a default sites.json on gerrit somewhere [21:57:18] just doneload it, done [21:57:19] * aude really has to run now [21:57:21] #info next week's meeting: E184 RFC: Requirements for change propagation (T102476) [21:57:21] T102476: RFC: Requirements for change propagation - https://phabricator.wikimedia.org/T102476 [21:57:29] primary info SMalyshev? [21:57:57] SMalyshev: yes... but it'S incomplete. So we'd have to somehow copmbined it with other info. That's a bit tricky. It's one of the questions I had for today. [21:58:02] but I guess we are out of time. [21:58:02] Krenair: primary source of information about what wikis are there etc. [21:58:05] DanielK_WMDE: then we need a way to add new wikis to the sites.json (idally not manualy( [21:58:09] ideally* [21:58:15] as part of addWiki.php [21:58:18] yeah, out of time now [21:58:23] aude: i'd add them manually - and only there. [21:58:32] Thank you, All! [21:58:37] TimStarling: do you think the next steps are approved? [21:58:58] thanks everyone :) [21:59:02] approved or last call? [21:59:04] SMalyshev, you're suggesting all the wg* config would be part of this? [21:59:04] * aude runs away [21:59:26] we can say informally approved I guess (pending code review) [21:59:39] TimStarling: good question... do we do last calls for parts of rfcs? or should i split that bit out into a separate ticket, and we do a last call on that? [21:59:43] I'm a little confused as to what we just approved [21:59:53] Krenair: yes basically what I am wondering - we have a lot of info in wg* configs which MW* scripts are using, but then we seem to have a lot of this info duplicated, if I understand right [22:00:02] robla: DanielK_WMDE's numbered points [22:00:07] so I wonder if it can't be in one place only. [22:00:10] SMalyshev, some of wg* comes from PrivateSettings [22:00:20] robla: a go-ahead for the three next steps i posted earlier, at xx:42 [22:00:50] we have wgConf->get to get data about other local wikis but it seems to have issues when you configure things per-dblist/family instead of per-wiki [22:00:52] SMalyshev: yes. wgConf or the json file should be canonical, and one should somehow reference the other. [22:01:05] Krenair: we don't need all wg*'s of course just those related to wiki definitions. Basically whatever would MW* see when it does its thing [22:01:17] robla: do you think a formal last call for these three points, in a separate ticket, would be in order? [22:01:27] or is an informal go-ahead sufficient? [22:02:04] DanielK_WMDE I'm not sure. [22:02:40] #info Tim thinks it's ok to go ahead with implementing the proposed next steps, as they are non-threatening. But should we have a formal last call? [22:03:01] so....let's not have a last call [22:03:07] I think last call is just for closing an RFC [22:03:17] TimStarling: that sounds good [22:03:30] robla: i tend to think we don't need one, since these steps don't actually incurr any changes on existing wikis. they only add options. [22:03:44] alright, that's it then [22:03:48] yay :) [22:03:50] thanks, all [22:03:54] yup, thanks! [22:03:54] #endmeeting [22:03:55] Meeting ended Wed May 11 22:03:54 2016 UTC. Information about MeetBot at http://wiki.debian.org/MeetBot . (v 0.1.4) [22:03:55] Minutes: https://tools.wmflabs.org/meetbot/wikimedia-office/2016/wikimedia-office.2016-05-11-21.00.html [22:03:55] Minutes (text): https://tools.wmflabs.org/meetbot/wikimedia-office/2016/wikimedia-office.2016-05-11-21.00.txt [22:03:55] Minutes (wiki): https://tools.wmflabs.org/meetbot/wikimedia-office/2016/wikimedia-office.2016-05-11-21.00.wiki [22:03:55] Log: https://tools.wmflabs.org/meetbot/wikimedia-office/2016/wikimedia-office.2016-05-11-21.00.log.html [22:04:45] thanks for making this RfC DanielK_WMDE [22:05:30] should we switch to another channel for SMalyshev's concerns? [22:05:52] * jzerebecki also wants to say something re that [22:06:18] #wikimedia-tech? [22:06:22] past midnight here, i'd prefer to do this async [22:06:33] SMalyshev: sounds good [22:06:34] ok that works too [22:06:41] SMalyshev: comment on the ticket? i'll reply tomorrow. i'm about to go into youtube mode [22:06:51] ok [22:06:54] DanielK_WMDE: mostly SMalyshev just needs to be brought up to speed on how things are [22:07:23] ah, i'm not the best person for that anyway. not for how dumpInterwiki works, or how it's used, anyway [22:07:24] so some other people can help with that, you don't need to answer all the questions [22:07:35] cool, thanks! [22:07:39] * DanielK_WMDE shuts up now [22:07:51] TimStarling: I'd appreciate a quick sync yeah... last time I looked into it I almost got my brain dislocated [23:49:54] We will be starting video-based office hours with interim ED in about 10 minutes: https://meta.wikimedia.org/wiki/Wikimedia_Foundation_Executive_Director/May_2016_office_hours [23:51:46] You can join us and ask questions during the session via Blue Jeans: https://bluejeans.com/198076339 [23:51:54] The session will also be available for streaming on YouTube: http://www.youtube.com/watch?v=XazXyL-Ybjo [23:52:18] Etherpad that has been setup for notes: https://etherpad.wikimedia.org/p/May_2016_-_WMF_ED_May_2016_video_office_hours