[12:52:07] tgr|away: i'm trying to wrap my head around the wmfcloud/vargant stuff. [12:52:17] let me know when you are around, I'm a bit lost ;) [13:05:33] DanielK_WMDE_: o/ [13:23:55] tgr: oh hey :) [13:24:30] tgr: so. I'm completely failing at this :) [13:24:49] current issue: I tried to make a sysop with createAndPromote. The script sais it created the user. but i can't log in, and ListUSers doesn't show the user. [13:25:52] i don't see it in the db either [13:25:59] but when i re-run the script, it sais the user already exists :) [13:27:54] I'm using mwscript. I'm very confused about where it wants to have the wiki/db name [13:28:51] mwscript maintenance/createAndPromote.php wiki daniel **** --sysop [13:28:52] Account exists. Perhaps you want the --force option? [13:29:27] so it exists... somewhere? but not in the DB called wiki. [13:29:37] DanielK_WMDE_: is this on mcr-base? [13:29:41] yep [13:29:57] in theory createAndPromote is the right way, vagrant itself uses it for some roles [13:30:13] createAndPromote.php usually takes the user name as the first param, not the db name. but if I try that, i get: [13:30:14] Database daniel not configured [13:30:27] we have a forked version of mwscript where you can omit the wiki name and it will use the standard wiki [13:30:39] otherwise it should behave the same as producion [13:31:18] it seems this version doesn't do either [13:31:23] i have no idea what it does, tbh [13:31:47] i see a user in the database called "wiki". maybe it used the first param as the db name *and* as the first param to the actual script? [13:32:43] hm... if i try to change the password, it comains that the password is too shaort, no matter what i do. do i assuem it's using the second param as the password. [13:32:48] this seems completely broken [13:33:14] tgr: my guess is that this forked version is the problem. [13:33:40] making the first param optional is going to make things extremely confusing, no? [13:33:49] weird, this used to work [13:34:15] well, most vagrant users don't use multiple wikis, so having to specify 'wiki' all the time is more confusing [13:35:11] maybe https://gerrit.wikimedia.org/r/#/c/431751/ broke something [13:35:13] it's annyoing and completely pointless, since all scripts already have a --wiki parameter [13:35:20] anyway, you can use --wiki=wiki [13:35:21] just use that! [13:35:47] so, the way it works now is that the script will use the first parameter as the wiki name AND as the first parameter of the script [13:35:55] that's just fubar [13:35:57] yeah, so it seems [13:36:11] --wiki does not count as a positional parameter though so that will work [13:36:46] nope [13:36:46] vagrant@mediawikivagrant:~$ mwscript createAndPromote.php --wiki=wiki Tgr 123 --sysop [13:36:48] wiki: Creating and promoting User:Tgr into sysop... [13:36:51] done. [13:36:54] it still tries to use the first param as the wiki name, and fails [13:37:16] mwscript maintenance/createAndPromote.php daniel xyz --sysop --wiki=wiki [13:37:22] Database daniel not configured [13:37:34] still has to be the first parameter [13:37:42] ah. uh... why??? [13:38:04] that's how mwscript works, we try to keep the fork as close to production as possible [13:38:43] *sigh* ok. i [13:38:47] i'm filing a ticket [13:39:02] how should I refer to "the fork"? and how shoudl i tag the ticket? [13:41:04] Vagrant's version of MWScript.php [13:41:21] it's stored somewhere in puppet/modules/mediawiki/templates [13:43:14] my attempt for sanity was T177111 which didn't meet with unwavering support [13:43:16] T177111: Librarize multiversion code - https://phabricator.wikimedia.org/T177111 [13:44:18] see that task for tags [13:48:47] haha, i just posted a password on phab. luckily, a freshly made up one for a test instance ;) [13:48:54] https://phabricator.wikimedia.org/T195772 [13:50:23] tgr: so, what's your intention with mcr-base vs mcr-full? [13:51:50] mcr-base would be a plain MW core install and mcr-full a bunch of extensions [13:52:01] tgr: i'd start with PageUpdater on mcr-base. once it's in, apply the RevisionStore patches, and test them with MIGRATION_WRITE_BOTH. Perhaps have another instance for testing with MIGRATION_NEW. [13:52:33] extensions is also good... but we need to test the two migration modes [13:52:54] yeah, that sounds like a good plan [13:52:58] as to the sdc test instance: the mediainfo role seems to be broken. is there a good way to manually install extensions? [13:53:07] the revisionstore patches are the DNM ones, right? [13:53:44] they are DNM because they need to be tested on a vps [13:54:11] we could rebase them on top of PageUpdater, but it's probably easier to just merge PageUpdater first [13:54:17] so... [13:54:49] is there a mediainfo patch? I haven't seen any [13:54:57] s/patch/role [13:55:00] yes [13:55:08] wikibasemediainfo [13:55:11] oh, ok [13:55:40] not sure whetehr it sets up one or two wikis, though [13:55:54] to *properly* test mediainfo, it needs to be federated with another repo [13:55:57] oh, you are looking at mcr-base [13:56:07] that's just something I started working on [13:56:23] but it calls the wikidata role [13:56:24] i'm just having a look around [13:57:10] I created wikibasemediainfo.pp on mcr-base by mistake, and then forgot to delete it after copying over [13:57:17] it's not an official role [13:57:28] ah :) [13:57:30] I'll do some manual testing on mcr-base to put PageUpdater through its paces. [13:57:46] ideally, we'd get selenium tests to run against that. i have no clue how to do that... do you? [13:58:02] there's a role for that [13:58:09] browsertests or something like that [13:58:34] the VMs have 2G memory, not sure if that will be enough [13:58:53] but we can always set up a medium box for running selenium tests [13:59:05] anyway I'll set that up [14:01:45] as for the mediainfo role, if you know what exactly has to be installed that would be helpful [14:01:49] https://github.com/wikimedia/mediawiki-vagrant/blob/master/puppet/modules/role/manifests/wikidata.pp [14:01:52] https://github.com/wikimedia/mediawiki-vagrant/blob/master/puppet/modules/role/manifests/wikibase_repo.pp [14:02:37] the wikidata role sets up a bunch of things that wikibase_repo doesn't, do any of those have to be present on the downstream wiki (which simulates Commons)? [14:03:29] and we probably want to set up something like commons.pp I guess where the wiki with the mediainfo is a file repo for other wikis [14:04:04] how is that supposed to be handled anyway, should other wikis display the structured data part? [14:05:17] tgr: that'S actually an excellent question for the multimedia team :) [14:05:33] I assume: not for now. maybe some day. [14:05:52] that is: they'll show whatever is translcuded into the wikitext [14:06:39] so... the wikidata role sets up a repo and a clinet wiki on the same host. the mediainfo role probably needs to do somethign similar: set up two federated repos on the same host. [14:07:41] btu it should also be possible to set up a non-federated mediainfo, with items and properties on the local wiki. that's much nicer for development. [14:09:16] tgr: so, what has to be installed and how it has to be configured depends a lot on how close we want to be to production. "standalone mediainfo" is a lot simpler than "mediainfo-as-used-on-commons". [14:09:47] (also note that i don't know shit about puppet/vagrant roles) [14:10:09] btw - we are really running vms inside vms? that'S the recommended setup? really?... [14:19:03] yeah, that's the recommended way to host mediwiki wikis on Cloud VPS [14:19:36] I'm not sure why, bd808 could probably tell (if he wasn't on vacation) [14:22:20] Vagrant is a skunkworks project that's mostly done in personal time, so maybe it was just easy to get it working that way and no one had the bandwidth to do it in a proper way (and RelEng wants to throw it away eventually anyway and replace everything with docker containers) [14:23:19] running docker in a vps makes a lot mroe sense. [14:23:34] i wouldn't replace the vps with docker, though [14:24:11] https://blog.cloudandheat.com/index.php/en/2016/04/20/docker-containers-on-openstack-vms/ [14:24:33] except docker does not have a role system, so it doesn't really provide a mechanism to install multiple extensions on a wiki [14:25:12] also vagrant uses LXC in Cloud VPS which is not that dissimilar from docker [14:25:28] right, you'd still need puppet. docket itself assumes soa [14:25:32] (it could use docker, too, somebody would just have to test it) [14:26:11] yea, whatever... [14:26:58] so which way do we want the mediainfo box, standalone or commons-ish? [14:27:18] I say let's go for standalone [14:27:42] for testing mcr, the federation aspect is irrelevant. [14:29:11] ack [14:29:28] that means: one wiki, with wikibase and mediainfo installed. [14:30:01] we don't need all the soft dependencies (ULS, CLDR, Cirrus, GeoData, etc) [14:32:53] as for mcr-full, I think the list of extensions that are tested for core commits is https://github.com/wikimedia/integration-config/blob/78c28fb7f93a5c5bdd8d89f7f13f0a72205e9cd2/zuul/parameter_functions.py#L434-L470 [14:33:09] drop the ones that don't deal with content/editing and seems like a good starting point [14:59:08] tgr: we should also keep the once that *react* to content being edited. [15:01:39] tgr: https://etherpad.wikimedia.org/p/mcr-full? [15:05:58] I removed GuidedTour and Zero* as I don't think those do anything interesting (ZeroPortal could be useful for testing JsonConfig but the zero stuff is not well-maintained so probably not worth the effort) [15:06:19] also TimedMediaHandler [15:06:25] looks good otherwise [15:37:02] tgr: I was thinking that GuidedTour uses ApiEditPage. But I may be wrong [15:40:25] at a glance it doesn't [15:40:52] enwiki has a tour that does a lot of it (Wikimedia Quest or something like that?) [15:41:04] that would have to be imported by hand though [15:41:45] I looked through the existing roles, mentioned on the etherpad the ones that seemed interesting [23:05:18] * legoktm :You must log in with services to message this user [23:05:42] which is curious, since I use SASL, it's not meant to be possible to connect without logging in [23:05:47] o.O [23:06:00] I have that turned on so I don't need to deal with all of the random pm spam [23:06:22] we don't need to do a meeting today, although I will still need to put together a weekly report for Cindy to use tomorrow [23:06:45] ok, should I put my status update in the etherpad then? [23:07:12] that would be useful if you have time, thanks [23:08:40] will do then :) [23:18:07] I sent out an email about meetings, just what I said above basically