[00:14:52] 2015-12-21 16:58:03 mw1224 thwiki CentralAuth-Bug39996 INFO: CentralAuthHooks::attemptAddUser: failed with message ชื่อผู้ใช้ที่กรอกมีผู้ใช้แล้ว [00:14:55] ......sigh. [02:32:00] Docker is not ready for managing complex infrastructure [02:34:55] it does not integrate smoothly with any configuration management tool, and it does not provide an equivalent facility [02:36:34] the way people work around it is so bad it beggars belief [02:37:19] example: https://puppetlabs.com/blog/building-puppet-based-applications-inside-docker , from James Turnbull, who created Puppet and who now works at Docker (and wrote the Docker book) [02:37:32] RUN apt-get -y -q install wget git-core [02:37:33] ADD Puppetfile / [02:37:33] RUN librarian-puppet install [02:37:34] RUN puppet apply --modulepath=/modules -e "class { 'nginx': }" [02:37:36] RUN echo "daemon off;" >> /etc/nginx/nginx.conf [02:38:43] the other standard approach is to include a huge shell script [02:38:52] parametrized, if you want flexibility [02:39:30] for example: https://github.com/synctree/docker-mediawiki/blob/master/1.24/docker-entrypoint.sh [02:40:49] I should say: the *sensible* ones use a shell script. There is also a fashion of trying to cram the installation of a complex application into a single, gigantic command line [02:42:23] Dockerfiles are not described by any formal grammar or specification [02:42:45] the docs (https://docs.docker.com/engine/reference/builder/) say that the format of Dockerfiles is "INSTRUCTION arguments" [02:43:09] but that's OK, because it's such a simple DSL, we don't need a spec, right? [02:43:35] it's not like we'll augment this with all sorts of terrible hacks in the future [02:54:40] am I missing something? [03:08:46] ori: I think your analysis and mine from a year ago are about inline [03:09:41] The reason for the giant command line things in a dockerfile is that a filesystem snapshot is made for every commnad [03:24:18] right [03:26:54] I think that dockerfiles are probably workable for deploying internally mnaged containers and for showing how something works. [03:27:08] They don't feel like a distribution format though [03:27:58] there are a lot of mostly emergin ways to manage orchestration which are very confusing from an end user perspective [03:28:11] *emerging [03:28:36] yes, agreed [03:29:13] bd808: I have a totally unrelated question, about process [03:29:22] * bd808 is trying out hexchat on his barely working debian testing laptop [03:30:03] I like totally unrelated questions [03:30:17] what should be done with tasks like https://phabricator.wikimedia.org/T110120 ? That is to say: things that would be nice to do at some point but for which there exist no concrete plan [03:30:33] and which do not mesh with any development arc currently underway [03:31:08] like where do you park them so you don't lose track? [03:31:43] there are many theories on tracking a backlog [03:31:55] it's parked in "backlog" on the perf-team board, but I'd like to differentiate it from the items that represent things that we must actually get to [03:32:06] I'm tempted to just delete it [03:32:35] At my last job we had a "parking lot" category [03:32:50] for things like that that were vague ideas [03:33:33] occasionally (once a quarter or so) we'd look through it and kill things, drag them forward or just let them sit [03:33:59] Lowest priority and exclude that from your queries? [03:35:14] no, I'm going to close such tasks, I think. What you suggest sounds completely sensible but also completely unsuitable for me. [03:36:03] https://www.mediawiki.org/wiki/User:MZMcBride/Bugs#Philosophy [03:36:39] yes, I know [03:36:50] :) [03:36:51] I have a line from that page in my fortune file [03:37:06] but it's not an approach that works for me [03:38:18] I'm very limited [03:41:46] I appreciate the perspective, though. Like I said, I think it makes sense, and hearing it helped me crystalize my thoughts. [03:51:56] stalled tasks are considered open, right? [03:52:17] yes [14:09:06] ori: If it were me, I'd make a "would be nice someday" column and put it there. [14:10:47] heh [14:10:57] Was it BZ that had the mystical future? [16:32:06] hashar: Is the database-related unit test failure in https://phabricator.wikimedia.org/T122007 telling us that the test needs "@group Database", or is there a less-expensive solution? [16:34:31] doh [16:34:50] anomie: to be honest I have no idea how centralauth is tested [16:35:10] but afaik, we only create a single database something like $wgDBname = mwjenkins-XXX [16:36:20] anomie: and the @group Database on a test makes it to be recognized by MediaWikiTestCase as needing a DB [16:36:32] it will then setup a basic database / tables cloned from the reference once [16:36:34] one [16:37:06] hashar: Does it set up the basic database for each test in the class, or does it somehow do it only once? [16:37:08] seems like CentralAuth default to using the local wiki [16:37:22] it is setup / teared down for each test [16:38:16] anomie: https://github.com/wikimedia/mediawiki/blob/master/tests/phpunit/MediaWikiTestCase.php#L411-L424 [16:38:22] looks for @group Database [16:38:42] which state yes / no [16:38:50] then just before we run the test : https://github.com/wikimedia/mediawiki/blob/master/tests/phpunit/MediaWikiTestCase.php#L114-L137 [16:39:04] where last highlighted line (137) is parent::run which actually execute the test [16:39:43] you can see that if we found @group Database, we might create a new db (setupTestDB using clone()) [16:39:57] then inject bunch of mwcore data ( addCoreDBData() [16:40:07] and tests can use addDBData() to inject their own material [16:40:08] hashar: Eew, it's even worse. CentralAuth has a special CentralAuthTestCaseUsingDatabase class that needs to be inherited from to get the database set up... [16:40:44] neaaat [16:41:09] so in this case [16:41:13] that is done before after class [16:41:15] not on each tests [16:42:10] hashar: Thanks for the help, looks like the bottom line is that for T122007 I'll have to hack the code that accesses the database to detect if it's being run from this core structure test and avoid the DB access after all. [16:43:35] ohh [16:43:40] the recently introduced ApiDocumentationTest::testDocumentationExists in mwcore [16:44:08] I don't even know what it is doing [16:44:20] seems to call the help for each api page/method/qeuries [16:44:53] That's pretty much exactly what it's doing. TTO suggested it after the nth time of merging a patch without the i18n. [16:45:15] sounds about right [16:45:31] CI runs install.Php [16:45:52] maybe CentralAuth could set $wgCentralAuthDBName = $wgDBName [16:45:54] andsome prefix [16:46:03] whenever it recognizes being run under jenkins [16:47:00] hashar: That's a good idea, I'll try that. [16:47:03] anomie: CI injects in LocalSettings.php the variable $wgWikimediaJenkinsCI = true; [16:47:10] which got introduced to Wikibase folks [16:47:13] s/to/for [16:47:41] so potentially, you can add a very dirty hack in CentralAuth which when install.php runs make it install the centralauth db on the local wiki [16:47:50] and set the global properly whenever on Jenkins env [16:47:52] that is dirty :( [16:48:54] a Wikibase example https://github.com/wmde/WikidataBuildResources/blob/master/Wikidata.php#L7-L17 [16:50:11] anomie: oh seems we had the same issue for GlobalBlocking [16:50:20] commit at https://gerrit.wikimedia.org/r/#/c/143253/ has some explanation [16:50:24] and the code is straightforward: https://gerrit.wikimedia.org/r/#/c/143253/1/GlobalBlocking.php,cm [16:50:37] if Jenkins && $wgGlobalBlockingDatabase = $wgDBname [16:56:20] and if someone mind looking at a GlobalUserRename being stalled. That will be welcome https://phabricator.wikimedia.org/T119696 [16:56:27] some renames jobs got interrupted [16:56:31] so it is stalled [16:56:32] Hmm. https://gerrit.wikimedia.org/r/#/c/260397/ with that test passed, but I'm wary of the fact that the failed test on PS1 ran 903 tests while the pass on PS2 ran only 670... [16:56:44] no clue :( [16:57:14] * anomie checks to see if someone went and reverted the problematic test [16:57:54] Yep. Krinkle did. Sigh. [16:58:26] you want to compare https://integration.wikimedia.org/ci/job/mwext-testextension-zend/17339/testReport/(root)/ [16:58:30] vs https://integration.wikimedia.org/ci/job/mwext-testextension-zend/17425/testReport/(root)/ [16:58:44] and yes apparently ApiDocumentationTest::testDocumentationExists is gone [16:59:12] Any idea what "the CI gate extensions" are? [16:59:53] https://github.com/wikimedia/integration-config/blob/master/jjb/mediawiki.yaml#L377-L406 [16:59:59] a bunch of repo sharing the same set of jobs [17:00:03] phpunit / qunit [17:00:15] * anomie uses the very nifty Depends-On to work around the revert [17:00:20] the job clones all those extensions and is triggered by all of them [17:00:22] :D [17:00:25] anyway, gotta move out sorry :( [17:01:21] kudos, that is a nice use for Depends-On :-} [17:01:30] ... Are those the ones run by core mediawiki-extensions-hhvm? Because all those already pass or it wouldn't have been mergeable [17:01:43] yup mediawiki-extensions-* [17:01:53] it should really become all the extensions deployed on Wikimedia [17:02:04] and get rename to something like wikimedia-deployed-stuff-hhvm [17:02:26] Darn. Now we get a different failure because just adding that didn't make CentralAuth create its tables in the main DB. [17:02:30] it is a bit of half done now :-( [17:02:50] maybe you can depends-on your other central auth patch [17:02:58] not sure what Zuul will compose though [17:03:23] anyway, gotta rush sorry :( [17:14:08] legoktm: Ha! With hashar's help, https://gerrit.wikimedia.org/r/#/c/260397/ makes CentralAuth pass. [18:52:55] what's the proper way to get the name of NS0 in $wgContLang for a given wiki? [18:53:18] https://en.wikipedia.org/w/api.php?action=query&meta=siteinfo&format=json&siprop=namespaces doesn't provide it, which is a bit strange [18:59:02] ori: ns 0? you mean main namespace? it's always empty [18:59:17] as in, there's no prefix [18:59:30] right, i just want 'Article' in the content lang [19:00:09] the namespace doesn't have a name itself, but there's some message that has this [19:00:55] ori: is the 'blanknamespace' message what you're looking for? (as seen in the "Namespace" dropdown on https://en.wikipedia.org/wiki/Special:AllPages etc.) [19:03:22] yep [19:03:39] what's the best way to query a wiki for the contentlang localisation of the blanknamespace message? [19:04:02] * ori is mostly an MW api noob [19:04:48] query=allmessages&ammessages=blanknamespace [19:05:26] I don't know if that'll use userlang or contentlang though [19:05:38] you can do uselang=content i think [19:06:02] Yup, seems you can [19:06:55] thanks guys [19:25:03] hm [19:25:12] https://he.wikipedia.org/w/api.php?action=query&meta=allmessages&ammessages=blanknamespace&amlang=content gives '(Main)', which is surely incorrect for hewiki [19:25:19] so i must be doing something wrong [19:26:20] Well, to test it, you swap content for he [19:26:20] https://he.wikipedia.org/w/api.php?action=query&meta=allmessages&ammessages=blanknamespace&amlang=he [19:26:24] "*": "(\u05de\u05e8\u05d7\u05d1 \u05d4\u05e2\u05e8\u05db\u05d9\u05dd)" [19:26:29] So yes, something looks askew [19:26:56] * Reedy slaps ori [19:27:08] https://he.wikipedia.org/w/api.php?action=query&meta=allmessages&ammessages=blanknamespace&uselang=content [19:27:19] heh. right. [19:27:31] https://he.wikipedia.org/w/api.php [19:27:38] uselang: https://he.wikipedia.org/w/api.php [19:27:40] DAMN YOU [19:27:47] uselang:Language to use for message translations. action=query&meta=siteinfo with siprop=languages returns a list of language codes, or specify user to use the current user's language preference, or specify content to use this wiki's content language. [19:27:47] Default: user [19:36:35] Reedy: see also https://phabricator.wikimedia.org/T97096 [19:37:00] tgr: heh [19:37:12] but passing the right thing to the API makes it return the right thing too ;) [19:37:14] well, some of the time [20:43:00] legoktm: are you around? [20:43:15] SMalyshev: kind of, what's up? [20:43:33] legoktm: I was told you are the person to talk to about composer & mediawiki [20:43:58] legoktm: basically, I wanted to ask about what's the right way of adding composer dependencies to an extension [20:44:29] SMalyshev: Moving elastica to composer? Always meant to get around to that :P [20:44:30] https://www.mediawiki.org/wiki/Composer/For_extensions [20:44:59] ostriches: I thought elastica is already composer-enabled? [20:45:04] Extension:Elastica uses "ruflin/elastica": "2.2.0" already [20:45:19] Eh, but Extension:Elastica is a silly shim extension that shouldn't exist :) [20:45:25] SMalyshev: https://www.mediawiki.org/wiki/Manual:External_libraries is the doc page [20:45:28] Ideally Cirrus would just import Elastica directly :) [20:45:37] Move the whole 1 file to CirrusSearch? [20:45:50] I think ApiFeatureUsage depends upon Elastica? [20:45:58] legoktm: aha, thanks! that looks like what I needed [20:46:16] class ApiFeatureUsageQueryEngineElasticaConnection extends ElasticaConnection { [20:46:29] class Connection extends \ElasticaConnection { (Flow) [20:46:52] legoktm: would CI, etc. automatically install composer deps for extensions? [20:47:10] Not yet, we're working on it. [20:47:10] * ostriches grumbles about poor architectural decisions he made 2-3 years ago [20:47:12] ostriches: not yet... Cirrus is not composer'ized yet. [20:47:33] SMalyshev: But you can submit a patch to add the library to mediawiki/vendor, and use the Depends-On: header in your commit message to make CI work [20:47:36] ostriches: though it may be a good project to look at [20:48:03] The mediawiki/vendor patch shouldn't be merged until the security review is complete though [20:48:14] legoktm: hmm... the thing is the library is used only by Cirrus, so I'm not sure if I would want it in core [20:48:46] legoktm: so is there any procedure for nn-composer-deployed extension to pull in composer dependencies? [20:48:46] mediawiki/vendor is the collection of libraries needed for use by Wikimedia-deployment, not by core. [20:49:00] legoktm: ahh! ok then, I'll look into that, thanks [20:49:03] core has its own composer.json file [20:49:38] legoktm: but there's nothing that checks composer.json for extensions and pulls the deps in? [20:50:10] We have the composer-merge-plugin, but CI isn't using it yet (there's a bug for it) [20:51:33] legoktm: I read in the docs that composer-merge-plugin just enables composer.local.json - so something still has to pull data from extensions to composer.local.json? [20:51:35] legoktm: And in tarballs... [20:51:41] Not just deployment [20:52:37] legoktm: about mediawiki/vendor - are those just checkouts of the libs? so if I make a fix to the library it has to be copied there too? [20:53:07] SMalyshev: you have to add extra.merge-plugin.include to your composer.local.json and include your extension's composer.json there. I meant to add a .sample version to core, but haven't gotten around to it yet [20:53:17] SMalyshev: yes, its just running composer and committing the result [20:53:43] Which is really really bad [20:53:45] Apparently [20:53:57] Like, commiting your wordpress credentials to github bad [20:53:59] legoktm: ok, but on which composer.json? core's or extension's? [20:54:19] * legoktm shushes Reedy [20:54:45] legoktm: You saw the comment response I got on github composer, right? [20:55:09] legoktm: i.e., in order to get stuff into mediawiki/vendor, which composer.json should I edit? [20:55:13] https://github.com/composer/composer/issues/4410#issuecomment-166239587 [20:55:27] core if it needs to be in core [20:55:34] extension if it needs to be in the extension [20:55:38] vendor too based on the above [20:55:43] SMalyshev: so, you'd add this new dependency to your extension's composer.json, and to mediawiki/vendor's composer.json. The whole point of mediawiki/vendor is that we don't want to run composer on the cluster, so people run it on laptops and we commit the result in a git repo [20:55:58] Reedy: yeah, I need to respond, it's so silly >.> [20:56:22] We'll send WMDE around [20:56:28] legoktm: wait, btu I don't see composer.json in my mediawiki/vendor. Am I looking at the wrong place? [20:56:45] Are you using hte repo? [20:56:54] https://github.com/wikimedia/mediawiki-vendor/blob/master/composer.json [20:56:54] ah, looks like I am [20:56:57] SMalyshev: did you clone the mediawiki/vendor repo? [20:57:17] ok, I looked at mediawiki checkout, where it apparently comes from some other place [20:57:22] yeah [20:57:27] you probably ran composer yourself [20:57:28] ok, now I got it [20:57:46] Reedy: it's vagrant install so probably vagrant did some magic on it [20:58:21] lol. https://github.com/wikimedia/mediawiki/commit/0435407b88ad95f0e4d83793df6e761b619e6a0b#commitcomment-15056791 [21:02:33] Reedy: commented [21:02:34] ostriches: haha! [21:02:55] legoktm: thanks, I think I got what I need now [21:03:10] awesome :) [21:03:13] * legoktm runs away for lunch [21:13:51] * anomie reads backscroll about extensions and composer, and grumbles about extensions that require other extensions via composer and make composer screw up the symlinks he has in extensions [21:19:43] 13 xtools issues closed, xtools-articleinfo works again [21:36:25] ori: Do I dare ask what the fix(es) were? [21:37:13] simple stuff https://github.com/x-tools/xtools/commits/master [21:38:03] Their experienced PHP devs can't have been overly experienced... [21:39:04] The Wikimania Conference is now open for scholarship applications and program submissions. [21:39:04] جامعة الدراسات الإسلامية تهتم [[Jamia Shariyyah Malibagh, Dhaka|بنشر العلوم الشرعية والتربوية]] واللغوية برؤية منهجية وعلمية متكاملة تواكب العصر الحديث ، وتتجاوب مع ماوصل إليه العلم الحديث فى العصر الراهن، وتواكب التقدم العلمي وتسعى الجامعة إلى استيعاب [21:39:04] سائر العلوم فى كافة المجالات ونشر العلم بالوسائل والتقنيات الحديثة بنظام التعليم المفتوح (التعليم عن بعد ) وتستهدف الجامعة كل من لديه طموح ورغبة ويتطلع إلى مستقبل وتعليم أفضل، فى اى مكان وفى أى بقعة من بقاع الأرض ومنهج الجامعة العلمى والعقدى [21:39:04] هو منهج اهل السنة والجماعة واعتقاد السلف الصالح وتنهج الجامعة التدرج فى التعليم تحت إشراف مجموعة من العلماء والأساتذة والدعاة المتخصصين ويمثلها [[Egypt|مكتب جمهورية مصر العربية]] ويختص بشئون الدارسين. [21:39:14] Why does this need to flow out of the box? [21:40:59] https://phabricator.wikimedia.org/M133 [22:30:38] legoktm: yt? [22:33:34] ori: what's up? [22:33:41] re: https://gerrit.wikimedia.org/r/#/c/260425/ [22:34:32] I love the idea -- but I'm a bit confused. Isn't the schema for extension.json formally specified? And if so, how is it OK to add new attributes willy-nilly? Shouldn't they be subkeys of some specially-designated key for extension-specific metadata? [22:36:05] legoktm: ^ [22:43:03] ori: the schema is defined, but we have additionalProperties: true, so people can add random properties (like @doc stuff). attributes are top-level because I envisioned it as we could transition things from global state into attributes without requiring updates to extension.json (e.g. https://gerrit.wikimedia.org/r/#/c/180554/). At the time I stuck it in top-level because both core and extensions were going to use attributes, and I didn't want [22:43:03] the extension usage to be different. Right now we don't have any schema validation for extension attribute properties, I have a few ideas on how to (sub-schemas added via a hook) but haven't gotten around to it yet [22:53:05] I think additionalProperties: true is bad; schemas are only useful inasmuch as they describe the data completely. Eventually you'll want to standardize some property and find that it already exists in the wild because some extension author had a different idea about what it should contain [22:53:24] but that's a minor point, and you can always cut a fresh schema version [22:53:49] on the whole this is great work, well done (not just the EL stuff, the whole extension.json stuff in toto) [23:08:52] ori: yeah you're right, this was before we introduced the versioning system. I have plans to change "config" and bump the version for that, so we should re-evaluate top-level attributes too. [23:08:54] and thanks :) [23:32:55] csteipp: could I get a +2 on https://gerrit.wikimedia.org/r/#/c/260423/ ? or hoo? [23:33:48] thanks :)