[09:44:12] Reedy: are you around? got a sec please? [10:19:22] hi, [10:19:59] I am getting this type of warning : Warning: OutputPage::transformFilePath: Failed to hash C:\xampp\htdocs\oim/skins/Cavendish/resources/colors/#F8F8F8.css [Called from OutputPage::transformFilePath in C:\xampp\htdocs\oim\includes\OutputPage.php at line 4037] in C:\xampp\htdocs\oim\includes\debug\MWDebug.php on line 309 can anybody help me out please ? [10:49:21] Hi and BIG thanks thanks for the most awesome wiki engine in these parts of the Laniakea Supercluster [11:09:24] hello [11:14:51] I'm trying to get Extension:Translate installed. It needs a lot of other extensions to work so I have tried to install this https://www.mediawiki.org/wiki/MediaWiki_Language_Extension_Bundle [11:15:33] But when I try to enable UniversalLanguageSelector I get PHP error stating that "/var/www/ban-covert-modeling.org/wiki/w/extensions/UniversalLanguageSelector/extension.json does not exist!" [11:16:20] I have been searching for a way to generate or manually write the extension.json, without any luck. I am guessing it could be generated by some other extension that I am still missing [11:19:03] I got the ULS extension via extension distributor, but trying to enable it just leads to it complaining about the missing extension.json, eventhough the installation instructions at https://www.mediawiki.org/wiki/Extension:UniversalLanguageSelector makes no mention of it [11:19:25] Any help would be greatly appreciated. Got unmetered internet, will hang. [11:28:42] Trying to run maintenance/update.php also fails with complaint of extension.json not existing [11:29:47] Should I maybe try getting the ULS with git clone instead of the .tar.gz from Extension Distributor. I have really really tried searching mediawiki.org and the Internet for where to get the extension.js from or how to generate or write it, with no luck at all [11:33:21] Whoops. My bad: Did not have the extension installed at all. Now downloaded and expanded .tar.gz and will try running update.php [11:39:57] Yeah.. My bad. After installing Universal Language Selector update.php runs fine [11:52:37] This https://www.mediawiki.org/wiki/Help:Extension:Translate/Installation tells to install a bunch of enhancing, optional, things, such as PHPlot, Syck or phpyaml and ElasticSearch [12:18:03] hi [12:22:15] Hello, I am looking for @twentyafterfour. Actually, I am not able to sign up for developer's account as the page is showing permission denied. Just was to know how can I get one. [12:22:16] So I'm guessing Composer is the best way to install and enable the additional php packages for Extension:Translate [12:26:05] jubo2, yea I'd use composer. [12:27:36] abijeet: Which directory should I run composer to get the PHPlot and Syck or phpyaml? [12:27:47] .. or does it matter? [12:27:58] jubo2, to make translation memory use ElasticSearch as the backend, I'd follow this guide - https://www.mediawiki.org/wiki/Help:Extension:Translate/Translation_memories [12:28:37] jubo2, I would run it in extensions/Translate folder. [12:29:05] abijeet: Am I going to mess things up that I have installed extensions from tarballs.. should I delete them and install with composer again? [12:29:44] I assume upgrading Mediawiki would be easier if Composer had been used to install the extensions [12:31:37] jubo2, so you downloaded and installed the MLEB extensions via the tarball as defined here - https://www.mediawiki.org/wiki/MediaWiki_Language_Extension_Bundle ? [12:31:55] abijeet: Yes. Each extension individually [12:33:15] That should be fine, I don't think you can install those extensions via composer. The dependencies of those extensions can be installed via composer. [12:33:34] So I would go into the extensions/Translate folder, and then run composer install --no-dev [12:34:18] abijeet: It says "Nothing to install or add" [12:34:18] The --no-dev flag will avoid downloading the development dependencies, you can safely skip those unless you plan to hack on the extensions. [12:34:40] I should install PHPlot and syck recommended here https://www.mediawiki.org/wiki/Help:Extension:Translate/Installation [12:36:42] jubo2, Yup, those should be installed when you run composer install --no-dev. I can see them listed in the composer.json [12:37:00] Do you see a vendor folder under extensions/Translate ? [12:37:02] abijeet: composer suggest the packages to instal should be in composer.json. Perhaps that file is included _only_ in the extensions installed with composer and not via tarball [12:37:21] abijeet: Yes. there is vendor/ [12:38:21] Under that folder you should see, mustangostang/spyc and davefx/phplot [12:38:39] So I'm guessing that if I had installed the extensions with composer it would have taken care of installing also dependencies and recommend packages [12:39:17] abijeet: I don't see those directories [12:39:42] AFAIK, I don't think there is a way to install the extensions using composer. You grab the tarball or get the extensions using git. [12:40:55] abijeet: This would probably be the good way to go about it: https://www.mediawiki.org/wiki/Composer/For_extensions [12:42:33] jubo2, oh, cool I was unaware that the extensions were available on packagist. [12:42:43] np [12:42:50] Glad to be of help [12:43:36] I am wondering if installing extensions with composer will also install the dependecies of each extension and also preferably useful packages [12:44:27] jubo2, although I guess this method is preferable, downloading the tarball, and then running composer install under the respective extensions folder should work fine as well. [12:44:54] since each extensions has its own composer.json, with the extension dependencies listed. [12:45:18] abijeet: but running 'composer install --no-dev' in 'extensions/Translate' says pretty much "Nothing to do" [12:46:40] oh. sorry.. I said I don't see the dirs 'davefx' and 'mustangostang' under 'extensions/Translate/vendor', but they infact are there [12:47:02] So I should maybe adjust the dir to run 'composer install --no-dev'? [12:47:02] jubo2, cool, then they are installed. [12:47:48] Or should I delete the manually installed extensions and make a composer.local.json and get composer to install them [12:48:32] I think composer is done installing the dependencies for the Translate extension and you shouldn't have to do anything. [12:48:38] I assume in the latter when I need to upgrade the mediawiki, I wouldn't need to manually download and extract the tarballs for each extension that does not come bundled with Mediawiki [12:50:03] So if you go with the composer approach of installing the extensions, running composer update would update the extensions as well. [12:50:33] I gonna do that. It just seems a much better way to keep everything tidy and avoid manual tinkering [12:51:24] from https://www.mediawiki.org/wiki/Composer/For_extensions : "This plugin and the MediaWiki core composer.json configuration allow a local deployment to add required extensions and libraries to a file composer.local.json inside the root MediaWiki directory" [12:51:49] Now I'm wondering if it wants the webroot dir or the 'w/' inside it [12:52:44] "root MediaWiki directory" would indicate 'w/' [12:52:46] I try [13:03:06] moved the dirs from tarballs away and wrote composer.local.json. Now when running 'composer install --no-dev' it complains that it cannot find package 'mediawiki/cleanchanges' [13:04:19] The line in composer.local.json is: [13:04:32] "mediawiki/CleanChanges": "*", [13:05:02] Should I assume some thing has not been done to enable installation of Extension:CleanChanges with composer? [13:05:46] I would check if that is available on packagist. [13:07:03] abijeet: how do I do that? [13:07:24] https://packagist.org/?query=mediawiki%2Fclean [13:07:55] Quoting from here, https://www.mediawiki.org/wiki/Composer/For_extensions [13:07:57] > As of MediaWiki 1.22, you can use Composer[1] to install MediaWiki extensions which include a composer.json manifest and are published on the Packagist package repository. [13:08:20] Translate extension seems to be there, https://packagist.org/?query=mediawiki%2Ftranslate [13:08:49] I could also find babel, and universal language selector. [13:09:44] It complained about universal language selector not being found [13:10:26] now I'm getting errors from composer that in the cli config for php there are no zip and unzip commands found [13:12:24] Might have to install the php-zip package, alongwith zip / unzip tools [13:13:09] yeah.. just did that [13:13:14] now to run the composer again [13:13:42] now it is installing stuff [13:19:44] afk [13:20:05] jubo2, best of luck, will be around in an hour or so. [13:24:28] Thanks abijeet.. Getting the Translation extension isn't a picnic by a lake but more like hill climbing in challenging geography [14:05:24] https://www.mediawiki.org/wiki/Extension:ContentTranslation suggests further extensions. I'm not sure if Wikibase is needed when running a single multilingual wiki [14:08:18] I'm adding extensions to 'w/composer.local.json', but running 'composer install --no-dev' does not install the extensions I've added. What could be wrong? [14:11:11] I think it is running the bundled 'composer.json' instead of 'composer.local.json'. How do I fix this? [14:14:42] Running 'composer require wikimedia/composer-merge-plugin' complains about each entry in 'w/composer.local.json' [14:15:16] Should I maybe be having the 'composer.local.json' in the /w/extensions directory instead of wiki root? [14:20:41] In the composer.local.json I have lines like [14:23:32] This is the output of 'composer require wikimedia/composer-merge-plugin': http://dpaste.com/3G6D891 [14:24:21] Most/a lot of extensions are not installable via composer [14:24:25] Generally, we don't consider it a supported method [14:25:20] My /w/composer.local.json is now like this: http://dpaste.com/07E9K0X [14:25:51] As above, most extensions cannot be installed via composer [14:26:16] Reedy: So you suggest to check which can and which cannot be installed via Composer. What is an easy way to do it? [14:26:28] I don't suggest to install any MW extension via composer [14:26:36] ok.. [14:27:14] Reedy: Do you suggest git cloning into the Extensions or getting them from the Extension Distributor? What are the up- and down-sides of each approach? [14:27:43] git makes it easier to switch between branches and get updates if any come into the branch [14:27:56] Extension Distributor makes things easier as you don't have to use git, just unzip the zip file, and put it in place [14:29:20] but instead of upon every Mediawiki update needing to download and unzip extensions by hand it would suffice just to 'git pull' the extensions. [14:29:40] Well, it depends [14:29:48] You can clone the meta repo to give you all extensions [14:29:56] But that takes quite a lot of time, and quite a lot of disk space [14:30:07] But you can just git clone each as you need/want them [14:30:20] How are the branches in Mediawiki extensions .. I assume git cloning into an extension will give the latest stable? [14:30:34] Depends what command you use [14:30:39] By default, git will give you master [14:31:01] Reedy: I want the stablest stuff.. not the one with most detuned bells and whistles [14:31:17] Generally we don't advise running master of an extension with a branched version of MW [14:31:50] aha.. so I can pass to git that I want extension for 1.32.0 ? How? [14:33:43] git clone --branch REL1_32 url [14:34:07] Awesome Reedy. I'll install the extensions with that [14:35:07] Reedy: Should I remove the manually downloaded and extracted extensions first? [14:35:59] If you want to keep things consistent, it's not a bad idea [14:41:09] I should probably use a script to get all I need installed in one go [14:42:26] Reedy: what ballpark of space would 'git clone https://gerrit.wikimedia.org/r/mediawiki/extensions --branch REL1_32' need? [14:46:16] 5GB on mine [14:47:00] Too much for me. I cloned into the extensions individually [14:47:14] and now ran update.php with no problems [14:55:42] I ran 'composer require wikimedia/composer-merge-plugin'. It suggests that packages suggest additional packages. How do I make composer install those. Also got a warning in orange that "Package phpunit/phpunit-mock-objects is abandoned, you should avoid using it. No replacement was suggested." [15:03:11] You don't need the dev packages (which is what phpunit-mock-objects is) [15:04:13] Ok Reedy [15:04:51] Reedy: How do I make composer to install the packages suggested by the installed packages? [15:18:21] 'composer suggests | xargs -i composer require {}' seems to do the trick [15:20:27] seems running the same command for the 2nd time installs more stuff ... probably the things suggested by the packages installed in the first run [15:21:20] unless you need functionality from a suggested package, that seems very pointless [15:21:42] the code itself won't use them, and by having extra packages installed you could be exposing yourself to addtional security risks [15:22:13] Skizzerz: Oh. I did not know that [15:22:21] Skizzerz: Can I roll back somehow? [15:23:01] no idea, likely not very easily. You'd possibly have to nuke the vendor folder and your composer.lock and start over. [15:23:51] Now I'm getting "ambiguous class resolutions" [15:24:08] looks like it is installing some packages for compatibility with older php [15:28:16] Okkkk... I moved the composer.lock and vendor/ to a safe place and am running 'composer install --no-dev' [15:28:23] I hope this is the right command [15:46:11] Hmmm... not making notes meticulously enough. I installed some extensions because the extension page of some other extension said these will be beneficial and now I can't find which extensions suggested them [15:47:17] I notice I have in extensions/ 'cldr' and 'Cldr' ... this is not good [15:49:29] Found the extension that suggested the packages: https://www.mediawiki.org/wiki/Extension:ContentTranslation thanks to the very good search functionality of Mediawiki [16:03:40] Can https://www.mediawiki.org/wiki/Extension:ContentTranslation and https://www.mediawiki.org/wiki/Extension:Translate be used in conjunction or do they clash? [16:06:36] They should work mostly fine together, I think [16:10:22] What does the "Allow translation of page title" checkbox do if I leave it checked? [16:10:53] Isn't the system that the translations should appear in /pagename/fi (for Finnish) ? [16:18:20] I'm trying to translate https://wiki.ban-covert-modeling.org/wiki/Translation_test_article to Finnish and when I hit save it doesn't actually save what I just translated [16:19:11] Am I missing some configuration? [16:19:20] I didn't touch the Translate.php [16:20:48] There is a ton of stuff here https://www.mediawiki.org/wiki/Help:Extension:Translate/Configuration [16:21:14] hi all, once you install mediawiki and import wikipedia dumps using importdumps.php or the java program. how to crawl the local copy of wikipedia ?? [16:22:09] is there a local url or directly from the database ?? [16:22:34] what about viewing pages [16:22:48] alioui_: I think you should (maybe) state more precisely what you mean by "crawl" i.e. what do you want to achieve [16:27:13] jubo2: i want to crawle wikipedia for nature language processing [16:27:32] so i want to get text data from wikipedia page [16:28:34] since wikipedia don't want bot to crawl the online version of wikipedia [16:29:08] so you need a crawler to crawl through the web UI, or directly the database or maybe via API [16:29:41] alioui_: did you get the .sql dump or the .xml dump? [16:30:34] I suppose it could be more efficient to crawl the .xml instead of a live database... but I need to state I don't know much about what you are trying to do alioui_ [16:31:41] jubo2: thanks for the help then [16:42:23] oh.. now the translation actually gets saved [16:42:59] alioui_: are you looking to train a natural language neural network with Wikipedia or what is your objective? [16:48:27] Going to the article https://wiki.ban-covert-modeling.org/wiki/Translation_test_article it knows that "This page has changes since it was last marked for translation", but when I click on "Translate this page" it shows the old version of the English original, not the current one. Any ideas Reedy? Maybe there are some jobs that should be run, but are not getting run [16:50:44] https://www.mediawiki.org/wiki/Help:Extension:Translate/Installation says that https://www.mediawiki.org/wiki/Manual:Job_queue#Set_up must be set up, but this being an ultra low traffic wiki I'd like the jobs to be run "at the end of a web request" [16:55:14] The manual says that by default the jobs created will be run after the client is served with the page that invoked the job needs, but this does not seem to be the case [17:02:20] Running the jobs manually with 'php maintenance/runJobs.php' runs some jobs but does not change the fact that the translation view shows the old version of the article to be translated [17:02:31] so something somewhere is slightly wrong [17:24:33] This is overtly complicated for me: https://www.mediawiki.org/wiki/Manual:Job_queue#Job_execution_on_page_requests [17:28:37] The Extension:Translate is broken for me in such a way that the source text shown in the https://wiki.ban-covert-modeling.org/w/index.php?title=Special:Translate&group=page-Translation+test+article&language=fi&action=page&filter= does not show the latest version, even as viewing the page the https://wiki.ban-covert-modeling.org/wiki/Translation_test_article page the Translation extension obviously sees that it has been changed since last translation [17:29:23] and it is not about jobs not having been run. I ran them manually with 'php maintenance/runJobs.php [17:33:10] ok.. now I get it.. I need to click on "marked for translation" and there is a view where I can set that this new version should be translated [17:58:02] Now I'm unable to run 'php maintenance/update.php' and 'composer update --no-dev' because "ruflin/elastica: 6.1.1 installed, ^6.1 required." [17:59:54] running 'composer update --no-dev' states Problem 1: "The requested package ruflin/elastica 5.3.0 is satisfiable by ruflin/elastica[5.3.0] but these conflict with your requirements or minimum-stability." [18:02:11] So somewhere there is a constraint that the required ruflin/elastica should be 6.1 whereas I have 6.1.1 [18:03:41] editing composer.json to state 6.1.1 did not help [18:05:38] Maybe I gonna try to install newer ElasticSearch than the one found in Debian stable? [18:16:21] jubo2: Mediawiki's elasticsearch integration only supports version 5 at the moment [18:36:17] Skizzerz: So I should install the 5.X branch? [18:46:40] Skizzerz: I installed elasticsearch 5.3.6, but 'composer install --no-dev' complains "The requested package ruflin/elastica 5.3.0 exists as ruflin/elastica[6.1.1] but these are rejected by your constraint." [18:47:16] I'm not really fluent in using composer, so I don't know how to fix this. Will I break something if I change the composer.json? [18:57:55] I changed the composer.json to ask for 5.3 and then 'composer update --no-deb' and 'composer install --no-deb' run without problems, but it seems something broke the Extension:Translate [18:58:28] When I try to translate the latest version tagged for translation I get error "Saving the translation failed: This namespace is reserved for content page translations. The page you are trying to edit does not seem to correspond any page marked for translation." [19:02:14] Hmm... now it is working [19:02:20] and so is elastic search [19:18:42] trying to run 'php maintenance/update.php' says lists a bunch of things where the installed version is newer than what is required and suggests "Error: your composer.lock file is not up to date. Run "composer update --no-dev" to install newer dependencies" .. when I run that it doesn't fix the situation [19:20:32] composer.json just lists the package versions you aim for [19:20:48] For example one line that the update.php complains goes as follows: [19:20:51] ruflin/elastica: 5.3.0 installed, ^5.3 required. [19:21:07] oh dear [19:21:43] is the composer.json a file you made or some dependencys? [19:21:43] So I'm guessing the hat '^' is regex saying it wants a version that starts with '5.3'. Maybe some regexing tools are not installed? [19:22:28] harmaahylje: I'm not sure.. I guess it existed and then subsequent runs of composer added stuff to it [19:24:25] harmaahylje: here is what I have done (in reverse chronological order (mostly..)): https://wiki.ban-covert-modeling.org/wiki/User:Jukeboksi/Blog/2019#Monday_2019-04-01 [19:26:09] This isn't a regex [19:26:33] https://getcomposer.org/doc/articles/versions.md#caret-version-range- for what the hat means [19:27:09] but ^5.3 should match 5.3.0 [19:27:41] If its update.php complaining though, maybe update.php is broken [19:28:56] I haven't touched the update.php [19:29:51] Looking at composer.json it is the Mediawiki original, but stuff has been added to it as I've installed new extensions and run composer [19:32:17] Running 'composer update --no-dev' I get the following warnings (I think they are just warnings): [19:32:34] Package egeloen/http-adapter is abandoned, you should avoid using it. Use php-http/httplug instead. [19:32:36] Package zendframework/zendframework1 is abandoned, you should avoid using it. Use zendframework/zendframework instead. [19:33:54] bawolff: here is the output of update.php: http://dpaste.com/1MQST9Z [19:34:23] so running the 'composer update --no-dev' that update.php suggests does nothing to fix the situation [19:34:59] moving the composer.lock away also does not help [19:36:14] I think maybe aws/aws-sdk-php is the problem [19:36:33] I don't need aws stuff [19:36:40] bawolff: How do I remove it? [19:37:35] As an aside, update.php can be overriden via --skip-external-dependencies but I'm not sure i reccomend that [19:38:19] But it looks like all the things update.php complains about not being sufficient versions are that version or newer [19:38:51] What does [19:38:56] composer why aws/aws-sdk-php [19:38:59] result in [19:39:19] jubo2: the version constraint isn't just that its new enough, being too new also causes a failure [19:42:07] bawolff: from what I read from your link the '^' will accept anything above as long as it isn't a new major version i.e. ^5.3 will take anything from 5.3 to 5.99 (if I understood correctly) [19:42:31] ^5.3 with take anything from 5.3 to 5.3.999999 [19:42:47] wait [19:42:50] maybe you're right [19:42:55] bawolff: can I somehow rebuild the composer.json to make sense.. though composer seems ok, but update.php is unhappy [19:43:09] I definitely think there is a bug somewhere here [19:43:21] If nothing else, update.php should be more clear on what it considers to be wrong [19:44:05] bawolff: Do I break something permanently if I run 'php maintenance/update.php --skip-external-dependencies' ? [19:44:37] no [19:44:45] I'll try that now [19:44:56] Its mostly just to warn users that something is not right, and their wiki might not work later [19:45:05] but update.php mostly does not depend on any of the dependencies [19:45:29] now it run in 0.4 seconds [19:47:57] Huh, i just looked at what maintenance/checkComposerLockUpToDate.php it seems like it just compares things for equality. Which is clearly wrong [19:48:09] So i think this is all a giant mediawiki bug, and nothing wrong with your install [19:51:13] Oh.. ok.. Glad to have helped to expose a bug I guess [19:52:21] Hmm, i think partially running composer require in the mediawiki directory is not the fully reccomended way [19:52:28] But i don't know what the actual reccomended way is [19:52:36] * bawolff finds composer confusing [19:52:45] Extension:Translate is now again giving me trouble. I edited a page, marked it for translation, clicked on "translate this page", added the translation for the new section, but I get error "Saving the translation failed: This namespace is reserved for content page translations. The page you are trying to edit does not seem to correspond any page marked for translation." [19:53:25] So something wrong with how Extension:Translate is conffed to use the namespaces? [19:54:23] I filed https://phabricator.wikimedia.org/T219832 [19:54:51] ctlr-a, ctrl-c, ctrl-r, ctrl-v and hit save and it works just fine. So this is a bug that exhibits sometimes [20:06:19] There seems to be a bug in Extension:Translate ... when a new segment has been added, and the article flagged for translation, then going to translate the newest fragment (inconsistently) gives error "Saving the translation failed: This namespace is reserved for content page translations. The page you are trying to edit does not seem to correspond any page marked for translation." [20:10:14] So I'm guessing there is something wrong with the configuration of Extension:Translate. I didn't touch its config at all. Should probably look if there is something that needs manual changing [20:13:46] I ran manually 'php maintenance/runJobs.php' and now the translation works again [20:14:21] .. So I am guessing I need to set the job queues to be run in a timely manner. The page explaining the various techiques to do this was confusing [20:15:23] I'd go for this: https://www.mediawiki.org/wiki/Manual:Job_queue#Job_execution_on_page_requests, but what if there are more jobs being created than pages are requested? Is that possible? [20:18:47] Nope.. setting $wgJobRunRate = 1; doesn't seem to be having effect on the Extension:Translate bug [20:20:30] Adding the translation and hitting save leads to the translation being shown on the right, but in bold and yellow background color and clicking on it gives that same old reason that the system thinks that "The page you are trying to edit does not seem to correspond any page marked for translation." [20:33:17] I can't use the Extension:Translate if it almost all the time goes into error when trying to translate a newly added section of the source document [20:37:15] So if anyone wants to try to help me with getting Extension:Translate to work here is the config I have set for it atm: http://dpaste.com/2CNF8PA and the problem seems to be that whenever a new section should be translated it the extension does something wrong because it says "Saving the translation failed: This namespace is reserved for content page translations. The page you are trying to edit does not seem to correspond any page marked for [20:37:16] translation." [20:38:32] This is beyond my means to figure out what is wrong that it thinks that the new translation of the newly added section in the source document does not belong in the Translate-namespace [20:39:08] s/Translate/Translations/ [20:41:33] This is really annoying because I'm in a hurry to get wiki.ban-covert-modeling.org/wiki/ up and translated as we have parliamentary elections on 2019-04-14 [20:53:19] Sometimes people at #mediawiki-i18n know more about translate then this channel [20:53:54] thanks bawolff [20:54:10] It seems the problem is to do with not enough jobs being run [20:54:29] ... as the bug doesn't exhibit when all the jobs have been run manually [20:55:14] I tried to increase $wgJobRunRate = 20; [20:55:35] I'm assuming this means that at max 20 jobs will be run after a request has been server [20:55:39] *served [20:56:13] I'd really like to have an option "Run all jobs whenever they are created", but cannot find this in the documentation [21:04:59] try doing [21:05:18] php runJobs.php --wait [21:05:35] but the 20 one should also essentially work [21:06:51] Yeah.. It seems the bug is to do with not running job queue in time. I did one cycle of 'add new text' -> 'mark article for translation' -> run 'php maintenance/runJobs.php' (jobs get processed) -> translate -> save => no error on save [21:06:54] I'd suggest instead of cranking up $wgJobRunRate, to instead add runJobs.php as a cron job [21:07:15] Skizzerz: Once per second cron job sound ok? [21:07:27] once per minute [21:07:36] I don't think cron gets more granular than that [21:07:41] Skizzerz: Then the translation will remain broken [21:08:35] Hitting on "mark this version for translation" creates a bunch of jobs and if they are not run before trying to translate and save the translation it fails [21:09:04] best I can say is ask in #mediawiki-i18n then [21:09:24] I don't understand why there is no option to "run jobs when they enter the queue" [21:10:36] The --wait option for runJobs I think should accomplish that [21:11:16] And in theory jobs from that request should be processed in the same request if jobRunRate is big enough i think, but i don't think people test that [21:14:22] This does not offer any solution as far as I can see https://www.mediawiki.org/wiki/Manual:Job_queue [21:14:51] bawolff: '--wait 0' ? Where do I put that? [21:15:11] wait doesn't take an argument [21:16:07] bawolff: ok.. why is it called 'wait' and what does it do and where do I put it to to fix the problem of non-processed jobs breaking Extension:Translate? [21:16:39] It just means runJobs will not exit when its done but wait for new jobs [21:16:41] e.g. do [21:16:45] php runJobs.php --wait [21:16:56] ok. [21:17:09] if that fixes the problem, how do I make it permanent [21:17:27] its a separate command that just has to stay running [21:17:58] Ok.. now running 'php maintenance/runJobs.php --wait &' and going to see if the Translate extension now starts to work as expected [21:20:42] Yeah. Now Translate works and when I hit the save button I see jobs flying by in the shell I'm running the 'php maintenance/runJobs.php --wait &' [21:21:25] bawolff: Should I put this into a tmux session and detatch from it? Should the user running it be www-data? [21:21:40] yes [21:22:59] There is no account called www-data [21:24:12] It may be adifferent user [21:24:18] e.g. nginx or apache [21:24:41] Mostly it should be the same as whatever user your webserver runs as [21:25:15] nginx doesn't run php. That's php-fpm [21:25:30] sometimes people name the user nginx [21:26:09] ok, that makes sense [21:29:27] On my system apache2 has rights to edit parts owned by www-data [21:30:05] I did not find a way to log in as www-data, but basically everything in the mediawiki installation is owned by me, except of course images/ [21:31:09] Now with the 'php maintenance/runJobs.php --wait &' running in a tmux nothing much happens untill I hit "save" in the Special:Translate and then a bunch of jobs fly by in the tmux [21:32:10] and this is actually fixing the Special:Translate ... does seem idiotic (could be due to me conffing something wrong too..) that tasks that must be done in order for the saving of the translation to work are deferred to the queue [21:34:49] jubo2: you shouldn't *log in* as www-data, but execute that command as that user. Use sudo for that: sudo -u www-data php maintenance/runJobs.php --wait [21:35:28] Vulpix: Ok. Thanks for the tip [21:37:59] Here btw http://dpaste.com/064ZGAA are the jobs that hitting "save" in Special:Translate triggers and the thing works _only_ if these are processed in a speedy way and not causing a "Saving the translation failed: This namespace is reserved for content page translations. The page you are trying to edit does not seem to correspond any page marked for translation." in case someone wants to fix this ... mmm.. stupid bug? [21:48:31] Maybe the problem is with jobs run on the webserver ($wgJobRunRate > 0) that are somehow failing and being lost? That would explain why "running them quickly makes it work", because otherwise they'll be picked by the webserver itself and fail