[00:04:57] Hi! i've created a web application for rich navigation through graph data structures. Something kind of similar in mechanism to Special:CategoryTree page on Wikipedia. [00:05:14] I want to integrate Wikipedia's Categories-Articles relationships graph into my app using an HTTP-based Web API. [00:06:03] What is the most streamlined API that Wikipedia offers that allows this? [00:48:57] OK I think I found what I'm looking for: the `categorymembers:` API seems to work pretty well in my few requests, and has decent documentation. gonna go with that for now :-) [04:45:20] He'll yeah [07:37:14] hi guys [07:37:47] planning to create a wikiFarm with Puppet, and I found the following site with interesting informations: http://www.shawndouglas.com/wikifarm.html [07:38:10] but this seems outdated as the version of mediawiki I use (1.32.1) contains differents directories/files … [07:38:27] is there an exact information about : what is core, and what could be instance based ? [07:39:17] the point is to install in /opt/mediawiki , and create instances in /var/www/html with symlinks to /opt/mediawiki (for core stuffs) and specific directories/files for the instance (LocalSettings.php, cache, extensions, images, … other ?) [07:39:22] any help is welcome :) [07:40:02] also, since I want to use puppet, I can easily build a specific LocalSettings.php, but is there a script or some executable in mediawiki to create the DB + tablespaces ? [08:12:49] Bardack: maintenance/install.php [08:21:21] yep indeed, thx :) [12:44:05] Anyone knows of a simple PHP curl example that gets a token and allows me to create a page? I mostly find full blown libraries with dependancies, but as I just need to create a simple page, I was hoping to find a simpler curl example. [12:48:44] finalbeta: https://github.com/wikimedia/mediawiki-tools-release/blob/master/make-deploy-notes/botclasses.php is pretty standalone [12:49:09] And a very simple usage example at https://github.com/wikimedia/mediawiki-tools-release/blob/master/make-deploy-notes/uploadChangelog.php#L47-L63 [12:49:23] Thank you Reedy [14:01:56] Technical Advice IRC meeting starting in 60 minutes in channel #wikimedia-tech, hosts: @alaa_wmde & @nuria - all questions welcome, more infos: https://www.mediawiki.org/wiki/Technical_Advice_IRC_Meeting [14:51:12] Technical Advice IRC meeting starting in 10 minutes in channel #wikimedia-tech, hosts: @alaa_wmde & @nuria - all questions welcome, more infos: https://www.mediawiki.org/wiki/Technical_Advice_IRC_Meeting [14:55:44] Anyone know why I would get NameTableAccessException [14:56:05] NameTableAccessException from line 42 of /opt/htdocs/wiki/mediawiki/includes/Storage/NameTableAccessException.php: Failed to access name from content_models using id = 1 [14:56:44] The error was from slot_roles, until I manually inserted a row in that table. [14:57:00] I'm running 1.32 [14:57:25] with a wiki farm; and don't have the error in other wikis [14:58:09] https://integration.familysearch.org/wiki/en/Main_Page [14:59:06] The DB is imported from v1.27.1 and upgraded [15:33:19] Is there a reason why content_models would have zero records? [15:34:07] In my other wiki, content models includes 'wikitext' and 'scribunto' [15:35:15] The subject wiki has well over 500K articles, so it definitely should have some content models [15:37:23] The table was added in 1.31... [15:37:55] showJobs.php reveals no jobs pending and I'm not sure if there is some maintenance script I should try [15:38:25] Just trying to spot where/when the rows get added [15:38:30] update.php completes without complaint [15:41:54] yeah, I don't think it's done by update.php [15:44:26] duesen_: ^ Any bright ideas? [15:46:43] if you append ?requestDebug to any URL you'll get the full backtrace [15:47:20] e.g. https://integration.familysearch.org/wiki/en/FamilySearch_Wiki:FamilySearch_Research_Wiki?requestDebug=true [15:48:41] freephile: Have you had a look in phab to see if anyone else has reported anything similar? [15:53:00] Reedy: I see this https://phabricator.wikimedia.org/T203860 which seems related [18:45:18] HELLO [18:45:29] Warning: file_put_contents(/home/bayanmedia/domains/bayanmedia.ir/public_html/extensions/Flow/handlebars/compiled/flow_block_loop.handlebars.php): failed to open stream: Permission denied in /home/bayanmedia/domains/bayanmedia.ir/public_html/extensions/Flow/includes/TemplateHelper.php on line 101 [18:46:03] Warning: file_put_contents(/home/bayanmedia/domains/bayanmedia.ir/public_html/extensions/Flow/handlebars/compiled/flow_block_header.handlebars.php): failed to open stream: Permission denied in /home/bayanmedia/domains/bayanmedia.ir/public_html/extensions/Flow/includes/TemplateHelper.php on line 101 [18:46:18] Warning: file_put_contents(/home/bayanmedia/domains/bayanmedia.ir/public_html/extensions/Flow/handlebars/compiled/flow_block_topiclist.handlebars.php): failed to open stream: Permission denied in /home/bayanmedia/domains/bayanmedia.ir/public_html/extensions/Flow/includes/TemplateHelper.php on line 101 [18:47:03] Plrase help me [19:08:33] SOKOT: how would you like us to help you? [19:08:53] what are you trying to do, what kind of setup do you have, what seems to be going wrong? [19:09:55] SOKOT: have you checked to ensure that the web content is owned by the web user? [19:09:55] e.g. chown -R www-data:www-data /home/bayanmedia/domains/bayanmedia.ir/public_html [19:10:51] (assuming you're on Debian/Ubuntu, otherwise web user could be different) [19:31:29] tgr: do you have any idea why / how to fix my NameAccessException error above? [19:32:14] NameTableAccessException [19:44:09] Hello! I'm interested in seeing how to use for Wikipedia HTTP API for querying for deep subcategories of a given base category. saw discussions about deepcat or CirrusSearch but not sure if those are available. any help appreciated :-) [20:02:19] freephile: (very late response) content_models will be empty if you are running in $wgMultiContentRevisionSchemaMigrationStage = SCHEMA_COMPAT_WRITE_OLD; mode [20:02:30] for any other compat mode, it should not be empty. [20:02:43] run update.php *after* changing the model. [20:02:50] the mode, sorry [20:03:12] * duesen_ is off to a meeting now [20:08:13] if the wiki were still in SCHEMA_COMPAT_WRITE_OLD, there wouldn't be any content model ID to trigger a lookup, no? [20:08:33] or maybe that 1 is a model name type juggled into int? [20:11:27] harmaahylje: hello [20:13:11] harmaahylje: chown -R www-data:www-data /home/bayanmedia/public_html chown: invalid user: ‘www-data:www-data’ [20:45:08] Warning: file_put_contents(/home/bayanmedia/domains/bayanmedia.ir/public_html/extensions/Flow/handlebars/compiled/flow_block_loop.handlebars.php): failed to open stream: Permission denied in /home/bayanmedia/domains/bayanmedia.ir/public_html/extensions/Flow/includes/TemplateHelper.php on line 101 [20:45:24] Warning: file_put_contents(/home/bayanmedia/domains/bayanmedia.ir/public_html/extensions/Flow/handlebars/compiled/flow_block_header.handlebars.php): failed to open stream: Permission denied in /home/bayanmedia/domains/bayanmedia.ir/public_html/extensions/Flow/includes/TemplateHelper.php on line 101 [20:46:04] Warning: file_put_contents(/home/bayanmedia/domains/bayanmedia.ir/public_html/extensions/Flow/handlebars/compiled/flow_block_topiclist.handlebars.php): failed to open stream: Permission denied in /home/bayanmedia/domains/bayanmedia.ir/public_html/extensions/Flow/includes/TemplateHelper.php on line 101. Please help me [21:49:24] For different values of wgMultiContentRevisionSchemaMigrationStage, I can get different error messages, but unsetting it I'm back to the original NameTableAccessException [21:49:38] I've run update.php each time [21:50:21] https://www.mediawiki.org/wiki/Manual:$wgMultiContentRevisionSchemaMigrationStage says to run migrateXXX.php [21:50:53] but none of the maintenance/migrate scripts seem to apply (is it missing?) [21:55:45] is update.php failing? [21:59:06] I have a very little question. I use API from parsoid. There is an option "scrub_wikitext", which is make some normalizations to the HTML parsed. The problem is whenever I make the value of it false, it keep making the normaliztions. [22:00:01] I need to not enable this option. Even the Docs said that making the value equal to false will stop the normalizations. [22:04:35] It wasn't earlier [22:08:35] Reedy: Sorry, could you please explain what do you mean? [22:10:45] freephile: I think the migrateXXX is just a way of referring to any and all migrate prefixed maintenance script [22:51:58] Reedy: Thanks