[00:20:14] Newb here: Was thinking of helping develope a feature for commons.wikimedia that finds similar pics using machine learning. Any thoughts? [00:20:29] For example if I wanted to find pics similar to: https://commons.wikimedia.org/wiki/File:Tesla_Autopilot_Engaged_in_Model_X.jpg [00:36:53] I found a mediawiki install with custom markup like this: {{Entry|foo=bar|baz=quux}}. what kind of "thing" is this? it's not a function, apparently? [00:37:11] since function would be {{#Entry|foo|bar|baz}} ... [00:39:11] oh... maybe it's a template? [00:43:28] {{Entry}} would be a template, yes [00:43:33] ningu: Visit Template:Entry on that wiki [00:44:18] Reedy: thanks, I am slowly figuring this out [00:44:29] first step is knowing how to ask the question or where to look for things [00:44:57] {{#entry}} would be a parser function [00:45:10] yeah [00:50:15] ok, so one of these templates has a parameter called Description, and it's being placed in the template as follows: {{{Description|}}} [00:50:21] is the pipe doing anything useful there? [00:52:06] yeah [00:52:33] https://meta.wikimedia.org/wiki/Help:Advanced_templates#Branching_techniques_without_ParserFunctions [00:54:29] I think you mean parameter defaults, not conditional values? [00:56:46] so in other words, if there is no Description parameter, it will use an empty string [00:56:59] not sure what happens if you leave that pipe out ... will it give an error maybe? [00:57:07] if Description isn't defined? [03:04:00] gcould someone explain this test failure to me? I'm at a loss.. https://integration.wikimedia.org/ci/job/quibble-composer-mysql-php70-docker/5050/console [03:52:50] davidwbarratt: maybe try changing L27 of src/GraphQL.php from $title = \SpecialPage::getTitleValueFor( 'Special:GraphQL' ); to $title = \SpecialPage::getTitleValueFor( 'GraphQL' ); and see if that makes the E_NOTICE (that's causing the browser tests and stuff to fail) go away? (SpecialPage#getTitleFor -- and #getTitleValueFor for that matter, too -- don't need the namespace, they just need the canonical special page name; if you ever need to [03:52:50] bypass entirely the NS_SPECIAL localization, you'll wanna resort to using Title::makeTitle( NS_MAIN, 'Special:MyCoolNewSpecialPage' ) or somesuch (as per /includes/user/User.php) [04:19:57] ashley let me try that [04:29:18] ashley that did it, thanks! [04:49:04] awesome :) [05:38:24] Is there any way to set up a template that tracks ages against a custom calendar or year count? I have a wiki for a roleplaying community I run and would like to be able to automatically track characters' ages. The current year in the setting is 771, and it increments at the beginning of each month. [07:05:22] how would you go about creating or duplicating the english wiktionary [07:05:59] or at least the sql portion of that [11:43:52] hey [11:44:31] i'm writing a mw extension and I have a question about a specific hook, is it the right place to ask? [11:48:44] well, nevermind, I think I got it [13:26:26] Reedy: When was MediaWiki 1.32.1 released? o_O https://gerrit.wikimedia.org/r/#/c/mediawiki/core/+/493031/3/RELEASE-NOTES-1.32 [13:51:27] Vulpix: Tarballs were built... But it's been sitting in weird limbo with releng [13:51:48] Omg, yes, I'm now reading T213595 [13:51:49] T213595: Release 1.32.1 as a maintenance release - https://phabricator.wikimedia.org/T213595 [15:00:03] Good day gentle people [15:02:47] I would like all the UI candy for viewers to make https://develop.consumerium.org/wiki/ multilingual with the [ [mw:Extension:Translate] ] [15:04:18] i.e. would be really nice to get the best possible UI solution from the perspective of wiki viewers and also a completeness expressed with small boxes that Wikibooks has iirc would be nice [15:04:36] ... for a multilingual wiki [21:02:01] are the database for wiktionary replicated [21:13:26] yes [21:14:26] or can completely copy the database without using a dump [21:18:59] they're replicated internally for load balancing and resiliency against server failure, but those replicas are not directly accessible to the public. If you want all of the wikitionary content, a dump is your best bet [21:19:33] there's the public replicas with some metadata [21:21:25] right, forgot about those (or didn't know they existed, not sure which) [21:28:51] i have tried some of those public dumps [21:29:22] Krenair: it seems like the instructions are inconsistent [21:29:35] which instructions? [21:29:49] dumps to live sql db [21:30:11] I don't follow [21:30:42] how do you take https://dumps.wikimedia.org/enwiktionary/latest/ to database [21:45:15] black_13: https://meta.wikimedia.org/wiki/Data_dumps/Tools_for_importing [23:08:35] the instructions the page https://meta.wikimedia.org/wiki/Data_dumps/Tools_for_importing are cursory [23:09:37] of the files in https://dumps.wikimedia.org/enwiktionary/latest/ which do you use and is there an order [23:11:54] how do you create a basic mysql or mariadb [23:44:50] .