[12:40:58] Hi, is there an easy way to get all "discussion_pages" which have content? [12:59:34] nickthenamehere: Do you mean talk pages that actually exist? Special:AllPages and filter by namespace [13:05:27] Vulpix yes and thank you. I'll try it out [13:14:40] Vulpix nice that's working. Would that also work with the API? [13:14:59] yes, there's an api for allpages [13:17:33] Hello all...do you know how via API get all pagename that link to a specific template? [13:18:41] You mean, transclude, not link? [13:18:41] https://en.wikipedia.org/w/api.php?action=help&modules=query%2Btranscludedin [13:19:16] ie {{MyTemplate}} not [[Template:MyTemplate]] [13:20:09] Reedy, on my site I have about 7000 page with {{TemplateX}}....I would like to extract a list of this page [13:20:48] Yeah, it's the api module I list above then [13:21:00] https://en.wikipedia.org/w/api.php?action=query&prop=transcludedin&titles=Template:Uncat [13:21:32] ok I try it [13:21:40] in my API:Sendbox [13:29:19] Reedy, it's works...Now I work with limit :') :') :') [13:29:46] set tilimit=max [13:30:26] but max is 500 for bot, right? [13:30:51] It seems to be 5000 for sysops [13:30:53] * Reedy looks at the code [13:32:30] Bots should have 5000 if they have the apihighlimits right [13:32:47] ok...and for next 5000? [13:37:08] pagination [13:37:19] Grab the continue string, append it to the query and get the next 5000 [13:45:19] thanks Reedy [13:45:27] I try with python [15:45:54] Hey all, I'm looking to resolve a potential namespace conflict in one of our wikis to help with a tangential issue. Basically SMW autoloads and its namespace 102 & 103 namespace values conflict with an existing pair of namespaces in that wiki. Would it be sufficient to basically run an UPDATE statement against the `page` table to set existing [15:45:55] entries with namespaces of 102 or 103 to 122 and 123 respectively? [15:46:38] Hmm, my wording there was less than ideal but I think I got my idea across. :) [15:48:13] You'd need to do updates on various other tables too [15:48:47] As otherwise links and other tracking would be wrong [15:49:38] I thought we had a scipt to "renumber namespaces" [15:49:58] justinl: But yes, basicallly in theory. Then flush caches and such [15:50:07] I figured as much. Was looking at https://www.mediawiki.org/wiki/Topic:Q6xah0k2h1e0drmb and saw they mentioned there was supposed to be a tool for such a namespace update long ago but it never happened. [15:51:56] I think this issue then is preventing me from being able to have a single wiki farm/family since 4 of my 7 wikis use SMW and one of them has this namespace ID conflict, so basically I could do a 4/3 split with two wiki families, e.g. standard and semantic, as it doesn't really seem feasible to have two different extensions directories with a single [15:51:57] copy of mediawiki being used by multiple wikis. [15:52:07] https://phabricator.wikimedia.org/T180885 [15:52:07] lol [15:53:38] Well, at least you tried. :) [17:36:17] hi. i installed TimedMediaHandler on mediawiki wiki, but i can't get it to work. i'm getting an error about not existing relation "transcode" (full error here: https://pastebin.com/GDMWnNT5, some errors are in polish - my system is in polish, though the wiki is in english - maybe i can change the error language somehow?). i ran update.php, added the extension to composer and 'composer install'ed. running 1.34.1 with postgre 11.7. [17:38:58] It would sound like update.php didn't create the table [17:52:33] so a bug? [17:52:59] oh, good. Restbase was able to detect something was wrong... "worker stopped sending heartbeats, killing.", tries to restart, good! "worker died, restarting". Seems to fail again: "worker stopped sending heartbeats, killing.". SNAFU. "worker died during startup, continue startup" :) [17:53:23] grzesiek11: It's hard to say. There's not enough information [17:53:43] According to systemd, restbase is still alive, but it's not. Nodejs is still alive... [17:53:43] I suspect the issue is just a lack of PG support [17:53:48] Which is fairly common [17:54:39] https://github.com/wikimedia/mediawiki-extensions-TimedMediaHandler/blob/master/includes/TimedMediaHandlerHooks.php#L507-L508 [17:55:35] https://phabricator.wikimedia.org/T157424 [17:57:52] oh, ok [17:58:05] so i should switch to mysql? [17:58:18] or just wait a bit [17:59:56] Well, the task hasn't been resolved in 3 years [18:00:20] If you can help, patches welcome [18:00:26] But mysql is generally better supported [18:03:10] i barley know PHP, and don't know SQL at all (i just use it for services that need it) - so rather can't help. and what about sqlite? it is supported in this particular plugin, but what about other ones? generally, i use postgre because it's more lightweight on resources (i don't have much) than mySQL, but i also use sqlite on few programs. [18:04:10] sqlite is ok depending on the traffic to the wiki [18:11:31] ok so, i will try sqlite, my wiki is very small and has less than 20 unique visitors a month. and one thing about pathes: i bodged-in a feature to one plugin, how can i submit a pull request? mediawiki uses gerrit (correct me, i just looked at the url) and i only have experience with gitlab. [18:12:41] If you can make a patch, we have https://tools.wmflabs.org/gerrit-patch-uploader/ you can use [18:12:46] Or use the gerrit web interface [18:13:28] Gerrit isn't particularly hard... Just rather than `git push origin`you can do `git push origin HEAD:refs/for/master` rather than necessarily using gitreview [18:13:32] thanks! [18:32:15] Hi, Should I use mediawiki as a tool to translate longer texts into different languages, or do you know a different / better tool? [18:33:42] I wonder, if I can do it on a single wiki, or with a wikifamily, basically I would like to use the moderation and openness to let many contributors translate texts of a single author. [19:56:34] A wiki of mine is moving into an another wiki. I came across this https://www.mediawiki.org/wiki/Template:Soft_redirect. Does soft redirect have a good SEO effect or should I just go for redirect permanent in Apache2 virtualhost conf? [19:57:14] I mean to do the redirecting on a per-page basis, since there are only under 50 articles in wiki.study [19:57:39] damn these .study-domains are expensive... 43€ / yr [20:16:58] I'd say soft redirects are not very good for SEO. A small page with just one link will be seen as a "soft 404 error", and dropped from the index entirely [20:17:37] redirect permanent is the way to go if you transfer the contents from one site to the other [20:29:46] Ok thanks for info Vulpix. And is there a way to do redirect permanent from within Mediawiki or should I just do it in Apache VirtualHost conf? [20:29:52] What's the best way to clear abandoned jobs from the queue? So far my googling basically shows manually deleting from the job table, but is there a safer and/or more preferred way? [20:36:21] jukebohi: MediaWiki doesn't provide permanent redirects from wiki pages [20:41:59] Hi. Is it possible to require to specify a reason when moving pages? [20:46:01] nvm, think I found it in deleteBatch.php [20:46:45] nvm misread it [20:47:22] manageJobs.php :) [20:47:50] Esteban16: so the "easy" way to do that is to add some js on the move page form that checks for a reason, but it's easy to bypass (disable js or use the API) [20:48:03] I don't recall any config variables that force you to require a move reason [20:48:14] so you'd probably need to write your own extension [20:48:14] Ok, thank you :) [23:41:49] How can I @import css a medawiki or user CSS page? I thought it was through index.php but it's not working [23:45:50] Found it, `index.php?title=MediaWiki:PageName.css&action=raw`. Is that the best/only way to do it or is there something better I can do?