[00:05:09] Hi all, I'm currently developing a plugin that has a dependency upon a symfony component. Normally, I would load this in with composer however I am unsure as to whether this dependency would be better off in extensions.json? [09:24:25] im trying to create a robots.txt and a sitemap... i want google to be able to index articles, but not non-article pages [09:24:57] the robots.txt examples im getting from the mediawiki docs and from wikipedia's robots.txt show a Disallow: /w/ and an Allow: /load.php? [09:25:29] cool. so it only allows grabbing of pages by load.php, and not by the short URL [09:26:08] but the sitemap that's generated by generateSitemap.php has short URLs in it [09:26:26] is there some magic here that google knows that a short URL = load.php? [09:26:41] or do i need to reconfigure the sitemap generator provided by mediawiki? [09:26:51] https://www.irccloud.com/pastebin/P1zHgm2w/ [09:28:59] khalella: we at the TDF take another way, see https://wiki.documentfoundation.org/robots.txt [09:29:20] interesting, DennisRoczek [09:29:22] so more or less: allow only shorturl [09:29:51] but what i have to say, our pages look like https://wiki.documentfoundation.org/Name_of_Article [09:30:01] so without any /w/ part [09:32:29] that could work [09:32:35] i want to understand wikipedia's solution though [09:32:44] is there some magic on the google side of things? [09:38:43] who knows? we all do not can look in googles code. but very likely google hasn't some special mediawiki handling [09:38:56] maybe some wikipedia handling, but i do doubt this [09:43:49] Hi there, do we have a page on mediawiki.org detailing what is default when you install Mediawiki? [09:45:52] Trizek: do whatnow [09:59:02] khalella, I'm not sure to understand your reply. [09:59:51] sorry. i didn't understand the question [10:00:15] i think most settings include defaults in the documentation [10:13:08] DennisRoczek: maybe its irrelevant, as presence in the sitemap means it doesn't need to index the page? https://webmasters.stackexchange.com/a/46684 [10:13:18] i dunno. i could experiment i guess [10:13:36] er s/index/crawl [10:14:43] * DennisRoczek is no seo expert at all. I just wanted you to show another solution somebody else worked out... [10:56:16] khalella, my question is : what features are default when you deploy a mediawiki. [11:09:11] Trizek: https://phabricator.wikimedia.org/source/mediawiki/browse/master/includes/DefaultSettings.php [11:27:28] Trizek: if you want a high-level overview https://en.wikipedia.org/wiki/MediaWiki is quite well-written [11:28:40] (PS: I answered just previously through the Matrix-IRC gateway but it seems it didn’t reach IRC.) [11:35:30] seb35, teach your Seb35[m] to identify to nickserv, possibly [11:36:30] yes, indeed, will try [12:29:44] thanks Sveta_ and Seb35[m] [12:30:19] I note that mediawiki.org, the main site where to find those informations is not in your replies! :D [13:05:44] hi [13:06:04] anyone here could help me to integrate collapse into a template? [13:32:08] hi, when on our internal wiki you block a user, autoblock of the ip is default on [13:32:17] can you change that default? [13:34:14] seems I can set $wgAutoblockExpiry very low to mitigate the effect, but I would like to disable it completely [13:56:37] ok, found the whitelist, that will solve my issue [14:13:56] raza_bulldog, no, some people speak other languages [14:41:01] Most people probably speak english, yes [14:49:11] Krenair: or Reedy [14:49:16] do you guys have a minute? [14:49:21] !ask [14:49:21] Please feel free to ask your question: if anybody who knows the answer is around, they will surely reply. Don't ask for help or for attention before actually asking your question, that's just a waste of time – both yours and everybody else's. :) [14:50:43] Ulfr: I'm in a meeting atm, so hard to patronise you atm [14:50:48] I have a .php file which parses a .csv then a template pulls the name [14:51:22] but the problem is the template pulls the whole team details [14:51:35] so we wanted a header being shown and the rest of the team being collapsed [14:51:42] so a table [14:52:00] the header being show and the rows hidden by default [14:52:05] how can i do this? [14:55:47] Ulfr: I think there's a maintenance script (nukeNS.php maybe) that lets you blow off an entire namespace [14:55:58] so you could use that to eradicate all of the flow namespaces [14:56:56] is it even possible to automate such thing? [14:57:23] show the class and hide its content rows? [14:57:51] "You can add mw-collapsed after mw-collapsible to have the content collapsed by default when you load the page. " [14:57:57] is that what you're after? [14:57:59] i tried that [14:58:05] but theres an issue [14:58:13] like have this "#team" [14:58:25] it pulls Team name and its players [14:58:33] i want to show the Team name and hide the playes [14:58:58] yes, so if you format it in a table, with team name as the table header and players as table contents, then adding those two classes mw-collapsible and mw-collapsed to the table will do as you want [14:59:12] give me an example? [14:59:44] the page Ulfr linked you has a couple of examples of wikitables set up like htat [15:00:02] the thing is, it *has* to be a wikitable; can't use a normal
if you want that behavior (keeping header and collapsing contents) [15:00:40] Skizzerz: Ulfr linked no site [15:00:48] @biberao https://www.mediawiki.org/wiki/Manual:Collapsible_elements [15:01:01] i didnt see such thing sorry [15:01:18] np :) [15:01:28] and he didnt link [15:01:30] anysite [15:01:38] i did a /lastlog Ulfr [15:01:43] only you linked it [15:01:44] lol [15:01:51] 16:00 <@Skizzerz> @biberao https://www.mediawiki.org/wiki/Manual:Collapsible_elements [15:02:00] anyway [15:02:03] i tried that [15:02:10] i could show you [15:02:17] how they did it and what im trying to do [15:02:18] ? [15:14:58] Error message? [15:17:07] You might need to re-enable flow temporarily [15:18:53] What's update.php fail with? [15:20:41] Did you delete the tables? [15:20:49] ie were they created before, logged as completed, no won't recreate [15:21:27] nuclear option would be a manual db query to change flow content models back to wikitext [15:29:36] Ulfr: Possibly do what Skizzerz suggested then.. If you don't care about the flow board data.. [15:29:46] Force change the content model, then just mass delete them [15:33:45] UPDATE page SET page_content_model = 'wikitext' WHERE page_content_model = 'flow-board'; [15:33:46] I think [15:38:26] They'd be pages of potential garbage I guess [15:41:46] funny [15:41:51] is Ulfr on my ignor? [15:42:15] i cant see his msgs [15:42:47] biberao: unidentified users are under the +q [15:42:50] ah [15:42:52] only ops can see them [15:42:54] see [15:42:58] Skizzerz: :P [15:43:01] oops [15:43:10] after all i wasnt being mean [15:43:11] :D [15:43:15] Hi! [15:43:20] my feelers weren't hurt [15:43:33] hi [15:43:47] *accidentally dumps hot coffee over Skizzerz* oopsies. :P [15:43:52] guys [15:43:53] *returns to being silent* [15:44:00] take a look at my issue [15:44:01] thanks [15:44:01] :D [15:44:30] biberao: you'll need to modify that parser function your extension to add those classes [15:44:45] Skizzerz: ok [15:44:47] *that parser function in your extension [15:44:53] if i paste the code can you help me modify it? [15:45:00] not for free, sorry [15:45:07] ok thanks [15:45:07] :D [15:45:12] sorry to have bothered [15:46:25] @Skizzers any chance you could take a gander at my 64k$ question above? I'm going to have to run that query at oh dark thirty when this channel's mostly ded [15:47:02] you'll probably have another puzzle to figure out :) [15:48:03] how big is your job queue? [15:48:37] I wonder if your frontend is misbehaving when mw is triggered to run a job at the end of a web request, and keeps the connection open even though mw is done sending data until it times out [15:49:16] My queue is currently at 0 [15:49:35] and it's not a static amount of time between outages, it's maddeningly imprecise [15:49:50] and didn't start up until I had to turn on memcached because y'all apparently hate people who dont cache [15:50:44] we do :) [15:51:20] check memcached memory usage and see if the OOM killer is sniping it or it's when cache entries are being invalidated and regenerated [15:52:13] (debug logging in mediawiki can give insight into the cache behavior) [15:52:40] I'll have to look into it, I pretty much just apt-get'd memcached and called it a day [15:52:59] would it affect multiple frontends simultaneously? [15:52:59] default memcached configs are notoriously bad [15:53:21] it sets the cache size to something really small like 128 MB [15:53:46] (small for mediawiki if your wiki is large and you're shoving everything into there) [15:54:19] memcached having problems would affect the backend, and if the backend is having problems then all of your frontends will too [15:54:42] oh good lord. it was set to 64m [15:55:32] you should also make sure it's not listening to 0.0.0.0 [15:55:41] (that's a security risk) [15:55:58] well that + no firewall blocking access to it from the outside world is a security risk [15:56:00] it's set to 127.0.0.1 [15:56:08] good :) [15:56:09] * Skizzerz afks a bit [15:56:10] Memcached defaults be crazy [15:56:35] anyway try upping the cache size and see if that makes your 504s go away [15:57:04] I might be missing context [15:57:04] already done did dat, am crossing fingers [15:57:24] not particularly. y'all decided that not caching comes with a 5 second page load penalty [15:57:29] so I had to scramble and get one set up [15:57:35] but 504s can sometimes be caused by localisation recache events [15:57:39] and apparently those settings are craycray [15:57:58] enabling $wgCacheDirectory can sometimes help with that [15:58:21] noo, I had that problem probably 3 years ago. drove me ABSOLUTELY mental until I changed something l18n related and suddenly my database usage plummeted [15:58:33] im screwed [15:58:34] :| [15:59:06] biberao: what happened? [16:14:09] @bawolff Skizzerz dipped on me before confirming, memcached listening to 127.0.0.1 is ok right [16:15:07] Yep [16:19:31] Thank you! [16:19:41] ..and after applying that simple fix my cpu usage went down 80% -.- [17:33:42] Is it possible to fetch from a template but just print parts of it [17:37:12] Sure. You'd do that with PHP [17:38:27] damn [17:38:40] :| [18:19:46] tgr: Hey, the fix for it is deployed, I'm going to run it again on commons, and monitor everything [18:21:03] Amir1: ack, thanks again for working on this! [18:21:08] I'll be afk for a while [18:21:13] Have fun! [20:23:50] hi, are there any efforts to provide automatic (synthesized) IPA-to-speech conversion? [21:38:49] JanSch, There has not been any development work, but some people have discussed it over the years. See https://phabricator.wikimedia.org/T33221 and all the link therein. (especially the comments since 2016) [23:24:24] quiddity, thanks! lexconvert + eSpeak was indeed what I was looking at. I don't think uploading a file for each IPA is a solution that will scale. I think it would be easier to either allow people to load the existing JavaScript-based solution (downside: it's 2 MB) or run a IPA-to-speech API somewhere (with caching).