[00:30:42] when a wiki install has more than 1 slave, how does routing work for queries when someone visits a page, is it random? [00:44:38] e.g. a wiki with one master and two slaves. the master has 20% load, first slave is at 50% load, second slave is over 90%. in terms of page load, if it's slow for someone but not for another, it makes it seems like it's randomly queried. [00:47:23] "This would configure one master and two slaves, each slave getting an equal amount of read access load." [00:47:30] 'load' => 1, [00:47:44] c: Should be weighted random I guess... Based on number of queries [00:48:00] Not taking into account "slow" queries [00:48:17] https://www.mediawiki.org/wiki/Manual:$wgDBservers#Details [00:48:55] load is documented as "ratio of DB_SLAVE load, must be >=0, the sum of all loads must be >0" [00:50:52] well, it's setup as 0, 1, 1. but the second slave is getting more than 50% [00:51:40] 20.58% 51.02% 93.75% --- somewhat imbalanced [00:51:45] but 50% of what metric? [00:51:49] CPU [00:52:09] connections are fine [00:52:54] yeah, I don't think MW can guarantee that the cpu time is going to be equal [00:53:06] It'd have to have more information about the sql queries which it simply doesn't have [00:53:27] so if person X queries a page which gets sent to the DB with the most CPU usage, that's just bad luck then [00:53:43] pretty much [00:54:03] WMF "work around" that kinda by pointing "slow" queries or queries of certain types to specific database servers [00:57:22] how do you tell mediawiki to point slow queries to a specific db server? i don't see a relevant option on Manual:$wgDBservers [00:57:56] You'd need to do it via wgLBFactoryConf [01:25:59] * c blinks [05:58:33] how does mediawiki/wikipedia store edits? Does it store a full copy of the new page for every edit? [06:04:23] narutowaifu: in the `text` table, https://www.mediawiki.org/wiki/Manual:Text_table [06:13:14] samwilson[m]: I've read that but I'm still confused. When I make a new edit, does mediawiki store both the old page and the new page from top to bottom? Or is there something more clever going on like storage of single lines changed? [06:21:10] narutowaifu: yes, it's a full new copy [06:21:22] narutowaifu: though there are ways to compress historial revisions to just store diffs [06:21:58] and those historical revisions have to be decompressed before they can be used? [06:22:20] I mean used from the normal UI [06:22:36] MediaWiki handles it transparently [06:28:23] ok [08:18:32] another question: let's say a page is vandalized. I click "undo" to revert the page to an older revision. Does mediawiki delete the most recent revision from the database entirely, or does it create a new revision taking the content of the original page? [08:34:20] narutowaifu, the latter [08:34:41] undo is not revision deletion [08:36:20] andre__: is there anything like "revision deletion" at all? [08:38:26] narutowaifu, https://www.mediawiki.org/wiki/Manual:RevisionDelete [08:38:58] narutowaifu, https://www.mediawiki.org/wiki/Help:Deletion_and_undeletion [08:58:23] Hello bros, I'm now using the 1.32.1 mediawiki, how can I import the full dump of wikipedia. [08:58:53] The mwdumper only supports sql:1.25 [08:59:34] I tried to load the enwiki-20190520-pages-articles-multistream.xml.bz2 into mariaDB, but it failed. [09:04:56] graphecoboy_: Sorry, not a "bro". See https://www.mediawiki.org/wiki/Manual:Importing_XML_dumps [09:07:13] Haha, thank you very much, I have read this page, it recommends to import large dump by mwdumper, however mwdumper only supports 1.25 [09:08:50] graphecoboy_, I wasn't joking. [09:08:53] graphecoboy_, Did you run into an issue with mwdumper in 1.32? [09:12:46] yes [09:13:48] I try to import enwiki-20190520-pages-articles-multistream.xml.bz2 in to mariaDB, it failed to import [09:14:10] maybe the enwiki-20190520-pages-articles-multistream.xml is exported by 1.34? [09:15:19] below is the error info: [09:15:20] ERROR 1064 (42000) at line 503: You have an error in your SQL syntax; check the manual that corresponds to your MariaDB server version for the right syntax to use near ''{{About|the international environmental organization}}\n{{Use dmy dates|date=Ma' at line 1 [09:19:04] Is it possible that it's a problem about the xml dump file? [13:25:37] Hi ! I'm willing to do a wikiwar oriented project for fun (and to learn stuff) and I would like to monitor things such as number of clicks, links followed, and time to get from a page to another. I found the wikimedia rest API that gives me exactly what I need (the content of a wikipedia page without the search bar or side bar) however the xframeoptions are set to "Sameorigin" and that prevents me to display it in an iframe. I [13:28:54] I searched all day a solution but couldn't find another API or a parmetter to give to the API to disable that. I'm only a student in IT so I might not fully understant why it's here and why I can't get rid of it. Is there any solution to my problem ? Thank you in advance. [14:01:20] Technical Advice IRC meeting starting in 60 minutes in channel #wikimedia-tech, hosts: @awight & @tgr - all questions welcome, more infos: https://www.mediawiki.org/wiki/Technical_Advice_IRC_Meeting [14:03:10] Dampfengine_: $wgEditPageFrameOptions $wgBreakFrames $wgApiFrameOptions [14:06:44] Dampfengine_: it prevents clickjacking and similar frame-based crossorigin attacks [14:10:04] Oh ok, I looked that up, but that means I will never be able to display a page from "https://en.wikipedia.org/api/rest_v1/" without [14:10:17] in an iframe* [14:19:38] Dampfengine_: that's kind of the point of it, yes [14:20:14] you can fetch the page via Javascript and render it on your own [14:24:30] Oh I did not think about that. Thanks I will try it ! [14:27:51] hi ! I'm trying to wrap the main content of the pages of my wiki with a specific div, but i can't find a way [14:28:59] i tried ParserSectionCreate, but the hook gets run several times, not just when creating sections [14:29:24] so i end up wrapping unwanted elements [14:51:19] Technical Advice IRC meeting starting in 10 minutes in channel #wikimedia-tech, hosts: @awight & @tgr - all questions welcome, more infos: https://www.mediawiki.org/wiki/Technical_Advice_IRC_Meeting [15:49:58] I noticed that mediawiki javascript modules will not go through the built-in minifier if they contain the string "/*@nomin*/". is this documented anywhere? [15:50:48] What sort of documentation are you looking for? :) [15:51:11] anything really, other than having to crawl through lots of code... [15:51:18] the minifier was messing up my react app in strange ways [15:51:28] since it was already minified, I just added that [15:51:34] There's not much code for it tbh [15:51:51] I suppose, but I didn't even know there _was_ a minifier, haha [15:51:55] /** @var string JavaScript / CSS pragma to disable minification. **/ [15:51:55] const FILTER_NOMIN = '/*@nomin*/'; [15:51:56] that is documented, at least [15:52:01] if ( strpos( $data, self::FILTER_NOMIN ) !== false ) { [15:52:01] return $data; [15:52:01] } [15:52:10] Reedy: right, once I found that, it was obvious [15:52:39] but googling things like "mediawiki disable minifier" turned up nothing [15:52:50] debug mode does that [15:52:52] ?debug=true [15:53:01] or there should be a $wg for it [15:53:02] ok, but then it would just fail in production :P [15:53:06] there is not a $wg for it [15:53:08] afaik [15:53:16] yes there is [15:53:16] * The default debug mode (on/off) for of ResourceLoader requests. [15:53:16] * [15:53:17] * This will still be overridden when the debug URL parameter is used. [15:53:17] */ [15:53:19] $wgResourceLoaderDebug = false; [15:53:23] I found some deprecated ones for things like max line length [15:53:31] oh, ok. it doesn't say minifier though [15:53:38] sure [15:54:27] FWIW, I'm not saying our docs are perfect, or even good ;) [15:54:37] https://www.mediawiki.org/wiki/ResourceLoader/Developing_with_ResourceLoader#Debugging [15:54:46] "Debug mode is designed to make development as easy as possible, prioritizing the ease of identifying and resolving problems in the software over performance. Production mode makes the opposite prioritization, emphasizing performance over ease of development." [15:54:58] minification is a performance improvement [16:09:10] I am a little surprised it messed up code up, actually [16:09:22] not entirely sure what happened, but don't have time to look into it either, haha