[15:46:15] In my extension special page on wiki1 I want to run this piece of code on wiki2 (another wiki): WikiPage::factory( $page )->doEditContent( $textObject, $summary, EDIT_NEW, false, $userObj ) [15:46:40] how can I do that without having the page creation log entry on wiki1? [15:47:31] page creation log? there's a revision history but page creation doesn't generate a log entry [15:48:35] sorry, meant recentchanges [15:48:48] mark the edit as a bot edit? [15:49:27] I tried $dbw->selectDB( 'wiki2' ); $dbw->selectDB( 'wiki1' ); but it would still end up in the recentchanges on wiki1 [15:49:46] the creation is done on wiki2, though [15:50:47] changing DBs as this is very scary [15:51:03] and the revision entry is properly inserted on wiki2. it's just that something is inserting the entry in RC on wiki1 [15:51:19] you can call from wiki1 the api on wiki2 to create the page, logging in first with a bot account, so the page is created as a bot. This is the safest way to go [15:51:38] then it doesn't matter if the wiki is on the same server or a remote one [15:53:11] I don't want to create an account for it. an other option is to put the code in a maintenance script and using exec() in the extension [17:39:24] hi [17:39:32] am a new here [17:39:48] how is it going here? [18:10:31] hi peeps! [18:10:47] I have a question regarding the dump files [18:12:04] metawiki-latest-pages-meta-history1.xml.7z for example, this file, does it contain edit history for the pages? [18:15:54] atbe: Yes, it should. [18:17:00] MTres19[m]: thanks for the confirmation. Is it required for me to build up all of bz2 fragments before loading them into sql? Or can I download metawiki-20170101-pages-meta-history1.xml.bz2 without downloading 2 3 and 4? [18:17:59] atbe: I don't know for sure, but you probably do need all of them. [18:18:57] MTres19[m]: sounds good. [18:22:07] MTres19[m]: one last question, say I am downloading the dumps from this folder, https://dumps.wikimedia.org/metawiki/20170101/, this contains the dumps for the month prior, correct? [18:22:39] or is it an aggregation of all data prior? [18:23:17] I think it's an aggregation. [18:24:02] Data isn't wiped from the databases after it's dumped, so it is also available in subsequent dumps. [18:25:38] MTres19[m]: eeek. okay. and I'm assuming incremental data dumps for say month slices at a time would not be available? https://meta.wikimedia.org/wiki/Data_dumps#Summary talks about a "friendlier" sized dump? Where are those located? The bottom of the page did not have any info on that [18:25:50] https://dumps.wikimedia.org/other/incr/ [18:25:52] ah found it [19:42:42] Hi [19:42:57] I have a problem [19:43:11] How do you do a #switch which is a range? [19:44:31] you'd probably do an #ifexpr instead [19:45:12] MatmaRex: : What I want to templatise is this ... - http://copyright.cornell.edu/resources/publicdomain.cfm [19:45:30] So I can determine what OKay for Commons :) [19:45:53] yeahhh. you'll need #ifexpr, and a lot of them, alas. [19:46:03] It's either going to need a LOT of sub-templates or a #range expr [19:46:21] Where do I suggest new parser functions? [19:49:09] why would you need a new parser function? :o [19:50:11] this is just something like: [19:52:02] {{#ifexpr: {{{date}}} < 19230101 | None | {{#ifexpr: {{{date}}} >= 19230101 and {{{date}}} <= 19771231 | Published without a copyright notice | {{#ifexpr: {{{date}}} >= 19780101 and {{{date}}} <= 19890301 | Published without notice, and without subsequent registration within 5 years | ........... }} }} }} [19:52:05] and so on, and so on [19:52:15] Okay [19:52:31] I think we are going to have to work on this togtehr [19:53:03] Problem 1 - Force a sane value out of {{{year}}} , nil, empty string , -1 [19:53:28] 4 possible inpute conditions but ifeq only accepts 2 arguments ... hmmm.... [19:53:34] by that table, looks like you need the full date, not just year [19:54:38] i'm not sure if it's worth it, trying to express that as a template, though… i'm sure there will be exceptions and it'll all get hopeless [21:06:22] Hi. Where can I find the portletId's of a certain Wikipedia? [21:09:00] Xxmarijnw: right click and choose inspect element [21:09:33] Thank you very much, Vulpix