[01:49:55] !wg wgNamespaceRobotPolicies [01:49:55] https://www.mediawiki.org/wiki/Manual:%24wgwgNamespaceRobotPolicies [02:27:12] [02:27:38] !n00b | Amgine [02:27:43] !botnoob | Amgine [02:27:43] Amgine: I don't know everything about everything. I am mostly for lazy experienced users to echo quick answers to very common mediawiki questions. Please don't randomly experiment with me for help. Everything I know is at http://bots.wmflabs.org/~wm-bot/db/%23mediawiki.htm and you can visit #mwbot which shares the same db (read only) [02:27:49] !! [02:27:49] bla [02:27:55] [03:17:05] Does anyone have any idea what to do when someone creates an account on a site with CentralAuth and CentralAuth just doesn't create their global account? [03:23:08] Ask the stewards for help. [03:24:08] Amgine: im a steward and ops [03:24:13] I just don't understand Central Auth very much [03:25:35] Ah, well, in that case, I suggest bugging Esther. Mostly because I haven’t a clue, and likely niether does xe. [03:27:05] Esther: help♥ [03:28:15] You know, this is a remarkably vanilla group. [03:28:16] PuppyKun: Where is this happening? [03:28:45] Esther: nenawiki.org (login.miraheze.org) [03:29:29] Don't use CentralAuth? [03:29:33] Is that an option? [03:29:38] I don't hear good things about it. [03:29:57] Esther: I think it's a little late for that. Although (for completely unrelated reason) we might be moving that wiki to another setup, so maybe. [03:30:05] I've only like effed up one or two servers lately [03:30:22] but we were recently having a whole bunch of issues with our webservers not responding, so account creation probably goofed :< [03:30:46] Maybe. [03:31:52] You could insert the row yourself. [03:32:12] If it's actually missing and is supposed to be there. [03:32:53] PuppyKun: Or you could try one of these: https://github.com/wikimedia/mediawiki-extensions-CentralAuth/tree/master/maintenance [03:33:42] attachAccount.php maybe. [03:35:37] eew. [03:35:56] * Esther pets Amgine. [03:36:12] * PuppyKun likes petpets? [03:36:13] [03:36:31] Amgine: For waht? [03:36:33] what [03:36:34] Whaaaaaaaaaaat. [03:36:41] attachAccount.php [03:37:10] Are you a Miraheze user? [03:37:24] Or just interested in CentralAuth attachments?! [03:37:33] I doubt it has any documentation. [03:37:38] You can just read the code. [03:37:42] https://github.com/wikimedia/mediawiki-extensions-CentralAuth/blob/master/maintenance/attachAccount.php [03:37:46] watt [03:37:52] dammit. [03:37:58] It has a userlist argument. [03:38:06] Probably outputs something with --help. [03:38:11] Welp. [03:38:15] Let's see if that worked. [03:38:28] PuppyKun: Did you run attachAccount.php? [03:38:42] Amgine: Importantly, also has a --dry-run flag. [03:38:46] I ran migrateAccount, since that one actually creates the global user [03:38:51] attachAccount yelled at me [03:39:09] rude. [03:39:11] migrateAccount created a global user [and attached it? special:centralauth doesn't error] [03:39:17] Great. [03:39:21] then i did createLocalAccount --wiki loginwiki [03:39:46] And now here we are. [03:40:22] In purgatory. [11:05:06] <_Tuxedo> test [11:30:23] <_Tuxedo> I have a single instance wiki that has been running for some time and I would like to set up a second language version using the same sources (MW 1.27.3), symlinked to the same MW directory. [11:30:56] <_Tuxedo> While example.com is the existing wiki, example.fr should become the new. So I created a LocalSettings.php with the relevant redirects. [11:31:29] <_Tuxedo> switch ($_SERVER["SERVER_NAME"]) [11:31:30] <_Tuxedo> { [11:31:32] <_Tuxedo> case "www.example.com": [11:31:33] <_Tuxedo> require_once "LocalSettings_en.php"; [11:31:35] <_Tuxedo> break; [11:31:36] <_Tuxedo> case "www.example.fr": [11:31:38] <_Tuxedo> require_once "LocalSettings_fr.php"; [11:31:39] <_Tuxedo> break; [11:31:54] <_Tuxedo> I was initially able reach the installer at http://www.example.fr/wiki/mw-config/ [11:32:39] <_Tuxedo> There the first "MediaWiki 1.27.3 installation" page is presented and prompts selecting "Your language" and "Wiki language". [11:33:03] <_Tuxedo> However, when clicking next, I get the error: [11:33:46] <_Tuxedo> Warning: require_once(LocalSettings_fr.php): failed to open sWarning: require_once(LocalSettings_fr.php): failed to open stream: No such file or directory in /usr/wiki/LocalSettings.php on line 9tream: No such file or directory in /usr/wiki/LocalSettings.php on line 9 [11:34:40] <_Tuxedo> In fact, there is no LocalSettings_fr.php, so this error can be expected! [11:35:08] <_Tuxedo> But how can I install a second wiki in this sitiation? [11:35:30] <_Tuxedo> What is the usual or a good prodecure? [11:36:44] try create the blank file as "LocalSettings_fr.php". [11:38:00] <_Tuxedo> Thanks for that tip :-) [11:39:35] <_Tuxedo> The installer now presents me with the message "An existing installation of MediaWiki has been detected. To upgrade this installation, please put the following line at the bottom of your LocalSettings.php: $wgUpgradeKey = 'da438d5b748e74bf';" [11:41:07] ugh.. [11:41:24] <_Tuxedo> I'm in a loop. [11:41:27] you can use "include_once" instead of "require_once" for when setup : http://php.net/manual/en/function.include-once.php http://php.net/manual/ja/function.require-once.php [11:41:39] http://php.net/manual/en/function.require-once.php [11:49:28] <_Tuxedo> I tried includeonce [11:49:42] <_Tuxedo> I mean include_once [11:49:45] <_Tuxedo> Like this: [11:50:30] <_Tuxedo> switch ($_SERVER["SERVER_NAME"]) [11:50:33] <_Tuxedo> { [11:50:35] <_Tuxedo> case "www.example.com": [11:50:36] <_Tuxedo> include_once "LocalSettings_en.php"; [11:50:38] <_Tuxedo> break; [11:50:39] <_Tuxedo> case "www.example.fr": [11:50:41] <_Tuxedo> include_once "LocalSettings_fr.php"; [11:50:42] <_Tuxedo> break; [11:50:44] <_Tuxedo> default: [11:50:45] <_Tuxedo> echo "This wiki is not available. Check configuration."; [11:50:47] <_Tuxedo> exit(0); [11:50:48] <_Tuxedo> } [11:50:50] <_Tuxedo> ?> [11:51:30] <_Tuxedo> And a LocalSettings_fr.php with nothing but an upgrade key [11:52:13] try rename "LocalSettings_fr.php " to other [11:56:02] <_Tuxedo> I tried some other name but same result. The LocalSettings_something.php with a file containing only the $wgUpgradeKey = 'da438d5b748e74bf'; [11:56:36] <_Tuxedo> I get the same message: An existing installation of MediaWiki has been detected. ... etc. [11:57:45] <_Tuxedo> The upgrade key is displayed in the top-left corner of the screen, and I can click the "Continue ->" button but the same screen keeps showing each time. [11:58:23] <_Tuxedo> I made LocalSettings_something.php world-writeable in case that was necessary, but no difference. [11:58:58] <_Tuxedo> I'm not trying to install or upgrading a new wiki. [12:00:06] <_Tuxedo> Ther wiki 1.27.3 sources already exist. So I just try to install a second version with different hostname, variables and database connection etc. [12:00:14] _Tuxedo: I forgot this: You must setup a new wiki for "www.example.fr" [12:04:30] _Tuxedo: rename "LocalSettings.php" to other. and setup as new wiki for "www.example.fr"; [12:05:01] and generated "LocalSettings.php" you should save save as "LocalSettings_fr.php" [12:07:10] <_Tuxedo> I guess I must disable the existing wiki at example.com meanwhile. [12:07:47] <_Tuxedo> And not use a LocalSettings with the server switch statements. [12:08:11] <_Tuxedo> It seems to confuse the installer for the new host. [12:09:59] _Tuxedo: if you can able to access a shell, you can use this: https://www.mediawiki.org/wiki/Manual:Maintenance_scripts#MediaWiki_installs_that_use_symlinks [12:11:19] mistake: " https://www.mediawiki.org/wiki/Manual:Install.php " is correct [12:22:51] <_Tuxedo> Thanks. However, the documentation appears to be a bit thin on the install.php script. I think safer and maybe better is to disable the first wiki temporarily by removing the symlink to example.com, placing an under maintenance notice at the domain, so it's not publicly open for anyone to click through the installers and thereafter create the second (fr) installation. Thereafter save the... [12:22:52] <_Tuxedo> ...LocalSettings_fr.php. Then add the symlink to the first again along with the host-switcher in LocalSettings.php. This is how I've create multiple instances of wiki's sharing the same sources in the past and it has worked. [12:26:06] <_Tuxedo> Or simply use a server password protection for the multi-wiki directory while doing the installations. [12:33:24] <_Tuxedo> I think I'll give install.php a try. It exists in my maintenance folder. Maybe it's the better way. [12:33:40] <_Tuxedo> Many thanks for the tip. [12:37:04] Alternate way: [1] make a "Setup only source" and use it. ; [2] Manually create a database from "source $IP/maintenance/table.sql" and copy an existing LocalSettings.php as alternative configure. [12:38:10] [2] -> should use https://www.mediawiki.org/wiki/Manual:CreateAndPromote.php [12:39:52] _Tuxedo: easiest way is to temp rename LocalSettings.php to something else well running the installer, and then move it back after (If you're not worried about having a couple minutes downtime) [12:42:55] <_Tuxedo> bawolff: Without having tested other alternatives, I think it's probably the best and quickest way. [12:43:19] I suspect intall.php will also complain if you have existing LocalSettings.php (not sure) [12:43:58] Doing maintaince/tables.sql by hand is a bit of a pain, as there are variables in it that mediawiki post processes (albeit they don't matter all that much in many configurations) [12:51:18] <_Tuxedo> I'll definitely go for the tried-and-tested method of renaming LocalSettings.php, temporarily disabling the live wiki, while quickly creating the new wiki, and finally setting up the host-switch for both wikis in LocalSettings.php. [12:51:45] <_Tuxedo> Thanks again. I'm all set. [12:52:36] Ugh. Surely there must be an easy way to filter out l18n-bot from git log [12:55:55] ah, figured it out: git log --perl-regexp --author='^((?!Translation updater bot).*)$'. Thank you stack overflow [13:55:34] bawolff: Yay. [13:55:58] bawolff: git log --format=%ae | grep -v ;-) [16:18:24] Hello and Big Thanks Thanks Thanks for the awesome wiki engine [16:18:43] I have a problem with implementing SUL for consumerium.org [16:19:09] The user database was filled up with bot-generated accounts in the 2008 and 2010 [16:19:28] So there are maybe like 150-200 genuine human accounts and 68k accounts in total [16:19:54] Is there a way to remove all accouts that have not made any edits? [16:20:24] coz it is looking like there is going to be en.consumerium.org and cmmns.consumerium.org [16:27:46] Any way I look at this.. [16:27:48] !deleteuser [16:27:48] Deleting users is very messy and not recommended, because this breaks referential integrity in the database (they appear in many different tables like users, edit histories, recentchanges, preferences, etc). A safe solution is to block the users, and possibly rename them with . You can also try [16:28:17] start database from scrap.. not good since now there are 2 databases which is not what is wanted when we want a SUL solutoin [16:30:40] they won't appear in recent changes or edit histories [16:30:51] if there were no edits [16:31:27] it's still true that we prefer hidding users rather than deleting them [16:31:37] you could probably script that (hiding them) [16:31:51] so that non admins aren't bothered with those usernames [16:33:44] https://www.mediawiki.org/wiki/Extension:UserMerge alternative you could try merging all the bad ones and then hiding just that one... [16:33:56] no idea how well that extension works though [16:49:27] now a weird bug.. [16:49:45] The system stopped expanding http://en.wikipedia.org/wiki/Special:Search?go=Go&search=w:somearticlename| [16:49:57] prkl autoexpand. sorry my bad [16:50:44] [ [ w:Article name| ] ] no longer autocompletes the [ [w:Article name|Article name ] ] [16:50:48] what's going on? [17:12:07] I am interested in the localization support in mediawiki. My goal is to improve the support of BCP 47 language tags. For some reason, the documentation states that mediawiki uses ISO 639 language tags (which is think is wrong). I am new to the project tough. Can you tell me where to start working on this issue? [17:27:52] problem: when I request "http://www.gutenberg.org/w/api.php ? action=query & format=xml & prop=revisions & rvprop=content & rvexpandtemplates=1 & generator=categorymembers & gcmlimit=max & gcmtitle=Category:Bookshelf", I get an XML doc where most of the elements have no content. [17:28:04] Is that supposed to be able to happen? [17:29:23] Note that some elements do have content: a element that contains a element that contains the page-text. [17:56:49] ximm: hi you might want to speak with the people in #mediawiki-i18n [18:36:31] anyone know how to get rid of -1 active users on Special:Statistics? initSiteStats.php doesn't fix it [18:38:16] update table manually?! [18:38:34] hm [21:49:26] On the mediawiki site I am working on, there are hundreds of pages that are almost identical. I have tried to reduce these pages by creating a template that would generate the page, once specific parameters are defined...but I can't figure out how to make the call to the template in order to generate a page. Is this possible? [21:53:42] I've tried to link to the template like this, [[{{Template:NameofTemplate|Param1=1|Param2=2|}}|Click here]], but that obviously isn't correct. [22:14:40] Amie: so rather than using a template to just generate the content of an existing page, you want to use a template to create a page that didn't even exist before? [22:15:14] jmdyck, yes, that is correct. [22:16:43] I suspect you can't do that with a template, but maybe with an extension. [22:17:46] you can prefill the edit box with the contents of a template by crafting a special link [22:17:55] extensions such as InputBox facilitate this process [22:18:48] for example, https://www.mediawiki.org/wiki/Nonexistent_page?action=edit&preload=Template:Help [22:18:59] the preload= parameter specifes what to fill the edit box with [22:19:12] Amie: does that suit your needs? [22:21:00] !inputbox [22:21:00] http://www.mediawiki.org/wiki/Extension:Inputbox [22:21:10] see also that link for a way to more easily automate it [22:22:26] also https://www.mediawiki.org/wiki/Manual:Creating_pages_with_preloaded_text for an overview of the preload feature itself (including how to specify template parameters) [22:26:00] -+ [22:27:36] Skizzerz, give me a minutes to look through it - hopefully it will solve my problem. [22:32:18] I don't think that is what I want, because I don't want the page to be created permanently. [22:36:13] If you have a few minutes, I can explain more in detail what I am trying to do. Maybe I'm going about it the wrong way. [22:55:24] Amie: sure, can you explain more what you're trying to do? [22:55:35] even if I don't know, by saying it here someone else may know something [22:55:54] Skizzerz...thank you. [22:57:37] I started working on a wiki site a few months ago that is about 8 years old. It has almost 85,000 pages. The problem is that the person in charge of the content on the wiki was not an engineer, and did not understand the purpose of templates... [22:58:47] Instead of using templates to design the pages, they created several templates for each page. In otherwords, no template is ever used more than once, and there are over 400,000 templates that have been created. [23:00:06] I'm not sure there is a way to fix this issue, but I am trying to solve the problem of users continuing to create individual pages and several templates for each topic. [23:01:23] They are currently trying to create a section for new users, that steps them through processes based on questions asked. I'd rather not have to generate a page for each of the set of questions. I'd like to create a temporary page, using templates, in order to get the user to the final page. [23:02:25] I've tried to use modals, but get very strange results. That is why I moved on to trying templates. [23:07:26] templates are meant for content re-use, they cannot (easily) be used to dynamically change page content [23:07:57] it sounds like you really want a wizard of some sort [23:07:59] Yeah, I am starting to get that. [23:09:04] Honestly, if the entire site could be completely redesigned, that would be the best option. But my hands are tied, since there is a lot of important data on the 84,000 pages. [23:09:05] as for the first thing, merging them to all use common templates is unfortunately just a pile of manual labor. Bots and global search/replace extensions can help somewhat though [23:10:34] I've thought of that, but 84k is a lot of pages. ugh [23:10:54] hence bots and search/replace extensions :) [23:11:05] provided the template names are at least somewhat consistent [23:11:36] like if Page A had Template:A_something and page B had Template:B_something, it should be easyish to script a thing that replaces {{PAGENAME}}_something with just something [23:11:38] they have been VERY consistent with naming conventions [23:12:16] I say easyish because I have no experience with doing anything like this myself, but others here are more experienced with bots [23:12:18] Yes, I agree. [23:12:42] I'm not experienced with bots at all. [23:13:20] If I get the number of templates reduced, that would speed up the site, right? [23:13:26] no [23:13:32] oh, okay. [23:13:57] Is there any reason to have one templated used on 1000 pages, verses 1000 templates used on 1000 pages? [23:14:13] yes, editing the one template adjusts all 1000 pages that use it at once [23:14:19] rather than needing to make 1000 individual edits [23:14:41] right, of course. Resign is easy. [23:15:15] Currently, the site is not mobile friendly, but I can't see fixing it without fixing the template issue. [23:15:18] but templates in and of themselves are probably not causing any performance impact on your site, given that page content is heavily cached by mediawiki [23:15:40] so it'll be evaluated once and then the cached version is served from then on out (until the page is edited again) [23:15:53] the caching is also why dynamic content is tricky [23:16:54] right now, I'm thinking the best solution for your other quandry is to make a new extension which defines a new Special page [23:17:00] that contains the questions and lets the users step through them [23:17:09] you'll need to dive into PHP coding for that, however [23:17:35] you could also do it with javascript (having all the questions on the same page but hidden via CSS display: none until JS reveals the appropriate questions) [23:17:40] but that seems more hacky [23:19:40] and for performance, see https://www.mediawiki.org/wiki/Manual:Performance_tuning -- you'll probably be most interested in the Object caching section. MediaWiki doesn't come with any by default, but adding it can greatly speed up the site [23:20:55] my recommendation would be APCu unless you're running a cluster with multiple webservers, in which case look into memcached instead [23:21:52] after object caching, tuning the database server would probably be step 2 if the object caching isn't impactful enough [23:26:11] We are running a cluster of multiple webservers. [23:27:22] What about storage...is a large number of templates going to require a lot of storage? [23:28:02] yes, each additional page uses up additional disk space in the database. However, deleting pages via the wiki does NOT remove their text from the db, as pages can be undeleted [23:28:13] so consolidating templates will not in and of itself save you any disk space [23:28:34] I assume there is a way around that? [23:29:21] if you consolidate the templates, delete the old (unused ones), and then run the deleteArchivedRevisions.php maintenance script; you'll recover all of the disk space used by those templates [23:29:41] excellent [23:29:53] "recover" [23:31:40] if you check the server's free disk space, it probably will not change, but this is due to how the tablespace files work. The tablespace will have a lot more free space in it for future text, but it wont return that disk space to the OS [23:32:27] (there are ways around this as well, particularly the OPTIMIZE TABLE statement) [23:36:32] * Skizzerz is going to be afk for a while, best of luck on your issues [23:36:48] if you have any other questions, just ask and wait around for an answer :) [23:38:13] Thank you so much, you have given me several things to start working on.