[02:08:18] Any easy way to just get 20,000 or so wiki page links from my wiki? For benchmarking purposes [02:09:50] CZauX: what do you mean by page links? URLs to pages? [02:09:55] yes [02:13:03] providing you have that amount of pages you can just {{subst:Special:AllPages}} (or Special:PrefixIndex/whatever if you only want only what starts with something) [02:13:46] You could query the page table to get a list of page titles, and concat the URL before the page title (you could do this in the same SQL query). Note, however, that the page table contains the namespace as number, so namespaces different than the main one would require manual substitution [02:14:32] that is if you meant to just post the pages, otherwise do what Vulpix said [02:15:24] Basically I'm just importing the dump of the Mediawikiwiki public XML dump, then throwing URL's at the Vegeta benchmark tool to see how well my stack holds against non-cached page-distributed loads [02:15:33] I'l try that out, ty [02:15:41] I think {{subst:Special:AllPages}} limits the list to less than 500 or so [02:27:38] Vulpix , IIRC new wiki importers rely on that to easily get a list of pages to export from Incubator , (prefixindex/wx/xx) , so unlikely its THAT limited [02:28:57] arseny92: trust me, or else anybody could bring wikipedia down by doing that :) [02:30:49] i mean limited to about 2k (the norm for a newproject startup) but 500 is kinda restrictive [02:31:47] I don't remember that limit being configurable [02:36:19] there's a non-configurable limit of 345 (yes, as random as it is) [02:38:35] https://phabricator.wikimedia.org/diffusion/MW/browse/master/includes/specials/SpecialAllPages.php;000a3e1581d71760c089abbf9dbcfd79897159b1$37 [03:41:08] Does anyone know what Extension is being used to allow this create on this wiki: http://gundam.wikia.com/wiki/Special:CreateTemplate ? I've been trying to find it but haven't had much luck :( [03:44:26] Kalshion, looks like it comes from their version of the SemanticForms extension [03:44:31] https://github.com/Wikia/app/blob/906a781ece15c8a5a76040bd941979ce3ab597db/extensions/SemanticForms/SemanticForms.php#L142 [03:45:05] It is in the base Semantic Forms as well. https://www.mediawiki.org/wiki/Extension:Semantic_Forms/Special_pages [03:45:41] Hmmm, if its in the base, then I'd assume it would be freely available and wouldn't need to be installed so I'm wondering why it doesn't show up on mine - unless I made am istake during installation. hmmm [03:45:57] You may need to update your extension. [03:49:56] Ah ok, thank you for the assist :) [03:56:55] if I have a question about compiling mediawiki-php-luasandbox not working for hhvm, is this channel or #wikimedia-dev preferable? [05:07:14] Hi toulouse. Either channel is fine. [05:07:30] ah alright [05:07:50] so basically, compiling for HHVM isn't working in that it compiles fine but the symbols aren't right [05:08:20] it looks like it does C++ name mangling of the luaopen_* functions, but inspecting the libraries, they have C linkage [05:08:39] I think this can be resolved with an ifdef'd (for cplusplus) extern "C" { ... } block around the lua imports [05:08:44] which creates another issue [05:08:48] at least, on CentOS 7 [05:09:10] I think the cmake script for finding Lua is specific to (probably) debian [05:09:29] in mediawiki-php-luasandbox [05:09:43] and macs, from the look of it [05:15:51] LUA_LIBRARIES is empty even with liblua and libluajit installed unless the find_libraries call is coerced in ways I'm not sure are correct [05:16:09] the lib fiels on a 64-bit machine are in /usr/lib64 [05:16:33] it finds the includes just fine [05:17:33] I added lib64 to PATH_SUFFIXES and /usr to PATHS and liblua and lua to NAMES, but it's worth noting that it's finding a C library and linking it as such [05:19:40] tl;dr: HHVM+CentOS => liblua has C linkage, but the mediawiki-php-luasandbox files are compiled as C++. [05:21:17] the linkage is easily fixed; the cmake scripts needs to find the lua with the right linkage [07:45:38] *scratches head* no activity at all in any of the three channels for the three hours since I asked. Is there a better time? [10:16:18] Hello, I have a problem with logging into wiki with bot password feature on 3rd party wiki that uses MediaWiki. I'm using simplemediawiki library for python. When I try to log in MediaWiki API responds with result Aborted and reason "Cannot log in when using MediaWiki\Session\BotPasswordSessionProvider sessions" [10:16:30] What can be a reason for this behaviour? [16:34:31] Hello [16:35:04] hi [16:35:28] How's it going andre__? [16:36:53] Would anyone here be able to help me install Scribunto on a MediaWiki 1.2.7 install? I've got the files on the server but when I edit the LocalSettings.php to require it just blanks the sites [16:36:57] *site [16:37:21] I've already installed two other extensions so I know I'm not just completely useless at the process [16:38:41] Pongles, https://www.mediawiki.org/wiki/Manual:Errors_and_symptoms#You_see_a_Blank_Page [16:38:53] ah, thank you [16:44:09] ...and now for some reason there is no error and everything works perfectly [16:44:27] @andre__ can you do that for me, everytime? [16:44:38] just poof away the errors [21:57:55] Hmm. can't get the --report option to work in importDump.php [21:58:17] Doesn't show anything during the import, but I see the imported pages on the site [22:00:12] Oh, had the quiet option on [22:00:16] ignore me [22:05:23] * Trela runs CZauX --quiet. :) [22:05:51] :^) [23:41:15] I am not sure where to look for help. our admin is busy so I thought I would fish around on the interntets [23:42:18] https://wiki.pumpingstationone.org/index.php?title=Special:UserLogin [23:42:32] " Wiki uses cookies to log in users. You have cookies disabled. Please enable them and try again. " [23:42:49] CarlFK: Have you tried enabling cookies? [23:43:09] yep [23:43:29] Well, had to give it a shot [23:43:35] CarlFK: What browser are you using? [23:43:35] marktraceur: try with any user/pw you want, I expect you will get the same message [23:43:47] both FF and chrome [23:44:11] Weird. [23:44:23] It looks like the request had a cookie set, but the response doesn't [23:45:37] Is your PHP session folder writable? [23:46:02] (also wondering about the value of session.referer_check in php.ini) [23:46:18] i am just a user - but it has been up for years. "something happened" about a week ago. maybe 2. [23:47:06] https://www.mediawiki.org/wiki/Topic:Rg3w5u0e70fs8l4e might provide some more hints/ideas [23:47:33] you need to ask the admin if they did anything around the time it happened (updated MW? updated php? updated server distro? etc etc) [23:48:27] I was hoping to hear something like "security update 123 is causing that. apply this other patch..." [23:48:46] There's no way we've had 123 security updates [23:49:04] "I tried to login today, and now it works :S. Makes no sense, but i'm going to export my wiki just in case." https://www.mediawiki.org/w/index.php?title=Topic:Rg3w5u0e70fs8l4e&topic_showPostId=rg5nq3ac4vv1y9b3#flow-post-rg5nq3ac4vv1y9b3 [23:49:34] CarlFK, you access and rights to edit your "LocalSettings.php" file ? [23:49:38] ohh ok [23:50:01] <- dumb user. [23:51:45] I see a few possible solutions in that thread. I'll pass them on. Thanks. [23:54:36] Hi, i updated my wiki from 1.26.2 to 1.27 but InstantCommons to Commons (basic config $wgUseInstantCommons) does not work anymore (red links instead of images). Ive read the news about hot linking but this should not have any effects in this config, right? Any clue ? [23:55:44] Php 5.5.38