[02:42:50] Hi there [02:42:57] I'm about to tear my hair out [02:43:35] I'm trying to import the infobox template from Wikipedia and I'm getting an error that apparently nobody ever has had [02:43:53] (or survived) [02:44:11] On import, I'm getting `Exception encountered, of type "Error"` [02:44:32] And that's that. Nothing descriptive, at all. [02:51:14] SmashShock: Add $wgShowexceptionDetails = true; to LocalSettings.php [02:51:40] And you'll get a disruptive error [02:52:27] descriptive* [02:52:29] Nope, same error. [02:52:45] I've tried error_reporting( -1 ); ini_set( 'display_errors', 1 ); too [02:52:48] Odd, usually that would work [02:53:11] SmashShock: check your webservers error logs, they might contain something useful [02:53:20] Tried error.log, nothing. [02:54:06] I'm thinking it's a PHP7 problem. [02:54:18] SmashShock: what MW version? [02:54:24] Considering literally nobody has had this specific error as far as I can Google [02:54:42] 1.26.3 [02:55:07] PHP 7 should not be a problem [02:55:21] It could be that some templates on Wikipedia use Lua [02:55:33] And maybe it needs you to install Lua [02:55:50] I've installed Extension:Scribunto and as far as Special:Version tells me, it's got it :( [02:56:23] Lua5.1.5 [02:56:35] That is really strange, usually one debugging method works [02:56:55] Yeah, that's what I was thinking [02:57:00] It sucks to have a unique error. [02:57:14] Indeed [02:57:30] Did you set $wgshowexceptiondetails at the bottom of LocalSettings? [02:58:11] is it $wgShow... or $wgshow... [02:58:40] $wgShowExceptionDetails [02:58:51] Ahaaa [02:58:54] Casing [02:59:06] Call to undefined function mb_check_encoding() [02:59:19] Ah, install mbstring maybe? [02:59:20] Perfect [02:59:22] Yep [02:59:30] Thanks dude, appreciate it :) [02:59:37] No problem :) [03:03:19] `Import finished!` is exactly what I wanted to see :p [03:04:16] Great :D [03:07:09] Using MW 1.28 really teaches you a lot [12:32:37] hello humble Sirs, i was given the tasks to move a Kerberos enabled mediawiki installation to a new Host, now that i have migrated everything i am running into Problem. I cannot login through Kerberos anymore, even though i very much seem to be logged in according to what i can tell from the debug output: http://paste.ubuntu.com/17742946/ [12:33:51] i am kind of confused by the fact that PlugAuth Session Check 2 is empty, is that my Problem? [13:01:17] morning [13:52:54] I'm trying to understand the MediaWiki API. [13:53:14] I'm making this request: [13:53:23] api.php?action=query&format=json&prop=revisions&continue=&generator=categorymembers&rvprop=content&gcmtitle=Category%3AEnglish+footballers&gcmprop=title&gcmlimit=500 [13:53:44] I want to get the content of all the pages in Category:English footballers. [13:54:02] It's returning the content for some of them, but it's leaving it out for others. Why? [13:54:26] probably due to limits [13:54:34] To get every page in the category, you'll need to paginate [13:54:38] It isn't just giving me the first ones. [13:54:43] It seems random. [13:54:50] {"gcmcontinue":"page|41524d5354524f4e472c204a494d4d590a4a494d4d592041524d5354524f4e472028464f4f5442414c4c45522c20424f524e203139303429|26392438","continue":"gcmcontinue||"} [13:55:04] Also, the documentation says the limit is 500. [13:55:16] in one request [13:55:22] Generally, just use limit=max [13:55:32] unless you only want a number less than the limit [13:56:22] So it gives me a JSON node thing for each of the pages, but it chooses only some of them, up to the limit, to give me the content for? [13:56:49] IIRC, with a generator, you might not get the limit either [13:56:50] Is there a way to only get the pages I can get the content for, or maybe to get the first ones instead of random ones? [13:57:07] you'll be able to get content for them all [13:57:39] The first page looks to have content looking at what it returned for me [13:57:53] "Arthur Aaron (footballer)"? [13:57:57] yeah [13:58:01] not for me [13:58:11] {"pages":{"25087680":{"pageid":25087680,"ns":0,"title":"Arthur Aaron (footballer)","revisions":[{"contentformat":"text/x-wiki","contentmodel":"wikitext","*":"{{Use dmy dates|date=June 2016}}\n{{Use British English|date=June 2016}}\n{{Infobox football biography\n| name = Arthur Aaron\n| listas = Aaron, Arthur Frederick\n| image = \n| [13:58:33] "Nathan Abbey" is the first one that I'm getting the content of. [13:59:04] https://en.wikipedia.org/w/api.php?action=query&format=json&prop=revisions&continue=&generator=categorymembers&rvprop=content&gcmtitle=Category%3AEnglish+footballers&gcmprop=title&gcmlimit=1 [13:59:07] Oh, it's different if I use "limit=max" instead of "limit=500". [13:59:16] gclimit* [13:59:24] gcmlimit** [13:59:24] lol [13:59:58] Is it better to make fewer requests for more pages each? [14:00:03] Not really [14:00:11] oh [14:00:22] the limits have been put in for a reason [14:00:55] API:Etiquette says "try to combine things into one request". [14:01:05] yeah, exactly [14:01:06] so limit=max [14:01:09] and paginate :) [14:01:15] it's very acceptable [14:02:32] "limt=max" will return only some of the pages but the content for each of them? [14:02:41] right [14:03:07] Limits with generators get a little fuzzy [14:03:20] doesn't look like it... [14:03:34] It's just leaving out the content for a different subset of the returned pages. [14:03:36] If you start asking for many different things [14:04:45] "limit=50" seems to work [14:05:06] I don't see that limit documented anywhere, though. Is it higher for accounts with the bot flag? [14:05:17] it will be [14:05:38] It will be documented, or it will be higher? [14:06:10] documented [14:06:14] and higher for bot flagged [14:06:35] /** Fast query, standard limit. */ [14:06:35] const LIMIT_BIG1 = 500; [14:06:35] /** Fast query, apihighlimits limit. */ [14:06:35] const LIMIT_BIG2 = 5000; [14:06:36] /** Slow query, standard limit. */ [14:06:36] const LIMIT_SML1 = 50; [14:06:38] /** Slow query, apihighlimits limit. */ [14:06:41] const LIMIT_SML2 = 500; [14:07:20] ah [14:56:58] I installed a mediawiki months ago in such a way that you had to login to read and edit pages [14:57:12] But now I want to open it up, can someone give me some pointers ? [15:01:21] garo - you probably need to remove, or comment out, the lines in LocalSettings.php that start with "$wgGroupPermissions". [15:03:28] ok, thanks ! [15:50:36] hi! i'd like to know how to make a footer/box with a list of article links in a "numbered list", think year numbers or episode numbers, like on the 1984 page: http://s.jay2k1.com/3Ggd.png [15:51:06] it's probably some kind of template but i'm not sure what to google [17:01:28] hey I was wondering how exactly I can support IPv6 addresses on my installation? [18:07:54] Is there a way to get the current value of the #wpTextbox1 textarea in the WikiEditor? [18:54:15] Hey there I’ve got an academic journal that wants to do a mediawiki install and they want to use a lot of the extensions that are on Wikipedia (citations and visual editor mostly) - does anyone know of a bundle that has all that ready to go? [19:01:26] citations is included in the default tarball [19:01:51] Visual editor is not, and its a bit annoying to setup relative to other extensions [19:02:03] Yeah thats what I read.