[00:03:13] hmmm, it seems from googling that the 'No data received' error can be caused by many different things [00:05:16] hi [00:05:32] is it possible to specify which php version we can use on mediawiki install [00:05:40] if the system has a bunch of them» [00:05:41] ? [00:14:25] biberao: you'd have to set that up at the webserver level [00:15:21] like [00:15:32] it has php53 54 52 [00:15:42] i have made php54 for my shell [00:15:48] but it still gets 53 [00:17:06] ideas jackmcbarn ? [00:17:08] biberao: you have to set your web server up to do that. mediawiki has no control at that time [00:21:16] ok [00:21:17] thanks [00:29:28] what is the name of the extension that makes my mediawiki have the source editor that wikipedia has? [00:29:39] bugzee: CodeEditor? [00:33:52] hmmm, zend_mm_heap corrupted doesn't sound good [00:45:38] commenting out the Scribunto extension solves my timeout issue [00:48:05] MatmaRex: once I was able to save my SimpleNavbox Template with the mw-collapsible class, the Collapse/Expand feature just worked [00:48:30] wmat: yay [00:48:57] MatmaRex: well, half a yay, as I still can't save edits to templates without disabling Scribunto :/ [04:53:52] Anyone here familiar with the DatabaseTable wrapper functions? [06:09:09] Hi gbah. What's your question? [06:18:12] I'm wondering about aliasing table names in a select() with join conditions [06:19:11] Say I have tables $tblNameA and $tblNameB, and do a $dbw->select(array('A' => $tblNameA, 'B' => $tblNameB), ...) [06:19:41] for the join condition, do I do array($tblNameB => array('INNER JOIN', ...)) or array('B' => array('INNER JOIN', ...)) ? [06:21:29] 'B' => ... sounds like it would make more sense, but the docs weren't clear on this from what I saw [06:22:01] I think you can do ->select( array( 'A as tblNameA' [06:25:36] legoktm: Yeah, that should work as well. I prefer ->select(array('A' => $tblNameA, ...), ...) to let mediawiki sort out the specific DB syntax, but I'm wondering about the join condition argument in particular [06:26:01] yeah, I don't know about that [06:26:07] try it and see what happens? [06:26:36] Can't test it at the moment, so I thought I'd ask here, otherwise I would've tried :) [06:29:38] But I'll just try it out later of course, if no one knows off the top of their head [08:25:32] hello [08:26:00] how do i link to a subpage without the parent page's name to appear in the link? [08:26:24] cause [[/subpage/]] doesn't work for me [08:27:20] Then you didn't enable subpages [08:27:55] # Enable subpages in all namespaces [08:27:55] $wgNamespacesWithSubpages = array_fill( 0, 200, true ); [08:28:15] API will tell you if that worked, e.g. https://it.wikipedia.org/w/api.php?action=query&meta=siteinfo&siprop=namespaces [08:30:24] http://joy.indivia.net/Wiki/api.php?action=query&meta=siteinfo&siprop=namespaces [08:30:37] subpages should be in the python lesson namespace.. [08:31:10] but its weird cause i can see the parent page in the content of the subpages when i open them.. [08:31:18] so it should be working [08:31:42] I don't see any such namespace [08:31:53] me neither [08:32:06] ... [08:32:21] but yes, looks like you have subpages everywhere, that's a good start :) [08:32:47] $wgNamespacesWithSubpages[NS_PYTHON_LESSONS] = true; ? [08:33:20] let's try it [08:35:21] no it doesn't work [08:36:51] http://www.mediawiki.org/wiki/Manual:Using_custom_namespaces ? [08:41:44] Hi ! I'm using the web api locally to insert data (using wikibase extension) and it's VERY slow (i can do about 10 calls/min), can I expect better results using internal api ? [08:43:54] Nemo_bis now the namespace is appearing in the list give from the api [08:44:29] but the trace of the parent page in the subpages is not appearing.. [08:45:30] do i have to use $wgNamespacesWithSubpages = array_fill( 0, 200, true ); and $wgNamespacesWithSubpages[NS_PYTHON_LESSONS] = true; in combination? [08:45:41] it's not enough just one of them? [08:49:14] olivd: define "locally" and with what framework, pywikibot [08:49:16] ? [08:49:49] softplay: you should first figure out why this namespace is not in the list of namespaces :) [08:50:10] now it is in the list of namespaces [08:50:10] i added it [08:50:15] Nemo_bis: framework : php but using http queries, locally : mediawiki is installed on the same machine which runs the api calls [08:51:53] Nemo_bis: I was wondering if there's some "cooldown" time between requests or something to avoid too massive inserts [09:28:56] olivd: there shouldn't be, at least not one that massive. [09:29:31] ok I'll try with the internal API if it's better [09:29:42] olivd: at first guess, i would suspect you are not using memcached on your server [10:14:35] hello, is anyone here? [10:15:00] anyone here? [10:17:47] wow, 2 minutes... [11:26:20] Nemo_bis the links to the subpages of the namespace doesn't work still [11:26:43] even if the namespace is registered and listed in the api file you've said to me [11:26:56] where am i wrong? [11:30:13] hey there, anyone here who has some time tp help me with some problems? [11:31:33] i've got following problems: everytime i want to open a special page, i get an internal server error [11:32:10] there's nothing suspecious in the wiki-log, and the apache-log just says "End of script output before headers" [12:04:35] anyone? [12:11:03] mitch: no idea, i guess you could try the debugging instructions we have? [12:11:04] !debugging [12:28:44] MatmaRex: tried it allready: no result... [12:30:38] (can't get any error message, wether from the apache neither from the wiki-log) [12:31:01] hey folks! I deleted *all* my MediaWiki files, EXCLUDING the DB. I didn't had any custom themes or uploaded files, just information in the DB. is there any way to get a fresh installation and put the files from there? note: I've lost the LocalSettings file also :( [12:31:28] except this tiny thing: "End of script output before headers: index.php [12:31:28] " [12:36:24] BUMP? [12:39:15] razva: backup your database [12:39:40] set up a new wiki [12:40:01] the installation routine will rcognize your database [12:40:27] run maintenance/update.php via commandline [12:40:31] thats it [13:11:04] mitch: It seems something broke in the latest release. You can just disable error reporting in LocalSettings.php [13:12:02] mitch: And it works. I use this extension Xdebug Output Toggler for chrome which toggles displaying the XDebug output [13:14:36] this is a pain in the arse ... "The search tab will soon be removed from user preferences. You will be able to set your search preferences on Special:Search." [13:17:14] mitch: unfortunately I don't have access to #bash (shared hosting) [13:25:36] sDrewth: link? [13:31:14] Amgine: https://meta.wikimedia.org/wiki/Special:MyLanguage/Tech/News/2014/23 [13:31:45] I will have to wait to see it, but I liked it where it was [13:31:52] * sDrewth shrugs [13:32:27] Agrees. User preferences should be in the User preferences. Period. [13:32:50] personally would prefer to have seen it trialled as a beta preference [13:33:14] well personally, I was happy with it where it was [13:33:20] is [13:34:08] I don't think there is any possible logical argument - for interface - to move it. [13:35:32] Gloria: #52817 was really rather dumb. [13:39:08] I have made comment now, and had been going to let it go [13:39:21] sDrewth: we are slowly trying to make Preferences less complex [13:39:37] they are very confusing to users. [13:39:45] thedj: spreading them through the wiki is not more helpful though [13:40:07] well it's based on the idea that most ppl simply NEVER use them anyways [13:40:14] when I get a new wiki, I go and set up all my preferences [13:40:33] you are rather unique in that :) [13:40:34] though I look forward to the day that I can set them up at login.wmf and have them replicate to each new wiki [13:40:53] he says provocatively [13:41:18] sDrewth: anway, you could probably create your own JS based ui to access ALL prefs in one spot using the API. [13:41:20] The code has been written.. ;) [13:41:38] thedj: you vastly over estimate my coding ability [13:43:01] or we will have a Special:Preferences/All or something at some point. [13:43:41] and it would be allowed to be ugly and unnavigable :)[ [13:44:34] Reedy: I look forward to the day that I can cookie cutter each new wiki with my config and user pages [13:45:10] or just have it happen [13:46:01] hey there, i couldn't solve my problem. any ideas? (special pages leads to an internal server error 500) [13:47:46] we are still waiting to finally finish the unification of user accounts :( [13:48:07] [13:48:17] it is going to be a race [13:48:31] the next ice age, or SUL unification [13:49:17] * sDrewth stares at Deskana|Away [13:49:45] mitch: what php version, apache, mw version? [13:49:56] yeah well, some ppl are very attached to their names, and it's a complicate procedure that takes resources with experience to execute it. [13:51:18] I've migrated a mediawiki installation to a new server and it's 100% successful (not that it was difficult). Is there a way to tell the old MW installation to use the new server, essentially redirecting them? I'm using a php redirect in index.php but was hoping for something that would follow links on the old server, but use the new IP and leave the user on the new IP as well. [13:52:07] they will all still have their names [13:52:25] just appended with extra bits if they are not unified accounts [13:53:00] we are talking how many people? and the holding up of a whole system of improvement? [13:53:21] they will be the longest around, and they should be the most understanding [13:53:28] haha [13:53:36] no, in general they are the least understanding [13:53:45] most precious [13:54:30] reedy: MW:1.22.7 PHP:5.5.9-1ubuntu4 (cgi-fcgi) [13:56:12] thedj: so, anyone writing a special page extension should create and maintain a local-to-their-project interface for managing their special page's preferences. [13:57:39] Amgine: at this point, ProofreadPage components integrate into Preferences, on the relevant tabs [13:58:08] [13:58:13] works fine that users don't have to navigate PrP elsewhere [13:58:13] Amgine: not necessarily, it's just that once in a while some weeding is in order. [13:58:19] we only have a few components [13:58:27] oh, and some gadgets [14:00:10] Search preferences are weedy? but it still begs the question: if using a common interface is best practice, why pull search out of that common interface? [14:06:34] No preferences are weedy, and search just found a better home [14:06:59] hello, i've a problem with linking subpages.. [14:07:30] [[/subpage/]] doesn't work [14:07:53] are you sure you actually enabled the subpages option for the namespaces in question ? [14:08:03] yes i did [14:08:08] (do you see the breadcrumb) [14:08:16] i do [14:08:30] but it's weird [14:08:34] let me paste the code [14:08:36] one moment [14:08:46] not in the channel, use a pasteboard plz [14:08:50] softplay: have you checked that they are configured through the API? [14:09:29] softplay: look at your equivalent to https://en.wikisource.org/w/api.php?action=query&meta=siteinfo&siprop=general|namespaces|namespacealiases [14:10:02] yes they're configured in the API i've checked [14:10:06] http://pastie.org/9250219 [14:10:20] without line 2 I cannot see the breadcrumb [14:12:16] thedj http://pastie.org/9250219 [14:14:31] any hint of what's happening? [14:18:38] no idea? [14:20:10] no clue [14:20:37] there is something wrong with the code maybe? the order of the statements? [14:28:16] sDrewth how can i see al the subpages of a namespace with tha API? [14:28:20] the* [14:29:59] Does anyone know if it's common for mw email to trigger a web host's spam filters? [14:33:38] can I query wikimedia with POST instead of GET? [14:33:59] what sort of query? [14:34:47] this one for example https://commons.wikimedia.org/w/api.php?format=jsonfm&action=query&list=geosearch&gsprimary=all&gsnamespace=6&gsradius=500&gscoord=51.5|11.95 [14:35:12] should work fine [14:35:18] not sure why you'd really need/want to [14:35:40] ok special pages.. [14:35:41] found [14:35:51] but the namespace is empty... :o [14:36:00] i've moved a lot of pages in it [14:37:21] but is namespace the same thing as categorization? [14:38:48] and why isn't the breadcrumb appearing? [14:40:12] no it's not working, the link is still empty when i type [[/subpage/]] [14:42:11] litlen i've tryed ok? [14:46:59] hey there, after some testes and the deactivation of all my plugins, i've still got a problem to open the special pages (internal server error)... any idea? [14:48:45] testes? they are play hard in your workplace mitch [14:49:31] :D [14:49:59] *testing [14:51:06] is there a way to get mediawiki to stop capitalizing usernames? I manually changed the display name in the db, and I stopped being able to log in. Shouldn't login depend on the record primary key rather than the textual username? [14:55:18] is the Bar in Bar/Subpage a namespace? [14:55:43] softplay: no [14:56:02] what is it then? cause it isn't a category neither [14:56:25] softplay: it's just the base page name [14:56:42] and it is possible to link to Subpage without referencing Bar? [14:57:06] softplay: if you put [[/Subpage]] on Bar, and subpages are enabled in that namespace, it will go to Bar/Subpage [14:57:27] note that by default, subpages aren't enabled in the main namespace [14:57:44] but you've just said that it isn't a namespace... (i'm a bit confused) [14:58:02] namespaces are things like Talk: and User: and Project: [14:58:13] if there isn't one of those in front of the page name, it's in the main namespace [14:58:15] ok [14:58:23] great [14:58:25] and by default, subpages don't work in the main namespace [14:59:08] ok so i have to link with the full name, no option [14:59:15] understood [14:59:20] softplay: or, if it's your own wiki, you can enable them [14:59:35] it's my wiki [14:59:47] but i got lost in that thing of namespace [15:00:06] softplay: then add this to LocalSettings.php to enable them: $wgNamespacesWithSubpages[NS_MAIN] = true; [15:00:13] i just need the slash and the breadcrumb [15:00:29] you'll also get the breadcrumb if you do that [15:00:35] and it is happyly working with Bar/Subpage [15:00:48] and [15:00:48] $wgNamespacesWithSubpages = array_fill( 0, 200, true ); [15:01:02] i guess that's one way to do it [15:01:17] but it is the same old problem [15:01:24] is Bar a namespace? [15:01:26] no [15:01:32] the namespace is the "main" namespace [15:01:36] it is not in the list of namespaces of the api [15:01:39] yes it is [15:01:41] the very first one [15:01:51] it's just a name i've put in the pages [15:02:16] ok ok it is the main [15:02:49] that's enough for me till i'll have the money to buy a tutorial i guess [15:03:09] buy a tutorial? what? [16:13:59] bois plz help me, i'm gettin mad [16:14:50] mitch: Yes? [16:52:24] hey DanielK_WMDE_ - how's it going? [17:19:00] Brilliant logo! http://backyardwrestling.wiki-site.com/index.php/Main_Page [17:52:53] lo. been trying all sorts of things with MediaWikiLanguageExtensionBundle, but "available languages" and "missing languages" remain empty. what's the secret? [17:53:17] all i can select is interface language [18:05:14] tuxick: have you installed the various extensions in the bundle via the LocalSettings.php? [18:22:40] Amgine: followed http://www.mediawiki.org/wiki/MLEB#Configuring_LocalisationUpdate [18:25:53] tuxick: If you look at https://www.mediawiki.org/wiki/MediaWiki_Language_Extension_Bundle you will see the bundle is a group of Mediawiki extensions. They will need to be installed in your wiki's extensions/ folder, and enabled in the local settings file (along with any configuration values required.) [18:26:22] well this was all i found [18:26:33] For example, the UniversalLanguageSelector has installation and configuration instructions here: https://www.mediawiki.org/wiki/Extension:UniversalLanguageSelector [18:26:52] Yep! and you asked how to get things working. Which is what I am trying to help you to od. [18:26:54] do. [18:30:09] [18:31:24] hmm, i'll look inrto the ULS bits then [18:31:59] mind, i'm talking about pages in different languages [18:32:27] since there's a bunch of pages in french [18:33:04] It looks to me like there are at least 6 extensions in that bundle. You will need to look through and see which extensions you need to enable to do what you need it to do. [18:33:34] I'm not personally familiar with the bundle, or the extensions used, so I'm probably not the best person to help with your setup and wiki needs. [18:33:53] (Just a volunteer/slightly knowledgeable person) [18:35:59] languageselector seemed the one :) [18:36:33] I got lucky, for once! [18:37:47] i mean i thought that'd be the one [18:37:57] but i still only get interface languages [18:40:14] but i guess it's the Translate thing after all [18:42:29] but i did everything i see to configure that as well [18:52:36] Hmmm, did you look at https://www.mediawiki.org/wiki/Extension:Translate for how to install/configure? [18:53:09] (sorry about the delay, tuxick, am trying to finish up a lesson.) [18:54:33] The configuration looks to be somewhat complex... [18:54:44] yes, did all as documented, afaict [18:56:17] Nemo_bis might be able to help you with troubleshooting. [19:05:42] Unfortunately I'm off to do errands, tuxick. Go ahead and ask any other questions in channel, and hopefully someone will be able to answer. [19:15:18] I don't know if I am at the right place (sorry). I need help with "coding" a simple template on WIkipedia...can someone help me? [20:14:49] ^demon|lunch: gerrit is being veeeeeeeeeeeeeeeeeeery slow atm [21:21:47] Hi guys. May I ask for an example how to obtain a diff via api.php? Thanks in advance! [21:24:46] Since i managed to figure out how to obtain the content of a revision, but cant seem to understand how to get a diff... [21:44:02] Marcgal: There's an rvdiffto parameter [21:44:04] And rvdifftotext as well [21:44:33] I forget how they work exactly (which is embarrassing since I wrote that code) but you can find them in the api.php documentation [21:44:57] The basic way it works is you fetch one revision as if you're fetching its contents, then you specify something else (revision or text) to diff it to [21:45:03] RoanKattouw: y i know, but frankly, i have trouble with using them [21:45:07] for example: [21:45:14] i try http://pl.wikipedia.org/w/api.php?action=query&prop=revisions&rvprop=comment|content&rvdifftotext=prev&titles=Wikipedysta:Marcgal/brudnopis [21:45:23] but i only obtain the latest revision [21:45:25] not the diff [21:45:44] hah that's weird [21:46:17] Ha! [21:46:29] Marcgal: For whatever reason, if you drop rvprop=content, it works [21:46:33] That sounds like a god [21:46:36] *like a bug [21:46:45] (I have NO IDEA how that typo happened xD ) [21:47:05] https://pl.wikipedia.org/w/api.php?action=query&prop=revisions&rvprop=comment&rvdifftotext=prev&titles=Wikipedysta:Marcgal/brudnopis works [21:47:15] Although you probably want https://pl.wikipedia.org/w/api.php?action=query&prop=revisions&rvprop=comment&rvdiffto=prev&titles=Wikipedysta:Marcgal/brudnopis [21:51:02] RoanKattouw: well... many thanks! [21:52:43] RoanKattouw: besides... what does rvdifftotext exaclty do? [21:52:59] Marcgal: It lets you specify arbitrary text to diff to [21:53:21] rvdiffto= is limited to things that have already been saved as a revision, rvdifftotext= lets you diff against things that haven't been saved anywhere [21:53:33] ah, that's why i received nonsensical output with rvdifftotext... thanks [21:54:05] I filed https://bugzilla.wikimedia.org/show_bug.cgi?id=66054 about the rvprop=content thing [21:54:14] It shouldn't mysteriously refuse to work when you pass rvprop=content, that's a bug [21:57:12] Whoa, i got my nick in bugzilla! How nobilitating! ;P [21:57:33] Besides, i have one last question, if you allow... [21:58:08] Sure [21:58:24] RoanKattouw: Is the diff fixed to contain those <s and >s instead of normal <'s and >'s? [21:58:51] That prevents XML parsing the diff... [21:59:37] Marcgal: Yeah you can't XML-parse it directly, you have to extract it and then XML-parse that [22:00:07] Also note that we have non-XML output formats where we can't embed XML :) e.g. https://pl.wikipedia.org/w/api.php?action=query&prop=revisions&rvprop=comment&rvdiffto=prev&titles=Wikipedysta:Marcgal/brudnopis&format=jsonfm [22:04:05] RoanKattouw: this makes me close to obtaining the diffs via special:diff rather then the api... [22:05:52] ah, idk. sorry, i'm just a programming newbie who tries to set something he imagined to work. That's why i ask these dumb questions :) [22:06:11] I guess sooner or later I'll figure it out. Well thanks for your help so far! [22:06:24] No worries, there are no dumb questions :) [22:10:17] Comforted by this I'll ask RoanKattouw if he remembers something about groupOverrides2 in InitialiseSettings.php :) [22:10:34] aka will https://gerrit.wikimedia.org/r/#/c/134400/ work [22:10:49] with that magic + [22:11:22] No idea [22:11:29] ok :) [22:11:31] I never messed with that file enough to understand groupOverrides2 [22:11:37] lol [22:11:46] Maybe in 2009 I might have understood it :) but I haven't touched it much since [22:12:02] I should have asked you earlier! [22:12:24] Exactly! [22:12:37] Sorry, you're five years late :P [22:39:48] hi i have a Mojibake problem with my wiki live version of the problem cand find here: http://wiki.vega-strike.org/Ru:Vegastrike [22:40:50] and any one know how i can fix this? [22:43:06] hallo.. vraagje. mag je gewoon teksten kopieëren van wiki en overnemen en bijv op je website zetten zonder bronvermelding? [22:43:15] lijkt mij plagiaat [22:49:38] modeem: Depends on the wiki in question. [22:51:46] i noticed someone copy paste text from wikipedia on their webpage. without refering/ source link to wiki itself [22:52:33] This is not relevant to #mediawiki. [22:52:42] ok [22:52:58] thanks [22:53:25] Meeting in #wikimedia-office in 7 min about the grid system RfC https://www.mediawiki.org/wiki/Requests_for_comment/Grid_system [23:00:58] Right now - #wikimedia-office We're discussing Pau Giner's proposal for "a Grid system in MediaWiki to simplify the creation of user interfaces and make them ready for multiple screen sizes." [23:08:38] hey sDrewth - how goes it? [23:09:06] okay, yourself? [23:09:09] Right now in #wikimedia-office, we're discussing Pau Giner's RfC, a proposal for "a Grid system in MediaWiki to simplify the creation of user interfaces and make them ready for multiple screen sizes." In case you wanna join us. [23:09:21] I'm enjoying running a meeting whilst waiting for dinner party folks to show up. :) [23:09:42] not even sure I understand the subject itself [23:10:10] https://pauginer.github.io/agora-grid/ is sort of a demo I think [23:10:41] I meant "dinner party" .... [23:10:47] :-p [23:10:49] hahaa! [23:11:14] the things that you city people do [23:19:16] hee [23:22:07] hello [23:22:14] Hi, dnj! [23:22:49] I'm having an issue with my mediawiki install [23:22:51] and google isn't being helpful [23:22:53] does anyone have a minute? [23:23:17] dnj: Hello! When you're on IRC it's usually good practice to jump right in and ask a question, as it saves time for everyone involved :) how can we help you? [23:23:34] I'm recieving this error [23:23:47] Catchable fatal error: Argument 1 passed to ScopedCallback::__construct() must be an instance of Closure, unknown given, [23:24:29] dnj: And this is on a clean install? [23:24:41] marktraceur: this occurred randomly [23:24:52] marktraceur: if I could track down what happened I'd be giving that information to you now [23:24:55] as far as I know, nothing changed [23:25:11] dnj: That doesn't really answer my question - have you installed any extensions, skins, or modified any parts of the code? [23:25:20] marktraceur: nope [23:25:23] marktraceur: just added content [23:25:55] dnj: Templates, media files, anything else? [23:26:01] noope [23:27:41] marktraceur: have you seen this error before? [23:28:25] I haven't [23:28:31] >_< oof [23:28:38] But it's OK [23:28:48] dnj: Can you find a bigger traceback, maybe, and pastebin it? [23:29:35] marktraceur: I'll see if I can dig something up [23:30:15] Knowing what's calling "new ScopedCallback()" and causing trouble would be the best way to fix it [23:34:35] lol I can't delete test 'translated' pages on mediawiki.org xD [23:38:26] marktraceur: this is the code that's throwing the error [23:38:27] http://pastebin.com/7dkGrEfX [23:38:33] I'm turning on debugging but not seeing anything currently [23:38:35] will keep on it [23:40:12] That's a bit odd [23:41:44] dnj: Try putting a space between "function" and "(" [23:41:48] I doubt it matters, but still [23:42:18] Ah, it doesn't matter [23:43:08] dnj: Can you tell me what version of PHP you're running?\ [23:43:12] hey, does anyone run mediawiki using MAMP on OSX? or is anyone generally familiar with MAMP? i'm having trouble but i'll admit it's not mediawiki-specific... just can't seem to get MAMP to give me apache access logs, which i need to diagnose my actual mediawiki problem [23:43:53] Skud: Have you tried using Vagrant? [23:43:55] !vagrant [23:45:00] yes, but it chews up an inordinate amount of my system resources and heats up my lap to a scary degree [23:45:16] i was hoping to avoid it [23:46:03] *nod* [23:52:27] marktraceur: I'm now getting this error [23:52:32] Could not acquire 'torontoc_mw:messages:en:status' lock. [23:53:23] dnj: You didn't answer my question - what PHP version are you running? [23:54:58] marktraceur: 5.4.27 [23:55:14] I...guess that should be fine? [23:55:35] I don't think we have egregious errors in it [23:55:52] dnj: And your MediaWiki version? 1.22 probably? [23:59:11] Is a users edit count stored in the centralauth database?