[06:44:53] 124.232.13 [11:03:17] How the community could undo a lock? [11:44:24] Wilfredor, do you mean a protected page? [11:47:42] unicodesnowman: about russavia case [11:52:55] Wilfredor, this is the channel for mediawiki, the software that powers wikis. you will need to follow the local community processes of whatever wiki you have an issue with. [11:55:01] Both are part of WMF control to the community. Currently mediawiki has ceased to be a free software community project [12:47:06] #/join#mediawiki [12:47:10] hello [12:47:26] #wikimedia-dev [12:48:45] christel_: stahp [12:49:04] !ops christel_ impersonator [12:51:25] its a common name .. [12:52:54] Athanasius: wat [13:14:52] Big thanks for the awesomest wiki software [13:24:00] I once updated 8 major revisions with only glitch being I needed to comment one line of SQL out [13:36:58] !tell jubo2 about upgrade [13:36:59] There is no such key, you probably want to try: !installmacosx, !useragent, [13:37:30] hold on I .g it in mah channel [15:07:40] I have couple of CAPTCHA systems broken on couple of MediaWikis [15:07:56] I want to make them both reCAPTCHA [15:12:50] jubo2: https://www.mediawiki.org/wiki/Extension:ReCaptcha does this help (I haven't used myself)? [15:14:25] Nikerabbit: It should be triggered by the ConfirmEdit extension [15:14:45] I need to do some serious server migrating in the near future [15:15:06] I've this almost ancient Debian7 in production that's got a busted git [15:15:28] really lovely for keeping the extensions up-to-date with minimum hassle [15:16:47] Nikerabbit: The situation seems to be bizarre.. There are 2 MediaWikis installed on that machine that have different CAPTCHAs but both broken [15:16:48] well, nowdays you need composer as well [15:17:11] jubo2: most captchas nowdays are pretty much broken [15:17:26] Last I heard was "Btw: i've heard that ConfirmEdit update is on its way. [15:17:33] " on 2014-12-19 [15:17:44] Nikerabbit: I mean the user is getting blank page when should be seeing the CAPTCHA [15:17:54] Damn this can of beer likes me too much [15:18:12] jubo2: oh that's not good [15:18:19] jubo2: be sure to enable php error logging [15:18:31] that's usually a sign of fatal php error [15:18:54] Nikerabbit: and I'd look for this php error log in ..? [15:20:05] now I look for extension distributor, right ? [15:20:23] to get the extension for reCAPTCHA [15:20:33] I wget it [15:21:18] jubo2: in php maintenance/eval.php prompty type echo ini_get( 'error_log' ); [15:23:49] ah.. there is nothing to wget [15:24:17] just the key and the secret, one require_once and switch CAPTCHA [15:24:30] unless it is ConfirmEdit that is brokeh [15:26:45] and the reCAPTCHA is aktivate! [15:40:30] I get error on the other MW installation when doing the CAPTCHA [15:40:34] [453d60b4] 2015-01-19 15:20:56: Fatal exception of type MWException [15:41:19] Something is wrong. The reCAPTCHA displays properly but leads to the above error in MW [15:47:52] I'm not sure what I'm buying into by using reCAPTCHA [15:49:03] at least got the study wiki sorted with a functional one ( http://GloBBA12.si/wiki/ basically 2 first years of business school subjects at a-not-so-bad University of Applied Science [15:53:29] Nikerabbit: echo ini_get( 'error_log' ); in the prompt prints empty line and then a prompt again [15:54:04] well, you could do well by defining the error log then ;) [15:54:53] Nikerabbit: and I does that how.. Try to be like instructing a kid to cook [15:55:16] jubo2: I'm horrible cook, sorry [15:55:31] jubo2: set it in php.ini or inside mediawiki itself with ini_set [15:55:43] ♪ ♫ talk me through in in irc and hold my handdd in irccc ♪ ♫ .. [15:57:08] so.. /etc/php/php.ini ? [15:57:26] maybe... it varies per distribution [15:57:39] Vanilla Debian7 [15:58:21] I prefer raspberry [15:59:30] I tried dropping a 32-bit Trisquel GNU/Linukka into VirtualBox but it wouldn't run, complained something missing [15:59:51] Nikerabbit: I have now located the php.ini [15:59:56] what do I put in it [16:00:05] it looks all weird starting with semi-colons [16:00:32] jubo2: try searching for error_log (btw: moi!) [16:02:45] now it should be logging to 'php_errors.log' [16:03:20] where-ever that is when not giving absolute path [16:03:25] * jubo2 sips his lager [16:03:29] now that I think of it... by default it should log to apache/nginx error log [16:03:36] Lager likes me too much [16:05:52] I'm seeing a bunch of these: [16:05:54] [Mon Jan 19 17:13:27 2015] [error] [client 91.153.146.159] PHP Fatal error: Call to undefined method MathRenderer::getRenderer() in /sites/consumerium.org/develop/w/extensions/ConfirmEdit/MathCaptcha.class.php on line 43 [16:06:06] I thought I removed the MathCaptcha [16:06:11] I need to double-check this [16:07:06] yes! [16:08:22] Nikerabbit: could it be that the require_once of MathCaptcha.php is what is broken [16:09:03] it's loading MAthCaptcha for some reason [16:09:29] jubo2: perhaps incompatible versions, I dunno [16:12:20] Nikerabbit: [ae0e0baa] 2015-01-19 15:52:53: Fatal exception of type MWException ( in browser ) [16:12:53] during installation i get 'DB connection refused' [16:13:09] I look at log [16:13:29] i had managed to install it just fine. But then i deleted and reinstalled mysql [16:28:03] Nikerabbit: I enabled $wgDebugLogFile and here is what I think is the key pastebin.com/U3RLkRhk [16:29:39] I don't get it .. why is it trying to email me [16:31:22] I was just trying to save an edit as an IP number [16:34:30] installing the missing pear mailing package is prlly one 'sudo aptitude install something' line [17:20:02] bad file number error when configuring git [17:23:16] pratikbsp: Did you google the error? [17:24:33] Yes i did it, i was able to connect on github but not able to connect to gerrit [17:24:58] http://pastebin.com/4znMbEcH [17:38:10] pratikbsp: is your git clone successful? [17:58:26] codezee:repository not found http://pastebin.com/YWWA6yvV [17:59:46] pratikbsp_: strangely, git is unable to find the repo, check your connection, are you behind a proxy? [18:00:38] codezee:i was able to access repository from github [18:01:39] try 'ping gerrit.wikimedia.org' and see what you get [18:04:27] codezee:request timed out [18:06:20] pratikbsp_: then probably your dns is unable to resolve their server, this is their server-208.80.154.81, try to use this in git [18:08:04] pratikbsp_:better option will be to first troubleshoot your network [18:11:50] codezee: troubleshoot couldn't identify any problems,beside i checked and i am not behind any proxy [18:12:54] pratikbsp_: is ping 'www.google.com' working? [18:16:13] codezee:same result :( , request timed out [18:16:54] pratikbsp_: check your network configuration settings, seems to be a dns problem [18:18:42] pratikbsp_: this might help-http://stackoverflow.com/questions/22110622/ping-response-request-timed-out-vs-destination-host-unreachable [18:21:29] pratikbsp_: my network is going off, so I'll not respond now [18:25:41] What is the current way of turning "File:Blue marker.png" into the full URL of the file? [19:02:31] what does this line mean "If that's not the case, ask someone to verify from a labs instance whether the shell username is taken, by issuing the command groups $username on a shell." while creating an account on wikitech [19:05:35] siddhism: context? link? [19:05:47] I can probably answer, but I'll give a better answer if I have context [19:06:12] http://www.mediawiki.org/wiki/Special:MyLanguage/Developer_access [19:53:17] I was thinking about problems of access to countries with dictatorships. In addition to having always wikipedia offline. My skills do not go further, however, I would leave this question in the air. A decentralized Mediawiki and accessed using P2P networks, where each page is updated as accessible, but is loaded local version. My question is how to decentralize the process of modification wiki. Where I could begin? [19:54:21] MediaWiki source code [19:55:44] ok, thanks for your help [19:58:18] Wilfredor: that's a very interesting idea [20:01:06] I wish to work on the T42322 bug (https://phabricator.wikimedia.org/T42322). I know what section of the code i have to edit for the fix. Could someone help me get started as it is my first bug fix? [20:05:51] UltrasonicNXT: My inbox dislikes you. [20:07:00] Lcawte: ahah I'm sorry, goddam jslint is being really annoying! [20:07:36] talking of jslint, anybody know how to fix jslint complaining about 'mw is not defined'? The code runs fine... (https://integration.wikimedia.org/ci/job/mwext-Comments-jslint/24/console) [20:08:27] UltrasonicNXT: the laziest way is to add /*global mw */ at the top of the file [20:08:48] oh sweet, I've seen that before, does that just fix it? [20:08:51] ^^ [20:09:26] Doing so for "document" seemed to work for my only instance of jslint fails. [20:09:39] UltrasonicNXT: it tells jslint to assume that a global variable 'mw' exists [20:09:47] ah cool [20:09:59] I'm guessing I call also do that with $ and JQuery? [20:10:07] *jQuery [20:10:08] UltrasonicNXT: but generally, i think the pretty way is to use mediaWiki and jQuery as globals instead of mw and $ [20:10:35] and i think these are preset as globals in our jslint config [20:10:41] oh are mw and mediaWiki identical? [20:10:43] ok [20:10:50] yes [20:11:00] and wrap your script in a closure like this: https://www.mediawiki.org/wiki/Manual:Coding_conventions/JavaScript#Closure [20:11:44] the entire file? [20:11:54] yep [20:12:18] right now this doesn't matter in practice, but there are some wild plans of making it so that every script would actually get a slightly different mediaWiki and jQuery to work with, and this could be used for profiling them or detecting script errors or whatever [20:12:33] Is there any other memcached+session loss troubleshooting to do other than the bit about checking your /etc/hosts? [20:12:51] I've got some particular users who are *always* getting it, but I'm barely seeing it [20:13:41] MatmaRex: oh right, funky [20:18:12] MatmaRex: I'm getting "'mediaWiki' is not defined" now, even with this setup you've just described [20:19:29] UltrasonicNXT: you might have to create a .jshintrc file like described here: https://www.mediawiki.org/wiki/Manual:Coding_conventions/JavaScript#Linting [20:20:11] sorry, i've gotta go right now. [20:20:21] ok sure, I'll have a play [20:39:07] trying to set up an environment manually in ubuntu [20:39:14] hello mediawiki [20:42:09] Could somebody please help me? I run a wiki on http://hellomouse.tk/wiki/ and for some reason, whenever I go to the list of special pages (Special:SpecialPages) it shows this message: https://dpaste.de/KmHe. I am running MediaWiki 1.24.1 on PHP 5.4.14. The same problem happens with PHP 5.3. It has not occured until I have upgraded MediaWiki. [20:43:29] siddhism: what problem are you having? [20:43:54] jeffluo1011: It looks like you're possibly missing some files [20:44:10] It looks like an extension didn't register something in wgAutoloadClasses [20:44:18] The special pages list doesn't load. The error message is "Fatal error: Class 'SpecialBookSources' not found in /home/**********/public_html/w/includes/specialpage/SpecialPageFactory.php on line 410 " [20:44:20] Well, except that's in core [20:44:27] includes\specials\SpecialBooksources.php [20:44:41] I think the autoloader errors look like they're coming from core. [20:44:48] jeffluo1011: Do you have any extensions installed? [20:44:50] I have tried replacing that file with the original downloaded file, but that didn't fix anything [20:44:53] yes [20:44:57] Which ones? [20:45:07] http://hellomouse.tk/wiki/Special:Version [20:45:08] SpecialBookSources is in master in autoload.php [20:45:10] it's a long list [20:45:12] Oh. [20:45:16] Weird [20:45:34] Google said nothing [20:48:06] solved, had to reconfigure php ..it said mysql not installed [20:52:52] Apparently it happens here, too: http://www.ddlab.dk/wiki/includes/specials/SpecialBooksources.php [20:53:25] never mind [20:56:08] jeffluo1011: does SpecialBookSources appear in autoloader.php? [20:56:25] checking... [20:57:01] Where is that file? [20:57:47] http://www.ddlab.dk/wiki/includes/specials/SpecialBooksources.php that shouldn't work [20:58:01] it's different [20:58:06] that wasn't my error [20:58:07] google found it [20:58:18] in /includes/ [20:58:38] it's in root in HEAD [20:59:14] found it [21:00:07] yes it does [21:00:13] 'SpecialBookSources' => 'includes/specials/SpecialBooksources.php', [21:02:12] are there any sort of recommended memcached sizes for MW? (regular mw cache stuff + sessions ) [21:05:03] jeffluo1011: if you unpacked the new installation files over the old ones when you upgraded, I'd recommend to unpack the new files to a new, empty directory instead, and moving LocalSettings.php and other modified files as needed [21:05:12] Hmmm Specialbooksources.php has a size of 0. Fixing that... [21:06:51] extracting new files over the old ones is usually the cause of such errors [21:07:06] And it works. [21:07:14] Thanks for all your help! [21:07:32] maybe I shouldn't do that next time. [21:26:57] I am playing with an extension that defines a tag + visual editor. It's treating other extensions' inner content as if it's alien and blocking it out. My extension it's just displaying the text and tags literally in VE. Does anyone know how to make mediawiki treat the block as alien in VE? [21:29:03] kstange: try asking in #mediawiki-visualeditor [21:29:10] Vulpix: thanks [21:43:06] hi! looking for some help transcluding wikipedia content into a private wiki. is labeled section transclusion extension + editing the interwikis table the way to go? can't get it to work, for some reason... [21:44:05] !scarytransclusion [21:44:06] http://www.mediawiki.org/wiki/Manual%3A%24wgEnableScaryTranscluding [21:44:41] ^ that may work, although all links of the transcluded content will point to wikipedia [21:45:02] tried $wgEnableScaryTranscluding = true; & get a link to the site, but not transcluded content. [21:45:36] {{#lsth:wikipedia:Article name}} [21:45:39] a link? no, you need to use template syntax: {{wikipedia::Page}} [21:46:17] no, i'd like to get the wikipedia intro paragraph transcluded into my private wiki article [21:46:20] and I doubt labeled section transclusion works with scary transcusion. It should be only for plain template syntax [21:46:25] don't want a link [21:47:03] hmmm, just tried {{wikipedia:Page}}. now I get [Template fetch failed [21:47:20] shouldn't need to get templates if i just want the intro/lead paragraph, right? [21:47:47] you can't get just parts of a page with scary transclusion [21:48:46] it goes to wikipedia and gets the rendered HTML. It doesn't parse wikitext [21:49:13] hmmm... is there a better way to grab some wikipedia content? (ie. just the lead paragraph of a few articles)? [21:49:27] doesnt have to be kept up to date, could be a one-time deal [21:49:35] external data extension? [21:51:11] no idea [21:51:20] shouldn't the labeled section transclusion extension be able to just grab the intro? [21:51:51] https://www.mediawiki.org/wiki/Extension:Labeled_Section_Transclusion [21:52:08] Transclude the introduction[edit | edit source] To transclude the introduction of a page (i.e. the content before the first heading), use {{#lsth:pagename}} [21:53:37] no, it won't, because the content is external. And scary transclusion grabs the resulting HTML of the page, not wikitext, so it doesn't have any notion of sections and such [21:55:47] hmmm [23:34:57] How can I clear the cache of Linker::link() for a page after editing? [23:36:52] I'm adding class names to $attribs['class'] relevant to the page content. After editing, these sometimes don't update. [23:38:50] nevermind. I'm posting on the support desk. https://www.mediawiki.org/wiki/Thread:Project:Support_desk/Clear_the_cache_of_Linker::link()_%3F