[01:30:01] SPF|Cloud: can you paste your apache config somewhere? [02:42:58] Hmm... for some reason one of my mediawiki installation is redirecting everything to https [02:47:49] meh, it was the browser cache o_O [03:51:50] in a skin for 1.23, how do I check whether the requesting user is logged in? [03:52:08] $this->getUser()->isLoggedIn() I think. [03:53:01] when I do that, the page stops loading at that point [03:53:23] er [03:53:24] !blankpage [03:53:24] A blank page or HTTP 500 error usually indicates a fatal PHP error. For information on debugging (including viewing errors), see . [03:54:51] Call to undefined method VectorTemplate::getUser() [03:55:13] so I'm actually in a VectorTemplate. That I didn't know :D [03:55:27] oh [03:55:31] in the template use [03:55:44] $this->getSkin()->getUser()->isLoggedIn() [03:56:02] greetings [03:56:02] ah [03:56:18] works, thanks [03:56:29] I wanted to ask about overriding table header styling [03:56:50] instead of styling each of them seperately [03:57:30] I heard you can apply styles to |- style="color: ..." to color the whole row [03:57:53] but apparently since it's the header row, all entries use ! Entry 1 Header [03:58:25] so the wikitable header styling overrides it. Using !important on the row's CSS didn't help [04:22:02] hi [04:22:04] bye [04:43:27] in a skin (or VectorTemplate), what's a good way to get the personal links (user page, talk, prefs, etc.)? [04:44:33] I basically want to implement a custom version of $this->renderNavigation('PERSONAL') so that I can add an icon to each of the links [09:54:06] ori: only need the actual VirtualHost? [09:56:18] https://www.irccloud.com/pastebin/CIZjAGPV [09:56:46] Above one, ignore that ServerName field [10:51:58] Okay, so is there any way that edits can be made that will not be checked by abusefilter? [10:52:07] Maybe via api? [10:52:40] justin_zw: api will check them as well [10:53:07] So why is the abusefilter only stopping actual good contributors instead of the spam? If you test the filter against the spam edits, they match. [10:53:16] Yet it still allows them through and doesn't log them. [10:53:41] The only thing in the log is a legitimate person making his own userpage. [10:54:54] I even went through the pain of installing memcached on load balanced servers. [10:56:30] well, sometimes the match is tricky. For example, checking for article_articleid = 0 works on page creation, but existing edits that went through won't have article_articleid = 0 if you test the filter against them [10:58:15] Vulpix: Maybe I should get specific. So, the filter is supposed to stop anyone who isn't autoconfirmed from adding external links to their user pages. That's where our spam is coming in right now. [10:58:31] They create an account and then make a userpage with nothing but an advertisement. [10:58:35] It's clogging up the recent changes [10:58:40] And it's really ugly. [10:58:53] http://zeldawiki.org/Special:RecentChanges [11:00:10] justin_zw: can you share that filter rule to see the logic being applied? [11:01:09] added_links & (article_namespace == 2) & !("autoconfirmed" in user_groups) & !("patrol" in user_groups) & !("sysop" in user_groups) & !("bureaucrat" in user_groups) [11:02:12] added_links it's an array, apparently. I don't know how would it work using it as a boolean expression [11:02:48] Doesn't "in" return true if the string is "in" the variable? [11:02:51] Doesn't "in" return true if the string is "in" the variable? [11:02:57] Oops, sorry. Forgot this wasn't Slack. [11:04:22] well, the documentation apparently doesn't say what kind of variable is it... it may be a new-line-separated string with all external links [11:06:02] I'm saying that these edits are matching the filter, though. [11:06:09] I can screenshot the batch tests [11:07:04] you should use count(added_links) > 0 [11:08:22] maybe the test for recent edits doesn't work exactly the same as the normal filters, and it matches it where it shouldn't [11:11:27] Vulpix: But that's absurd. [11:11:43] One would assume that the testing would work the same way. [11:13:41] that's the only possibility it comes to my mind [11:35:34] Is Wikitionary a Wikimedia project? [11:35:51] yes [11:36:05] Then why did a link to it show up from a spambot? [11:36:52] #itsaconspiracy [11:37:28] to try to circumvent abuse filter rules that whitelist edits with links to trusted websites? [11:37:28] I've asked the staff to keep track of the links these bots are adding so we can look for patterns [11:37:48] This particular one showed up so I'm trying to find out if it was the only link. [11:37:48] that won't work [11:37:58] No? [11:38:02] look at other patterns [11:38:26] for example, I have a filter to block page creations that contain "
" [11:39:02] there are a lot of spam with that pattern [11:39:49] if you look for links, you won't never stop them, since they're mostly random [11:40:06] Have you ever had it this bad, Vulpix? [11:40:14] http://zeldawiki.org/Special:RecentChanges [11:40:20] It's just... everywhere. [11:40:51] justin_zw: yes :( http://awa.shoutwiki.com/wiki/Special:AbuseLog [11:41:28] Yowza [11:42:27] justin_zw: you should try to install https://www.mediawiki.org/wiki/Extension:QuestyCaptcha . On specific topics like your wiki it will be incredibly successful [11:43:25] Vulpix: Yeah, we use that already. [11:43:45] But they've cracked it somehow. Every question I add they just answer. [11:43:46] for account creation? [11:43:51] Yeah. [11:44:16] that should be impossible :S [11:44:18] Like I don't know if we have a security vuln and someone's sneaking peeks at my LocalSettings or what. [11:44:58] justin_zw: well, this question is very weak: "Please provide the eighth character from the sequence 04749a2c: " [11:45:18] I resorted to dynamic ones after the ones I came up with kept failing. [11:45:36] I suppose I can always change them again, but I'm running out of questions that I haven't used [11:46:50] Can using ConfirmEdit interfere with AbuseFilter? [11:47:28] no [11:48:14] Okay, so that's one theory out the window. [11:48:15] well, I'm really confused. That questy captcha shouldn't be failing. That's incredibly weird [11:52:05] I was wondering if creating accounts through api would succeed bypassing the captcha, but apparenty they don't [11:52:37] Maybe they've found an exploit in our particular version of MediaWiki? [11:52:42] I know we need to upgrade. [11:52:54] We're on an unsupported version. [11:55:23] ould be reasonable [11:55:40] *would be reasonable [12:07:57] Well, I changed the questions again to some more Zelda related ones. We'll see if that does anything. They beat my last ones quick. [13:21:03] action=purge redirects to page itself but the URL no longer reads action=purge. e.g. https://www.mediawiki.org/w/index.php?title=Manual:Parameters_to_index.php&action=purge Is that intended? [13:22:54] Never mind, we have Action::getActionName now instead of wgRequest. [13:23:09] Upgrading from 1.16 is hard :) [13:30:21] Vulpix: I changed the questions about 1.5 hours ago and still no spam so far... Maybe I reused an old question last time. [13:30:59] So, I'm pretty fluent in HTML, CSS, JS, PHP and MySQL. I've worked on several small projects before, sometimes in teams. I'd like to contribute to MediaWiki, but this is my first open source contribution. Could someone please help me out? [13:31:33] polybuildr: Depends on which project you'd like to help out. [13:31:44] Are you interested in MediaWiki itself? [13:31:47] I'd like to start by looking at MediaWiki core. [13:32:27] Well you're probably better off talking in #mediawiki-dev (I think that's it...?) [13:32:45] !start | polybuildr [13:32:45] polybuildr: https://www.mediawiki.org/wiki/How_to_become_a_MediaWiki_hacker [13:33:16] Raight. Sorry, I'll be on my way then. :P [13:39:44] Sorry, I fail. I try to check if the page was purged but Action::getActionName( $this->getContext() ) is "not in object context" [13:47:21] Does action=purge work at all? https://www.mediawiki.org/wiki/Extension:Purge [13:54:09] action=purge is part of core, not an extension [13:57:11] yes, I mean I have these tabs and want to check if user used it. [15:18:05] hello, i have added parenlink extension to my wiki, now in any page i have generate, it show : =Internal Links= Parent Article: [[]]
, what does it mean??? [15:29:32] could anybody answer??? [15:31:22] elahe: that could mean parenlink extension is no longer supported by recent versions of MediaWiki [15:32:01] you mean i can not have parent link? [15:32:05] in my wiki? [15:35:12] : you mean i can not have parent link in my wiki? [15:35:29] you can, but it will stay broken [15:35:56] the author didn't keep it up to date for current versions of MediaWiki [15:37:29] could you show me what will affect if i use parent link? [15:37:55] I don't use that extension, and apparently you're already using it [15:38:19] err, I think that behavior is correct? the only thing that extension does is add a parent article link as boilerplate in a new article [15:38:22] I'm getting the error: "connect to host gerrit.wikimedia.org port 29418: No route to host" on my mac. has anybody faced a similar issue? [15:38:58] arpitachandra, works for me [15:39:03] Have you managed to connect before? [15:39:07] yes i use it, but you told, it isnot working [15:39:16] elahe: *what* is not working [15:39:35] elahe: what you describe seems correct behavior, although there is no page filled into the boilerplate text [15:39:39] parentlink extension [15:39:58] elahe: I assume you mean Extension:ParentPageLink [15:40:05] elahe: you told us it wasn't working, implicitly, when you said it just shows Parent Article: [[]]
[15:40:11] I have configured it correctly I think, but I can't connect to gerrit [15:40:34] yes, it show that in each page [15:40:51] but you mean i should add each parent link manually [15:40:52] elahe: define 'page'. Do you mean in the edit window when you create a new page? [15:41:02] or is it configured automatically [15:41:12] yes [15:41:35] i though it is configured automatically [15:41:39] not manually [15:41:51] again, define 'configured' [15:41:59] now i should define the parent link for each page [15:42:00] what is your expectation? [15:42:12] it should fill in the parent page name between the [[ ]]'s, yes [15:42:28] but guessing from the code, that page *must exist* [15:42:32] i expect that in each page, there an icon by clicking it go parent link [15:42:44] like ftp [15:42:47] elahe: no, it doesn't do that, and that's also what the information page tells you. [15:42:55] elahe: it adds boilerplate text to each new page [15:42:59] !e ParentPageLink [15:43:00] https://www.mediawiki.org/wiki/Extension:ParentPageLink [15:43:17] elahe: however, as far as I know, mediawiki already provides links to parent pages at the top [15:43:26] arpitachandra: 'no route to host' suggests your computer is unable to find a way to connect to gerrit -- which is a bit strange [15:43:28] ssh -T git@github.com gives me a similar error too, so i changed the port from 22 to another port [15:43:28] no [15:43:37] i do not see it [15:43:41] could you sho wme [15:43:43] ? [15:43:52] i use meidiawiki 1.23 [15:43:54] elahe: https://www.mediawiki.org/wiki/Manual:Pywikibot/Scripts [15:44:01] the "< Manual:Pywikibot" under the page title [15:44:02] Yes, how do I fix it? [15:45:51] do you kno which commanad should i use to have this? [15:45:56] arpitachandra: not sure. does https://wikitech.wikimedia.org work for you? [15:46:14] elahe: for what? that is standard in mediawiki, as far as I know. [15:46:43] however, the parent page (in this case "Manual:Pywikibot") has to exist [15:46:57] arpitachandra, please answer my question. [15:47:09] I have tried a couple of things, but none of them work [15:47:32] elahe: see https://www.mediawiki.org/wiki/Help:Subpages [15:47:39] Krenair, sorry I missed your question, what did you ask? [15:47:43] arpitachandra, Have you managed to connect before? [15:47:49] Or is this your first connection? [15:47:52] yes yesterday, via port 443 [15:48:03] Does that still work? [15:48:09] now it doesnt work. I made changes to my ssh config file [15:48:17] no it doesnt work anymore [15:48:40] Well SSH config files shouldn't affect HTTPS (port 443) [15:48:49] i add {{my page}} at first but it is not affected [15:49:01] Can you ping gerrit.wikimedia.org? [15:49:26] Host github.com Hostname ssh.github.com Port 443 . i added these three lines [15:49:36] it doesnt work anymore [15:49:52] elahe: I have no clue what you're trying to do or why. [15:49:59] elahe: read the Help:Subpages page. [15:50:02] ummm [15:50:18] elahe: "Breadcrumb links will appear automatically at the top of the subpage, linking to each parent page that exists. These links do not appear, however, if the parent pages have not yet been created or if the subpage feature is turned off." [15:50:20] I don't think you can SSH to port 443, arpitachandra [15:50:48] ok, let me study that [15:50:51] Oh, okay so Github lets you [15:50:56] But that will not work with Gerrit. [15:51:38] github doesnt let me either now [15:51:43] how do I fix this? [15:52:55] I am using a proxy, could that be a problem? [15:53:02] Yes. [15:53:03] could be [15:53:19] I added them to my git settings [15:53:25] still doesnt work [15:53:38] Can you SSH to bastion.wmflabs.org ? [15:53:43] standard port [15:54:00] I'm not sure whether git allows you to run ssh over an https proxy, but at least https should work (and this is possible both with gerrit and github) [15:54:14] or is it a socks proxy? in that case, it should not be an issue... [15:55:08] the 'no route to host' suggests the connection doesn't pass through your proxy [15:57:19] urgh, 10 open labs shell requests, one of which is yours [15:57:27] so you probably can't ssh there yet [15:58:05] Krenair: sure, but that should give an 'No authetication methods left' response, not a 'No route to host' [15:58:16] right [15:58:45] (I was wondering about the idea of proxying ssh to gerrit via labs) [15:58:53] but that's useful to test anyway [15:59:22] one of the fixes tells me to use git remote set-url [16:00:26] To set origin to... what exactly? [16:02:41] git remote set-url origin git@github.com [16:02:54] um [16:02:58] I thought you were trying to connect to gerrit? [16:03:23] git doesnt work either now. i've mentioned it earlier [16:03:39] * Krenair shrugs [16:03:49] connect to host github.com port 22: Connection refused [16:03:54] I give up. It's difficult to understand exactly what is happening [16:04:29] thanks [16:12:28] arpitachandra: Are you sure you're aware of the difference between github and gerrit here? [16:13:29] Never mind. Just read the older messages. [17:36:28] connect to host github.com port 22: Connection refused. Any help? [18:56:02] I think the first link shouldn't be there, because it's already on the bullet list: https://www.mediawiki.org/w/index.php?title=Template:MediaWiki_News&curid=4227&diff=1202815&oldid=1187725 [19:52:08] connect to host gerrit.wikimedia.org port 29418: No route to host: Has anyone encountered before? [19:55:58] Yes I am too suffering from same error [19:56:25] I am getting name or service not known [19:56:40] those are probably different errors. [19:56:55] ac16: maybe your network provider is blocking port 29418? [19:57:07] alisha_: yours sounds like a DNS issue. [19:57:13] ac16: assuming you're arpitachandra, have you figured out your proxy situation? [19:59:52] legoktm: I'd expect port blocking to give 'Connecton timeout' or 'Connection refused', not 'No route to host' [20:00:12] hmmm right. [20:00:35] I think when my school was blocking the port, I would get timeouts [20:00:43] ac16: I think the issue is still that your connection doesn't pass over your proxy. However, I'm still not sure what kind of proxy you're dealing with. [20:00:57] (nor am I certain how to configure a proxy for git...) [20:01:41] legoktm: what should I do then? [20:02:38] alisha_: can you connect to https://gerrit.wikimedia.org ? [20:03:06] yes [20:04:06] That suggests you have a typo in the server name in git, or there's something very strange with your DNS settings. [20:04:23] (or your DNS requests are also proxied, but I'm not sure how that would work...) [20:04:44] Ok [20:07:41] but I am not getting what step should I take next to resolve this issue [20:14:00] alisha_: check if the server name in git is correct? should be shown in the error message [20:14:26] can anyone tell me about the global $placeHolders [20:14:28] ? [20:14:32] Hi, could someone assist me in installing SMW via shell on server? The SMW IRC isn't too active - I'm getting tripped up at the first step (Composer) [20:14:46] I've got an extension that's throwing an error, saying its undefined - it's an old extension [20:14:59] sinix: What extension is it? [20:15:21] it's a homemade one, for the client I'm working on [20:15:32] about 4 years old i think [20:15:37] You're saying you made it then? [20:15:55] no, the previous steward made it [20:16:18] I'm just skinning the wiki, and I just came across this error [20:16:19] valhallasw`: the complete error message is :Could not resolve hostname gerrit.wikipedia.org: Name or service not known [20:16:30] the lines in question are: [20:16:42] function($i) { [20:16:50] global $placeHolders; [20:16:56] return $placeHolders[intval($i)]; [20:16:57] } [20:17:06] I made changes to the ssh/config file and it works for git [20:17:16] but I can't get through my gerrit account [20:17:34] sinix: Is $placeHolders defined anywhere else? [20:18:16] If I recall correctly, a undefined variable (One that does not exist before) cannot change it's scope like that. [20:18:32] What does the extension do precisely? [20:18:49] I'm searching for it now, but I'm guessing it was a deprecated mediawiki global - I've been running into a lot of errors in these old extensions that rely on globals that aren't in the current MW [20:19:39] But how old? All globals now use $wg(Rest here) [20:20:12] "Placeholder" sounds like it does nothing anyway, by the looks of it. [20:20:28] alisha_: gerrit.wikiMedia.org [20:20:40] hm, yeah, good point - all the other ones have been $wgDeprecatedGlobal or whatever [20:20:42] not .wikiPedia [20:20:59] ac16: could you post your git config file in a pastebin? [20:21:29] I'd suggest you comb through the code and determine it's relevancy. [20:21:55] Oh so silly of me [20:22:15] well thanks valhallasw` [20:22:22] alisha_: you're welcome [20:23:56] thx Chron_ I think you're right [20:24:09] I'm *honestly* not sure if this is possible, it's not anything I've tested or have seen to be evident -- Perhaps the old skin defined $placeHolders somewhere? I've never considered that anything could be passed between skins and extensions, but I guess that's technically a possibility. [20:24:31] And if that's the case, I'm in some trouble myself! [20:26:25] valhallasw`: I am getting permission denied issue [20:26:52] alisha_: https://www.mediawiki.org/wiki/Gerrit/Tutorial [20:27:04] specifically, https://www.mediawiki.org/wiki/Gerrit/Tutorial#Set_Up_SSH_Keys_in_Gerrit [20:27:51] I am following this link only [20:28:12] but not able to run ssh successfully [20:28:26] alisha_: in any case, your public key is either not sent to the server, or the server does not recognise it [20:28:31] I have followed all these steps only [20:28:53] Ok I will check for it [20:29:10] sinix: Just be sure to comment it out with an explanation included - Don't delete it! From a quick test I just ran, it seems as if passing data between the skin and an extension is not as easy as changing the scope to global, but I'd check the skin anyway! [20:29:19] *(The old skin, that is) [20:31:35] So, back to my question: Could someone assist me with installing Semantic Mediawiki? I'm having issues with Composer. [20:31:55] thx Chron_ good luck with Semantic MW :) [20:33:31] wait, it looks like you're having trouble iwth composer? I've had some trouble with it, too, and figured a few things out iwth it - what's the problem? have you gotten composer installed? [20:38:09] sinix: The installation itself is confusing! Is composer installed to your mediawiki directory, or..? Where does the json file go? In context, the official instruction aren't very clear. [20:38:59] And when I think I do have it installed, and when it's time to configure it to install SMW, it says something along the lines of "no changes" ect ect. [20:39:37] yeah, the docs are a little bit terrible - it's a lot like npm or bower, if you know those - you intall it globally and locally - so, the best idea is to do everything from the wiki's root directory [20:40:38] what OS are you running? [20:42:19] sinix: I've contacted you via PM. [21:03:01] * hl3fx cry [21:03:20] VisualEditior is driving me crazy [21:03:34] been googling and trying things for about two days now [21:03:51] not sure what else to do to get it to work on 1.23, about to just upgrade to 1.24 [21:03:55] unless anyone is willing to help? [21:04:31] mediawiki 1.23 ubuntu 14.04 [21:04:50] getting parsoidserver-http-bad-status: 500 [21:05:02] was getting : 404 [21:05:14] checke and double checked settings. ugh [21:05:26] anyone else ever run into this issue? [21:08:10] anyone around [21:09:28] hl3fx: is parsoid logging anywhere? [21:10:27] logging to /var/log/parsoid/parsoid.log [21:10:42] although it's not outputting any helpful logs [21:11:11] i was trying to figure out how to have node.js log to something, but did not get too far with that [21:11:36] are you running vagrant? [21:12:02] i'm not sure what that is? i read some stuff on that when i was googling around [21:12:12] how do i find out? [21:13:32] hl3fx: if you're not sure, you're almost certainly not. do you know what port parsoid is listening on, by any chance? [21:13:38] ah, it's not vagrant [21:13:44] i built it from scratch [21:13:59] default port :8142 [21:14:32] what happens if you just curl it from the commandline?( i.e., if you run 'curl localhost:8142' on the server?) [21:14:58] it works [21:41:09] Is there a way to autocomplete category names in the search? [21:43:59] I mean without typing "Category:" in front. [21:45:08] add it to $wgNamespacesToBeSearchedDefault, perhaps? [21:45:21] !wg NamespacesToBeSearchedDefault [21:45:21] https://www.mediawiki.org/wiki/Manual:%24wgNamespacesToBeSearchedDefault [21:46:23] It find's everything when I type "Categor..." [21:47:51] But users don't do that. Also I have the "GoToCategory" so when I type "foo" and hit "Go" it goes to Category:Foo. But people don't type Category:F to search it [21:48:31] my cat ns is in $wgNamespacesToBeSearchedDefault [21:48:46] and if you type foo what happens? [21:48:56] it suggests nothing [21:49:21] cos no article starts with Foo... [21:49:39] Can't it search truncated? [21:50:10] Subfader: can you paste your relevant part of LocalSettings.php ? [21:52:42] These might affected search https://dpaste.de/gj4L [21:55:21] Also disabled GoToCategory, no luck [21:56:56] Doesn't even work on mediawiki.org. "New con..." doesn't suggest https://www.mediawiki.org/wiki/Category:New_contributors [21:57:30] it's just a lame index listing from the API :( [21:58:01] "assuran..." doesn't suggest https://www.mediawiki.org/wiki/Quality_Assurance [21:58:41] But I would be happy with category names without the prefix already :) [22:02:06] Subfader: might be related to bug https://bugzilla.wikimedia.org/show_bug.cgi?id=65753 [22:04:01] Hmh. Not sure what he aimed to do. Will open a bugzilla request, ok? [22:08:49] Subfader: https://bugzilla.wikimedia.org/show_bug.cgi?id=24214 here it is [22:10:13] Looks promising. Will try to hack it [22:10:47] also user preferences influence that as well [22:12:56] Hi, I'm making an extension with a special page and a form with HTMLForm... how can I show a text field when X option is selected in a "select" field? [22:13:23] javascript? [22:15:00] saper, but.. how can I insert Javascript in the HTMLForm? [22:17:09] the best way is to use !resourceloader but you can do it also inline at least for testing [22:17:14] !rl [22:17:14] ResourceLoader is the delivery system for JavaScript/CSS in MediaWiki. First released in MediaWiki 1.17. See also https://www.mediawiki.org/wiki/ResourceLoader , https://www.mediawiki.org/wiki/ResourceLoader/Migration_guide_%28users%29 and https://www.mediawiki.org/wiki/ResourceLoader/Migration_guide_for_extension_developers [22:17:43] Subfader: looks like this code won't work with the new search suggest module [22:18:34] 1.22 here [22:20:23] yeah this was done in 1.21 [22:20:31] but it still gives "0" namespace hardcoded [22:21:51] Where could I hack me into? I have no problem hackign core with comments [22:26:18] Do you understand the function? I'd have no problem it just searches all namespaces. It are just "suggestions" :) [22:28:06] And why does it grab the NS so complicated? The searchNs array is added in the DOM [22:33:04] Hi there. We recently created a template for users to include a more detailed description of files when uploaded to the wiki, similar to what's used in Wikipedia (http://www.khwiki.com/Template:Aboutfile). I'm wondering if there's a way to make it appear on the Special:Upload form the way the license options appear on it. See https://lh4.googleusercontent.com/-hgO1eICy_dc/VDHGdYLlIDI/AAAAAAAAAcQ/bCTjZwikvIA/w958-h766-no/ [22:34:34] saper: I just added "namespace: 14" without the function and it suggest only categories. Cool enough [22:34:47] Thanks for the hint [22:36:26] dirty hack, yeah [22:43:03] And how does it want to multiple ns numbers? tried various forms. e.g. ["0","14"] but no luck [22:46:43] Subfader: tried namespace: "0|14" ? [22:48:03] then it only takes the first. [22:49:01] let me check mw.api docs [22:49:57] namespace:14 also doesn't work. it suggested me stuff from my browser history I think. [22:51:01] rly? strange [22:51:35] Now! I had the pref "Enable simplified search bar (Vector skin only)" enabled. In Monobook... [22:51:50] • namespace – namespace numbers to search (delimited by |, defaults to 0) [22:52:11] Coolio! Will give that and the function another try [22:53:38] https://www.mediawiki.org/wiki/API:Opensearch and https://doc.wikimedia.org/mediawiki-core/master/js/#!/api/mw.Api should take you somewhere [22:55:41] to really reload all resourceloader stuff you need to use lots of magic voodoo [23:06:14] cheers [23:10:21] Subfader: watch this space, maybe I have a proper patch [23:11:29] kk :) Just realized that the searchNs I saw in the DOM is the RL stuff the function grabs [23:18:00] !rl [23:18:00] ResourceLoader is the delivery system for JavaScript/CSS in MediaWiki. First released in MediaWiki 1.17. See also https://www.mediawiki.org/wiki/ResourceLoader , https://www.mediawiki.org/wiki/ResourceLoader/Migration_guide_%28users%29 and https://www.mediawiki.org/wiki/ResourceLoader/Migration_guide_for_extension_developers [23:35:58] Is there any manual on workins with databases (yes, plural, I have to work with the mediawiki database and with another one) from extensions? [23:36:43] saper: I def don't get multiple NS values to work [23:38:56] Subfader: ok I'm trying right now... [23:40:13] Great :) Just saying if it doesn't work hardcoded there then you won't need to try a searchNs function [23:48:55] The wiki doesn't describe any way to do it :\ [23:49:53] I just hardcoded the mysql stuff into it. You cannot use any MW functions for external dbs AFAIK [23:50:12] Ok, thought that the wrapper could help me [23:50:13] thanks! [23:51:28] Polsaker: you should be able to pretty easily [23:51:48] wfGetDB( DB_SLAVE, array(), 'name of database' ); assuming the other one is on the same host and uses same username/pw [23:51:56] ohh [23:51:59] if not, you'll need to set up some load balancer config iirc [23:52:00] awesome [23:52:37] you can look at Extension:GlobalBlocking for an example