[00:00:01] or should I alias the localized names to the namespaces? [00:01:20] shadowphrogg3264: so the pages got imported into the main namespace with titles like "Whatever:Page title" instead of, say, "User:Page title"? [00:01:29] exactly [00:01:56] I looked for "namespace manager" on google [00:02:11] and found a couple of seemingly dead addons [00:02:32] shadowphrogg3264: if you're concerned about links in these pages to other pages still working, it might be simplest to indeed alias the names [00:02:47] (and run the namespaceDupes.php maintenance script afterwards) [00:03:12] what does that exactly do? [00:03:27] the script? [00:04:01] yea [00:04:12] so, in general, the namespace number and the page title within the namespace are stored separately in mediawiki's database [00:04:22] yep [00:04:40] and the current 'offending' pages are in main ns [00:04:42] this script examines all pages and finds entries that correspond to missing namespaces, or where the stored namespace and page title conflict, and does its best to fix them up [00:04:48] oh I see [00:04:48] yes [00:05:02] so it will change the entry from (0,Whatever:Page) to (Whatever,Page) [00:05:18] just what I needed, awesome [00:05:36] shadowphrogg3264: and if you're not concerned about preserving links/page names, then i'd try switching the wiki language temporarily to the one in the dump, running namespaceDupes.php, then switching the language back to english – this should place the pages in the right namespaces and then switch namespaces' names [00:05:50] but make a backup, i never actually tried doing that :P [00:06:09] MatmaRex: I'd rather do the alias method [00:06:18] it suits my needs better [00:06:34] because this is only a part of the problem [00:06:38] I mean [00:06:58] it's usable this way, but we want to put the old content in another ns [00:07:25] with write protection [00:08:15] so all the main ns articles should be in a Legacy: namespace [00:15:37] or can I protect it based on categories? [00:15:52] AND mass-add categories, of course [00:18:13] no, use namespaces for that [00:18:25] * DanielK_WMDE should go to bed [00:18:39] I would if I could [00:19:59] anyways, bedtime here too, will come back later this day [00:20:08] thanks guys [00:24:27] DanielK_WMDE: hey, i'm going to cherry-pick aude's revert (https://gerrit.wikimedia.org/r/#/c/159818/) to wmf21 [00:25:04] DanielK_WMDE: due to [00:25:09] is that ok? [01:01:28] howdy -- it looks like #if can have empty fields if the condition is true/false, can #ifexist have an empty field too? [01:02:40] I'm trying to see if a page exists before adding a [[link]] to it, but if it doesn't, I'd prefer nothing show [01:04:59] prech: you could test {{#ifexist:Link|[[Link]]|}} [01:05:13] I think it works [01:06:20] the last | might not be necessary (you should test it if you remove it though) [01:16:43] PiRSquared42: thanks! [01:17:05] You're welcome [03:32:05] howdy, I sometimes see code with #if and many {{test conditions}}, like #if{{test1}}{{test2}}{{test3}}| does this mean all 3 test conditions must be true, or just one? [03:36:12] prech: hi. [03:36:13] https://www.mediawiki.org/wiki/Help:Extension:ParserFunctions#.23if This function evaluates a test string and determines whether or not it is empty. A test string containing only white space is considered to be empty. [03:36:13] So I guess it evalues all 3 things (you wrote them as templates, but I think you meant {{{}}}, which are params. Hm.) [03:41:05] if string a+b is empty, then both a and b must be empty as well [03:43:33] prech: if any of them are true, then the result is true [03:43:41] true=non-empty [03:45:11] ok, that's what it looks like in the code, too. Thanks! [07:02:18] are you in here [07:02:19] ? [07:02:28] i want to add a sentences in my wiki footer [07:02:34] how could i do that? [07:06:25] you could help me? [08:21:17] could anybody help me wiht footer [08:21:31] i want to add a footer, such that i can have a text [08:21:39] how could i add this [09:19:05] hello [09:19:20] i want to add a special text in my wiki page footer [09:19:26] could anybody help me? [09:26:36] could you help me? [09:26:57] sasan: hi. why do you ask me specifically? [09:27:32] becasue i know you, i asked you uestion before [09:27:36] and aswered me [09:27:54] that does not mean that I am a personal support line though :) [09:27:58] so let me google "mediawiki manual footer" for you... [09:28:10] Alright, this is the first result on google.com: https://www.mediawiki.org/wiki/Manual:Footer [09:29:35] in general, I'd appreciate to not get pinged when it's not a question specifically directed to me, as pings are disruptive. thanks :) [09:29:49] today, mediawiki.org has problem in my country [09:30:03] becasue i this, i have problem with opening that [09:30:38] this is why i asked in irc [09:30:48] if do not want to help me, no problem [09:30:56] i wait, till a guy help me [09:31:35] so you mean that you cannot access https://www.mediawiki.org/wiki/Manual:Footer ? [09:31:58] yes, you can remote and check my pc, [09:32:01] (I did help you by telling you where it is explained.) [09:32:29] maybe today in my counteir it is blocked [10:10:31] i use special software to search [10:10:43] i dl the extention and paste in my extension folder [10:10:52] then i add this command in local setting.php [10:11:04] require_once('extensions/Footer/Footer.php'); $wgFooterLinks['numberofwatchingusers']=false; $wgFooterLinks['credits']=false; $wgFooterLinks['copyright']=false; $wgFooterLinks['tagline']=false; $wgFooterLinks['privacy']=false; $wgFooterLinks['about']=false; $wgFooterLinks['disclaimer']=false; $wgFooterLinks['viewcount']=false; $wgFooterLinks['lastmod']=false; [10:11:09] but my wiki can not be load [10:11:11] why? [10:11:44] sasan: hi [10:12:05] what does it do instead of loading? [10:12:19] https://www.mediawiki.org/wiki/Debugging [10:13:51] hi [10:13:56] could you see my code [10:14:07] yes, need more info from you [10:15:23] which inof do you wnat? [10:16:48] I asked what it does instead of loading, and linked to a page which describes how to obtain additional information for troubleshooting the issue [10:17:37] I'd be happy to walk through this, but I'm unfamiliar with that guide myself. I would personally start with checking webserver logs and looking at what it loads instead of the wiki. [10:19:02] but i told above, media wiki in not be opened toay in my country [10:19:03] :( [10:20:09] does that stop you from explaining what you see instead of the wiki? [10:20:37] no [10:20:39] do you see a blank page? [10:20:43] parsintelligent.com/wiki [10:20:46] do you see an error message? [10:20:48] * Svetlana looks [10:22:25] : have you seen my wiki? [10:22:39] I see, it's a 500 error. [10:22:47] do you see anything in your webserver logs about it? [10:24:20] hiw could i webserver logs? [10:24:51] what webserver are you using? [10:24:59] i use host [10:25:04] i am in cpanel [10:25:49] I am unfamiliar with cpanel, but do you have SSH access in addition to that? [10:27:23] i have error log in wiki [10:27:27] it can help? [10:32:42] of course [10:36:04] this is last one [10:36:05] [12-Sep-2014 14:39:50 Asia/Tehran] PHP Fatal error: Call to a member function getMaxIncludeSize() on a non-object in /home/parsinte/public_html/wiki/includes/parser/Parser.php on line 3139 [12-Sep-2014 14:49:06 Asia/Tehran] PHP Fatal error: Call to a member function getMaxIncludeSize() on a non-object in /home/parsinte/public_html/wiki/includes/parser/Parser.php on line 3139 [10:41:54] what happened? [11:20:44] finally i could delete the footer [11:20:55] i use this command to add a text [11:21:00] but it show me erro [11:21:03] $wgFooterManagerLinks['eweq']=true; [11:21:15] it show me nothing [11:34:16] could anybody help:( [11:39:08] i want to add a footer to my wiki page [11:39:27] i use this $wgFooterManagerLinks['my desired text']=true; [11:40:26] but nothing showm [11:48:55] please asnwer me:( [12:06:28] could anybody help? [12:17:32] Niharika: could you help me?? [12:18:55] sasan: When a person X joins this channel, he/she cannot see previous conversation that happened before they joined. [12:19:23] i want to add footer [12:19:30] i followed all instruction [12:19:36] i dsable the default footer [12:19:46] but i want to add my desired text as footer [12:19:51] i use [12:19:53] i use [12:20:05] $wgFooterManagerLinks['my desired text']=true; [12:20:19] it is not shown [12:21:17] why did you use $wgFooterManagerLinks['my desired text']=true ? [12:21:34] i want to add my desired text as footer [12:21:35] ? [12:21:44] why did you use $wgFooterManagerLinks['my desired text']=true ? [12:22:02] I did not ask what you want to do. I asked why you used that line. [12:22:04] i want to save a text as my footer [12:22:09] I did not ask what you want to do. I asked why you used that line. [12:22:32] becasue i find that if i use false, it disbale the footer [12:22:42] that's not an answer to my question. [12:22:49] I asked why you used that specific line [12:22:59] i find it in google [12:23:08] you can ask the website adin [12:23:12] you can ask the website admin [12:23:19] i do not kno wwhy he use this [12:23:23] where is "in google" exactly? [12:23:36] some random website result is not a source. I'd like to know the source. [12:23:56] maybe [12:24:04] clould you help me the right one [12:24:06] ? [12:24:19] sasan, I explained to you two hours ago already? [12:24:20] remember? [12:24:28] google? [12:24:37] sasan, no? [12:24:54] https://www.mediawiki.org/wiki/Manual:Footer if you remember? [12:25:13] i told you, i can not open mediawiki [12:25:16] 100 times [12:25:21] so what do you propose? [12:27:08] ask whom he know [12:28:52] sasan, http://imagebin.org/319426 [12:29:27] Thanks, i will verify it:) [14:27:52] Can someone explain the job queue to me? I am using cirrussearch on a 1.23 mediawiki server. I just set it up a couple of days ago, it seemed to be working well. But then I noticed that changes I made to pages yesterday had not shown up. I did a bit of research and found I the runJobs.php script. That fixed it, but I'm wondering why the job queue isn't running itself. I believe it is turned on by default, with one job per request, correct? [14:42:17] Hello! I have a problem since quite some time to get my private and secure wiki to work with Parsoid. Someone here told me I had to use nginx for it to work. For someone familiar with nginx this is probably no sweat, but I am a total noob there. I have another wiki that is public, and that works fine, so there is no problem with Parsoid itself. [14:42:17] My setup: A Synology NAS with mediawiki 1.22.6, VisualEditor 0.1.0. [14:42:17] Parsoid localsettings: https://dpaste.de/cMhM [14:42:17] nginx setting: https://dpaste.de/piqd [14:42:19] wiki LocalSettings.php setup: https://dpaste.de/O2fV [14:42:21] When I try to edit a page I get: "parsoidserver-http-bad-status: 500" [14:42:23] I am obviously missing something here... Could somewone please point me in the right direction? (I have about a half hour before I must run off) [14:52:49] https://www.mediawiki.org/wiki/Thread:Extension_talk:VisualEditor/parsoidserver-http-bad-status:_500 [14:52:53] https://www.mediawiki.org/wiki/Thread:Extension_talk:VisualEditor/VisualEditor_will_simply_not_work... [14:53:08] gute: ^^ [14:55:02] keisetsu_: there's documentation. did you read it yet? https://www.mediawiki.org/wiki/Manual:Job_queue#Job_execution_on_page_requests [14:55:06] bbl [14:55:30] jeremyb: yes, I read it. [14:55:47] jeremyb: no information in any of those. the last one says even "No such thread" [14:57:04] gute: those 3 dots are part of the URL [14:57:16] gute: then your client didn't linkify the whole thing [14:57:29] Well, anyway. I read the documentation. [14:58:19] I don't really understand the reasoning behind doing it that way, but that's been discussed elsewhere. I guess I'm just going to have to run a cron job. [14:58:41] keisetsu_: you can also try setting $wgRunJobsAsync = false; [14:58:50] it solved problems on some installations [14:59:08] Ok. is "true" the default? [14:59:13] but it would be good to know why it's failing [14:59:24] !wg RunJobsAsync [14:59:25] https://www.mediawiki.org/wiki/Manual:%24wgRunJobsAsync [14:59:31] yes, it is [15:00:42] jeremyb: I've done all that. As you can see in the pastebins. [15:01:20] Vulpix: it's not failing, but I don't get much traffic, and I'm doing a lot of template editing. I'm testing for wider use in the company, I'm just wondering what kind of latency there is going to be. It will probably never be a high-volume site. [15:02:18] Vulpix: meaning, as I understand it, if a lot of jobs are added, it will take a long time for updates to the search index to happen. [15:02:37] gute: I think there's something wrong with your file -> pastebin pipeline. or your config is very wrong [15:03:15] well. I just copy paste [15:03:31] what's wrong with the config? [15:03:36] maybe that's your problem [15:03:49] e.g. lines that end in $ [15:03:50] keisetsu_: yes, you're right [15:04:01] gute: and I think you haven't "done all that" [15:04:43] bye [15:05:17] ok sorry if I sounded disrespectful. I am in a bit of a hurry. Nothing bad ment. [15:06:57] no, it's fine. I just have to go afk. and your pastebim really is corrupt [15:08:04] I see what you mean now. thats because I copied it from nano. Darn, my time is up. I'll have to get back in here later tonight. bye. [15:13:57] Vulpix: Hmm. I have two servers running, a dev and a production version. Mostly the same. I ran runJobs.php manually on the dev version, now the job queue is down to 129. I didn't run runJobs on production. I added "$wgwgRunJobsAsync=true;" to the production LocalSettings before you told me it was the default; now I see the queue go down every time I look at a page. But when I do the same on the dev version, the queue doesn't go down. [15:14:37] Vulpix: The production version is at around 8000. [15:14:40] did you mean $wgwgRunJobsAsync=false;? [15:14:47] Nope, true [15:15:03] true is the default, so you changed nothing [15:15:25] Vulpix: I changed it to false, and it doesn't seem to do anything. You're right. [15:18:20] Vulpix: When it's set to true (or leave it out, I guess), it does decrement each time I refresh the page. When I change it to false, I don't see any change (at least not in the time I've been looking at it) [15:19:28] that's strange, because the usually problematic setting is leaving it to true, since it makes an additional HTTP request to itself to pick a job to run [15:19:53] Vulpix: I guess what I'd like to know is, should I just set up a cron job? I'm likely not going to get enough traffic to empty these jobs out of the queue. [15:19:57] setting it to false makes it run on the same request, while it serves the current page [15:20:20] yes, setting a cron job is always a good idea [15:21:06] Vulpix: Yeah, I'm pretty sure it doesn't do anything at all when set to false. It's a better idea to use true anyway, right? So it doesn't mess with user interaction? [15:23:10] if it doesn't cause problems, yes [15:23:13] I can't do [[Media:some_long_file.pdf Foo]], is there any way to 'Foo' display while still linking to the file? [15:23:41] pi-: why you can't do that? [15:24:03] Also, it would be [[Media:some_long_file.pdf|Foo]] (missing pipe) [15:24:15] Super, that's what I was missing [15:24:39] :) [15:24:51] Vulpix: Thanks for your help [15:25:14] yw [17:15:34] So, using HTMLForm, is there any easy way to clear the form html after they click submit other than hacky jquery? [17:15:52] I have a new installation of mediawiki; a proxy node, two app nodes and two databases (master and slave configuration). I got an error when hitting save page; sometime it works the first time and never a second time. : "Sorry! We could not process your edit due to a loss of session data. Please try again. If it still does not work, try logging out and logging back in." I tried [17:17:41] Sometime it doesn't even save the page the first time I click on save pages; only after clicking on show changes first then save page will save the edit. BTW,I'm on Mediawiki v1.23.3. [17:18:33] Tom1900: be sure your requests are always being processed by the same app node [17:19:13] I did some googling around and tried some of the suggestions but they didn't help; like editing the /etc/hosts files - Vulpix; how do i configure that? What is the Localsetting parameters should I be looking at? [17:19:38] that's a setting of your proxy node, not of LocalSettings [17:21:18] Vulpix, I'm new to this so if you could help point out the file i need to edit and the parameters, i really appreciated... [17:22:57] well, if you managed to make an installation of a proxy node, 2 app nodes and master-slave replication, I'm sure you'll be able to find what setting can do that, because I have no idea [17:33:14] how do I "Make sure your wiki includes ajax.js on all pages"? Is that referring to $wgUseAjax = true; ? [17:33:44] wmat: ummm, that's referring to MediaWiki 1.16 or something similarly old… [17:33:44] where did you read that? I think that's pretty outdated [17:34:08] that statement is from the HotCat installation instructions [17:34:14] heh [17:34:21] hotcat? lol [17:34:23] it doesn't even depend on ajax.js [17:34:26] for a few years now [17:34:32] heh, lovely [17:34:39] (it used to, though) [17:34:44] wmat: I know dat feel [17:35:12] soo, I have HotCat almost working, but the category suggestion feature doesn't work yet [17:35:36] it's supposed to pop up and suggest categories when I start typing right [17:36:10] yeah, i think so [17:36:15] wmat: the ajax you're trying to use is so old the nursing home is getting tired of it lingering ;P [17:36:23] heh [17:36:54] Have you popped open your handy dandy debug console to see if there's any errors? [17:37:07] was just about to ;) [17:37:31] wmat: the thing about hotcat is that it's written to support like ten MediaWiki versions back, and this resulted in the code becoming an absolutely uncontrollable growth that no one dares dive into [17:37:50] crap [17:38:13] well, i was using CategorySuggest, but it wasn't working correctly either [17:38:31] maybe I'll go back to it though [17:38:35] * Ulfr_ butts out of the conversation now, as he has no idea what hotcat is :D [17:38:38] wmat: it usually works surprisingly well though [17:38:57] MatmaRex: yeah, I like HotCat tbh [17:40:58] Thanks Vulpix, you point me to the right direction and I figured out the problem; sticky session. [17:41:26] Hm. Is there any way I can have a HTMLForm submit callback be set to a special page instead of another form? [17:41:28] :) [17:45:27] It's a sad day when I understand the SpecialPage class better than some other class :( [17:46:52] I have another problem in regarding to wiki ver (1.16.2 r0) which I'm trying to replace with 1.23.3. There is this one table call "current_edits" that somehow being updated on my slave instance (not good). I inherited this existing setup so now I'm supposed to fix it. Any idea where this could have come from? [17:47:07] OH MAN I'M DUMB [17:47:11] THANK YOU FOR NOT CALLING ME OUT ON IT. [17:47:25] a few deprecated functions in this CategorySUggest extension :( [17:49:59] Tom1900: current_edits is not a known table for MediaWiki core. Check if it's for an extension [17:50:19] probably the EditWarning extension [17:51:15] yep [17:51:31] EditWarning seems to use editwarning_locks [17:51:57] maybe old versions used that table [17:55:02] it seems so: https://www.mediawiki.org/wiki/Extension:EditWarning/0.3.8 [17:55:15] The version I have installed is (Version 0.3.4). Looks like the current is .4. [17:55:36] damn there it is.. [17:56:13] is that extension even necessary? [17:57:37] Wow, thanks a lot for your help - Vulpix and wmat. you guys rocks! I don't know if that is necessary but this is what I inherit from someone else... [18:38:24] I asked about this yesterday, but the answer I got wasn't very satisfying, and it goes nicely with wmat's question earlier. I'm mostly happy with MsCatSelect, but it removes categories from the wikitext when you edit a page, then puts them back when you are finished with the page. This works for most cases, but not when the category needs to be inside certain text, for example, in tags. It takes the category out of the noinclude tags, [18:38:24] then puts it at the end of the page, so then pages include categories for templates that they shouldn't. I've put in a request with the developer and might even try fixing it myself some day, but I'm guessing it'll be a while before that happens. Long story short (sorry), is there a way I could introduce a magic variable that I could use in pages I don't want to run the extension? [18:39:39] BTW, the answer given was to use HotCat, which didn't find as useful [18:41:13] ah, i was unaware of that extension, i think i'll try it out [18:43:29] wmat: It's pretty user-friendly, but I had to modify it to work with custom namespaces. [18:46:41] ugh, no git repo :( [18:49:17] this is lovely though [19:07:15] hi guys. [19:08:27] Who here agrees to the statement that you need a reverse proxy set up in order to get a parivate wiki with https to work with parsoid? [19:09:06] And if so, why is that? [19:16:44] gute: have you tried asking at #mediawiki-parsoid or #mediawiki-visualeditor? these guys probably know a lot more about it [19:18:07] oh? did'nt know they existed. Thanks! [19:27:40] gute: for the record, you don't need a reverse proxy for private wikis [19:33:26] gwicke: alright [19:33:30] ;) [19:36:21] gute: but it might come in handy for something else ;) [19:37:36] Hi all, I just did a fresh install of 1.23.3 [19:38:00] but I'm getting a bunch of exceptions when creating / editing pages [19:38:18] MalformedTitleException: Empty title [19:38:38] has anybody seen anything like this? [19:46:19] hi [19:46:36] i want to start the work on mediawiki [19:47:01] please guide me [19:48:54] Hi sofat [19:48:59] You want to contribute to the code? [19:49:31] yes [19:49:39] i want to contribute [19:49:48] !start [19:49:48] https://www.mediawiki.org/wiki/How_to_become_a_MediaWiki_hacker [19:50:28] doanauog: do you have any extension installed? [19:50:54] Vulpix, no just a straight stock installation [19:51:08] yes [19:51:17] that's weird, I've never seen something like that [19:51:21] i develop one extension in mediawiki [19:52:24] doanauog: maybe it would be helpful if you get a stack trace [19:52:29] !debug | doanauog [19:52:29] doanauog: For information on debugging (including viewing errors), see http://www.mediawiki.org/wiki/Manual:How_to_debug . A list of related configuration variables is at https://www.mediawiki.org/wiki/Manual:Configuration_settings#Debug.2Flogging [19:54:15] Vulpix, http://pastebin.com/MGYcU5q3 [19:56:15] that's thrown on the user page.. but I get the same thing whenever I try to create, edit pages as well [19:56:43] don't worry about it too much. I can look deeper [19:56:57] I just wanted to check to see if anyone had run across it before [19:58:37] it's trying to construct a block object on an empty title, huh... [19:58:44] okay [20:01:46] Okay, I feel a little bit like a dog stuck behind a screen door right now. [20:02:18] I have everything I need set up from this HTMLForm class, but all the values I need to access are protected! and the one place I can play with them doesn't like it if I try to sneak the values out. [20:02:32] How does one access the chosen form values outside of the callback for submit? [20:02:45] Okay, so I need some help with the CSS file on monobook. I'm trying to change the background color from white to black and I'm going to be honest it been a while since I looked at the CSS and I was wondering fi anyone can help me [20:03:11] smoss: the default background for monobook is an image [20:03:43] Ulfr_: I don't know anything about HTMLform, but if those variables are protected I guess you'd need to create your own class that inherits from it, so you have access [20:03:49] @Ulfr: Thanks, now to go find a black image. I hate when the change things even though we need to keep up with tech [20:04:21] smoss: Actually, only the top is. Past that you just need to go to special:monobook.css and change the background-image value [20:04:24] or assign one [20:04:29] Vulpix: You're kidding :| [20:05:32] I'm not! protected variables can be accessed by child classes. Private variables don't [20:12:49] so [20:13:03] Hi, is it possible to add a dropdown, similar to "heading" to wikieditor toolbars? I can't find anything on this [20:13:04] guide me [20:18:23] Ulfr: Where do I find special:monobook.css [20:20:39] smoss: It's a page on your wiki [20:28:49] Okay, so Ulfr told me that I need to work on Special:Monobook.css, however my wiki is telling me that I don't have that page [20:29:39] smoss: MediaWiki:Monobook.css, rather [20:30:15] (probably, i don't know the context) [20:31:54] MatmaRex: Working on the skin. I know the header is a background. Just need to change the color for everything below that [20:33:40] I got it now [20:42:35] does ANYONE have a workaround for a parsoid to communicate with a wikis api though it is private? I use mw 1.22 and I suspect that that version of visualeditor does not have support for private wikis. [20:48:30] the parsoid service is on the same server as the wiki. could one just permit a single ip-adress to be able to work fully with the api? [20:49:39] If I get a solution I promise to write about it in the mediawiki-wiki. [20:50:58] Fridays is probably a bad time to ask questions here I suppose... [21:18:19] gute: it's working fine in production [21:18:45] I'm not sure about 1.22 [21:19:06] that's pretty old [21:19:37] if you can, I'd upgrade to 1.23 at least [21:37:03] gwicke, well, if I do an upgrade, do I have to reinstall parsoid service as well? [21:38:38] Can some one please help me setup short URLS, the wiki is at http://vrail.ratime.org.au/w and I need this to be able to be reached at http://vrail.ratime.org.au/wiki/Main_Page but everything I have tried is not working. [21:40:38] gwicke, Im worried that an upgrade may f**k everything up. [21:55:31] Can some one please help me setup short URLS, the wiki is at http://vrail.ratime.org.au/w and I need this to be able to be reached at http://vrail.ratime.org.au/wiki/Main_Page but everything I have tried is not working. [21:57:35] fusionoz: That's a rather general question. I guess you know this: https://www.mediawiki.org/wiki/Manual:Short_URL [21:58:15] That is what I have been working form and even using the auto generator that is listed there [21:58:25] what happens is it gives a 403 error [22:00:35] gute: No, you can keep Parsoid. Do restart it after the upgrade though [22:00:37] I have been researching this for about an hour, perhaps someone here can help? I've got a doubleunderscore variable set up for an extension, but can't for the life of me figure out how to see if a particular page has it set. Does the page EditPage object or something have a method that lists the magic variables set in a particular page? [22:03:44] fusionoz: Alas I am really not an expert when it comes to setting up Apache. According to WP " Status code 403 responses are the result of the web server being configured to deny access, for some reason, to the requested resource by the client." [22:04:02] fusionoz: So make double sure, that the server has the necessary read rights. [22:04:45] But I have not blocked it and have used .htaccess files and my Virtual Host file to try and get this to work and well keeps doing that same thing [22:07:31] keisetsu_: this information is generally stored in the page_props table [22:08:36] keisetsu_: if you have a ParserOutput object, you can access it using $pout->getProperty( 'notoc' ) etc. [22:09:02] They all do it is a Windows Server running W8.1 and XAMPP [22:09:14] MatmaRex: I think I got it: I compare MagicWord::get('myvariable') to $m_pgObj->textbox1. Oh, but I might take a look at that. [22:10:31] keisetsu_: EditPage has a mParserOutput field that contains such an object, but (i think) only when the user is previewing the page [22:11:54] MatmaRex: Ok. I think I can get it with my method, thanks for your help [22:12:34] keisetsu_: oh, you can call EditPage::getPreviewText() to populate that field, but i can't guarantee that it won't have funny side effects :) [22:13:25] keisetsu_: if you only look for __NOTOC__ (etc.) in page text, you'll miss magic words added by templates [22:14:08] That's okay. I'm just trying to keep an extension from running on the page that I'm editing. [22:14:32] MatmaRex: And I think I got it working! [22:15:05] :) [22:17:48] MatmaRex: Thanks for your help. Good to end the work week with a win! [22:27:10] Does anyone know anything about the history of the user_registration field in the user table on WMF wikis? [22:27:19] I know it was added in 1.6 (according to https://www.mediawiki.org/wiki/Manual:User_table#user_registration) [22:27:31] Was it backfilled for any older accounts? [22:27:37] If so, was this done uniformly? [22:27:46] I'm asking because of https://bugzilla.wikimedia.org/show_bug.cgi?id=70759 . [22:28:03] I'm wondering if it affected English Wikipedia differently maybe because it was backfilled on enwiki. [22:28:32] JeroenDeDauw: can you take a look at https://www.mediawiki.org/w/index.php?title=Extension_talk:Maps&offset=20140912194721#Can_a_custom_layer_be_used__instead_of_a_mapping_service.3F_47951 ? i'm not really familiar with the extension [22:28:52] ^ cc Tim-away brion csteipp [22:29:13] ? [22:29:18] ah user_registration [22:29:32] superm401: yeah it was added at some point so enwiki (and maybe some others) will have [22:29:35] a) some null entries [22:29:39] ftr, I have no idea... [22:29:42] b) some entries that have had values backfilled [22:29:51] i suspect backfilling was *not* consistent [22:30:08] Okay, I am going to suggest that the backfill be done everywhere. [22:30:27] ok [22:30:32] either backfill everywhere or code to assume null is legit :D [22:31:20] superm401: There's an open bug about that. [22:31:36] Carmela, oh, if you can find the number, I'd appreciate it. [22:31:37] The backfilling was done one-time and was kind of inaccurate. [22:31:50] Because it was based on first edit, not guessed time of registration. [22:32:00] Carmela, well, I think that was the best they could do. [22:32:01] https://bugzilla.wikimedia.org/show_bug.cgi?id=22097 [22:32:19] I mean if you didn't store registration time before, how else would you know? I guess other logged actions, but that's it. [22:32:21] https://bugzilla.wikimedia.org/show_bug.cgi?id=18638 [22:32:30] The latter has a neat graph. [22:32:38] Or you can approximate using user ID. [22:33:36] Yeah, that's actually clever (some users will have actions, some won't, but you can draw a curve). [22:34:13] brion, yeah, null should be handled, but if it was backfilled it would be less of an issue in practice. [22:34:39] yep [22:35:02] "The user.user_registration field is NULL for all users predating its addition circa 2005 (except on enwiki)." [22:35:03] Why?! [22:35:15] I understand doing it on edits, not an all actions (including edits but also logged) [22:35:16] https://bugzilla.wikimedia.org/show_bug.cgi?id=22097#c4 [22:35:17] But why only enwiki? [22:35:18] jackmcbarn: sure [22:35:33] I'm not sure why only en.wiki. [22:35:42] https://bugzilla.wikimedia.org/show_bug.cgi?id=22097#c0 [22:35:45] Yeah. [22:35:53] It's goofy. [22:35:59] There are open bugs as a result. :-) [22:36:51] Thanks for the info. [22:50:59] Is anyone else having trouble connecting to Freenode? My client is failing 100% [22:51:04] Had to use webchat [22:51:13] People are pinging out all over the place [22:51:22] Hrmph [23:04:36] is anybody aroud? [23:05:14] anybody? [23:05:15] :( [23:05:28] !ask [23:05:28] Please feel free to ask your question: if anybody who knows the answer is around, they will surely reply. Don't ask for help or for attention before actually asking your question, that's just a waste of time – both yours and everybody else's. :) [23:06:25] is there a way to research the most queried things that have no pages [23:06:35] so no necessariy red linked [23:06:48] and not existing pages that got page views from google queries [23:08:34] sleighbells: Don't think so. [23:10:05] there is no way on earth it can be done? [23:10:35] sleighbells: You'd need to write an extension [23:10:57] so it is possible, in theory, just nobody did it yet [23:11:09] sleighbells: Maybe there already is one doing that, but I certainly never heard of it. [23:17:37] FoxT: i know wmf wikis can do that [23:19:02] jackmcbarn, so it is something that can be done? [23:19:05] jackmcbarn: Interesting. Do you know, how they do it? Extension? Or something between webserver and db? [23:22:22] not sure offhand [23:45:33] Gulp.. I'm doing a upgrade from MW 1.22.6 to 1.23.3... I got as far as to the mw-config. (https://www.mediawiki.org/wiki/Manual:Upgrading#Web_browser) then I got this error: https://dpaste.de/Pj2G/raw [23:55:10] gute, the error seems potentially related to SemanticForms, so try asking in #semantic-mediawik [23:55:24] Err, #semantic-mediawiki [23:55:59] thanks