[00:22:42] I am having a somewhat confusing issue with $wgUploadDirectory/$wgUploadPath and I am hoping someone here can offer some suggestions. (This might take a while to type up, so bear with me please.) [00:23:42] Firstly: I am using ShortURIs and ImageMagik (though I do not believe that has a direct impact following my previous investigations). [00:24:27] If I leave $wgUploadDirectory/$wgUploadPath as default, everything works perfectly. [00:24:59] However, if I attempt to change them, it: [00:25:51] Uploads correctly and recognises the file is present (i.e. not a redlink), but does not render the file (either as a thumbnail or as the full size image). [00:26:05] The 'Original File' link is also broken. [00:26:10] Any ideas? [00:27:59] Basically I am attempting to upload into subdirectories within /images for each site (I run multiple sites off the same /core/ location). I.e. /images/site1, /images/site2, /images/site3, etc. [00:30:35] is it possible to create a page with an empty page (that is just submit an empty content and the page will be created) ? [00:30:52] Usually, yes. [00:30:56] right now i have to type in something in order to have the page created (as i could see from the database) [00:30:58] is it.. [00:31:26] because if i just type ABC then click on save page, that ABC does not appear in mydatabase. only if i type some content in the page ABC then i see it is being created [00:32:43] The database is a bit of a mystery to me, if IIRC the Page and page text entires are saved in (several?) differen tables. [00:32:52] hmm [00:33:12] even so, the empty content ABC page does not appear on my main page. [00:33:21] currently, recently created article will appear on my main page [00:33:28] empty content didnt appear so i assume it is not created :/ [00:33:32] Ah, that is a statistics thing... you can change that behavious. [00:33:37] behaviour* [00:33:43] how should i do that? [00:33:54] Give me a moment to look it up [00:33:58] sure thanks [00:34:04] im sorry i could not help you with your issue.. [00:34:08] newbie here:/ [00:34:33] https://www.mediawiki.org/wiki/Manual:$wgArticleCountMethod [00:35:58] Not sure why you would want to count a blank page as an article, but set that to "any" in your LocalSettings.php, run maintenance/updateArticleCount.php and it will. [00:37:38] so i just set $wgContentNamespaces = any in my localsettings? [00:38:02] No, set $wgArticleCountMethod = "any" [00:38:07] oops [00:38:08] thanks [00:38:09] typo [00:38:11] let me try [00:38:32] then run the updateArticleCount maintenance script. [00:39:33] is it through cmd? [00:39:43] i am using windows nv tried update maintenance before.. [00:41:00] yes, run your command prompt [00:41:48] try the following in there after navigating to your installation directory (i.e. where your LocalSettings.php file is) [00:41:59] php maintenance/updateArticleCount.php [00:42:02] i just run the updateArticleCount.php on my cmd, it says [00:42:16] to update the site stat table, run the script with the --update option [00:42:25] php maintenance/updateArticleCount.php --update then :) [00:42:27] is it considered updated already? [00:42:36] oh [00:42:53] it says done, thanks. [00:42:58] let me try to see if it works :) [00:43:06] now check your article count [00:43:42] in my database? [00:44:20] sure, you can check there if you want... [00:44:55] Personally I would just check the site statistics page though. Much easier. [00:45:27] is it through special pages? [00:45:35] yup [00:46:43] .. dun think it is working? :/ [00:46:53] is it because I am creating a namespace instead? [00:47:02] namespace:ABC [00:47:23] By default it only counts the Main namespace, so yes. [00:48:15] ohhhhhhhh nooooooooooooooo [00:48:29] so there is no way to create an empty content page in custom namespace ? :/ [00:49:35] You have to register the namespace first. [00:49:46] at where? [00:50:01] i did the namespace configuration at my localsetting.. [00:50:13] https://www.mediawiki.org/wiki/Manual:Using_custom_namespaces [00:50:29] already did this [00:50:50] i can create pages with namespace:abc, namespace:def but not with empty content :/ [00:51:13] If you defined and added it, then made it a content namespace, you should be able to. [00:51:48] Page "ABC:Thisisapage" should be creatable as empty (though again, why you would want to... ;)) [00:52:27] i am referring to his link (https://www.mediawiki.org/wiki/Manual:$wgContentNamespaces) am i correct? [00:52:40] then i put this $wgContentNamespaces = array( ABC); on my localsetting [00:53:58] you have done the define("NS_ABC", ####); and $wgExtraNamespaces[NS_ABC] = "ABC"; parts first, yes? [00:54:29] define("NS_ABC", 508); define("NS_ABC_TALK", 509); [00:54:44] $wgExtraNamespaces = array(508 => "ABC", 509 => "ABC_talk", ); [00:54:50] as so in my localsetting [00:56:13] The next setting should be ContentNamespaces then [00:57:21] There is a way of appending rather than replacing the existing list... ummm... [00:58:25] I *think* it is.... $wgContentNamespaces = $wgContentNamespaces & array( ); [00:58:35] Been a long time since I've done it though. [00:58:43] hmm so is not as simple as $wgContentNamespaces = array( 508 ); ? [00:59:08] No. Setting that means that *only* NS 508 holds content. [01:01:19] ya , i would only want 508 to be able to create an empty page [01:01:26] and mediawiki will display that empty page [01:01:29] or am i confusing lol [01:01:34] As I said though, it has been a _long_ time since I have messed with custom namespaces. [01:02:05] Why do you not want content in the Main namespace? [01:04:02] becuase [01:04:44] this function is a user request for a non-existing article [01:05:14] i am giving it a namespace such that users only can create/edit pages that start with the namespace ABC: and not others [01:05:17] so yeah :/ [01:05:41] Ohhh... Per-namespace permissions. Nice... [01:05:45] Been there, done that. Heh. [01:06:24] I remember it being a royal pain to get set up and working. [01:06:43] what if there was a special page that simplified group management? [01:06:56] as in group permissions, not group membership [01:07:44] I remember I ended up using one of the BizWiki extensions before (that put group permissions on a MediaWiki: namespace page). [01:08:13] But the namespace part was still all manual, and a headache. [01:08:32] are there such things? [01:08:35] sigh. [01:09:32] The trouble you are running into is that Mediawiki is not designed as a CMS, so per-namespace permissions are not built into the core design. (I believe this is still true, as it was back then.) [01:10:41] i believe so.. [01:10:45] thanks for the assistance :) [01:11:14] Sorry I couldn't be more help. It has been way too long since I have really got stuck into namespaces. [01:12:36] is alright, really thanks for your assistance even though you have your own bug yet to be solved! [01:15:31] It's confusing the heck out of me... heh. [01:16:23] ohhhhhhhhhhh would want to ask another question [01:16:50] is it possible to prepend [[Category:XYZ]] into the content page once a new page is created? [01:17:08] as in when user create a new page, the [[Category:XYZ]] is already in the editor ? [01:17:12] hope i am making sense? [01:17:55] Not in the core, no. [01:18:07] I think you can using a boilerplate extension though. [01:41:31] hi. i need help, please. [01:42:03] i not speak english. only spanish. [01:42:42] Partially fixed my problem. The "Original file" link now works, but it still isn't generating thumbnails... [01:51:12] Hello, I have a very important question. What permission bit should I set the extensions folder to? 655? [01:51:34] I accidentally fucked up a chmod. [01:52:06] Hmmm, depends what user/group owns it. [01:52:16] Basically the Web server needs to be able to read files from there. [01:52:27] Most extensions probably aren't writing there. [01:52:43] I almost fear using 755. [01:52:45] So give sufficient read permissions. [01:52:47] Fear why? [01:52:56] Too public for my taste. [01:53:01] Call me paranoid. [01:53:03] lol [01:53:11] Shared hosting? [01:53:15] Nah, VPS [01:53:36] If you're paranoid, try the most restrictive settings and increment from there? [01:53:45] I set it to 655 and everything seems fine., [01:53:53] Resolved/fixed, then! [01:53:56] Oh! I have another question! :D [01:54:04] I have endless snark. [01:54:42] Is there a guide for using HHVM on a Wiki instead of straight PHP? [01:54:58] I hear it's faster and my server isn't the fastest lol [01:55:44] You'll need a pretty recent version of MediaWiki. [01:56:09] And the HHVM PHP packages, I guess. Not sure how good the docs are. [01:56:19] I am using 1.25.1 :D [01:56:23] I'm not sure there's much to it beyond installing HHVM. [01:56:27] Not really [01:56:30] Like MediaWiki should just work. [01:56:36] As far as I know. [01:56:45] I've been considering switching to it as well. [01:56:49] Maybe one day. [01:56:49] !hhvm [01:56:49] HHVM is a thing. It's like PHP but faster. MediaWiki supports HHVM. [01:56:54] zomg [01:57:05] That sounds like a bot snippet I would write. [01:57:22] Yeah I was on that page the other day when I first discovered HHVM lol [02:02:55] Seems I just done broke meh Wiki while trying to get HHVM working. [02:02:56] Hmm. [02:02:58] Well then. [02:03:08] It shouldn't be too difficult [02:03:13] IIRC, on debian etc at least [02:03:21] It sets itself up inside apache/lighttpd [02:03:29] or is it nginx.. whatever [02:03:30] Ah wait, I see my mess up. [02:03:36] Yeah I'm using Apache. [02:03:46] What is your opinion on nginx by the way? [02:03:57] Can't really comment [02:04:02] It's not Apache! [02:04:16] Then it should just be a case of disabling PHP inside apache, or switching php files to be handled by hhvm by default [02:04:19] gj Katie [02:04:43] You can visit the page "Special:Version" on your wiki to verify which PHP you're running on the Web server. [02:04:55] Can Zend and HHVM exist concurrently the same host? [02:05:16] Waiting for my server to restart at the moment. [02:05:17] Katie: Probably. [02:05:20] don't see why not, although they'd be separate PHP installs [02:05:28] Well, like which would /usr/bin/php launch? [02:05:37] whichever the system default is [02:06:04] generally if I do multiple installs I leave system php alone and put the actual installs I care about in /opt [02:06:05] Taking for f****** ever to SSH into my server now. [02:06:14] Are you in a race? [02:06:16] Crap baskets.... [02:06:17] Have you tried restarting? [02:06:22] klaas: You're hilarious. [02:06:27] Err, Katie: ^ [02:06:32] Oh wait, I got it. [02:06:33] Katie: update-alternatives [02:06:37] Damn tab-completion. [02:06:54] The canonical php can point to cowsay if you really wanted [02:07:11] It'd have better typing, anyway. [02:09:22] Sorry for the long wait. Was tweaking something. [02:09:34] PHP version: 5.5.9-1ubuntu4.11 (apache2handler) [02:09:38] We died of boredom [02:09:46] RIP. [02:10:06] <3 [02:11:01] Step 5: Restart all the things to make the switch from PHP to HHVM [02:11:02] quite [02:11:37] Quick google shows people using nginx and switching php to hhvm [02:15:36] I'm following this: https://runtimereflection.wordpress.com/2015/06/29/how-to-setup-hhvm-on-ubuntu-14-04-server-with-apache-2-4-part-1/ [02:32:24] is Parsoid still limited to amd64 machines, or does it work on i386 now? [02:32:43] No idea [02:32:56] Though, I'd imagine non 64bit support (if non trivial) wouldn't be high on the todo list [02:33:03] It used to clearly state it in the docs, but I can't find it now... [02:34:31] Even more confused over my images issue now... [02:35:07] Two files I have uploaded work fine now. Two other images are showing the problem from before though (all fresh uploads)... [02:38:23] Ah, nope. Still only amd64. Oh well. Still need to figure out why images are all funky... [02:54:29] I just don't get why thumbnailing isn't working when I use a non-standard $wgUploaddirectory/$wgUploadPath... [02:57:52] ...and now the 'Original Image' link is only working for two files (despite no changes being made)... [03:29:54] why do i get this error on my apache log [03:29:56] The given path is misformatted or contained invalid characters [05:08:25] any idea if single sign on (using the auth_remoteuser extension) will work on mobile/tablet device as well ? [05:08:38] it is working on my desktop on 3 browsers( IE,Chrome,Firefox) [05:08:44] i am not sure if it will work on mobile [06:44:37] Anyone know what happened to mediawikiwidgets.org and whether or not it's coming back? (or mirrored someplace?) [09:31:52] hi, how can I set up ShortURL (/wiki) with nginx? [09:35:39] I found a htaccess converter, trying that [09:43:53] what the fuck? [09:44:08] the main page offered some GeunEwMW.exe download [09:45:31] that exe is just index.php [09:51:44] ok, fixed by removing "break" from the nginx rewrite rules [09:52:25] but how do I make http://flarar.cadoth.net/ go to /wiki/Main_Page and not the index.php URL? the guide suggests I shpuldn't hardcode Main_Page therw [09:52:29] *shouldn't [09:54:07] oh, by modifying localsettings [13:58:54] Does anyone know of a Docker with Mediawiki/VisualEditor ? [14:01:28] Leroy2: we have a Vagrant environment, if that helps. https://www.mediawiki.org/wiki/MediaWiki-Vagrant never heard of any Docker package. [14:03:07] I am using MediaWiki. It seems the default Vector skin is not responsive. The Wikipedia Original Skins seems to be devloped to be responsive or have some mobile version. How can I do the same within my own MediaWiki Installation? [14:10:38] Hrmm interesting [14:16:38] Not much going on here [14:16:50] Hhi everyone? [14:20:13] hi [14:20:23] Hi Leroy2 [14:21:17] yeah Vagrant looks like it creates a VM using virtualbox... pretty good for developers... not so good for ops [17:00:01] Nikerabbit: Raymond_ https://translatewiki.net/wiki/Thread:Support/Adding_support_for_Semantic_Cite_%28SCI%29 [17:15:47] just testing: [17:15:49] !blame [17:15:49] Whatever happened, it's Domas' fault. For everything else, there's `git blame`. [17:15:57] Okay, it actually works. :P [17:19:43] howdy all, is there some magical mw make-this-input-safe-from-injection method I can encapsulate stuff in before cramming it in a database? [17:22:45] Ulfr: https://www.mediawiki.org/wiki/Manual:Database_access#Database_Abstraction_Layer [17:23:04] https://www.mediawiki.org/wiki/SQL_injection [17:27:06] That helps when I'm working with integers, but what about strings? [17:29:13] Eh, i'll figure it out. Thanks! [19:33:58] Leroy2: vagrant has multiple backends, some of which are better for ops. [19:34:39] I didn't get a response to my question, probably because I posted it in the early morning hours: 19:24 PBLRD Rental Program‎ (diff | hist) . . (+130)‎ . . Kyalla ahashion (Talk | contribs | block) (Retired.) [rollback 2 edits] [19:34:40] N 19:23 Rental Program‎ (diff | hist) . . (+147)‎ . . Kyalla ahashion (Talk | contribs | block) (placeholder) [19:34:40] 19:22 (Move log)‎ . . [Kyalla ahashion‎ (2×)] [19:34:40] 19:22 . . Kyalla ahashion (Talk | contribs | block) moved page Talk:Rental Program to Talk:PBLRD Rental Program without leaving a redirect ‎(PBLRD Rental Program is retired, make way for new rental program) [19:34:42] 19:22 . . Kyalla ahashion (Talk | contribs | block) moved page Rental Program to PBLRD Rental Program without leaving a redirect ‎(PBLRD Rental Program is retired, make way for new rental program) [19:34:45] oops, sorry [19:34:54] wrong thing in clipboard :/ [19:36:11] hey. i would like to create a offline version of a third party mediawiki [19:36:16] how to do that? i mean which tools to use [19:37:59] Anyway - actual question instead of stupid spam, mediawikiwidgets.org seems to be down, does anyone know what's up with it or if its mirrored somewhere - almost all the widgets were hosted there [19:38:51] qknight, Special:Export or a db dump from that wiki. [19:43:05] Special:Export would be good for importing later on another MediaWiki. If you want an offline mirror with just plain HTML pages a website copier like HTTrack would do the job (for any website, not just MediaWiki). Be sure to configure it properly, though [19:44:05] HTTrack seems more like what i need to use [19:44:10] sdaugherty, Vulpix thanks! [19:52:49] Vulpix: any idea how to reformat the mediawiki page when doing the httrack dump? [19:53:04] reformat in what sense? [19:53:06] Vulpix: i've been downloading the target wiki and it already looks pretty promising [19:53:20] Vulpix: change the style so that the menu on the left is not there [19:53:42] Vulpix: i need to rewrite it so that i can put the single pages into a qt::assistant like browser [19:53:52] (at least that is what i think would be a good thing to have) [19:55:00] well, that's not possible. The only way would be to use URLs with action=raw in the query string, like https://www.mediawiki.org/w/index.php?title=API:Parsing_wikitext&action=render [19:55:45] you'd have to generate a list of all URLs of pages you want to download first, and change them so they use action=raw, and feed HTTrack with them [19:57:31] Vulpix: action=raw seems to be very interesting. maybe this can be used with pandoc [19:57:37] sorry, I mean action=render, not action=raw [19:57:41] Vulpix: do you happen to know how pandoc handles mediawiki templates? [19:57:48] (the URL I provided as an example is correct) [19:58:25] I don't know what's pandoc [19:58:38] Vulpix: ok, thanks anyway! [19:58:44] yw [22:19:07] Is there a search that's been "blessed" by the project, i.e. what we'd expect to see best supported, active on WP in the future, etc? [22:21:10] CirrusSearch, SphynxSearch, etc? [22:38:10] I think CirrusSearch is what WP uses? [23:34:21] Overand: WP uses cirrus