[00:37:30] Hi, how do I enable pdf uploads? [00:38:15] hi, you already have uploads enabled but they're not pdf, right? [00:38:32] right [00:38:40] $wgFileExtensions = array('pdf','gif','png','jpg','jpeg','svg','xls','doc'); [00:39:05] I tried adding pdf that to my LocalSettings.php and it worked [00:39:22] so? [00:39:25] yup, that's what I was about to suggest; I'm assuming it's working now? [00:39:55] Nope [00:39:59] It works without pdf [00:40:04] but when I add it, nope [00:40:26] I tried adding both in the front and at the end of the array list. [00:40:47] or array, or whatever it is [00:42:11] do you have strict uploads set? [00:42:47] what happens when you try to upload, it just says the file type is unsupported? [00:43:08] Here's the error message it shows: iException encountered, of type "LocalFileLockError" [00:43:14] but when I add pdf with caps instead [00:43:28] the page shows, and shows PDF in the list of accepted file types [00:43:35] but still won't accept .pdf [00:43:50] Platonides, I'm not sure, could you elaboring further? [00:45:20] what does "won't accept" mean? what exactly happens when you try? [00:45:26] which MediaWiki version is this about? [00:45:39] ah, only that error message and nothing else? [00:45:41] hmm. [00:46:02] so the page does not even show when not using caps? or what does that mean? [00:46:11] newest mediaiwki version [00:46:23] yes andre__ [00:46:26] what's that? 1.27.1? [00:46:28] that's all thats shown [00:46:34] on which page? [00:46:38] oh god, the newest that was out yesterday [00:46:44] upload page [00:46:48] Special:Upload ? [00:46:51] yes [00:47:05] perhaps it's a permissions error [00:47:11] but why show this when adding .pdf [00:47:17] I haven't even tried uploading anything [00:47:31] well, there are several versions out there. So "newest" is always unclear. Plus you might use some version provided by your distribution instead of having it downloaded yourself. [00:48:21] ok, it's 1.27.1 [00:48:36] as shown in my RELEASE-NOTES [00:50:55] ok, can you try uploading a supported file type, like a .png, and see whether it works? [00:53:04] interesting [00:53:08] the upload screen shows [00:53:21] but when I go to upload a .jpg, the following same error shows: iException encountered, of type "LocalFileLockError" [00:54:37] ok :-) not a pdf problem here [00:55:42] look at perms for 'images' subdirectory, `chown -R www-data images` or something should be able to help [00:56:54] should I only do this to the images directory? [00:57:00] For some reason I have staff at the group owner [00:58:02] Chown it (both user and group) to that user and that group which the webserver uses. [00:59:07] i have 501 as the user? [00:59:12] like where did this come from? [00:59:42] It should be www-data [01:00:52] hey [01:00:55] chown -R www-data * worked [01:00:57] of course [01:00:59] lol [01:01:13] any exception to that rule that maybe I should add? [01:01:24] perhaps LocalSettings.php should be root owned? [01:04:34] No. [01:04:41] You only need to chmod the images directory. [01:04:48] Or chown. [01:04:50] Or something. [01:05:04] I don't have a mediawiki install at hand to tell you what the working permissions look like. [01:05:20] But the idea is you have to make it writable by the wiki, and not writable by the world. [01:06:07] If you simply chown it to the webserver user and group, I think that should work. [01:06:23] Please tell what you are doing and how it behaves in response. [01:29:15] Gryllida, Ok, so I have root own everything [01:29:27] except for the images directory in which the web server user owns [01:29:39] How does it behave now? [01:30:14] fine [01:30:16] no problems at all [01:30:39] even uploading works [01:30:40] Okay. Very cool. [01:30:49] indeed [01:31:07] This is great considering only the root user can make changes [01:32:03] I'm glad you asked, backnforth. I don't remember whether root ownage is proper but as long as your config or user passwords aren't world readable (or writable) it sounds fair. [02:30:44] Best message from a developer in the console: "Elasticsearch failed in an unexpected way. This is always a bug in CirrusSearch." [03:23:25] Heh. [03:57:28] Hi, another question: How do I auto protect multiple pages? Or even all the pages I currently have created, but not the new ones? [04:11:05] where did you go? [06:46:50] Maybe someone can help me spot the cause of an error? I have an intranet wiki that worked fine yesterday but today when trying to access it I receive some errors, without having done any changes to the set up. Now I can't even reach the Main_Page or any other pages. Instead I see a screen of various errors, one of them being: "MWException from line 118 of /usr/home/wiki/includes/cache/localisation [06:46:52] /LCStoreCDB.php: Unable to move the new CDB file into place". [06:48:08] Adding "$wgShowExceptionDetails = true;" to LocalSettings I copied the complete output to: [06:48:21] http://dpaste.com/3AV8FWA [06:55:20] The same server has more than one wiki and having searched a bit I think I found the error: https://phabricator.wikimedia.org/T127127 [07:01:46] As suggested in the above phabricator thread, simply setting the $wgCacheDirectory unique to each wiki fixed it, e.g.: [07:01:47] $wgCacheDirectory = '/tmp/mw-cache-my_intrawiki'; [10:37:46]  [12:30:36] How can I get {{padright:}} to pad spaces onto something. I've tried ` ` ` ` ` ` as the append string, and none of those work. [12:32:39] The purpose is for quick repetitive copy and paste for whitespace of template parameters on a page. [12:34:13] `| this1 = ` `| this12 = ` `| this123 = ` `| this1234 = ` etc... [12:35:01] using `this{{SUBST:#vardefineecho:this|{{SUBST:#expr:{{SUBST:#var:this|0}}+1}}}} = ` [14:14:27] hey all, anyone here has working on mediawiki with SOLR before? is there any issue so far? is it feasible to do so? or other tidbits? [14:14:57] as I'm planning to implement it now, but i see from the repo that is not updated anymore, so i was wondering why, thx! [14:16:15] so before i do that, would like to know how's the feedback, comments or anything from the community, thx! [14:19:43] Never used it myself -- the built-in searching scales enough for me. [14:22:32] ardnet: yes, the SOLR stuff is no longer really maintained. All the interest right now seems to be in the ElasticSearch/CirrusSearch extension. [14:24:24] hey Yaron, thx for the respond. So I believed what you meant by Elasticsearch is the one from AWS, is it? and for CirrusSearch is actually this one? https://www.mediawiki.org/wiki/Extension:CirrusSearch [14:25:10] just want to confirm about those, thx! [14:29:36] nvrmind, i just check those stuff, CirrusSearch is actually MW extension to integrate with Elasticsearch (https://www.elastic.co/products/elasticsearch) [16:24:50] HI , I would like to delete my wikipedia account. [16:25:17] Can anyone help ? [16:31:13] LOL, he or she ^^ ask for there wikimedia account to be deleted and then leave 30 secs later [16:40:42] paladox it was a timed test. nobody was able to help them delete their wikipedia account in less than 41 seconds. test failed. [16:40:57] LOL [18:18:30] Hey folks. Im trying to get my head around the mediawiki JSON data API. I've made a simple call to the wikipedia for information from Stack Overflow using this endpoint. https://en.wikipedia.org/w/api.php?format=json&action=query&prop=extracts&explaintext=&titles=Stack%20Overflow [18:18:51] My question is this, can i return the Markdown content broken down into further JSON? [18:19:11] So a sub title like "History" would be represented as a nested JSON object? [18:19:23] Instead of the single view of all the text data on the page [18:21:17] hazamonzo: this isn't an answer to your question, but FYI the MediaWiki syntax is not called "Markdown" but rather "wikitext". [18:21:27] Yaron: Thats right [18:21:33] But to your question - I don't think so. [18:21:35] I did a little googling for Markdown to JSON [18:22:04] I found a couple of third party libraries but i was hoping it might be possible with certain parameters directly to the wikimedia API [18:22:11] Note that "Markdown" is a name of a type of syntax, its just not what we use [18:22:30] So googling for that probably won't help you that much [18:22:41] Ahh okay [18:22:45] hazamonzo: hmm, you're using prop=extracts – do you actually mean to get a short extract of article text (that API has a couple of option to control the length), or did you want the whole article? [18:23:03] MatmaRex: The whole article [18:23:26] hazamonzo: in that case you're probably better off with prop=revisions, which i think can also return the separate sections [18:23:37] The ideal solution would be to return the whole article as a single nested JSON object that reprepsents the headings, sub headings ect ect [18:23:45] hazamonzo: btw, here's all the automatic documentation for the API options: https://en.wikipedia.org/w/api.php?action=help&recursivesubmodules=1 – very helpful :) [18:23:52] MatmaRex: Ahh okay. I'll gibve that a shot [18:24:00] hazamonzo: i also recommend https://en.wikipedia.org/wiki/Special:ApiSandbox :D [18:24:01] or https://en.wikipedia.org/w/api.php?action=help&modules=parse which you can specify the section you want [18:24:28] MatmaRex: Much appreciated guys! I have about 30 tabs open while i was googling before i came here to ask silly questions :) [18:25:08] hazamonzo: However, if you really want a syntax-tree type thing, you should try the parsoid output. Its in XML instead of JSON, but it would be much easier to break apart into small chunks if that's what you're after [18:25:35] (lol. we have too many options, heh) [18:26:11] Hey lots of options is great! [18:26:32] hazamonzo: https://www.mediawiki.org/wiki/Specs/HTML/1.2.1 describes the parsoid output so you can see if that's the sort of output that would be useful to you [18:27:48] https://en.wikipedia.org/api/rest_v1/page/html/Dog for an example (If you look at the view source, the source is more amenable to parsing out parts then the normal html of a page is) [18:28:40] Hmmm all very interesting [18:29:20] It might be an odd requirement but I have a client that wants to "scrape" data from a mediawiki installation and from multiple pages that all have the same structure [18:29:52] If I can get the data out in some sort of format that I can parse with my tool then I can split the page contents into data fields [19:00:06] MaxSem: hi max, are you there ? [19:43:24] Sophivorus, hey [19:44:14] MaxSem: hi! I just commited a small change to TextExtracts and added you as a reviewer [19:44:22] are you the right person to review this, or should i ask someone else ? [19:45:31] reviewed, needs tests [19:46:23] MaxSem: hm, ok, i never added tests before, but i guess its time for me to learn [19:46:43] hanks [19:46:44] thanks [19:47:17] it's really easy - just see that file and add more test cases [19:47:42] will do [19:48:45] if you're using vagrant, run them like that: cd /vagrant/mediawiki && php tests/phpunit/phpunit.php --wiki=wiki --testsuite=extensions --group=TextExtracts [19:50:34] im not using vagrant unfortunately, but ill find the way [19:51:28] If you're not using vagrant, its basically the same thing [19:51:35] except you first have to install phpunit [19:52:00] bawolff: cool, thanks [19:52:40] Also, if you find the unit tests have random failures that don't seem to be your fault, its probably safe to ignore them [20:06:34] anyone knows why I get "Fatal error: Class 'Elastica\Transport\Http' not found in /var/www/html/extensions/Elastica/ElasticaConnection.php on line 207" for CirrusSearch and Elastice extension [20:07:09] i googled that error for a bit, it seems due to missing of curl, but I already have it installed in my machine [20:10:24] ardnet, you need to install Composer dependencies [20:14:06] MaxSem: thx for the respond! is it like this one here? https://www.mediawiki.org/w/index.php?title=Topic:S6wby9ea7pcpkile&topic_showPostId=s6ylss8p8y947268#flow-post-s6ylss8p8y947268 [20:16:05] ardnet, basically yes [20:16:22] you need to run composer from MW's root directory [20:17:59] MaxSem: oh, actually I just ran composer install in Elastica extension directory directly, seems it solves the problem [20:18:08] i think it's good now, thx! [22:29:12] I need to make an app could someone help me whith this. I'm a complete newbie. [22:29:47] how is that related to mediawiki? [22:30:48] I need to create this using mediawiki api. I just need some directions... how to start [22:31:07] you need help using the mediawiki api or creating an app? [22:31:18] and what is the target for your app? [22:31:34] Android, iOS, Chrome, Nokia… [22:31:48] It's simple it should display market values from google.com/finance on my mediawiki page [22:32:08] you mean a mediawiki extension? [22:32:34] Yes. thats what I wanted to say [22:33:23] Is there any extension which I can use for this task. Or I have to write one myself [23:00:45] /13/7 [23:25:34] I'm having trouble migrating a MediaWiki site from one server to another. After doing the migration, everything about the site seems to work properly, except that the page content doesn't render. Is anyone familiar with this dysfunction? [23:38:01] Hi Guest8846. [23:38:02] "the page content"? [23:38:12] Guest8846: Can you provide a link to your site? [23:38:13] what's the output? a blank page? [23:38:40] http://eucalyptusgrove.matsonconsulting.com [23:39:03] Hmmmm. [23:39:11] http://eucalyptusgrove.matsonconsulting.com/index.php?title=Special:Version hmmmm. [23:39:14] The output is not a blank page. Everything looks correct except the wiki text doesn't render. [23:39:40] The version is 1.13.3 [23:39:49] there's an extra in the sking, btw [23:39:52] MediaWiki 1.13.3? [23:40:02] Yes. MediaWiki 1.13.3. [23:40:08] that's old [23:40:12] Yeah, that's pretty ancient. :-) [23:40:23] I intend to upgrade as soon as I get it working. [23:40:26] Sure. [23:40:28] Might be an extension. [23:40:35] Might be an incompatible version of PHP? [23:40:42] A MediaWiki extension, I mean. [23:40:47] You can try commenting them out in LocalSettings.php. [23:41:07] When you migrated servers, how did you keep running an old version of PHP? [23:41:22] It could also be the skin. [23:41:35] looks like a normal monobook [23:41:42] http://eucalyptusgrove.matsonconsulting.com/index.php?title=Special:Version&useskin=modern [23:41:46] Yeah. [23:41:49] Probably not the skin. [23:41:55] I'm looking at LocalSettings.php now. If there are extensions, where would I find them? [23:42:13] you'd see lines starting with "require_once" (or "require"/"include"/"include_once") [23:42:21] at least http://eucalyptusgrove.matsonconsulting.com/index.php?title=Special:BlankPage shows correctly [23:42:23] When I migrated servers, I didn't think about the PHP version. I assumed that if this were a newer version, it would be reverse-compatible. [23:42:30] Like "require_once( "$IP/extensions/Newuserlog/Newuserlog.php" );" [23:42:44] Platonides: Hmmm, interesting. [23:42:58] http://eucalyptusgrove.matsonconsulting.com/index.php?title=Special:UserLogin also works. So Special:Version is special? [23:43:02] something is runnign there when failing [23:43:09] Debra: no, Main Page also fails [23:43:10] I could find no instances of "require" in LocalSettings.php. So, probably no extensions. [23:43:16] Platonides: Sure. [23:43:28] Guest8846: Can you search the file for the word "extensions" to be sure? [23:43:37] Do you know how to tail a log file? [23:43:39] Guest8846: could you post somwhere a (redacted) copy of your LocalSettings.php ? [23:43:56] The error logs might have some clue as to what's happening. [23:44:04] For PHP/the Web server. [23:44:20] if it's the php version, it will be clear there [23:44:29] but I'm surprised it'd run so much [23:44:34] in that case [23:44:34] If page titles are working and message cache is working, [23:44:41] maybe the database table "revision" went wonky? [23:44:45] Err, text, I mean. [23:44:49] I searched for "extensions". There are lines "$wgCheckFileExtensions = false;" and "$wgStrictFileExtensions = false;", but that's it. [23:44:53] http://eucalyptusgrove.matsonconsulting.com/index.php?title=Special:SpecialPages works [23:44:54] http://eucalyptusgrove.matsonconsulting.com/index.php?title=Frequently_Asked_Questions&action=history revision seems fine. [23:44:57] action=purge gives a 500 [23:45:16] http://eucalyptusgrove.matsonconsulting.com/index.php?title=Frequently_Asked_Questions&oldid=273 Hmmm. [23:45:20] I do not know how to tail a log file. [23:45:23] oldid= gives a weird error. [23:45:37] Debra: also a 500 [23:45:39] Yeah. [23:45:42] that would indicate a PHP error of some sort [23:45:43] Skizzerz: Hi! [23:45:48] hi [23:46:11] I had to whois you to figure out who you are :P [23:46:18] oh [23:46:20] (o: [23:46:31] * Platonides realises just with that bit of information [23:46:34] xD [23:46:36] Heh. [23:46:57] I'm guessing normal page displays are putting up tons of warnings, but the server is configured to not display them [23:47:04] !debug [23:47:04] For information on debugging (including viewing errors), see http://www.mediawiki.org/wiki/Manual:How_to_debug . A list of related configuration variables is at https://www.mediawiki.org/wiki/Manual:Configuration_settings#Debug.2Flogging [23:47:05] and is fataling on only specific things [23:47:25] probably, updating mediawiki would fix the errors [23:47:35] Yeah. [23:47:45] most likely [23:47:48] Or make new errors! [23:48:05] If he updates MediaWiki, he'll need to reinstall skins, at least. [23:48:21] Well, I guess not with the tarball. [23:48:30] I'm too used to Git. [23:51:46] I posted a redacted version of LocalSettings.php here: http://eucalyptusgrove.matsonconsulting.com/LocalSettingsRedacted.txt [23:52:01] The formatting somehow got lost. Sorry about that. [23:54:35] * Skizzerz recommends upgrading to a recent mediawiki version; make a backup of your database, save off the old files somewhere, extract a fresh tarball of 1.27.1, run the installer (point it at the existing database), use the fresh LocalSettings.php it generates, migrating over settings from your old LocalSettings.php as necessary [23:54:50] I agree with Skizzerz. [23:55:14] Going from 1.13.3 to 1.27.1 is less painful you'll think. [23:55:19] than [23:56:45] it doesn't look like you have any extensions, so unless you hacked the core files (something we strongly recommend against doing), there should be no issues with the upgrade process [23:59:06] I turned on error reporting. The pages don't look much different. [23:59:52] no you didn't