[00:05:06] hey all, just upgraded from ubuntu 11.10 to 14.04 -- which includes a php upgrade from 5.3.6 to 5.5.9 -- something funky going on in resourceloader now, getting [00:05:13] PHP Warning: array_keys() expects parameter 1 to be array, null given in /var/www/ops_dev/includes/MessageBlobStore.php on line 350 [00:05:39] and things similar, all around MessageBlobStore [00:05:44] in apache error log [00:08:44] I have read the ResourceLoader documentation several times and I still have no idea how it works. [00:09:02] My problem is that I have an older skin, and I'd like to use the new jQuery features with it [00:09:13] Is this possible with minimal interference? [00:09:55] ObsessiveMathsFr & reeeve - you may find mediawiki-l answers your question [00:09:57] !lists [00:09:57] mediawiki-l and wikitech-l are the primary mailing lists for MediaWiki-related issues. See https://www.mediawiki.org/wiki/Mailing_lists for details. [00:10:02] * sumanah doesn't have an answer [00:10:13] k, will ask around [00:11:40] Thanks [00:14:30] reeeve: Hmm, that one is new to me [00:15:09] If all you did was a php upgrade, maybe some sort of caching error, although that sounds odd [00:16:08] reeeve: Sometimes running rebuildLocalisationCache.php --force can clear of i18n related errors [00:24:06] bawolff: thanks, i think that must be it [04:18:54] Hi! How can I access the current user's ID through my skin? I need to display messages based on the actively viewing user, and I can't seem to get a hold of the ID. I've read this - http://www.mediawiki.org/wiki/Manual:RequestContext - Though it's not the same as special pages, so I'm at a loss. [04:20:06] Doing "$this->getUser();" doesn't work. [04:20:40] "Fatal error: Call to undefined method DefaultTemplate::getUser()" [04:29:04] I think $wgLang->getCode() . . . OK, too slow. [04:29:46] And that's the user language, not the user. $wgUser->getID() or something having got the globa, but you're gone anyway. [05:30:22] Hello! This is the extension I have made, used to store material properties http://202.164.53.122/~albertcoder/mediawiki-1.22.6/index.php/Special:Mat_ext . You may use username:mat password:material to login. I wish to add a new form from which new material properties(might be various other forms too) can be added. How can I do that? [05:34:13] I tried to make a new class(Specialmat_ext1) which extends SpecialPage class, and then included the name of the class in another file(mat_ext_join.php) like this $wgAutoloadClasses['Specialmat_ext1']=$dir.'mat_ext_join.php'; [05:39:04] But still the code in mat_ext_join.php does not work, I am having problem in referring to that file. [06:25:58] If only it wasn't broken for unregistered users, DismissableSiteNotice should probably be included in core https://wikiapiary.com/wiki/Extension:Main_Page [06:32:56] Nemo_bis, was the above text, a reply to my question? [06:33:20] Pardon! I couldn't figure it out. [10:25:28] SPQRobin: https://meta.wikimedia.org/w/index.php?title=Special:NewPages&feed=atom&hidebots=1&hideredirs=1&limit=500&offset=&namespace=202&username=&tagfilter= doesn't match https://meta.wikimedia.org/w/index.php?title=Special:NewPages&hidebots=1&namespace=202 [10:25:41] Apparently it shows the last edit instead of the creation/page title? [10:46:28] Hi, How you guys doing? [10:47:13] why $wgNamespaceRobotPolicies is not effective for NS_SPECIAL? [10:47:54] anyone can fix includes/specialpage/SpecialPage.php? [11:14:18] Nemo_bis: hmm, so these feeds are really crappy :-S [11:14:33] Shukran, shukran MediaWiki and other FLGOSS devels [11:15:19] I have a Debian6 that for some reason creates unnecessary and harmful amounts of apache2 threads or what every they're called ( processes ) [11:15:35] I've now restarted the server and stopped the apache [11:15:59] is there a way to search commons by file size ? [11:16:17] I looked at 'tail -f access.log' [11:16:27] looking at that it's not a DoS attack [11:16:44] maybe I should look at the error.log [11:16:59] since there is erronous behaviour [11:17:12] matanya: what are you looking for? [11:17:13] I am in with ssh now [11:17:29] pls halp / hold hand. have time [11:17:47] Betacommand: a user on he.wiki asked me, if he can search for files above 40mb [11:18:27] matanya: its doable, just not though normal methods [11:18:35] SPQRobin: they are, but they'll soon be wonderful if you touch them ;) [11:18:37] jubo2: how is related to mediawiki ? [11:18:57] matanya: have them drop me an email (in English ) with exactly what they want and Ill throw a report up [11:19:08] matanya: Im about to go AFK for a bit [11:19:13] thank you Betacommand! [11:19:19] Nemo_bis: did you file a bug? :p [11:19:22] jubo2: what MediaWiki version? [11:19:32] matanya: its what I do :P [11:20:43] Vulpix: I dunno.. 1.19 ? [11:21:13] jubo2: well, don't expect us to know that info... [11:21:34] Vulpix: yeah.. it's in http://en.wikipedia.org/wiki/Special:Search?go=Go&search=Special:Version but that is sorta down now [11:21:57] sry.. regex expansion automatic surprised me too [11:22:09] I am in with ssh now [11:22:16] I look for version in the MW [11:23:03] grep wgVersion includes/DefaultSettings.php [11:23:15] Reedy: 'k tnx doing now [11:23:37] jubo2: I ask because there would be an explanation in MediaWiki 1.23 and newer: https://www.mediawiki.org/wiki/Manual:Job_queue#Changes_introduced_in_MediaWiki_1.23 [11:24:19] 1.19.1 [11:24:38] I think this is apache2 issue, not MW per se [11:24:43] well, that's not the cause, then [11:25:06] I didn't touch any configs to break them. Who/what did would be an interesting questio [11:25:49] I am going to communicate the outage via d.consumium.org [11:26:59] SPQRobin: before you tell me it's a dupe as with 10***, I asked you :) [11:27:14] looks like I have 2 options.. #1 fix the current server and #2 migrate the sites to the new spare server [11:30:07] /etc/apache2/apache2.conf hasn't been touched in over 2 yrs [11:30:19] so nothing malicious there.. [11:33:51] Nemo_bis: I couldn't find one, though I found a useful tracking bug that isn't linked from some relevant bugs: https://bugzilla.wikimedia.org/show_bug.cgi?id=3646 and I found a duplicate bug there btw, fixing that... [11:37:41] oh, it was already linked from bug 10268 [12:22:49] hrm. My File: accesses seem to take a LooOOng time in the latest 1.23.0 [12:22:57] wasn't like this under 1.21 or 1.22... [12:23:19] even happens after the second or third view of File:, so it doesn't appear to be derivative-related. [12:25:50] anyone know how i'd begin to debug? [12:41:15] Can anyone help with some javascript stuff? I'm trying to use the hook system with postEdit, but nothing's happening... [13:00:17] Morbus: what sort of File pages, local files or remote files? [13:00:26] And do you have InstantCommons enabled at all [13:00:45] Nemo_bis: local, and no, dunno what Instant Commons is. [13:09:13] anyone have thoughts on enabling confirmedit by default in mediawiki? [13:11:13] Nemo_bis: you can fiddle at http://www.disobey.com/wiki/Main_Page [13:11:43] Nemo_bis: hrm... it might have been a server issue. [13:11:48] Nemo_bis: things seem fine now. [13:16:40] matanya: you still around? [13:18:01] yes Betacommand [13:18:31] matanya: I think Im running to a translation issue with the user looking for large files. [13:18:45] can i help? [13:19:02] matanya: can you find out exactly what they want to search for or if they just want a list of large files? [13:19:41] I understood he is looking for a list, isn't that what he wrote you in the mail ? [13:20:35] He used the term "I want to search all images, the volume of 30MB or more." [13:21:16] Im trying to figure out exactly what he wants. but I think his English is too poor to do that well [13:21:48] and my Hebrew is about as good as my Klingon [13:22:02] no idea what "enabling confirmedit by default in mediawiki" might mean [13:22:14] nice Morbus [13:23:39] Betacommand: i'll check with him [13:23:41] Nemo_bis: The ConfirmEdit extension, one of the primary anti-spam steps [13:25:02] Nemo_bis: https://www.mediawiki.org/wiki/Extension:ConfirmEdit [13:26:00] Betacommand: it's bundled with MediaWiki, you only need to mark the checkbox on the installer to have it [13:26:58] Vulpix: Given how often that new wikis get swarmed by spam should it be installed by default? [13:27:42] And what does that mean? [13:27:57] no, sysadmins should decide whether they want it installed, and what captcha mechanism would like to enable [13:28:22] I think there is a bug about reducing the number of clicks needed to enabled bundled extensions [13:28:33] note that the capchas that can be enabled by default without further configuration have low rate of efectiveness [13:28:42] And one thing that would be really useful is a text area where the installer prompts for a QuestyCaptcha question [13:29:14] I would make that default for the case where one picks "public wiki", but it's not super-trivial [13:29:28] that would probably require to adapt the installer so extensions can add installation steps [13:29:33] All the other captchas are useless and/or need complex installation steps [13:30:01] Vulpix: not if confirmedit was made integral part of core :) but yes, probably would need that [13:50:13] when I edit a wikipedia article, does it archive the whole old article or does it store only the edit diff? [13:51:10] zPlus: the whole article [13:52:27] note that one can hide (delete) selected revisions of text, and even split them to other pages. Storing only diffs would be problematic in such situations [13:53:07] well, split and merge, actually [13:55:58] thanks Vulpix. I've another question: how does mediawiki compare the current article with an old revision to find differences? [13:57:37] zPlus: it retrieves the text of both revisions, and uses an external diff utility (like GNU diff) to show the diferences [14:00:51] wow thanks Vulpix, I really thought it was something built in specifically for mediawiki [14:02:11] <^demon|away> Vulpix: It only uses diff3 for edit conflict merging. Actual diffing of revisions is either handled by the wikidiff2 pecl extension or by the pure-php implementation we have. [14:04:52] oh ok so actually it's done by mediawiki [14:05:13] I wasn't aware of that. Good to know :) [14:09:49] Betacommand: he would like to have a list [14:11:33] matanya: Ah easily done, give me a few minutes [14:11:44] thank you [14:13:50] hi sumanah [14:14:01] hi matanya [14:16:58] hello! sumanah I am stuck at one point. I have made two extensions work separately. One can store material properties, and add new materials and there properties. And the other is capable of showing all the properties in tabular form. [14:17:31] In a tag extension, I want to parse wikitext (an internal link specifically, i.e. "[[foo]]") to get the resulting HTML (i.e. ) but not insert it into the page. Need to stick it in some Javascript actually. I'm wondering if using $parse->parse() messes with the state of the $parser and if I should be using something else. [14:18:35] I wish to include the working of second extension in the same extension that I have made to store the material properties. Like I wish to give a link say "view all properties" and the user must see the contents which my second extension is capable of doing. [14:19:51] Please have a look here http://202.164.53.122/~albertcoder/mediawiki-1.22.6/index.php/Special:Mat_ext you will get an idea of my problem. [14:20:38] I tried using $parser->recursiveTagParse("[[foo]]") but I get back a [14:20:42] retentiveboy: you can use the Parser::parse fine if you need to, but it Parser::recursiveTagParse is normally preferred (it's much simpler) [14:20:55] oh [14:23:28] albertcoder: sorry, I don't know enough to help you. Have you tried mediawiki-l ? [14:23:30] !lists [14:23:30] mediawiki-l and wikitech-l are the primary mailing lists for MediaWiki-related issues. See https://www.mediawiki.org/wiki/Mailing_lists for details. [14:24:55] Anyway thanks sumanah I will ask this on the mailing list. I am stuck at this problem from a couple of days. :( [14:25:15] albertcoder: my sympathies. how often have you asked for help, here or in other places? [14:25:42] I have asked this may be thrice since day before. [14:26:18] albertcoder: btw what country do you live in? is there a developer community there that has meetups? [14:26:33] It shouldn't be much work to put both extensions into one extension [14:26:37] Mostly copy paste [14:26:42] I live in India. [14:27:19] I've heard the http://hasgeek.in/ conferences there are good. [14:27:34] Reedy: got any ideas for albertcoder? [14:27:48] Reedy, I have copied the code of my second extension in the code of my first extension but that get displayed on the same page. [14:28:05] Identify and register them as seperate special pages then? [14:28:29] matanya: should I include ogg files? [14:29:12] I wish to make a link say "view all material properties" and then the code of my second extension should come into action. [14:29:56] Both the extensions work separately when I try to access them from special page separately [14:30:47] Linker::link() and SpecialPage::getTitleFor( 'Foobar' ) [14:31:13] but I wish to make all these features available in a single extension. Right now I can redirect to the URL of the special page of my second extnsion but that way two separate extensions need to be installed. [14:31:35] You're doing it wrong then [14:31:42] Many extensions have numerous special pages [14:32:51] yes Betacommand [14:33:03] any type, if i understand what he wanted [14:34:06] Hope I get my answer from this. Suppose I wish to make a separate form for adding new material property. So how can I access this form by clicking on a link on my special page? [14:34:22] Output a link to your special page? [14:34:35] !class Linker [14:34:36] See https://doc.wikimedia.org/mediawiki-core/master/php/html/classLinker.html [14:34:45] yes [14:35:05] I think you might be over thinking it [14:36:01] And how can I copy the code of my second extension into first such that it should only work if I click on view all materials? [14:36:37] matanya: http://tools.wmflabs.org/betacommand-dev/reports/commons_large_files.txt [14:36:38] Right now I have copied the code in my main extension but it gets executed without having to do anything. [14:36:48] thanks! [14:39:00] Reedy, if you can have a look here, I will be grateful. This is my first extension. http://202.164.53.122/~albertcoder/mediawiki-1.22.6/index.php/Special:Mat_ext [14:39:03] matanya: anything else? [14:39:12] just many thanks :) [14:39:16] albertcoder: It doesn't really help [14:39:28] You need to link to your code [14:39:50] Betacommand: does that contain video as well ? [14:39:51] matanya: the hardest part was doing the math needed to find out what 30MB was :P [14:39:56] matanya: yes [14:40:01] :) [14:40:35] matanya: select CONCAT('* [[:File:',img_name,']]') from image where img_size > 31457280 ORDER BY img_name; [14:40:59] Betacommand: only 150 files? that seems a low amount [14:41:10] i thought there would be more [14:41:28] matanya: 30MB files are supper massive [14:41:38] o_0 [14:41:48] I uploaded several that are 100+ [14:41:58] http://202.164.53.122/~albertcoder/mediawiki-1.22.6/index.php/Special:Joins this is second. Reedy so how should I get over this problem I have tried to create a new Specialclass in my main file of first extension and copied the code of my second extension there? [14:42:10] matanya: are those on the list? [14:42:51] no Betacommand [14:42:56] e.g Ambassador Shapiro Visits Amphibious Assault Ship USS Kearsarge in Eilat.webm [14:43:18] I then gave the name of new special class in this variable $wgAutoloadClasses but still it did not work. :( Any advice? please [14:43:27] matanya: let me take a look [14:43:34] As I said, it's hard to help without seeing what code you're trying to use [14:44:03] albertcoder: can you share the source code you have written? [14:44:07] Okay I will show the code, just a second. [14:45:45] Betacommand: i'm off for now. you can just leave a message, and i'll check scrollback [14:45:46] This is my first extension https://github.com/albertcoder/MaterialsDatabase/ and I have done whole coding in Specialmat_ext.php [14:45:52] matanya: figured out the good [14:45:55] *goof [14:46:06] update the page now [14:46:22] albertcoder: Don't use ?> at the end of files [14:46:29] oh my [14:46:31] I would suggest you explicitly remove them [14:46:41] ok, thanks. my browser just gave up [14:46:47] :) [14:47:03] matanya: 63553 rows [14:47:18] I was running it on enwiki the first time [14:47:28] the is the max of a signed int ? or real result ? [14:47:39] *that is [14:48:01] ALright Reedy I will remove ?> and this is my second extension https://github.com/albertcoder/SqlJoinsMediawiki where Specialjoins.php has all the code to be executed. [14:50:11] Reedy, Now I wish to make a link "view all properties" on the special page of first extension and then the code of my second extension should execute. But I wish to make a single extension for many such operations [14:52:39] And Reedy I think I was cautious of not using ?> at the end of files and I have not used them anywhere. :) [14:56:14] albertcoder: You should also add a .gitignore, and not commit temporary files such as those that end in ~ [14:57:01] Thanks Reedy I will do it definitely! [16:40:28] Betacommand: the he.wiki user asked me to pass on to you his gratitude. [17:05:35] is it possible to change the color of wiki markup bullets in a table? [17:05:52] CSS maybe [17:06:23] i added inline css to change the font text, but i don't know what to use to make the bullets match [17:15:26] hello? [17:15:37] hi? [17:17:40] Basically, I would like to as the following question, why doesn't this code return "02 B": {{#ifeq:{{#sub:2 B|1|2}}|{{#sub:2 B|-2|-1}}|02 B|2 B}} but this one does: {{#ifeq:{{#sub:2 B|1|2}}|{{#sub:2 B|1|2}}|02 B|2 B}} ? [17:17:51] How is one way of looking at whitespace different rfom another way? [17:19:56] Can anyone help? [17:20:08] I really don't understand what I'm doing wrong [17:22:06] Mox1: you could always try to put each {{#sub}} on different lines, surrounding them with dashes, to see the exact output from each one, and see if that makes sense [17:22:32] the dashes is to see if it outputs whitespace or whatever [17:25:32] Vulpix, could there be some way to simply represent white space? [17:25:55] I.e.:{{#ifeq:{{#sub:2 B|1|2}}|" "|02 B|2 B}} for example? [17:26:14] MatmaRex: https://translatewiki.net/wiki/Thread:Support/Text_in_English [17:26:15] (that code doesn't btw) [17:26:53] *doesnt work [17:27:44] Nemo_bis: hm, there was that bug for "More" not updating [17:28:13] https://bugzilla.wikimedia.org/show_bug.cgi?id=66524 [17:28:18] you think it's this? or what? [17:29:04] I've already reopened that [17:29:10] Mox1: use to delimit whitespace inside parser functions. Parser functions trim whitespace in parameters [17:29:17] MatmaRex: but I don't remember, was Vector already removed from core [17:29:25] Nemo_bis: no [17:29:33] hm [17:29:39] but it was poked around a bit [17:29:40] hmmmmm [17:29:45] hmmmmmmmmmmmmmmmmm. [17:30:29] MatmaRex: hmhmhmhmhmhmhmhmhmh? [17:30:49] hmmmmm [17:30:59] mhm [17:31:01] MatmaRex: https://translatewiki.net/wiki/Thread:Support/Messages_removed_by_translation_updater_bot ? [17:31:04] Nemo_bis: i moved vector's l10n messages to a separate file [17:31:08] " [17:31:09] Change-Ids: If59db33f98025822dcb2f886599fbc8dae7f2513 and I9778079ddb92602259510e91574e63911bd17dac " [17:31:14] ah [17:31:14] Nemo_bis: so yeah [17:31:15] https://gerrit.wikimedia.org/r/#/c/141303/ [17:31:18] this should fix it ^ [17:31:34] unless someone at twn dun goofed and actually removed the contents for these… [17:31:58] Nope, but see Raymond's comment [17:32:03] the thing is, what if i wanted to simply represent whitespac on its own, instead of part of string, Vulpix? Because I am aware that and "" can stop the trimming of white space, but they can't represent the white space itself... :/ [17:32:38] Because, to check that there is a white space in the specific location of the string. [17:32:38] Nemo_bis: noted, thanks [17:32:50] argh, silly me, lost one lqt comment by opening a second reply [17:32:51] Mox1: <-- isn't that whitespace? [17:33:11] Nemo_bis: i read all bugmail and gerritmail :) [17:33:31] MatmaRex: when the mailserver is not down you mean :P [17:33:36] Mox1: ? [17:33:36] Nope. At least not the same white space as one space (" ") [17:34:03] Nemo_bis: well, yes. [17:34:43] bawolff, nope. :/ [17:35:01] MatmaRex: so the issue raymond mentioned needs to be solved within thursday or all wikimedia wikis will have Vector in English, right? [17:35:29] ah no because that's only for moving to another repo [17:35:59] yes [17:36:01] or so i hope [17:36:38] however, modern and cologneblue are a bit messed up right now :/ luckily they have no important messages (only skinname-) [17:36:56] (they were moved out, but fortunately i'm not the one at fault :P) [19:03:18] Documenting rollback procedure for an upgrade from 1.21 to 1.23 [19:03:37] database restore before after changing code and images back? [19:05:43] restore database, and code (MediaWiki core+extensions+local modifications if any). Images shouldn't have been touched usually, but take them into account if they're inside the MediaWiki directory [19:07:35] Well right, but in what order.. I was thinking code first, then images, then data (the code I keep in separate directories, for a quick changeover via symbolic link) [19:38:13] hi [19:38:16] I'm trying to transclude a page, but mediawiki is rendering the link as "Template:PAGENAME" not transcluding PAGENAME. Any suggestions? [19:38:30] my mediawiki displays some inputboxes as "UNIQ5c03c3db8b482fba-inputbox-00000000-QINU" [19:38:34] it's really strange [19:38:40] what's the possible cause of that? [19:39:39] Strip markers! [19:39:44] chabibi: badly written parser extensions [19:39:58] chabibi: or ones written for a different MediaWiki version than you're using. have you recently upgraded? [19:40:13] yes, it was written by some guy for a previous version of mediawiki [19:40:33] example here on top of the page: http://wiki.arabeyes.org/القاموس_التقني [19:41:04] with the inputbox extension [19:41:31] Upgrade the extension? [19:41:44] yeah, that sounds like a good idea [19:41:50] I hope he didn't change it [19:41:53] thanks guys [19:48:14] Its much more likely to be a different extension that's broken, then inputbox needing an upgrade [19:48:22] Input box has barely changed in the last 8 years [19:49:07] <^demon|lunch> It got mostly rewritten like 6 years ago when Trevor started. [19:49:11] <^demon|lunch> It was like his first thing. [19:53:04] tdannecy: That's odd. You have a link? [19:54:27] bawolff: Next to "Document 1" http://flnowarchive.org/mediawiki/index.php/Box_20_Folder_1#Document_list [19:54:57] bawolff: The page for Document 1 has an tag, so it should be transcluding over. Confused over here... [19:55:08] tdannecy: Oh I thought you meant it was literally writing out template:PAGENAME [19:55:21] tdannecy: You need an extra colon. Do {{:Box 20 Folder 1 Document 1}} [19:55:42] bawolff: Aha. Wow. Thanks. [19:55:51] bawolff: Feeling dumb. :p [19:55:53] by default curly braces mean include from pages starting with "Template:". If you start it with a colon, it means transclude from main namespace [19:57:12] bawolff: Learned something new. Thanks for your help. [19:57:52] No problem. https://www.mediawiki.org/wiki/Help:Templates and https://www.mediawiki.org/wiki/Help:Namespaces has more information [20:02:41] Reedy: that solved the problem [20:02:52] * Reedy wins an internet [20:18:38] Nemo_bis: ping [20:20:13] pong [20:20:26] Nemo_bis: https://bugzilla.wikimedia.org/show_bug.cgi?id=67007#c5 [20:35:05] gwicke: I saw the comment but I don't understand what it means, should we change the wikitext like that? [20:37:13] Nemo_bis: basically it's invalid to have non-table elements directly inside a element [20:37:49] in this case there's a
and other content directly inside a {| (rather than inside a td) [20:41:03] Nemo_bis: could you check whether https://it.wikipedia.org/w/index.php?title=Utente%3AGWicke%2Fmainpage&diff=66666719&oldid=66666189 does the right thing? [20:41:38] I am looking for a way to redirect a mediawiki page depending on the locale of the user [20:41:48] is there anyway to do that within mediawiki? [20:41:59] Nemo_bis: it does fix the rendering issue: http://parsoid-lb.eqiad.wikimedia.org/itwiki/Utente%3AGWicke%2Fmainpage?oldid=66666719 [20:42:13] chabibi: Special:MyLanguage I guess. See translate extension [20:43:02] ULS [20:49:13] thanks bawolff [20:57:52] gwicke: thanks, I'll ask admins to fix [21:15:56] can anyone suggest a safe alternative to this extension http://www.mediawiki.org/wiki/Extension:NewsBulletins ? [21:17:24] demo wiki is down... [21:18:02] chabibi: semantic-mediawiki hosts their blog on the wiki itself; the MediaWiki feeds could probably be included in the same MediaWiki installation with the RSS extension [21:18:52] But really, transclusion, DynamicPageList or at worst MassMessage is more likely to be what you want [21:20:01] I see, thanks Nemo_bis [21:20:17] I really like what mediawiki has become, it's really powerful [21:20:43] power has never been its lack ^^ [21:25:06] ete: can you upgrade the wikiapiary host stats to answer this question? :) https://www.mediawiki.org/wiki/Thread:Template_talk:Extension/extra_parameter_%22require_shell_access%22/reply_(7) [21:26:20] i.e. add counts for each of the sections at https://wikiapiary.com/wiki/WikiApiary:Hosting_providers [23:54:25] Hi guys! I just installed the HTML5video extension. though this means that I have to put videos in "wiki/extensions/HTML5video/videos/" folder. Is there a smart way to enable ordinary file uploads from within the wiki? [23:55:45] gute: That extension looks unmaintained... [23:56:34] Reedy>> ok. is there another alternative that will allow me to put up my own videos? [23:57:19] Reedy: ok. is there another alternative that will allow me to put up my own videos? [23:57:26] What do you mean? [23:57:37] You don't need an extension to upload it.. To embed it for playing and such you would [23:57:47] TimedMediaHandler ? [23:58:19] Reedy: well yes. I want to embed it as well.