[00:01:24] How can I make https://directory.fsf.org/wiki/Circus_Linux-sandbox change the border color ("|color=#CCCC00" in ) in https://directory.fsf.org/wiki/Template:Print_antifeature_intro-sandbox ? [00:02:09] Pass it as a parameter? [00:05:16] Reedy: I've added {{{color|#CC0000}}} in https://directory.fsf.org/wiki/Template:Print_antifeature_intro-sandbox but it doesn't work because I first have to pass it to https://directory.fsf.org/wiki/Template:Project_antifeature-sandbox but I don't know how. [00:06:24] Does one template not include the other? [00:07:01] Print_antifeature_intro is different than Project_antifeature [00:07:17] The templates are used by https://directory.fsf.org/wiki/Circus_Linux-sandbox: https://directory.fsf.org/wiki/Template:Project_antifeature-sandbox, https://directory.fsf.org/wiki/Template:Antifeature-sandbox and https://directory.fsf.org/wiki/Template:Print_antifeature_intro-sandbox [00:07:43] where is Print_antifeature_intro-sandbox [00:07:45] called? [00:09:37] Platonides: In https://directory.fsf.org/wiki/Template:Antifeature-sandbox ( I just updated it but same problem) [00:11:04] oh [00:11:12] it is called through DPL [00:11:49] not DPL, semantic mediawiki [00:13:43] I changed back https://directory.fsf.org/wiki/Template:Antifeature-sandbox to use https://directory.fsf.org/wiki/Template:Print_antifeature_intro because it already contained {{{color|#CC0000}}} [00:13:48] Platonides: What do you mean? [00:15:14] I'm not familiar with how #ask passes the parameters [00:15:30] hi, im trying to write script that will add a lot of pages (that will contain only template at start), and this script will maintain those pages (to check whether all of their params are update). its very hard to work with template with pywikibot, im saw 2 libraries: mwparserfromhell and wikitestparser (wpt), anyone have any recommendation or reference to similar code to look at? tyvm:) [00:16:47] Platonides: This answer says that it's impossible to pass parameters to templates: https://stackoverflow.com/questions/9893268/how-to-pass-variable-parameter-to-template#9893310 [00:23:49] that's a C++ question [00:23:53] irrelevant for mediawiki [00:25:52] Platonides: Yes I realized that now. [00:26:11] In mediawiki you can pass parameters to templates [00:26:28] you just edit them and include the parameters [00:26:33] but #ask is a parserfunction [00:26:46] it could allow passing extra parameters… or none at all [00:31:27] anyone have any experience with it? [00:33:19] niso: Sorry no. [01:00:34] David_Hedlund: I don't know how Template:Antifeature gets included, but we could use userparam [01:00:51] saper: Hi dear you. [01:01:01] saper: How? [01:01:18] now I have added class [01:01:33] class="black-antifeature" [01:01:49] and we can set colors and stuff in MediaWiki:Common.css [01:02:21] saper: Ok, please don't edit directly [01:03:42] saper: Ok you can go on and edit. [01:03:51] saper: I was just thinking some. [01:04:40] saper: https://directory.fsf.org/wiki/Circus_Linux has |color=#CCCC00 [01:05:17] that won't work [01:05:25] saper: But it's not changing border color to yellow yet. [01:05:50] saper: How should https://directory.fsf.org/wiki/Circus_Linux be modified then? [01:06:01] I have an idea [01:06:06] saper: Cool [01:06:46] saper: I've played with sandboxes hours. I'm glad that you stepped by. [01:07:21] saper: Because I have no idea how to solve this. [01:12:18] David_Hedlund: https://directory.fsf.org/wiki/Circus_Linux-sandbox viz https://directory.fsf.org/wiki/Circus_Linux [01:12:24] is this what is required? [01:12:58] saper: https://directory.fsf.org/wiki/Circus_Linux-sandbox is not required, it's just a sandbox. [01:13:09] please compare visual effects [01:13:14] saper: Ok. [01:13:53] sandbox is now green, #cc0000 is the default [01:14:10] saper: Cool! [01:14:19] saper: Please set the default color to red. [01:14:47] saper: Antifeature = Bad [01:15:06] #cc0000 or red? [01:15:36] Set red as default border color. [01:16:21] done [01:16:33] https://directory.fsf.org/wiki/MediaWiki:Common.css this is where you can change color per page [01:16:56] saper: Thanks, but the border in https://directory.fsf.org/wiki/Circus_Linux-sandbox is not red. [01:17:28] it is green just to demonstrate how to change it [01:18:46] I don't know what your original intention was, but now the border color for all "antifeatures" is set in https://directory.fsf.org/wiki/MediaWiki:Common.css *and* you can change it per-page should required [01:19:19] saper: 1) I removed the custom background color from https://directory.fsf.org/wiki/Circus_Linux and it's still red, that's great. [01:19:37] https://directory.fsf.org/wiki?title=MediaWiki:Common.css&diff=62454&oldid=62452 [01:19:53] this is a per-page customization [01:20:20] (if needed) [01:20:46] all visual attributes should go into master CSS, not inline styles [01:20:54] saper: The yellow color doesn't stick: https://directory.fsf.org/wiki?title=Circus_Linux-sandbox&diff=62455&oldid=62449 [01:21:11] it won't [01:21:24] you can change it in https://directory.fsf.org/wiki/MediaWiki:Common.css , not there [01:22:31] saper: Ok, I'll have a look. [01:22:35] it is dirtyyellow now after https://directory.fsf.org/wiki?title=MediaWiki:Common.css&diff=62457&oldid=62454 [01:23:04] need to log off now, have fun! [01:23:27] saper: Thanks. [01:24:36] saper: And thank you! [05:54:54] anybody know what wfMsgExt(..., array('parse', 'content')) implies should be done in a modern setting? [05:55:06] there's no example for that combination in the migration guide table [05:58:19] Quasar`: wfMessage( 'messagename' )->inContentLanguage()->parse(); [05:58:28] thanks [05:58:33] if my memory is right. Been a while since I saw a wfMsgExt ;) [07:08:17] got an optional side rail with hot spots and community corner widgets going :) https://pasteboard.co/Hd2DkoB.png [07:40:05] Hello everyone [07:42:42] hiya [07:43:05] ReferenceError: importArticles is not defined - what might cause this? [07:44:52] something is trying to use a variable named importArticles. perform a search in your files to determine where it comes from [07:44:56] only place I can find that name is in an extension [07:45:15] An extension? Which one? [07:45:27] https://www.mediawiki.org/wiki/Extension:ImportArticles [07:45:38] thanks [07:49:54] wait, it still has that error, even after installing said extension. Not that it does anything at all though. [07:53:14] I figured you might have it installed and should try disabling it but if that's not the case then I'm lost. [07:54:41] Not like it effects anything, but oh well [10:03:00] is there a way of injecting the i18n translations of the current language to the current page via a js variable? [10:18:17] tkore: what does that mean? [10:18:33] changing the language of the user interface of that page? or of some content on that page somehow? [10:20:50] nope [10:21:21] lets say i have a i18n file named he.json [10:21:38] and i have a key named "some-message", that contains the value "this is some message" [10:21:55] is there a way through javascript to get an i18n by it's key? [10:22:12] for example mediaWiki.i18n.get('some-message') [10:22:25] that will return "this is some message" [10:22:31] andre__, ^ [10:35:31] tkore, what is the usecase of getting that key value via javascript? [10:36:01] You can pass uselang=qqx as a URL parameter to see the keys in the interface, and you can set uselang=he to see the he translations in the interface. [10:57:32] Hi, which mediawiki page should I edit if I want to post an announcement in the Recent changes page? [10:57:39] I want it to appear just above the recent changes feed [11:09:20] Hello [11:09:32] Can someone help me with my question? Thanks [11:09:42] which mediawiki page should I edit if I want to post an announcement in the Recent changes page? [11:09:49] I want it to appear just above the recent changes feed [12:43:38] anomie: Hey, are you around for a quick char wrt https://gerrit.wikimedia.org/r/#/c/419798/ ? [12:43:41] *chat [13:43:21] Amir1: I am now [13:44:21] anomie: okay, what should we do to get this merged, is the schema change should happen before that? [13:44:55] Also I have this rather fast patch: https://gerrit.wikimedia.org/r/#/c/421277/ [13:46:28] Amir1: I'm wary about doing it without the changed index, but if you can get jcrespo to say it's ok without the index change then I'd be satisfied. For clarity: the index change would be to change the "rc_namespace_title" index from (rc_namespace, rc_title) to (rc_namespace, rc_title, rc_timestamp). [13:47:43] I make a patch for that, it shouldn't be that hard [16:08:55] Hello, is there a way for me to create a template that displays on all pages in a namespace...example...i want to include some text on all pages in the User: namespace [16:09:13] when a new user is created and they go to there user page it would include text that i want displayed [17:05:21] hello, small internal wiki was on a server that was not backed up (oops.) I have access to all the files. It is an older install.. I was hoping to install the same version of mw (1.16.5) first, restore the db etc, then upgrade. I am having trouble finding the 1.16.5 release, can someone please help [17:07:50] alu__: https://releases.wikimedia.org/mediawiki/1.16/ [17:08:35] that said, if you have the db backup [17:08:57] you can restore that, run the installer, point it at the existing db, and it'll upgrade you to the latest version in one go [17:09:02] no need to install 1.16 first and then upgrade [17:09:59] I suggest having the installer generate a fresh LocalSettings.php for you and then porting everything over from your older LocalSettings [17:10:06] since a LOT has changed since 1.16 [17:10:16] thank you skizzerz, that sounds much better [17:25:01] alu__: you probably need to get the version from Git [17:25:27] a while ago I was unable to get 1.19.24 from the releases page [17:25:44] so clone the git repository and change the branch to the version you'd like [17:30:24] I might have to go and get that old version, 1.30 is having problems upgrading the db [17:36:18] alu__: what kind of problems? [17:36:38] I wouldn't recommend doing that, you should first verify it works with the version it was running on [17:36:55] you could mess the db up (and I don't know if you can do rollback with mediawiki migrations) [17:46:57] there is no rollback functionality in mediawiki [17:54:53] harmaa.. seems my problems is that just copying the data files is not enough when tables use innodb. my tables are imported but not usable (doesn't even work in phpmyadmin). I'll fix that first, then I'm sure mediawiki will be able to upgrade [17:55:18] oh no [17:55:26] you need to dump the old db, and then import the dump [17:59:52] hi, is there a way how to make wikilink to a special page with some parameters set? something like [[Special:OrphanedPages&limit=500&offset=0]] [18:04:47] auvajs: the {{localurl}} magic word takes actions as a 2nd parameter [18:05:58] Hi anybody, I get Fatal error: Class 'HtmlFormatter\HtmlFormatter' not found in /public_html/wiki/extensions/TextExtracts/includes/ExtractFormatter.php on line 27 [18:13:24] wow Nate_ is quite impatiend [18:13:29] impatient* 2 minutes! [18:21:27] Hey all, anyone know why the extensions Semantic MediaWiki and Semantic Result Formats would show as installed in the Special:Version page even though they only exist in the extensions directory but have not actually been included() or required()? [18:21:30] harmaa, the machine that the db was on died. Not sure how to restore the db, except for the copy-files-over-method [18:23:15] FWIW, I have other extensions in the extensions directory that aren't loaded and don't show in Special:Version, so I'm not sure why it is just these 2. [18:24:52] justinl: check for wfLoadExtension in your LocalSettings.php [18:25:04] that's the new way of loading extensions; doing an include_once/require_once is deprecated now [18:25:10] justinl: maybe they were installed with composer? [18:25:51] They were installed with Composer but they are not loaded in LocalSettings.php. [18:26:10] justinl: they don't need to be [18:26:20] if they were installed with composer [18:26:23] alu__: in that case, you might be screwed. you can start mysql in innodb recovery mode and see how much data you can dump out, but chances are the innodb transaction log is corrupt so you may not be able to recover everything [18:26:50] yep, doesn't sound promising for alu__ unfortunately :/ [18:27:00] Actually Maps shows up as well even though it's not required(), though GraphViz does not show up in Special:Version and both of those were installed with Composer. [18:27:37] justinl: I don't know how they are loaded, but at least semanticmediawiki is loaded automatically if installed with composer [18:27:42] if an extension was installed with composer, it'll be automatically loaded in without any entry in LocalSettings [18:28:00] GraphViz doesn't show even though it was installed with Composer [18:28:19] alu__: https://dev.mysql.com/doc/refman/5.7/en/forcing-innodb-recovery.html [18:28:27] last time my server rebooted we lost a month's worth of files off of it, as if the ext4 journal hadn't been flushed in all that time; that was fun. Fortunatley, no db corruption though. [18:28:28] alu__: what kind of situation is it, do you have the OS files? was it in a virtual machine? [18:28:34] try with a value of 1, and then see if you can do a mysqldump with that [18:29:24] after recovering, delete all the databases and restore from the dumps you made [18:30:04] (by "delete all the databases" that also includes deleting the innodb transaction log files for those dbs. If you don't have innodb_file_per_table set, then you'll be basically reinstalling mysql) [18:31:12] Ok, so is it the existence of composer.json that includes a autoload/files entry that causes an extension to be automatically loaded? GraphViz has an autoload section but no files entry in it. [18:32:16] Quasar`: tnx, but {{canonicalurl:Special:OrphanedPages|offset=500}} [18:32:17] justinl: probably the lock file? [18:32:28] Quasar`: still looks like external link [18:32:53] yeah unfortunately there's no way to style it like an internal link that I know of [18:33:09] skizzerz, thank you, i tried that and unfortunately does not seem to make the tables accessible [18:33:35] harmaa, unfortunately a poorly maintained non-virtualized machine, I only have the data files [18:33:55] other than, possibly, CSS in a template that contains it [18:34:02] keep trying with increasing values of the recovery int (so if 1 doesn't work, then try 2, then 3, then so on) [18:34:20] if you get up to 6 and it still doesn't work, your data is unrecoverable [18:34:39] note you need to restart mysql each time after changing that [18:34:52] harmaahylje: So it seems like the composer.lock file is created when composer update is run and includes the contents of composer.json and composer.local.json, and includes the info from each extension in the composer*.json require sections [18:35:14] justinl: yep [18:37:36] harmaahylje: Can that cause any unwanted behavior? I have 7 wikis all served by the same web servers, each one with its own identical copy of MediaWiki and extensions as its virtual host DocumentRoot, but each wiki's LocalSettings.php only includes the desired extensions for that wiki. [18:37:50] Only 4 of our wikis actually use SMW. [18:38:59] thanks skizzerz and harmaa, I will keep working this, bye. [18:39:05] good luck! [18:39:15] you may wish to ask a mysql channel for more help if you're still getting stuck [18:40:42] justinl: well I'd expect that composer only installs everything locally, but I don't really know actually, might be worth checking it [18:43:33] I am not very familiar with php stack [18:44:56] well I don't see any hidden folders in my dev wiki at least [18:44:57] :S [18:45:06] I've been supporting and these wikis for several years now and only recently noticed this, though I only provide the infrastructure (web servers, databases, etc.), it's our communities that actually produce the content. I would imagine that there really shouldn't be an impact of having an extension loaded, for the most part, as long as it's not being used, but I'll check with our main editors. [18:45:34] Thanks for pointing me in the right direction! :) [18:45:37] well generally extensions just provide extra functionality [18:45:55] but I'd remove every extension I can personally [18:46:37] I just recently updated like 10 wikis that used every 3rd party extension that were never really maintained [18:47:10] One that snagged us just this week is NewUserMessage. Loading it without some configuration causes a poor user experience, essentially, so I've disabled it until our main editors provide some feedback on how they want it configured. [18:48:13] I've actually worked a lot over the years to get to having all wikis' directories be identical (I'd like to move to a wiki family structure) so having different contents of the wikis' extensions directories is a step backwards, but I can probably come up with a clean way of managing it. [18:50:58] I think that would be a good idea [18:51:07] hi. sorry for that simple question: how do i recreate a the mediawiki database? [18:51:18] I can only imagine what a nightmare it would be if you have everything separately [18:51:47] I had some combined setting where all wikis were running under the same instance of mediawiki, but each wiki had its own config [18:51:59] and a lot of repetition, I don't know why it was configured like that [18:52:25] It took a looong time to migrate all the common settings into one settings file, and then only having wiki specific settings for each wiki in their own file [18:52:57] Yeah, having 28 copies of MediaWiki sucks, but I have one main copy on a "wiki manager" server and a script that rsyncs that main directory to all of the web servers. [18:53:05] just use a mediawiki docker image and kubernetes [18:53:30] Coming up with an integrated LocalSettings.php that includes the proper files based on the Host header will take a bunch of work but that's on my roadmap. [18:53:40] Guest75535: what? you want to start maintaining docker as well? [18:53:54] justinl: That's kind of similar to how WMF does it [18:54:09] I recently moved my wikis from an on-prem datacenter to AWS and just upgraded them to 1.30, and now I can take a breather and rethink the entire architecture. Docker/ECS is definitely something I'm going to investigate. [18:54:21] EFS/NFS is the biggest issue, though. [18:54:57] oh, if your requirements it to deploy and move them anywhere, then docker might be a good idea [18:54:59] Yeah, I just have to figure out how to make containers work for our needs, all of the Varnish, Apache (may move to Nginx), and PHP-FPM stuff, plus the NFS mounts. [18:54:59] nothing to maintain (maybe a private docker repository) [18:55:47] just do it [18:55:53] hmm, well, if you move all the wikis under the same instance, I don't know what docker brings into that actually [18:56:02] Not really about moving stuff around, but more about being able to quickly and reliably recreate web servers if any go out in the weeds, and to more easily support version upgrades, both MediaWiki and OS apps like Varnish, etc. [18:56:10] it works like a charm (in my case on bare metal) [18:57:03] justinl: just don't put the db in docker ;P [18:57:47] why not? [18:57:56] Right now I have four web servers behind 3 AWS ALBs (using SNI for multiple SSL wildcard certificates and differing security group requirements amongst some of the wikis) plus 4 Aurora instances, 1 big Elasticache Memcached instance, and a large EFS filesystem for the upload directories. [18:58:21] Guest75535: doesn't feel like a good idea to me [18:58:35] I'm using Aurora MySQL so that's not an issue. :) [19:00:01] hard facts are better than feelings :) sounds like: elefants cant jump - no matter how hard they try:) [19:00:21] I've heard of horror stories from someone putting their db in docker [19:00:22] I've run into numerous architectural concerns and constraints over the years given the performance and resource requirement of our wikis, which have definitely led to a more complex architecture than I'd like, but certain constraints are hard to work around, like the need for NFS. [19:01:52] yeah NFS is hard constraint. The main alternative is Swift, but that's also hard to setup [19:03:34] The hardest thing about NFS has always been backups. Currently I'm rsyncing from the EFS filesystem to an EBS volume on my wiki manager EC2 instance, but at 90 GB and 2.6 million files, it takes about an hour or so just to calculate the changes! [19:16:30] CindyCicaleseWMF: What do you think about the idea of spamming the EMWCon live feed to mediawiki.org sitenotice? [19:17:02] There's probably people browsing mediawiki.org who would be interested [19:17:08] bawolff: Great idea! [19:17:30] ok, I'll do that [19:17:36] Thank you! [19:22:21] Its up there now [19:22:36] let me know if there's something I should do to make it prettier [19:22:45] blink and marquee [19:22:57] and comic sans [19:23:43] I guess I could add the logo [19:24:34] have it spin every once in a while [19:24:37] :P [19:26:39] I guess the events list got removed from Main_Page https://www.mediawiki.org/w/index.php?title=Template:Main_page&diff=2713220&oldid=2535194 [19:28:58] bawolff: may want to modify [[MediaWiki:Sitenotice id]] as well [19:29:16] at least I assume that's why I'm not seeing it [19:29:47] oh nvm [19:29:51] I hid it with personal css [19:29:54] :P [19:31:09] Skizzerz: Well good catch nonetheless [19:31:21] I forgot about that aspect of dismissable site notice [19:36:51] ugh external links in the site notice [19:37:07] i think fundraising got a slap for linking to youtube once [19:38:18] I was wondering how long until people would get mad at me for that :P [19:38:54] But its not like we have streaming infrastructure. I think fundraising more got slapped for chosing youtube over commons for a pre-recorded video [19:39:30] I believe videos will eventually be on commons after the conference [19:40:03] it was the whole privacy, tracking issue from memory [19:40:12] even nocookie mode of youtube leaves cookies [19:56:16] Hi, I already asked a steward and at wikmedia-tech, nobody had any idea: [19:56:40] what happened here: https://www.mediawiki.org/wiki/Talk:Reading/Web/PDF_Functionality#What%20happened%20to%20my%20post??? [19:57:24] both of those links work for me [19:57:59] @Skizzerez: You mean, you can see the post? [19:58:15] here: https://www.mediawiki.org/w/index.php?title=Topic:U7yyjlehi3rvc4b7&action=history [19:58:21] perhaps I'm not understanding the question [19:59:13] For me and all the others I asked, the two posts are not shown on the page, and in the history link above [19:59:37] ah [19:59:47] in that case, yeah, I'm not seeing them in either of those spots [19:59:59] perhaps the posts were deleted? idk [20:00:11] I have a meeting in a minute but I can dig in more after I'm out of that [20:00:11] but who deleted them? [20:00:30] gimme like 30-40 mins and then I'll be able to check in more detail :) [20:00:36] or perhaps someone else can look in the meantime [20:42:19] I'm running MediaWiki 1.30.0 php 5.6.33 (fpm-cgi) with seperate user poo in fpm. It's been working fine for over a month. Today I wanted to log in from another computerer. The wiki displays fine but couldn't log in. Tried restarting httpd, mariadb and resetting my pw. [20:42:37] only after restarting php-fpm could I get logged in. [20:44:58] I'm using php-fpm with separate user and I had not that problem... Well, if it happens again, it may help to setup a debug log to see if there's something there that can give you a hint of what can be happening [20:45:51] sessions are stored in database now (by default, it can be configured to be in redis, memcached, etc), so it shouldn't be a problem of being stuck in apc or similar [20:47:25] I was wondering about sessions... [20:54:59] Vulpix: I don't think that's true. Sessions will be stored in APC if $wgMainCache = CACHE_ACCEL; which is a fairly common config [20:56:02] I think that if it couldn't access session, mediawiki gives a quite descriptive message [20:56:06] Hmmm, right, that was causing a lot of problems when sessions started to use the object cache [20:56:24] GumShoe: what does "couldn't log in" mean? [20:57:23] Platonides: Sometimes the issue is if people have something like apc.shm_size = 1M, session will get bumped from cache very fast, and it acts as if sessions are just thrown out [20:58:31] Anyways, if you just want to check if this is causing your issue, just set $wgSessionCacheType = CACHE_DB; This will be slower, but if it works, then it means that you have an issue with your cache config that's causing sessions to fail [21:00:06] I don't think for a reason to have caching in apc. If you have php-fpm, or any way that the webserver uses multiple threads to run PHP, there's no guarantee to reach always the same thread on each request [21:00:36] also, if you restart the webserver or php-fpm, everyone gets logged out [21:01:14] Debenben|away: I looked, and it was not because the revs are deleted [21:01:30] if I had to venture a guess, I'd say it's probably due to replication lag or some sort of replication issue [21:01:31] Vulpix: that's an interesting point for multithreaded [21:01:52] but isn't apc shared between threads? [21:02:42] I do kind of think we should consider making the default be CACHE_DB for ease of use, and just suggest setting it to CACHE_MEMCACHED (or whatever) as a performance optimization people could do [21:02:50] well, I can't say for sure, but that would be problematic and either cause contention locks or race problems [21:03:08] because CACHE_DB basically always works [21:04:00] I assume race conditions are handled gracefully. e.g. WMF uses redis for sessions, I'm sure that can have race conditions similar to apc [21:04:18] Debenben|away: if it's still broken tomorrow, open a task on phabricator as it may be an extension bug at that point [21:04:32] or at least make the installer more "clever", and if it suggests $wgMainCacheType to be CACHE_ACCEL, set also $wgSessionCacheType to CACHE_DB [21:05:07] every php-fpm process shares the same APC cache [21:05:17] *every php-fpm process in the same pool [21:06:26] maybe installer should test CACHE_ACCEL works sanely (e.g. check apc.shm_size) [21:07:34] T147161 [21:07:34] T147161: Installer suggests $wgMainCacheType = CACHE_ACCEL, potentially breaking sessions - https://phabricator.wikimedia.org/T147161 [21:10:16] Is there a way to add a html class attribute to an internal link from wikitext? [21:10:32] interesting, I am sure it was there when I wrote it [21:11:13] Debenben: which is why I'm thinking replication issue. When you write it, you hit the master db and get the fresh live data. When reading, you're hitting slave dbs. Theoretically they're up to date but perhaps some of them aren't? [21:11:29] unfortunately I can't investigate that myself; someone with more access to the WMF environment is needed to check that theory [21:11:42] ok, we'll see tomorrow [21:11:44] thanks [21:11:47] FoxT: No, you can do [[bar]] or do [[baz|fred]] [21:11:48] FoxT: no. You have to wrap the link in a with a class [21:12:00] ninja'd :< [21:12:45] Debenben: if the comments were deleted, they shouldn't be showing on your contribs page, plus there would be a deletion log entry for them [21:12:52] which is why I think it's not that [21:12:56] Will wrap it then. Thx both. :) [22:06:51] Hello! I'm a n00b with MediaWiki ... I've got it up and running on Ubuntu, however running into an issue with file uploads ... I believe I have it all set up correctly and can upload files, however when I click on the file to download it in the wiki, I'm sent to a page with the file path that says "There is currently no text in this page." - the file doesn't download. Just wondering if [22:06:51] anyone's seen that before ... maybe a setting I'm missing? Been googling and checking the help docs but so far no joy... THANKS! [22:13:02] morrison23: something wrong with https://www.mediawiki.org/wiki/Manual:$wgUploadPath or with rewrite rules if you have set up short URLs [22:13:52] morrison23: which urls did you set up? [22:13:55] Vulpix: I set $wgUploadPath = './images'; [22:14:05] (in LocalSettings.php) [22:14:08] bad choice [22:14:10] that . is wrong [22:14:14] Aha! [22:14:26] !wg UploadPath [22:14:26] https://www.mediawiki.org/wiki/Manual:%24wgUploadPath [22:14:45] the default should probably work for you, don't override it in your LocalSettings [22:14:49] in fact, you shouldn't need to change the default [22:14:57] how did you set up your urls? [22:15:20] it's probably an issue of having clean urls everywhere [22:15:32] I would recommend copying the wikipedia approach [22:15:38] install mediawiki on /w/ [22:15:43] and put articles at /wiki/ [22:16:01] Didn't change any of the defaults associated with URLS... Installed into /var/www/html/wiki (ubuntu/apache) [22:16:33] I like the /w/ idea tho ... gonna try commenting out the $wgUploadPath = './images'; now and see if that does it. [22:17:19] (I've USED WikiPedia a lot ... and I'm familiar with open source/php stuff but this is my first try with media wiki) [22:18:16] EGG AND MY FACE are occupying the same point in space... [22:18:36] Commenting that line got it to where it's downloading ... ! [22:19:07] good [22:20:28] Off to learn more about export/import templates ... this platform is SO interesting! I knew there was a reason I've been donating to WikiPedia all these years... LOL. THANK YOU FOR THE PUSH. [22:21:59] Platonides: Thx for the path suggestion... will check into that scheme. [22:37:22] Hmm. MediaWiki:Sitenotice doesn't seem to apply to mobile mediawiki.org [22:37:27] Need to spam all the people! [23:49:56] I noticed the mobile skin also doesn't seem to have a place for page status indicators *grumps* [23:57:31] Quasar`, why are you not using the timeless skin? It is mobile friendly. [23:57:51] just using what came with MobileFrontend [23:58:03] You don't have to install mobile frontend. [23:58:09] oh. [23:58:26] https://www.mediawiki.org/wiki/Skin:Timeless [23:58:43] Timeless takes over! [23:58:55] :) [23:59:01] Isarra reigns supreme!