[03:59:54] hi I'm looking for a string inside MediaWiki source but it seems it's not in the core. [04:00:24] So how can I find the related extension using gerrit [04:00:55] It's probably related with the localization for Turkish language [04:01:04] Skizzerz: do you have any idea? [04:01:47] what's the string? [04:01:52] or extension? [04:05:18] legoktm: I don't know the extension it's CustomizeModificationsOfSidebar in the Common.js [04:05:37] MediaWiki:Common.js? [04:05:41] yes [04:05:47] that means it's not an extension, it's something your wiki configured [04:06:30] legoktm: but it has to be coming from somewhere right? I didn't write this I'm sure. [04:07:59] legoktm: where can I get the tr language pack? [04:10:17] also even if it's coming from a localization package, there should be a script that including that part. Do you know which file is handling this kind of modifying Common.js work? [04:34:03] Boa noite estou tentando instalar o mediawiki no meu hospedeiro e aparecem dois erros [04:34:19] 500 Internal Server Error [04:34:22] e 404 Internal Server Error [04:39:01] Boa noite estou tentando instalar o mediawiki no meu hospedeiro e aparecem dois erros 500 Internal Server Error e 404 Internal Server Error [04:39:59] A equipe de suporte do server diz não ter qualquer gerência quanto à instalação e utilização do recurso mediawiki [08:27:10] Hello all. Is it possible (perhaps by an extension) to search for links that are missing. I don't mean broken links, I mean key words that are in the content of a page that exist in titles of other pages. A contributor may not know (or just be lazy) about another page and missed putting in a link? [08:32:25] you mean analyze text, to look for matches with existing titles ? [08:32:40] OliverUK1: i doubt it.. [10:29:15] thedj: Oh well, just a thought [10:29:22] thedj: Thanks anyway [13:07:50] legoktm: I fell a sleep last night I'm so sorry. [13:08:44] My question was where or how do I find the source of lanaguage packs [13:47:07] I want to set up two separate mediawiki instances on the same box. I have *mostly* managed to do so. However when I try to log in on the separate host, which is in server.com/differentWiki the special:userlogin link redirects to server.com, and when I try and access server.com/differentWiki/index.php/Special:UserLogin my firefox nopes [13:47:11] How do I fix this? [13:49:21] note: I did set $wgScriptPath to server.com/differentWiki [13:49:35] so now the link directs to the correct location, however my browser still won't complete the request [14:04:28] $request->getFullRequestURL() has an extra differentWiki inserted, so it's trying to go to server.com/differentWiki/differentWiki/index.php/Special:UserLogin. Thus the redirect loop. How do I fix this? [14:10:56] Ulfr: not sure, but setting $wgServer explicitly may help [14:11:00] !wg Server [14:11:00] https://www.mediawiki.org/wiki/Manual:%24wgServer [14:11:26] Ulfr: also, make sure you always set $wgArticlePath as well as $wgScriptPath [14:18:48] DanielK_WMDE: am I doing it right if $wgServer and $wgScriptPath are identical? [14:22:23] Evidently not. This is beginning to irritate me. I set $wgScriptPath to /differentWiki/ and now the homepage is server.com/differentWiki/differentWiki/index.php/Home after I get the login prompt to work correctly. [14:28:39] Hmm. The universe is messing with me. It's only doing this for the first redirect after someone logs in. Bloody hell. [14:32:23] I am having a problem confirming my email with the Snapwiki page for creating game maps in Doom. [14:32:38] I am prompted to confirm my email address so I can edit pages. [14:33:38] I am being shown an HTTP 500 ERROR message when I try to have the cofirmation code sent to my email [14:33:52] Any suggestions? I've tried it on a chromebook, laptop, and phone. [14:37:47] xSOCCER_MOMSx: i don't think we can help you here. you should contact the administrator of that wiki and let them know their account creation is broken. [14:39:21] How would I be able to contact the admin on the site? I'm looking all over for a contact us page and no luck. I will send the link to the homepage. [14:39:30] http://snapwiki.doom.com/index.php?title=Main_Page [14:49:27] xSOCCER_MOMSx: I think what they're trying to say is you need to contact them because there's nothing the MW devs can do [14:50:22] Oh, I know. I linked the homepage to the wiki page I was on. I can't find a way to contact the admins of that site. [14:50:26] http://snapwiki.doom.com/index.php?title=Main_Page [15:30:33] join [15:44:59] hi, is there any way to run smitespam extension for very large set of pages and users [15:45:28] maybe 25k is enough, I think [15:45:49] I have one years ago spammed mediawiki, which is nowadays patched but needs clean [15:46:05] smitespam ajax shows me only 250 pages per query [16:16:37] Hello [16:17:39] Combien dimage a tu prit pour faire limage de facebook [16:18:39] Jooin [18:59:21] hi, where can I get the language packs? [19:00:02] I mean I'm looking for page like https://download.moodle.org/langpack/2.0/ for MediaWiki [19:01:29] tiddlywink: do you have any idea? [19:01:44] no [19:02:16] I'll have a look though [19:02:47] Language packs? What are they supposed to do? [19:03:02] FoxT: I'm looking for the Turkish translation. [19:03:14] tiddlywink: OK thank you. [19:03:40] FoxT: I need to search a string inside the package. [19:03:43] mertyildiran: Of MediaWiki? Languages should already be included [19:03:47] I'd guess you want to start at https://www.mediawiki.org/wiki/Languages [19:04:04] and probably look through https://www.mediawiki.org/wiki/Category:Internationalization_extensions [19:04:38] and https://www.mediawiki.org/wiki/Manual:Language [19:05:39] Maybe https://github.com/wikimedia/mediawiki/tree/master/languages/i18n? [19:05:44] tiddlywink: I think it should be in this site: https://translatewiki.net/ [19:06:28] https://translatewiki.net/wiki/Translating:MediaWiki [19:06:30] FoxT: OMG thank you. [19:06:59] np [19:07:03] tiddlywink: a big thank you for you too :) [19:07:23] no probs, always happy to try and help [19:42:36] has anyone used the LDAPAuthorization plugin with success? [20:38:56] hi I have enabled apiThumbCacheExpiry with $wgForeignFileRepos but now there are too many images cached under images/thumb directory. [20:40:25] apiThumbCacheExpiry was assigned to 60*60*24*90 so images will remain in the directory for 3 months [20:40:35] But I was to remove these cached images how can I remove them? [20:41:34] I have tried 'apiThumbCacheExpiry' => 60 and 'apiThumbCacheExpiry' => 0 but no images removed. [20:43:40] !maintenance [20:43:40] https://www.mediawiki.org/wiki/Manual:Maintenance_scripts [20:44:33] saper: which script should I use? [20:44:50] deleteImageMemcached.php? [20:45:08] or pruneFileCache.php? [20:47:00] I don't remember [20:47:02] I don't want to make mistake because there 200,000+ records [20:47:10] I thought one of them might work [20:47:14] hmm OK I will try to run php pruneFileCache.php [20:47:19] are you using memcached? [20:47:26] deleteImageMemcached.php deprecated [20:47:46] No no problem is files are already there under images/thumb directory [20:47:57] no memory related files are on the disk. [20:50:24] saper: it said "Nothing to do -- $wgUseFileCache is disabled." [20:51:09] 'apiThumbCacheExpiry' is a way to tunnel instant commons [20:51:17] I think it's not FileCache [20:51:44] while tunneling it's creating temporary images [20:52:28] just purging old files might work [20:54:05] saper: by which method? [20:54:35] I have executed both php pruneFileCache.php --agedays 1 and php pruneFileCache.php --agedays 0 [20:54:47] and it said "Nothing to do -- $wgUseFileCache is disabled." [20:57:53] I'd do it with find(1) [20:58:18] with -atime one can find even recent not accessed files [21:01:33] saper: I don't understand. What command should I run? Could you give me the full command? [21:02:38] saper: is it OK to remove images/thumb directory? [21:12:51] mertyildiran: > find . -type d -name "thumb" -exec echo rm -r {} \; [21:13:06] remove the "echo" to actually do the remove :P [21:13:35] that will actually remove all thumbs, not only those old not recently created, though [21:19:16] Vulpix: OK but is there a possibility of corrupting the database etc. by removing them manually? I mean we are deleting them before the expiry. Maybe it can crash something when the expiration date comes. [21:21:29] I don't want to see unexpected consequences this is why I'm asking to be sure about it. [21:21:41] mertyildiran: there's no harm in removing them if you have setup a 404 handler for thumbs: https://www.mediawiki.org/wiki/Manual:Thumb.php [21:22:02] otherwise, you may see missing thumblails until you purge the pages that use them [21:23:54] Vulpix: how can I purge all the pages? [21:24:35] changing https://www.mediawiki.org/wiki/Manual:$wgCacheEpoch should trigger a reparse of each page when anyone views them [21:25:28] Vulpix: oh OK thank you. [21:28:56] Vulpix: Is resetting apache while running php importDump.php OK? [21:29:11] yes [21:33:05] Vulpix: OK executing rm -rf under images/thumb and restarting apache fixed the issue, thanks a lot! [21:33:36] yw :)