[02:01:57] is there a hook for when somebody submits the form to create a new account and before it is created? [02:05:11] hi Extrarius [02:05:42] several hooks [02:06:18] you can search in https://github.com/wikimedia/mediawiki/blob/master/docs/hooks.txt [02:06:24] or there is the AuthManager doc perhaps [02:07:17] https://www.mediawiki.org/wiki/Manual:SessionManager_and_AuthManager offers a cleaner solution than implement several hooks [02:07:50] for example, AbortNewAccount is deprecated [02:19:53] how about modifying the new account form to include the realname field, make it required, and have basic validation (minimum and maximum character limits, limited character set) [08:39:27] is this the place to get help for wikimedia api? [08:39:50] yup [08:42:25] OK I have this url but im not gating image for amazon.com am I doing something wrong? [08:42:27] https://en.wikipedia.org/w/api.php?format=json&action=query&generator=search&gsrnamespace=0&gsrsearch=amazon&gsrlimit=10&prop=pageimages|extracts|images&pilimit=max&exintro&explaintext&exsentences=1&exlimit=max [08:43:40] mamo: It's a disambiguation page [08:43:47] https://en.wikipedia.org/wiki/Amazon [08:44:06] Uh, you're searching [08:44:20] yes im searching [08:45:01] You probably need some more limits setting for other things [08:45:12] &imlimit=max [08:45:17] Gives a load more images [08:47:32] let me try it thanks for helping [08:57:11] thank you guys!!, its work for me [12:32:49] who is in charge of copying strings from translatewiki to wikipedias? I'm interested in updating jbo.wikipedia.org. Can someone transfer localized strings? [12:33:52] gleki: It's done automatically [12:35:46] Reedy: but i can see some strings haven't been updated for years although Translatewiki has them [12:35:59] Has the wiki got local overrides? [12:36:20] https://github.com/wikimedia/mediawiki/blob/master/languages/i18n/jbo.json [12:36:26] https://github.com/wikimedia/mediawiki/blob/master/languages/i18n/jbo.json [12:36:28] ffs [12:36:33] I know what it'll be [12:36:35] Reedy: even if it had (i have no permission) the message says one should do translations at translatewiki.net [12:36:44] The language will have dropped below a threshold to be exported [12:37:01] Nikerabbit: ^^ [12:38:37] Reedy: sounds likely [12:38:48] the export threshold is at 13% IIRC [12:38:52] gleki: Translate more messages! [12:39:18] about 350 translations in mw core [12:39:54] I could do an export... if I could remember how to do it [12:43:58] Reedy: translatewiki.net is exported into https://github.com/wikimedia/mediawiki/blob/master/languages/i18n/jbo.json ? [12:44:06] No [12:44:21] It should be, if you had enough translations to reach the threshold [12:44:33] what is the threshold? [12:44:41] oh, 350 i see [12:44:45] well, no [12:44:46] 13% [12:44:53] https://translatewiki.net/w/i.php?title=Special%3ALanguageStats&x=D&language=jbo&suppresscomplete=1 [12:45:00] MediaWiki[expand] 30,457 29,882 1% 1% [12:45:00] MediaWiki (most important messages) 585 317 45% 1% [12:45:07] Noting, the first one includes all extensions [12:45:11] and then it would be exported automatically? [12:45:22] Installer 305 300 1% 0% [12:45:22] MediaWiki Action API 1,683 1,683 0% 0% [12:45:23] MediaWiki core 3,827 3,462 9% 1% [12:45:23] Yup [12:45:30] I think you need 13% of mediawiki core [12:46:14] 13% out of 3,827 then [12:46:36] yeah [12:46:39] you seem to have 9% [12:49:56] Reedy: thatnks for the explanations! issue closed :) [13:44:44] So, dumb question: how, exactly, does a wiki get listed on the interwiki map by Mediawiki's standards? [13:46:47] For everyone? Or for WMF wikis? [14:11:42] Reedy: For the general MW releases (which may or may not be WMF's standards, honestly don't know) [14:14:53] honestly, those haven't been updated in a very long time [14:15:14] you could try starting a thread in wikitech-l, if you can convince most people there that your site would be added, I'd add it [14:15:31] but mostly we don't add most sites to the default mw one, I think there's only like 10 in the default install [14:20:16] Hi. Will the login centralisation issues with CentralAuth be fixed in 1.29? [14:22:49] Reception123, what are "the login centralisation issues with CentralAuth"? [14:23:02] is there a Phab task? [14:23:43] andre__: https://phabricator.wikimedia.org/T141482 [14:24:02] I know they are not fixed, but I'm wondering if there's still time [14:24:16] These issues are affecting WMF wikis as well as other wikis using CentralAuth [14:24:26] Reception123, I don't see anything in that task that implies that anyone would work on this, no [14:25:08] andre__: shouldn't the priority be High or even maybe "Unbreak now" considering that this is an important feature on WMF wikis and it's affecting logins? [14:25:51] Reception123, I don't see why this task should suddenly be urgent - this was reported nearly a year ago. [14:26:12] Reception123: also, do you know how many people have been impacted by this? [14:26:36] Not sure on WMF, but on our wikifarm there's a lot [14:26:55] since when? [14:27:46] since 1.28 I think [14:28:02] It used to impact me for a while after that but then it suddenly got fixed [14:28:55] Reception123: at first glance it doesn't even seem to be a code issue [14:28:58] if you have specific information or good reasons why you think this is an urgent task, feel free to add them as a comment to the task and why you propose a higher urgency/priority [14:29:20] Given root cause is not fully diagnoised even, chance of fix in 1.29 is almost zero [14:29:49] existing comments seem to imply that it might be caused by sessions expiring too fast or something [14:30:08] If its happening at a high rate on your wiki, it may be a configuration issue [14:30:31] bawolff: any specific configuration variables I could look at? [14:30:40] Fixing this issue would be really great for us [14:31:40] It might be whereever your session is stored doesn't have enough space [14:32:01] If you're storing in memcached try giving it more memory [14:32:17] redis [14:34:11] I don't know much about redis, but if you're redis as your session store I assume there's something similar about how much memory and cache eviction policy that can be adjusted [14:42:41] bawolff Vulpix : thanks for your help! I will look into it [15:42:39] Hi, I'm importing data from Wikipedia to my MediaWiki installation using this method: https://www.mediawiki.org/wiki/Manual:Importing_XML_dumps#Using_importDump.php.2C_if_you_have_shell_access [15:43:14] My question is how can I automatically get the media files? [15:48:10] Do I have to manually get the images from here: http://ftpmirror.your.org/pub/wikimedia/images/wikipedia/ [15:49:30] Also it seems a bit out dated. Is there a more convenient way? I tried instant commons but I want to store the images inside my server and don't want to load the images from Wikipedia's server to the browser. [15:51:07] heey please help me [15:51:12] anyone there??? [15:51:18] Is this room dead? [15:51:32] no [15:51:39] @Vulpix oh great [15:51:49] Did you read my question? [15:51:55] wikipedia, and particularly commons, has terabytes of images, you probably don't want to download all images to your server :) [15:52:10] But I'm not copying en.wikipedia.org [15:52:31] I'm copying tr.wikipedia.org [15:52:39] It's relatively small [15:53:24] I don't know if tr.wikipedia.org has local uploads or it requires users to upload to commons.wikimedia.org [15:53:41] anyway, it probably uses commons images as well [15:53:50] No no let me explain [15:54:00] I'm importing the articles from XML dump [15:54:16] Look at this page for example: https://lugat.org/Mustafa_Kemal_Atat%C3%BCrk [15:54:40] There is an image called "Dosya:Ataturk-1905-Zubeyde-Makbule.jpg" [15:55:01] How can I download/get this image from Wikipedia automatically [15:56:08] Is there a script for this [15:56:33] automatically? with instant commons [15:56:33] You can use instantcommons [15:56:37] !instantcommons [15:56:37] InstantCommons is a feature for MediaWiki 1.16+ that allows any MediaWiki installation to use media on the Wikimedia Commons. This has basically been realized via $wgForeignFileRepos. See . If you're only looking to use images from Wikimedia Commons and no other wikis, you can use the shortcut setting $wgUseInstantCommons to true. [15:56:52] Let's say I have only 5 articles in the database and I want to download the necessary images for these articles. [15:57:03] No wikipedia.org is blocked in Turkey. [15:57:13] This is why we are trying to create an alternative. [15:57:27] If I use instant commons people still can not reach to the images. [15:57:27] mertyildiran: instant commons can be configured to not hotlink [15:57:43] @bawolff what do you mean with hotlink? [15:58:09] I mean mediawiki downloads the image, instead of just doing [15:58:09] Do you mean my server can tunnel the images? [15:58:14] yeah [15:58:32] @bawolff great! how can I configure it that way? [15:58:43] its the apiThumbCacheExpiry option in wgForeignFileRepos [16:00:05] @bawolff could you explain more what should I exactly add to LocalSettings.php? [16:00:14] just a minute [16:00:27] @bawolff OK, I'm waiting. [16:01:23] mertyildiran: Try maybe https://dpaste.de/wdkb [16:01:27] I haven't really tested it [16:01:50] There's also a bug with cascading file repos, where it will only fetch descriptions from local trukey images and not commons [16:02:53] There's not very many local files at turkey, it may make sense just to load from commons instead of turkish wiki, so the descriptions work [16:03:03] the config I gave you to try would load from turkish [16:03:39] The apibase field would control whether its turkish wiki or just commons [16:04:15] Firstly, what do you mean with "cascading file repos" [16:04:42] and secondly, both en.wikipedia.org and tr.wikipedia.org is blocked in Turkey [16:04:59] actually wikipedia.org domain is completely blocked [16:05:01] hmm, that config was wrong [16:05:17] https://lugat.org/Mustafa_Kemal_Atat%C3%BCrk [16:05:21] This page is now working [16:05:25] just a second [16:05:31] my server is in the US [16:05:43] by the way don't misunderstood [16:06:19] I can reach to en.wikipedia.org and tr.wikipedia.org from my server without any problem [16:06:47] by the way don't misunderstand the situtation [16:06:52] *by the way don't misunderstand the situtation [16:09:07] oh, the config was right [16:10:10] @bawolff your configuration working flawlessly as it seems there is no problem. [16:10:26] @bawolff can I ask you more questions? [16:11:00] mertyildiran: What i mean by cascading file repos is: If you go to a page like https://lugat.org/Dosya:Example.png, there is no description. But images direct from turkish wikipedia are ok e.g. https://lugat.org/Dosya:Ver_Elini_%C4%B0stanbul.jpg [16:11:46] hmm OK I get it [16:11:51] there are only a small number of images direct from tr.wikipedia.org, so it might make sense to use apibase of https://commons.wikimedia.org/w/api.php which will only include images from commons.wikimedia.org and not tr.wikipedia.org, but then the images from commons would have correct description pages [16:12:30] hmm OK I will try both [16:12:42] err, nevermind, there are actually quite a lot of files locally in turkish https://tr.wikipedia.org/wiki/%C3%96zel:MediaStatistics [16:12:52] feel free to ask any other questions [16:13:08] my next question is my tags are not working. Is there an extension is missing on my installation? [16:13:33] !e cite [16:13:33] https://www.mediawiki.org/wiki/Extension:cite [16:13:57] great! thanks :) [16:14:00] https://tr.wikipedia.org/wiki/%C3%96zel:S%C3%BCr%C3%BCm has a list, but not all of them are important and some are really hard to install [16:14:01] one more question [16:14:28] OK good [16:14:36] also full config files are available at https://noc.wikimedia.org/conf/ (however the config files are for all wikimedia, and really hard to understand/read) [16:14:41] This page has a legend on left: https://tr.wikipedia.org/wiki/Mustafa_Kemal_Atat%C3%BCrk [16:14:54] But my version does not have a legend: https://lugat.org/Mustafa_Kemal_Atat%C3%BCrk [16:15:04] How can I enable the legends? [16:15:33] I mean that informative box long vertical on left [16:15:52] I don't what is the name of that part [16:16:00] looks like you're missing https://tr.wikipedia.org/wiki/%C5%9Eablon:Makam_sahibi_bilgi_kutusu and other templates [16:16:02] infobox vcard [16:16:06] they are usually called "infoboxes" [16:16:11] !wptemplates [16:16:11] To copy templates from Wikipedia, use Special:Export and check the "Include templates" option to get all the sub-templates, then upload the file with Special:Import on your wiki. You'll also likely have to install the ParserFunctions extension, Scribunto extension and install/enable HTML tidy. You also might need some CSS from Wikipedia's Common.css. You'll also need a lot of... [16:16:41] umm, but if you're importing the whole dump, its probably better to import a dump that includes templates ( as opossed to importing one by one with special:export) [16:17:33] @bawolff I do not understand how to enable "infoboxes" [16:17:54] Could you explain it again please? [16:18:36] I assume you imported from a dump right? [16:18:38] e.g. https://dumps.wikimedia.org/trwiki/20170501/ [16:18:51] yes exactly this xml dump [16:19:33] you maybe used a dump that didn't have templates in it [16:19:41] you need one of the dumps that have all pages [16:19:56] e.g https://dumps.wikimedia.org/trwiki/20170501/trwiki-20170501-pages-meta-history.xml.7z [16:20:20] or https://dumps.wikimedia.org/trwiki/20170501/trwiki-20170501-pages-meta-current.xml.bz2 if you don't want old versions of pages [16:21:10] the problem is you're missing some pages. The infobox is actually the page named https://lugat.org/%C5%9Eablon:Makam_sahibi_bilgi_kutusu [16:21:12] @bawolff so only these dumps have the infoboxes? [16:21:54] OK I get it [16:21:59] So what should I do now [16:22:09] In theory I actually thought all of them had templates, but it appears that you don't have templates for some reason [16:22:11] Do I have to reinstall the whole MediaWiki? [16:23:05] umm, possibly [16:23:11] or is just stopping the import and continuing with the dumps you have provided is OK? [16:23:32] OMG :(( [16:23:57] @bawolf are you on Gitter? [16:24:51] I've never heard of it [16:25:08] mertyildiran: Is the dump still running [16:25:23] it could be that you have the right dump, and it just hasn't gotten to the templates yet [16:25:26] @bawolff yeah dump is still running [16:25:46] ok, I'd wait a little while then until its done, its possible templates are at the end of the dump or something [16:26:16] oh yeah it could be [16:27:00] how do we know how many pages the dump have? [16:28:22] I guess you could grep the element that introduces a new page in the xml file and pipe to wc [16:28:42] https://tr.wikipedia.org/wiki/%C3%96zel:%C4%B0statistikler is total for turkish [16:28:54] but I'm not sure which dump you have, so it may or may not have all those pages [16:29:02] https://www.mediawiki.org/wiki/User:Ciencia_Al_Poder/Extract_page_from_XML_dump [16:37:54] @bawolff and @Vulpix thank you so much for these informations. I could maybe ask more questions in the following days. Take care... [17:23:17] Glad to help [18:06:38] Is there any kind of guide on how to modify forms on wiki, specifically the create account form to add a field with basic validation? [18:58:47] https://meta.wikimedia.org/w/index.php?title=Special%3ALog&type=gblrename&user=&page=&oldname=Fenerli1978 [18:59:08] is there a quick way to find out api equevelant results for this ? [19:46:06] HakanIST: there's https://meta.wikimedia.org/wiki/Special:ApiSandbox#action=query&format=json&list=logevents&letype=gblrename however there's no oldname parameter in the api [19:53:53] Vulpix: that param is essential for my bot, do you think there's an alternative way? [19:57:49] report a bug if none exists already. See if scraping the HTML of the Special:Log page is an option for you in the meantime :P [20:00:04] good old scraping yea :-) [20:57:01] @bawolff hey me again. While importing the XML dump of tr.wikipedia my logo suddenly changed to original tr.wikipedia logo: https://lugat.org/Ana_Sayfa [20:57:32] But I can't see such change in LocalSettings.php. How is that possible? [21:00:03] mertyildiran: maybe being overridden at MediaWiki:Common.css or MediaWiki:Vector.css [21:04:44] @Vulpix where are these files I mean location/directory [21:05:00] mertyildiran: it's not a file, but a page [21:06:09] @Vulpix oh I just realized that yeah. So how can I change this line to accept LocalSettings.php : background-image: url("//upload.wikimedia.org/wikipedia/tr/b/bc/Wiki.png") !important; [21:06:23] @Vulpix oh I just realized that yeah. So how can I change this line to accept LocalSettings.php: [21:06:24] background-image: url("//upload.wikimedia.org/wikipedia/tr/b/bc/Wiki.png") !important; [21:07:02] this is CSS code, not related to LocalSettings.php [21:07:34] oh so effect of LocalSettings.php irreversibly gone? [21:08:14] edit the page and remove the line [21:10:05] @Vulpix OK I fixed the logo thank you very much: https://lugat.org/Ana_Sayfa [21:10:09] I have one more issue. [21:11:05] I have installed cite extension by adding wfLoadExtension( 'Cite' ); to LocalSettings.php [21:11:10] https://lugat.org/%C3%96zel:S%C3%BCr%C3%BCm [21:11:28] But tags still don't work: https://lugat.org/Mustafa_Kemal_Atat%C3%BCrk [21:12:20] mertyildiran: did you enabled the extension before or after the import? [21:12:39] @Vulpix import of the data you mean? [21:12:53] yes [21:12:57] after :( [21:13:14] Actually that shouldnt matter [21:13:38] unless page content is cached :) [21:13:42] Since edit to LocalSettings.php should kill cached entries in theory [21:13:45] I did not download any files to extensions/Cite because there are files already there [21:14:20] Ctrl + F5 not working either [21:14:43] Should I restart apache? [21:14:50] Although its appearent thats not true i guess [21:15:08] No, restarting apache does nothing [21:15:10] @bawolff what is not true [21:15:27] That installing cite clears all cache [21:15:32] its supposed to [21:15:44] but that didnt happen for you [21:15:53] so i guess it does not [21:16:56] Its not cache related. ?action=purge did not fix it [21:17:43] * bawolff not sure why [21:19:29] I can't see any References section at the bottom of the pages either [21:20:27] the wiki is actually open for editing [21:20:40] previewing the page doesn't make them work [21:20:55] I got a timeout on preview... [21:21:45] mertyildiran: you should install ParserFunctions extension (and probably Scribunto) [21:23:09] doing a preview of only the works, though [21:23:18] But does work in special:expandtemplates... [21:23:22] jinx [21:24:37] works for me now [21:26:10] I have added wfLoadExtension( 'ParserFunctions' ); still it does not work for me [21:26:25] @Vulpix how does it work for you? [21:26:36] @bawolff how can I add scribunto [21:26:37] I mean in Special:ExportTemplates [21:26:52] *ExpandTemplates [21:26:56] The parserfunctions was for a different issue [21:27:31] yeah it worked for me in expandtemplates too. Its really weird [21:27:56] it doesn't hit parser limits apparently [21:28:28] @Vulpix could you tell me what should I do step by step please [21:28:32] mertyildiran: some of the extra html junk (e.g.
on the article you linked above is caused by no html tidy [21:28:34] because I do not understand [21:28:56] using {{:Mustafa Kemal Atatürk}} on Special:ExpandTemplates display them properly [21:28:57] mertyildiran: we dont understand either [21:29:04] lol [21:29:14] huh. [21:29:44] What do you mean with this "using {{:Mustafa Kemal Atatürk}} on Special:ExpandTemplates display them properly"? [21:29:48] Could you provide a link? [21:30:03] hmmm, the output is definitively cached [21:30:19] change anything of the wikitext and preview, it works [21:31:28] this is the preview caching introduced in 1.27 or 1.28, I don't remember [21:31:41] @Vulpix yeah it worked in preview wtf :D [21:32:04] Restarting apache clears cache I will do it after import thanks! [21:32:08] setting https://www.mediawiki.org/wiki/Manual:$wgCacheEpoch may solve this [21:33:06] $wgCacheEpoch = max( $wgCacheEpoch, gmdate( 'YmdHis', @filemtime( __FILE__ ) ) ); [21:33:22] This will clear the cache did I understand correctly? [21:33:32] Thats the default [21:34:12] Its supposed to clear cache on any edit ti LocalSettings.php, but i dont think it is [21:34:21] so what should I assign to $wgCacheEpoch [21:34:22] it should make cached pages as obsolete, so cache should be discarded, but doesn't seem to work for you [21:35:03] What type of caching are you using (e.g. what is wgMainCacheType ? [21:35:36] ## Shared memory settings $wgMainCacheType = CACHE_ACCEL; $wgMemCachedServers = []; [21:35:53] I choose PHP blabla caching I forgot :D [21:36:10] thats probably the best choice [21:37:13] I'm using VestaCP there could be some different caching mechanism on my server that I'm not aware [21:37:32] Standard VestaCP installation Ubuntu 16.04 [21:39:43] Hmm. You have a lot of pending jobs [21:40:05] can you run the maintenance script runJobs.php [21:40:06] What do you mean? [21:40:48] its how mediawiki handles asynchronous actions [21:41:04] https://lugat.org/api.php?meta=siteinfo&siprop=statistics&action=query [21:41:13] I'm currently running "nohup php importDump.php < trwiki-20170501-pages-articles-multistream.xml &" at the background [21:41:33] Is there a conflicting risk of these two scripts runJobs.php and importDump.php [21:42:09] 56849 wow even Steve Jobs has less jobs [21:42:22] It should be ok to run at same time [21:43:59] I dont actually think it will fix your issue, but it might help things [21:44:09] should I run it with "nohup php runJobs.php &" or just "php runJobs.php" because it seems it's a long execution [21:44:26] could it speed up the importing process? [21:46:09] No, it wont speed up the import process [21:46:48] you can start or stop runJobs.php at any time with no ill effects. So i wouldnt worry about nohup [21:48:21] @bawolff OK I will try thanks a lot! You guys are the best! [21:56:16] mertyildiran: I note, if i go to Special:Random, most pages on your wiki seem to have citations working, just not that one [22:11:54] installed apache and mediawiki from repo's on a fedora box, when I go to http://boxname/wiki/mw-config.php I get php code on the screen. Whats most likely wrong? [22:12:33] new2ip: you probably have to install a package like apache-php5 [22:12:47] (i have no idea what it might be called on fedora, but it should be along these lines) [22:17:40] thanks @bawolff I followed you on GitHub [22:17:48] @Vulpix are you on GitHub? [22:17:56] ok, looking for something along those lines [23:51:01] /clear