[03:01:07] hello. I was trying to join earlier but kept getting booted off [03:02:29] I had two questions about the MediaWiki software: I'm using 1.23.6 and for whatever reason the tags aren't working [03:04:23] And also pages links will still be red when I update them elsewhere. I have to go into edit the source and resave the page for existing hyperlinks to show up. [03:04:47] Does anyone know how I can solve these problems? Thx all :) [03:15:25] Anon2: tags are provided by the Cite extension [03:15:27] !e Cite [03:15:27] https://www.mediawiki.org/wiki/Extension:Cite [03:15:49] Anon2: if the other pages are not getting re-rendered, that means your job queue is not working [03:15:51] !jobqueue [03:15:51] The Job Queue is a way for mediawiki to run large update jobs in the background. See http://www.mediawiki.org/wiki/Manual:Job_queue [03:16:20] Matma can you help me, im stuck [03:16:24] Anon2: also, MediaWiki 1.23 is no longer supporting. i recommend upgrading :) [03:16:26] !upgrading [03:16:26] http://www.mediawiki.org/wiki/Manual:Upgrading [03:16:29] !stable [03:16:29] There is no such key, you probably want to try: !download, [03:16:32] !download [03:16:32] The latest stable release of MediaWiki can be downloaded from . Files are supplied in a .tar.gz archive. MediaWiki can also be obtained direct from our Git repository . [03:16:56] I need to install the cite extension? I thought that wasn't necessary after 1.21 [03:17:32] reg_: perhaps. depends on what your problem is. [03:18:10] I may be missing an extension. I've copied all templates [03:18:10] Anon2: it's included with the main download, but the extension has to be enabled when you're installing the wiki [03:18:47] trying to copy the brackets from http://wiki.teamliquid.net/warcraft/ASUS_Open/Spring/2006, this is what it's looking like http://23.94.38.188/index.php/Test [03:18:55] I installed via Mojo Maeketplace with Bluehost cuz I'm a noob. They only went up to 1.23.6 [03:19:13] boxes and columns are all off [03:20:53] i'm bad at this, i exported and imported all templates but I still don't have what I want [03:21:22] so I'll take any advice. may be an extension I need? [03:21:43] possibly, but i don't immediately see which one [03:22:19] I did set it up to use liquipedia's BracketContest from the git. And that may need the latest version of mediawiki. im using latest stable but not latest as in 1.28 [03:22:29] i'm still 1.27 should i upgrade? [03:22:56] I have a feeling anything from the liquipedia git is going to need whatever version liquipedia uses [03:29:06] I just downloaded the Cite tar.gz file and extracted [03:29:22] So that regular TAR file should be added to extensions? [03:30:32] Or should I include the folder? The TAR file is the only thing in the folder [03:30:51] put in extensions folder, will need to reference it in LocalSettings.php [03:30:55] is there a readme? [03:31:17] MatmaRex: how would I get the version of a wiki? and what is easiest way to perform an upgrade [03:31:30] Nope just folder and extracted TAR file [03:32:05] the extensions go in the extensions folder, have a look at some of the existing ones for an idea of folder structure [03:32:34] you will still need to check readme, some extensions load with wfLoadExtension, some require_once. and there may be additional settings [03:32:57] The readme in extensions? [03:35:44] no, the readme for the ext you downloaded [03:37:39] I downloaded the one for 1.23 and extracted it and it just has a folder with a regular TAR file in it rather than the TAR.gz I downloaded [03:47:04] extract the contents of that file same way as you extracted contents of downloaded file [03:47:22] !findversion [03:47:39] I downloaded, upgraded. how can I check it upgraded correctly? [03:51:11] Ok I've extracted do I just replace now? [03:51:28] I have a cite folder already but do I just reupload? [03:53:51] if it's already there. why not use that one [03:54:56] Well that's the thing. The extension already seems to be there. [03:55:06] But the ref tags aren't working [03:55:18] I have basically everything I extracted already in cite [03:58:01] hello [03:58:05] how are you ? [03:58:12] anybody home? [03:58:19] Yep [03:58:25] awesome [03:58:35] Do u need help [03:59:00] I have a question, I have mediawiki running on a budget webserver that only allows 10mb sql storage [03:59:36] I'm a noob but other people will know how to answer your question [03:59:46] I want to minimise the db size in general [04:00:05] but my "easy" question is this [04:00:36] rename old folder, then put new one [04:01:00] @anon2 [04:01:13] i would've thought 10mb was a lot [04:01:58] sorry I'm back, [04:02:29] ok my easy question is this: is there a way to force mediawiki to only save a few revisions or none at all? [04:03:32] my group editing the page are all trusted and I only really need to save the most current versions of every page [04:04:09] any ideas? [04:04:17] my anon2 friend? [04:05:05] No idea [04:05:15] But I have a question of my own :P [04:05:37] For tags they aren't working for me [04:06:23] I tried adding wfLoadExtension( 'Cite' ); at LocalSettings.php but that crashes the site [04:07:13] !debug | Anon2 [04:07:13] Anon2: For information on debugging (including viewing errors), see http://www.mediawiki.org/wiki/Manual:How_to_debug . A list of related configuration variables is at https://www.mediawiki.org/wiki/Manual:Configuration_settings#Debug.2Flogging [04:07:32] o.O [04:12:14] well as long as you don't allow 'anyone' to update the wiki [04:12:19] imagine you couldn't roll back changes [04:13:14] Anon2: Well that's the thing. The extension already seems to be there. But the ref tags aren't working [04:13:26] so the new one crashes your site, is the folder of the new one named Cite [04:14:10] I got it now [04:14:33] I needed the require once thing [04:24:57] So my final question is when I create a new page from current page, if I refresh current page the link is still red [04:25:19] It shows I have visited (a lighter shade of red) but it doesn't recognize the page exists now [04:29:22] reg_ yeah I'm not worried about rolling back changes [04:29:58] all changes we do are trusted [04:31:04] I feel like there must be a way to lets say, throw out all revisions that are 1 day old, how about that [04:31:45] so that if you accidently delete a page, you can go back and fix it that day, but then poof, its gone [04:34:23] also, most of my DB space is taken up with the l10n_cache table [04:34:35] which I can delete but it seems to just grow back [04:34:44] I mean trunkate [04:34:53] truncate [04:39:02] Does anyone know for the links not working thing? [04:39:11] Notably they do work if I resave the page [04:40:58] what you mean link not working? [04:41:34] you click the tag thing and it wont go to the linked page? [04:42:01] oh never mind I see you question now [04:42:14] Well basically what happens is if I'm on page a, have linked to page b which doesn't yet exist, and then create page b, page a doesn't update that it exists at all unless I resave it [04:42:34] Refreshing it doesn't help. I have to reopen editor and then resave [04:42:42] oh, I see [04:42:44] It's manageable but a pain in the arse [04:42:56] I'll try it on my site, let me see [04:45:01] ok [04:45:06] I tried it on my site [04:45:19] I don't seem to have the problem [04:45:29] is it just that you have the page cached? [04:45:39] How would I have done that? [04:45:51] I mean the browser cached it [04:46:08] press the "refresh" button on your browser [04:46:13] or the F5 key [04:46:26] ok lemme try [04:46:32] and see if it does it without actually editing the page again [04:46:59] Nope it doesn't work [04:47:03] Refreshes don't work [04:47:04] damn [04:47:32] I wonder if you have cacheing turned on, in mediawiki [04:47:37] if its something to do with that [04:48:06] How does that work? [04:48:25] I installed it using MojoMarketplace thru Bluehost [04:48:31] Not a manual installation. [04:48:51] I put the following in my localsettings.php to turn caching off $wgEnableParserCache = false; [04:48:51] $wgCachePages = false; [04:49:28] $wgMainCacheType = CACHE_NONE; [04:49:28] $wgMemCachedServers = array(); [04:49:28] Aight lemme try [04:49:39] just check the localsettings.php file [04:49:42] It might already be that but let's see [04:50:00] if there is that setting then just change the value to the above [04:50:12] but if its not there add the line [04:50:18] lines I should say [04:50:36] turing of cacheing seemed to speed up my site and fix many problems [04:51:54] Ok I'll try [04:53:12] That's all what I have right now [04:53:22] I see [04:53:48] even the $wgEnableParserCache = false; and the $wgCachePages = false; [04:53:59] and they arenm't commented out with # signs? [04:54:13] Oh wait hold on I missed something [04:55:09] Ok I'll put the wgEnableParserCache in now [04:55:16] And the cache pages [04:56:27] cool [04:56:37] maybe it will help I hope [04:56:42] I'm a noob too [04:57:51] Yes! [04:57:55] Thanks so much [04:57:58] It works [04:58:01] awesome [04:58:04] glad to help [04:58:22] That's all my questions. Are you jeanmaster on MediaWiki? [04:58:33] naw, I might sign up on there [04:59:00] Well thanks. What's your wiki btw? [04:59:00] I'm a total noob, but I've been fighting mediawiki for about 4 hours today [04:59:29] if I sign up on the mediawiki site I'll be xor1337 [04:59:37] thats my usual handle [04:59:56] Neat. I'll check in :) [04:59:58] Cheers [05:00:07] cool [05:00:14] good luck [06:01:41] I am new to Media Wiki and need some advise. I am needing to import around 400 webpages. These webpages include mainly images and text. Any recommendations on the best way to import this external content? Ive been looking at https://www.mediawiki.org/wiki/Manual:Importing_external_content there is just quite a bit going on. Any help would be appreciated! Thank you :) [06:02:42] scorpion: Does each page have history? [06:02:47] And/or do you want to keep it? [06:03:05] If you can do some basic scripting, using MediaWiki's `api.php` is probably safest and best supported. [06:03:13] Specifically action=edit. [06:05:41] These pages have no history. I'm simply copying the content. I would have no issues copy manually.. Unforunatel, there are like 5 to 10 images per page x 400 pages can get tedious with uploading each picture individually. [06:06:46] Yeah, images are annoying. [06:07:17] There's a hack that allows you to embed raw tags. [06:07:27] You could maybe use that if the images are already on a Web server. [06:07:35] That would save you uploading them to MediaWiki. [06:07:58] But you'd also lose out on the benefits of MediaWiki, including categorization, file metadata visibility, revision history, etc. [06:08:56] The unfortunate downside of convenience [06:09:49] I'll remember that for future reference. That seems like it can come in handy from time to time. [06:11:04] !wg EnableImageWhitelist [06:11:04] https://www.mediawiki.org/wiki/Manual:%24wgEnableImageWhitelist [06:11:11] !wg AllowImageTag [06:11:11] https://www.mediawiki.org/wiki/Manual:%24wgAllowImageTag [06:11:33] !wg AllowExternalImages [06:11:33] https://www.mediawiki.org/wiki/Manual:%24wgAllowExternalImages [06:27:24] Awesome! Thank you Yvette! :) I will bookmark these links. [07:20:50] andre__: Hi, we have a GCI task in READY here https://codein.withgoogle.com/dashboard/tasks/4880751277375488/, Could you please publish it? [07:57:30] Hi #mediawiki! [07:59:26] I have a huge bug with categories on my wiki, is somebody can help me? [10:46:47] Kelson: done. Thanks! [10:47:10] andre__: thank you [12:35:53] Hi guys, [12:36:07] I am using a custom skin based on Vector. [12:36:16] For every page view I get this debug message: [12:36:19] ContextSource::getContext (SkinTemplate): called and $context is null. Using RequestContext::getMain() for sanity [12:36:32] What should I do to fix that? [12:51:25] anomie: could you have a look at https://phabricator.wikimedia.org/T153747 ? [13:53:05] A philosophical question: a category page is a page, but not an article page. Why not just put the information from the analogous article page into the category page? [13:53:58] For example, on wikipedia there is a category for The Beatles, which consisted of the line "The main article for this category is The Beatles." linking to the article of the same name. [13:54:56] Why not just go ahead and on the category page go with "The Beatles were an English rock band, formed in Liverpool..." and so forth? [13:56:47] the category page often has some metadata about the category specifically, rather than the subject of the related article. [13:56:50] also, you might want to categorize the category and the article differently, if you care about categories being transitive (e.g. if [[Apple Records]] is inside [[Category:The Beatles]], and [[Category:The Beatles]] is inside [[English rock music groups]], would that imply that Apple Records is a music group?) [13:57:11] also, category pages are not counter towards the number of articles displayed on the front page. ;) [13:57:23] not counted towards* [13:58:04] [[Category:English rock music groups]] * [14:11:59] Thank you. The nesting categories failing to make sense sold me on keeping them separate. Category pages (the top halves, anyway) just tend to be underwhelming, but I know putting too much content in there is stepping on the article page's turf. [14:20:51] I see the practical value, but it leads to things like Category: 2 Skinnee J's albums which helpfully informs readers that "This category contains albums by 2 Skinnee J's." How stupid is the target audience that they can't figure that out from the category name? (although I do notice that it's a convenient way to link to the "2 Skinnee J's" article, which otherwise wouldn't be. ) [15:11:02] Is formatversion=2 of the JSON API still considered experimental? [15:23:15] When something is not recommended for "large wikis" what's the general rule of thumb of how big that is? [15:24:56] boblamont: probably somewhere between 10,000 and 100,000 pages, depending on how beefy your server is [15:25:02] well, if your server is starting to strain under the load, maybe turn it off and see if it helps [15:25:14] (the option that is, not the server ;-) [15:25:16] cgt: not really. it's mostly stable. [15:25:44] Cool, thanks. API sandbox on dawiki still says "Experimental modern format. Details may change!" [15:35:55] ok, thanks. [15:37:35] cgt: details may change, but mostly in case of bugs where the fix breaks backwards-compatibility. e.g. https://gerrit.wikimedia.org/r/#/c/328190/ from a few days ago [15:38:35] cgt: but it's already used by a bunch of production mediawiki code. in general, if some API call produces results and they don't look obviously wrong, you should be able to rely on it [16:03:19] MatmaRex: Sounds fine for my purposes. Thanks again. [16:10:41] Hi there. Running MW 1.18.0, php 7, with postgres 9.6.1 backend, hosting a french translation of a book. I’m searching for an extension allowing us to export wiki content as a PDF…but it seems everything I found either doesn’t like non english characters, or doesn’t like recent MW and/or PHP version. Any advice ? [16:10:55] MW 1.28.0, that is. [16:12:50] low: Wikimedia wikis use https://www.mediawiki.org/wiki/Extension:Collection and i think it mostly works [16:13:23] MatmaRex: sweet, will look at it, thanks a lot ! [16:13:43] low: depending on what exactly you want to do, you could also try printing the pages to PDF from your browser. MediaWiki has good print stylesheets built in, and modern operating systems can print to PDF without extra tools. [16:15:48] MatmaRex: well, we host a « pure » translation of the book, and are planning to host an adapted version for our use, so we need to be able to choose (from a category for exemple) which pages to render. [16:16:28] Moreover, there is more than 100 pages, so it would be quite a burden to do it page per page ;) [16:26:31] low: ha, indeed. although, if server-side PDF generation fails you, you could try building the versions for rendering with MediaWiki templates. there are a few limits, e.g. the length of input can't exceed 2 MB, but most books i know should fit below that. ;) [16:27:39] MatmaRex: hmmmmmmm, it’s a 550 pages book. [16:28:25] (in printed/pdf form). we’re at something like 120 MW pages. [16:29:21] MatmaRex: wiki is on a bare-metal server I do administer, so that should be ok ;) [16:29:28] worst case, you can split it by chapters or something. but yeah, server-side rendering would be nicer, if you can get it to work [16:29:30] ha :) [16:58:32] Hey all, I'm seeing a performance issue with some old revisions of pages where there were large changes, where PHP-FPM processes will hang apparently indefinitely waiting for the pages to complete. [16:59:41] For example, right now there's a page /index.php?title=Chef&oldid=363770 that has several such hung processes trying to finish, sometimes those URLs will also have &diff=prev or &diff=next on them. [17:00:34] In php.ini, max_execution_time is set to 60s but many hung processes have been there for 30 minutes or more [17:02:24] I could try setting request_terminate_timeout to something like 2-5 minutes but that doesn't solve the underlying problem. I'd like to figure out how to improve the time it takes for the web server to generate the correct results. [17:02:41] FWIW, this is MW 1.27.1 on Ubuntu 16.04, Apache 2.4, and PHP 7. [17:06:27] Oh I should add the pages that are hanging are revisions that had huge changes. For example, the page that is currently getting a lot of hung processes was about 64k of text on the revision prior to the one getting stuck, the stuck one having had about 61k added to it.