[08:03:29] Anyone here involved in en wikipedia server administration? [08:04:21] Or is there another channel I should try? [08:04:23] !ops ^ [08:05:01] GoldenRing: give me 2 seconds [08:05:56] Yes, no worries. [08:08:51] Just wondering why API requests on wikipedia have gone from working to returning HTTP 503, if anyone knows. [08:09:00] In the last few minutes. [08:12:06] GoldenRing: it's a work-in-progress for ops now, some servers are being restarted, which is probably why you witness 503s [08:12:27] Ah, no worries. I was wondering if I'd done something to offend... [08:12:32] :) [09:15:52] Hi folks. I am working up getting my wiki up to date. It's currently on 1.22. [09:16:07] Can I upgrade straight to 1.24 or do I need to update to 1.23 first. [09:24:50] captbunzo, yes you can [09:30:21] thanks MaxSem [09:31:40] another question. I've just migrated my wiki from one host to another [09:33:01] and now pages are not rendering [09:33:09] http://wiki.nyvaria.net/Main_Page [09:33:33] well, specifically the main content part of the pages are not rendering [10:36:42] This is so painful. https://meta.wikimedia.org/w/index.php?title=User_talk%3ARobiH&diff=11735064&oldid=11725084 [10:59:36] i use 1.24.1 and got file caching activated for not logged-in users. it works fine for regular pages themselves but if i add a category to a page then the category page is not updated, purging makes no difference [10:59:38] any ideas? [12:31:01] Hello... in a custom Wiki Ive created a form in which users can submit the title of a page of the custom wiki. I want to retrieve the text of that specific page. Can someone point me in the right direction as to how to get this text from the ´text´ table in MySQL based on the page title? I already know how to do it using the API, but I don´t know whether its a good idea to make a http request on your own wiki. Thanks :] [13:17:07] Hi, I'm back with the same question. [13:17:17] is there a way to force mediawiki NOT to add width and height attributes to images? [13:17:25] other than manually writing your own tag? [13:29:25] hi [13:30:10] I'm hoping to get a help here, it is regarding my recent change of account name in Commons [13:31:31] I need to re-categorize couple of hundreds pictures and not sure how to do that [13:36:00] Hello [13:37:50] Beata: you might find more commons-related help in #wikimedia-commons [13:38:22] Beata: but if you want to rename a category, see https://commons.wikimedia.org/wiki/Commons:Rename_a_category [13:38:55] ok, category renamed, but I need to change category in all pics and author too [13:39:06] I'll go through help files then [13:41:09] hmm then you probably need someone with AWB or a bot to apply search and replace on all these pages [13:42:00] I am currently looking as to how to get the text of of a wikimedia page using the page title. I know the text data is stored in the ´text´ ´old_text´ MySQL table.. but how to get there using just the page title? [13:42:58] Ive also looked at the API, and it works on my local machine, but since I have direct access to the tables of the custom wiki, it seems strange to me to use http requests and curl [13:43:25] sitic, thank you! [13:45:16] is this actually the right place for this question , or should I go to wikimidia-dev? [13:52:46] blabliblo: see https://www.mediawiki.org/wiki/Manual:Page_table [13:52:52] "To retrieve the text of an article, MediaWiki first searches for page_title in the page table. Then, page_latest is used to search the revision table for rev_id, and rev_text_id is obtained in the process. The value obtained for rev_text_id is used to search for old_id in the text table to retrieve the text." [14:00:02] Thanks alot. Ill see if I can get it to work :] [14:09:11] A few of the page_latest numbers do not correspond to any numbers in the revions table, so does that mean that there is still another link in between? [14:19:09] is there an extension or extensions for finding how pages are linked to each other (maybe in a tree structure) so one knows how many pages are there in the wiki and how they are related ? [14:19:37] I also wrote it in https://lists.wikimedia.org/pipermail/mediawiki-l/2015-April/044215.html but didn't get any answers till yet. [14:20:50] if anybody knows of any other resource where I could ask for the same, please let me know. [14:43:41] hi, all. I have MassMessage rights on Meta. How can I target pages on mediawiki.org? [15:07:41] Elitre: meta.wikimedia.org/wiki/template:target [15:08:41] Nemo_bis: I know the template: it doesn't like mediawiki.org, for some reason. [15:09:21] Elitre: Nemo_bis: either of you know the answer to these question [15:09:23] I also wrote it in https://lists.wikimedia.org/pipermail/mediawiki-l/2015-April/044215.html but didn't get any answers till yet. [15:09:23] if anybody knows of any other resource where I could ask for the same, please let me know. [15:09:38] is there an extension or extensions for finding how pages are linked to each other (maybe in a tree structure) so one knows how many pages are there in the wiki and how they are related ? [15:14:34] shirish: Google finds something with: mediawiki how pages are linked to each other [15:15:18] tale: which extension are you talking about ? [15:15:51] tale: does it have some sort of visualization or graphics in it ? [15:15:58] shirish: The one that above google search sgiws as first gut,̈́ [15:16:05] Sorry, first hit [15:16:58] tale: sgiws, mediawiki extension ? [15:17:41] shirish: Sorry, typos. I will write my anwser again, this time slowly. [15:17:56] shirish: The one that above google search shows as first hit. [15:18:15] tale: sure, because I was unable to get any extension as sgiws. [15:18:36] tale: can you share the link, because I don't know what you are talking about. [15:19:40] shirish: Do You really say You do not know what google is? [15:20:08] tale: I know what google is, but I don't know what keywords you did and what results came to you. [15:20:27] tale: also perhaps you do not know/realize that google does not give the same answers to everybody. [15:20:52] shirish: http://lmgtfy.com/?q=mediawiki+how+pages+are+%20+%20+%20+%20+%20+%20+%20+%20linked+to+each+other [15:26:45] tale: thank you. [15:28:02] shirish: You are welcome. [15:37:34] Hi, I'm getting mad with port redirections. Problem is: I cannot use port 80 for mediawiki (currently I'm using port 56080); If I forward internet:56080 -> server:56080 everything works fine; but if I redirect internet:80 -> server:56080 (to have a "normally accessible" server) it does *not* work (net timeout). Is there any reason for this behavior? [15:38:12] mcon1: Some other server listening on port 80? [15:38:20] in a template i want to declare some variable to be used by javascript, eg: var molecule = smiles("CC=CC(O)C"); [15:38:27] how can i do it? [15:38:56] tale: Yes, that's why I had to move to 56080. [15:50:33] hi, is there a way for a page to show more random numbers without manually seeding the generator? [18:53:53] hi! [18:54:27] I have a MediaWiki 1.24.0 installation with Vector.css as a skin. [18:55:29] I installed the Mobile Frontend extension, and everything runs fine from mobile, it's just that, if accessed from an iPad with the "Desktop view" mode enabled, the wiki shows the Wikipedia English logo instead of mine. [18:55:49] That's really odd. I mean, MediaWiki doesn't even come up with that logo by default. [18:56:52] I've got a mediawiki in /var/www/html/ -- I've just tried moving /var/www/html/* into /var/www/html/wiki/ so that my web-root folder is free for a webpage that can then link to the wiki. I was expecting http://46.101.38.186/wiki/ should now work, but no content loads. [18:56:56] Permissions of the `wiki` folder match those of the `html` one, any ideas what is wrong? [18:58:26] pi-, make sure you update LocalSettings.php accordingly [19:00:24] https://about.gitlab.com/2015/03/03/gitlab-acquires-gitorious/ [19:00:32] $wgServer = "http://46.101.38.186/wiki"; // I assume [19:01:21] Still no luck [19:01:54] Of course it is now /var/www/html/wiki/LocalSettings.php -- I don't know if this constitutes a problem? [19:03:22] hmm [19:03:28] pi-, I haven't done that before [19:03:28] Could it possibly be some Apache setting? [19:03:37] but keep in mind that there's all those $IP/ paths in the config file [19:03:47] I guess you gotta add the given directory [19:03:51] in the middle of that path [19:15:15] The javascript page of oversight? [19:15:59] Maybe relocating the wiki root is a particularly unusual thing to do.. [19:16:16] Seems reasonable enough tho. [19:17:26] "$IP/cache" "$IP/skins/bla" "$IP/extensions/bla" that's all I've got, and none of it looks critical for loading the page [19:17:49] well [19:17:50] Seems reasonable for using oversight tools without having oversight rights. How to copy oversight codes to a .js page that can be used properly? [19:17:53] if skins aren't there [19:17:56] the wiki crashes [19:18:06] at least, that's what it happened when I got the skin path wrong [19:19:02] Not set yet [19:19:26] pi-, if you haven't, try checking out this page http://www.mediawiki.org/wiki/Manual:Moving_a_wiki [19:19:29] might be of help [19:19:33] gtg, sorry [19:20:33] I'm asking I need oversight codes copied to a .js page to use oversight tools properly instead of having oversight right. [19:38:07] can any one tell me what are the possible spams you all face while using mediawiki? [19:45:37] This is turning into a disaster. I have no idea why moving my wiki to ./wiki/ just breaks everything. [19:46:34] http://www.gossamer-threads.com/lists/wiki/mediawiki/212241 <-- looks like this guy did the same thing and `$wgScriptPath = "/wiki"; ` fixed it for him -- no such luck for me [19:46:42] That was several years back.. [19:47:12] Log gives "The character encoding of the HTML document was not declared. The document will render with garbled text in some browser configurations if the document contains characters from outside the US-ASCII range. The character encoding of the page must be declared in the document or in the transfer protocol." [20:01:23] I seem to have fixed it. It required `$wgScriptPath = "/wiki";` in the LocalSettings.php and `/etc/init.d/apache2 restart` [20:04:09] $wgServer = "http://46.101.38.186/wiki"; turns out to be wrong -- I need to leave that as it was [20:05:01] Changing `$IP/` to `$IP/wiki/` doesn't seem to break anything so I'm not sure whether I should do that [20:05:14] Wouldn't it be tidier to create new $wikiroot variable? [20:06:25] Shouldn't mediawiki have some such variable? Because surely many people won't have their wiki on the webroot.. [20:08:08] Aha, Changing `$IP/skins` to `$IP/wiki/skins` DOES break it! [20:08:29] $IP/cache it doesn't seem to have any effect [20:08:47] So I'm going to assume that change is unwelcome everywhere [20:36:11] How can i remove jslint error in https://gerrit.wikimedia.org/r/#/c/201595/ ? [21:02:10] phoenix303: What is "this" in that context? [21:02:16] phoenix303: The rest of that function doesn't appear to use this [21:03:59] 'this' is a problem here [21:04:20] Thanks Nemo_bis [21:06:28] * thanks RoanKattouw [21:10:11] OMG, why does PageTriage have a phabricator component not named after the repo (it's called PageCuration instead)? That's going to be terribly messy [21:14:12] Oh sorry Thanks roankattouw [21:17:38] Howdy all. I'm trying to figure out how to pull the titles of all pages with a certain prefix out of the database for Mediawiki 1.23.8 (building a script to ensure prerendering of all Collections). I can't find the prefixes in the db, and am open to better approaches, if this one is crazy. [21:22:02] Bah. Murphy's law - no sooner posted than solved. [21:45:54] hello, httpd-2.4.10 mediawiki-1.24.1... I am trying the PlantUML extension... When I just edit a page by adding normal text, the edit gets save. When I put a , the edit doesn't get saved, and I get a blank page. Can I turn on some debug somewhere to see what's going on? [21:48:11] i was missing around it [21:56:11] it renders it as text, rather than adiagram [23:50:18] Hi, how do I get access to Tools Lab? I just submit a Tools Access Request? or do I follow that up? [23:50:26] *do I [23:52:59] MathJax 2.5.1 is out, and claims significant performance improvements compared to pre-2.5 versions of MathJax. Wikipedia still uses MathJax 2.3 and I'm wondering when that's likely to update.