[00:01:47] marktraceur: I can't find a way to check it without accessing mediawiki, which is broken [00:01:59] Totally broken? [00:02:07] dnj: You can look at the RELEASE_NOTES file [00:02:29] marktraceur: great point! [00:02:31] it's 1.21 [00:03:34] Sweet [00:05:06] !centralauth [00:07:41] marktraceur: anything else can I find to help with this issue? [00:08:03] dnj: Oh, I was...uh...I guess I would suggest trying an upgrade and seeing what happens [00:08:17] :o [00:08:38] Unless this is a heavily inflexible wiki [00:08:46] I mean, staying up to date is nice anyway :) [00:08:49] would I lose my content? [00:09:02] Naw [00:09:04] !upgrade [00:09:34] For the most part single-version upgrades are pretty minor things, at least so I hear [00:09:41] If not, make sure you make a backup :) [00:09:58] okay [00:09:59] thanks for the help [00:10:18] Hm, maybe I scared him off [00:10:22] If I want to use WebRequest to get the value of an inputbox, how do I sanitize for null input or illegal chars [00:11:30] Withoutaname: Illegal characters for what context? [00:12:39] tags or things normally used by the URL (? ! =) etc [00:13:22] Hm [00:13:23] well anything that might cause the URL to behave weird [00:13:30] Withoutaname: For a URL? [00:13:35] Because there's url encoding for that. [00:14:11] well, I just want to remove any special chars [00:14:16] used by mw [00:14:29] like : should not be allowed [00:14:53] Withoutaname: Is this for a title? [00:14:55] anything in the shift+numberkey row [00:15:01] Like, a page title? [00:15:30] yeah, the page title is part of a special page though [00:15:54] Like Special:Whatever/? [00:16:12] exactly [00:16:16] OK [00:16:35] Probably you can use the existing title blacklisting regexes for that [00:18:38] Withoutaname: Is the subpage value going to be used to point at another page? Like Special:MovePage/Whatever points at the main namespace Whatever? [00:19:23] no the subpage value is going to store input from the inputbox into the database [00:19:31] ....hm [00:19:51] Withoutaname: I think you're maybe doing it wrong...that data should probably go into a POST request field [00:21:02] i think i should show you what i have so far, makes it easier [00:21:17] Probably [00:26:57] marktraceur: http://imgur.com/wUGRuBV [00:27:56] Hrlm [00:29:16] Withoutaname: So, POSTing to a non-existent resource hurts my REST-bone [00:29:58] I would suggest just POSTing to your special page with a groupname=WHATEVER and avoid the issue partlry [00:30:01] partly* [00:30:56] It means not having to deal with invalid mw title characters at least [00:33:39] marktraceur: how does that work differently from this setup? I have it currently setup so that Special:UserGroups/Foo displays an error if "Foo" doesn't exist, then points bak to the "create new usergroup" form [00:33:59] which prompts the user to enter the new name [00:34:03] wwUhhh [00:34:36] Withoutaname: I don't understand, why do you need to sanitize for a title...oh! You should just sanitize for the user group name, I think it's even more strict [00:35:09] Just replace things that aren't [a-zA-Z\-] with a - or something [00:36:00] ok, I'll try that [00:36:23] Sorry I tangented [00:36:44] Withoutaname: Oh, wait, you might be able to stick spaces in the group names but not the IDs? [00:37:19] sure, spaces or say chinese chars [00:37:42] but not anything that might cause mw to flip out [00:37:50] Something tells me it's not just "all valid title characters" though [00:37:58] I think '/' might be off limits e.g. [00:38:36] * Withoutaname is looking at other special pages like special:userrights [00:47:01] well I guess we could use Title.php to sanitie [00:47:13] Probably a good idea [02:21:10] Hi folks! Would anybody happen to know which extension is being used by Wikia on this page: [02:21:13] http://community.wikia.com/wiki/Blog:Wikia_Staff_Blog [02:21:31] Is BlogPage extension, or...? [02:21:58] http://community.wikia.com/wiki/Help:Blog_article [02:22:18] I'd suggest it's BlogArticles [02:22:26] But that's a guess based on Special:Version [02:22:31] [02:22:32] srsly wikia [02:22:43] I know, right :P [02:23:13] CityVisualization is registered twice [02:23:14] BlogArticles seems to be the name of whatever extension they used to insert those pages. [02:23:16] What is that I don't even [02:24:46] I've located "BlogPage" in Wikia's source on Github - But from experience,there's no way they're using that. [02:25:26] They probably don't even know [02:25:34] ._. [02:26:03] Could I ask them? Not sure how they operate. [02:26:19] Stern no? Lawsuit? I don't even know. [02:26:32] Could try ##wikia [02:26:52] They have an IRC? [02:26:54] Cool. [02:27:57] good luck finding staff there [02:28:17] I'm just shocked there's no good Blogging solution for Mediawiki. [02:28:29] well... blogs are kinda opposite of wikis [02:28:51] in a blog, a single author writes and owns a page, and others can comment on but not modify the content [02:28:58] wikis, however, allow everyone to collaborate on the page [02:29:31] anyway, ashley may know more about how wikia does the blogging things [02:29:33] True, but they're needed in some cases - Wiki family, for example, may need a decent hub. [02:29:52] (assuming he's still awake) [02:30:09] I'd be willing to pay ashley to up the BlogPage extension (Don't take my word on that) [02:30:28] lol [02:32:37] But really, a decent blogging solution, just for the sake of order is needed. My network needs something along the lines of that to distribute news, it's a large wiki family that needs to be more than just a pile of wikis. [02:33:55] Not only does BlogPage force you to use the "VoteNY" extension for each post, but it's so... Not a blog at all. Wikilog is the same, sadly. [02:34:31] But if ashley does take donations, I'd be willing to give a bit just for the cause. [02:34:43] yeah, most of the wikia extensions are all dependent on each other as well as crappy core hacks sometimes [02:35:14] (at least, most of the ones that add the frontend "social" features) [02:35:45] Extension:VoteNY is available at least for regular mediawiki installs [02:36:09] I'm sure with some debugging you could solve that, it looks like they're using SocialProfile's "Comments" extension with they're Blog-thingy. [02:36:23] yeah, that seems likely [02:36:36] what seems unlikely is that you'll find someone willing to decouple all of it :P [02:41:32] I have a bad feeling ##wikia is dead. [02:44:58] Skizzerz: Well, what about it looks complicated? The only Wikia dependency I see would be avatars, which could very well be swapped for SocialProfile's avatarsystem or perhaps no system at all. [02:46:22] * Skizzerz hasn't looked at the source code for any of it, I just know that ashley likes complaining about it every now and again when fixing up the code to work on third party wikis [02:47:58] Not the code, just the looks of it. I can't imagine what else could be Wikia specific. [02:49:44] Luma_, #wikia-dev has a few people in it. (Just searched the channel list. They might all be afk or bots.) [02:53:17] hi is there a extension manager for mediawiki? [02:57:45] junix6591: can you be more specific about what you mean? [02:58:38] is there a easy way to add extensions to mediawiki other than downloding the extentions and manually adding them? [02:58:49] is it built into mediawiki? [02:58:54] https://www.mediawiki.org/wiki/Requests_for_comment/Extension_manager [02:59:02] junix6591: some of them come in the tarball. you can just check a box and get them during installation [02:59:04] It's not built in, but is being considered. [02:59:32] ok [02:59:48] i'm trying to decide between, xwiki, mediawiki, and dokuwiki... [03:00:21] junix6591: what's your use case? [03:00:35] internal wiki system... [03:02:00] I'm trying to access the MediaWiki web API over HTTPS. Is this possible? I'm getting a weird error "network not reachable" [03:02:54] sumanah: it's possible [03:02:57] what url specifically? [03:03:25] https://en.wikipedia.org/w/api.php?action=query&prop=extracts&format=json&explaintext=&titles=Knitting&redirects= [03:03:41] HTTP works. HTTPS does not. [03:03:46] https works for me [03:04:02] OK, I will investigate further, maybe the problem is on my side [03:08:58] fhocutt: https://en.wikipedia.org/w/api.php?action=query&prop=extracts&format=json&explaintext=&titles=Knitting&redirects= is what I was trying [03:10:45] OK! I must have had a super intermittent problem. It works for me now, in the browser & using the Requests Python module [03:26:01] fhocutt: I'm gonna head to bed, myself - if you end up learning Things about the best way to work with MW API client libraries & HTTPS please summarize for me/the world lateR? [03:26:04] er, later [03:26:11] sure! [03:36:40] Amgine: :P [06:51:30] hello, how to allow templates in common.css? [06:51:34] they do not parse [06:56:04] okay, already got it [07:00:04] Hi! [07:01:06] Is there a javascript equivalent of linker::link() in ph for creating links? [07:01:18] *php [07:08:41] Hi [07:08:56] Is there any way to perform OAuth2 authentication with Google using mediawiki? [07:13:16] Hi, I am going to write QUnit tests for https://git.wikimedia.org/blob/mediawiki%2Fextensions%2FTranslate/ca4b908a956df827f4ce8dadac0fca853bda6482/resources%2Fjs%2Fext.translate.special.pagemigration.js [07:13:53] Not understanding how to register the module as per https://www.mediawiki.org/wiki/Manual:Hooks/ResourceLoaderTestModules [07:14:11] so that I can see it under Special:JavaScriptTest/qunit [07:31:35] So ugly [08:25:22] hi, i got a template which contains a navigation based on dynamicpagelist and a navigationlayoutcontainer we created... this navigation template only shows new entries when we explicitly edit and save it, any idea what we could do? [10:06:18] hey there, I've got several problems with my mediawiki installation: i can't access any special pages, all i get is an "internal server error - 500", the wiki logs seems to be ok, the apache log says "End of script output before headers: index.php"... plugins are deactivated, mediawiki-version 1.22.7, php 5.5.9-1ubuntu4 (cgi-fcgi), sql 5.5.37-0ubuntu0.14.04.1-log... [10:09:22] mitch: hi, seen you here yesterday, heh. [10:09:40] but didn't see you mention php 5.5.9. that's kinda new and i don't think anybody has tested MW on it yet – maybe php is segfaulting? [10:09:57] mediawiki tends to bring out nasty bugs in php [10:10:30] hi matmarex, yeah, but i'still desperate about my wiki :) [10:11:11] mitch: are you able to downgrade php, just to see what happens? to 5.3 (prefereably) or 5.4 [10:12:35] yeah, i could do that. never thought that this could be a problem... [10:12:47] (server is really up-to-date) [10:14:42] mitch: there's been a fair share of segfaulting bugs over the years MW has ran into. https://bugzilla.wikimedia.org/buglist.cgi?quicksearch=ALL%20seg%20fault&list_id=318956 [10:36:19] !blank [10:56:54] mitch: i've gotta go now, if you try downgrading php and it fixes an issue, please file a bug about this so it's not forgotten: https://bugzilla.wikimedia.org/enter_bug.cgi?product=MediaWiki [10:57:03] fixes the issue* [11:22:38] hrms... [11:23:18] does anyone have any pointers on how to do ... .. "user checklists", per se? I want to have a group of items (say 30 toys) that a user can then indicate, on his user page, that he has - with as little difficulty as possible. [11:23:29] looking around for "mediawiki checklist" didn't really bring up anything... [11:24:29] so what i was thinking of doing would be to add all the data for the toys into a template - and add support in the template for a flag on whether you own a particluar toy... [11:24:38] then then Main:Toy page would be just the template without any ownership... [11:24:55] and then users could embed the template in their user page, passing "own" flags as needed. [11:25:16] that way, the data isn't ever duplicated (i.e., users will always have the latest copy of the data on their user pages), but the flagging remains user-specific... [11:25:25] it's not as /nice/ as a checklist approach, but it seems like it would work. [11:26:46] that'd only ever really work if the grouping of toys or items was all on one Main:Toy product line though... [11:26:54] wouldn't really work if each Toy had its own individual page. [11:30:40] that seems to be the same idea behind http://wiki.teamfortress.com/wiki/Template:User_weapon_checklist ... [11:47:43] MatmaRex: hey thank you for your hints, i will try to fix it and file a bug. thanks a lot again [12:18:07] hi i have a probleem with my mediawiki instalation when the install lation show charaters form chinees alfabet or rusian show wheerd charaters. [12:18:50] for compleasion i have move form mysql 5.3 to 5.5 and also upgrade mediawiki. [14:43:16] I'm getting empty responses to my api calls for templatedata on localhost, i.e. http://localhost/api.php?format=json&action=templatedata&titles=Template%3ACite+web&lang=en-gb&redirects=1 [14:43:26] does anyone have any ideas on how to debug this? [14:43:44] TemplateData is installed in my Special:Version [14:44:04] and my templates were dled from wikipedia and the template data on them looks correct [14:54:13] mvolz, is the template data directly on the template page you're querying for (check the source of the page) [14:54:55] If they're transcluded from another page (e.g. documentation), TemplateData won't pick it up unless it's a very new version [14:56:31] mvolz, also you might try going to the page with action=purge and then querying again [15:02:38] Krenair: thanks! I just updated templatedata a few hours ago to the most current version so I don't think that's it. [15:02:55] The template data appears on the source of the page; looks exactly like it does on wp [15:03:40] purge module requires a post request :( [15:04:05] mvolz, I mean from the UI [15:04:15] http://localhost/index.php?title=Template:Cite_web&action=purge [15:04:24] hey Krenair how's it going? [15:04:28] Mithrandir: hey there, how are you? [15:04:28] hi sumanah [15:04:31] got a sec? [15:04:43] things are going well actually [15:04:45] sumanah: hiya [15:04:50] how are you? [15:05:01] sumanah: good, good, enjoying a minor heat wave. You? [15:05:01] Krenair: tnx :) [15:05:10] still not working though :( [15:05:11] mvolz, did it work? [15:05:40] Mithrandir: fhocutt is with me in NYC and we're talking about API usability and "more than one way to do it" .... wanted to run sometihng past you [15:05:46] the template data is in the documentation though [15:05:50] not directly on the page. [15:06:03] might try moving it [15:06:11] Mithrandir: kind of warm here in NYC too :/ [15:06:33] Krenair: cool! whatcha working on? (+ other nonwork things that might be going well, I should not presume) [15:06:57] mvolz, oh, um.... I know, try editing the template page [15:06:58] sumanah: sure. I need to pop out to pick up a pizza rsn, but I have a few minutes. [15:07:03] mvolz, make no changes, hit save [15:07:22] Mithrandir: so, we want APIs to be usable (and we want API client libraries to help with the usability, as sort of a shield/friendly interface to less intuitive APIs). And sometimes having "too many" ways to achieve very similar results can be .... confusing! and therefore less usable! [15:07:33] sumanah, am just helping mvolz, going to work on some VE/OOUI stuff in a moment [15:07:39] Sweet [15:08:18] Krenai: [15:08:19] omg [15:08:21] that worked [15:08:22] ??? [15:08:26] thank you so mcuh [15:08:29] You're welcome [15:08:42] what was going on? [15:08:45] sumanah: absolutely. [15:09:01] sumanah: it's the "there is more than one way to do it" critic against perl [15:09:08] I had this issue on the English Wikipedia. Had forgotten how I solved it, but turns out it just needed a 'null' edit (edit making no changes) [15:09:14] Mithrandir: so, when you have various things you can do with an API that allow you to access different levels of abstraction.... it's important to make the most common actions, that users will want to do a lot, simple, and make sure the docs/tutorials/libraries also make those easy and show those methods and maybe even ignore/not mention the other ways [15:09:16] s/critic/criticism/ [15:09:30] sumanah: yeah, or put them in separate modules. [15:09:49] "this is how you do oo access", "this is low level", "this is procedural", etc. [15:10:17] fhocutt: http://bots.wmflabs.org/~wm-bot/logs/%23mediawiki/20140603.txt [15:10:19] mvolz, So I think MediaWiki caches transclusions of things (e.g. {{documentation}}), a null edit makes it redo that [15:10:48] it's pretty common for python modules that wrap C libs to have one that's a pretty faithful imitation of the C library and one that makes it into a python module instead [15:10:49] Mithrandir: different modules of the *code* or of the docs? [15:10:52] oh [15:10:57] sumanah: both [15:11:01] Nod. [15:11:14] is there any way to flush cache programmatically? [15:11:19] ok, that confirms and deepens the understanding I had and was passing on to fhocutt so thanks Mithrandir [15:11:21] sumanah: they can even be separate packages you install, though whether that makes sense obviously depends on the size, complexity, etc. [15:11:33] right, yes. [15:11:57] ok, need to run to pick up pizza, back in 20-ish. [15:12:00] OK fhocutt! any further clarifications/ideas you wanna talk about now? or shall Mithrandir go get pizza? [15:12:03] oh ok :) thanks Tollef [15:12:10] thanks Mithrandir! [15:12:41] (i.e. I can't imagine you edited every citation template on English wikipedia) [15:14:05] (why not, there's only few dozens) [15:14:57] (ok never mind :)) [15:15:50] hrm... [15:15:59] how do i stop sidebar groups from collapsing? [15:16:09] moving from 1.21 to 1.22 jsut now, and my second group is auto-collapsing. [15:17:49] (for some reason I thought there were a billion citation templates but only the new CS 1 ones have template data so I guess it wasn't as many as I thought!) [15:17:52] fhocutt: http://hexm.de/mw-search is a thing I may not have mentioned to you before - it searches a lot of things! mailing lists, Bugzilla, repositories.... [15:18:53] you can also enable in your preferences https://www.mediawiki.org/wiki/Special:Preferences#mw-prefsection-gadgets "external search" - check the checkbox that says: [15:18:53] External Search: Add a selector to the search page allowing to use the Wikimedia technical search (those searches are processed using Google CustomSearch, subject to Google's Privacy Policy). [15:19:31] fhocutt: it's under "User interface gadgets" [15:20:12] oh, I see, it doesn't change the upper-right-hand-corner search box, it gives you a dropdown choice at https://www.mediawiki.org/wiki/Special:Search . [15:21:05] hi qgil! I am helping fhocutt move her API client "gold standard" into finalized status (it has been a draft). How are you? [15:23:50] hi sumanah "gold standard" :) I'm good, and ready to attack our quarterly review materials... [15:24:24] anyone know how to stop Mediawiki:Sidebar groups from becoming collapsed by default? http://www.disobey.com/wiki/Main_Page [15:24:32] the "Games and puzzles" section is being forced closed by JS, it seems. [15:25:26] :) [15:26:57] hrm. [15:27:00] might have been cache, i suppose. [15:27:09] i just edited the page i was on, and now the items are staying open. [15:54:35] [15:59:38] hi terrrydactyl how's it going? [15:59:57] rmoen: hey there, I'm gonna be in Portland for Open Source Bridge + AdaCamp in late June, hope to say hi and grab coffee with you then [16:00:30] sumanah: for sure. We should get that coffee :) [16:00:36] yes! [16:00:46] hope PDX is treating you well and that the new team is as well :) [16:01:02] sumanah: it is! :) gotta do the standup. brb [16:01:10] I'm so excited rmoen -- I'm gonna give a talk about Python tips http://opensourcebridge.org/sessions/1329 [16:03:56] sumanah: ooohhh ;) That looks like a fun talk [16:05:36] rmoen: https://www.mediawiki.org/wiki/Meetings/2014-06-19 I will be giving it as a Wikimedia tech talk ahead of time as well [16:05:51] qgil: got a moment to talk about upcoming tech talks? [16:07:33] 1 minute to finish one thing pls sumanah [16:08:47] sumanah, ready [16:09:05] sure no prob qgil. Basically, you remember Filippo Valsorda, FiloSottile, who was at Zurich? He made https://filippo.io/Heartbleed/ . He is giving a talk about making it at an OWASP conference later this month, and I think it would also make a good Wikimedia tech talk http://www.meetup.com/OWASP-NYC/events/183358352/ [16:09:21] qgil: he's interested, and I think it'd be of interest to us [16:09:57] he was very Wikimedian in how he did it - open source, no ads, etc. and it was an interesting scaling challenge and I've heard the story and I think a lot of Wikimedians would find it interesting, from a security and performance perspective [16:11:09] he's a Wikimedian who makes bots/tools/etc and edits Italian Wikipedia [16:11:45] So I think it would be a good opportunity to showcase some interesting tech that a community member made. What do you think qgil? [16:11:53] ok for me to go ahead & have Rachel schedule it? [16:12:06] If e.g. csteipp and Krinkle see the point, and Heartbleed may be interesting for Wikimedia / MediaWiki in the future... [16:13:20] I mean our Tech Talk schedule is basically empty and welcoming sessions, but we should keep a link to Wikimedia / MediaWiki topics (I believe, and I'm not convinced) [16:14:17] qgil: ok, I'll check with Chris. Yuri already said "sure I'd be interested" so I have at least 1 more person who thinks it's of interest [16:15:32] When Wikimedia / MediaWiki organizes a tech talk, the expectation is that the subject is about our project or about projects we have a relation with. We can be as flexible as needed, but we just need to make sure we don't raise wrong expectations. [16:15:44] Thank you for your help filling the agenda of tech talks! We really need more topics. [16:21:52] qgil: Understood [17:00:07] Krinkle: ping https://gerrit.wikimedia.org/r/#/c/122838/ [17:00:07] Can haz review? [17:07:03] bawolff: Hi [17:07:10] kunalg: hi [17:09:01] bawolff: I was working on the Page Language selector, starting with a SpecialPage. Robin suggested that it might be considered even as a final implementation. [17:10:09] bawolff: I wanted to ask where should the code for changing language go? I don't find it too useful in the Title class. Should it rather be in the SpecialPage? [17:10:09] That's possible, its probably best to do that first, and after its done to evaluate if it effectively meets the use case [17:10:19] Yes, it should probably be in the special page [17:14:29] bawolff: Ok. [17:15:31] bawolff: Thanks, it would be great if you could review my commits. :) [17:16:11] I'll try to look at them soonish [17:16:30] Thanks [17:18:31] anomie: Robin wrote a patch for reducing the complexity in the languages part and making the page languages into a single array- https://gerrit.wikimedia.org/r/#/c/137033/ [17:21:24] hi [17:22:02] what would you guys do to improve the search for your wiki on google search? [17:32:48] what's the name of that popular WYSIWYG extension that helps new users get acclimated? [17:35:07] Do you mean VisualEditor? [17:35:40] !e VisualEditor [17:35:43] or [17:35:59] !e WikiEditor [17:40:22] ooh an irc bot [17:41:44] mvolz Amgine thank you! [17:44:50] hello, i've been trying to get visualeditor working on my internal wiki for a while now without much success. PARSOID is up and passes all tests but when i try to edit a page using visual editor i receive parsoidserver-http-bad-status: 500 [17:45:41] im running mediawiki 1.22.7 and what i think is the correct visual editor release but i am not sure whats up [17:46:01] ive tried upgrading mediawiki to rel 1_23 and then also visual editor but that failed [17:46:37] i also tried moving my mediawiki from centos to ubuntu server because it was much easier to install parsoid on ubuntu but that didn't work out either [17:47:25] jeffo, did you use the deb to install parsoid? [17:47:36] i did on my ubuntu instance [17:47:48] hmm [17:48:10] when using short urls should my entry points be /wiki/ or /w/ [17:48:17] the default port in that case is 8142 [17:48:53] yeah i fixed that in the localsettings.php on that instance of my wiki [17:49:33] for the api uri, you need to use whatever uri gives you the main api page [17:49:49] which might be /api.php with short urls [17:49:55] just try it with a browser [17:50:02] see https://www.mediawiki.org/wiki/Parsoid/Troubleshooting [17:50:11] parsoid is working great [17:50:53] okay, so you are able to load a page from your wiki through parsoid [17:51:00] no [17:51:19] parsoid itself when i browse to server:8142 works fine [17:51:27] and when i run the command line tests of parsoid [17:51:33] all expected success/fail are registered [17:51:45] did you configure your local wiki prefix & api url? [17:51:53] but when i "edit" a page in my wiki, i get parsoidserver-http-bad-status: 500 [17:52:07] basically check the troubleshooting page [17:52:26] k one sec [18:08:10] okay [18:08:10] nope [18:08:30] how can i turn on debug logging to determine some why i am getting error 500? [18:09:07] when i do curl to parsoid it works fine [18:09:22] not sure why the error 500 is generated from mediawiki [18:13:33] 500 status code can be caused by a lot of things. Sometimes its a php fatal error, in which case check your php error logs [18:13:54] !500 [18:15:20] hmm [18:22:57] Hi! Is ashley online today? [18:28:43] ashley: If you happen to check here later, please ping me. I was told you'd be the one to ask about something. [18:35:50] jeffo, make sure you are setting the right port in the VE config [19:00:32] Luma_: hi! [19:00:33] also [19:00:35] !ask | Luma_ [19:28:01] bawolff: got a minute? [19:28:06] sure [19:28:19] can you look at my last comment on https://gerrit.wikimedia.org/r/#/c/123516/ and see whether it's mergable? [19:30:34] I think its ok to merge. I put a +1 on it for now, if no one objects in like a day or so, I'll merge it [19:31:01] kk [19:39:09] Which mediawiki extensions are necessary for correct displaying of pages imported from simple english wikipedia dump? I enabled ParserFunctions, Cite, ImageMap and InputBox, but there is html tags as text on several pages and design of some pages is broken. [19:41:07] Dmitry: Which html tags? [19:41:15] You probably also want Tidy [19:41:17] !tidy [19:41:27] and you also probably want scribunto [19:41:37] but I'm not sure either of those would leave tags in text [19:42:18] on MainPage [19:42:27] {{:Main Page/Article Division by zero.}} on MainPage [19:42:39] Dmitry: That could be tidy missing [19:43:21] I'm not sure what circumstances could cause "{{:Main Page/Article Division by zero.}}" to be on a page unless its in a block [19:43:48] on Wikipedia:Privacy_policy. And it page is visually broken. [19:45:08] that all sounds like html tidy not being set up [19:45:52] ok. I go try... [19:52:30] bawolff: tidy solved the problem [19:52:42] :D [19:52:49] bawolff: thank you [20:13:01] where's the bestresource for troubleshooting problems with my Visual Editor implementation on mediawiki [20:13:34] i can't even get debug logs to show why i am getting the error "parsoidserver-http-bad-status: 500" [20:13:38] when i click [20:13:40] "Edit" [20:13:45] i have parsoid working [20:14:06] but whenever i click "Edit" in my mediawiki i get a web browser message about error 500 [20:14:16] i have parsoid running in my terminal session and no message is displayed there [20:14:36] because it is an AJAX call to edit the page w/ visualeditor no other error is displayed, even if i turn on php errors [20:16:01] hi jeffo [20:16:53] jeffo: sorry to ask you to repeat yourself, but the #mediawiki-visualeditor room is often helpful and has the key developers in there. It'd be good for you to also tell them what versions of MW, Parsoid, and VE you are running [20:17:09] I wish you well [20:18:26] thanks pal [20:18:40] exactly what i was looking for [20:18:43] :) [20:37:04] fhocutt: https://www.python.org/psf/grants [20:37:15] "A related source of funding is the PSF Sprints Committee. The Sprints Committee provides organizational and financial support to Python-focused coding and hacking sprints. This includes work on modules or frameworks, Python 3 module ports, etc." [20:38:23] fhocutt: https://openitp.org/grants.html - "OpenITP provides material support to free and open source software projects that make tools for circumventing digital surveillance and censorship....We usually do a call out twice a year - once in the Spring and one in the Fall." [21:30:15] For my wikis, $wgUploadDirectory is the default $IP/images but it is an NFS mount. On top of that, each of my 5 wikis does this across 4 web servers. Is it possible to somehow move the upload directory outside of the document root so I can more easily upgrade MediaWiki software? [21:40:48] why would it be easier? [21:42:10] Otherwise I have to do the shuffle of unmounting nfs, moving the mw directory out of the way, move the new mw directory into place and remount nfs [21:42:36] and that has to be done for each of 20 mw installations (until I can move to a wiki family architecture on each of the 4 servers) [21:43:11] Just replace the files? [21:43:44] yes, since you don't just extract a new MW tarball over the existing installation [21:45:20] basically, i extract the new MW tarball to a staging location, along with all extension updates in the extracted locations extensions directory, then for each wiki directory, unmount nfs, move old dir out of the way, copy new staged dir into place, remount nfs, copy back LocalSettings.php, etc. [21:45:46] (that also ignores Semantic MediaWiki updates, which has its own complexities) [21:48:16] So I was trying to figure out if there's a way to mount the NFS directory outside the mediawiki directories and point $wgUploadDirectory to that mount location, assuming I correctly configure Apache permissions, etc. [21:49:21] just update $wgUploadPath and that should work [21:49:41] that will alter the URLs to use the new location (I suppose you could do apache-level rewrites instead) [21:52:06] so if I have a wiki at /var/www/sites/wiki1 and I set $wgUploadDirectory for that wiki to /var/www/images/wiki1, what should $wgUploadPath be? [21:53:42] most likely /images/wiki (it was probably /sites/wiki1/images before) [21:53:58] According to the docs, $wgUploadPath defaults to /wiki/images, which should stay the same, so that's what confuses me [21:54:29] I don't currently set $wgUploadDirectory or $wgUploadPath [21:54:36] just take the defaults [21:55:40] right, the defaults would be /var/www/sites/wiki/images and /sites/wiki/images, respectively [21:55:49] It is possible I'm just doing it wrong. I don't currently have a better way to manage upload directory than through NFS, which has historically made the MW upgrade procedure more complicated. I'd love a better solution. [21:56:00] e.g. doing var_dump( $wgUploadDirectory ) inside maintenance/eval.php [21:56:32] I don't see why NFS should be complex [21:56:40] fhocutt: sorry, meeting went long [21:56:42] coming back now [21:56:52] because of the upgrade procedure for mediawiki [21:57:19] sumanah, not to worry, I'm looking at simplemediawiki [21:57:25] nod [21:57:42] extracting the tarball to a temp location, unmounting the nfs images dir, moving the old directory out of the way, copying the extracted dir into place, and remounting nfs [21:58:15] well, that's *your* process ;) [21:58:26] I'm all for a better process :) [21:58:58] As I mentioned, I do want to move to a wiki family architecture so I only need 1 mediawiki directory per server instead of 5, but that's a bit further down my roadmap [21:59:00] so yeah, using another directory is easier, also maybe just symlinking the images/ dirs would work (redoing them with each tarball, probably with a script) [22:00:06] I can test that symlink idea on my dev server. I am pretty sure I tried something like that long ago and had issues, but I don't remember what it would have been if that is what happened. :P [22:01:11] Hi, is there anyway to make MediaWiki use URLs like /wiki/Main_Page?action=history instead of /w/index.php?title=Main_Page&action=history ? [22:01:25] I would try moving the directories first (and updating those config vars) [22:01:58] also, using thumb_handler/img_auth together can avoid the need to apache to serve the media as a third option [22:02:04] probably lots of ways to do this [22:02:15] I guess I'm still confused by $wgUploadPath. That's part of the URL and I don't want the URLs to change, just the on-disk locations. [22:02:33] I looked at the img_auth stuff and that seems more complex than I would like. [22:03:04] especially since on http://www.mediawiki.org/wiki/Manual:Image_Authorization it recommends against it [22:13:48] using it for secret wikis is not recommended (though we sort of due that ourselves anyway) [22:14:05] that wouldn't matter for what you were doing...in any case it would involve changing URLs [22:14:24] if you don't want those to change you need either redirects or aliases setup in apache so the old URLs work [22:23:09] hey, how can i remove the category information from parsed text like here http://en.m.wikipedia.org/w/api.php?&action=parse&format=json&disabletoc&disablepp&prop=text&page=treasure ... compared with http://en.m.wikipedia.org/wiki/Treasure ... [22:37:19] hey pginer - how's it going? got a moment? [23:23:18] herro? [23:23:44] I have a few questions, and i need a bit of help [23:24:43] anyone active? [23:26:10] !ask [23:29:09] ah, im trying to set up a wiki, i downloaded all the client but am unsure how to proceed, any help would be apreciated [23:29:56] i think we have a guide about this, are you having problems with any particular step? [23:29:58] !installing [23:31:03] ok i will poke at that, thank you, I have a work related project that i need to create 4 or 5 inter related wiki pages [23:33:36] Hi guys! [23:34:32] Does anyone know of a way to add comments in the new JSON files? [23:35:01] I am switching some extensions to the new JSON i18n format and I would like to keep some comments. Is there a way? [23:36:53] my other concern with setting up my wiki is security, all pages i create need to have highly regulated access. [23:39:12] MediaWiki doesn't have great, per-page access controls [23:40:28] Joergi: you can't :( [23:40:56] you can create fake keys that start with "@", and they'll be ignored by mediawiki though [23:41:34] so one i start a page its public? [23:41:40] noce* [23:41:45] i cant type lol [23:41:48] legoktm: Hmm, that's sad. Any ideas? I used to have those texts, which belong together, grouped together. Then followed another group. And so on. [23:42:02] leogktm: Do you have an example? [23:42:41] an example is the @metadata key used to hold the list of authors... [23:43:08] hrm... why does the "mobile version" link of MobileFrontend use https://, when my site isn't https? [23:43:23] That sounds good. So something like "@section-navigation": "These are navigation texts", should do [23:44:13] legoktm: Will there be any drawbacks, if I use such keys starting with @? [23:45:18] I don't believe so [23:46:22] Hmm, the language files also seem to continue working... [23:58:20] legoktm: Yes, seems to work. Thanks for the idea! :-) [23:58:30] awesome!