[02:36:07] Hi, sorry to bother you guys again but can anyone tell me where exactly I download the Refreshed skin? [03:02:09] No Repton [03:02:11] Sorry. [03:03:53] Ok [04:09:14] Hi, I have installed mediawiki in local and installed the extensions cargo. I'm trying to work on the task T91222. But I How can run & chech the extension in the localhost. is there is any URL specifically for that [04:12:35] Hi prakhash. [04:15:14] Hi Fiona [04:21:19] @Niharika can you please help me to identify this [04:21:21] Hi, I have installed mediawiki in local and installed the extensions cargo. I'm trying to work on the task T91222. But I How can run & chech the extension in the localhost. is there is any URL specifically for that [04:22:03] prakhash: Firstly, check if the extension is installed. You can do that by opening Special:Version. [04:22:23] prakhash: Link to extension cargo page? [04:22:36] The page on MW. [04:23:03] Ah, found it. [04:23:08] Yes in localhost how can I access those Extension. [04:24:43] prakhash: Did you check if it's installed? [04:25:15] prakhash: Does the Special:Version page show you the extension listed? [04:28:01] I have downloaded & placed inside the extensions folder then I have specified it in the LocalSettings.php and then I have executed the update.php command During that time I was able to see the installation process. but Didn't find it inside the SpecialVersions.php page [04:28:46] No. Not SpecialVersions.php. [04:28:54] In your browser. [04:29:08] Go to /wiki/Special:Version [04:31:23] no i'm getting 404 [04:31:43] that is not available [04:32:16] The requested URL /core/Special:Version was not found on this server [04:34:29] @Niharka the link has to be given /wiki/index.php/Special:Version [04:35:05] now I'm able see a page & inside the installed Extensions I'm able to see cargo [04:35:25] prakhash: Good. It's been a while since I messed with all this. [04:35:46] So now Cargo is enabled and is (hopefully) working as it should. [04:37:37] Thanks Niharka Ya it is working. I will try to solve that issue. [04:41:07] this page crashing chrome for anyone else? http://en.wikipedia.org/wiki/Manichaeism [06:43:30] Hi there, can someone help me with a robots.txt question [06:45:10] !ask | Guest7487 [06:45:10] Guest7487: Please feel free to ask your question: if anybody who knows the answer is around, they will surely reply. Don't ask for help or for attention before actually asking your question, that's just a waste of time – both yours and everybody else's. :) [06:45:46] User-agent: * Disallow: /w/ Allow: /w/sitemap/ Sitemap: /sitemap.xml Disallow: /wiki/Special:Ask/ Disallow: /wiki/Special:Browse/ Disallow: /wiki/Special:SearchByProperty/ Disallow: /wiki/Special:ExportRDF/ Disallow: /wiki/Special:PageProperty/ Disallow: /wiki/Special:Properties/ Disallow: /wiki/Special:UnusedProperties/ Disallow: /wiki/Special:WantedProperties/ Disallow: /wiki/Special:SMWAdmin/ Disallow: /wiki/Special:Types/ Dis [06:46:05] hmm that didn't work well.. first time here - sorry [06:49:58] Guest7487: Don't flood, use https://dpaste.de [06:50:13] Guest7487: You might want to read the rest of the channel topic, too. [06:51:48] Sorry, I will leave [06:59:59] https://dpaste.de/UDNX#L1,2 [07:00:46] Is this a suitable robots.txt? And if so, I would still like to allow images to be crawled. I need an Allow [07:08:21] Guest7487: You can test robots.txt with Google tool: https://support.google.com/webmasters/answer/6062598?hl=en [07:10:50] Thx. What about allowing images? [07:16:04] Guest7487: Test the images url with that google tool, see what it says. [07:20:37] hi, I want to contribute to Multilingual Semantic MediaWiki project. Could someone guide me as to where I can start? [08:17:35] bd808 : I did a git pull followed by a "vagrant up" and "vagrant provision". I got a lot of failures. You may have a look at them here : https://gist.github.com/anonymous/17bcbc5f0f6f06c628d6 [08:17:52] Any idea what's wrong? [10:25:52] hi GSoC admins, qgil especially. [10:26:18] I'm Jacob from #apertium. http://www.dtu.dk/Service/Telefonbog/Person?id=78778&tab=6 [10:26:31] Apertium didnt get into GSoC this year :-( [10:26:42] Therefore our mentors have lotsa spare time :-| .... I was thinking if there would be a possibility to do a joint GSoC project, related to your machine translation plugins, where Apertium is currently used for Spanish-Catala ? [10:57:38] Hi vagrant up gives the following error : The guest machine entered an invalid state while waiting for it to boot.Valid states are 'starting','running'. The machine is in the 'poweroff' state.Please verify everything is configured properly and try again. [10:57:40] and while starting the machine manually on Virtual Box it gives the following error: [10:57:41] Failed to open a session for the virtual machine vagrant_default_1425637903749_41452. VT-x is disabled in the BIOS. (VERR_VMX_MSR_VMXON_DISABLED). Result Code: E_FAIL (0x80004005) Component: Console Interface: IConsole {8ab7c520-2442-4b66-8d74-4ff1e195d2 [10:58:12] any idea how to fix this? [10:58:32] How can I enable VT-x on BIOS [11:14:40] sakshi1: actually - turning that on would be specific to your computers bios settinsg [11:15:40] I am using windows 8 [11:15:42] you will find something like "Enable visualization" or something like "VT technology" in your bios settings. Enable that one - and boot up ! [11:15:57] sakshi1: you will have to change the settings from your bios [11:16:16] sakshi1: something like http://superuser.com/a/179586 [11:18:35] Ok i'll try [13:45:00] Hello brave people [13:45:31] I have edited something on translatewiki.net [13:46:03] When will I see the change on www.mediawiki.org ? [13:46:26] Cool, I just did too. I just can't figure out where the damn watchlist e-mails are on translatewiki... I can't find it and there's an evil grammar error in the Danish version. [13:47:12] You know the trick "qqx" ? [13:47:26] what? [13:48:25] To find the key, you may append ?uselang=qqx to the address [13:49:12] Well, it's in an e-mail, not a MediaWiki site... but it is sent by MediaWiki, so I'd think it would be on translatewiki [13:49:58] Oh, for an e-mail I don't know [13:53:20] but thanks for the tip, it might come in handy another time [14:12:19] I was away [14:12:31] Bye [15:03:39] How can I force the installer? I'm trying to copy a MediaWiki-Vagrant instance up to AWS, but it loads missing.php even when I remove LocalSettings.php [15:06:33] I could start with a fresh clone of MW, and clone all the extensions, but would rather understand why I can't just re-use the code from MediaWiki-Vagrant [15:09:54] Maybe my question is better suited for #wikimedia-operations [15:10:21] freephile: I think this is the right place, actually [15:10:44] marktraceur:ok [15:10:58] freephile: So you are uploading a Vagrant instance to AWS [15:11:08] freephile: Surely there's a more performant way to host a MediaWiki site. [15:11:22] right... rsynced the code [15:11:43] client wants AWS [15:11:50] freephile: Not to say that Vagrant is causing your problem, but why not use apache and friends on the server itself, and host MediaWiki without Vagrant? [15:12:17] oh, i'm not using Vagrant....I developed on MediaWiki-Vagrant locally [15:13:08] I rysnced the mw codebase to AWS, and setup Apache, MySQL etc.... [15:15:46] But the mw-config/missing.php file that is getting loaded seems to be part of wikimedia-operations puppet system [15:15:51] Aha. [15:16:03] freephile: What does that page say when it loads, exactly? [15:16:47] No wiki found [15:16:56] Sorry, we were not able to work out what wiki you were trying to view. [15:16:56] Please specify a valid Host header. [15:17:47] which is showGenericError() [15:23:18] Huh. [15:31:09] the file missing.php defines showGenericError(), which is displaying the output i see [15:31:41] missing.php is only referenced by ./MWVersion.php and ./defines.php in /var/www/w/ [15:33:35] I just checked git for core, and those files aren't part of core [15:34:11] so I think that answers the question.... rename/remove those files and the installer should run [15:36:53] no, it appears that MediaWiki-Vagrant has a special version of mediawiki that is multi-site [15:38:58] 9 references to MWVersion.php including index.php api.php and load.php [15:40:04] So, the correct answer might be to just clone MediaWiki core into /var/www/w [15:49:34] freephile: Possibly, yes. Or maybe mwv is confused because the hostname of the site changed. [15:52:03] marktraceur: I tried to avoid that scenario by removing LocalSettings and any database [15:53:10] Not sure what the context is, but I can't imagine removing LocalSettings.php and the db is going to help things [15:53:30] bawolff: fresh install of mw [15:53:55] hmm, well I guess that would be the exception of where that would help [15:54:15] bawolf: I got the code off my MediaWiki-Vagrant development environment rather than downloading it from git [15:54:31] * bawolff doesn't know about how vagrant is set up [15:54:57] * freephile nods.... I just found out a lot I didn't know :-) [15:55:06] freephile: You deleted your database and LS.php, why do you need to preserve the mwv setup again? [15:55:46] marktraceur, I just wanted all the extensions [15:55:47] Like, clearly there's an issue, but I'm 89% sure the "right" solution is for you to not use Vagrant in production. [15:55:52] OK, well, they're all available. [15:56:07] Stop using Vagrant, download MediaWiki and the extensions for real. [15:56:22] freephile: Normal mediawiki comes with common extensions included [15:56:29] In tarballs, yes. [15:56:30] (normal being tarball, not git) [15:56:35] yep... I've got to stop trying to do things the 'easy' way... takes at least 4x longer :-) [15:56:47] I have never thought of Vagrant as the "easy" way :P [15:57:04] i mean rsync A -> B done [15:57:17] As neat as it is to have fewer commands to install so many things, it seems to always cause me more trouble than it's worth. [15:57:30] Slowness, timeouts, browser test failures [15:58:17] ಠ__ಠ [15:59:04] mw-vagrant does have a custom wikifarm setup modeled on the WMF multiversion code [15:59:19] I'm more just stuck in my ways. My install from plain git works fine (Well "works" might be an exageration...), and I don't really want to break what isn't broken [16:00:26] bawolff: I'd normally start with a git clone also... which lets me track any changes to any files, plus switch branches, pull etc. [16:01:14] freephile: I'm trying to think of what would land you on the missing.php page. Does it list URLs that it thinks should be valid wikis? [16:01:44] bd808: thanks. Good to know. I'm interested in wikifarms, so nice to learn about WMF multiversion [16:02:43] It takes config from files in the vagrant/settings.d/wikis/* directories [16:03:03] nope... it lists "devwiki", at 127.0.0.1 but I think that's only because it's a default [16:03:08] one day I'll get vagrant working properly, I keep running into NFS errors [16:03:36] You can try: vagrant config nfs_shares no [16:03:47] That will disable nfs and fall back to vbox sharing [16:03:56] bd808: I wrote an unwrapLocalSettings function to grab all the settings.d conf [16:04:39] and stuffed that in LS.php without the CommonSettings.php but still didn't succeed in getting it to work. [16:04:55] What does your /var/www/w directory look like? [16:05:24] I'm thinking that there may be problems transferring things directly via rsync as there are files that only exist in the VM [16:06:06] not all of the files generated by a Puppet run are directly found in the host computer's vagrant directory [16:06:17] http://pastebin.com/VB1ixCbw [16:07:13] ^ that's my directory tree [16:08:00] and does dblist.php look like this? -- https://dpaste.de/9cs3 [16:08:20] That is the file that loads per-wiki data for our mutliversion scripts [16:08:46] and it has hard coded expectations of where to find the bootstrapping files [16:08:50] and like I mentioned, to make up for the fact that there is no '/vagrant' directory on my AWS host, I used a unified version of settings.d/*.conf [16:09:26] the multiversion code works on files that are loaded before CommonSettings [16:09:37] so you'll need to rip that out too [16:10:12] I think you'll need to change all of the files in /var/www/w [16:11:29] Running as a single wiki I think you can just `mv w w-vagrant; ln -s /path/to/$IP w` [16:11:54] but you may have to tweak a few more things. I haven't tried that [16:12:20] bd808: Thanks, now that I know more about MW-Vagrant, I'll just start with fresh clone of core for the deployment [16:13:00] The files in /var/www/w are stubs for files from $IP that invoke MWMultiVersion to determine which config should be loaded based on the URL of the request being handled [16:13:26] That all happens before any LocalSettings files are loaded [16:14:00] bd808: That's awesome. That's basically how I've always done Mass Virtual Hosting in Apache [16:15:07] That's basically how the WMF production cluster works too [16:48:06] Hm, it's the second time already in few days that I struggle to find a component for MediaWiki core backend stuff https://old-bugzilla.wikimedia.org/describecomponents.cgi?product=MediaWiki [16:53:18] what needs to be done to fix https://phabricator.wikimedia.org/T76942 ? decision on whether rename MediaWiki-General-or-Unknown or create a new tag? [16:54:40] Vulpix: direct database action so we don't trigger hundreds of thousands of notification emails. [17:02:35] does someone have a bot account in mediawiki.org to fix this SNAFU? https://www.mediawiki.org/wiki/Thread:User_talk:Shirayuki/Recent_edits_to_mw:Template:WikimediaDownload_seem_to_create_a_rendering_issue_on_mw:Extension:GraphViz [17:03:42] it could be fixed with a simple replace, but there are a lot of pages to fix [17:04:41] I could probably do it with AWB if someone told me what needs replacing :P [17:05:54] Reedy: https://www.mediawiki.org/w/index.php?title=Extension:GraphViz&diff=1434881&oldid=1408016 [17:06:54] Oh, that should be easy [17:14:09] Vulpix: fixing [17:14:30] https://www.mediawiki.org/wiki/Special:Contributions/Reedy [17:14:34] thanks! [17:15:54] running under a bot account would be more optimal, though [17:17:43] lol [17:17:56] Let me flag my bot on mw [17:18:36] well, Kghbln and Shirayuki already spammed recentchangesfixed a bunch of them, so it shouldn't be a problem [17:19:24] Ok, doing it as bot now [17:26:16] I'm trying to find something I'm pretty sure exists, but I'm not able to describe it with the right search terms, apparently. I'm looking for something that would look at the links between pages on a wiki, and construct a graphical map, 2d or 3d either would be fine, showing which pages are most-connected, where the "lumps of inerconnectedness" are, orphan pages floating out in space, etc. [17:26:44] I'm finding plugins which allow you to embed graphviz markup in the page, for instance, but not any which will *generate* that from the wiki's internal structure. [17:37:36] myself: Hi there. [17:38:06] hi [17:38:16] myself: Special:LonelyPages lists pages not linked [17:38:52] myself: Special:MostLinkedPages has a list of most-linked pages. [17:39:08] myself: You could probably pull that data pretty easily and construct a graph, not sure if that extension already exists. [17:39:16] marktraceur: I'm aware of all the lists, but I'm also aware that graphical presentations help humans work with data sets like that :) I'm pretty sure I've seen graphs like this for page relationships before, so I'm mostly looking for a term that might be used to describe that practice [17:39:23] hmm, ok. [17:39:38] Maybe hatnote has done something like that. [17:40:18] myself: There's http://wiki.polyfra.me/ which you might be able to point at a different wiki, not sure [17:40:40] oooh. That's exactly what I'm looking for. [17:41:06] No links to source [17:41:20] Yeah. :/ [17:41:26] https://meta.wikimedia.org/wiki/Connectivity_Project was wonderful (died with Toolserver) [17:41:27] myself: Email the creator at owen.cornec@gmail.com and tell him to give it to you :) [17:42:12] Not very graphical though, IIRC. No idea how reusable a backend [17:44:21] lol I clicked on the toollabs link on that page and firefox wants me to download go.sh [17:44:23] Hmm, there's WikiMindMap... [17:47:11] that also looks unmtaintained. Hmm. [17:47:58] eyeplorer and viswiki are both defunct [17:48:34] did I step into some rift where data viz goes to die? Am I gonna vanish off the internet after contemplating this idea too loudly? [17:49:02] lol [17:52:24] Vulpix: That should be everything that isn't a translation page [17:55:14] Reedy: yes, I suppose. Normal /lang pages that are not managed by the translation extension should be updated, though, since they're normal pages [17:55:24] Right [17:55:33] But AWB won't, due to some probable API issue [17:55:36] Gonna debug that now [17:56:48] I think you can't edit subpages managed by the translate extension [17:56:55] at least not from the normal web interface [17:58:03] "See https://www.mediawiki.org/w/api.php for API usage" [17:58:09] You can't via the API [17:58:34] The unknownerror is annoying [17:58:55] https://phabricator.wikimedia.org/T37654 [17:58:59] Oh look, I filed it already :P [18:12:40] I have created message groups by running processMessageChanges.php. How can I find list of all message groups? [18:15:34] qgil: Around? Comments on https://phabricator.wikimedia.org/T91748 ? [18:43:32] How easy is it to setup an MW installation on a pen drive? [18:45:10] Caliburn: probably pretty easy. Hardest bit would be to get a web server on it, but I'm sure people make apache packaged for usb key type things [18:45:25] Caliburn: But... you may want to check out tiddlywiki, which might be better for your use case [18:47:10] I'd prefer an MW installation to experiment with extensions etc. [18:47:28] Unless you can do so with your suggested program [18:48:05] no tiddly wiki is an entire separate program (Its a wiki optimized to be savable simply as an html file on your local disk) [18:50:10] Would xampp do the job? [18:51:28] probably, all you need is php and apache. you don't even need mysql since mediawiki can work with sqlite [18:51:48] Ok I'll look into it, thanks :D [18:56:55] how do i use a colon in component name with the bugzilla extension? [18:58:15] Niharika, sorry, I was in work meeting and now I have family meetings (dinner etc). I will reply later tonight. [18:58:26] Niharika, ... and keep the very good work you are doing! [18:58:31] qgil: No worries at all. [18:58:34] Thank you. :) [19:01:45] isifreek: use a \ to esacpe [19:02:05] err wait, [19:02:11] you shouldn't need to do anything at all [19:02:15] bawolff: tried that :/ [19:02:29] isifreek: You're using this bugzilla extension? https://github.com/mozilla/mediawiki-bugzilla/blob/master/README.md#usage [19:03:21] bawolff: i get this: String foo: bar is invalid using regex /^[\w]*$/ [19:03:42] bawolff: yes i believe so [19:03:54] Hmm, I guess they're specificly disallowing it [19:04:17] I'd reccomend filing a bug with the extension maintainer [19:04:26] bawolff: the way im using it does not look like the examples in the github readme tho [19:05:00] bawolff: more like: {{#bugzilla: component=.....}} [19:05:21] hmm, I think maybe that's a different bugzilla extension [19:05:49] bawolff: is that a "template" could the template->extension be the issue? [19:06:14] no, things starting with # sign that look like templates aren't templates (they're parser function) [19:06:36] bawolff: do i have to have privilege to view that code? [19:06:57] (if not where should i look?) [19:07:02] similarly though, I'd reccomend filing a bug with whomever is maintaining the extension. The error message you cite makes it sound like they are specificly rejecting anything that's not a word character (\w). colon isn't considered a word character [19:07:16] isifreek: Special:Version on your wiki should have a link to the home page of the extension [19:07:26] OOOH I see, thank you much! [19:08:05] bawolff: http://www.mediawiki.org/wiki/Extension:Bugzilla_Reports [19:08:06] yup [19:09:01] so you'd probably have the best luck complaining at https://code.google.com/p/bugzillareports/issues/list [19:09:19] bawolff: oops, wrong error message (probably the same problem tho?): String Foo: BAR is invalid using regex /^[\w,@\.\s\*\/%!()+-]*$/ [19:09:39] bawolff: and that is not our fault, its the extension's right? [19:11:04] yeah [19:11:54] Although you can always try Foo%25 BAR [19:12:12] depending on if the extension is crap or not, it might let that work (It really is not supposed) [19:12:30] i mean [19:12:34] Foo%3A Bar [19:12:36] bawolff: I tried %3A before, ill try that one [19:12:37] not %25 [19:12:41] oh lol [19:12:54] * bawolff looked at the wrong line on my ascii chart. %25 is the code for % [20:05:53] Can anyone see if I have this problem sussed correctly? I have a wiki that's been hosted in a subdirectory with no problem. Now we've added a hostname and suddenly php and the database connection are breaking. Has anyone run into this before? [20:07:19] a hostname to what? [20:10:50] Pennth: could you give us exact error messages or elaborate how it's breakin? [20:11:10] made a domain that's a cname to the hosting server. If I use hosting://home/wiki it works. If I use newdomain://wiki it fails with The requested URL /cgi-bin/php56-fpm/wiki/index.php was not found on this server [20:11:46] :// ? [20:12:18] yeah, bad shorthand. Just / [20:12:50] I'm feeling a bit punchdrunk [20:13:31] did you add to/update your webserver config to match? [20:15:16] the apache config was updated to listen to the new domain, but it's still listening to the old config as well. [20:16:03] can you pastebin it? [20:23:12] http://pastebin.com/f7wzgQ8G [20:59:14] Hey Guys..I was getting an error when I tried to authenticate my email on a new mediawiki account (running mediawiki on vagrant) : "class undefined : Net_SMTP". [20:59:59] I was suggested that I update the git clone and then run "vagrant provision" [21:00:04] which I did [21:00:14] But it lead to many failures [21:00:31] You may have a look here : https://gist.github.com/anonymous/17bcbc5f0f6f06c628d6 [21:00:45] ankita-k: apt-get update timed out [21:00:49] Can anyone tell me what's the problem here? [21:01:05] That cascaded to lots of other dependency warnings [21:01:29] Are you on a network where an http proxy is needed to access the internet? [21:02:12] We have seen this in such cases (eg school networks with strict proxy rules) [21:02:30] Yes [21:02:34] I am behind a proxy [21:02:54] but I have the env variables set [21:03:50] Can you do this: vagrant ssh -- sudo apt-get update [21:04:16] The virtual machine needs to be able to talk to your proxy [21:04:16] I shouldn't be able to. I am on a mac and I use brew to instal [21:04:28] oh [21:04:35] that would send the apt-get command into the vm [21:04:39] yes [21:04:43] right. of course. [21:04:46] let me try that [21:05:40] I'm guessing you will need to setup vagrant-proxyconf -- http://stackoverflow.com/questions/19872591/how-to-use-vagrant-in-a-proxy-enviroment [21:06:36] ah..okay [21:06:44] Let me do that [21:07:26] Also, i ran the command and it's just frozen [21:07:31] No output [21:07:43] *nod* you need the plugin then [21:08:15] We should really get some docs for this on https://www.mediawiki.org/wiki/MediaWiki-Vagrant [21:12:25] Hey bd808! https://gerrit.wikimedia.org/r/#/c/193680/ <---This is up for review now. [21:12:56] Yes. There does need to be some documentation. I have been wasting a lot of time all the errors which were caused by this. :P [21:13:34] ankita-k: If you could add instructions (assuming you get it to work) that would be awesome. I don't have a network with proxy requirements to test from [21:13:52] Alright. Let me do that. [21:13:56] Niharika: Cool! Does it all work? ;) [21:14:18] I am looking for software for desktop where I can draft a wiki article and safe the draft locally. [21:14:39] svetlana: Notepad [21:14:41] Like a text editor? [21:14:45] With button to preview it in a given wiki and to download a certain article markup from the wiki to edit it. [21:14:52] A bit like a text editor, yes. [21:15:04] ah [21:15:27] bd808: Yep! I triple checked for all silly stupid bugs. I dunno if you'll be happy with how I am updating questions. I'm dropping off existing questions for a campaign and reinserting them. [21:15:35] I htink there are browser extensions that will replace an editbox in a webpage, with a local text editor, would that do something like what you're looking for? [21:16:07] "itsalltext" is one [21:16:19] Niharika: That may cause strange problems. What if the change is just to fix a typo in the middle of an active campaign? [21:17:52] Niharika: we can probably start there but will need some followup changes [21:18:08] I'll review "soon" [21:19:02] bd808: Right. Hmm. I could not think of how the question ID could be used here. And on the reviews page, I imagined displaying all the questions from a campaign sequentially, not involving the question ID anywhere. [21:20:07] I'm taking input in array[], much like reviewers. [21:20:09] Hence. [21:22:28] Niharika: You can put the question id in the [] (eg name="question[7]") and php will pass that through to you. Then you have an associative array of id=>question [21:23:02] Questions are probably going to end up needing their own edit page, but maybe not. Not sure [21:24:19] bd808: Hmm. Okay. I'd ask you to review this one anyway. I'll make adjustments for the question ID in a follow-up patch, if that's okay. [21:25:41] Or I could do those in this one itself. [21:25:52] * Niharika must go to bed [21:36:10] myself: yeah goo thing itsalltext but it's a bit different workflow to what I am looking for. [21:36:16] good [21:47:33] hi [21:47:39] what is the module ns about, [21:47:41] ? [21:48:03] It's for Lua modules [21:48:14] Similar to the templates [22:06:33] for more context I would like to be able to edit the article again after a reboot, which may be quite a bit of clicks with the itsalltext appoach [22:11:06] svetlana: Something like... [22:11:10] !e Drafts | svetlana [22:11:10] svetlana: https://www.mediawiki.org/wiki/Extension:Drafts_ [22:11:59] marktraceur: I have no control over that wiki instance though. [22:12:56] Well [22:13:02] There's your problem right there [22:13:05] svetlana: Which wiki? [22:13:47] I draft things for many wikis, including wikimedia projects, mozilla wiki, and a couple small wikis. [22:14:17] svetlana: Well, I hate to say "Come back when you've got consensus on-wiki", but come back when you've got consensus on-wiki [22:14:21] emacs's mediawiki.el is a bit close, but I don't know whether it saves the draft locally. It at least can I think preview any local wiki markup file. I am looking for other things like that. [22:14:24] At least for Wikimedia wikis. [22:14:36] I would be unlikely to get consencus on all these wikis. [22:14:41] True! [22:15:04] But if you did, then I could kick my potential GSoC project for cleaning up E:Drafts and maybe deploy it on WMF cluster. [22:16:05] I don't follow that. What does that project involve? [22:19:47] svetlana: Basically, on the edit page, you can save a draft without publishing it. [22:20:01] svetlana: My evil plans include giving it the power to save automatically every X minutes [22:21:30] I'll look at it. I would rather be able to work on the draft while I'm offline though. [22:21:53] I'm unlikely to ask people to deploy an extension which, while useful, makes them more dependant on the web. [22:23:44] svetlana: Fair enough, fetching things via the API and uploading edits is probably not too difficult either [22:23:56] svetlana: e.g. the wiki/git bridge [22:24:07] I think you can set it to only fetch certain articles but I'm not certain. [22:40:37] Reedy: ah lua, thx [22:40:40] anywaygn [23:01:48] i am working on a video game wiki where we have many game items that are associated to an image ... would it be appropriate to create a template with a bunch of IFs that would return the associated image per item name? maybe something like {{Image Lookup|Item Name}} would return itemname123.jpg ? [23:02:16] just wondering if someone has done this before and had headaches [23:02:41] Why the complexity? [23:03:04] because i hate passing in both a file name and a item name to a icon template [23:03:23] i'd rather just pass in the item name and let the template figure out where the icon file is [23:03:43] I suspect the nicer way of doing that would be to have consistently named images [23:03:59] So you can deduce the file name from the item name [23:04:20] hm ... yeah, that's probably a far more robust solution [23:04:35] does there happen to be a way to remove spaces from a string? that would simplify this process quite a bit [23:05:01] parserfunctions maybe? [23:05:01] errm [23:05:04] !parserfunctions [23:05:05] "Parser functions" are a way to extend the wiki syntax. ParserFunctions is an extension that provides the basic set of parser functions (you have to install it separately!). For help using parser functions, please see . For details about the extension, see . [23:05:35] hmm, the stringfunctions of parserfunctions definitely does [23:05:41] https://www.mediawiki.org/wiki/Extension:StringFunctions#.23replace: [23:05:51] i totally thought i looked for this once [23:05:52] http://www.mediawiki.org/wiki/Help:Extension:ParserFunctions#Stripping_whitespace [23:05:54] i must be blind [23:05:58] thanks so much for the help [23:06:36] :) [23:06:47] have a great day everyone