[01:36:33] Could somebody please help me diagnose what's wrong with my wiki [01:36:54] http://www.trivictoria.org/w/index.php?title=Main_Page&useskin=vector [01:37:02] This is not using the Vector skin [01:46:17] it's showing me this wikikiwiniki: http://prntscr.com/6uobor [01:46:40] Exactly what I see [01:46:42] Any thoughts? [01:48:35] Hi - if I create a FauxRequest from a posted request, do I process it using a new ApiMain object? thanks very much for any answers [01:48:39] maybe your vector skin is corrupted [01:48:47] download it again [01:48:51] Ok [01:48:52] Thanks [01:51:32] Hmm, redownload doesn't seem to help [01:51:39] I am using git clone [01:51:45] Should I be using something else? [01:53:47] or perhaps a DerivativeRequest not a FauxRequest because what I want to do is replace all processing of the current request with a different one [01:54:47] mmm [01:54:53] I guess I want to preserve the context from the current request [01:54:56] download the rar/zip/7z [01:55:07] so a DeriovativeRequest seems more like it [01:55:25] you need an extension also [01:55:50] www.mediawiki.org/wiki/Extension: Vector, and www.mediawiki.org/wiki/Skin:Vector/es [01:56:00] I'm sorry if the links are on spanish, i'm from Chile [01:56:11] www.mediawiki.org/wiki/Extension:Vector [01:56:15] that's the first link [01:58:55] Thanks [01:58:58] That's fine [01:59:37] The zip worked [01:59:58] great [05:06:05] o/ [05:06:15] Im getting a Fatal exception of type MWException error when I try to upload a file [05:06:17] :s [05:12:10] aelevadoan, WMF projects? [05:12:23] MaxSem: what? [05:12:41] so on your own wiki? [05:13:09] yes [05:13:53] !debug [05:13:53] For information on debugging (including viewing errors), see http://www.mediawiki.org/wiki/Manual:How_to_debug . A list of related configuration variables is at https://www.mediawiki.org/wiki/Manual:Configuration_settings#Debug.2Flogging [05:24:29] tuxick: why would it, it's not it's job to do it [06:41:31] Nikerabbit: in the end i do get option to select language, but only for editing and user interface, but how am i supposed to seethe translated pages? [06:41:50] "language" is a bit ambiguous here [06:43:14] tuxick: that's called content language, not interface language [06:44:16] tuxick: assuming you have installed Translate and are using the language code subpages naming pattern, you can set wgTranslatePageTranslationULS https://www.mediawiki.org/wiki/Help:Extension:Translate/Configuration#Page_translation_feature [06:46:34] dunno about "language code subpages naming", i adopted a dtabase :) [06:46:55] i knkiw i can accccess translated page by prepending Fr: [06:47:10] sheesh sorry, on "fast mobile internet" [06:56:40] no idea what Translate offers to help actually showing those pages [06:58:17] "something" is showing me "available languages" and "missing languages" [06:58:21] but both are empty [06:58:43] i'm expecting to see this "Other languages: " thing [07:01:45] tuxick: so if the pages are translated using language code prefixes, then they have not used the Translate extension [07:01:55] they probably have their own templates then, which seem to be broken [07:05:00] oh great [07:05:53] in that case i really hope there's a backup of original site somewhere [07:07:25] i don't know enough of internals to fix that [07:08:11] thanks anyway [08:28:42] How do I even rollback or at least undo changes to translations at mw.o [08:28:48] translate extension confuses me [08:46:34] what can i use to clone a public wiki to my computer [09:15:05] Question please - setting up a DerivativeRequest, wanting to change the target page (not URL) to a different one -- I can't see how DerivativeRequest supports this [09:37:43] tuxick: how does a backup help? Either way, you should really move to using the language subpages, like Page, Page/fr, Page/it, which are supported by core. Is the wiki public? [09:38:23] Nemo_bis: i only got the database dump [09:38:43] Nemo_bis: wiki.davical.org [09:48:43] tuxick: ok, you have only 60 or so pages to move, a matter of 10 minutes work. :) http://davical.dhits.nl/index.php/Special:AllPages [09:49:46] Nemo_bis: ye once i know what i'm doing :) [10:04:08] Nemo_bis: or better yet: make the french understand you can't work in IT if you don't master english :) [10:04:27] tuxick: I prefer translation :) [10:14:15] <3 translation... [10:16:35] dunno, first thing i do with any new device is switch to english [10:16:50] dutch interface confuses me too much [10:18:12] meanwhile my new laptop got fucked up :/ [10:18:44] bought new thermos flask, so it leaked [10:18:52] now the screen has panther design [10:19:15] i think this will not fall under warranty or onsite support [10:19:50] oops, sorry, wrong window :) [15:02:46] Hello all [15:04:52] does anybody know of mediawiki functions that let you easily create a new page, with content. For example, if I wanted a page located on ´wwikroot/newpage´, with content ¨This is a new page and Ive created it automatically¨ [15:06:09] blabliblo: there are various ways to do that... is the idea to create a whole bunch of pages at once, or to do it "on demand"? [15:10:59] Hello Yaron [15:12:20] On demand. Basically Im working on an extension in which users can save content from a specialpage to a new wikipage [15:13:02] the new wikipage will have an url form wroot/user/time and the data will be sent to this url via post html form [15:13:32] Ive managed to grab the data on this new form.. but I don know how to make wikimedia autocreate this page, and auto insert this data on this page [15:13:55] The data is just a long string [15:15:35] Ive looked at transforming the string into xml.. and importing it with importdump.php,but isn´t there an easy way to do this internally? [15:17:23] .. Yaron has quit : Im not asking for a complete solution.. just for someone to point me to the right direction.. documentation.. [15:55:00] second attempt. Can somebody point me into the right direction as to how to create wikimedia pages automatically? Documentation.. Books..Guides..Extensions..any information is welcome :] [15:58:14] Versioned Resource Loader Modules...Having trouble understanding version based invalidation. How do I indicate I want my module versioned? Is it purely automatic? How can I forcibly invalidate? [17:43:28] widgets are neat [20:18:46] bit of help please. I am in a Special page, have a context, and I create a DerivativeContext all ok, but I don't know how to specify the url to GET (as I call ApiMain with the DerivativeContext)... any hint appreciated! [20:20:01] in other words, how do I set/change a request's GET url? I do not want to modify the server part of the path, I want to modify the pagename to GET [20:23:40] do you want to make an API call hypergrove? [20:23:46] is it just a matter of specifying a 'title' in the parameters array? [20:24:16] I think you might want a DerivativeRequest [20:25:54] Krenair, thx loads. yes that is what i've created, but i don't know how to set the page title to 'get' ! [20:26:18] have you seen https://www.mediawiki.org/wiki/API:Calling_internally ? [20:26:31] i understand that i need to call ApiMain with the DerivativeRequest to have it processed correctly [20:27:28] yes i have followed that best as i can -- it doesnt contain a hint about setting the pagetitle to 'get' [20:30:56] I note that there's a DerivativeContext AND a DerivativeRequest. (A FuaxRequest I understand is used when there is no context) So a DerivativeRequest somehow gets ahold of a Context from the 'fallback' request [20:32:14] what I am trying to ask is this: to change the pagetitle associated with a given Request, is it true I need to change the pagetitle within the Context associated with the Request? If so, then how does FauxRequest operate without a Context? [20:33:14] oh, just found a method on WebRequest::extractTitle that could be useful [20:35:48] hypergrove, ... would setting a 'title' parameter to send not work? [20:35:57] I don't understand what problem you're trying to solve. [20:37:00] ok, I'll try setting a 'title' param value [20:58:26] wgRequest->getFullRequestURL works, but derivative->getFullRequestURL causes the wiki to hang [20:59:04] this is with the title param specified .... [21:03:06] i am unhappy that derivative->getFullRequestURL() hangs the system. [21:33:01] hello [21:34:01] I was thinking of setting up a mediawiki server & wanted expert advice [21:34:48] What Linux distro is most popular as a host? Any idea? [21:35:21] I'd assume something like Red Hat, but that's just a guess. [21:35:46] Anyone know what the Wiki folks run on? [21:35:54] wikimedia use ubuntu [21:37:10] <_`_> use whatever you feel most comfortable with to use for this use case. [21:37:15] Just staight old ubuntu server? [21:39:14] There are about 298 distros, I have done 30+ of them, but I'd prefer to use what is mainly used on professional Wiki setups, as there is probably a good reason behind such a choice. [21:41:17] Any gotchas with ubuntu and mediawiki? [21:43:39] when in doubt, just run unpacked mediawiki tarball or git checkout [21:46:19] good idea, just reading the doc. [21:48:44] Is there any real benefit to running Instiki compared with the full fat version? [21:51:19] instiki? [21:52:17] http://en.wikipedia.org/wiki/Instiki [21:53:12] Soso, running mediawiki on ubuntu just fine [21:53:43] no gotchas ? Nebraskka? [21:53:54] didn't had any [21:54:11] i guess mediawiki would feel good enough on any php apache infrastructure [21:54:27] nginx as frontend as an option [21:54:58] I don’t mind technical hassle but prefer not to hit an unforeseen brickwall, always good to ask first. [21:55:15] good idea about nginx, hadn't thought of that [21:55:33] nice attitude :) in my experience, ubuntu server was enough, as running it on all my projects, even clustered one [21:55:48] but it's personal choise and ability to operate one, some people like other distros [21:55:58] and know about their gotchas or specific details [21:56:19] what about replicating? [21:56:19] in case of mediawiki installation, i think it must be quite trivial [21:56:37] SQL or file? [21:56:46] suppose I am paranoid & want a second server as backup [21:57:08] is that difficult with a standard install? [21:57:27] basically I don't trust hardware [21:57:36] i think periodical sql dumps and rsnapshot (or bare rsync) would be enought as a start :) [21:57:37] got plenty of kit [21:57:53] ahh, just sql dumps [21:58:15] but if you would need some online failover and availability of both nodes in the same time, try MariaDB Galera (it's supporting only InnoDB, so index cache must be dedicated in Elasticsearch) [21:58:38] rsnapshot of wiki app folder* [21:58:48] spot on, that's where my thinking is going [21:59:08] just one, with a live (or semi-live) backup [21:59:41] live backup could be done as MySQL slave installation, so it would replicate master MySQL all time [21:59:55] I don't want to create a server, do all of the work, get the content, then have a PSU blow & lose downtime [22:00:04] understandable [22:00:26] so Maria's a better bet? [22:01:17] i am unhappy that derivative->getFullRequestURL() hangs the system. [22:01:46] i guess i have to trace that problem too [22:02:06] Soso, IMHO it's more actively developed than MySQL and have a varienty of performance advantages. It's a fork of MySQL, appeared right after MySQL being bought by Oracle, so the original developer of MySQL now working on MariaDB [22:02:56] but again, both must be good enough as a start :) it's more like personal choise of own experience [22:03:03] indeed Nebraskka, so mediawiki not too fussy what SQL db you use? [22:03:23] i think yes [22:03:52] OK! what you say makes plenty of sense :) [22:04:01] at least i had no problems with both =) [22:04:16] wish you best then, have a productive week [22:04:33] thank you very much Nebraskka :) [22:09:36] ok, derivative->getFullRequestURL() hangs the system because the request url is not set.... how do i do that? [22:15:16] any hint about how to set the requestURL in a DerivativeRequest would be appreciated [22:21:29] oh, public FauxRequest::setRequestURL exists [22:24:49] hypergrove: why it's not set? [22:32:14] saper thx for commenting. I'm not sure yet [22:33:27] but what i suspect is that new data valuescan be used to construct any FauxRequest, however FauxRequest (and Derivative, a subclass) makes no effort to change its url for the new values;.... i think that's a bug [22:35:05] so i suspect i need to call setRequestURL($my_own_constructed_url) in addition to constructing the DerivativerRequest with an array of the same values I have to encode for the setRequestURL [22:36:33] it's not a breaking bug of course, but i am puzzled that noone has wanted to process a FauxRequest that has a different requesturl than wgRequest [22:37:07] but i need to check a few more things to be sure I am correct in this assessement...