[00:17:29] Well I find out its because nginx is doing a rewrite to a proxy_pass and that doesn't contain the original url. So I just commented out the MW code for now until I can figure out how to fix that or redo the backend to not use proxy_pass [00:17:57] I had a similar problem with nginx [00:18:09] only with special characters, though [00:18:17] nginx wanted a normal form [00:18:21] and mediawiki the other :/ [00:20:59] Well mine is because I use pretty urls, nginx rewrites them and then they go through proxy_pass to the backend httpd servers. The original url is not preserved by nginx anywhere it seems and this causes the failure as MW 1.27 appears to try to normalize urls [00:21:58] Because of the rewrite match, nginx stops processing the try_files where I used @mwwrewrite and restarts its processing to find the .php match and that is why the original url is lost [00:22:47] I don't knwo how you configured nginx, but it's certainly possible to do that [00:22:52] !shorturl [00:22:52] To create simple URLs (such as the /wiki/PAGENAME style URLs on Wikimedia sites), follow the instructions at or try the new beta tool at . There are instructions for most different webserver setups. If you have problems getting the rewrite rules to work, see !rewriteproblem [00:23:28] did you look at https://www.mediawiki.org/wiki/Manual:Short_URL/Nginx ? [00:25:52] Platonides: http://pastebin.com/cC1GiBN6 [00:26:00] Relevant parts of the configs [00:54:23] So if $_SERVER['REQUEST_URI'] exists, $wgUsePathInfo is never used.. Makes sense... not really. [00:54:51] you could probably set some vars manually [00:56:06] Thinking thats what I am going to have to do. pass some X-vars and have mw trust those. in a way [00:57:40] But the fact if you have a request_uri set t hat pathinfo is completely ignored, doesn't make it easier, as otherwise I could have just put the proper information in those [01:17:51] For shorturls I typically just use http://shorturls.redwerks.org/ [01:17:55] d0ne [01:25:31] Anyone know if there's a way to cache content and have it be 'attached' to the page in such a way that action=purge also removes it? I can't find how mediawiki internally handles its caches, wether they're meant to expire or if you can make them persistent [01:26:02] Persistent meaning they'l stay until either my extension forces an update, action=purge is called, or the page is deleted [01:28:42] Basically, avoiding garbage data from being in a cache forever [01:33:44] CZauX_: MediaWiki essentially does that. [01:33:58] In order to render a page, MediaWiki has to know, for example, whether links should be blue or red. [01:34:13] If a page is created, red links need to turn blue. If a page is deleted, blue links need to turn red. [01:34:19] Same with template updates. [01:34:24] MediaWiki relies on a job queue to do this. [01:34:27] !jobqueue [01:34:27] The Job Queue is a way for mediawiki to run large update jobs in the background. See http://www.mediawiki.org/wiki/Manual:Job_queue [01:35:29] What I'm wondering though is if its possible to create a cache key that mediawiki can then handle if a page is deleted or a purge is required [01:35:57] Rather than my extension hooking into everything [03:34:50] Weird question. Mediawiki templates won't migrate CSS will they? IE: If I have an inline piece of CSS in a template, and a page calls that template, the CSS on the template page won't fill in right> [03:44:16] ... seems like it does, sweet action! [03:46:52] And the piece of CSS I am wanting to use (automatic image resizing based on the field/screen size) automatically snags normal mediawiki image links. :D [07:23:18] anyone have any suggestions for a good mediawiki host service? I've been eyeballing mywikis.com for a little bit, but I don't really have a point of reference to judge them on [07:46:14] Hello guys, i'm havin a pain settin up visualeditor on an nginx hosted private mediawiki (ssl-exclusive). i've detailed more about it here, https://www.mediawiki.org/wiki/Topic:Tcemtxt6obi8mtuw can anybody share some insight ? i'm on this for hours :( [07:59:46] Do you have the PHP Curl extension enabled? [08:03:21] umm i didnt [08:03:26] i do now, let's test out [08:11:17] oh boy that fixed it. unbeliveable :( [08:12:25] very nice, would have loved to read bout this in the docs :/ anyway, some pages only show the edit source button, can i fix that issue as well ? [08:28:04] hello [09:18:19] hexmode: on my way to mw stakeholder meeting, will be 10 mins late [09:18:53] hexmode: where can I find you guys? [11:04:36] i managed to change that boring /w/index.php?title=Main_Page into /wiki/Main_Page :D [11:05:00] thank to that site: http://shorturls.redwerks.org/ [11:05:01] hehehe [11:05:05] lunch time for me [11:33:38] blah, i still don't know why i keep broken link... [11:34:15] especially on the images. [11:34:36] for example, here: http://www.allaze-eroler.com/wiki/Template:Ambox [11:43:48] hummm look like i have some broken code in my ambox [11:44:09] Meneth: are you here? [11:46:18] Meneth: hoston, we got a problem with ambox system... [12:00:50] ok, still one icon to fix! [12:01:26] but i wonder where is that icon come from: http://central.paradoxwikis.com/images/thumb/b/b4/Ambox_warning_blue.png/40px-Ambox_warning_blue.png [12:09:06] hi [12:09:39] short question: can I create in mediawiki useres with different rights for different topics? [12:11:34] i think it's possible but i have no idea.. oh i know how! [12:12:00] first, create the group user with special right you wanted to set up [12:12:27] secondly, go at certain topics you want make it for certain user, set it up as protected [12:12:29] some pages does have information that should not be readable by all. [12:12:39] with a special user group [12:13:04] that is possible by hiding it from an option, i don't know where [12:14:42] hmm ok [12:15:43] go in yoursite.com/wiki/Special:SpecialPages [12:16:04] and you might find a section related to user management [12:16:21] you might find a way to create specific group [12:24:46] all fixed now! [12:24:56] it's just an error of file name apparently [15:51:02] is there a way to call mediawiki:ambox.css? [15:51:12] call? [15:51:35] i mean, i know how to call common.css but how to call ambox.css i created? [15:52:03] so, i could use special class for styling the bow [15:52:08] box* [15:52:30] http://www.allaze-eroler.com/wiki/MediaWiki:Ambox.css i created this [15:54:18] similar to for HTML [15:56:11] Usually, you'd have it loaded on an RL module [15:56:26] RL module? [15:56:31] resourceloader [15:56:38] i see [15:58:10] i think i found a solution: http://matthewjamestaylor.com/blog/adding-css-to-html-with-link-embed-inline-and-import [15:59:36] i just add that @import "newstyles.css"; in the common.css [15:59:48] Try it? [15:59:52] I'm not sure if it'll work [16:01:25] i'm about to test it out [16:23:37] http://www.cssnewbie.com/css-import-rule/#.V-vuZnZH2Ht i'm reading this [16:24:27] The arbitary path stuff won't work so well for MW [16:24:42] There's a way to do it... I think [16:24:46] Not seen it recently [16:25:05] do you know where is stored the Mediawiki:Ambox.css for example? [16:25:08] hummm i see [16:25:40] if we had that @import feature for common.css, it would save alot of headash [16:26:31] It's stored in the database, it doesn't exist as a physical file [16:26:38] oh [16:26:43] that explain why! [16:26:47] I think there is a way [16:26:53] Just need to ask the right person.. [16:27:14] RoanKattouw: About? Do you remember how to include another CSS page in MW? [16:27:48] There's no good way to do that [16:27:57] Other than @import with a full URL [16:28:55] that would be like: @import url("http://www.allaze-eroler.com/wiki/MediaWiki:Common.css") [16:29:06] right? [16:31:40] hum [16:32:03] doesn't work like i hoped.. [16:32:37] No [16:32:42] you need an ?action=raw IIRC [16:32:53] oh [16:33:39] usually, there's some extra parameters put on it too [16:34:44] like this? http://www.allaze-eroler.com/wiki/Mediawiki:Infobox.css?action=raw [16:35:09] Might want to try http://www.allaze-eroler.com/w/index.php?titleMediawiki:Infobox.css&action=raw [16:35:18] http://www.allaze-eroler.com/w/index.php?title=Mediawiki:Infobox.css&action=raw [16:38:31] doesn't works... [16:40:21] doesn't work :/ [16:43:38] guess i will gave up then... [16:49:21] there might be a solution to that :/ [16:49:56] i noticed it had a message saying i have to use instead. is there an hint? [16:56:10] that is what i get as message: Info: @import prevents parallel downloads, use instead. [16:59:38] i will forget that for moment then... [17:22:33] it's a shame that i can't use the @import because it was a good idea [17:22:58] There was a way for JS [17:23:13] js for javascript? [17:23:32] humm i guess i have to add these in common.js? [17:23:42] I don't think just including it in JS would work [17:23:51] fair point [17:24:01] i would rather not doing it [17:24:02] You could create a correct RL module definition... And add it to every page in LocalSettings [17:24:06] There's gotta be a better way [17:24:19] because js is know to get easily hacked [17:24:25] true that! [17:29:25] i guess we could add a little feature that let people use the different css file [17:30:37] I can't think how this is usually gotten around [17:43:20] Heh. Love ghosting whoever keeps steelin my nick. Is it at all possible these days to enable the HitCount extension without going back to MW1.25? Now that I'm not caching in a way I don't understand I'd like to re-enable page view statistics [17:43:52] It's broken? [17:44:16] Reedy: Well I just installed the extension and ran update.php and the hit_counter table is empty still [17:45:05] hexmode: ^^ [17:45:53] I'm totally willing to admit fault for installing it wrong, but as far as I can determine, it's plop the hitcounter directory inside extensions, add the wfLoadExtension call, run update.php, pray that doesn't make my database break, and then it should go [17:46:12] the table also *exists* however it is empty [17:47:37] https://www.mediawiki.org/wiki/Extension:HitCounters [17:47:44] Currently it is only possible to migrate wikis using MW 1.25 to use this extension. A fresh install in MW 1.26 and higher is not possible! Attention: If you do not act according to the following instructions, an update to MediaWiki 1.26 or newer can permanently delete your hitcounter numbers! See task T120216. [17:47:45] T120216: Installation does not work for MW 1.26+ due to removal of hitcounter table - https://phabricator.wikimedia.org/T120216 [17:48:29] I don't want to migrate, the numbers I had before were all bogus, I'd assumed I can install the extension and have it begin working with 1.27 if I didn't care about my old numbers getting baleeted [17:48:38] anyone already tried to install that visual editor? [17:48:50] allaze-eroler: I've done it, it's pretty swish [17:48:57] i see [17:49:11] i noticed there is some files that needed to be installed [17:49:35] gotta install the extension as well as parsoid [17:49:43] parsoid's pretty painless if you're using ubuntu 14.04 [17:50:03] i see [17:50:31] since i planned to install visual editor in my online wiki but it's on a shared server [17:51:25] beside, the extension parsoid is no more supported [17:51:42] because it's said "unmaintained" [17:52:04] it's in the documentation, parsoid is separate from your webserver, it's an entirely different service [17:52:22] ah i see [17:52:49] Reedy: So what you're telling me is I can't get page view stats without downgrading once, installing, and then upgrading again? [17:53:01] like, will not function otherwise? [17:53:14] Sounds like bollocks to me [17:53:26] Manually create the table [17:53:52] another problem is that my wiki is in 1.27 version [17:54:07] I have VE running in 1.27, works like a charm :D [17:54:18] Reedy: Care to share your create table statement? I wanna make sure mine matches [17:54:25] I don't have one [17:54:36] Just have a look at the sql files in the extension? [17:54:46] And for hit_counter... Just look at the tables.sql for an older version of MW? [17:54:47] fair enough. Because I had both tables described and they remained tragically empty [17:55:12] TBH, it sounds like yours is vaguely setup [17:55:18] The extensions could be broken [17:55:22] I mean, worst comes to worst I'll just downgrade and upgrade at like, 2am my time. [17:55:39] That sounds absolutely pointless and a waste of time [17:56:30] That's a fair description of my job most of the time I think [17:59:16] i see, all i do is just add parsoid extension and it will work? [17:59:23] i'm running on debian [17:59:46] no, parsoid service. [17:59:59] allaze-eroler: https://www.mediawiki.org/wiki/Parsoid/Setup [18:00:18] Once Parsoid is up and running you're good to go. [18:00:56] i tried that one, it failed to work because it haven't been updated [18:01:02] as this message said: Parsoid switched package repositories on 2015-09-17. If you installed Parsoid prior to this date, you will need to follow the instructions below to add the new Parsoid repository to get updated packages. If you are installing Parsoid as a new package, follow the instructions below [18:03:28] So follow the instructions.. [18:03:49] i did [18:04:12] i said that page haven't been updated because it failed to work [18:04:42] i got error with it [18:05:36] allaze-eroler: you might have better luck in the parsoid IRC channel then bud [18:05:52] :| [18:06:28] another reason why i gave up on it and wait for next version of visual editor since it will be planned for 1.28 [18:07:58] best of luck then! (you'll still need parsoid) [18:08:45] i know [18:09:17] what i noticed is there is only one file that is missing: localsettings.js [18:11:25] Reedy: Ah! A clue. Can you direct me to where I can find the bloody sql files for mw-1.25? Specifically I think I need to change my page table [18:11:38] maintenance/tables.sql [18:12:01] https://github.com/wikimedia/mediawiki/blob/REL1_25/maintenance/tables.sql [18:12:06] You shouldn't need to re-add anything [18:12:49] I found an update.php file within the hitcounter extension that I ran, and it's saying I'm missing the page_counter field [18:13:02] which is presumably fairly important to keeping track of page views [18:14:14] Thanks for showing me where that was, I'll get back to plugging away [18:14:42] Ulfr: it shouldn't matter [18:14:47] It was removed from core MW [18:14:53] the extension should be adding it to it's own table [18:15:04] CREATE TABLE /*_*/hit_counter ( [18:15:04] page_id INT(8) UNSIGNED NOT NULL PRIMARY KEY AUTO_INCREMENT, [18:15:05] page_counter BIGINT(20) UNSIGNED NOT NULL DEFAULT '0' [18:15:05] ) /*$wgDBTableOptions*/; [18:15:05] CREATE INDEX /*i*/page_counter ON /*_*/hit_counter (page_counter); [18:15:17] yeah, that's already a thing in my database [18:16:03] It's just very empty. [18:16:33] So, the code tries to do [18:16:34] INSERT INTO /*_*/hit_counter (page_id, page_counter) SELECT /*_*/page.page_id, /*_*/page.page_counter FROM /*_*/page; [18:16:43] I'd modify it... [18:16:52] INSERT INTO /*_*/hit_counter (page_id, page_counter) SELECT /*_*/page.page_id, 0 FROM /*_*/page; [18:16:53] Or something [18:16:57] To re-initialise it [18:17:04] Reedy: You're a mad genius. [18:17:54] Ulfr: Can I start invoicing you yet? [18:18:25] Reedy: Depends, I offer payment in heartfelt thanks and/or beverages [18:18:40] beverages work [18:20:00] Reedy: Let me know what you want and how you'd like it shipped xD [18:20:13] Will coffee arrive hot? [18:21:08] Hm. Probably not. I was more thinking of the sort that came in a bottle and could be refridgerated after arrival. I'm reasonably sure coffee would arrive at room temperature [18:21:25] Hahaha [18:22:15] as i though: when i tried to follow the instruction for my webserver, it failed because it's only in read mod [18:22:19] mode* [18:23:45] i have to run these line comand from ssh, right? [18:24:14] which i did [18:24:17] but didn't worked [18:25:50] Reedy: Well, fiddlesticks. It's still being hateful. I'm so very glad y'all obfuscated extension stuff so it's much trickier to figure out what's calling who where [18:26:05] What? [18:26:34] Only the entry points have really been "obfuscated" [18:27:42] Exactly! I dunno why it keeps failing when it tries to populate the hit_counters table :( there's a bloody statement in there without me mucking with it that assigns a default value of 0 [18:28:02] But nooo, an irrelevant field doesn't exist and so my table won't get populated [18:30:43] hi andre__ [18:30:52] how do I create a blocking task in phab, the option is gone? [18:31:04] now there are some new features like "story points" [18:31:23] or parent / subtask [18:31:38] I don't really want to make this "parent" it's more or less preventing from some other task to be implemented [18:32:31] petan hi, it is now under Edit related tasks [18:32:53] paladox: I am aware of that menu but still I don't see "blocking tasks" in there [18:33:01] only subtask, or parent task [18:33:08] Yeh [18:33:11] those are it [18:33:13] what would a blocking task be, parent or subtask? [18:33:34] I mean if A blocks B, then which one is parent and which one is subtask [18:33:39] and like i said, i'm on a shared server [18:33:53] Please ask this in #wikimedia-devtools, where twentyafterfour will be able to explain which one is which [18:33:58] petan ^^ [18:34:17] but i presume subtasks is blocking [18:34:27] ok [18:34:41] I am sort of lazy to repeat this all in another channel so I will just do that [18:35:14] Reedy: LOL [18:35:16] Found it. [18:35:16] http://www.wikidoc.org/index.php/Special:PopularPages [18:36:17] Any thoughts on where exactly that stuff is lurking and how I can clear it the blazes out? [18:36:28] petan but parent / subtask is the new blocking task feature, it was renamed from blocking to that [18:36:56] yes, it was too clear in past so let's make it confusing :P I clearly understand this new trend in technology [18:37:05] i though that #mediawiki-devtools was for me... [18:37:08] if it works, let's break it [18:37:14] working things are too boring [18:37:57] petan https://github.com/wikimedia/phabricator/commit/2cb779575dd593bfdae8d751da90022479d530a8 [18:38:12] It wasen't broken [18:38:18] Upstream did it [18:38:44] this isn't looking like upstream repository to me :) [18:38:50] Yeh [18:38:59] we host a phab repo [18:39:04] that we merge upstream into [18:39:09] since we have customised code [18:39:28] oh I thought that chad is actually... [18:39:33] Chad Horohoe, lol [18:39:36] petan blocking tasks are subtasks and blocked tasks are parent tasks [18:39:46] Thats ostriches [18:39:48] now I see that this commit message is from upstream [18:39:54] yes [18:40:36] Upstream Phab has a Chad too. I am not him :p [18:40:50] Yep [18:48:07] Ulfr: Rebuild your localisation cache? [18:56:53] Reedy: Negative. I do that when I initalize a new webserver, and I promise this one was built before jan 2013 [18:57:23] Are you sure? [18:57:27] That's what that error comes from [18:57:36] error? Oh boy. [18:57:36] Did you do it after enalbing the extension? [18:59:04] Reedy: Yup. It should show a bunch of bogus hit stats from 2013 [18:59:06] not an error [19:00:26] I just dunno how to purge that particular bit of old information [19:01:49] aaand I forgot what runJobs did. That's gonna take a looong time [19:30:00] i used this for installing the node for parsoid but it's not for debian and shared server: https://gist.githubusercontent.com/x-Code-x/2562576/raw/5eb63fedde19e1855a9698eef70245ea73b0fd36/readme_install_node [19:30:14] nodejs [19:35:08] https://www.irccloud.com/pastebin/7rFgMEQv/ [19:35:29] could someone review this patch : https://gerrit.wikimedia.org/r/#/c/313021/ [19:35:29] all the subtasks have been closed already [19:36:12] I give up, thanks again for the help Reedy! [19:36:21] enigmaeth: I don't think that's how it should be done [19:36:58] ohh.. the deprecation warnings? [19:37:00] I think we should be getting most of the stuff migrated, so we can then change [19:37:01] protected static $enableDeprecationWarnings = false; [19:37:01] to [19:37:04] protected static $enableDeprecationWarnings = true; [19:37:35] migrate to ? [19:37:57] Remove/replace any other uses of deprecated functions that were replaced by contenthandler [19:38:10] okay [19:38:49] all subtasks have been closed which means all the uses of the function have been closed [19:38:55] No, it doesn't [19:39:02] I suspect, if I look, there'll be many more [19:39:21] oh yes...those were just the ones reported! [19:39:30] or added to phabricator [19:39:33] I think, the intention wasn't to fix them all [19:39:46] Just the ones that affect WMF... In a best case [19:39:50] But fixing all the stuff in doEdit [19:39:51] ffs [19:39:57] https://phabricator.wikimedia.org/T145728 [19:40:55] so how to proceed with this : https://phabricator.wikimedia.org/T145736 [19:41:14] Specifically, https://phabricator.wikimedia.org/T145728#2660115 [19:41:46] DanielK_WMDE: DanielK_WMDE__ about? [19:41:56] okay I'll check again [19:42:38] You could help by fixing up other usages of other deprecated CH related functions :) [19:43:16] okay! [20:02:48] i mean: i can't get it because of this message: fatal: unable to access 'https://gerrit.wikimedia.org/r/p/mediawiki/services/parsoid/': Couldn't connect to server [20:03:10] hello good people of the internet [20:03:13] i tried that command: git clone https://gerrit.wikimedia.org/r/p/mediawiki/services/parsoid [20:03:20] anyone have any suggestions for mediawiki hosting? [20:03:58] https://www.mediawiki.org/wiki/Hosting_services [20:03:59] here [20:04:06] yeah, I've seen that [20:04:10] ok [20:04:54] the problem is, I'm trying to find a new home for a wiki manual for an open source project that gets a lot of traffic (a few million page loads per month) [20:05:24] some hosters kind of back away from that [20:06:36] they're insisting on outsourcing it rather than let someone in the community manage it, which I think is crazy [20:06:43] it's not that hard to set up mediawiki [20:07:16] the only reason their own server is no good is because it's so outdated and crappy that no one wants to touch it ;) [20:07:34] NedScott: you could use cloudflare for a front-end cache, so the "real" host doesn't get so many direct requests [20:08:05] Vulpix: they have that set up now, actually. I just wasn't sure if it made more sense to find a host that handles everything vs host + CF [20:08:56] I know very little about actual server/backend stuff. I just handle actual wiki editing things, and they think I know what I'm doing ;) [20:09:03] NedScott: I wouldn't agree that mediawiki 'isn't that hard' to setup. Setting it up properly takes a lot of work [20:09:22] it's not turnkey, for sure [20:09:35] but these other community members have their own open source projects and their own mediawiki installations [20:09:54] so it's more like just sharing existing resources [20:11:48] Nginx+PHP-FPM Will handle a lot of that load. You could potentially add varnish ontop like this wiki: http://www.cgsecurity.org/wiki/PhotoRec. And/or Add cloudflare free plan [20:11:50] I got a little excited when I saw that "mywikis.com" offered setting up the visual editor, and their prices seemed reasonable, but their "featured wikis" on their home page are half broken.. [20:11:51] for free SSL [20:11:54] and caching [20:12:23] Ah, yeah I'm not sure about wiki hosters [20:12:49] I tried to find one once but they were either pricey for my project or didn't examplify their service [20:12:58] Visualeditor requires a good deal of resources [20:20:14] enigmaeth: I see that Daniel has been replacing them like that too... [20:26:00] back, sorry for not replying, i would suggest you www.ovh.com because it can handle really well many traffic [20:26:35] i pay mine for 80€ per year which it's good enough [20:27:22] ovh? I've read they suspend or even delete your account if you suffer a DDOS attack (yes, I mean YOU are the target of the attack, not the source) [20:28:48] they changed their server and they even improved the DDOS protection since then [20:29:19] they even added DDOS protection for free [21:11:57] eh, OVH is spotty even for proffessional services. If its not mission critical it can be quite nice, though [21:12:15] *cough*seedbox*cough* [21:12:20] Discord has switched from some OVH servers due to them just dropping randomly [21:12:40] For seeding Ubuntu :D [21:12:56] yup, lots of linux iso's to share [21:19:51] ah? what is a seedbox? [21:21:21] box where you store seeds [21:21:43] oh XD; [21:22:37] heh i see, though, ovh offer a large variety of os distro [21:23:18] ok, i have enough relaxed from TV [21:23:42] time to get back and try to fix the visual editor problem [21:27:06] hum failed for the install part [21:29:06] here what i did: https://dpaste.de/XV0N [21:29:15] sorry if it's french [21:31:23] it said that i can't use [21:31:53] because of this: /var/lib/dpkg/lock ? [21:32:09] hummm [21:32:43] sudo ? [21:33:03] well, it's on debian, i use echo [21:33:07] humm [21:33:16] no [21:33:26] Sudo make me a sandwich? [21:33:27] sudo is for rights elevation [21:33:43] i see [21:33:44] apt-get update && echo apt-get install parsoid [21:33:51] I don't know why you've got an echo in there [21:34:42] if you have echo infront of everything... it's not gonna run anything? [21:34:59] hum [21:35:15] echo literally echos what you typed [21:35:18] how can i know what linux version i have? i kind of forgot what is the comment [21:35:19] literally echos what you typed [21:35:28] i see [21:35:29] uname -a [21:35:37] gives you a reasonable idea [21:35:56] but you have dpkg, so it's debian based [21:36:20] Linux ssh1.240.ha.ovh.net 3.14.51-grsec-hosting-3.14-grsec-ipv6-64-vps #1 SMP Wed Jan 27 12:10:00 UTC 2016 x86_64 GNU/Linux [21:36:34] that is what i have [21:37:24] is because i have that: sudo: unable to stat /etc/sudoers: Aucun fichier ou dossier de ce type [21:37:26] ? [21:39:18] beside, i'm on a shared host [21:39:27] i mean shared webserver [21:41:39] Reedy? [21:41:55] are you actually allowed to install packages? [21:42:19] yes since i managed to install nodes [21:46:47] i will look at other place than shh [21:47:49] and see if i can change debian to ubuntu [21:50:15] i sent a question to ovh support [21:50:22] and wait for an anwser :) [21:54:52] for moment, i will try it for offline version [21:55:21] since i already have ubuntu installed [21:56:18] brb, switching to lapton [22:00:39] back* [22:00:48] but i will take a shower XD; [22:48:27] after update from 1.19.24 to 1.27.1 all pages in Russian are gone. $wgDBTableOptions = "ENGINE=MyISAM, DEFAULT CHARSET=utf8"; Update was via web-interface, no errors reported. Could anyone give an idea how to return (recode?) pages? [22:49:53] gone as in...? if there are links to them, what do you see? what about Special:AllPages? [22:56:33] Special:AllPages shows only non-Russian pages [22:57:53] ... and many empty line, which I assume are for pages in Russian [22:58:17] aha [23:00:19] did you touch $wgDBTableOptions ? did this upgrade involve mysql upgrade/migration to a different server? [23:00:33] No and no. [23:01:46] I did not touch LocalSettings.php at all. Just copied it to dir with new engine [23:02:38] what table options are there for actual DB tables? (SHOW CREATE TABLE page;) [23:03:33] CREATE TABLE IF NOT EXISTS `wiki_text` ( `old_id` int(10) unsigned NOT NULL AUTO_INCREMENT, `old_text` mediumblob NOT NULL, `old_flags` tinyblob NOT NULL, PRIMARY KEY (`old_id`) ) ENGINE=MyISAM DEFAULT CHARSET=utf8 MAX_ROWS=10000000 AVG_ROW_LENGTH=10240 AUTO_INCREMENT=12101 ; [23:04:29] this is from mysqldump copy . I dumped DB berore making update. [23:04:48] also, AAAA, MyISAM!!!1 [23:05:40] * MaxSem scratches head [23:22:36] Anyone know how to get a look at the ParserCache and ObjectCache for a wiki? [23:22:47] as in, all the keys and their values [23:28:32] Tried to do a simple var_dump(ParserCache::singleton()->get( $page, $out->parserOptions())); on the onOutputPageParserOutput hook, but i just get bool(false) [23:28:58] This is with the mediawiki vagrant image, so $wgMainCacheType = 'redis'; [23:36:30] ah, I see, redis-cli with command 'keys *' will display all [23:41:17] back [23:41:33] took me a while to shower up XD; [23:42:12] ok, now the serious thing: with terminal, it's enough to do some ssh command? [23:42:18] on ubuntu [23:44:27] nevermind, i will google it [23:44:49] what? [23:59:20] i'm trying to install SSH on my localhost [23:59:31] with ubuntu