[00:06:27] Reedy: im back up. I needed to adjust a few lines provided to me earlier in the chat and now im back up...however im having issues logging in. [00:07:04] i connect using ldap typically, however when the site restored it doesnt appear to be reconnecting. im not sure if i need to fix a plugin. but i should atleast be able to login using a local admin account on my wiki [00:07:33] but when i attempt to use that local admin account it doesn't connect. Is there a way for me to create an admin account via localsettings or some other method since i cannot login? [00:24:03] Additionally after restoring the site much of my page content is missing [00:24:18] the category pages and special page blue links show on their respective pages but page content/text is missing [00:24:25] it will display only the page title. [00:32:45] Hello im attempting to run a maintenance script on my windows server install of mediawiki, attemting to run maintenance/update.php but when i right click and open with php it does not launch. when i attempt to run command line of php maintenance.php it does not run either. [00:33:35] you will need to go into the command line, then start php from there and then issue the command from memory [00:35:23] p858snake: when i attempt to do that it tells me the following: http://hastebin.com/ayopekadet.tex [00:35:48] nvm wrong url [00:36:16] well actually, maybe not...do i need to list out the full path to maintenace? [00:36:28] i.e. server-name\maintenance\maintenance.php? [00:36:48] i.e. server-name\maintenance\maintenance\update.php? [01:28:34] i just upgraded from git, and everything looks very plain [01:28:42] like no css or something [03:31:51] ah ok, forgot to update the skins. its working now, but in the console I see an error like this: [3d860da5b14caa07d9fbeebc] 2016-03-29 03:30:05: Fatal exception of type MWException [03:31:59] how do i dig down int othat? [03:55:15] Hi k-man. [03:55:19] !debug [03:55:19] For information on debugging (including viewing errors), see http://www.mediawiki.org/wiki/Manual:How_to_debug . A list of related configuration variables is at https://www.mediawiki.org/wiki/Manual:Configuration_settings#Debug.2Flogging [03:55:41] Probably an extension? [04:16:12] worked it out [04:16:19] visual editor needed updating [04:16:39] and then i had a misconfiguration of parsoid. got there in the end thanks [04:18:29] running from git makes it a bit fiddly to upgrade [12:40:32] morning [13:24:08] Hi all [13:28:32] We use mediawiki for informations about ours servers, and I'd like to make some pages that are updated directly with infos on the server. For example, a page who says : "I'm server toto et free disk space is XXX Go", with XXX is the real free space when the user come on the page [13:29:34] I think I should use a plugin or mabye write one, does this plugin exists ? [13:30:22] Depending on what your data source is... [13:30:26] There are things like https://www.mediawiki.org/wiki/Extension:SNMPquery [13:34:23] Fine, thanks Reedy. Do you know another one more scriptable ? Like parsing backup logs files and print the result "Backup Ok or Failed" ? [13:35:31] I don't know of anything [13:35:38] But it's possible it might exist [13:38:10] I've just looked SNMPQuery source. It seems not so difficult to write a plugin [13:38:56] Hi - I will be updating from 1.21.2 to the latest, should I be concerned with something breaking or are upgrades usually fairly clean? I dont really use many extensions. [13:39:48] from 1.21 should be alright [13:40:31] can I just grab the latest and use the maintenance script, or should I grab each version between 1.21.2 and latest and install them in order? [13:41:12] Latest should be fine [13:41:26] It's only when you're going from really old versions you need to do it more incrementally [13:41:47] Ok cool thanks [13:43:05] finally, is it possible to build a parser extension (such as ) and have it render markup that actually saves to the page so the parser tag is deleted and replaced with the content it rendered? Im not sure what that is called so I am having problems googling for an example [13:46:00] wikiguy32000: You should be able to... [13:46:04] Not sure I've ever seen an example [13:46:17] We do it for templates... Where we call it subst (substitution) [14:17:08] есть кто? [14:18:59] Существует ли расширение, которое позволяет делать множественную загрузку файлов в вики? [14:20:16] !uploadwizard [14:20:22] Meh. [14:20:25] Mmmyeessss? [14:20:31] mihail_: https://www.mediawiki.org/wiki/Extension:UploadWizard [14:26:31] большое спасибо [15:19:19] hello im getting the following error when attempting to register new accounts [74705d95] /index.php?title=Special:UserLogin&action=submitlogin&type=signup&returnto=Main+Page MWException from line 291 of includes\mail\UserMailer.php: PEAR mail package is not installed [15:20:07] Install pear, or disable email? [15:20:30] You can get it via composer if you wish [15:20:43] i see a folder called pear in my php folder [15:20:58] so i would assume that i have it [15:21:34] pear is a big repository thing [15:21:48] So, it may or may not have it [15:21:56] i believe it came preinstalled with xampp [15:22:47] I also get errors on my other wiki when i enable error reporting that references pear [15:22:55] but it lists the old directory of xampp [15:23:11] so im unsure of i need to modify any other files outside of mediawiki to enable it? [15:23:12] php ini? [15:23:15] possibly [15:23:20] is it using some global php ini? [15:28:30] yes [15:28:52] so php is located in c:/xamp2016/php [15:29:07] mediawiki is installed at c:/xampp2016/htdocs/mediawiki [15:29:24] theres another wiki installed at c:/xamp2016/htdocs/2016 [15:29:40] i currently see pear at c:/xamp2016/php/pear [15:30:46] but inside php ini i see the following:; UNIX: "/path1:/path2" include_path=C:\xampp2016\php\PEAR ; ; Windows: "\path1;\path2" ;include_path = ".;c:\php\includes" ; ; PHP's default setting for include_path is ".;/path/to/php/pear" ; http://php.net/include-path [15:30:53] open up a command prompt (Start->Run, type "cmd" without the quotes and hit enter). Then try executing the following two commands: [15:31:48] cd C:\xamp2016\php\bin [15:31:53] pear install mail [15:32:11] (I'm not 100% sure on the first path, does that folder exist?) [15:32:26] first path doesn't work [15:32:41] does not exist in the php directory [15:32:52] try the second, it may or may not work depending on if you said to include it in your path [15:32:56] *try the second anyway [15:33:16] if it says "pear: command not found" or something like that then we'll need to revisit where exactly it is [15:33:16] okay [15:33:21] if it works, then it works ;) [15:33:34] says did not download option dependencies pear/net_smtp [15:33:41] but it did install [15:34:29] now when i try and register i get: Fatal error: Class 'Net_SMTP' not found in C:\xampp2016\php\pear\Mail\smtp.php on line 346 [15:34:40] In MW 1.27 you can just get it via vendor [15:34:53] Well, actually, in prior to 1.27 you can get it via composer/vendor if you wish [15:34:58] yeah you'll want to run pear install Net_SMTP [15:35:07] Reedy: except composer isn't preinstalled with xampp, whereas pear is [15:35:14] okay just did [15:35:16] I'm just going with the easy route :) [15:35:19] Skizzerz: Download one phar? [15:35:27] php composer.phar update [15:35:28] profit [15:35:37] well the easy route is disabling mail [15:35:40] so the middle route [15:35:50] heh [15:36:01] [16:20:07] Install pear, or disable email? [15:36:36] if this is turning into a dependency hell though may just go with "install composer and have it do everything for you" [15:36:37] ayo! [15:36:41] that worked [15:36:58] .what directory do i need to cd to to install composer? [15:37:27] Anyone here have much experience with Varnish? I'm trying to resolve a hitratio problem that came up after a recent upgrade to MW 1.26. [15:37:31] You can put composer wherever you want [15:37:40] kevindank: not strictly necessary (unless you're downloading mediawiki from git as opposed to the .tar.gz files) [15:37:58] i just used: https://www.mediawiki.org/wiki/Composer [15:38:06] says it installed to htdocs/composer.phar [15:38:59] getting close to getting out of everyones hair [15:39:31] now that ive created a local user account again, how can i promote it to an administrator/sysop without a prior administrator account? PHPMYADMIN? Localsettings.php? [15:39:33] what else isn't working besides mail? [15:39:50] the initial account you create in the installer is automatically a bureaucrat and sysop [15:40:07] Also, when i updated from 1.17.1 to 1.25 it appears that my Mediawiki:Common.css changes are gone [15:40:13] are you saying that you're unable to get at that account? [15:40:17] correct [15:40:29] when i rerun mw-config it doesn't ask me for one either [15:40:33] just asks for db username and pw. [15:40:41] if yes, in that command prompt do cd C:\xamp2016\htdocs\mediawiki\maintenance [15:41:11] then php createAndPromote.php --bureaucrat --sysop username password [15:41:27] replace username and password with the desired username and password [15:41:57] if the username already exists and you just want to promote it, omit the password and add --force instead [15:42:19] (--force would go before the username, not after; not sure if it matters but just in case) [15:42:38] okay awesome, that worked i was able to login. however when i went to specialpages it produces this error: Fatal error: Call to undefined function wfLoadExtensionMessages() in C:\xampp2016\htdocs\extensions\SemanticQueryComposer\SpecialSemanticQueryComposer_body.php on line 6 [15:42:43] im assuming i need to look into smw for that [15:42:48] or disable the line [15:43:52] did you update your extensions when you updated mediawiki as well? [15:44:00] if not, you should go do that [15:44:14] yep, that resolved the issue [15:44:16] looks good for now [15:44:23] thanks for the help Skizzers and Reedy [15:44:29] yw [15:52:52] so im using vector skin and have "$wgVectorUseSimpleSearch=true" however not producing autocomplete on search fields [15:52:57] Can anyone possibly help me diagnose why my Varnish hit ratio dropped from ~85% to ~37% after my upgrade from MW 1.24 to 1.26? I asked on the mailing list recently but no replies yet. [15:54:04] kevindank: wasn't that deprecated? [15:54:14] Oh, no [15:54:31] i also tried using $wgUseAjax = true; $wgEnableMWSuggest = true; [15:54:47] Check your js console? [15:55:37] Uncaught Error: module already implemented: user.options [15:56:49] Sounds suspect to say the least [15:57:02] If not necesserily the cause [16:06:43] And on the new wiki i get this: Warning: include(C:\xampp\htdocs\2016/extensions/Widgets/smarty_plugins\modifier.validate.php): failed to open stream: No such file or directory in C:\xampp2016\htdocs\2016\extensions\Widgets\compiled_templates\766ffebae8039f55f3b2a63d9b61be5f9d4be0cb.wiki.Html5media.php on line 28 Warning: include(): Failed opening 'C:\xampp\htdocs\2016/extensions/Widgets/smarty_plugins\modifier.validate.php' [16:06:43] justinl: right after the update, i would expect the hit rate to drop, since all cached data was (or should have been) invalidated by the upgrade. A lot of the misses you see may be from clients that now need to load the new version of all the relevant JS and CSS modules. [16:06:45] looking at it, it looks to be trying to access the old path as it says include c:\xampp and not c:/xampp2016/ [16:06:50] This is just a guess, I'm no expert on the varnish stuff. [16:07:44] justinl: but the hit rate should soon recover. how long that takes depends on how many hits you get, when the cache expires, how often people re-visit your site, etc. if it doesn recover at all, then yea, something is fishy... [16:07:44] It's been a couple of weeks since the upgrade. I expected a ramp up but what happens is that the hit ratio plateaus around 37%. [16:08:01] Very consistently. [16:08:07] huh. that doesn't sound good. [16:08:30] I've been doing all sorts of research and troubleshooting, trying tweaks to my VCL based on research, etc. but nothing is affecting the ratio. [16:08:44] do you have logs of the hits and misses? If you do, share them on the mailing list. without any logs, it's realyl hard to guess what's going on. [16:09:28] Well the wikis (one in particular) get a ton of traffic so Varnish logs get too big to save, even for a day, so I don't record them. [16:09:38] However, I've been watching them a lot and can't seem to find a particular pattern. [16:10:07] justinl: then record an hour, or, if that is too big, record a minute or so. or sample every 100th request. [16:10:11] I also just started tracking Varnish's LRU nuked object parameter in Graphite with the rest of my metrics, and I increased Varnish's cache from 4 to 8 GB. [16:10:47] my guess is that someone who knows this stuff would need between 1k and 10k fairly log sampled entries [16:11:01] I changed Varnish VCL last summer to strip cookies from thumbnail images and that made the hit ratio increase from ~40% to ~87% [16:13:00] There is around 12 GB of thumbnails in the main wiki, and about 30 GB of thumbnails across all 5 wikis, but since that wasn't a problem at 4 GB caches and still got 87% just a couple of weeks ago before the upgrade, I doubt the hit ratio reduction is simply due to cache usage [16:14:38] That said, Varnish is nuking (n_lru_nuked) between 0 and 200 objects per minute on each web server, so it's definitely using all of it's cache space [16:16:23] justinl: can you get separate stats by mime type? I mean js vs css vs html vs jpeg etc. that could give some hint at what the problem is. [16:19:09] Hmm, let me see what I can do. I was just looking at the breakdown of cache misses by the start of the URL, e.g. /wiki, /index.php, /api.php, and /images/thumb, so let me look specifically at the mime types to see if anything stands out [16:20:52] justinl: that's probably more or less the same breakdown, actually. [16:21:16] You also want to look for /load.php though [16:22:26] I did leave that one out, but the .php ones typically have cookies and so would be cache misses anyway [16:23:30] I can't seem to figure out though how to tell varnishlog to show only results without cookies, that would require writing logs and post-processing as far as i can tell [16:25:48] I've thought about removing the VCL code that strips cookies from thumbnail images just to see what effect that has, but it was that change last year that dramatically improved the hit ratio to begin with [16:28:19] justinl: the output of load.php is specifically desigend to be cached aggressively. not sure how it should interact with cookies. the output should be the same for all users though [16:28:54] justinl: actually, that might very well be your problem: not caching output from load.php due to session coocies. [16:28:59] cookies even [16:29:03] yeah i just ran a quick test and out of about 900 misses, only 4 were load.php [16:29:29] most were index.php (400), /wiki (240), and images/thumb (180) [16:30:22] if you use pretty urls with the /wiki/ prefix for viewing, you should not see many requests for index.php. [16:30:33] very few, actually [16:31:06] do you have hit/miss reations for each prefix (or each mime type)? [16:31:25] (again: i'm not an expert for this...) [16:31:35] i'm looking at index.php x-original-url headers now for misses, most are diffs/oldids [16:32:11] no, i don't have anything like that, varnish doesn't break it down that way and i don't save varnishlogs for post-processing [16:32:46] i could try stripping cookies from index.php requests... [16:32:47] i recommend getting a full varnish log of 10k requests [16:32:57] right now, you are flying blind [16:33:03] well not all, some are edits, etc. [16:34:04] i guess i'm just confused by why the sudden drop after the upgrade since the varnish and apache configs didn't change. [16:35:11] i'll log varnish for a little while and try analyzing those results. with our traffic, it doesn't take long to get huge logs [16:35:54] justinl: that's why wikimedia went without request logs for nearly ten years ;) [16:36:16] thankfully, we do have them now, though they are not exactly easy to handle [16:36:44] logging > 100k requests per second is fuuuunnn.... [16:39:45] it only took about a minute or so for 1 server's log to hit 10 MB, so that's 14 GB per day and it's not peak time yet [16:52:10] the log hit about 100k requests in about 10 minutes, about 18GB log file at that point and that's just for 1 of 4 servers [16:53:00] sorry i meant at that rate it'd be about 18 GB in one day [16:54:03] so i could set up something to do daily processing since there's around 100 GB free on the servers, but obviously files that large take a lot of time and cpu to process and these are the live wiki servers [17:19:28] Does anyone serve their wikis from multiple web servers? Ours get enough traffic to warrant load balancing across 4 web servers but most of my googling when researching problems and good practices shows people typically using a single server. [17:20:35] Anyone besides wikipedia, that is? Their scale is way beyond ours, of course, so they solve problems very differently than we would and have the resources for. [17:22:08] Still can't get Vector skin to cooperate...the navigation items aren't collapsing any longer [18:08:02] Hello, does anyone know why when I add a
or new heading on Sidebar using Vector, it won't show up? [18:13:08] Wrong syntax? [18:13:38] Don't think so... shows correctly in Mediawiki:Sidebar [18:14:43] That means absolutely nothing [18:14:49] It's parsed differently [18:14:54] !sidebar [18:14:54] To edit the navigation menu on the left, edit page [[MediaWiki:Sidebar]] on your wiki using its special syntax (see https://www.mediawiki.org/wiki/Manual:Interface/Sidebar for details). If you need more control, you can use the SkinBuildSidebar hook (https://www.mediawiki.org/wiki/Manual:Hooks/SkinBuildSidebar). [18:32:35] Thank you Reedy [18:37:38] ohai Reedy, do you still review patches? (evil smiley) [18:38:00] "Do you even review, brah?" [18:39:03] Wassup? [18:39:30] Reedy: remember the thing before flow? [18:39:48] MW talk pages? [18:40:02] Reedy: well the step in between [18:40:06] :) [18:40:07] it has TLA [18:40:30] and it likes to throw fatal errors and exceptions: https://gerrit.wikimedia.org/r/#/c/280207/ [18:40:44] Let me get a laptop [18:42:56] does common.css get stored anywhere in mediawiki directory? [18:43:14] if i go to Mediawiki:Common.css for example [18:43:17] No, it's a wiki page [18:43:22] In the db [18:43:56] do i need to reprocess it somehow? perform an update? [18:44:14] because most of the formatting changes to the vector skin are not appearing while the code does reside in the file [18:44:20] also checked Mediawiki:Vector.css [20:51:55] Afternoon all. [20:53:16] I have a weird question. So I have a piece of mediawiki code that works to put a scrollbar in my Vector sidebar. However, when I put it there it apparently makes the logo at the top disappear. I'm stumped as to why. [20:53:26] Rather, a piece of CSS for the scrollbar. [21:05:35] Oh, that may be the error... [21:05:38] hi guys, I'm having problems with the math extension, it works but doesn't seem to render certain symbols like \triangle. I'm using mediawiki 1.23 on centos 6.7 [21:05:50] can you think of any suggestions? [21:07:39] Not I. I'm trying to fix a sidebar issue. [21:07:52] Math functions are a little beyond my normal use for MW. [21:08:12] what sidebar issue do you have? [21:08:44] MrD504, is your texvc up to date? [21:09:27] it says it is MaxSem [21:09:38] it worked on my last install same version, same OS 6 months ago [21:09:44] the RPM package? [21:10:07] straight from the yum repo [21:10:11] All dependancies? [21:10:22] no errors while installing [21:10:22] /optional use packages [21:10:30] you realise that all RedHat derivatives have outrageously old packages? [21:10:34] haven't checked the optional [21:10:39] I do yes. [21:10:56] Just don't understand why it would work 6 months ago, and not now [21:11:07] surely how to draw a triangle hasn't changed [21:11:16] various tex packages changed? [21:11:32] no i took a mirror of the old machine with the old database [21:12:47] i guess the best shot i have is to try your /optional use packages and then just get the latest rpm and deps [21:13:43] MrD504: I'm trying to put a scrollbar into a Vector sidebar. I got the CSS to work, but the logo at the top disappears when I have the code in place. [21:14:27] do you get any errors in your console? [21:15:48] MrD504, was the Math extension itself updated? [21:16:08] I don't see any ML changes that might have caused this [21:16:10] No MaxSem, I used all packages for the 1.23 version of mediawiki [21:16:38] so all PHP and texvc is iterally the same? [21:17:04] yes that's right [21:17:31] is nothing have changed, I recommend exorcism [21:17:37] lol [21:18:48] I have no idea, I'm more of an end user on this. [21:18:54] I have to admit i've never got mediawiki working flawlessly, there's always some sort of lucene error I have to suppress :) [21:19:10] Do you report bugs? [21:19:21] I've talked them over on forums [21:20:09] is it best practise to use the same version of mediawiki that the actual wikipedia foundation uses? [21:20:26] and keep the mysql and php version identical? [21:20:44] No [21:21:31] good, because I noticed they weren't using the LTS version [21:22:49] Usually, WMF have the engineering capacity to fix MW when it breaks :P [21:23:05] yea where as at my place its just lil ol' me [21:23:17] WMF uses master, updating every week [21:24:12] right I see. Well cheers for the input guys, I will come back tomorrow and let you know if and what fixed the problem [21:24:45] ElmerG, you don't have console (f12 on your keyboard) [21:27:26] Derp. Yeah, I do. I've been using inspect on there to make sure I was editing the right div. [21:27:34] Let me check by clicking Console. [21:29:01] No error listed, but the code makes the logo disappear. I think it has to do with the p-logo being inside the div, maybe? [21:29:39] what about if you inspect the logo itself and increase its z-index? [21:31:26] I know you're an end user, but can you paste the code into pastebin or something? [21:33:57] https://dpaste.de/9R9T [21:34:59] and the html? [21:35:28] might be easier to use https://jsfiddle.net/ [21:36:57] I'm not doing anything other than putting that in the Vector.css. It seems to push the entirety of the sidebar up and the logo is gone. Let me put it back in and inspect again, maybe I overlooked something. [21:38:43] you can edit the css straight in the browser, try removing position absolute, and change it to relative or something like that. That's normally why my elements are out of position [21:40:01] * ElmerG nods. I'll give it a whirl. :) [21:43:34] Hrm. Okay, that might be it. It looks like when I change the div, the actual logo image from p-logo is gone. I think if I add that in manually, then change the top (top of 40px is causing it to push up), it might do it. [21:45:10] good, nice work [21:49:00] Let's see how this goes. [21:52:29] and the result is ... ? [22:00:20] Still messing with the CSS, but I think I've got it. [22:00:39] Nice one! [22:01:27] * ElmerG had to change more than he was expecting, but the end result will be nice. :) [23:35:09] Does anyone know if Squid3 caches only pages that its told to cache, or does it cache everything? I want to replicate its behavior in Varnish, but Varnish caches everything, making edit pages, special pages, history, etc, all not updated on purge. [23:35:26] It is purging, but mediawiki only sends out 1 purge command when you edit a page [23:36:03] uh, that sounds wrong [23:36:19] have you looked at our (WMF) varnish config? [23:36:55] https://www.mediawiki.org/wiki/Manual:Varnish_caching V4 i'm guessing? [23:39:37] it wasn't written by our ops engineers :P [23:39:58] Alright, not sure where your ops engineers put it then :P [23:41:00] https://github.com/wikimedia/operations-puppet/tree/production/templates/varnish [23:41:55] Holy mother of wood finishes batman. [23:43:01] you mostly want to look in text and common [23:43:43] Well, you do put varnish on the repository tree! [23:44:24] Trela: A proper 16 coats of it, too [23:48:42] Is there any sort of simplified version of this config for normal mediawiki installs? I understand parts of this, but it looks like a lot of specifics for wikimedia foundation, too