[00:10:06] hey fellas and ladies, I need some help with css. I want the number and bullet lists to display one color and the text to be white but it seems like both are synced together. Refer to https://wiki.gamepaduniverse.com/wiki/DIY:Xinput_Arduino_Gamepad [00:10:34] I want the bullet and number to be yellow and the text after to be white smoke. What changes do I need to make? [00:12:04] I am using #mw-content-text > ul li{ color:yellow;} currently [00:17:04] hello, I would like to ask if LDAP authetication extension is compatiable with Apache running on windows 7? The documentation I see tends to guide towards setting up on IIS server instead. Thanks for all advises [00:45:20] hi [00:52:22] when I upload files, the first letter is set to upper case, can I change this behaviour ? [00:54:31] Why did Gerrit just ask me to do a code review? [00:55:21] APerson_: somebody (change author, somebody else) thought you might be a good person to review some change [01:04:13] maxagaz $wgCapitalLinkOverrides[ NS_FILE ] = false; [02:04:12] My first wiki :) http://www.bksys.at/ecowiki [02:06:05] Is it ok to use my admin account for normal page creating/editing etc., or should I make a more mundane account for that? [02:18:45] darsie, well, if the admin account isn't literally "Admin" but has your own name/nickname on it, then I guess it would be OK, given that admins on wikipedia don't have their own non-admin accounts [02:19:05] ok [02:19:25] congrats on your first wiki [02:19:34] thx :) [02:20:01] I think I have tried before, but this is the first working one. [05:46:43] hi all, I managed to configure my wiki to login using the AD domains [05:47:10] next, i would like to have them automatically login to the wiki without showing the login page. is this possible? [05:47:16] i am using apache and php [07:18:10] hi, how do i put image caption on images in gallery? [07:51:39] anybody knows how to configure apache to have auto login? I've tried downloading the mod_auth_sspi and configured it but it is giving me Undefined index: REMOTE_USER error still :/ [08:19:08] XYZ_: "auto-login" .. do you mean single sign on? .. if so then kerberos? [08:37:44] currently i am able to login using the windows credentials from my mediawiki and i have use the ldap extension as well. is kerberos compatible with that? thanks for your response! appreciated it. :) [08:45:55] XYZ_: if you are using the ldap auth extension .. sso should be available [08:47:14] i will say that the only time i ever setup a wiki to hook to ldap was with the server doing the authentication not mediawiki [08:47:26] i used the auth_remoteuser extension for that [08:54:13] u used apache web server as well? [08:54:45] so you meant you use auth_remoteuser instead of ldap to do sso? thanks for your reponse :) [09:04:47] XYZ_: yes i used apache to integrate ldap authentication .. and then auth_remoteuser to trust that the REMOTE_USER has been authenticated by apache [09:13:08] thanks for your advice! I will try that now. Greatly appreciated! Have a good day ahead :) [10:15:22] Hey is anyone around that I can ask a quick question about pulling data from a media wiki site to a json result? [10:15:45] SirTrott: just ask your question [10:16:02] if someone has an answer, they will respond (hopefully ;-)) [10:17:49] Okay so i had this shown to me : http://wiki.teamliquid.net/hearthstone/api.php?action=askargs&conditions=Category:Cards|Has_image::%2B|Has_name::%2B|Has_class::%2B&printouts=has_name|has_image|has_class¶meters=offset%3D0|limit%3D10&format=jsonfm and that pulls the data from /hearthstone/category:cards and displays them but I'm trying to do the same request for /dota2/Liquipedia:Upcoming_and_ongoing_m [10:17:50] atches so I'm doing the following call http://wiki.teamliquid.net/dota2/api.php?action=askargs&conditions=Liquipedia:Upcoming_and_ongoing_matches&printouts=¶meters=offset%3D0|limit%3D10&format=jsonfm but it doesn't display the results so I think i'm not meant to use askargs for this page, but unsure what should be used to pull the data from that page [13:06:55] I recently changed the URL to my mediawiki, but it's still redirecting to something with "/wiki" in the URL. Where can I check for where this is still set? [13:07:10] I've check LocalSettings.php but it doesn't look like it's in there [13:08:05] could it be in one of the database tables? [13:12:15] There a $wgScriptPath variable in LocalSettings.php that's set to "/wiki" but if I change it to just "/" it seems to redirect to http://index.php/Main_Page for some reason [14:09:39] Hello everybody. I am having a question about $wgCompressRevisions. When I turn it on, I getgzinflate(): data error in /var/www/html/mediawiki/includes/Revision.php on line 1297 - is that configuration variable expecting another variable to get turned on? [14:24:27] malaverdiere: the manual seems to suggest not .. only saying that zlib support in PHP must be available [14:25:23] i can only suggest trying a debug log .. or maybe your web server logs / php logs .. to see if you can find the source [14:25:42] that error sort of suggests to me that gzinflate() was handed some bogus information to inflate? [14:29:27] znx: it happens in the phpunit tests too [14:32:22] malaverdiere: the error itself is only pointing to functions which compresses/uncompresses the text handed to it [14:32:53] i wonder if the compression is baulking over some character it doesnt like [14:33:21] znx: its easy to reproduce, because it happens in PHPunit - would the stack trace be of any help? [14:35:09] malaverdiere: it maybe will help another in here but im not a mediawiki coder [14:35:37] :) [14:55:46] malaverdiere: seems like you have gzdeflate, but it fails to actually compress. Looking at the core, it seems that in this case, you data actually gets lost silently. you will onyl notice when you try to read it - then uncompressing will fail with an empty input string [14:55:55] this is actually a nasty case of silent data loss. will make a patch [14:56:15] malaverdiere: if you want to verify, find this in Revision.php: $text = gzdeflate( $text ); [14:56:44] then do something like if ( $text === false ) throw new MWException( 'gzdeflate is broken!' ); [15:01:23] DanielK_WMDE_: looking now [15:05:24] malaverdiere: https://gerrit.wikimedia.org/r/#/c/215920/ [15:06:39] you're waaaay faster than me [15:07:11] malaverdiere: well, don't feel bad, it's my job, and i have 10 years of practice ;) [15:07:26] DanielK_WMDE_: By the way, is there a reccomended/required version of phpunit for given versions of mediawiki? [15:07:32] but the patch doesn't do exactly what I suggested to you for investigating [15:07:55] didn't throw the exception [15:08:01] thatÄ's [15:08:11] err. that's odd, then i don't know what's going on [15:08:37] phpunit version... there is a required version, but i'm not sure what it is, actually :) [15:09:32] because I'm getting things like Class 'PHPUnit_Framework_ComparisonFailure' not found and I'm having a debian 8 setup [15:10:14] malaverdiere: CI is using 3.7.37 [15:10:33] anything < 3.7 is likely to break [15:10:50] but if you are using composer to install, it should be pulling in the correct version [15:11:08] just use the phpunit that gets pulled into mediawiki instead of the one in your system path [15:11:11] php -r 'echo function_exists( "gzdeflate" ) . "\n"; ' gives me 1 [15:11:31] woah too fast [15:11:33] yea, my idea was that it exists, but fails, and returns false, leading to data loss [15:11:40] fals gets stored to the database as an empty string [15:11:51] uncompressing the empty string then fails, causing the error you described [15:11:57] this was just a guess though [15:12:06] perhaps look at what's actually in the text table [15:12:15] and see if you can run it through inflate [15:14:02] gonna try that [15:15:59] Hi. I am part of an organization, and we want to use MediaWiki to chronicle our history. We have a timeline of connected and unconnected events, but I have no idea how to best format this for the reader. [15:16:13] Is anyone aware of any wikis which have this same style so that I can get some inspiration? [15:16:55] My goal is to make things very easily searchable, which would probably involve heavy use of the category system. [15:18:26] orion: i'D recommend to use SMW for this kind of thing [15:18:29] !smw [15:18:29] SemanticMediaWiki is an awesome extension that lets you connect wiki pages with semantic relations. See and . Mailing lists are available at [15:18:58] orion: there'S a bunch of extensions for visualization, including timelines, built on top of smw [15:19:34] Cool! Thanks. [15:20:28] malaverdiere: btw... make sure the old_text field in the text table is BINARY. [15:20:44] (or BLOB) [15:22:11] DanielK_WMDE_: I might have done something very silly [15:22:30] I ran maintenance/install.php, appended options to LocalSettings.php and then gave it a go [15:22:37] was I supposed to run some other script? [15:23:03] malaverdiere: not really. to make sure, you can run update.php too [15:23:07] but it will probably do nothing [15:23:24] * malaverdiere is finding his way with the psql cli [15:23:27] On this page: https://www.mediawiki.org/wiki/Download the "MediaWiki 1.25.1 changes not including i18n (unified diff)" link is broken. [15:23:30] update.php should only be needed dafter upgrading, or after installing (some) extensions [15:24:35] orion: looks like there are no patch downloads for 1.25 [15:24:36] https://releases.wikimedia.org/mediawiki/1.25/ [15:25:54] :< [15:27:18] orion: filed as https://phabricator.wikimedia.org/T101404 [15:33:14] orion: there's no patch downloads because 1.25.1 came out like 2 hours after 1.25 [15:35:27] legoktm: maybe so but it is still a broken link ;-) [15:36:44] I hid the link [15:36:58] https://www.mediawiki.org/w/index.php?title=Download&type=revision&diff=1666139&oldid=1664049 [15:37:05] ace [15:54:48] DanielK_WMDE_: which table were you wanting me to check? [15:55:24] malaverdiere: text [15:55:35] malaverdiere: page -> revision -> text [15:56:16] I see page and revision [15:56:21] not text [15:57:07] malaverdiere: if you do not have a table called "text", i have no idea how your wiki works... [15:57:24] this is a boring plain setup of mw 1.23 [16:01:40] I'm going to guess that 'pagecontent' is the deal [16:03:57] DanielK_WMDE_: the content stored in pagecontent looks legit - just one non-printable character that I can't be sure of [17:03:38] DanielK_WMDE_: So I managed to get the stuff from the DB. Hex from the DB: 4b 2c 4e 49 2b 4e 4c 4b 49 04 Hex from php: 4b 2c 4e 49 2b 4e 4c 4b 49 04 00 - extra 00 at the end [17:06:02] though it doesn't look like it is affecting the gzinflate function [17:47:14] malaverdiere: ah! "The text table holds the wikitext of individual page revisions. If using Postgres or Oracle, this table is named pagecontent." [17:47:18] are you using postgres? [17:50:15] DanielK_WMDE_: sorry I was AFK [17:50:26] I'm using Postgres in this setup [17:50:36] I have various configurations in docker [17:50:57] yea, that's why the table name was different [17:51:28] so, at first glance, it looks like the data is encoded properly in the DB [17:51:28] anyway, that decompresses to asdfsafda [17:51:50] no idea why it is complaining [17:52:01] me neither [17:52:21] I am using an 'old' version, so I guess I had it coming ;) [17:52:44] naw, should work fine [17:52:48] strange problem, that [17:52:51] you mentioned using composer - I don't see the json file in 1.23 - is that something new added in 1.24? [17:53:02] yes. or 1.25 even, not sure [17:54:03] this is a head-scratcher. I am trying to create some environments for repeatable tests [17:54:28] and it sounds like I'll need one environment per major version [17:55:25] i was trying to clone mediawiki core into my local system,but I get the following error:https://dpaste.de/0Etz [17:55:29] DanielK_WMDE_: though you might find the outcome useful for automated testing eventually [17:55:36] malaverdiere: maybe use something like travisci? [17:55:43] can someone help me with the above issue? [17:56:00] joyce: sometimes can be the network - have you tried a few times? [17:56:25] joyce: sounds like a network hickup. does it happen every time? [17:56:28] malaverdiere: i have tried around 3 times [17:56:32] what command do you use for cloning? [17:57:15] DanielK_WMDE_: git clone ssh://USERNAME@gerrit.wikimedia.org:29418/mediawiki/core.git [17:57:49] joyce: try using the https url for cloning [17:58:15] you'll have to change the remote to use the ssh url later if you want to push anything [17:58:25] but it would be good to know whether ssh works. [17:59:00] DanielK_WMDE_: I will try with https then,ssh failed all the 3 times [22:28:28] after upgrading to 1.24 http://mywiki.example.com/ 301 redirects to http://mywiki.example.com/Main_Page. Previously we were able to show /Main_Page at / with the help of an apache redirect, but now it seems that MW doesn't like the URL/Title mismatch and redirects. Is there any way around this?