[00:35:09] Hey anyone here familiar with .htaccess? I need some help.. it involves mediawiki [00:35:56] well, that's technically an apache thing, but there probably are some such people anyway ;-) [00:36:06] what was the question? [00:36:50] Ok so.. I have a 'Pretty URLS" mod on my forum. It alters .htaccess, and mediawiki is installed in a subdirectory. [00:37:00] Now... How can I have both use short urls [00:37:01] lol [00:37:15] Mediawiki's short urls are still redirecting to the forum >_< [00:37:31] isnt there a way to tell .htaccess to ignore /w/ until it gets there O_o [00:38:56] maybe you should pastebin the relevant .htaccess stuff along with the relevant (web-visible and fs-level) paths? [00:39:53] This is the forum (root) .htaccess file http://pastebin.com/Gap78gtx [00:40:28] and this is MediaWiki's [00:41:47] one sec lol [00:42:22] i broke it >_< [00:43:06] ok here we go [00:43:06] http://pastebin.com/4u3yYN2L [00:43:37] thats the .htaccess in /w/ for short urls [00:44:01] but it still sends people to a page on the forum, instead of the wiki's installation o.o [00:46:33] .php5? [00:46:56] default installation, so no.. i believe [00:47:16] actually let me double check if thats what apaches running [00:47:26] yeah, php5 [00:47:35] i was meaning using the php5 extension [00:47:46] Oh, no. index.php [00:47:46] you'd have to be running php 5 to get MW to run [00:47:52] right lol [00:48:39] It's just currently the rewrite rules for root are overiding the rewrite rules for /w/ [00:48:51] and i have no idead how to get them to work together. [00:49:14] what directory/url structure are you using? [00:49:20] /trying to use [00:49:37] public_html is the forum, with custom rewrite rules. [00:49:45] public_html/w/ is the mediawiki install [00:50:22] now i can add a line for .htaccess to ignore /w/ and it'll change the URL to mydomain.com/wiki/Main_Page [00:50:32] but the page that loads is the forum again [00:50:33] instead of the wiki lol [00:51:25] it'd be simpler if you could just put the wiki on a seperate subdomain or similar [00:51:50] foo.com being a forum, but foo.com/w being a wiki is just confusing [00:52:05] true.. would be easier [00:52:33] or if you used public_html/forum and public_html/wiki or something as base [01:06:17] wooh! [01:06:18] fixed it. [01:06:29] added the modification to the root .htaccess [01:06:42] removed this line: RewriteRule ^/?$ %{DOCUMENT_ROOT}/w/index.php [L] [01:06:53] now mediawiki and the forum are playing nicely [01:06:54] :) [01:31:25] Hi. I'm testing VisualEditor on 1.23.1. Everything is OK except for inserting an image. When you insert media, you can't use searching function. The error message is 'error":{"code":"readapidenied","info":"You need read permission to use this module"}'. [01:32:41] Could anyone give a clue? I've changed the permission $wgGroupPermissions['*']['writeapi'] = true; according to https://bugzilla.wikimedia.org/show_bug.cgi?id=67313#c16. But it doesn't work. [01:37:53] I have a wikipedia that's set up with many nested templates. This one page has ~108 templates and the page loading is ~3 seconds. (host is shared commandline on hostmonster) Does anyone have any suggestions about speeding up the template generation, or caching the whole page so that loading happens faster? [01:45:39] is there are a means to push the api to return numbers in arabic, rather than the local language? [01:46:25] arabic = standard numbers [01:48:23] Coiby: try asking in #mediawiki-visualeditor? [01:48:32] sDrewth: which API module isn't? [01:48:49] !fast [01:48:49] [01:48:58] looking at neWP, and naturally it gives Nepali figures [01:49:01] isaaclw: ^ I recommend reading those [01:49:07] sDrewth: link? [01:49:36] legoktm: Thanks! I didn't know there's such channel. [01:49:56] sidebar, it is long [01:50:22] is your language set to english? [01:50:34] I would think so [01:50:47] hmm, weird. [01:50:59] yep [01:51:14] though, the api may not know who I am [02:07:00] related question. I downloaded the apc module, and added it to my php.ini, but all the guides say that to install php-apc, you need sudo access? is that true? why does it seem like my php-apc module is working? [02:07:32] by "downloaded the apc module" I mean I downloaded and compiled it locally, and linked it into the extensions folder. [02:36:20] isaaclw: and you typically need root access to get it in the extensions folder [02:42:52] valhallasw`cloud: ah, ok. I just made a new extensions folder and linked the system extensions to it, and referenced that folder in the php.ini [02:43:44] right, and you then need root access to edit php.ini [02:44:14] *unless* you're on a very atypical system, where you are running php as per-user (f)cgi process; in that case, each user could have their own configuration and thus their own extensions [04:27:24] 2/1 [07:17:55] is this related to directory permissões or some bug ? so far google has not helped much . error: Could not create directory "mwstore://local-backend/local-public/1/1e" [07:40:39] solved. [07:41:06] was mediawiki 1.23 and it was perms on images/ needed to 775 [08:36:12] Hi! [08:36:27] I am working on the BookManagerv2 extension [08:37:04] the wmflabs server web interface gives me 'No webservice' error [08:37:26] It says: The URI you have requested, /bookmanagerv2/wiki/, is not currently serviced. [08:37:48] Can anyone please help me with this? [12:14:32] Hi, according to my provider, the file DefaultSettings.php is hackable and is used by hackers to generate spam and collect data. Is this a known problem? (MediaWiki versions of the mentioned sites: 16, 17) [12:14:49] Say what? [12:15:09] Anything less than 1.19 isn't supported at all [12:15:53] Versions 16 and 17 aren't really that old. [12:15:56] They are [12:16:14] But as it's a PHP file, if it's left writeable, of course it could be used for any purpose [12:16:37] 16 and 17 are 3-4 years old [12:16:56] It's only writeable for owners [12:17:11] But if they hack the server... all bets are off [12:17:47] Of course. My guess is though that it's a false message. [12:19:07] It's certainly possible [12:19:09] Just unlikely [12:19:19] Wordpress would be more likely to do that ;) [12:19:41] GuidodB: LocalSettings.php is in this regard no different from *any* other PHP file. [12:19:52] the only difference would be in file permissions (i.e. who can modify the file). [12:19:57] It's DefaultSettings.php [12:20:06] GuidodB: or that [12:20:15] Only owners can modify [12:20:27] ...or whatever. any php file will do [12:20:33] if an attacker can modify files, you lose. [12:20:44] always. [12:21:06] The file has not been changed in any way. I checked. [12:21:13] GuidodB: how secure that is depends on who the owner is, and what processes run as the owner. [12:21:31] in any case, the alert seems silly. [12:21:46] Thanks, I thought as much. [12:21:58] "cars are unsafe, they can be stolen and used to rob banks". well, yea... [12:22:05] :) [13:05:58] so i was working on something that will require a lot of data to be sent through a url. i was thinking of using gzdeflate and then url encoding it before sending. Does that sound ok ? [13:06:51] Hi Nemo_bis. Sorry, just saw your ping. [13:07:15] rohan013: gzipping is fairly standard practise [13:07:45] Ok. Thanks! [13:27:59] Somebody tried using Phalanx ([[extension:Phalanx]]) on a "normal" wiki? [13:38:42] Hi everyone, I was wondering how to get the language mappings between different wikipedia pages. This information seems to be available on Wikidata, but I can't figure out how to get a dump. The "Wiki interlanguage link records" at http://dumps.wikimedia.org/wikidatawiki/20140705/ looked promising but that seems to contain user information if I'm not mistaken. Any thoughts? [13:45:42] merpeltje: Try http://dumps.wikimedia.org/wikidatawiki/20140705/wikidatawiki-20140705-iwlinks.sql.gz [13:45:50] "Interwiki link tracking records" [13:53:44] Thanks reedy, however, how do I recognise the inter-language links there? I see for example 725 | commons | COM:VP | | 725 | commons | Category:Rio_de_Janeiro_(state) | | 725 | commons | User:Multichill/Commons_Wikidata_roadmap [13:53:50] oh, that's not right [13:54:21] https://dpaste.de/Hbhz [13:55:13] it looks like it also lists "see also" in the same format [13:55:33] making it difficult to filter out the "see in other language" links [14:01:49] merpeltje: You might get a quicker answer asking in #wikidata [14:04:12] ah thanks! Still new to using wikidata [14:04:15] obviously :) [15:46:29] im wondering too is there a replacment for OpenId extension and SocialLogin for MW 1.23 [15:47:14] E:OpenID is still maintained, so it shouldn't need replacing... [16:16:53] Err, how do I find the canonical domain (if any) for a wiki of which I only know the IP? http://94.23.18.211/api.php [16:18:02] http://ping.eu/nslookup/ [16:18:09] ks366720.kimsufi.com [16:21:34] Yes, I know nslookup. :) I meant from the wiki itself, is it really impossible? [16:21:58] Other than the which doesn't look like best thing [16:23:35] Lol just teasing [16:23:46] Was looking for a way, but cannot seem to find anything apparent [16:24:40] Ok [16:24:55] I feel less stupid then :) [17:18:44] how are you supposed to urlencode/decode strings? I sent a urlencode() string but extracting it from request->getValues() on the other side returns some garbled string [17:20:06] "supposed to" [17:20:22] ragno: You sent a urlencode() string from the client side? [17:20:35] Can you provide us with code, maybe? [17:20:36] !paste [17:20:36] To avoid overflowing the channel with inane amounts of text, use https://dpaste.org/ or other awesome pastebin sites to share code, errors, and other large texts. [17:21:59] fhocutt: Hi! [17:22:07] hi sumanah! [17:22:34] fhocutt: How goes your Thursday? [17:22:57] it goes! I have tea. [17:23:31] marktraceur: i did urlencode(gzencode('some string')) as a parameter for a url [17:24:04] fhocutt: is today correspondence day? [17:24:18] sumanah, yesterday was mostly correspondence day [17:24:22] ah [17:24:32] marktraceur: now when i go to the url, and extract the parameter through request->getValues(), it returns something like �‹��������3��·ï܃���� [17:24:54] sumanah: there is more of course, but today I will finish up the Perl issues and get started on writing out the Ruby eval [17:25:23] fhocutt: got it. how are you feeling? [17:25:39] sumanah: waiting for the tea to kick in, right now [17:25:42] nod [17:25:52] but not half bad. [17:26:03] i think urlencode will inflat that gzip data a lot though :) [17:26:06] *inflate [17:26:18] cool. fhocutt anything you need? anything you are blocked on from my end? [17:27:01] nope, everything is pretty much the same as yesterday, except that 3/4 of the tabs in my mailbox are now smaller than they have been in a long while :) [17:27:07] :D [17:28:05] I will finish up my tea and get on that, then. Anything you need from me? [17:28:34] I don't think so! Happy hacking fhocutt [17:28:46] ty [17:28:48] \o [17:29:36] oh, sumanah, I'm going to meet up with amenking next week and talk about research tools [17:29:43] Cool! [17:43:29] ok, I'm about to triage some RfCs and move them around [17:43:57] and schedule next week's meeting on the Composer stuff. bd808 I presume sometime Wed next week works for you? [17:44:21] * bd808 looks at calendar [17:44:53] sumanah: Yeah my wednesday is wide open right now [17:44:57] cool [17:51:54] marktraceur: Reply whenever you are online. i will read it in the logs [17:52:21] ragno, well, gzencoding it might be the issue [17:52:30] ah https://www.mediawiki.org/wiki/Requests_for_comment/Data_mapper and https://www.mediawiki.org/wiki/Requests_for_comment/Dependency_injection are new [17:52:31] Just a guess [18:11:00] hidgw, Is it possible to pass an array of tables in the MediaWiki join? Like here http://pastie.org/9400616 , instead of table 'user_properties' , I wish to pass an array of tables. [18:12:14] Just add all your joins in that same outer array [18:12:46] Reedy, please make it clearer [18:13:44] I wish to retrieve values from multiple tables but I don't wish to write all the tables names in that query [18:14:12] http://pastie.org/9400627 [18:14:16] What? [18:14:16] marktraceur: ok yes, the UtfNormal class normalizes all strings. Is there a way to avoid get the raw string from the url ? [18:14:32] marktraceur: ok yes, the UtfNormal class normalizes all strings. Is there a way to get the raw string from the url ? [18:14:34] ragno: I don't *think* so [18:14:41] hmmm [18:14:46] I tried doing that a few times, it seemed a little nasty [18:14:49] i forget if we modify $_REQUEST and friends directly or not [18:15:03] we probably shoulnd’t if we do :) [18:15:49] Alright Reedy thanks, let me try this. [19:33:48] hi liangent [20:08:33] hi - all js is minified by resource loader. when i 'prettify' a module, the lines are out of whack with whats being executed, so it's darn difficult to debug js. Is it recommended to somehow turn off mnification during debugging? or there's something I am missing? thanks [20:10:32] hi - all js is minified by resource loader. Using FF debugger when i 'prettify' a module, the lines are out of whack with whats being executed, so it's darn difficult to debug js. Is it recommended to somehow turn off mnification during debugging? or there's something I am missing? thanks [20:11:47] hypergrove: ?debug=true [20:13:33] ah i see thanks [20:23:43] sumanah: I have submitted all the issues for MediaWiki::Bot [20:23:48] \o/ [20:23:56] :) :) [20:23:58] all of them? [20:24:00] That's a lot. [20:26:21] Quick question. If I want an image in a template, like stub with a div or something, how can I get it (the image) to stay in the template itself and not sit halfway out of it. (so like it will extend the height of the template box to fit in it but not lay partially out of it). If that makes sense, anyway. [20:27:22] Reedy: all the ones I planned to at least! [20:27:38] feel free to add your own [20:27:40] Guest13643: look into "css clearing floats" [20:27:55] Guest74258: ^ [20:28:09] why do we have some many "Guests" here, anyway? [20:28:24] Guest74258: for a practical example, see https://en.wikipedia.org/wiki/Template:Clear [20:28:45] Awesome, thank you! [20:34:56] MatmaRex: I presume it's because of the folks who come in here via freenode's webchat interface - that's a default I suppose? [20:36:34] sumanah: dunno, i've seen this happen after heavy netsplitting [20:36:42] nod [20:36:46] but i've only seen one modestly sized netsplit earlier today [20:36:50] :) [22:50:02] fhocutt: hey there, I'm about to nip off for the day [22:50:08] how are you and need anything? [22:50:11] (hi DanielK_WMDE) [22:50:16] hi sumanah [22:50:52] don't think so, I have a MW::Gateway eval to draft and Java resources to look at [22:50:55] OK! [22:50:57] thanks! have a good evening. [22:51:07] You as well! [23:16:09] does anyone know if the concept cache automatically rebuilds it self, or do I always need to run rebuildConceptCache? [23:17:45] wth is the concept cache? [23:18:20] semantic-mediawiki thing [23:19:06] yep, is this the wrong irc to ask? [23:19:51] yes [23:19:52] there's #semantic-mediawiki [23:20:12] Jango_____: says "pre-computed results for a concept query might become out of date, so the displayed results might no longer agree with the contents of the wiki" [23:20:18] which makes me think it doesn't automatically rebuild [23:20:23] but i'd ask on #semantic-mediawiki to be sure [23:21:45] thanks, I saw that but thought it weird that it wouldn't automatically rebuild, ill ask the other channel too [23:34:24]  [23:34:26] oops. [23:47:22] hello hello71