[00:15:36] hi,guys!what's the extension in this pic? [00:15:37] http://postimg.org/image/5jzemi463/ [00:18:35] zjhxmjl: It may not be an extension, it might be a gadget [00:29:44] marktraceur:thks [02:01:39] Hi! Could I possibly have some assistance moving my wiki to a new directory? I'm very confused. [02:02:00] My wiki was originally at 'wiki.example.com', but I want it at 'www.example.com/wiki' [02:02:23] Output: The requested URL /Wiki/ was not found on this server. Additionally, a 404 Not Found error was encountered while trying to use an ErrorDocument to handle the request. [02:02:40] JubJub: You probably have associated Web server rewrite rules. [02:02:48] What kind of Web server is this? Apache? IIS? [02:02:53] Apache [02:03:11] So if you want example.com/wiki/, you need to name the directory something other than wiki/. [02:03:15] Like w/. [02:03:20] That's what Wikipedia does. [02:03:31] So you'll have example.com/w/index.php [02:03:51] And then you'll use a bit of Apache configuration to turn /w/index.php?title=Foo into /wiki/Foo [02:03:57] In addition to updating LocalSettings.php [02:03:58] !shorturl [02:03:58] To create simple URLs (such as the /wiki/PAGENAME style URLs on Wikimedia sites), follow the instructions at or try the new beta tool at . There are instructions for most different webserver setups. If you have problems getting the rewrite rules to work, see !rewriteproblem [02:06:53] Alright, so what should my $wgScriptPath look like (Assuming it's example.com) [02:07:04] (Forgot question mark there) [02:09:13] Gloria: I've changed the actual directory of the wiki's contents to something else - Now what should my $wgScriptPath say exactly? It was previously blank, and I assume that's because it was at the root of my server. [02:10:34] !wg ScriptPath [02:10:35] https://www.mediawiki.org/wiki/Manual:%24wgScriptPath [02:15:36] The requested URL /wiki/ was not found on this server.- May I post something like a pastebin to show my current settings? [02:18:00] Gloria: http://tny.cz/8e0c4936 [02:18:48] $wgScriptPath = "/example.com/w"; - "example.com" also happens to be the name of the directory. [02:20:10] JubJub: So generally Apache is configured to point the root domain (example.com/) at a particular directory [02:20:14] . [02:20:15] Such as /var/www/example.com/ [02:20:23] So your wiki won't know about that. [02:20:29] $wgServer should be... [02:20:46] !wg Server [02:20:46] https://www.mediawiki.org/wiki/Manual:%24wgServer [02:21:09] You can probably not set $wgServer. [02:21:26] Actually, you're supposed to. [02:21:51] So what am I doing wrong exactly? [02:22:08] JubJub: What happens when you go to "http://example.com/w/index.php?title=Main_Page" in a URL? [02:22:11] Does it work? [02:25:13] The requested URL /w/index.php was not found on this server. [02:26:56] Err, how about I go ahead and post my actual settings (Not using example.com)? This is seriously causing major headaches. [02:27:48] you can copy/paste your LocalSettings.php to something like pastebin.com -- REMOVE all usernames/passwords (from db connection settings) first [02:32:25] Skizzerz: http://tny.cz/4bbbca5c That's everything important. I'm sorry for being incredibly stupid with this - It's not my area of expertise. [02:32:37] it's fine :) [02:32:56] your $wgScriptPath is wrong [02:33:13] Obviously I'm just not "getting" something. [02:33:13] What should it be? [02:33:13] "/wockypedia" [02:33:18] the script path does NOT include the domain [02:33:32] also remove $wgServer entirely [02:34:05] Well the the folder titled "wockypedia" is in the folder "wockymedia.com" [02:34:10] yes [02:34:20] that doesn't matter [02:34:29] it's the path relative to the URL itself [02:34:42] Ah, I see. [02:35:12] So it doesn't need to reflect the folder it's in at all? [02:35:37] e.g. if you visit wockymedia.com/a/b/c, and that resolves to /var/www/wockymedia.com/a/b/c, then the script path would be /a/b/c [02:36:24] Okay, that makes much more sense, let me try. [02:36:56] also you're going to have a bad time with the article path being /$1 [02:37:11] I recommend having the article path point to a nonexistent directory [02:37:15] e.g. /wiki/$1 [02:37:42] http://www.wockymedia.com/wiki/ < Result [02:38:02] yes, that's what you want (before you add in the alias or rewrite rules, that is) [02:38:27] if it points to somewhere that exists, you'll be mixing wiki pages and actual files [02:38:41] if it points to somewhere that doesn't exist, you know that anything hitting that URL is a wiki page [02:38:54] you still need to set up URL rewriting to get that to work though [02:39:07] do you have root-level access to the server (e.g. can you modify the apache configuration)? [02:39:47] if no, it's not a big deal, just means we have to use a different method that works slightly worse [02:40:36] JubJub: ^ [02:41:00] Skizzerz: I'm so sorry! My browser went whack and died. [02:41:04] heh [02:41:10] what's the last thing you remember me saying? [02:41:16] I'll copy/paste everything after that [02:41:42] Last thing I saw at all was me posting the link to http://www.wockymedia.com/wiki [02:41:46] ok [02:42:37] I just have such a hard time with software I didn't design myself, it's shameful. [02:42:52] http://pastebin.com/s8C951M9 [02:42:58] is what I said [02:43:10] it was like 8 lines so I didn't want to re-paste it here (people tend to get mad at that) [02:44:26] No access to the apache configuration - I'm using a cheap host. [02:44:46] okay [02:45:00] It would be on a real server, but I'd rather that's away from business stuff. [02:45:04] in that case you'll need to set up some rewrite rules to make the pretty urls (and thus the article path) work [02:46:52] JubJub_alt: do you want it so that visiting www.wockymedia.com automatically goes to your wiki? [02:47:08] or do you plan on having some other site/landing page on the domain's root? [02:47:53] No, the wiki's meant to be a subsection of a later-to-be full site so www.wockymedia.com needs to be separate itself. [02:48:32] okay [02:49:57] make a new file called .htaccess (with the dot in front) directly in your web root (so the wockymedia.com folder), or edit the existing one if one is there, and insert the following code into it: http://pastebin.com/Rxke5WWE [02:50:15] if one already exists and already has a line saying "RewriteEngine On", then you don't need to copy/paste that line again, just insert the other 3 lines somewhere below it [02:50:23] that will make your article path work [02:50:49] (assuming a $wgArticlePath = '/wiki/$1' -- also note the single quotes there) [02:54:34] Ah, still 404 [02:55:09] I will note that due to my previous browser crash that I lost some changes made to localsettings.php. [02:55:37] $wgScriptPath = should be set to what again? [02:56:01] $wgScriptPath = "/wockypedia"; $wgArticlePath = "/wiki/$1"; remove $wgServer entirely [02:57:55] That's all what I had, but it still produces a 404. [02:58:04] http://www.wockymedia.com/wiki/ [02:58:23] where did you put that .htaccess file? [02:59:11] In the root. [02:59:18] (Same as LocalSettings.php) [02:59:21] nono [02:59:25] it goes the folder ABOVE that [02:59:31] web root, not wiki root [03:01:49] The absolute root - Or the "wockymedia.com" folder? [03:01:58] wockymedia.com folder [03:02:10] Still producing a 404 error. [03:02:31] hmm [03:02:38] was there any .htaccess there previously? [03:02:44] or did you add the file [03:03:46] I had .htaccess previously - From a previous configuration that didn't work for now obvious reasons. [03:03:51] okay [03:04:01] did you overwrite it entirely with what I gave you? [03:04:07] or did you keep bits of the old one? [03:04:39] Kept bits of the previous as you instructed. [03:04:44] okay [03:04:48] can you copy/paste the full file? [03:05:01] (into whatever paste site you were using before) [03:05:30] http://tny.cz/52adcbbd [03:05:59] okay [03:06:05] remove everything that was there before, just leave the stuff I gave you [03:07:13] http://www.wockymedia.com/wiki/ - 404 [03:07:42] =\ [03:08:00] one last thing to try, add the line "RewriteBase /" (without the quotes) immediately after "RewriteEngine On" [03:09:35] Still 404 [03:10:15] Here's a look at the localsettings again: http://tny.cz/63fd47cd [03:10:30] that part is right [03:11:08] can you paste .htaccess again (with the edits you made)? [03:11:41] can you also provide me the full path to it versus the full path to your wiki's index.php (on the filesystem) [03:12:06] http://tny.cz/c6fe9e79 [03:12:16] oh [03:12:34] actually one more thing to try [03:12:34] now that I looked at mine [03:12:36] (root)/wockymedia.com/wockypedia/LocalSettings.php [03:12:44] Oh, really? What is it? [03:12:46] remove both of the RewriteCond lines [03:12:57] then change ^/wiki to ^wiki [03:13:19] and add in anoth... actually I'll just paste the changes [03:13:29] http://www.wockymedia.com/wiki/Main_Page ! [03:13:36] THANK-YOU 0_0 [03:13:59] RewriteRule ^wiki/(.*)$ /wockypedia/index.php?title=$1 [PT,L,QSA] [03:13:59] RewriteRule ^wiki/*$ /wockypedia/index.php [L,QSA] [03:14:03] that's what I'm using on my own wiki [03:14:27] the second rule makes visiting http://www.wockymedia.com/wiki not error out [03:15:35] So I should add the second one instead for stability? [03:15:46] you should have both [03:16:37] Now, what about those previous entries about files? [03:16:53] Does that not matter anymore? [03:17:10] it doesn't matter as long as you never actually create an actual wiki folder [03:17:32] (e.g. as long as /wockypedia.com/wiki never actually exists) [03:17:46] er, /wockymedia.com/wiki [03:18:12] I know, confusing naming convention. [03:18:46] Wow, what a nightmare. [03:18:49] Thanks a lot. [03:18:52] np [03:19:42] Now I have a second wiki that needs to be moved to the same folder - I'll closely follow your examples. [03:20:02] uh... [03:20:11] you mean you're merging two wikis into one? [03:20:23] or you have 2 seperate wikis [03:20:36] because if you have 2 seperate wikis, you'll need a different fake folder/article path for each [03:20:56] No, same folder as in the wockymedia.com folder, two separate wikis connected via shared tables. I understand. [03:21:12] okay, well best of luck ;) [03:21:26] I recommend trying to understand everything you just did before trying to set up a farm though [03:21:44] as understanding how rewrites work as well as the basics of wiki config will help you a lot with that [03:23:26] http://www.wockymedia.com/petpet/Main_Page [03:23:55] I probably won't forget that anytime soon. [03:23:59] Skizzerz: BTW, the manual page recommends to keep $wgServer set. [03:24:09] I thought the same as you. [03:24:21] huh [03:24:32] I mean, you'll get a small performance improvement by keeping it set [03:24:38] Ooh, it does? That's gone on both wikis now. [03:24:52] and setting it is necessary if you're running multiple wikis from one set of files [03:24:58] but beyond that the autodetection works just fine [03:25:05] JubJub_alt: when I say small, I mean < 1 ms [03:25:37] !wg Server [03:25:38] https://www.mediawiki.org/wiki/Manual:%24wgServer [03:25:46] It likely saves more not having to execute that code anyway :P [03:26:31] okay maybe like 10 ms [03:26:34] https://www.mediawiki.org/w/index.php?title=Manual:$wgServer&diff=674927&oldid=674744 [03:26:35] but still pretty insignificant [03:27:17] Now all I need to do is configure the old URLs to redirect to the new locations... [03:28:26] ...And done. [03:30:32] One last question - Would it be unsafe to rename the database tables - As in I'd have to go through and change a lot of settings in a lot of places? Not that's it's important, but I originally named the databases to correspond with the subdomain URLs. [03:32:04] I think the update script covers things like that? [13:45:17] Hi there [13:46:29] Somebody can help me to resolve a problem which is not referenced on forums ? [13:50:10] Schubby: Maybe [13:50:45] I would like to install mediawiki on my CentOS 6.5. It seems that the current version is the 1.22.6 [13:51:57] I downloaded the package unzipped move the folder to the right directory ... [13:52:10] It works, i have the first page telling me that "LocalSettings.php not found. Please set up the wiki first." [13:52:21] BUT when an going on the link "set up the wiki" i have a blank page. [13:52:40] I saw on some forum (like here : http://www.mediawiki.org/wiki/Manual:How_to_debug) that i need to write something in the "LocalSettings.php" file. But if 'm right, it's the link which launch the procedure to create my "LocalSettings.php"file. [13:52:50] Check your error logs [13:53:58] I'd presume you're running apache on the server so this might be useful if you can't find your error log location: http://serverfault.com/questions/189205/where-can-i-find-apache-error-log-on-centos [13:54:01] which logs ? the one for httpd ? [13:54:09] yeah [13:54:11] that'd be it [13:55:58] i have my error message : " PHP Fatal error: Class 'DOMDocument' not found in /var/www/mediawiki-1.22.6/includes/cache/LocalisationCache.php on line 587, referer: http://10.156.7.3/mediawiki-1.22.6/" [13:56:20] Google might help me so ;) [13:57:55] What version of PHP are you running Schubby? [13:59:12] PHP Version => 5.3.3 [13:59:57] and mysql 5.1.73 [14:00:46] Schubby, try yum install php-xml [14:01:54] works =) [14:02:30] yay :) [14:02:43] (I don't even run CentOS, I just found that suggestion on stackoverflow, heh) [14:02:49] Schubby, fyi, it was http://stackoverflow.com/questions/14395239/magento-class-domdocument-not-found [14:03:21] i saw somtinhg about that but i thought installed it ... but no ... [14:04:15] thank you for your time [14:08:28] you're welcome [14:49:51] Hi all. Is the User Statistics Extension discontinued? [14:50:15] Was reading about it here: http://www.mediawiki.org/wiki/Extension:Usage_Statistics , but I can't download it. [14:50:38] you can't download it? [14:50:53] huh. [14:50:56] oh, right. [14:51:10] locsmif, it's not been migrated to git. I think that means you have to clone it from svn [14:53:40] Krenair: ah, okay, I'm also seeing it hasn't been updated in a while, maybe I should try another extension instead? [14:53:44] I have a question regarding a text-rendering issue (choice of font) of an SVG file I uploaded, is this the right channel? [14:53:48] (Last in 2009 or so) [14:54:20] locsmif, yeah, stuff that hasn't been migrated to git won't have been maintained since august (when svn was made read-only) [14:55:18] most of those things have been abandoned for a long time [14:55:29] locsmif, so yes, much better to avoid it. it may not even work with modern versions of mediawiki [14:56:55] Yeah [15:00:18] Krenair: well, I decided to go straight to what I want to look at; I'm at the SQL prompt. Would you be so kind as to point me in the right direction? Looking for page edits, dates, times and user making edit [15:00:30] Of all pages in the entire wiki [15:00:46] I'm an SQL man so the query design and export won't be a problem :) [15:01:27] Well we have https://www.mediawiki.org/wiki/Manual:Database_layout [15:01:27] Small data set as of yet too, so :) [15:01:34] Krenair: awesome [15:01:49] but it doesn't show you a lot more than an Explain query [15:02:21] https://upload.wikimedia.org/wikipedia/commons/thumb/4/42/MediaWiki_1.20_%2844edaa2%29_database_schema.svg/2500px-MediaWiki_1.20_%2844edaa2%29_database_schema.svg.png <-- awesome [15:02:21] locsmif, you want the revision table [15:03:03] Krenair: thanks, going in [15:03:07] rev_user/rev_user_text for user info [15:03:20] (rev_user is the id of the user. rev_user_text is the name) [15:04:13] Hmm, why did they denormalize that? Performance? [15:04:29] Won't deny it simplifies the query :) [15:04:53] You mean why does it have _text when you can look up each user name from the table using the ID? [15:05:55] locsmif, ^ [15:08:46] Krenair: yeah :) btw, first export to have a look --> SELECT * INTO OUTFILE '/tmp/revision.csv' FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' FROM revision ORDER BY rev_id; [15:13:53] Krenair: thanks for the help [15:15:16] you're welcome [15:15:58] (I'm not actually sure of the answer to the question - I suspect the answer lies in phase 1/2 :/ [15:39:00] hi, i'd like to have link in a template like [[ {{PAGENAME}}&action=myCustomAction]]. Any ideas how i would do that? [15:40:10] gblfhs [15:40:16] пидары привет [15:40:18] суки [15:40:22] пендосы [15:40:35] lotek4715, [{{fullurl:{{PAGENAME}}|action=myCustomAction}} text], I think? [15:40:36] аллах акбар [15:40:42] АЛЛАХ КБАР [15:40:49] ALLAH ABAR [15:40:56] AKBAR [15:41:13] !ops [15:41:13] ZATNIS [15:41:13] !op [15:41:13] There are multiple keys, refine your input: openid, opsschool, opw, [15:41:32] WIKIPEDIA SUCS [15:41:42] SUCKS [15:42:08] [{{fullurl:{{PAGENAME}}|action=myCustomAction}} text] [15:43:36] @Krenair : thx ,that works [15:43:43] Krenair: !ops is correct and sends a message to #antispammeta [15:43:45] :) [15:43:57] Oh, hi ToBeFree [15:44:02] lol hi [15:44:03] !ops is correct and sends a message to #antispammeta [15:44:04] I was expecting something to go to #wikimedia-ops :/ [15:44:16] naaah bandera, now you triggered it... [15:44:33] SUCK MY DICK [15:44:44] It does report to #wikimedia-ops as well. [15:45:13] Apparently the ops are asleep though [15:45:27] freenode staff is on the access list, someone could report in #freenode if nothing happens, just saying [15:45:36] * ToBeFree wanders off [16:00:16] hi fhocutt [16:00:27] hi sumanah [16:00:46] *wave* how is Seattle? [16:01:48] do deprecations break the extensions that use them i.e. deprecation of sajax_do_call? [16:02:41] should I ask that in wikitech instead? [16:10:17] wmat: you might want to ask here again in an hr when more North American west coast people are online. or on mediawiki-l [16:10:20] !lists | wmat [16:10:20] wmat: mediawiki-l and wikitech-l are the primary mailing lists for MediaWiki-related issues. See https://www.mediawiki.org/wiki/Mailing_lists for details. [16:11:05] just found the tickets for removal of sajax from core [16:11:20] @seen Tpt [16:11:20] rohit-dua: Last time I saw Tpt they were quitting the network with reason: Quit: Tpt N/A at 5/20/2014 1:48:33 PM (2h22m47s ago) [16:29:16] Hi, I am testing MediaWiki on my PC using a single site for all languages. I have problem on translating a page from a native language, say fr to en. [16:30:15] hi KCLau - what happens when you try? are you getting an error? [16:31:14] The source language of the page is always set to the language of the site, say en when it should be fr. [16:32:40] The tag expands to English as the 1st (default). So I cannot actually translate into English. Translating into another language say zh is ok. [16:40:48] You can access my site http://61.18.209.180/wpw1/index.php/Ma_deuxi%C3%A8me_page and see what I mean. [16:45:34] http://61.18.209.180/wpw1/index.php/Ma_troisi%C3%A8me_page/fr shows another page (named with /fr at end) with a link to translated page in Chinese. [16:46:45] Looking for someone who knows how to use semantic forms for mediawiki [16:48:14] Is there a way to specify the source language of the page rather than letting it default to the site language $wgLanguageCode = "en"; ? [17:00:40] cgendron1234: I know how to use Semantic Forms (I'm the author). [17:00:44] qgil: https://www.mediawiki.org/wiki/Google_Summer_of_Code_2014#Status needs a bunch of clarification in my opinion. if some of those items have dates, then what is "prior microtasks completed"? [17:00:52] "community bonding report" - when? [17:01:04] Generally, the #semantic-mediawiki channel is the best place to discuss any SMW-based extensions. [17:01:32] sumanah, the microtasks done before being accepted [17:02:10] qgil: I mean, if you are giving due dates for some of these updates, then you should specify due dates for each of the updates, or say "no due date" or clarify temporal expectations [17:02:26] ok, I will [17:02:28] thanks [17:03:53] and if you ask "do you have a minimum viable product that you will build on?" on Day 1, then it's really confusing to the new person. "no! of course I do not! I think! I just started!" if you say something like "do you have a DESCRIPTION of what the mvp WILL BE, and that you should achieve by week 6?" then they'll understand better [17:04:27] (if they can even untangle the jargon at all) [17:04:40] (jargon like "deliverable" is bad enough) [17:07:58] Hi sumanah, my problem here is not error during translation. Rather, for translating a page into other languages, I would want to specify a source language rather than letting mediawiki defajult to the site language. [17:08:26] sumanah, we had discussed the minimum viable project with GSoC projects during the selection process, this is why it may come as a surprise to OPW non-coding projects like yours. [17:09:09] The default (English in this case) works fine when the source page is in English, but in our case the pages are usually written in a native language and then translated into other languages. [17:10:55] qgil, that makes sense, but since OPW is not code-only the language should be different for it or at least clarified. [17:11:08] hi [17:11:49] fhocutt, yes, I understand and I agree. This is a (solvable) side effect of using the same instructions for both programs, which is a good principle per se. [17:11:58] hi biberao [17:12:05] qgil, agree [17:12:16] fhocutt, this problem was reported 25 minutes ago, and it is almost solved by now. Not bad. :) [17:12:28] ok! [17:18:09] sumanah: https://www.mediawiki.org/wiki/Evaluating_and_Improving_MediaWiki_web_API_client_libraries/Progress_Reports [17:20:39] sumanah, I have about 30 min right now, want to talk current status and next steps? [17:20:46] fhocutt: sure [17:20:55] fhocutt: tell me about your past 20 hours! [17:21:03] * fhocutt grins [17:21:36] I added/sorted through the libraries on API:Client code [17:22:09] sorted only on whether they have been updated in the last year so far [17:22:24] nod [17:22:58] Java, JS, and Python each have several maintained libraries--I was surprised [17:23:51] wow! [17:24:00] today I am planning to go through and evaluate on features/open pull requests [17:24:08] nod [17:24:50] any thoughts on best format/place to put these evaluations? Notes on API:Client code, spreadsheet, separate page? [17:25:54] I think the API:Client code talkpage would be fine [17:26:14] ok, that works! [17:26:42] cool! [17:27:09] that will probably take most of the day; if it doesn't I'll start outlining blog posts [17:27:12] fhocutt: well it looks like Perl will be easy [17:27:19] (I mean, to decide what to shortlist) [17:27:21] yep. There is precisely one. [17:27:34] and Ruby might be pretty easy as well [17:27:51] agree. [17:28:41] Java and JS will be the more difficult ones [17:29:06] nod nod [17:29:15] ok, I think I'm good [17:29:18] anything from you? [17:30:08] I'm grumpy and caffeinating and generally a bad person right now [17:30:21] * sumanah listens to theme from Indiana Jones to see if it helps [17:30:37] (I have an album called "Greatest Marches" which I consider basically early techno) [17:31:04] sorry to hear it, sumanah [17:31:07] thanks [17:31:39] I am happy that even now, the lives of 3rd-party developers is better than it was 4 days ago, because of the work you have done in https://www.mediawiki.org/wiki/API:Client_code to help them find client libraries that will suit them [17:31:40] am also working on the caffeinating, 9 am is earlier than I usually do these days [17:32:07] ya mediawiki needs good info :X [17:32:10] nod [17:32:41] thanks sumanah! [17:32:43] Today I will do the meditative task of updating a bunch of RfCs [17:33:31] that sounds like a good grumpy task. [17:33:35] nod! [17:33:47] fhocutt: I should let you go now. Thanks for the checkin. Same time-ish tomorrow? [17:34:02] AaronSchulz: https://www.mediawiki.org/wiki/Talk:Performance_guidelines in case you have anything to reply to, especially around indexing [17:34:31] how about around 11 am/2 pm? [17:34:40] sure, sounds good! [17:34:47] great, talk to you then [17:58:48] i gotta say [17:58:55] the new client api seems pretty [17:59:16] now each lib needs a nice explanation [18:03:17] biberao: Hi! It sounds like you are saying that the page "API:Client code" looks good but could use more explanation, correct? [18:05:30] So, I'm playing around with 1.23 and found a couple of oddities. [18:05:38] 1) User preferences are all grayed out [18:05:59] sumanah: yes [18:06:43] 2) The expanding groups on the sidebar all stay either collapsed or expanded. Even if you click them. [18:06:59] biberao: great! So, that is a list, but it is not a new client API. It's a list of bits of software that interact with MediaWiki's existing API [18:07:01] !api [18:07:01] The MediaWiki API provides direct, high-level access to the data contained in the MediaWiki databases. Client programs should be able to use the API to login, get data, and post changes. Find out more at < https://www.mediawiki.org/wiki/API >. For client libraries in various languages, see < https://www.mediawiki.org/wiki/API:Client_Code >. [18:07:26] i mean new page [18:07:26] arr [18:09:39] ah cool :) [18:10:00] Rosencrantz: ! This is your own install of 1.23 correct? [18:10:04] yeah [18:10:17] I figure the sidebar thing is probably some wacky javascript somewhere I just have to find [18:10:42] I know that the collapsible javascript for tables I was using broke. So I started using mw-collapsible instead [18:10:56] But the preferences I'm really drawing a blank on [18:17:38] weird! [18:17:45] Rosencrantz: would you mind filing a bug in Bugzilla? [18:18:10] Sure thing. [18:19:02] Thanks! [18:20:21] Hey sumanah got a min.? [18:20:45] hi kishanio! yes. What's up? [18:22:27] sumanah: I finished documenting Proofread Page Extension API https://www.mediawiki.org/wiki/Extension_talk:Proofread_Page#API_Documentation_.26_Improvement [18:22:43] * sumanah clicks [18:23:15] and need some help to translate it before moving it onto Extension page. [18:23:52] sumanah: disregard the inputs section. [18:24:31] kishanio: ah. I think the "inputs" section is notes for you to use and clean up and integrate into the main part of the documentation - correct? [18:26:05] sumanah: Yes inputs section is just some notes i compiled from IRC & Mailing list. [18:26:11] kishanio: Remind me - what programming languages do you have at least a little bit of experience with? [18:34:33] kishanio: like, I think you know PHP, right? [18:35:32] sumanah: urm i have done couple of web dev using php/js at my old job including mainstream projects like seagate.com *flaunts* but this was almost 2 years back. My new work place is more driven towards philosophies we select technologies as per requirement. [18:35:49] Oh you worked at Seagate? Thank you! hard drives - they are necessary [18:37:23] sumanah: yes i do know php and not really i was part of outsourced team they hired to do frontend work. [18:37:38] kishanio: so, I would love it if you would give a little HOWTO for a common use case, e.g., "user wants to use the API to download everything in a particular Wikisource book" (example: https://en.wikisource.org/wiki/Index:Merry_Muses_of_Caledonia.djvu ) [18:37:54] or "user wants to use the API to proofread a page" - is that possible? [18:40:44] sumanah: We can do something like http://en.wikisource.org/w/api.php?action=query&generator=allpages&gapnamespace=104&gapprefix=Merry_Muses_of_Caledonia.djvu&prop=proofread [18:41:26] this will give us list of pages along with the proofread status. We could iterate over this to get entire book. [18:42:18] kishanio: so, explaining that would be helpful to new Wikisource API users [18:42:29] in the documentation you write [18:43:23] Alright. I would include the above example in it. Also im not sure if proofreading can be done in terms of just API itself. Tpt_? Is this possible [18:45:20] sumanah: I found the sidebar issue. I still had the Vector extension installed. [18:46:01] AHA. Glad you could clear it up Rosencrantz [18:46:14] I've still got the preferences issues, but I submitted a bug report about it [19:15:11] kishanio: Sorry, I was having my dinner. Yes it's possible to edit Page: pages with the API and so do proofreading. [19:16:02] hi [19:17:21] i'm looking for an editor extension that lets me set text-color, is there any? i tried the 'WikiEditor' but it doesn't have it [19:17:52] ckeditor seems not to support mediawiki [19:18:21] and i can't get VisualEditor to work [19:18:42] probably overkill anyway [19:19:27] Bombo: VisualEditor doesn't do text colour setting right now, sorry. [19:20:25] James_F: is there any ext that does? [19:20:50] Micru: hey there - https://meta.wikimedia.org/wiki/Requests_for_comment/How_to_deal_with_open_datasets - sometime this month, tomorrow or next week, want to talk about it in the IRC meeting? [19:24:10] Bombo: probably not. you can just use HTML markup [19:24:28] (mediawiki accepts a limited subset of HTML) [19:24:57] text goes here is simple and still works as well as it did in the nineties ;) [19:25:05] Bombo: It'd be relatively easy to write a plug-in for VE that did text colour annotation, but it's not been done yet, no. [19:26:28] Tpt_: Thats alright. Can you help me with this later tomorrow? [19:26:53] kishanio: Not tomorrow but maybe Thursday. [19:27:17] MatmaRex: sure, it just would be a lot easier if it'd be clickable [19:27:52] James_F: i couldn't get it to work anyway ;) [19:28:15] Tpt_: sure that works. thanks. [19:28:28] (Parsoid/Parsoid.php not found) [19:29:15] Bombo: You need to install the Parsoid extension as well as the Parsoid service. [19:30:14] James_F, btw: what's the ETA for not requiring the extension any more? [19:30:41] gwicke: Well, that will need the core LESS to get merged. After that, a few days. [19:30:56] gwicke: Ideally I'd like to get it done and pushed into REL1_23 to back-compat. [19:34:55] sumanah: do you have the link to Amir's blog post you sent yesterday around somewhere? For some reason my browser has years of history, but is unable to find a specific page from only a day ago >-< [19:34:59] >_<* [19:35:01] hey valhallasw [19:35:06] http://aharoni.wordpress.com/2011/08/24/the-software-localization-paradox/ [19:35:17] Yes, that one :-) Thanks! [19:35:20] No prob! [19:35:37] sumanah, sorry, I was away. Sure! I was thinking of a IRC meeting next week, maybe monday, what do you think? [19:36:01] Micru: I realized that this is something you should probably discuss with DarTar and those folks [19:36:29] hmm i wonder if i could add a color button to WikiEditor [19:36:30] as it's more policy (at least initially) than about changing Wikimedia infrastructure [19:36:42] valhallasw, sumanah has this data structure in her mind with thousands of blog posts on relevant topics [19:36:50] * sumanah laughs aloud [19:36:58] sumanah, yes, yes, there are many people involved, I'm trying to pull everybody in [19:37:13] hahaha [19:37:47] the Firefox awesome bar is invaluable. It is the memex that Vannevar Bush predicted. Or close, for me [19:37:47] aharoni: :D Yes, it's fascinating how the human mind works :-) [19:37:53] http://www.theatlantic.com/magazine/archive/1945/07/as-we-may-think/303881/ [19:40:24] Can I get the size of a wiki's database without manually running an SQL query? [19:40:41] James_F, if nobody objects in the next day or so we should just merge this [19:40:52] gwicke: Agreed. [19:40:59] Ca11um: for a Wikimedia site or for a wiki that you administer or a wiki you do not administer? [19:41:07] gwicke: Writing a justification in the commit message to make Krinkle happy now. [19:41:39] ok, cool [19:42:00] sumanah, for a wiki that I have total control over [19:42:19] Is there any form of 'Special' page that shows me the size of the DB in KB/MB/GB, or must I use MySQL command line? [19:42:55] Ca11um: good question! [19:43:15] * sumanah looks at https://en.wikipedia.org/wiki/Special:Statistics  [19:48:38] Ca11um: you know, I'm not seeing anything. I presume the thing that you want is something visible on the web so you don't have to go in on the command line, right? [19:49:23] Yeah, sumanah [19:54:48] Ca11um: I tried searching for relevant keywords in http://hexm.de/mw-search which searches the source code, mailing lists, the wiki, bug reports, etc. and I cannot find an answer for you. I suppose people either usually use the command-line tools or they choose to install a web-facing frontend for their databases [20:04:01] AndyRussG: hey there, how is it going? [20:04:16] hi sumanah! [20:04:28] Good! Just about to start a quick stand-up meeting, should be done in 15 min tops [20:04:31] :) [20:04:39] https://www.mediawiki.org/wiki/Requests_for_comment/Typesafe_enums - AndyRussG hope to chat with you about it [20:04:58] K, is in 15 min OK? Thanks!! [20:05:01] Cool! [20:25:16] sumanah: all done standup, thanks 4 waiting & apologies :) [20:27:10] Thanks, sumanah [20:29:51] AndyRussG: no prob. So, on typesafe enums - have you asked for feedback from any of the more longterm developers around, like aude and RoanKattouw? [20:29:57] * sumanah loads up https://www.mediawiki.org/wiki/Requests_for_comment/Typesafe_enums [20:30:24] I bet JeroenDeDauw and Tyler Romeo would also be interested in giving you some feedback on the topic [20:33:09] sumanah: no I haven't, that would be really cool though [20:33:57] AndyRussG: go ahead :) [20:35:01] I added the link to the RFC page not long ago--afterwards I wasn't sure if I should have put it at the bottom or the top of the list :) was going to check that [20:35:16] We have 2 more RFCs about changes to core coming in the pipeline, BTW [20:35:46] We started with the enum one 'cause it seemed like it would be the least... controversial! :) [20:38:00] sumanah: K, I'll definitely ask them, then :) [20:38:08] Cool! [20:40:44] Jeroen is aware of some of the direction we've been going with the Editor Campaigns project, in fact I have on my to-do list replying to an e-mail of his from a while back, which I just discovered (sadly it got lost in my flaky e-mail client filters) [20:41:09] * sumanah shakes a fist at unreliable and sneaky filters [20:41:25] * ragesoss reads along silently. [20:42:13] * AndyRussG can't help but add something using the IRC "me" command, too :) [20:42:31] * sumanah is a fan of /me! [20:42:47] which is nearly "sumanah is a fan of sumanah" which sounds a bit narcissistic [20:43:16] anyway AndyRussG would you like to have us chat about the enum thing in tomorrow's RfC meeting in IRC? [20:43:31] I thought I had a topic but the person turned out not to have a timing match on Wed [20:43:43] sumanah: :) Sure you bet [20:43:46] What time would it be at? [20:43:52] * ^d is a fan of ^d [20:43:58] (mm I shoul dknow) [20:44:10] AndyRussG: well, what times are good for you? are you in Mexico City currently? [20:44:26] yep! [20:45:14] I'd like something a bit early in the day, to make it possible for Europe to also participate, maybe 1100 UTC [20:45:14] wait [20:45:36] that is 6am [20:45:39] sorry, I am wrong [20:45:48] The only thing I have on my agenda so far for tomorrow is a meeting at 10 SF time (17 UTC) [20:46:02] I'm 5 hours behind UTC [20:46:07] like, 1500 UTC-15:30 or 1600 UTC [20:46:48] would nowish (2000 UTC) also be okay? since I do want to give like 24 hrs notice [20:47:02] * sumanah will make the NEXT meeting more Europe- and Asia-friendly [20:48:36] sumanah: 15:30 or 16:00 UTC are totally fine [20:48:56] cool! ok, this is gonna happen then AndyRussG [20:49:14] I can also do at this time tomorrow, though a bit later (mayb 20:15 ish UTC) might be better [20:49:17] AndyRussG: go ahead and ping the people I mentioned, and/or other people who you think would have helpful suggestions/feedback [20:50:18] cool! Thanks a lot! [20:51:04] cool! [20:51:32] So I'll just wait to hear back for confirmation about the exact time [20:52:49] AndyRussG: cool [20:54:09] * AndyRussG rejoyces in anticipation of RFC-feedback-fun [21:50:51] hey MatmaRex thanks for your email to the wikitech-l list. Do you think https://www.mediawiki.org/wiki/Requests_for_comment/Redo_skin_framework is something to include in the discussion? [21:52:17] sumanah: no, it's not. i said in my proposal that i do not aim to do this [21:52:24] hmm, maybe "rewriting the Skin and SkinTemplate classes" is less clear than it could be :) [21:52:52] but i'm definitely not touching that just now [21:52:55] I'm sorry for misunderstanding MatmaRex [21:53:08] :) [21:54:09] sumanah: do you know any skinning people who should be pinged about this thread? [21:54:37] i'm going through some skins that seem to be maintained and collecting email addresses, but apart from that :) [21:54:53] :) [21:55:19] MatmaRex: I fear that I would simply give you redundant information. Whom are you already considering? [21:56:47] let me just cc you on the email [22:02:06] oooh http://everytimezone.com/#2014-5-21,540,6bj - thanks Nikerabbit for the tip [22:05:12] experience of remote teams ;) [22:08:04] :) [22:08:27] I had been using worldtimebuddy and timeanddate.com [22:08:41] sumanah: cc'd [22:08:48] thank you [22:10:59] hi again [22:11:19] hi biberao [22:11:33] i have setup 2 wikis one in each dir but im not sure i can really control them by including a commons file [22:13:33] ashley: ^ got any advice for biberao? [22:13:59] hi! what's up? [22:14:55] hi ashley [22:17:41] so biberao, what are you trying to do? a wiki farm setup of some kind is always a tad bit tricky and depending on your goals, you might end up reinventing the wheel a bit; sharing users and images between two (or more) wikis is pretty simple, but everything beyond that...not exactly a piece of cake, but probably doable [22:20:22] MatmaRex: I'd additionally suggest: Reuben from WikiHow; someone such as Owen Davis from Wikia; Daniel Renfro from VistaPrint; Yury Katkov; Jason Lewis; Ingo Malchow [22:21:06] Links for the last 3: http://lists.wikimedia.org/pipermail/mediawiki-enterprise/2013-March/000209.html ; http://lists.wikimedia.org/pipermail/mediawiki-l/2013-May/041162.html ; http://lists.wikimedia.org/pipermail/mediawiki-l/2011-August/037870.html [22:21:09] sumanah: want to forward it to them? i admittedly don't know most of these names (and don't have the email addresses) [22:21:28] MatmaRex: sure, no problem, and will cc you. [22:21:37] thanks [22:24:07] MatmaRex: Done. [22:24:34] thank you [22:25:15] You're welcome. Thanks for working on this. [22:26:01] sorry ashley [22:26:08] i want to share permissions too [22:34:22] cscott: AndyRussG thanks for your flexibility. see email/GCal :) [22:48:00] :| [22:49:13] biberao: you might want to try emailing mediawiki-l or mediawiki-enterprise [22:49:17] !lists | biberao [22:49:17] biberao: mediawiki-l and wikitech-l are the primary mailing lists for MediaWiki-related issues. See https://www.mediawiki.org/wiki/Mailing_lists for details. [22:49:19] biberao: https://lists.wikimedia.org/mailman/listinfo/mediawiki-enterprise [22:53:17] biberao: the simplest possible way is to share the user_groups table, so admins on wiki1 will also be admins on wiki2 and so on; for anything more complicated, you could try [[mw:Extension:GlobalUserrights]] for example; it's a pretty good extension, we use it at ShoutWiki (and one of our developers originally wrote it, heh!) [22:58:46] ok will try [22:59:01] ashley: so wiki farm is out uestion [23:07:12] sumanah: it's the least i could do for type safety