[00:02:48] I'm getting a "ssh_exchange_identification: Connection closed by remote host" error while ssh-ing into gerrit. I'm behind a proxy, and have used corkscrew to go around it. Have added ssh public ssh keys to gerrit and wikitech. Help? [00:09:17] Fixed ^. Never mind. :) [00:35:05] hello everyone! I've heard talk that Wikimedia Gerrit will (eventually) be replaced with Phabricator Diffusion (hosting source code) and Phab. Differential (reviewing code). Does anyone know an estimated date of when that will happen (or if it will happen)? [00:59:35] codynguyen1116, it looks like this might be the main task. https://phabricator.wikimedia.org/T119908 Try this search for other (semi)relevant tasks. https://phabricator.wikimedia.org/maniphest/query/lsyp7CtsXcX0/#R [01:01:01] codynguyen1116: https://phabricator.wikimedia.org/tag/gerrit-migration/ <— There’s your tag. [01:01:57] Ah okay, thanks @Matthew_ and @quiddity! [01:05:46] Yep [04:35:06] Hi, guys. Anyony knows how to turn these redundant tables? http://paste.ubuntu.org.cn/i3983276 [04:44:20] Hi, is anyone here familiar with IEGs? :) [05:43:03] josephine_l: kind of [05:43:26] legoktm, cool. :) Mind if I ask you a couple of Qs about it? [05:44:13] sure [05:45:33] legoktm, thanks! Do you know if there are flexible starting times for IEG projects? [05:46:15] oh, I'm mostly familiar with scope of IEGs and stuff, but lets read [05:46:51] I've been writing a proposal for this round's IEG, but the schedule says the projects should start in "June". I can only start mine in July. [05:47:14] I asked about it at https://meta.wikimedia.org/wiki/Grants:IEG/Questions but no responses so far. [05:48:46] Personally I think that should be fine provided that's communicated clearly ahead of time and you still finish on time, but I pinged marti on the questions page: https://meta.wikimedia.org/wiki/Grants:IEG/Questions#Timing_of_grant_start [05:49:36] Thanks legoktm ! I figure I could work it into my proposed schedule (just do fewer weeks) if I knew exactly when the grant was due to start, but 'June' is a bit nebulous. [05:49:45] As in, I don't know if I need to dock 4 weeks off, or just 1. [05:50:59] ahh, not sure. Maybe look at a past grant to see when they started? [05:51:38] Okay, will do. :) Thanks for the help! [10:09:09] hey everybody! I would like to get some statistics for all my articles (number of editors, length in words, number of edits etc.). can you recommend a specific mediawiki extension that would do the job? [10:23:15] Hi, guys. Anyony knows how to turn these redundant tables? http://paste.ubuntu.org.cn/i3983276 [10:33:05] pity: turn? [11:32:26] Hi, I'm trying to get the page's langlinks from inside a custom FormAction. They only seem to be available as part of OutputPage/ParserOutput, and so do not exist within the context of the FormAction. Any other ways of getting them, apart from querying the DB directly? Thanks! [11:36:07] Dror_[FFS]: i'm afraid there is no nice abstraction for getting the effective langlinks of a page. i tried to introduce that a while back, but the change got stuck. [11:36:17] so, you'll have to ask the database. [11:38:20] @DanielK_WMDE_: I was afraid of that. It means code replication and will easily break, but hey, that's life on the mediawiki frontier. [11:38:33] And thanks! [11:39:19] Dror_[FFS]: if you feel like it, write a nice DAO style wrapper :) [11:40:19] Dror_[FFS]: the thing that makes it tricky to write a re-usable storage laywer abstraction is that you neede paging. And since there is no primary key, you have to page using multiple columns. And there is sort order, and filtering... it's fun. [11:40:44] DanielK_WMDE_: Exactly. If I did, my code will probably be a performance nightmare... ;-) [11:41:29] that's exactly why this thing doesn't exist: it seems straight forward, but hard to actually get right [11:43:45] Well, from my viewpoint it would be better to have a not-gret solution (split out the part in OutputPage/ParserOutput so it's reusable), then having to copy bits of mediawiki core into every extension I write. But then again, this way of thinking is probably why I'm not a core developer - I'd make a mess of things. [11:48:16] Dror_[FFS]: oh, the stuff in OutputPage/ParserOutput does not access the database. it extracts info from the wikitext. The info returned from ParserOutput is used to write links to the database [11:48:23] so, that code would be really different [11:48:45] Dror_[FFS]: what could be abstracted out is code from the special page(s) and API module(s) that read sitelinks from the DB [11:49:20] Yes, you're right. I'm looking at the API module right now. [11:49:24] Not sure where the skin is currently pulling this from [11:49:32] and the interaction with links defined in wikidata is a bit compelx too [11:50:26] Dror_[FFS]: for reference https://gerrit.wikimedia.org/r/#/c/60034/ [11:50:49] oh wow, it has been nearly three years now... [11:53:59] I have no idea how it works with wikidata; with a regular wiki, it gets the links from the Parser (skintemplate->outputpage->parseroutput->parser). Shudder. [11:54:37] with wikidata, it still works that way, but additional links are mixed in just after parsing [11:55:27] Well, if I used wikidata, I could simply query it directly in this case, wouldn't I? [11:56:20] I see Reedy took a look at your change in Sep2015, but nothing further happened. [11:57:09] Oh, I guess if I just queried WikiData I would miss potential local langlinks, but those shouldn't occur anywhere. [11:57:51] yea [12:38:55] p858snake: Sorry, I mean turn off the redundant tables. [14:38:05] morning [15:52:42] Is there an alternate way to install parsoid, or way to chunk it up? npm install keeps running out of memory [15:54:36] G_SabinoMullane: join #mediawiki-parsoid ? [15:56:14] thanks, will do [17:00:52] with the kerfluffle today going on with MailChimp / Mandrill (kicking developers into paid plans with short notice), are there mail delivery integrations that exist for MediaWiki? I ask because I'm evaluating which delivery options to use for Drupal, CiviCRM and WordPress [17:02:33] freephile: what do you mean by "mail delivery integration"? [17:03:07] MediaWiki uses php [17:03:18] err, php's mailing functions (which in turn use sendmail) [17:03:43] and it can use the Pear mailing stuff, primarily for direct SMTP... [17:03:46] is that what you mean? [17:05:28] DanielK_WMDE MediaWiki can send email to users, and out the box I think it just uses sendmail, but from some ISPs (Digital Ocean) delivery may not work well, so I was about to search for what extensions exist for offloading email delivery from MW to a service like SendGrid [17:08:41] there is $wgSMTP to talk directly with an SMTP server https://www.mediawiki.org/wiki/Manual:$wgSMTP [17:08:53] freephile: should be easy enough to configre, but i don't know whether there is anything bundeled [17:11:00] freephile: yes, you can use wgSMTP. Or you configure PHP to use some sendmail-compatible script for sending mail. [17:11:16] I woudl assume that most delivery solutions provide such a script [17:12:46] freephile: ini_set( 'sendmail_path', '....' ); [17:12:56] should work for any php code [17:13:46] oh, it's possibel that you have to do that in php.ini, and can't do it inline in code. [17:17:18] DanielK_WMDE: For SendGrid, there is an official PHP library to their API https://github.com/sendgrid/sendgrid-php [17:18:06] But I imagine I would want to write an extension to properly interface between MW and SendGrid for email delivery, similar to what is done for Drupal https://sendgrid.com/docs/Integrate/Open_Source_Apps/drupal.html [17:18:55] freephile: you could write an extension that makes MediaWiki use that. The relevant hook to implement seems to be AlternateUserMailer [17:19:26] freephile: oh look! https://www.mediawiki.org/wiki/Category:AlternateUserMailer_extensions [17:19:30] some exaples there [17:19:38] no idea if they are any good [17:20:24] freephile: the code looks streight forward enough https://git.wikimedia.org/blob/mediawiki%2Fextensions%2FMailgun/master/MailgunHooks.php [17:20:35] that's it [17:22:44] DanielK_WMDE: Thanks, that's exactly what I was looking for, and simple too! [17:23:00] freephile: had to dig a bit myself. have fun :) [17:23:33] I think I'll write an extension to integrate with SendGrid, or whatever I end up using for CiviCRM [17:24:10] if you do, please make a page about it on mw.o, e.g. https://www.mediawiki.org/wiki/Extension:SendGrid or something [17:24:31] Certainly! [17:25:00] cool! [17:30:23] hi [17:30:43] i need to connect my exisiting wiki media to my loacal host installation [17:31:38] how can i connect to my existing wikimedia account while installing mikimedia on localhost [17:33:59] pleas if someone can help [17:34:06] please* [17:34:10] what is "connect"? [17:34:30] so i am installing mediawiki on my server [17:34:40] and i already have mediawiki account [17:35:04] A mediawiki account where? [17:35:16] on mediawiki webpage [17:35:17] There are thousands of MediaWiki installations out there. [17:35:26] So you have an account on mediawiki.org. [17:35:30] so how does it work [17:35:35] It's unrelated. [17:35:51] this is my page wiki.wiredmessenger.com [17:35:59] There are thousands of MediaWiki installations out there. Each of them on separate servers with their own user databases. [17:36:35] There are MediaWiki extensions that allow using external OAuth providers as login. [17:36:41] If that's your question. Not sure. [17:36:52] as I still don't know what you want to achieve. :) [17:37:00] yes so how can i get that extension [17:37:34] to use external OAuth [17:37:46] and how to install visual editor [17:37:52] thanks andre [17:39:10] Guest89127, https://www.mediawiki.org/wiki/Extension:OAuth and https://www.mediawiki.org/wiki/VisualEditor/Setup [17:39:28] i really appreciate it thanks [17:43:32] Guest89127: Extension:OAuthAuthentication might be your thing but it's still experimental [17:46:00] i am trying to use MediaWikiAuth extension [17:46:40] how can i configure it [17:52:19] Guest89127, did you read the link that I posted? [17:54:15] yes [17:54:28] i am trying to configure it now [17:55:08] If you have specific questions or specific problems, feel free to describe them so folks here can help. Thanks