[00:33:56] [00:34:18] hi Amgine [00:34:48] Hi legoktm. I hate shorturls. Or I hate rewrite rules, I can't decide which. [00:35:20] o.O why? [00:36:57] because I get things like "The requested URL /wiki/Main_Page was not found on this server." [00:42:01] :P [01:25:42] biberao: perhaps you could make an extension for one of the wiki installations, that jumps into a hook before retrieving files, and makes sure it fails? I also have no idea what the purpose for such a thing is--if their goal is content protection or a permissions system, there are likely much better ways to do it [01:45:55] redirects are indexed? [01:48:17] is that a rhetorical question? [01:52:14] well this seems kind of silly https://www.google.com/search?q=ffxiv+lancers+creed&rlz=1C1CHKZ_enUS438US438&oq=ffxiv+lancers+creed&aqs=chrome.0.69i59.3681j0j7&sourceid=chrome&es_sm=122&ie=UTF-8 [01:52:32] top two [02:26:06] ok, so I installed checkuser, uploaded the files and ran the intall.php chron job, and when I went to select checkuser on my user rights, i got this: Fatal error: Call to undefined method RequestContext::newExtraneousContext() in /home/a2074436/public_html/wiki/extensions/CheckUser/CheckUser.hooks.php on line 25 [02:26:22] I looked at the file and the method seems to be there. [02:26:35] sean_kurth: what version of mediawiki are you running? [02:26:58] 1.19 lts [02:27:20] what version of the extension did you download? [02:28:19] REL1_19 [02:29:24] do you have access to the shell? [02:29:56] no, i can do chron jobs and phpmyadmin though [02:32:06] * Skizzerz is taking a look [02:32:28] sean_kurth: can you set a a single run cron to run /maintenance/update.php ? [02:32:56] !update.pho [02:32:58] !update.php [02:32:58] update.php is a script that updates the database tables. You usually need to run it after upgrading MediaWiki or installing certain extensions. For details, see [02:33:08] Betacommand: that error doesn't look update.php-related [02:33:29] if there were db errors due to a missing field or something, yeah, an update.php will fix that [02:33:29] Skizzerz: it may be db related [02:33:35] but that's a PHP error [02:34:04] should be able to, but I can't update past 1.19.x because my host's versions of php and mysql are old [02:34:27] Skizzerz: Ive seen odd cases where something downstream causes php errors, in trouble shooting its best to nail the low hanging fruit [02:35:19] Skizzerz: if update.php errors out that will tell us something too [02:35:24] sure [02:37:25] looking at the code, that line is in a backwards-compat section based on presence of... something [02:39:03] Skizzerz: I doubt its a bug with the actual code since its been released for quite a long time [02:39:12] !version [02:39:12] To find out the version of your MediaWiki installation, visit the page Special:Version. Should the wiki be broken, but you have access to the program files, find $wgVersion in DefaultSettings.php. Please note that 1.15.0 > 1.5.0 (since nobody wants to go to 2.0). See http://www.mediawiki.org/wiki/Version_lifecycle for supported versions. [02:39:15] sure [02:39:21] but it could've been an issue with the branch point [02:39:38] e.g. the REL1_19 branch of CheckUser actually only supported 1.20+ or something [02:40:03] Skizzerz: if that was the issue we would have heard a lot more complaints, [02:40:13] and it would have been fixed [02:40:57] and he left..... [02:41:09] yeah, that method definitely exists in 1.19 [02:41:34] Skizzerz: I suspect it was a user error [02:41:39] I do as well [03:14:43] hi,guys!how to achieve the action with extension?look this pic! [03:14:43] http://postimg.org/image/6qcbftz8j/ [03:23:02] zjhxmjl: Your question is hard to understand. Can you phrase it a different way? [03:24:38] marktraceur:i want to achieve the same format as the pic [03:25:53] Ah, well [03:25:55] but now:http://postimg.org/image/qzqsmd0vz/ [03:26:07] zjhxmjl: It looks like it's just a few divs with styling. [03:27:09] I have to manually add again and again? [03:34:06] Or you could use a template [04:03:16] marktraceur:thks [06:24:34] hi [10:44:41] what that tw channel? [10:44:44] !translatewiki [10:44:44] You can help translating MediaWiki to your language at translatewiki.net [10:44:51] meh [10:45:28] is there any common template for translate wiki that say "literally " [10:50:29] petan: as in, that "this word" shouldn't be transalted? [10:51:17] MatmaRex: no, imagine you would need to translate "yes" it would be just "yes" what would you say in documentation for that key? [10:51:35] "just translate it as yes with no context?" [10:51:47] isn't there something like {{generic|yes}} [10:51:50] or whatever [10:52:20] oh. hmm [10:52:57] actually, why would anyone translate a "yes" as something else than "yes"? [10:53:08] (when not given specific instructions to be creative, that is) [10:53:23] some languages a word can differ based on context [10:53:41] of course they can [10:55:20] but i don't see how this knowledge helps here :) [11:04:58] oops: you here? [11:15:54] pew pew [11:15:59] :) [11:16:29] plop [11:16:33] we probably need an in-house tool to enforce or semi-enforce the guideline [11:16:35] Moving https://www.mediawiki.org/wiki/Performance_guidelines discussion here. [11:16:58] I wouldnt want it to be yet another guideline people (particularly newer people) ignore [11:17:24] that's why I suggested Jenkins test job :) [11:17:32] ToAruShiroiNeko: I think this particular set of guidelines includes ones that we can check with automated tools and ones that are not possible to check that way. [11:17:33] ah [11:17:48] sumanah certainly [11:17:55] but any performance gain is a plus [11:18:24] If the benefit outweighs the cost of implementing the tool, yes. [11:18:54] I am assuming that isnt the case. :) [11:19:33] I would love for us to improve the profiling tools we have available - https://www.mediawiki.org/wiki/Performance_profiling_for_Wikimedia_code is a guide to what we have right now [11:20:28] ToAruShiroiNeko: if you'd like to work on that, I would welcome that. Aaron Schulz and Ori Livneh are the people working most on performance right now, along with Tim Starling working on HipHopVM, Sean Pringle on database work, and others [11:20:49] Ah, unfortunately I am way outclassed for this probably [11:20:55] a profiling tool that one can run independently while on one's developer machine and that can run as a jenkins test would be cool [11:21:17] I think we can add for example Chrome web developer tools for frontend profiling... they came very handy during ULS optimization [11:21:44] One way that we can ensure that newer developers really absorb these principles is if we point to these principles when we are accepting or rejecting their code [11:21:47] also for mobile [11:22:12] "You ought to think about using the jobqueue here - see [link]" [11:23:52] hi here. coming a bit late [11:24:00] Nikerabbit: you mean to the "how to profile" page? I have a vague line "Look at your Chrome developer tools to check performance." but if you can specify anything further or link to a good guide I would love that [11:24:04] JobQueue isn't a magic wand though... it's internals are very difficult to understand and until now most wiki installation ran jobs during web requests so it didn't actually solve anything, unless you configured the wiki like WMF does [11:24:05] hi liangent [11:24:09] nod nod [11:24:37] sumanah: yeah I can expand it a bit, I wrote something about that in http://laxstrom.name/blag/2013/12/09/performance-is-a-feature/ [11:24:59] cool! maybe just link ;-) [11:25:39] liangent: http://bots.wmflabs.org/~wm-bot/logs/%23wikimedia-dev/20140519.txt has a little discussion too (of the performance guidelines) before we moved here [11:26:43] Nikerabbit: you know answer to that? [11:26:49] is there any common template for translate wiki that say "literally " [11:26:53] sumanah: yeah I can read scrollback in my irc client [11:27:00] ok, no prob :) [11:27:02] petan: that what? [11:27:13] that word, just alone that word [11:27:18] like translate "yes" [11:27:23] are you looking for a [sic] or something? in qqq? [11:27:23] what would you write to qqq? [11:27:34] I need an example, not understanding [11:28:12] oh someone just fixed it [11:28:16] it's {{identical [11:28:20] that was what I was looking for [11:28:28] O_o [11:28:59] petan: hi there! you might like to take a look at https://www.mediawiki.org/wiki/Performance_guidelines and https://www.mediawiki.org/wiki/Performance_profiling_for_Wikimedia_code [11:29:01] Nemo_bis: what would you write in qqq for [11:29:02] https://translatewiki.net/w/i.php?title=Huggle:No/en [11:29:10] that's what I'm in this channel to discuss for the next 30 minutes. :) [11:29:16] ok [11:29:19] petan: where it's used [11:29:28] anywhere you want to say "no" [11:29:37] like "yes / no" message boxes etc [11:29:45] it's just generic "no" [11:29:51] sounds like a bug [11:29:57] ? [11:30:10] bad message reuse [11:30:20] please list the cases in which it's used [11:30:38] alolita_: you've been at the Foundation longer than I have so maybe you remember more examples, good and bad, that I ought to link to, in this document or in the tutorial. [11:30:46] key-1: "do you want to exit huggle" key-2:"yes" key-3:"no" [11:30:48] I'd love anything you can narrate or link to [11:31:00] Nemo_bis: that is one use case [11:31:16] petan: good, start by adding this one; the more the better [11:31:34] adding where? [11:31:37] qqq [11:31:44] eh what? [11:31:51] what am I supposed to write to qqq [11:31:52] sumanah: yup - i am glad you've referred to article feedback [11:32:16] there are some other examples too which we can link to [11:32:45] Nemo_bis: I will just stick with that template it makes sense and even mediawiki uses it [11:33:13] JeroenDeDauw: Krenair saper_ Yaron - for the next 30 min, I'm in this channel and available to talk about https://www.mediawiki.org/wiki/Performance_guidelines and the related architecture and security documents I am working on , in case you have thoughts [11:33:14] https://www.mediawiki.org/wiki/Performance_guidelines is a pretty long document, but the table of contents is pretty short :-) and a good guide to what's in it. We describe several principles of writing high-efficiency code, explain what our goals are, and then, for each principle, explain how to implement it, and point to good and bad examples [11:33:22] are you planning to have a section where we can have some recommendations of acceptable ranges for performance timing [11:33:32] As I wrote this I also put together a very very rough thing: https://www.mediawiki.org/wiki/Performance_profiling_for_Wikimedia_code which others might find useful. I think it is currently the best comprehensive guide to how to measure the performance of your code and use existing monitoring tools [11:33:58] sumanah: performance stuff isn't really my thing; sorry. [11:34:01] cool - this is helpful [11:34:02] petan: no idea why you think it's in contradiction [11:34:33] alolita_: I asked Faidon and others for that in Zurich. (that is, I asked for that set of ranges.) The answer I got was basically that it is SO context-dependent that experts are unwilling to set down a list of acceptable ranges of values. [11:35:14] Yaron: Understood. This is a page meant for you and for all the developers who ARE NOT usually interested in performance, to help you write code that performs well. [11:35:35] Oh, I get it. Okay, thanks. [11:35:37] it is context sensitive but on the other hand there are some ranges which can be acceptable for responsive user experience on wikimedia websites [11:35:48] so I would welcome "this makes no sense to me" or "this jargon is unexplained" or other feedback that helps me make this better for you [11:36:10] alolita_: I agree! please make this argument on the list so that we can actually get some ranges written down [11:36:18] petan: I don't find any "do you want to exit huggle" message [11:36:22] ok that works [11:36:28] Yaron: btw, please have a look at https://gerrit.wikimedia.org/r/#/c/133690/ :) [11:36:29] alolita_: maybe if more people say this then I will be able to get them [11:37:03] agreed [11:37:03] Nemo_bis: that was just sample use case, it's just used for any yes / no question, what's so hard on understanding that? [11:37:17] literally any [11:37:25] I already commented this [11:37:39] password of https://logstash.wikimedia.org/ ? [11:37:44] Nikerabbit: oops, I forgot about that one! Just approved it. [11:37:58] this is mentioned on https://www.mediawiki.org/wiki/Performance_profiling_for_Wikimedia_code [11:38:12] Matt Walker removed a concrete goal https://www.mediawiki.org/w/index.php?title=Performance_guidelines&diff=1007057&oldid=1007037 alolita_ - "Our goal is reasonable up to 300 milliseconds [[:en:Round-trip delay time|round-trip time]] or so. If someone's on satellite with 2000ms RTT, then they can expect everything to be slow, but that's a small minority of users." - I may put that back in [11:38:17] liangent: only wmf staff [11:38:28] Nemo_bis: hm ok [11:38:31] liangent: so, I need to clarify that - logstash, like graphite.wikimedia.org, is right now only available to WMF staff [11:38:47] oh it does say that [11:38:50] "http://logstash.wikimedia.org/ is an exploratory data tool to view production log files, e.g., the logs of fatal errors, exceptions, slow parses, and so on. Only Wikimedia Foundation staff can view it, since the logs contain identifying information such as IP addresses." [11:39:17] sumanah: sorry didn't notice that [11:39:18] that may be good to put back in - RTT of <3s is considered responsive [11:39:40] but "WMF Labs (use wiki login name not shell)" is somewhat misleading... [11:39:43] 300 ms = 0.3 s [11:39:49] 300 milliseconds, so, 0.3 seconds. [11:40:04] good catch :-) [11:40:05] liangent: could you file a bug about that? [11:40:15] 0.2-0.3 secs [11:40:17] about the "WMF Labs (use wiki login name not shell)" message [11:40:17] there is already one [11:40:54] liangent: "WMF Labs (use wiki login name not shell)" - we should probably add a few words to indicate that it is for WMF staff only at the present moment [11:41:00] sumanah: ok after this session ends [11:41:03] thank you liangent [11:41:10] petan: still not finding any question in huggle [11:41:13] sumanah: one thing about the "Indexing" section: I've heard that sometimes the query plans might different in WMF production and on your local instance due to many reasons: custom WMF indexes, different mysql/mariadb version and just lack of data. I'm wondering how much of this is true and relevant and if there are ways to address those differences. [11:41:49] Nikerabbit: would you mind posting that question on the talkpage, and I will follow up with Sean Pringle? [11:41:57] sumanah: will do [11:42:00] Thank you [11:42:11] I'm very grateful for your detailed feedback Nikerabbit [11:42:28] I had a comment about the "Indexing" section too: it starts with the sentence "The tables you create will be shared by other code." What does that mean? [11:42:30] sumanah: is it possible to make jenkins or some other facilities yell when slow operation is executed? [11:42:47] like what GAE does - database queries without index don't work at all [11:42:54] petan: is this one case in which you use yes/no messages? https://translatewiki.net/w/i.php?title=Huggle:Block-sensitivewarning/en [11:43:17] possibly [11:43:23] sigh [11:43:37] it could be used in literally any "yes" "no" question [11:43:51] even in questions that will be implemented in future [11:45:46] Yaron: Hmmm. So, am I right in understanding that, when people change MediaWiki core or extensions, they sometimes change the database schema to add new tables? Maybe that is so rare that I should instead focus on simply the fact that we are all sharing those tables [11:45:50] petan: this is what we need https://translatewiki.net/w/i.php?title=Huggle%3ANo%2Fqqq&diff=5551322&oldid=5534713 [11:45:52] and should act responsibly in not locking them [11:46:21] Nemo_bis: petan - is there a huggle channel or a #mediawiki-i18n channel where your conversation would fit better for the next 15 min? [11:46:30] sumanah: ah, I guess this depends on whether the tables are being added to core MediaWiki or to an extension. [11:46:57] liangent: so, you are perhaps the 3rd person to bring this up, which means that several people want it. the question is: who will implement it? [11:47:40] sumanah: I think conversation is over ;) [11:48:24] sumanah: your conversation would fit better in #-office. :) [11:48:26] sumanah: note it somewhere and wait for someone to implement it in the future :p [11:48:37] poor sumanah changing channels so often :p [11:48:40] MatmaRex: in this particular instance, I wanted to experiment with having it someplace where the developers already were. [11:49:19] liangent: ok, could you file that bug as well? or I can. [11:49:46] sumanah: i don't mind, and i'm sure no one does; but trying to evict the developers who are talking about something else here feels inappropriate to me [11:49:56] * MatmaRex shuts up now [11:50:09] MatmaRex: I hope literally *asking* whether there is another channel is not taken as trying to evict anyone! [11:50:18] the answer "no" is a perfectly legitimate answer to my question [11:50:34] it's ok to ask for things and it's ok to be told no - I figure this is how adults negotiate [11:50:52] sumanah: and the next: the document encourages caching everywhere, but we had some bugs about cache pollution which must be avoided [11:51:20] eg https://bugzilla.wikimedia.org/show_bug.cgi?id=41354 , and I remember ULS for anons caused this at some time in the past [11:51:26] i have heard the term "aggressive caching" used often nowadays [11:51:43] it might worth some lines to give warning about this [11:51:43] alolita_: what does "Aggressive caching" mean? preemptive caching? [11:52:15] liangent: so the cache pollution bugs are bugs that developers introduced when writing new code, correct? (rather than bugs inherently in the caching infrastructure) [11:52:38] liangent: ULS (language selection part) for anons was never enabled, so it shouldn't have caused any caching issues... but caching issues are the reason it hasn't been enabled [11:52:48] cache as often as needed [11:53:16] Nikerabbit: ok then it must be some 3rdparty wikis I've seen [11:53:23] alolita_: what would a different strategy look like? what would we call it? (this would help me understand what "aggressive caching" is and how to explain it) [11:54:06] sumanah: it happens more often when some developer is adding new features to a component which is already using some caching infrastructure [11:54:11] gabriel wicke has an excellent writeup for the parsoid [11:54:21] alolita_: link would be great [11:54:23] "aggressive caching" just means caching a lot of stuff, I think. [11:54:45] Yaron: yup as often as many times as needed [11:55:31] In our case (my opinion), we trade some interactive features to enable caching, allowing us to scale better [11:56:19] "aggressive" is the opposite of "needed" [11:56:22] of course it is not that simple, JavaScript on frontend is mixing things, and ESI is supposed to allow more granular caching so that interactive/customized parts can be included without performance hit [11:56:29] sumanah: is the objective of the performance guidelines to focus on mediawiki optimizations or Wikimedia website performance optimizations [11:56:56] are we generally moving to heavy client side (javascript) processing to enable more server-side caching? [11:57:08] since the factors may be different accordingly [11:57:32] liangent: I don't think so... client side processing should also be kept fast [11:57:39] alolita_: the performance guidelines focus on MediaWiki core, extension, and gadget developers on Wikimedia sites. So many of these guidelines will not be applicable to other MediaWiki sites. [11:57:46] liangent: I just think it will get more granular and complex [11:58:00] sumanah: having read through https://www.mediawiki.org/wiki/Performance_guidelines now, I would actually recommend splitting it up into three pages - or at the very least three big sections: for WMF developers, for non-WMF core MediaWiki developers (if there are such people), and for extensions developers. [11:58:29] Yaron: (to answer quickly: yes, MatmaRex and thedj and several other people who do not work for WMF are core maintainers) [11:58:37] Alright, then. [11:59:05] For example, there's discussion of specific technologies like Swift and MariaDB, that doesn't make sense for non-WMF usage. [11:59:14] https://gerrit.wikimedia.org/r/#/admin/groups/11,members is a bit messy, because it explicitly names several WMF engineers who are core maintainers, but it also mentions several non-WMF people [11:59:42] Yaron: MariaDB does, Swift perhaps less so [12:00:16] Right - not that those can't be used, but rather that developers have no control over which technologies their code will run on. [12:00:45] Yaron: Yes, absolutely, this document is specifically focusing on WMF usage and is suitable for MediaWiki core, extension, and gadgets developers who are aiming to get their code deployed on Wikimedia sites. From your perspective, which sections are also currently suitable for people who are working on other MediaWiki installations? [12:00:47] Maybe, then, the distinction needs to be between developers and infrastructure people - this page seems to mix in advice for both. [12:01:31] sumanah: that makes sense thanks! [12:01:52] Yaron: I have aimed to only give advice to developers - if you can point to any advice that does not make sense for developers, please do tell me so I can alter it [12:02:15] sumanah: anything that mentions a specific DB system like MariaDB, for instance. [12:02:38] But if this page is WMF-specific (which is fine), maybe it should have a less generic name? [12:02:58] Nikerabbit: heavier doesn't mean slower... modern browsers are more powerful in javascript execution. and re ESI: isn't it cheaper to ESI a simple json than to ESI some fully rendered HTML? [12:03:15] Yaron: Hmm. So, sometimes I mention, for instance, MariaDB or memcached, because it can be easier for people to think about things as specifics instead of abstracts. But I could balance that as: "our db system (MariaDB)" or similar [12:04:15] Nikerabbit: you said "In our case (my opinion), we trade some interactive features to enable caching, allowing us to scale better" - I think this is often true. https://www.mediawiki.org/wiki/Architectural_vision_discussion - some people have been tossing around the idea of aiming at making logged-in browsing as fast as anon [12:04:24] which would be a step towards more interactivity? maybe? [12:04:25] Well, my response to that might depend on whether this is in fact intended as a guide for WMF-deployed code - before you mentioned it, tI was reading the page as a generic guide. [12:04:48] *I* was reading... [12:04:57] That was a weird typo. [12:06:18] liangent: that's getting a bit too much in details for my experience to say what is better. I just mean that 1) pages will be constructed from more pieces 2) we should also measure and limit frontend processing time [12:06:23] it's important to remember that when it comes to having your code deployed widely, only WMF infra really matters to be honest. [12:06:37] thedj: mmmmm. Wikia? [12:06:38] this is also why so much dev fails. people don't think about WMF scale [12:06:52] sumanah: who ? ;) [12:07:14] ashley: would welcome your perspective in this discussion of https://www.mediawiki.org/wiki/Performance_guidelines [12:07:48] Yaron: I've changed the second sentence of the doc to clarify and respond to your feedback. [12:07:51] now it says: "If you write code like that for Wikipedia and Wikimedia sites, please follow these guidelines so you don't slow down the site, whether you're writing a one-line bugfix, a new gadget feature, a whole new extension, or a big change to core." [12:08:20] Yaron: maybe it should have a less generic name, or live on wikitech.wikimedia.org. [12:08:42] sumanah: either one of those sounds good, I think. [12:08:48] Hmmmmmm. [12:08:54] * sumanah will ponder, welcomes suggestions [12:09:03] If the page lived elsewhere, the "we" in the first sentence would also make more sense. :) [12:09:04] not on wikitech [12:09:29] it's mediawiki conventions. like our CSS/JS and PHP code conventions [12:09:54] thedj: maybe you could help point out the generic parts and the WMF-specific bits and we could transclude appropriately [12:10:06] I think the page is useful for all developers. Some developers can ignore it, but for the code running on WMF it is a must to follow. [12:10:30] Yaron, that page also lists some WMF people who are volunteer core maintainers [12:10:41] Krenair: alright. [12:11:22] I would think the right solution is indeed to split up the page, potentially into three pages... [12:12:00] Nikerabbit: I feel like I should use what you just said, the way advertisements for movies use quotes from movie reviewers. "useful for all developers... a must to follow" says Nikerabbit! "Five stars" says the Signpost! "A thrilling ride for all ages" says sumana's sockpuppet! [12:12:19] :-) [12:12:31] lol [12:12:36] "I think we should split this into a trilogy" says Yaron (kidding) [12:12:43] LOL [12:12:54] Part I: The Two Tables [12:12:56] now this is getting to be fun : LOL [12:13:01] sumanah: i would also put it in our conventions navbox eventually. like accessibility guidelines and code conventions are as well. [12:13:02] Part II: The Fellowship of the Cache [12:13:18] "I couldn't stop commenting on it." [12:13:23] Part III: The RL that binds them [12:13:30] Part III: The Very Fast Round Trip Return of the King [12:13:36] :D [12:13:58] thedj: agreed - in fact if you feel like putting it in there now I think it is ready [12:14:13] With some animation and cool data graphs these parts could be rockstars at Wikimania's hackathon [12:14:26] sumanah: your tutorials will be a hit [12:14:30] :) [12:16:06] more seriously I am glad to have written something that people will find a useful reference - I would not be surprised if Yaron is right and it ought to be broken down in some way, but still, I'm glad to have Contributed To Knowledge And Wisdom [12:16:42] and yeah, the tutorial I write and the poster that Moiz designs will be helpful [12:16:42] thanks for the discussion - this was a great start and will add more to the mediawiki page. [12:16:42] cool [12:16:43] thanks [12:17:26] sumanah: in the intro, what do "we" in the first sentence and "the site" in the last sentence refer to? [12:17:36] Those should probably be clarified. [12:17:41] hmm, design team should probably spend a bit of time on https://www.mediawiki.org/wiki/Style_guide [12:17:57] andre__: congrats on http://blogs.gnome.org/aklapper/2014/05/19/wikimedia-phabricator/ [12:18:05] Actually, "the site" in 2nd-to-last sentence, and "we" again in the last sentence. [12:18:15] Yaron: I was sort of going for an Upgoer 5 style of writing. Will reconsider. [12:18:23] sumanah: oh thanks, appreciated! I felt this need to summarize the steps so far for myself and let the world know :) [12:18:34] thedj: we're pointing to https://www.mediawiki.org/wiki/Wikimedia_Foundation_Design instead I think [12:18:37] or trying to [12:19:00] sumanah: then it's probably a good idea to mark some of that other stuff as historic [12:19:02] Is that like a "Simple English" thing? [12:22:23] Yaron: Similar, yes. I wanted to write the first few sentences in very clear and accessible prose, using words everyone understands like "Fast" instead of words that sound jargon-y. But it might be unclear in not being specific. [12:23:30] Right, okay. Well, putting "we" on a mediawiki.org page is rarely clear, I would think. [12:23:35] I hear you. [12:23:40] sumanah: cc me if you're creating the bug about catching slow operations automatically [12:24:16] Yaron: it's sad. I wish "we" clearly meant "us, including you the reader - everyone who cares about MediaWiki." but that is not obvious and I should not pretend it is. [12:24:32] liangent: OK [12:24:49] sumanah: oh... I thought "we" here was the WMF, actually. [12:25:03] Just make it explicit "MediaWiki developers" [12:25:14] You're probably right thedj [12:26:08] This page seems intended for others too, though, like infrastructure people. [12:26:11] btw the Hour Of Sumana (as promised in the wikitech-l mail) is over now, but I will be hanging out for a while longer [12:26:33] Yaron: if you want to have your code deployed, you SHOULD think about infrastructure [12:26:39] the one doesn't go without the other. [12:26:47] Okay... [12:27:03] I am curious about what seems intended for "infrastructure people" and not MediaWiki developers [12:27:18] As I said, anything related to Swift, MariaDB... [12:27:38] Well, I haven't said anything about how to tune those tools [12:28:02] sumanah: also, make a point of noting "keeping asking for help if you are uncertain about one of these areas", "know what you don't know" [12:28:02] Yes, that's true... maybe I was just confused by the fact that they're even mentioned. [12:28:11] I have mentioned that they are different bits of our infrastructure that are available and that the reader as a MW developer will need to choose to use the right ones [12:28:37] thedj: hmmm. that's in "how to think about performance" and I don't want to be TOO redundant.... [12:28:46] ok [12:29:04] sumanah: how's this for both clarifying *and* simplifying: change "Redis job queue, MariaDB database, or Swift file store" to "job queue, database or file store". [12:29:42] Yaron: the thing is, to help both Sensing and Intuitive people (people who learn from concrete examples/names and people who learn via patterns and principles), it is very helpful to mention both [12:29:51] Oh man. [12:30:19] This is one reason that we offer principles (for Intuitive learners) and have a good and bad example of each principle (for Sensing learners) [12:30:46] Well, without getting into too high-level a discussion here, I'll note that the majority of concepts mentioned here don't in fact have a specific example attached. [12:31:02] It mentions extensions and gadgets, for instance, without giving an example of either. [12:31:41] There are concepts mentioned that we don't think people need to *learn* via this document. [12:31:48] For those, linking might be the best option. [12:31:56] Who's "we" now? :) [12:32:05] OK, I. [12:32:26] "we" just now meant "all the people who worked on this doc with me" [12:32:48] (and who agreed on the structure of the document and on balancing specific and abstract in the way the doc does right now) [12:32:49] but I am the main author. [12:33:14] Well, it's fine to mention Swift and the others - but I think the way it's currently done is a little confusing, in that it seems to switch to a WMF deployment-specific thing. [12:33:41] Nod. [12:33:43] ...even for the Sensing people among us. [12:33:59] Maybe especially for them. [12:34:24] OK. I'll do something to emphasize that there's an especially WMF-specific set of nouns involved in that section. :) [12:35:32] Yaron: btw the Sensing-Intuitive spectrum is one of the 4 Felder-Silverman learning styles - http://www4.ncsu.edu/unity/lockers/users/f/felder/public/Learning_Styles.html I have found these very useful in thinking about how I improve my own engineering skills, and in teaching others [12:35:51] sumanah: cool, thanks. [12:36:08] http://www4.ncsu.edu/unity/lockers/users/f/felder/public/ILSdir/styles.htm is an overview in case that is of interest [12:36:08] sure [12:37:40] But it's great that there's a performance guide now - it's actually strange to think that there wasn't one before. [12:37:58] YES [12:38:47] Yaron: throughout writing this thing, I kept having this vertiginous "is it seriously actually true that I am writing this for the first time? am I not duplicating work?" feeling/thought. And we have had several conference talks before, which I link to at the bottom. But yeah, we have needed this for a while! [12:39:56] Right, that makes sense. [12:43:51] Hi, I have an English wiki set up, and I would now like to enable others to translate this. I would like to use the same database, so a single installation that handles multiple languages. What is the best way of doing this? [12:43:54] e.g. if there is /wiki/exposure then should a French person make /wiki/fr-exposure or /wiki/exposition or what? [12:44:09] DrSlony: translate extension [12:46:38] thanks Nemo_bis [12:47:53] There are many other systems and you don't want to look into any of those unless your objective is to ruin your life [12:59:04] hey Niharika [13:08:57] Hi sumanah! Sorry, didn´t see your ping before. [13:09:18] hi Niharika! No prob. [13:09:35] sumanah: Did you book the hotel for OSB yet? Should I do it sooner than later? [13:09:45] I need help adding blinking text to my wiki. [13:09:52] I already booked my hotel for OSB, yes. I suggest you do it as soon as you can Niharika because hotels do sell out [13:10:05] Alright. I will do that. [13:10:06] Niharika: http://opensourcebridge.org/wiki/2014/Lodging [13:10:23] Niharika: a hostel will be cheaper. [13:10:44] Specifically, the tag. [13:10:49] sumanah: Yeah. I´ll book one tonight, probably. [13:10:56] nod [13:11:41] brother_johnny: I don´t know how to do that, but, blinking text is mostly a bad design idea, from what all I know of web design. [13:12:44] Okay. [13:13:18] Oh, and it's only supported on some browsers. [13:13:44] Google Chrome not being one of them. [13:14:10] That eliminates a lot of audience, I think. [13:14:19] nod [14:40:23] yo [15:19:38] * thedj suddenly remembers that he was going to 'look for something' for sumanah ... [15:19:48] :) [15:19:51] * thedj checks phone's notepad... [15:20:20] accessibility organizations... [16:05:12] qgil: hi [16:05:19] you were frozen, and I quit and rejoined [17:03:44] biberao: here now [17:16:25] 6c6c/query oops [17:16:27] crap [17:16:41] you still here oops ? [17:27:09] biberao: yup [17:27:28] pm me ? [18:51:55] liangent, do you guys still need https://zh.wikipedia.org/wiki/MediaWiki:Common.js/1.16wmf4.js ? [18:56:20] if (window.jQuery === undefined) { [18:56:21] lol [19:00:41] hi fhocutt :) welcome [19:00:49] hi sumanah! [19:01:13] * sumanah toots a horn of celebration [19:01:22] * fhocutt has confetti around here somewhere  [19:02:06] fhocutt: so, in reviewing https://www.mediawiki.org/wiki/Evaluating_MediaWiki_web_API_client_libraries#Deliverables I figure it's good to check what might not have gotten done out of the " [19:02:07] Community Integration Period (8-10 working days) " section [19:02:17] or, what you have already achieved [19:03:01] and then we can look at the "Week 1" section and see how to structure that work and what help you'll need [19:03:07] and then schedule our chats for this week :-) [19:03:12] sound good to you fhocutt? [19:03:12] ok, sounds good. [19:03:15] great [19:03:19] "Set up computer: install necessary languages and interpreters/compilers, install git, get everything working" - how is this? [19:04:16] I have git, I have python set up with virtualenv, I don't have the others fully set up (I think I have most of the languages though) [19:04:30] ...but I do have mostly-functional wireless, which was a challenge. :P [19:05:17] Important first step :) [19:05:24] YES [19:05:33] * fhocutt mutters about Linux and wireless drivers [19:06:03] fhocutt: if you write up what your current situation is (in a quick blog post) and what you have tried, I will dent and tweet about it and probably some expert will fling advice in response [19:06:27] hi valhallasw! [19:06:40] hi sumanah! [19:06:53] fhocutt: Are you an OPW person? [19:06:53] bawolff, I am! [19:06:58] she is MY OPW intern [19:07:01] * sumanah says proudly [19:07:08] bawolff: https://www.mediawiki.org/wiki/Evaluating_MediaWiki_web_API_client_libraries is her project [19:07:23] fhocutt: Well welcome to Wikimedia :) [19:07:26] (ok, not just mine, because Mithrandir is my co-mentor and valhallasw and anomie have volunteered as technical advisors) [19:07:29] thank you! [19:07:47] bawolff: she'll be emailing some on mediawiki-api in case you want to look at that list's archives once in a while [19:08:47] re: the drivers, there is someone who's actively working on getting this driver working and who'll be offering an updated patched kernel sometime after the 3.16 kernel comes out [19:09:03] Ah. Do we know around when the 3.16 kernel will come out? [19:09:17] so, fhocutt, it sounds like you still have a TODO to get yourself set up to develop in Perl, JavaScript, Ruby, and Java but we will get you into those languages more on a just-in-time basis (like, as you are about to evaluate them) [19:09:31] yes, that sounds right [19:09:38] "Search for the existing API client libraries, add these to API:Client code" - have you done this? [19:09:52] and no, I'm not sure when 3.16 comes out [19:10:03] I've searched for them but I haven't added them yet. [19:10:30] https://www.mediawiki.org/wiki/Evaluating_and_Improving_MediaWiki_web_API_client_libraries/Status_updates/Search_results [19:10:51] ooooh! [19:11:00] I had forgotten this page, sorry [19:11:34] re the bit that starts "Hi Sandra," I thought for a moment that you had gone undercover as "Sandra" to ask a question and get an answer [19:11:51] * fhocutt grins [19:11:52] * sumanah imagines fhocutt in a trenchcoat and wide-brimmed hat, skulking [19:11:59] nope, just not the first one to have this problem [19:12:17] Right [19:12:37] if I pick this back up again Asheesh mentioned how to actually address the problem is his PyCon talk [19:12:46] Oh cool [19:14:00] *in [19:14:05] how? [19:14:57] I don't recall right now--it had something to do with how the package is installed and the installer thinking that some form of string must be an IPv6 address [19:15:10] but there is a way to get around that [19:17:34] ok, so would it make sense for one of your tasks this week to be to add links to those libraries to API:Client code ? [19:17:49] yes, sounds good [19:17:53] "Communicate with mentors and MediaWiki community" [19:18:19] I think I've done that [19:18:50] I am still getting used to the IRC/mailing list information flow [19:19:14] Nod nod [19:19:33] fhocutt: are there any particular bits of etiquette, taboos, rhythm I can help you with? [19:19:56] a list of taboos would be helpful [19:20:01] Hmmm! [19:20:25] Well, I'll start just braindumping, in a few moments when we've finished the sort of task accounting. Sound good fhocutt? [19:20:39] works for me. [19:20:52] Cool [19:21:02] "Work on a mini-project to explore the MediaWiki API" you did! and it is neat! [19:21:12] bawolff: https://twitter.com/AutoWikiFacts is by fhocutt and it is fun :) [19:21:22] yes! [19:21:55] question, how hard did you expect the getting wikidata claims part of it to be? [19:22:18] I did expect you would learn a lot. But I didn't have a time estimate. [19:22:38] I definitely learned a lot! [19:23:12] Hi. [19:23:16] Hi Piotrek [19:23:26] fhocutt: "Submit first pull request to a git repository" - I believe you did this, right? with DW? [19:23:31] Can Unix timestamps be used with API? [19:23:51] not just submitted, had two accepted :D [19:23:54] Piotrek: no, all timestamps should be iso formatted [19:23:58] fhocutt: nice! [19:24:02] ah [19:24:18] valhallasw, thanks! [19:24:44] Piotrek: most programming lanuages have something to convert formats though [19:24:57] oh neat! fhocutt I see you in https://github.com/dreamwidth/dw-free/graphs/contributors [19:25:06] https://github.com/dreamwidth/dw-free/commits?author=fhocutt [19:25:30] hey, yes! [19:25:50] Piotrek: You can even use the unix date command to do so if you need to [19:26:10] fhocutt: can you figure out how to navigate to that page yourself, given https://github.com/dreamwidth/ ? [19:26:27] I can tell/show you. It might help you as you do your research, to know how to do that in GitHub [19:26:37] yep, did that [19:26:44] ty [19:27:18] fhocutt: OK, if you can navigate yourself then I shall not spew redundant info at you :-) [19:27:30] fhocutt: join me in #wikimedia-research? I have good news for you :) [19:35:10] ok, after that interlude! fhocutt this week I believe you have "Select the best library/libraries in Python, Perl, JavaScript, Ruby, and Java." basically a shortlist for each language. and "Research and decide on criteria to evaluate these libraries in more depth." [19:35:19] the first being required and the second being a bit more smooshy [19:36:06] yes, the second carries over into (required) in week 2 [19:36:53] do you feel like you have what you need to start the "select the best" process? [19:37:22] I would like to talk about how to break that down [19:37:24] ok! [19:38:06] so, I can offer a few criteria that would help you make the shortlists - to distinguish "belongs on the shortlist" from "kind of not even worth bothering with" [19:38:08] also this sounds like I will need to get Java/JS/Perl/Ruby up and running asap if I'm to evaluate them [19:38:09] if that would help [19:38:12] that would help. [19:38:40] it may be possible to create the shortlists without needing to USE the libraries. [19:38:50] 1) how recently were they maintained? [19:38:54] this feels super key to me. [19:39:07] And you can see this just by looking at their git or other source control logs. [19:39:42] what if they work fine and would theoretically be responsive if anyone found a problem? [19:39:48] It's also a good idea to cross-reference that with the amount of bugs/pull requests open. [19:39:55] Yes, exactly :-) [19:39:57] (does that ever actually happen?) [19:40:44] I think so. If it's a simple library that e.g. just directly connects a function call to an API call, I'd expect not much maintenance to be needed [19:40:44] fhocutt: Since MediaWiki changes, and since we have breaking changes to the MediaWiki API .... at least a few times a year, a healthy client library would need to get updated at least a few times a year [19:41:08] valhallasw: maybe you disagree with me? [19:41:41] I think it depends on the complexity of the library, really; most of the breaking changes recently were about the specific API return format [19:41:58] so if the library doesn't try to parse whatever the API returns, it wouldn't need to be updated [19:42:38] valhallasw: would the API breaking changes all be in the release notes for each version? I presume so - correct? [19:42:44] I think so, yes. [19:42:54] Otherwise, mediawiki-api-announce is a good list to check [19:43:00] for instance: https://www.mediawiki.org/wiki/Release_notes/1.22#API_changes [19:43:06] ( http://lists.wikimedia.org/pipermail/mediawiki-api-announce/ ) [19:43:12] https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce [19:43:13] :) [19:43:58] We can see in http://lists.wikimedia.org/pipermail/mediawiki-api-announce/ that not every month has a msg :-) in fact only ~7 of the last 12 months have any traffic on that list [19:44:02] Yes, if anyone breaks the api without mentioning in the RELEASE-NOTES, they should get a dirty look [19:44:06] ok, sounds good [19:44:24] also the HISTORY file has all the old release notes combined [19:44:33] fhocutt: a common pattern in Wikimedia-land and in open source in general is that there is a high-traffic list for discussion, and then there is a low-traffic list for announcements, that only a few people can post to [19:44:55] bawolff: if you were new to navigating the MediaWiki codebase, would you be able to find the "HISTORY" page? honest question [19:45:11] Its in the main directory [19:45:52] https://git.wikimedia.org/blob/mediawiki%2Fcore.git/master/HISTORY [19:46:32] ah. I wouldn't have necessarily expected them to be in the codebase itself. [19:47:11] Thanks bawolff - I was gonna find it and then I got distracted by Hacker School discussion (a fellow Hacker Schooler announced in the HS chat that he has been hired to be a model for MAC cosmetics in June) [19:47:11] I'd have gone looking on mediawiki.org. [19:48:08] ok, so red flag if a library hasn't been updated for ~6-12 mo? [19:48:34] valhallasw: what do you think? I think if a library has not been updated at all for a year then I think it goes off the shortlist [19:48:49] and investigate to find whether it's because it works and hasn't had breaking changes or whether it's neglected [19:48:54] sumanah: sounds reasonable. [19:49:09] especially if there are bugs open [19:49:17] a lot of open pull requests is definitely a red flag [19:49:44] a lot of open reasonable pull requests - in my opinion this depends on the ratio of open to closed [19:49:52] like, 10 open requests from the last year and none closed - bad [19:50:01] 10 open requests and 100 closed - good [19:50:09] (the open ones might be bad code or antifeatures, right?) [19:50:22] makes sense to me [19:50:48] ok, so that [19:51:00] that is (1) -- maintenance, recently maintained, etc. [19:52:10] (2) have you checked out the "support matrix"? https://www.mediawiki.org/wiki/API:Client_code#Support_Matrix [19:52:29] yes, I have [19:53:12] last updated http://api.wikia.com/wiki/Client_libraries?action=history :/ [19:53:19] sumanah: yes, you're right. [19:53:31] oh my. [19:53:40] valhallasw: about the ratio of open to closed pull requests? nod nod [19:53:48] fhocutt: yeah :/ [19:55:29] fhocutt: hmmm so that info, since it is so old, may be out of date and the libraries may have gained or lost capability since then [19:56:10] definitely [19:57:16] it says it's a list of "the best" ones but I don't know what criteria they were using [19:57:30] Mithrandir: valhallasw: so, I have a little argument to make and I want to hear your thoughts/argument back. :) basically I think that we should concentrate on libraries that actually try to translate MediaWiki's API into the idiom of the language, because if the library is just a way to send and receive HTTP requests, then it isn't actually helping the programmer any more than a generic equivalent of python's requests library [19:57:48] in terms of helping the programmer work *with MediaWiki* [19:58:07] and do interesting things with pages, users, and other MediaWiki-specific abstractions [19:58:38] sumanah: does requests handle logins/cookies/etc. gracefully? [19:59:05] and thus it will be easy, even without having to break out one's IDE and program in Ruby/JS/Java/etc., to see -- from comparing the code of two different projects -- if one is more like that than another is [19:59:09] fhocutt: lemme check! [19:59:16] sumanah: at that point, it's just synthetic sugar, sure. [19:59:34] Mithrandir: do you mean "syntactic sugar" or are you making a subtler point? [19:59:38] * sumanah looks at http://docs.python-requests.org/ [19:59:45] I mean syntactic sugar, yes. [19:59:57] syntactic sugar? [19:59:58] I think there's value in syntactic sugar libraries as well as libraries that try to map the API into something else. [20:00:04] value, yes! [20:00:11] sumanah: Hmhm. I'm not sure where to draw the line, though -- a library that just handles login/cookies for you? Once it handles continuations? [20:00:27] fhocutt: yes, requests handles cookies http://docs.python-requests.org/en/latest/user/quickstart/#cookies but I have not used it that way myself, yet [20:00:29] fhocutt: basically, a library that just makes the api less tedious to use by doing things like always passing the logged-in session you have for you [20:00:38] My gut feeling says no and yes to those two, but I'm not sure how to make a very clear definition of one vs the other [20:01:18] fhocutt: "syntactic sugar" is a programming term meaning: making it easier for the programmer to do a common thing by reducing the number of characters she has to type or allowing her to use an easy-to-remember name for it [20:01:22] less tedious is useful. [20:01:26] ah, ok. [20:01:59] and continuations as in "get more items from the same search"? [20:02:07] fhocutt: yes [20:02:45] like, in Python, you create a new class by doing: class nameofclass(thing-I-inherit-from): [20:02:48] and then defining the class [20:03:54] there are much more tedious things you COULD do to create a class, involving a bunch of __methods-that-look-like-this__ ("double-underscore methods" or "dunder methods") and explicitly stating various defaults [20:04:28] but using the "class nameofclass(thing)" syntax is "syntactic sugar" that lets you avoid the tediousness and gives you the defaults that you almost always want [20:04:55] this is not important to you right now fhocutt but I thought an example might help [20:05:08] sumanah, thanks [20:05:12] I may have distracted things too much though [20:06:14] I think Mithrandir and valhallasw and fhocutt and I agree that - for fhocutt's purposes - it would be better to concentrate more on documenting & improving the libraries that actually try to grapple with the abstractions MediaWiki presents, right? although we should also SAY to developers "if you just want something to deal with logins/cookies/continuations, here's one like that too" [20:07:24] Right. In the case of python, it actually seems it's simple, as there are no intermediate forms [20:07:26] I think it's worth flagging the simpler libraries that are actively developed/well-documented as I go along, but agree on documenting/improving [20:07:53] because that would be useful information to have on API:Client Code [20:09:21] sumanah: I think both are valuable, so it's up to fhocutt. The syntactic sugar ones are simpler to deal with, since they don't require you to know the language and the idioms, but because they can go across how things are usually done in a language, they can also feel "wrong" in a language. [20:09:57] They also require you to do some non-trivial stuff such as continuations and, for instance, edit tokens [20:10:33] Mithrandir: valhallasw - which of Python, Perl, JavaScript, Ruby, or Java would you be willing to pair/advise Frances in this week? Maybe this would be an activity where it makes sense for an expert to pair with her to help her look at the libraries that exist [20:10:38] I think, at the very least, a library should take care of logging in/out, continuations and edit tokens; whether the interface is json or really suited to the specific language doesn't really matter. [20:11:20] and help her generate the shortlist, by saying "oh this is obviously shit, we should cross it off the list, here's why I was able to say that" for a few of the libraries [20:11:21] I'm happy to help with Python, and I can try to help with Ruby and Java, at least. [20:11:25] sumanah: this week is really packed for me, so I don't really have much time now, but in terms of languages python/perl are the ones I can do. [20:11:40] I can evaluate ruby too, but I'm no ruby guru [20:11:53] java I haven't written for years, and JS I'm a novice in [20:12:01] I don't know how Perl OOP works, so I think Mithrandir would be better for that. I'm also not very used to the async JS structure. [20:12:43] anomie: hey there, you know JavaScript. would you have time to help Frances by looking at a few JS libraries and helping her understand which ones should go on the shortlist of libraries to evaluate & document? [20:12:51] in any case, I've had Ruby on my list of languages to take a better look at, so this sounds like a good reason ;-) [20:12:55] :) [20:13:22] fhocutt: btw, do you know what "OOP" or "async" mean? [20:13:33] OOP, yes, async no [20:14:12] fhocutt: "asynchronous" [20:14:16] Sorry; asynchronous requests/function calls. In Javascript code, you often see code that uses callbacks; i.e. tells the browser 'please get this URL for me, and then call this function with the data you received' [20:14:36] hm, ok [20:14:50] sumanah: Time when? Not right now though. Although MediaWiki's built-in "mw.Api" stuff should probably go on the list. [20:14:50] as opposed to typical python code, where you would have 'result = request(....)' and then on the next line you do something with result [20:14:50] anomie: this week [20:14:58] fhocutt: I'm terrible at throwing acronyms around, so if I say something that you don't know, please stop me and ask. I'm happy to explain, but I just suffer from "this is obvious to me, so it must be obvious to the entire world". [20:15:17] Mithrandir, I know how that goes [20:15:53] :-) [20:16:21] anomie: maybe Thurs or Fri - basically I think you would set aside 30 min to talk here in IRC with fhocutt, she would give you a list of the libraries and the criteria she has assembled so far, you would quickly look at those libraries and tell her which libraries are good or bad according to those criteria [20:17:21] sumanah: I'd be worried that evaluating a library would take longer than 30 minutes, depending on the criteria. But we could give it a try. [20:18:08] anomie: this is a first evaluation to check "which libraries go on the shortlist?" [20:18:36] anomie: so (a) 30 min might be enough, and (b) if you schedule it for Thurs then you can continue your discussion on Fri if it flows over. [20:20:15] sumanah: Sure. We can schedule it now, or else just feel free to claim a 30-minute spot on my calendar on Thursday, anything showing as "open" should be open. [20:20:29] Hello, I have a mediawiki installation being served over an apache reverse proxy. In order for the page to load correctly I had to set wgserver to the name of the web proxy. [20:20:29] * anomie has to disappear in a few minutes [20:20:55] fhocutt: I'll set up a GCal event for sometime like noon your time and 3pm Brad's time, on Thurs, or something [20:21:01] ok sounds good. [20:21:04] But now when I load from behind the proxy the page obviously doesn't load. [20:21:26] akio: If you just remove the $wgServer line altogether from LocalSettings.php, it will be auto-detected [20:22:57] akio: Are you using the reverse proxy to cache things? If so, there's additional config stuff you can set to make mediawiki take advantage of a caching reverse proxy [20:23:38] invite sent. [20:24:00] valhallasw: do you want to set up a few similar meetings for Ruby and Python and Java? [20:24:00] bawolff: thanks, that works like i want it to [20:24:20] is that the useprivateips/squidserver options? [20:24:32] sumanah: I'd actually prefer e-mail, but if you think IRC is better, that works for me. [20:24:33] $wgUseSquid and friends [20:24:50] valhallasw, I can do email [20:24:54] yeah, i will do that if i notice page loads being crappy [20:25:03] valhallasw: fhocutt - you can figure that out between you both, yeah [20:25:19] akio: You also have to set up cache purging for that to work properly. MediaWiki can be set to send HTTP purge requests to relavent servers when needed [20:25:23] and fhocutt maybe you can also do work via email with Tollef for the Perl stuff. IIRC Perl is a language you can also get around in, right? [20:25:45] if yes, it sounds like everything except for Java's covered [20:26:10] oh, fhocutt, valhallasw has offered to at least try the Java stuff :) [20:26:11] oh, ok! Sounds good. [20:26:14] yes [20:26:22] akio: One thing to caution you about removing $wgServer entirely. It can cause inconsistency on what value is displayed for the {{SERVER}} and {{fullurl: magic words, when used in wikitext (usually this doesn't cause too many problems) [20:26:38] * sumanah delegates :) (thank goodness that we have three mentors who are much more experienced programmers with varied skillsets!) [20:26:53] if those aren't used by default then we are good [20:27:04] relatively new wiki and people don't know how to use things like that [20:27:18] akio: It can also sometimes make links in email messages sent not work (if you run jobs from the command line instead of via web) [20:27:26] no email going on here either [20:27:32] sumanah: ok, so for criteria, we have "updated within 1 year" and "handles at least continuations, preferably translates API to language idiom/offers abstractions" [20:27:42] fhocutt: I'm pretty certain I can help with the python ones, but not completely sure about the other ones, so I think it would be best if we can do the java/ruby ones first [20:28:01] valhallasw, that works [20:28:14] it sounds like Python's covered and it's also the language I'm most comfortable in [20:28:26] fhocutt: yeah, I think those are the more objective criteria. There's also "is this code super shoddy" which I assume the more experienced programmers will be able to smell if it is [20:28:40] how about "has tests and documentation"? [20:28:45] I would have no clue how to smell that for ruby and java, though :-) [20:28:55] fhocutt: I ..... should have mentioned that earlier. you are absolutely right [20:29:14] yo [20:29:16] Well by that standard MediaWiki fails (partially) ;) [20:29:33] valhallasw: I think that it is ok for the first pass if we don't yet notice if code is shoddy. We can dive in more in the second pass [20:29:37] bawolff: Ha! [20:29:42] bawolff: :) [20:29:56] fails on the tests, or on the docs? [20:30:09] fhocutt: I think it should be, like, a yellow flag if there are no tests and really skimpy or no docs [20:30:15] hi [20:30:16] not a red flag [20:30:19] but a criterion [20:30:21] hi biberao [20:30:31] sumanah, ok [20:31:23] fhocutt: do you think this is enough to get started to try to do a first pass with valhallasw over email in Ruby and Java? and if not, tell us tomorrow and we can iterate and get more criteria for you? [20:31:49] sumanah: yes, I think so. [20:32:04] cool. so fhocutt now I will braindump a little at you about IRC/mailing list dos and don'ts [20:32:23] http://en.flossmanuals.net/GSoCStudentGuide/ch014_communication-best-practices/ [20:34:38] fhocutt: I think you may have already heard this rule: if you want to share a multi-line code snippet or error message or log or similar with people, so they can look at it with you, it's best to give a link to a paste -- e.g., http://tools.wmflabs.org/paste/ -- or a GitHub gist [20:34:59] * fhocutt nods [20:35:46] fhocutt: perhaps you have already read my "some help for new open source people" http://www.harihareswara.net/sumana/2014/02/26/0 which has a "how we talk" section [20:35:53] I did, yes [20:36:29] Cool [20:36:35] I wonder how much reading of the manual is necessary when there are many and/or confusing manuals [20:36:41] so im trying to do a multiwiki [20:36:49] as is the case with Mediawiki/Wikidata/etc. [20:37:03] !wikifarm [20:37:04] To run multiple wikis, you can simply install MediaWiki in different folders, with different databases or in one with database prefixes. You can also have multiple wikis using a single installation: If you run a farm or want to, join the mailing list: [20:37:12] ive made a link from the main dir to a second [20:37:28] same db [20:37:33] biberao: check out what wm-bot just posted btw :) [20:37:33] but cant it to work with the 2 localsettings [20:37:43] i did sumanah but cant get it working properly [20:37:54] biberao: what error do you get? [20:38:06] its the localsettings it makes me confused [20:38:14] i have 3 localsettings but it gives me a php error too [20:38:17] let me paste it [20:38:24] bawolff: so now that works perfectly, i was wondering if i could do something more complicated [20:38:29] i need a noobs guide :| [20:38:32] two reverse proxies [20:38:33] fhocutt: hopefully enough to be able to point out what is confusing :-) [20:38:36] fhocutt: a good point. Maybe you have seen my posts recently to wikitech-l? where I say "this is how I tihnk it works - am I right?" [20:38:53] when I load the page now, it is proxied twice to get around a firewall [20:39:14] but in the browser status bar i see it trying to connect to the realserver ip [20:39:30] fhocutt: but it's definitely an issue - especially with open source documentation and maybe even extra on WM-related docs, that are often on a wiki and are not always actively maintained by those who maintain the code [20:39:41] im sure there is another setting i am missing to use the correct ip/host [20:39:53] sumanah: wanna guide me? [20:39:53] :X [20:40:04] sumanah: yes, but it's less useful when I'm just getting started, am very confused, or don't know what I don't know [20:40:12] biberao: i just did this [20:40:14] fhocutt: for instance, for pywikibot, there's a manual on most wikipedia's, there are a few wikibooks and there is the documentation on mediawiki.org -- and that's just the documentation on WMF soil! [20:40:19] biberao: sorry, I do not know enough to guide you, so I will let others guide you [20:40:37] akio: did what? [20:40:41] fhocutt: oh I did not mean "did you read those posts and get something out of them" - I first just wanted to check whether you had seen them, before talking more [20:40:50] aren't you splitting wikis? [20:40:55] akio: Do you know which resource is loading the real servers ip [20:41:08] sumanah, actually that's a pretty good example: there is so much documented there and it assumes a relatively high level of knowledge from the start [20:41:12] im just trying to bring up the main page [20:41:13] akio: You can usually use developer tools or firebug/etc to figure out [20:41:22] akio: i want them to use the same db and other resources except images [20:41:39] akio: I mean, is it a javascript file, an image, or something else [20:43:12] php [20:43:25] biberao: you just use links [20:43:28] fhocutt: I presume the "pretty good example" is one of the recent "is this right?" msgs to wikitech-l -- correct? [20:43:36] biberao: what part are you doing now? [20:43:57] each wiki should have its own storage for uploads and things [20:44:29] keep those things out of the usr directories and you can use your package installed mediawiki on the rest of the wikis you make [20:44:30] sumanah: ah, no, I meant valhallasw's example of the pywikibot docs [20:44:31] akio: Does the page load at all [20:44:31] yyes [20:45:04] bawolff: it loads, the second get request goes to the ip and tries load.php [20:45:20] sumanah: but your "here is how I think it works" is difficult when I am very confused or don't know what I don't know [20:45:26] akio: ok. load.php is for js and css files [20:46:03] akio: That's usually controlled by $wgScriptPath (There's other more specific variables. they are usually not used) [20:46:06] fhocutt: absolutely, yes, my "here's how I think it works" is NOT an example of something I think you should be reading and learning from ABOUT THAT TOPIC (about, say, HTML templating, or the image thumbnail caches) [20:46:36] bawolff: so take out script path? [20:46:36] sumanah: I mean, a difficult strategy for me to take [20:46:54] fhocutt: I wanted to bring it up as an example of a tactic that you could use at a certain stage in your understanding AFTER you have an initial framework and have questions within that framework [20:46:57] akio: No, it needs to be there. just make sure its set to something that sounds right [20:47:13] sumanah, sure, that makes sense [20:47:15] but to simply get an initial overview of the topic - yes, sometimes there are lots of confusing and conflicting manuals/docs and that is where talking with your mentors comes in [20:47:30] it is fine to grab us on IRC and say "what is this I don't even" [20:47:36] and we will help you see the big picture [20:47:50] ok [20:48:18] or join you in the "what is this I don't even" [20:48:22] That happens too [20:48:25] :) [20:48:26] * fhocutt grins [20:48:40] fhocutt: Reedy is a longtime MediaWiki contributor and used to be the main API maintainer IIRC [20:48:51] he lives in England [20:48:56] and loves to drive his car [20:49:09] * fhocutt waves to Reedy [20:49:22] he and I had a great time at the Manchester airport once looking at old planes at their open-air museum [20:50:06] fhocutt: one way to go here is for you to make a note of nouns or verbs or jargon of any kind that make your head swim, and ask very freely about them, here in channel, to me and others [20:50:38] I will happily give you a "what this means in context" braindump [20:50:50] sounds good. [20:51:24] the Wikidata stuff - were you able to get any help via IRC or mailing lists? [20:51:38] IRC, yes [20:51:50] so many of them are in Europe :/ a bit hard for you [20:52:18] it looks like wikidata-tech is active http://lists.wikimedia.org/pipermail/wikidata-tech/2014-May/thread.html so if you ask a question in IRC and don't get an answer within ~20 min, please feel free to email that list to ask [20:52:48] (going back to the IRC/list etiquette topic) [20:52:58] ok, sounds good [20:53:11] it is ok to cross-post initial questions as well [20:53:36] like, to wikitech-l (a general "wikimedia tech and developers and sysadmins" list) and to a specialized interest list, e.g., mediawiki-api or wikidata-tech [20:54:06] https://meta.wikimedia.org/wiki/Mailing_lists/Overview#MediaWiki_and_technical is a sort of inventory of mailing lists. [20:55:10] useful. [20:55:12] There are a lot of them, and there are a lot of IRC channels. In my opinion we are more consistent about listing and inventorying our mailing lists than our IRC channels. (It takes sysadmin work by Wikimedia Foundation to create a new lists.wikimedia.org list. It takes one person's whim to make a new Freenode channel) [20:55:43] like, fhocutt, if you just try to join a nonexistent Freenode channel, Freenode will obligingly create it for you [20:56:07] I've done that. Will it keep it around? [20:56:16] so sometimes people will just sort of start ad-hoc one-time channels like #mw-core-may-17 or whatever [20:56:27] fhocutt: it disappears as soon as the last person leaves [20:56:30] basically [20:57:05] so it is not like an Etherpad [20:57:25] ok [20:57:47] btw, on the europe timezones -- it should not be too bad, as long as you keep in mind people will be going to bed around now. Most volunteers would not have time during the day to help, after all. [20:58:16] valhallasw: yes. and some of the Wikidata contributors are volunteers. I was thinking about paid Wikimedia Germany people, though..... [20:58:25] Oh, of course. [20:58:35] fhocutt: (WMDE is Wikimedia Germany - the nonprofit that is the home of the Wikidata technical team) [20:58:49] valhallasw, ok--also sometimes I am up/working late enough to catch them in the morning, but hopefully not too often :) [20:58:53] ha! [20:58:58] Wikimedia Deutschland* :) [20:59:02] yes, take care of yourself Frances! [20:59:39] fhocutt: remind me, do you speak German? [20:59:59] sumanah, often if I'm doing that I'll have spent the evening hours doing something else and I'll be poking at some project before bed [21:00:03] nod nod [21:00:06] nope [21:00:17] I have some Spanish and French, which are not helpful [21:00:21] *in this case [21:00:40] well, at least you can use those to check for localisation, in case that is something you are interested in [21:01:22] e.g., send a request to the API at fr.wikipedia.org and see if what you get back is appropriately labelled in French rather than English, in terms of, like, names of fields [21:01:36] not particularly but I suspect I will become more interested as I work here [21:01:41] that's off the top of my head and I do not think it is something you actually need to think about yet [21:01:42] sure [21:01:50] sumanah: api is not supposed to be localized [21:01:56] oh ok :) [21:01:57] mostly [21:02:01] there's exceptions [21:02:05] what are they? [21:02:05] its kind of a mess that way [21:02:13] the error messages are translated, for instance [21:03:03] The success message for watch I think [21:03:11] the uploadwarnings field (maybe) [21:03:24] valhallasw: only some though [21:03:31] most aren't i don't think [21:03:55] action=parse depends on language, but that's as to be expected [21:04:25] fhocutt: so, my colleague Amir wrote http://aharoni.wordpress.com/2011/08/24/the-software-localization-paradox/ which is about how it would be better if all software were translated and localised, but right now in order to be a first-class citizen of the web you have to be fluent in English, and it's a vicious circle [21:04:36] because then people just use things in English and the gap widens [21:05:11] The reason the api isn't translated in general though, is because its supposed to be read by computers not machines, so we don't want strings mysteriously changing some times [21:05:17] This is true. [21:05:32] otoh I don't know how that applies to the help text [21:05:40] bawolff: except the *english* error messages can also change -- so really the keys are the only safe thing to use [21:05:45] yes. [21:06:19] that sounds just kind of ugh [21:06:33] fhocutt: remind me, do you know what "screen scraping" is? [21:06:44] btw, translated documentation is one of the reasons I'd like to keep pywikibot documentation on-wiki [21:06:52] yes, I was just talking about that with a friend last night [21:07:12] ok, cool. If you had not then I would have explained it [21:07:33] where you get data from the loaded webpage, generally because it's not available in other ways [21:07:42] fhocutt: do you know about pywikibot's two branches, "trunk" and "compat", and the history that has led to those branches? [21:07:47] (right) [21:07:56] I know very little about pywikibot [21:08:21] So, once upon a time, pywikibot was basically a screenscraper, because there was no MediaWiki API [21:09:01] -- after all, all those language links had to be kept up to date, as there was no wikidata yet [21:09:14] fhocutt: do you know what the "language links" are in this context? [21:09:52] something about keeping should-be-identical-data on the different wikis up-to-date? [21:09:57] fhocutt: not quite. https://en.wikipedia.org/wiki/Tulsi_Gabbard lefthand navbar, scroll down to "Languages" [21:10:25] (you're right that it is about synchronizing data on various wikis!) [21:10:51] ah, ok [21:10:51] you see how there are links to the Tulsi Gabbard article in German, Farsi, French, Italian, etc. Wikipedias [21:11:15] yes [21:11:40] these interrelationships used to live in the individual articles [21:12:10] at the bottom of an article's wikitext, just next to the category listings, e.g., [[Category:Fictional times of day]] or whatever, there would be a set of the links to the same article on other wikis [21:12:15] oh, ok [21:12:31] but this is structured data! and structured data should live in Wikidata! [21:12:35] and now it does! [21:12:41] :-D [21:13:04] if you hover your mouse over "edit links" in that navbar, you'll see that you would edit the links (and maybe add another) to a Wikidata thing [21:13:25] oh, I see. [21:13:28] (I say "thing" here because I am not sure whether it is an item, an entry, a thingamajig, a whatsit, or some other technical term) [21:13:40] entity. :) [21:13:44] thank you! [21:13:47] I think also an item? [21:14:03] :) So now it can just be edited once and then that will propagate to all n Wikipedia entries about that ..... thing. [21:14:09] Yay Wikidata! [21:14:29] but there was a dark time before Wikidata was the keeper of this information. A time when entropy roamed the land [21:14:39] dun dun dunnnnn [21:14:48] like, even more than now [21:15:32] so much unorganized data, flowing everywhere... [21:15:34] valhallasw and his fellow pywikibot developers, protectors and maintainers of the language links, did .... something I don't entirely understand. With pywikibot. [21:16:51] Something I also don't understand :-) This is actually from waaaay before my time, 2003-ish; Rob Hooft wrote a script specifically to do the language link updating [21:16:53] (also, "Fictional times of day" is not yet a real en.wp category ("en.wp" or "enwp" is shorthand for "English Wikipedia")) [21:17:56] but I'll stop keeping sumanah from telling the story :-) [21:18:07] no no valhallasw I was kind of hoping you would jump in and tell the rest :) [21:18:14] ah, okay :-) [21:18:40] basically, that small project organically grew to something that could take on a whole range of tasks: the pywikibot framework [21:19:16] now, around 2006, the mediawiki API was developed. This allowed people to edit wikis much more easily, without needing to screen-scrape [21:19:53] however, adding that to the code base of pywikibot was not easy, as it was organically grown, and thus much more complex than a typical well-designed project [21:20:16] ah. [21:20:33] so a plan was created to implement a 'new' pywikibot (which is now known as 'core'), with a much cleaner internal structure [21:22:02] then, while that project was under way, someone else hacked API support into the old version [21:22:16] which made the code even more of a mess, but at least the API was used [21:22:25] right. [21:22:48] which also meant there was no incentive for users to actually convert to the new framework [21:23:07] and then, the Lannisters -- wait, no [21:23:20] so in the end, we have two frameworks, that are almost compatible, but not quite, a lot of people using the old framework, but almost all maintainers using the new one [21:23:30] (which really reminds me of python 2 vs python 3) [21:23:36] yes yes yes [21:24:40] then wikidata came along [21:24:45] oh goodness. [21:24:59] which 'core' picked up much more readily than compat -- after all, it was much easier to implement [21:25:19] and this is finally pulling people towards core, I think. [21:25:32] interesting. [21:26:29] so, in the end, we're working on a 'real' deprecation plan, where we try to get the last missing bits into core, and finally declare compat as 'we will not maintain this anymore' [21:26:41] fhocutt: in fact they worked on this a bit in Zurich IIRC [21:27:02] nice. [21:27:44] valhallasw: rename it "incompat" [21:27:46] https://www.mediawiki.org/wiki/Requests_for_comment/Deprecate_pywikibot-compat [21:27:49] that should get the idea across [21:28:28] SamB: :D Well, renaming 'trunk' to 'compat' and 'rewrite' to 'core' also helped to make that point, I think. [21:28:48] so fhocutt if you end up working on pywikipediabot (now called "pywikibot" since after all it works with wikis like Wikidata, not just Wikipedia), then you would work on documenting and improving "core" [21:28:57] valhallasw: what, no "master"? [21:29:05] ok! [21:29:28] SamB: they are no longer branches, they are seperate git repositories :-) [21:30:00] fhocutt: feel free to edit what Merlijn and I have just said into a blog post or similar [21:30:57] in fact, I think that would be a good idea - do you think that's something you could do in the next few weeks? [21:31:25] it is. I'm going to need to actually schedule time for blogging. [21:31:30] absolutely. [21:31:51] IIRC you are meant to blog at least once a week. :-) so this could count as one of them if you want, or as an extra [21:32:17] I should really get myself to write down my ideas for the pywikibot docs :/ [21:32:21] once every two weeks, but yes [21:32:50] sumanah: do you know what the process (if any) is for keeping the MW docs up to date? [21:33:04] valhallasw: which ones? like, docs in general about MediaWiki? [21:33:19] Yeah, but also technical ones, such as, say, the database layout page. [21:33:23] The things at doc.wikimedia.org are automatically generated [21:33:30] The others are not kept up to date [21:33:59] bawolff: doc.mw.o is just API docs, right? [21:34:14] That's doxygen docs [21:34:38] In theory if you add a new global, you are supposed to make a manual: page to correspond to it [21:35:20] fhocutt: "doxygen" - we write specially formatted comments in the code, and then there's a tool we run, doxygen, which eats all that code, parses it looking for those comments, and spits them out into webpages [21:35:23] Sometimes people go through and look for missing manual:$wg... pages or missing manual:Hooks/... pages, but that's done very irreguraly, and mostly by random people if they get bored. Its not organized very well [21:35:25] bawolff: Yeah. I'm actually trying to think of a way to link gerrit to the mw.org, so manual changes and wiki changes would be one and the same commit. [21:35:35] sumanah: yep, thanks [21:37:38] btw fhocutt there is a summary at https://wiki.python.org/moin/Python2orPython3 about whether you should use Python 2 or Python 3, and an INCREDIBLY LONG page http://python-notes.curiousefficiency.org/en/latest/python3/questions_and_answers.html that I found fascinating as a software engineering document on the same topic [21:38:04] I am not telling you to read them anytime in the next couple weeks. But sometime this summer I think you would find them interesting [21:39:17] hi there sDrewth! how are you? [21:39:40] * sDrewth prises open an eye [21:39:52] sumanah: that is interesting! [21:40:09] fine thx sumanah, still waking [21:40:26] yourself? [21:40:30] sDrewth: may I introduce someone to you? fhocutt is my intern this season, and she is working on evaluating, documenting, and improving some API client libraries for MediaWiki. https://www.mediawiki.org/wiki/Evaluating_MediaWiki_web_API_client_libraries [21:40:36] I'm pretty good! [21:40:54] hello, sDrewth! [21:40:55] fhocutt: sDrewth, a.k.a. billinghurst, is a friendly and knowledgable member of the Wikisource community. [21:41:05] good to know! [21:41:07] gday fhocutt [21:41:21] welcome [21:41:35] thank you. [21:41:36] good of sumanah to start on my faults [21:41:38] sDrewth: https://twitter.com/AutoWikiFacts is a project fhocutt made that tweets out little snippets from various Wikimedia sources, and she was able to get it to work with Wikisource, after a bit of fiddling [21:41:56] neat [21:41:58] yes! [21:42:20] wait, were you, fhocutt? [21:42:29] I didn't actually get wikisource working, unfortunately [21:42:33] * sumanah lies, sorry [21:42:56] sDrewth: have you ever worked with Wikisource via the API? [21:43:08] I think I could have hammered together something with searches for "Index: ", but didn't get to that [21:43:17] SDrewth's other eye pops open, to join the other [21:43:50] sumanah: api only for queries [21:44:49] I need to sit down to learn api's, however, it doesn't come naturally, and then other distractions/tasks/priorities have me doing something else [21:44:59] always another spambot to kill [21:46:01] an intern wrote a course or two - http://www.codecademy.com/courses/web-beginner-en-vj9nh/0/1 and http://www.codecademy.com/courses/web-beginner-en-yd3lp/0/1 - in case you like those sorts of things to help you learn [21:46:44] so many things to learn ... so little time and competence [21:47:03] sumanah: are you going to Wikimania this year? [21:47:10] No I'm not, sorry [21:47:22] k, :-( [21:48:41] fhocutt: so, do you feel like you have a next thing to do today/early tomorrow, or would you like to settle that? [21:50:00] adding lots and lots of library links to API:Client Code sounds like a useful thing, and also screening those for obviously-unsuitable libraries [21:50:11] ok! [21:50:12] yes! [21:50:15] I agree [21:50:27] I may also start making outlines/brainstorming for blog posts [21:51:24] I hereby nod. [21:51:33] nod.js [21:52:02] thanks, sumanah! I will head off, get lunch, and get started. [21:52:14] Cool! Thanks fhocutt, see you around the same time tomorrow? [21:52:19] sumanah: re twitter, I don't think that we have a good simple way for wikisource to pump particular feeds [21:52:30] :/ [21:53:06] sDrewth: Is that something you/someone could do on irc.wikimedia.org? [21:53:07] sDrewth, there should be lots of other people who can help you learn things during wikimania [21:53:14] * marktraceur bets slaporte would be into such things [21:53:28] it all requries bots bots anbd bots [21:53:33] sumanah: I will be up for the ECT meeting and we could chat after that [21:53:33] bots bots bots [21:53:43] OK! [21:53:59] see ya [21:54:03] later [21:54:07] and my linux days have had too big a gap, so get onto my bot account, and then I logoff again [21:55:03] marktraceur: quite possibly, but to me, it is almost something that needs an OAuth right and a web interface [21:55:16] Hm [21:55:38] and twittercards on the wiki [21:55:44] sDrewth: Probably the service could just listen to the relevant feed and serve it in a different format [21:55:49] Beurk twittercards. [21:57:01] sure marktraceur, my point was about trying to get something that could be ore aligned to those who want to feed, but don't have the patience/knowledge to bot [21:57:06] more [21:57:09] Ah. [21:57:22] Gotta learn to build little bits at a time :) [21:57:29] yep [21:57:32] 's about the only way things get done around here [21:57:38] We should get a machine-readable RC feed soon(TM), right? [21:58:08] WSes have lots of passionate people, but low in the tech area [21:58:11] * marktraceur doesn't even know who maintains the current one [21:58:14] http://rcstream.wmflabs.org/ looks like it still just does beta :/ [21:58:58] kishanio: hey there, meet sDrewth. sDrewth, kishanio is a new developer who is interested in Wikisource. [21:59:30] I actually saw kishanio on enWS rc feed a day or so ago, so gday [21:59:36] * marktraceur should probably finish lochner...dunno how brandeis has been either [21:59:51] sumanah: hi sumanah how is it going? [22:00:22] sDrewth: Yes i'm documenting ProofreadExtension API hooks. Have a look. https://www.mediawiki.org/wiki/Extension_talk:Proofread_Page#API_Documentation_.26_Improvement [22:00:27] so I am looking forward to Wikimania, so I can have some long wished for face-to-face conversations, and with a whiteboard [22:00:30] sumanah: ^ almost done. [22:00:34] kishanio: I'm all right! I wanted you to meet sDrewth a.k.a. billinghurst, a Wikisource person [22:01:09] Also hi sDrewth. :) [22:01:32] less WS at the moment, and more stewardry at the moment. WS is more my relatxation at the moment [22:01:39] * marktraceur looks around for Micru, CristianCantoro, GorillaWarfare; fails [22:02:39] (you'll see billinghurst in https://en.wikisource.org/wiki/Wikisource:Scriptorium ) [22:03:22] their sleep time [22:03:22] only little bits at the moment sumanah, [22:03:28] true [22:03:41] Well not GorillaWarfare, she's just pretending not to be a programmer [22:04:08] i.e. is busy [22:04:46] * sDrewth goes to rouse daughter out of bed for school [22:04:53] micru was on a bit earlier. There was an interesting chat about music transcription [22:04:54] bye! [22:05:31] Thanks for introduction sumanah [22:05:34] :) [22:07:28] anyway, I'm off to bed. Good luck writing & searching, fhocutt [22:07:39] thanks, valhallasw [22:30:33] heya guys.. i noticed today on English Wikipedia that if i click on edit while not logged in... i get a notice asking me to sign up... which extension or gadget is that? I think a tool like that would be very useful for the wikis i help manage.... [22:30:56] StevenW: ^^ [22:32:48] Wait...isn't that a bug? [22:33:09] Oh, no, never mind. [22:33:10] marktraceur: why would it be? we deployed it this morning [22:33:14] * marktraceur read it wrong [22:33:19] * greg-g nods [22:33:20] greg-g: Thought he meant a *sign-up form* [22:33:35] StevenW: answered in #wikimedia, ignore the ping [22:35:50] ok [23:27:31] fhocutt: hey, got a moment to PM about logistics? [23:31:33] sumanah: sure, can do