[00:49:14] hey fhocutt - just wanted to wave hi [01:51:03] !tables.sql [01:51:03] https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/core.git;a=blob;f=maintenance/tables.sql;hb=HEAD [02:07:28] fhocutt: I'm writing a reply to you now re Wikidata - will send the email but also wanted to mention I'm here :) [02:13:14] !seen GorillaWarfare [02:13:14] Did you mean @seen GorillaWarfare? [02:13:17] #seen GorillaWarfare [02:13:19] @seen GorillaWarfare [02:13:19] sumanah: Last time I saw GorillaWarfare they were talking in the channel, they are still in the channel #wikipedia-en-help at 5/14/2014 10:25:11 PM (1d3h48m7s ago) [02:13:37] @seen GorillaWarfare #mediawiki [02:13:38] sumanah: I have never seen GorillaWarfare #mediawiki [02:15:05] Fatal error: Call to a member function merge() on a non-object in /vagrant/mediawiki/includes/parser/Parser.php on line 1965Fatal error: Call to a member function getTargetLanguage() on a non-object in /vagrant/mediawiki/includes/parser/Parser.php on line 948 [02:19:10] Withoutaname: are you seeing that error on your own personal MediaWiki installation, or on a Wikimedia site, or somewhere else? [02:21:11] Vagrant. :-) [02:26:20] yeah vagrant [02:26:25] local install [02:26:44] should i file a bug? or is it intended? [02:27:01] If I were you I would file a bug report. [02:27:11] fhocutt: I mean Wikidata + Wikisource [02:27:44] hmm, i tried to fix with $mLinkHolders = new LinkHolderArray( $this ); [02:27:48] that threw up more errors [02:36:27] hello [02:36:46] hello jay [02:37:09] how is life there? [02:37:32] sumana hi to you [02:37:51] I'm all right, jay. :) So, what sorts of things are you doing with MediaWiki? [02:38:28] i dont know exactly, haha, im just curious about things here :) [02:39:23] OK! Well, this is a channel where we talk about MediaWiki, the software Wikipedia runs on, and help each other fix and develop it. [02:39:35] You can help. https://www.mediawiki.org/wiki/How_to_contribute [02:40:01] Oh thank you so much... i will try my best :P [02:40:03] fhocutt: msg sent. Best wishes [02:40:06] :) [03:02:45] * fhocutt_ waves to sumanah [10:21:12] hi .... [10:21:57] i want to contribute to mediawiki ... but i am new to open source community ... [10:22:20] and i am not getting how to start [10:23:11] i want to start with some bugs solving [10:23:19] !hacker | surbhi [10:23:19] surbhi: http://www.mediawiki.org/wiki/How_to_become_a_MediaWiki_hacker [10:23:20] can anyone help [10:26:29] i have gone through the "annoying little bug " but still not getting any clue to proceed [10:29:24] please help [10:29:30] i have gone through the "annoying little bug " but still not getting any clue to proceed [12:36:33] hello tonythomas, were you in Zurich in the end? I remember looking for you at some point but I wasn't sure you were there [12:37:20] Nemo_bis: nope :( I applied for the scholarship, but did got it rejected. couldn't make it [12:39:05] Nemo_bis: sad I missed it [12:39:05] Ok. I didn't have one either, but I was only 22 € train far. :) [12:39:30] There will be other opportunities in the coming years, as a GSoC student you won't lack mentoring. ;) [12:39:47] Nemo_bis: lucky you. for me it was a continent far. [12:40:29] Nemo_bis: yeah. my exams gets finished in a few days, and will be able to resume the works [12:41:15] Nemo_bis: still now, we lack conscience on the shift to swift-mailor [12:41:49] https://bugzilla.wikimedia.org/show_bug.cgi?id=63483 [13:06:49] hey pginer - you want to send out a note to wikitech-l about your updated RfC-related changeset? [13:08:10] I would wait until the reviewers that were interested n the grid system confirm if they have enough info to do the review and if they found any issues. [13:08:46] Nikerabbit: I'm at 1.96 TB of ZIM uploads now (ratio 39.4). [13:09:48] Nemo_bis: 2500 GB [13:10:18] Hmm. [13:13:38] I wonder if we steal each other the leechers then, I hardly ever reached 8 Mb/s upload average. http://p.defau.lt/?oPO5oxWgQFdVAJxc14cnjA [13:26:25] DanielK_WMDE: https://www.mediawiki.org/wiki/Performance_guidelines#How_to_think_about_performance now mentions the critical paths stuff [13:28:19] thank you! [13:43:13] Others should feel free to wontfix https://bugzilla.wikimedia.org/show_bug.cgi?id=65394 "A very big, long-term suggestion here: a real performance testing cluster." [13:47:18] hey kishanio - want a big project? ;-) [13:48:29] sumanah: Sure idm. :) [13:50:19] kishanio: so, you like learning more about the MediaWiki API, right? [13:52:10] sumanah: Oh yes. [13:53:17] kishanio: have you ever visited Wikisource? [13:53:22] sumanah: i do get individual parts but overall architecture is still hazy. [13:54:49] sumanah: No i haven't [13:55:16] kishanio: I know you tend to learn better by writing code, but here I do want to mention a couple of resources. https://www.mediawiki.org/wiki/API:Tutorial has some text and a video. And perhaps you might find http://www.mediawiki.org/w/api.php useful. [13:55:48] kishanio: Wikisource is one of Wikipedia's sibling sites, along with Wikimedia Commons, Wikiquote, Wikidata, and several others. [13:56:21] Wikisource hosts freely licensed documents. Some of them are individual documents, like the speeches of important historical people, whereas some are entire books [13:58:12] sumanah: Yes thanks for the links. I have read through them once actually. [13:58:16] Ah cool [13:58:40] and okay im on wikisource right now. I read its like an online library [13:58:45] yes! [13:58:52] There are Wikisources in several languages [13:58:59] they try to collaborate together [13:59:12] kishanio: the Wikisource community is quite eager for new developers :-) there are some developers already [13:59:51] They rely a lot on a MediaWiki extension https://www.mediawiki.org/wiki/Extension:ProofreadPage [13:59:56] Alright that sounds good. [14:00:30] some Wikisource documents are transcripts of uploaded scans ("ProofreadPage" lets people do side-by-side comparisons and edits between scan and transcript), whereas some sources were typed in without that help [14:02:24] kishanio: there is a somewhat poorly understood ProofreadPage API. :-) a good first step to help improve their API would be to interview them over IRC, take notes, and then put those notes on https://www.mediawiki.org/wiki/Extension_talk:Proofread_Page [14:03:00] kishanio: btw anomie (who just came online into this channel) is Brad Jorsch, the person who has been giving you code/bugzilla comments, in case you did not know :-) [14:03:47] kishanio: the more technical people of the Wikisource community are listed in this list https://meta.wikimedia.org/wiki/Wikisource_roadmap#Tasks so you can reach out to them - also there is a #wikisource channel [14:05:10] sumanah: Alright. So initially i'll ask them questions regarding the API improve the documentation and with same i'll get better understanding of API. [14:05:23] kishanio: That seems like a good direction to go in, yes [14:05:37] kishanio: and if you do not get answers via the IRC channel then you can try the wikisource mailing list [14:05:46] https://lists.wikimedia.org/mailman/listinfo/wikisource-l/ [14:05:47] and yes last night i was looking for him here. anomie Hi how are you :) [14:06:07] hi kishanio. Catching up on email, how are you? [14:06:20] kishanio: anomie and I are both on the east coast of North America. You? [14:08:51] sumanah: Oh great i can do that thanks. So i can start with an introductory email on mailing list right now. [14:08:57] :) [14:09:00] Thank you kishanio [14:09:52] sumanah: thats great. Im from west coast india (Ahmedabad, Gujarat). [14:10:01] My parents are from Karnataka [14:10:13] I need to head off for a moment, but glad I could point you at a project idea :-) [14:10:44] sumanah: Thats alright. Thanks. :) [14:12:19] Gooooooood morning everyone! Was wondering if anyone could help me out with trying to gauge exactly how much work my mysql database is doing over a given period of time. I'm planning an upgrade to increase IOPS but I haven't the foggiest what my benchmark needs to be [14:13:38] anomie: I am good. Yesterday i patched the other bug again. Working on xml-json difference bug right now https://bugzilla.wikimedia.org/show_bug.cgi?id=55371 [14:14:02] i might seek you again for same. :) [14:18:06] kishanio: I left a few nitpicks on https://gerrit.wikimedia.org/r/#/c/133441/ for you. Should be easy to fix [14:20:26] kishanio: Bug 55371 looks slightly tricky due to backwards compatibility issues. What approach are you taking? [14:26:32] anomie: Right now i was thinking towards your comment 2 solution. To get rid ov * in xml format whenever other sub elements are needed. [14:27:40] kishanio: Yeah, that's probably the best way to go. Are you going to pick an arbitrary name (note "text" or the like might conflict in some cases) or require one be specified along the lines of setIndexedTagName()? [14:27:41] anomie: im still researching on this and need a-bit more time. I'll leave a comment before approaching. [14:27:46] kishanio: ok [14:28:32] Vulpix: I choose you! :D [14:28:48] lol [14:32:53] Vulpix: But no seriously, you know anything about mysql server stuffs? [14:34:18] Iknow something [14:34:31] what's your problem? [14:38:52] Vulpix: I have no idea what sort of IOPS my database needs, and I feel like this is an issue. [14:39:12] AWS recommends a ratio of data size to IOPS, but I feel like that's not true if it's a high traffic site [14:40:36] no idea about that, I don't have experience with that kind of performance level [14:41:22] Vulpix: Yeah, me neither. [14:41:40] Thanks anyways! :) [14:45:09] hi Ulfr_ [14:45:16] https://www.mediawiki.org/wiki/Performance_profiling_for_Wikimedia_code ? [14:45:43] Ulfr_: like, it might help you - and if it doesn't, but you find an answer, add it there or on talkpage? :) [14:46:22] sumanah: My problem is that I have three frontends that all poke the same database, so the profiling doesn't give me the whole picture [14:46:41] believe me though, I use the crap out of that page when I'm trying to figure out why my users are whining about it being "slow" [14:47:30] nod nod nod [14:47:53] "buy better hardware" :P [14:49:01] Vulpix: You know, I'd do that, but I don't buy any hardware, I just get funny cloud servers that I can't kick when I'm frustrated [14:50:21] I AM however trying to figure out how beefy the servers need to be, but this magical metric called IOPS defines that, and I can't figure out how to measure that. [14:50:39] * Ulfr_ wants to buy a IOPS gauge that works like a tire pressure gauge [14:50:45] Ulfr_: the #wikimedia-tech folks may be more expert in this domain [14:50:46] ha! yes! [14:51:26] sumanah: I'll take a whack, but most sufficiently high level techs are all like, wait, you have this much load and don't know already? *blank stare* [14:54:44] Ulfr_: you can monitor CPU and IO usage to see where is the bottleneck [14:55:21] Vulpix: You'd think so, but my CPU load is near 0 and IO usage seems to operate based off of whimsy [14:56:34] I'm using iostat but that only provides instantaneous snapshots every second or so. I need to know what my outliers are so I don't get flattened by some botnet [14:57:12] Ulfr_: use iotop [14:58:55] Vulpix: You're awesome. [14:59:55] and who is the culprit process? [15:05:21] I've got a rooted box with a mediawiki install. I guess this means the attackers got password hashes for our wiki. Are those salted by default? [15:11:18] JordiGH: yes [15:11:20] Vulpix: The culprit? It's mysql! [15:11:44] JordiGH: but i think MW still uses a somewhat weak password hash algorithm [15:12:53] JordiGH: salted md5 :( [15:14:09] Ouch. [15:14:33] Still, that means rainbow tables don't help, right? [15:15:31] yes [15:18:32] JordiGH: yeah, but weak passwords are still brute-forceable without supercomputers :) [15:18:57] there's some work going on about making MW use a proper algorithm for this, but it's a bit stalled right now apparently [16:56:19] hi, what is the suggested way to write a looong single shell command? pre and code doesn;t seem to wrap the code and it overflows from the right edge of the page... [16:57:23] Well, actually code wraps but then it is not that significant that there is code there... Indent creates a better terminal code effect [16:57:41] Can I combine both? [16:59:15] Mahjongg: use
 and manually indent lines, adding a "\" at the end of each line to indicate that the command continues on the next line (basically, to escape the "newline character")
[16:59:56] 	 alternatively: 
[17:11:25] 	 Vulpix, thank you.
[17:30:58] 	 terrrydactyl: do you have 5 min to help me with something?
[17:31:08] 	 reading something and telling me whether it makes sense to you
[17:32:50] 	 Mithrandir: btw not sure whether you saw that I'm working on https://www.mediawiki.org/wiki/Performance_guidelines which will probably amuse you with their simplicity when it comes to things you are expert in
[17:38:37] 	 brion: or other people who know - have we considered using something like Gearman as the jobrunner for our job queue?
[17:38:43] 	 I understand that we built our jobrunner in house
[17:39:01] 	 istr some talk of gearman years ago but we never got to it that i remember
[17:39:11] 	 maybe our needs are not fulfilled by other projects?
[17:39:16] 	 once reddis is being used to run the queues i guess the rest of the infrastructure doesn’t seem that hard maybe :D
[17:39:49] 	 it’s probably worth looking into a refresh of the jobs infrastructure beyond the queue srver itself though
[17:40:46] 	 I used http://hexm.de/mw-search to search our repos/docs/lists/bugs/etc. for "gearman" and saw that it integrates with Zuul?
[17:41:14] 	 like, in our setup, present or future
[17:45:18] 	 separately - people who want to understand more about performance, "Scalable Web Architecture and Distributed Systems" by Kate Matsudaira  helped me a lot! re caches, proxies, and more
[17:47:09] 	 nice
[17:47:18] 	 hey those are some nice graphs
[17:47:58] 	 http://www.aosabook.org/images/distsys/imageHosting4.png <- that starts to approach our arch :D
[17:48:41] 	 must… reheat… coffee
[17:57:01] 	 Hi, for JS, how to log output? the build says that console and alert are not defined. what is the MediaWiki way to do this? :)
[17:57:45] 	 console.log should work on some browsers
[17:59:28] 	 sumanah: sorry, was afk for a second. i’d be happy to proofread something for you.
[18:05:13] 	 Vulpix, it works on the browser, but the build fails: https://integration.wikimedia.org/ci/job/mwext-Translate-jslint/3314/console
[18:05:47] 	 Hey. Anyone here knowledgeable on the new gallery tags?
[18:07:17] 	 !ask | Sven_Manguard
[18:07:17] 	 Sven_Manguard: Please feel free to ask your question: if anybody who knows the answer is around, they will surely reply. Don't ask for help or for attention before actually asking your question, that's just a waste of time – both yours and everybody else's. :)
[18:07:20] 	 really, do just ask
[18:09:58] 	 BPositive: I don't know if you need to add jshint comments for that https://github.com/jshint/jshint/issues/1027#issuecomment-16950126
[18:10:11] 	 ask Krinkle 
[18:10:27] 	 or MatmaRex 
[18:11:18] 	 Vulpix, I checked out some other files and they seem to use mw.log ( .... ) and window.alert( ... )
[18:11:42] 	 window.alert O_O
[18:12:04] 	 MatmaRex: I've got a gallery of four images on a page. I want to have that gallery stretch the full width of the page, regardless of screen resolution 
[18:12:14] 	 mw.log maybe is the right thing to do, probably
[18:12:41] 	 okay I will just comment out alerts, not sure about that at the moment
[18:12:46] 	 mw.log seems fine
[18:12:56] 	 as an alternative to console.log
[18:13:08] 	 Sven_Manguard: hmm, i'm not sure if any of the gallery modes allows that
[18:14:45] 	 MatmaRex: is there a way to force the four images to stay on one line, always?
[18:15:17] 	 Sven_Manguard: no idea. that might be difficult to implement
[18:16:09] 	 Sven_Manguard: 
[18:17:05] 	 thanks, testing
[18:17:18] 	 hi
[18:18:29] 	 Vulpix: It didn't work. Three images on one row, one on the row below
[18:19:04] 	 the docs lie, then https://www.mediawiki.org/wiki/Help:Images#Optional_gallery_attributes
[18:19:12] 	 it used to work, though
[18:19:32] 	 sumanah: simplicity is good.
[18:19:35] 	 maybe you're using the mode attribute?
[18:19:39] 	 sumanah: low hanging fruit and all that. :-)
[18:20:51] 	 Vulpix: I am
[18:20:58] 	 do they not play nice with one another?
[18:21:25] 	 Sven_Manguard: maybe, that would explain why it doesn't work for you
[18:22:09] 	 Vulpix: most infuriatingly, it is four on one line in the view preview, but 3/1 when live
[18:24:03] 	 terrrydactyl: hey
[18:24:13] 	 terrrydactyl:  https://www.mediawiki.org/wiki/Performance_guidelines#What_to_do_.28summary.29 and the section https://www.mediawiki.org/wiki/Performance_guidelines#How_to_think_about_performance_2 
[18:25:34] 	 Mithrandir: :D
[18:26:22] 	 terrrydactyl: basically, you are a mid-level (not novice, not senior) MediaWiki developer, and I want to see whether those 2 sections make sense to you or whether there is jargon you do not understand
[18:35:35] 	 i want to import a template straight from here: http://www.mediawiki.org/w/index.php?title=Template:Note. So I click "edit" and I don't see any relevant code that would make this template work. How am I supposed to import this?
[18:39:21] 	 Hi sauce 
[18:39:23] 	 !exporttemplate
[18:39:24] 	 exporttemplates
[18:39:37] 	 troll wm-bot 
[18:39:40] 	 hah
[18:39:46] 	 try !somethingelse
[18:39:50] 	 blargh
[18:39:53] 	 !template
[18:39:54] 	 For more information about templates, see . See also: !templateproblems , !wptemplates
[18:39:59] 	 !wptemplates
[18:39:59] 	 To copy templates from Wikipedia, use Special:Export and check the "Include templates" option to get all the sub-templates, then upload the file with Special:Import on your wiki. You'll also likely have to install the ParserFunctions extension, Scribunto extension and install/enable HTML tidy. You also might need some CSS from Wikipedia's Common.css. You'll also need a lot of...
[18:40:03] 	 ah, you were missing the last "s"
[18:40:10] 	 !exporttemplates
[18:40:10] 	 To copy templates from Wikipedia, use Special:Export and check the "Include templates" option to get all the sub-templates, then upload the file with Special:Import on your wiki. You'll also likely have to install the ParserFunctions extension, Scribunto extension and install/enable HTML tidy. You also might need some CSS from Wikipedia's Common.css. You'll also need a lot of...
[18:40:19] 	 mm, it's the same
[18:40:28] 	 sauce: the process described in that FAQ entry is similar for all Wikimedia sites, including mediawiki.org
[18:41:06] 	 i tried the export/import instructions but i had trouble, i'll try it again
[18:41:15] 	 oh, what error did you get sauce?
[18:41:37] 	 not an error. it imported all the templates, which i could see by going to Special Pages | All Pages | Templates
[18:41:56] 	 ohhhh
[18:42:00] 	 but Template:Note still didn't look right
[18:49:31] 	 Wow, so forward looking: https://www.mediawiki.org/wiki/Release_notes/1.24
[19:02:05] 	 sauce: you probably also need to copy any custom css that the template may be using
[19:02:13] 	 (and maybe even javascript)
[19:02:15] 	 !css
[19:02:15] 	 To change styles for your wiki, go to one of the MediaWiki:xxx.css wiki page (NOT a file) and put your custom styles there (sysop/admin rights required). MediaWiki:Common.css is for all skins and should be used for content styles. MediaWiki:Vector.css is for the Vector skin (default), etc. For more information, see !skins and 
[19:12:15] 	 terrrydactyl: do you think you'll have some time to look at that within the next hour or so?
[19:17:43] 	 sumanah: yeah, i have some time. :)
[19:17:51] 	 Thanks! Awesome
[19:18:07] 	 sumanah: for some reason colloquy isn’t pinging me when i get a ping. sorry for missing your messages
[19:18:31] 	 got it terrrydactyl - maybe I should rig up Twilio to SMS you every time I ping you ;-)
[19:18:36] 	 hello sumanah!
[19:18:41] 	 hi fhocutt_! :D
[19:18:46] 	 sumanah: hahaha, that might be overkill
[19:18:53] 	 gonna check my settings
[19:18:59] 	 or maybe you are sitting near someone else I could para-ping
[19:19:12] 	 hi fhocutt_! how’s the internship going?
[19:19:27] 	 it starts on Monday!
[19:19:35] 	 I'm writing a twitterbot in the meantime
[19:19:39] 	 sumanah: i work mostly from home so unless you can ping my dog. that won’t work. :)
[19:20:13] 	 fhocutt_: fun! what are you trying to do with the twitter bot if you don’t mind me asking.
[19:20:17] 	 right now I am figuring out the wikidata api and going to ask for clarification on the structure of wikisource
[19:20:22] 	 it's @autowikifacts
[19:20:27] 	 tweet obscure things! 
[19:20:52] 	 cool!
[19:20:59] 	 the wikipedia option tweets the first sentence of an article's longest section
[19:21:45] 	 neat
[19:22:20] 	 i once made a twitterbot to track my dog’s accident streak lol
[19:22:36] 	 but then i stopped updating it. should probably stop the cronjob for it, but too lazy. :P
[19:22:56] 	 heh. :)
[19:25:15] 	 sumanah: are you going to email me prompt?
[19:25:31] 	 terrrydactyl: sorry, what do you mean?
[19:25:39] 	 terrrydactyl:  https://www.mediawiki.org/wiki/Performance_guidelines#What_to_do_.28summary.29 and the section https://www.mediawiki.org/wiki/Performance_guidelines#How_to_think_about_performance_2   basically, you are a mid-level (not novice, not senior) MediaWiki developer, and I want to see whether those 2 sections make sense to you or whether there is jargon you do not understand
[19:25:50] 	 terrrydactyl: maybe you didn't see that in backscroll
[19:28:39] 	 sumanah: I'm working on getting human-readable claims from wikidata
[19:29:29] 	 there are something like 7 datatypes that a claim value (the "object") can have and I'm trying to figure out how to handle those
[19:30:15] 	 fhocutt_: the #wikimedia-wikidata channel will know many of those things - aude and Lydia_WMDE and DanielK_WMDE are also Wikidata experts and can answer things as well
[19:30:20] 	 (they work at Wikimedia Germany on Wikidata)
[19:30:33] * fhocutt_  heads over there
[19:30:47] 	 they are in this channel as well
[19:30:49] 	 just fyi
[19:31:09] 	 you can ask here if you want, or there - OK either way
[19:32:04] 	 ok! 
[19:32:47] 	 so I'm trying to get a random claim from a recently-changed page
[19:33:05] 	 sumanah: sorry, totally missed it!
[19:33:09] 	 it's ok!
[19:33:17] 	 and print out (subject) property (object), all in English
[19:36:27] 	 it's easy to handle if the value of the claim is a string or number or something similar
[19:37:27] 	 but it's more difficult if it's a link (to another wikidata item or a media file)
[19:38:21] 	 first, is there a way to search for only claims where the value has a certain datatype in my API call? It doesn't look like it but would be convenient. 
[19:40:14] 	 second, is there an accessible description of which of those datatypes are links and which have usefully printable content?
[19:41:12] 	 sumanah: i think for the most part, the first link is okay
[19:41:56] 	 i was kind of confused about what “Performance is often related to other code smells; think about the root cause.” meant. i don’t know if “smells” would translate well for people who are not native speakers. i had to read it a couple times to understand what you meant.
[19:42:28] 	 you are right terrrydactyl 
[19:42:40] 	 "code smell" is a bit of software engineering slang. I will fix
[19:43:06] 	 i didn’t understand this: Defer loading modules that don't affect the initial rendering of the page, particularly the "above the fold" part of the page visible on the user's screen. So load as little JavaScript as possible with position 'top'; instead load more components asynchronously or with the default ResourceLoader behavior of loading at the bottom of the page. See loading modules for more information.
[19:43:11] 	 but i don’t know much javascript
[19:43:19] 	 so it might just be me.
[19:43:37] 	 you quote “above the fold” as if i should know what it is.
[19:43:49] 	 thank you terrrydactyl this is exactly the kind of feedback I need
[19:44:26] 	 ‘ instead load more components asynchronously or with the default ResourceLoader behavior of loading at the bottom of the page.” this sentence made sense though, even with my basic javascript knowledge.
[19:45:28] 	 “The tables you create will be shared by other code. Every query must be able to use one of the indexes (including write queries!).” I’m not sure what indexes are referring to. my first inclination is the primary keys or something like that, but not sure if that is what you meant.
[19:45:43] 	 that’s all the feedback i have for that section. will take a look at the second link now.
[19:46:44] 	 well, databases usually have indexes, that's a basic feature of a database engine
[19:47:49] 	 Vulpix: well, I didn't say "database indexes" 
[19:48:12] 	 Vulpix: this is exactly why I wanted the point of view of someone like terrrydactyl, to tell me when I have not been clear
[19:48:37] 	 Vulpix: it could just be my lack of knowledge on that topic. but it’s good to know that i should know about it.
[19:49:19] 	 sumanah: okay, about "Every query must be able to use one of the indexes (including write queries!)", it's confusing. What do you mean by "including write queries"? queryes that "write" (update) to database use indexes? or use indexes when writing queries?
[19:50:07] 	 Vulpix: good question! I will clarify that. Will need to check with AaronSchulz to make sure of accuracy. :)
[19:52:27] 	 Maybe you mean the latter. Although indexes are normally for selects, unless you update or delete all rows, any WHERE clause on them can use indexes (and most probably should)
[19:57:04] 	 sumanah: the second link is pretty clear. the only thing i’m confused is what’s the difference between page views and rendering page content?
[19:58:36] 	 good question terrrydactyl - I shall clarify that
[20:03:40] 	 overall, i thought the articles were pretty clear. :)
[20:03:57] 	 let me know if you ever want more proofreads. i actually really like proofreading.
[20:06:06] 	 thank you terrrydactyl!
[20:06:37] 	 terrrydactyl: ok! https://www.mediawiki.org/wiki/Performance_profiling_for_Wikimedia_code - wanna skim this and see which bits seem most useful like "I wish I had known this 5 months ago"?
[20:11:48] 	 sumanah: looking!
[20:16:15] 	 sumanah: “IOW, you don't want half of your users to have a good experience, you want all of them to.” i don’t see why you need to use an acronym here. it makes it harder to understand.
[20:16:32] 	 terrrydactyl: got it. would you mind editing that to "in other words"?
[20:17:17] 	 terrrydactyl: the feedback I will find most useful right now would be -  which bits seem most useful like "I wish I had known this 5 months ago"?
[20:17:24] 	 on that doc
[20:17:47] 	 okay
[20:18:55] 	 it's super drafty, I know the prose needs lots of work and intend to do a big prose passthrough
[20:24:20] 	 sumanah: hmm, i don’t know if at my current state, i’m the intended audience. i see that there are a bunch of graphs but i’m not sure how to read and understand them. i feel like kind of lost while reading it. sorry, not sure if that’s the feedback that you need.
[20:24:51] 	 terrrydactyl: are you developing MediaWiki code?
[20:24:55] 	 (these days)
[20:24:59] 	 no, i’m not
[20:25:03] * sumanah  is diagnosing audience mismatch
[20:25:04] 	 ahhhhhh
[20:25:07] 	 i’m working on wikimetrics
[20:25:12] 	 which is a totally different project
[20:25:20] 	 ok, thought you were working on wikimetrics AND some MediaWiki extension stuff
[20:25:41] 	 ah, no, i kind of hung up my PHP skills for a bit, haha.
[20:25:44] 	 never mind then
[20:25:50] 	 thanks for your feedback on the general guidelines
[20:25:55] * sumanah  will let you go
[20:26:17] 	 fhocutt_: did you get any help from the Wikidata people, maybe in the other channel?
[20:27:13] 	 sumanah: you’re welcome. :)
[20:27:29] 	 wasn't sure on etiquette of asking the same question, but no, I haven't
[20:28:03] 	 right now I'm putting together something that isn't elegant but which hopefully will work
[20:30:42] 	 "code smells" fixed. Linked to glossary re "module" to improve the "defer loading modules" thing. Clarified "above the fold"
[20:31:23] 	 decided the details re the "even write queries" bit clarified it sufficiently
[20:31:41] 	 fhocutt_: it is fine to ask the same question in another channel. I hereby approve/allow it
[20:31:46] 	 go ahead
[20:31:48] 	 :)
[20:34:38] 	 done :)
[20:56:40] 	 !cms
[20:56:40] 	 Wikis are designed for openness, to be readable and editable by all. If you want a forum, a blog, a web authoring toolkit or corporate content management system, perhaps don't use wiki software. There is a nice overview of free tools available at  including the possibility to try each system. For ways to restrict access in MediaWiki, see !access.
[21:02:17] 	 !access
[21:02:18] 	 For information on customizing user access, see . For common examples of restricting access using both rights and extensions, see .
[21:02:36] 	 !potato 
[21:03:53] <^d>	 !friday
[21:04:05] 	 Surely there's a youtube link we could add.
[21:04:29] 	 !squeee
[21:05:34] 	 DanielK_WMDE: #wikimedia-kawaii
[21:09:36] 	 ^d: I think you mean !caturday
[21:10:04] <^d>	 I'm never having cats again.
[21:10:08] <^d>	 Lived with cats too many years.
[21:12:15] 	 cats are <3
[21:12:40] 	 I'd rather have tags. (they see me trollin')
[21:13:46] <^d>	 I want a puppy.
[21:14:30] 	 ^d: what colour?
[21:14:51] 	 Sob, https://www.mediawiki.org/wiki/Manual:$wgDefaultSkin is so outdated again
[21:15:23] 	 When did we kill modern?
[21:15:27] <^d>	 Hmmm, green and fuchsia.
[21:15:37] <^d>	 puppy, that is.
[21:15:41] 	 Ah no we didn't
[21:15:46] <^d>	 Not yet. Soon.
[21:15:57] 	 Let me think, I may have one like that
[21:16:10] 	 ^d: my upstairs neighbours at my brothers unit have a dachshund, so cute when he is running up the stairs
[21:16:53] 	 Hm, only for some questionable definition of green
[21:17:19] <^d>	 p858snake|l: My neighbors had a dachshund growing up. Cute dogs.
[21:18:03] <^d>	 My sister-in-law's coworker breeds Yorkies. Thinking of getting one once I move out of my current apartment.
[21:38:56] 	 I'm not able to upload tif files. After uploading a single paged/layered file, I get this error message: http://pastebin.com/rqg4EnSa
[21:39:32] 	 Could someone help me figure out the issue?
[21:41:26] 	 It may be that you don't have TIFF support on your system
[21:41:49] * marktraceur  guesses.
[21:55:03] 	 marktraceur: Aha! Fixed it. Just had to fix the ImageMagick location.
[22:03:32] 	 awww, Hacker School reunion week is ending. (just finished the last presentations)
[22:47:50] 	 ok, I am gonna make my best stab at translating these notes from Zurich
[22:47:50] 	 "think of resources as being the thing you deliver. That is the main rep of the content. Add indexing you need in ephemeral tables. You can directly retrieve/access the main request by getting resource - HTML, JSON, etc - and rearchitect your indexing layer separately...."
[22:48:20] 	 erm, context?
[22:48:26] 	 also, hi :)
[22:49:04] 	 sumanah: which doc is this for now?
[22:49:17] 	 https://www.mediawiki.org/wiki/Performance_guidelines
[22:49:25] 	 gwicke said it I think
[22:55:11] 	 ok, leaving it as a talkpage thing, I think it's not essential
[22:55:48] 	 yeah, i don't really get what it means
[23:02:21] 	 jeremyb: can you find for me the hash module mentioned in https://www.mediawiki.org/wiki/Performance_guidelines#ResourceLoader and edit the pg to add a link?
[23:02:23] 	 to the code?
[23:02:27] 	 if you can I'd appreciate it
[23:03:27] 	 maybe...
[23:04:05] 	 let's see how long this takes
[23:04:27] 	 mediawiki-core$ time git remote update
[23:06:25] 	 real	2m20.693s
[23:09:39] 	 sumanah: That bit about the ephemeral table indices above is definitely gwicke. I'm not sure I know exactly what he means either.
[23:10:24] 	 I think he's talking about an object store API 
[23:10:40] 	 it's about DB performance vs. normal form
[23:10:53] 	 and flexibility in development
[23:11:13] 	 hi
[23:11:14] 	 andre__: ^demon|away will you approve my wikitech-l email? thanks
[23:11:20] 	 how can i add an image on the images dir?
[23:11:33] 	 [[File:imagename]]?
[23:11:54] 	 gwicke: expand on it on the talkpage of [[Performance guidelines]] if you get a chance? thanks
[23:12:24] 	 sumanah, ^demon|away: done
[23:12:48] 	 thanks
[23:12:52] 	 its on basedir/images/
[23:12:55] 	 some people may get it 2ce, oh well
[23:12:58] 	 or should it be anywhere else?
[23:13:02] 	 ok, Friday dinner, see y'all later
[23:13:10] 	 later sumanah!
[23:13:24] 	 fhocutt_: I gotta go but maybe end of day today send a status note to your mentors?
[23:13:27] 	 have a good night
[23:13:34] 	 sure, can do
[23:13:35] 	 please anyone?
[23:15:17] 	 biberao: you might ask mediawiki-l
[23:15:18] 	 !lists
[23:15:18] 	 mediawiki-l and wikitech-l are the primary mailing lists for MediaWiki-related issues. See https://www.mediawiki.org/wiki/Mailing_lists for details.
[23:15:23] 	 ?
[23:15:50] 	 biberao: Just use Special:Upload?
[23:16:51] 	 There should be a link in the Tools section of the left hand sidebar on your wiki.
[23:16:56] 	 where does it store the images bd808 ?
[23:17:04] 	 so i could add all of them manually its faster
[23:17:41] 	 It depends on your wiki configuration. The images are placed in $wgUploadDirectory
[23:17:44] 	 ok found it
[23:18:14] 	 But without a page associated I don't think you'll be able to reference them
[23:18:56] 	 thanks
[23:19:22] 	 biberao: The maintenance/importImages.php script may be what  you want for bulk import
[23:20:21] 	 will do that
[23:20:22] 	 thanks
[23:41:55] 	 Gloria: phab=phabricator
[23:48:00] 	 jeremyb: Interesting.
[23:48:19] 	 I guess Bug for Bugzilla and Wiki for Wikimedia then.
[23:56:26] <^demon|away>	 Gloria: And I for IRC and Com for Computers.
[23:57:34] 	 Perf.