[00:49:49] how do I clear a password reset for a user? [00:50:05] it was sent to the wrong email, I changed his email in the DB and now I can't send another password reset [00:54:51] Skaag: you could just use changePassword.php and email that to him https://www.mediawiki.org/wiki/Manual:Resetting_passwords#Use_the_changePassword.php_maintenance_script [00:55:38] If its due to the throtle error, you can also just set $wgPasswordReminderResendTime to false temporarily, but PiRSquared's solution is probably better [00:55:54] I prefer bawolff's solution. :P [00:56:02] didn't know about that [00:57:02] or you can always set user_newpass_time to 19700101000000 in the db :) [01:57:43] thanks PiRSquared [01:57:53] np Skaag :) [01:58:33] Skaag: you told them to change it, right? [02:00:10] eh, the user_newpass_time and re-reset would have worked too :P [02:33:18] hi,guys!someone can give me help [02:35:22] this is my install parsoid step: [02:35:23] 1,git clone https://gerrit.wikimedia.org/r/p/mediawiki/services/parsoid [02:35:23] 2,yum install nodejs npm(fedora 19) [02:35:23] 3,in parsoid directory, "npm install " [02:35:23] 4,configure parsoidConfig.setInterwiki( 'localhost', 'http://localhost/mediawiki/api.php' ) [02:35:23] 5,run the server node api/server.js [02:36:48] what's wrong with me? [02:37:33] Dunno. [02:37:40] #mediawiki-parsoid may be more helpful. [02:37:46] Though it's a quiet time of day. [02:37:58] Gloria: thanks ,but nobody [02:38:33] Gloria: i want to install Extension:VisualEditor [02:38:51] but it must install parsoid [02:39:20] i stuck on "install parsoid":-( [02:45:53] !debug [02:45:53] For information on debugging (including viewing errors), see http://www.mediawiki.org/wiki/Manual:How_to_debug . A list of related configuration variables is at https://www.mediawiki.org/wiki/Manual:Configuration_settings#Debug.2Flogging [04:19:17] Anyone knows default mediawiki message for signs? [04:22:06] Revi: You mean user signatures? [04:22:06] I think it's "MediaWiki:Signature", creatively. [04:22:07] There's a separate message for anons. [04:22:12] yes [04:22:22] Thanks, Gloria [04:22:43] No problem. [04:23:06] https://en.wikipedia.org/wiki/MediaWiki:Signature-anon is the anon version. [05:41:58] grrr the "advanced toolbar" preference is not respected any longer? [05:42:28] Helllo #mediawiki, what's the difference between Project:Help and Help:Contents? Thank you :) [05:46:13] CasperVector: project is mediawiki.org-specific, Help is for all mediawikis [05:50:29] Nemo_bis: Thanks, I discovered Wkipedia redirects Project:Help directly to Help:Contents... [05:50:55] (I mean I wanted to do the same thing :) [05:56:49] because in that case it's Wikipedia-specific [06:14:06] hi [06:14:58] I am looking for Wikipedia API GEOCODE based [06:15:21] http://www.geonames.org/maps/showOnMap?q=mumbai [06:18:45] i am looking like http://www.geonames.org/maps/showOnMap?q=mumbai Goecode wikipedia API [06:27:48] hi [06:28:15] i am looking for geocode wikipedia api [08:50:51] https://community.sugarcrm.com/sugarcrm/topics/where_is_sugarcrms_source_code_version_control_system_vcs?topic-reply-list%5Bsettings%5D%5Bfilter_by%5D=all&topic-reply-list%5Bsettings%5D%5Breply_id%5D=14283028#reply_14283028 [10:24:56] fhocutt: hi! [10:25:08] (just got your note) [10:25:38] Got a moment to chat? [10:32:00] sumanah: urm yes [10:32:48] hi kishanio! Sorry, I was talking to fhocutt [10:34:11] but kishanio - thanks again for improving MediaWiki. :-) How is life going? [10:34:57] sumanah: oh my irc got reconnected and that was the first message there-on. [10:37:30] sumanah: Sure. n its pretty good. Im on holidays for 2 weeks so trying to grasp as much as i can. [10:38:39] how about you [10:39:43] I'm all right! [10:39:56] My mind currently is on https://www.mediawiki.org/wiki/Talk:Performance_guidelines#TODO_for_Sumana.2C_14_May_2014 [10:40:20] kishanio: I, as senior tech writer for the Wikimedia Foundation, am working to write things that will help lots of developers make the right decisions [10:40:50] to make the code secure, to make it run fast, and to make sure it fits well with our existing code base and infrastructure [10:41:15] sumanah: i know you. Your famous i have http://harihareswara.net bookmarked :p [10:41:27] What?! How did you ever hear of me? [10:41:51] btw you might find https://www.mediawiki.org/wiki/Performance_profiling_for_Wikimedia_code helpful if you're trying to ensure your code runs fast [10:42:54] So had http://crummy.com bookmarked for few months now. I follow it for movies reviews and all. So once i got hooked up reading something there and i found your weblog. [10:43:24] You have made Leonard and me laugh happily just now kishanio :) [10:45:06] sumanah: Yay [10:51:40] sumanah: how would one not have heard of the glorious you? [10:52:19] Mithrandir: I laughed aloud, but Leonard said "That's how I feel!" [10:52:52] :-) [10:53:40] Mithrandir: so, Wikidata - have you checked it out at all? [10:53:48] https://www.wikidata.org [10:55:28] I haven't. [10:55:40] it's basically all the various metadata from wiki* pulled out? [10:56:54] In the far future, it will be, and then Wikipedia/Wikivoyage/Wikisource/etc. will have a lot of calls to the Wikidata API embedded in 'em, and this will make updating and multilingual stuff better [10:57:50] interesting [10:58:26] Like, instead of manually updated infoboxes on 20 different Wikipedia pages reflecting the current average yearly rainfall of Tanzania (on English/French/etc. Wikipedias), we can have that fact updated in 1 place and propagated [10:58:29] Structured data! [10:58:40] yeah, that's pretty cool [10:59:10] sumanah: Regarding todo for performance_guidelines. There is this point about improving API etiquette. What's it about? Because right now i'm more focussed on APIs having solving bugs for same and trying to learn wikimedia architecture. [11:00:10] kishanio: https://www.mediawiki.org/wiki/API:Etiquette is a page for people to read if they are writing clients that read from or write to our sites using the web API [11:02:32] BTW kishanio what is your learning style? how do you learn best? from seeing examples or learning principles? from looking at what other people have done to plan out your own work, or from making your own mistakes first? Visually or with words? step by step or jumping into the deep end? etc [11:03:40] Some people find this questionnaire helpful http://www.engr.ncsu.edu/learningstyles/ilsweb.html (all the answers are valid - it's just to help you understand how you learn, so if you get frustrated, you can try another approach) [11:04:17] For instance, people who learn well by reading might learn more about the API by seeing discussions of what we need to improve in the API in the long term, such as https://www.mediawiki.org/wiki/Requests_for_comment/API_roadmap [11:05:02] (more about engineering learning styles: http://www4.ncsu.edu/unity/lockers/users/f/felder/public/Learning_Styles.html ) [11:09:39] sumanah: Oh alright. N i prefer hands-on. I can read a code but need a reason/use-case behind it. And thats the reason i started solving bugs from first day. I did made few mistakes at first but now im a-bit familiar with protocol about submitting patches. [11:09:56] * sumanah understands [11:10:33] So its a mixed model i do make mistakes but at the same time i research how people had done in past. [11:14:31] But this is great i'll read through it. Thanks. [11:16:21] kishanio - glad to help! [12:21:26] Hi, I have an English wiki set up, and I would now like to enable others to translate this. I would like to use the same database, so a single installation that handles multiple languages. What is the best way of doing this? [12:22:37] e.g. if there is /wiki/exposure then should a French person make /wiki/fr-exposure or /wiki/exposition or what? [12:49:31] DrSlony: /wiki/exposure/fr I think [12:49:50] DrSlony: take a look at commons for how they do it [14:19:17] Krenair: I hope my recent wikitech-l (just now) email does not embarrass you with praise :) [14:30:10] Is anyone familiar with some of the resourceloader modules? I'm trying to test for page existence with mw.Title.exists() and it doesn't seem to be working. [14:38:51] Rosencrantz: do you get a particular error? [14:39:00] No, I just get null [14:39:14] Like when testing for something I know exists, like the Main_Page [14:39:20] Whoa [14:39:33] have you successfully used RL in the past at all? [14:40:24] Yeah.. just for some simple stuff [14:40:33] Honestly I'm just wondering if I'm using this one wrong [14:42:38] Rosencrantz: is https://www.mediawiki.org/wiki/ResourceLoader/Migration_guide_%28developers%29 or a related doc useful? [14:43:49] Rosencrantz: mw.Title.exists is a utility for storing and retreiving that information through title normalisation [14:43:57] it does not itself compute that information [14:44:14] like mw.config and mw.messages, it is an interface for storing it, it doesn't go and fetch it from the server. [14:44:34] Oh! [14:44:35] See also https://doc.wikimedia.org/mediawiki-core/master/js/#!/api/mw.Title-static-property-exist [14:45:28] So I'd have go do something through the API then to test if a page exists. [14:50:29] Rosencrantz: Yep [14:51:07] Rosencrantz: You can use $.ajax (or mw.Api to abstract it slightly more for convenience) and request action=query&titles= [14:51:36] ok, awesome, thanks! [14:53:18] Rosencrantz: https://www.mediawiki.org/w/api.php?format=jsonfm&action=query&titles=Main_Page|Sandbox&indexpageids [14:53:58] ooh nice [14:54:04] https://www.mediawiki.org/w/api.php?format=jsonfm&action=query&titles=Main%20Page|Sandbox|Non [14:54:07] (better example) [14:54:39] you basically for each in .pages, go .set( mw.Title( page.title ), page.missing !== undefined ) [14:55:24] *nods* [14:55:25] Thanks! [14:55:35] the passing through mwtitle instead of plain page.title is because exists.set() wants to have "Title(s) in strict prefixedDb title form", not the display title (e.g. _ instead of " ") [14:55:38] Okay :) [15:02:12] hi! how may i disable the check for similar name when uploading a file. i just want to be able to upload files with same name but different extensions. [15:02:37] havent't found anything in the docs about it [15:12:14] Hello - Just looking for some help where the support desk couldn't help me. I'm re-working the login page to include entering a birthdate, and I wish to load a system message along with that. The problem is it seems that when loading a system message through PHP, and Wiki code and HTML is not pared and thus displays as text. Any known solution? [15:13:03] *the wiki code and HTML is not PARSED - Sorry for that spelling issue. [15:14:41] Elvana: Are you using ->parse()? [15:16:15] marktraceur: Eh, no. I really don't know enough about PHP to know all the ins and outs - I'm coming from HTML5 and java so it's hard to adjust. Do I just enclose the function that loads the message within ->parse()? [15:17:23] No no [15:17:43] Elvana: When you get the message, you do something that looks like $out->msg( 'message-key' ); right? [15:18:46] msg('birth_info') ?> - To be exact. [15:19:11] Elvana: Do $this->msg( 'birth_info' )->parse() instead. [15:20:26] Is there anything I should enclose within ->parse()? [15:22:22] hi pidzero - this is on your personal wiki, right? [15:22:30] marktraceur: Fatal error: Call to a member function parse() on a non-object in [15:22:41] ...hm. [15:22:49] Oh, probably msg() doesn't return a Message object. [15:22:50] sumanah, well not my own but a personal wiki, yes [15:22:56] Blargh one sec [15:23:01] Dealing with deployment blues [15:23:57] That's fine, I can wait. [15:25:16] In the meatime - Are there any general "hooks" for adding to the "user tables" (Or whatever had to do with the user's info?) - I'm having to go some round about way with this. [15:25:41] pidzero: am skimming https://www.mediawiki.org/wiki/Manual:Configuring_file_uploads and yeah I do not see anything.... [15:26:15] Elvana: Probably you should add a new table for this; there are hooks for registering database updates [15:27:23] marktraceur: Would that require running the update script again? [15:27:43] I'm working with two wikis using a shared user database, by the way. [15:28:16] sorry pidzero I don't know. You might want to try http://hexm.de/mw-search which searches source code + mailing lists as well [15:28:40] sumanah, probably a hard coded requirement. thx anyway [15:30:43] Also worth noting: I have three users already signed up (Me, two bots). [15:32:22] pidzero: Are you using Special:Upload or the UploadWizard extension? [15:32:23] Not exactly sure where the table is located, either, or if whether I'd have to add the new entry myself or not. [15:34:32] Elvana: https://www.mediawiki.org/wiki/Manual:Developing_extensions#Adding_database_tables [15:34:53] Elvana: Where are you doing $this->msg()? In a skin file? Special page? [15:35:31] Special page (In UserCreate.php) [15:36:01] Elvana: I mean, what is $this ? Is it a SpecialPage subclass? [15:37:15] Ah, yes it is. It's using "msg" - Should I add 'parse' as well? [15:37:46] I *think* so. [15:37:47] marktraceur, i guess Special:Upload [15:38:04] pidzero: Weird, I thought that was possible. I think there's an open bug, sec. [15:38:33] Fatal error: Call to undefined method UsercreateTemplate::parse() [15:38:37] Huh [15:38:43] Elvana: So that's not a SpecialPage. [15:39:00] Elvana: Can you find UsercreateTemplate in the code and tell me what class it extends? [15:39:35] QuickTemplate [15:40:32] Well wait - It's listed in any order like: class name1 extends name2 [15:41:26] Yeah [15:41:42] Huh [15:41:46] Which of the two are you looking for? Both? [15:42:09] I got what I needed [15:42:19] But I'm not very happy about it...the QuickTemplate class is terrible [15:42:45] Is there something else I should rather use? [15:43:01] Probably not, but it means unravelling a bit more thread :) sec [15:43:21] marktraceur, seems to be by design 'The following checks are performed [...] File exists with normalized extension [...]' - and i don't see any conditional in the code for affecting this behavior :\ [15:43:21] Note to developers: "An ugly, ugly hack" is NOT ACCEPTABLE DOCUMENTATION [15:44:09] Hee hee [15:44:11] Elvana: Try using msgHtml() or msgWiki() instead [15:44:18] I'm not sure which will work, one of them should [15:44:26] pidzero: Sorry :( [15:45:01] The first one worked! [15:45:08] It's a MIRACLE! [15:45:27] The internet doesn't hate me! Yay! Thank-you! [15:45:59] marktraceur, there's no reason for that! thanks for taking time [15:47:02] No problem, Elvana and pidzero [15:48:10] Something else I'm curious about: Is it possible to nuke the PHP layout and switch to an HTML based layout backed by PHP (For the login page?) [15:50:42] Ehhh [15:50:51] Elvana: There's some work on templating, but it's not complete I think [15:51:10] Elvana: Pretty sure spagewmf is the person to talk to, he's not in yet [15:51:36] Hello, how do I view the blobs (current page content) in my MediaWiki 1.22 install. I have to fix thousands of URLs in it. [15:51:36] That's fine, it's just a bit more difficult having both the layout and functions behind co exist side-by-side. [15:51:47] *behind it [15:52:19] I plan to dump that table, take care of the URLs, and import that table again. [15:52:42] Elvana: Yeah, I think the answer is "soon" :) [15:53:35] That's good. I'm annoyed with the way things like that have to be done now . [15:53:58] I think everything should be more based around stupid people like me :'D [15:55:24] Gosh, I'm really afraid I'm going to screw something up if I attempt adding another entry to the user table. [16:03:03] I can't seem to find the name of the user table either. [16:08:35] Seriously, is the name of the table just 'user'? [16:11:06] hi Elvana - this is in the db schema for MediaWiki? [16:11:08] "User" [16:11:23] https://www.mediawiki.org/wiki/Schema [16:12:05] Ah, thank-you. It's confusing when it has such a plain name. [16:14:11] Elvana: have you btw already read https://www.mediawiki.org/wiki/MediaWiki_architecture ? It does not mention the User table but it mentions other useful things [16:15:01] BTW Elvana - nice to meet you. I'm Sumana Harihareswara, senior technical writer at the Wikimedia Foundation. Welcome to MediaWiki :) [16:15:32] Reading now - Thank-you! Nice to meet you too. :) [16:17:30] I guess the only part that's tripping me here is adding another field - So far what I've read seems to only tell me how to add new tables and/or modify these tables as a whole. [16:55:12] Okay, I'm viewing my MySQL database directly - Can I add the table that way instead? [17:05:42] Elvana: Ehhhh, "yes", but it will mean you're not able to distribute the code you're writing as easily [17:05:50] We write the schema loader hooks for a reason [17:06:14] Elvana: You shouldn't add another field, you should add a new *table*. [17:08:09] A completely new table..? [17:08:27] But if it has to do with the user, shouldn't it be in the use table? [17:08:56] *user [17:09:50] This is something I'd need step-by-step help with - It was my mistake dodging PHP early on and going for thing like Java and HTML5 that don't require any server connection at all. [17:09:50] Elvana: why do you want to modify the user table? [17:10:15] legoktm: To comply with COPPA laws, I need to collect the user's birthdate. [17:10:31] I have no idea why such a feature hasn't been created yet. [17:11:31] Hello, how do I view the blobs (current page content) in my MediaWiki 1.22 install. I have to fix thousands of URLs in it. [17:11:34] yeah, I would have guessed something like that would already exist. [17:12:26] Elvana: I would create a separate table, `birthdates` with two colums, bd_user, and bd_date where bd_user is the user's Id from the `user` table, and bd_date is the birthdate. [17:12:37] so you can easily join with the user's id [17:13:37] multiverse: the `text` table. https://www.mediawiki.org/wiki/Manual:Text_table [17:14:29] multiverse: if you want to fix urls though, you can use Special:LinkSearch and the externallinks table [17:14:35] And could I just do that with my SQL client, or should I try using these "hooks" to do so? [17:14:52] The hook method asks me for data I'm not sure about. [17:15:28] you should ideally use https://www.mediawiki.org/wiki/Manual:Hooks/LoadExtensionSchemaUpdates [17:17:18] it really depends on if you're just writing the code for yourself, or want to eventually publish it as an extension for others to re-use (which would be totally awesome) [17:19:11] multiverse: what do you mean by "fix URLs"? [17:20:08] My URLs are images. I upgraded from one version of a gallery to another, and of course the htaccess redirects aren’t working as advertised. So I have to do some global relacements. [17:20:46] legoktm: It's possibly something I could publish, but at this point I still need help. I have a general understanding of the way things work, but it's the "little" things here that are really getting to me. [17:21:18] multiverse: this may help: [17:21:18] !e Replace_Text [17:21:18] https://www.mediawiki.org/wiki/Extension:Replace_Text [17:21:58] Can't I just add the table myself in my SQL client, then "pretend" it was made using the hook? I hardly see the difference, and if I were to publish it I would of course include the hook to auto-create the tables. [17:23:59] Yaron: bingo [17:25:40] legoktm: I'm also using Mediawiki 1.19, and I can't seem to find a function *specifically* designed to add a table, the article you linked to doesn't list my version at all, it cuts around it by saying "Before 1.18" and "After 1.20" [17:26:01] heh. [17:26:24] Elvana: the hook is just for update.php, if you're still writing the extension, just add it with your sql client and add the hook later on. mediawiki won't care [18:26:02] csteipp: hi there, just waving at you *wave* to mention that I'm here for our chat in 5 min. Hope you are doing well. [18:26:37] sumanah: I'm editing in examples right now :) [18:27:43] :D [18:34:23] Alright, sumanah: I made a few updates and reorganized https://www.mediawiki.org/wiki/Security_for_developers/Architecture a little big [18:34:39] s/big/bit/ [18:35:00] bd808: https://www.mediawiki.org/w/index.php?title=Performance_guidelines&diff=1005878&oldid=1005843 Thanks to you, I can shorten the perf guidelines, because the media storage documentation lives somewhere else :D [18:36:11] csteipp: for the threat modeling section, is there a freely available example image? [18:36:34] If we can show people what a data flow diagram looks like, that will help them [18:36:52] Hmm... that's a good question. I can always write one up, but there must be ones out there [18:37:14] I wonder whether OWASP makes sure to openly license all their training materials? [18:37:17] * sumanah checks [18:38:15] https://www.owasp.org/images/8/8a/OWASP_SiteGenerator_V2_DFD.pdf [18:38:45] mentioned on https://www.owasp.org/index.php/SpoC_007_-_OWASP_Site_Generator which says "Content is available under a Creative Commons 3.0 License unless otherwise noted." namely BY-SA. We're golden :) [18:39:02] csteipp: btw you ok talking here or would you rather move someplace? I'm fine here [18:39:15] I'm fine here [18:39:24] OK. :) [18:39:43] I'll make sure to make the first threat model I do with another team here freely available [18:39:48] (for those wondering: Chris and I are talking about the security guidelines so that we can get them finished by the end of June) [18:40:03] In this chat, we'll check in on https://www.mediawiki.org/wiki/Security_for_developers/Architecture , look down the checklist of examples or stories for each principle, and decide whether we're ready to give the list of principles to Design for the poster. [18:40:10] That's great [18:41:18] OWASP has https://www.owasp.org/index.php/Perform_security_analysis_of_system_requirements_and_design_%28threat_modeling%29 but I think the PDF I have linked to is the only thing on OWASP's site that is called a data flow diagram - I could be wrong, I do not quite trust search because all metadata is broken [18:41:22] we live in a fallen world, etc., etc. [18:42:30] csteipp: in "implementation" should there be something reflecting Risker's suggestion https://www.mediawiki.org/wiki/Talk:Security_for_developers/Architecture ? [18:42:33] For the examples, I probably need to flesh out most of those.. Although as someone said in the perf discussion, it might be good to also let many of those be added organically. [18:42:52] Ah, yeah, I should call that out there too. [18:43:47] So, my personal bias is to *make sure* there is at least one good example or story for each principle, and to *prefer* that there is also at least one negative example for each principle, when we ship the doc. [18:44:22] Because people who learn more by examples than by principles (in Felder-Silverman learning style language, sensing rather than intuitive learners) really need examples. [18:44:48] Gatcha [18:45:11] So I still need a good example of a security control that we've kept simple. [18:45:28] (btw, I may as well take this moment to talk about how we're hitting the other three learning style axes. Visual-to-verbal: we'll have a poster as well as a text doc. Reflective-to-active: reflective learners can read, and for active learners, let's link to one of those interactive tutorials, Project Cow?) [18:45:29] I'll hunt down that unicorn eventually ;) [18:46:01] Project cow? [18:46:49] I am blanking on the name, sorry. Google and OWASP each have different "download this super vulnerable CMS and learn how to crack into it" downloads [18:47:14] Webgoat! [18:47:21] Ah, yep :) [18:47:32] phew https://www.owasp.org/index.php/Webgoat [18:47:39] And google gruyere [18:47:52] and then Google security folks made a different one - Gruyere, thank you, no way I would have remembered that [18:48:13] * sumanah evidently thinks all livestock are in a giant quadrupeds-we-manage category [18:48:26] I can remember which cheese it is, but getting the spelling always gets me.. [18:49:04] (And for the sequential-to-global folks, well, your HOWTO is pretty sequential, and we can link to the bits of our codebase that are especially security-centric, like AbuseFilter, that the more globally inclined folks can use to jump into the deep end and learn all at once.) [18:49:48] A security control that we have kept simple. Login? [18:50:26] * sumanah tries to think outside the box [18:50:53] I cringe, only since the way we combine account creation, password reset, and user login in one special page always causing problems [18:51:08] We simply don't allow you to, for instance, edit on another person's behalf (author-committer separation like in Git). Although we do allow bots to edit and we sort of have bot owners [18:51:17] But maybe one aspect of it.. [18:51:21] that prohibition is a kind of simplicity [18:51:53] another kind of simplicity is that we choose not to write ACL granularity, thus making sure it cannot break [18:51:54] Yeah, I might be able to pull a single aspect. Also our user groups are pretty simple [18:52:49] csteipp: have you talked with Legal re the "What are we trying to protect?" section? [18:52:55] Yeah, I'll probably add an example for our access control [18:53:16] Not yet, I need to do another pass based on the draft privacy policy [18:53:19] OK [18:53:57] Hmm. Timing wise, I was also thinking we can wait until June 6th when the new policy is approved, but that might be too long for you. [18:54:01] btw, I'm sorry I haven't explicitly said this yet - I like your "Developing Securely in MediaWiki" summary and then you go into most of those bullets in more detail [18:54:21] csteipp: I think we can do most of the work on this doc without that particular thing [18:55:15] because it feels unlikely that, in the next 3 weeks, we will add an wholly new category of data to protect that would not be protected by the assessment, principles, threat modeling, etc. sections [18:55:22] we may need to add some bits to "implementation" [18:56:08] Yep, that process is a bit of a work in progress right now [18:56:57] Those are what most people check for in code reviews [18:57:10] But I'd like to expand those a little. [18:57:46] I'm also hoping that we'll find a static analysis tool that developers can run their code through as well, but I havne't been able to find a helpful one yet. [18:58:34] :/ You've asked your l33t network of pals csteipp? [18:59:24] Yep. Sadly, few other people use php. [18:59:34] btw I know the ElasticSearch talk is about to talk in case you want to stop now [19:00:08] Oh, I did want to watch that. Anything else you need from me in the next week on this? [19:00:24] Yes, but I'll email you. Go enjoy the talk! https://plus.google.com/events/cokipb2senmmvkvdjif7aq55kac [19:00:41] (sorry for not making the best use of this time, will be more careful next time) [19:00:42] Cool. Thanks! [20:04:51] nooby question: is there an easier way to install than the lengthy https://www.mediawiki.org/wiki/Manual:Installation_guide ? [20:04:54] https://www.mediawiki.org/wiki/MediaWiki-Vagrant is probably the simplest way that works. There is a MediaWiki .deb package for Ubuntu/Debian (and probably an RPM for Fedora/RHEL/CentOS) but those are often out of date or end up giving you things linked weirdly I think [20:05:02] like, "sudo apt-get install mediawiki" or something? [20:05:43] sorry, I should clarify - not for development, but, for example, a staging server? [20:07:22] bitnami it is! :D [20:07:22] https://bitnami.com/stack/mediawiki [20:09:11] milimetric: I think I don't quite get it but if you're happy then yay! [20:09:53] sumanah: bitnami is an open "packaging" project that makes it easy for dunces like me to install complicated software like mediawiki [20:14:29] Thanks milimetric [20:15:53] I'll update the docs, seems like people would want to know about this alternative [20:16:02] good idea! [20:16:57] ah, it's mentioned here: https://www.mediawiki.org/wiki/Software_bundles [20:17:09] but a bit buried [20:21:07] nod [20:22:55] i was asking yesterday about moving my upload directories from NFS to a separate web server but I'm unclear on something...if I change $wgUploadPath for my wiki wiki.domain.com to something like http://upload.domain.com, how do I set $wgUploadDirectory? And do I need some kind of Apache rewrite rule? [20:27:36] jcl: What do you mean by how do you set it? [20:27:52] $wgUploadDirectory = '/mnt/uploads'; in LocalSettings.php [20:28:41] but since I'm no longer going to be using nfs, instead the files will live on the other web server(s) pointed to by $wgUploadPath [20:30:00] you can't point $wgUploadDirectory to a remote server [20:30:13] right, that's $wgUploadPath [20:30:27] so what would $wgUploadDirectory need to be? the docs aren't clear [20:30:49] http://www.mediawiki.org/wiki/Manual:$wgUploadDirectory just says it needs to be coherent with $wgUploadPath [20:30:50] $wgUploadDirectory: The file system path of the folder where uploaded files will be stored. https://www.mediawiki.org/wiki/Manual:$wgUploadDirectory [20:31:07] all the apaches serving mediawiki need to be able to write to the path [20:31:11] so it's the path on the remote filesystem? [20:31:22] jcl: btw, if you would not mind pasting this conversation into the talkpage of the relevant doc on mediawiki.org once you have an answer, that would be great [20:31:42] sumanah: will do! [20:31:43] and ping me by mentioning [[User:Sharihareswara (WMF)]] so I see it and can take a stab at prosifying it and adding it to the page [20:32:00] I don't expect to have a satisfactory answer for this one [20:32:00] (no promises on when that would be! it would be a learning experience for me as well :)) [20:32:36] * sumanah goes off to grok BetaFeaturesHooks.php and UpdateBetaFeatureUserCountsJob.php better [20:33:15] sumanah: Preemptively let me explain that those files are what happen when you let someone with "Frontend" in their title write PHP [20:33:22] jcl: if images are on a remote server, and you expect your wiki to be able to upload new files, that won't work. You should use an extension to allow that, like S3 storage, Azure, etc [20:33:26] so maybe I'm asking the wrong question: I just want to move from NFS to a more reliable mechanism, and a set of upload servers seemed a good idea. Is this a recommended approach or should I consider somethign else? [20:34:05] jcl: look at https://www.mediawiki.org/wiki/Category:File_repository_extensions [20:34:31] yeah, i saw that wikimedia moved to Swift but I have no experience with that [20:34:39] marktraceur: ok. :-) This is a good example though - I am going to explain (in today's Hacker School presentations) how we choose to update the "how many users have opted in to this feature" # asynchronously [20:34:56] and thus speed up user experience when they hit Save to update prefs [20:35:18] Cool [20:35:47] does anyone know when the variable names {{NAMESPACE}} etc were changed to be able to take one param [20:35:57] and where those names now live in code [20:36:10] Vulpix: so are they all cloud-based or can ones like swift be implemented internally? [20:36:59] Withoutaname: /includes/parser/CoreParserFunctions.php [20:37:05] jcl: MediaWiki doesn't just put files in filesystem and make them remain unchanged. When new files are uploaded, they're moved in the filesystem, etc, and that makes it difficult to sync on remote filesystems or external storage servers [20:37:19] ty [20:37:20] *new versions of files [20:37:34] MatmaRex: do you also know the revision that changed their behavior [20:37:54] no, but i can find out [20:38:06] Withoutaname: http://mediawiki.org/wiki/Special:Code/MediaWiki/46630 [20:38:24] MW 1.15, it seems [20:38:42] Vulpix: so any solution needs to simulate a local filesystem [20:39:11] jcl: basically, you need to use any of the existing extensions for that. Each remote-filesystem mechanism like this requires it's own extension, unless you use one that makes it transparent for MediaWiki (like NFS) [20:39:28] MatmaRex: so would they be considered https://www.mediawiki.org/wiki/Manual:Parser_functions or https://www.mediawiki.org/wiki/Manual:Variables [20:41:08] Withoutaname: eh, the entire distinction is silly, but i guess they'd be parser functions [20:41:09] I understand now. I think I just had a weird mental block. Since NFS has been very painful for us but we don't want to use a cloud-based solution, it seems from the link you provided that Swift would likely be a good extension-based approach. [20:43:50] Vulpix: Thanks for the advice and helping me break through my mental block. I'll investigate the options you provided. [20:46:30] yw :) [20:46:55] maybe there's another extension for that, hosted outside mediawiki.org and not listed there [21:45:10] what file produces the "MediaWiki has been successfully installed" message [21:46:05] Withoutaname: define 'produces'. the text itself is probably in /includes/installer/i18n/en.json [21:46:29] WebInstallerPage.php [21:46:30] ./includes/installer/i18n/en.json:313: "mainpagetext": "MediaWiki has been successfully installed.", [21:47:09] yeah, and the 'mainpagetext' message is used in includes/installer/Installer.php