[00:00:40] Dunno. [00:15:02] I've a quick question, is anyone around? [00:15:09] !ask [00:15:09] Please feel free to ask your question: if anybody who knows the answer is around, they will surely reply. Don't ask for help or for attention before actually asking your question, that's just a waste of time – both yours and everybody else's. :) [00:15:25] Great =P [00:16:06] My server has some subdomains on it, so when I go to www.example.com/wiki and example.com/wiki, there are two different sessions [00:16:06] :D [00:16:18] I can individually log in and log out on each site, even though they show the same data [00:16:55] I wish that weren't so. I've set my session.cookie_domain to ".example.com", but it doesn't fix it. [00:17:10] well, your configuration should have www. redirect to the default domain [00:17:10] one sec [00:17:40] Right, so then the problem becomes that if the user logs into the www., they are immediately redirected to the default and asked to log in AGAI [00:17:43] N [00:18:18] Which is the behaviour my users first brought to my attention. [00:18:22] Phorcys: try setting the $wgCookieDomain option of MediaWiki [00:18:22] can you pastebin your /etc/apache2/sites-enabled/files? [00:18:28] !wg CookieDomain [00:18:28] https://www.mediawiki.org/wiki/Manual:%24wgCookieDomain [00:19:55] your servername should be www.example.com, with your serveralias having simply example.com [00:20:26] though I suppose it could be done the other way around [00:21:09] ..the point being that this is supposed to be taken care of at the apache level, not the php level [00:21:50] Yeah, I'll check on that. [00:23:03] Currently my servername is example.com and my serverAlias is www.example.com [00:24:25] Actually, I think the $wgCookieDomain variable just solved my problem [00:24:48] No more double logins, and my session is preserved across domains [00:24:51] Thanks a ton! [00:24:54] Yaron: If you were wondering what caused my login trouble earlier, it turns out the hard drive on the wiki's server was full! [00:25:06] *shrug* alright, cool :) [00:42:19] is anyone here willing to explain context and requests in mediawiki to me? :) [00:43:49] context is just an object that contains information about the situation MW is running in - what title we're dealing with, who the user is logged in as, etc. [00:44:37] the request object stores info about the http request php is running to complete [00:46:02] ah I see-- so they are different things, they are just often seen close to each other when work is being done [00:46:17] often, yeah [00:47:25] oops, though we have a RequestContext class, confusingly enough [00:47:37] mind explaining that one? :D [00:47:44] oh wait, that's self-explanatory [00:47:56] it's a context object [00:48:05] when I said 'the request object' I was referring to WebRequest [00:48:14] yeah..that was one of those moments where I spoke before I read :p [00:48:32] which you can get from a RequestContext - call getRequest() [00:53:30] oh good, got it. the question that it was leading to, is in SpecialUpload, it gets the request object, and then it needs to fill mSourceType, which determines whether the upload is of type file or url. it does a call of getVal (its in class WebRequest) for wpSourceType, with a default result of 'file' [00:54:24] however, I'm not seeing the connection, where this could have actually been changed [00:54:37] so I'm not following why that request is done, when it'll end up being 'file' [00:55:59] oops, getVal just gets params from the URL [00:56:02] it's changed by the user [00:56:27] so somewhere, it could be changed, and it is sent along in post/get params? [00:56:30] (or possibly formdata etc.) [00:57:17] okay, thank you, that just connected the dots [00:57:28] oops, if you run grep you can see where we build a form field with name=wpSourceType [00:57:48] yes, it's a radio button [00:58:36] ..which is then set by upload-type in the form. excellent :D [00:58:49] bit of abstraction there but alright haha [08:39:45] Hey! [08:40:39] After upgrading my wiki from 1.17 to 1.22 I observe that changes made in Mediawiki:Sidebar and Mediawiki:Common.js no longer have any effect. I wonder why. [08:41:10] Have spent two days figuring it out. [08:42:28] Hi, is it possible as an admin with no shell access to manually change the password? [08:42:42] email requests not working for some reason.. [08:42:58] not sure if i am missing some plugin or there is a feature for this [08:50:41] miandonmenmian: core MediaWiki doesn't allow you to do that, but some extensions might [08:51:17] aurimai: is your wiki public? can you link it? [08:52:51] MatmaRex, the wiki is generally private, but has many whitelisted pages in the main namespace. The URL is http://trv.jbv.no/wiki . [08:53:44] MatmaRex, I can provide the content of some not-whitelisted pages on request, if you need it. [08:55:58] aurimai: hmm. the contents of common.js are loaded for me [08:56:05] http://trv.jbv.no/w/load.php?debug=false&lang=nb&modules=site&only=scripts&skin=vector&* [08:56:48] and i get no JS errors, so i think this code works well [09:00:10] MatmaRex, nice. And the contents is exactly what it must be. So why neither "wikiEditor( 'addToToolbar', ... 'math'" is working or changes in Mediawiki:Sidebar has any effect? [09:02:05] matmaRex any suggestions on a extension to do password changing? or this is not recommended ? [09:02:49] miandonmenmian: i don't know of any, try searching mediawiki.org [09:03:42] miandonmenmian: MediaWiki is optimized for sides like Wikipedia, where "administrators" have a lot less privileges than on "regular" sites where there's just one administrator who runs the site [09:03:58] i think that's mostly the reason why this functionality is not included [09:04:02] same with renaming or deleting users [10:30:56] Hi all, is there any UML diagram for mediawiki development? (eg. sequence, use case) [10:31:45] I'm studying the source code and some diagrams would help [10:42:20] hello [10:42:27] i'm using mediawiki api [10:42:42] what doesn "Unrecognized value for parameter 'prop': extracts" mean? [10:43:24] the manual says: [10:43:47] api.php?action=query&prop=extracts [10:44:46] softplay: this API is provided by the TextExtracts extension, ensure you have it installed: https://www.mediawiki.org/wiki/Extension:TextExtracts [11:36:20] is there a way to obtain the content of a page without writing the page id in the mediawiki api? [11:37:08] i've an old solution index.php?title=Title&action=raw [11:38:09] but it's not documented in the mediawiki API documentation [11:39:10] to access the content of the page in this http://bpaste.net/show/aHAUgehXv2FvfSDvGeyk/ response i've to have the id number of the page [11:39:59] well maybe i can do what i want to do even with the page id.. [11:40:31] it's a little more complicated but.. [11:42:58] softplay: https://en.wikipedia.org/w/api.php?action=query&prop=revisions&titles=Main%20Page&rvlimit=1&rvprop=content ? [11:45:19] action=raw is standard [11:46:18] and is documented more than once [11:46:19] https://www.mediawiki.org/wiki/API:Query#Sample_query [11:46:23] https://www.mediawiki.org/wiki/API:FAQ#get_the_content_of_a_page_.28wikitext.29.3F [11:47:15] https://www.mediawiki.org/w/index.php?title=Special%3ASearch&search=action%3Draw&ns104=1 [11:48:51] ty Hello71 [11:58:48] render doesn't return play html [11:58:56] it returns an object [12:14:52] softplay: indeed, it returns json, that embeds the html [12:15:21] action=raw is not API, thus not documented as api. [12:35:23] hi all. I am trying to do desaster recovery on a small MW installation that has two broken tables: revision and recentchanges [12:35:39] which leads to most pages not being accessible ("was not found... maybe deleted or moved" [12:35:59] is there any way to rebuild the page<->revision relations with the text that i currently have? [12:46:56] can anyone check if this http://joy.indivia.net/Wiki/api.php?action=query&list=categorymembers&cmtitle=Category:Command&format=json api calls returns this http://joy.indivia.net/Wiki/index.php/Syntax_Error page which should be caontained in the category according to this http://joy.indivia.net/Wiki/index.php/Category:Command page? [12:47:15] i mean the API call is omitting one entry? [12:47:24] how could it be? [12:54:34] can anyone help? [12:54:48] Hi. [12:55:54] softplay: By default, you're limited to 10 members or something. [12:55:55] softplay: http://joy.indivia.net/Wiki/api.php?action=query&list=categorymembers&cmtitle=Category:Command&format=json&cmlimit=100 [12:56:15] softplay: Or you can paginate through the results. For example... [12:56:33] http://joy.indivia.net/Wiki/api.php?action=query&list=categorymembers&cmtitle=Category:Command&format=json&cmcontinue=page|535452|62 [12:57:21] hm, is there any way to rebuild the revision history for a mediawiki install? [12:57:26] (sorry for repost) [12:57:45] ty Gloria [12:58:05] absynth: If you look in the maintenance directory of your MediaWiki installation, there are some rebuild and refresh scripts in there. [12:59:14] absynth: http://git.wikimedia.org/tree/mediawiki%2Fcore.git/21de2cffb6d1779823e40d2a6a3c6332676194cc/maintenance [13:02:20] absynth: If revision is busted, you may be screwed. recentchanges doesn't really matter. [13:02:28] revision is pretty important. [13:02:38] thought so [13:02:46] a couple pages still work, but most don't [13:03:06] rebuildall.php: * Rebuild link tracking tables from scratch. This takes several [13:03:15] this is not relevant to the revision history, right? [13:03:24] Not really, no. [13:03:40] Link tracking tables are to track category links, file links, external links, internal links, etc. [13:03:56] So that when you put [[foo]] on a page, you can figure out on which pages it appears. [13:05:29] well... it seems that only *newer* revisions aren't available anymore [13:05:38] i have a couple older ones [13:06:13] revision is just metadata. [13:06:18] The text itself is in the text table. [13:06:29] The page table is also mostly metadata. [13:06:29] yeah, i checked out the biiiiig entity relationship diagram [13:06:37] :-) [13:06:54] is there a feasible way of rebuilding latest revision history by re-linserting revisions manually? [13:06:59] it's only, like, 80 pages or so [13:07:20] Maybe. [13:07:33] One of the maintenance scripts will create an XML dump/export. [13:07:51] But without working revision and page tables, who knows if it'll work correctly. [13:08:17] The text table keeps all revisions. [13:08:30] Without the revision table knowing which revision is latest, it'll be a nasty extraction. [13:08:55] Well, even more than that. [13:09:09] Without a revision table, you can't match text entries with particular pages. [13:09:22] And without a page table, you can't match revision entries or text entries with a particular page title. [13:09:33] mh, well, i enabled debug [13:09:43] and it looks like the wiki still knows which revision it _thinks_ is the latest one [13:09:49] Article::fetchContent failed to retrieve current page, rev_id 374 [14:56:06] qgil: someone told me good luck with my tech talk today. is it today or on thursday? [14:56:27] manybubbles, are you nick? [14:56:31] Nik [14:56:32] qgil: yeah! [14:57:59] In case of doubt https://www.mediawiki.org/wiki/Project:Calendar wins [14:58:18] manybubbles, I see that there is a calendar invitation for today, I will fix it. [14:58:38] qgil: thanks! [15:04:38] manybubbles, fixed. Note that the default time is 30 minuest, flexible. [15:04:46] thanks! [15:06:00] manybubbles, I was wondering whether you could or would want to invite someone from Elasticsearch upstream? We are trying to get upstreams more visible in our community [15:07:03] qgil: I _can_ but most of my contacts are pretty busy at this point. let me check [15:07:41] manybubbles, thanks. If it works, well. If not, no problem. [15:56:59] Hello [15:57:11] ? [15:57:16] hi [15:58:06] hi [15:58:48] can anyone possibly help me out with my help desk report at: http://www.mediawiki.org/wiki/Project:Support_desk#escapeshellarg_.28.29_disabled_during_installation_42922 [16:00:58] TMAC_Kratos: http://www.mediawiki.org/wiki/Thread:Project:Support_desk/_Warning:_escapeshellarg()_has_been_disabled_for_security_reasons [16:02:07] did you disable safe mode? http://www.mediawiki.org/wiki/Safe_mode [16:02:12] I have tried that, as stated in the report, safe mode is not enable and the function is not disabled in the php.ini [16:02:26] "Safe mode is disabled and there are no restricted funtions" [16:02:50] exactly [16:04:24] even with safe mode off and no restricted functions i still get that issue [16:04:32] TMAC_Kratos: but does it say "You can install MediaWiki" there? (it's been a while since I used it, may be different now) [16:04:35] because of that issue the installation hangs [16:05:04] it stops barely after setting the server url [16:05:17] in the checking of the environment [16:05:34] the checking just stops. [16:09:20] quick pic of the issue: https://dl.dropboxusercontent.com/u/54314339/Help%20Photos/Media%20wiki%20installation%20hangs.png [16:51:53] qgil: I didn't quite catch the very end decision - is our office hour next week happening & talking about all our activities? I think that is what you said [17:48:01] hi varnent [17:49:49] sumanah: greetings [17:50:06] sumanah: how are you doing? [17:50:18] varnent: I am kind of ebullient re https://www.mediawiki.org/wiki/Performance_guidelines being like 85% of the way done [17:50:45] and you? [17:50:50] sumanah: wow - very cool! [17:51:09] YES [17:51:20] sumanah: a little irritated today - I’ve been working on WM Hackathon and was just told last night my help was no longer wanted - always a nice way to thank a volunteer :) [17:51:27] um. wow. [17:51:56] sumanah: yeah..so apparently I will not be in London was my way there evaporated [17:52:00] I hope the reason is because they have enough help from other people who are, like, already in England.... but still [17:52:05] :( [17:52:28] sumanah: yeah - they basically deciced they’d rather run it themselves than cover my expense - but two weeks ago they were insisting to me they needed my help - so idk [17:52:36] I won't be there either, but that was already the case - I've been there 3 years in a row, I am not a community manager anymore, totally makes sense to use that money to give someone else that trip [17:53:03] sumanah: I tried to explain that the Wikimania Hackathon isn’t exactly like the others - but I lost that argument - so we’ll see [17:53:45] My sympathies - no matter how it ends up, I can see how this bit of communication hurts :( [17:54:06] sumanah: I mean if I really wanted - I could probably lobby AffCom into sending me - but we already assumed I was going based on what I had been told - and so we’re moving towards sending new affiliates - which I would rather do - but I’m a still aggrevated and feeling burned - I’ll be fine after another day of scouling [17:54:07] and the disappointment of suddenly finding out you won't be at WM after all :( [17:54:16] nod [17:54:43] sumanah: yeah..that sucked..I’m scrambling trying to find folks to fill in for me elsewhere - sounds like a couple of things are just gonna get dumped - oh well - maybe for the best [17:55:01] * sumanah nods  [17:55:08] (in an understanding acknowledging way) [17:55:29] on a happier note, there's a cool new dataviz MW extension [17:55:43] yay! :) [17:55:51] that Dan Andreescu worked on at the Zurich hackathon - ingests JSON (from a wiki page) and displays a map or what have you [17:56:00] I need to play around with some extensions on WikiQueer soon - I’ve been focusing on AffCom stuff lately - so [17:56:00] it's called Vega [17:56:13] Extenison:Vega ? [17:56:16] I’ll check it out [17:57:01] I’m getting better with the translate extension - no clue what I’ll do with that knowledge yet - but as someone that sucks at foreign languages - I feel some odd obligation to understand the parts of that effort which I can help with :) [17:57:25] sure :) [17:57:33] maybe mutate WikiQueer into a translate based encyclopedia project as an experiment - idk [17:57:35] * sumanah seeks info from milimetric in another channel re Vega [17:58:29] if you need help getting https://www.mediawiki.org/wiki/Performance_guidelines translate ready - let me know :) [17:58:50] right now I’m trying to get all the AffCom, affiliate and procedural WMF content on Meta translate-friendly [17:58:54] *affiliates [17:59:15] like 1/2-2/3 done with affiliates [17:59:41] sumanah: are you going to be working on a format for the extensions pages? that whole topic is still something that peaks my interest [18:00:06] varnent: https://github.com/milimetric/WikiViz is Vega [18:00:17] I still think we need a top-down review of all extensions on MW - it’s getting better - but I suspect there are still problems lurking we just haven’t found yet [18:00:20] varnent: that particular item is not in my backlog, no [18:00:38] ah well - maybe one day :) [18:00:55] I am happy that the cleanup started about a couple years ago is still used in some ways - so.. [18:01:24] so are you then focused on documentating technical parts of MW core that lack helpful documentation? [18:01:26] nod [18:01:28] no [18:01:43] Now till June 30: Security guidelines, Performance guidelines, & architecture guidelines [18:01:59] July till prob Dec: API/data/dev doc stuff [18:02:04] gotcha - so what’s the overall guiding focus for your job? to improve MW documentation? [18:02:05] https://www.mediawiki.org/wiki/Data_%26_Developer_Hub [18:02:26] Now/continuing/onwards: architecture-related meeting coordination/shepherding, e.g., RFC reviews [18:03:03] gotcha - I think :) [18:03:08] My focus for the foreseeable future is to write/make things that are force multipliers for the whole Wikimedia technical community. qgil hope you agree. :) [18:04:10] lol - you scared him right out of the room [18:04:33] ha! now, I may end up, as a part of my role, yes, explaining bits of MW core or extensions that need and don't have explanations, e.g., the parser cache [18:04:38] well I’m still sad we don’t get to interact as much in your new gig - but I’m glad the foundation is giving that area the attn it needs [18:04:41] Thanks! [18:05:07] and I second your comments about the valuing of jumping around jumps within a large org [18:05:39] lol - in smaller nonprofits we just absorb more - I may be absorbing a third dept soon - but I’ll get more staff to help..so..but it’s a nice way to mix things up [18:06:02] otherwise I think adult ADD kicks in and we lose interest [18:07:29] Hi varnent. [18:08:05] huh: greetings :) [18:11:33] I like learning [18:55:06] hi [18:56:47] hi [19:03:14] hi [19:04:26] hi [19:04:40] <^d> hi [19:07:08] hi [19:08:12] hi [19:17:53] MultimediaViewer is no longer a BetaFeature, correct? [19:18:39] Meiko: it is still in beta on most wikis [19:19:22] Meiko: We're about two weeks away from it being totally not beta [19:19:37] Sweet. [19:19:42] Maybe three. [19:19:50] Unless things explode. :) [19:20:00] let's hope that doesn't happen :D [19:21:30] Meiko: Let's be honest, it's pretty certain to happen, we just need to shape the charges and make sure nothing important goes out [19:21:46] yeah [19:22:15] Brickimedia will be updating its MMV version as soon as MediaWiki 1.23 is released [19:22:17] so that should be fun [19:22:38] It also works well with out WIP skin redesign: http://georgebarnick.com/refreshed-beta/index.php?title=MultimediaViewer [19:23:16] Meiko: Out loud I said "Damn" [19:23:19] That's really pretty [19:23:30] The skin? Glad you like it! ^_^ [19:24:08] Before we deploy it we're going to try to make it VisualEditor-compatible [19:24:24] Which is hard to do since my server doesn't allow Parsoid to run [19:24:30] well, it doesnt allow node.js at all :( [19:24:36] Aww. [19:25:28] https://github.com/Brickimedia/brickimedia/issues/274 has a list of things we're planning to change from our old skin to the new skin [19:26:25] Meiko: I don't know if you've told the design list about your skin redesign, but I think you should, there are elements that might be useful for Winter and other design efforts :) [19:26:59] I'll probably do that soon. [19:27:15] Here's the source code: https://github.com/Brickimedia/Refreshed/tree/refreshed-beta [19:27:48] Meiko, there are some really cheap vms out there that let you run whatever you want [19:28:11] gwicke: I'll be changing my server later this year [19:28:22] right now it's just on a super cheap shared hosting provider [19:30:15] kk [19:35:36] marktraceur: I'll draft an email to design@lists.wikimedia.org in a sec. [19:35:40] Dunno what I should say :P [19:38:28] Meiko: May I offer suggestions? [19:38:37] sumanah: sure [19:39:44] Meiko: 'hi design folks! thought you might want to see my skin redesign [link to code, link to demo page/site] - my goals when making it were [a few sentences here about what you were aiming for], you might find it useful" [19:39:51] er, s/"/'/ [19:40:20] thanks [19:40:30] and if you want feedback, "I'd especially welcome any feedback on [things I want feedback on - mobile-friendliness, accessibility, ease of use in general, colours...]" [19:41:35] When I am giving people something to look at, I find it useful to quickly specify who the target audience is, what I was aiming for (like, hospitality or innovation or whatever), and what particular bits I would like feedback on [19:41:56] if I want feedback at all, which sometimes is not my goal :) [19:42:21] Alright, sounds good [19:42:31] Anyway. Thank you for contributing to MediaWiki! [19:42:33] And using it! [19:44:02] :D [19:54:59] went ahead and sent that email to design@lists.wikimedia.org [19:55:46] Thanks Meiko! It has shown up http://lists.wikimedia.org/pipermail/design/2014-May/001823.html :) [19:55:53] :D [19:58:35] <3 yay [20:01:08] I wonder if Refreshed could ever be included in mediawiki/core :o [20:03:02] Meiko: I think we're trying to push away from bringing skins into core [20:03:24] hmm [20:03:35] * MatmaRex appears [20:03:52] MatmaRex: Well hi there! [20:03:59] Ever since a lot of the ancient skins were removed from core (thank god), I feel like there's not as many options like there used too be [20:04:01] it could go to mediawiki/extensions/skins [20:04:02] *to [20:04:10] Meiko: basically the plan is not to include any skins in the mediawiki/core repository and instead bundle a few chosen ones with the tarball [20:04:26] MatmaRex: yeah, that sounds more like what I was thinking of [20:04:36] (we said "skins", so MatmaRex got pinged, apparentnly) [20:04:59] (or use crazy things like git submodules, maybe, if people get very angry that running mediawiki from git got a tiny bit more problematic) [20:05:06] Platonides: we actually are planning on moving Refreshed to Wikimedia version control (whether it be Gerrit or Phabricator) after we deploy our redesign to it [20:05:07] marktraceur: i just lurk too much. :D [20:05:08] [20:05:32] Meiko, good :) [20:05:38] we already have one of our extensions on Gerrit (mediawiki/extensions/MediaWikiChat) [20:05:46] Obviously skinning MediaWiki isn't the answer and people should just build Stylish themes. :) [20:05:48] !e MediaWikiChat [20:05:48] https://www.mediawiki.org/wiki/Extension:MediaWikiChat [20:05:55] the problem is that making non-core skins was harder [20:05:58] (https://addons.mozilla.org/en-US/firefox/addon/stylish/) [20:06:34] !s test [20:06:34] There are multiple keys, refine your input: safemode, sal, sandbox, sanitizer, sarcasm, sb, scap, scarytranscluding, schema, search, secrets, security, seen, selfmerge, selinux, semantichelp, semanticmediawiki, sequel, serversuperglobal, session, sharedlogins, shell, shellrequest, shorturl, shorturls, sidebar, sitemap, sitenotice, siwae, skinning, skins, sleep, sm, smart, smtp, smw, sofixit, songs, sop, sortkey, sorttable, sourcecode, sourceforge, spam, specialpages, speed, sqlite, sqllog, sqlsearch, srv193, stalemaintpages, start, stash, stfu, stringfunctions, stroopwafels, submodules, subpages, subst, subversion, suggest, sul, sumanah, support, supported, svg, svnauthor, svnprops, svn-rev, svnsearch, syntax, syntaxhighlight, sysadmin, [20:06:39] :O [20:06:46] Is there a !skin [20:06:49] I would keep at least one skin [20:06:52] there should be since there's !e [20:06:53] !skin [20:06:53] General skin help: http://www.mediawiki.org/wiki/Manual:Skins / List of available skins: https://www.mediawiki.org/wiki/Category:All_skins / Creating a new skin: see !skinning [20:07:06] !e [20:07:06] https://www.mediawiki.org/wiki/Extension:$wiki_encoded_* [20:07:15] even if it's a bare one [20:07:26] !s is https://www.mediawiki.org/wiki/Skin:$wiki_encoded_* [20:07:26] Key was added [20:07:28] !s test [20:07:28] https://www.mediawiki.org/wiki/Skin:test [20:07:38] marktraceur: yay :D [20:07:46] !s Refreshed [20:07:46] https://www.mediawiki.org/wiki/Skin:Refreshed [20:07:46] Platonides: i'm definitely going to keep a "placeholder" skin in core so that you don't get fatals/exceptions [20:07:56] Platonides: but i'd like to move out all "real" skins [20:08:03] <^d> !lazy is what do you think this bot is for? [20:08:04] This key already exist - remove it, if you want to change it [20:08:08] <^d> huh? [20:08:10] <^d> !lazy [20:08:10] Don't rely on the bot answers too much, I am not always the best answer to every question. And don't immediately reach for the bot to answer a question, without at least checking the recent channel history to see if someone already botted them. Also, if someone says "can anyone help me", check that they haven't already stated the question and been told not to repeat themselves (reducing them to ask "can anyone help me?"). [20:08:16] <^d> lame answer. [20:08:20] Platonides: https://www.mediawiki.org/wiki/Separating_skins_from_core_MediaWiki if you haven't seen it :) [20:08:57] totes lame [20:09:01] !lazy del [20:09:01] Successfully removed lazy [20:09:06] I am quite distant now [20:09:33] brb I'm going to go get food but MatmaRex, thanks for linking that :o [20:10:39] I support your proposal [20:10:59] but please just use the existing way of skin extensions [20:11:10] instead of creating a new incompatible one :P [20:11:51] Platonides: which existing way? i know of at least three [20:12:06] I think it should mention somewhere that it's a gsoc project, btw [20:12:16] I think I know two [20:12:19] what's the third? [20:12:28] (skins folder and as an extension) [20:12:47] Platonides: skins folder like an extension (with require_once and stuff) [20:13:07] Platonides: hm, it has a category, and it used to be in my user namespace [20:13:26] ok, I didn't read the category [20:13:49] I read it thinking "this looks liek a gsoc project", "but no, it's written by MatmaRex" [20:14:05] "ah, Nemo bis summary says it is" [20:15:05] Platonides: updated the page [20:15:10] if you are doing a require_once from localsettings, what's the point of using skins/ folder? [20:16:19] hi qgil [20:16:27] Platonides: because the autoloading mechanism for skins/ sucks a lot [20:16:39] Platonides: that's how the most up to date tutorial says to do it [20:16:51] and doesn't detect files like skins/MySkin/MySkin.php, only skins/myskin + skins/MySkin.php (i think) [20:17:16] oh, for using subfolders [20:17:19] yeah, those won't work [20:17:25] Platonides: https://www.mediawiki.org/wiki/Skin:Chameleon does it this way, for example [20:17:32] https://www.mediawiki.org/wiki/Manual:Skinning/Tutorial#Installing_your_skin [20:17:34] (it also seems to use composer *shudder*) [20:17:42] but still, then there's no need to place them in skins/ [20:17:43] (it used not to, eh) [20:18:54] Platonides: i think there are two sane ways to go: either support that format and remove/deprecate autoloading (essentially turning skins/ into another directory working like extensions/), or remove/deprecate skins/ wholesale and stuff skins into extensions/ [20:19:29] i really should summarize this all soon and send some emails asking for comments [20:19:42] * wmat thought there was no more autoloader [20:20:14] as far as skins/ would be the same as extensions... [20:20:22] ie. no different format [20:21:01] after all, extensions can be insalled anywhere [20:21:04] doing that right would also require adding some magic to ResourceLoader etc. [20:21:13] except for the resource path... [20:21:15] extensions/ has $wgExtensionAssetsPath and a bunch of related variables [20:21:23] skins/ doesn't [20:21:36] s/resource/assets/ aboce :P [20:21:41] *above [20:22:41] hmm, well, actually there is a $wgLocalStylePath which does the exact same thing at a glance [20:22:59] back from when skins/ was called styles/ or something, i guess [20:23:08] it'd still need support for RL module definitions [20:24:43] I don't think it was ever called styles [20:25:31] hmmm... skins folder created in 2004: fd758aaa3c619ed1f3058117b55e61a3b59f8a63 [20:25:54] there was a stylesheets folder [20:26:19] (containing javascript and images!) [20:26:53] Platonides: it was called style/. http://mediawiki.org/wiki/Special:Code/MediaWiki/5089 [20:27:35] and your commit id is http://mediawiki.org/wiki/Special:Code/MediaWiki/5015 [20:29:22] MatmaRex: Using composer gives you installation with one CLI command instead of heving to install two extensions and two external packages to make the skin work [20:29:30] *having* [20:30:35] FoxT: yeah, and instead you need just a half a screenful of instructions! /s [20:30:39] why there's no link to the code there? [20:30:57] FoxT: does that actually still work with the recent composery changes in core? [20:32:35] MatmaRex: Not sure. [20:32:56] FoxT, I only see it needing mediawiki/bootstrap [20:33:31] True, it's only one extension. That extension relies on Twitter/bootstrag and oyejorge/less [20:41:43] Platonides: there is also one category :) [20:50:03] Nemo_bis, see above where I said "oh, I didn't read the category" [20:51:30] :) [20:55:45] :) [20:56:12] * Nemo_bis has too much latency right now [21:04:46] could anyone here (ori?) give me some advice on what I can do to get vagrant run-tests to complete? Right now it comes to a halt at ParserTests with "/usr/local/bin/run-mediawiki-tests: line 4: 4493 Killed php tests/phpunit/phpunit.php --testdox "$@"" [21:08:17] killed? that's a bit drastical [21:08:28] you're out of memory, perhaps? [21:08:32] possibly [21:09:02] if you run dmesg, do you see some text about "OOM looking for a process to kill" ? [21:10:01] I didn't change the virtualbox that vagrant set up. I can try increasing its memory, but I'm not sure if that meshes well with how vagrant works (which I have no clue about really) [21:10:35] I don't quite know how I could get that information, since the "vagrant run-tests" invocation is from the host [21:10:37] run dmesg inside the VM, and see if there's some text like that [21:10:51] I have no ide how it connects to the vm [21:10:55] ssh? [21:11:03] neither do I [21:11:06] I can ssh in [21:11:10] and yes [21:11:24] [142012.432652] Out of memory: Kill process 4493 (php) score 780 or sacrifice child [21:11:24] [142012.433684] Killed process 4493 (php) total-vm:876656kB, anon-rss:592300kB, file-rss:0kB [21:12:20] the vm should have with more memory in order to pass the tests [21:13:04] it is convigured from vagrant with 768MB. What's a ballpark figure I could give a swing? [21:13:20] I don't know if the vagrant manifest is intended to give enough memory to support that use [21:13:45] or it is intended to reduce memory usage and just allow mw to run [21:13:56] well, run the testsuite seems to be a reasonable requirement for an environment intended to use for development ;) [21:14:38] one would think that :) [21:19:12] well, it's running the suite again, lets see what it makes of it this time :) [21:19:31] I run with 1GB memory and tests typically die after about 80% of a full run. I usually split database tests and databaseless tests, and they run just fine. [21:20:29] I'm fairly memory rich on the host, I'm trying with 1.5 now [21:21:29] also, I barely know what I'm doing, which is part of the reason I wanted to be sure I would be able to run the tests in the first place [21:27:40] Platonides, hola! [21:46:22] argh... how can i get to the image page?? [21:46:33] Platonides: Which one? [21:46:45] Oh, media viewer givin' you trouble? [21:46:53] yes, that one [21:46:59] Platonides: You can use ctrl or shift click like normal [21:47:02] I want to view the file [21:47:16] Platonides: You can also use the button with the commons icon in the lower right of the screen [21:47:31] I don't see a commons icon... [21:47:39] oh, it's there [21:47:41] really small [21:47:43] Platonides: It may be a different icon [21:47:51] Oh h.m [21:47:58] Thought we fixed that [21:51:19] see http://imgur.com/4LWprPz [21:51:30] Yeah, I see the same thing [21:51:51] Platonides: What screen resolution is that? It's massive [21:52:02] Or maybe your browser is zoomed out? [21:53:21] not zoomed [21:54:26] the screenshot is 1792x921 [21:54:55] Hm [21:54:57] Anyway [21:55:06] Platonides: I'll file a bug about the size of the icons in that area. [21:55:27] thanks :) [21:56:16] fwiw I do find the text size smaller since the update to ff 29 [23:07:23] hi [23:07:49] what would be the best to setup 2 wikis with access to the same content except images and be stored in /wiki1 and wiki2? [23:07:52] using apache [23:13:22] biberao, hmm.. that would mean a mostly shared wiki DB except for the file namespace [23:19:11] Krenair: yes [23:22:59] Krenair: please help :D [23:23:32] I'm not sure that's really possible to be honest... [23:23:39] it seems to be [23:23:45] but im not sure about the main .php [23:29:45] :| [23:53:44] qgil: so after Zurich... [23:54:09] My question is, what's the timeline expected for moving on from just task management and to adding code reivew?