[00:08:43] WAT [00:09:15] Ten years, I've been using MediaWiki for ten years, and I'm finding out an odd thing. [00:09:21] (No, it's not the first one.) [00:09:48] Is there a way to tell MediaWiki that an [[Image:]] will appear in its natural size? [00:10:06] A PNG, in whatever the file's actual size is? [00:11:55] aharoni: |frame [00:12:17] as in, [[File:Foo.png|frame|Blah blah blah]] [00:12:19] MatmaRex: and if I need frameless? [00:12:26] It's for the VE user guide. [00:12:58] The images there are all defined as frameless for design purpose, and it's nice. [00:13:11] And I can't tell what a translated screenshot's size will be. [00:13:34] dunno. let's see [00:13:42] Looks like I can specify an arbitrarily large size, and then it will be shown in its natural size. [00:13:46] But that's lame. [00:13:50] Isn't there anything better? [00:14:17] MatmaRex: https://www.mediawiki.org/w/index.php?title=Help%3AVisualEditor%2FUser_guide&diff=1423864&oldid=1423444 [00:14:30] It does what I want, but it's silly. [00:22:32] aharoni: i see no better way. :( [00:25:46] unfrigginbelievable [01:56:02] MaxSem: thanks! i'm starting out, don't yet know what to grep ;) [01:56:02] Would disabiling "Restrict editing by absolutely everyone" mean my account too? [02:04:15] MaxSem? [02:04:16] max: if I added "Restrict editing by all non-sysop users" to my localsettings.php, would that diable my account too? [02:04:16] MaxSem [02:06:55] is there a way to find out the globalId of a mediawiki installation? [02:06:55] for example, can I do it via javascript on the site, or is there a special page showing it? [02:10:58] Anyone else here to can help us? [02:42:37] despens: what do you mean globalId? [02:42:46] *what do you mean by [10:39:33] Hi, I try to make a static hml copy of my wiki using httrack? Wiki needs to be logged in to read and encryption is enforced. Someone here who can help me? [11:29:59] hello [11:30:09] is it possible to hook the UploadWizard extension with jquery after a file has been selected? [11:30:25] server-side php would be fine too [11:41:31] kthx_: for what purpose? [11:43:17] Betacommand: i want to sideload from another wiki and use the other wiki's api to fill in author and image source [11:45:41] kthx_: see https://www.mediawiki.org/wiki/Manual:Configuring_file_uploads#Uploading_directly_from_a_URL_.28.22Sideloading.22.29 [11:45:45] and related code [14:41:51] qgil: lol - yeah - minor clarification - but yeah - AffCom is not taking on Trademark complaints :) [14:42:16] qgil: I can see how that came up though since Legal is helping us with the form and we’ve been using theirs as a basis - our is in Phab request (I think..) or will be soon [14:45:39] varnent, I thought, well, if someone is showing a trademark is kind of showing and affiliation and... but I was not in "wikimedia-l mood", so I just continued doing whatever I was doing ;) [14:46:14] qgil: That happens to me often - the raised eyebrow but “meh - something else to do that’s actually in my wheelhouse” [14:46:44] qgil: AffCom has enough work - lol - I don’t want us to fill in for Luis :p [14:47:23] not for Luis anymore.... [14:47:27] exactly :) [14:47:35] I’ll admit - I’m pretty geeked about that change [14:48:12] It was surprising only because I didn't see it coming. Once I know it took me two seconds to think 'of course!' [14:48:25] knew [14:48:59] the new community dept, Luis taking the helm, all of it except Anasuya [14:50:14] that was my reaction - AffCom got a sense of some possible changes in community related roles when we were in SF last month - but obviously not the details - so I had my guesses - but was pleasantly suprised [14:50:33] I told Asaf it was a palmsmack “well duh” moment :) [14:52:21] qgil: mostly unrelated - the contact form extension isn’t working - aside from who Phab auto-adds for that extension - is there anyone in engingeering I should add as a CC? Given that WMF uses the form for Legal - wasn’t sure if there was someone in particular that would want to know - I CC’d Stephen so Legal knows [14:53:14] no idea, but Stephen is the right contact [14:54:21] kk - I figured he’d know if there was an appropriate engineering contact beyond Andre..who I assume is the right one anyway :) [14:54:41] lol - so if it really is broke - we’ll want that Phab solution sooner :p [14:55:33] qgil: OH! before I forget - are you or someone from WMF possibly available to do a quick chat at some point (maybe in a month or two) or maybe in Berlin about Phab? [14:55:58] qgil: I’m intriqued by us knowing more about it with the idea of us using it for our processes and also encouraging Affiliates to do the same - I see the potential [14:56:19] Sure (online, I won't be in Berlin) [14:56:37] ((unless I have to be there, but nobody has asked)) [14:56:41] qgil: Cool beans - I’ll connect with you about that - I need to nudge the committee about if first, but wanted to check on possibility and such [14:57:11] qgil: I’d ask…but experience tells me that carries little weight as far as..you know..funding and such - lol [14:57:35] I'm fine with online :) [14:58:17] stupendous - let me know if you want to circle around on OTRS at any point - I LOVE the idea of the team using OTRS for a week or so to see what’s up [14:58:44] my opinion on it changes daily :p [14:59:33] some days I think “meh, Phab, MW.org, and mw-l have it covered” - other days I think “but what if I’m just not that MW savy yet or having some VE related or other extension (like contact form) problem, can’t use the wikis, Phab scares me, and I don’t want to subscribe to mw-l” [15:00:36] qgil: that said - I’ve also never been convinced our IRC, e-list, on-wiki combo of troubleshooting has been as satisfying for folks we do not know as we think [15:01:34] “meh, Phab, MW.org, and mw-l have it covered” :) [15:02:13] varnent, are you suggesting use of OTRS for mediawiki support? [15:02:29] For Affcom requests :) [15:02:32] Krenair: chatting about it - I honestly cannot decide if I’m actually recommending it [15:02:46] qgil, he was talking about mediawikiwiki and mediawiki-l... [15:02:48] qgil: well actually that has come up - using OTRS for affiliate concerns [15:03:12] ah, wait, yes, he is recommending that [15:03:13] sorry, multitasking + short term memory == fatal [15:03:13] sorry - lol [15:03:13] I jumped around a bit there [15:03:19] bad habit :) [15:03:21] ADD [15:03:21] varnent, it sounds like a really silly idea given almost no developers touch OTRS [15:03:45] Krenair: well that’s sort of my thinking as well - which makes me wonder if step one should be looking into that and maybe that solving the problem itself [15:04:21] it's for private tickets mainly [15:04:25] well - correction - something that may or may not be a problem [15:04:54] OSS support needs to be in public [15:05:13] and channels that you can actually have proper group discussions in. Not with OTRS's locking system [15:05:21] what got me thinking about it was I processed a few OTRS tickets that wound up being possible VE bugs - but I’m not sure how many would know that or how to process them [15:05:49] They need to go into phabricator, like all the other bugs. [15:05:50] so I agree the process should be public - it’s the intake of novice users I’m wondering about [15:06:54] Krenair, https://phabricator.wikimedia.org/T88402 [15:07:28] would having an OTRS email be a better method of taking bug reports for others to properly process from novice users who may increasingly be coming across our extensions and other more developer related issues that may not be realized by the end user as such - so even if it just exists for developers to process bugs into Phab from other OTRS emails - although again - maybe that could be solved by helping train OTRS folks on recognizing bugs and Phab [15:07:43] Adding more developers to manage currently existing things like tech-issues may be OK. [15:08:02] We should be looking to narrow the scope of things people attempt to move into tech-issues, not expand it [15:08:09] I agree - it may be a logical first step that helps solve a few issues - like the mobile queue [15:09:27] at some point when I’m caught up with AffCom work - I may look around older OTRS tickets and see if there are VE related bugs being missed - or if those were just a couple of random experiences I had early on [15:10:04] That provides us with no benefit and just splits things up into another restricted access system [15:10:44] The mobile queue is completely separate from info-en::Tech-issues, and needs to be fixed from within WMF Engineering management. [15:10:50] well - again - I don’t think the bugs should be processed with OTRS - but perhaps it’s a better method of intake from novice users - or a better place to route bugs that come in naturally via OTRS [15:11:17] right - but arguably having more developers using OTRS could help with the mobile queue issue is what I was saying earlier [15:11:40] Very specific developers, in mobile's case. [15:11:50] Like, the actual mobile apps team members. [15:12:07] fair enough [15:12:58] They don't have time to handle the backlog (currently standing at just under 700 open tickets) [15:13:59] well in any case - qgil is correct - Phab seems like the logic place to house this conversation [15:14:06] We (volunteers) on the mobile queue are not always able to directly address everything received [15:14:28] do you think having more devo volunteers would help with management of it as it is - or no? [15:15:11] Does mobile apps have any development volunteers..? [15:16:25] A lot of the feedback needs thought from a product perspective, not necessarily programmers [15:18:55] A lot of the android crash reports are duplicates but cleaned up/merged easily enough, but we still get a lot that make very little sense. 100 open. [15:19:36] iOS feedback - no one actively responding who owns a device able to run the app. 300 open. [15:20:50] android feedback - much more traffic, lots of suggestions that are difficult to group together into an actual request in phabricator for a feature. 300 open. [15:22:31] would volunteers be able to process them into Phab or no? [15:22:38] Simply throwing random developers at feedback won't help varnent, it needs people able to work with the mobile team(s) to get answers. [15:22:58] okay [15:23:06] Often, yes. Me and Florian have done a lot where we can [15:23:54] Thank you for that by the way :) [15:23:58] But this is a whole separate issue to your mediawiki bug submission process for newbies in OTRS idea. [15:24:14] right [15:24:38] I was just exploring what potential benefits adding more developers to OTRS may have [15:25:42] but obviously defer to you on the mobile queue issue [15:26:11] Are you simply interested in giving more people with technical knowledge full access to info-en for tech-issues? [15:26:32] Possibly [15:26:42] Or opening up another queue for gating bug submissions? [15:27:05] well - I’m not sure what, if anything, I’m suggesting yet - right now I’m just pondering ideas [15:27:42] The bug submissions idea would require more volunteers with technical knowledge. [15:27:48] So perhaps doing that is enough. [15:28:15] So at the very least, if there’s something that should be done, that seems like a logical first step to take. [15:28:43] Another possibility would be training existing OTRS on entering bugs into Phab. [15:29:06] In other words, help make the existing OTRS more tech knowledgeable [15:29:16] *existing OTRS volunteers [15:30:55] OTRS agents aren't really trained on anything right now. I think technical issues are not likely to be high up on the list. [15:31:04] and I admit that in part I am still trying to figure out if there are tech issues being missed or not being routed to Phab for developers to see - it seems to me from what I have seen there might be - but I am relatively new to OTRS - so cannot claim to be an expert :) [15:31:14] well - yes - training overall does seem to be an issue - I agree [15:32:14] a whole different topic [15:32:36] There are templates sitting around to direct people to mediawiki-l/wikitech-l. They seem to have been used by OTRS agents who couldn't answer such tickets to kick it out of OTRS in hopes they would go to the public list instead [15:33:20] am I correct that both require you to subscribe to send to them? [15:34:03] brb [15:34:16] For wikitech-l, yes. Not sure about mediawiki-l [15:52:43] okay - I don’t recall about MW-l either [16:03:17] No doubt it does [17:47:25] Nikerabbit: Are you maintaining CirrusSearch? [17:47:42] physikerwelt: no [17:48:51] physikerwelt: You'd probably want to talk to manybubbles or ^d [17:49:13] bawolff: thank you [17:49:21] <^d> whats up? [17:49:48] I have a hard time to set up Cirrussearch on my labs instance [17:49:55] https://phabricator.wikimedia.org/T90652 [17:50:36] <^d> Hmm [17:50:38] I can not find any hint how to get the index created [17:51:39] <^d> update.php? You need to run updateSearchIndexConfig.php [17:52:13] ^d: it has the same effect [17:52:17] <^d> Hmm, yeah [17:52:23] <^d> I've not ever done labs-vagrant [17:53:03] I have only one index wiki_content_first I wonder where this comes from [17:53:05] it "should" work the same as mediawiki-vagrant [17:53:55] physikerwelt: I'm currently in the position that I have no clue what the error message means [17:54:14] <^d> I've got my vagrant up, lemme try to reproduce [18:07:40] <^d> Ah, there we go [18:07:43] <^d> That's no good [18:08:01] <^d> You can totally get in this situation with vagrant + Cirrus [18:09:15] ^d: I'd say it's good. At least it's reproducible [18:09:35] <^d> Yeah. So when we're creating the index with puppet it's not creating the aliases properly [18:09:57] <^d> Also it doesn't appear to make the general index at all, just content [18:11:05] Is there a way to fix it manually (independent of puppet)? [18:12:00] <^d> I think so, one moment [18:14:07] <^d> Not yet, hmmm [18:15:01] <^d> Hmmm, ES 1.3.8 in vagrant, 1.3.4 in prod [18:15:50] <^d> Our config file is bogus [18:15:51] <^d> wtf [18:16:03] <^d> bd808: Holp [18:16:11] wazzzup? [18:16:18] <^d> vagrant is only writing part of the config file for elasticsearch.yaml [18:16:30] that's ... non optimal [18:16:41] <^d> Oh wait, that's the whole file [18:16:44] <^d> I'm used to prod. [18:16:47] <^d> Weird [18:25:09] <^d> physikerwelt: Ok, to workaround for now you should be able to do this... 1) delete any messed up indices that were initially created (should just be `curl -XDELETE localhost:9200/wiki_content_first` but you can look for others with `curl localhost:9200/_cat/indicies` [mw_cirrus_versions is fine]) [18:25:21] <^d> 2) Set $wgCirrusSearchMainPageCacheWarmer to false in your LocalSettings [18:25:29] <^d> 3) Run updateSearchIndexConfig.php again [18:26:38] <^d> You may hit some edge cases until the underlying problem is solved (I'll update the task in a moment) but that should get you mostly running in the right direction [18:28:51] andre__: Hi :) Who should I ping to get Phabricator tags created once the discussion is finished? [18:32:42] ^d: that's promising... i'll report back in a while index creation naturally takes some time [18:33:12] <^d> I updated the task with more info [18:33:25] <^d> Should be able to find a fix now that it's isolated [18:42:07] Heiya, just a short question. As I can see on https://de.wikivoyage.org/wiki/Spezial:Version the slippymap tag is available. Now my question: Which extension acutally provides it? Some docu on the wiki refers to the SlippyMap extension which is however not installed [18:44:27] Ah ok, it's the https://www.mediawiki.org/wiki/Extension:MapSources extension [18:44:37] Rather obvious I guess :) [18:46:14] Currently having a jolly good laugh with you. :) [18:46:35] :) [18:49:51] physikerwelt: no (but I guess you already found out) [18:50:33] guillom: anyone in Project-Creators can create projects. [18:51:43] legoktm: Yes, but no one is touching mine :) [18:51:51] guillom: link? [18:53:14] https://phabricator.wikimedia.org/T88470#1054821 and https://phabricator.wikimedia.org/T88468#1044279 [18:55:57] ^d: Indexing takes a while, i.e. since all the math elements are being rendered... the math extension implements the onMaintenanceRefreshLinksInit hook to avoid rerendering on refreshlinks... I wonder if there is a similar link for the search index [18:56:10] i meant a similar hook [18:57:00] Nikerabbit: Yes. Thank you. Have a nice day anyhow;-) [19:03:42] <^d> Hmmm [19:17:25] ^d: now I get a different error SearchPhaseExecutionException[Failed to execute phase [dfs], all shards failed https://gist.github.com/physikerwelt/dc195b0b56f1c1eca7a2 [19:17:58] <^d> Bah, same error [19:18:03] <^d> still need dynamic scripting [19:18:30] do I have to remove wgCirrusSearchMainPageCacheWarmer again? [19:19:28] but at least it looks like I have all the indexe [19:19:29] health index pri rep docs.count docs.deleted store.size pri.store.size [19:19:29] green mw_cirrus_versions 1 0 2 0 3.4kb 3.4kb [19:19:30] green wiki_content_first 4 0 131 0 5.8mb 5.8mb [19:19:30] green wiki_general_first 4 0 896 0 5.9mb 5.9mb [19:22:19] <^d> Yeah, the indexes are right but querying's going to be wonky [19:22:34] <^d> The cache warmer setting is only during initialization [19:23:32] guillom: https://phabricator.wikimedia.org/project/profile/1082/ [19:24:00] legoktm: Weee \o/ Thank you! [19:37:03] ^d: Do you have the same issue in your local vagrant instance? [19:38:43] guillom, me or anybody who's in #Project-Creators [19:39:00] guillom, I might even get there tomorrow after doing a few other things :-/ [19:39:46] andre__: Thanks. Lego created #report; Have you made a decision regarding notice tags? i.e. #user-notice etc. vs #notice-users ? [19:40:07] I'm fine either way as long as I can start using them :) [19:42:23] guillom, I think both is fine, does not really matter because typeahead tokens work on dashes. Whatever you prefer [19:44:14] andre__: Ah, we're stuck in an undecisiveness loop :D Alright, let's go with #notice, #user-notice (with aliases #tech-news and #technews), #developer-notice and #sysadmin-notice [19:44:59] guillom, I love delegating decisions if I agree with them. :P [19:45:03] héhé [19:46:55] (And now for the most difficult part: Picking icons!) [19:47:53] Does anyone know of any extension, or maybe a pywikbot, that will update broken links after (or as part of) moving a page? [19:50:32] Rosencrantz: why don't redirects work? [19:51:13] Redirects work, but can easily clutter up search. [19:52:42] Well if they used to point to that page, wouldn't they be a legitament response to the search query? [19:54:31] Not necessarily. For instance, if there's a misspelling in the title. Let me ask some of the folks I'm working with for some other examples. [19:55:35] Poor titles. [19:55:48] Poor and/or mislabeled. [19:55:58] I'm working with wikis that are primarily used for technical documentation. [19:58:15] So one of the common cases of page move is if a user puts up a page title like "Gitub repositories" which is specific to their group, but becomes confusing to a wider audience. [19:58:27] silly users [19:58:46] Hi [19:58:50] I have this weird error trying to run the install script in 1.23.8 [19:58:51] can I get help about the ldap extension here? [19:58:53] http://pastebin.com/C2aLHv7Q [20:05:16] !ldap [20:05:16] http://www.mediawiki.org/wiki/Extension:LDAP_Authentication To get support, open a new thread on the support page http://www.mediawiki.org/wiki/Extension_talk:LDAP_Authentication [20:06:02] ^d / manybubbles : Is it possible to do a search query with Cirrus like "incategory:Felis_silvestris_catus (intitle:Birds OR intitle:Dogs)" (Anything in the cat category that has either Birds or Dogs in the title) [20:06:59] bawolff: unfortunately we don't support OR there - its rarely something people want but its a gaping hole in the implementation. file a phab ticket! [20:07:05] it deserves to work [20:07:08] but it just doesn't [20:07:48] My real question was more, I wanted to do (intitle:ogg OR intitle:ogv OR intitle:webm) to do a pseudo search by file type [20:07:58] will file phab ticket [20:08:19] does OR work in any of the points in the query? [20:08:48] TIL: Going to https://www.mediawiki.org/ and writing "Wikimedia Product Development/Personae" in the search box goes to "Wikimedia Product Development/Personæ". Notice what happens in the last two letters. Automatic normalization? Super-smooth search? [20:09:27] But https://www.mediawiki.org/wiki/Wikimedia_Product_Development/Personae does not automatically redirect to https://www.mediawiki.org/wiki/Wikimedia_Product_Development/Person%C3%A6 . [20:13:04] aharoni: Well now it does :P [20:13:49] well yeah, but I'm wondering how was the search so smooth [20:17:23] I think there's an argument, that if you go via direct url, you might not want to be redirected in case you want to make that specific page [20:17:44] where if you go via search, you're fine with case folding/unicode similar character folding [20:18:06] <^d> physikerwelt: Yep, I was able to replicate locally [20:25:33] ^d: I wonder why it behaves different from production [20:26:20] if someone really wanted to be trolly, they should move that page to [[Wikimedia_Product_Development/Perſonæ]] [20:44:55] aharoni: yes, that's the new search being smart [20:46:08] If, i add a new class. Am I supposed to run and commit generateLocalAutoload.php, or do autoload.php get automatically regenerated via some sort of magic? [20:47:23] run and commit [20:47:34] it's just like previously, except with an additional useless indirection step [20:48:31] I notice when I run the script, its hardly just the new class added, there's like 20 different things that are updated [20:50:11] meh, I'll manaully edit it to just add the class I care about [20:50:23] <^d> physikerwelt: Different config :) [20:50:25] <^d> I'm more interested in how it ever managed to work on vagrant :p [21:04:18] I'm very much a fan of this idea https://phabricator.wikimedia.org/T76128 [21:06:33] bawolff: if it's adding extra classes, either you have local uncommitted stuff or someone before you forgot to run the script :P [21:06:48] my local copy is a bit of a mess [21:07:00] perhaps thats it [21:07:38] https://dpaste.de/DU36/raw [22:20:45] hello! anyone can recommend some steps from creating infoboxes? or some ground down documentantion for starters? [22:20:56] !templates [22:20:56] For more information about templates, see . See also: !templateproblems , !wptemplates [22:21:21] dtcrshr: Basically any introduction to CSS would probably be where I would suggest to start documentation wise [22:21:32] dtcrshr: but most people just copy from wikipedia [22:25:36] I see. this http://www.mediawiki.org/wiki/Manual:Importing_Wikipedia_infoboxes_tutorial shows just to do this revamp from another wiki [22:26:06] well, ill try this path, I need only 3 custom infoboxes, maybe theres one next to what I want.. if I just don enter the fields, are they not displayed? [22:27:50] It depends on the infobox template [22:28:31] yeah... [23:33:26] How is the text index supposed to work when pages are edited? On my mediawiki I have to run php rebuildtextindex.php manually, doubting that this has to be done like this.