[00:00:46] willazilla: just delete anything on the template:screenshot page that looks like [[Category:Some cat name here]] [00:00:50] ok. Thanks for trying! [00:18:41] ^^bawolf^^ you totally rock, you are awesome. It efing worked! yay! [00:22:41] bawolff does rock. This is a true statement [00:22:57] yay me! ;) [00:26:20] heh [00:26:33] bd808: Did you hear that gwtoolset people are going for round 2? Given that you did much of the review/fix-up work last time around, you'd be in a good position to give some constructive criticism on their new grant proposal [00:26:43] https://meta.wikimedia.org/wiki/Grants:PEG/Europeana/GLAMwiki_Toolset [00:33:53] bawolff: oh neat. I had not heard about that. I'll look at the proposal [00:34:26] There's also a thread on the mailing list https://lists.wikimedia.org/pipermail/glamtools/2015-February/000335.html [00:34:42] i do hope it gets funded [00:37:13] "confirmation of Code Review allocation from WMF technical department or provision of an alternative method for being trusted to review our own code, would be a requirement" ... [00:37:23] there's the big sticking point [00:37:27] yeah [00:37:58] Tell them to add in +50% dev time and pay you to review ;) [00:38:14] "or provision of an alternative method for being trusted to review our own code" is obviously not going to happen if they only have a single developer [00:38:15] ;) [00:38:39] nope. It's a non-starter [00:38:49] I wish GWT had been built as a tool [00:38:59] And I somehow imagine the multimedia team probably doesn't want to commit to that much review time (although nobody has said that) [00:39:06] would have made a lot of people's lives easier [00:39:29] and I can't imagine any other wmf team would even be willing to consider doing all the review [00:40:49] mw-core did 90% of it last time around [00:41:06] aaron and chris mostly [00:42:18] that's true, but i imagine that was mostly because there wasn't even a multimedia team at that point [00:42:58] *nod* It was handed off to t.gr and g.i11es pretty quickly [01:30:03] what happened on february 6? http://stats.grok.se/www.w/latest30/MediaWiki [01:33:14] I suppose we were popular [01:34:13] that's such a spike, I wonder if it was some misbehaving bot [04:23:45] GWToolset is interesting. [04:24:13] Having volunteers upload giant datasets via slow and expensive HTTP requests is an interesting approach. [04:24:55] The whole problem probably needs smarter approaches. [04:26:15] My only experience with GWToolset is when we realised it was relying on PHP_SAPI to prevent indefinite job loops. [04:26:16] in what sense is HTTP expensive? [04:26:31] After the HHVM switch. Which changed PHP_SAPI. [04:26:59] Fiona: i would bring that up with liam [04:27:54] tgr: Maybe not the best word. Perhaps closer to inefficient. [04:28:11] The broad goal seems to be to dump lots of content into Commons. [04:28:23] Doing that via Web requests seems kind of rudimentary. [04:29:49] Maybe we should set up a shipping service of some kind. [04:29:50] I would say in any real-world-relevant sense having volunteers walk into GLAM headquarters with big harddisks is much, much more expensive than fetching publicly available data via HTTP [04:30:13] Depends how much data you're talking about. [04:30:25] You can get quite a lot of on a hard drive in-person in a very fast period of time. [04:30:34] s/of // [04:30:48] I would hope you get all of the hard drive. [04:30:49] But the flip side is whether Commons is really ready to accept so much media. [04:31:01] which is the part which is not relevant in practice [04:31:19] maybe the upload will take weeks, who cares [04:31:27] I think if a big organization wants to make a donation, shipping a hard drive is much cheaper. [04:31:35] the bottleneck is Commons' ability to handle uploads anyway [04:31:42] Well, it's expensive in the sense that you're using bandwidth. [04:31:44] And CPU. [04:32:07] Right, I'm not sure Commons wants/can handle a lot of data quickly. [04:32:43] Pushing data through the public Internet makes sense in a lot of contexts. For these particular use-cases, it seems kind of silly. [04:32:48] the expense that's relevant in practice is that someone has to buy the harddisk, someone has to arrange with the staff of the GLAM to get in there and get the data, someone has to get the disks to the WMF datacenter, someone has to actually upload files from them [04:33:06] Right. We could make that process more efficient. [04:33:12] which is a lot of barriers to people being motivated to do it [04:33:28] Maybe, but you don't do a large act really quickly. [04:33:32] That's true in all parts of life. [04:33:35] compared to entering an URL and pressing a button [04:33:38] If you want to upload a lot of data, it requires more work. [04:33:53] it doesn't require more work [04:34:00] it requires more time [04:34:08] time is cheap, work is not [04:34:24] You're just hiding the work. [04:34:51] Or shifting it, I guess. [04:35:07] shifting it to computers [04:35:16] which is why we have them [04:35:35] Eh, I'd stay call the system inefficient and expensive. [04:35:49] s/stay/still/ [04:35:53] I should probably sleep. [04:53:32] Why isn't my site map populating @ http://wiki.gamepaduniverse.com/w/sitemap.xsl? I am using the extension found here: http://www.mediawiki.org/wiki/Extension:DynamicWikiSitemap [04:55:31] i applied the 1.24 fix found in the discussion section [04:56:33] and adjusted $DEFAULT_SITEMAP_STYLE and $DEFAULT_SITEINDEX_STYLE in sitemap.php to reflect the proper directory the files are installed on [10:10:30] Hi all ! [10:24:26] I found a strange behavior in SpecialSearch.php. Can someone help me to determine if it's a bug ? [10:25:51] SpecialSearch.php, line 363, hook "SpecialSearchResults" is triggered, with titleMatches and textMatches parameters by reference [10:26:56] But few lines ahead (315-324), numbers are copied [10:27:12] numbers of "how many matches" [10:28:24] So if I try to add results in titleMatches or textMatches in the hook, and there is no result by default, my results will be ignored... [10:28:57] Seems strange to me. [10:29:52] But if there is a right way to just add search results (and keep the current search engine), please tell me !! [10:29:55] Thanks :) [11:01:26] why does $wgAllowImageTag automatically allow external images? [11:24:42] kthx_: Because you are allowing raw tags to be placed. [11:26:21] Zhaofeng_Li: and what do i need $wgAllowExternalImages for? [11:27:34] kthx_: Inserting external images without any markup (a image URL stuck to the source will automatically be expanded) [11:29:45] Zhaofeng_Li: so it's not possible to alter the size of the image? it's just rendered as ""? [11:30:57] kthx_: I might be wrong, but I don't think so. [14:52:19] Why isn't my site map populating @ http://wiki.gamepaduniverse.com/w/sitemap.xsl? I am using the extension found here: http://www.mediawiki.org/wiki/Extension:DynamicWikiSitemap [14:52:26] i applied the 1.24 fix found in the discussion section [14:52:33] and adjusted $DEFAULT_SITEMAP_STYLE and $DEFAULT_SITEINDEX_STYLE in sitemap.php to reflect the proper directory the files are installed on [14:53:59] hi. how to mark featured articles in mw 1.25? [15:04:50] Heiya, just a quick one, hopefully :) MW generates the tag by combining the pagetitle with the pagename. Is there a setting or some other way to manipulate this in core? I know that there a extensions around. [15:06:30] <kghbln> I have done some search on mw.o but could not find anything. propably used the wrong words. [15:13:51] <kghbln> ah yeah, it's "MediaWiki:Pagetitle". Sometimes just asking helps you find the solution. [15:14:00] <kghbln> :) Cheers [17:04:18] <aharoni> Hallo. [17:04:29] <ty> Hi [17:04:35] <aharoni> Can anybody please make me an admin at http://en.wikipedia.beta.wmflabs.org/ ? [17:05:00] <aharoni> Oh, actually I should be able to do it myself. [17:05:29] <aharoni> funny, I'm a crat there, but not an admin. [17:42:00] <Betacommand> !centauth [18:44:21] <Betacommand> how does wikimedia handle global user rights? [18:48:51] <brion> Betacommand: global groups applied via CentralAuth iirc [18:49:10] <brion> the group permissions are configured globally in $wgGroupPermissions in common config [18:49:22] <brion> and then the memberships are conferred via CentralAuth’s auth plugin [18:49:23] <brion> i think :D [18:50:00] <Betacommand> brion: thanks, trying to help a guy on the mailing list with a wikifamily and wants to add global rights [18:52:37] <legoktm> Betacommand: there's also a GlobalUserGroups extension iirc. people really shouldn't use centralauth >.> [18:55:16] <brion> ooh handy [18:55:35] <brion> i replied with some general notes, didn’t know about that one, sounds handy too :D [19:06:46] <APerson_> Is there any way I can tell the API to return how many results would be returned from a query (like COUNT(*) in SQL) instead of the query results? [19:15:39] <ty> APerson_: Most API queries have a limit parameter where you can specify how much you want [19:15:59] <APerson_> ty: Is it possible to have the API just return how many results, though? [19:16:09] <APerson_> I'm not interested in getting the actual results. [19:16:31] <MaxSem> no [19:16:50] <ty> Sorry I misunderstood your question [19:17:47] <RoanKattouw> APerson_: No, the API doesn't offer that [19:17:56] <RoanKattouw> Because we'd actually have to run a real COUNT(*) query for that [19:18:16] <APerson_> RoanKattouw: Okay, thanks [19:18:19] <RoanKattouw> Which gets crazy if you're asking how many pages are in [[Category:Living people]] or how many pages use [[Template:!]] [19:18:40] <RoanKattouw> (IIRC the former is something like 700k, and the latter is about 2 million) [19:18:51] <MaxSem> none should use ! though:P [19:19:02] <RoanKattouw> Oh I guess {{!}} is a magic word now [19:19:03] <APerson_> So there's no way other than actually running the query and getting the length property from it? [19:20:13] <RoanKattouw> Exactly [19:20:21] <RoanKattouw> Except in a few cases where numbers may be tracked separately [19:20:27] <RoanKattouw> Like, we have counters for category membership [19:21:07] <APerson_> okay, cool [19:21:32] <APerson_> I was trying to figure out how many articles a user has created, and to do that, I'm just using the Usercontribs module [19:21:44] <RoanKattouw> Oh ,right [19:21:57] <RoanKattouw> Yeah we have a counter of how many *edits* a user has, but not of how many *creations* they have [19:22:36] <APerson_> Cool. By the way, would you happen to know of any JS wrappers for the API that do continues? [19:38:32] <malaverdiere> Hello *. I had some help yesterday from GEOFBOT on getting phpunit tests started. They are crashing with 'Class 'SebastianBergmann\Version' not found in /usr/share/php/PHPUnit/Runner/Version.php on line 39' [19:38:53] <malaverdiere> I am using a centos VM - are there other Linux distros that are known to work smoothly [19:38:55] <malaverdiere> ? [19:41:06] <MaxSem> use debian/ubuntu [19:42:11] <MaxSem> redhat clones suck because of their antiquated packages [19:43:01] <malaverdiere> MaxSem: ubuntu trusty is known to work? [19:43:20] <^d> Yes [19:43:56] <MaxSem> also, just upgrading phpunit would probably work. until you bump at something else [19:45:29] <malaverdiere> MaxSem: the least work I put in this the happier I'll get [19:46:46] <MaxSem> yep:) [20:12:42] <notconfusing_> how could i find out in which version of MW was action=compare introduced? [20:14:59] <Vulpix> notconfusing_: try looking a mention in RELEASE_NOTES [20:15:17] <notconfusing_> Vulpix, i did [20:15:17] <notconfusing_> site:http://www.mediawiki.org/wiki/Release_notes/ action=compare [20:15:26] <notconfusing_> does not yield any results [20:17:50] <notconfusing_> is there a way to search historic versions of the api [20:18:18] <Vulpix> git log [20:19:20] <notconfusing_> vulpix thanks [20:27:29] <wmat> is there anyway to embed brightcove video in a wiki page? [20:27:50] <wmat> extension EmbedVideo doesn't seem to support it [20:37:25] <Vulpix> wmat: you may be able to embed it using a custom Widget [20:43:34] <wmat> Vulpix: would there be instructions on how to do that anywhere? [20:49:07] <Vulpix> wmat: from my understanding, Widgets allow inserting arbitrary HTML on wiki pages, so basically it should be a matter of pasting the contents of the "embed this video" code into the widget and insert a variable to allow a page specify the video ID [20:51:55] <Rosencrantz> wmat: Yep [20:52:33] <Rosencrantz> wmat: I did it for my internal company wiki with a javascript gadget and lua template [20:53:22] <wmat> Rosencrantz: so a custom Gadget, not a Widget? [20:53:57] <Rosencrantz> Either would probably work I imagine. I just did mine with a Gadget because I'm more familiar with those [20:54:05] <wmat> ah, OK [20:54:37] <wmat> Brightcove provides ways to assign content players programmatically: http://support.brightcove.com/en/video-cloud/docs/assigning-content-players-programmatically [20:54:48] <wmat> I guess I'd need to use one of those methods [20:54:49] <Rosencrantz> Really it's just finding a way to take the brightcove snippet and run it as javascript. [20:55:35] <Rosencrantz> well take the snippet and that object param elements, put them in the page, then load their javascript player [22:00:59] <jnor> Hi getting this error in install "Your session data was lost! Check your php.ini and make sure session.save_path is set to an appropriate directory. " Buuut save_path is set and session files are being created... O_o [22:03:59] <ianj> I am looking at adding custom namespaces to my wiki, and I think I am missing something. [22:04:31] <ianj> one of the wikis is in Japanese, and the predefined namespaces look like this in XML: [22:04:35] <ianj> <ns id="-1" case="first-letter" canonical="Special" xml:space="preserve">特別</ns> [22:05:06] <ianj> but in the documentation for $wgExtraNamespaces and related, I don't see any way to set a canonical="$NAME" with the namespace name as $LOCALIZED_NAME [22:05:38] <ianj> I tried using wgNamespaceAliases as suggested, but I don't like the result because unlike the predefined namespaces, the text in the URL is converted to the canonical name when the page is loaded. [22:06:01] <ianj> is it possible to localize custom namespaces in the same manner as the predefined ones? [22:07:00] <RoanKattouw> ianj: You can't set different canonical names for extra namespaces. But you could define the extra namespace under the "canonical" name and make the other name an alias [22:07:07] <Vulpix> custom extra namespaces, as I understand, affect a single wiki with a specific language, and there's no point in localizing them [22:07:24] <RoanKattouw> But, yeah, you can't localize them in the same manner as built-in namespaces, unfortunately [22:07:29] <ianj> I see [22:07:34] <ianj> I'm mainly trying to be consistent. [22:08:53] <Vulpix> basically, there's no point in defining a canonical for them, because they're specific for that wiki. Canonical makes sense when that namespace is common on all wikis [22:16:29] <ianj> I've got an InterWiki multilingual setup like Wikipedia does, so I thought it would be nice [22:16:38] <ianj> but I can deal with it this way, sure [22:16:53] <ianj> t's not great but at least it looks the same on the surface. [22:18:01] <Vulpix> you'll have to go with aliases [22:21:06] <Nemo_bis> "If you have a small group (e.g. a small co-op), consider simply using [WWW]Google Docs" sigh http://wikispot.org/2015_Shutdown_Notice [22:22:01] * ianj sigh [22:22:30] <Vulpix> WikiTeam probably needs to archive those wikis :( [22:22:43] <ianj> I'm not familiar with wikispot, but I tire of the "let's just let google do everything for us!" mentality [22:23:28] <Vulpix> "For a small amount of information (< 1000 pages), simply copying and pasting your information into a new provider will probably work great" [22:24:02] <ianj> it looks like they've just replaced wikispot with localwiki, though [22:24:05] <Vulpix> ah, that's not a MediaWiki. I was about to suggest Special:Export :D [22:24:57] <Nemo_bis> yeah, otherwise I would have already downloaded it all ;) [22:25:23] <Nemo_bis> Nice thing is that they're working on a converted to MediaWiki. Maybe Lcawte is interested in absorbing all those wikis? :) [22:28:44] <Lcawte> ShoutWiki is more than happy to do MediaWiki XML imports if requested. [22:29:52] <Nemo_bis> Ok. :) Will try to get the dumps first ;)