[16:10:11] TimStarling: meeting in 3hr, right? [16:10:19] Updating the RFC [16:37:06] hexmode: isnt it a little later? [16:37:06] https://phabricator.wikimedia.org/T76575#806428 [16:37:48] 21:00 UTC (now it is ~ 16:30 UTC ) [16:38:21] I thought so as well Helder [16:41:51] Helder: ok.... well, better early than late. [18:00:10] :D [18:00:15] hey everyone :) [18:00:22] Hey [18:00:24] time for the wikidata office hour [18:00:26] Hey [18:00:27] hey [18:00:28] who's here for it? [18:00:34] <-- [18:00:34] yay! people! :D [18:00:51] We need to do the formalities first... [18:01:02] oh my :P [18:01:03] which ones? [18:01:20] Wasn't there a bot here or something? [18:01:36] yes but i fail at using it... [18:01:43] does anyone know how to use it? [18:02:00] also it seems it is gone [18:02:06] nvm then [18:02:12] :) [18:02:17] alright [18:02:27] so big stuff that happened lately: [18:02:29] I think we need to settle the current problem, deletion vs redirection. Some admins redirect and some delete. I think rfd should be consistent. Either redirect or delete [18:02:41] Ehm, deploy today? [18:02:56] heh ok then those first [18:03:06] Jianhui67|cloud: there is an open rfc that needs closing by an admin about this [18:03:11] not sure why it wasn't closed yet [18:03:22] sjoerddebruin: yes deploy today for statements on properties [18:03:26] Yep. It has been opened for 2 months [18:03:29] Maybe we don't have active uninvolved admins? [18:03:43] possibly [18:03:47] i didn't check [18:03:48] Oh, my name is no where on that page. [18:03:52] hah! [18:04:10] But never closed a RfC before. :/ [18:04:49] ohhhhh [18:04:51] but but but [18:04:54] then it can be your first [18:04:57] exciting! :D [18:05:17] Jianhui67|cloud: Have you asked people like Epìdosis why they are still deleting? [18:05:33] And Cycn. [18:05:53] I think its because they simply dont want to loose the rights (due to inactivity) [18:06:08] Bad reason... [18:06:15] Lol [18:06:27] yeah ... btw hi Lydia [18:06:30] Cycn already showed his disagreements [18:06:46] hey matej_suchanek :) [18:07:00] Oh yes, I can't find Cycn because of the stupid signature. [18:07:02] * DanielK_WMDE wibbles [18:07:02] Jianhui67|cloud: what did they say? [18:07:06] I told some of the people who request deletion to redirect instead of requesting deletion [18:07:45] cool [18:08:09] sjoerddebruin: do you feel like you can/want to close the rfc or should we see if we find someone else? [18:08:37] I don't think I'm neutral anymore. :/ [18:09:15] fair enough [18:09:36] Jianhui67|cloud: want to find another admin then? or post it on the admin noticeboard? [18:09:47] good idea [18:09:48] Stryn, why don't you do it [18:09:57] Or matej_suchanek [18:10:20] There is also a backlog regarding property deletions. [18:10:22] I was involed into it and I have never closed an RfC before [18:10:34] I think it's a good point to start 2015 clean. [18:11:53] sjoerddebruin: is the issue a lack of discussion about the properties or about making the decision and doing it? [18:12:33] There is enough discussion, but many people participated. [18:12:55] i see [18:13:39] And I'm very busy with all backlogs... [18:13:44] *nod* [18:13:50] thanks for working on those! [18:14:10] I sometimes think I'm the only one that is beating vandalism. [18:14:17] oh I have just reminded... [18:14:24] I'm lazy :p [18:14:36] Lazy to do anything lol [18:14:37] mh the merge gadged has "Request deletion for extra item on RfD" as a default, probably should be changed when that RfC is closed [18:14:45] Jianhui67|cloud: how about becoming unlazy and posting about the rfc on admin noticeboard? :D [18:14:54] Sure [18:14:57] \o/ [18:15:12] jzerebecki: adaption already in progress i think according to project chat [18:15:19] the filter 32 (Special:AbuseFilter/32) doesn't work anymore, same issue as with the redirects [18:15:35] matej_suchanek: which one is that? [18:15:49] tagging merging items [18:15:53] Uncompleted merges: https://www.wikidata.org/wiki/Special:ShortPages [18:16:14] it's been so since September 30th [18:16:36] matej_suchanek: is this something the dev team can/has to fix? what's the issue? [18:17:14] I don't know but there was a bug that the AF couldn't tag creating redirects [18:17:24] ahhhh [18:17:26] even if there was everything correct [18:17:26] i remember [18:17:32] let me search for that [18:17:42] and I see this is a similar issue [18:18:19] https://phabricator.wikimedia.org/T72715 [18:18:25] ok i'll bump this one [18:18:36] thanks [18:18:38] np [18:19:37] other things we should talk about? [18:19:52] maybe I have one more question [18:19:57] shoot [18:20:15] Done! [18:20:39] Jianhui67|cloud: \o/ [18:21:15] it's quite complex [18:21:33] it's related to enforcing constraints [18:21:37] we can handle it :P [18:21:39] ok [18:23:21] I also realise from RfD, there seems to be a problem with creating redirects from there. [18:23:23] basically you have once on the Project Chat stated that if there is an inverse constraint the system wouldn't handle it by making two same edits to both items [18:23:45] yeah [18:23:54] this was something I had imagined how it should work [18:24:29] for example both statements would have the same reference(s), qualifiers etc. [18:25:06] what was the reason not to go this way? [18:25:09] right [18:25:13] just asking [18:25:19] sure :) [18:25:29] this is what this office hour is for, right? ;-) [18:25:36] yeah [18:25:44] anyway: the reason is that the world is unfortunately complicated [18:26:10] and while in most cases inverse properties might actually be inverse there are still those special cases that are "different" [18:26:28] just like we don't specify that sex/gender can only be male and female [18:26:36] because obviously the world is more complex than that [18:26:39] however! [18:26:51] we can still build clever tools to help [18:27:18] ok .. by the way, how's the team doing? [18:27:23] for the sex/gender case we are going to build out the suggester so it suggests the most common values first while still letting you enter other things [18:27:38] may I add to the answer to the inverse question a bit? [18:27:58] yes [18:27:58] and for inverse properties we can simply give the user a small hint saying "this seems to be an inverse property. would you like to also add it to the other item?" or something like that [18:28:10] dennyvrandecic: please :) [18:28:20] imagine the inverse of "country" [18:28:31] or of "nationality", to be a bit more specific [18:29:09] if every statement with a country property added to any song, movie, building, etc. would lead to a respective edit on the object of that statement, ... [18:29:12] ... we would be in trouble [18:29:29] hah that too! [18:29:37] the items for Germany, Russia, United States are already quite rich [18:29:47] but that's nothing compared to the number of inverses they would sport [18:30:16] (Hello All) [18:30:26] matej_suchanek: does that make sense? [18:30:29] I like the idea of the small hint [18:30:40] me too [18:31:04] sweet [18:31:20] I think there should be, once we have simple queries up and running, a link to "show me all items that have a statement with this as an object and the property P" [18:31:22] there are quite a few cases where we can be smarter like this [18:31:35] that would resolve your usecase in most cases [18:31:35] if you can think of some that would be useful do file tickets or tell me [18:31:59] not in all use cases, but in most - e.g. reflexive properties are not handled well, still [18:32:04] but these are fortunately rare [18:32:08] dennyvrandecic: the explanation makes sence for me but the mentioned nationality is not an inverse property so there can't be a problem ... or did I misunderstand you? [18:32:19] Lydia_WMDE: When will I be able to view Wikidata on mobile? [18:32:33] sjoerddebruin: ahhh asking the painful questions :D [18:32:34] no, but the property "nationality" would have an inverse, like "citizen" or sth [18:32:46] It's almost 2015. :) [18:32:47] matej_suchanek: ^ [18:33:00] sjoerddebruin: i can't give you a specific date but we'll make progress with the new UI on that [18:33:01] sjoerddebruin: as soon as there are devs to do it :P [18:33:39] why not start an Open Source alternative Wikidata app? I wanted to do some Android deving anyway :) [18:33:54] having an app would be cool [18:34:16] I'm also dreaming of a working nearby page on Wikidata. ;) [18:34:25] matej_suchanek: another issue is that a reasonable inverse might sometimes depend on the actual domain and range [18:34:28] sjoerddebruin: there is a tool for that [18:34:37] sjoerddebruin: let me see if i can find it [18:34:56] matej_suchanek: the property country is currently used on movies, songs, albums, buildings, mountains, etc. it is hard to figure out a good inverse for that [18:35:20] sjoerddebruin: http://www.johl.io/aroundme-wikidata/ [18:35:22] matej_suchanek: but it would be a pity to create incentives to create only properties that have a good inverse [18:35:34] I meant that not all properties need an inverse [18:35:36] Why is it German? :( [18:35:56] It's broken. [18:35:59] sjoerddebruin: you can poke johl on twitter to translate it ;-) [18:36:04] matej_suchanek: true, but there would be task forces to make inverses for all properties ;) [18:36:12] it is working for me [18:36:24] the tool is nice anyway [18:36:29] Will try the iOS-simulator then. [18:36:33] k [18:36:50] Nothing... [18:36:58] hmmmm [18:37:11] are you giving it permission to read your location? i missed that first [18:37:18] Yup. [18:37:27] dennyvrandecic: i'm totally for getting rid of all inverses ;) its redundant data [18:37:28] WONTFIX, works for me ;) [18:37:50] sjoerddebruin: hmm then i don't know sorry [18:38:24] But we have https://tools.wmflabs.org/wikidata-todo/around.html :D [18:38:47] ah that also works ;-) [18:39:02] The Netherlands is crowdy because of all the monuments. :) [18:39:31] nice! :D [18:39:46] I once had the idea to put locations of Subway restaurants, Starbucks, etc. on Wikidata... [18:40:29] So we can become a good POI-provider. [18:41:32] that might be a bit too much and better left to openstreetmap? :) [18:42:15] we are still missing a property to link to stuff on osm that is not a relation there [18:42:15] Me and my crazy ideas... [18:42:28] sjoerddebruin: <3 [18:43:04] Lydia_WMDE: otoh wikivoyage probably has many of those POIs already... [18:43:14] sjoerddebruin: but that would need an item for each starbucks? [18:43:27] dennyvrandecic: Probally. :P [18:43:39] I don't think that's a wacky idea [18:44:20] We also need a business hours property then. :/ [18:44:27] I actually think it would be pretty awesome. Although I would start not with Starbuckses, but "all schools", "all museums", etc. [18:44:50] we have a not too bad coverage for these in some places already [18:44:59] yeah [18:45:17] all libraries [18:45:58] that would be rather neat [18:46:04] long time ago, I got a crazy idea [18:46:14] matej_suchanek: I'd love to hear it [18:46:40] as a user, you can have a sandbox but that's only for wikitext [18:46:56] what about having a sandbox item in the user namespace? [18:47:08] I have a file with a lot of schools in the Netherlands, sadly no license is noted.https://data.overheid.nl/data/dataset/02-vo-adressen-alle-vestigingen--ministerie-van-ocw [18:47:11] there is a sanbox item already in the main namespace [18:47:33] I know [18:47:58] data licenses suck :( [18:48:10] as I said, it was a crazy idea [18:48:13] Sailing schools are public domain. https://data.overheid.nl/data/dataset/zeilscholen XD [18:48:31] matej_suchanek: not crazy ;-) [18:48:45] sorry, leaving for a while [18:49:01] The best public domain dataset I've found is about gas stations. With coords and all. [18:49:16] gas stations? which area? [18:49:23] Netherlands. [18:49:59] (10 mins left) [18:50:34] any other topics we should discuss still? [18:51:38] ok then i'll give a quick list of things the dev team has been working on lately: [18:51:57] * performance improvements - some are already live but more are coming [18:52:10] * UI redesign - currently the sitelink section is still being worked on [18:52:22] * statements on properties - to be rolled out tonight [18:52:36] * language fallbacks - first version to be rolled out in a week or two [18:52:55] * arbitrary access and usage tracking - coming early next year - rollout will be staged [18:53:57] * started working with the team of students to work on data quality tools - they are concentrating on constraint violations and reports as well as checking against 3rd party databases [18:54:04] back again [18:54:05] up next: [18:54:13] * finishing the above [18:54:18] rough idea on the stages for the arbacces? [18:54:19] * working with the Foundation on queries [18:54:41] * more work on data quality tools [18:54:47] * units [18:54:55] Units <3 [18:55:04] dennyvrandecic: commons, small wikipedias, big wikipedias, other projects maybe? [18:55:21] sweet [18:55:41] what about wikidata, commons, ... ? [18:55:49] ? [18:55:55] wikidata should be the most friendly place to start this arbaccess [18:56:02] and they won't need it that much [18:56:10] oh it already has it! (just without purging) [18:56:28] we endabled that a while ago because there were bad hacks to make it work anyway [18:56:37] ah, ok, so it will be the first place to get tracking and purging anyway? [18:56:39] good idea [18:56:42] yeah [18:56:56] so it can be tested there, and then it goes to commons, who will really benefit from this [18:57:00] and the smaller wikipedias [18:57:01] right [18:57:05] awesome [18:57:17] and agree with sjoerddebruin: units <3 [18:57:20] :D [18:57:27] all <3 units [18:57:29] Stryn closed the rfc [18:57:33] \o/ [18:57:34] so the arbacc deployment will be split? [18:57:45] wow!!! [18:57:46] matej_suchanek: yeah because we need to see how it scales [18:57:48] sjoerddebruin:, dennyvrandecic: Might I add please add / contribute at some point to this conversation some CC wiki link resources planned for first for large languages, focusing on Sailing schools, All Museums, All Libraries as part of a CC all 7, 106 languages, MIT OCW-centric open wiki World University and School, which is like Wikipedia with MIT OCW ? [18:58:08] Lydia_WMDE: right [18:58:24] Scott_WUaS: yes, sure, go ahead, it's an open forum [18:58:28] every big change goes first to small projects [18:58:53] maybe even small wikipedia before commons [18:59:00] *nod* [18:59:10] because commons can (and should!) go crayz with arbacc! [18:59:15] *crazy [18:59:16] hehe true [18:59:19] Stryn, many thanks to you [18:59:43] Thanks ... still only in English, but planned for large languages and for wiki teaching and learning - here's the beginning wiki Sailing subject / school - http://worlduniversity.wikia.com/wiki/Sailing [18:59:44] it wasn't so hard than I expected [18:59:54] thanks Stryn! [18:59:56] lol [18:59:59] it's time :D [19:00:13] yeah it was very short :/ [19:00:14] It's 3am over here xD [19:00:19] Library Resources planned to aggregate all open online libraries in each of all languages - http://worlduniversity.wikia.com/wiki/Library_Resources [19:00:38] thanks everyone for coming! :) as always if you want to chat more come to #wikidata [19:00:55] was good to chat with you all [19:00:57] <3 [19:01:01] Museums wiki subject page - planned to aggregate all open online museums in each of all languages http://worlduniversity.wikia.com/wiki/Museums [19:01:07] Thanks, Lydia! [19:01:17] thanks Scott_WUaS! [19:01:42] see you on #wikidata [19:01:47] thanks Lydia_WMDE [19:02:06] always [19:02:11] Thanks member:dennyvrandecic and member:sjoerddebruin [19:02:38] Now I feel like in a socialist cadre, being called a member :D [19:02:53] haha [19:03:12] I removed the colon and these technologies printed member :) [20:56:01] 5 minutes [20:56:26] till the Global Wiki discussion? [20:56:59] There's going to be a MediaWiki RfC meeting [20:57:17] ah, thanks [20:57:20] https://lists.wikimedia.org/pipermail/wikitech-l/2014-December/079713.html [20:58:54] #startmeeting RFC meeting [20:58:57] Meeting started Wed Dec 3 20:58:54 2014 UTC and is due to finish in 60 minutes. The chair is TimStarling. Information about MeetBot at http://wiki.debian.org/MeetBot. [20:58:57] Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. [20:58:57] The meeting name has been set to 'rfc_meeting' [20:59:13] #topic Shadow namespaces | RFC meeting | PLEASE SCHEDULE YOUR OFFICE HOURS: https://meta.wikimedia.org/wiki/IRC_office_hours | Please note: Channel is logged and publicly posted (DO NOT REMOVE THIS NOTE) | Logs: http://bots.wmflabs.org/~wm-bot/logs/%23wikimedia-office/ [21:00:51] #link https://www.mediawiki.org/wiki/Requests_for_comment/Shadow_namespaces [21:01:20] so this RFC has been renamed from "global scripts" and (entirely?) rewritten in the last day [21:01:48] yaaay [21:02:13] I see. I think the idea has merit [21:02:14] yeah, Fiona and I talked about it yesterday and updated it with where we'd like to see it go, it was kind of a mess before [21:03:02] Fiona? not just the RFC that was renamed then? [21:03:20] :P [21:03:20] Hi. [21:03:22] ah, we have Marybelle as well [21:03:38] Via telnet, even. ;-) [21:03:39] Global gadgets was part of the earlier RfC, but that is alreayd mostly implemented and was split out to https://www.mediawiki.org/wiki/Requests_for_comment/Global_user_preferences [21:04:08] Yeah, I didn't like the organization, so I re-did it. [21:04:15] help namespaces by remote loading is an interesting idea, I always assumed it would be done by XML export [21:04:39] Didn't we have Help pull from Meta-Wiki at some point? [21:04:43] The problem with XML exports is that you have to keep re-importing to stay up to date [21:04:54] I actually meant to note that in the RFC. [21:05:11] The other problem with XML is that it's XML. [21:05:31] and importing gets messy with usernames and attribution and if a local user has the same username as a user on mw.o or something. [21:05:32] I don't think anything ever actually pulled in help [21:05:50] I could swear we had some kind of hacky auto-tranclusion thing. [21:06:04] In any case, shadow namespaces could easily be adapted to the Help namespace, I think. [21:06:13] It's relatively easy to fetch a full Content object from another wiki's database. [21:06:23] Oo! so, could this be used to update help pages on non-WMF wikis? [21:06:40] ...if the database is directly accessible. [21:06:49] DanielK_WMDE: we'd want this to also work over the API [21:07:13] hexmode: yes [21:07:22] legoktm: both could work. but i see several issues. [21:07:29] one hard thing is purging [21:07:32] +1 [21:07:35] the idea is to get wikitext from the remote wiki and then do the parsing locally? [21:07:38] the other hard thing is not tripping over a thousand assumptions [21:07:46] hexmode: update is the wrong word though, it would always live on mediawiki.org, but be visible from the local wiki [21:07:58] TimStarling: I think you'd do parsing on the foreign side? [21:08:06] anywaaay.... if I click [edit], what will happen? [21:08:25] If you click edit, I think you'll get an edit notice saying "the page you're looking for is over here." [21:08:26] legoktm: ok, not ideal, but one step at a time [21:08:31] doing the parsing remotely is probably good for help pages, and certain kinds of templates [21:08:31] With optional Mario graphic. [21:08:41] but probably not so good for lua modules [21:09:07] I think you'd want stuff to be parsed locally so parser functions and things are run in local context, not foreign [21:09:34] maybe it should be configurable? [21:09:34] Doesn't that assume a sane local parser? [21:09:41] yeah, that is the difficulty [21:09:47] well, one of many difficulties [21:10:20] If we just spent 30 minutes listing the problems, I think that would be a good use of time. (Not kidding.) [21:10:28] That would allow Lego and I to re-work as necessary. [21:10:31] about the centralized help: https://strategy.wikimedia.org/wiki/Proposal:Wikimedia_Help_%28A_single_wiki_to_centralize_all_of_the_help_content%29#Summary [21:10:43] if you do local parsing, then what do you do when you see a template invocation in the fetched wikitext? [21:11:07] DanielK_WMDE: for purging, I was thinking on the local wiki you'd have a continuous script listening to rcstream and if a page that your wiki depends upon, it queues up htmlcacheupdate jobs [21:11:26] Continuous listening sounds challenging. [21:11:32] or what do you do if you see a require() in a remotely-fetched module? [21:11:50] TimStarling: You could process it like any other template, check locally and then check foreign repos, so it could allow for sub-overrides and stuff. We would probably want some batch lookup process [21:11:50] those two questions are very closely related in the current Scribunto implementation [21:12:16] since require() actually works by calling the parser's template fetch function [21:12:22] ty helder [21:12:27] so that it will be registered in templatelinks [21:12:32] legoktm: i'd much prefer push notifications. keeping a socket open scanning a cvonstant feed is more maintenance than most people are willing to do, i suppose [21:12:38] but possible, yes [21:13:08] I don't think we need to worry at all about editing right now. [21:13:20] For now I'd assume this is read-only on the local side. [21:13:25] or people could poll on regular intervals. Push will require usage tracking [21:13:47] Not sure why we'd need polling. [21:13:57] legoktm: subscriptions, yes. sonme pub-sub mechanism [21:14:00] We could just use caching like we do with Commons/. [21:14:00] legoktm: you would need to be pretty careful to name modules and templates such that they don't get accidentally overridden [21:14:23] hm... local parsing sounded good at first, but i'm a fraid it's a horrible can of worms [21:14:43] well, for modules, local parsing is essential [21:14:45] Can we talk about specific use-cases? [21:14:57] For user pages, local parsing is probably not needed? [21:14:58] for viewing, the remote can do the rendering [21:15:17] for *using* JS/Lua, whatever, no parsing is needed, no link updates apply, and no templates are being resolved [21:15:32] TimStarling: well when you're creating a new template, you'd see that one already exists at that name on a foreign repo, and choose a different name. iirc you can't upload locally over a commons image unless you're a sysop [21:15:38] Link updates kind of apply if we want to track usage. [21:15:49] I think the mechansims for using code on wiki pages, and viewing wiki pages, are rather separate [21:16:00] I'm not worried about name conflicts. [21:16:07] Most wikis don't have much content. [21:16:15] <^d> legoktm: Yes you can. [21:16:21] <^d> (upload over commons) [21:16:48] You're not uploading over anything, exactly. [21:16:53] <^d> True. [21:17:07] And I think the Commons integration works quite well. [21:17:13] Which is why I'm not really concerned about conflicts. [21:17:18] You need the reupload-shared right for that, which is only sysops on enwp [21:17:50] Restricting page creation is trivial. [21:17:58] What are the hard parts here? [21:18:04] For help pages and user pages I think remote parsing is fine. [21:18:25] How do you track usage? [21:18:33] I don't know if I am totally sold on the idea of the local module namespace being shared with the remote repository [21:18:46] in previous design discussions I think I said that I preferred having a prefix [21:18:57] a global module namespace? [21:19:03] You mean a prefix other than "Module:"? [21:19:24] instead of module: or in addition to module: [21:19:47] I don't think the chance of collisions makes it worrth it. [21:19:58] I think the vast majority of Wikimedia wikis have 0 modules. [21:20:09] So you're kind of solving for a problem that doesn't exist, no? [21:20:52] https://bn.wiktionary.org/wiki/%E0%A6%AC%E0%A6%BF%E0%A6%B6%E0%A7%87%E0%A6%B7:AllPages/Module: [21:21:01] I think probably most of them have more than zero [21:21:27] and the consequences of using the wrong code in an include() are potentially severe [21:21:39] and hard to debug [21:22:06] I'm not too sure how usage tracking would work, we could have sites that use our templates/modules/etc could register in our db somewhere, we give them a token, they send authenticated API requests with said token saying which templates they're using, and we send them push updates in return. [21:22:12] <^d> I wonder how many are original creations and not copy+pasted from other wikis. [21:22:33] Why authentication? [21:22:52] So random spammers can't fake usage [21:23:05] ^d: right, a lot of them are copy-pasted [21:23:06] ...and how many are just reinventing the wheel [21:23:31] or forking due to the originals not working without local customizations [21:23:40] but you can't randomly replace copy-pasted modules with modules of the same name but a different version and expect it to keep working [21:24:06] <^d> I would do it like we do files. Local version takes precedence. [21:24:14] Right. [21:24:15] <^d> So you wouldn't be using the hosted copy until you got rid of your local hack. [21:24:32] ok, well I would like to see table of pros and cons [21:25:01] Pros and cons of what? [21:25:06] ^d: I think that is what most users would expect given how Commons files works [21:25:07] And make an #action item, please. :P [21:25:15] <^d> Helder: Indeed :) [21:25:21] shadow namespaces versus explicit remote transclusion for modules [21:26:43] action commands are meant to have an assignee, should that be Marybelle? [21:27:39] #action Marybelle or someone to give rationale for shadow namespaces versus explicit remote transclusion for lua modules [21:28:08] I think your bn.wiktionary.org link shows an anti-pattern, BTW. [21:28:27] It seems pretty clear that those were lifted from elsewhere. [21:28:34] #info some use cases seem to benefit from remote parsing, e.g. help pages, some seem to need local parsing, e.g. lua modules [21:28:35] Due to the current lack of sharing. [21:29:08] #info cache invalidation by pub-sub? socket? polling? existing rcstream? [21:29:54] Does rcstream exist now? [21:29:59] I thought it was still alpha or something. [21:30:19] #info global gadgets already mostly implemented, split off to a different RFC [21:30:41] it exists, but still in alpha [21:30:46] For Lua modules, Mr. Stradivarius has been doing a good work in creating scripts with subscripts which can be customized by other wikis, so that the main part of the code can just be copyied and only the config parts are edited locally [21:30:55] (I did that for some modules on ptwiki) [21:31:00] Helder: right, but we want 0 copying and pasting. [21:31:08] Copying and pasting is a bad hack. [21:31:16] sure. [21:31:20] yeah, but that work will help regardless of which way we do it [21:31:47] I think for this RFC, doing a first implementation might just be for user pages or help pages. [21:31:55] separating config/localisation from logic is a win whichever way you look at it [21:31:57] And then we could figure out the trickier part later. [21:32:11] Trickier part --> templates and modules. [21:32:22] #info I think for this RFC, doing a first implementation might just be for user pages or help pages. [21:32:46] Yeah, so maybe restrict the applications of shadow namespaces for now. [21:32:56] The other question is whether we'd want to rejigger fForeignFileRepos. [21:33:00] To be less file-centric. [21:33:06] I think we should [21:33:09] ok, for time management purposes, please say something if you want to talk about "Opt-in site registration during installation" today [21:33:14] personally, I'm more interrested in global preferences than in a global user page... [21:33:15] Like create $wgForeignRepos and then have NS_FILE overridden. [21:33:30] in the channel or by private message to me [21:33:31] DanielK_WMDE: Yeah, that's a separate issue, though. :-) [21:33:46] wherever there's an if branch special casing NS_FILE, it should probably be generalized into any shadow namespace [21:34:35] legoktm: So maybe you and I can get together soon and figure out how to flesh out the RFC? [21:34:45] sure [21:34:49] DanielK_WMDE: I think global user page is mostly done in an extension (by legoktm ?) [21:34:58] It is, but the implementation is weak. [21:35:03] The RFC should be clearer about that. [21:35:15] It's fine for now, but would be redone. [21:35:23] Extesnion:GlobalUserPage is mostly done, but would significantly benefit from a strong implementation in core rather than hacking around hooks and stuff. [21:35:23] TimStarling : hexmode and I are here to talk about opt-in [21:35:45] (for management reasons) [21:35:49] :) [21:35:54] ok, anything else for the notes before we move on? [21:36:01] I think we're good. [21:36:09] We may need one more sanity check before Lego starts development work, though. [21:36:17] #topic Opt-in site registration during installation | RFC meeting | PLEASE SCHEDULE YOUR OFFICE HOURS: https://meta.wikimedia.org/wiki/IRC_office_hours | Please note: Channel is logged and publicly posted (DO NOT REMOVE THIS NOTE) | Logs: http://bots.wmflabs.org/~wm-bot/logs/%23wikimedia-office/ (Meeting topic: RFC meeting) [21:36:25] #link https://www.mediawiki.org/wiki/Requests_for_comment/Opt-in_site_registration_during_installation [21:36:56] <^d> So, we talked about doing this during GSoC one year and during new-installer. Just never got to it. [21:37:12] <^d> But I'm +1 to the idea and have been since the start, long as we keep it opt-in. [21:37:31] Don't think we could do anything *but* opt-in [21:37:31] I see hexmode made a few changes in the last 24 hours [21:37:37] Is the idea to have the installer ping the central server once? [21:37:40] we'd have a revolt [21:37:42] will there be another pin during updates? [21:37:45] <^d> DanielK_WMDE: Yeah, that was the idea. [21:37:49] DanielK_WMDE: kind of [21:37:54] there might be updates [21:38:01] Register once, be registered forever?... [21:38:03] e.g. after LocalSettings change [21:38:16] There should be a later opt-out [21:38:17] Dantman mentioned that on the talk page, I believe. [21:38:25] DanielK_WMDE: that was originally the idea. But comments show that maybe install isn't the right time for everyone [21:38:31] I was thinking that perhaps the central server could scan all registered wikis once in a while. [21:38:32] Initial wiki creation stats would likely be useless. [21:38:42] ask the api for installed version, extensions, etc (if exposed) [21:38:55] may want to include a secret token for this [21:38:56] wikiapiary already does that. [21:39:06] Marybelle: at least, an installation would inform wikiapiary about its existence [21:39:07] DanielK_WMDE: We already have people scanning [21:39:20] wikiapiary could then include it in the scanning [21:39:23] i know. and a request to lock down special:version per default [21:39:32] Authentication and tokens are overhead. Wikis were designed for openness. [21:39:34] the idea is to also gather information about inhouse wikis [21:39:35] this would just collect a list that we could share with trusted parties (wikistats, wikiapiary, etc) [21:39:43] An automated directory of just wiki names and urls on mediawiki.org would be nice [21:39:55] For what purpose? [21:39:56] and then wikiapiary can grab dumps of the list and do their own thing [21:39:57] the data about this could only be collected if the wiki pushes info to wikiapiary [21:40:04] I think we need to look more closely at specific use-cases here. [21:40:06] <^d> lists of wikis aren't very interesting or useful. [21:40:07] mglaser: right. My update mentioned an option for marking your wiki private [21:40:14] hexmode: i could imagine wiki -> server ping during install / update, and regular server -> wiki sweeps later. The registration would set a token to use for the sweeps. [21:40:17] Marybelle usage statistics [21:40:23] ^d: they're useful for usage info :P [21:40:29] Usage statistics for what purpose? [21:40:34] Gather information about "our customers" [21:40:48] And do what with it? [21:40:50] So we have a good idea what sort of db are supported [21:40:50] Marybelle: how many people use an extension, use what php version [21:40:51] in order to know what kinds of developments to focus on [21:40:56] Make back-compat breaks [21:41:01] and what sort of php versions we should consider [21:41:04] Marybelle: https://phabricator.wikimedia.org/T34054 [21:41:10] i think it was closed prematurely [21:41:20] like the bump from 5.3.2 --> 5.3.3, we looked at how many people were actually using the latest stable with 5.3.2 [21:41:24] know if there are older MediaWiki versions that are used widely and should be supported [21:41:27] somehow [21:41:33] legoktm: especially for extensions, installation is not the best time. [21:41:42] (1.16) :P [21:41:57] DanielK_WMDE: I think we could track changes in LocalSettings [21:42:03] If there are widely used older versions, we should figure out how to get people to upgrade...not support them :/ [21:42:05] (by file date or so) [21:42:13] it's already used for cache invalidation [21:42:22] <^d> 1.6.x, duh. [21:42:29] <^d> only version still working on php4. [21:42:29] mglaser: yes, but what would do the actual ping? the next request? a cron job? [21:42:31] legoktm: agreed. Now, where is my kudgel [21:42:43] mglaser: also, updating an extension doesn#t change LocalSettings [21:42:49] DanielK_WMDE: could be in the job queue [21:43:00] hexmode: so, the next request (or some later) [21:43:03] DanielK_WMDE : how about a job in the job queue? that would also work without cron, right? [21:43:12] DanielK_WMDE: true [21:43:17] DanielK_WMDE: Thanks for the link. I re-opened the task and commented. [21:43:19] jinx! [21:43:31] we could also send data on a regular basis [21:43:38] if the admin opts in to that [21:43:38] Some wikis use cron to execute jobs. [21:43:47] should this be in core or in an extension bundled in the tarball? [21:43:52] I don't think it should be the resposibility of the wiki to send usage info [21:44:07] After opting in, you mean? [21:44:11] mglaser: why not *ask* for data on a regular basis? the central server already knows the wiki's url, right? [21:44:13] I think they should send their url and some other basic info (public/private) to the central server, and that server takes care of geting more detailed info [21:44:19] TimStarling : a bundled extension seems to be a nice solution [21:44:19] <^d> TimStarling: If we wanted the installer to be able to do it it'd likely need to be in core as the code stands. [21:44:23] legoktm: indeed. [21:44:24] <^d> Nowhere for an extension to hook in. [21:44:40] legoktm: I think it would be useful to send it if ok'd and necessary [21:44:42] DanielK_WMDE: there are a lot of the wikis that are behind firewalls and cannot be polled [21:44:42] How do we update their entry if they switch domains? [21:44:47] extension seems reasonable if it is a cron-like thing after install, to monitor extension usage [21:44:49] hexmode: why? [21:44:56] mglaser: so we would fail to poll them. nothing lost [21:45:05] A private wiki is private. [21:45:10] firewall. moving sites, etc [21:45:11] <^d> I don't like the idea of tracking the domain at all. [21:45:12] DanielK_WMDE : nothing but the information we want to get ;) [21:45:18] <^d> I'd much prefer the data be anonymized. [21:45:35] How else would you poll the wiki's api.php? [21:45:39] mglaser: you mean we shouldn't use polling *instead* of pushing. ok. so how about doing both? [21:45:48] both mechanisms are brittle. [21:45:54] both together are less brittle [21:46:01] DanielK_WMDE: , both is fine with me [21:46:02] <^d> Have MW's installer push its initial registration. [21:46:02] Lies, damned lies, and statistics. :P [21:46:08] <^d> Then mw.org can do he polling. [21:46:25] ^d or wikiapiary [21:46:41] ^d: ...except for firewalled wikis, as mglaser noted. so the wiki should push updates "sometimes". [21:46:59] What would mediawiki.org be polling? The anonymized hostname? [21:47:09] mglaser: the installer could actually let you choose. but then you'd get the data fragmented. [21:47:13] <^d> Marybelle: Yeah ;-) [21:47:20] :-) [21:47:23] <^d> Is the number of firewall'd wikis that want to opt into stats reporting non-zero? [21:47:36] <^d> I feel like most wikis running inside a corp intranet probably won't opt-in. [21:47:37] Marybelle: the full wiki url. supplying that would be optional of course [21:47:45] ^d definitely [21:47:53] ^d: Yes [21:47:59] ^d: I think for the purpose of making it possible, we should assume a non-zero number [21:48:13] ^d, mglaser: they might if they could get something in return. like specific security alerts. [21:48:14] I agree with Daniel about pushing and polling. [21:48:17] as an educated guess, say that's 20%-40% of all installations [21:48:24] I think we'll need to do both for this to work sorta well. [21:48:46] hm... tailored alerts for registered wikis seems an idea worth exploring [21:48:57] especially wrt extensions. [21:49:00] DanielK_WMDE: I think they are willing to give the data as long as it is clear what kind of info is sent [21:49:02] WordPress does that, kind of. [21:49:08] for core there's the mailing list, but what about critical fixes to extensions? [21:49:29] I mean, some of this kind of ties in to the idea of having MediaWiki say "you should update PHP" or "you should update MediaWiki". [21:49:38] If you're gonna be passing messages back and forth already... [21:49:38] <^d> DanielK_WMDE: You're assuming anything other than core matters :) [21:49:48] mglaser: that definitely helps. but... then it should also be visible *when* that data is being sent [21:49:51] Marybelle++ [21:50:00] like, mail a copy to root@localhost or something [21:50:07] If this integrated with something to tell you when your extensions / core was outdated, that would be very helpful [21:50:11] DanielK_WMDE : +1 [21:50:23] csteipp : also +1 [21:50:41] DanielK_WMDE: or the email given to user 1 when installing? [21:50:53] yes, even better [21:51:00] Informing users of out-of-date software is a tangiible, direct benefit/use-case, IMO. [21:51:21] could be done on-wiki to a special group ( [21:51:29] ...to which the initial user is added [21:51:32] but if you have mediawiki.org push that information then you have to authenticate it [21:51:42] but could also be done by email... [21:51:42] if you have the remote wiki pull it, then it is easier to authenticate [21:51:43] Authenticate what? [21:52:11] you can just do an API request to https://www.mediawiki.org [21:52:17] So you think email could be a medium? instead of api calls or such? [21:52:33] if mediawiki.org posts to the API, then how does the receiving wiki know it is mediawiki.org posting? [21:52:38] I'm lost about how e-mail entered the discussion. [21:52:41] that's surely more failsafe behind firewalls [21:52:56] q: should security notifications (or similar) be a pre-requisite for implementation of this? Or can we say, "we'll start doing this X time in the future"? [21:52:56] TimStarling: sure, the only trouble is automating that through a firewall, without a cron job, and in a way that doesn't scare admins of a coorperate intranet [21:53:01] Marybelle : no worries about the history :) how about the idea? [21:53:28] otoh, maybe we are making this too complicated. [21:53:36] mglaser: I'm not sure what the e-mail body would contain. [21:53:37] how about a special page, with a big "check for updates" button [21:53:41] Or who would be reading the e-mail. [21:54:02] which opt-in for the server to keep the info you send [21:54:13] The main idea is to get this info on wikiapiary.org in the first place [21:54:14] Sounds reasonable to me. [21:54:16] Initial implementation: special page to check for updates and installer note about going to that page first? [21:54:23] i think people would love that, and it's way easier to implement [21:54:25] Then the question becomes how to integrate the Special page with the installer. [21:54:33] and assures people that they are in control [21:54:55] Marybelle: either have a special case for that, or somly don't do that in the installer [21:55:03] just put a link on the default main page [21:55:16] "go here to check for updates. do that regularly". [21:55:36] +1 [21:55:37] So Special:Update or similar? [21:55:39] you could put it on the login success page of users in some group [21:55:42] (just trying to dumb this down to an MVP) [21:55:57] OK, but I still think new installations should register with wikiapiary [21:56:04] I think we eliminated the login success page for most users. [21:56:07] Because it was awful. [21:56:08] Marybelle: yea... except that it doesn't actually update :) [21:56:11] but I'd like a note like "It has been a month since you checked..." [21:56:16] Special:Version, then. ;-) [21:56:18] as of now, wikiapiary doesn't crawl, but just lists wikis that registered [21:56:20] yeah, just for users in the group [21:56:33] so the more wikis we get to register, the better our statistics [21:56:37] mglaser: not true. it does crawl [21:56:41] (wikiapiary's statistics) [21:56:56] or you could have a login success overlay on the next page view [21:57:00] hexmode : ok. Still registration helps, right? [21:57:11] Re: stats, we're dealing with a nasty long tail no matter what, I think. [21:57:26] mglaser: definitely. It only crawls wikis it has been told about. [21:57:40] tracks mw version over time [21:57:55] can we extract any principles from this discussion? [21:57:58] So this would be MediaWiki (the software package) sending date to mediawiki.org or some other random third-party site (wikiapiary.org?)? [21:58:10] Marybelle: mw.o [21:58:13] s/date/data/ [21:58:18] Okay. [21:58:20] the implementors should presumably lead the design process [21:58:31] Marybelle: intrgrating with Special:Version sounds nice, but we still need a way to poke people about going there [21:58:42] we've had some good brainstorming here, but it would be good if we had some general principles to guide hexmode and mglaser [21:58:48] I was mostly kidding that "Version" was already taken if "Update" is too unclear. :P [21:59:05] Marybelle: just put a big botton on Special:Version. why not? [21:59:27] With a bit of AJAX, you could put the up-to-date version number next to the current version number. [21:59:37] And do highlighting in red for big differentials. [21:59:42] also please do #info, #action etc. to wrap up since we are out of time [22:00:26] #action add "Check for updates" button to Special:Version for users in proper group [22:00:38] Not just the button, we need a checkbox as well. [22:00:50] ? [22:00:55] #info implementation must be opt-in [22:00:59] The checkbox was to opt in to something more regular, I think? [22:01:02] I forget, now. [22:01:05] ah [22:01:06] It's somewhere in scrollback. [22:01:51] #action add checkbox to opt-in to regular update checks without intervention [22:02:06] Will you all need design help? [22:02:15] If so, we can probably get that, we just need to know. [22:02:47] [x] Allow $server to collect statistics [22:02:47] #info installer ping back seems to be desired in most proposals here [22:02:47] [x] Allow $server to poll this wiki using $url [22:02:47] #info make clear who is being polled and what mw is sending [22:03:08] design help would help [22:03:13] #info polling of opted-in wikis via API would help to monitor extension usage of public wikis [22:03:14] :P [22:03:22] i think installer integration is cool, but not needed for version 0.1 [22:03:36] Agreed. [22:03:43] But I would like this to be somewhat prominent. [22:03:51] I liked the idea about putting the link on the main page, for example. [22:03:53] #info installer integration not needed for first implementation [22:04:11] I was less hot on the idea of re-introducing a post-login screen, but maybe it's not so bad. [22:04:29] #info poke admins every n weeks to check for updates (after login?) [22:04:43] a post-login *page* is probably old-fashioned [22:04:46] might want an extra permission for that [22:05:00] TimStarling: agreed [22:05:40] ok, thanks everyone [22:05:47] ty [22:05:53] next week we are going to post some candidates to wikitech-l [22:06:00] we are up to the very oldest entries in the backlog now [22:06:19] so we want to know if there is interest in reviving any of them [22:06:22] TimStarling: I'm hoping to clean up the RFC indices soon. [22:06:34] Like currently they're manual (the main page and the archive page). [22:06:40] Should we post a summary to wikitech-l? [22:06:46] Marybelle: sounds simultaneously useful and scary [22:06:48] It's annoying me enough that I'm probably going to make it non-manual. [22:06:59] I'll try not to be too disruptive. :-) [22:07:53] #endmeeting [22:07:54] Meeting ended Wed Dec 3 22:07:53 2014 UTC. Information about MeetBot at http://wiki.debian.org/MeetBot . (v 0.1.4) [22:07:54] Minutes: https://tools.wmflabs.org/meetbot/wikimedia-office/2014/wikimedia-office.2014-12-03-20.58.html [22:07:54] Minutes (text): https://tools.wmflabs.org/meetbot/wikimedia-office/2014/wikimedia-office.2014-12-03-20.58.txt [22:07:54] Minutes (wiki): https://tools.wmflabs.org/meetbot/wikimedia-office/2014/wikimedia-office.2014-12-03-20.58.wiki [22:07:54] Log: https://tools.wmflabs.org/meetbot/wikimedia-office/2014/wikimedia-office.2014-12-03-20.58.log.html [22:08:00] Summary to wikitech-l is always welcome.