[09:34:03] Lydia_WMDE: Why is https://phabricator.wikimedia.org/T128100 high priority? I suppose we/you knew this [09:36:00] Adrian_WMDE: ah sorry - i thought the problem was that the link was actually showing up twice at the bottom [09:36:15] it's ok then [09:36:17] changing back [09:36:30] Ok, that's what I thought :) [09:36:49] I was thinking about this a bit more after the discussion with daniel [09:37:06] Is there a long-term idea or design for terms? [09:37:50] Adrian_WMDE: parts of it - we can look at what henning did on tuesday [09:38:02] Ok [09:39:16] I was wondering if we wouldn't want give people basically a suggester where there can select a new language to add [09:39:30] That would support https://phabricator.wikimedia.org/T126510 [12:26:22] Hello, wikibase tests are (again) block all merges for ContentTranslation: https://integration.wikimedia.org/ci/job/mwext-testextension-php55/2756/console [12:30:12] Nikerabbit: we know :/ [12:30:18] working on it [12:31:12] aude: I'll look into the Capiunto stuff [12:31:16] tried to post that to wikidata-tech, but I can't from my wikimedia.de address [12:31:22] hoo: yay [12:31:22] Too many mail addresses *sigh* [12:31:35] hopefully it's obvious and easy to fix [12:32:05] aude: thanks [12:35:08] aude: it is: https://gerrit.wikimedia.org/r/273446 [12:38:24] hoo: ok [12:40:33] * hoo rages [12:41:19] :( [12:41:33] So painful to work around Scribunto test base [12:41:42] * aude has "interpreter for LuaSandbox is not available [12:41:48] especially as different phpunit versions do different things [12:41:50] need to set that up [12:41:58] nah, standalone is fine [12:42:53] some tests are skipped, but i also see failures [12:43:01] on master [12:43:10] :( [12:44:47] (I might time out in a bit... area with no cell phone reception ahead) [12:44:58] * aude eating [12:45:19] so can't 100% look at this stuff right now [12:45:51] I can only really make my own phpunit happy... the one on jenkins can act differently [12:46:11] but I think the version I uploaded now should work around all issues [12:46:16] ok [12:47:44] Wow... I'm still connected... weird [12:48:16] :) [12:52:28] hoo: i think your connection is more reliable than the office wifi :/ [12:52:41] * aude got disconnected and is on wired connection now [12:53:24] I haven't yet worked it out... but I think it depends on both the weather and the type of (train) car I'm in [12:53:31] hmm [12:53:56] Also they sometimes make the train stop in the middle of that area with no reception [12:54:06] annoying if you want to work or read emails or so [12:54:13] yeah [12:54:32] * aude always has problems on the train or bus between nyc and dc, especially in new jersey :P [12:55:18] hm... I had ok-ish reception with t-mobile on the new jersey turnpike [12:56:28] aude, I'm back did you realize that https://gerrit.wikimedia.org/r/#/c/273244/and the other changes were not merged [12:57:17] More Scribunto related failures... yay :S [12:57:35] physikerwelt: we are working on it [12:57:49] now hoo is fixing some stuff [12:58:15] The capiunto change should be fine to merge, passes locally and on Jenkins [12:58:33] I love problem that are beeing fixed without me doing anything :-) [12:58:57] :) [13:00:06] I see the other failure [13:00:25] ok [13:00:31] * aude still eating [13:00:49] Will look into it more when home [13:00:53] k [13:01:06] See you in a bit o/ [13:01:28] ok [13:28:26] so..... a new build needs https://gerrit.wikimedia.org/r/#/c/273244/ etc. [13:29:53] and think https://gerrit.wikimedia.org/r/#/c/273244/ needs a new build??? [13:31:13] * aude gives jenkins another try [13:40:16] ok, makign a new build [14:35:45] jzerebecki: Are you in the office and/or working? [14:51:36] Thiemo_WMDE yea if you came back [14:56:54] http://git.wikimedia.org/commitdiff/integration%2Fjenkins.git/d7bdc8f1a77eec58832bad3684b6c01e6720a639 [15:09:45] https://phabricator.wikimedia.org/T128191 [15:15:04] jzerebecki: https://gerrit.wikimedia.org/r/#/c/273458/ [15:34:03] jzerebecki: Thiemo_WMDE now we need https://gerrit.wikimedia.org/r/#/c/273448/ [15:34:09] then can make a new build [15:34:58] thanks [15:38:09] jzerebecki: https://gerrit.wikimedia.org/r/#/c/272952/ [15:46:00] Lydia_WMDE: Hi [15:46:24] Good day, i received your email. Thanks for the work. I will reply to it soon. [15:47:01] Lydia_WMDE: Do you think since we have agreed to work on the project during the summer, we can move it to the GSoC 2016 work board. [15:47:18] d3r1ck: yeah [15:48:49] I just wrote a comment under the task. Should i do that or i should allow you do it? [15:50:49] please do :) [15:53:45] Lydia_WMDE: should i leave the priority as low? Or step it up to normal [15:53:46] ? [15:54:06] it doesn't matter i think [15:55:28] Ok, as you can see, i have moved the task to the GSoC 2016 work board. Hope we are good to go. I saw another mentor you added in the mail [15:55:39] That makes 4 mentors and 1 mentee :) [17:54:46] hoo: Hi [17:54:58] hi d3r1ck [17:56:06] hoo: its been a while, are you following our email concerning the project? [17:57:06] I did, yes [18:03:43] hoo: Ok, hope you are ok with it. :) [18:04:13] I just need to get in touch with Stephen Laporte and Benedikte [18:04:25] hoo: do you know them? [18:04:35] Sounds good to me [18:04:36] sure [18:04:45] benestar|cloud is Benedikt [18:04:56] hoo: ahhh, cool. [18:05:21] benestar|cloud: Hi, hope all is good :) [18:06:10] * d3r1ck will be right back [18:20:46] hoo: Any idea where I can get an overview of types of badges and the image in use? Seems to be hardcoded somewhere :-( [18:23:00] multichill: The badges on wikidata or the stuff we display on the clients? [18:23:32] Wikidata [18:24:50] gerrit.wikimedia.org/r/mediawiki/extensions/Wikidata.org [18:25:37] Hmm, I see only good and featured article are good for me in this case. I'll just use https://commons.wikimedia.org/wiki/File:Cscr-featured.svg and the good one [18:27:44] multichill: https://www.wikidata.org/w/api.php?action=wbavailablebadges [18:27:58] that only lists them and not an overview, though [18:28:17] Works, thanks :-) [18:28:42] Just realized badges are not implemented in pywikibot yet :-( [18:30:12] aude: That would be https://phabricator.wikimedia.org/T114473 I guess [18:31:32] :/ [18:31:47] a special page wouldn't be difficult to add [18:37:24] I'm still confused how it can be that the company revenues have not been imported to Wikidata yet [18:38:29] Hmm https://www.wikidata.org/wiki/Wikidata:List_of_properties/Organization [18:44:00] We're at 117 external id properties now [18:44:05] I'll do further conversions tomorrow [19:12:20] Adrian_WMDE: you pinged me the other day but then weren't around every time I was, wondering what you wanted :) [19:18:26] multichill: the thing you mentioned on the bot requests page about adding instance of based on namespaces, do you check the namespaces of all the sitelinks? I sometimes come across things where namespaces got mixed up (e.g. where one sitelink is a template and the rest are category) but I've not found a way to find things like that yet [19:22:55] So easy to use https://www.sec.gov/dera/data/financial-statement-data-sets.html [19:42:41] nikki: Nope, I only work on items with zero claims. The error rate is quite low [19:42:53] And I have a report somewhere to find this kind of cross namespace issues [19:44:02] nikki: https://www.wikidata.org/wiki/User:Multichill/Cross_namespace , should probably update it again.... [19:44:31] that'd be nice :) [19:46:31] Ok. It's running. Not sure how long it will take. Some language versions really like to mix up their links so the list is probably much much longer..... [19:52:36] nikki: 1270 lines is the new list at https://www.wikidata.org/wiki/User:Multichill/Cross_namespace [19:52:40] So knock yourself out ;-) [19:53:05] thanks :) [19:53:20] not sure how many I'll do, but I guess anything is better than nothing [19:54:33] Some of them have 40 lines because it's 40 sitelinks [19:54:42] Those are easy to score [20:23:59] nikki: Linked the stats pages from https://www.wikidata.org/wiki/User:Addshore/Identifiers/2 and the likes [20:24:08] Makes it easier to cherry pick properties to convert [21:40:30] nikki: still there? :) [21:40:34] yep [21:40:42] cool :) [21:41:06] I wanted to talk to you about ko-kore [21:41:19] ah [21:42:30] because, kore already is the default script for ko [21:42:41] So I don't think it makes sense to add ko-kore [21:47:21] I thought kore would be more suitable because the hanja name can still include hangul if there's no hanja for those syllables (so it ends up being han + hangul, which is what kore is), and I don't remember seeing anything other than hangul for normal korean labels (etc) [21:48:09] although I still personally think it would be better as a property, because it should be linked to the corresponding normal hangul name, but my proposal was opposed :/ [21:49:00] I see [21:49:48] Then maybe we should rather introduce ko-hang? [21:51:42] ko-hani is definitely used by other people [21:52:30] ko-hang is used on enwp [21:54:19] I don't think it would make much sense that way round because korean is normally written in hangul and I don't think people would expect ko to be the hanja name [21:54:40] so if we really can't use kore, hani would be better than hang [21:55:57] But you said that the content could end up actually being han + hangul [21:56:08] In that case, hani wouldn't be correct [21:56:34] Why not keeping both as just ko? [21:58:37] because then you can't tell which is which [21:59:16] like if you want to generate a table like https://en.wikipedia.org/wiki/Special_cities_of_South_Korea#List_of_metropolitan_cities [22:00:38] You could use qualifiers for that [22:03:48] it seems strange to use qualifiers to mark the script for some cases and not others... [22:03:50] * nikki shrugs [22:04:43] I guess I'm not very good at arguing that we should add it because I still think it should be a property [22:06:20] but filceolaire wanted it to be a language code and nobody else commented on it, so I gave up [22:06:54] I think the point is that this is not about what value it is, but about which role it plays [22:07:33] And for that, using a specific property or a qualifier seems to be ok [22:08:02] If you make the script part of the value, you also have to provide a reference for it (for example) [22:08:18] I mean, it's not wrong to say it's 'ko' or even 'ko-kore' [22:08:45] If your source says that's 'the Korean name' than using 'ko' is exactly the right thing [22:09:15] If your source says that's 'the Korean name in Hangul' than 'ko-hang' would be the right thing, likewise for Hanji [22:09:42] So, what you want to use in a specific place in your infobox shouldn't be determined by the language code used for the value imho [22:16:00] in my experience, when hanja names are shown, they're normally written in brackets after the hangul name, or the hangul and hanja names are presented in two separate columns of a table (like I one I linked), or something similar along those lines (which is basically why I think it should be a property :P it's not really used independently of the hangul name as far as I can tell) [22:17:43] Yeah, I understand that [22:18:24] But I don't think you can determine which one is 'the hangul name' or 'the hanja name' based on the language code given in the monolingual text value, even if we would add script subtags [22:18:54] i. e. I also think you should have a specific property or a qualifier for this [22:25:51] I do suspect that if we did have two values, ko would end up meaning the "normal" hangul name, but I'm not going to talk you out of what sounds like support for my proposal :P maybe I should repropose it at some point with more of an argument about why I think it should be a property and not just a language code [22:26:05] but not right now [23:01:03] nikki: I have to go to bed, but please ping me if you do anything on this topic [23:01:14] will do :)