[04:53:11] Lydia_WMDE: I see no mail? [04:54:22] about your message sent 9 hours ago [05:07:37] if I write "Satyrus" one more time... lol [05:41:27] Lydia_WMDE: also can you paste the content of those weekly summary pages into the annoncing mail? [05:41:29] like what [Wikitech-l] Deployment Highlights does [09:56:02] Tpt_: have you read https://www.wikidata.org/wiki/Wikidata:Requests_for_permissions/Bot/ImplicatorBot ? [09:56:35] legoktm: No, I didn't [09:57:04] Ok well, just a heads up so you know what you might be in for :/ [09:58:20] Ok, thanks. But I think it's less problematic (very simple assertion) [09:58:42] Agreed. I just hope everyone else thinks that :) [10:32:06] multichill: are you there? [10:32:13] Yes, I am [10:32:24] hi [10:32:34] see this:http://www.wikidata.org/wiki/User_talk:Dexbot#Your_creations [10:33:56] That's only a few compared to the total number of articles I guess [10:35:54] Amir1: Yeah, you have quite a few deleted contributions [10:36:07] With tag 2013-06-25, I'm getting the message «This script must be run from the command line» when opening index.php on a fresh installation. [10:36:09] you've got "735.843" articles?o.O [10:36:09] Why is this? [10:36:47] multichill: I'll check how much left un-imported [10:37:17] Amir1: What you could do is check in some wiki's if an article with the exact same name exists [10:37:44] DanielK_WMDE_: If you are there, any idea? [10:38:00] multichill: which wikis? [10:38:13] a user suggested I should skip this cat http://nl.wikipedia.org/wiki/Categorie:Wikipedia:Beginnetje_biologie [10:39:06] That's good in the mean time [10:50:17] (fixed) [11:00:15] Change on 12mediawiki a page Extension:Wikibase was modified, changed by LivingShadow link https://www.mediawiki.org/w/index.php?diff=719945 edit summary: [+521] /* Installation */ Note: Wrong entries by mediawiki setup [15:05:12] New patchset: LivingShadow; "Documentation for RepoApi.createClaim extended" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/71175 [15:18:50] New review: Hoo man; "(1 comment)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/71175 [15:28:59] i'm trying to enter the a claim with the property 'EC number' in http://www.wikidata.org/wiki/Q420032, but the input box for the property's value isn't coming up. in other words, i'd like to enter 'EC number' = '3.4.21', but the box where i would input '3.4.21' doesn't appear when i select that i'd like to add a claim for the property 'EC number'. is anyone else able to add that claim to... [15:29:01] ...http://www.wikidata.org/wiki/Q420032? [15:30:48] nevermind, the bug magically disappeared [15:32:06] wait, no, it didn't [15:46:45] Hello? [15:47:51] Can anyone tell me if they think something is a hoax wikidata item? [15:47:59] i don't know how to AfD an item [15:48:19] I guess I can just delete it, since I'm a global sysop... is that ok? [15:49:34] https://www.wikidata.org/wiki/Q13558847 [15:49:49] PiRSquared: We have no objections to GS deleting but we prefer local sysops to deal with it. I'll look now. [15:50:37] PiRSquared: Why do you say that is a hoax? [15:50:52] JohnLewis: look at this - https://sco.wikipedia.org/wiki/Kipter [15:50:59] I can't find any other references to him online [15:51:08] also, the image is of James II of Scotland [15:51:33] And it was delete on frwiki - https://fr.wikipedia.org/wiki/Kipter [15:51:48] Yep. Seems like a hoax. [15:51:57] and dewiki - https://de.wikipedia.org/wiki/Kipter [15:52:15] not sure, though [15:53:31] I've deleted it. Those two existing articles may need be hoax deleted. If it turns out to be a legit item (though I doubt) it can be undeleted. [15:54:00] I already deleted the xhwiki one. [15:54:14] It wasn't written in the right language (seemed like Afrikaans or Dutch, not Xhosa) [15:54:50] IS scowiki in the GS wikiset? [15:55:11] Or since GS is opt-out, not in the opt out wikiset. [15:56:07] yes, I can delete there [15:57:14] Mind deleting then purely because schowiki is not enwiki so even if it si not a hoax wrong language. [15:57:26] deleted [15:57:39] Scowiki is in Scots [15:57:44] which is like English [15:57:44] Is it? [15:57:53] * JohnLewis learned something new today [15:58:00] https://sco.wikipedia.org/wiki/Main_Page - looks like English in a Scottish accent :P [15:58:18] anyway, deleted [15:59:23] https://sco.wikipedia.org/wiki/Auld_Lang_Syne [16:19:15] JohnLewis: forgot to say thank you ;) [16:19:41] PiRSquared: Welcome :) [16:47:28] New patchset: Liangent; "New LanguageWithConversion class" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/67453 [16:47:49] New patchset: Liangent; "New Utils::getLanguageFallbackChain() function" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/70871 [16:49:35] New patchset: Liangent; "New Utils::getLanguageFallbackChain() function" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/70871 [16:49:50] New patchset: Liangent; "Enable variant level fallback for {{#property: }}" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/71072 [16:52:06] New patchset: Liangent; "Enable variant level fallback for {{#property: }}" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/71072 [16:58:57] just for curiosity: why is the first numeric id always 2 instead of 1? like Q2, P2 etc [17:00:05] liangent: Q1 exists [17:00:43] PiRSquared: on my local installation [17:10:57] Q1 is the universe. who could have guessed :O [17:51:50] what's the current version of wikibase? [17:51:54] for @since tag [17:52:14] 0.4 still [18:02:16] New patchset: Liangent; "Enable variant level fallback for {{#property: }}" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/71072 [18:02:56] we care less performance than readability and maintainability right? [18:03:07] or is there some performance critical parts in wikibase? [18:03:09] yep [18:04:13] Well, there might be some parts which are performance critical, but I don't think there are many [18:04:34] Rendering stuff and a like ... and maybe some client things [18:28:40] one test file per class, or one test file per code file? (in case one code file contains multiple classes) [18:30:14] legoktm: You around? [18:30:22] for a few minutes [18:30:44] liangent: Both is probably ok, although I prefer one test file = one code file [18:31:30] I created to itempages from normal pages. Both are about the same item, but itemA == itemB is false [18:31:48] So something seems to be wrong in the page object equality checking..... [18:32:57] oh [18:32:59] i see why [18:33:16] i was just using the Page.__cmp__ method [18:33:17] wait [18:33:20] bleh [18:33:20] if not isinstance(other, Page): [18:33:20] # especially, return -1 if other is None [18:33:20] return -1 [18:34:05] hm [18:34:05] that should be true [18:35:06] multichill: im guessing its because it uses self._link.title to check equality, and i hack around that a bit, so somewhere its not set properly [18:36:20] multichill: if you call item.title() on both of them before you check equality, does it work? [18:36:37] That's how I changed it now, that works [18:38:01] ok, basically WikibasePage needs its own __cmp__ method, i can write it later today [18:39:48] Ah, right, I guess Page has one two? [18:39:50] *too? [18:41:46] yes [18:42:00] legoktm: Playing around with a bot to add the links and maybe merge items later on [18:42:02] hoo|away: do API requests from the GUI include &bot=1? [18:42:13] neat :) [18:44:46] Stuff like https://nl.wikipedia.org/wiki/Margarites_hickmanae shouldn't be too hard to merge [18:47:17] legoktm: No, they usually don't [18:47:44] unless you also append bot=1 to the gui url (eg. Special:Contributions supports stuff like that) [18:49:49] hoo: So if someone is using a script that uses the wikibase JS library, the edits wont get marked as bot…. :/ [18:50:04] legoktm: Exactly [18:50:25] https://www.wikidata.org/wiki/Wikidata:Bureaucrats%27_noticeboard#Flooding ugh. [18:50:55] mh [18:51:30] We could of course add that to wb.RepoApi easily... but there's a reason to not have these enabled by default [18:51:46] whats the reason? [18:52:23] Ask whoever implemented that in core initially, I can only guess [18:53:10] hoo: phpunit only recognize the first test case class in one test file? [18:53:11] Er, no. If you're flagged as a bot and edit through the normal user interface, your edits get marked as bot no matter what [18:53:36] legoktm: oh, ture [18:53:37] * true [18:53:57] liangent: You mean the first test class? [18:54:14] right [18:54:42] it's called class ...Test extends \MediaWikiTestCase [18:55:07] liangent: Right... that depends on how you run the tests [18:55:12] * legoktm goes afk [18:55:22] legoktm: I'll think about that a bit [18:55:26] is there a bug yet? [18:55:27] ok :) [18:55:30] probably not [18:55:31] hoo: how should I run tests? [18:56:00] The question probably is rather how jenkins runs tests [18:56:07] and I don't really know that [18:56:30] with one class = one file you're secure... [18:59:29] hoo: so to achieve one test file per code file I'm prefixing test function names with class names? [19:00:51] liangent: mh... do you really need two classes in one file? That's usually not considered good style [19:01:47] hoo: I'm changing a existing file and it's already the case [19:01:51] *an [19:02:34] :/ [19:08:04] New patchset: Liangent; "Make MultiLangSerializationOptions aware of fallback chains" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/71182 [19:24:34] New patchset: Liangent; "Make MultiLangSerializationOptions aware of fallback chains" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/71182 [19:27:01] ouch this is generating a failure [19:27:03] https://integration.wikimedia.org/ci/job/mwext-Wikibase-client-tests/1137/console [19:27:08] Test method "testMultiLangSerializationOptionsLanguagesPreprocess" in test class "Wikibase\Test\SerializationOptionsTest" is not public. [19:27:36] it's meant to be a helper function of testMultiLangSerializationOptionsLanguages and not a real test function [19:27:51] liangent: Helper functions musn't start with test [19:27:58] * mustn't [19:28:41] well phpunit doesn't yell on my computer for this [19:28:43] don't know why [19:30:07] New patchset: Liangent; "Make MultiLangSerializationOptions aware of fallback chains" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/71182 [19:46:42] DanielK_WMDE_: (If a user has the bot right) Do you think edits done via the Wikibase JS interface should be marked as bot edits per default like core does? [19:47:27] aude: review review review [19:47:40] or is it because it's weekend these days? [19:48:01] liangent: People are sometimes around during weekends, but that's of course not reliable [19:48:25] I don't really expect a reply from Daniel above either [19:49:18] legoktm: Spotted a bug in getSitelink() [19:49:30] Documentation says it returns a Page object, but it returns a string [20:24:58] multichill: is bot working? [20:25:47] Hi Amir1 [20:25:57] hoi dear [20:25:58] Which one? Yours or mine? I think both are at the moment [20:30:44] Hi Amir1 [20:31:02] hi [20:31:23] multichill: both are working? that's good [20:31:38] @Amir1 You Ready? [20:31:48] Florence: PM [20:32:01] and now I'm a little busy [20:32:09] Hey, does anyone here know what nick Rschen uses? [20:32:33] Maple__: he (or she) is not here [20:32:36] Huh. [20:32:42] /ns info Rschen7754 shows that they [20:32:47] 're still online somewhere... [20:32:49] his/her nickname is very look like his/her username [20:32:51] I'll send them a memo. [20:46:22] Amir1: Trying to bring down the number of pages without items [20:46:34] Still quite a few pages do have it, but the table is lagging [20:46:56] how much it is? [20:47:31] last time I checked for Dutch It was 14 thousand [20:48:13] 29187 for the nlwp [20:48:51] No, that's the ones with local links [20:48:58] 43975 is the number that need to be touched [20:49:23] 94307 for the newp [20:49:25] *enwp [20:49:48] multichill: http://tools.wmflabs.org/addshore/addbot/iwlinks/raw.html?lang=nl [20:49:53] I checked this [20:50:27] and for en it was 100,000 [20:50:38] Amir1: Working on http://toolserver.org/~multichill/temp/queries/wikidata/nlwiki_touch_bot.sql [20:51:48] Getting rid of the false positives at https://nl.wikipedia.org/wiki/Special:UnconnectedPages [20:52:05] nice [20:53:14] did you know PWB finally will migrate to git at 6th July [20:55:16] i think biology articles (bot-made?) are major part of unconnected articles [20:55:43] Amir1: I know, keeping an eye on it [20:56:06] Yes, the bot generated bugs are flooding the unconnected articles page [20:57:28] you should add a policy to make bot opes to do something about wikidata [20:58:00] at lease add [[en:article]] to end of page and a bot (addbot) add those to wikidata [20:58:02] [5] 10https://en.wikipedia.org/wiki/article [21:05:00] Amir1: Catching up with some simple logic, see https://www.wikidata.org/wiki/Special:Contributions/BotMultichill [21:05:29] hoo: if the user also has the bot right on wikidata, then yes [21:05:54] hoo: but core doesn't mark bot edits automatically... at least not edits via the api, iirc [21:06:18] DanielK_WMDE_: I know, but it marks edits done via the editing interface as bot edits [21:06:24] So I'd suggest to just set bot=1 in our editing JS [21:06:50] Can we do that unconditionally or will the API complain if the user hasn't got the right? [21:06:57] hm, but when a bot moves a page... is that generally marked as a bot edit? it does get an entry in the revision table, right? [21:07:49] i don't think the api will complain. iirc, it will just not set the bot flag. [21:08:48] ok, the change itself should be very simple [21:17:26] New patchset: Hoo man; "Unconditionally set the bot parameter to match the core behaviour" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/71246 [21:42:54] multichill: so fix it :) [21:52:32] multichill: sorry I was AFK [21:52:50] let me see [21:56:36] multichill: very nice, after finishing your bot works can you give me list a unconnected articles? e-mail me a txt file [21:57:28] would it be possible to put more thought to that: http://www.wikidata.org/wiki/Wikidata:Project_chat#More_than_one_interwiki_per_language_.3F [21:57:48] Wikidata is about to go take care of interwikis for other projects, and it doesn't even support all Wikipedias, that doesn't make sense imo [21:58:07] hoo|away is away :/ [21:59:06] I proposed solution on the project chat, but everytime the topic is archived without thought even being put in it.... [21:59:54] no answer about feasability or even somebody of the development team saying he is looking at it :( [22:01:34] Amqui: tbh your particular topic faut un RfC [22:01:44] rather than just a PC discussion [22:03:05] tbh? [22:03:16] Amqui: There is a RfC around the thing you are proposing. [22:03:24] where is it? [22:03:49] Amqui: = "to be honest" [22:03:53] oh ok [22:04:30] https://wikidata.org/wiki/Wikidata:Requests_for_comment/One_vs._serveral_sitelink-item_correspondence [22:04:40] Hopefully it is the same thing Amqui. [22:05:26] Wikidata's premise breaks with multiple sitelinks. [22:05:31] Is what it comes down to. [22:05:35] not exactly [22:05:53] mais pourquoi? [22:06:09] Uh.... yes exactly. Jeblad stated almost precisely that, and he actually /does/ work on the software. [22:06:18] JohnLewis all previous discussions have seen discuss about "a Wikipedia having two article on two subjects, but other Wikipedia having only one article for both subjects" while my issue is straightforward "a Wikipedia having two articles on the exact same subject" [22:06:41] Then those articles should be merged. [22:06:48] Sky2042 hum why? [22:06:49] That's not Wikidata's responsibility. [22:07:01] Amqui: Because That's Just Good Writing. [22:07:12] you don't even know the issue [22:07:29] that's an English Wikipedia policy [22:07:35] maybe other Wikipedias don't have it [22:07:45] Amqui: I'm curious how you get around NPOV otherwise. [22:07:47] :) [22:07:53] see... [22:07:59] NPOV isn't even an issue here [22:08:09] Yeah... Probably is. But whatever. [22:08:27] Amquidiot is right … as usual [22:08:36] I'm talking about Wikipedia that have the same subject article but written in different scripts because the language doesn't only use one alphabet [22:08:55] or languages that are as standardized as English or German and have the same subject written in more than one dialects [22:09:02] that aren't** [22:09:13] Serbo-Croatian? [22:09:24] Amqui: That's still not Wikidata's problem, even in that regard. That's really an issue for the devs to figure out how to deal with. [22:09:26] :/ [22:09:39] the dev of what? [22:09:45] the dev of Wikidata XD [22:09:50] Uh, no. [22:10:00] Presumably, the devs working on multi-language support. [22:10:04] And there are those. [22:10:05] Wikidata is handling the interwikis [22:10:09] * Jasper_Deng pokes Lydia_WMDE ^ [22:10:11] Irrelevant. [22:10:25] the Wikipedia is working [22:10:31] only the interwikis aren't supported [22:10:31] That MediaWiki poorly supports dialects does not mean it's Wikidata's job to fix that problme. [22:10:36] I find this interesting. Here we say this is not Wikidatas problem, however in the RfC we say this is not Wikipedias problem. [22:10:49] JohnLewis: Well, apparently I differ in opinion. :) [22:10:59] It's really a MediaWiki problem when it comes down to it. [22:11:04] Sky2042: the 2 scripts on 1 wiki thing is a serious problem, though [22:11:09] the problem is: Wikidata takes care of interwikis link and it only allows one link per project instead of one link per language/script/dialect [22:11:14] how is that not a Wikidata problem? [22:11:38] hey [22:12:08] Amqui: please post links to example articles on the contact the dev team page [22:12:18] i'll try to get you an answer on monday [22:12:24] Lydia_WMDE there is an example on the project chat topic [22:12:25] sorry - it is past midnight here [22:12:30] ok [22:12:31] which I just linked on the contact the dev team page ;) [22:12:54] thanks [22:15:22] Sky2042 some Wikipedias have been able to have the article on the same page because the conversion between scripts is straightforward, but when it is not, it's not very practical to have the articles in each script on the same page [22:16:37] Amqui: I think I will need to disengage here, not least because it is difficult to tell you why I think you are wrong using IRC. :) [22:16:43] (But I'm also hungry.) [22:17:05] I tried using the project chat [22:17:19] but they archived my discussion three times before any real answers :( [22:17:30] yeah [22:17:46] Amqui: It's automatic archiving, when the discussion goes stale. No-one is maliciously trying to deny you your answer. [22:17:52] PC is always a bad place if you want to have answers [22:18:11] Sky2042 only a week stale... [22:18:38] Amqui: There are a lot of topics on the project chat page. A week isn't unreasonable. [22:18:45] it is imo [22:19:10] Amqui: Then we will need to agree to disagree. [22:19:23] on the majority of Wikipedias two week without answer is considered consensus fyi [22:19:34] I don't really care. [22:19:43] This is Wikidata, also, not "the majority of Wikipedias". :) [22:19:46] if we would archive everything after a week... consensus would be bad [22:20:08] but it is intended to support all Wikipedia I believe [22:20:26] As it is intended to support the entire world; as it is intended to support all the Wikimedia projects. [22:20:37] The only way for Wikidata to do all that is to be its own project. [22:21:01] interwikis aren't a "world" problem lol [22:21:18] Amqui: I suspect you would be surprised! [22:21:20] :) [22:21:48] then it should support all world languages :P [22:22:05] Amqui: I don't follow. But again, I don't want to debate here. :) [22:22:05] anyway we are going nowehre lol [22:22:19] Or rather, that doesn't follow, whatsoever. [22:23:21] MediaWiki doesn't "support" all the world languages [22:23:26] Neither do any of the other WMF projects [22:23:38] Reedy it has the potential to [22:23:42] * Sky2042 laughs at Reedy. [22:23:43] Sure [22:24:35] ;) [22:26:49] Reedy: Incubator does support all languages with an ISO code :P [22:27:17] (at least in theory) [22:28:19] so do MediaWiki [22:28:24] (also in theory) [22:29:14] MediaWiki does even more (if you count stuff like de-formal etc.) :P [22:29:26] hehe [22:30:03] maybe interwiki on Wikidata should be listed by language ISO Code instead of project name, after you assign a project name to the ISO code [22:30:22] like that more than one ISO Code could point to the same project [22:31:46] Amqui: that will probably be in that way soon [22:32:02] (because of Incubator support) [22:32:28] yep that would solve Incubator too [22:33:17] and when the project moves out of the Incubator, just change the project name associated with the ISO code in Wikidata, easy [22:59:45] Amqui: That's a good idea :D [23:01:10] Hazard-SJ: You pinged me? [23:02:30] hoo: Yes, I'm looking into the global js issue that I told you about [23:02:58] Hazard-SJ: Sorry, more context please [23:03:10] * hoo got a lot of global script things [23:03:11] Hazard-SJ: see the dev noticeboard (@ Amqui's suggestion) [23:03:40] hoo: Remember I said that some things were broken on items with my global js enabled [23:03:56] hoo: It might be https://en.wikipedia.org/wiki/User:AzaToth/morebits.js [23:06:36] possible [23:08:04] Vogone: Reading, btw [23:10:23] Hazard-SJ: https://meta.wikimedia.org/wiki/User:Hazard-SJ/global.js you don't include twinkle on wikidatawiki AFAIS [23:16:19] hoo: I know, I just excluded it, it works with the twinkle script itself, but not morebits