[00:56:39] I've been talking to Sumanah, and she suggests that I should scope my project for 6 weeks instead of 12 [00:57:37] This is the state of my application as of now : http://www.mediawiki.org/wiki/User:Pragunbhutani/GSoC_2013_Proposal [00:58:24] I'm in no position to judge your idea, pragunbhutani, but I think it's a good one [00:58:40] (and I do think the use of JS should be avoided where possible) [00:59:06] keeping the six week scope in mind, I propose moving everything after the first milestone (and the corresponding deliverables) to an 'if time permits' category and make space for code review/bug fixes instead [00:59:25] Jasper_Deng: thanks, that's the general idea :) [01:00:26] I wanted to run it by this channel before doing it [01:00:32] Does it seem like a good idea? [01:01:06] it does, but again, I'm not a Wikidata developer [01:01:24] and I don't see one online atm (except possibly JeroenDeDauw) [01:04:18] ppl [01:04:49] https://www.wikidata.org/wiki/Wikidata:Project_chat#Abusefilter-view-private_and_abusefilter-log-private_for_rollbackers is bothering me :-) [01:05:47] odder: I think we need a full RfC for that, that's not enough consensus for a new right. I don't think that's a big disclosure. [01:06:40] https://bugzilla.wikimedia.org/show_bug.cgi?id=47503#c1 my thoughts exactly [06:36:57] hello is anyone on? [06:37:29] coooooooooooooooooooooooooool [06:39:53] seriously no one is talking [06:40:07] isnt that the point of irc? [06:40:28] xhelmsx: it's the middle of the night in the US, and early morning in europe [06:40:43] this is true [06:40:46] and i am in the us [06:41:04] but come on 2:40am isnt early [06:42:37] anybody in here from eastern europe? [06:45:04] * jeremyb_ isn't going to bother responding to this mail on wikidata-l now... [06:45:12] maybe someone else will :) [06:48:31] jeremyb_: link? [06:48:37] errr [06:49:19] slooooow [06:49:23] i need to close some tabs [06:52:48] legoktm: http://lists.wikimedia.org/pipermail/wikidata-l/2013-April/002176.html [06:54:11] ty [06:54:51] Hahc21: ok where do i see all RfP's on one page? [06:55:18] eeeeeh [06:56:01] ill just make one fast [06:57:21] https://www.wikidata.org/wiki/Wikidata:Requests_for_permissions/All [06:57:21] I was about to do that... [06:57:24] I was doing that [06:57:25] xD [07:35:59] New patchset: Henning Snater; "Added QUnit tests for DataType constructor and instances" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/60809 [07:36:00] New patchset: Henning Snater; "Introduction of qunit-parameterized for usage in QUnit tests" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/60808 [07:36:00] New patchset: Henning Snater; "Added tests for jQuery.valueview.ExpertFactory" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/60811 [07:36:00] New patchset: Henning Snater; "DataType constructor taking DataValue constructors as 2nd argument now" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/60810 [07:37:09] Change merged: Henning Snater; [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/60808 [07:39:39] Change merged: Henning Snater; [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/60809 [07:49:09] New patchset: Henning Snater; "DataType constructor taking DataValue constructors as 2nd argument now" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/60810 [07:50:06] Change merged: Henning Snater; [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/60810 [07:50:16] New patchset: Henning Snater; "Added tests for jQuery.valueview.ExpertFactory" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/60811 [08:03:43] New patchset: Henning Snater; "Added tests for jQuery.valueview.ExpertFactory" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/60811 [08:15:41] Change merged: Henning Snater; [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/60811 [08:59:28] Lydia_WMDE: moin! uh, is denny there? should i dial in for the daily? what hangout? [09:00:01] DanielK_WMDE: denny sagt erst wenn abraham da ist und es organisiert ;-) [09:00:04] also nächste woche [09:00:50] :P [09:00:52] kk [09:00:55] moin Denny_WMDE [09:01:37] brief daily: got a bunch of reactions/reviews from wmf folks after the call yesterday. will reply to those, then work on Special:EditEntity [09:01:53] that's it from my side :) [09:12:18] DanielK_WMDE: will spend time on the rc patrol thing, namespace setting for clients, and then back to change dispatcher [09:14:45] Danwe_WMDE: and Henning_WMDE will be working on the time datatype [09:14:59] and i will make slides, answer emails, and am there for any question [09:20:18] yay :) [09:23:04] Hi. Can I have IP block exempt on wikidata? My IP when I am connected to VPN is blocked on Wikidata. I have IP block exempt on en.wiki since 2010 [09:23:51] ebraminio: sure give me a sec [09:25:46] ebraminio: done [09:27:53] legoktm: Thanks :) [09:28:11] np [10:07:12] ouch. the gigaom article on wikidata is great. but it's screenshot shows a bug… http://gigaom.com/2013/04/26/wikipedia-is-now-drawing-facts-from-the-wikidata-repository-and-so-can-you/ [10:07:14] :P [10:07:16] hehe [10:10:30] hehe [10:25:34] Lydia_WMDE: https://www.wikidata.org/wiki/Wikidata:Requests_for_comment/A_need_for_a_resolution_regarding_article_moves_and_redirects#Clarification [11:06:08] is there a welcome template we have encouraging ips to create an account? i'd love to autopatrol some IP addresses... [11:15:45] legoktm: yeah, I'd love to have that too. Haven't checked, though [11:16:22] Moe_Epsilon: poke [11:16:28] wanna make a welcome template? :) [11:16:38] howdy [11:16:50] a Wikidata welcome template? [11:17:04] well one targeted towards ips and telling them to make an account [11:17:08] lemme find the enwiki version [11:17:26] Denny_WMDE: Lydia_WMDE https://bugzilla.wikimedia.org/47620 [11:17:34] maybe someone could cleanup the mainpage... [11:17:34] https://en.wikipedia.org/wiki/Template:Welcome-anon [11:17:38] can you comment there if all talk page namespaces should be excluded [11:17:44] for having the widget, etc. [11:17:54] lbenedix: so fix it [11:18:03] :) [11:18:08] legoktm: sure, I can do that :) [11:18:12] :D [11:18:34] there are 14 Links in the Contribute-section [11:18:43] lbenedix: https://www.wikidata.org/wiki/Q12000000 :D [11:18:46] er [11:18:51] that was just for everyone [11:19:03] lbenedix: i was gonna say if you draft a new main page in a userpage, that would be awesome [11:19:36] I'm not a designer... but I'll see what I can do [11:19:51] * Moe_Epsilon wonders what the prefix for the foundation website is [11:19:52] there should be a simple explanation what wikidata is... maybe a diagram [11:19:57] :foundationwiki ? [11:20:00] Moe_Epsilon: [[wmf:]] [11:20:02] [2] 04https://wikimediafoundation.org/wiki/ [11:20:05] that works :p [11:21:02] http://meta.wikimedia.org/wiki/File:Interproject_links_in_title_dropdown_(version_by_NaBUru38).jpg [11:21:53] legoktm: similar to http://en.wikipedia.org/wiki/Template:Welcome-anon [11:21:56] ? [11:22:01] Moe_Epsilon: yup [11:22:13] mmk :) [11:23:55] * Moe_Epsilon frowns searching for d:Template:Welcome and being brought to Q5611978 [11:25:29] yeah [11:25:38] theres a bug about that i think [11:33:46] DanielK_WMDE, hi, i partially agree with you re "if langlinks are stored externally, they should be requested from the source" argument [11:42:25] u"session-failure: * '''Sorry! We could not process your edit due to a loss of session data.'''\nPlease try again.\nIf it still does not work, try [[Special:UserLogout|logging out]] and logging back in.\n* \n" [11:42:28] hmmmmm [11:43:24] thats weird. [11:49:47] ugh [11:49:49] it wont go away [11:50:04] i can edit just fine logged out >.< [11:51:40] heyas [11:51:56] WD doesn't work. [11:52:59] I just created [[simple:Footbridge]] (while logged in), and wanted to add x-wiki links. The thing asks me to log into wd to do this. [11:53:00] [4] 10https://simple.wikipedia.org/wiki/Footbridge [11:54:22] hm [11:54:27] where were you trying to link it to? [11:54:38] https://en.wikipedia.org/wiki/Footbridge [11:54:39] When I do the exactly same (add a link, eg. to [[de:Fußgängerbrücke]] I don't need to log in to wd [11:54:40] [5] 10https://de.wikipedia.org/wiki/Fu%C3%9Fg%C3%A4ngerbr%C3%BCcke [11:55:13] legoktm: en:footbridge, but this doesn't matter, as I don't get the usual wd screen [11:55:53] ok this is weird [11:55:56] its not working for me [11:56:04] and hoo isnt here :( [11:56:23] actually [11:56:24] i got it [11:56:30] i just had to clear my cache a few times? [11:56:45] im not gonna link it though so you can try [11:57:12] legoktm: thats not a solution, nor explanation. thats a workaround at best [11:57:40] well im not sure [11:58:14] legoktm: guess Template:Welcome-anon/text should be marked from translation [11:58:24] have all the parameters for English up [11:59:04] marked for* [11:59:33] eptalon: its working for me first try on other simple pages [12:00:40] legoktm: http://www.wikidata.org/wiki/User_talk:195.75.73.1 [12:00:47] ok, guess I am wrong them [12:00:50] er then [12:01:03] Moe_Epsilon: nice! [12:01:10] :) [12:03:53] legoktm: you're translation admin right? Can you mark http://www.wikidata.org/wiki/Template:Welcome-anon/text for translation? [12:04:00] oh yeah [12:04:01] right [12:04:42] > The page Template:Welcome-anon/text has been marked up for translation with 7 translation units. The page can now be translated. [12:04:54] :) [12:05:45] ok this is seriously pissing me off [12:06:05] what is [12:06:45] im not sure why i keep getting session failures [12:06:50] u"session-failure: * '''Sorry! We could not process your edit due to a loss of session data.'''\nPlease try again.\nIf it still does not work, try [[Special:UserLogout|logging out]] and logging back in.\n* \n" [12:07:00] :< [12:07:49] you're not going back and forth between wikidata.org and www.wikidata.org are you? [12:08:12] or was that bug fixed? [12:08:18] nope [12:08:20] this is my bot [12:08:43] oh.. [12:08:56] 'wikidata': 'www.wikidata.org', [12:09:43] Hello! [12:09:49] hi! [12:11:11] it would be just easier to run this logged out or something >.< [12:19:55] ops, please delete http://www.wikidata.org/wiki/Q12012324 [12:20:36] ebraminio: done [12:23:34] Change on 12mediawiki a page Extension:Wikibase Client was modified, changed by Jens Ohlig link https://www.mediawiki.org/w/index.php?diff=678869 edit summary: [+36] /* Requirements */ Extension:Scribunto [12:24:00] scribunto is a requirement? [12:24:13] not just an option? [12:27:36] legoktm: TNX! :) [12:27:42] np :) [12:28:50] I've completed a draft of my GSoC proposal [12:29:55] may I have some feedback please? [12:31:38] akjshdfjd i figured out what i did wrong [12:42:31] legoktm: say so [12:42:53] huh? [12:43:04] legoktm: what you did wrong :) [12:43:09] oh [12:43:17] i was trying to pass a token from nlwiki onto wikidata :P [12:43:22] pragunbhutani: yes, you should. ping us if you don't get it today :) [12:43:32] legoktm: ah. yeah. ok :) [12:43:48] multichill: im running through the nlwiki category right now and attempting to create items for those that dont exist yet [12:43:51] Denny_WMDE: roger that, thanks :) [13:36:43] hi [13:36:57] how can i mark something to be fictional? [13:37:17] should I? [13:38:38] legoktm: Ok, great! :-) [13:38:55] Schisma_: im not sure if we have a property for that [13:39:05] maybe instance of --> fictional character? [13:39:12] multichill: ill run through enwiki afterwards :) [13:39:24] More hits at enwp already? [13:40:52] legoktm: but its not a character [13:41:03] oh what is it then? [13:41:07] not necessarily [13:41:16] http://www.wikidata.org/wiki/Q1110 [13:41:19] ► Commons category without a link on Wikidata‎ (113,514 C, 66,879 P) [13:41:30] fictional sovereign state ^^ [13:41:41] hahaah [13:41:56] Schisma_: im not sure :P [13:42:07] maybe someone should propose a boolean fictional property? [13:42:22] maybe there should be an "is fictional" property [13:42:36] or something like this [13:42:55] fictional characters could be "persons" and "is fictional" [13:44:29] propose it :) [13:44:53] maybe a fictional instance of? [13:44:57] does it make any sense? [13:45:08] yes... cool [13:45:10] well there should be something... [13:45:12] better [13:47:33] what about people whos actual existence is unknown? [13:47:33] http://www.wikidata.org/wiki/Q302 [13:48:49] like we need to start a debate about that >.< [13:48:50] um [13:49:15] we can use a the 'unknown' value [13:50:58] so the would also assume an is_fictional value [13:51:16] is_fictional=yes, is_fictional=no, is_fictional=unknown [13:51:25] doesn't look like wikidata logic [13:52:02] its unknown for the yeti [13:52:08] until you find one [13:52:28] well [13:52:28] idk [13:52:49] is it fictional until you find one? [13:53:51] lbenedix: not helping [13:54:04] Schisma_: please no booleans :P [13:54:17] ^^ [13:54:19] Denny_WMDE1: no booleans? [13:54:26] booleans suck [13:54:47] why that? [13:54:58] booleans suck == true ? [13:55:05] :D [13:55:12] LOL [13:55:29] otherewise for a boolean like "is fictional" someone could make the case that they should be on every persons page [13:55:39] true [13:55:40] Obama - is fictional = true [13:55:44] nah, you dont want that [13:55:53] I have never seen Obama [13:56:02] * legoktm slaps lbenedix  [13:56:09] legoktm: thx [13:56:14] :P [13:56:19] if(booleans_suck == true) { false = true; } [13:56:26] :D [13:57:15] there is actually quite some philosophical research on the topic of the ontology of the fictional [13:57:55] but for now, I'd guess, just "instance of -> fictional person" or sth like this could handle that issue [13:58:20] Denny_WMDE1: only if you want to describe a fictional person [13:58:31] or, maybe even better, a property "fictional universe -> Marvel universe" [13:58:40] that could be used anywhere [13:58:55] only if there is a universe article [13:59:09] want to bet they all have? :) [13:59:14] "fictional universe -> Christianity" :D [13:59:45] Schisma_: your name points to the idea that that would be too simple :) [14:00:15] and I could imagine that some people might find it offensive to read "Moses, fictional universe -> Torah" [14:00:27] whatever, please no booleans [14:01:03] Santa Claus - fictional? :P [14:01:25] aude: no spoilers! we want to remain kid friendly ;) [14:01:32] :) [14:04:00] Denny_WMDE1: what is the problem with booleans? [14:06:28] hmm, i thought i already said so [14:06:44] booleans have the notorious quality of being either true or false for everything [14:07:04] or unknown [14:07:15] well, sure, but that's not the point [14:07:28] one could always make an argument that it should be there on every item [14:07:49] and you would not want any such property, for obvious reasons [14:08:45] but wont the community decide that "is fictoinal = false" is not a good property for Q525 [14:09:20] btw. a click on "edit links" in wikipedia dont show the links-section of the wikidata-page [14:09:24] maybe a bug [14:09:56] definitely confusing [14:09:57] lbenedix: i think fixed in next deployment [14:10:05] ok [14:15:06] * lbenedix stops arguing for bool type... all bools can be replaced by some "is a"-property [14:24:28] i need to force one wikidata-item into another wikidata-item since it just is another language for the single one. ? http://www.wikidata.org/wiki/Q1049151 <-- http://www.wikidata.org/wiki/Q11976148 [14:24:50] could anyone do the trick here [14:29:16] howwddyyyyyy [14:32:08] hello anyone here... ? [14:33:18] Migrant: put the data from one in the other and put the other one here: http://www.wikidata.org/wiki/Wikidata:Requests_for_deletions [14:33:42] or just put it there, and tell them to merge it :) [14:34:51] Migrant: {{doing}} [14:34:51] [1] 10https://www.wikidata.org/wiki/Template:doing [14:35:20] done [14:39:35] New patchset: Jeroen De Dauw; "Fix issues in Entity::patch" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60872 [14:40:51] Denny_WMDE1: okay will to the comment on the linked page you gave me.. thanks [14:41:04] Migrant: errrr, read Stryn above [14:41:07] JeroenDeDauw : i also just figured out why the tests were failing [14:41:11] :) [14:41:15] oh [14:41:18] ok, I'll review [14:41:44] we need to include a link to https://www.wikidata.org/wiki/Help:Merge in the topic :P [14:41:51] New patchset: Jeroen De Dauw; "Import classes rather then using FQNs" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/61008 [14:43:26] Stryn : did you merge the wikidata-items for me ? [14:43:34] yep :) [14:43:44] Thanks :) [14:43:49] you're welcome :) [14:46:11] hashar: please to merge https://gerrit.wikimedia.org/r/#/c/60211/ ? :p [14:46:26] ohhh [14:46:36] JeroenDeDauw: sorry haven't seen that one :D are they passing ? [14:46:53] hashar: probably not [14:47:00] hashar: this is not a real issue though [14:47:01] indeed http://integration.wikimedia.org/ci/job/mwext-Wikibase-testextensions-master/? [14:47:10] Right now we do not know if they are failing or not, and if they are, why [14:47:12] that will make jenkins vote -1 isn't it ? [14:47:28] If the tests fail, it should -2 [14:47:37] 17:40:47 DB connection error: Unknown error () [14:47:38] :( [14:47:56] hashar: alternatively if we had a way to tigger the tests to run manually on jenkins, we could also further debug this [14:48:09] ah [14:48:15] so whenever you are on a build result like http://integration.wikimedia.org/ci/job/mwext-Wikibase-testextensions-master/1465/ [14:48:23] on the left is a link "Rebuild" [14:48:37] that let you resubmit the job with all the parameters that were originally provided by zuul [14:48:46] gives you http://integration.wikimedia.org/ci/job/mwext-Wikibase-testextensions-master/1465/rebuild/? [14:48:55] pressing [Rebuild] at the bottom craft a new build [14:48:56] :) [14:49:24] hashar: I guess this link only shows up when one has certain rights? [14:49:54] JeroenDeDauw: aren't you logged in ? [14:49:59] Is the login for Jenkins the same as for labs/gerrit, or is it separate? [14:50:02] I am not logged in [14:50:04] that is labs [14:50:22] I am looking at the rights matrix [14:50:53] hashar: logged in, still don't see the rebuild button [14:51:30] ah that is for the ldap group 'wmf' [14:52:08] hashar: well, would be good if the wd team could trigger rebuilds for our codes :) [14:52:39] Or if possible, everyone who can +2 for some repo, also tirgger rebuilds for the tests for that repo [14:53:05] I guess that is not easily feasible unless there is a jenkins gerrit plugin thinghy for that tho :p [14:53:51] Tobi_WMDE: revieeeeeeeeeeeeew!!! :p [14:54:39] JeroenDeDauw: I guess we need to create a wmde LDAP group [14:54:46] that would be useful for Gerrit projects too [14:54:53] I think your group is manually maintained right now [14:55:12] hashar: we have a group on gerrit [14:55:15] not a wmde one [14:55:17] JeroenDeDauw: almost finished with testing [14:55:17] a wikidata one [14:55:19] :-P [14:56:42] Tobi_WMDE: hashar: so how feasible is it to have the Selenium tests we have run on WMF Jenkins? [14:56:58] JeroenDeDauw: we don't run selenium :D [14:57:11] hashar: why not? [14:57:15] hashar: that's what I thought [14:57:22] JeroenDeDauw: for now that is done using a third party : browserstack / cloudbees [14:58:22] so we though it would cost too much to build a farm of virtual instances that runs a gazillon of different browsers / OS [14:58:32] so instead we get the service from two 3rd parties [14:58:35] hashar: sure. We have selemium tests now though and are relying on them. They are run on our own Jenkins. Would be better if this was linked to gerrit though. So is there anything that prevents us from moving our current selenium infrastructure to labs and have it run by WMF Jenkins? [14:58:47] zeljkof is maintaining them. I am sure he could give you an introduction course / get you an account [14:59:03] ahh [14:59:08] hashar: can we use these things to run Selenium tests we wrote? [14:59:12] Or is that just for QUnit? [14:59:15] JeroenDeDauw: oh please no! labs is way too slow [14:59:20] that is selenium [14:59:37] QUnit has been implemented by Timo. They are being run in a headless browser (phantom.js) [14:59:53] and labs is a firm NO. It is not reliable :/ [15:00:03] is it possble to give me a temporary sysop flag for 5 minutes (300 seconds) to translate http://www.wikidata.org/w/index.php?title=Special:Translate&taction=translate&group=translate-workflow-states&language=uk&filter=!translated&action=translate [15:00:07] hashar: well, we could also do better on the QUNit front. We have a lot of such tests, and it'd be awesome if they could be run by WMD Jenkins [15:00:14] Denny_WMDE1: i think there's essentially no public access to OAI [15:00:22] hashar: ok, no labs then [15:00:27] (in general, not specifically wikidata) [15:00:34] <^demon> jeremyb_: Yes. [15:00:43] if yes ask me when i will be ready because i'm afk for I while now [15:00:52] Barras: why do you need a sysop flag? [15:00:55] for that at least? [15:00:57] JeroenDeDauw: for QUnit the first step was to get them running for mw/core . The second step is having them running reliably for the Visual Editor extension (timo is part of that WMF team). [15:01:12] JeroenDeDauw: once they work fine on VisualEditor, we can generalize QUnit to other extensions. [15:01:13] er [15:01:19] hashar: Tobi_WMDE: so the two alternatives remaining for the Selenium tests then are: 1. we continue using our current infrastructure but have it linked up to gerrit. 2. we have the tests run by the 3rd party things you mentioned, if this is indeed possible [15:01:20] ^demon: you agree? [15:01:23] Base-w: why do you need a sysop flag to translate? [15:01:38] legoktm: I don't need any sysop flag... I've that access anyway. :-p [15:01:38] ^demon: did you see the thread about Instant data repo? [15:01:44] (wikidata-l) [15:02:01] <^demon> Huh? No. [15:02:07] Barras: wrong ping sorry :P [15:02:18] <^demon> I was just saying yes to "No you can't get to OAI without a user/password" [15:02:24] ok, good [15:02:31] well see the list if you feel like it :) [15:02:32] hashar: QUnit: well, it sucks for us to be blocked on that. What if we want to put some effort into this ourselves? [15:02:43] JeroenDeDauw: our selenium tests take about one hour to run once. do you think that would make sense to hook them to gerrit for each changeset? [15:03:21] Tobi_WMDE: perhaps not; however being able to request them to run for a single changeset would be great [15:03:33] <^demon> jeremyb_: I'm not on the list. Thread so I can search archives? [15:03:39] JeroenDeDauw: indeed, that would be nice [15:03:50] JeroenDeDauw: we could also discuss this on monday [15:04:01] I think we have a CI discussion then [15:04:01] <^demon> Ah, found it. [15:04:04] 26 06:52:47 < jeremyb_> legoktm: http://lists.wikimedia.org/pipermail/wikidata-l/2013-April/002176.html [15:04:32] Tobi_WMDE: a one hour run would not be very helpful for Gerrit reporting indeed :-] [15:04:55] Tobi_WMDE: yeah, we do have a meeting on this. It's good to get some of hashars thoughts before then though [15:05:14] hashar: presumably we can run some subset [15:05:24] JeroenDeDauw: Tobi_WMDE want me to join your monday meeting? or we can meet the three of us over skype/google hangout [15:05:54] hashar: in any case, even if we end up not running everything for each commit, it'd be very good to be able to trigger the tests manually. Right now only a few people have a selenium setup [15:05:57] <^demon> jeremyb_: tldr: Cross-wiki stuff is hard ;-) [15:06:09] hashar: that could work perhaps [15:06:16] hashar: JeroenDeDauw: sure. it's 3pm our time [15:07:58] ^demon: :-) [15:08:16] hmm [15:08:27] hashar: for now, it'd be great if we could get PHPUnit to work - so is it possible to give us the trigger build rights? [15:08:27] I am not sure I am willing to assist a meeting in German though :-D [15:08:41] hashar: I don't know German, so it'll be in English [15:08:47] oh my god [15:08:52] BREAKING NEWS [15:09:02] I thought you were german :) [15:09:13] lol :) [15:09:16] lulz [15:09:16] Nope, Dutch [15:09:19] <--- epic fails [15:09:41] (Permission is hereby granted to quote me and have fun) [15:10:07] hehe, what could possibly go wrong? [15:10:22] me revealing I am french? [15:10:25] oops [15:11:46] ^demon: oh, whoops, just now noticed that he /quit before i even started talking to him. and now the guy on the list figured out it's password protected and requested access [15:11:59] hashar: can you see this list? https://gerrit.wikimedia.org/r/#/admin/groups/32,members [15:12:00] JeroenDeDauw: Tobi_WMDE Iam dropping a mail and CCing Zeljko Filipin who is the selenium boss :-] [15:12:04] JeroenDeDauw: yeah [15:12:13] huh [15:12:14] heh [15:12:19] hasharok [15:12:19] hashar: well, if those people can get access... [15:12:24] JeroenDeDauw: will fill a RT ticket for ops to convert that to a LDAP group. Something like wmde [15:12:24] hashar: ok [15:12:30] hashar: or just me for now, then I'll stop shouting already :) [15:12:54] JeroenDeDauw: I prefer to have you shouting on the RT ticket for the benefit of everybody :-] [15:12:56] <^demon> Why does it need to be an ldap group? [15:13:08] <^demon> This is gerrit, please don't throw it in the RT black hole ;-) [15:14:42] ^demon: to add more rights to the wmde people in Jenkins, I need a ldap group :-] [15:14:59] <^demon> Ohhh, misunderstood. [15:15:15] <^demon> Well, if we make that an ldap group, we can do the same in gerrit so we can manage it in one place. [15:15:18] ^demon: then I guess we can migrate the Gerrit group to use ldap/wmde :] [15:15:36] <^demon> There'd be a one-time cost of cleaning up ACLs, but other than that it'd work just fine. [15:15:53] New patchset: Jeroen De Dauw; "Do not load Client in test entry point file as else it is completely broken" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/61011 [15:16:04] Change merged: Jeroen De Dauw; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/61011 [15:17:37] hashar: ^demon: beware, wmde is not the group we have now [15:17:41] Change merged: Tobias Gritschacher; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60872 [15:17:52] We currently have a group for people that activly work on the WD codebase [15:18:01] There is an intersection with WMDE people clearly [15:18:10] Change merged: Tobias Gritschacher; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/60931 [15:18:17] Though there are WMDE people not associated with what we are doing and non-WMDE people working on it as well [15:18:36] <^demon> Well, the wikidata group could inherit the ldap/wmde group, plus extras [15:18:37] <^demon> :) [15:19:03] <^demon> Like https://gerrit.wikimedia.org/r/#/admin/groups/1,members [15:19:06] That would work I guess [15:19:53] hashar: can you trigger a build now? [15:20:00] DanielK_WMDE: aude: is the fixed claim-patching something we want to backport already to wmf3? I guess so.. [15:20:03] It seems the builds are not triggered for each commit? [15:20:29] Tobi_WMDE: i think so [15:20:38] JeroenDeDauw: build link please? : -] [15:22:35] hashar: oh wait... one started now [15:22:41] hashar: did you start that? [15:22:53] JeroenDeDauw: nop [15:22:55] hashar: or does it just takes 5 mins before jenkins picks up there is a new commit? [15:23:22] aude: can you glimpse at the board please, and send me the related bugs for the claim-edit-conflict thing? [15:23:25] Tobi_WMDE: JeroenDeDauw : sent you a mail + Zeljko. Example of a Selenium run: https://saucelabs.com/jobs/5215e577f18a4255a7a4fdaf68c72573 :-D [15:23:31] I'll have to check them [15:23:51] JeroenDeDauw: https://integration.wikimedia.org/ci/ [15:23:56] JeroenDeDauw: jenkins is too busy right now :d [15:24:01] Tobi_WMDE: 44101 [15:24:08] 44547 47022 [15:24:41] hashar: thank. I_ve tried this some time ago, but it was terrible slow for our tests (written in ruby / watir) [15:24:45] but will check again [15:25:13] the cloudbees thing could be useful, at least for client testing on test2 [15:25:42] * aude seen it work for upload wizard, etc. [15:25:44] hashar: bluh, it's annoying Jenkins runs all tests in the directory and not just the registered ones [15:26:27] hashar: JeroenDeDauw: I think we would need a public accessable instance of wikidata for testing it with 3rd party [15:26:40] right? [15:26:43] aude: thanks [15:26:49] Tobi_WMDE: yeah indeed [15:26:55] Tobi_WMDE: but we can set one up on the beta cluste [15:27:05] Tobi_WMDE: and you can get one @ wmde office :D [15:27:36] tried to get one at wmde but answer was no [15:27:44] but could ask again [15:28:01] or use labs :D [15:28:12] legoktm: because massages is in ns8 [15:28:18] and wait five hours for the tests to finish? [15:28:29] Base-w: those messages should be translated on translatewiki.net then [15:28:33] New patchset: Jeroen De Dauw; "Another attempt to make the tests run on Jenkins" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/61013 [15:28:37] and increase request timeout to two minutes? [15:28:40] legoktm: realy? hm [15:28:41] Change merged: Jeroen De Dauw; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/61013 [15:28:41] better not [15:28:53] Base-w: yes, all interface messages should be translated on twn [15:28:57] why are they in the language stats on wd? [15:29:04] and on meta too [15:29:10] i'm not sure [15:30:50] legoktm: there is no message MediaWiki:Translate-workflow-state-progress and other on twn [15:30:59] I uhhhh [15:31:13] you should talk to someone in #mediawiki-i18n then [15:33:00] Tobi_WMDE: can you please resend the meeting invite to amusso@wikimedia.org ? Also add in zfilipin@wikimedia.org please :-] [15:34:34] hashar: done [15:34:38] thx [15:34:38] Danwe_WMDE: Henning_WMDE: any chance one of you can answer http://www.wikidata.org/wiki/Wikidata:Contact_the_development_team#JSON ? [15:36:22] New patchset: Jeroen De Dauw; "Another attempt to make the tests run on Jenkins" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/61015 [15:40:26] aude: do you have any idea what is causing this error? https://integration.wikimedia.org/ci/job/mwext-Wikibase-testextensions-master/1470/console [15:40:32] Only happening when client tests are included [15:41:31] Lydia_WMDE: yeah [15:41:49] JeroenDeDauw: adding the wmde LDAP group is https://bugzilla.wikimedia.org/show_bug.cgi?id=47734 I have filled the RT ticket and you are on copy of both of them. [15:41:59] JeroenDeDauw: no [15:43:07] hashar: is there any way we can have a directory with tests excluded from the jenkins run? [15:43:26] no idea what settings are used [15:43:27] aude: any suggestion on where to start debugging? [15:43:37] aude: the example settings [15:43:41] New patchset: Tobias Gritschacher; "Introduced ValueComparer interface" [mediawiki/extensions/Diff] (mw1.22-wmf3) - https://gerrit.wikimedia.org/r/61017 [15:43:42] ugh [15:43:58] JeroenDeDauw: what is the use case? [15:43:58] aude: I also tried with the settings that work for me, as seen here https://gerrit.wikimedia.org/r/#/c/61015/1/Wikibase.php [15:44:00] no idea if they work for the client and repo together [15:44:15] aude: well, i get them to work together [15:44:24] aude: and why would they not work together? [15:44:31] aude: any code in particular that is suspect here? [15:44:36] * aude rages [15:45:14] hashar: the tests for wikibase client are very fragile, they fail if the environment is not just right - they are the only thing breaking the build on wmf jenkins right now [15:45:21] don't know about the prefix stuff [15:45:33] hashar: so I'd like to have the client tests just not run for now, untill we fixed them [15:45:41] They are not relevant to most changes being made anyway [15:45:43] JeroenDeDauw: possibly we could have split in two different jobs. One for server and one for client. [15:45:52] hashar: even if setup to run in a single wiki, then it would skip setup and use cases that we have in reality for production [15:45:53] nemo_bis sayed that "yes those must be translated locally" [15:45:58] legoktm: ^ [15:46:06] huh. [15:46:08] ok [15:46:10] I can just write messages here [15:46:18] aude: these are UNIT TESTS, not integration or acceptance tests [15:46:22] i can't give you sysop access to translate them, but if you tell me the message ill translate it [15:46:22] it would be good to make them work on a single wiki [15:46:22] or a variant with a flag [15:46:27] ok [15:46:35] aude: if we need a production like environment to run them , we are doing it very wrong [15:46:36] JeroenDeDauw: and they depend on settings [15:46:45] They should not [15:46:46] New patchset: Tobias Gritschacher; "Fix issues in Entity::patch" [mediawiki/extensions/Wikibase] (mw1.22-wmf3) - https://gerrit.wikimedia.org/r/61018 [15:47:10] something seems to depend on the settings [15:47:16] Base-w: if you just save it on a wiki page, that will be easiest [15:47:24] hashar: yeah sure, we only need the non-client part of the split to start with though :) [15:47:25] ok [15:47:45] hashar: so any idea how we go about excluding the client tests? It's not a problem they don't run just yet [15:47:56] New patchset: Tobias Gritschacher; "Cleanup of edit-conflict Selenium test" [mediawiki/extensions/Wikibase] (mw1.22-wmf3) - https://gerrit.wikimedia.org/r/61020 [15:50:24] New patchset: Jeroen De Dauw; "DO NOT MERGE" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/61021 [15:50:50] legoktm: http://www.wikidata.org/wiki/User:Base/translate-workflow-states [15:51:38] aude: sure, I am not saying this is not the case [15:51:43] I am saying it should not be the case [15:52:10] hashar: can we have jenkins somehow add the "--exclude-group WikibaseClient" option to the phpunit call? [15:52:12] Base-w: awesome, all done [15:52:15] thanks :D [15:52:40] it would be nice if i could credit you as the translator somewhere... [15:52:49] legoktm: thanks you too :) [15:52:58] Change abandoned: Jeroen De Dauw; "(no reason)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/61015 [15:53:17] legoktm: yep new translate interface do not allow edit summaries :( [15:53:55] oh good, you're already a translationadmin :) [15:54:08] yep :) [15:55:07] New patchset: Daniel Kinzler; "(Bug 42063, bug 44577) RDF for Special:EntityData" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/51164 [15:56:06] but in such situations it is useless flag :) [15:56:17] yeah :/ [15:56:19] JeroenDeDauw: possibly. I will have to find a way to craft that in jenkins [15:56:42] hashar: do commits for our repo require +2 before the tests are run even when submitted by people that have +2 rights? [15:57:05] JeroenDeDauw: that is handled differently unfortunately :( [15:57:17] hashar: it'd be very much appreciated if that could be done ; I can put in time now... [15:57:30] "that" pointing to the exclude-group [15:57:31] JeroenDeDauw: Zuul has a list of committer emails which are allowed to have test run for them. [15:57:41] hashar: I am in that list no? [15:58:22] JeroenDeDauw: your gmail is :) [15:58:48] I am only using my gmail... [15:58:49] aude: Have you don anything regarding the enhanced watchlist/ change mess yet? [15:58:58] hashar: ok, I guess Jenkins is just being slow on starting the test runs [15:59:14] hoo: not recently.... still have my patch stashed [15:59:32] JeroenDeDauw: nothing waiting right now https://integration.wikimedia.org/zuul/ [15:59:41] aude: What does it do again? [16:01:41] hoo: try to split up parts of the code so that extensions can inject stuff in both enhanced and old changes [16:01:56] heh, we have non-client tests failing for evil reasons as well >_> [16:02:16] we should have gotten an introduction to properly writing tests at the start of the project I guess :) [16:02:17] enhanced does some coalescing and collapsing, so it's not enough to just inject extra lines [16:02:37] * aude be around later..... [16:02:46] aude: DanielK_WMDE: sent the 3 backports for the claim-edit-conflicts to review. would appreciate if you could merge. [16:03:46] hi. How can I merge two entries? [16:04:01] Q8090781 and Q9889793 [16:05:41] Change abandoned: Jeroen De Dauw; "(no reason)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/61021 [16:05:45] New patchset: Jeroen De Dauw; "DO NOT MERGE~" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/61023 [16:06:50] hashar: yeah, its empty... [16:07:11] hashar: still, nothing triggered for https://gerrit.wikimedia.org/r/#/c/61023/ so far [16:07:22] New patchset: Daniel Kinzler; "(Bug 42063, bug 44577) RDF for Special:EntityData" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/51164 [16:07:54] Tobi_WMDE: sorry, family time. can't promise anything before Monday. [16:08:21] DanielK_WMDE: no prob [16:08:37] JeroenDeDauw: it did ? [16:09:01] hashar: where? I don't see it [16:09:01] JeroenDeDauw: or do you mean the unit tests ? [16:09:06] hashar: https://integration.wikimedia.org/ci/job/mwext-Wikibase-testextensions-master/ [16:09:09] yeah ofc [16:09:18] I'm only caring about the unit tests ATM :) [16:09:19] JeroenDeDauw: I haven't merged your change!!! [16:09:36] hashar: err, I'd hope so? :p [16:10:08] hashar: so do I need to +2 CR that commit before the tests get run? [16:10:11] Looks like it ... [16:10:22] JeroenDeDauw: that might merge it [16:10:27] JeroenDeDauw: let me apply your change [16:11:22] JeroenDeDauw: will make it non voting to avoid blocking other peoples [16:11:34] hashar: sure [16:14:01] hashar: I guess it is quite easy to add the exclude group to all test runs no? That's somewhat evil, though it ought to not cause problems, so could work fine as temporary hack [16:16:30] JeroenDeDauw: i have enabled tests for wikibase https://gerrit.wikimedia.org/r/#/c/60211/ [16:16:38] DanielK_WMDE: why the fuck do we have two EntityChangeTest? [16:16:48] needs Zuul to reload though [16:16:50] They are both testing the same class as well, unless the docs are wrong [16:18:52] DanielK_WMDE: the one in repo is newer, and the reason it does not fatal when we run the tests is because it is not registered? wtf x 2? [16:20:00] New patchset: Jeroen De Dauw; "Move test into other namespace to prevent fatal and added FIXMES" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/61026 [16:20:08] Change merged: Jeroen De Dauw; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/61026 [16:20:15] New patchset: Jeroen De Dauw; "DO NOT MERGE~" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/61023 [16:20:36] hashar: when does it reload? has this happened yet? [16:21:15] JeroenDeDauw: it is reloading :-) [16:21:29] JeroenDeDauw: it waits for all currently running jobs to have finished [16:21:40] JeroenDeDauw: should work now [16:21:53] 61026,1 has been triggered [16:23:23] https://integration.wikimedia.org/ci/job/mwext-Wikibase-testextensions-master/1475/console [16:23:27] toonnnns of E :D [16:25:13] yeah... [16:25:17] those tests make me sad [16:28:56] and some are not working on sqlite :( [16:28:57] rror: 1 near "truncate": syntax error [16:29:15] I should really migrate to MySQL [16:31:39] ops, plz delete http://www.wikidata.org/wiki/Q12015044 [16:31:54] {{done}} [16:31:54] How efficient, legoktm! [16:33:53] JeroenDeDauw: I am off for the weekend. See you on monday :-] [16:34:04] legoktm: thanks again :) [16:34:16] np [16:34:24] ebraminio: we also have [[WD:RFD]] for these kinds of things [16:34:24] [2] 10https://www.wikidata.org/wiki/WD:RFD [16:34:25] hashar: thnx for the help and have a good weekend [16:34:28] but IRC works too [16:35:18] legoktm: Yes, I am using it when something is not my error :D [16:48:04] aude: you might find it amusing that a bunch of the test failures we are having now are coming from code I recently wrote :p [17:18:14] blah, would have noticed this waaaaaaaaaaaaaaaay sooner if the tests where actually running on jenkins [17:22:03] New patchset: Jeroen De Dauw; "Make Database component tests work without a MW setup and without MySQL" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/61037 [17:27:01] hi! [17:27:25] hey [17:27:28] We're looking at ISO 3166-2 codes and were surprised to see only ~800 entities (out of a few thousands) have them in wikidata [17:27:37] Is there any bot moving them over? [17:27:53] (e.g. the code is here https://en.wikipedia.org/wiki/Tartu_County but not here http://www.wikidata.org/wiki/Q192370 ) [17:28:19] reosarevok: probably not. [17:28:26] wanna write one? :D [17:28:46] I should be writing code in my own open data project :p [17:29:27] Actually I was mostly hoping someone who has bots for moving data from WP was interested in doing it ;) [17:29:40] (if nobody is I *can* look into it, but it'd be slower :( ) [17:30:06] you can start a new request at https://www.wikidata.org/wiki/Wikidata:Bot_requests [17:30:37] a bunch of people are disappearing for the next few weeks due to finals :( [17:31:22] heh, that's *also* something I should be doing [17:31:23] But well [17:31:26] I'll look [17:32:25] if you're using python https://github.com/earwig/mwparserfromhell is a true mediawiki text parser that works near-perfectly with templates [17:34:03] New patchset: Jeroen De Dauw; "Made SetupTest run without actually accessing a real database" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/61041 [17:35:16] New patchset: Jeroen De Dauw; "Made SetupTest run without actually accessing a real database" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/61041 [17:36:41] heh, pip install errored out [17:36:59] Let's try the dev version [17:52:10] DanielK_WMDE: local langlink api response seems to be wrong since about two weeks because of not cleaned local langlink table [17:52:19] is this known? [17:56:32] e.g. http://de.wikipedia.org/w/api.php?action=query&prop=langlinks&titles=Alawiten%20%28Begriffskl%C3%A4rung%29 should not return any langlink [17:59:02] that doesn't seem good.... [18:05:06] aude: Awake? [18:05:31] Time for wikimaps this weekend aude? [18:06:23] maps? [18:08:38] multichill: maybe and wikidata coding [18:12:54] legoktm: Are you on maps-l? [18:13:03] nope [18:13:12] I already feel like I'm on too many mailing lists [18:13:15] aude / legoktm : http://lists.wikimedia.org/pipermail/maps-l/2013-April/001254.html [18:13:26] :) [18:13:49] So if you get an instance up and running we might be able to drag Tim in to help out [18:15:17] yay! [18:15:18] looks pretty cool [18:15:32] legoktm: I wrote https://commons.wikimedia.org/wiki/User:Multichill/Wikimaps some time ago. It a live of it's own and it got rebooted in London at the GLAMwiki conference [18:15:58] grmbl [18:18:53] I have installed mapwarper on my vps. Now I hate ruby ;) [18:19:08] seems like a fun project :) [18:19:40] if you dont have a labs project yet, you should poke in -labs now since it seems like both ryan and andrew are there [18:19:48] @seen Denny_WMDE [18:19:49] liangent: Last time I saw Denny_WMDE they were quitting the network with reason: Ping timeout: 268 seconds N/A at 4/26/2013 12:53:20 PM (05:26:28.2163610 ago) [18:59:23] Someone linked me a wikidata-related file they wanted vectorised a few days ago. Does anyone know what it might have been? [18:59:25] Yes [18:59:26] It was me. [18:59:31] The crat icon. [18:59:37] https://www.wikidata.org/wiki/File:Wikidata_bureaucrat.png [18:59:47] Oh. [18:59:51] Link? [19:00:02] 'Kah. [19:00:04] y [19:00:06] ǵ [19:00:12] Ack. [19:02:02] Wheeeee! [19:08:06] legoktm: What's the copyright? [19:08:46] either cc-by-sa 2.5 or 3.0 [19:09:06] Hmm... [19:09:32] also {{WMFTrademark}} or something [19:09:33] [3] 04https://www.wikidata.org/wiki/Template:WMFTrademark [19:11:09] Any idea who the author(s) of this are? https://commons.wikimedia.org/wiki/File:Human-preferences-desktop.svg [19:12:28] Hello! [19:12:48] I was wondering if it would be possible to receive some feedback on my proposal before the weekend? [19:13:49] legoktm: Para ti: https://commons.wikimedia.org/wiki/File:Wikidata-tools.svg [19:14:03] <3 [19:15:07] https://www.wikidata.org/w/index.php?title=Template%3AUser_bureaucrat&diff=31379226&oldid=30347215 [19:15:08] :DDDD [19:15:17] Congratulations or something. [19:23:38] Isarra: Any chance you can svg-ify this one too? https://commons.wikimedia.org/wiki/File:Wikidata_oversight.png [19:23:56] Sure. [19:24:14] But I'm going to need you to fix Zombiebaron for me. [19:26:59] aude: Why do I see add links at https://en.wikipedia.org/wiki/February_1933 ? [19:27:08] What's with the excess padding on the wikidata logo? [19:27:13] It's already at WIkidata. A null save doesn't seem to make a difference [19:27:30] multichill: thats the new link widget thing [19:27:40] Isarra: I'm not sure. it's just what Hahc does. [19:28:39] It was on the original. [19:28:45] Oh [19:28:48] Well then [19:28:51] I have no clue. [19:32:53] This new widget sucks :@ [19:34:04] Ok forget it [19:34:08] This sucks ass [19:34:35] legoktm: Do we already have a bot to merge items? I tried using the interface and that's not ready yet [19:34:54] I have a script that can do it [19:36:01] legoktm: https://commons.wikimedia.org/wiki/File:Wikidata_oversight.svg [19:36:07] I just want to say mergeshit.py -lang:en -page:Pagetitle -lang2:nl -page2:Pagina_titel [19:36:29] It will strip everything from page2 and move it to page [19:37:18] legoktm: Is that what your script does? Can you please publish it in Pywikipedia? [19:38:13] oh no [19:38:44] mine was written to take all the "Films released in XXXX" categories in 5 languages and merge all the items [19:40:54] Hi everyone, I have a question that surely has been asked before: What about properties with item type whose potential values are a lot of people, which could or could not have their own item? [19:41:17] Examples, majors of all towns and villages in the world [19:43:16] Anyone watching? [19:45:14] https://www.wikidata.org/w/index.php?title=Q5395595&action=history <- wtf just happened here? [19:49:59] jem-: Thats qualifiers I guess [19:51:38] multichill: it shouldnt be too hard to write a mergeshit [19:51:47] you just need to first get the item [19:51:50] store it [19:51:57] then blank it through editentity [19:52:14] then readd everything, then re-add in the claims/refs [19:52:33] hello. I've a question about [[Q849295]] In the french version, the english version interwiki points on http://en.wikipedia.org/wiki/Template:Country_data_Sz%C3%A9kely_Land and not http://en.wikipedia.org/wiki/Sz%C3%A9kely_Land as described in wikidata. Do you know why ? [19:52:34] [4] 10https://www.wikidata.org/wiki/Q849295 [19:53:53] multichill: In which way would qualifiers help? I think I don't see it [19:54:33] New York. Major -> Bloomberg, Qualifier [19:54:42] And you have a long list of majors [19:54:47] Abaddon1337: I'm betting its a messed up template. Lemme see. [19:56:16] ok, thx [19:56:56] multichill: Ok, but if it's about a small town, Wikidata won't let me enter the name of the major in the first place, if he has no item [19:56:58] Abaddon1337: found it [19:57:04] https://fr.wikipedia.org/w/index.php?title=Mod%C3%A8le%3ACountry_data_Pays_sicule&diff=92501606&oldid=89706976 [19:57:30] So create an item for the major [19:59:06] legoktm: thanks a lot. I could have spent a long time on http://fr.wikipedia.org/wiki/Pays_sicule trying to find what's wrong in the article.. [19:59:19] np :) [20:00:09] Well, I don't know if possibly millions of such items with no associated articles would be allowed [20:07:49] jem-: We all decide on that [20:18:04] legoktm:der [20:18:06] ?? [20:18:13] hi [20:18:19] http://www.wikidata.org/w/api.php?action=wbcreateclaim&entity=q511968&property=p351&snaktype=value&value=1017&token=foobar [20:18:31] this is also showing same error [20:18:39] i dont think the prob is with module [20:19:11] well yeah [20:19:18] value should be wrapped in a " " [20:19:25] but json.dumps should take care of that [20:20:01] [03:19:11 PM] well yeah [20:20:01] [03:19:17 PM] value should be wrapped in a " " [20:20:01] [03:19:25 PM] but json.dumps should take care of that [20:20:34] [20:21:18] ya thanks [20:21:47] i'll pull request pywikibot rectifying this bug [20:21:48] :) [20:29:35] I think we hit the 12th million item. [20:30:33] we did yesterday [20:30:47] Yesterday? [20:30:53] or at least in my yesterday [20:31:38] Looks like it is your yesterday legoktm :) [20:45:36] Ok, thanks, multichill, I keep the idea [20:49:10] Hi, I've asked at forum but no one answered: can I turn off case-insensitive search (and use case-sensitive)? [20:50:56] I don't think thats an option per se. [20:51:08] aude: ^