[08:54:23] Change merged: jenkins-bot; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/64599 [09:03:33] DanielK_WMDE: daily? [09:10:48] Abraham_WMDE: oops [09:10:56] seid iohr schon fertig? [09:11:11] DanielK_WMDE: reviewing your patches for entity data [09:11:25] aude: great [09:11:50] Abraham_WMDE: sorry - that's what happens when I try to do "just a little coding while i wait" :P [09:11:54] https://gerrit.wikimedia.org/r/#/c/64037/ is up for review [09:11:56] anyway [09:12:11] if there is any additional feedback or whatnot [09:12:16] today for me: reviews, rdf content negotiation, more rdf mapping and testing [09:12:24] sounds good [09:12:44] and: oh god. more drilling in our house. just below me. gah. [09:12:47] * aude also working on enhanced changes this week, in prep for the hackathon (to get feedback and move that forward) [09:12:49] hey Denny_WMDE [09:12:53] ack [09:12:58] DanielK_WMDE: hey [09:13:26] DanielK_WMDE: i could need your help when you have time [09:13:27] let's see if in say 15 min, i can setup squid and really try to verify the patches :) [09:13:43] don't like to guess on the flush() one [09:18:58] aude: what guess? [09:19:21] Denny_WMDE: sure, what's up? [09:19:50] i am having a bit of trouble with this changeset [09:20:04] https://gerrit.wikimedia.org/r/#/c/64538 [09:20:25] i would appreciate it if you could help me with adapting the tests to the suggested changes [09:20:31] and also get your approval / agreement on the changes [09:20:43] i'd like to see this in before wednesday [09:20:56] there might be one or two things inside necessiating discussion though [09:22:36] DanielFriesen: according to the spec, NS_DATA would be /wiki/Special:EntityData/ [09:22:50] err, Denny_WMDE--^ Sorry DanielFriesen! [09:23:07] ah, ok [09:23:10] will change that [09:23:14] i was wondering about this [09:23:26] that was my biggest point for discussion :) thanks! [09:23:28] DanielK_WMDE: inability to verify flush() vs. the other thing we did [09:23:38] that we know works [09:24:37] aude: we don't know that "it" works. We dod two things: remove flush(), and add ob_start .. ob_end_flush. We know that one of these, or the combination, worked. [09:24:58] according to the documentation, i do not see how ob_start .. ob_end_flush could have any impact on what we are trying to do [09:25:07] of course, it would be great if we could actually test this. [09:25:09] sure [09:25:29] one more reason to get test.wikideata.org up and running .) [09:25:46] but not great to experiment with production [09:25:51] * aude see if i can reproduce [09:26:02] and would be good to test squid stuff anyway [09:26:30] yes, testing squid stuff would be good [09:26:35] it's not difficult [09:26:56] aude: once we can configure the cache time for the data stuff, we should set it to an hour or so. [09:27:02] ok [09:27:45] ...until we have implemented purging for that stuff; i have submitted a patch to core for this [09:29:56] Denny_WMDE: NS_DATA would not be hardcoded to /wiki/Special:EntityData/ btw - it should get the Title object for the special page, and fetch the path from that [09:30:11] otherwise, installations using a different setup will break [09:30:12] nice point, thanks [09:30:13] will do so [09:31:55] btw is there anywhere a description of how to install wikibase w/ easyrdf? [09:33:36] Denny_WMDE: i don't think so, no [09:34:02] Denny_WMDE: i suggest to maintain all such documentation in the repo, in the docs/ directory [09:34:17] easier to keep in sync with the code that way [09:34:40] we can use wiki format, so we can easily copy the documentation to a wiki [09:34:51] (hm, i think the options documentation needs some loving...) [09:35:23] i will update the data ns, but the tests still will fail, and i dont know how to fix that [09:35:34] maybe there is an error in the code, or the tests do not fit, dunno [09:38:11] Denny_WMDE: i'll have a look at the tests once you updated the change [09:41:01] ok [09:50:44] New patchset: Denny Vrandecic; "changed namespaces a bit, added license" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/64538 [09:54:19] DanielK_WMDE: done ^^ [09:56:51] New patchset: Tobias Gritschacher; "Fixed wrong decade precision" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/64790 [09:59:44] Denny_WMDE: do you want feedback *now*, or is it ok in an hour? [09:59:53] then i could finish the content negotiation stuff first [10:00:57] DanielK_WMDE: later is fine [10:01:06] just at some point today would be nice [10:01:10] ok [10:01:11] !admin http://www.wikidata.org/wiki/Special:Contributions/54.244.58.84 [10:01:16] hi [10:01:22] hey [10:01:37] can one of you give this ip a kick? :D [10:01:41] 54.244.58.84 is already blocked. Do you want to change the settings? [10:01:45] hah [10:01:46] ok [10:01:48] thx [10:02:02] sorry i'm just quick on the draw :P [10:02:05] lol [10:03:26] Lydia_WMDE: can you please ban everyone from Commons or something like that? This is not acceptable: https://bit.ly/nyandata [10:03:36] New patchset: Tobias Gritschacher; "Fixed typo in test output" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/64792 [10:03:42] gj JeroenDeDauw [10:03:43] 404 [10:03:54] Oh [10:03:54] JeroenDeDauw: *sob* [10:03:56] i know [10:03:59] You mean the 404 isn't acceptable? [10:04:09] Reedy: the wikidata nyancat was deleted [10:04:10] it got deleted :( [10:04:15] JeroenDeDauw: Upload it to wikidatawiki as fair use or some such [10:04:30] * legoktm slaps Reedy  [10:04:41] * Reedy uploads legoktm to wikidatawiki as fair use [10:04:46] We have uploads disabled I think - unless there would be an unexplained glitch in the config for a minute? :D [10:05:05] https://www.wikidata.org/wiki/Special:Upload [10:05:09] Local file uploads are disabled. Please upload files to Wikimedia Commons. [10:06:07] Change merged: Jeroen De Dauw; [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/64792 [10:09:44] New patchset: Tobias Gritschacher; "Fixed wrong decade precision" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/64790 [10:23:35] New review: Jeroen De Dauw; "(1 comment)" [mediawiki/extensions/Wikibase] (master) C: -1; - https://gerrit.wikimedia.org/r/64644 [10:24:12] Denny_WMDE broke the build! https://gerrit.wikimedia.org/r/#/c/64538/ [10:24:15] Change merged: Denny Vrandecic; [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/64790 [10:24:24] Now where is the USB rocket launcher I ordered? [10:24:39] JeroenDeDauw: but that wasn't merged! [10:24:43] that only caunts if it gets merged. [10:24:50] and whoever merged it gets the blame, no? [10:25:06] Denny_WMDE: for "known bad" code, you can use git review -D [10:25:14] that marks it as a draft in gerrit. [10:25:18] ah, good [10:25:24] jenkins doesn't run on those, i think [10:25:34] New patchset: Denny Vrandecic; "changed namespaces a bit, added license (draft, DO NOT MERGE)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/64538 [10:25:35] ...and drafts are only visible to reviewers, not public [10:25:48] i made a comment in the commit message now [10:25:52] the reviewers actually assigned to it, that is [10:25:57] DanielK_WMDE: jenkins does review drafts [10:26:06] ah. good. [10:26:11] wait, chaning the commit message removes all comments on the patch? [10:26:12] i hope it doesn't try to merge them :) [10:26:26] Denny_WMDE: yes, because it creates a new change set [10:26:27] no it does not merge [10:26:34] (facepalm) [10:26:36] Denny_WMDE: gerrit is pretty dumb about that [10:26:46] same with rebasing, etc [10:26:59] the comments are not gone, they are still in the earlier version [10:27:06] but you have to actually look there to see them [10:27:17] marumpfl. [10:27:24] :P [10:27:31] could be smarter [10:27:37] indeed [10:27:49] like everything :) [10:28:05] but denny, you don't like smart, remember?... [10:28:26] not in the first iteration :) [10:33:35] Change merged: jenkins-bot; [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/64541 [10:37:50] New patchset: Aude; "Add missing namespace use" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/64794 [10:44:06] Change merged: jenkins-bot; [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/64542 [10:46:37] New patchset: Jeroen De Dauw; "Change README to README.md and added Travis CI build status" [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/64795 [10:46:50] Denny_WMDE: you can haz easy review https://gerrit.wikimedia.org/r/#/c/63677/ [10:46:58] New patchset: Jeroen De Dauw; "Remove some deprecated methods" [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/63677 [10:47:09] New patchset: Jeroen De Dauw; "Added preliminary diff merger functionality" [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/63690 [10:47:15] New patchset: Jeroen De Dauw; "Added ArrayComparer interface" [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/63713 [10:48:45] Change merged: Denny Vrandecic; [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/63677 [10:48:52] yay, like easy reviews [10:50:35] Change merged: jenkins-bot; [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/64543 [10:51:27] Change merged: Jeroen De Dauw; [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/64795 [10:51:45] New patchset: Jeroen De Dauw; "Actually add the build status image now" [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/64796 [10:52:09] Change merged: jenkins-bot; [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/64796 [10:58:44] New patchset: Jeroen De Dauw; "Added Travis CI file" [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/64797 [11:00:52] Tobi_WMDE: Denny_WMDE: https://gerrit.wikimedia.org/r/#/c/64797/ [11:03:50] Change merged: jenkins-bot; [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/64797 [11:04:32] Tobi_WMDE: https://travis-ci.org/JeroenDeDauw/Ask [11:05:09] Denny_WMDE: Abraham_WMDE: I'm taking the day off btw, to tired to do serious programming [11:09:12] JeroenDeDauw: ok [11:13:29] Change merged: jenkins-bot; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/64544 [11:14:55] Change merged: jenkins-bot; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/64545 [11:15:58] Change merged: jenkins-bot; [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/64546 [11:18:48] New patchset: Tobias Gritschacher; "(bug 48145) Moves "Time" data type out of experimental" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/64547 [11:34:06] New patchset: Tobias Gritschacher; "Removed dead code" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/64800 [12:25:34] Change merged: Denny Vrandecic; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/64547 [12:36:01] New patchset: Jeroen De Dauw; "Improved ListPatcherTest" [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/64803 [12:43:35] Abraham_WMDE: this tool might be of use to us, and have good metrics for on the monitor thing: http://phpundercontrol.org/images/0.3.5-metrics.png [12:43:51] WMF recently hired some guy that contributed to that thing as well [12:44:25] JeroenDeDauw: thanks for the pointer [12:46:22] New patchset: Jeroen De Dauw; "Add unhappy path test to CallbackListDifferTest" [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/64804 [12:48:43] New patchset: Jeroen De Dauw; "Add unhappy path test to MapDifferTest" [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/64805 [12:53:06] New patchset: Jeroen De Dauw; "Add unhappy path test to DiffTest" [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/64806 [12:53:46] Denny_WMDE: 4 more easy commits here https://gerrit.wikimedia.org/r/#/q/status:open+project:mediawiki/extensions/Diff,n,z [12:53:51] aude: how far did you get with reviewing the EntityData stuff? [12:54:59] it would be helpful to have Ic13fb09ad5a merged, the content negotiation stuff somewhat overlaps with that [12:56:14] Change merged: Denny Vrandecic; [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/64806 [12:56:53] Change merged: Denny Vrandecic; [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/64805 [12:57:14] Change merged: Denny Vrandecic; [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/64804 [12:59:04] Change merged: Denny Vrandecic; [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/64803 [13:11:33] Denny_WMDE: da? [13:12:15] DanielK_WMDE: ja [13:12:57] Denny_WMDE: also, das problem ist $dataResource = $graph->resource( $entityUri . '.rdf' ); in RedBuilderTest [13:13:19] that's just wrong, the data URI can't be derived from the entity TRI. [13:13:28] but the root of the problem is deeper [13:13:38] ah, true, this one was a mistake [13:13:43] but i think there's more [13:13:44] you replaced the '#' with the full document URI [13:14:06] that is not semantically equivalent. usiong '#' serves one important purpose: [13:14:20] it says "this info is about THIS HERE document". [13:14:33] ...and then gives the URI of that document in addition. [13:14:47] well, a schema:url, at least [13:15:03] DanielK_WMDE: # is a very bad idea [13:15:22] especially if you want to use the same code for making a dump of all entities [13:15:30] ah, true [13:15:45] but it's very useful if the document is just that one document. [13:15:51] *sigh* [13:15:55] oh, well, never mind that for now [13:16:10] # only is a shortcut and means "use the url of this document" [13:16:15] when you move it, it moves along [13:16:20] when yo download it it moves along [13:16:30] yes, which is correct [13:16:33] just using a stable proper URI here is a better solution [13:16:41] because that is the name of the document [13:16:42] as long as the data is unchanged, the '#' refers to the right thing [13:16:46] which was the point of using it [13:17:03] # is just a bad shortcut [13:17:30] no, it's the only way to refer to the present document. [13:17:35] but let's not block on that [13:17:59] (actually, we could use #q123, and set xml:id attributes... but that only works with xml) [13:18:00] anyway [13:18:13] to fix the issue at hand (which causes all 4 failures): [13:18:53] give RdfBuilder a getDataURI( $suffix ) method, and use that in the test setup [13:18:55] let's discuss that f2f, but trust me, # is bad :) there is absolutely no reason to not use the name of the document proper [13:19:03] ok [13:19:07] or getDataURI( $entityId, $format, $revision ) or something [13:19:13] i dont know how to use it in test setup [13:19:59] in the line use modified there. line 110 in RdfBuilderTest. You have a $builder there, just call it to get the URI [13:20:22] using a dummy builder there is kind of ugly, since it's the thing under test, but i don't see a good way around that [13:20:26] hm, actually... [13:21:09] Denny_WMDE: i suppose we should avoid depending on global state for that URI. So my suggestion to use a Title object was bad (or incomplete) [13:21:16] this should not be in the builder: \Title::newFromText( 'Special:EntityData' )->getCanonicalURL() . '/'; [13:21:31] it should be wherever the builder is constructed, and injected into the builder [13:21:46] DanielK_WMDE: got it [13:21:46] that way, you can use some random string in the test, no need to use the dummy object [13:22:49] good [13:23:02] ain't that true for baseURI either? [13:23:33] sure - that's way RdfBuilder takes the base uri as a parameter [13:23:53] ... and the test sets it to 'http://acme.test' [13:24:07] you can define another constant in the test, alongside URI_BASE [13:27:47] for requesting rollbacker should I make new paragraph on http://www.wikidata.org/wiki/Wikidata:Requests_for_permissions/Other_rights#Rollbacker ? [13:29:26] New patchset: Jeroen De Dauw; "Updated depenedencies in composer.json" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/64810 [13:30:25] admins? [13:31:22] M4r51n: Yes. [13:31:33] You need to make a new section. [13:32:01] yeah a section not a paragraph [13:32:05] hehe [13:32:06] tnx :) [13:32:13] Not a problem. [13:32:59] Are we still on track for having dates in June ? [13:33:34] GerardM: yes [13:36:13] http://www.wikidata.org/wiki/Wikidata:Requests_for_permissions/Other_rights#.7B.7BAnchor.7CRollbacker.7D.7D.7B.7B.23ifexist:Wikidata:Requests_for_permissions.2FRollbacker.2FHeader.2F.7B.7Bint:lang.7D.7D.7C.7B.7BWikidata:Requests_for_permissions.2FRollbacker.2FHeader.2F.7B.7Bint:lang.7D.7D.7D.7D.7C.7B.7BWikidata:Requests_for_permissions.2FRollbacker.2FHeader.2Fen.7D.7D.7D.7D [13:36:20] ops. [13:36:23] http://www.wikidata.org/wiki/Wikidata:Requests_for_permissions/Other_rights [13:36:25] here is it [13:36:35] hmm [13:37:03] why it doesnt show the reason? [13:39:30] DanielK_WMDE: would it make sense to also have RdfSerializer get these parameters? [13:41:03] Denny_WMDE: it needs them, so it can construct a builder. [13:41:15] alternatively, it could just take a builder, instead of constructing it. [13:41:28] would be somewhat niver, but also a bit inconventient. [13:41:36] *nicer. architecturally [13:43:01] Abraham_WMDE: crowdfunding... https://en.wikipedia.org/wiki/Travis_ci [13:45:46] Abraham_WMDE: looks like it might be easier for us to have the tests run in various environments there [14:00:33] what is the German person registry called again ? [14:00:46] (need it for a blogpost about Wikidata) [14:04:23] GND [14:05:41] GerardM: GND is people, organisations, places, etc. it replaces the old PND, which was just people. [14:05:55] tools and templates are still often using "PND" [14:10:13] thanks DanielK_WMDE [14:11:05] How I can use ui-entityselector-input? Is there any documentation for ui widgets? [14:13:00] or sample code [14:15:37] ebraminio: sorry, the UI team is not in today. best thing would be to either catch them on the weekend in amsterdam or from thursday on here [14:23:18] Denny_WMDE: https://gerrit.wikimedia.org/r/#/q/status:open+project:mediawiki/extensions/Ask,n,z [14:29:57] Denny_WMDE: ok. thanks [14:36:03] there are certain attitudes I hate ... http://ultimategerardm.blogspot.nl/2013/05/most-op-data-in-wikidata-is-curated.html is a response to one of them (to do with Wikidata) [14:40:21] GerardM: yes, most data in Wikidata needs curation. but most articles in Wikipedia need curation too. ;) [14:41:11] My point is that interwiki data has been the subject of a lot of scrutiny [14:41:47] Lydia_WMDE: y u no rage on twitter? [14:41:49] my point is that much input from for instance the GND has been curated [14:42:00] busy hiring assassins? [14:42:20] * Lydia_WMDE pokes JeroenDeDauw hard [14:42:44] no busy with all kinds of other stuff [14:44:47] GerardM: there I disagree. the GND is pretty much irrelevant to anything interesting you might wish to do with Wikidata. [14:45:12] and your argument is ? [14:45:46] trying to fit the world into 7 arbitrary categories dreamed up by German librarians is a process of discarding useful information. [14:45:55] any inferences you draw from that are likely to be wrong. [14:46:09] you can't use it for any interesting purposes because not everything actually fits a "main type" [14:46:10] it does not resonate with me [14:46:14] BECAUSE [14:46:26] it is a link to the German Librarian data [14:46:43] Yes. and if German librarians think they can boil the world down into seven categories, they are idiots. [14:46:51] So when we link to them we can either feed them with data or be fed from their data [14:47:09] data that has been validated [14:47:13] Wikidata has a property - P107 - called "GND main type" [14:47:20] it allows you to specify one of seven types of thing. [14:47:33] and your argument does not refute that [14:47:36] only the world doesn't actually fit cleanly into these seven arbitary types [14:47:46] so they have to fudge it. [14:47:51] tommorris from my pov... irrelevant [14:48:13] The Python programming language is a "term" under GND [14:48:21] but it's not a term, it's a programming language. [14:48:31] so we use it not for everything ... [14:48:42] it is great for people [14:49:02] no it isn't [14:49:11] the Medici family is a "person" [14:49:15] and we can state in Wikidata that pythin is a snake [14:49:18] but it isn't a person, it's a family. [14:49:20] python [14:49:39] but in order to fit it into an arbitrary categorisation system, we have to say it is a person, even though it isn't [14:49:58] we only have to do this where we want to [14:49:58] and if you ask the GND proponents why they care about it so much, they say it's so they can verify German Wikipedia against it [14:50:17] but you don't need "GND main type" [14:50:23] you can just use "instance of" [14:50:23] and we CAN have references to many systems [14:50:53] tommorris perfection is the enemy of the good [14:50:54] this isn't a reference to a system, this is a deliberately broken type statement that we have to fit in with a system. not the same. [14:51:00] let us use the good of the GND system [14:51:18] and use other systems as well [14:51:26] the best use of the GND system is to delete it from Wikidata as an irrelevant distraction, a strange fetish invented for German librarians rather than actual real-world data. [14:52:08] I've spent years and years working with semantic web data. nobody has yet to make a plausible case for why "GND main type" isn't complete madness. [14:52:31] your perfection is the enemy of a perfectly valid and good reason why you want it as well [14:53:10] what is the perfectly valid and good reason? [14:53:20] I can't see a valid or good reason for saying that the Medici family is a person [14:53:29] they want to use it to validate German Wikipedia articles [14:53:53] they want to use it to validate German Wikipedia articles... with data that's inherently broken. [14:54:05] let them live and learn [14:54:26] you do not make friends with rants [14:54:56] I invite you to write a blogpost about this and when it is well written, I post it on my blog [14:55:37] I've written about it on-wiki and the community is against me. [14:55:49] it will be a good blogpost when you address how you can use the GND identifier and not include the data in Wikidata [14:55:52] oh well, if German Wikipedia wants to do silly things with data, that's their lookout. [14:56:08] I sympathise [14:56:28] we're making good progress on starting to be able to use the data in English Wikipedia. ;) [14:56:28] and I present you with a positive way to vent your frustrations [14:57:36] New patchset: Daniel Kinzler; "(bug 44576) Select format based on Accept header." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/64817 [14:57:41] tommorris can you show me a good example [14:57:51] preferably to do with a list [14:58:08] http://www.wikidata.org/wiki/Wikidata:Requests_for_permissions/Other_rights [14:58:14] i need to wait very long or ?D [14:58:48] GerardM: https://en.wikipedia.org/wiki/Wikipedia:Wikidata/Wikidata_Sandbox - a version of Infobox person that pulls one property from Wikidata ('spouse') [14:58:58] M4r51n: look how long the other requests have been on the page [15:01:38] about 2 min [15:01:38] :. [15:02:22] oh :) [15:06:04] tommorris: GerardM: i think you were talking about different things. Tom doesn't like P107 (which I fully understand), Gerard was refering to P227, which provides us with the GN identifier [15:06:26] I agree with you both: P227 is cool, P107 is… erm… interesting [15:06:59] oh, P227 is fine. I'm okay with sameAs type relations. [15:07:19] that is sameAs-style relations. "type" is a somewhat confusing word in that sentence. ;) [15:08:48] the P227 allows for reference outside of Wikidata and as such is really valuable [15:09:55] it'd be better if we could just replace P227 with a generic sameAs identifier like OWL has. [15:10:10] aude: Did you poke someone for the jshint change? [15:10:51] hoo: only tobi was here today and asked [15:11:07] i think probably danwe needs to approve or maybe henning [15:11:55] Yes... that one is trivial... but the amount of changes makes it non-trivial again, I guess :P [15:12:02] * aude nods [15:12:04] tommorris: I agree in general, but it would be more confusing [15:12:30] tommorris: because it would require a URI on the other side [15:12:53] tommorris: example, what to do with an IMDB number like tt32378 ? :) [15:13:17] http://www.imdb.com/title/tt32378/ [15:15:15] nope, that's the website, not the movie. owl:sameAs'ing that would be bad [15:15:52] Denny_WMDE: I'm happy to cut the difference. [15:16:03] you are :) [15:16:22] next think is, Shakespeare's will be writing the Wikipedia article on Hamlet! [15:16:26] *thing [15:16:28] if the W3C work out httpRange13 sometime, we can rethink. [15:16:57] I'm just not a big fan of custom magic properties [15:23:57] Change merged: Hoo man; [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/64800 [15:26:46] New review: Hoo man; "(1 comment)" [mediawiki/extensions/Wikibase] (master) C: -1; - https://gerrit.wikimedia.org/r/64810 [16:19:31] New patchset: Denny Vrandecic; "changed namespaces a bit, added license" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/64538 [16:23:59] New review: Denny Vrandecic; "(1 comment)" [mediawiki/extensions/Wikibase] (master) C: 2; - https://gerrit.wikimedia.org/r/64817 [16:24:01] Change merged: Denny Vrandecic; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/64817 [16:26:35] New patchset: Denny Vrandecic; "changed namespaces a bit, added license" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/64538 [16:27:57] DanielK_WMDE: want to take another look at the changeset for the rdf export? [16:30:09] DanielK_WMDE: still failing, but getting closer: https://gerrit.wikimedia.org/r/#/c/64538/ cheers [16:30:15] i am off for now, maybe later [16:52:38] New review: Nikerabbit; "(1 comment)" [mediawiki/extensions/Wikibase] (master) C: -1; - https://gerrit.wikimedia.org/r/64810 [18:50:58] For heavens sake... maybe we should start a discussion on whether admins should hold editinterface on Wikidata... IMO they shouldn't [18:51:50] ? [18:52:11] rschen7754: Yet another gadget created and made default by the same user [18:52:15] without code review or anything [18:52:22] can you link? [18:53:12] rschen7754: I'll have a look myself, maybe it's security related [18:57:16] rschen7754: https://www.wikidata.org/w/index.php?title=MediaWiki%3AGadgets-definition&diff=44795229&oldid=43026301 [18:58:00] :/ [18:58:25] class UserMailer is exactly where i'd look to find where watchlist notificaiton timestamp gets updated :o [18:58:27] People cant just take text and create html from that with an regex [18:59:33] errr, EmailNotification :) [19:02:19] I'm going to do smth. about the gadget trouble now... [19:30:43] https://www.wikidata.org/wiki/Special:AbuseFilter/21 this was about time [19:38:30] fyi, https://www.wikidata.org/wiki/Wikidata:Project_chat#Issues_with_new_gadgets [19:54:51] New patchset: Denny Vrandecic; "changed namespaces a bit, added license" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/64538 [20:02:22] New review: Hoo man; "(1 comment)" [mediawiki/extensions/Wikibase] (master) C: 1; - https://gerrit.wikimedia.org/r/64083 [20:06:53] hey can someone in #wikidata-admin pleas change the settings so i can join without an invitation? [20:31:16] hallo! [20:32:45] aude: is it a good time to talk to you about the patch? [20:34:10] hi [20:35:04] I'm working on the updating the patch based on the comments [20:35:42] hoo said that I should use a namespace checker at the beginning of the function [20:36:08] can you tell me how to go about doing that? [20:39:22] it's in the client [20:39:49] the title object has a namespace [20:40:28] the checker helps determine if a namespace is "enabled" for having wikidata links [20:41:37] so how do I implement a checker? [20:45:38] there's a class for this [20:52:16] can you tell me the name of the class? [20:52:48] NamespaceChecker [20:55:22] oh I understand now [20:55:49] so what hoe's suggesting is that I should check if Wikibase is enabled and only then do things if need be [20:56:11] right [20:58:45] the function requires me to pass a namespace as a param [20:59:21] how can I get this namespace object? [21:00:26] from the title [21:01:23] $context->getTitle()->getNamespace()? [21:02:22] i think so [21:05:50] if ( $title->exists() && $namespaceChecker->isWikibaseEnabled( $title->getNamespace() ) ) [21:07:31] should work [21:11:33] also, JeroenDeDauw points out that singleton is deprecated code [21:11:52] Deprecate all of the singletons [21:14:10] so a simple Settings::get( 'siteGlobalID' ) will work? [21:16:00] pragunbhutani: that only gets the site id [21:16:03] not the site object [21:16:21] aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaah [21:38:10] how can I get a site object without using singletons? [21:43:43] in this case, i think we can just construct one of the site objects [21:43:52] (Site and related classes are in core) [21:44:00] then assign the global id [21:57:46] New review: Aude; "not necessarily a problem with this patch, but the html messages don't handle interwiki links correc..." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/64083 [22:01:56] New review: Hoo man; "[[enwiki:Foo]] is invalid. We would need w:en:Foo or smth. like that. But just $siteID . ':' . $page..." [mediawiki/extensions/Wikibase] (master) C: -1; - https://gerrit.wikimedia.org/r/64083 [22:02:21] to contract a site object, I need to supply $type as param [22:02:38] *construct [22:06:32] pragunbhutani: yes or we could use the MediaWikiSite class that extends Site [22:20:10] Can I change title and description of item via wbeditentity? [22:21:20] ebraminio: the label? [22:21:57] i think it supports labels, descriptions, aliases and site links [22:22:08] not yet claims [22:28:40] aude: thanks. so for claims, i must move them one by one? [22:30:05] dumb question, i really need help on http://www.wikidata.org/wiki/User:Ebraminio/merge.js [22:31:35] ebraminio: unfortunately yes, at least for now [22:32:35] aude: if I use Site::newForType ( $siteType ) [22:32:51] what should I pass for $siteType? [22:33:46] one of the site type constants in the Site class? [22:35:05] would $type=self::TYPE_UNKNOWN work? [22:35:39] what kind of site would a wikipedia be? [22:35:42] unknown? [22:36:22] (alright, have to "sign off" in a few minutes) [22:38:25] $type=self::TYPE_MEDIAWIKI [22:41:33] yes [22:41:56] or look at the MediaWikiSite class that should handle that [22:42:09] either would work [22:42:39] that's where I found this [22:44:46] so I can create an object of the class Site using TYPE_MEDIAWIKI and use the setGlobalId member function to set its global ID [22:48:43] should work