[00:00:50] (PS2) Hoo man: Normalize item ids before trying to remove doubles [extensions/Wikibase] - https://gerrit.wikimedia.org/r/75533 [00:01:09] that's what I don't like about PHP... there's no such class, but it still doesn't complain [00:02:19] (CR) jenkins-bot: [V: -1] Normalize item ids before trying to remove doubles [extensions/Wikibase] - https://gerrit.wikimedia.org/r/75533 (owner: Hoo man) [00:03:36] (PS3) Hoo man: Normalize item ids before trying to remove doubles [extensions/Wikibase] - https://gerrit.wikimedia.org/r/75533 [00:04:43] hoo: do you why appendtext edit action may cause edit conflict? theorically there must be not any edit conflict on appendtext IMO [00:04:55] *know [00:05:28] ebraminio: If you pass it without baserevid and anything it really shouldn't [00:05:51] hoo: yeah, i don't, but there is [00:06:00] mh [00:06:17] people is arguing why merge.js is trapped on WD:RfD edit conflicts [00:06:45] I am retrying 6times with 5 second delay although [00:06:45] ok, RfD is really high traffic [00:07:21] I wrote/ maintain the script for steward votes on meta and that one only does a single retry with appendtext and no one yet complained [00:08:41] hoo: may it because token? because "Request deletion" gadget also had this problem [00:08:59] the edit token is session unique and required [00:13:07] hoo: http://meta.wikimedia.org/wiki/User:Hoo_man/stewardVote.js is same with that wikidata gadgets [00:13:24] hoo: anyway, i hoped someone file and report it as a bug [00:14:56] * ebraminio 4:44AM, going to sleep! :) [00:15:32] I should as well ;) [01:06:58] el.wikivoyage is the first Wikivoyage to have 0 mainspace unconnected pages! [01:07:04] http://el.wikivoyage.org/wiki/%CE%95%CE%B9%CE%B4%CE%B9%CE%BA%CF%8C:UnconnectedPages [03:11:32] hi anybody [03:11:39] !helper [03:12:09] respnd [03:12:19] If you have a question or need help, ask it directly, please. [03:13:02] wait 1 min [03:13:42] the article bill wray is a stub [03:13:57] should i delte it [03:15:08] wikidata.com/billwray [03:15:17] org [03:15:28] !helper [03:15:53] Wikidatahelper: you're not making any sense [03:15:55] respond [03:16:11] h [03:16:11] h [03:16:11] h [03:16:11] h [03:16:12] hg [03:16:12] f [03:16:12] d [03:16:13] d [03:16:13] f [03:16:13] d [03:16:14] f [03:16:14] h [03:16:15] g [03:16:15] g [03:16:27] rschen7754: Thank you. [03:16:30] np [03:16:42] people are strange... [03:16:57] Ye-es... [03:17:12] I have ops in here? [03:17:15] yup [03:17:45] Guess I should pay more attention to this channel. ;) [03:17:53] lol [06:16:24] (CR) Aude: "(1 comment)" [extensions/Wikibase] - https://gerrit.wikimedia.org/r/75477 (owner: Hoo man) [08:37:52] (PS2) Addshore: Do not store empty aliases in the datamodel [extensions/WikibaseDataModel] - https://gerrit.wikimedia.org/r/75321 [09:26:01] Denny_WMDE: hi, any "news" on badges? ;) [09:26:17] no, meeting is today in the afternoon :) [09:26:27] so in 3-4 hours i can tell you [09:26:44] ok [09:33:30] aude >> funky? :D >> http://grab.by/oK6O [09:33:52] oh noes [09:34:02] i think its something to do with redirecting to items [09:34:03] huh [09:34:09] do you see it if you go to http://www.wikidata.org/wiki/Wikidata:SandboxItem [09:54:21] the implementation phase ( which will be at consumium.org/wiki/ ) will probably use both http://DBpedia.org #WikiData.org and whenever the server has free time it can go compare the values given by both systems respectively [09:55:13] So I access WikiData with SPARQL I would assume .. [09:55:34] I think at least DBpedia was accessed with SPARQL [09:56:06] Since it's somewhat semantic ain't it [09:57:04] if you have a semantic database based on subject/predicate/object triples ain't it best to query it with SPARQL ? [09:57:44] jubo2, we don't use sparql [09:58:26] Denny_WMDE: what do you use ? [09:58:44] we don't provide any queries right now [09:58:49] you can get a dump [09:58:54] and do the sparqling yourself [09:58:58] Denny_WMDE: that's ait with me and my interests [09:59:04] or you can access the data item by item [09:59:08] ait? [09:59:16] shorthand for "all right" [09:59:31] ok [09:59:55] Denny_WMDE: do you want to know the etymology of the expression 'ok' ? [10:00:36] there is no widely accepted etymology, but a number of competing theories [10:00:58] It was used in the Civil War of 1860's ( urr thereabouts ) to mean "zerO Killed" to report zero casualities [10:01:20] if you have a good source for that etymology, i would be interested, otherwise it is just one more folk explanation :) [10:01:49] hence the expression "Are you OK ?" [10:02:01] indicating. "Are all the men alive?" [10:02:46] i heard that a number of competing explanations, but let's get back to the topic of wikidata? :) [10:02:55] Waht? Usian Army of Ammurica doesn't use convenient shorthands and codespeak w00t? w00t? [10:03:40] I certainly don't 10-4 that and neither should any of you as it's false [10:09:44] It'd be nice if an official canonical SPARQL endpoint existed, and offical canonical RDF published as linked data rather than as an "export" [10:11:40] tommorris: the world woudl be awesome [10:12:20] * tommorris sits and waits patiently for the Wikidata folks to realise that this would be sensible. [10:14:34] tommorris: we do publish rdf [10:14:42] officially and canonically [10:15:00] http://www.wikidata.org/entity/Q42.rdf [10:15:06] ooh, nice [10:15:33] or actually, via LOD conneg here http://www.wikidata.org/entity/Q42.rdf but you will need to use something else than your browser to get the right representation [10:15:55] * tommorris has a couple of RESTish clients. [10:16:02] that should work [10:16:16] tommorris: SPARQL is a different story. We would love to provide a SPARQL endpoint, but we simply are technically not able to [10:17:04] couldn't we just have a daily dump of changes go into, say, 4store [10:17:12] there's no open source solution that allows us to provide sufficiently quality-of-service over our quickly changing dataset and that requires sufficiently little work so that we can do it [10:17:46] well, we do provide a daily dump. go ahead and set the sparql endpoint up [10:18:02] but if we had a sparql endpoint,we would like it to be live, not on a dump [10:18:02] * tommorris adds it to his ever-growing list of things to do [10:18:30] there's no reasone why we should provide sparql access to a day old dump [10:18:36] that can be done externally, obviously [10:20:15] day old is more useful than not-at-all [10:20:42] DBpedia is ridiculously useful to people with often months old data [10:21:05] I've helped other editors on en.wp use DBpedia to find inconsistencies in articles [10:21:13] well, someone can set that up [10:21:23] i don't see the advantage of us doing it [10:22:28] i.e. what i mean, I don't think that we can do this effectively, based on the expertise and resources we have in the organization [10:25:35] addshore: any clue why http://www.wikidata.org/w/index.php?title=Q20317&diff=61046707&oldid=61046611 shows incorrect edit summary? [10:25:54] i cannot reproduce that on my test wiki (master or on the deployment branch) [10:25:58] HAH! [10:26:11] well, that is interesting [10:26:15] yeah [10:26:58] could be an undo [10:27:04] errr [10:27:33] except undo has specific edit summary [10:27:36] how very very strange [10:27:42] http://www.wikidata.org/w/index.php?title=Q20317&diff=61130726&oldid=61130705 [10:28:06] :o [10:28:15] im trying to think of how that is even possible... [10:29:27] * aude thinks it's unlikely for an ip to be using editentity or some other api module [10:29:48] aude: not that unlikely. some bots don't notice that they got logged out [10:29:49] bot logged out? [10:30:06] possible but unlikely [10:30:17] happens all the time [10:30:19] they did just 2 edits [10:30:27] true [10:30:28] but still, its an odd thing to happen [10:30:33] we should restrict editentity to logged in users :) [10:30:53] isn't edit entity used in some part of the ui? [10:30:54] maybe not [10:31:01] aude: ui uses setsitelink [10:31:08] yeah [10:31:13] of highly highly unlikely that was using the uI :P [10:31:16] and set alias [10:31:24] set description set label set claim [10:32:07] but [10:32:18] editentity still would just have the desciprtion 'Updated item' ? [10:32:26] ok [10:33:37] addshore: talk to tobi. i think if editentity ends up having only a single changeop, than it should use that changeop's summary. otherwise, a generic one. [10:33:51] (or of course, the summary supplied by the user) [10:33:59] DanielK_WMDE: yes yes [10:34:01] hmm, i dont think that has been merged yet [10:34:11] probably not [10:34:30] 8 hours before the edit we are looking at there was http://www.wikidata.org/w/index.php?title=Q20317&diff=61021946&oldid=54793399 which must have used editentity [10:34:47] as it is now, summary in set site link is set with $summary->addAutoSummaryArgs( $page ); [10:35:12] as far as i can tell [10:35:35] $summary = $this->createSummary( $params ); [10:35:42] oh, ok [10:37:08] will take more of a look in a second [10:37:18] * aude don't understand createSummary.... it accepts params, but does not use them  [10:37:26] (suppose, unless it is subclassed) [10:38:20] ok, ModifyLangAttrib subclasses and uses them [10:38:28] attribute [10:39:00] gets the language and then later the page is given in set site link [10:39:17] how can that go wrong???? [10:43:35] ahh aude $summary->setLanguage( $params['linksite'] ); and $summary->setAction( 'remove' ); [10:43:42] etc [10:44:22] https://bugzilla.wikimedia.org/51953 [10:44:39] more bugs :) [10:44:43] set language! [10:44:48] language != site [10:44:58] should be resolved when everything starts to use the changeops :) [10:45:06] i hope so [10:45:28] I feel like the comment after setting the language should mean something :P //XXX: not really a language! [10:45:46] :) [10:45:46] ahh, because it passes the site not the lang :) [10:50:05] (PS7) Daniel Kinzler: (bug 49264) Handle UnDeserializableValue gracefully. [extensions/WikibaseDataModel] - https://gerrit.wikimedia.org/r/70443 [11:04:10] Denny_WMDE: the list is halfway through being added and this time it actually looks like it is doing it correctly! :) [11:04:15] http://tools.wmflabs.org/addbot/addbot/iwlinks/index.html?lang=bi&site=wiki [11:05:09] "Haus blong toktok"? [11:06:03] No idea what that page is, but it has some links in text (which unfortunatly were matched when really they shouldnt have been ) - but they should all be removed after the first run [11:06:03] sounds korean-ish :) [11:06:19] the interwikilink on 'Rome' was removed [11:06:22] from what johl says, there's a bit of german influence in korean [11:06:28] just Dipsi left on biwiki! [11:06:35] but then it's in latin spelling [11:06:39] strange language :) [11:06:40] which is a section link :( [11:07:20] https://bi.wikipedia.org/wiki/Bislama [11:09:03] ha, it's actually an englishz-based pidgin that kind of took on a life of it's own... fun :) [11:09:14] :D [11:09:27] if you squint and try to mouth the words to yourself, you can kind of understand half of it [11:09:32] hah [11:09:37] it'S like reading dutch :P [11:10:23] DanielK_WMDE: you mean german ofc [11:10:32] aude: https://gerrit.wikimedia.org/r/75477 so for that to work the site_identifiers table has to be fixed in production? [11:11:03] hoo: yes [11:11:06] ok, mh [11:11:26] maybe interim solution is do what you do + use language as fallback if no navigation ids are found [11:11:29] to match [11:12:07] (Abandoned) Jeroen De Dauw: DO NOT MERGE [extensions/WikibaseQuery] - https://gerrit.wikimedia.org/r/75150 (owner: Jeroen De Dauw) [11:12:23] Denny_WMDE: https://gerrit.wikimedia.org/r/75339 and https://gerrit.wikimedia.org/r/75340 [11:17:19] Denny_WMDE: huh, what kind of language is this? https://en.wikipedia.org/wiki/User:Wakebrdkid/Solar_cycle [11:18:36] JeroenDeDauw: hmm, don't know [11:18:38] Funny how Travis is now sending "You broke it" mails to the i18n bot [11:20:00] JeroenDeDauw: mathematica maybe [11:20:49] Denny_WMDE: isnt that one similar to octave? [11:21:28] dunno octave [11:22:07] Well, this thing does not look like octave [11:22:40] Denny_WMDE: wait, octave is similar to matlab, not mathematica [11:24:49] Denny_WMDE: those two commits are what me and addshore did together so far. Would be nice if they could be merged so we can continue building on them [11:28:47] (CR) Denny Vrandecic: [C: 2] Some cleanup of QueryEntity and its test [extensions/WikibaseQuery] - https://gerrit.wikimedia.org/r/75339 (owner: Jeroen De Dauw) [11:30:18] (Merged) jenkins-bot: Some cleanup of QueryEntity and its test [extensions/WikibaseQuery] - https://gerrit.wikimedia.org/r/75339 (owner: Jeroen De Dauw) [11:30:27] (CR) Denny Vrandecic: [C: 2] Ground work for QueryEntity serialization and deserialization [extensions/WikibaseQuery] - https://gerrit.wikimedia.org/r/75340 (owner: Jeroen De Dauw) [11:30:42] JeroenDeDauw: >> https://gerrit.wikimedia.org/r/#/c/75321/2 :) [11:31:43] (Merged) jenkins-bot: Ground work for QueryEntity serialization and deserialization [extensions/WikibaseQuery] - https://gerrit.wikimedia.org/r/75340 (owner: Jeroen De Dauw) [11:33:13] [travis-ci] wikimedia/mediawiki-extensions-WikibaseQuery#20 (master - 2954d59 : jeroendedauw): The build is still failing. [11:33:13] [travis-ci] Change view : https://github.com/wikimedia/mediawiki-extensions-WikibaseQuery/compare/c446608b8ac6...2954d5937c0f [11:33:13] [travis-ci] Build details : http://travis-ci.org/wikimedia/mediawiki-extensions-WikibaseQuery/builds/9429799 [11:34:54] [travis-ci] wikimedia/mediawiki-extensions-WikibaseQuery#21 (master - 4516849 : jeroendedauw): The build is still failing. [11:34:54] [travis-ci] Change view : https://github.com/wikimedia/mediawiki-extensions-WikibaseQuery/compare/2954d5937c0f...45168493a469 [11:34:54] [travis-ci] Build details : http://travis-ci.org/wikimedia/mediawiki-extensions-WikibaseQuery/builds/9429861 [11:43:12] (PS6) Daniel Kinzler: (bug 49264) Make SnakValidator fail on bad values. [extensions/Wikibase] - https://gerrit.wikimedia.org/r/69659 [11:43:51] (CR) jenkins-bot: [V: -1] (bug 49264) Make SnakValidator fail on bad values. [extensions/Wikibase] - https://gerrit.wikimedia.org/r/69659 (owner: Daniel Kinzler) [11:43:52] (PS7) Daniel Kinzler: (bug 49264) Make SnakValidator fail on bad values. [extensions/Wikibase] - https://gerrit.wikimedia.org/r/69659 [11:44:10] (CR) Daniel Kinzler: "PS 7 is a rebase" [extensions/Wikibase] - https://gerrit.wikimedia.org/r/69659 (owner: Daniel Kinzler) [11:44:32] (CR) jenkins-bot: [V: -1] (bug 49264) Make SnakValidator fail on bad values. [extensions/Wikibase] - https://gerrit.wikimedia.org/r/69659 (owner: Daniel Kinzler) [11:46:19] (PS8) Daniel Kinzler: (bug 49264) Make SnakValidator fail on bad values. [extensions/Wikibase] - https://gerrit.wikimedia.org/r/69659 [11:46:46] (CR) jenkins-bot: [V: -1] (bug 49264) Make SnakValidator fail on bad values. [extensions/Wikibase] - https://gerrit.wikimedia.org/r/69659 (owner: Daniel Kinzler) [11:48:52] DanielK_WMDE, Denny_WMDE: I've read Daniel's email and followed up. I'll be available for another 8-9 hours. [12:06:31] DanielK_WMDE: hangout? [13:12:41] Denny_WMDE: did i leave too quickly? [13:13:01] nope [13:13:42] here's the link I spoke of [13:13:43] https://oc.wikipedia.org/wiki/Supetar [13:13:48] check the source [13:14:17] {{Infobox|tematica=|carta=}}{{Intro}} [13:14:19] HAH! [13:14:20] [1] 04https://www.wikidata.org/wiki/Template:Infobox13 => [13:14:22] [2] 04https://www.wikidata.org/wiki/Template:Intro [13:15:52] lots of q-ids in the infobox. i want inline editing for labels :) [13:16:06] but anyway: awesome! [13:17:48] What language is oc? [13:17:49] lazowik: you here? [13:17:52] occitan [13:19:01] What's this in German? [13:19:19] Or where is it from? [13:20:18] a language in spain [13:20:22] iirc [13:20:47] Ok! Sounds like Spanish and French mixed [13:32:40] (PS1) Jeroen De Dauw: Work on QueryEntity [extensions/WikibaseQuery] - https://gerrit.wikimedia.org/r/75591 [13:53:01] (PS2) Jeroen De Dauw: Work on QueryEntity [extensions/WikibaseQuery] - https://gerrit.wikimedia.org/r/75591 [13:53:20] Denny_WMDE1: https://gerrit.wikimedia.org/r/#/c/75591/ [13:54:05] yay, we're doing pair programming in the office! [13:55:43] Regarding the switch of wikivoyage to wikidata: [13:56:39] on he-voyage we use a Interwiki config-sorting order file to force the English interwiki links to be on top: [13:56:48] http://he.wikivoyage.org/wiki/%D7%9E%D7%93%D7%99%D7%94_%D7%95%D7%99%D7%A7%D7%99:Interwiki_config-sorting_order [13:57:05] But it seems to have no effect on the new wikidata links [13:57:29] Should it have? Or is there any alternative configuration? [13:57:54] Short question: My identification key now starts supporting different topics. Topics are defined in items. If I want a special page just for topic Q465, how do I best configure this? I have to use it on the client site (JavaScript), but the definition should go to LocalSettings.php I guess. [14:01:30] Denny_WMDE1: now I'm here [14:01:31] tzafrir: yes, we have some sorting configuration. it needs to be set in the config, then we can do that. [14:01:48] lazowik_: the team has decided (against my voice) to use items for the badges [14:02:00] so, items it is [14:02:10] eh [14:02:25] Granjow: ? [14:02:27] rewriting something which is not even finished yet... [14:02:32] :D [14:04:46] Denny_WMDE1: Basically I want to define $myTopic = 'q453' in LocalSettings.php and have this variable available on the client side in some JavaScript variable. [14:05:23] Denny_WMDE1: all items one could possibly imagine? [14:05:24] DanielK_WMDE, so what do I need to do? [14:05:31] lazowik_: aye [14:05:47] I can't imagine how UI needs to look like [14:05:55] especially for clients [14:05:59] but that's for later [14:06:16] Denny_WMDE1: Or maybe it is easiest to put it into an attribute of the
which I'm using anyway ... yes, will try that. [14:06:18] tzafrir: probably poke aude when she's online, she has the best understanding of this. or check the config file yourself and make a changeset :) [14:06:46] Granjow: I don't know how to do that, sorry :) [14:07:18] tzafrir: we support a handfull of sorting orders, and there's some extra config for that... [14:07:27] ask aude, i'm not sure how to best set this up [14:07:45] (I know the same order works on he-wiki) [14:07:52] Thanks. I will [14:08:10] tzafrir: if it'S already on hewiki (without any js hacks), we have a config for it [14:08:22] tzafrir: or make a bug to bugzilla and we will do it [14:08:22] maybe aharoni knows [14:08:36] DanielK_WMDE: we do have a setting for that [14:08:41] reading up [14:08:46] hi tzafrir , DanielK_WMDE [14:08:48] Denny_WMDE1: i know, but i do not know how lexible it is. [14:08:58] I have JS disabled, and it seems to work there, so it does not seem to be a JS hack (unless it's server-side - node) [14:09:00] it can do what hevoy needs [14:09:07] no js hack involved [14:09:12] aharoni: do you happen to know about the langling sort order on hewiki? [14:09:17] it just prepends one wiki to the rest of the search [14:09:25] heh [14:09:57] DanielK_WMDE: tzafrir : AFAIK the Hebrew Wikipedia already has the same thing configured. [14:10:31] ok, great. [14:11:16] Denny_WMDE1, what exactly is "the config file"? [14:13:00] basically the localsettings for the wmf wikis [14:13:13] it's in gerrit… I'll take a look at the exact location [14:16:47] (CR) Daniel Werner: [C: 1] (bug 49264) Handle UnDeserializableValue gracefully. [extensions/WikibaseDataModel] - https://gerrit.wikimedia.org/r/70443 (owner: Daniel Kinzler) [14:17:26] tzafrir: here https://git.wikimedia.org/blob/operations%2Fmediawiki-config/b190fb74e594d3ef9049a83ac78bfe37e70217c5/wmf-config%2FInitialiseSettings.php [14:17:36] need to add to line 12257 the hevoy setting [14:18:18] do you know how to do it, or shall i do it? [14:19:16] (CR) Daniel Kinzler: "(2 comments)" [extensions/DataValues] - https://gerrit.wikimedia.org/r/70433 (owner: Daniel Kinzler) [14:21:36] tzafrir: ? [14:22:14] DanielK_WMDE, not really sure how to, and I guess I don't have the permissions anyway [14:23:21] tzafrir: you do :) but it's ok — can you make a bug on bugzilla [14:23:39] tzafrir: especially pointing to a page on the wiki that says the community wants this or expects this [14:23:53] tzafrir: like a page saying "language links are sorted like this" or whatever [14:24:06] do you know how to write bugs? [14:30:40] ok, trying to figure out this "you can fix it yourself" bit [14:31:25] tzafrir: this might help: http://www.mediawiki.org/wiki/Gerrit/Tutorial [14:32:33] or write a bug [14:32:33] (PS7) Daniel Kinzler: (bug 49264) Handle bad values using UnDeserializableValue. [extensions/DataValues] - https://gerrit.wikimedia.org/r/70433 [14:34:14] (PS8) Daniel Kinzler: (bug 49264) Handle UnDeserializableValue gracefully. [extensions/WikibaseDataModel] - https://gerrit.wikimedia.org/r/70443 [14:34:53] (CR) jenkins-bot: [V: -1] (bug 49264) Handle UnDeserializableValue gracefully. [extensions/WikibaseDataModel] - https://gerrit.wikimedia.org/r/70443 (owner: Daniel Kinzler) [14:37:00] (PS9) Daniel Kinzler: (bug 49264) Handle UnDeserializableValue gracefully. [extensions/WikibaseDataModel] - https://gerrit.wikimedia.org/r/70443 [14:37:04] (CR) jenkins-bot: [V: -1] (bug 49264) Handle UnDeserializableValue gracefully. [extensions/WikibaseDataModel] - https://gerrit.wikimedia.org/r/70443 (owner: Daniel Kinzler) [14:38:45] (PS11) Daniel Kinzler: (bug 49264) Handle UnDeserializableValue gracefully. [extensions/Wikibase] - https://gerrit.wikimedia.org/r/68002 [14:40:15] (CR) jenkins-bot: [V: -1] (bug 49264) Handle UnDeserializableValue gracefully. [extensions/Wikibase] - https://gerrit.wikimedia.org/r/68002 (owner: Daniel Kinzler) [14:40:57] (CR) Denny Vrandecic: [C: 2] "(2 comments)" [extensions/WikibaseQuery] - https://gerrit.wikimedia.org/r/75591 (owner: Jeroen De Dauw) [14:42:48] gggaaaahhhhhh!!!!!! [14:43:07] git review -d will silently overwrite unstaged changes?! [14:43:13] i just lost more than an hour of wiork! [14:43:14] wtf? [14:44:16] (Merged) jenkins-bot: Work on QueryEntity [extensions/WikibaseQuery] - https://gerrit.wikimedia.org/r/75591 (owner: Jeroen De Dauw) [14:44:17] i thought git would save your work... [14:44:26] johl is on #wikimedia-de-intern [14:45:03] git reset --soft HEAD@{1} or something ? [14:45:09] git reflot [14:45:12] reflog [14:45:28] although unstaged, don't know [14:47:12] aude: that only works for comitted stuff [14:47:29] git-reset apparently uses reset --hard [14:47:34] which will kill all local changes [14:47:40] but i think i got it [14:47:53] PHPStorm's internal history still has most of it [14:48:46] oh really? [14:48:57] i'm sure there is a way somehow... [14:49:00] * aude thinks [14:49:01] (PS9) Daniel Kinzler: (bug 49264) Make SnakValidator fail on bad values. [extensions/Wikibase] - https://gerrit.wikimedia.org/r/69659 [14:49:08] (CR) jenkins-bot: [V: -1] (bug 49264) Make SnakValidator fail on bad values. [extensions/Wikibase] - https://gerrit.wikimedia.org/r/69659 (owner: Daniel Kinzler) [14:49:11] aude: i don't think there's a a way in git [14:49:25] hmmmm [14:51:36] (PS1) Daniel Kinzler: InMemoryDataTypeLookup should throw PropertyNotFoundException [extensions/Wikibase] - https://gerrit.wikimedia.org/r/75605 [14:51:55] yay, more exceptions :) [14:52:15] no, just better exceptions [14:52:18] ok [14:52:23] exceptions are your friends :) [14:52:36] can be [14:52:53] i do wonder if the MWExceptions, though have special handling in wikimedia world [14:53:01] something i've been meaning to look into [14:53:10] (CR) jenkins-bot: [V: -1] InMemoryDataTypeLookup should throw PropertyNotFoundException [extensions/Wikibase] - https://gerrit.wikimedia.org/r/75605 (owner: Daniel Kinzler) [14:54:16] wtf, why does this fail? [14:54:17] grrr [14:54:25] gotta go, will follow up later or tomorrow [14:56:00] k [14:56:13] * aude got to organize wikimania stuff [14:59:50] * aude love that the wmf servers are named for elements :) [15:01:00] :) [15:04:51] Should this happen that a script only runs with debug=true? [15:08:15] Granjow: huh? where what? [15:11:09] aude: In my extension. If I run it without debug=true I get an error message about a variable being undefined, which can only happen if the scripts are evaluated in the wrong order. [15:11:15] Or if the minifier fails. [15:11:47] (Not related in this channel, I know ...) [15:27:42] By the way ... adding statements is much improved thanks to the sorting :) [15:38:48] (PS1) Jeroen De Dauw: Finished implementation of QueryEntitySerializer and added integration test [extensions/WikibaseQuery] - https://gerrit.wikimedia.org/r/75625 [15:40:14] (CR) Jeroen De Dauw: "(1 comment)" [extensions/WikibaseQuery] - https://gerrit.wikimedia.org/r/75591 (owner: Jeroen De Dauw) [15:40:19] (CR) jenkins-bot: [V: -1] Finished implementation of QueryEntitySerializer and added integration test [extensions/WikibaseQuery] - https://gerrit.wikimedia.org/r/75625 (owner: Jeroen De Dauw) [15:45:01] (PS2) Denny Vrandecic: Finished implementation of QueryEntitySerializer and added integration test [extensions/WikibaseQuery] - https://gerrit.wikimedia.org/r/75625 (owner: Jeroen De Dauw) [16:51:39] http://ultimategerardm.blogspot.nl/2013/07/wikidata-is-multilingual-project.html [16:52:14] it is the first time I found an item without an English label :) [17:11:41] Anyone has experience with writing Java unit tests? DanielK_WMDE, I can't figure out how I'll write tests for classes that have Myrrix ClientRecommender objects as delegates (HAS-A style). The Myrrix objects depend upon a Myrrix instance running at a host:port. [17:12:36] And to *mock* that myrrix instance would be like...I don't know...how do I mock it anyway, it's a living collaborative filtering engine. [17:13:25] Writing tests for the PHP side seems intuitive, easy. [17:13:35] * nileshc needs to get his head around Java unit tests with live instances. [17:19:51] DanielK_WMDE: When Myrrix comes into the picture, it becomes an integration test doesn't it. So, to eliminate the myrrix service integration, basically I need to create a fake Myrrix ClientRecommender class that gives hardcoded suggestions to any random inputs (I have not the remotest idea why one should do something like this)? [17:20:28] aude: Any thoughts? [17:23:07] nileshc: i wasn't into doing java unit tests (yet) when i coded java [17:23:52] i suppose we need a mock class for testing [17:24:53] aude: Indeed. A mock class for Myrrix's client classes. Shouldn't Myrrix provide them? [17:25:30] aude: It doesn't, so I guess I'll have to write mock classes. Question is, how intelligent will those be? O.o [17:25:42] no idea what the best practice is for this [17:26:37] would something like https://code.google.com/p/mockito/ work? [17:26:46] Hmmm...I think I'll let the unit test have a dependency on the Myrrix service for now, until someone comes up with a better idea. :/ [17:26:54] Let me check it out. [17:27:26] seems similar to the issues we have for wikibase code that touches the api [17:27:29] in php [17:27:56] we try to isolate wikibase code and have (ideally) a thin wrapper around what deals with the api [17:30:23] aude: I see...the problem here is that I can't simulate the intelligence of a recommendation engine. So it's either specify in the docs that so and so service needs to be run for testing - or - write a mock class that gives hardcoded suggestions (yuck) [17:33:23] I'll try playing around with mockito and see if it tastes good for this use case too. :P [17:44:14] nileshc: i'd see what DanielK_WMDE thinks [17:44:21] he's done a bit more java [18:03:38] addshore: https://en.wikipedia.org/wiki/Command_%26_Conquer#Chronology [18:03:51] Denny_WMDE: we are hard to work clearly ^^ [18:07:04] ya, I guess you are entering the C&C chronology in Wikidata so you can import them for testing the query engine? [18:12:35] tzafrir: you proceeding? [18:12:55] DanielK_WMDE, I posted a review request [18:15:16] tzafrir: you are continually mixing up DanielK_WMDE and me :) [18:15:26] aude: Yes, I need to talk with him again about this, I'll shoot an email. [18:15:36] Sorry [18:15:44] tzafrir: you have a patch in gerrit? [18:15:47] for sort order? [18:16:12] i just made you a reviewer, aude [18:16:12] Denny_WMDE: ok [18:16:45] tzafrir: looks good to me [18:16:56] yeah, to me too [18:17:39] whenever Reedy is around and available, he can deploy it [18:17:44] https://gerrit.wikimedia.org/r/#/c/75617/ [18:32:10] (PS3) Jeroen De Dauw: Finished implementation of QueryEntitySerializer and added integration test [extensions/WikibaseQuery] - https://gerrit.wikimedia.org/r/75625 [18:40:26] can we put population of cities in Wikidata? [18:55:05] Paintman: not yet [18:55:08] Is there any ETA? [18:57:03] (CR) Denny Vrandecic: [C: 2] Finished implementation of QueryEntitySerializer and added integration test [extensions/WikibaseQuery] - https://gerrit.wikimedia.org/r/75625 (owner: Jeroen De Dauw) [18:57:47] Paintman: yes. checking the latest ETA... [18:58:08] september [18:58:45] (Merged) jenkins-bot: Finished implementation of QueryEntitySerializer and added integration test [extensions/WikibaseQuery] - https://gerrit.wikimedia.org/r/75625 (owner: Jeroen De Dauw) [19:00:19] (PS2) Daniel Kinzler: InMemoryDataTypeLookup should throw PropertyNotFoundException [extensions/Wikibase] - https://gerrit.wikimedia.org/r/75605 [19:00:55] Thanks Denny_WMDE and the rest of the crew for the amazing work! [19:01:15] Denny_WMDE: btw, do we have separate ETAs for quantities without and with entities? [19:01:28] no we don't [19:01:32] we probably should [19:01:36] yea [19:01:43] but we'll refine this when we get closer to it [19:01:50] quantities without units could be realized a but quicker [19:01:57] yeah [19:01:58] though august is going to be a slow month [19:02:23] is typically the holiday month [19:02:28] yeah [19:02:29] you guys deserve it! [19:02:31] let's see [19:02:35] [travis-ci] wikimedia/mediawiki-extensions-WikibaseQuery#23 (master - 8043ebc : jeroendedauw): The build is still failing. [19:02:35] [travis-ci] Change view : https://github.com/wikimedia/mediawiki-extensions-WikibaseQuery/compare/9573dccfac1d...8043ebc5dd02 [19:02:35] [travis-ci] Build details : http://travis-ci.org/wikimedia/mediawiki-extensions-WikibaseQuery/builds/9444948 [19:02:38] it's an eta, not a promise :) [19:03:12] :) [19:03:43] DanielK_WMDE: Please help me out with this unit test dilemma. Shall I repost the IRC messages I posted a few mins/hours ago? [19:06:29] nileshc: "write a mock class that gives hardcoded suggestions (yuck)" [19:06:36] that's exactly how unit tests work [19:06:46] ideally, you are testing ONLY the code in ONE class. [19:06:55] everything else should be fixed (hardcoded). [19:07:02] DanielK_WMDE: D'oh. So that's it?! [19:07:06] yep [19:07:27] integration tests are good too. it'S a different approach. [19:07:41] Yes, to a different end. [19:07:45] there you'll need some code to set up your "fixture" (that is, feed the service with some data). [19:07:52] and you'll require the service to be there [19:07:57] DanielK_WMDE: Okay, so I'll just make a MockClientRecommender instead of using Myrrix's TranslatingClientRecommender. [19:08:05] yep [19:08:35] just keep in mind that all you are testing is the code in your service class. [19:08:40] you are not testing the recommendation logic [19:09:07] if we can get around to doing that too, much the better. i suppose httpunit would be a a good choice for testing the entire stack at once. [19:09:17] but start with the small stuff for now [19:10:29] DanielK_WMDE: Okie dokie. [19:17:41] DanielK_WMDE: I guess it's normal to have to modify the original class heavily to make it test-ready? I have this case where the Myrrix TranslatingClientRecommender class is being built (instantiated) inside the class to be tested. A MyrrixConfiguration is passed to the class being tested, that it uses to instantiate the Translating...... I can't even subclass TranslatingClientRecommender since it's declared final. [19:24:18] nileshc: ideally, everything the class under test needs to operate is "injected", that is, created outside the class itself. [19:25:07] it is indeed the case that the requirement that classes be testable in isolation has a heavy influence on the architecture, and has to be considered when designing classes [19:25:25] this is a lesson i have only learned over the last year, in the wikidata projects [19:25:57] i recommend another look at http://googletesting.blogspot.de/2008/08/by-miko-hevery-so-you-decided-to.html [19:26:23] i keep coming back to that post because it's nice and consise [19:30:18] DanielK_WMDE: Thanks for reminding me about that page, I'll give it another read. [19:30:32] nileshc: i would say that the constructor of AbstractClientRecommender should take an instance of TranslatingRecommender, not a MyrrixClientConfiguration. [19:31:02] The construction of the TranslatingClientRecommender should be left to the bootstrap code [19:31:13] DanielK_WMDE: Yes, then I won't be able to pass a MockClientRecommender instead of TranslatingClientRecommender during testing. [19:31:14] that way, in the test, you can use a mock implementation of TranslatingRecommender [19:32:17] DanielK_WMDE: Because TCRecommender is final. So technically, in order to support this unit testing properly I will have to write a whole decorator for TranslatingClientRecommender that delegates to it.. [19:32:19] the mock recommender could even be implemented by a Proxy object, if you only need a few of the many methods the interface defines [19:32:44] Sorry, I think I did mean Proxy when I said decorator. [19:32:55] a decorator? why? just implement the interface, no need to use the actualy implementation class [19:33:25] nileshc: what i meant is this: http://docs.oracle.com/javase/6/docs/api/java/lang/reflect/Proxy.html [19:33:32] just an idea though [19:34:12] nileshc: if it delegates, "decorator" is the correct term. A decorator can be implemented manually, or using java's proxy mechanism [19:34:41] nileshc: http://stackoverflow.com/questions/2993464/how-do-java-mocking-frameworks-work [19:34:44] DanielK_WMDE: Ah. Good. I see. [19:35:25] the stackoverflow post points to http://jmock.org/ [19:35:49] haven't used it, and iu find mocking frameworks a bit annoying in general, but it might be what you need [19:37:16] DanielK_WMDE: I tried using mockit, but I repeatedly kept running into the "why would I do this thing in such a roundabout way" problem. [19:37:38] that'S why i tend to manually write mock objects :) [19:37:56] but TranslatingRecommender has a gazillion methods, so that may be annoying too [19:37:57] DanielK_WMDE: I think I found a solution to this particular problem. I can derive MockClientRecommender from TranslatingRecommender (the class which final TranslatingClientRecommender derives from) [19:38:26] yes, that's what i suggested a few minutes ago :) [19:39:16] Oops, I thought something else of it. And therefore "TranslatingRecommender has a gazillion methods", true. So THIS is where a mocking framework shines! [19:40:28] I'll do a mock(TranslatingRecommender.class); instead of creating the big fat mock class, and I guess it'll let me make some method "overrides" (stubs?) ? [19:41:57] yea, but you'll have to find out the details yourself. [19:42:17] often, you can just specify a fixed value to be returned from each method, and you can specify how often you expect a given method to be called [19:42:24] DanielK_WMDE: Got it...now I actually got the reason for taking the trouble of using these mock frameworks. [19:42:30] :P [19:43:08] * nileshc just learnt testing!! Yeay! [19:44:28] nileshc: if you look at tests in mediawiki core, you'll notice that hardly any of them use this pattern, most rely heavily on global fixtures and database state. [19:44:40] the tests for wikibase are better about that, but many still have issues [19:45:06] we too are only slowly learning this [19:45:11] * DanielK_WMDE sends regards to JeroenDeDauw [19:45:50] DanielK_WMDE: I see. global fixtures and DB state is kind of where I was getting stuck at, that'd be doing it the easy way. That wouldn't be called a propa' 'unit test' now would it. :D [19:46:13] JeroenDeDauw built a lot of unit tests? [19:46:55] he's the one who most strongly insisted on designing our classes with testability in mind from the start, especially with respect to dependency injection [19:47:37] nileshc: you just discovered why dependency injection is good for testing: having the dependency on TCRecommender hardcoded in your class meant that you can't test it in isolation [19:47:54] to make it testable, you had to remove the dependency, and "inject" the instance into the constructor [19:48:00] you class now only depends on the interface [19:48:00] (CR) Jeroen De Dauw: [C: 2] InMemoryDataTypeLookup should throw PropertyNotFoundException [extensions/Wikibase] - https://gerrit.wikimedia.org/r/75605 (owner: Daniel Kinzler) [19:49:18] DanielK_WMDE: Exactly. This was my first practical application of what I learnt in theory till now...Dependency injections, all these patterns, proper code design. One never really can appreciate it until one discovers. [19:50:05] nileshc: yea... because i'm laszy and don't read books, i got to discover a lot of them on my own - very education, but slow. and of course, i missed some stuff :) [19:50:07] So writing testable code ~= writing well designed code. [19:51:00] Haha :D [19:51:04] it definitly helps: code that is testable in isolation is modular: it consists of small, largely independent blocks. the tests force you to code that way. [19:51:19] that also make maintenance and refactoring easier, and improves reusability [19:51:59] if you don't care about testability, it's probably bossible to write well designed code that isn't testable... but why would you [19:52:01] ? [19:52:26] Yup! Wow. Tests *are* powerful. I was disgruntled with them an hour ago. They are basically a systematic framework to force you to write good code. [19:52:32] (code re-use via sublcassing is a technique that comes to mind) [19:53:23] nileshc: indeed. and in a big project, where you often work with other people's code, you come to appreciate how much time they save in the long run, even though they slow you down a lot when coding "new stuff" [19:56:01] DanielK_WMDE: Yes. Like you said, while doing "brain dumps" it's okay to sacrifice a bit and code fast, but after that one must make it a point to clean it up with proper docs and tests. Or else in the future when people start extending/refactoring my code, they'd have no dearth of &##$@^** for me. :D [19:56:23] (CR) Daniel Kinzler: [C: -1] "Apparently, updating the commit message without reloading the page will undo any changes you made after the loaded the gerrit page. FAIL!" [extensions/Wikibase] - https://gerrit.wikimedia.org/r/69659 (owner: Daniel Kinzler) [19:56:58] gerrit fail! ----^ [19:57:08] :/ [19:57:49] DanielK_WMDE: What would you suggest as next steps for https://gerrit.wikimedia.org/r/75477 ? This is only fixed on the deployment branch atm. [19:57:58] DanielK_WMDE: BTW, I've added to as reviewer to several changes in the WES repo, most of them to do with code docs. I just realized I need to make some improvements to them. Do I make new changes or should I append to the old changes (I don't know how to do that yet) ? [19:58:02] * hacked (rather than fixed) [19:58:49] (PS10) Daniel Kinzler: (bug 49264) Make SnakValidator fail on bad values. [extensions/Wikibase] - https://gerrit.wikimedia.org/r/69659 [19:58:55] (CR) jenkins-bot: [V: -1] (bug 49264) Make SnakValidator fail on bad values. [extensions/Wikibase] - https://gerrit.wikimedia.org/r/69659 (owner: Daniel Kinzler) [19:59:39] [travis-ci] wikimedia/mediawiki-extensions-Wikibase#469 (master - 1637878 : daniel): The build was fixed. [19:59:39] [travis-ci] Change view : https://github.com/wikimedia/mediawiki-extensions-Wikibase/compare/a8aa68931c06...16378786a9e6 [19:59:39] [travis-ci] Build details : http://travis-ci.org/wikimedia/mediawiki-extensions-Wikibase/builds/9446468 [19:59:47] nileshc: i'll look in a minute, let me sort out the mess gerrit made out of my rebase... [20:03:21] DanielK_WMDE: :D Alright. [20:03:53] nileshc: are you using git-review? [20:04:43] Umm, yes? From the beginning. [20:05:39] I don't think I can do it without git review -R (direct push doesn't seem the work last time I checked) [20:05:50] to* [20:06:29] nileshc: ok, good. so, to (in gerrit terminology) add a new "patch set" to your "change", use git review -d 12345 to check out your change locally [20:06:57] the "change" corresponds to a gerrit "commit". to add a "patch set", you (in git terminology) amend the change: [20:07:02] git commit --amend [20:07:08] the you just push it again: [20:07:17] git review -R [20:07:25] (the -R is so git-review won't try to rebase) [20:07:45] hm... [20:08:23] DanielK_WMDE: OK, great. Thanks. :) I'll try it out in a few mins. [20:08:44] nileshc: WebClientRecommender takes a file name as a parameter. that's bad for testing. it should take some representation of the file's content - an array of IDs, or an InputStream or Reader, or some such [20:09:14] nileshc: it's really annoying that gerrit uses terminology different from git's own. [20:09:18] but you get used to it [20:09:34] the system works quite well once you get the hang of it, though it does have some quirks [20:10:09] hm... [20:10:13] DanielK_WMDE: Yeh. I wonder why on earth did they change all these terms. [20:10:24] don't ask me :P [20:10:40] DanielK_WMDE: Haha! [20:11:45] nileshc: ah, you are not using topic branches, so all your changes depend on each other [20:12:04] that's annoying - that way, you can't amend or merge them independantly [20:12:12] please make a separate branch for each change [20:12:20] DanielK_WMDE: That sounds like something I should've done but didn't do. [20:12:29] (branches are quick and cheap in git, i make a dazen a day) [20:12:46] yea - not a big deal. you could fix it by rebasing the changes, but never mind that for now [20:13:04] looks like it's all documentation stuff anyway [20:13:15] i won't review them tonight, but firt thing tomorrow [20:13:37] DanielK_WMDE: Oh boy. I'm a bit miserly with branches. I can never predict when I might mess something up with gerrit and cry "Oi, I burnt down my repo agin'". [20:14:43] DanielK_WMDE: ok, so after you review them and they are done with, I'll make fresh changes for the doc improvements (I haven't explained the params yet, that needs to be done) [20:15:47] nileshc: don't overdo the param documentation. their name should be obvious (if not, change the name). add explanations only when there are special constraints on the values. [20:16:49] by t6he way: don't be scared to play with git. nothing that has been committed is lost (at least not for 2 weeks or so - then stuff that is no no branch may get garbage collected) [20:17:10] DanielK_WMDE: ok...actually they're quite obvious. I won't bother with param docs then. [20:18:02] nileshc: well, in the case of the idListFile param to the WebClientRecommends constructor, youÄd have to specify the expected file format... but we are getting rid of that anyway, right?... [20:18:22] DanielK_WMDE: Oh..that's good news. Okay. So from now on I'll be making a separate branch before committing any new change. [20:18:36] Yes, " WebClientRecommender takes a file name as a parameter" - I've pondered over that too. Actually it takes the name of an environment variable that's set in web.xml using [20:19:35] And that environment variable's value is the file name, set in web.xml. It's a file with one property name in each line... [20:19:59] Here's the thing. [20:20:36] If I remove value suggestions from this engine, I may not be needing TranslatingClientRecommender, this idproplist and a host of other things.. [20:21:01] If I don't, these will stay, and we have to do something about the idListFile. [20:21:02] yay, less code :) [20:21:32] i'd say: get rid of it for now. [20:21:46] if we want a real suggester for values, it'll have to be a separate engine enayway# [20:22:07] Right...wait, gimme some time to think about something... [20:25:47] DanielK_WMDE: I'm having a thought...remember all the things I proposed in the email thread? item/property/value KV store blah blah...elasticsearch etc. ? [20:26:31] sure [20:27:00] i'll have to re-read it and think about use cases, but we settled for the simple baseline solution for now, right? [20:27:17] addshore: http://imgur.com/qH5U8IF :o [20:27:24] is that what you saw earlier? [20:27:39] * aude was making a new item [20:28:40] nileshc: what about it? [20:28:44] I think they may be unnecessary and may already be possible with this engine itself (but of course using two separate Myrrix models/instances, one for property-suggestion, one for value)....this just occurred to me. How about doing this: I leave the code for value suggestion just as it is in this engine. But I keep it disabled (so that all tests/demos etc. will be property-suggestion centric. Let the dormant code lie, with proper [20:28:44] documentation, but just use property suggestion for now) [20:29:32] In August, I'll enable it and ask you to test it...we can check if the value suggestions are any good. If not, I'll remove its code and move to the baseline implementation that we agreed on? [20:30:10] you really want to try that idea of yours :) [20:30:31] If it works, almost nil extra work will be needed for value suggestions. If it doesn't, we have that mapreduce-find-mysql thing to be done. [20:30:41] I'd say: park the code in a git branch. implement the baseline. if there's time, revive the old code, so we can compare it to the baseline. [20:30:59] The thing is that it's already there, the code for the value suggestions. [20:31:10] (PS11) Daniel Kinzler: (bug 49264) Make SnakValidator fail on bad values. [extensions/Wikibase] - https://gerrit.wikimedia.org/r/69659 [20:31:15] Okay, that's a good idea too! [20:31:46] nileshc: dead wood in the code base is bad. luckily, git will keep the code around, so it's not lost. [20:32:02] (CR) jenkins-bot: [V: -1] (bug 49264) Make SnakValidator fail on bad values. [extensions/Wikibase] - https://gerrit.wikimedia.org/r/69659 (owner: Daniel Kinzler) [20:32:28] grr, too many modules [20:32:48] JeroenDeDauw: this bad value stuff is annoying, it keeps me juggeling 3 git repos [20:33:37] DanielK_WMDE: Very true. I agree. So, let me finish off with the code documentations and tests. After everything looks okay, I'll park the code in a separate branch. I'll remove the value suggestions part and setup the property suggester to be tested and demoed. [20:33:49] Sounds good? [20:34:18] yep :) [20:35:58] Cool. Please review the doc stuff in the CR box when you can. I'll resume pushing changes after that tomorrow. [20:38:26] i'll review tomorrow morning [20:40:21] DanielK_WMDE: Okay. And it seems I was wrong about idListFile. It's indeed a file path relative to the classes directory in WEB-INF. Dirty. The web.xml en variable is never used. Anyway, this'll be removed before August. [20:42:54] at the very least, it should be a URL - probably one you get from ClassLoader.getResources. But for testing, that would still suck. [20:45:23] Yes. But we don't need to worry about this for now since all that problematic code will move to a different branch anyway. [20:46:16] It'll be a concern while writing tests for value suggestion if we decide to keep it later. [20:48:10] nileshc: it'S probably easier to remove that stuff now, so it doesn't get in the way when writing tests... [20:48:26] well, try and see what works :) [20:51:17] DanielK_WMDE: Yes, that occurred to me too, but if we decide to use the other "valuesuggestion" branch later in August, there would be a problem and I'll have to manually bring these tests over to there... [20:54:51] Bit sleepy, will start pushing new changes tomorrow a'noon after I return from college. Going to hit the sack now. [20:57:03] nileshc: merging is easy with the right tools (and good tests!) [20:57:08] see you tomorrow! [21:06:20] DanielK_WMDE: Okay, then I'll go for it, make a new branch and delete the value stuff for now. [21:06:38] DanielK_WMDE: Yup, good night. :) [22:13:05] Has there been any discussion (or bug filing) about adding an option to wikipedias (on Special:MovePage) to automatically fix the link on wikidata? [22:13:56] Sid-G: goes live tomorrow [22:14:30] rschen7754: thanks. good to know :) [22:16:43] Sid-G: Plus it is already live on Wikivoyage. [22:58:03] rschen7754: \o/ [22:58:19] :D