[01:23:10] Lydia_WMDE_: Whatever autocomplete magic has been wrought of recent, it's really neat! :D [01:36:04] (03PS1) 10Aude: I am not sure if/why this patch is a problem for travis, but let's see if this makes travis happy [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84477 [01:36:51] (03PS2) 10Aude: I am not sure if/why this patch is a problem for travis, but let's see if this makes travis happy [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84477 [01:39:09] (03CR) 10Hashar: "No clue. Could be an issue in Wikibase or some incompatibility in one of the various extensions being tested." [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/72226 (owner: 10Liangent) [01:59:44] (03CR) 10Aude: [C: 032] "let's see if travis build succeeds" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84477 (owner: 10Aude) [02:01:20] (03Merged) 10jenkins-bot: Revert "(bug 46366) Use SnakFormatter for diffs and summaries" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84477 (owner: 10Aude) [02:07:07] [travis-ci] wikimedia/mediawiki-extensions-Wikibase#826 (master - 81874d6 : jenkins-bot): The build was fixed. [02:07:07] [travis-ci] Change view : https://github.com/wikimedia/mediawiki-extensions-Wikibase/compare/209ee8fbc667...81874d6391f3 [02:07:07] [travis-ci] Build details : http://travis-ci.org/wikimedia/mediawiki-extensions-Wikibase/builds/11445779 [02:12:37] (03PS1) 10Aude: Revert "Revert "(bug 46366) Use SnakFormatter for diffs and summaries"" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84478 [02:13:03] (03CR) 10Aude: [C: 04-1] "somehow this breaks builds on travis" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84478 (owner: 10Aude) [02:15:41] (03CR) 10Aude: "it might be worth splitting this into two patches" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84478 (owner: 10Aude) [02:57:33] Reedy: knowing Chinese is not really helpful there [02:57:55] I did respond before he'd actually given context [02:59:30] Reedy: ok anyway it requires a good understanding of historical and political stuff in both eastern and western to translate it [02:59:39] or I can just give some transliteration [03:56:51] any German users still awake? [05:17:08] nein [08:52:16] [travis-ci] wikimedia/mediawiki-extensions-Wikibase#828 (master - 27c447c : Tobias Gritschacher): The build was broken. [08:52:16] [travis-ci] Change view : https://github.com/wikimedia/mediawiki-extensions-Wikibase/compare/dfc2ac91ad34...27c447c4300f [08:52:16] [travis-ci] Build details : http://travis-ci.org/wikimedia/mediawiki-extensions-Wikibase/builds/11455485 [08:52:18] romaine .... you only need ONE link [08:52:18] that suffices [08:52:19] (03Merged) 10jenkins-bot: Debug phpunit tests on travis [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84500 (owner: 10Tobias Gritschacher) [08:53:15] and the link can be within Wikidata [08:53:40] that is the theory, in practice every Wikipedia article about a subject needs in the same item [08:53:40] to use it, you need labels ... not interwiki links [08:53:40] so when I have an article without a link and without a second article I am snookered ? [08:53:40] hell no [08:54:23] the core of Wikidata is that information is stored in one place [08:55:11] I agree on that one [08:55:37] but that does not mandate two articles or even one article [08:56:55] to enable data - stored in 1 place - to be used, it needs as basis that the interwikilinks are set properly, all combined in one item [08:57:07] that is the practice [08:57:24] that is why many many users work very hard to repair interwikilinks [08:59:14] [travis-ci] wikimedia/mediawiki-extensions-Wikibase#829 (master - 718d8f9 : jenkins-bot): The build is still failing. [08:59:14] [travis-ci] Change view : https://github.com/wikimedia/mediawiki-extensions-Wikibase/compare/27c447c4300f...718d8f930467 [08:59:14] [travis-ci] Build details : http://travis-ci.org/wikimedia/mediawiki-extensions-Wikibase/builds/11455653 [09:03:51] Romaine: you are confusing the mechanism with making data available [09:05:22] my bot is adding data FROM wikipedia in Wikidata ... by hand I am adding information lacking in Wikidata often lacking in any wikipedia as well [09:11:17] my problem is with articles that do not have an associated wikidata item [09:20:28] by the way Romaine, the problem maintaining interwiki links, is this something you are doing ? [09:20:52] also yes [09:24:20] (03PS1) 10Tobias Gritschacher: Exclude ClaimSummaryBuilderTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84502 [09:24:42] addshore: https://gerrit.wikimedia.org/r/#/c/84502/ [09:26:22] (03CR) 10Addshore: [C: 032] Exclude ClaimSummaryBuilderTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84502 (owner: 10Tobias Gritschacher) [09:31:36] [travis-ci] wikimedia/mediawiki-extensions-Wikibase#830 (master - 1c985f6 : Tobias Gritschacher): The build is still failing. [09:31:36] [travis-ci] Change view : https://github.com/wikimedia/mediawiki-extensions-Wikibase/compare/718d8f930467...1c985f6ef04b [09:31:36] [travis-ci] Build details : http://travis-ci.org/wikimedia/mediawiki-extensions-Wikibase/builds/11456762 [09:37:42] (03PS1) 10Tobias Gritschacher: Revert "Exclude ClaimSummaryBuilderTest" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84503 [09:37:51] (03CR) 10Tobias Gritschacher: [C: 032 V: 032] Revert "Exclude ClaimSummaryBuilderTest" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84503 (owner: 10Tobias Gritschacher) [09:43:59] (03PS1) 10Tobias Gritschacher: Exclude SetClaimTest on travis [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84504 [09:44:05] addshore: ^^^ [09:44:38] [travis-ci] wikimedia/mediawiki-extensions-Wikibase#831 (master - 9195d8f : Tobias Gritschacher): The build is still failing. [09:44:38] [travis-ci] Change view : https://github.com/wikimedia/mediawiki-extensions-Wikibase/compare/1c985f6ef04b...9195d8fedea7 [09:44:38] [travis-ci] Build details : http://travis-ci.org/wikimedia/mediawiki-extensions-Wikibase/builds/11457189 [09:45:03] (03CR) 10Tobias Gritschacher: [C: 032 V: 032] Exclude SetClaimTest on travis [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84504 (owner: 10Tobias Gritschacher) [09:50:07] Lydia_WMDE: finally found that email you told me about this morning! [09:51:57] addshore: ;-) [09:52:49] [travis-ci] wikimedia/mediawiki-extensions-Wikibase#832 (master - f26f0c8 : Tobias Gritschacher): The build was fixed. [09:52:49] [travis-ci] Change view : https://github.com/wikimedia/mediawiki-extensions-Wikibase/compare/9195d8fedea7...f26f0c8eda1d [09:52:49] [travis-ci] Build details : http://travis-ci.org/wikimedia/mediawiki-extensions-Wikibase/builds/11457415 [09:54:59] Abraham_WMDE: fyi: https://lists.wikimedia.org/mailman/listinfo/teampractices [09:57:19] Lydia_WMDE thx [10:04:00] (03PS1) 10Henning Snater: Made SnakList/Reference order-aware [extensions/WikibaseDataModel] - 10https://gerrit.wikimedia.org/r/84505 [10:05:14] (03PS1) 10Henning Snater: Registered movetoolbar definition for references [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84506 [10:05:15] (03PS1) 10Henning Snater: Triggering snaklistview's "change" event when moving items [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84507 [10:06:29] (03PS1) 10Tobias Gritschacher: Remove full qualified namespace usages from SetClaimTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84508 [10:06:36] (03CR) 10jenkins-bot: [V: 04-1] Remove full qualified namespace usages from SetClaimTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84508 (owner: 10Tobias Gritschacher) [10:07:57] (03PS2) 10Tobias Gritschacher: Remove full qualified namespace usages from SetClaimTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84508 [10:13:52] (03PS3) 10Tobias Gritschacher: Cleanup SetClaimTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84508 [10:20:12] DanielK_WMDE: https://travis-ci.org/addshore/mediawiki-extensions-Wikibase/jobs/11458493 [10:35:54] (03PS1) 10Tobias Gritschacher: Don't pass Language object to message [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84511 [10:36:12] (03CR) 10Addshore: [C: 032] Don't pass Language object to message [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84511 (owner: 10Tobias Gritschacher) [10:38:05] (03PS1) 10Daniel Kinzler: (bug 52799) Filter JSON dump by shard or type. [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84515 [10:38:18] (03PS2) 10Daniel Kinzler: (bug 52799) Filter JSON dump by shard or type. [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84515 [10:40:13] (03Merged) 10jenkins-bot: Don't pass Language object to message [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84511 (owner: 10Tobias Gritschacher) [10:41:46] (03CR) 10jenkins-bot: [V: 04-1] (bug 52799) Filter JSON dump by shard or type. [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84515 (owner: 10Daniel Kinzler) [10:44:58] addshore: https://gerrit.wikimedia.org/r/#/c/84508/ [10:45:40] (03CR) 10Addshore: [C: 032] Cleanup SetClaimTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84508 (owner: 10Tobias Gritschacher) [10:47:09] (03Merged) 10jenkins-bot: Cleanup SetClaimTest [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84508 (owner: 10Tobias Gritschacher) [10:51:32] (03PS1) 10Henning Snater: Implemented "snaks-order" parameter in reference serializer [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84516 [10:51:33] (03PS1) 10Henning Snater: Implemented "snaks-order" parameter in SetReference API module [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84517 [10:52:02] (03PS1) 10Daniel Kinzler: Fix WikibaseSnakFormatterBuilders getting messages. [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84518 [10:54:59] (03CR) 10jenkins-bot: [V: 04-1] Implemented "snaks-order" parameter in SetReference API module [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84517 (owner: 10Henning Snater) [11:19:04] (03CR) 10Addshore: [C: 032] Fix WikibaseSnakFormatterBuilders getting messages. [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84518 (owner: 10Daniel Kinzler) [11:20:33] (03Merged) 10jenkins-bot: Fix WikibaseSnakFormatterBuilders getting messages. [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84518 (owner: 10Daniel Kinzler) [11:23:37] (03CR) 10Jeroen De Dauw: [C: 04-1] "(10 comments)" [extensions/WikibaseDataModel] - 10https://gerrit.wikimedia.org/r/84505 (owner: 10Henning Snater) [11:52:16] (03PS3) 10Henning Snater: Using listview to group referenceview's snaks [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84326 [11:52:31] (03PS2) 10Henning Snater: Registered movetoolbar definition for references [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84506 [11:52:42] (03PS2) 10Henning Snater: Triggering snaklistview's "change" event when moving items [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84507 [12:03:00] (03PS3) 10Henning Snater: Triggering snaklistview's "change" event when moving items [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84507 [12:12:17] (03PS1) 10Henning Snater: Triggering claimview's "change" event when qualifier snaks are reordered [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84521 [12:17:58] (03CR) 10Daniel Werner: "(3 comments)" [extensions/DataValues] - 10https://gerrit.wikimedia.org/r/84016 (owner: 10Daniel Kinzler) [12:32:48] (03CR) 10Tobias Gritschacher: [C: 032] Triggering claimview's "change" event when qualifier snaks are reordered [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84521 (owner: 10Henning Snater) [12:34:18] (03Merged) 10jenkins-bot: Triggering claimview's "change" event when qualifier snaks are reordered [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84521 (owner: 10Henning Snater) [13:10:38] JeroenDeDauw: I noticed https://github.com/JeroenDeDauw/WikibaseDataModelPython from your email on the Pywikipedia list. Which branch did you look at? [13:10:43] Compat or core? [13:28:23] (03CR) 10Addshore: "(1 comment)" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84304 (owner: 10Addshore) [13:29:30] aude: https://gerrit.wikimedia.org/r/#/c/71996/20/client/includes/parserhooks/PropertyParserFunction.php [13:43:14] (03PD1) 10Addshore: Remove unused method from Test [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84523 [13:46:14] (03CR) 10Daniel Kinzler: [C: 031] "(7 comments)" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/71996 (owner: 10Liangent) [13:48:54] can I search Wikidata to find people who are a) women, b) born in the 1790s, and c) don't have articles in sv.wikipedia ? [13:49:32] LA2: not yet. the query interface is due by the end of the year [13:50:09] so maybe catscan on en.wikipedia is the best approach for now? [13:50:18] Also, 1790 would be a range query, which may take a bit more time. [13:50:43] I could make 10 queries, one for each year, if there are years (and not individual dates) [13:50:47] LA2: yes, or catgraph, if it's up again. much faster, but very experimental [13:50:56] and combination of queries... [13:51:11] i'm not sure when we will be able to combine multiple queries [13:51:52] https://en.wikipedia.org/wiki/Category:1790s_births [13:52:04] hashar: hey! You there? we have two changes on gerrit that are stuck on a mystery failure of jenkins. [13:52:30] hashar: the one i asked you about a week ago or so is still failing: https://gerrit.wikimedia.org/r/#/c/83816/ [13:52:39] http://208.80.153.172/wdq/ (one of magnus tools) does some limited queries [13:52:40] and the one by liangent you commented on is too [13:55:23] hm... [13:55:42] thinking about it, we will probably need range queries right away. otherwise, queries by year will be impossible. [13:56:04] (technically, you would have to query every second) [13:57:10] (03PS4) 10Daniel Kinzler: (bug 52799) Dump JSON of entities listed in file. [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84000 [13:57:17] (03PS3) 10Daniel Kinzler: (bug 52799) Filter JSON dump by shard or type. [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84515 [14:00:26] Abraham_WMDE: looks like magnus already has that [14:00:38] don't know how difficult to implement property with wikibase query [14:00:41] proper [14:07:42] (03CR) 10Liangent: "(3 comments)" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/71996 (owner: 10Liangent) [14:07:52] hmm.... there are some 30-40 men for each woman, and no good categories for women on en.wikipedia ... [14:12:19] (03PD1) 10Addshore: Add missing @throws tags [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84525 [14:19:40] DanielK_WMDE: I think I manually retriggered the failing job of https://gerrit.wikimedia.org/r/#/c/83816/ and it was passing [14:19:50] huh [14:21:46] all three patchsets pass everywhere on travis DanielK_WMDE :P [14:22:38] hashar: does jenkins run the tests with *all* the other extensions installed on the wiki? [14:22:51] e.g. spam blacklist or whatever else there is [14:23:13] * aude wonders if there is some interaction with some other extension [14:23:25] hi [14:23:38] is there a reasonably painless way to clear all descriptions from an article? [14:23:41] hi MatmaRex :P [14:23:59] MatmaRex: set it to an empty string? [14:24:00] https://www.wikidata.org/wiki/Q115748 this dude is swiss, not italian, yet he has 20+ descriptions that say he is [14:24:02] i don't know [14:24:10] :o [14:24:20] because of that silly autoedit gadget [14:24:28] which will not allow me to clear them, of course [14:24:35] aude: only some extensions that are explicitly added in the jobs configuration: Ask,Serialization,Diff,DataValues,DataTypes,WikibaseDataModel,Validator [14:24:57] $entityLookup = $wikibaseClient->getStore()->getEntityLookup(); [14:24:58] $propertyLabelResolver = $wikibaseClient->getStore()->getPropertyLabelResolver(); [14:24:58] $snaksFormatter = $wikibaseClient->newSnakFormatter(); [14:25:01] https://www.wikidata.org/w/index.php?title=Q115748&action=edit&undoafter=25175138&undo=25341275 [14:25:04] what're their versions to use in tests? [14:25:07] says it can be partially undone [14:25:10] hashar: you said it was passing, but gerrit marked it as failed. [14:25:39] DanielK_WMDE: the retriggered job does fail (2 errors, 1 failure) https://integration.wikimedia.org/ci/job/mwext-Wikibase-testextensions-master/4551/console [14:25:45] Travis has, DataValues, DataTypes, Diff, Datamodel, potentially I should try adding the others to it and running :O [14:25:46] hashar: i'm not sure what passed for you, the output I saw still had the mystery failure [14:26:12] hashar: great, now i just need to see *what* errors and failures :) [14:26:21] addshore: would be interesting [14:26:24] blah it can't find the test result apparently junit-phpunit-allexts.xml [14:26:36] i was about to hack our jenkins-specific wikibase.php entry point to inject --log-tap [14:26:58] hashar: well, phpunit never finishes, it just crashes [14:28:50] DanielK_WMDE: now running with --tap https://integration.wikimedia.org/ci/job/mwext-Wikibase-testextensions-master/4552/console [14:29:07] DanielK_WMDE: maybe that should be the default, the issue is that it produces a super long console output [14:29:28] aude: any idea where validator is on packegist? [14:29:30] DanielK_WMDE: maybe the debug log should be written somewhere and archived in the build result [14:29:39] addshore: nope [14:29:42] :< [14:31:31] (03CR) 10Hashar: "A run of testextensions-master with phpunit --tap is at https://integration.wikimedia.org/ci/job/mwext-Wikibase-testextensions-master/4552" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/83816 (owner: 10Daniel Kinzler) [14:33:11] 00:01:17.652 not ok 6418 - Error: Wikibase\Test\Api\CreateClaimTest::testValidRequest O_o [14:35:48] hashar: it could just skip all output bekinning with "ok"? Nobody cares to see a list of everything that passes, right? [14:36:08] (03PS21) 10Liangent: Have labels shown in variants for {{#property: }} [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/71996 [14:36:15] (03CR) 10jenkins-bot: [V: 04-1] Have labels shown in variants for {{#property: }} [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/71996 (owner: 10Liangent) [14:41:12] DanielK_WMDE: I hate your https://gerrit.wikimedia.org/r/#/c/82400/13 and https://gerrit.wikimedia.org/r/#/c/83090/8 [14:41:18] DanielK_WMDE: no idea if it can be done [14:41:18] creating changes impossible to merge [14:41:33] how can I undo your commits at those two files only? [14:42:03] hashar: grep -v ^ok [14:42:07] and redo it on renamed files in my patch [14:42:15] (03Abandoned) 10Addshore: Use EntityHelper for EntityContent [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84321 (owner: 10Addshore) [14:43:21] well liangent you could checkout the version of those files from before they were merged [14:43:27] what exactly are you trying to do? [14:44:18] liangent: something like: git diff > somthing.patch; patch -R something.patch; patch something.patch; [14:44:30] addshore: Daniel changed some functions; I moved those functions around, to different files [14:44:47] or, as addshore suggested, use git checkout instead of patch -R to restore the version before my hcanges [14:45:27] liangent: if the files are very similar, patching will work. otherwise, you'll have to do it by hand... [14:47:35] (03PS1) 10Liangent: Revert 3fab17d4 and 61dacb15 for PropertyParserFunction [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84526 [14:48:41] (03PS22) 10Liangent: Have labels shown in variants for {{#property: }} [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/71996 [14:49:30] (03CR) 10jenkins-bot: [V: 04-1] Revert 3fab17d4 and 61dacb15 for PropertyParserFunction [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84526 (owner: 10Liangent) [14:50:40] commuting to coworking place for some conf calls. [14:50:58] (03CR) 10jenkins-bot: [V: 04-1] Have labels shown in variants for {{#property: }} [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/71996 (owner: 10Liangent) [14:52:58] Does Wikidata have a sandbox entry? [14:55:43] DanielK_WMDE: I can't really understand your changes... SnakFormatter was changed to OldSnakFormatter then changed back? [14:56:33] liangent: no. SnakFormatter was renamed to OldSnakFormatter, and a new, different SnakFormatter was introduced. In the next change, all usages of the old SnakFormatter were removed or replaced by the new SnakFormatter [14:58:40] (03PS1) 10Liangent: Revert "Revert 3fab17d4 and 61dacb15 for PropertyParserFunction" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84528 [14:59:13] (03CR) 10Liangent: [C: 04-2] "To create a reverse diff for easier read." [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84528 (owner: 10Liangent) [15:00:17] DanielK_WMDE: so I need to apply https://gerrit.wikimedia.org/r/#/c/84528 to my patch [15:00:23] is there anything I need to take care? [15:01:00] other than applying it on PropertyParserFunctionRenderer.php instead of PropertyParserFunction.php ? [15:04:36] liangent: i don't think so, no. Basically, the way you get a SnakFormatter changes. Now you get it from a SnakFormatterFactopry, which you get from WikibaseRepo or WikibaseCLient [15:06:15] https://www.wikidata.org/wiki/Q4115189 [15:06:16] :) [15:07:00] zuzak: that is the sandbox [15:07:00] zuzak: ---^ [15:09:54] thanks [15:11:37] (03PS23) 10Liangent: Have labels shown in variants for {{#property: }} [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/71996 [15:13:59] (03CR) 10jenkins-bot: [V: 04-1] Have labels shown in variants for {{#property: }} [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/71996 (owner: 10Liangent) [15:14:05] DanielK_WMDE: Fatal error: Undefined class constant 'Wikibase\Lib\SnakFormatterFactory::FORMAT_PLAIN' in .../Wikibase/client/includes/WikibaseClient.php ? [15:17:22] (03CR) 10Daniel Kinzler: "@addshort: do you still want to refactor that one function jeroen commented on?" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/81671 (owner: 10Addshore) [15:18:46] (03CR) 10Addshore: "If I do it I will do it in a follow up and refactor all of the similar functions at the same time" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/81671 (owner: 10Addshore) [15:19:56] (03PS24) 10Liangent: Have labels shown in variants for {{#property: }} [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/71996 [15:21:38] (03CR) 10jenkins-bot: [V: 04-1] Have labels shown in variants for {{#property: }} [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/71996 (owner: 10Liangent) [15:22:47] (03CR) 10Liangent: "(3 comments)" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/71996 (owner: 10Liangent) [15:24:59] DanielK_WMDE: is it SnakFormatterFactory::FORMAT_* or SnakFormatter::FORMAT_* ? [15:26:56] changed to SnakFormatter::FORMAT_* [15:27:10] SnakFormatterFactory::FORMAT_* shouldn't be used anywhere any more [15:27:16] might show up during rebase though [15:28:55] DanielK_WMDE: any chance that you're 1) German, and 2) a video gamer? [15:29:31] Sven_Manguard: German yes, Gamer no. Why? [15:29:36] (03PS25) 10Liangent: Have labels shown in variants for {{#property: }} [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/71996 [15:29:46] I need someone to help me with some questions about the Unterhaltungssoftware Selbstkontrolle [15:30:02] Mainly, I want to know how people refer to USK ratings in conversation. [15:30:25] Sven_Manguard: still need my reply to your question about Chinese ? [15:30:38] liangent: if you remember it, sure [15:30:45] I don't remember the question :S [15:31:11] DanielK_WMDE: "Approved for children aged 12 and above in accordance with Art. 14 German Children and Young Persons Protection Act (JuSchG)." appears to be the official name for the middle rating, but I want to know what someone would actually refer to it as [15:31:17] Sven_Manguard: no idea, sorry. [15:31:23] (03CR) 10jenkins-bot: [V: 04-1] Have labels shown in variants for {{#property: }} [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/71996 (owner: 10Liangent) [15:32:27] Sven_Manguard: it was "Or Japanese. Either would work. Just want to know what the label for https://www.wikidata.org/wiki/Q10911535 should be in English" [15:32:36] and I have no idea. I can only transliterate it [15:32:39] Sven_Manguard: with movies, it's just "Ab 12", somtimes "FSK 12". [15:32:40] liangent: yes, that [15:32:43] I need a label [15:34:37] DanielK_WMDE: https://integration.wikimedia.org/ci/job/mwext-Wikibase-client-tests/3080/console [15:34:42] failure with no reason again [15:35:05] DanielK_WMDE: that is https://gerrit.wikimedia.org/r/71996 [15:35:07] hashar: ^ [15:35:53] seems like you are hit by the same issue daniel is hit by in another change :/ [15:38:28] Sven_Manguard: "Vice Chancellor"? home-made translation based on what I read on https://zh.wikipedia.org/wiki/%E5%8F%83%E7%9F%A5%E6%94%BF%E4%BA%8B and https://zh.wikipedia.org/wiki/%E5%AE%B0%E7%9B%B8 and https://en.wikipedia.org/wiki/Chancellor_%28China%29 ... [15:38:41] alright [15:38:43] it's just some position [15:38:44] thanks [15:38:59] is it a Vice Chancellor for anything in specific? [15:41:26] Sven_Manguard: per zhwiki article, it's something with a lower rank then https://en.wikipedia.org/wiki/Chancellor_%28China%29 , but has an effectively similar power [15:41:34] s/then/than [15:41:43] hashar: the place where it fails with --tap is DIFFERENT from the place where it fails without --tap :) [15:42:57] DanielK_WMDE: what does the Ab in Ab 12 mean? [15:44:07] DanielK_WMDE: :-( [15:44:08] the line it fails on it 113 in PropertyInfoTableBuilderTest ;p [15:44:44] Sven_Manguard: "From" or "Since". As ok to watch "from the age of 12" [15:44:54] Ah, okay [15:45:23] so then the English language label would be USK 12 and the alias would be From 12? [15:45:35] DanielK_WMDE: client/includes/WikibaseClient.php: public function newSnakFormatter( $format = SnakFormatterFactory::FORMAT_PLAIN, FormatterOptions $options = null ) { [15:45:40] hashar: random phpunit crashes are a huge waste of time :/ [15:46:01] liangent: on master?! [15:46:37] DanielK_WMDE: yeah [15:47:13] (03PS1) 10Liangent: Change SnakFormatterFactory::FORMAT_* to SnakFormatter::FORMAT_* [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84534 [15:47:18] DanielK_WMDE: ^ [15:47:23] liangent: eek. how did i miss that?! I can fix it in a minute. [15:49:13] (03CR) 10Daniel Kinzler: [C: 032] Change SnakFormatterFactory::FORMAT_* to SnakFormatter::FORMAT_* [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84534 (owner: 10Liangent) [15:49:19] thanks liangent [15:50:07] DanielK_WMDE: there's another occurrence in PropertyParserFunction, but I fixed in my patch together [15:50:22] don't fix it in master now or it conflicts again... [15:51:13] (03Merged) 10jenkins-bot: Change SnakFormatterFactory::FORMAT_* to SnakFormatter::FORMAT_* [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84534 (owner: 10Liangent) [15:51:22] (03PS2) 10Liangent: Revert 3fab17d4 and 61dacb15 for PropertyParserFunction [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84526 [15:51:39] (03PS26) 10Liangent: Have labels shown in variants for {{#property: }} [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/71996 [15:54:05] liangent: but it needs to be fixed on master by tomorrow (branch day) [15:54:52] DanielK_WMDE: then review my code now :p [15:55:05] (03CR) 10jenkins-bot: [V: 04-1] Revert 3fab17d4 and 61dacb15 for PropertyParserFunction [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84526 (owner: 10Liangent) [15:57:08] (03CR) 10jenkins-bot: [V: 04-1] Have labels shown in variants for {{#property: }} [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/71996 (owner: 10Liangent) [16:00:06] (03PS7) 10Daniel Kinzler: (bug 52799) Introducing dumpJson. [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/83816 [16:00:53] liangent: i'll try, but if it doesn't go in by tomorrow, we need a separate fix on master [16:01:06] (03PS2) 10Henning Snater: Made SnakList/Reference order-aware [extensions/WikibaseDataModel] - 10https://gerrit.wikimedia.org/r/84505 [16:01:19] (03PS2) 10Henning Snater: Implemented "snaks-order" parameter in reference serializer [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84516 [16:01:48] DanielK_WMDE: ok then squash its reverse patch into https://gerrit.wikimedia.org/r/#/c/84526/ [16:09:02] (03CR) 10Daniel Kinzler: "(4 comments)" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/79988 (owner: 10Addshore) [16:10:04] (03PS27) 10Liangent: Have labels shown in variants for {{#property: }} [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/71996 [16:11:55] (03PS28) 10Liangent: Have labels shown in variants for {{#property: }} [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/71996 [16:12:20] (03CR) 10Addshore: [C: 04-1] "(15 comments)" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/83816 (owner: 10Daniel Kinzler) [16:14:00] (03CR) 10jenkins-bot: [V: 04-1] Have labels shown in variants for {{#property: }} [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/71996 (owner: 10Liangent) [16:15:47] (03CR) 10Aude: "(2 comments)" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/83816 (owner: 10Daniel Kinzler) [16:17:26] (03CR) 10Daniel Kinzler: [C: 032] Remove unused method from Test [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84523 (owner: 10Addshore) [16:19:08] (03Merged) 10jenkins-bot: Remove unused method from Test [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84523 (owner: 10Addshore) [16:20:28] (03CR) 10Daniel Kinzler: [C: 032] Add missing @throws tags [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84525 (owner: 10Addshore) [16:20:48] DanielK_WMDE: hmm is output already wfEscapeWikitext()'d when it's speficied FORMAT_WIKI ? [16:21:10] liangent: yes. [16:21:13] DanielK_WMDE: it seems being double escaped now [16:21:14] ok [16:21:48] USK IS GO! [16:21:57] (03Merged) 10jenkins-bot: Add missing @throws tags [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84525 (owner: 10Addshore) [16:22:02] liangent: actually, you may want to get a SnakFormatter for a different output format depending on the current parser mode [16:22:09] that would actually be kind of ne [16:22:12] *neat [16:22:18] maybe [16:22:28] can't foresee all implications [16:22:35] All German speaking video gamers everywhere will celebrate this day. Maybe. [16:23:45] Sven_Manguard: yay [16:24:30] There are 12 to do, I'm doing them in order of how easy it is to understand the rating system. Therefore Germany's system is the fourth least complicated [16:24:41] although, honestly, that's not really true [16:25:10] PEGI is more complicated because of Portugal, but Germany also has its blacklist, which Wikidata doesn't support yet [16:25:13] so whatever [16:26:20] anyways [16:26:22] (03PS29) 10Liangent: Have labels shown in variants for {{#property: }} [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/71996 [16:28:10] (03CR) 10jenkins-bot: [V: 04-1] Have labels shown in variants for {{#property: }} [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/71996 (owner: 10Liangent) [16:29:08] (03PD6) 10Aude: (bug 51876) handle populates sites table interwiki ids correctly [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84520 [16:29:19] (03PS7) 10Aude: (bug 51876) handle populates sites table interwiki ids correctly [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84520 [16:29:51] DanielK_WMDE: https://gerrit.wikimedia.org/r/#/c/84520/ [16:29:53] feedback please [16:30:13] * aude struggles with where to put this stuff (+ needs tests) [16:31:05] (03PS8) 10Daniel Kinzler: (bug 52799) Introducing dumpJson. [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/83816 [16:31:28] populate sites could work in a number of ways [16:31:44] like use $wgConf directly, if it's populating for own wiki farm [16:32:08] could be part of site matrix extension, or core (although it's kind of specific) [16:32:15] could be part of wikimedia maintenance scripts [16:32:33] site matrix could be improved to use Site classes [16:33:11] * aude proceeds to have this add special sites like commons :) [16:34:07] DanielK_WMDE: are you still working on the jenkins issue? [16:41:54] liangent: reviewing again. priorizing stuff that needs to go into the branch. [16:42:01] but you are high on my list :) [16:43:46] DanielK_WMDE: btw I have never manually run full unit tests locally for my patches [16:47:23] liangent: i always run the tests for what i touched - that saves a lot of time and trouble. i often also run --group Wikibase, especially when touching a lot of files or classes that are used a lot. [16:47:38] i only run *all* tests when investigating odd jenkins failures [16:49:52] DanielK_WMDE: I always `git review` a patch once I stop coding, even before I manually test it [16:50:10] (03CR) 10Daniel Kinzler: [C: 04-1] "(10 comments)" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84520 (owner: 10Aude) [16:50:11] that's why my changes have so many patches stacked there [16:50:39] liangent: why do you do that? it takes a lot longer for jenkins to test your pages than to run the test locally [16:51:06] liangent: hint: if you run phpunit with the fill path of the file, it's *fast*. funning with --group or --filter is a lot slower [16:51:17] *full path to the file [16:51:21] DanielK_WMDE: but they can run simultaneously in this way [16:51:37] liangent: who cares if running them locally takes about 1 second? [16:51:41] if I do it in the other way, I have to wait for jenkins [16:51:55] why do you have to wait? [16:51:57] DanielK_WMDE: so where to put populate sites? [16:52:26] aude: you mean inside Wikibase? I'd just move it to core. [16:52:34] I never know the hint :p [16:52:37] DanielK_WMDE: hmmm [16:52:38] I always --filter= [16:52:55] i think the script is still ugly [16:53:01] and I feel it slow so throw it to jenkins [16:53:32] liangent: i use that a lot, too - it'S more convenient. but with --group or --filter, phpunit will first go through all tests, then run all providers (so it can show you a percentage), then apply the filter, then run your test [16:53:48] running just the one file is a lot faster [16:54:00] i have set it up as an external tool in my ide, so it'S 2 clicks [16:54:21] no xdebug that way though - i use --group XXX for that. [16:55:02] liangent: i recommend to do quick preliminary testing locally. jenkins should be the final test for the patch set. [16:55:41] lovely how site matrix calls all the non-special sites "Wikipedia" [16:55:46] http://meta.wikimedia.org/w/api.php?action=sitematrix&format=json [16:56:40] DanielK_WMDE: I guess I'm even unaware of the "correct" way to do tests [16:56:49] I created a suites.xml for wikibase manually [16:57:16] and phpunit.php --configuration=wikibase-suites.xml [16:57:40] and created a wrapper script [16:57:44] liangent: just use mediawiki's main suite.xml with --group Wikibase [16:58:10] i just go into mw/tests/phpunit and call php phpunit.php --group Wikibase [16:58:25] that will run all wikibase tests [16:58:49] having a custom suite is probably handy to test all the dependencies (Diff, DataValues, etc) [17:02:55] liangent: you'll have to use OldSnakFormatter instead of SnakFormatter in https://gerrit.wikimedia.org/r/#/c/84526/2 [17:03:10] SnakFormatter is an interface now. [17:03:28] (03CR) 10Daniel Kinzler: [C: 04-1] "use OldSnakFormatter instead of SnakFormatter" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84526 (owner: 10Liangent) [17:03:29] DanielK_WMDE: do not read anything at https://gerrit.wikimedia.org/r/#/c/84526 [17:03:40] liangent: hm? [17:03:49] but your other change depends on it [17:03:51] it's just a literal revert [17:04:06] but we can't merge it [17:04:32] DanielK_WMDE: force merge it at the same time as https://gerrit.wikimedia.org/r/#/c/71996/ [17:04:41] how? [17:04:43] https://gerrit.wikimedia.org/r/#/c/71996/ rewrote everything affected by https://gerrit.wikimedia.org/r/#/c/84526 [17:04:47] you'd have to squash it first [17:05:07] DanielK_WMDE: remove jenkins' comment? [17:05:24] hm squashing should work [17:05:38] and then merge broken code, causing jenkins to fail for everything until the other patch is in? [17:05:40] eek... [17:05:54] please squash :) [17:07:39] (03PS30) 10Liangent: Have labels shown in variants for {{#property: }} [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/71996 [17:07:45] (03Abandoned) 10Liangent: Revert 3fab17d4 and 61dacb15 for PropertyParserFunction [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84526 (owner: 10Liangent) [17:08:47] DanielK_WMDE: ^ [17:09:44] (03CR) 10jenkins-bot: [V: 04-1] Have labels shown in variants for {{#property: }} [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/71996 (owner: 10Liangent) [17:10:12] DanielK_WMDE: I feel https://gerrit.wikimedia.org/r/#/c/71996/30/client/includes/parserhooks/PropertyParserFunction.php is completely unreadable as a diff [17:10:29] you can just read the new versions [17:10:57] liangent: and jenkins is still failing :/ [17:11:21] DanielK_WMDE: with no good reason [17:11:23] i managed to fix it for my patch, but the issue here seems unrelated [17:11:46] liangent: yes, and especially no good message. usually, there is a reason (i.e. a failing test or php error). [17:11:58] but phpunit then crashes instead of reporting it [17:14:15] DanielK_WMDE: in local run of wikibaseclient tests I got a database error immediately [17:15:18] Query: CREATE TEMPORARY TABLE `metawiki.sites` (LIKE `metawiki.sites`) [17:15:19] Function: DatabaseMysqlBase::duplicateTableStructure [17:15:19] Error: 1066 Not unique table/alias: 'metawiki.sites' (localhost) [17:15:47] that's annoying that I have to look at jenkins [17:16:42] liangent: that sounds very strange - it's supposed to add a prefix to the table name. [17:16:58] somehting muist be odd abiout yout database setup [17:19:25] DanielK_WMDE: I have no idea how "odd" it is [17:26:10] (03CR) 10Daniel Kinzler: "To investigate the strange jenkins failure, put this into the Wikibase.php file in the extensions root dir (not the one in the repo dir):" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/71996 (owner: 10Liangent) [17:27:00] liangent: see my comment for an ugly hack to make phpunit at least tell us *where* it is failing. [17:32:10] aude: ping [17:33:02] hi [17:33:22] hello! [17:33:57] will you have some time today? [17:34:07] some... [17:34:11] (03CR) 10Daniel Kinzler: "(3 comments)" [extensions/DataValues] - 10https://gerrit.wikimedia.org/r/84016 (owner: 10Daniel Kinzler) [17:34:16] Danwe_WMDE: --^ [17:34:19] i am at the office for another ~30 minutes and then online from home later [17:34:29] okay then can you help me get started on that final bit? [17:34:32] parsetOUtput -> skin [17:34:34] sure [17:36:03] how do I get started? [17:36:52] what we are putting in parser output / output page will be pretty much the same as what is currently in the $mJsConfigVars variable in output page [17:37:08] we may want to move away from putting stuff in that variable... i don't know [17:40:09] if it's already there in $JsConfigVars can't we just use that? [17:40:19] oh we want to stop using that variable? [17:40:36] we want to stop using it bu [17:40:48] to allow you to proceed, it might be okay for now [17:41:03] i am working on a patch to do it more proper [17:41:33] see https://bugzilla.wikimedia.org/show_bug.cgi?id=54215 and https://bugzilla.wikimedia.org/show_bug.cgi?id=54216 [17:41:37] and https://bugzilla.wikimedia.org/show_bug.cgi?id=54217 [17:44:02] (03CR) 10Daniel Werner: "(1 comment)" [extensions/DataValues] - 10https://gerrit.wikimedia.org/r/84016 (owner: 10Daniel Kinzler) [17:44:10] aude: Did you change something with the globes? [17:44:58] https://www.wikidata.org/w/api.php?action=wbgetentities&ids=Q3783&props=claims&format=json now returns "globe": "earth" [17:45:23] (03CR) 10Daniel Kinzler: "(1 comment)" [extensions/DataValues] - 10https://gerrit.wikimedia.org/r/84016 (owner: 10Daniel Kinzler) [17:45:28] I remember it as "globe": "http://www.wikidata.org/entity/Q2" [17:46:05] aude: when we put our entity data into parser output, we put it in the api serialization format? [17:46:25] multichill: that's odd [17:46:32] pragunbhutani: yes [17:46:50] O right, at https://www.wikidata.org/w/api.php?action=wbgetentities&ids=Q9920&props=claims&format=json it does return that [17:47:22] multichill: might just be a very old value, when this wasn't checked [17:47:32] aude: okay so we then access it in the skin, deserialize it, format it using value/snak formatters and then use it int he skin [17:47:36] am I correct? [17:47:39] pragunbhutani: the idea is that the api serialization should be more stable and we can build backwards compatibility into an entity desrializer [17:47:43] multichill: Q3783 is quite old, i'd guess [17:47:50] pragunbhutani: yep [17:47:59] multichill: you can use Special:Export to see what's "really" in there. [17:48:01] multichill: it could also be just that item [17:48:06] or do you see it everywhere [17:48:09] * DanielK_WMDE thinks so [17:48:15] * DanielK_WMDE hopes not [17:48:17] DanielK_WMDE: we hit 5000 items within a day or two of the project going live. Four digits is really, really old [17:48:32] unfortuntantely i don't think we do much validation on the globe param, so bots could put in "earth" [17:48:35] Sven_Manguard: i know, but the statement might not be [17:48:48] https://www.wikidata.org/w/api.php?action=wbgetentities&ids=Q60&props=claims&format=json [17:48:50] Did a null edit to fix it [17:48:52] see for comparison [17:49:02] Oh. Well, I'm not paying attention to context, apparently [17:49:25] probably a bot can help fix the non-conforming globe values [17:50:07] aude: okay, so I guess I should get started with bug 54215 then. How do I go about it? [17:50:29] DanielK_WMDE / aude : Shouldn't that be '//....' instead of 'http://...' btw? [17:50:43] pragunbhutani: i would start with making use of the serialized data [17:50:55] for now, what is in mJsConfigVars would be okay [17:50:56] (03CR) 10Daniel Werner: "(1 comment)" [extensions/DataValues] - 10https://gerrit.wikimedia.org/r/84016 (owner: 10Daniel Kinzler) [17:51:33] okay let me just take a look at entity view code once to see where mJsConfigVars is [17:51:33] pragunbhutani: putting the data into parser output is the easy part [17:51:57] making use of it is harder [17:52:11] multichill: no. It's a URI. URIs (and URLs) can not start with //. [17:52:14] pragunbhutani: if you do view:source for any wikidata page [17:52:26] it is stuffed into the page as a json variable [17:52:34] multichill: these are URL fragments usable in links, not full URLs. For URLs, I don't care much, but for URIs I do. [17:52:52] or in the js console you can do mw.config.get('wbEntity'); [17:53:01] righ [17:53:04] t [17:53:36] aude: okay I'll just check it out [17:54:10] ok [17:54:23] it's what the javascript uses to format stuff [17:59:13] aude: uh oh [17:59:15] I see what you mean [17:59:25] ? [18:01:24] making use of it is not going to be easy [18:01:38] thats why we need deserializers [18:02:07] i am almost done with my patch for that [18:02:18] so then you have entity objects [18:02:19] for implementing a deserializer? [18:02:23] yes [18:02:40] okay yes, then we'd have the objects [18:02:47] we could just put entity objects in the parser output but it's risky [18:03:00] in terms of handling when the structure changes [18:03:13] hmm yes [18:03:56] otherwise, perfectly okay to put objects there [18:05:41] so that leaves me with accessing the objects in the mobile skin? [18:05:46] yes [18:06:16] although without need for database access, etc. [18:06:21] which is too heavy at that level [18:06:37] or having to do api calls etc) [18:08:13] we wouldn't have to do that, would we? [18:08:37] nope [18:08:54] that's the idea of having stuff in parser output and output page [18:12:54] so should I wait for you to finish the deserializer before I try to access the data in the skin? [18:13:22] well, if you have an entity object [18:13:29] then what to do with it? [18:13:38] i think you can figure that part out [18:18:04] (03PS8) 10Aude: (bug 51876) handle populates sites table interwiki ids correctly [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/84520 [18:19:14] aude: I can see how the claims are stored [18:19:29] pragunbhutani: the claims are the tricky part [18:19:53] on the other hand, there already is some claim deserializer code [18:21:30] I think we have deserializers for all parts of entities, just not for the entity itself [18:21:41] the serializers are all in WikibaseLib [18:21:43] DanielK_WMDE: pretty much [18:21:58] is it possible to see the output of the claim deserializer in the console? [18:22:08] and putting together the labels, aliases, etc but that's not difficult [18:22:25] pragunbhutani: these deserializers are in php [18:22:26] pragunbhutani: it's a claim object... how do you want to see it? [18:22:30] you could use var_dump [18:22:56] or there is a maintenance script console mode [18:23:00] in mediawiki [18:23:08] there is, but i have never used it [18:23:13] maintenance/eval.php where you can try stuff, [18:23:18] although tricky with all the namespaces we use [18:23:26] i don't know about deserializers in JS... there are some, but... [18:23:39] (03CR) 10Umherirrender: [C: 04-1] "(2 comments)" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/81671 (owner: 10Addshore) [18:25:10] pragunbhutani: i am going home... back online in an hourish [18:25:17] * aude needs to eat [18:25:23] aude: okay [18:25:39] i can put my patch on gerrit for serializers [18:25:42] aude: feeling a little lost at the moment but I'll continue to look around till I can see some sense [18:26:04] I'll be right here when you return [18:26:05] hmmmm [18:26:24] the code that represents the entity, claim, etc. is all in WikibaseDataModel [18:26:38] i would look though that and you can see it has stuff like [18:26:44] Item::getLabels() [18:26:55] Item is a subclass of Entity [18:27:18] Item and Property, right? [18:27:20] yes [18:27:38] or getLabel( 'en' ) [18:27:39] I'll look through WikibaseDataModel then [18:27:40] i think [18:28:02] and then insert it into the skin (of course with proper escaping) [18:28:40] * aude with entity view was less of a mess :/ [18:28:46] wish [18:29:03] it's some of our oldest code [18:29:39] if we could do things like getLabel( $language ), where the $language is a changeable setting, that would solve another problem [18:29:45] but one thing at a time, I think [18:29:47] true [18:29:53] I'll read about WikibaseDataModel [18:30:07] alright, back in ~hour [18:30:12] see you then [18:54:38] aude: At https://www.wikidata.org/wiki/Q3211336 a property is deleted (P288). How does the interface know how to render it? How does it know P288 took an item? [18:55:07] Oh, nvm, found it [18:55:43] multichill: because the value itself also has a type. [18:55:59] Yeah, was looking at the api, but overlooked it [18:58:02] /lurk [18:59:40] Ah, found it. The fromJSON tries to do a lookup for a property, but it's deleted so it fails :P [19:00:04] hi JeroenDeDauw, did you take a look at compat or core for Pywikipedia and Wikidata? [19:00:18] Implementation in compat is a bit of a mess.... [19:01:06] addshore: is Addbot yours? [19:01:18] If so, please take this wet trout and smack yourself a few times [19:01:28] for bad geocoord importing [19:01:40] Sven_Manguard: just wanted to say that it is his [19:01:42] Sven_Manguard :O Really? [19:01:52] but now I'll better shut up :D [19:02:07] Let me see which Repo addbot is actually working on atm :p [19:02:09] JohnLewis: https://www.wikidata.org/wiki/User:Byrial/Globes#Coordinate_values_with_globe_given_as_a_string_instead_of_an_item [19:02:22] ALL OF THEM [19:02:43] Reedy: IMPOSSIBLE [19:03:06] Reedy: well, I'm not sure about the title of the section, but the addbot imports didn't specify a percicion (i.e. 1 arcsecond, 0.1 arcsecond, etc.) [19:03:18] which is, apparently, bad [19:03:37] OH percicion. [19:03:37] addshore: you're the worst [19:03:59] Reedy: fill his desk with toeads! [19:04:17] toeads: Toads with human toes [19:04:57] Sven_Manguard: Even worse; I don't know which Repo addbot is running on atm and whatever it is, I bet I don't have push access :p [19:05:18] JohnLewis: these are, apparently, old additions [19:05:22] all of the ones on that page are [19:05:29] Really? [19:05:34] let me check [19:05:54] I see July 31st atm. [19:05:55] 16:06, 31 July 2013‎ Addbot (talk | contribs | block)‎ . . (11,358 bytes) (+324)‎ . . (‎Created claim: Property:P625) (undo) (restore) [19:05:56] yeah [19:06:01] that's old [19:06:13] when did the geocoord datatype go live? [19:06:18] I'll just kill him :p [19:06:40] geocoord went live june 10 [19:19:04] Multichill: did you see https://www.wikidata.org/wiki/Wikidata:Administrators%27_noticeboard (latest thread) [19:19:47] Stryn: No, I didn't [19:19:57] oh, please could you stop your bot? [19:20:21] Or fix it. Stopping it isn't the only solution. [19:20:32] Stryn: What as asshole [19:20:36] *an [19:21:06] your bot added many wrong categories [19:21:11] Ivan bad idea [19:21:24] Stryn: No, Ivan is an idiot [19:22:00] He doens't like the property so now he's try to sabotage it [19:22:30] Stryn: https://www.wikidata.org/w/index.php?title=Q26540&diff=70987105&oldid=70739798 <- tell me what is wrong with this edit? [19:22:51] tbh; When Ivan comes across something they dislike, they do try to do anything they can to stop it. (My experience, not the truth) [19:23:02] nothing wrong with this one [19:23:54] one wrong: https://www.wikidata.org/w/index.php?title=Q7214956&diff=70976869&oldid=62120809 [19:24:40] this was also: https://www.wikidata.org/w/index.php?title=Q261&diff=70979229&oldid=69625081 [19:25:17] That was already a user fuck up [19:27:35] Stryn: Linkin Park one is right enough. The songs were connected, that other one wasn't [19:33:46] multichill: I saw some stuff, though this was a while ago [19:34:05] multichill: did you see my email on having a standalone python implementation of the data model? [19:34:23] Yes, that's why I asked you what code you looked at [19:34:35] multichill: ok, I think I understand now how it works... e.g. in https://www.wikidata.org/wiki/Q189305 (Kyustendil) this https://www.wikidata.org/w/index.php?title=Q189305&diff=70995566&oldid=68671851 (Category:People from Kyustendil) is not correct, but your bot adds it, because Kyustendil is here: https://www.wikidata.org/wiki/Q6362637 [19:35:02] So it will just show up in the conflicts section so it can easily be corrected [19:36:09] Ooh, Psychonauts is only two dollars today... [19:43:04] Byrial ever come on IRC? [19:44:12] Sven_Manguard: Never seen them why? [19:44:48] https://www.wikidata.org/wiki/User:Byrial/Globes#Coordinate_values_with_globe_given_as_a_string_instead_of_an_item is several weeks old, could use an update [20:07:03] JohnLewis: I'm suddenly seeing a lot of new aliases, and I don't see who added them. Most of them are pretty awful [20:07:09] is this a feature? [20:07:46] Sven_Manguard: Hm? [20:08:58] What do you mean Sven_Manguard? [20:09:25] I mean that I've been editing pages and seeing no aliases, and now every page I see has several, most of which are bad [20:09:40] and I'm not seeing a contrib that would indicate that they were added [20:09:55] Example? [20:10:05] https://www.wikidata.org/wiki/Q287288 [20:10:47] Sk!dbot also imports (or imported) aliases [20:10:53] ^ [20:10:59] Stryn: from where? [20:10:59] (03PS10) 10Aude: Store serialized entity in parser output (DRAFT) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/83846 [20:11:07] from redirects [20:11:11] a lot of these imports are either weird as hell or outright wrong [20:11:17] ah, that explains it [20:11:47] (03CR) 10jenkins-bot: [V: 04-1] Store serialized entity in parser output (DRAFT) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/83846 (owner: 10Aude) [20:20:15] Stryn: Found the source of one of the problems: Hobbits [20:20:32] okay [20:21:09] Someone used the property the other way around [20:21:25] (03PS11) 10Aude: Store serialized entity in parser output (DRAFT) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/83846 [20:21:32] (03CR) 10jenkins-bot: [V: 04-1] Store serialized entity in parser output (DRAFT) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/83846 (owner: 10Aude) [20:21:49] Stryn: https://www.wikidata.org/w/index.php?title=Q74359&oldid=63951941 [20:27:47] (03PS12) 10Aude: Store serialized entity in parser output (DRAFT) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/83846 [20:29:43] (03CR) 10Hashar: "Congratulations on fixing the tests :-)" [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/83816 (owner: 10Daniel Kinzler) [20:30:38] (03CR) 10jenkins-bot: [V: 04-1] Store serialized entity in parser output (DRAFT) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/83846 (owner: 10Aude) [20:46:33] (03PS13) 10Aude: Store serialized entity in parser output (DRAFT) [extensions/Wikibase] - 10https://gerrit.wikimedia.org/r/83846 [21:08:08] what's the globe object for mars? [21:08:34] round [21:09:07] ty Reedy [21:09:18] spheroid ? [21:09:38] lol [21:09:45] i wanted to know what the entity object was [21:19:09] legoktm: Q111 [21:19:15] thanks! [21:19:29] moon is Q405 :) [21:19:33] is there a way i can easily see a list of items being used as coordinate globes? [21:19:37] no [21:19:41] or do i have to parse a dump :x [21:19:45] not yet [21:20:22] i am poking at supporting non-earth globes better [21:20:26] ok [21:20:26] mainly on my weekend time [21:20:57] items being used as globes probably would be a database report [21:21:23] and at some point maybe restricted like calendars [21:24:39] aha [21:24:41] aude: https://www.wikidata.org/wiki/User:Byrial/Globes [21:24:47] yeah [21:24:58] very useful [21:32:09] ok, the pywikibot internal list is now more up to date [22:53:50] what the hell naming convention is https://en.wikipedia.org/wiki/Qazaxlar_%2840%C2%B0_17%27_N_47%C2%B0_07%27_E%29,_Barda