[00:01:54] it should be possible to represent both [00:02:13] is this that GND entity type nonsense that the Germans get a big boner for? [00:02:18] because that shit is ridiculous brain cancer [00:03:25] the only correct way to deal with that is to tie a big chain around the whole idea, attach a few breezeblocks and chuck it in the ocean with all other formal controlled vocabulary nonsense. [00:04:45] tommorris: I am not sure that is exactly helpful [00:05:47] ignore entity type. it's silly and ridiculous [00:05:50] use instance of [00:07:34] cannot have an instance of "winery" or "vineyard" [00:13:53] sDrewth: why not? [00:18:12] * Jasper_Deng_busy hopes sDrewth got his PM, but imagines sDrewth is in a content, not policy, mood right now [00:18:57] Jasper_Deng_busy: it is morning, it is a non-interpretative time [00:19:16] PiRSquared: because it is not defined [00:19:48] and in some things I get sick of drilling down four more layers to fix one thing [01:13:08] * addshore wonders how much of a slap he would get if he just unblocked someone that is being classed as an LTA on enwiki.. [01:39:55] addshore: {{whale}} [01:39:55] 04[29] 04https://www.wikidata.org/wiki/Template:whale [01:53:30] PiRSquared: whale slap ? :P [01:59:21] yep [02:17:55] https://fr.wikipedia.org/wiki/Wikip%C3%A9dia:Bulletin_des_administrateurs#Wikimedia_Foundation_elaborates_on_recent_demand_by_French_governmental_agency_to_remove_Wikipedia_content. [02:18:18] sorry [02:45:34] New patchset: Daniel Werner; "(bug 45002) jQuery.valueview is now a single Widget using composition rather than inheritance" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/57050 [03:01:47] New patchset: Daniel Werner; "Moves jQuery.ui.suggester from Wikibase into ValueView extension" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/57449 [03:01:57] New patchset: Daniel Werner; "(bug 45002) additional experts for jQuery.valueview" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/57738 [05:47:46] New patchset: Jeroen De Dauw; "test commit" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58050 [05:48:03] Change merged: Jeroen De Dauw; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58050 [06:30:10] New patchset: Jeroen De Dauw; "test commit" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58052 [06:30:29] Change merged: Jeroen De Dauw; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58052 [06:34:17] !admin can one please delete a duplicate? [06:34:29] matanya: at your service [06:34:47] hi PinkAmpersand :) thanks. https://www.wikidata.org/wiki/Q6039428 should be deleted [06:34:55] it is dup withhttps://www.wikidata.org/wiki/Q1470293 [06:35:03] then I can merge the conflicts [06:35:19] kk 1 sec [06:35:52] New patchset: Jeroen De Dauw; "test commit" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58054 [06:36:02] Change merged: Jeroen De Dauw; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58054 [06:36:30] matanya: {{merged}} [06:36:31] 10[30] 10https://www.wikidata.org/wiki/Template:merged [06:36:54] thanks a lot PinkAmpersand [06:37:01] no problem :) [06:37:21] jsyk, since you're a steward you should feel free to join #wikidata-admin [06:37:52] PinkAmpersand: it is an invite only [06:38:13] * PinkAmpersand invites matanya [06:38:24] thank you [06:38:39] what is the policy for stewards in wikidata? [06:38:54] deletion allowed or disallowed? [06:38:56] [[Wikidata:Administrators]] [06:38:56] 10[31] 10https://www.wikidata.org/wiki/Wikidata:Administrators [06:39:22] currently says you can only delete vandalism or spam [06:39:40] though I wasn't involved in the discussion that decided that [06:40:13] oh well. thank you for that too :) [06:40:18] :) [06:41:06] * PinkAmpersand redirects [[WD:Stewards]] to [[Wikidata:Administrators#Other_accounts_with_administrative_access]] [06:41:06] 10[32] 10https://www.wikidata.org/wiki/WD:Stewards13 => [06:41:08] 10[33] 10https://www.wikidata.org/wiki/Wikidata:Administrators%23Other_accounts_with_administrative_access [06:43:52] New patchset: Jeroen De Dauw; "test commit" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58055 [06:44:00] Change merged: Jeroen De Dauw; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58055 [06:45:29] Lydia_WMDE: around? [06:45:43] New review: Jeroen De Dauw; "test" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58055 [07:44:38] New patchset: Tobias Gritschacher; "Improved docs and code style" [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/57409 [07:57:21] New patchset: Jeroen De Dauw; "Improved docs and code style" [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/57409 [07:58:09] New review: Henning Snater; "InputAutoExpand functionality seems to be broken. Did you forget to attach it to the experts?" [mediawiki/extensions/DataValues] (master); V: -1 - https://gerrit.wikimedia.org/r/57738 [08:02:50] New patchset: Henning Snater; "(bug 44228) Exchanging "apply" with "call"" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/57073 [08:18:55] Change merged: Tobias Gritschacher; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/57073 [08:22:05] New patchset: Daniel Kinzler; "(bug 41573) Add entities to watchlist." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/54704 [08:45:20] New patchset: Tobias Gritschacher; "Get rid of GenericArrayObject usage" [mediawiki/extensions/Diff] (master) - https://gerrit.wikimedia.org/r/57347 [08:49:42] DanielK_WMDE: ping [08:56:55] New patchset: Jeroen De Dauw; "Add StringFormatter" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/57706 [08:57:00] Change merged: Jeroen De Dauw; [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/57706 [09:05:18] New patchset: Henning Snater; "(bug 44228) Cloning tooltip content" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58065 [09:16:37] New patchset: Henning Snater; "(bug 44228) Replacing "textContent" with "nodeValue"" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58067 [09:21:44] Abraham_WMDE: hi [09:22:58] DanielK_WMDE: is skype or a short phone call possible ? [09:23:54] Abraham_WMDE: we can sykpe (until the baby wakes up) [09:24:02] DanielK_WMDE: revieeeeeeeeeeew https://gerrit.wikimedia.org/r/#/c/57466/ [09:24:41] JeroenDeDauw: oh, it's only ~2200 lines of changes :P [09:24:55] DanielK_WMDE: exactly [09:25:00] sorry, but i have to get my presentations for tomorrow ready [09:25:09] DanielK_WMDE: its way under 9000 [09:25:28] well, that's something, i guess :P [09:25:31] Abraham_WMDE: now? [09:25:41] DanielK_WMDE: yea [09:27:06] Abraham_WMDE: will try again, somehow skype doesn't want to work [09:28:17] klingelt... [09:30:30] DanielK_WMDE: Hi! I've a question about phase 3: Will queries be static (like SELECT * WHERE id = 1) or it will be possible to put parameters in queries (like SELECT * WHERE id = $1) with $1 set at the call of the query? [09:32:23] Tpt_: they will be static, since they need to be pre-computed and cached. But if the need arises, I can imagine we could have a system of query "templates" that can be used to easily create new queries of a specific type. [09:33:25] DanielK_WMDE: Thanks :-) And an other question: is it possible to add in the API a method getEntityFromID() ? [09:33:43] * Lua API [10:01:38] Tpt_: sorry, was on the phone [10:02:20] Tpt_: getEntityFromID() would be possible, but we would need some way to track which page is using which entity, so we know which pages to purge when the entity changes [10:02:31] this is targeted for phase 3 [10:02:46] until then, you can only access the entity associated with the wikipedia page itself [10:03:29] DanielK_WMDE: Thanks :-) [10:03:31] Is it possible to review https://gerrit.wikimedia.org/r/#/c/56737/ that fix a bug in the lua API? [10:05:10] New review: Daniel Kinzler; "Looks sane to me, but I'm not a Lua expert. This also seems a bit tricky to test." [mediawiki/extensions/Wikibase] (master) C: 1; - https://gerrit.wikimedia.org/r/56737 [10:05:30] Tpt_: put a +1 there. poke us again if that doesn't move forward [10:06:10] DanielK_WMDE: Thanks :-) [10:06:12] And as the change https://gerrit.wikimedia.org/r/#/c/51164/ looks abandoned, do you want I work on it by integrating some code written for https://github.com/Tpt/BasePedia ? [10:06:59] Tpt_: it's not abandoned, but stalled since Anja left the team. It's somethign I was going to pick up this week. [10:07:16] Tpt_: feel free to work on it though [10:07:44] DanielK_WMDE: Ok. I'll try do do some work on it this evening. [10:07:47] Tpt_: note however that you should not modify the RDF mapping, or add to it, without consulting with denny. [10:09:22] DanielK_WMDE: Is there a specification somewhere? https://meta.wikimedia.org/wiki/Wikidata/Development/RDF doesn't deals with sitelinks and things like that. [10:14:19] Tpt_: i don't know of any besides that. ask denny what todo about stuff that is missing there. [10:14:48] Tpt_: you can make suggestions in code, of couse, but i suggest to do that in small, separate changes [10:14:57] * DanielK_WMDE is off to lunch [10:15:12] DanielK_WMDE: Ok. Thanks :-) . I'll talk to him before doing anything. [10:15:54] New patchset: Jeroen De Dauw; "Added IriFormatter" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/58069 [10:16:40] Tpt_: you can also ask anja, i think she started to work on something for sitelinks based on skos [10:17:09] DanielK_WMDE: Ok. [10:40:54] Change merged: Tobias Gritschacher; [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/58069 [10:50:01] are there plans to separate properties that require a source from others like viaf, sudoc, ...? [10:52:38] btw. there could be a tooltip when hovering the statements property [10:57:38] Change merged: Jens Ohlig; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/56737 [10:58:18] Change abandoned: Jens Ohlig; "Fixed in Change-Id: I220996203ac88890ea8a2c50193e0932a907ddcd" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/57903 [11:28:55] Change restored: Hashar; "(no reason)" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/32768 [11:28:55] New patchset: Hashar; "jenkins test" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/32768 [11:28:55] Change abandoned: Hashar; "(no reason)" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/32768 [11:29:53] Change merged: Tobias Gritschacher; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58065 [11:29:56] JeroenDeDauw ^^^ DataValues now triggers the jslint job :-]  Thank you for the patch [11:33:47] Change merged: Tobias Gritschacher; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58067 [11:51:30] Change merged: Tobias Gritschacher; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/57509 [11:55:16] short question: what is the reason there is no birth-date propertie? [11:55:28] or other date-properties [11:59:40] holy shit @ http://www.wikidata.org/wiki/Special:DispatchStats [12:01:35] does that mean that an item edited today will be seen in the wikipedias tomorrow? [12:07:18] can someone tell me how to edit the data type of a property? [12:08:09] you cant [12:08:16] you have to delete it and create a new one unfortunately [12:08:23] sounds like fun [12:08:33] i can delete it for you if you need :) [12:08:39] the test-dataset has strange properties [12:08:44] ah [12:08:56] Atomic weight: Data type: Item [12:10:01] how to I delete a property? [12:10:20] Is there a specialpage for this? [12:10:26] no, its just like normal [12:10:30] hit the delete tab [12:11:21] cant see one: http://lbenedix.monoceres.uberspace.de/screenshots/p8150v1wta_(2013-04-08_14.10.58).png [12:11:55] what userrights do I need? Im in the administrators group [12:13:15] its in the dropdown [12:13:19] the arrow pointing downwards [12:13:56] if you were using the monobook skin, it would be a tab [12:14:03] along with protect [12:14:11] okay [12:14:13] thanks [12:14:40] I think anyone should create a short introducion into wikidata [12:14:45] Change merged: Tobias Gritschacher; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/57726 [12:15:30] lbenedix: [[:en:WP:Wikidata]] [12:15:31] 04[34] 04https://www.wikidata.org/wiki/:en:WP:Wikidata [12:15:32] Change merged: Tobias Gritschacher; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/57735 [12:18:51] Is this up to date? [12:22:02] reasonably [12:22:08] it doesnt mention phase 2 though [12:22:41] Maybe a screencast would be nice [12:23:27] maybe. [12:35:53] is it possible to remove several langlinks without clicken 10000 times? [12:36:00] clicking [12:36:46] not unless you use a script i guess [12:37:06] deleting the item and recreating it was good enough... [12:43:43] New patchset: Tobias Gritschacher; "Typo in Selenium test description" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58078 [12:43:58] New patchset: Jeroen De Dauw; "Cleanup of DataType related formatting code" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/57466 [12:44:31] Change merged: Tobias Gritschacher; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58078 [12:44:41] New patchset: Jeroen De Dauw; "Cleanup of DataType related formatting code" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/57466 [12:48:55] New review: Aude; "there's no escaping of the parser function output now" [mediawiki/extensions/Wikibase] (master) C: -1; - https://gerrit.wikimedia.org/r/57466 [12:54:42] New patchset: Jeroen De Dauw; "Cleanup of DataType related formatting code" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/57466 [12:57:49] New review: Aude; "it's not possible to run ValueFormatters tests in the client" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/57466 [12:59:12] New review: Aude; "(2 comments)" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/57466 [13:06:25] New patchset: Jeroen De Dauw; "Cleanup of DataType related formatting code" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/57466 [13:09:54] New patchset: Jeroen De Dauw; "Cleanup of DataType related formatting code" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/57466 [13:15:57] New review: Aude; "unrelated to the patch but I get a fatal error when running tests in the repo:" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/57466 [13:27:58] New patchset: Jeroen De Dauw; "Cleanup of DataType related formatting code" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/57466 [13:28:20] New patchset: Jeroen De Dauw; "Cleanup of DataType related formatting code" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/57466 [13:42:47] New patchset: Jeroen De Dauw; "Cleanup of DataType related formatting code" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/57466 [13:48:49] New patchset: Jeroen De Dauw; "Add registration of formatter argument as being used by Wikibase" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/58083 [13:50:10] is there any api where I can get statistics about wikidata? Like the number of edits per time by type (label description, statemend, ...)? [13:50:40] New patchset: Jeroen De Dauw; "make use of assertContainsOnlyInstancesOf" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/58084 [13:52:54] New patchset: Jeroen De Dauw; "Cleanup of DataType related formatting code" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/57466 [13:54:27] Change merged: Tobias Gritschacher; [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/58083 [14:24:28] New patchset: Jeroen De Dauw; "Cleanup of DataType related formatting code" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/57466 [14:28:07] is there any api where I can get statistics about wikidata? Like the number of edits per time by type (label description, statemend, ...)? [14:28:19] errr [14:28:21] not really [14:28:44] is it possible to get this stats anyhow? [14:28:50] http://stats.wikimedia.org/wikispecial/EN/draft/TablesWikipediaWIKIDATA.htm [14:28:55] you can stalk the irc feed [14:29:09] but there's no really good way without checking every edit [14:29:42] what is the channel? [14:30:22] irc.wikimedia.org/#wikidata.wikipedia [14:33:04] hmmmm [14:33:08] lots of edits... [14:33:23] https://wikipulse.herokuapp.com/ [14:33:35] weird [14:33:35] we're normally at 500 [14:33:57] http://www.youtube.com/watch?v=p5QHqkztF7Y [14:34:07] > This video is private. [14:34:20] it is a video of my wikidata-meter [14:35:10] oh right [14:35:16] should work now [14:36:02] today at 1am it was at about 1000 [14:36:21] waaaat [14:37:00] to get the stats I want i need to look at ~500 edits to see what part of the item was changed... [14:37:20] most you can check the edit summary i guess [14:37:27] is it possible to add a comment like "label", "desctiption", ...? [14:38:11] I think every edit is a separate request [14:39:05] New review: Daniel Werner; "Not, its there and a dependency of "jquery.valueview.experts.stringvalue"." [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/57738 [14:41:28] New patchset: Daniel Werner; "(bug 45002) Using the new "ValueView" extension in Wikibase now." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/57052 [14:42:01] New patchset: Daniel Werner; "(bug 45002) made EntityId data values work with the new jQuery.valueview." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/57053 [14:42:09] New patchset: Daniel Werner; "Move jQuery.ui.suggester into ValueView extension" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/57448 [14:56:14] Danwe_WMDE, Henning_WMDE: can you give me a list of userinterface related bugs? [15:00:28] No, there's no common tag. Sorry. [15:01:40] I heared, that you are kind of responsible for the userinterface... [15:02:24] I deny everything. [15:03:29] I have it in my irc-log [15:03:32] lbenedix: I would ask Danwe_WMDE [15:03:54] i did [15:04:19] lbenedix: Danwe_WMDE wants to know what you are asking [15:04:36] I already know^^ [15:04:44] I'm looking for a list of all userinterface related bugs in bugzilla [15:05:35] and I dont want to look at every single item in the list of >500 bugs... [15:05:49] well, [15:05:54] someone has to tag them all :P [15:06:14] someone is hopefully working on them [15:06:38] I hope this person can give me the list [15:30:52] hello. I begin on wikidata and I have a question: how to merge two items ? [15:32:38] Abaddon1337_: you have to remove the sitelinks from one item, then add them to the other [15:32:44] Abaddon1337_: what do you mean by merge and what items are you talking about? [15:32:48] see also https://www.wikidata.org/wiki/Help:Merge [15:33:26] lbenedix: http://www.wikidata.org/wiki/Q7370264 and https://www.wikidata.org/wiki/Q2989719 are the same [15:34:06] and I wish to have the interwiki links correctly linked on wikipedia [15:34:32] delete the link from Q2989719 and add it to Q7370264 [15:34:46] then make a request for deletion of Q2989719 [15:34:54] Abaddon1337_: {{done}}, in the future just follow the instructions at https://www.wikidata.org/wiki/Help:Merge [15:34:55] How efficient, legoktm! [15:35:19] legoktm: great. Thank you so much [15:35:27] np :) [15:36:32] the nre langlink is not visible on: http://en.wikipedia.org/wiki/Rotary_converter [15:37:27] a last question : each time an article is renamed on a given wikipedia, are the modifications automatically reported on wikidata ? [15:37:54] Abaddon1337_: right now we have bots that take care of it, but in the future the software will [15:38:01] lbenedix: gotta purge the page [15:38:23] legoktm: ok. thanks [16:21:12] New patchset: Daniel Werner; "(bug 45002) made EntityId data values work with the new jQuery.valueview." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/57053 [16:21:31] New patchset: Daniel Werner; "(bug 45002) additional experts for jQuery.valueview" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/57738 [17:15:36] perhaps all of http://en.wikipedia.org/wiki/Category:Hydroelectric_power_stations_by_country should get p31 → Q80638 [17:15:36] ? [17:16:19] or not all but ones that has some template [17:16:23] perhaps infobox [17:19:30] Base-w: oh [17:19:33] thats really easy to do [17:19:49] https://www.wikidata.org/wiki/Wikidata:Tools/Array_properties_gadget [17:21:36] legoktm: perhaps I will try to teach my bot [17:21:45] my bot already does it :) [17:21:49] but i'm not sure in success [17:21:56] you just need to file a request [17:22:14] I dont like to use other bot's service :) [17:22:26] lol [17:22:30] but may be you're right [17:22:41] better file a request now [17:23:03] New patchset: Daniel Werner; "(bug 45002) jQuery.valueview is now a single Widget using composition rather than inheritance" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/57050 [17:25:02] New patchset: Daniel Werner; "Moves jQuery.ui.suggester from Wikibase into ValueView extension" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/57449 [17:27:18] PinkAmpersand: i am around now :) [17:27:25] still anything i can help you with? [17:27:46] Change merged: Henning Snater; [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/57050 [17:28:08] New patchset: Henning Snater; "(bug 45002) additional experts for jQuery.valueview" [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/57738 [17:28:19] Change merged: Henning Snater; [mediawiki/extensions/DataValues] (master) - https://gerrit.wikimedia.org/r/57738 [17:28:42] Change merged: Henning Snater; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/57052 [17:29:09] Change merged: Henning Snater; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/57053 [17:30:41] lbenedix: about that list, perhaps Lydia_WMDE knows more. You can search for bugs tagged with JavaScript, I did that sometimes but not always and others didn't, I guess. [17:31:26] legoktm: does it takes pages in Category only or in subcategories too? [17:31:33] I asked Lydia yesterday... she said ask Danwe_WMDE and Henning_WMDE [17:31:39] hmm? [17:31:41] i did not [17:31:48] ah recursion parameter [17:31:50] perhaps [17:31:50] maybe it was hoo [17:31:50] i said someone needs to tag the bugs in bugzilla [17:32:21] and that isn't going to happen anytime soon by anyone on the team because everyone on the team is swamped already, sorry [17:32:41] lbenedix: They wrote most of the client JS, that's the only thing I said about them [17:33:17] s/client JS/repo and lib JS/ [17:33:33] sorry, if I misunderstood... I thougt the people who writes code are responsible for fixing bugs... [17:33:52] What's broken [17:34:06] they are [17:34:13] but that is different from managing the bugs [17:34:32] and tagging them is not a necessity for the team right now [17:34:37] so it doesn't get done [17:34:45] that doesn't mean they don't get fixed [17:34:49] I thougt they might have a list of bugs they work on [17:35:04] scanning this list may be faster than scanning all 500 bugs [17:36:36] the bug reports people plan to work on in that sprint are set to assigned usually [17:37:09] so its not possible to get a list of all UI related bugs without reading all 500? [17:37:34] not right now, no [17:38:31] do you think you will get used to tagging the bugs? [17:40:13] i can bring it up in a few days when we have a all-hands meeting to discuss how we want to do things in the future [17:40:22] but thb i think it is not very likely [17:41:14] isn's the tagging one of the good features of bugzilla? [17:42:27] we could open a bug for tagging the bugs :D [17:43:11] I agree with lbenedix here, we really should get used to tagging bugs [17:44:00] maybe someone could get throug the old bugs as well... [17:44:15] lbenedix: we don't have the resources for that right now [17:44:44] lbenedix: it is a good feature but we did not have any use on the team for tagging the bugs as UI or not [17:44:48] so it isn't done [17:45:01] it's not like we're doing unneeded work for the fun of it ;-) [17:45:24] isnt the userinterface one main point for wikidata? [17:45:24] we're not sitting around waiting for work to pop up suddenly :P [17:45:34] tagging bugs.... needs-volunteer :) [17:45:50] lbenedix: sure - as is the backend [17:45:57] it can't life without either [17:46:01] anyway [17:46:06] the discussion is moot [17:46:10] New bugs should be tagged by the person reporting them. IF that's someone from the team, that person should also tag them [17:46:10] the bugs are not tagged [17:46:46] or the one who is working on the bug could tag it [17:47:18] lbenedix: when the bug is assigned already, what's the big point in tagging them then? [17:47:19] I have no idea how much more work the tagging is [17:47:27] would you rather they spent time on administrativa or actually fix the bug? [17:47:53] managing is not wasting time [17:47:55] it's not a lot of work but it is yet another thing we're asking the developers to do [17:48:03] and we should carefully consider if this is really worth it [17:48:30] I think it would be an interesing point to see if the frontend makes more problems than the backend [17:50:03] i don't think you can infere this from the number of bug reports [17:50:26] starting with that users obviously see more bugs in the UI than in the backend [17:50:41] but users dont use bugzilla [17:50:55] and the number of bug reports is a better benchmark than nothing [17:50:59] some do and so do i for others [17:51:13] it's a misleading one - i consider that worse than nothing [17:51:15] but anyway [17:51:27] i really need to get an announcement out now [17:51:28] sorry [17:51:33] np [17:53:22] i take it WD phase 2 is still coming to enwiki? [17:55:38] Base-w: you need to set the recursive option if you want subcats [18:04:20] rschen7754|away: No [18:04:25] oh ok [18:06:20] New patchset: Henning Snater; "(44228) Handling $.ui.autocomplete's missing original event in IE8" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58116 [18:08:17] New patchset: Daniel Werner; "(bug 44228) Handling $.ui.autocomplete's missing original event in IE8" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58116 [18:09:06] Change merged: Daniel Werner; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58116 [18:10:41] rschen7754|away: we need to resolve technical issues with the change notification backlog [18:10:47] ok [18:11:22] deploying more to enwiki won't improve it, but we'll see about scaling it up more [18:11:42] apparently wikidata has more edits happening than english wikipedia :o [18:12:15] lol [18:12:20] bbl [18:12:36] k [18:12:39] aude: lol [18:12:57] * aude loves the bots though :) [18:17:37] erm Base-w [18:17:39] i dont think [18:17:46] https://www.wikidata.org/wiki/Q80638 is the best target [18:18:25] hmm [18:18:31] enwiki doesnt have a specific article [18:19:26] oh crap [18:19:30] https://www.wikidata.org/wiki/Q80638 is an interwiki mess [18:20:01] * hoo wonders whether the candy bowl in the office is empty :P [18:20:19] hoo: focus! [18:20:54] legoktm: Those candies are crucial for the success of Wikidata :P [18:21:06] :P [18:27:54] ah, okay.... why i don't see hoo's widget (not logged in) :) [18:28:05] * aude reviewing the patch [18:28:11] \o/ [18:28:43] only suggestions ... not easy, is see if we can reduce the amount of js dependencies [18:28:49] in a later patch [18:28:55] aude: mhm, why? [18:29:14] first the requirement was to reuse as much existing code :P [18:29:14] it takes quite a few seconds to load the widget on my external hosted wiki [18:29:31] Let me check on WMF [18:29:36] not the fastest hosting in the world, but curious how it will be on test2, for example [18:29:39] ok [18:30:03] * aude in favor of reusing existing code at same time would like it to be more lightweight when possible [18:31:22] aude: 1298ms on enwiki [18:31:41] not so bad [18:31:46] var start = new Date().getTime(); mw.loader.using( 'wbclient.linkItem', function() { console.log( ( new Date().getTime() ) - start ); } ); [18:31:51] :) [18:32:26] I think it's bearable [18:32:33] probably works for now but when we get into more widgets, then some optimization would be nice [18:32:53] like site link widget doesn't need to know about snaks :) [18:32:58] Wikidata repo eg. is rather "slow" as well... [18:33:08] Does it? [18:33:10] * aude nods [18:33:23] hoo: don't think so [18:33:54] It doesn't... the worst part probably is that the wikibase lib JS in itself is such a dependency hell [18:34:05] * aude nods [18:34:23] We only load 2 real wikibase modules and those are the biggest mass [18:34:24] it loads all the data value js files too [18:34:56] Hello [18:35:29] aude: That's already marked as TODO in the lib/ resources [18:35:35] I just learned about Wikidata, seems like a fine addition to the Mediawiki family. But how come there's no property for something as trivial as population? [18:35:51] 'wikibase.store' is a the module with wikibase.RepoApi and a lot of stuff we just don't need [18:36:00] ZeroOne: we don't have all the property types available yet [18:36:15] aude: so we're waiting for "integer"? [18:36:16] numbers being one, since they are more complex with various units and formatting [18:36:19] I guess I can refactor that out easily cutting the load down [18:36:29] ZeroOne: yes [18:36:37] hoo: would be nice [18:39:49] aude: OK. So, I'm guessing it's the same with dates then? [18:40:46] It is, ZeroOne [18:41:09] ZeroOne: yes [18:41:16] There is a list of properties waiting for the creation of its datatype, if you want to see [18:41:33] I came to it after having the same questions :) [18:42:55] [[d:Wikidata:Property_proposal/Pending]] [18:42:55] 10[35] 10https://www.wikidata.org/wiki/Wikidata:Property_proposal/Pending [18:43:17] Btw, population isn't yet in there, I don't know if it has been commented elsewhere [18:43:45] Obviously it's a must-have property [18:44:18] thanks jem- [18:44:24] Np :) [18:44:58] In which wikis do the infobox properties work, then? In it.wikipedia.org at least. [18:45:40] ZeroOne: http://blog.wikimedia.de/2013/03/27/you-can-have-all-the-data/ [18:45:44] Italian, Hebrew, Hungarian, Russian, Turkish, Ukrainian, Uzbek, Croatian, Bosnian, Serbian and Serbo-Croatian Wikipedias [18:47:03] OK. I wonder how exactly those got picked and why not {insert-your-local-version}? [18:47:20] it, he, hu were the first ones in phase 1 [18:47:28] the others just... asked :) [18:47:57] or are languages that the team understands and can give feedback [18:47:59] in some cases [18:48:12] Anyway, the rest are planned for this wednesday (if plans haven't changed) [18:48:23] jem-: we'll see.... [18:48:30] :) [18:48:35] "We are planning to deploy it on English Wikipedia on April 8 and on all remaining Wikipedias on April 10." says http://meta.wikimedia.org/wiki/Wikidata/Deployment_Questions [18:48:36] if we can sort of the change backlog, etc. [18:48:50] that's assuming no technical issues [18:49:23] there's a signficant lag between time of wikidata edit and time the wikipedias are notified (e.g. in recent changes and watchlist) [18:49:36] we want to get that improved [18:50:18] there are improvements to the job queue coming soon, so that should help among other things [18:52:24] So, will the properties deploy a localized version of a string when used? I.e. will Q33's 'official language' property be "suomi" when used in fi.wikipedia.org but "Finnish" when used in en.wikipedia.org? [18:52:49] ZeroOne: yes [18:53:03] assuming it's translated on wikidata [18:55:54] Cool. And will you be able to build hierarchies like "San Francisco, California, United States" when, say, someone's 'place of birth' property is 'California'? [18:56:11] Change merged: Aude; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/54816 [18:57:53] ...I mean when someone's place of birth is "San Francisco", of course. [18:57:55] aude: Thanks :) [18:58:01] sure :) [18:58:13] Do we still want this? https://gerrit.wikimedia.org/r/54817 [18:58:51] Change merged: Aude; [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/54817 [18:58:54] yes [18:59:59] :) [19:00:46] it should be there for the next deployment.... (not this week) [19:00:56] :) [19:01:27] I'll upload the module refactoring later... I guess that will lower the loading times much further [19:01:55] ok [19:02:22] Only one more wb change of mine is open in gerrit :) [19:02:39] If only all reviews would be this fast [19:04:45] can't promise but don't see why we can't get your other change in for next deployment [19:05:03] * aude needs to work on stats for the dispatcher [19:05:30] It's not a real important thing... link item certainly is more urgent... while the dispatcher is most [19:06:15] bugs are always important [19:07:24] talking about deployments... will the fix for the broken tooltip be deployed this week? [19:07:47] (https://gerrit.wikimedia.org/r/57682) [19:09:07] when status = merged --> deployment this week? [19:09:24] lbenedix: Now, that just means it's in our master branch [19:09:41] from that one regular snapshots are taken and then deployed in a two-week phase [19:09:45] lbenedix: probably next week [19:10:14] monday, i'm guessing or wednesday [19:10:41] WD org usually is wednesday AFAIR [19:11:06] hoo: lately it's been wednesday but sometimes was monday [19:11:10] https://www.mediawiki.org/wiki/MediaWiki_1.22/Roadmap [19:11:35] Phase 2 Wednesday, April 17, 2013 [19:12:18] Yay, Reedy deployed my CentralAuth update :) [19:12:24] :) [19:12:36] https://meta.wikimedia.org/wiki/Special:GlobalUsers/abusefilter works :P [19:15:03] I do tend to do things when I say I will ;) [19:15:59] will there be a fix for the dispatch lag? We are currently discussing to slow down bots to 60 edits/minute https://www.wikidata.org/wiki/Wikidata_talk:Bots#Bot_speed [19:16:06] Damn, I git pulled my core while running Qunit tests... -.- [19:16:48] lolol [19:17:48] Sk1d: yes but can't say how quick [19:18:20] aude: good :) [19:18:35] likely involves having it run entirely via the job queue (which will use redis soon, and be in ashburn) [19:18:35] Sk1d: i'll reply there in a few minutes - thanks for the link [19:19:12] where the dispatcher is, it's at the limits of the hardware, so we can't just ad dmore cron jobs [19:20:55] is there an expected timeframe on that? :) [19:21:24] Is it me or does Wikidata have too many bots editing too quickly? [19:21:26] legoktm: no [19:21:33] * aude don't want to over promise :) [19:21:41] but asap [19:21:46] ok :D [19:21:58] Jasper_Deng: yes currently tbh [19:22:04] Jasper_Deng: faster than other wmf projects yes, but there isnt really any other option [19:22:12] Jasper_Deng: https://www.wikidata.org/wiki/Wikidata_talk:Bots#Bot_speed [19:22:18] legoktm: no other option than what? [19:22:57] Lydia_WMDE: fast bots [19:22:58] well the other option is slowing down i guess, and that really doesnt seem like a good idea when we're trying to deploy phase 2, except we dont have very much data imported yet [19:23:14] * Jasper_Deng is still waiting for the sourcing interface to start working [19:23:24] (and for actual quantitative data types) [19:23:46] legoktm: it is entirely ok not to have all the data in there when deployed really [19:23:50] Qunit tests running like charm after I refactored :) [19:24:00] legoktm: i would even say this is a bad reason tbh [19:24:03] yay [19:24:24] legoktm: most infoboxes can't be done with wikidata yet and that's ok [19:24:30] probably even good [19:24:40] so this shouldn't be a reason to rush with bots [19:24:46] I guess. [19:25:20] i think the big lag is a bigger problem then the missing informations if a user wants to use wikidata but does not know why the edit is not shown he gets frustrated [19:25:34] yes [19:26:06] i dont think a normal wikipedia user will miss a lack of data on wikidata [19:26:14] hehe [19:26:54] (i'd rather have the wikipedias really discuss moving certain infoboxes than rush it and not having the data available is an excellent means to achieve that - even if a bit evil) [19:26:57] muahahah [19:27:00] sorry... [19:27:01] ;-) [19:27:59] :) [19:36:26] New patchset: Tpt; "(Bug 42063, bug 44577) Special:EntityData supports RDF output" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/51164 [19:39:18] > My bot speed is arround 75 edits/sec. [19:39:26] right....... [19:40:28] New patchset: Tpt; "(Bug 42063, bug 44577) Special:EntityData supports RDF output" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/51164 [19:45:47] legoktm: How will I know that your bot done request? [19:46:13] it gets archived at https://www.wikidata.org/wiki/User:Legobot/properties.js/Archive [19:46:20] or ill just poke you on irc :) [19:46:26] :D [19:47:30] legoktm: How did you call your wikidata bot? I didn't see it in the commit log [19:47:50] multichill: i never commited/finished it :x [19:48:08] i got some homework to do, ill work on it in an hour [19:48:17] legoktm: Dude! What's done, what's missing? [19:50:41] umm it was basically done [19:50:45] the main issue was overwriting [19:50:47] and sources [19:51:17] Just commit it with a list of FIXME's at the start [19:51:28] If you're lucky someone else might fix it before you get to it [19:51:51] lemme pastebin it [19:52:01] No, just commit it to pywikipedia [19:52:40] http://dpaste.de/YIEKo/raw/ [19:52:44] fineeee [19:52:55] but that means i need to document it! [19:53:33] what should it be named? [19:53:49] guys i don't know if i am telling you this enough but just in case: <3 [19:54:00] * Lydia_WMDE is happy with this day after a really sucky weekend [19:54:06] <3 [19:54:14] today is a great day, its 70 degrees out :D [19:54:18] hehe [19:58:11] !properties [19:58:29] ah no bot to give me a link to a full list of properties [19:58:37] [[WD:P]] [19:58:38] 10[36] 10https://www.wikidata.org/wiki/WD:P [19:58:38] [[List of properties [19:58:41] or that... [19:58:42] :P [19:58:51] multichill: suggested name for script? [19:58:55] legoktm: you are more clever than me when it comes to shortcuts... [19:59:04] :P [19:59:10] im a very lazy person [19:59:13] haha [19:59:14] so i know all the shortcuts [19:59:22] not a bad thing really... [19:59:54] claimit.py? [20:00:16] O0 [20:00:17] http://ruvr.co.uk/2013_04_08/Assange-new-Wikidata-released/ [20:00:27] ^ just popped up in my twitter search [20:00:30] Hahahahaha [20:01:18] heh [20:02:08] legoktm: You should put it under the MIT license like everything else [20:02:17] that is the MIT license ;) [20:02:18] but yeah [20:02:35] i simplified the header [20:02:37] Remove cruft and insert # Distributed under the terms of the MIT license. [20:06:41] !api [20:07:22] what is for setting props? [20:07:38] umm [20:08:18] committed :) [20:08:22] Base-w: what are you trying to do? [20:08:44] to find what I need in Special:ApiSandbox [20:09:04] wbcreateclaim [20:10:43] hm is that in ApiSandbox? I'm still not see it [20:11:00] action=wbcreateclaim [20:11:08] what language are you working in? [20:11:15] there's probably a framework already out there... [20:11:41] Java [20:11:54] I use old version of MER-C's Wiki.java [20:12:05] ah [20:12:10] dont think so then [20:12:21] Thanks legoktm, that gives me a start :-0 [20:12:37] i think the most important fixme is the one about duplicates [20:12:49] but you're welcome :D [20:12:52] Do you happen to have some code snippets where you're already doing this? [20:13:03] Ah it is not for api.php but for index.php? [20:13:08] Base-w: no, its for api.php [20:13:17] multichill: checking for dupes? yeah, lemme find it [20:13:24] Duplicate checking can be done on three levels: [20:13:37] * Property (if you're trying to add property Pxx and Pxx is already set, skip it) [20:14:08] * Claim (if a Pxx with Qyy is set and you're trying to set Pxx with Qyy, skip it) [20:14:31] * Source (so you can have multiple sources added) [20:14:40] well property is easy [20:14:43] that would be [20:14:51] item.get() [20:15:09] if not claim.getID() in item.claims: item.addClaim(claim) [20:16:40] That works? claim.getID() should return a different claim object than in item.claims [20:16:55] Things like Qyy and source makes it a different object [20:16:58] Danwe_WMDE: Around? [20:17:08] .getID() is a string [20:17:19] it would return the id of the property the claim sets [20:18:44] http://dpaste.de/Y4f3G/raw/ tests at the claim level (where g is whether to add it or not) [20:19:10] So item.claims is a list of strings? [20:19:17] item.claims is a dict [20:19:29] {'p107':[claim1,claim2]} [20:19:37] where claim1,2 are Claim objects [20:21:17] hoo: yes [20:22:21] Danwe_WMDE: I'm currently refactoring the wikibase.store module [20:22:25] Ah right, and the key is a string so membership testing works [20:23:18] mhm [20:23:29] Danwe_WMDE: Well, is it acceptable to lazy load some (most likely) already loaded modules for sanity using mw.loader.using in there? [20:28:44] Our whole dependencies seem messed, meh [20:28:58] legoktm: Thanks, I'll probably have a shot at it later this week [20:29:07] :D [20:29:16] hoo: example please? [20:29:28] Danwe_WMDE: Messed dependencies? [20:29:33] wikibase.utilities [20:29:40] multichill: one of the things that does need to be fixed in the framework itself is multiline sources [20:29:59] right now we dont support them at all [20:30:06] The wikibase framework or the Pywikipedia framework? [20:30:10] pywikipedia [20:30:38] we basically need to create a new "Reference" object, which is composed of a list of Claims [20:31:53] New patchset: Tpt; "(Bug 42063, bug 44577) Special:EntityData supports RDF output" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/51164 [20:32:43] hoo: funny, wb.utilities is not mentioned in the dependencies directly, should be though. For this one it doesn't make sense to lazy load it though since it will always be required, already when initializing the module. IF you're really refactoring it, you could remove the word "store" from it and remove store.js since the whole thing isn't anything close to a "store" anymore. [20:33:17] Hi [20:33:29] maybe we should stop all bots? [20:33:30] https://www.wikidata.org/wiki/Special:DispatchStats [20:34:30] Kizar: have your read the discussion on wikidata-l? [20:34:42] Yep [20:34:43] and no, stopping all bots is not a real solution [20:34:57] is still increasing.. [20:35:41] now is descending but very slow [20:36:10] New review: Tpt; "Note: EasyRdf_Format::getFormat($query) supports mime types." [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/51164 [20:37:18] Danwe_WMDE: What would rename it to? [20:37:33] I yet factored the RepoApi out [20:37:38] hoo: As far as I can see, if you're doing what I suggested above, you won't require wb.utilities at all since it was only used to inherit from the wb.EntityStore [20:37:39] New patchset: Tpt; "(Bug 42063, bug 44577) Special:EntityData supports RDF output" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/51164 [20:38:08] * Lydia_WMDE finds it funny that denny is on the picture on http://www.google-melange.com/gsoc/org/google/gsoc2013/kde [20:38:11] Danwe_WMDE: I don't need it no, but I need the datamodels, which are much heavier [20:38:14] is anybody else getting this: [20:38:15] An error occurred while trying to perform save and because of this, your changes could not be completed. [20:38:17] Gateway Time-out [20:38:24] Vacation9: what page? [20:38:26] hoo: simply call the module wikibase.RepoApi perhaps [20:38:52] Danwe_WMDE: Might make sense... atm I got two modules which probably isn't mcuh better [20:39:01] hoo: In your opinion, is using the data model a good or bad thing in the RepoApi? [20:39:07] legoktm: Any data page [20:39:12] umm [20:39:17] * legoktm tests [20:39:40] Danwe_WMDE: I wouldn't have mixed it in, I can see that it's nice to have but it does things that probably don't really belong there [20:39:44] hmmmmm [20:39:54] hoo: like? [20:40:13] Vacation9: https://www.wikidata.org/w/index.php?title=Q4115189&diff=23553187&oldid=23375733 [20:40:17] took a while to go through though [20:40:18] wb.Claim.newFromJSON( result.claim ), [20:40:21] and my bot was having the same issues [20:40:36] That's not really fetching data from the API, but already using it [20:40:57] Vacation9: I asked in -tech [20:41:07] any response [20:41:14] no [20:41:32] try changing the description on a normal page - http://www.wikidata.org/wiki/Q4774674 [20:41:33] -operations is also going crazy [20:41:58] on sandbox I got We are experiencing technical difficulties, and because of this your "save" could not be completed. [20:42:00] 04Error: Command “operations” not recognized. Please review and correct what you’ve written. [20:42:10] * legoktm pats AsimovBot  [20:43:36] hoo: its just an abstraction to return the data in an unserialized form already. Not really a problem imho. Though, it adds dependencies which makes the thing more heavy. Suppose you want to use the API on a very low level without any abstraction, RepoApi is not the best choice. So this brings up the question whether we should have designed RepoApi without abstraction, putting another thing on top of that for introducing that [20:43:36] abstraction there. [20:44:00] There is no twisted dependencies or anything evil though [20:44:07] Vacation9: try now [20:44:14] working now [20:44:15] weird [20:44:46] Danwe_WMDE: Yes... I'm currently trying to do that while not breaking b/c to much [20:45:15] :) [20:45:19] they fixed it [20:45:51] Danwe_WMDE: What about RepoApi and AbstractedRepoApi maybe... mh [20:46:33] hoo: you're trying to separate the current RepoApi into a module only dealing with the API, without heavy abstraction and one module on top of that? [20:46:36] ah [20:46:53] Yes, and I killed it's dependent script already [20:47:06] lib/resources/wikibase.store/wikibase.EntityStore.js [20:47:09] I deleted that [20:47:30] First off, in that case you probably won't require any lazy loading since RepoApi won't have heavy dependencies and as soon as you're loading the "AbstractedRepoApi" you can't save much by using lazy load I guess [20:47:45] I lol'd at https://www.wikidata.org/?diff=23554123&oldid=22418456&rcid=23529520 [20:48:07] hoo: good, was about time someone deleted that [20:49:50] Danwe_WMDE: What would you rename 'wikibase.store.FetchedContent' to? Or just leave the name [20:52:08] hoo: that's something designed more into the direction of an actual store module, so I put it in there even though it has nothing to do with the RepoApi [20:52:22] so you could just leave it there and leave the store module just with that one in it [20:52:36] and with store.js [20:53:12] Sounds good :) [20:58:04] hoo: just out of curiousity, why do you need the api without abstraction? [20:58:29] Danwe_WMDE: We want it to more lightweight for better client usage [20:59:06] I see, when dealing with site-links only you don't want to load the whole data model stuff? [21:00:01] Danwe_WMDE: Yes... using the API doesn't necessary imply we want SnakList.js loaded etc. [21:09:07] Danwe_WMDE: What if I call the "big" RepoApi RepoApi and the other one RepoApi base or so? [21:09:29] anywhere except in the client you probably don't want the lightweight thing [21:10:25] hoo: Wouldn't have a problem with AbstractedRepoApi either. RepoApiBase sounds more like some abstract class to me [21:10:34] True [21:17:22] legoktm: have you read: https://www.wikidata.org/wiki/Wikidata_talk:Bots#Bot_speed? [21:17:43] yes, but it looks like more people have commented since i last did [21:17:45] lemme see [21:22:21] is there a date datatype in wikidata? [21:23:24] not yet [21:23:26] its planned [21:24:02] what is the problem with date type? [21:26:09] it hasnt been coded yet? [21:33:19] i got disconnected... [21:33:24] what is the problem with a date type? [21:33:46] [04:26:08 PM] it hasnt been coded yet? [21:34:41] I dont understand... what have to be coded? [21:35:05] idk, i was just guessing [21:35:10] hoo probably knows ;) [21:35:24] huh? Sorry, my pidgin crashed [21:35:40] I was asking about the problem with date types [21:35:57] New patchset: Hoo man; "Refactor wikibase.store" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58223 [21:36:13] Some aren't yet implemented [21:36:58] aren't there implementations for dates in nearly every language/framework/operating system? [21:38:12] http://de.wikipedia.org/wiki/ISO_8601 [21:38:26] i dont think thats the issue [21:38:41] id guess its the integration with wikidata [21:38:44] New patchset: Hoo man; "Refactor wikibase.store" [mediawiki/extensions/Wikibase] (master) - https://gerrit.wikimedia.org/r/58223 [21:38:50] like the JS to add claims [21:38:54] how to store them [21:38:56] Danwe_WMDE: aude: ^ ;) [21:38:59] the API format, etc [21:39:17] there is a standard for representation of dates [21:40:01] ISO 8601 and RFC 3339 [21:41:37] https://www.wikidata.org/wiki/Wikidata:Project_chat#New_property_types_for_numbers.2C_coordinates.2C_multi-lingual_text [21:41:38] * lbenedix dont remember big changes on dates since 1582 [21:41:41] also relevant [21:42:14] bbl food :D [21:56:56] legoktm: your bot is currently doing over the half of all edits on wikidata [21:57:27] is that an issue? [21:57:36] half the edits are adding a claim [21:57:37] and for each claim, a source is added [21:57:47] as the lag is still over one day i would say yes [21:58:12] hmmm [21:58:17] ok, i just killed one of the processes [21:58:34] the Editcount is only at ~400Edits/min by now... what happened? Was 1000 yesterday at this time [21:59:09] lbenedix: we're slowing down our bots [21:59:18] Sk1d: actually, im just going to throttle one of the processes [21:59:19] yes I estimated that we should have only 350 edits per minute to keep the lag at the same time or reduce it [21:59:37] dropped to 300 [22:00:01] huh [22:00:19] oh wait [22:00:20] gah [22:00:26] i disabled the throttle [22:00:29] * legoktm fixes [22:00:58] there are still about 1.288.556 edits to progress [22:01:41] lbenedix: see https://www.wikidata.org/wiki/Special:DispatchStats [22:01:59] http://lbenedix.monoceres.uberspace.de/cam/ [22:02:58] i just use https://wikipulse.herokuapp.com/ [22:03:52] you cant touch wikipulse and throw it against the wall when it dont work [22:04:54] heheheheh [22:09:57] http://lbenedix.monoceres.uberspace.de/screenshots/tt904j37ji_(2013-04-09_00.09.08).png [22:10:24] the great setup including Lego and an android smartphone [22:10:32] <3 legos [22:11:27] LEGOktm: do you like KTM as well? [22:11:38] heh, nope those are just my initials [22:15:03] I saw one of your old signatures, Loved the Kontributions part Legoktm. [22:15:12] heh [22:33:49] currently the lag is going down with about 6.000 edits per hour so maybe only 214 hours left to get to 0 ; ) [22:34:03] wooh [22:34:14] then we can ramp up the bots and do the whole thing again :D [22:34:20] but only a rought estimation [22:34:43] lemme set up a script to record it [22:34:44] i hope in 9 days there is hardware-software solution :) [22:35:05] can you accsess the lag via api? [22:35:10] lol legoktm [22:35:34] Sk1d: no, i was just gonna screenscrape \o/ [22:35:38] Sk1d: yeah - hope to have it soon - we have another call with the foundation team on thursday and hopefully know more then [22:38:39] hmmm, this regex is going to be rather ugly [22:38:44] my favorite! [22:39:11] :D [22:39:25] legoktm: have some fries with it! [22:39:32] very nomnomnom [22:46:39] * lbenedix set up a livestream for his Wikidata-Meter: http://www.youtube.com/embed/T1b0-VT93iI [22:47:32] :D [22:48:19] about 10sec lag... [22:48:37] JohnLewis: hold off on requesting any deletions for a minute please [22:48:47] trying to archive the page [22:48:53] Ok Legoktm. [22:49:44] JohnLewis: go for it :) [22:49:49] Ok. [23:11:20] Lydia_WMDE, sk1d: can i just track the "average" field? [23:11:24] or should i track median? [23:11:39] legoktm: i _think_ average is fine [23:24:01] ok http://tools.wmflabs.org/legobot/dispatchstats.log will update every 10 minutes as soon as i add it to my crontab [23:24:48] \o/ [23:25:50] it would be nice if we could somehow include EPM [23:26:00] but i dont know of an API for that [23:26:26] who's the person who runs wikipulse again? [23:26:46] edsu: ping! [23:34:24] bah [23:34:27] its not adding newlines [23:36:28] lbenedix: ping [23:37:19] pong [23:37:44] can you set up a webpage or something that i can have my script fetch to get the current EPM? [23:38:05] we have http://tools.wmflabs.org/legobot/dispatchstats.log, but i'd like to be able to compare that against with EPM [23:38:50] you can either monitor the irc by yourself or use edsus wikipulse [23:38:51] wikipulse.herokuapp.com/stats/wikidata-wikipedia/60000.json [23:39:37] woaaaah [23:39:40] that is awesome [23:39:44] edsu: <3 [23:39:58] its not really json, but its great! [23:42:39] i'm working on monitoring the irc with a carambola-microcontroller board to get my meter indipendet from the computer [23:42:58] woot [23:42:59] http://tools.wmflabs.org/legobot/dispatchstats.log [23:51:17] can you explain to me what this lag means for normal users? [23:52:13] basically the dispatch lag is how long it takes a change on wikidata to be propogated to the wikipedias [23:52:24] so right now if i add a sitelink to a page, itll take a day to show up [23:53:08] there is no privilege for non-bot-edits? [23:53:14] no [23:53:28] this will frustrate users [23:53:34] indeed. [23:53:56] of course, you can purge to manually override which is what we keep telling people