[16:21:32] Should I write a proposal to approval for a bot (https://www.wikidata.org/wiki/Wikidata:Bots#Approval_process)? [16:21:45] rdhyee: sounds like the right thing to do yes. [16:21:53] ok [16:21:56] (No but seriously I held a Wikidata hackathon with Census people in attendance and we made... very little progress.) [16:22:06] If you can make more progress that would be excellent. [16:22:12] rdhyee: that would be quite cool. How did you do the mapping? [16:22:25] harej: can you explain why there was little progress? [16:22:34] ricordisamoa: cool :) [16:22:34] We didn't know what we were doing? [16:22:40] heh [16:22:40] ok [16:22:41] I'm not very far along....but will start to figure out mappings.... [16:22:50] Wikidata was also not very mature at the time. [16:22:59] harej: rdhyee: maybe also check if there is a wiki project? or create one if none exists? [16:23:07] Lydia_WMDE: but it doesn't work :( [16:23:11] No units yet [16:23:12] I've worked with the US Census API for a while and have access to local census experts at Berkeley [16:23:24] audephone: a good point [16:23:36] ricordisamoa: what's the holdup? [16:23:53] thanks for the encouragement, everyone.... [16:24:17] a query service is needed to obtain the node [16:24:23] ppopulation won't need units [16:24:24] such as http://overpass-api.de/api/interpreter?data=%5Bout%3Acustom%5D%3Bnode%5Bwikidata%3D%22Q244207%22%5D%3Bout%3B&url=http://www.openstreetmap.org/?{{{type}}}={{{id}}} [16:25:06] ricordisamoa: on the OSM side? sorry i'm not very familiar with OSM :( [16:25:19] me neither [16:25:51] planning to start with populations from 2010 US census and look at issue of mapping census identifiers to wikidata identifiers....maybe easy maybe not.... [16:25:56] I think we already have a property for population [16:26:04] indeed [16:26:11] I'm sure, I did use it [16:26:28] yes...there is a property for population. This idea came from my entering by hand the 2010 population for my city. [16:26:45] ricordisamoa: then maybe have a chat with audephone and andy mabbet? they're both rather familiar with OSM [16:26:58] Census has stuff like area also [16:27:08] Tried to find a way to compute the precision of the INSEE datas, turns out they only give a pdf on how to compute it [16:27:09] a WMDEv should take a look at https://github.com/kenguest/Services_Openstreetmap [16:27:26] But start with population is good [16:28:22] Lydia_WMDE, about precision, will there be a way to see the entered raw numbers ? Currently when adding a precisions, the numbers after precision are all 0 although the origianl number was more precise [16:28:51] TomT0m: what would you like to see? [16:29:48] I don't know, it's weird at first when you entered 198234123 and that 198230000 is showed finally [16:30:49] so in that case the value was 198234123 and the precision was +- 10000? [16:32:44] with OSM we could have accurate coordinates, boundaries, streets, etc [16:33:18] yep, that's a little weird actually, but the insee publishes its raw census numbers with a well hidden pdf on how to compute the error margin [16:33:40] https://lists.wikimedia.org/pipermail/wikidata/2015-June/006383.html [16:33:51] "Therefore I would like to suggest the forming of a [16:33:51] working group with core users from OpenStreetMap and from Wikidata to sit [16:33:51] together on this subject." [16:34:48] came on this because a friend just added the Paris population for the city to show up on the big city with a female mayor query, and he wanted to remove the +/-1 ironically [16:35:33] of course remove but not add the precision :) [16:35:33] he could have set it to 0 [16:35:47] but yes, I agree that's a bit weird. [16:36:07] Lydia_WMDE: do you have a spec on how it should look like, or do you need one? [16:36:09] yeah the +/- 1 we really need to fix -.- [16:36:12] community could work on one [16:36:19] as a suggestion [16:36:27] dennyvrandecic, yep, that's what he did, but I wanted to add the real precision, it's kind of globally absurd when you think about it [16:36:31] no if someone wants to write up a suggestion that'd be aprechiate [16:36:32] d [16:36:57] TomT0m: let's work on a suggestion. You want to start a first draft? [16:37:45] TomT0m: yeah, a bit absurd, but it sounds right. This kind of things happen all the time with scientific data [16:39:06] dennyvrandecic, ok, I'll think about it [16:39:12] cool :) [16:39:52] sooooo then a question from my side: coolest usage of arbitrary access so far you've seen? [16:40:14] my own taxobox [16:40:21] matej_suchanek: ohhhh [16:40:22] link? [16:40:38] however, cswiki is getting arbacc on Monday [16:41:00] so it isn't live [16:41:01] tuesday [16:41:19] awwwww ok. do send me a link when it's live then [16:42:51] also icons for ratings of videogames (PEGI, USK) [16:43:04] is that live? [16:43:20] yes, but it uses array of images instead [16:43:30] ok [16:43:46] ricordisamoa: what's your latest cool thing? [16:44:34] Tpt: do you want to give us a short update on all things primary sources tool? [16:44:47] Lydia_WMDE: Yes. [16:45:19] then there is one use case one hold but our community would like it [16:45:32] Lydia_WMDE: nothing after http://tools.wmflabs.org/ptable/ :( [16:45:36] matej_suchanek: which one? [16:45:38] ...but keep listening! [16:45:47] ricordisamoa: that is pretty cool! :) [16:46:01] I am at Google since two weeks. I have mainly focus currently on creating a conversion tool between Freebase dumps format and Wikidata statement system in order to feed the Primary Sources tool [16:46:04] gender inflection of labels using monolingual property which hasn't been created yet [16:46:18] matej_suchanek: ah ok. noted. [16:46:30] useable on occupations actor - actress [16:46:37] *nod* [16:46:44] thus herec - herečka [16:47:05] A first set of statements have already been uploaded to Primary Sources. Your feedbacks are welcome. To use the Primary Source tool please visit https://www.wikidata.org/wiki/Wikidata:Primary_sources_tool [16:47:05] looks interesting [16:47:39] Tpt: any chance you can tell us what kind of data you've worked on already and what's coming? [16:48:21] The first uploaded dumps contains only simple statements (e.g. without qualifiers) between items. [16:48:37] ok [16:48:49] The next one should support all Wikidata types (times, coordinates, strings...) [16:49:22] and also qualifiers in order to reflect the compound data types of Freebase (that works more or less like our qualifiers) [16:49:41] great [16:50:03] Lydia_WMDE: in my head there are also sister cities and birthplaces/deathplaces with countries and flagicons [16:50:53] matej_suchanek: nice! looking forward to seeing it [16:51:08] alright. we have about 10 mins left. anything else you want to talk about today? [16:51:53] nope, I'm out [16:51:57] good bye [16:51:58] One of the big challenge that remains is to map as many Freebase properties as possible. I have created a page with the most used properties to do so: https://www.wikidata.org/wiki/Wikidata:WikiProject_Freebase/Mapping Your help is welcome for this topic [16:52:53] Tpt: want to add it to the weekly summary as well? [16:53:07] Lydia_WMDE: Yes, it would be nice [16:54:07] https://www.wikidata.org/wiki/Wikidata:Status_updates/Next is always open ;-) [16:54:55] Lydia_WMDE: thanks :-) [16:55:18] alright. thanks so much everyone for coming. [16:55:23] always nice hanging out with you :) [16:55:45] popups work on properties [16:55:46] hope to see some of you at wikimania [16:55:50] but it's hackish [16:55:58] ricordisamoa: yay! indeed! [16:56:07] * Lydia_WMDE <3 hovercards [16:56:18] Thank you very much Lydia_WMDE [16:59:44] thanks [16:59:52] hi all [17:00:21] #endmeeting [17:02:11] bye [17:02:58] * Lydia_WMDE will try to post log on-wiki [17:26:29] - [20:48:17] madhuvishy, the Edit schema, was assigned to Dario, but we should assign it to JForrester [20:49:04] madhuvishy, just telling you, because I see you probably want to talk to him next [20:49:20] mforns: cool :) [20:49:22] madhuvishy, I'll move the schema down next to james' [20:49:37] mforns: why are we chatting #office :) [20:49:55] madhuvishy, xDDD sorry