[02:27:20] We don't list filmography lists on an actors wikidata item, right? I.e. we don't list all films an actor has participated in from the actor-item, but instead link them from the film instead...right? [02:30:21] Yes [02:32:21] Because we don't list any of the harry potter movies under https://www.wikidata.org/wiki/Q38119 [08:00:36] disruptive edits from https://www.wikidata.org/wiki/Special:Contributions/189.159.131.45 [08:01:02] changes the English label of property genre [08:07:38] mutante: might be better to ask on the administrator's noticeboard [08:07:47] I have to go out now so I can't help :/ [08:08:26] nikki: alright, it's ok [08:08:31] thx [10:58:33] LeaAuregann_WMDE: I wonder if the wikidatacon overlaps with one of the local European conferences [10:59:01] multichill: thanks for the info, I'll check that [10:59:59] You're early so I think you just claimed the day, but you might want to send a save the date to the chapters list so they all put it in their schedule [13:17:19] Woo.. ContentHandler deprecated hooks and functions are starting to be removed [13:27:34] Reedy: So everything starts to get broken now? ;-) [13:27:52] multichill: Well, unless people are using it only via weird reflection things [13:27:57] And if they are, they deserve it :P [13:28:34] ha [13:30:09] I'm not removing ones that have other usages [13:30:21] Just ones that are only used inside core (tests), and have no usages in WMF hosted extensions [13:30:23] It's 1.29 now [13:30:28] They've been deprecated since 1.21 [13:30:37] So they can gtfo if they complain :) [17:06:08] what is the best place for a newbie to understand wikidata "architecture" vs. dbpedia and Freebase (and any other meritorious approaches)? Is SPARQL ripe enough for "everyday" use for example? Can I take a Word document (free text) and parse out only NAMES (as in persons) and get a good link into wikipedia from wikidata services? Where to start? -Rick [17:09:36] maybe http://wikidata.org/wiki/wikidata:Data%20access [17:15:03] Nemo - thanks for the link. [17:22:16] When a Google search shows a text box from Wikipedia in the search results page (typically up top, and in the right hand column) do they make an API call outside of their systems to Wikipedia (or do they wholesale copy wikipedia/wikidata internally)? Does Google use the same "linked data" architecture internally as Wikidata? If not what "architecture" does Google use internally? [17:22:57] rjlabs: Nobody knows. Google keeps all this entirely secret. [17:23:23] How does this matter to you? [17:25:26] Hey hey, I am looking for an api which helps adding different language translations in wikidata. Any help? [17:25:26] Thiemo - I'd like to go with whatever architecture is "defacto standard" in the industry, vs being way off the well beaten trail (if there actually is one). [17:31:04] Theimo - this was at least a start http://research.google.com/pubs/pub44818.html [17:38:35] amisha: could you explain in a bit more detail what you want to do? :) I can think of a few ways to add translations but I'm not sure if any of them would help you [17:54:40] "Metaweb's Freebase (collaborative knowledge base launched in 2007) was acquired by Google in 2010 and used as the open core of Google Knowledge Graph" -- does this imply that Googles "linked data" architecture internally is basically Freebase? While I know WikiData has a vast collection of data, they do not for instance have data on Products at the level Google and Amazon does (For example UPC codes, b box size, mfg.... [20:53:39] What happens if a property goes (only hopefully) into cyberheaven ? But can reenter at another site but that site is not named yet ? [21:12:50] Is any data in WikiData dynamically listed to its source? For example, most different language versions of WikiPedia would have an article on Paris France. Most of those would use an Info Box that presumably links the underlying data to some common record, even though the interface language is unique. For example Population. Is there a single "golden" copy of the Paris population statistic at WikiWhatever? and... [21:14:13] Can there be dynamic links from government sources of statistics that dynamically update WikiWhatever's "gold copy" (of say population) or is the entire WikiData(base) manually kept up to date? [21:16:28] What happens if a property goes (only hopefully) into cyberheaven ? But can reenter at another site but that site is not named yet ? [21:17:45] Migrant: Your question is unclear. What other site? We only have www.wikidata.org [21:18:05] it's not all manual, there are some bots which update some things [21:18:12] rjlabs: Property usage on Wikipedia is very minimal right now [21:18:21] It's still very scary [21:18:49] as for a single copy, wikidata would be the place for that, for example the english wikipedia page "south pole telescope" has an infobox that comes (almost) entirely from wikidata [21:19:27] rjlabs: You can see the changes to an item in the history, for example for Paris at https://www.wikidata.org/w/index.php?title=Q90&action=history [21:19:31] I change the infobox in any country for Paris Popluation and it "percolates" to all other languages? [21:20:00] I.e there is only one copy of Paris Population, the gold copy, and that is kept in WikiData? [21:20:06] No, were not there yet. Far from it [21:20:28] Right now most Wikipedia's still keep the data locally. Things like the South pole telescope are still exceptions [21:20:33] Slowly it's coming [21:21:00] yeah, we haven't yet convinced all the projects they should use lots of wikidata data [21:21:05] so Infoboxes - language and data, are unique to each country? [21:21:11] but if they want to use it, they can [21:21:57] so info boxes are local to each wikipedia language site? [21:22:07] yes [21:22:21] wow did I have misconceptions! lol [21:22:52] rjlabs: First step, getting all the data from the infoboxes to Wikidata. We're still working on that [21:23:22] Second step, reusing the data in infoboxes on Wikipedia. We have some pilots. Some languages are doing more than others [21:24:07] Multichill: wikidata-links for property P1447 Sports-reference.com Profile are now dead links but will hopefully raappear at another site but that is not yet confirmed. [21:24:13] o/ hall1467 [21:24:48] Ah, broken links. We change the formatter url at https://www.wikidata.org/wiki/Property:P1447 [21:24:58] Multichill: http://www.sports-reference.com/olympics/ [21:25:04] is the issue that the various country sites can't get (for instance) the "gold copy" of Paris Population from a central server? [21:25:09] :O the site has gone [21:25:53] I set the current formatter url to deprecated as it is no longer working [21:27:02] and last time i saw there were much over 90 thousand id's created hopefully we can reuse the profile-ID's at a later stage when they reappear [21:27:09] so abruptly... [21:27:42] rjlabs: there's a variety of reasons, some people don't like having to edit data on a separate wiki, some people don't trust wikidata's data quality, some people just don't like change, some don't have any editors who know how to insert wikidata info, etc [21:28:06] but it looks like the will still work for icehockeyplayers and basketballplayers [21:28:32] Migrant: https://www.wikidata.org/wiki/Property_talk:P1447#Broken_links [21:28:50] \o halfak [21:28:58] :) [21:29:31] Fun. https://nl.wikipedia.org/w/index.php?title=Speciaal:VerwijzingenZoeken&limit=500&offset=0&target=http%3A%2F%2Fwww.sports-reference.com%2Folympics%2F [21:29:35] * multichill thinks about changing his nick to multihal [21:30:01] Why not edit (e.g. Paris Population) on any language Wikipedia and have that update the "gold copy" that is held at WikiData. (Perhaps too much strain on WikiData server to answer all the requests for "gold copy" data as pages from everywhere are requested?) [21:30:11] sjoerddebruin: I'm about to go to bed. Feel like diving into the wayback engine and see what it got? :P [21:30:28] rjlabs: Not a technical problem, it's a community thing [21:30:42] it seems to have a bunch of pages, haven't checked them all though (obviously, I'm not superhuman :P) [21:30:46] se also this at the same discussion page https://www.wikidata.org/w/index.php?title=Property_talk%3AP1447&type=revision&diff=412950355&oldid=382837592 [21:31:45] I've got a question I was hoping someone here might know the answer to. How can we track the usage of specific statements from a Wikidata entity on the various Wikipedias? I've seen the properties pages, but it seems like they take an all or nothing approach to statement usage tracking. [21:32:25] Related to kjschiroo's post, I'm working with him on this project as well [21:32:37] ^ would be a great way to start quantifying the importance of certain statements on Wikidata [21:33:34] I don't know enough about the usage tracking to be able to help :/ [21:33:49] We've been kicking around the idea of setting up a mirror of Wikipedia and modifying to track this usage [21:33:50] Maybe Amir1 or DanielK_WMDE [21:33:54] if nobody else here can help, I would suggest asking on the contact the development team page on wiki [21:33:57] Is it the English language Wikipedia site doesn't want someone from Germany, editing the Paris Population "fact" (gold copy kept at WikiData) and having that appear when the next draw from the English Language site assembles the Paris page for the English language site user? [21:34:02] It's kind of late in Germany :/ [21:34:08] I'm around [21:34:24] I'm kind of in middle of applying somewhere [21:35:29] kjschiroo: I think statements is all or nothing yes. It's tracked for different parts [21:36:05] I'm not sure I understand what you mean by "tracked for different parts" [21:36:24] multichill, I was going to ask what you mean by "all or nothing" :) [21:37:04] https://doc.wikimedia.org/Wikibase/master/php/interfaceWikibase_1_1Client_1_1Usage_1_1UsageLookup.html [21:37:48] ooh [21:38:19] multichill, do you know if this is exposed via the API? [21:38:28] If not, maybe you'd know what action/prop we can look under [21:38:49] For some reason I end up at http://marefa.org/extensions/Wikibase/docs/usagetracking.wiki [21:39:49] halfak: https://en.wikipedia.org/w/api.php?action=help&recursivesubmodules=1#query+wblistentityusage [21:40:42] So this lets us know when an entity is being used, but not the specific statements that are being used? [21:40:46] Oh interesting. So this queries usage across the entire wiki [21:40:58] So the aspects are S, L, T, X, O [21:41:07] Just have to figure out what it was again [21:43:05] ALL / LABEL / OTHER / SITELINK / TITLE [21:43:35] multichill: hypothetically, if we were to setup a mirror of Wikipedia with the wikidata extension, would we be able to log the specific statements of an entity that a page is using? [21:43:46] oh cool http://olympstats.com/2016/08/21/the-olymadmen-and-olympstats-and-sports-reference/ [21:43:46] kjschiroo: tracking which statements are used (or at least which statement groups, by property) would require a lot more room in the database. So far, there has not been a compelling use case to justify this. Make your case on https://phabricator.wikimedia.org/T151717 [21:44:04] https://doc.wikimedia.org/Wikibase/master/php/classWikibase_1_1Client_1_1Usage_1_1EntityUsage.html#a83de3369ed00f34eec3a22bee9b1022b [21:44:21] kjschiroo: but i'm not sure it would be possible to provide reliable tracking without breaking changes to our Lua interface [21:44:35] DanielK_WMDE: You should explain at https://en.wikipedia.org/w/api.php?action=help&modules=query%2Bwblistentityusage what the meaning of S, L, T, X, O is [21:44:38] We currently pass the full data structure to Lua, and Lua code can do whatever with this [21:44:49] which we track as "all" data usage. [21:45:10] We'd need magic Lua objects that track access to parts of the data structure... possible afaik, but not easy. [21:45:12] DanielK_WMDE, I was thinking that maybe we could write this stuff to a log rather than the DB [21:45:27] sjoerddebruin: i would suggest a link to the official Olympic reports at leats until hopefully the new site is up ? [21:45:38] DanielK_WMDE, I see where that gets difficult [21:45:43] leats => least [21:45:43] if they have stable id's, Migrant... [21:46:37] multichill: true. Sitelink, Label, Title, All, Other (not Sitelink/Label/Title, so it's Statements at the moment) [21:46:47] multichill: file a ticket ;) [21:49:30] halfak: a log might be a compromize if db capacity is an issue, though it would be less useful. But we'd sill need a lot of voodoo magic to track access to the data structure from Lua. [21:49:43] sjoerddebruin: https://www.wikidata.org/wiki/Special:Contributions/Mahir256 ? ehm? [21:49:50] DanielK_WMDE, gotcha. [21:49:55] We could however offer a specialized Lua function to allows better tracking. [21:50:11] DanielK_WMDE, I'm thinking that this'll be valuable long term. [21:50:17] if that becaome the preferred way of accessing data, that might be good enough [21:50:20] multichill: yeah, the whole day already [21:50:25] i agree [21:50:31] talk to hoo|away :) [21:50:45] E.g. I imagine that we'd want people partolling RC to focus on changes to statements that get a lot of use cross-wikimedia. [21:50:57] I realize though that this isn't the only use of wikidata [21:51:03] But it is an important use. [21:51:12] What the hell is he doing? [21:51:50] halfak: that's an interesting idea! can you write it down somewhere? [21:52:32] sjoerddebruin: I'm giving him 2 minutes to respond or i'll block [21:53:21] He got a message a year old about this he never answered https://www.wikidata.org/wiki/User_talk:Mahir256#follows_.28P155.29_.26_followed_by_.28P156.29 [21:53:43] but they are qualifiers? [21:54:21] He can add them as qualifiers too if he likes, just remove them from the main item [21:54:35] https://www.wikidata.org/w/index.php?title=Q2822&type=revision&diff=412992082&oldid=409367049 [21:54:58] It's like the newbie adding genre as a qualifer to instance of film [21:55:51] DanielK_WMDE, totally :) [21:56:49] nikki & multichill - thanks very much for the info. Will take a look at a few history pages on WikiData, and "south pole telescope" on en Wikipedia to get a better feel for the current situation. (especially how to make Wikipedia draw data from WikiData). Will also look at how WikiData draws IN metadata and data from the various info boxes. Lots to learn! :) [21:57:43] nikki: Can you give your opinion on https://www.wikidata.org/w/index.php?title=Q2920&type=revision&diff=412993236&oldid=405161959 ? [21:58:27] In my opinion it's wrong to remove the statements. He shouldn't be mass doing that anyway before discussing it. [21:58:28] I can't think of any situation where it would only apply to that p31 statement and not to the item's identity as a whole [21:58:45] so I think it makes more sense as normal statements, not qualifiers [22:00:19] and yeah, I agree it shouldn't be mass changed without discussing it if it's not clearly wrong [22:01:16] nikki: See https://www.wikidata.org/wiki/User_talk:Mahir256#What_are_you_doing.3F , you got to be kidding me [22:02:04] >_< [22:03:05] Delete everything that [22:03:08] 's not perfect [22:03:13] Isn't that what we do here? [22:04:07] halfak: Oh dear, said the universe, and vanished in a puff of imperfect logic. [22:05:50] multichill: maybe a good time for Alphos's rollback thing? [22:10:24] I am done for the day. Maybe you can comment on https://www.wikidata.org/wiki/User_talk:Mahir256#What_are_you_doing.3F nikki ? TIA [22:21:47] DanielK_WMDE: Do you have any sense what proportion of Wikidata statements come from Lua modules versus in templates? [23:04:38] Stryn: thanks for the link to Olympstats.com-article ! [23:12:35] hall1467: nearly all the template use Lua modules [23:12:59] data access is either via the {{#property}} parser function or Lua. [23:13:21] that's usually hidden by templates, but that really doesn't make a difference