[10:35:36] A bit new to Wikidata. Can someone tell me if there is a notability criteria for pages? [10:35:58] I think as long as it exists on a linked project, it's fine [10:36:34] I see. Is there a deletion process for WD? [10:38:03] Such as this page here https://www.wikidata.org/wiki/Q26879248. There is no linked project page and even the page was created by the person in question, possible COI. [10:38:50] fddh: https://www.wikidata.org/wiki/Wikidata:Requests_for_deletions [10:39:39] Thanks for the help Daniel. I'll try opening a new Rfd there :) [10:40:14] https://www.wikidata.org/wiki/User:MahmoudHashemi [10:40:22] nice pic :) [10:41:14] fddh: Pigsonthewing (Q15136093) did quite some editing on the page, you may want to ping him about this. [10:41:45] arguably, the same problem applies to him and his page [10:42:27] should high profile wikimedians get data items?... [10:42:41] * DanielK_WMDE is Q15136093 [10:42:50] * DanielK_WMDE didn't create that item [10:43:05] err. [10:43:05] I'll leave a note on onthewing's talk page. [10:43:07] * DanielK_WMDE is Q29644932 [10:44:48] That would be the same as having a Wiki page won't it? Especially since many other external applications rely on Wikidata a well. [10:47:42] I don't see why someone of whom articles or whatever speak, or who gives conferences, should not get an item [10:47:47] Wikimedian or not [10:48:05] fddh: inclusion criteria for wikidata are not quite as strict as wikipedia's. In particular, "structural need" may apply to wikimedians when the item is useful to describe people organizing events or projects [10:48:07] https://www.wikidata.org/wiki/Wikidata:Notability [10:48:08] fddh: see also https://www.wikidata.org/wiki/Wikidata:Notability [10:48:12] ah, DanielK_WMDE beat me to it [10:48:33] "If there is no item about you yet, you are probably not notable." [10:49:10] but in general, i'd say that i'd leave it to *other* people to decide that i should have an item. i don't want to decide on my "notability". [10:49:15] Agreed [10:50:23] my own claim to fame is apparently based on the fact that I appear in the procedings of wikisym https://dl.acm.org/author_page.cfm?id=81466647803 [11:10:23] Can pages be moved/renamed in Wikidata? [11:17:18] fddh: no, identifiers are permanent. they can be merged though, leaving a redirect. [11:18:08] What about this edit here https://www.wikidata.org/w/index.php?title=Q215639&type=revision&diff=610695511&oldid=610694324 [11:19:05] fddh: that changes the label. you could call that a "rename". But items cannot be referenced by label. so, it's more like DSIPLAYTITLE on wikipedia. [11:19:53] I see. So what can you do to change that label? [11:20:00] for mediawiki, the actual page title doesn't change: it's still Q215639 [11:20:33] click "edit" above the box that has the labels and descriptions [11:20:59] all edits are per "section", there is no global edit model or edit page [11:21:00] Alright thanks. [11:21:24] Yes, that is what I was confused about. No source viewing either. [11:22:10] right. the "source" is really not relevant. you can get JSON and RDF exports if you want. [11:22:23] e.g. https://www.wikidata.org/wiki/Special:EntityData/Q215639.json [11:23:03] or https://www.wikidata.org/wiki/Special:EntityData/Q215639.ttl [11:23:49] Ah, that's good. Still missing the source edit but I guess they are not much needed on WD. [11:24:15] you really don't want people to edit that source [11:24:40] Why so? [11:24:47] look at it! https://www.wikidata.org/wiki/Special:Export/Q215639 [11:25:31] is a rather complex data structure mostly consiting of references to numeric identifiers encoded as json [11:25:39] it'S not made for manual editing [11:26:23] You're right, that makes sense. [11:26:27] if you *really* must, you can use the API sandbox to edit the json directly. but i don't see the point. [11:27:43] fddh: the downside is that any non-trivial change takes a lot of clicks. which is why there are a lot of user created tools for semi-automated editing that give you a better interface for specific tasks. [11:28:01] https://www.wikidata.org/wiki/Wikidata:Tools [11:28:49] Thanks for linking me to the tools. These should make it easier. [11:40:56] Any idea what this is about "As an anti-abuse measure, you are limited from performing this action too many times in a short space of time, and you have exceeded this limit. Please try again in a few minutes." [11:44:49] you have hit the edit rate limit [11:45:03] this exists on all wikis, but it's easier to hit with wikidata [11:45:29] especially when editing labels - a separate edit is performed for every language. doing to many at once will trigger the limit [11:49:10] doing one edit per language is... not too great. that should be fixed. but most mass edits are done by bot anyway, so this issue was never high priority [12:51:58] @daniel That's bad. What if someone is correcting multiple mistakes (as I was)? I thought edits I was doing at one time would be counted as one but looks like each edit for a field is counted, no wonder why WD pages have long edit histories. [13:04:05] https://www.wikidata.org/wiki/Q998628 [13:04:11] illuminator? [13:04:21] it should be illustrator, isn't? [13:06:10] I think they’re different things [13:06:30] yes [13:06:34] http://www.getty.edu/vow/AATFullDisplay?find=&logic=AND¬e=&subjectid=300220539 [13:06:34] illuminator seems to be an older term, 13th to 15th century [13:07:19] (compare also “may he who illuminated this illuminate me”, for anyone who’s seen Indiana Jones and the Last Crusade ;) ) [13:15:27] that seems to be a meaning of illuminate that I didn't know [13:22:40] sjoerddebruin How do you find the sources for references (as getty.edu above). I'm not very good at that. [13:23:06] fddh: I clicked trough to "art of illumination", where we see "AAT ID" [13:23:10] (identifier in the Art & Architecture Thesaurus by the Getty Research Institute) [13:24:53] Didn't notice that. [13:54:34] I thought an illuminator is a member of the Illuminati... [17:27:36] maxlath[m]: not that I know nothing about you project (inventaire) but I read mentions of "works" and "editions" and if it's books then you might be interested in https://bookbrainz.org/ maybe? [18:55:11] o/ Is there any limit on running large amounts of queries on wikidata query service? An example query looks something like this: [18:55:12] https://query.wikidata.org/#SELECT%20%3Fitem%20%28COUNT%28%3Fsitelink%29%20as%20%3Fcount%29%20WHERE%20%7B%0A%20%20VALUES%20%3Fitem%20%7B%20wd%3AQ1%20wd%3AQ2%20wd%3AQ3%20%7D%0A%20%20%3Fsitelink%20schema%3Aabout%20%3Fitem%0A%20%20FILTER%20REGEX%28STR%28%3Fsitelink%29%2C%20%22.wikipedia.org%2Fwiki%2F%22%29%0A%7D%20GROUP%20BY%20%3Fitem [18:55:39] !admin I am unable to understand the "permanent duplicated item" field. Can someone help? [18:55:39] Attention requested  HakanIST sjoerddebruin revi [18:55:57] SMalyshev: o/ what do you think about ^? [18:56:08] fddr: there are some wiki's that support multiple scripts or dialects [18:56:12] i want to run queries periodically to gather some data and train machine learning models [18:56:53] So this is a duplication field for those? [18:58:01] Oh, I get it. Its for the same field but for diacritic support. Why isn't this directly supported? [18:58:20] Why even need two data fields for that? [18:58:28] Two data fields? [18:58:34] bmansurov: there are limits, yes [18:59:02] bmansurov: what exactly you want to do with this? [18:59:26] CatQuest: there doesn't seem to be much activity there https://bookbrainz.org/statistics + it's not wikidata-centered ;) [18:59:37] SMalyshev: we want to use this information in order to determine which articles are important and missing from various language wikis [18:59:37] a wikidata item can only have one link to a wiki, in the wikis sjoerd mentioned, there are multiple pages that should be linked to the same wikidata item, but since we can't do that, we have to create separate items [18:59:48] I mean why is this duplication field even needed? Why not include diacritic support directly. [18:59:54] which means the items are technically duplicates, but we can't merge them [19:00:10] bmansurov: ok. but specifically the query - what it is meant to do? [19:00:10] and the "permanent duplicated item" property lets us link the items together [19:00:40] SMalyshev: get the number of wikipedia that link to a specific wikidata item [19:01:11] here's the query in case you missed you above: https://query.wikidata.org/#SELECT%20%3Fitem%20%28COUNT%28%3Fsitelink%29%20as%20%3Fcount%29%20WHERE%20%7B%0A%20%20VALUES%20%3Fitem%20%7B%20wd%3AQ1%20wd%3AQ2%20wd%3AQ3%20%7D%0A%20%20%3Fsitelink%20schema%3Aabout%20%3Fitem%0A%20%20FILTER%20REGEX%28STR%28%3Fsitelink%29%2C%20%22.wikipedia.org%2Fwiki%2F%22%29%0A%7D%20GROUP%20BY%20%3Fitem [19:01:13] I see. Looks like this is for technical reasons rather than something else. [19:01:28] bmansurov: ok, so I think it's better to use schema:isPartOf/wikigroup [19:02:59] SMalyshev: ok, and that wouldn't be limited? [19:03:36] SMalyshev: say I want to get all wikidata items and links to those items from various wikipedias. That's the most efficient query? [19:03:48] bmansurov: limited by what? every query you run is limited (60 sec timeout plus load restrictions [19:04:30] yep, technical reasons [19:04:31] SMalyshev: I see. [19:05:03] SMalyshev: I meant by the service itself. Would my requests be denied if I make say 100 requests per second. [19:06:14] bmansurov: yes [19:06:35] let me look up the exact limits [19:07:01] SMalyshev: thanks! Is there a way to query this information offline without putting any load on the service? [19:07:33] bmansurov: well, it depends. You can always take the dump and parse it, for example. [19:07:49] e..g using wikidata toolkit. But that probably would be rather slow [19:08:14] SMalyshev: i see [19:15:27] Hola [19:16:07] I want to know how to edit Isabel Segovia element [19:16:40] bmansurov: so why you need this 100 requests per second? what you are trying to do with it? [19:18:44] SMalyshev: If each query is limited to 60 seconds, I thought maybe making multiple queries at once would help me pull the data I need quickly. So the number of Wikipedia links a Wikidata item has is an indication of that item's importance. So If we have 5 articles to suggest to the user, we could sort those articles by the number of links and suggest the top one. [19:19:18] SMalyshev: this data is used as one feature of a model that generates these recommendations. [19:20:25] bmansurov: and that's why we have limits :) because if everybody thinks "maybe I run 100 queries in parallel and get my data sooner" than nobody gets any data because the server dies under load :) [19:20:49] agreed [19:21:05] bmansurov: we already have number of sitelinks on each item as a value, you do not need to count them anew [19:21:20] oh that's cool [19:21:36] bmansurov: https://www.mediawiki.org/wiki/Wikibase/Indexing/RDF_Dump_Format#Page_properties [19:22:23] SMalyshev: I forgot to tell that I need sitelinks to Wikipedias only and not other wikis [19:23:02] ok that we don't have so here you may want to count [19:24:05] SMalyshev: do you know if Wikidata dumps are available in the analytics cluster? Maybe I can use wikidata toolkit with spark to parse those dumps? [19:32:45] bmansurov: well wikidata dumps is on /public/dumps but I am not sure whether it's connected to analytics... [19:32:58] if not then I guess you'll need to d/l it [19:33:09] ok, thanks! [19:34:16] bmansurov: but I personally would consider just using sitelinks, if the item has a lot of sitelinks it's usually because of wiki links... pretty rare that it had a lot of sitelinks but they are all non-wiki ones [19:34:46] that makes sense [19:41:55] yannf: is a terminator a member of the terminati? :D [19:44:32] fallgtp: what do you want to edit? :-) [19:45:24] Sorry Let me read so I can explaine what I need in a better way [19:46:11] so why you need this 100 requests per second? what you are trying to do with it? Answer I'm new maybe this is not the place to get the information [20:13:51] ignorance is a bliss [20:39:23] oh dear. that was not meant ofr fallgtp :( [20:39:51] maxlath[m]: we link to wikidata atleast :) [20:40:23] CatQuest: oh, your from the project, cool :D [20:41:29] CatQuest so now I have questions ^^ do you have an API that can be interrogated by ISBN or Wikidata id? which license do you use? how do you handle covers? [20:41:57] CatQuest: on the side of inventaire, you can learn more about the data here https://wiki.inventaire.io/wiki/Data [20:44:44] maxlath[m]: the main people from the project are at #metabrainz but I'd expect this to just have the same CC0 license as MusicBrainz itself, at least for most data (https://musicbrainz.org/doc/About/Data_License) [20:45:22] That said, I'm surprised the license isn't specified [20:45:29] I'll pester the team about it :p [20:48:31] ah I'm just the beta (and alpha) tester monkey and head cheerleader honnestly :D not a coder :) [20:49:09] maxlath[m]: it's a project in the works for now though :) That said, I'd expect it to be easy to map Wikidata to the project in the future - and for books Wikidata is less bad than for music, although I guess it's still at the "work" level, not edition [20:49:14] but i've been onboards since it was written in haskel :P [20:49:33] +1 to that [20:50:01] Anyway, definitely interesting to see more free datasets out there, specially since one already linked to WD should be simpler to match to later on too :) [21:08:45] CatQuest: to be blunt, i don't think he would have understood it anyway :x [21:09:34] CatQuest: i still miss the annual cheerleading calender ;p [21:10:13] bookbrainz is meant to link to various online databases and libraries in the future. right? riiiiiight? [21:10:16] * CatQuest has too hairly legs to be in that sort of calendar :D [21:10:55] one can shave. [21:11:41] then you be the one, I'in't [21:11:45] :D [21:17:21] CatQuest: you were going to bed! [21:17:45] yes i am doing this. [21:18:05] SothoTalKer: It would be a shame not to, so I expect so :p [21:20:48] Hi [21:21:23] Any QS master? [21:44:22] jklamo: https://en.wikipedia.org/wiki/QS - which of those? :-) [21:47:43] The one mising, QuickStatements :-) [21:53:54] not me :)