[00:12:20] Could someone help fix this isue which was caused due to the external-ID-change... https://www.wikidata.org/wiki/User_talk:Vlsergey#WEF_doesn.27t_load_data_from_Wikidata [00:14:17] or maybe it i sfied...I'm geting odd results when trying it...something it works... [06:14:41] when deciding about person or not person in wikidata game, the tough ones are "roman catholic saints", "gods", "category:person" [06:15:17] gods are never persons? (no matter if you believe in them or not) [06:15:26] how about those saints.. hrmm [06:15:56] and category pages cant be persons but they are linked to person pages [06:17:37] oh yea, and finally the comic characters [06:18:01] and fictional persons [11:05:24] mutante: we have items for fictional human, fictional character and deity, so I would not mark them as humans. not sure what you mean by "category:person" but categories are obviously not people [13:58:48] is it possible to filter edit history by property? [14:00:02] johtso: not really, no [14:00:16] could be done with a direct database query, i guess [14:00:24] but that would not be 100% reliable either [14:14:42] I've noticed that back in 2014 a bot added bad data for various football stadia capacities [14:15:17] great [14:15:39] this is why we need references with dates, so you can easily grab the batch [14:16:11] https://www.wikidata.org/w/index.php?title=Q150928&diff=122448221&oldid=113592270 [14:16:37] https://en.wikipedia.org/wiki/Westfalenstadion [14:16:53] it's because the capacity in the infobox has various historical capacities, the oldest coming first [14:18:49] is there anything to do in this situation? the bot is no longer run/maintained.. [14:20:18] another example https://www.wikidata.org/w/index.php?title=Q159848&diff=121681899&oldid=121681898 -> https://en.wikipedia.org/wiki/Camp_Nou [14:24:13] feels like a problem that could be solved with another bot :) [14:24:23] technically it's not wrong, it just needs start/end date qualifiers adding (and the most recent one set to the preferred value) [14:24:36] and yes, I was about to suggest asking on the bot requests page [14:25:03] just shows what a nightmare it is going from semi/un-structured to structured data! [14:25:19] aha, I'll take a look there [14:32:20] I guess one way to do it would be to fetch all the maximum capacity statements that are marked as imported from wikipedia and see whether wikipedia has multiple capacities listed, if so, edit the existing statements to add dates and add the missing ones [14:33:14] yeah, was thinking something along those lines [14:33:50] nikki: if I was to try and put something together using python, how much pain would I be letting myself in for? :) [14:36:24] pywikibot seems quite easy to use, even for me as someone with not much python experience [14:38:40] yeah, pywikibot seems okay [14:41:37] I haven't tried to extract data from templates, but there's a script (harvest_templates.py) in the repository which can import specific parameters from templates, so somewhere in there you should be able to find out how it does it :) [15:04:10] nikki: thank you. by category:person i meant category pages on commons that are named after a person and linked to the person Q in wikidata [15:04:28] gotcha@ fictional humans and deities [15:04:56] the only part i am not sure yet is if Roman Catholic Saints were people or deities [15:05:07] but i skip them for now [15:06:33] oh, commons. I don't think there's a right answer there, depends who you ask. probably easier to keep skipping them :P [16:19:34] Thanks Wikidata! Just imported this template to svwp which uses Wikidata info on soon million(s) of articles. https://sv.wikipedia.org/w/index.php?title=Coccobius_sumbarensis&diff=35651074&oldid=28697384 [17:03:13] Josve05a: nice! [17:17:50] SMalyshev: what is your question re testwikidata.org? [17:18:10] jzerebecki: well, basically I am having trouble with deletes in wdqs [17:18:30] so I wanted to test it using test.wikidata. But even on test only admins can delete [17:18:40] so I wonder if it's possible to test it somehow [17:19:11] SMalyshev: it is easy to get that right on beta [17:20:11] there are probably admins for testwikidata around this channel :) [17:20:26] I saw there only two: you and aude :) [17:21:27] oh you are right I have that [17:21:41] should I just create an item and delete that? [17:22:03] need to run, will get to it later [17:22:12] jzerebecki: not right now, but may need it later :) [17:22:35] also may need to figure out if it's possible to have unit test to do it... [17:22:39] may be not trivial [17:25:01] SMalyshev: i think it's okay to make you admin on test wikidata [17:26:30] aude: that'd help a bit though does not solve unit test... but I think I know a way around it [17:27:17] {{done}} [17:27:17] You rule, aude! [17:27:19] :) [17:33:21] aude: thatnk you! [18:25:10] Lydia_WMDE, when we deploy tabular data on commons, can we store page description in wikidata? Also possibly the associated license? [18:27:13] yurik: can't speak for lydia, but i think not [18:27:31] especially since commons will eventually be a wikibase repo also [18:27:46] although not sure if/how that would work with tabular data [18:28:16] aude, tabular data is an "entity", and it could use some metadata. In a way it is similar to files [18:28:58] i know [18:29:10] There are things like license - which is basically a Qid [18:29:23] it's just that we are very early with developing structured (file metadata) support for commons [18:29:35] and the scope of this is pretty big already [18:29:54] really is a question for lydia and maybe daniel [18:30:50] aude, reason for it is that I'm thinking about https://phabricator.wikimedia.org/T134426 [18:32:58] license is a Q number, but description can be stored either inside the data, or in Wikidata, or in the wikibase for commons [18:33:42] plus the question of "where should it be stored" - should we store meta data together with data, or should it be moved to wikibase when its available [18:46:45] yurik: probably could work similarly to how files get handled [18:47:02] i thought they are not yet? [18:47:12] do you know ETA on wikibase on commons? [18:47:15] in the glorious future.... [18:47:22] (weeks vs years) ? [18:47:27] we are actively working on it at the moment, but it's early stages [18:47:46] i mean, do you think it will be like a few months or a few years? [18:47:57] that's a lydia question and not sure we can say [18:48:16] * aude doesn't want to overpromise :) [18:49:49] yurik: what aude says is correct. i would handle it in the same way we handle files in the future. it is at least 6 month out before anything of it shows up on commons - likely more [18:50:20] creating items on wikidata for these datasets seems not good as aude said [18:51:27] possibly the license could be linked from commons to wikidata [18:51:31] but not descriptions [18:51:38] * aude thinks it's an open question [18:54:05] *nod* [18:55:30] Lydia_WMDE, will we want to have Q IDs for each tabular data page (or file for that matter)? [18:56:03] no but likely a mediainfo entity [18:56:11] that is the equivalent of an item for media files [19:01:25] Lydia_WMDE, i wonder if this is a good approach because Q ID has one amazing property - it is unique without any meta information like the source of that ID (wikidata vs some other storage). The moment we introduce another numbering system, we can no longer identify items in Wikipedia - we have to say either "wikipedia/wikidata/wiki* entity", vs "a commons file/datablob identified by some other ID" [19:02:21] it is actually important to seperate them because they are very different concept-wise [19:02:47] they will also have IDs [19:02:58] so can be adressed [19:03:00] for example, lets say we have a list of the largest cities in wikipedia. It has a Q id. Now imagine we move all the data to commons .tab page, and use Lua to generate the list [19:03:20] shouldn't the data have that Q ID? [19:03:36] or you want the data to have C ID ? [19:04:15] That's my concern - addressing implies you have to make longer key now --- "commons:ID" vs "wikidata:ID" [19:06:19] i understand [19:08:15] and you will still need to have context - like saying "insert image 123 here" may turn out to be impossible because that's not an image but a tabular data... So should it have yet another ID? [19:09:36] how do I transform an item into a redirect? [19:12:17] you can enable the merge script in your preferences [19:12:31] and then the merge option will exist in the drop-down where "move" would usually be [19:12:53] ok thanks [19:54:52] Lydia_WMDE: remember I was saying how I'm not sure how it decides which languages to put in the terms box when logged out? I just noticed on test.wikidata.org that it offers me japanese, tibetan and dzongka, now *that's* a bizarre choice :) [20:03:37] nikki: whut? [20:03:44] that _is_ kinda wrong :D [20:07:27] yeah :) wikidata itself still offers me a reasonable selection though [20:09:35] I actually get a nice selection... [20:33:59] hmm... looks like I found what is the problem with deletes in WDQS. [20:34:19] rcstream sets revid to 0 for delete messages for some reason [20:39:00] also, test.wikidata.org seems to have a bug: "Exception encountered, of type "BadMethodCallException"" when restoring a deleted item [20:40:54] filed as T137249 [20:40:54] T137249: Exception when restoring deleted item on test.wikidata.org - https://phabricator.wikimedia.org/T137249