[08:32:12] yurik, you can write some python that goes along all "what links here" and write the Qxxxx-results to a file? [08:32:49] edoderoo, why "what links here"? [08:33:10] i would much rather get just the "instance-of" property value, in any language [08:34:03] I thought you wanted all items that held a Pxxx propertie...but you want something different? [08:37:03] edoderoo, correct - I have a list of 300,000 Q values, and i need their "instance-of" property [08:37:39] that list you have is in a file? [08:37:54] edoderoo, i could put it there, sure [08:38:27] i might use excel to match it against some other stuff [08:39:02] you can simply write your own generator that reads from the file, and gets the wikidata item ... and wrap that inside some code to "do the rest" [08:39:31] if you output that to a CSV comma/pipe/etc separated file, you can read it back into excel easily [08:41:33] on https://www.wikidata.org/wiki/User:Edoderoobot/Set-nl-description you find an example when you search for wd_from_file() [08:46:48] let me know if I can be of any help [09:24:04] edoderoo, are you doing it one wikidata item at a time? [09:25:23] usually, yes [09:25:40] as far as I know, you can not edit/change multiple items/pages in one go [09:26:22] edoderoo, but i don't need to change anything, but simply to retrieve the instance-of property for a known entity [09:27:08] then loop through the items one by one, check the Pxxx value, and act on it (for example, export to another csv file) [09:27:59] but if you have no python experience, I can write that for you if that helps? [09:32:41] edoderoo, its ok, thanks. I was hoping for a more batch-oriented solution [09:33:03] sending 300,000 requests is not nice to the servers :) [09:33:06] usually inside a batch things are done one by one ;-) [09:33:19] yeah, but the overhead is what would kill it ;) [09:34:57] when using a tool like PetScan, it is also doing these micro-requests somewhere in background... I wouldn't be worried about that [09:50:18] edoderoo, most requests to MW API uses batch requests - that's how MW API was designed (i know, i wrote it :)) [09:50:37] (and not everyone is using it, that's correct) :) [09:52:54] well, if you can advise me how I can batch-request a SparQL query, I will be most welcome to study it [10:06:51] edoderoo, that's the sad part - neither Wikidata API nor SPARQL were (at least initially) designed for batches :((( [10:07:08] i was hoping something new and exciting is now available [10:07:28] I could use SPARQL with VALUES clause, which allows about 250 values at once [10:08:57] but i will have to write a SPARQL python requester regardless - maybe i will use your code for that :) [10:13:09] I have a small piece of code that is a simple generator for a SparQL-query. Ever since I don't use WDQ anymore ;-) [13:47:06] Lydia_WMDE: https://www.mediawiki.org/wiki/Events/FOSDEM/2017#Participants_at_FOSDEM please add yourself and encourage others :-) [13:47:46] multichill: k [17:13:06] "vi:Ngòi Thia" has no wikidata entry... how can that happen? [17:17:36] no one has created it yet [17:18:29] isn't that done automatically? [17:18:47] no [17:19:38] ah - I thought so ;) [17:23:25] Stryn: so does one have to "manually" add a new article to wikidata ? or is there some bot or gadget who help to do it? [17:25:58] on #wikidata someone says it's not done automatically - which I thought it would be [17:26:20] grrr - wrong window