[00:05:58] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1966 bytes in 0.096 second response time [00:11:07] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1948 bytes in 0.079 second response time [00:55:02] SMalyshev, i might have found the culprit - user queries during import may cause significant deadlocks... stopping user queries might help, investigating [01:42:26] is the sparql endpoint under heavy load ? [01:43:16] https://query.wikidata.org/sparql?query=SELECT%20?given_name%20?given_nameLabel%20?fnameLabel%20?genderLabel%20?languageLabel%20WHERE%20{%20?given_name%20wdt:P31%20wd:Q5.%20OPTIONAL%20{?given_name%20wdt:P21%20?gender.}%20OPTIONAL%20{?given_name%20wdt:P735%20?fname.}%20OPTIONAL%20{?given_name%20wdt:P103%20?language.}%20SERVICE%20wikibase:label%20{%20bd:serviceParam%20wikibase:language%20%22[ [01:43:16] AUTO_LANGUAGE],en%22.%20}%20}%20LIMIT%2010 [01:43:22] time out problem [01:50:21] beshoo: I'm having timeouts issues too, I think the endpoint is under heavy load atm [01:50:42] how can we load all human names and gender ? [01:50:55] i need this info , is there a way to do it localy ? [01:51:10] is there a database i can download it and do this localy [01:51:15] Yes there is [01:51:30] https://www.wikidata.org/wiki/Wikidata:Database_download beshoo [01:52:05] and how can i pass this command that i sent you to the database ? [01:52:44] You can't without installing a sparql endpoint yourself, or you write a python script that reads the json data dump and extracts what you need [01:52:56] Hmmm [01:53:48] do you know good sparql command line tool [01:54:35] Nope sorry [01:54:48] Ok thats fine [02:05:46] Nazral do you know ay Python script which i can use to read this BIG json dump and gime me thats quary results ? [02:16:14] No, I wrote my own scripts to do that [02:20:15] can you share this script to do that select i add , Please? [02:20:48] Or if you have the dtabse in you locam , can you give me the results and i amy donate for you $10 [02:22:35] i need the full data of that query [02:22:47] not just limited to 10 [02:24:10] since the gz json file around 30 Gb it will take time to download [02:24:17] if you already have a new one [02:24:39] and can help with that query , it will be nice form you [02:26:24] SELECT ?given_name ?given_nameLabel ?fnameLabel ?genderLabel ?languageLabel WHERE { ?given_name wdt:P31 wd:Q5. OPTIONAL {?given_name wdt:P21 ?gender.} OPTIONAL {?given_name wdt:P735 ?fname.} OPTIONAL {?given_name wdt:P103 ?language.} SERVICE wikibase:label { bd:serviceParam wikibase:language "[AUTO_LANGUAGE],en". } } LIMIT 10 [02:26:28] sorry at work atm [02:26:41] Ok are you intrested in ,today ! [02:26:48] no [02:26:59] on your free time :) [02:27:45] Even if it a frealnce work ! [02:27:51] freelance * [02:29:58] no, sorry :) [02:30:45] 50$ :) [03:01:42] yurik: yeah definitely don't do both loading and query serving, does not work very well [03:02:24] thx :) are there any significant changes with the lex intro? [03:02:49] e.g. did you add any special handling to ttl splitting? [03:10:23] yurik: no, no substantial changes [03:10:43] you didn't have to optimize anything for lex storage? [03:11:01] (like you did with other prefixes) [03:51:42] yurik: lexemes are not in yet [12:11:29] I wonder if wdt: should exclude historical data [12:13:31] feels a bit silly having to add a novalue statement to things on things which no longer exist just to make the historical data go away [12:19:43] people do keep marking correct but historical statements as deprecated... I don't know if they're doing it because of how wdt: behaves though, I'm not sure what else makes use of the ranks [12:20:24] nikki: Lua parser functions, per default, only show "best" statements. [12:20:36] The UI for ranks is quite bad. [12:20:56] I'm glad it's hard to find, otherwise we'd have a lot more incorrectly deprecated statements [12:20:56] it would be nice to make the meaning a nd behavior clearer in the ui [12:21:03] *sigh* [12:21:41] it should be easy to find, and easy to understand. big dialog, lots of text. it's rarely needed, so it can be big and in-your-face and text heavy [12:22:53] why is it silly to add a preferred novalue statement, though? How else would you model "used to kave kings A, B, and C, dbut no longer has a king today"? [12:23:02] I think part of the problem is that people edit a statement to mark it as historical, they don't expect to have to edit another statement... they might not even know which other statement to edit [12:23:56] ah, right. [12:24:10] a "historical" rank between "normal" and "deprecated" would fix that [12:24:15] yes, I would love that [12:24:44] though it's not clear what "normal" is supposed to be used for, then [12:25:33] e.g. three population statements, two from 2010 and one from 1900... the 1900 is definitely less preferred than the 2010 ones, but you can't say which of the 2010 ones is best without more research [12:26:01] I would expect it to be used for alternative values that aren't old ones [12:26:29] but still "less preferred"? [12:26:44] disagreement about the current value would generallöy be modeled by multiple preferred statements [12:33:10] the only thing coming to mind right now is identifiers, where you could say an identifier that redirects is still valid but less preferred than the canonical one, although that could also be modelled as historical on the basis that it was presumably a separate entry [12:35:13] I'm sure I've come across other situations though... I'll have to make a note of them when I find them [12:48:00] oh and to answer your question, I don't know how to model whether something has a king or not (the closest property we have is "head of state" which isn't only for monarchies) [12:49:30] but I would find it weird to add head of state: novalue to historical countries, there's no head of state because it doesn't exist. it's technically correct but weird because you don't expect a value to exist [12:50:33] whereas novalue is very useful when you *do* expect it to have a value but one doesn't exist [12:50:42] nikki: there is no need to. all the heads of state could have "normal" rank. what's the problem with that? [12:51:32] if you don't want them to show under "heads of state", the query should probably "heads of state of existing countries". though actually listing current vs historical countries isn't easy either [13:10:38] I don't have a problem with them all having the normal rank, it was just an example of a situation where it's not expected to have a current value [16:52:58] !admin [16:52:58] Attention requested  HakanIST sjoerddebruin revi [16:53:08] could I get a revdel for https://www.wikidata.org/w/index.php?title=Q54878671&diff=prev&oldid=694104577 [16:53:18] accidentally revealed my IP [16:54:01] Done. Please mail oversight@wikidata.org next time or send me a PM. [16:54:17] tyvm, will keep that in mind for next time [16:54:31] It's kinda tricky so out in the open, you are lucky this time I was online. :) [16:54:57] I had to restrain myself from clicking the link ;) [16:56:00] yeah, I wasn't sure where to look to ask for oversight on Wikidata, but I figured IRC was better than the admin noticeboard [16:56:38] thanks again, have a great day [17:02:13] accidentally revealed their IP on IRC by linking to the edit here. It was not possible to connect the IP to the user before... meh [17:05:53] I don't know why people care about it anyway [17:36:17] Because attackers everywhere!1!1 https://security.stackexchange.com/q/186929/47770 [17:57:14] Hi! Can someone give an exmaple for wbcreateclaim with an external id? [18:00:46] nonZero: what are you trying to do? :) [18:01:18] reosarevok: Add external IDs to an entity [18:01:42] I got that much :) [18:01:53] But are you looking into a bot like thing? [18:02:34] reosarevok: I have a lot of code already, just adding some new functionality - importing data from the Israeli Film Archive [18:02:58] Oh, ok. I've always abstracted this stuff via pywikibot [18:03:20] More examples for that than for direct API usage for bots I get the feeling [18:04:13] nonZero: for wbcreateclaim you just have to pass a JSON representation of the claim that you want to create [18:04:14] * reosarevok But I guess really you should be able to do the same as for the string example here? https://www.wikidata.org/w/api.php?action=help&modules=wbcreateclaim [18:04:30] Not sure how I ended up with a /me there, sigh [18:04:32] the funny thing is that you need to generate a statement ID for that claim [18:05:16] pintoch: https://www.wikidata.org/w/api.php?action=help&modules=wbcreateclaim [18:05:53] oh sorry I was thinking about wbsetclaim [18:06:34] ok then the example they provide for the string datatype should work, no? [18:06:47] strings and external-ids are treated in the same way [18:07:03] (unrelatedly, from wbsetclaim: "and move the claim to the topmost position within the entity's subgroup of claims that feature the main snak property P1. In addition, move the whole subgroup to the top of all subgroups aggregated by property". wat and why?) [18:09:56] reosarevok: oooohhh.... that should go away. that feature is long dead :) [18:10:23] the orignal idea was so have explicit control over the order of statements on the page [18:10:42] but that was confusing and cumbersome (and buggy, too) [18:10:43] Yeah, I was wondering because if that was doable I'd expect it to work on the UI too [18:10:57] yea. we tried that. it was horrible [18:11:11] so now we just have a global list of properites, and sort by that [18:14:01] So there is some sort of a consistent ordering, then? [18:14:30] I mean, I noticed p31 and p279 being on top but wondering how much of the rest is hardcoded and how much is just add order or P number order [18:16:42] reosarevok: the cisual order is controleld by https://www.wikidata.org/wiki/MediaWiki:Wikibase-SortedProperties [18:17:16] it's interpreted as a single list, sections are ignorewd [18:17:39] they still kind-of work, because properties from different sections are rarely used together, and generic stuff is on top [18:17:48] And any other just sort after these in P order? [18:18:19] Or are these all except external IDs? (since there's so many of those I guess that could be the case) [18:18:52] i don't remember how the rest is handled... may be numeric sorting, may just be the order in which they were added to the item [18:20:48] I see :) [18:24:30] pintoch: thanks. it works! [18:34:18] were timeout problems with mwext-mw-selenium-composer-jessie solved? I see it's voting and is blocking my patch... I thought it was made non-voting some time ago [18:34:43] ^ on wikibase extension [19:01:03] hello [19:45:38] Hi! I have ~1000 entity ids - some are redirects (i.e. were merged). How can I quickly check all 1000 ids if any of them are redirects? [19:51:39] you can do a query like http://tinyurl.com/y9ok8taf [19:52:29] although that doesn't return anything because the items I used aren't redirects :) [19:56:41] nikki: wonderful! BTW - how did you find this answer? [19:57:40] owl:sameAs is mentioned on the "RDF Data Model" page linked from the query service's help menu [19:58:16] but I already knew it was possible to do that [22:43:29] does anyone have a template sample for generating the software version and its publication date from wikidata? [22:43:39] {{#property:P348}} [22:43:51] does the earlier, but for the rest I suspect that lua is needed [22:52:43] de:Modul:Wikidata allows dong that with {{#invoke:Wikidata|claim|P348|qualifier=P577}} — [22:52:54] but I'm not able to translate to the local wikidata module :/