[03:41:31] how would I go about merging https://www.wikidata.org/wiki/Q32123602 into https://www.wikidata.org/wiki/Q3813 ? [04:19:25] dominikh: See: https://www.wikidata.org/wiki/Help:Merge [04:20:02] thanks [04:20:07] ;) [04:20:57] though this is your chance to impart extra knowledge that should be documented but isn't :P [07:30:54] Hello. I'd like to download all labels (e.g., select term_full_entity_id, term_language, term_textterm_text from wb_terms where term_type = 'label'), as a csv. I don't have enough space or memory to load the whole wb_terms table. How can I do it? [07:44:49] from Wikidata? [07:45:00] Good morning, can someone tell me, if it is possible to copy an existing wikidata entry to a new one? [07:45:50] I want to create several building entries which, vary only in some specific Points (address, architect) [07:46:11] There is a user script for that I think [07:46:49] Thanks Tpt[m]. How can I find/use this? [07:47:04] I'm looking for it [07:59:18] Anybody here is familiarized with JSON LD syntaxes? I have problems with third party dataset Openrefine can't manage and I think is due to some syntax problem. [08:03:15] Tpt[m]: Yes, from Wikidata. [08:03:56] mindey: you could use the JSON or RDF dumps: https://dumps.wikimedia.org/other/wikibase/wikidatawiki/ [08:04:21] You should be able to extract labels from a simple grep from latest-truthy.nt [08:04:24] Tpt[m]: how big are they, when extracted? [08:04:57] you should probably not extract them fully on disk but use things like zcat pipe with grep if you could use a UNIX shell [08:05:08] from python user gzip.open() [08:05:12] * use [08:05:49] Tpt[m]: good way. I've used zcat to try to load the wb_terms table, but run out of space (400GB not enough for that one table). [08:06:05] ... [08:06:18] If you do not care about descriptions/aliases it should be way smaller [08:06:49] I *only* need entity_id, labels. [08:07:05] No need for aliases, or anything else. [08:08:20] ok. So a "zcat latest_truthy.nt.bz2 | grep ''" should give you the relevant links [08:08:49] Let me try :) [08:10:09] The wb_terms tables is deprecated and should be removed soon so using the dump is a good idea for future usages [08:10:51] Got it. [08:16:10] Arch2all: I don't manage to find the gadget, sorry [08:16:56] Btw., what I'm attempting here, is to try to realize this idea -- https://www.halfbakery.com/idea/Default_20Interlingual_20Synsets . The task tree I'm on, is https://dynalist.io/d/_OLqWbcscbx5xGq2SpOTiu3d -- any ideas that could help create a validator for wikidata's label collisions within the same languages (e.g., due to homonyms), would be great. I'm planning to write it in Python, and share. It [08:17:02] should define a pipeline that, running once, would generate latest collision-free synsets, and report about collisions. [08:19:53] (Human languages don't have default meanings for words, but could have. :)) [08:27:01] Btw., is there like a global task tree around Wikidata (i.e., who is working on what)? [08:40:40] mindey: did you have a look at lexicographical data? https://www.wikidata.org/wiki/Wikidata:Lexicographical_data we're storing words and their meaning in Wikidata now. This exists only since last year, so plenty of data is missing, but feel free to add what you need in there :) [08:42:22] Tpt[m]: Ok, but You are sure, that there is a Gadget with this functionality already somewhere out there? [08:43:04] Would be really helpful for me... [08:47:54] Auregann_WMDE: Thanks, actually I've noticed that Q-items are all noun-like concepts, P-items relation-like. Wihle I was missing verbs and other forms, I always thought that you can replace all other forms with nouns (i.e., nounize), and then use topic-comment structure, that is the natural order that some languages have (instead of SVO/SOV/etc. overcomplexification). So, while interesting direction, [08:48:00] introducing additional super-types like L-, F-, S- might be an overfitting to what language is about -- i.e., language is about characterizing one things with others, :) [09:03:49] Though, hope the L-, F-, S- superclasses will permit a greater diversity of things (concepts, relations) to be nounized. [09:21:21] Arch2all, Tpt[m] : https://www.wikidata.org/wiki/User:Magnus_Manske/duplicate_item.js [09:24:29] pintoch: thanks [10:36:46] @pintoch:Thanks, helps me a lot! [10:55:12] pintoch: Installed it and tried, works like a Charme! Great! [13:21:41] Hello. Is there a quick tool or page where we could enter P or Q values, so as to quickly check what it's for? [13:22:14] For example, when entering P17, it would output country [13:30:29] Rehman: doesn't the search box (top right corner of the Wikidata user interface) suit your needs? [13:31:16] Unfortunately not for properties. Unless there is a setting I could change to include properties in search? [13:32:31] ah right, yes I agree that's an issue - I usually go on a random item and pretend I want to add a statement for that. It would indeed be useful if the search box returned properties too. [13:33:16] I set up a search shortcut for Wikidata, then I can type `wd Q123` or `wd P:P456` into my address bar and look at the entity [13:33:23] (also works for lexemes with L:) [13:35:02] still, I find it odd that Special:Search returns properties by default, but not the search box itself [13:35:47] You can also set up https://www.wikidata.org/entity/$1 to write P17 directly [13:35:50] But +1 pintoch [13:41:23] pintoch: wbsearchentities doesn’t support searching multiple entity types, unfortunately [13:41:40] :( [13:42:15] Lucas_WMDE: I understand why you would want that for items and lexemes, where the data model is very different [13:42:54] but for items and properties I don't see what is blocking, beyond internal architectural issues? [13:43:12] (of course I understand that this could be hard to actually implement, I am just taking a user stance) [13:43:19] well, I was only thinking in terms of internal architectural issues ^^ [13:43:33] okay :) [13:43:36] I quickly checked how hard it would be to make the search box search both entity types [13:43:41] but it would require an API change [13:43:51] (or double the API calls per keystroke, which seems less desirable) [13:43:56] should I open a phabricator ticket? I could not find any [13:44:08] sure, it would make sense to me [13:44:30] Thanks folks :] [13:46:13] done here: https://phabricator.wikimedia.org/T217768 [14:23:59] Hello again. Anyone familiar with {{#ifeq: expressions? Looking for some help [14:24:12] o/ [14:25:09] Rehman: What do you need? [14:26:29] Hey! Thank you. I need to do something like this (hope it doesn't sound alien): [14:26:32] {{#ifeq: [any one of the P31 values of this page] | [is Q6558431] | Coal plant | Not coal plant}} [14:27:02] I'm not sure how to go about with the wikidata code to fetch the values from wikidata to enwiki [14:27:18] I think for that it's better to use {{#switch:...}} [14:28:09] https://www.mediawiki.org/wiki/Help:Extension:ParserFunctions##switch [14:28:35] Possibly :] But I'm still stuck how to get the wikidata values in there [14:29:58] Do you know how to do that? [14:31:07] I'm not sure if there's a module to get that directly [14:31:35] To separately get and check every P31 value [14:33:48] Maybe this> https://en.wikipedia.org/wiki/Module:WikidataCheck [14:35:43] Maybe; I don't know how it deals with several values [14:36:18] And, once you get them, you can use https://www.wikidata.org/wiki/Module:String#find in your {{#ifeq: and avoid {{#switch: [14:41:02] Rehman: Does Module:WikidataCheck work for you? [14:41:38] Unfortunately no :[ [14:43:48] And [[Module:Wikidata]]? [14:43:48] 10[1] 10https://www.wikidata.org/wiki/Module:Wikidata [15:02:22] Technical Advice IRC meeting starting in 60 minutes in channel #wikimedia-tech, hosts: @milimetric & @CFisch_WMDE - all questions welcome, more infos: https://www.mediawiki.org/wiki/Technical_Advice_IRC_Meeting [15:05:21] hello, what do i do if a wikidata item (library) has physically moved buildings? [15:29:22] sophietheopossum: you could create a statement the current location with the rank "prefered" and the qualifier "start date" and an other one with the rank "normal" and the qualifier "end date" for the previous location [15:29:44] if you are not familiar with ranks: https://www.wikidata.org/wiki/Help:Ranking [15:32:03] hmmm alright i'll give it a go now [15:51:27] Technical Advice IRC meeting starting in 10 minutes in channel #wikimedia-tech, hosts: @milimetric & @CFisch_WMDE - all questions welcome, more infos: https://www.mediawiki.org/wiki/Technical_Advice_IRC_Meeting [17:31:31] How do I query all wikidata properties? [17:46:13] sec^nd: you can find various tools here https://www.wikidata.org/wiki/Wikidata:List_of_properties [17:48:38] That doesn't give me a count though [17:50:04] sec^nd: https://grafana.wikimedia.org/d/000000167/wikidata-datamodel?refresh=30m&orgId=1 [17:50:27] 5944 as we speak :) [19:59:09] Thanks Auregann_WMDE [22:47:24] good night