[01:24:05] hare: any idea how it does the relevance scoring ? [01:24:40] incidentally, is there any way to reproduce the sparql endpoint in my own machine (to avoid unnecessary load ffor wikidata) ? [01:51:01] g [01:51:02] - 65 [01:51:38] -65566666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666 [01:51:38] 66666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666666555555555555555555555 [01:52:21] `sorry wornt happen again. [01:52:24] . at least for week or two [01:58:17] Hopefully not [08:34:34] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1971 bytes in 0.078 second response time [09:05:37] Is it possible to calculate the distance between two nodes in the wikidata graph with sparql ? [09:24:45] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1953 bytes in 0.088 second response time [09:39:18] Lydia_WMDE: Did you see https://twitter.com/jrvosse/status/1001434942377734145 ? Maybe good for the weekly newsletter? [09:45:36] multichill: I don't see Wikidata mentioned? I try to keep a direct connections with Wikidata when I add something to the newsletter [09:51:18] Auregann_WMDE: Try actually opening the paper [09:51:26] It's slightly hidden [10:01:55] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1963 bytes in 0.085 second response time [10:03:45] Auregann_WMDE: The whole last part is about Wikidata [10:27:04] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1969 bytes in 0.073 second response time [10:34:14] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1971 bytes in 0.081 second response time [10:46:00] we have some vandalism here https://www.wikidata.org/w/index.php?title=Wikidata:How_to_use_data_on_Wikimedia_projects&action=history can someone have a look? [10:52:30] Auregann_WMDE: https://www.wikidata.org/w/index.php?title=Wikidata%3AHow_to_use_data_on_Wikimedia_projects&type=revision&diff=688307433&oldid=686842405 [10:52:42] If you undo vandalism, make sure you undo all edits ;-) [10:53:06] thanks, I didn't have a lot of time to look at it [10:55:14] https://www.wikidata.org/wiki/Special:Contributions/2600:387:1:0:0:0:0:0/48 seems to be a source of garbage Auregann_WMDE [10:55:52] yeah, thanks for blocking [11:04:23] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1954 bytes in 0.112 second response time [11:16:25] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1962 bytes in 0.068 second response time [11:42:28] multichill: thanks for the link :) will have a look too [11:44:49] Lydia_WMDE: Any plans to embed GeoSParql as a new data type? [11:44:56] Or are we going to stick to the Commons trick? [11:47:07] I'm asking because I found a geojson set under cc0 of every building in the Netherlands :-) [11:47:33] multichill: is that where you embed the geoshape in the statement itself? [11:47:42] Yep [11:47:59] we said no to that until further notice because they'd be too huge especially for countries etc [11:48:05] POLYGON ((4.636409293306906 52.38113315352825, .....4.636409293306906 52.38113315352825))"^^ [11:48:17] That was my assumption [11:48:21] yeah :/ [11:48:45] Let's see how Commons likes it if I put 80.000 building shapes on there :P [11:49:01] :D [11:49:56] https://data.pdok.nl/sparql#query=select+%3Fa+%3Fb+%3Fc+%3Fd+%3Fe%0Awhere+%7B%0A++%23%3Chttp%3A%2F%2Fbag.basisregistraties.overheid.nl%2Fbag%2Fid%2Fpand%2F0392100000065734%3E+%3Fa+%3Fb+.%0A++%3Chttp%3A%2F%2Fbag.basisregistraties.overheid.nl%2Fbag%2Fid%2Fpand%2F0392100000065734%3E+%3Chttp%3A%2F%2Fwww.opengis.net%2Font%2Fgeosparql%23hasGeometry%3E+%3Fc+.%0A++%3Fc+%3Chttp%3A%2F%2Fwww.opengis.net%2Font%2Fgeosparql%23asWKT%3E+%3Fe+.% [11:49:57] 0A%0A%7D+&contentTypeConstruct=text%2Fturtle&contentTypeSelect=application%2Fsparql-results%2Bjson&endpoint=%2Fsparql&requestMethod=POST&tabTitle=Gedefinieerde+klassen&headers=%7B%7D&outputFormat=leaflet [11:49:59] Ah, too long [11:50:17] https://tinyurl.com/ydaqkrap [11:50:36] That's provided for every building [11:51:47] (I think the housing market collapse of a couple of years ago actually triggered the Open Data, funny how problems can cause good things to happen) [12:04:22] https://commons.wikimedia.org/wiki/Data:Haarlem/Grote_Kerk.map damn, that works out well [12:11:51] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1952 bytes in 0.105 second response time [12:39:12] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1961 bytes in 0.088 second response time [13:04:42] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1943 bytes in 0.078 second response time [13:04:51] does anyone know whether if I remove a statement in wikidata, if mix'n'match will remove the match in its database? [13:17:01] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1949 bytes in 0.057 second response time [13:57:32] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1956 bytes in 0.069 second response time [14:15:01] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1968 bytes in 0.077 second response time [14:17:16] i see totally broken encoding in lexeme screen's "language" and lexical category. Is that a known bug? [14:20:02] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1954 bytes in 0.087 second response time [14:30:14] hi [14:30:22] is there a lexeme search anyhere? [14:30:31] *anywhere [14:33:54] aude: About? [14:37:55] how can i search for lexemes? [14:38:26] like https://www.wikidata.org/wiki/Lexeme:L12 [14:39:05] Not yet. [14:39:45] okay, thanks :) [14:42:32] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1961 bytes in 0.067 second response time [14:47:32] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1974 bytes in 0.077 second response time [14:55:49] Reedy: aude seems to be MIA for a while [14:55:58] Yeah, that's what I thought :( [15:04:19] Reedy: whats up? [15:04:49] yurik: I think that is patched already, might need some cache purge? [15:22:27] addshore, possible - probably rendering cache [15:22:34] addshore: Not a wikidata question :P [15:30:04] yurik: Do you know if http://tinyurl.com/yd6vmjkk is compatible with Kartographer? [15:30:41] It returns a POLYGON , bug not sure if Kartographer is able to handle that [15:31:31] multichill, might be tricky... sec [15:32:02] i mean - it cannot be used with kartographer for sure, but it might be possible to use with graph [15:34:18] Reedy: rubbish [15:34:23] Reedy: your in the wrong channel then ;) [15:34:29] I was looking for aude specifically [15:34:35] So tried pinging her in a public channel [15:43:31] yurik: Or convert the POLYGRAPH to a bunch of points? That seems a bit hard [16:00:51] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1972 bytes in 0.064 second response time [16:05:52] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1948 bytes in 0.070 second response time [16:27:15] multichill, not sure what you mean by POLYGRAPH. The example you gave above is in WKT format. There is a relatively easy way to convert it into GeoJSON. Once converted, it can be used by both graph and kartographer [16:28:13] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1965 bytes in 0.076 second response time [16:33:12] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1967 bytes in 0.072 second response time [17:03:45] yurik: SOrry, POLYGON ((4.636409293306906 52.38113315352825 .....)) [17:04:02] What function to use in SPARQL to convert it? [17:04:28] I don't think SPARQL can do the conversion (but it should be relatively easy to add it) [17:06:00] i'm not sure if Lua can access wikdiata query results - if it could, you could do it in lua [17:06:36] SMalyshev, can this type of conversion be done? Seems that other SPARQL geo data also uses WKT, e.g. POINT() [17:07:06] basically its a simple static function that would convert geo wkt representation into geojson [17:13:39] yurik: probably possible, though I am not sure I'll have time in near term to do it... [17:13:54] yurik: but submit a task, or a patch :) [17:14:14] i'll let multichill file a task :-P [17:22:18] hi! does anyone know a translate tool kana => Russian Cyrillic? [17:25:26] Hi, Harmonia_Amanda :) [17:25:41] Nope, but perhaps Amire80... (?) https://www.wikidata.org/wiki/User:Amire80 [17:26:18] someone added hundred of wrong transliteration and I would love to correct them instead of deleting :s [17:26:52] (never ever transliterate based only on the kanji, *always* take the kana into account too) [17:27:14] :S [17:28:07] I understand enough Cyrillic to know which are wrong, but not to add the correct ones :s [17:29:13] (like Мураками is *not* a transliteration of むかみ. The correct transliteration would probably be Муками?) [17:29:13] /join #mediawiki-i18n ? :-/ [17:33:23] Anyway, I'm convinced Amire80 will give you a solution [17:33:39] thank you :) [17:35:01] Good luck [17:35:10] And thanks for your work :D [17:35:29] if you can find someone who understands russian, https://ru.wikipedia.org/wiki/%D0%A0%D1%83%D1%81%D1%81%D0%BA%D0%B8%D0%B5_%D1%82%D1%80%D0%B0%D0%BD%D1%81%D0%BA%D1%80%D0%B8%D0%BF%D1%86%D0%B8%D0%B8_%D0%B4%D0%BB%D1%8F_%D1%8F%D0%BF%D0%BE%D0%BD%D1%81%D0%BA%D0%BE%D0%B3%D0%BE_%D1%8F%D0%B7%D1%8B%D0%BA%D0%B0 seems to explain it [17:35:44] nikki: … I'm so stupid [17:35:54] of course there would be a Wikipedia article [17:35:59] >< [17:36:30] I was so confused because there wasn't one in english [17:36:31] abian: the funny thing is that I thought that I already cleaned Japanese names [17:36:37] which I did [17:36:53] I just didn't think people would then add mistakes :p [17:37:38] They do it every day :( [17:38:00] Ahem, and appliying uppercase to labels and descriptions, ahem [17:38:05] *applying [17:38:47] You revert, but a different contributor comes and does the same, and again... [17:39:14] oh, uppercase, it's a fight ongoinq >< [17:39:18] ongoing* [17:39:28] Auregann_WMDE or Lydia_WMDE, are you around (nothing urgent)? [17:39:34] Harmonia_Amanda: Yes :/ [17:39:50] I thought I even had an automated message for copying to each newbie page but I think I deleted the subpage [17:40:16] every time I spotted someone doing that, I let a message [17:40:25] so I hoped they would not do it again… [17:41:02] but at least it's not plain wrong [17:41:13] Yes, but that warning should be a part of the UI, that would avoid all the later work [17:41:21] sometimes, people do things and I have absolutely no idea why [17:41:35] http://cgi.geocities.jp/p451640/php/txt_cyl.php this seems to be a converter [17:42:21] (I understand the Russian transliteration errors, they are the most frequent associated with the kanji, but we also have the less frequent family names, so they are wrong in most cases) [17:42:37] (at least I entered むかみ and it gave me Муками) [17:42:40] so, it's wrong, but I understand where it comes from [17:42:47] nikki: thank you! [17:43:01] sometimes, I really don't understand [17:43:16] :) [17:43:24] like the guy who added "German" as value of P282 in more than a dozen items [17:43:37] does a language and a writing system look the same to you? [17:43:58] I could understand if it was "Latin" instead of Latin script, but German? [17:44:09] I wonder [17:44:30] I can understand that [17:44:57] They may associate that idea with the (German) alphabet [17:45:33] most westerners don't know much about the script they write their language with, they only know it as the language's alphabet [17:46:09] nikki: we have an item "German alphabet" [17:46:52] but that doesn't appear when you write "german" [17:46:58] (in english at least) [17:47:18] hmm, so it would be a search error [17:47:29] like people adding music albums as given names [17:53:08] I think that's slightly different... people selecting the language item probably do think that's what's being asked for (i.e. it's written using the writing system for the german language), but people who select an album for a given name either didn't realise they picked the wrong thing, or didn't care and just wanted anything with the right label [17:54:27] P282 is explicitly asking for a *writing system* not a language [17:56:24] yes, but you need to understand what the distinction is [17:56:34] and lots of people don't [17:57:00] maybe [17:57:19] so it would be akin to people adding the name in religion as value for "given name" :p [17:57:30] misunderstanding the property :p [17:58:32] (in fact most of those are from semi-automated added given names, based on the title of the Wikipedia article) [18:30:33] Hey, this is ArthurPSmith - I wonder if anybody on here can help? I'm on a cell phone network temporarily and it seems to have been blocked, so I can't edit or contact anybody on wiki... [18:31:38] https://thepasteb.in/p/zmh8yzWGgkXfZ [18:33:15] I don't use this network very often, but it's never had a problem like this before... [18:33:37] !admin if anybody on? [18:33:37] Attention requested  HakanIST sjoerddebruin revi [18:34:09] I didn't realize IP blocks also block logged in user accounts? That seems odd [18:34:30] apsmith: here is the user who added that https://meta.wikimedia.org/wiki/User:Matiia [18:34:43] Yes indeed but I have no way to contact them. [18:34:47] @seen matiia [18:34:56] they say they are "on a lot of channels" [18:35:06] apparently this isnt one of them though.. hold on [18:35:53] 14:35 < wm-bot> mutante: Last time I saw Matiia they were joining the channel, but they are not in the channel now and I don't know why, in #wikimedia-overflow at 3/12/2018 6:36:01 AM (81d11h59m38s ago) [18:36:00] hrmm [18:37:25] I have to shut down here for a few minutes, will be back online shortly... [18:37:27] apsmith: here's what you can do. send a mail to OTRS. user says they are an OTRS [18:37:31] https://meta.wikimedia.org/wiki/OTRS [18:37:33] via email [18:37:39] I can't send email on-wiki [18:37:44] off wiki [18:37:46] due to the block [18:37:49] oh - ok thanks [18:38:46] apsmith: https://en.wikipedia.org/wiki/Wikipedia:Contact_OTRS [18:38:53] apsmith: I could set the ip block exempt flag for your account on wikidata if that would help [18:43:41] apsmith: if you missed what I said, I said I could set the ip block exempt flag for your account on wikidata if that would help [18:44:18] nikki: yes I missed that, can you do that? [18:45:00] done [18:45:18] it'll expire in a week, hopefully that's enough time to resolve the problem [18:45:22] It works, thanks! [18:45:57] :) [18:46:13] Ah - ok, I'll query about it. This is a generic US AT&T network, I'm rather surprised at the block. [19:10:18] https://www.wikidata.org/wiki/Special:Contributions/2600:387:1:0:0:0:0:0/48 <- this range? [19:17:44] SMalyshev: https://www.wikidata.org/w/index.php?title=Wikidata%3ASPARQL_query_service%2FFederation_report&type=revision&diff=686075332&oldid=682038977 seems to work and found another open sparql endpoint [19:17:58] I think the number of sparql endpoints might increase a lot the coming year [19:41:06] Harmonia_Amanda: https://www.wikidata.org/wiki/Wikidata:WikiProject_sum_of_all_paintings/Collection/mus%C3%A9e_d%27art_et_d%27histoire_de_Saint-Brieuc the painters seem to be missing from your import here [20:28:26] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1966 bytes in 0.078 second response time [20:38:36] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1952 bytes in 0.069 second response time [21:11:06] PROBLEM - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1963 bytes in 0.079 second response time [21:16:32] multichill: add new ones to the federation requests page, I review it periodically [21:19:56] Of course, already done for the one I got [21:26:25] RECOVERY - wikidata.org dispatch lag is higher than 300s on www.wikidata.org is OK: HTTP OK: HTTP/1.1 200 OK - 1962 bytes in 0.072 second response time [21:42:35] multichill: I would need to go back to the data but I think it was the cases were we didn't have the information in the dataset [23:37:11] hello. anyone still alive? (: [23:37:38] yes [23:37:57] nice