[00:03:15] RECOVERY - High lag on wdqs1004 is OK: OK: Less than 30.00% above the threshold [600.0] [00:04:15] RECOVERY - High lag on wdqs1005 is OK: OK: Less than 30.00% above the threshold [600.0] [00:11:46] RECOVERY - High lag on wdqs1003 is OK: OK: Less than 30.00% above the threshold [600.0] [01:27:05] PROBLEM - High lag on wdqs1003 is CRITICAL: CRITICAL: 31.03% of data above the critical threshold [1800.0] [01:27:35] PROBLEM - High lag on wdqs1005 is CRITICAL: CRITICAL: 37.93% of data above the critical threshold [1800.0] [01:29:35] PROBLEM - High lag on wdqs1004 is CRITICAL: CRITICAL: 34.48% of data above the critical threshold [1800.0] [03:39:56] RECOVERY - High lag on wdqs1004 is OK: OK: Less than 30.00% above the threshold [600.0] [03:39:56] RECOVERY - High lag on wdqs1005 is OK: OK: Less than 30.00% above the threshold [600.0] [03:45:35] RECOVERY - High lag on wdqs1003 is OK: OK: Less than 30.00% above the threshold [600.0] [03:49:05] Jhs: thank you. [06:59:30] thanks quiddity [06:59:39] just posted another one [08:52:13] btw, is the appropriate way to get in touch with the people behind WikiDataMovieDB? [08:54:10] pere: It's the most appropriate we have, I guess [08:54:28] https://www.wikidata.org/wiki/Special:EmailUser/WikiDataMovieDB does not work and they don't have other contact information stated [09:02:36] do not seem to get quick replies. several questions are without answer for months. [09:03:39] :/ [09:35:15] pere, i got it working finally. am importing the missing Internet Archive IDs as we speak (see https://www.wikidata.org/wiki/Special:Contributions/Jon_Harald_S%C3%B8by ) [09:42:08] I am wondering why so few people endorse https://meta.wikimedia.org/wiki/Grants:Project/DBpedia/CrossWikiFact [09:42:57] in particular, there is no endorsement from the Wikidata team - any idea why? [09:44:08] no concerns have been raised on the talk page, so it's hard to figure out what is going on [09:56:55] pere, done [10:22:14] "listToTruth" … that's a pretty bold claim for a Lua function :D [10:24:16] you doubt the listToTruth function? heresy! [10:57:52] Jhs: thank you very much. I got 196 new entries in my data set. :) [11:08:41] Jhs: btw, did you both add and correct incorrect/outdated links, or just add? [11:39:51] pere, just add. items that already had the property were ignored [11:40:57] right. I've fixed a few typos in the links the last few days, hope they will be synced some time too. [15:37:04] PROBLEM - High lag on wdqs1003 is CRITICAL: CRITICAL: 33.33% of data above the critical threshold [1800.0] [15:37:34] PROBLEM - High lag on wdqs1004 is CRITICAL: CRITICAL: 34.48% of data above the critical threshold [1800.0] [15:41:04] PROBLEM - High lag on wdqs1005 is CRITICAL: CRITICAL: 36.67% of data above the critical threshold [1800.0] [16:14:23] PROBLEM - Check systemd state on wdqs2003 is CRITICAL: CRITICAL - degraded: The system is operational but one or more units failed. [16:14:24] PROBLEM - Check systemd state on wdqs1005 is CRITICAL: CRITICAL - degraded: The system is operational but one or more units failed. [16:14:25] PROBLEM - Check systemd state on wdqs1003 is CRITICAL: CRITICAL - degraded: The system is operational but one or more units failed. [16:14:33] PROBLEM - Check systemd state on wdqs2001 is CRITICAL: CRITICAL - degraded: The system is operational but one or more units failed. [16:14:44] PROBLEM - Check systemd state on wdqs2002 is CRITICAL: CRITICAL - degraded: The system is operational but one or more units failed. [16:14:56] PROBLEM - Check systemd state on wdqs1004 is CRITICAL: CRITICAL - degraded: The system is operational but one or more units failed. [16:16:13] Jonas_WMDE, Nice :) It would be good to have those onwiki somewhere. (I'm not sure where, maybe the "gentle introduction" page? https://www.wikidata.org/wiki/Wikidata:SPARQL_query_service/Wikidata_Query_Help ) [16:18:15] quiddity yeah sure! feel free to add them, they are already on commons [16:20:15] PROBLEM - puppet last run on wdqs2003 is CRITICAL: CRITICAL: Puppet has 1 failures. Last run 3 minutes ago with 1 failures. Failed resources (up to 3 shown): Service[wdqs-updater] [16:23:04] PROBLEM - puppet last run on wdqs1004 is CRITICAL: CRITICAL: Puppet has 1 failures. Last run 5 minutes ago with 1 failures. Failed resources (up to 3 shown): Service[wdqs-updater] [16:39:25] RECOVERY - Check systemd state on wdqs1005 is OK: OK - running: The system is fully operational [16:39:25] RECOVERY - Check systemd state on wdqs2003 is OK: OK - running: The system is fully operational [16:39:33] RECOVERY - Check systemd state on wdqs1003 is OK: OK - running: The system is fully operational [16:39:43] RECOVERY - Check systemd state on wdqs2001 is OK: OK - running: The system is fully operational [16:39:54] RECOVERY - Check systemd state on wdqs2002 is OK: OK - running: The system is fully operational [16:39:55] RECOVERY - Check systemd state on wdqs1004 is OK: OK - running: The system is fully operational [16:48:03] RECOVERY - puppet last run on wdqs1004 is OK: OK: Puppet is currently enabled, last run 1 minute ago with 0 failures [16:50:13] RECOVERY - puppet last run on wdqs2003 is OK: OK: Puppet is currently enabled, last run 3 minutes ago with 0 failures [17:10:24] RECOVERY - High lag on wdqs1005 is OK: OK: Less than 30.00% above the threshold [600.0] [17:12:53] RECOVERY - High lag on wdqs1004 is OK: OK: Less than 30.00% above the threshold [600.0] [17:13:24] RECOVERY - High lag on wdqs1003 is OK: OK: Less than 30.00% above the threshold [600.0] [17:14:21] hi everyone [17:14:35] i have a question about the Name of a person [17:15:01] i'm looking to import a medium dataset of authors (around 10.000) [17:15:26] and was looking to use quickstatements, since i have already checked they don't exist in wikidata [17:16:17] the problem is that properties like name and family name are not strings, and i have strings in my original dataset. [17:17:06] rbarbano: have you tried using OpenRefine for that? [17:17:20] OpenRefine? [17:17:25] i don't know that tool [17:17:39] http://openrefine.org/ [17:18:03] going. [17:18:08] it is relatively convenient to use when you need to match multiple columns of the same dataset to Wikidata [17:18:49] thank you pintoch [17:19:04] in fact, i'm looking to import the names as a string [17:19:39] since its a large dataset and i don't have the time to match every name with the correspondent entity [17:19:41] in wd. [17:19:59] yeah, but OpenRefine lets you do that semi-automatically actually [17:20:20] I expect it should be relatively straightforward for first names and family names actually [17:20:47] (you can also try matching your authors against wikidata just to make sure they are all new…) [17:21:02] (how did you make sure that none of them were in Wikidata yet?) [17:21:25] whenever an author gets created it's searched in wikidata [17:21:55] manually ? [17:22:00] yep [17:22:22] impressive :) what dataset is it? [17:22:39] jeje [17:22:46] you can check it here [17:23:07] http://autores.uy/sites/default/files/datos/autores.csv.zip [17:23:22] thanks [17:23:30] i warn you, that it's the full dataset, not the filtered dataset that already exists in wd [17:25:38] I see [17:27:05] do you think autores.uy is enough to witness notability of these people? [17:27:26] (I can't judge) [17:28:06] for the technical side of things, this dataset looks like a textbook example of what OpenRefine is good at, so it's definitely doable :) [17:28:12] autores.uy it's a database that pick information of reference books, libraries datasets like national library of Uruguay, etc. [17:28:47] meaning that every person there its an author referenced in some source [17:29:01] jeje, thanks pintoch [17:29:04] ideally it would be nice to be able to link to that source [17:29:10] i'll look into that sofware [17:29:25] because http://autores.uy/autor/2697 looks a bit bare… :-/ [17:29:46] yes, you're right [17:29:51] (but I've seen imports which were waaay worse…) [17:30:08] the problem with the national library of Uruguay it's that you can't link to the data itself [17:30:13] you have too do a search [17:30:21] and it returns a temporary link [17:31:35] Thiemo_WMDE, DanielK_WMDE - I'm going to be out of the office on vacation for the next 12ish days, but if you can coordinate with Matthias and/or get me some more complete definitions of how to structure this work on entity lookups, I'd appreciate it. [17:32:18] marktraceur: I'm also on vacation for 1 week + 1 day. :-) [17:32:34] Thiemo_WMDE: Well, I guess that puts a pin in things for a while! [17:38:55] "Entities using the military rank property should be instances of person or goat (or of a subclass of them), but Jack Bauer currently isn't."