[00:00:12] but disabling, it's not broken [00:02:10] maybe requires running jobs [00:07:23] aude: when you can recreate locally, I think https://gerrit.wikimedia.org/r/#/c/313680/ will fix it. It would also need testing with something that causes a dberror to see if that still works right too. [00:10:11] PROBLEM - Response time of WDQS on wdqs1002 is CRITICAL: CRITICAL: 10.00% of data above the critical threshold [300000.0] [00:14:20] ok [00:15:13] RECOVERY - Response time of WDQS on wdqs1002 is OK: OK: Less than 5.00% above the threshold [120000.0] [00:40:17] PROBLEM - Response time of WDQS on wdqs1002 is CRITICAL: CRITICAL: 10.00% of data above the critical threshold [300000.0] [00:42:47] RECOVERY - Response time of WDQS on wdqs1002 is OK: OK: Less than 5.00% above the threshold [120000.0] [02:10:54] PROBLEM - Response time of WDQS on wdqs1002 is CRITICAL: CRITICAL: 10.00% of data above the critical threshold [300000.0] [02:13:43] RECOVERY - Response time of WDQS on wdqs1002 is OK: OK: Less than 5.00% above the threshold [120000.0] [02:41:25] PROBLEM - Response time of WDQS on wdqs1002 is CRITICAL: CRITICAL: 10.00% of data above the critical threshold [300000.0] [02:46:46] RECOVERY - Response time of WDQS on wdqs1002 is OK: OK: Less than 5.00% above the threshold [120000.0] [03:08:16] PROBLEM - Response time of WDQS on wdqs1002 is CRITICAL: CRITICAL: 10.00% of data above the critical threshold [300000.0] [03:13:36] RECOVERY - Response time of WDQS on wdqs1002 is OK: OK: Less than 5.00% above the threshold [120000.0] [03:20:40] !ops hola alvaro molina [03:43:37] PROBLEM - Response time of WDQS on wdqs1002 is CRITICAL: CRITICAL: 10.00% of data above the critical threshold [300000.0] [03:46:07] RECOVERY - Response time of WDQS on wdqs1002 is OK: OK: Less than 5.00% above the threshold [120000.0] [03:46:42] AlexZ: hhm [03:47:03] Does icinga-wm_ need to be making a racket in here? [03:47:07] :p [03:47:17] I'd leave it [03:47:26] WDQS is a Wikidata thing [03:47:36] It's going to keep triggering [03:47:53] so if you can live with the pings.. [04:11:06] PROBLEM - Response time of WDQS on wdqs1002 is CRITICAL: CRITICAL: 10.00% of data above the critical threshold [300000.0] [04:16:00] RECOVERY - Response time of WDQS on wdqs1002 is OK: OK: Less than 5.00% above the threshold [120000.0] [04:43:35] PROBLEM - Response time of WDQS on wdqs1002 is CRITICAL: CRITICAL: 10.00% of data above the critical threshold [300000.0] [04:48:29] RECOVERY - Response time of WDQS on wdqs1002 is OK: OK: Less than 5.00% above the threshold [120000.0] [05:16:01] PROBLEM - Response time of WDQS on wdqs1002 is CRITICAL: CRITICAL: 10.00% of data above the critical threshold [300000.0] [05:21:01] RECOVERY - Response time of WDQS on wdqs1002 is OK: OK: Less than 5.00% above the threshold [120000.0] [05:46:16] PROBLEM - Response time of WDQS on wdqs1002 is CRITICAL: CRITICAL: 10.00% of data above the critical threshold [300000.0] [05:51:08] RECOVERY - Response time of WDQS on wdqs1002 is OK: OK: Less than 5.00% above the threshold [120000.0] [06:16:24] PROBLEM - Response time of WDQS on wdqs1002 is CRITICAL: CRITICAL: 10.00% of data above the critical threshold [300000.0] [06:23:57] RECOVERY - Response time of WDQS on wdqs1002 is OK: OK: Less than 5.00% above the threshold [120000.0] [06:54:32] PROBLEM - Response time of WDQS on wdqs1002 is CRITICAL: CRITICAL: 10.00% of data above the critical threshold [300000.0] [06:59:42] RECOVERY - Response time of WDQS on wdqs1002 is OK: OK: Less than 5.00% above the threshold [120000.0] [07:30:23] PROBLEM - Response time of WDQS on wdqs1002 is CRITICAL: CRITICAL: 10.00% of data above the critical threshold [300000.0] [07:33:03] RECOVERY - Response time of WDQS on wdqs1002 is OK: OK: Less than 5.00% above the threshold [120000.0] [07:34:00] :) [07:53:57] aude: Nice talk page @ https://www.wikidata.org/wiki/User_talk:Aude [V-C84gpAAEUAARIU2PoAAABK] 2016-10-02 07:53:06: Fatal exception of type "Flow\Exception\InvalidInputException" [07:54:41] https://phabricator.wikimedia.org/T147122 [08:03:39] multichill: what an editrate is BotMultichill set up? [08:04:01] I think, I have seen edits every second [08:04:12] in wikidata [08:11:40] Just the normal pywikibot. So it backs off when the replag goes up [08:18:50] doctaxon: It generally seems to be at around 10-15 edits per minute [08:19:38] multichill: I saw edits in recent changes every second yesterday evening [08:19:50] evening UTC [08:23:06] multichill: SuccuBot actually edits twice in the second [08:23:33] how much is allowed with flooder flag [08:26:29] Dunno, never used it, bots have a botflag [08:29:55] multichill: yes, this was my question, is there an editrate maximum in Wikidata for bot users with bot flag? [08:30:24] you are admin there, I ask you for that [08:32:44] https://meta.wikimedia.org/wiki/Bot_policy#Edit_throttle_and_peak_hours & https://www.wikidata.org/wiki/Wikidata:Bots & https://www.mediawiki.org/wiki/API:Etiquette are the relevant pages. [08:32:54] No hard limit, just don't go too fast and respect replag [08:33:20] We've blocked bots in the past that were going too fast. Several edits per second [08:34:28] ah okay multichill, yes - my maxlag is set up to 5 - that's okay [08:37:54] multichill: https://www.wikidata.org/w/index.php?title=Q2027954&type=revision&diff=383119869&oldid=205011862 [08:43:21] sjoerddebruin: nameguzzler forgets the entity description [08:43:58] ? [08:46:28] ya, you only inserted the entity label [08:47:21] Yes, that's where the script was designed for. [08:47:31] Vreemd sjoerddebruin. Je zou eens kunnen kijken of het met https://nl.wikipedia.org/wiki/Categorie:Hoogleraar_aan_de_Katholieke_Universiteit_Leuven te maken heeft? [08:47:43] multichill: ik ga zo kijken of er nog meer zijn [08:47:52] Kan zijn dat je iets te diep ging ;) [08:47:54] then it's time to make the script better [08:48:33] Well, feel free to improve it. Perfect descriptions are really hard. [08:48:43] yes it is [08:48:53] And if I add occupation and country, a bot will add a Dutch description at last. [08:51:51] multichill: 5 anderen, verkeerd gemarkeerd door Gerard in 2014. [08:53:01] * doctaxon loves Wikidata every second more and more [11:00:08] At least a homosexual Muppet got a source for his sexuality. https://www.wikidata.org/wiki/Q7356093 [11:24:34] Impressive. https://www.wikidata.org/w/index.php?title=Q26761745&type=revision&diff=372720138&oldid=372719940 [12:50:58] phabricator is inconsistent in the order of the tags and subscribers field. it has tags first on the create task page but subscribers first on the search page >_< [13:04:59] nikki: fill a bug, ask for permission to fix the UI, and if greenlighted, submit a patch [13:05:04] secure.phabricator.com the tracker [13:05:28] but they don't like third party contributions as a general rule [13:05:48] I imagine here if you wish to fix the UI that's a low cost maintenance change, so would be acceptable [13:06:15] I don't particularly want to fix it myself, I have plenty of other things to do already :P [13:07:15] oh sorry, I didn't know why, I were on the strange assumption you already fixed it in your local Phabricator installation, and not seen it on Wikimedia tracker [13:07:26] :P [13:07:27] that's better, they are pretty responsive for small fixes [13:12:18] Is there an equivalent guideline to wikipedia's in wikidata for semi-bot? I have a dataset that I want to merge with wikidata as a one time thing. Should I submit it as a bot proposition? [13:13:44] chewieQC: I think you can request the flood flag for that. https://www.wikidata.org/wiki/Wikidata:Flooders [13:15:36] sjoerddebruin: It seems to be exactly what I needed, thanks! [14:42:52] Hi! What value have I to take to replace the value of property P2566 using API wbsetclaimvalue? [14:43:45] is here anybody with some API knowledge? [14:44:01] doctaxon: Use a framework that talks with the api? [14:44:01] yurik: maybe you? [14:44:34] multichill: I have the example: api.php?action=wbsetclaimvalue&claim=q42$D8404CDA-25E4-4334-AF13-A3290BCD9C0F&snaktype=value&value={"entity-type":"item","numeric-id":1}&token=foobar&baserevid=7201010 [14:44:50] and there is a value to set [14:45:05] Back doctaxon, what do you try to do and why are you communicating with the api directly? [14:45:31] cause I am using a flooder account [14:46:02] let me explain: [14:46:07] please do [14:49:55] multichill: as example: entity Q408432, property P2566, its value is wrong and I want to change it using wbsetclaimvalue to the right value "100.006.474" [14:50:57] snaktype value property P2566 datavalue {value 100.006.474 type string} datatype external-id [14:51:30] Let's rephrase that, as example: entity Q408432, property P2566, its value is wrong and I want to change to the right value "100.006.474" [14:52:02] yes, it's right [14:52:15] In one of the frameworks like pywikibot or the java thingie that's just a couple lines of code [14:52:36] but I have the framework [14:52:39] And it takes care of everything. If you are going to communicate with the api directly you have to implement that all yourself [14:52:47] Which one? [14:52:56] tcl framework [14:53:10] Link? [14:53:13] I only have to know the json for the value [14:53:33] It's a string [14:53:50] "type":"string" [14:54:14] "type":"string","value":"100.006.474" [14:54:34] but this does not work [14:55:14] i have only this not matching example: https://www.wikidata.org/w/api.php?action=help&modules=wbsetclaimvalue [14:55:50] there is stated something like this: value={"entity-type":"item","numeric-id":1} [14:56:08] what is that? [14:57:25] That's item Q1 [14:57:28] The universe [14:57:41] but what item do I need? [14:58:11] The data type is external-id so it's a string [14:58:25] and the json? [14:58:54] Just do some edits in your browser with debug console open [14:58:55] "entity-type":"external-d" ? [14:59:02] Than you'll see all the api requests and answers [14:59:37] my trials give the error: servedby mw1233 error {code invalid-snak info {Invalid snak. (Can only construct StringValue from strings)} messages {{name wikibase-api-invalid-snak parameters {} html {* {Invalid snak.}}}} * {See https://www.wikidata.org/w/api.php for API usage}} [15:01:22] multichill: can you give me the URL for such a replace, that will help me a lot [15:04:09] multichill: this doesn't work too: "entity-type":"external-id","value":"100.242.450" [15:07:21] multichill: can you give me the URL for such a replace, that will help me a lot [15:18:31] multichill: I got it, thank you for any help [15:47:28] good luck [15:48:11] sjoerddebruin: I run into quite a few items like https://www.wikidata.org/w/index.php?title=Q21552930&oldid=276376461 . Do you happen to be working on these already? [15:48:28] Could do a sparql query now we have the number of statements in there :-) [15:49:22] multichill: I'm working on UGent first, then I will focus on the Dutch Senate I think. [15:49:47] Ok. I just move around a bit. Did a huge RKDartists clean up this weekend [15:50:39] It's so annoying that a lot of sites don't include simple stuff like genders. [15:51:22] Parlement.com for example, 444 items without gender. [16:39:04] I have noticed that some metro stations have "metro station" in its name, for example https://www.wikidata.org/wiki/Q2331863 [16:39:59] timmy: yeah, that's because that was the title of the page on the English Wikipedia. [16:40:08] It can be removed imo. [16:40:46] So, is it ok to remove all such occurencies that I find? [16:41:22] Yeah, if it is noted in the description it is not needed in the label. [16:42:10] Thanks, I'll try. [16:43:38] though in some cases, "Station" may be part of the proper name [16:50:10] but how to find out if it is the case? [16:52:34] Reading sources. [18:21:10] timmy: if things like that are not capitalised (for english, at least), it's usually just descriptive and not part of the name, if it is capitalised, I would recommend checking more carefully [18:22:46] sitelinks for other languages can be useful too, since enwiki tends to use a descriptive page name where others often treat it like any other disambiguation (you can see that in your example :)) [19:41:52] !ops hola alvaro molina