[06:31:05] PROBLEM - puppet last run on wdqs1006 is CRITICAL: CRITICAL: Puppet has 1 failures. Last run 5 minutes ago with 1 failures. Failed resources (up to 3 shown): File[/usr/local/bin/depool] [06:58:07] RECOVERY - puppet last run on wdqs1006 is OK: OK: Puppet is currently enabled, last run 2 minutes ago with 0 failures [08:08:10] Sv25 [08:08:25] == [16:54:43] judging from the lags, somebody is running mass-editing bot again... [16:57:42] I am seeing Harmonia Amanda at 73/m, Edoderoobot at 69/m and Hogu465 and 86/m... I guess it's a collective effort? [17:12:28] 813gan is doing 764 MB in the last three hours... [17:12:45] editing country items [17:19:26] PROBLEM - WDQS HTTP Port on wdqs1003 is CRITICAL: HTTP CRITICAL: HTTP/1.1 503 Service Unavailable - 649 bytes in 0.002 second response time [17:21:26] PROBLEM - Check systemd state on wdqs1003 is CRITICAL: CRITICAL - degraded: The system is operational but one or more units failed. [17:26:38] RECOVERY - WDQS HTTP Port on wdqs1003 is OK: HTTP OK: HTTP/1.1 200 OK - 448 bytes in 0.615 second response time [17:27:10] RECOVERY - Check systemd state on wdqs1003 is OK: OK - running: The system is fully operational [17:27:50] I am seeing edits double since what it was 6 hrs ago on wikiscan.org [18:41:14] anyone can add me in #wikidata-admin? [18:47:37] sjoerddebruin: SMalyshev It's shocking for me that wikidata can't handle mass imports... [18:47:37] Is there any ticket tracking this issue? [18:48:08] M813gan[m]: it can handle it if it's done right [18:49:58] SMalyshev: what do you mean by right? Should i increase delay between edits or something like that? [18:50:20] M813gan[m]: yes, that would be good idea [18:54:10] SMalyshev: are there any plans to increase wb performance? I created reusable bot and I initially want to import everything from https://www.wikidata.org/wiki/Wikidata:WikiProject_Economics . With bigger delay (which is huge for me anyway) it will take years... [19:59:55] also where is good place for asking 'bigger' questions? [20:15:17] M813gan[m]: broad matters can be discussed at https://www.wikidata.org/wiki/Wikidata:Project_chat [20:16:35] M813gan[m]: I need more information - which data do you have? [20:16:55] why would it take years? [20:32:58] SMalyshev: Currently i'm getting data from worldbank. [20:32:59] Maybe i went to far with multiple years, i dont know what deealy is needed. One import with current throttle takes few days [20:33:45] M813gan[m]: that's what I am trying to figure out, what kind of data you are importing? [20:34:22] a few days for an import sounds reasonable in general but depends on what it is [20:35:33] macroeconomics, P2299 at this moment [20:36:36] SMalyshev: yes, but if i will need to increase delay 10 times it will be 'not ok' [20:37:24] M813gan[m]: P2299 on which items? [20:37:46] I understand it applies to countries? How many countries are there, couple of hundreds? [20:38:15] near every country https://www.wikidata.org/wiki/Special:Contributions/813gan [20:38:44] SMalyshev: about 180 [20:39:05] so, 180 edits shouldn't take too much time [20:39:11] with any delay [20:40:52] no :p [20:40:53] now i'm importing data from period between 1990-2017 [20:40:53] some datasets began at 1960 [20:41:58] M813gan[m]: ok so a good question to ask would be a) do we really need to keep so many years in Wikidata and b) can't you update the item once instead of updating it 20 times? [20:42:15] it's hugely wasteful to do a new update for each data point [20:42:39] and our current update model doesn't really handle things like that very well [20:44:19] Wikidata in general is not very good in storing large time series and working with them [20:44:23] >can't you update the item once instead of updating it 20 times? [20:44:24] what api methods can do it??? does pywikibot support it? [20:44:52] and doing single edit per item is a hugely expensive thing (quadratic really because more you add, the bigger the item gets) [20:45:35] M813gan[m]: I have no idea... that should be a question to pywikibot maintainer. If it does not, that would be a good idea to add it, doing 20-30x more edits than necessary is a huge waste [20:45:39] SMalyshev: i was sure that it's one of proposes of wikidata... [20:45:52] M813gan[m]: storing large time series? hadrly [20:46:08] the UI is terrible for that [20:46:18] and the storage mode too [20:48:33] >UI [20:48:34] http://query.wikidata.org/ looks awesome for me... [20:48:35] But are you sure that api can do it? I contributed to pywikibot, i have (small but anyway) knowlage about it so i would add it. [20:51:40] >and the storage mode too [20:51:41] where is good place for discussion about it? [21:03:35] M813gan[m]: on project chat probably - https://www.wikidata.org/wiki/Wikidata:Project_chat or on WIkidata mailing list [21:04:49] SMalyshev: ok, thanks