[04:39:45] playing wikidata game - "occupation" .. what comes up? Elizabeth II. .. so what is her job ?:) [04:40:33] tons of suggestions from "governess" to "miner", "aviator", "ambassador" to "comics artist" [04:45:13] picked Q19643 [04:57:06] * nikki finally did something on phabricator and boldly marked https://phabricator.wikimedia.org/T103752 as a duplicate of https://phabricator.wikimedia.org/T102888 [04:58:01] if I did anything wrong or should've done something else too, please do tell me :) [06:40:35] nikki: no - exactly right :) [06:40:36] thanks [07:02:05] Morning. [07:16:27] Ugh, ‎GZWDer is flooding again. [07:17:26] Lydia_WMDE: good :D [07:23:37] multichill: Busy following your advice. :) http://tools.wmflabs.org/reasonator/geneawiki2/?q=Q20220290 [07:24:28] sjoerddebruin: About the genea website? [07:24:46] Yeah, busy filling everything... [07:25:12] Always nice to see that after some time, you hit a person with a existing item. :) [07:28:35] I never thought of Genealogics, because of his bad SEO. [07:46:01] hey Lydia_WMDE :) [07:46:11] hey [07:46:11] PROBLEM - check if wikidata.org dispatch lag is higher than 2 minutes on wikidata is CRITICAL: HTTP CRITICAL: HTTP/1.1 200 OK - pattern not found - 1373 bytes in 0.110 second response time [09:06:22] benestar: I approve of your !nyanreview proposal! [09:06:45] it's a JeroenDeDauw :D [09:06:51] JeroenDeDauw: interesting work times btw :P [09:07:03] [05:54:22] (CR) Jeroen De Dauw: [C: 1] [09:18:22] RECOVERY - check if wikidata.org dispatch lag is higher than 2 minutes on wikidata is OK: HTTP OK: HTTP/1.1 200 OK - 1360 bytes in 0.171 second response time [09:19:34] Anyone looking into the dispatch lag? [09:20:08] could someone protect https://www.wikidata.org/wiki/Q349172? it's attracted vandalism from a bunch of ips today [09:20:48] nvm, seems that's no longer a problem [09:22:59] hoo: Created a task for the issues from last night [09:23:21] sjoerddebruin: Ok... has it been a problem, despite that one moment? [09:23:45] hoo: New flooding this morning... [09:23:50] nikki: done. [09:23:54] thanks :D [09:24:46] sjoerddebruin: Thanks for the bug [09:25:39] Maybe I can backport that change later today, if it gets merged [09:25:52] Still flooding atm. https://www.wikidata.org/wiki/Special:NewPages [12:54:16] benestar|cloud: I live on the internets, and that is always online [12:54:27] addshore: "Addshore fixes" ... that PR name [12:54:39] ;) [12:54:53] * JeroenDeDauw is tempted to rename it to "Fixes addshore" [13:14:56] JeroenDeDauw: https://github.com/wmde/wikidata-analysis/pull/9 ? ;0 [13:25:06] addshore: huh, java and python? [13:25:52] let me watch that project [13:26:44] addshore: 2013... was that thing moved to the wmde org recently? [13:27:10] 2013? noo, it has been there for ages [13:27:25] Just it was rewritten to work again recently ;) [13:27:57] * JeroenDeDauw sees added production code and no tests [13:28:00] !nyan | addshore [13:28:01] addshore: ~=[,,_,,]:3 [13:28:30] hah, I wouldn't call it 'production' code ;) [13:28:52] addshore: ill have a look at this tomorrow. to tired now [13:28:57] :) [13:39:45] addshore: hmm. those map things are fancy and all, but they do not show the really important data... I can haz map which shows the locations of all cat pictures? [13:40:02] JeroenDeDauw: sure ;) [13:47:46] JeroenDeDauw: https://gerrit.wikimedia.org/r/#/c/220717/ <-- +2 so I'll +2 yours ;) [13:49:26] !nyanreview is ~=[,,_,,]:3 .oO(review!!!) $1 [13:49:26] Key was added [13:53:43] ;) [13:54:05] ;) [13:54:29] <- back in ~1h [13:59:20] benestar|afk: \o/ now all we need to do is make it animated [14:31:12] ooh maps... I was wondering just the other day if there were maps of things with coordinates mapped by the languages too [15:09:14] Thiemo_WMDE: around? [15:10:27] Lydia_WMDE: Around? [15:27:49] Why I can't to this: https://www.wikidata.org/w/index.php?title=Help:Sources&action=edit§ion=8 [15:27:54] *do [15:28:26] MGChecker: issue of the translate extension [15:28:58] I can't translate some of this sectiopn too :/ [15:29:00] Why? [15:30:15] *section [15:32:21] MGChecker: try #mediawiki-i18n [15:33:39] I know this channel should be for localisation, but how can it help at my problem? [15:48:09] DanielK_WMDE_: care to restore your +2 on https://gerrit.wikimedia.org/r/#/c/219808 ? :) [15:51:40] There isn't anyone talking to me at #mediawiki-i18n [15:52:12] MGChecker: try again when california is awake, maybe... [15:52:32] or send mail, or file a bug [15:52:42] at phabricator? [15:52:44] if it's an issue with the translate extension, it's unlikely anyone here can help you [15:52:51] MGChecker: yes [15:52:57] I haven't think aput time zones... [15:53:19] MGChecker: you can alsi try to directly ping Nikerabbit in #mediawiki-i18n [15:53:57] He already did [15:55:00] ah, well then... [16:06:54] Okay, can someone mrk the page for translating now? [17:19:59] jzerebecki: Around? [17:27:54] DanielK_WMDE_: did you have a chance to look at https://gerrit.wikimedia.org/r/#/c/216894/ more ? [17:41:28] SMalyshev: The result from some discussion between jan, lydia, and me is to go with https for the document uris. See https://phabricator.wikimedia.org/T103767#1401513 [17:41:55] SMalyshev: i didn't get a chance to look at the redirect stuff, sorry [17:42:25] DanielK_WMDE_: ok, I'll revert it [17:44:51] DanielK_WMDE_: https://gerrit.wikimedia.org/r/#/c/220832/ [17:45:08] SMalyshev: thanks [17:45:13] sorry for the back-and-forth [17:48:27] hoo: yes [17:48:51] See #wikimedia-core ... wth [17:55:03] DanielK_WMDE_: did you discuss dates 7 https://phabricator.wikimedia.org/T99795 recently? it'd be nice to get some handle on that too, it's still not clear what's going on in rdf dump with negative dates [17:55:47] i.e. i'm moving the java part to full support of xsd 1.1 but if rdf stays 1.0 results may be wrong [18:03:29] hoo: #wikimedia-core ? Is that an irc channel? Why do I think I end up in a trollfest if I enter that channel? :P [18:03:50] multichill: Whoops, meant #mediawiki-core [18:04:18] It is an IRC channel, but I guess it's not particularly interesting, if you're not hacking on MW core [18:05:54] hoo: Are the wikidata servers very busy? The api is sending waits to my bot [18:06:06] jzerebecki: Do you see anything? If not, I guess we need to attempt hard coding the exact versions we want... [18:06:15] multichill: Uh, let me check [18:06:35] hoo: still trying something [18:07:07] jzerebecki: Thanks [18:07:17] https://www.wikidata.org/w/api.php?action=query&meta=siteinfo&siprop=dbrepllag&sishowalldb= looks normal [18:07:20] multichill: More than 400 edits/ minute, that's tough [18:07:34] Me or in total? [18:07:46] In total [18:07:50] didn't look at users [18:08:01] oh [18:08:03] ok [18:08:24] Bot back off retry wait logic all seems to work well :-) [18:09:06] https://www.wikidata.org/wiki/Special:Contributions/BotMultichill <- I do see a red link every once in a while when refreshing this. That's a bit strange [18:09:31] multichill: It's not, we read stuff from slaves there, so they might not have the new data [18:09:47] also if you do deletions, pages in the delete log might be displayed as still existing [18:10:08] I understand the logic behind it, but it's still a bit weird to see as a user ;-) [18:10:36] Only people that do a lot of edits in a short amount of time should notice that [18:11:01] If replag is over 10s(?), we display a warning that stuff might be inaccurate [18:11:46] fuck [18:11:57] edits aren't the problem [18:12:02] I guess arbitrary access is [18:12:27] but probably not ruwiki [18:15:34] mh, stuff looks ok again [18:16:03] I don't know what exactly happened, but I think subscriptions are to blame [18:16:39] multichill: Thanks for letting me know... I don't usually notice stuff like that (unless I get an alert, but if that happens stuff is very awry already) [18:16:59] We alter if we ahve more than 300s replag [18:17:05] * alert [18:18:01] hoo: The warnings seem to come and go [18:20:57] Unusually many open connection [18:20:57] s [18:21:06] Will poke further [18:26:21] hoo: https://www.wikidata.org/w/index.php?title=Special:Contributions/Lockal&offset=&limit=500&target=Lockal <- 500 edits in 2 minutes [18:26:32] Did magnus remove the speed restrictions from Widar? [18:27:03] I'm looking at the number of edits, but it shouldn't be a problem at this scale [18:27:58] ah... lot of stuff going on about creating items [18:28:03] we should really do my backport [18:28:08] multichill: Are you creating items as well? [18:28:21] Or do you see issues during normal edits? [18:28:22] Yeah [18:28:33] Only during creation I get warnings I think [18:30:05] hoo: Yes, warning after creation of the item [18:30:20] What exactly does it sayß [18:30:22] * ? [18:33:12] Already checked the logs, nothing useful in there [18:33:28] mh... not surprising [18:33:37] It tries to create an item. Fails. Waits. Tries again. Fails...... Success [18:34:44] jzerebecki: Any chance? If not, I guess I will try to force merge and see what travis says [18:34:59] And jenkins on the build should owkr, right? [18:35:05] hoo: working on it [18:36:48] hoo: Damn you ssl, can't do dump of the tcp stream ;-) [18:37:12] (I can, but it's not very readable) [18:48:34] arg writing json ad hoc is not nice [19:05:09] Amir1: around now :) [19:11:10] hoo: done [19:11:38] ok, looking :) [19:12:19] jzerebecki: hoo: fire fighting? [19:12:31] Yeah :/ [19:12:38] Got better already [19:12:46] but we should still deploy the backport [19:12:56] Lydia_WMDE: Anything else to backport, now that I'm on it? [19:13:37] hoo: the fix you made yesterday for sjoerd. the other thing that really needs to get out isn't done afik :( [19:14:04] Lydia_WMDE: that actually is the fix I made for sjoerd :P [19:14:10] oh [19:14:11] ok [19:14:17] good good [19:14:28] hoo: why is it failing? [19:15:07] 19:12:05 [9456b954] [no req] MWException from line 85 of /mnt/jenkins-workspace/workspace/mwext-Wikibase-repo-tests-mysql-hhvm/src/extensions/Wikibase/repo/Wikibase.hooks.php: Wikibase: Incomplete configuration: $wgWBRepoSettings["entityNamespaces"] has to be set to an array mapping content model IDs to namespace IDs. See ExampleSettings.php for details and examples. [19:15:09] what the hell [19:15:20] hoo: oh we also need to backport the not to autoload the extension [19:15:28] ah right [19:15:49] I'll force merge that one, because we need it to make jenkins run anyway [19:19:12] I'm also investigating the travis failures, btw (just so that no one else spents time on that) [19:33:44] Is there a way to delete translations that aren't needed anymore? [19:34:03] What kind of translations? [19:34:21] Do you mean labels/descriptions or the stuff from the translate extension? [19:34:56] tuff from the translate extensio [19:35:03] https://www.wikidata.org/w/index.php?title=Translations:Wikidata:Arbitrary_access/7/de&action=edit [19:37:11] That can be deleted like normal pages can, I think [19:41:17] ÄI remember that there was any problem, but you should know it, beause you theoretically have the possibility to do this [19:42:23] I have the possibility to do almost everything... the question is whether I actually end up doing it [19:44:02] Yes, but you coul see if it is possible at Special:Delete [19:47:31] Need to backport more stuff... :/ [20:00:29] Entity usage on ruwiki keeps growing fast [20:00:50] (but not to fast, nothing to worry about atm) [20:29:48] benestar: Wouldn't it be good if https://www.wikidata.org/wiki/Special:AbuseFilter/56 would warn the user? [20:30:07] * benestar will look later [20:31:05] * MGChecker thanky benestar [20:31:13] *thanks [20:31:27] * MGChecker is typing much to fast without looking [20:58:15] hmm, I think https://www.wikidata.org/w/index.php?title=Special:AbuseLog&wpSearchFilter=56 isn't even correct :S [21:30:42] hoo|away: can this cause damage to the servers? https://www.wikidata.org/wiki/Special:Contributions/Lockal [21:34:01] benestar: if you want real answers, poke in -operations as its their job :) [21:35:01] JohnFLewis: I think the user is responsible and stopped editing for now [21:36:31] mkay [21:37:25] An answer anyway would be, it wouldn't damage them but at most cause rep lag on Wikidata or the databases [21:37:42] JohnFLewis: I wonder how he does it because widar is supposed to be throttled [21:38:03] no idea and actually, I wonder what wikidata rep lag is atm [21:38:25] Last time I checked it was 0 [21:38:48] I have two bots running and both don't complain about it [21:39:32] Ok. Bots are running, time for bed [21:39:34] Yeah, still is [21:40:51] The throtteling on quick statements seems lower. [21:42:47] Looking at the source, it doesn't contain timeouts. [21:43:12] https://bitbucket.org/magnusmanske/wikidata-todo/src/13ab57dbdf51e94e2ffc6b9ec385097ebc02fc7d/public_html/quick_statements.php?at=master [21:44:18] benestar: Not aware of any recent problems beside the id problems where I created a patch yesterday [21:44:39] constantly monitoring active queries though, as we had trouble earlier [21:45:12] The only downside is the RC-flooding then. [21:45:16] setTimeout ( function () { getWidar ( params , callback ) } , 500 ) ; // Again [21:45:25] Yeah, after errors it seems. [21:45:30] sjoerddebruin --^ found that but too tired to understand what it actually does ... [21:45:44] Retry after 500 ms. [21:47:49] * benestar hopes he didn't affront User:Lockal [21:47:58] sjoerddebruin: what do you think? [21:48:28] Editing on such high rate without a flag, I don't like it. [21:48:42] Saw it when I wanted to look at the RC. [21:49:05] benestar: How many edits/ minute was that user doing? [21:49:26] sjoerddebruin: Deploying the fix for the id problem now [21:49:34] hoo: nice [21:49:55] > 100 [21:50:04] That's kind of bad [21:50:31] Shouldn't be a problem, but frankly our capacities are very limited, especially regarding dispatching these days [21:50:58] When we just had Wikipedias and the dispatcher was simpler, we could process about 3k edits /minute :P [21:51:12] I think the upper limit is at 400 or 600 now [21:51:38] hoo: 297 [21:52:01] In a perfect world that wouldn't be a problem [21:52:16] hoo: can we put a limit server-side? [21:52:17] but I guess no one should really be doing more than say... 50(?) edits/ minute [21:52:32] Not really, these people are exempted from rate limits [21:52:36] yeah, also to avoid that the bot gets out of controll [21:52:36] Usually that's not a problem [21:53:03] People should have common sense. [21:54:04] sjoerddebruin: +1 but that's like saying "people should be clever" [21:54:25] Using these tools is on your own risk. [21:57:22] Indeed [22:04:08] multichill: Did you see errors during page creation recently? [22:04:13] If so, those should be gone now [22:04:18] stuff might be slow, though