[09:09:46] *facepals gerrit being super slow for certain patchsets*..... [09:09:52] *palms... [09:33:32] addshore: merged all the things! [09:33:37] indeed [09:34:02] New JSON dump is there :) [09:34:06] YAY [09:34:13] But it's still smaller than the old one, which is worrysome [09:34:15] I don't believe I managed to break it in 3 different ways :D [09:34:30] hoo: well, allot of 0's got removed! [09:34:39] yeah, hopefully that's it [09:35:44] well hoo well over 10,000,000 dates ;) [09:37:31] and most of them probably lots 12 0 characters [09:38:02] Let's see how big the size change is after gzip [09:38:09] so 120,000,000 bytes :D [09:38:35] probably closer to 20,000,000 dates [10:32:40] addshore: that's #over9000! [10:36:05] addshore: After unzipping, the dump actually grew by 1.6% [10:36:27] hoo: yeh it should ;) as datatypes now get included for all snaks! [10:36:44] So that makes a bit of sense! :D [10:37:40] yeah [10:40:37] addshore unlocked the "waster of diskspace" badge [10:51:39] Fatal error: Class 'Wikibase\Repo\WikibaseRepo' whhhhyyyy [10:56:42] addshore: I guess the autoloader is not invoked in time [10:56:53] yeh, got a fix for it, about to put it up [11:01:56] heh? [11:34:05] addshore: y u confused about the direction? https://gerrit.wikimedia.org/r/#/c/229678/ [11:34:18] You're from the UK, not Australia! [11:35:34] !nyan | ~=[,,_,,]# [11:35:34] ~=[,,_,,]#: ~=[,,_,,]:3 [11:38:14] JeroenDeDauw: phpstorm disagrees with you [11:49:48] $this->hasHadServicesSet [11:49:53] I like that name :'D [11:50:16] hhss [11:52:15] hoo: i'll finish wriitng a test for your core thing then merge it :) [11:52:26] \o/ [12:49:41] Yay... // FIXME: This is not really well defined. Could be either of the two. [12:51:15] ugh [12:51:21] we have cut off values in wb_changes [12:51:25] * hoo cries [12:54:40] Anyone up for fixing the tests for master or shall I give it a shot? [12:54:57] I'm still wtf-ing about wb_changes [13:18:23] hoo: does this sound familar??? https://gerrit.wikimedia.org/r/#/c/225036/ [13:18:52] yeah, that's the thing [13:21:18] Thiemo_WMDE: Iaea79bd is related, yes [13:21:35] I guess SQLite changed it's query plan because it now has the indexes at hands [13:21:51] but the assumption in the test was flawed to begin with... even has a FIXME on it [15:43:11] hmm, master has a broken test... [15:59:43] hoo|away: DanielK_WMDE_ https://gerrit.wikimedia.org/r/#/c/229736/ [16:00:21] oh great, it that fails for mysql though... thats lame... [16:01:33] addshore: yea... [16:03:46] *facepalm life* [16:12:47] cheer Thiemo_WMDE ! [16:12:49] *cheers [17:01:57] i am a bit puzzled by Q7502697 (shoulder angel)... i marked it as an instance of "Allegory" and "Plot device"... but i'm not sure this is right [17:02:32] english wikipedia says that it is a plot device, but i think that is a narrow definition, as it is not only used in narratives [17:03:57] related: Q1664033 "innerer Schweinehund" [17:37:23] ysangkok: "innerer Schweinehund" is a phrase or expression. possibly a figure of speach. Not a plot device. [21:05:36] https://www.wikidata.org/wiki/Special:Contributions/ProteinBoxBot that bots updates retrieved dates of references... daily? :/ [21:07:28] Wut? [21:07:43] sjoerddebruin: See the revisions [21:07:51] Jup, I see. [21:07:53] I seriously doubt that's a good idea :P [21:07:59] hahaha, lol [21:08:01] Could be done like twice a year or so... [21:08:08] huh https://www.wikidata.org/wiki/User_talk:ProteinBoxBot [21:08:16] It's like when my bot was edit warring for over a month and nobody pointed it out [21:08:48] Do we have a central place to keep track of viaf errors? [21:09:09] https://www.wikidata.org/w/index.php?title=Q17195901&curid=18789933&diff=239010009&oldid=238959846 <- this stuff scares me a bit [21:09:12] Wondering that too. [21:09:37] https://www.wikidata.org/w/index.php?title=Q18056036&type=revision&diff=238990318&oldid=238941317 [21:10:13] ouch, that's really bad matching [21:11:09] I've checked every edit on my watchlist... [21:11:51] But, what about "Wikibase\Lib\Store\UnresolvedRedirectException" on https://www.wikidata.org/wiki/User_talk:ProteinBoxBot [21:14:42] I wonder if they're just matching based on the names being the same... [21:14:54] I hope not, because that's a brilliant way to mess up your data [21:15:25] Their based on matching done by VIAF. [21:15:31] They are* [21:15:35] nikki: Looks like viaf matched on name [21:15:56] I meant viaf [21:16:08] So Nieuwe Gracht is a street both in Haarlem and Utrecht, picked Haarlem instead of Utrecht. Same goes for Drenthe, province of the Netherlands or town in the US [21:16:14] I sent them an email but haven't had a response yet [21:16:31] er, I meant to say I also saw one the other day [21:16:42] It's a bit like my painter boo boo :P [21:17:02] VIAF communication is bad. [21:21:16] hey :) [21:24:24] hi Amir1 [21:25:13] hey, how are you? [21:26:12] not too bad [21:26:53] I would be better if air conditioning were common here or if the weather weren't so hot [21:29:30] :D [21:30:53] sjoerddebruin: I don't see an exception on https://www.wikidata.org/wiki/User_talk:ProteinBoxBot [21:31:30] Weird. [21:38:02] sjoerddebruin: no I have it too :-O [21:40:25] :O [21:40:28] !panic [21:40:28] https://dl.dropboxusercontent.com/u/7313450/entropy/gif/omgwtf.gif [21:40:54] wtf [21:40:58] on a non entity page [21:41:02] * hoo panics as well [21:41:15] * sjoerddebruin didn't knew about !panic [21:42:08] ok I think someone delete an entity that was linked like {{Q|Q19859703}} [21:42:09] 10[1] 10https://www.wikidata.org/wiki/Template:Q [21:43:44] jzerebecki: No, double redirects are the problem [21:43:47] :s [21:43:55] Does anyone have a bot handy to fix these? [21:44:09] See https://www.wikidata.org/w/index.php?title=Special:WhatLinksHere/Q16428906&hidelinks=1 [21:44:32] Jup https://www.wikidata.org/w/index.php?title=Q18020996&type=revision&diff=239007539&oldid=238445134 [21:44:50] For some reason these don't appear on the special page [21:44:56] so I guess we have various issues here [21:44:59] uhm... yay? [21:45:52] I'll file bugs [21:45:56] * hoo sighs [21:46:57] thx [21:56:09] jzerebecki: Running updateSpecialPages.php --wiki wikidatawiki --only DoubleRedirects maybe the cache is the problem [21:56:16] if not I'll open a bug about that as well [21:57:11] it is supposed to continously update, right? [21:57:30] No, it's not... it's to heavy for that (the update takes several minutes) [22:00:27] oh ok. it is never run in the wikimedia cluster according to the puppet repo... [22:02:06] https://www.wikidata.org/w/index.php?title=Special:DoubleRedirects&limit=500&offset=0 [22:02:12] so now it's complete [22:02:23] Not a bug, just not updated often enough it seems [22:02:41] took almost 6 minutes, btw [22:03:52] It's being refreshed every third day now... maybe we should have that being recached like twice a day? [22:04:27] Only worth it, if bots will act this often, though [22:05:29] hoo: how is it being run I don't see it in puppet.git? [22:05:43] misc::maintenance::update_special_pages [22:06:08] that updates all of them, but we could run that one separately [22:06:22] oh ok there is one that is missing --only didn't see that [22:38:02] jzerebecki: https://gerrit.wikimedia.org/r/229992 Easy one [22:42:23] hoo: should I sign that up for the next swat? [22:42:47] As stop gap it would be enough for someone to fix the double redirects [22:42:58] but if no one is going to do that, we can quickly backport it [22:43:40] $ grep -c 'UnresolvedRedirectException' exception.log [22:43:40] 16 [22:46:33] probably not worth it then. last swat before next week is in 14 minutes. [22:47:36] yeah, let's just wait for the SWAT to run through [22:47:47] * bots [22:47:48] meh [22:59:30] Restarting dbus wasn't the best idea I had today, doh [23:13:19] Do we have a page to list items with invalid site links (sitelinks that are present in two items)?