[02:08:27] hello [02:20:06] hi [02:35:18] so of course Special:RecordImpression hits are sampled at 1:1000 [02:35:20] and of course it's done on the server side [02:35:54] so instead of only sending a request on one in every thousand page views, we're instead sending them on all page views and dropping 999 out of a thousand on the floor [02:39:35] Sounds preformant [02:44:22] oh, and also [02:45:01] when no campaign is running, which is most of the time, Special:RecordImpression records the fact that no banner was shown because no banner is running [02:52:40] it's an inspiring example of how Wikimedia runs on globally-distributed crowdsourced effort [02:54:24] ori: We should invest in our own http://devnull-as-a-service.com/ [02:55:10] heh [03:20:26] TimStarling: fun: https://phab.wmfusercontent.org/file/data/iteayw55oibir447swgc/PHID-FILE-pfstogztly5ohkpzxdlb/wtf.png [03:22:02] haha [03:22:57] ori: well, it's cloning all the data in the document, presumably? [03:23:02] yeah, the mediawiki parser will quite deep recursion as well, at times [03:23:50] although it has a lot of virtual stacks which help to avoid recursion in the most common cases [03:24:08] MatmaRex: not sure why it would do that if it's fetching a brand new DOM from parsoid anyway [03:24:42] it's still only 0.9% [03:24:47] ori: isn't that after is has fetched the DOM, and it'd indeed data from the DOM being copied? [03:24:52] it's* [03:25:45] TimStarling: the % is from the total time the profiler was running, so 68.56% is idle [03:26:51] there's also a lot of time spent checking that interface elements which we know are hidden are in fact hidden [03:28:02] because jquery's showHide checks the CSS to ensure it doesn't insert a duplicate 'display:' property on an element that already has the desired visibility [03:28:05] https://phab.wmfusercontent.org/file/data/oobaijxodod4bwlvggwp/PHID-FILE-dqk55yssekvslrmnvu3k/Screen_Shot_2015-01-18_at_18.40.47.png [03:31:24] it's a shame dev tools won't let you export these profile views as HTML [05:56:45] bd808: cool beans, re: the FF extension. and good catch w/the mediawiki.org domain. Should we merge the repos together? [05:57:37] I took a look at porting it to FF and saw that it couldn't be done using the nice, high-level pure JS API and immediately got lazy and gave up. [06:02:47] ori: heh. I cut-n-pasted my way to getting it to work. Found some add-on that stuck an XFF header on all requests and worked from there. [06:03:51] It probably would be simpler to stuff both of them in the same repo. Should be an easy enough thing to do. [06:08:02] bd808: Cool. If you feel like doing it, go ahead. If not, I'll do it sometime this week. [06:12:55] I might poke at it tomorrow. Not sure what I'm going to do with my day off other than run a couple of errands and pack [06:13:52] I've been trying not to look at logstash to avoid being sad that I killed it :/ [11:31:36] 3MediaWiki-Core-Team, wikidata-query-service, Wikidata: Wikidata Query - add license - https://phabricator.wikimedia.org/T86833#985371 (10daniel) What dependencies does this have, what libraries does it use? Is it all JVM code, or does this involve PHP code too? WikiDataToolkit is Apache2, so making this Apache2... [13:26:48] 3MediaWiki-Core-Team, wikidata-query-service, Wikidata: Wikidata Query - add license - https://phabricator.wikimedia.org/T86833#985429 (10Manybubbles) Its all JVM code. Iirc that GPL apache incompatibility is something the apache foundation disputes but I'm not sure. If we accept that incompatibility then it mu... [22:26:38] 3MediaWiki-Core-Team, operations: Audit log sources and see if we can make them less spammy - https://phabricator.wikimedia.org/T87205#985909 (10yuvipanda) 3NEW