[00:28:20] (PS5) AndyRussG: Get URL params via mw.Uri() instead of ad-hoc methods [extensions/CentralNotice] (campaign_mixins) - https://gerrit.wikimedia.org/r/230192 [03:26:21] (PS4) AndyRussG: WIP Banner history logger campaign mixin [extensions/CentralNotice] (campaign_mixins) - https://gerrit.wikimedia.org/r/229560 (https://phabricator.wikimedia.org/T90918) [15:45:08] (CR) Cdentinger: Get URL params via mw.Uri() instead of ad-hoc methods (1 comment) [extensions/CentralNotice] (campaign_mixins) - https://gerrit.wikimedia.org/r/230192 (owner: AndyRussG) [15:45:53] hey all. I'm working from home today. I'm doing a bit of intense wiki management. I'll be online but not looking here much for most of the morning [15:51:38] (CR) AndyRussG: Get URL params via mw.Uri() instead of ad-hoc methods (1 comment) [extensions/CentralNotice] (campaign_mixins) - https://gerrit.wikimedia.org/r/230192 (owner: AndyRussG) [15:51:48] dstrine: cool! [15:51:58] * AndyRussG waves [15:52:10] * dstrine waves back [15:53:54] :) [16:10:09] AndyRussG: if i'm understanding, urlParams is not deprecated, just usually not set by that point, but the tests will have to set it prior to fake requests? [16:10:54] cwdent: yeah! This patch doesn't get rid of it, it just uses standard code (mw.Uri) to fetch it instead of our old ad-hoc URL parser [16:11:41] The availability for tests was there before this patch, the latest change sets just fixed the fact that the first change sets were inadvertently overwriting the values set by tests 8p [16:12:08] ah ha, cool that makes sense [16:12:44] i think everything looks good in that case, happy to merge if you think it's safe [16:13:46] cwdent: fantastic! Yeah sounds great, many thanks :) [16:14:49] (CR) Cdentinger: [C: 2] Get URL params via mw.Uri() instead of ad-hoc methods [extensions/CentralNotice] (campaign_mixins) - https://gerrit.wikimedia.org/r/230192 (owner: AndyRussG) [16:15:04] yeah, looks like a big win for code reuse and standards [16:16:11] (Merged) jenkins-bot: Get URL params via mw.Uri() instead of ad-hoc methods [extensions/CentralNotice] (campaign_mixins) - https://gerrit.wikimedia.org/r/230192 (owner: AndyRussG) [16:22:57] cwdent: hope so! I was gonna just keep it in for this round, but it does reduce the size of the code, and in the banner history patch I had a URL param to pull in, so it was either do this, or find a way to grab a ad-hoc-parsed param from display.state, or end up adding some inconsistency in this new branch... Heh, neither of the last two options were OK 8p [16:24:47] AndyRussG: for sure. anything about it make you nervous? i could try writing something to test url parsing more extensively [16:29:24] cwdent: no, we're OK... Thanks though! I talked to the main person responsible for the mw.Uri, which is used all over MW these days, and it seems we're good :) [16:29:47] excellent [16:31:18] cwdent: I'd say for the banner history stuf, more important is review of the other smallish patches, and and the WIP banner history patch, and other tests :) [16:31:22] * AndyRussG waves at awight [16:32:47] cwdent: awight: the blockr I've just come across for banner history is that our data is too complex for EventLogging. In a sec I was gonna look for EL/analytics folks to ping for advice on that, if you have thoughts or suggestions on that also pls LMK... and in case ur intersted I'll mention here where/when I'm talking to other teams about it... [16:33:44] AndyRussG: dang. what does the evenlogging data look like vs. banner history? [16:33:52] AndyRussG: What do u mean by too complex? Just cos it's a list, or does the schema change? [16:35:53] awight: cwdent: nested arrays and objects rrrgg [16:36:22] is just stuffing some json in there too terrible? [16:36:24] But the nested stuff is just a list of single events, which can each be represented as a flat array? [16:36:38] cwdent: good point [16:36:50] awight: cwdent: no, it's an array of objects [16:37:11] see resources/subscribing/ext.centralNotice.bannerHistoryLogger.js in https://gerrit.wikimedia.org/r/#/c/229560/ [16:37:16] postgres has a native binary json type where you can even query against the objects [16:37:42] specifically makeEventLoggingData() and makeLogEntry() [16:38:19] cwdent: well, we have to massage EL data anyway, unpacking json would be nbd. [16:38:20] vs https://www.mediawiki.org/wiki/Extension:EventLogging/Programming#JSON_schema_validation (third bullet point in that section) [16:38:41] awight: yeah we could JSON it up, but where does that leave querying? [16:38:58] I don't think we ever want to query EL tables directly, other than to extract the data. [16:39:18] could make a little sorter, parser specific to this application [16:39:33] The biggest reason to avoid querying EL is that schema migrations result in new tables [16:40:21] AndyRussG: How do you feel about forcing our data to fit into EL, vs. finding another backend? [16:41:45] i usually think key value stores are totally bogus, but for relatively unstructured data like history they can be ok [16:42:16] you can put a billion rows in mongo easily, long as you don't have to query it in any meaningful way [16:42:23] or redis [16:43:58] AndyRussG: One option would be to send each log entry as its own event. [16:47:25] awight: I did think of that, but wouldn't that be kinda awful? [16:47:41] cwdent: right! Well we have Hive, I think that's really the place for it 8p [16:48:02] * AndyRussG puts his ignorance on display [16:48:46] oh ok cool! i haven't used hive before but it looks legit [16:49:12] hehe. I guess the real dealbreaker is whether we want to actually need to query this data in place, or transform it into a more usable schema. [16:49:45] If we have time to transform it, then we can do any workaround we'd like to gum up EL [16:50:48] Since this will be replacing banner impression counts, we might have a low tolerance for lag time before the data becomes available [16:51:13] EL doesn't have a "send multiple" API, eh? [16:52:14] nope. [16:53:04] awww, crikey. EL sends all its data in the GET parameters. I think that forces us to log one event at a time, anyway. [16:53:16] I don't think you can trust URLs > 1024 chars? [16:54:47] actually, seems to be around 2k chars, http://stackoverflow.com/questions/219569/best-database-field-type-for-a-url [16:57:59] awight: cwdent: what about something ad-hoc server-side to grab it and munge it into Hive? I think latency isn't a priority issue [16:58:23] I wonder what Kafkatee is written in? [16:58:55] That's a similar thing--grabs our S:RI impressions via Varnish and munges them to look like udp2log [16:59:07] does the existing implementation send it directly from the browser? [16:59:15] Hive is not exactly an operational data store either, is the only issue. [16:59:37] what do you mean awight? [16:59:40] cwdent: there is no existing implementation of banner history... That's the new bit, part of what motivated us to do all this [17:00:02] (other than all the wrong reasons) [17:00:08] Just that, I ran some queries that collected impression counts from the weblogs, for a single campaign, and it took maybe 36 hours and caused all the analops to kick my arse [17:00:34] heh geez that's ridiculous [17:00:50] i have never been exposed to the use case for that scale of data processing before [17:01:22] but like, the current sampling is in regular old sql and doesn't take no 36 hours to return data [17:01:35] awight: K, but... Maybe that's 'cuase of how we're storing stuf? Can't we break things down efficiently and make indexes to get stuff more quicklier? [17:01:48] I'd actually like to understand Hive... heh in passing [17:02:07] AndyRussG: I'm anxious that we stick really close to best current practices in the backend... We don't want to be maintaining anything custom. [17:02:13] I don't think Hive does indexes [17:02:38] awight: and... what are our best curent practices? [17:02:41] I mean yeah, my query definitely sucked, it had to parse the URL maybe 6 times for every web request [17:03:07] AndyRussG: I think we would have to ask analytics [17:04:19] re: my Hive tragicomedy: https://phabricator.wikimedia.org/T90635 [17:04:59] Anyway, my hunch is that we should be munging into MySQL [17:06:00] awight: yeah also an option... [17:06:07] *munge munge* [17:07:32] Cookie! [17:08:20] hey guys - sounds like we're going to start sending a slow trickle to Brazil, FYI. should be this afternoon. awight ejegg cwdent XenoRyet AndyRussG :) [17:08:33] awight: were you basically trying to correlate access logs into banner impressions there? [17:08:45] atgo: cool-o-rama! [17:08:58] is that what you were querying in hadoop? [17:09:46] cwdent: nah, just counts. But yeah, I searched all weblogs for ones starting with Special:RecordImpression [17:10:11] Clearly a losing horse [17:10:17] heh, gotcha [17:10:48] Whaa. In case anyone needs a little lift: http://hoofbeats-in-heaven.com/HIH/ [17:10:56] if we're going to record every banner impression, seems like that will bet into the billions pretty quick [17:11:30] cwdent: we do record every one! Yeah that's one reason ops and performance love us soooooooooo much :) [17:11:58] AndyRussG: oh we do? i thought we only had the downsampled data? [17:11:58] They've got a baseball bat hanging on the wall, with FR-TECH all over it [17:12:32] awight: cwdent: ejegg: XenoRyet: In about 20 min I have to do my daily afternoon chauffeur thing, of which this is my last week for... I was thinking of bugging other teams about the EL issue later, like around 12 pm PST... I'd be really happy if anyone else would also like to join in... if you're insterested, will you be around then? If you won't be aroudn at that time, what other time would be güt? [17:12:51] cwdent: no... Note that banner history and banner impressions logging are different things [17:13:15] sure, but isn't banner history not a thing yet? [17:13:50] For banner impressions logging, we now use Special:RecordImpression and we're gonna start using Special:BannerLoader. Banner history isn't deployed yet anywhere, that's a more complete list of a user's history of banner impressions [17:14:28] yeah, that's what i thought [17:14:46] plain banner impressions are not sampled on the client so a full stream is sent to the servers. Then we have a database that takes a 1:100 sample for quick access, and we have Hive that takes the raw full-on unsampled stream [17:14:59] So FR can either get the sampled version fast, or the unsampled version slow [17:15:24] gotcha. so what will go into the new banner history that s not captured in hive? [17:16:16] For banner history, on the other hand, we're sampling on the client. So on any banner view (or campaign selection selection + banner hide) we randomly choose whether the user goes in the sample. If they do, they'll send back their whole log of banner display or campaign seleciton + banner hides [17:17:03] ok that makes sense [17:17:19] cwdent: the patterns of viewing of single users (or rather devices). Since a user's history of what they saw influences what they do, FR wants more than just single banner impressions with no indication of what came before [17:17:42] could that data be churned out of hive with infinite resources, or does the device correlation not exist? [17:19:12] atgo: Cool, I'll take a peek at the logs and make sure things are all still good [17:19:22] thanks! i think they're targeting 1pm [17:19:27] also...is the goal to have this totally replace the current impressions system and get rid of that one request per banner? [17:19:36] or will this be in addition to that? [17:24:19] cwdent: it's in addition! Also, what do you mean re: device correlation? [17:27:53] AndyRussG: correlating an impression to a user or more properly a device [17:28:05] or a group of impressions as in bannerhistory [17:28:51] cwdent: we're being pretty religious about not assigning anyone a unique ID, so no we can't get the history from Hive [17:28:58] err, from raw impression logs [17:31:24] cwdent: ^ per what awight said [17:31:48] in banner history we get a group of correlated impressions, but one that still can't be correlated with another group? [17:32:06] by user or device [17:33:26] cwdent: correct [17:33:39] We're anticorrelationists [17:33:57] excellent [17:34:25] ;) [17:34:32] gotta run! back soooon! [17:35:17] K4-713: Question about "N0NE PROVIDED"... I assume we want to stuff dummy data into street and zip whenever they are empty. Currently, we have code to skip staging this data if the unstaged data !isset, and that seems wrong to me. [17:35:51] ejegg: also re ^, that ternary in stage_street is completely insane, eh? [17:36:25] it's not what it appears ;) [17:37:41] awight: oh, is that the root of our problem? [17:38:07] I'm unfolding that code just for readability, but I think it might have done the right thing. [17:38:17] Except for the !isset case, which is creeping me out. [17:39:24] what really confuses me is how it stopped working [17:40:00] My staging/unstaging split in February might have shaken out some moths [17:40:20] yeah, but it started flaking out in october [17:40:31] or november, whatever date i put in the phab ticket... [17:41:07] and that ternary was there since stage_street was written [17:41:19] I saw those notes, not totally convinced that's the date yet, but thank you for digging! [17:42:58] ejegg: It was totally me. git log -p -1 0b72a07d [17:43:07] I made up that !isset crap, and it's wrong [17:44:09] oh, lemme see... [17:51:17] (PS1) Awight: WIP Ensure we're plugging the AVS hole with n0thing [extensions/DonationInterface] - https://gerrit.wikimedia.org/r/230584 (https://phabricator.wikimedia.org/T108129) [17:52:53] K4-713: I'm copying the functionality as you had originally written it--yeah, we always want to stuff the AVS fields even if !isset(unstaged['street']) [17:56:12] awight: Sorry: Meetingtown. :) [17:56:36] Ah... yes, I think that makes the most sense. But then again, I wrote that silly thing. [17:56:41] So, I would think that. [17:57:12] But: I think it only makes sense if unstaged isn't set in the first place. [17:57:20] Otherwise, we don't need it. [17:57:34] Unless... [17:57:49] awight: Are we looking to make sure there's a numeric value somewhere in the address field, for those purposes? [17:58:02] Or, are we just checking to make sure it's not empty? [17:58:04] :/ [17:58:30] Just checking that it's not empty [17:58:35] I was wondering about that, too... [17:58:49] There are legit addresses with no number, eh? [17:59:33] I'm not going to make any assumptions about how not-broken processor code might be :( [18:00:39] I... kind of want to have a filter test for that, and add points on our end. [18:00:42] K4-713: But this thing you say about unstaged not being set freaks me out. I thought the whole point was, someone goes to the FR form and it doesn't have street fields..... [18:01:03] omg. https://payments.wikimedia.org/index.php/Special:GatewayFormChooser?uselang=en&language=en¤cy_code=EUR&country=FR&paymentmethod=cc&gateway=worldpay&ffname=worldpay [18:01:09] Well, consider people who are able to get around validation by not filling them out. [18:01:13] just got that from a banner. [18:01:32] K4-713: right. they should also get the N0NE brand [18:01:38] Hargh. What banner. [18:01:38] wat [18:03:22] K4-713: one more bleeding limb for your quarterly report :D [18:03:40] Fundraising-Backlog, MediaWiki-extensions-DonationInterface: Freaky banner arguments cause redirect to (nonexistent) Worldpay form. - https://phabricator.wikimedia.org/T108605#1524774 (awight) [18:04:10] Also: I just want to point out that the shortest typeahead form of Fundraising Backlog is "fu ba". Accident? I think not. [18:09:07] awight: you're onto me [18:09:27] (PS1) Awight: Remove Worldpay from the form chooser options [extensions/DonationInterface] - https://gerrit.wikimedia.org/r/230588 (https://phabricator.wikimedia.org/T108605) [18:10:51] (PS2) Awight: Remove Worldpay from the form chooser options [extensions/DonationInterface] - https://gerrit.wikimedia.org/r/230588 (https://phabricator.wikimedia.org/T108605) [18:11:14] Fundraising Sprint Queen, Fundraising-Backlog, MediaWiki-extensions-DonationInterface, Unplanned-Sprint-Work, Patch-For-Review: Freaky banner arguments cause redirect to (nonexistent) Worldpay form. - https://phabricator.wikimedia.org/T108605#1524808 (awight) [18:37:02] (PS2) Awight: Ensure we're plugging the AVS hole with n0thing [extensions/DonationInterface] - https://gerrit.wikimedia.org/r/230584 (https://phabricator.wikimedia.org/T108129) [18:39:11] I'm still unable to run tests locally. Swear I've fixed this same problem ten times.... [18:39:42] mediawiki-core/tests/phpunit/phpunit.php doesn't find any tests for extensions [18:41:44] Fundraising Sprint Queen, Fundraising Sprint The Pogues, Fundraising-Backlog: Set up import for Major Gifts events payment/invitation tool - https://phabricator.wikimedia.org/T101191#1524919 (XenoRyet) @CCogdill_WMF I notice the example data file doesn't have actual data in the name, address, or emai... [19:03:37] awight: suite.xml has testsuite "extensions" uncommented? [19:03:54] and whitelist has the extensions directory? [19:04:58] * awight sniffs for truffles [19:07:54] ejegg: Do you need to change anything after a standard install? IIR the defaults used to work. [19:08:10] I didn't have extensions in the suite.xml whitelist, but adding it hasn't changed the behavior. [19:08:30] Oh, maybe that was old! [19:08:35] What scares me is that it sounds like you've had the same issue--the testextensions suite definitely isn't running [19:08:52] updating... [19:09:09] Maybe @group is actually spelt "grup" [19:10:24] So, I've also been running unit tests with the makefile all this time [19:10:49] phooey, updating broke it for me [19:10:59] PHP Fatal error: Call to undefined function MediaWiki\suppressWarnings() [19:11:04] heh [19:11:14] how do you run them? just phpunit from the tests dir? [19:11:54] cwdent: I forget where I read this, but make phpunit FLAGS="--group DonationInterface" [19:11:55] ejegg: that's a composer install thing [19:12:01] awight: ah, thanks! [19:12:21] Argh, I'm spreading the poison! [19:12:31] awight: : cwdent: ^ what ejegg said! [19:13:41] +1 -- thanks! yeah it must be finding a different suite.xml??? "--configuration .../tests/phpunit/suite.xml" is the winning snippet [19:13:48] gets added by the Makefile [19:13:52] huh... [19:14:11] Aww balls, but now it's not restricting tests to just DonationInterface [19:14:25] nvm. it is [19:14:30] awight: do you have to install the dev dependencies to get rid of that undefined suppressWarnings? [19:14:35] * awight considers hiding behind |fud [19:15:03] ejegg: I usually run composer with no --no-dev argument, so I guess that installs dev dependencies [19:15:12] thx! [19:15:41] ejegg: Looks like it's from a non-dev library, though: mediawiki/at-ease [19:16:35] weird, i don't have any mediawiki dir in vendor, just wikimedia [19:16:50] Whatever you do, don't read phpunit.php where it says "//Hack to eliminate the need to use the Makefile (which sucks ATM) [19:16:59] Heh [19:17:21] ejegg: sorry, looks like composer.lock is gitignored, so you'll have to composer update [19:17:28] ah, oops, that's it! [19:17:45] thx again. [19:17:50] dang this seems like a can of worms [19:18:04] (CR) Ejegg: [C: 2] "That oughtta fix it!" (1 comment) [extensions/DonationInterface] - https://gerrit.wikimedia.org/r/230584 (https://phabricator.wikimedia.org/T108129) (owner: Awight) [19:18:13] yeah, the composer discussion has been a few years of disagreement and shared worst practices, AFAICT [19:19:18] Nice that we're taking a chisel to the obelisk, though. Doesn't have to be very sharp. [19:20:32] Fundraising Tech Backlog, Fundraising-Backlog, Wikimedia-Fundraising-CiviCRM, MediaWiki-extensions-DonationInterface, Technical-Debt: Merge CRM and DonationInterface queue wrappers - https://phabricator.wikimedia.org/T95647#1525061 (awight) p:Low>Normal [19:21:25] Fundraising Tech Backlog, Fundraising-Backlog, Wikimedia-Fundraising-CiviCRM, MediaWiki-extensions-DonationInterface, Technical-Debt: Merge CRM and DonationInterface queue wrappers - https://phabricator.wikimedia.org/T95647#1196764 (awight) Raising the priority cos this just became much more r... [19:22:09] (Merged) jenkins-bot: Ensure we're plugging the AVS hole with n0thing [extensions/DonationInterface] - https://gerrit.wikimedia.org/r/230584 (https://phabricator.wikimedia.org/T108129) (owner: Awight) [19:24:31] (CR) Awight: Ensure we're plugging the AVS hole with n0thing (1 comment) [extensions/DonationInterface] - https://gerrit.wikimedia.org/r/230584 (https://phabricator.wikimedia.org/T108129) (owner: Awight) [19:36:22] Fundraising Sprint Queen, Fundraising Sprint The Pogues, Fundraising-Backlog, fundraising-tech-ops, Unplanned-Sprint-Work: Make DonationInterface fatal errors accessible - https://phabricator.wikimedia.org/T107918#1525103 (awight) Looks like this will be possible: https://www.mediawiki.org/wi... [19:42:10] Hi ejegg [19:43:41] Hi jessicarobell! [19:44:09] ejegg awight: I just wanted to give you a heads up that the Brazil test will go up at 13 PST (in 20 minutes). It will be limited to 10% traffic though so I don't think you'll be seeing a lot in the logs. The test will run for 24 hours. :) [19:44:49] thanks for the heads-up! I'll let you know if anything server-side looks amiss [19:45:07] Thanks ejegg! [19:56:21] Fundraising Sprint Queen, Fundraising Sprint The Pogues, Fundraising-Backlog, fundraising-tech-ops, Unplanned-Sprint-Work: Make DonationInterface fatal errors accessible - https://phabricator.wikimedia.org/T107918#1525150 (Jgreen) Wow, that's great! [20:05:54] AndyRussG: you around? [20:15:22] Fundraising Sprint Queen, Fundraising-Backlog, MediaWiki-extensions-DonationInterface, Unplanned-Sprint-Work, Patch-For-Review: Freaky banner arguments cause redirect to (nonexistent) Worldpay form. - https://phabricator.wikimedia.org/T108605#1525181 (atgo) a:awight [20:58:22] used systemd to change power button behavior. was easy and painless [20:58:45] i still haven't done anything with systemd that made me hate it [21:01:40] woot! [21:11:30] awight: cwdent: https://office.wikimedia.org/wiki/Data_access [21:11:43] also https://www.mediawiki.org/wiki/Extension:EventLogging/Guide#Analyzing_EventLogging_data [21:12:53] Oh, great links! [21:13:26] huh yeah, this says mysql and mongo? [21:14:12] woohoo! an existing infrastructure where some people already figured things out :p [21:14:34] I have no clue what the mongo thing is about, though--I was only aware of the mysql tables. [21:22:25] oh, the hive backend is mysql? [21:22:35] the EL backend is, afaik [21:22:51] ooh, EL != hive [21:23:07] ejegg|food: I'm gonna deploy the AVS thing, just so nobody else has to step on that mine. [21:23:43] i thought EL was stored in hive? [21:23:48] and that hive was https://wiki.apache.org/hadoop/HDFS [21:24:23] EL might get replicated to Hive, donno, but I'm sure that there are also mysql tables for all the records [21:24:47] (PS1) Awight: Merge branch 'master' into deployment [extensions/DonationInterface] (deployment) - https://gerrit.wikimedia.org/r/230657 [21:25:01] (CR) Awight: [C: 2] Merge branch 'master' into deployment [extensions/DonationInterface] (deployment) - https://gerrit.wikimedia.org/r/230657 (owner: Awight) [21:25:25] (Merged) jenkins-bot: Merge branch 'master' into deployment [extensions/DonationInterface] (deployment) - https://gerrit.wikimedia.org/r/230657 (owner: Awight) [21:26:56] (PS1) Awight: update DonationInterface submodule [core] (fundraising/REL1_25) - https://gerrit.wikimedia.org/r/230658 [21:27:17] (CR) Awight: [C: 2 V: 2] update DonationInterface submodule [core] (fundraising/REL1_25) - https://gerrit.wikimedia.org/r/230658 (owner: Awight) [21:27:48] awight: cool, didn't look too dangerous! [21:28:06] Not unless you live in South Africa... [21:28:40] well, with luck the banks are still doing the same checks [21:29:04] awight: ejegg: cwdent: dstrine: XenoRyet: K I see ottomata is online on # -analytics... I just updated the notes on the data structure we needed (https://www.mediawiki.org/wiki/Extension:CentralNotice/Notes/Campaign-associated_mixins_and_banner_history#Data_and_logging), so I'll start pinging wildly there, unless anyone would like me to wait for them to be more available... [21:29:41] AndyRussG: whenever is fine, i'll likely just lurk! [21:29:59] * awight grabs my loudest peanuts [21:30:12] thanks for being so thorough with documentation, too [21:30:35] yeah i'll lurk too... [21:30:55] !log updated paymentswiki from af16d371f9c46d4f0b78986080f2a2be3226ace8 to 325640bd70680a08ae77fd117433565634a98d88 [21:31:00] Logged the message at https://wikitech.wikimedia.org/wiki/Server_Admin_Log, Master [21:32:37] there are mongo and mysql subscribers to the zeromq stream already? where do they live? [21:33:35] I try to not know :p [21:34:13] heh, this all makes sense though...and seems like it should fit the use case after all? [21:34:46] looks like you can use...html tables...to define schema [21:39:41] heh carmel pop corn [21:39:48] AndyRussG: mind adding me to the event if you've got one set up? [21:41:30] ejegg: yeah! only hangout is tomorrow, right now I'm just about to start IRC'ing on #wikimedia-analytics [21:41:51] oh, gotcha! [21:42:47] dstrine: it looks like I can't add guests to tomorow's meeting with ellery, maybe you could add ejegg? thanks!!! :) [21:43:23] ah sorry [21:43:39] I added ejegg: and opened the invite to edits [21:43:54] thanks! [21:44:26] dstrine: thx! [21:45:40] Fundraising Sprint The Pogues, Fundraising-Backlog, Unplanned-Sprint-Work: Kick Silverpop export job - https://phabricator.wikimedia.org/T107184#1525687 (awight) [21:46:03] Fundraising Sprint The Pogues, Fundraising-Backlog, fundraising-tech-ops, Unplanned-Sprint-Work: Footer images on payments missing - https://phabricator.wikimedia.org/T106728#1525689 (awight) [21:46:39] Fundraising Sprint N*E*R*D, Fundraising-Backlog, Wikimedia-Fundraising-CiviCRM, fundraising-tech-ops, Unplanned-Sprint-Work: Access to https://civicrm.frdev.wikimedia.org/ - https://phabricator.wikimedia.org/T104658#1525699 (awight) [21:50:16] https://abc.xyz/ [21:57:25] Fundraising Tech Backlog, MediaWiki-extensions-DonationInterface: More and easier testing for DonationInterface - https://phabricator.wikimedia.org/T86247#1525727 (awight) [21:57:58] Wikimedia-Fundraising, MediaWiki-extensions-CentralNotice, Easy, Technical-Debt: Use CSS instead of obsolete HTML attributes on CentralNotice tables - https://phabricator.wikimedia.org/T108259#1525729 (Aklapper) [22:36:11] fundraising-tech-ops, Traffic, operations, Patch-For-Review: Decide what to do with *.donate.wikimedia.org subdomain + TLS - https://phabricator.wikimedia.org/T102827#1525929 (CCogdill_WMF) IBM tells us it's not possible to have a customized domain with an active ssl cert in place; they aren't able... [23:13:59] AndyRussG: moving a side question here, to reduce havoc... So if banner history is controlled per campaign, is it possible to get data back from an hour-long "campaign", or only from a long-running campaign? [23:17:53] awight: it's turned on or off by campaign but aggregated between campaigns [23:19:05] whoa. [23:19:48] awight: I mean, the log is shared among all campaigns that have it on [23:20:04] So in the log you sample, you may get entries that were added during other campaigns [23:20:39] We record banner impressions from campaigns without the feature enabled, right? [23:22:58] awight: yeah the two features are separate. We continue to record full-on unsampled banner impressions, currently with S:RI and soon with S:BL [23:22:59] And, it took me a minute but I think I see the logic in only sending histories back during a campaign. You're getting people whose most recent impression is in the campaign. [23:23:49] awight: heh, the only real logic is it's CN-specific code, and we're not turning it on globally for all campaign-targetteds [23:24:05] mmm, I think it's relevant if a non-history campaign shows someone a bunch of banners, in the same way that it's important to track "no impression" pageviews. [23:24:39] We can work this all out, no rush. I'm probably misunderstanding... [23:29:24] awight: it could be relevant, but it's not tracked currently. We'd have to activate the feature globally for all campaigns [23:30:39] Argh. Cos mixin. [23:31:00] Well, it could be packaged as a CN RL module instead. [23:33:13] yes.... [23:57:29] (PS2) Ejegg: Clear out old Amazon code to prepare for PwA [extensions/DonationInterface] (payWithAmazon) - https://gerrit.wikimedia.org/r/230253 (https://phabricator.wikimedia.org/T108112) [23:57:31] (PS1) Ejegg: Add id attribute to amount and currency [extensions/DonationInterface] (payWithAmazon) - https://gerrit.wikimedia.org/r/230706 [23:57:33] (PS1) Ejegg: WIP redirect to Amazon for login. [extensions/DonationInterface] (payWithAmazon) - https://gerrit.wikimedia.org/r/230707