[13:10:26] (PS1) Erik Zachte: protect against dounble run with flock [analytics/wikistats] - https://gerrit.wikimedia.org/r/126240 [13:10:41] (CR) jenkins-bot: [V: -1] protect against dounble run with flock [analytics/wikistats] - https://gerrit.wikimedia.org/r/126240 (owner: Erik Zachte) [13:16:19] (PS1) Erik Zachte: 'incl bots' -> 'excl bots' [analytics/wikistats] - https://gerrit.wikimedia.org/r/126242 [13:16:24] (CR) jenkins-bot: [V: -1] 'incl bots' -> 'excl bots' [analytics/wikistats] - https://gerrit.wikimedia.org/r/126242 (owner: Erik Zachte) [14:59:11] (PS7) Milimetric: Fix parse_username to handle Unicode [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/125752 (owner: Csalvia) [14:59:20] (CR) Milimetric: Fix parse_username to handle Unicode (5 comments) [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/125752 (owner: Csalvia) [14:59:30] (PS8) Milimetric: Fix parse_username to handle Unicode [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/125752 (owner: Csalvia) [15:08:36] (PS1) Erik Zachte: massive cleanup of comscore files (standard naming now) [analytics/wikistats] - https://gerrit.wikimedia.org/r/126262 [15:08:43] (CR) jenkins-bot: [V: -1] massive cleanup of comscore files (standard naming now) [analytics/wikistats] - https://gerrit.wikimedia.org/r/126262 (owner: Erik Zachte) [15:08:56] (CR) Csalvia: [V: 2] Fix parse_username to handle Unicode [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/125752 (owner: Csalvia) [15:10:36] (CR) Nuria: [C: 1 V: 2] Fix parse_username to handle Unicode [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/125752 (owner: Csalvia) [15:11:03] (CR) QChris: "Jenkins is complaining, as the test suite was been moved away." [analytics/wikistats] - https://gerrit.wikimedia.org/r/126262 (owner: Erik Zachte) [15:14:50] (CR) QChris: "recheck" [analytics/wikistats] - https://gerrit.wikimedia.org/r/126262 (owner: Erik Zachte) [15:15:10] (PS1) Erik Zachte: use new comScore csv file names [analytics/wikistats] - https://gerrit.wikimedia.org/r/126264 [15:15:15] (CR) jenkins-bot: [V: -1] use new comScore csv file names [analytics/wikistats] - https://gerrit.wikimedia.org/r/126264 (owner: Erik Zachte) [15:16:06] (PS1) Erik Zachte: handle missing values slightly different [analytics/wikistats] - https://gerrit.wikimedia.org/r/126265 [15:16:11] (CR) jenkins-bot: [V: -1] handle missing values slightly different [analytics/wikistats] - https://gerrit.wikimedia.org/r/126265 (owner: Erik Zachte) [15:25:08] (CR) QChris: "recheck" [analytics/wikistats] - https://gerrit.wikimedia.org/r/126264 (owner: Erik Zachte) [15:25:10] (CR) QChris: "recheck" [analytics/wikistats] - https://gerrit.wikimedia.org/r/126265 (owner: Erik Zachte) [15:28:06] (CR) QChris: "Jenkins does not seem to pick up the "recheck"." [analytics/wikistats] - https://gerrit.wikimedia.org/r/126262 (owner: Erik Zachte) [15:29:21] (CR) QChris: "Jenkins does not seem to pick up the "recheck". But when I tell Jenkins to rebuild directly, it goes green:" [analytics/wikistats] - https://gerrit.wikimedia.org/r/126264 (owner: Erik Zachte) [15:30:22] (CR) QChris: "Jenkins does not seem to pick up the "recheck". But when I" [analytics/wikistats] - https://gerrit.wikimedia.org/r/126265 (owner: Erik Zachte) [15:32:14] hashar: Can I get Jenkins to revote on a change (without uploading a new patch set) if it does not pick up "recheck"? [15:32:28] https://gerrit.wikimedia.org/r/#/c/126264/1 [15:34:08] (PS9) Milimetric: Fix parse_username to handle Unicode [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/125752 (owner: Csalvia) [15:34:39] (CR) Milimetric: Fix parse_username to handle Unicode (1 comment) [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/125752 (owner: Csalvia) [15:34:46] (CR) Csalvia: [C: 2 V: 2] Fix parse_username to handle Unicode [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/125752 (owner: Csalvia) [15:35:11] (PS1) Erik Zachte: dataset2 -> dataset1001 + remove test code [analytics/wikistats] - https://gerrit.wikimedia.org/r/126371 [15:35:43] (CR) Nuria: [C: 2] Fix parse_username to handle Unicode [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/125752 (owner: Csalvia) [15:38:45] milimetric: +1 for postgres in eventlogging [15:38:55] milimetric: but cross dbengine joins :( [15:38:56] seriously YuviPanda :) [15:39:08] yeah, no, we need a postgresql cluster [15:39:12] that takes in everything [15:39:22] Aaron S. and I were talking about this last time I was in SF [15:39:29] there are a lot of nasty issues [15:39:38] but the advantage to having a real db would be pretty great [15:40:05] i meet a lot of people at the foundation that say they hate SQL, I think they really mean they hate mysql [15:40:32] milimetric: indeed. [15:40:39] milimetric: it's a lot of work tho :( [15:40:53] yea, no doubt [15:40:55] (PS1) Erik Zachte: minus negwikibooks, plus zhwikivoyage [analytics/wikistats] - https://gerrit.wikimedia.org/r/126442 [15:42:50] (PS1) Erik Zachte: dataset2 -> dataset 1001 [analytics/wikistats] - https://gerrit.wikimedia.org/r/126468 [15:44:16] (PS1) Erik Zachte: updated binaries and html index file [analytics/wikistats] - https://gerrit.wikimedia.org/r/126487 [15:45:45] milimetric: we're hiring some more android folks, so I will be doing some analytics work once the app releases (a month?). Will be building a non-limn dashboard, most probably using vega :) Do let me know if you've thoughts [15:46:44] cool YuviPanda, our vega project might be just getting started around the same time [15:46:54] right now the designers are trying to figure out how to lay out the dashboard [15:47:01] we should definitely keep in touch [15:47:06] will do, milimetric [15:48:35] (PS1) Erik Zachte: new absolute monthly counts, plus added dates to ignore to blacklist [analytics/wikistats] - https://gerrit.wikimedia.org/r/126546 [16:14:46] (CR) Hashar: "Sorry you would need a patchset. recheck does not rerun the tests until we get them to run in isolated sandboxes :-(" [analytics/wikistats] - https://gerrit.wikimedia.org/r/126264 (owner: Erik Zachte) [16:25:36] (PS1) Erik Zachte: new images for wikistats portal [analytics/wikistats] - https://gerrit.wikimedia.org/r/126665 [16:30:17] YuviPanda, what's "the UA RfC"? [16:30:33] Ironholds: qchris pointed it out to me in an email in that thread. let me find it [16:30:49] * qchris denies everything [16:31:46] http://tools.ietf.org/html/rfc2616 [16:32:02] YuviPanda ^ [16:33:01] Ironholds: http://tools.ietf.org/html/rfc2616#section-3.8 and the section just before http://tools.ietf.org/html/rfc2616#section-3 [16:33:01] so the UA mentions in the HTTP 1.1 RfC [16:33:02] Ironholds: let me put those in gerrit [16:33:33] Ironholds: http://lists.wikimedia.org/pipermail/analytics/2014-March/001730.html [16:33:47] yes, your user agent is not compliant with that [16:33:56] if it was, ua-parser would parse it [16:34:02] I know this because I just altered it TO comply, and it parses ;p [16:34:20] aha! what's the fix? [16:34:28] semicolon after the OS version number. [16:34:32] As qchris's email has [16:34:45] I'll leave a note [16:34:52] Ironholds: ah, indeed. Like https://gerrit.wikimedia.org/r/#/c/126247/ does [16:35:18] then why did the example UAs in the email not include em? [16:35:37] Ironholds: it's just not been implemented in the iOS version, and I added it to the Android version after Adam sent that email [16:35:43] Ironholds: hence the confusion. [16:35:52] ahh [16:36:08] I should've replied to that email, but it was late when I saw i so didn't. [16:36:12] and yay for confusion resolution :) [16:37:21] Ironholds: Can you comment there or should I? [16:37:27] I'm gonna [16:37:34] I'm just testing/resolving the iPhone one [16:37:48] so far I've got the modification necessary to make it recognise it's an iPhone. From there.. [16:38:24] ok! [16:38:41] and woot for doing these *before* the app releases :) [16:39:15] well, I said I'd test em [16:39:27] and I don't particularly want to be on the hock for parsing the unparseable, or writing weird rules we have to maintain [16:39:54] indeed, but if they do match the RfC they should parse. [16:40:15] except they didn't as provided to me!:P [16:40:21] * Ironholds checks the iOS patch [16:40:34] indeed. [16:40:41] Ironholds: incomplete they are. [16:41:09] the android patch should be merged shortly. then they will be WikipediaApp/v2.0 (Android/4.4;tablet) [16:41:13] err, Tablet [16:41:17] should there be a space between ; and tablet? [16:42:41] no slash! [16:42:45] Android 4.4; [16:43:07] and, good question [16:43:45] answer comes back 'it's irrelevant, ua-parse doesn't have device class parsing yet, just device parsing' [16:43:52] so, if you can provide the actual device, that'd be grand [16:44:02] if it's just tablet versus mobile it probably won't pick it up [16:44:02] actual device as in? [16:44:09] HTC Explorer vs Nexus 4? [16:44:14] HTC ([A-Z][a-z0-9]+) Build' [16:44:17] we explicitly didn't put that in because of privacy concerns. [16:44:29] * Ironholds sighs [16:44:32] okay. hurgh. [16:44:45] (CR) QChris: "> Sorry you would need a patchset. recheck does not rerun the tests" [analytics/wikistats] - https://gerrit.wikimedia.org/r/126264 (owner: Erik Zachte) [16:44:48] then include tablet however the hell you want [16:44:56] because either I'll have to write a regex for it or we won't use it ;p [16:45:06] heh [16:45:06] but either way format doesn't matter as long as it doesn't interfere with OS identification [16:45:11] so [16:45:21] WikipediaApp/v2.0 (Android 4.4; Tablet) [16:45:21] WikipediaApp/v2.0 (Android 4.4; tablet) [16:45:24] hah! [16:45:24] hah! [16:45:26] ... [16:45:27] hah! [16:45:59] missed that. my mind reader mustn't be working [16:46:23] user agents: reducing confidence in the theory that we are different people since 1999 [16:46:27] permission to post this to bash? [16:47:18] hehe [16:47:24] definitely [16:47:42] Ironholds: I updated the Android patch to provide that. [16:48:06] grand! [16:50:01] alright, patch note added [16:50:26] y'know, ua-parser is implemented in python [16:50:30] so testing is trivial [16:50:34] Indeed [16:50:54] now need to figure out who to hassle for my stat1003 access (I responded on the thread) [16:50:59] no ottomata :( [16:51:13] so are we not including device for iOS devices, where it's just iPhone versus iPod versus iPad? [16:52:26] good question. depends on what makes more sense to you. [16:52:32] but there's also iPod touch [16:52:38] yep [16:52:42] I more meant from a privacy POV [16:52:47] but yeah, we can just make it iPhone vs iPad and let it be, or add the ; tablet [16:52:53] current idea is to mimic android [16:52:54] point being: fewer devices, so that's good, buuut [16:53:05] I mean, we won't be reporting iPhone 5s, for example.l [16:53:20] s/l$// [17:02:35] yup [17:06:42] YuviPanda, got it to work [17:06:52] Ironholds: woot! [17:06:58] Ironholds: what's the form it worked in? [17:08:57] YuviPanda, see email and the patch [17:09:05] Ironholds: woot. awesome! [17:09:15] Ironholds: I think adam will fix iOS, and android patch should be merged shortly [17:09:18] cool [17:09:20] Ironholds: ty! [17:09:23] I can always patch it myself, but eeeeh. [17:09:24] cool [17:14:01] hey ottomata! [17:14:19] yoyooo [17:14:20] I remember responding to you about stat1 access moving over, but looks like I don't have stat1003 access? nor bast1001 :( [17:14:36] hmmmmMMmm [17:15:16] ! i don't know how that slipped by Yuvi! [17:15:29] fixing... [17:15:48] ty ottomata! [17:19:30] Hey ottomata. Is it possible that some cron jobs are still running from old stat1? [17:20:33] yes! i didn't disable any of them, i just moved the new ones over [17:20:39] so they are probably running in both places? [17:21:09] There are some problematic queries running against s1 and I suspect they are attached to old, defunct crons. [17:21:58] Sadly, they are using the common "research" user for connecting to MySQL. [17:22:15] Do you know how I might track down the source? [17:23:49] well, afaik all crons are still running on stat1 [17:24:02] stat1 is still up, log in and see if any of your crons are there, and if you don't want them, just disable them? [17:24:24] ahh [17:24:26] Oh yeah. Totally could do that for myself. This issue isn't one of my crons though. [17:24:33] ottomata, related question, what happens to crons of people who leave? [17:25:02] if we remove their account...i'm not actually sure! I assume they won't run...but they might [17:25:11] i think we don't actualy remove accounts though [17:25:15] usually we just disable ssh keys [17:25:33] Could we stop all the crons of people with disabled keys? [17:26:06] Only we're pretty sure some of erosen's scripts are still triggering (or were) [17:26:14] DarTar: ^ [17:26:51] let's see... [17:27:07] hey [17:27:41] so I am not sure if erosen’s scripts are reused by other people, aren’t they generating data for the geodashboards used by grantmaking? [17:28:01] halfak: i do see that you ahve crons on stat1 [17:28:10] that are probably still running [17:28:10] including some mysql stuff [17:28:14] I just killed 'em ottomata [17:28:18] like 2 secs ago [17:28:26] ottomata: I do have crons, I can port them [17:28:54] DarTar, should have been already ported for you. [17:29:01] ok cool [17:29:10] DarTar, they have all been ported ja [17:29:13] danke [17:29:14] but I did not remove them from stat1 [17:29:25] DarTar, if erosen's account is running old crons that we need, we need to take them over. [17:29:29] so they are probably running on stat1 and stat1003 right now [17:29:35] i don't see erosen crons on stat1 [17:29:36] I'll add a card to audit old crons. [17:29:40] if the rsync to stat1001 is in place from stat1003 I can safely disable them all from stat1 [17:29:41] checking other hosts [17:29:52] DarTar it should be [17:30:05] also i think the rsync from stat1 won't work anymore as of yesterday anyway [17:30:06] ok [17:30:10] hmm, or today [17:30:13] sorry, yeah this morning [17:30:16] alright [17:33:10] hm ottomata what’s stat1003 full hostname? [17:33:17] stat1003.wikimedia.org [17:33:30] I can’t SSH into it for some reason [17:34:02] my connection hangs [17:34:03] you have to go through bast1001 [17:34:08] are you? [17:34:09] ah, right [17:34:12] Firewalls! [17:34:12] makes sense [17:35:38] heh [17:35:45] dartar@bast1001:~$ ssh stat1003.wikimedia.org [17:35:45] Permission denied (publickey). [17:35:52] huh [17:35:58] DarTar [17:36:03] you can't ssh directly without forwarding your ssh key [17:36:15] https://wikitech.wikimedia.org/wiki/Server_access_responsibilities#SSH [17:36:30] heh, I forgot that [17:36:36] you should use ProxyCommand, though, not forward [17:36:37] bu tja [17:38:09] I thought my .ssh/config was already set up touse ProxyCommand, I’ll have to check again [18:37:39] halfak, Ironholds, DarTar: the hangout link on the calendar item for the research showcase needs to be updated [18:38:08] yes, I’ll remove it, sorry for the confusion [18:39:07] wow, hangout doesn’t allow me to remove a video link once it’s added [18:39:08] clever [18:39:13] I’ll ask office it [18:39:41] i don't have the right link [18:39:46] is there no showcase at the moment? [18:41:30] ^ DarTar [18:42:06] ori: https://www.youtube.com/watch?v=Pps__TkfrMs [18:42:27] thank you [18:42:29] and #wikimedia-research for discussion [18:42:47] moar channels [18:46:46] (CR) Hashar: "@qchris and I upgraded Zuul today, I should be able to have recheck to retrigger test jobs for trusted folks :-]" [analytics/wikistats] - https://gerrit.wikimedia.org/r/126264 (owner: Erik Zachte) [19:09:16] ori: that channel was created around 2005-6 :p [19:09:28] long before #wikimedia-analytics existed [19:12:54] is there a doc setup on how to reach the mysql slaves from stat1003? I've forgotten... [19:16:36] YuviPanda: maybe this? https://wikitech.wikimedia.org/wiki/Analytics/Data_access#Analytics_slaves [19:18:45] ottomata: is what I need. ty! [19:27:29] yay! [19:27:33] people are using my documentation [19:27:48] thanks for that Ironholds :) [19:27:54] I will try to keep it up to date too, as things change [19:27:57] like, more data going into hive [19:28:02] etc. [19:28:17] no problemo! [19:28:20] congrats on bits, btw :) [19:30:49] ottomata: is there a way I can see a stream of all events that are coming in? Mine might not be validating, and I need to verify that [19:34:28] milimetric: ^ [19:34:39] or more to the point, how do I debug if my events are making it at all vs if they aren't validating? [19:34:40] events like event log? [19:34:46] ottomata: EventLogging, yeah. [19:34:57] hm, I think you need to have an account on vanadium for that, not sure, [19:35:00] ottomata: https://wikitech.wikimedia.org/wiki/EventLogging#Varnish mentions vanadium [19:35:08] afaik, eventlogging on stat1003 is just archived logs [19:35:09] hourly i think [19:35:10] right [19:35:13] oh wat [19:35:20] no, the mysql replication is instantaneous, I think? [19:35:42] oh in the db? [19:35:43] i guess so [19:35:48] i'm thinking on the disk [19:35:49] https://wikitech.wikimedia.org/wiki/EventLogging#Check_if_the_data_I_am_logging_is_coming_through_properly.3F [19:35:51] is that helpful? [19:36:20] ottomata: no because zsub isn't installed [19:36:24] on stat1003 [19:36:25] oh hm [19:36:27] hmmmm [19:36:38] oh yeah, we must've missed that in the migration [19:36:45] as in - it probably wasn't puppetized [19:36:54] ah, right. [19:36:59] should I make a patch? [19:37:07] but that's where i'd check the zeromq stream usually [19:37:30] sure YuviPanda, patches welcome, but I think after that we'd need to subscribe it as well, I never did that but I guess I could learn [19:37:41] also, YuviPanda, any problem with just checking the db? [19:37:54] milimetric: yeah, it isn't there in the db [19:38:03] milimetric: so I need to get the raw logs which will tell me if it is being dropped [19:38:30] ok, makes sense YuviPanda, yeah, events might not be validating or there could be other problems between the stream and its being written to the db [19:38:30] i can patch it for install [19:38:32] few secs [19:38:39] so yeah, 1. install 2. subscribe [19:38:39] ottomata: woo! [19:39:04] milimetric: yeah, subscribe should be simple if there are no firewall issues [19:48:40] (CR) QChris: "recheck" [analytics/wikistats] - https://gerrit.wikimedia.org/r/126264 (owner: Erik Zachte) [19:50:22] qchris: na it is not working yet :-D [19:50:28] :-D [19:50:34] Was trying ... just in case. [19:50:41] we never know [19:50:45] But I am replying to your comment. [19:51:09] I learnt that hashar is quick ... so I wanted to check before requesting it :-P [19:51:19] * hashar blushes [19:51:36] with the old zuul version, it would retrigger based solely on the comment [19:51:53] with the new version, we can combine requirements [19:52:15] (CR) QChris: "> I upgraded Zuul today" [analytics/wikistats] - https://gerrit.wikimedia.org/r/126264 (owner: Erik Zachte) [19:52:16] so we could get "comment is 'recheck'" AND "email is whitelisted" [19:52:26] Mhmmmm. [19:52:30] Sounds supergood. [19:53:03] Would it be hard to get that working? [19:53:15] * hashar ponders [19:53:29] it is 10pm and I havent finished my cognac + I need some sleep [19:53:40] Hahaha. Cognac first. [19:53:45] on a serious note, no that is not too hard [19:53:56] just have to read the doc, figure out what is needed, test it locally and push! [19:54:32] It would help me a lot :-) [19:59:37] the hardest part is figuring out whether a bug got filled [20:03:42] milimetric: fwiw, I can use zsub as mentioned in the wikitech EL Page to debug, so all is well [20:04:01] ok, awesome YuviPanda [20:07:05] (CR) Hashar: "qchris : the zuul conf would be https://gerrit.wikimedia.org/r/#/c/126838/ :-D" [analytics/wikistats] - https://gerrit.wikimedia.org/r/126264 (owner: Erik Zachte) [20:07:35] hashar: That's not fair. [20:07:41] I am not done with filing the bug ... [20:07:47] well once you did [20:07:49] Meh... I knew you were qucik :-) [20:07:54] you can amend that change and Bug: 123456 [20:08:08] then I can redirect the necessary bike shed to that bug report :-] [20:08:12] And I can make the bug's description "merge change 126838" [20:08:24] to be honest, I already thought about that change during the dinner [20:08:31] Cheater! [20:09:09] before having a kid, I was dreaming about problems, found a solution during the shower. Would then show up at 9:30 exactly, code, and by 10am my day was over :] [20:09:25] Hahaha. [20:10:07] the recheck thing, I will got to test it out on labs [20:10:31] I have to rebuild Zuul/Jenkins in labs for dev purposes tomorrow afternoon. The change above is going to an excellent motivation. [20:22:08] hmm, any idea why making a request with 'https://bits.wikimedia.org/event.gif?%7B%22schema%22%3A%22MobileWikiAppEdit%22%2C%22revision%22%3A%228174633%22%2C%22event%22%3A%7B%22action%22%3A%22start%22%2C%22userName%22%3A%22yuvitest44%22%2C%22editSessionToken%22%3A%22cf724bdf-e4d7-4b2f-9db0-0b5a3e129ba7%22%7D%7D' isn't even doing anything? [20:22:13] nothing on the varnishcsa logs either [20:22:46] milimetric: ^ [20:22:54] (or whoever I should poke about EL) [20:23:17] ah, non 200 code [20:23:19] * YuviPanda investigates [20:25:02] oh, nevermind. it is returning a 204 as it should [20:25:05] but nothing on the server [20:25:06] HWY [20:25:07] WHY [20:26:39] ah cool [20:26:43] it does hit varnishcsa [20:26:45] ignore me, sorry [20:46:34] ok, so it hits varnish but somehow gets dropped in the middle and I've no idea why :( [20:49:01] {"schema":"MobileWikiAppEdit","revision":8188113,"event":{"action":"start","userName":"yuvitest44","editSessionToken":"77078656-8520-4c5d-832a-7b37b4ba7cf8"}} is the payload [20:49:09] and it is hitting https://bits.wikimedia.org/event.gif [20:49:14] and I get back a 204 [20:49:35] I looked at both EventCapsule and event itself and they both seem to validate. Unsure why it isn't coming through [20:50:31] anyone? [20:52:26] * YuviPanda pokes milimetric and halfak randomly [20:52:52] In meeting -- receptive to random poke in ~ 45 min [20:53:01] halfak: oh, I guess that's all of you then. [20:53:07] halfak: i'll just wait. Thanks for that! [21:13:46] hm, YuviPanda, I don't know how to debug that, could you file a bug and cc me on it? I'll get it prioritized as we have to learn this sooner rather than later [21:13:55] milimetric: alright [21:15:04] and ori, please don't work on this bug that YuviPanda's about to file. I know you could probably fix it in half a second, but we gots to learn :) [21:25:19] milimetric: https://bugzilla.wikimedia.org/show_bug.cgi?id=64024 [21:26:03] milimetric: hopefully cc'd you in the right mail [21:26:24] milimetric: I'll be off to sleep in a bit [21:27:02] yep YuviPanda, thanks and that includes me [21:27:27] milimetric: :D [21:27:41] milimetric: yw! And do note that we are hoping to ship the entire thing in a couple of weeks [21:28:27] yep YuviPanda, this is getting high importance [21:28:34] milimetric: ty! [21:28:52] milimetric: it might be me doing something stupid (probable!), fwiw :) [22:04:36] YuviPanda, I don't read code. [22:04:53] Could you take a look at the new iOS patch and tell me if it incorporates an underscore between version numbers? (I.e. 4_4) [22:04:57] I can't see any way it could, but. [22:11:25] wait, nm [22:13:08] Ironholds, the ios doesn't try to make an underscore of it [22:13:18] do you want that? or are dots okay? [22:13:32] Ironholds: you should hang out on #wikimedia-mobile, too! [22:14:12] Ironholds: apple requires the build version to be in #.#.#, so it's easy enough to do a find and replace if an underscore is needed. i just rebased, by the way, didn't make any other changes [22:17:26] dr0ptp4kt, yeah, just tested it [22:17:29] it works fine :) [22:17:33] hence my +1 [22:17:45] Ironholds: cool [22:17:46] (sorry, popped out for a smoke) [22:18:25] argh, rebase! [22:18:27] * Ironholds +1s again [22:18:36] we reaaally need that gerrit config option set [23:15:32] Hey YuviPanda. Still around? [23:16:08] Sorry for the delay. :\