[12:59:28] hey morning everyone. [12:59:53] I'm doing two interviews today, one of them running into our standup. If I miss it, my update is that I'm still working on timeseries. [13:17:56] milimetric: http://gingle.wmflabs.org/ [13:22:23] drdee, unavailable? [13:22:46] 1 sec ago it wasn't [13:23:01] mmmm i have to ask ottomata to make sure it restarts automatically [13:23:03] 1 sec [13:29:40] try again [13:29:53] gingle?! o_O [13:33:14] cool drdee, looking good [13:33:23] except it always writes to 1112 still [13:33:52] if you think it's ready, change it back to dynamic [13:34:02] YuviPanda: it's kind of a long story :) [13:34:13] milimetric: ypu [13:34:15] hmm, 10 mingle points long? :P [13:34:22] but before that that [13:34:26] just a tiny side project to make our lives easier [13:34:39] let's hangout! [13:34:44] and talk oath! [13:35:07] like bingle or bugello? [13:35:07] (bugzilla -> mingle / bugzilla -> trello) [13:40:44] yp [13:40:52] git & mingle -> gingle [14:55:09] (PS1) Erik Zachte: fix: WikiCounts expects undeerscore as delimiter in StatisticsContentNamespaces.csv, should expect dashes [analytics/wikistats] - https://gerrit.wikimedia.org/r/81935 [14:55:58] qchris around? [14:56:06] yop. [14:56:17] hangout? [14:56:26] Sure. 5min. [14:57:39] (CR) Erik Zachte: [C: 2 V: 2] fix: WikiCounts expects undeerscore as delimiter in StatisticsContentNamespaces.csv, should expect dashes [analytics/wikistats] - https://gerrit.wikimedia.org/r/81935 (owner: Erik Zachte) [14:58:59] drdee: I am in the hangout. [14:59:05] that's not 5 minutes [14:59:14] :-D [15:09:20] (PS1) Erik Zachte: fix perl syntax error [analytics/wikistats] - https://gerrit.wikimedia.org/r/81937 [15:10:21] (CR) Erik Zachte: [C: 2 V: 2] fix perl syntax error [analytics/wikistats] - https://gerrit.wikimedia.org/r/81937 (owner: Erik Zachte) [15:48:26] (PS1) Diederik: Made a small change for Christian to demo gingle #for 1112.3 [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/81942 [17:01:00] milimetric; scrum [19:32:54] hey [19:33:16] hey MaxSem [19:33:22] stats.wikimedia.org is not very maintained these days, right? [19:36:55] so is there a newer source on user-agent breakdown? [19:37:25] hi MaxSem [19:38:03] MaxSem: are you interested in the code that classifies user-agent in stats.wikimedia.org [19:38:06] ? [19:38:41] yes, currently in HTML/WAP breakdown - how is it calculated? [19:42:43] maybe drdee knows?:) [19:42:50] MaxSem: https://github.com/wikimedia/analytics-wikistats/blob/master/squids/perl/SquidCountArchiveProcessLogRecord.pm#L899 [19:45:57] MaxSem: maybe you want to address EZ on this particular question in the analytics mailing list [19:46:41] average_, thanks but that code appears to be counting requests to *.wap.wikipedia.org - which is just a redirect these days [19:48:31] MaxSem: if you want to update that code, someone will have to run it and it takes like ~20h to run it to see your results for 1 month of data [19:49:20] average_, so far I just want to understand how is it counted - based solely on user-agent? [19:49:58] MaxSem: I suggest you open up a discussion on this on the analytics mailing list so Erik(the original author and maintainer of wikistats) gets a chance to understand your request