[00:00:09] see you drdee [07:26:50] http://blog.wikimedia.org/2012/09/19/what-are-readers-looking-for-wikipedia-search-data-now-available/ [07:26:56] Anyone know where that is? [07:37:47] weird [07:38:13] Would love to look at but nothing there despite blog post. :( [08:24:57] purplepopple: hey [08:25:24] purplepopple: drdee the author of the article will be around here later today so you'll be able to speak with him about it [08:25:35] purplepopple: perhaps 3-4h from now [08:26:56] Define later today when today is 6:26PM? :) [08:27:05] purplepopple: like 4h from now [08:27:18] * purplepopple nod nods [08:27:41] Hopefully headed to bed by then. [08:27:58] * purplepopple puts her stats stuff into packets for GLAM related stuff [08:28:15] and that one would have been interesting to see Olympic/Paralympic searches during those periods. :) [13:28:16] moooorning! [13:33:11] mornin everyone [13:55:00] hello ottomata [13:55:06] hi milimetric [13:55:14] howdy [13:55:23] drdee: hi [13:55:29] mornin d [13:55:37] morning guys [14:01:41] drdee: can I drop usage of strtok ? [14:02:02] drdee: I need finer grained control in parse_url [14:02:16] what are you suggesting to replace it with? [14:03:21] drdee: well currently strtok is causing me some trouble, because I need to split the hostname into pieces [14:03:28] drdee: and split the directory part into pieces [14:03:41] drdee: and I've used TOKENIZE, TAIL, HEAD, FIELD [14:03:46] drdee: I understand how they work [14:03:56] k [14:04:07] drdee: but I tried to split a hostname into parts and I'm having trouble because of the strtok in TOKENIZE [14:04:23] what's the problem? [14:04:39] drdee: not getting what I'd expect from it, I can show you a sample [14:05:41] drdee: https://gist.github.com/e262556ee467b75c7fe9 [14:06:07] drdee: I was expecting that the first FIELD would return 0, and control flow would not enter the if [14:06:22] drdee: because the documentation of strtok says that it fails if there are no more separators in the string [14:06:30] drdee: and I have just one dot in the url [14:10:25] are you entering the same string, cause then you might need strtok_r [14:11:20] strtok_r is used [14:12:02] maybe the macro expects a domain name with at least 2 dots, not 1 as in your current example [14:12:55] hm, ok, but the log can have a domainname with a singledot [14:13:09] drdee: for example http://wikimediafoundation.org/wiki/Home [14:13:29] yes it an :) [14:13:33] i mean yes it can :) [14:13:47] drdee: that's why I'd like to replace strtok with just while loops that iterate searching for the next character [14:15:27] ok, make the changes that you feel are necessary to make this work [14:15:39] ok [15:17:25] ok, battery dying, need to pick up laundry, be back on in a bit [15:50:21] http://www.kickstarter.com/projects/Musopen/open-source-bug-tracking [16:24:03] I hope all these minimalistic approaches to bug tracking and all the new startups emerging will make traditional stuff like jira go away [16:24:11] I don't like jira [16:25:05] anyway, back to coding.. [16:30:52] :) [16:51:08] mornin [16:51:22] morning [17:00:02] https://plus.google.com/hangouts/_/b26d998e265c18b523f7f8f41aed7eaf2ed1e617 [17:01:19] danke [17:02:24] drdee: I need some squid logs please with data that shows urls coming from *.planet.wikimedia.org [17:02:36] drdee: also for blog.wikimedia.org or wikimediafoundation.org [17:02:46] just copy some of the lines in the current example2.log [17:02:52] and change the domain name [17:02:54] drdee: if I had access to a /var/log I would get them or find them with grep [17:04:51] https://plus.google.com/hangouts/_/b26d998e265c18b523f7f8f41aed7eaf2ed1e617 [17:05:39] ottomata: is that for everyone in the channel ? [17:06:55] in the hangout? [17:06:55] yeah [17:25:13] http://etherpad.wikimedia.org/AnalyticsPlanningMeetingNotes [17:36:23] http://etherpad.wikimedia.org/AnalyticsMilestonesAndPriorities [17:42:35] Global Engineering Roadmap collab: http://etherpad.wikimedia.org/EngineeringRoadmap2 [17:42:47] i've pasted our parts into http://etherpad.wikimedia.org/AnalyticsMilestonesAndPriorities [18:42:33] http://art.less.ly/2012/heart-dino.png [19:06:08] THAT [19:06:10] was a great meeting [19:06:15] A+ would meet again [19:21:35] dschoon, d3 talk? [20:03:29] fyi dschoon, i'm reinstalling an02-an10 with precise right now [20:03:58] leaving an01 for now, as its got some fancy stuff on it I want to make sure I save ( http proxy, iptables, puppetmaster, etc.) [20:25:29] drdee: https://gerrit.wikimedia.org/r/24487 [20:25:40] ty [20:25:49] drdee: please ./build.sh and check out filter2.c [20:25:58] drdee: and the respective binary filter2 [20:26:26] better not to put binaries in git [20:26:37] i have to compile source myself [20:27:37] oh, I didn't put the binary there, I just mentioned it because it'll be generated after ./build.sh [20:29:47] note I forgot to add: basically all boils down to splitting the domain name on '.' into all the parts. it's easier to handle all edge cases if the parts are available [20:30:25] okay, both sounds good [20:57:58] average_drifter: visit https://gerrit.wikimedia.org/r/#/admin/projects/analytics/webstatscollector,branches [20:58:03] create a new remote branch [20:58:07] push your changes there [21:07:06] update for dschoon, all reinstalled except an02 and an07 [21:07:09] dunno what's up with them [21:07:13] will ffigure it out tomorrow [21:07:16] lataers! [21:10:18] back [21:10:23] well, that was poorly timed. [21:11:12] :) [21:13:22] average_drifter: any luck with the remote branch? [21:54:35] drdee, if you're still around, I could use a git pointer [21:54:48] how do you guys collaborate with separate repositories like you have? [21:54:55] i'm new to the whole forking world [21:55:20] everybody clones from git/gerrit [21:55:26] no forking necessary [21:55:45] small / quick fixes => local branch, merge in master on origin [21:55:58] big new features => local branch, remote branch [21:56:12] these are rules of thumb [21:56:16] limn is a bit different [21:56:23] well no, right now for limn we have: [21:56:25] dsc/limn [21:56:28] milimetric/limn [21:56:29] there you would clone [21:56:32] wikimedia/limn [21:56:36] do dev on your own clone [21:56:45] and do a pull request against wikimedia/limn [21:56:53] wikimedia/limn should always be production ready [21:56:58] ok [21:57:07] so no branches on wikimedia/limn [21:57:25] nope, AFAICT [21:57:33] so you get stuff into master on your own branch via whatever process you want, then you pull from wikimedia/limn and then you push that to wikimedia/limn [21:57:58] you send a pull request from your repo to wikimedia/limn [21:58:08] wikimedia/limn needs to accept the pull request [22:00:44] milimetric: i can help [22:01:09] uh, i think i understand the idea [22:01:19] coolio. [22:01:32] i set up my local with git flow fyi [22:01:35] another useful note is [22:01:54] i've added all my teammate's forks to my local clone as remotes [22:02:05] so when i fetch --all i get all changes [22:02:10] smart :) [22:02:34] and i can pull any branch -- including yours and wikimedia -- into my fork at any time [22:02:35] their master branches? [22:02:39] everything. [22:02:53] remotes don't automatically change anything [22:03:02] you have to git pull to add their changes to a working copy. [22:03:13] otherwise you're just keeping track of things [22:03:29] hm... don't get why that's cool [22:03:38] but that's 'cause i'm not cool probably [22:05:15] so wouldn't you only want to know about what teammates merge into their respective master branches? [22:31:16] no milimetric, you would wanna see what they working on in experimental branches [22:33:11] well, but if 10 people start working on this, that's not scalable [22:33:38] it's ok, I see the idea so it's not as confusing to me that everyone's off on their own repo [22:39:59] * milimetric be back soon - dinner [22:58:05] back