[04:26:17] MediaWiki-extensions-MultimediaViewer, Analytics, Multimedia: Make upload.wikimedia.org serve images with Timing-Allow-Origin header - https://phabricator.wikimedia.org/T76020#939229 (Tgr) [08:06:44] Analytics-EventLogging, Continuous-Integration: UniversalLanguageSelector patches are blocked by voting schema tests - https://phabricator.wikimedia.org/T85124#939288 (Amire80) NEW [08:24:04] Analytics-EventLogging, ContentTranslation-Deployments, Continuous-Integration: UniversalLanguageSelector patches are blocked by voting schema tests - https://phabricator.wikimedia.org/T85124#939308 (Arrbee) [08:54:46] Analytics-EventLogging: EventLogging ValidateSchemaTest::testValidEvent() fails under HHVM - https://phabricator.wikimedia.org/T78680#939329 (QChris) Open>Resolved a:QChris >>! In T78680#932191, @hashar wrote: > Ideally we would have a test highlighting the issue. If the issue was be the EventLoggin... [09:58:07] Analytics-EventLogging, ContentTranslation-Deployments: UniversalLanguageSelector patches are blocked by voting schema tests - https://phabricator.wikimedia.org/T85124#939427 (hashar) [10:00:20] Analytics-EventLogging, ContentTranslation-Deployments: UniversalLanguageSelector patches are blocked by voting schema tests - https://phabricator.wikimedia.org/T85124#939428 (hashar) The extension has bad resource loader definitions. You can reproduce on your local setup using: cd tests/phpunit php phpun... [11:55:35] (PS1) QChris: If desktop site starts in 'www.', drop 'www.' for mobile site [analytics/aggregator] - https://gerrit.wikimedia.org/r/181396 [12:16:14] Analytics-EventLogging, ContentTranslation-Deployments, Continuous-Integration: UniversalLanguageSelector patches are blocked by voting schema tests - https://phabricator.wikimedia.org/T85124#939595 (hashar) So looking quickly at ULS, it depends on a RL module `schema.UniversalLanguageSelector` which is not i... [12:19:59] Analytics-EventLogging, ContentTranslation-Deployments, Continuous-Integration: UniversalLanguageSelector patches are blocked by voting schema tests - https://phabricator.wikimedia.org/T85124#939608 (Amire80) As I said, AFAIK the schema in ULS is loaded correctly using the hook, with the technique described h... [12:29:11] Analytics-EventLogging, ContentTranslation-Deployments, Continuous-Integration: UniversalLanguageSelector patches are blocked by voting schema tests - https://phabricator.wikimedia.org/T85124#939610 (hashar) The problem is the Jenkins job does not clone the EventLogging extension when preparing the tests for... [12:48:47] Phabricator, Analytics-Tech-community-metrics, Engineering-Community: Monthly report of total / active Phabricator users - https://phabricator.wikimedia.org/T1003#939636 (Aklapper) >>! In T1003#852987, @Dzahn wrote: > Access denied for user 'phstats'@... [...] This requires a change in DB GRANTS. @dzahn: Do... [12:50:12] Phabricator, Analytics-Tech-community-metrics, Engineering-Community: Monthly report of total / active Phabricator users - https://phabricator.wikimedia.org/T1003#939640 (Dzahn) >>! In T1003#939636, @Aklapper wrote: > @dzahn: Does that require a separate ticket, or how to proceed? It does, but it already exi... [12:53:46] Multimedia, Analytics, MediaWiki-extensions-MultimediaViewer: Make upload.wikimedia.org serve images with Timing-Allow-Origin header - https://phabricator.wikimedia.org/T76020#939648 (Aklapper) Thank you that this is now a [[ http://www.google-melange.com/gci/task/view/google/gci2014/5832099869229056 | GCI ta... [13:32:05] ContentTranslation-Deployments, Analytics-EventLogging, Continuous-Integration: UniversalLanguageSelector patches are blocked by voting schema tests - https://phabricator.wikimedia.org/T85124#939691 (Amire80) Looks good to me now. Thank you. [13:32:18] ContentTranslation-Deployments, Analytics-EventLogging, Continuous-Integration: UniversalLanguageSelector patches are blocked by voting schema tests - https://phabricator.wikimedia.org/T85124#939692 (Amire80) a:hashar [13:32:29] ContentTranslation-Deployments, Analytics-EventLogging, Continuous-Integration: UniversalLanguageSelector patches are blocked by voting schema tests - https://phabricator.wikimedia.org/T85124#939693 (Amire80) Open>Resolved [13:36:52] operations, Analytics-EventLogging, Wikimedia-Site-requests: wikitech.wikimedia.org error "$wgEventLoggingBaseUri is not set." - https://phabricator.wikimedia.org/T84965#939698 (Dzahn) a:Andrew [15:07:25] Analytics-Engineering, Analytics-Dashiki: Vital Signs user knows to click on metric title to open definition [3 pts] - https://phabricator.wikimedia.org/T76741#939856 (Milimetric) Open>Resolved [17:25:59] will be back in 30 mins [17:43:41] YuviPanda: here :) [17:45:04] jdlrobson: I poked ottomata in -operations too [17:47:09] thanks YuviPanda heads up i have a standup in 15 [18:06:06] milimetric: YuviPanda is asking me about the qchris-* labs instances. Will you still need them for the Storm experiments? [18:06:10] (PS1) Mforns: Add Annotations API [analytics/dashiki] - https://gerrit.wikimedia.org/r/181424 [18:09:15] qchris: you are thinking of killing them entirely? [18:09:25] i guess that's fine until I finish with the sandbox [18:09:39] so far the sandbox has been enough [18:09:51] Yes, I am thinking of killing them entirely. It seems that would be preferable for YuviPanda. [18:10:12] ok, that works for me qchris: it's not hard to bring them back right? you have the puppetization saved? [18:10:12] might be, at least partially because they’ll keep sending me 5 emails a day saying they’re down ;) [18:10:41] Ever-nagging Shinken :-P [18:10:58] milimetric: Yup, I have the puppet commits that I used to bootstrap them. [18:14:47] YuviPanda: Does Shinken allow you (non guest) to acknowledge the alerts? [18:15:00] qchris: it does, but we don’t have a suitable user plugin yet [18:15:53] Sounds like I should ping you to acknowledge :-) Then at some point that'll get annoying for you, and booooom we have the user plugin? :-D [18:16:00] :D [18:16:05] just so manmy thing. [18:16:15] Ok. I'll kill the instances then. [18:17:45] Done. [18:19:06] (PS2) Nuria: [WIP] Mobile apps oozie jobs [analytics/refinery] - https://gerrit.wikimedia.org/r/181017 [18:26:20] qchris_meeting: wheeee [18:34:34] (CR) Ottomata: [WIP] UDF for classifying pageviews according to https://meta.wikimedia.org/wiki/Research:Page_view/Generalised_filters (4 comments) [analytics/refinery/source] - https://gerrit.wikimedia.org/r/180023 (owner: Ottomata) [18:36:14] ottomata, yay comments! [18:38:06] just a few. Ironholds i think there is one maybe that needs to be responded to still? [18:38:15] kk [18:38:22] patchset 15 line 39 [18:38:28] about \\? in uriPath [18:38:53] milimetric: you around? [18:39:01] jdlrobson: yes [18:39:01] hi [18:39:28] so i was curious how i can test limn locally now.... i'm trying to work out how to make the urls in dashboards/report.json [18:39:30] how do i do that now? [18:40:23] jdlrobson: "make the urls"? [18:40:32] like what they should be for a new graph that you've just added? [18:40:49] milimetric: so currently the datasources are like http://datasets.wikimedia.org/limn-public-data/mobile/datafiles/edits-monthly-new-active.csv [18:40:56] but i have a local file in datafiles/edits-monthly-new-active.csv [18:40:58] that i want to use [18:41:09] oh! ok [18:41:15] http://localhost:5000/limn-public-data/mobile/datafiles/edits-monthly-new-active.csv doesnt seem to exist [18:41:35] right, it's stupid... hang on [18:41:42] i think i know but let me check [18:43:18] jdlrobson: try: http://localhost:5000/data/datafiles/mobile/edits-monthly-new-active.csv [18:43:24] yup that did it [18:43:26] the two pieces of magic in there are: [18:43:49] 1. "mobile" - this is a keyword that limn uses to differentiate how it symlinks to your repository vs. other repos [18:44:07] 2. "data" in "data/datafiles" - this is just the route prefix that the limn server uses [18:44:22] "magic" -> "trash" [18:58:14] (PS1) Jdlrobson: Update scripts in light of recent changes [analytics/limn-mobile-data] - https://gerrit.wikimedia.org/r/181428 [18:58:15] ^ milimetric w00t! [18:58:31] cc bmansurov: ^ [19:00:08] (CR) Jdlrobson: [C: 2] Timebox before summing. [analytics/limn-mobile-data] - https://gerrit.wikimedia.org/r/181204 (owner: Bmansurov) [19:00:15] (Merged) jenkins-bot: Timebox before summing. [analytics/limn-mobile-data] - https://gerrit.wikimedia.org/r/181204 (owner: Bmansurov) [19:00:36] ha jdlrobson, ok, if that helps :) [19:10:07] hy, I want to log some events (these ones https://phabricator.wikimedia.org/T76416) and for that I am using EventLogging and EL dev server running on vagrant. But I can't seem to make this command 'eventlogging-devserver' to work. It gives me this output: http://pastebin.com/KmAwx1fA And if I try to run this command inside vagrant (from vagrant ssh) it says that I don't have permission (sudo chmod-ing is also not allowe [19:10:07] d). Any ideas on how to fix this? [19:15:38] tuxilina: hello, teh dev server has to be run "inside" vagrant [19:16:10] tuxilina: that is, you are instantiating that process inside vagrant's virtual machine [19:17:08] tuxilina: Did you see: http://www.mediawiki.org/wiki/Extension:EventLogging/Guide#Installing_the_EventLogging_devserver [19:22:23] nuria: yes, I followed those steps, but when trying to run 'python setup.py install' it gives me 'error: [Errno 13] Permission denied: 'eventlogging.egg-info/requires.txt'' even with root access. [19:22:59] tuxilina: Did you try running with sudo (while you are the vagrant user) [19:23:26] nuria: yes, the same [19:25:09] nuria: this is the output when run inside vagrant: http://pastebin.com/mJXNtNzA [19:27:34] I also asked on #wikimedia-dev earlier, but apparently it was a bit earlier and I thought that maybe here is best to ask. I'm sorry if I was wrong. Thing is I struggled with this for quite some time, and I tried to figure things by myself but at some point I am afraid of wasting too much time that can be solved quite easily with a question on IRC. [19:29:03] tuxilina: I would try uninstalling eventlogging.egg-info and trying again [19:29:16] tuxilina: when did you updated vagrant/ EL code last? [19:30:06] yesterday I think, but not sure 100%. [19:30:16] tuxilina: i just tried the install with latest EL code and worked fine [19:31:49] tuxilina: http://stackoverflow.com/questions/1550226/python-setup-py-uninstall [19:34:59] ottomata: in order to make my coordinator run on a "daily" partition I need to define a datasets file like: https://github.com/wikimedia/analytics-refinery/blob/master/oozie/webrequest/datasets.xml [19:35:09] ottomata: without ${HOUR} correct? [19:38:28] hm, you don't necessarily need a datasets.xml file, you can define the dataset within your coorindator [19:38:36] we have a separate file so that we can include it in multiple coorindators [19:39:19] ottomata: right, but if i want to define "daily" datasets it seems a common case enough that should go in that file right? [19:40:34] but, ja, it would be the same, except it would have a different frequency [19:40:37] nuria: ja was about to say that [19:40:51] i think we don't need an 'unchecked' daily dataset though...b [19:40:53] but i'm not srue [19:40:54] sure [19:41:22] ok, ottomata will make another dataset-daily.xml on that directory, is that correct? [19:42:15] hm, no you can put it in the same file [19:42:18] for daily webrequest [19:42:23] ottomata: ok [19:42:24] just name it differently [19:42:41] etc. [19:42:42] i think is good [19:42:56] nuria: you said to uninstall but I cannot do that, it didn't install at all because it doesn't have permissions. And I tried after updating everything (vagrant and extensions). [19:43:08] ottomata: and the _SUCESS flag is also available at the daily level? [19:43:17] tuxilina: you can "remove' the egg directory [19:43:41] tuxilina: and start afresh perhaps? [19:44:25] HMMMM [19:44:27] wait a minute.. [19:44:32] ottomata: so when partitions are processed we also setup a _SUCESS at: ${webrequest_data_directory}/webrequest_text/hourly/${YEAR}/${MONTH}/${DAY} [19:44:37] lemme review some oozie docs, I might be steering you wrong [19:45:02] ottomata: maybe i can look this up directly in teh cluster actually, let me see [19:45:52] nuria: assuming the directory is in here "/usr/local/lib/python2.7/dist-packages/eventlogging-0.8_20141222-py2.7.egg", the dist-packages directory doesn't have any event logging egg file. [19:47:29] no ok nuria [19:47:34] i think you do not need a new dataset. [19:47:39] instead, make your coordinator frequency be daily [19:47:40] and [19:47:51] HMM [19:47:59] ah yes [19:48:11] and then has instances that specify a 24 hour period [19:48:12] like [19:48:28] [19:48:28] ${coord:current(-23)} [19:48:28] ${coord:current(0)} [19:48:28] [19:48:35] or the other way around even [19:48:38] might be better? [19:49:01] tuxilina: look in /vagrant/mediawiki/extensions/EventLogging/server/eventlogging.egg-info [19:49:49] ottomata: I did that already, but in that case the coordinator is using also faulty partitions, is that not the case? [19:50:22] nuria: same, permission denied, even with sudo as a vagrant user, and even as root user. [19:50:41] nuria, no, because the data-in event will not be available until all 24 datasets are ready [19:50:50] hang on, linking to example... [19:51:42] nuria: i'm have trouble linking [19:51:49] go here: [19:51:50] https://oozie.apache.org/docs/3.1.3-incubating/CoordinatorFunctionalSpec.html#a6.4._Asynchronous_Coordinator_Application_Definition [19:51:56] nuria: eventlogging.egg-info folder belongs to root:root . I don't know if that matters, while the others belong to user and group vagrant [19:52:07] and search for the text "Coordinator application definition that creates a coordinator action once a day for a year, that is 365 coordinator actions" [19:53:30] nuria, the idea is [19:53:37] your coordinator's frequency is 1 day [19:53:49] and it will use the previous 24 hourly instances [19:53:58] as [19:54:19] in the coordinator then, you shoudl have reference to the coodinator's current time variables [19:54:20] tuxilina: likely you are having permits issues because your /vagrant directory (that is mounted to a dir to your local box) must have files you cannot delete from vagrant itself. I would start by removing eventlogging extension, making sure no code is left in vagrant or local box and doing the setup process within vagrant afresh. If you tried to 'install' [19:54:20] outside vagrant under the /vagrant mount, permits will conflict [19:54:22] that you can pass to your workflow [19:54:59] ottomata: ok, but how does it to know to use the "cleaned" up partitions [19:55:04] you'll probably need multiple in , one for each webrequest_source partition [19:55:09] depends on which dataset you use [19:55:15] if you use [19:55:17] webrequest_bits [19:55:22] ottomata: ah , ok, i see [19:55:35] the dataset will not be available until the _SUCCUSS flag exists [19:55:36] if you choose [19:55:41] webrequest_bits_unchecked [19:55:48] it will be available as soon as the directory exists [19:55:59] nuria: thank you so much for your help. I am trying this now. Thanks! [19:56:04] so, if you define a coordinator that spans 24 hourly datasets [19:56:28] it will not be launched until each of the 24 directories has a _SUCCESS flag in them [19:56:44] ottomata: ok, understood now thank you [19:56:49] yup :) [20:12:03] ottomata: nice talk! [20:12:10] at hakka [20:12:51] thanks! [20:14:05] ottomata is a speaker ninja master, kia! [20:28:33] (PS3) Nuria: [WIP] Mobile apps oozie jobs [analytics/refinery] - https://gerrit.wikimedia.org/r/181017 [20:48:17] Analytics-Wikimetrics, Analytics-Dashiki: Vital Signs user sees annotations on graphs [13 pts] - https://phabricator.wikimedia.org/T78151#940453 (mforns) {F23564} [20:49:30] Analytics-Wikimetrics, Analytics-Dashiki: Vital Signs user sees annotations on graphs [13 pts] - https://phabricator.wikimedia.org/T78151#940454 (Milimetric) sweet :) [20:49:31] kevinator, yt? [20:49:47] yeah im here... [20:49:59] I’m in a cafe with a crappy connection, and it seems to be working. [20:50:06] I was going to move in 10 mins [20:50:19] ok [20:50:31] do you want me to wait till youre back? [20:52:22] nah, what’s up? [20:52:37] I uploaded a screenshot on Dashiki annotations in the task [20:53:08] I wanted to ask you for your opinion, and also ask if you think we should show it to Pau [20:53:31] maybe you or he want to change something [20:53:42] ok, let me look at it [20:53:50] ok [20:55:43] that’s cool. can you show me the file the data is in? [20:57:02] kevinator, yes, just a sec [20:57:09] https://meta.wikimedia.org/wiki/Dashiki:RollingActiveEditorAnnotations [20:58:30] mforns: do the annotations on the bottom appear in the order they are in the file? [20:58:48] no, they appear sorted by start date [20:58:54] the older first [20:59:26] It’s weird that you can speficy an annotation for a date or range that is not even within the data range [20:59:54] I think I should call a meeting with Pau to discuss next steps... [21:00:13] kevinator: i would say that's not necessarily necessary at the moment [21:00:15] otherwise this does seem useful when it will have real data in it [21:00:31] we said when we tasked this that we'd have a second pass where we engaged Pau [21:00:35] exactly kevinator, Nuria, Dan and I were thinking in showing only the annotations that share the date range of the chart [21:00:46] and that's when we would try to put the annotations in the graph itself [21:01:23] aha milimetric, so you think we should finish this task as it is now [21:01:41] yes, that was how we envisioned it when we tasked it [21:01:54] and then create another task engaging Pau, ok [21:02:03] yes [21:02:05] makes sense to me [21:02:11] yes, a second pass later [21:02:28] but this first pass was defined as: have annotations show below the graph with a little popup [21:02:37] and it's great that you got markdown working, that's really neat :) [21:02:43] ok, so I'm going to finish it up [21:02:45] :] [21:02:51] that was easy, hehe [21:03:03] the layout was more tricky [21:03:22] so is the width of the box determined by how much text is in it? [21:03:36] the enclosing box? no [21:03:44] If we only have one annotation will it show up on the left? (left aligned?) [21:03:46] the note boxes, yes [21:03:52] yes, exactly [21:04:57] one more question: will each project:metric have its annotation file? [21:05:15] It seems they may all inherit the same annotations from a metric [21:07:05] This look great tho [21:07:15] kevinator, yes, now all projects share the same annotation file, and a metric will show all annotations regardless of which projects are being shown in the graph [21:07:58] That’s fine for now. We’ll see if we get any requests for project specific annotations [21:08:09] Dan and I spoke this the other day informally, maybe in the next step we can add 1-project-only support [21:08:16] yes [21:08:45] ok, so, let me finish this up and I let you guys know [21:09:00] :-) [21:09:14] i’m going to sign off for a bit and go back home [21:09:19] I’ll be online again in 30 mins [21:10:32] ok [21:33:04] Where should I request access to hive? RT? [21:43:49] Ironholds: I'm hoping to look up some personally identifying financial and health information in Hive. I hear I can make queries from stat1002, but I don't see how to? [21:44:08] awight, type "hive" ;p [21:44:19] baaa! [21:44:22] also, that's dumb [21:44:26] you sneaky analsen [21:44:31] you know we keep the health information in MySQL [21:44:36] haha, uhhh [21:44:36] hive is where we keep the social security numbers [21:44:41] I just dump it everywhere [21:44:50] ahh, the AOL Approach [21:44:52] can't be bothered to clean my dungeon [21:44:55] a valid analytics technique [21:45:00] awight, you are not in the analytics-privatedata-users group [21:45:05] * awight belches some bones [21:45:07] o no! [21:45:07] so you will not be able to access the webrequest data in hive. [21:45:17] (if you can access that data, you had better let me know!) [21:45:19] ottomata: is that an RT process to enlist myself? [21:45:31] yes! but RT is no more! [21:45:36] Phabricator! [21:45:55] ooo! thx [21:46:02] the official access request policy in Phab is still being worked out. but for now, if you create an issue and tag it with Operations or maybe ops-request, i think someone will find it [21:46:14] ottomata: confirmed that permission was denied :) [21:47:08] good! [22:14:15] stupid C++ [22:14:33] awight, you'll like this. One of my friends wrote a library for reading delimited files into R. It's 15x faster than the default. [22:14:51] one minor problem; somewhere in the ~1000 lines of C++, he accidentally, in the process of testing it, hard-coded it to only accept commas as separators [22:15:03] he can't actually work out where, we just know it blows up when you throw in a TSV. [22:15:13] Ironholds: wow, I was totally trying to ignore your C++ emission above [22:15:24] Hey, I like C++. I just hate the people who write it. [22:15:40] Bjarne is probably a really nice guy once you get past the whole C/C++ thing [22:15:41] It's like there's a Who Can Add Fewest Comments To Their Code competition and everyone upstream from me is invited. [22:15:47] Ironholds: this code is open sourced somewhere? [22:15:56] oh yeah, https://github.com/hadley/fastread [22:16:03] that is a heinous name [22:16:10] I found it in the end- https://github.com/hadley/fastread/blob/88d4cfafbdcea174bd901438a7e9596de23f14c2/inst/include/fastread/SeparatorPolicy.h#L8 [22:16:23] also, problem with "read" is past tense vs imperative, etc... [22:16:24] the name is not the problem [22:16:26] hah [22:16:29] I can't get over it [22:16:34] the problem is that every R/C++ library I encounter is, like. [22:16:39] yep [22:16:40] Like there's a guide to writing somewhere that says [22:16:41] I remember [22:16:43] "do not add comments" [22:16:47] oooh [22:16:53] "If you find you absolutely have to write comments, make sure they describe the code, not the why" [22:16:53] and do evile magick if possible [22:17:00] HAH [22:17:02] thx for the link [22:17:06] so you end up with a thousand lines of C++ that include one comment which reads "set the namespace" [22:17:13] WHY. WHY ARE YOU SETTING THE NAMESPACE. TELL ME. [22:17:19] not quite hardcoded but entertaining enuf [22:17:32] actually the library you read was a poor example. The code is mostly good. Just the lack of comments. [22:17:38] It balances out my C++ which is at least 30% comments. [22:18:01] "Why am I doing this" :p [22:18:10] more https://github.com/Ironholds/urltools/blob/master/src/encoding.cpp [22:18:35] I have so much to make less-heinous in that library, but I am tremendously proud of the URL decoder/encoder. [22:18:50] ohwow, "internal_url_encode" [22:18:59] Yes, I hate it too [22:19:02] I like that your "%" fooled github [22:19:10] TL;DR Rcpp can't handle functions in classes [22:19:17] WAT [22:19:20] if I pass encoder::url_encode() out to R, it starts screaming [22:19:26] "I DON'T KNOW WHAT THE ENCODER NAMESPACE IS HALP" [22:19:38] oh. mangling. [22:19:52] so you write things in classes, as god intended, and then https://github.com/Ironholds/urltools/blob/master/src/urltools.cpp#L56-L69 [22:20:22] * awight holds up gorgon shield [22:20:26] hahah [22:20:35] oh, you're lucky the guts are hidden for you. [22:20:37] *from [22:20:37] Auugh I think it got me [22:20:45] if you want real fun, you should see R/C code [22:20:49] everything is a SEXP. EVERYTHING. [22:21:02] never before have so many caps locks sacrificed so much for so many [22:21:09] please take out the MediaWiki style spaces around template brackets oh god [22:21:27] I like spacing it like that, it makes it readable [22:21:44] now if you insist on void func_name() { instead of void func_name(){ we will come to blows, good sir. [22:21:47] that is a stupid convention [22:21:57] * awight abandons holy war [22:22:02] although less stupid than hurr durr let's put the brace on the next line hurr [22:22:11] that's just...why. why would you do that. [22:22:13] I put them on a new line [22:22:22] why? Seriously, I'm interested to hear the rationale [22:22:24] because then you can see the balanced pairs [22:22:25] I've never really got it [22:22:31] ...okay that's a really good point [22:22:45] also, imagine a two-line else statement... [22:22:59] I just do if(foo){ [22:23:01] bar; [22:23:04] } else { [22:23:07] baz; [22:23:07] } [22:23:25] it's really easy and common to: else if (stupid condition)\n\t make my day;\n } [22:23:37] can we at least agree that the if(foo){bar} people need to go somewhere else? [22:23:38] hah. snap. [22:23:50] anyway the brace on the same line is... K&R [22:23:56] you want to be cool, don't you? [22:24:12] I want a lot of things but that actually isn't one of them. [22:24:17] Mostly I just want to be happy [22:24:28] a list of string vectors. [22:24:40] a single-quote before each line of documentation! [22:24:44] list < std::vector < std::string > > [22:24:45] you're made of stouter stuff than me [22:24:56] oh, actually the single quote is for roxygen [22:25:01] I'm sure... [22:25:06] it uses [comment_char]' in its parser [22:25:12] you're not just rubbing that post-colonian shite in, you sure? [22:25:13] "this line has something to turn into a markdown page" [22:25:22] *colonial* [22:25:33] what, with a single apostrophe? [22:25:39] with half a quote [22:25:45] groooan [22:25:57] wait, awight [22:26:11] * awight pretends was not wandering off already [22:26:18] would an apostrophe without a closing apostrophe be an unopposedtrophe? [22:26:19] * awight satisfied to cause small cuts [22:26:34] Ironholds: don't forget the newline and you shouldn't have that issue [22:26:43] fine, ignore my terrible pun [22:27:08] it was fantastic [22:27:30] It made me feel like the bar was closing. [22:27:51] I think that's a compliment and will interpret it as such. Thank you! [22:34:55] (CR) Milimetric: [C: 2] Normalize project before deduplicating [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/181001 (owner: Mforns) [22:35:06] (Merged) jenkins-bot: Normalize project before deduplicating [analytics/wikimetrics] - https://gerrit.wikimedia.org/r/181001 (owner: Mforns) [22:36:34] Quarry: Make "Home" navlink go to profile for logged-in users. - https://phabricator.wikimedia.org/T85175#940705 (Capt_Swing) NEW a:yuvipanda