[15:06:38] _o/ [15:07:22] _o\ [15:07:39] -o- [15:34:53] \o/ [15:46:06] _o_ [16:32:29] halfak, or others of the revscoring crew. Is the example you have in the readme supposed to work out of the box? https://github.com/wiki-ai/revscoring/blob/master/demo_load_model.py [16:40:32] kjschiroo, looks like it's broken. Check out the ipython instead. [16:40:42] https://github.com/wiki-ai/revscoring/blob/master/ipython/feature_engineering.ipynb [16:40:50] Oh! Looks like we don't load a model there. [16:40:57] I can fix the example quick. [17:41:19] halfak, thanks [17:46:33] * halfak gets to the fixing [18:47:03] Does anyone remember the research/study that tried to measure the persistence of content additions over time? (I found but that doesn't seem to report results.) [18:50:37] pajz: did the research/study you have in mind report results? [18:52:23] yeah, I think it was presented quite some time ago at a metrics/activities meeting or a research showcase [18:55:13] Hey pajz. That was mine. [18:55:16] I'll get some links for oyu [18:56:02] https://meta.wikimedia.org/wiki/Research:Measuring_edit_productivity [18:56:53] The talk and slides are linked from here: https://www.mediawiki.org/wiki/Wikimedia_Research/Showcase#January_2016 [18:58:00] Most of the code for generating this is documented here: http://pythonhosted.org/mwpersistence/ [19:02:24] thanks, as always [21:07:56] * guillom watches J-Mo|away spamming his watchlist on Meta. [21:08:12] ;) [21:21:31] can I say how much I hate ICWSM/AAAI for not supporting DOIs :-(