[17:32:00] halfak: so mwreverts is awesome, i was able to get the API version to work quickly in a PAWS notebook... [17:32:08] halfak: ... however, i'm wondering how to use the db version (which seems preferable for performance) on PAWS too. at least the example from https://pythonhosted.org/mwreverts/db.html#mwreverts.db.check doesn't work out of the box [17:32:58] halfak: (disregarding that that example should probably say "import mwreverts.db" instead of "import mwreverts.api" [17:33:33] halfak: does that "schema" thing work with the databases on PAWS too or does it (or mwdb in general) need to be modified? [17:34:16] (happy to open an issue on github instead, if that's more than a simple fix) [18:24:08] * Nemo_bis follows with interest [18:57:52] o/ [18:57:58] Sorry I missed the pings. [18:58:36] It should work on PAWS, but I do remember there was weirdness. [18:58:56] I think opening an issue on github would be good. Please feel free to ping me here about it. [18:58:59] HaeB, ^ [19:07:27] halfak: thanks, will do! [22:25:44] https://meta.wikimedia.org/wiki/Research:Measuring_edit_productivity is now fully up to date with the results presented in a research showcase last year. [22:26:01] Not a big announcement, but that was a lot of work. Also I cleaned up the efficiency analysis. [22:26:32] Look at efficiency collapse as Wikipedia grew exponentially and then immediately begin to rebound in 2007: https://meta.wikimedia.org/wiki/File:Persistent_tokens_per_hour.smoothed.by_month.svg [22:26:41] Really says something. [22:28:26] halfak: wow that is good reading, esp https://meta.wikimedia.org/wiki/Research:Measuring_edit_productivity#Productive_efficiency [22:42:42] Right!? It raises some questions that I think imply coordination costs -- not necessarily the lack of value of newcomers ;) [22:43:09] One thing I noticed is that we're essentially at high productivity level now with *far more* people than in 2006.