[17:17:41] hey leila, did you want me to say anything in the SoS? [17:17:53] ow yes, schana. sorry. [17:20:34] schana: please report the following: a reader survey will go out (most probably, if all the technical components work smoothly) as part of the deployment train tomorrow (Thursday). The survey will run for a week in enwiki, using QuickSurveys and we expect to collect between 200-500K responses. The sampling rates will be finalized today. We will also start a documentation on https://meta.wikimedia.org/wiki/Research:Characterizi [17:21:00] will do. thanks leila [17:21:05] schana: we are working with the mobile web team for this survey, so you and they can add more if there are questions about the technical components of the survey. [17:21:11] thank you schana. [18:20:30] hi leila, it seems the link you posted might not be correct. would you mind double checking it? thanks [18:20:52] why is it not correct, schana? it works for me. [18:20:58] you mean the link to the meta page? [18:21:13] I'm seeing "https://meta.wikimedia.org/wiki/Research:Characterizi" [18:21:25] https://meta.wikimedia.org/wiki/Research:Characterizing_Wikipedia_Reader_Behaviour [18:21:29] schana, ^ [18:21:33] thank you [18:21:37] np. :-) [18:54:20] hi schana, I wanted to check in and see how the recommender project's going [18:55:59] hi milimetric. I had been working on the QuickSurvey stuff, but should be able to get back into the recommender today [18:56:47] ok. just let me know if you think you won't be able to get to all the fixes by leila's deadlines [18:57:13] I have a ton of stuff to do, but the recommender sounded more urgent and I'm happy to help [18:57:39] thanks. I'll let you know [21:47:55] 10Quarry: Make available more options for number of shown rows of resultset (Quarry) - https://phabricator.wikimedia.org/T126540#2016800 (10XXN) 3NEW [21:51:08] 10Quarry: Add page navigation on top also (Quarry) - https://phabricator.wikimedia.org/T126542#2016825 (10XXN) 3NEW [22:05:44] joal, do you know how to setup the SparkContext with pyspark so that it works for spark-submit? Looking around online I'm only seeing example of how to get it to be local. [22:08:00] kjschiroo: --master yarn --deploy-mode cluster [22:08:07] (if you don't want it to be attached to your tty) [22:10:30] Does that just define sc for you? It just appearing in the interactive shell makes me confused about what to expect. [22:10:57] ah, no, if you are going to submit a job [22:11:00] instead of use the repl [22:11:05] you have to do some extra init work [22:11:33] http://spark.apache.org/docs/1.3.1/submitting-applications.html [22:12:18] ottomata: thanks! I'll take a look over that. [22:12:48] oh also [22:12:49] "Note that cluster mode is currently not supported for Mesos clusters or Python applications." [22:12:59] so, i guess with python you can't use --deploy-mode cluster [22:13:07] For Python applications, simply pass a .py file in the place of instead of a JAR, and add Python .zip, .egg or .py files to the search path with --py-files. [22:14:25] kjschiroo: maybe something like sc = SparkContext() in your py code [22:15:41] No args? [22:15:55] maybe args, not sure [22:16:04] never done spark submit with pythong [22:16:20] What do you usually use? [22:16:36] scala [22:17:18] kjschiroo: http://spark.apache.org/docs/1.3.1/api/python/pyspark.html#pyspark.SparkContext