[18:15:35] We'll be starting our IRC office hour with the Discovery team in about 15 minutes. Join us! [18:18:50] \0/ [18:29:05] #startmeeting What's new in Discovery | Wikimedia meetings channel | Please note: Channel is logged and publicly posted (DO NOT REMOVE THIS NOTE) | Logs: http://bots.wmflabs.org/~wm-bot/logs/%23wikimedia-office/ [18:29:05] Meeting started Wed Mar 30 18:29:05 2016 UTC and is due to finish in 60 minutes. The chair is CKoerner_WMF. Information about MeetBot at http://wiki.debian.org/MeetBot. [18:29:05] Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. [18:29:05] The meeting name has been set to 'what_s_new_in_discovery___wikimedia_meetings_channel___please_note__channel_is_logged_and_publicly_posted__do_not_remove_this_note____logs__http___bots_wmflabs_org__wm_bot_logs__23wikimedia_office_' [18:29:41] * MaxSem waves [18:29:45] o/ [18:29:50] CKoerner_WMF: thanks, and greetings to all of those who have joined us [18:30:17] * tfinc waves back to MaxSem and dcausse [18:30:18] (I don't suppose there's any way to access the feed without giving the site webcam access?) [18:30:48] YairRand You can disable your webcam, but I can't remember if that's before or after you connect. [18:31:17] o/ [18:31:24] also, scotch [18:31:36] Reminder for those who'd like to join us via video/audio, we have a meeting setup here: https://bluejeans.com/388063933/ [18:31:46] as in, tape on your webcam, not whiskey :P [18:32:28] anyway? questions? [18:33:16] https://www.mediawiki.org/wiki/Wikimedia_Engineering/2015-16_Q4_Goals [18:33:32] First question on blue jeans is about Q4 goals --^ [18:34:06] it is a bit weird to be split between video and irc link here [18:34:12] but we'll make the best of it [18:34:12] one of the first goals is looking into language detection [18:34:38] specially using https://www.mediawiki.org/wiki/TextCat [18:34:44] A/B test this quarter in language detection. [18:35:03] Also, this quarter, look into upgrading Elasticsearch [18:35:44] improve stability and performance [18:36:27] geospaciel queries, that means searching for things within a certain geographic area? [18:37:18] YairRand I'll tee that up for an answer [18:37:42] YairRand, yes: https://www.mediawiki.org/wiki/Extension:GeoData#list.3Dgeosearch [18:40:54] CKoerner_WMF: i'm always torn between transcribing all the good discussion on the video link vs treating these two as different streams [18:41:05] Now talking about clarifying the record of work for Discovery. [18:41:24] Deb's talking a little about describing the work on the Portal: https://www.mediawiki.org/wiki/User:DTankersley_(WMF)/Proposals/Wikipedia_Portal_Update [18:41:44] As an example of plans for a particular area [18:42:17] (also another thing Discovery is working on!) [18:43:07] Justin O. Wants more metrics! :) [18:44:15] YairRand: do you have any other questions about GeoSpatial ? [18:44:22] tfinc: nope [18:45:02] actually, is this stuff going to work with geoshapes, once that's available? [18:46:17] that's a planned property datatype in wikidata [18:46:22] YairRand Do you mean polygons on interactive maps? [18:46:54] YairRand: what's the use case and do we have a phab task for it? we'll need that to prioritize [18:46:57] reference: https://www.elastic.co/guide/en/elasticsearch/guide/current/geo-shapes.html [18:47:14] Use cases and phab tickets are welcome to help prioritize! [18:47:37] (searches around on phab) [18:47:58] YairRand: I'm not aware of any Phab tickets for this. :-) [18:48:44] I think I see one that YairRand mentioned (planned property datatype in Wikidata) here: https://phabricator.wikimedia.org/T57549 [18:49:00] There we go. [18:52:12] Now talking about http://www.gdal.org and how we might interpret geospacial data [18:53:58] Max is talking about the need for a central repository for such data, if not in wikidata itself. [18:54:37] Current plate of work is full, but would be something to look into in the future. [18:54:49] Short term maps plan - get maps on Wikipedia! [18:55:27] how many people are on search? [18:56:07] JustinO: About 3.5 engineers. [18:56:10] k [18:56:38] JustinO: + ops + PM [18:56:44] There is a team breakdown here: https://www.mediawiki.org/wiki/Wikimedia_Discovery#The_team [18:56:49] thx [18:58:00] Max is reminding us that if you have an interest in something the team is working on join the related task(s) in Phabricator. [18:58:44] For those who are possibly new to Phabricator: https://www.mediawiki.org/wiki/Phabricator [18:59:14] Now we're talking about making repositiories public. [18:59:22] And now on to recall issues [18:59:34] me/ What is 'recall' in relation to search? [19:01:26] Nevermind. There's a wiki article for that. https://en.wikipedia.org/wiki/Precision_and_recall [19:04:18] Justin O. suggests taking results with zero results and running them against other search engines and see what they return. Why do they differ? [19:04:31] take ZRP queries, runtime on google/bing/ddg (theQuery site:wikipedia.org). if google returns a reasonable answer, figure out why. [19:05:25] recall: the ability to bring the correct result on to the search page. precision: placing the result in the best place on the page [19:05:37] *crickets* [19:05:52] https://www.mediawiki.org/wiki/User:TJones_(WMF)/Notes/Why_People_Use_Search_Engines [19:06:24] other recall ideas: [19:06:25] find top queries w/ a low SAT or click rate. categorize them & count the categories. is there an easy category w/ high impact to attack? [19:06:25] run top queries w/ a low SAT or click rate on google & compare. [19:06:26] look at session that begin on wikipedia, then goto google and back to wikipedia. [19:06:32] I'd love any questions/comments/concerns on https://www.mediawiki.org/wiki/Wikimedia_Discovery/FDC_Proposal [19:06:47] https://www.mediawiki.org/wiki/User:TJones_(WMF)/Notes#Survey_of_Zero-Results_Queries [19:06:52] for SAT sessions of len >= 2, look at first query & final query. figure out why the first query wasn't successful. [19:07:59] recall: for SAT sessions of len >= 2, look at first query & final query. figure out why the first query wasn't successful. [19:08:50] Question from video chat "Have you looking into other ways to determine satisfaction with search results?" [19:09:05] dwell times are one way [19:09:14] but what about things like [19:09:14] copying the URL [19:09:17] or content snippets? [19:09:24] signifying that the got the information that they needed? [19:09:55] Right now our user satisfaction metrics are desktop only [19:10:06] CKoerner_WMF, should i try to join the video chat? [19:10:06] looking in the future to do something with mobile (as it's growing) [19:10:15] yurik Feel free! [19:10:46] web dwarfs apps for search. That's interesting. [19:11:04] More fun with numbers here: http://discovery.wmflabs.org/metrics/ [19:12:18] CKoerner_WMF, apps are like 1% of all of our traffic, maybe less [19:13:00] comparing second queries to first when folks have unsatisfied results. Look at the distance (frequency of words) [19:13:12] TFIDF: https://en.wikipedia.org/wiki/Tf%E2%80%93idf [19:13:21] ^related to search satisfaction [19:14:30] Dan's sharing some stats: http://discovery.wmflabs.org/metrics/#survival [19:17:49] Defining SAT: (what is search success on wikipedia?) [19:17:49] dwell >30s [19:17:49] copy url / snippet contents [19:17:49] click external link & dwell >30s on click [19:17:49] click & click sub-page & dwell >30s [19:19:07] one day someone needs to figure out how to get meetings like this on-wiki without losing any features [19:19:14] If there are no more questions we're going to wrap up this chat. [19:19:48] #endmeeting [19:19:48] Meeting ended Wed Mar 30 19:19:48 2016 UTC. Information about MeetBot at http://wiki.debian.org/MeetBot . (v 0.1.4) [19:19:48] Minutes: https://tools.wmflabs.org/meetbot/wikimedia-office/2016/wikimedia-office.2016-03-30-18.29.html [19:19:48] Minutes (text): https://tools.wmflabs.org/meetbot/wikimedia-office/2016/wikimedia-office.2016-03-30-18.29.txt [19:19:48] Minutes (wiki): https://tools.wmflabs.org/meetbot/wikimedia-office/2016/wikimedia-office.2016-03-30-18.29.wiki [19:19:48] Log: https://tools.wmflabs.org/meetbot/wikimedia-office/2016/wikimedia-office.2016-03-30-18.29.log.html [19:20:02] Thanks everyone! [21:01:30] #startmeeting https://phabricator.wikimedia.org/E152 [21:01:30] Meeting started Wed Mar 30 21:01:30 2016 UTC and is due to finish in 60 minutes. The chair is robla. Information about MeetBot at http://wiki.debian.org/MeetBot. [21:01:30] Useful Commands: #action #agreed #help #info #idea #link #topic #startvote. [21:01:30] The meeting name has been set to 'https___phabricator_wikimedia_org_e152' [21:01:49] #topic Please note: Channel is logged and publicly posted (DO NOT REMOVE THIS NOTE) | Logs: http://bots.wmflabs.org/~wm-bot/logs/%23wikimedia-office/ [21:02:13] hi folks! [21:02:58] * robla begins to wonder if he's going to be the only one at this office hour ;-) [21:03:11] Hi Robla! [21:03:26] hi Scott_WUaS [21:05:19] @robla: What in particular do you want to focus on today? [21:05:47] this is really just going to be more of an office hour in a somewhat traditional sense. we only had a couple of ArchCom folks at the telecon last hour (gwicke and Krinkle) [21:07:01] sounds good ... and perhaps an opportunity to get things done in a different way, with relatively few participants [21:07:28] I listed a few RFCs that I'm shepherding in https://phabricator.wikimedia.org/E152 that I'm specifically happy to answer questions about, but really, no locked down agenda. Scott_WUaS, any specific questions you have? [21:07:57] Yes, thanks ... [21:08:30] robla: I'm around too if we need to discuss the Gerrit/Phab one a bit too [21:08:55] ostriches! o/ [21:09:05] WUaS which donated WUaS to Wikidata last autumn is curious what the process is for communicating about further developing WUaS in Wikidata / MediaWiki and re ArchCom? [21:10:29] Scott_WUaS: I think your donation is something we can discuss in a different venue, and I'm happy to do so in the hour after this meeting [21:10:51] WUaS is currently talking with former CC MIT OCW Executive Director MIT Dean of Online Learning, Cecilia d'Oliveira and have received Creative Commons' permissions from her to develop and adapt MIT OCW in 7 languages and in Wikidata [21:11:02] ostriches: we touched on the Gerrit->Phab migration conversation in our last meeting [21:11:14] robla: thanks [21:12:10] robla: Yeah I saw. Was there any followup needed on that? I think the only question really is the status. It's not really in draft, it's under implementation now if we consider it accepted [21:12:47] robla, I had one question about T123753 [21:12:47] T123753: Establish retrospective reports for #security and #performance incidents - https://phabricator.wikimedia.org/T123753 [21:12:48] ostriches: is there help you need from ArchCom? I think the "done"ness is something we can discuss a little bit [21:13:26] I don't think we really need much in the way of help from ArchCom at this point. Considering the outcome of the various discussions we've had so far I think there's consensus for it. [21:13:36] (for it to be accepted and move forward, that is) [21:14:37] ostriches: Are we going to see a Gerrit upgrade happen before we dump it? :) [21:14:46] ostriches: I wouldn't go so far as to consider it "accepted", but that did get us into a general conversation about what does "accepted" mean by ArchCom. I don't think anyone in ArchCom wants to block it [21:14:55] legoktm: Yes, I've been working on that this week. "Soon" [21:15:03] <3 [21:15:37] * TimStarling is partially online this hour, as well as looking after kids [21:15:46] o/ TimStarling :-) [21:15:48] hi TimStarling :) [21:17:09] robla: In which case I think we're good then? I don't think we need to bikeshed over the template status too much :) [21:17:28] As long as ArchCom doesn't need to block and we've got general consensus based on passed discussions, I think RelEng can move ahead [21:17:37] *past [21:18:31] ostriches: I really appreciate that y'all wrote up an RFC on this, as I think having that written up is going to help the migration go more smoothly. there's some nitpicking we can do about the RFC about the "how" and the "when", but I don't personally see any problems with the "what" [21:19:00] I think some more of the how/when will become clear in the coming quarter. [21:19:05] Krinkle: do you mind if I further paraphrase what you said in the past hour? [21:19:16] It's going to be like the Gerrit RFC insofar as this one isn't going to be "done" for a long time. [21:19:49] robla: OK [21:20:36] ostriches: I think the how/when questions need to be clear in order for it to be marked "approved" (under the current ArchCom process) [21:21:41] marking ArchCom-RFCs as "approved" is a subject that sends me down the process wonk rabbithole [21:22:02] robla: We do have https://www.mediawiki.org/wiki/Wikimedia_Release_Engineering_Team/Project/Differential_Migration as a result of our annual planning. [21:22:08] Which should be incorporated into the RFC. [21:22:15] * robla looks [21:23:02] (it's linked to from the RFC, as I was tired of copy/pasting tables all last week ;) ) [21:23:29] greg-g: I understand, truly :-) [21:23:40] So many tables that was. [21:24:04] the outcome of planning is great, the process can sometimes be... subpar :) [21:25:59] ostriches: lemme see if I can paraphrase... Phase 1: T130418 done hopefully June 30 (and then do the same quarter math for Phase 2 and Phase 3) [21:25:59] T130418: Goal: Phase 1 repository migrations - https://phabricator.wikimedia.org/T130418 [21:26:51] I think thats how the quarter math works out [21:27:20] Phase 2: T130420 done hopefully by December 31 of this year [21:27:21] T130420: Goal: Phase 2 repository migrations - https://phabricator.wikimedia.org/T130420 [21:28:12] phase 3: T130421 done hopefully by 2017-03-31 [21:28:13] T130421: Goal: Phase 3 repository migrations - https://phabricator.wikimedia.org/T130421 [21:28:35] does that sum up the plan about right? [21:28:39] yup, and the KPIs might be helpful to understand what we consider {{done}} along the way [21:28:44] (they're at the bottom of the doc) [21:29:41] ie: what "phase X" means :) [21:30:35] "Q1: By the end of Q1 we plan to have a system in place to manage Differential and Nodepool/Continuous Integration interaction, from the baseline of no system in place." (Q1 ends 2016-09-30, so the middle of Phase2, right?) [21:30:54] FY [21:31:18] the system will be in place before phase 2 [21:31:33] phase 2 is in Q2, semantically luckily enough [21:31:59] so....is there a numberless phase to this project? ;-) [21:32:15] I'm confused by the "in the middle of phase2" part [21:32:43] phase2 happens in Q2, building the glue happens in Q1... [21:32:59] * greg-g goes to get his hoodie he left outside, his office is suprisingly cold [21:33:23] greg-g: my apologies, I was extrapolating phases based on when the endpoints were [21:33:48] * greg-g nods [21:34:30] I was worried I miss-aligned something along the way and was not looking forward to copy/pasting a lot more [21:34:33] :) [21:34:42] since Phase 1 hopefully ends 2016-06-30, and Phase 2 hopefully ends by 2016-12-31, I put 2016-09-30 in the "middle of Phase 2" [21:35:03] ah, I see what you mean, yeah [21:35:49] as we imagined it (correct me if I'm wrong, ostriches ) is that there'd be a period of "build integration and respond to our phase1 users" before starting the phase 2, er phase [21:36:36] phase1 completion requires integration work to be done, yeah. [21:36:55] so, phase 1.1 ;-) [21:37:24] 1.uhoh? ;) [21:37:58] (the release after 1.0, to fix the inevitable bug you missed, for those that don't get the joke/context) [21:39:45] Phase 1: hopefully ends 2016-06-30, Phase 1.1: hopefully ends 2016-09-30, Phase 2: hopefully ends by 2016-12-31, Phase 3: hopefully ends by 2017-03-31 [21:40:15] * greg-g nods [21:41:00] * robla looks for the RFC number for this RFC [21:41:16] T119908 [21:41:17] T119908: [RfC]: Migrate code review / management to Phabricator from Gerrit - https://phabricator.wikimedia.org/T119908 [21:41:44] #info T119908: Phase 1: hopefully ends 2016-06-30, Phase 1.1: hopefully ends 2016-09-30, Phase 2: hopefully ends by 2016-12-31, Phase 3: hopefully ends by 2017-03-31 [21:41:44] T119908: [RfC]: Migrate code review / management to Phabricator from Gerrit - https://phabricator.wikimedia.org/T119908 [21:42:26] alright should we talk about the other RFCs, or is that one the most interesting to get cleared up? [21:43:04] reminder of other topics: https://phabricator.wikimedia.org/E152 [21:43:24] I think matt_flaschen had a question about T123753 [21:43:25] T123753: Establish retrospective reports for #security and #performance incidents - https://phabricator.wikimedia.org/T123753 [21:43:38] robla: anything else you want/curious about from ostriches and I? [21:43:43] greg-g: ah, right, thanks for the reminder [21:44:08] * robla doesn't have any followup right now for the Gerrit->Phab stuff [21:44:22] * greg-g nods [21:46:02] by the way, gwicke, Krinkle , and I discussed putting the mbstring requirement RFC into last call....I'll bring that up after I answer matt_flaschen 's question [21:46:08] matt_flaschen: your question? [21:47:19] robla, what I mentioned on the task: How would these new retros relate to the Incident reports we have currently? https://wikitech.wikimedia.org/wiki/Incident_documentation [21:48:41] matt_flaschen: I'm hoping we figure out some social norms around this [21:48:44] so... [21:49:51] what I would envision happening is that the WMF Security Team being able to flag things as "this should have a retrospective" [21:50:11] key word being "should". I don't envision there would be 100% compliance [21:50:13] Hi Megan! [21:50:37] (WMF Performance Team would be able to do the same) [21:52:06] the point would be that it would not be socially ok to create many security issues and never write a retrospective. at the same time, if the WMF Security Team got really fussy, I wouldn't envision there being 100% of the retrospectives written they suggest. [21:52:45] (same holds true for Performance) [21:53:11] matt_flaschen: does that make sense? [21:53:21] robla, do you think we should do them at https://wikitech.wikimedia.org/wiki/Incident_documentation ? Potential advantage: As I mentioned on the task, line between "really bad performance" and "outage" is not always clear cut. [21:53:44] matt_flaschen: I don't think robla is asking for a duplicative reporting. Take the save-timing regression as example. When this happened it was mostly on the performance team to do the full investigation and (in later stages) (maybe) delegate some actionables to the relevant maintainers of the code in regression. [21:53:58] That's not a healthy or maintainable way of working. [21:54:54] Krinkle, so you are you saying "Incident documentation" should be for documenting the immediate response, and there should be a separate retrospective of the full solution? [21:55:56] If a performance problem could also be considered an outage. [21:56:01] Which depends on the severity. [21:56:20] I imagine if the regression is result of regular deployment, it is subsequently reverted and the relevant author/merger/maintainer should do the investigation (probably on Phabricator). The deployer (if they notice the regression) could write an immediate response on wikitech, but I'm not sure it's all that useful. It depends on how big/obvious the [21:56:20] regression is. In most cases (at least until we have better automated measurements) it will be noticed hours/days later, in which case I think using wikitech/incident is overkill. [21:56:49] matt_flaschen: I agree, but I'd say the severity threshold is at "If the deployer observed it" (in logs/alerts etc.) [21:57:21] Which will slowly become a lower threshold as our infrastructure improves [21:57:21] I think the credibility of the Security and Performance teams is tied up in how frequently they suggest postmortems are needed. It's very subjective, and that seems ok to me. [21:57:33] +1 [21:57:53] the issue with many of the big systemic issues is that it would take a lot of time to write a proper description & evaluate possible solutions [21:58:34] Thanks, Krinkle, that answers my question. Basically, do an incident report for severe perf issues (if you notice immediately when deploying), and do a retrospective on Phabricator if the Performance team asks for it (I would add "or if your team thinks it's a good idea"). [21:59:06] by the way, we're coming up on the end of our hour, so I feel bad about ending the official part right on the top of the hour. I may run over a couple minutes, but probably not more [22:00:04] matt_flaschen: Yeah, I don't think it's worthwhile pursuing a really strict rule that one can autonomously follow. It's mostly a quest to adopt and accept this as a normal social behaviour going forward. And to not interpret it as an assignment of blame. [22:00:22] #info general discussion, most of the hour on T119908 , and then the end of the hour on T123753 [22:00:23] T123753: Establish retrospective reports for #security and #performance incidents - https://phabricator.wikimedia.org/T123753 [22:00:23] T119908: [RfC]: Migrate code review / management to Phabricator from Gerrit - https://phabricator.wikimedia.org/T119908 [22:02:02] #info T129435 ( RFC: drop support for running without mbstring) is going to be heading into last call [22:02:02] T129435: RFC: drop support for running without mbstring - https://phabricator.wikimedia.org/T129435 [22:02:38] thanks everyone! [22:02:43] #endmeeting [22:02:43] Meeting ended Wed Mar 30 22:02:43 2016 UTC. Information about MeetBot at http://wiki.debian.org/MeetBot . (v 0.1.4) [22:02:43] Minutes: https://tools.wmflabs.org/meetbot/wikimedia-office/2016/wikimedia-office.2016-03-30-21.01.html [22:02:43] Minutes (text): https://tools.wmflabs.org/meetbot/wikimedia-office/2016/wikimedia-office.2016-03-30-21.01.txt [22:02:43] Minutes (wiki): https://tools.wmflabs.org/meetbot/wikimedia-office/2016/wikimedia-office.2016-03-30-21.01.wiki [22:02:43] Log: https://tools.wmflabs.org/meetbot/wikimedia-office/2016/wikimedia-office.2016-03-30-21.01.log.html [22:02:54] Thanks, Rob! [22:03:00] * DanielK_WMDE__ looks around confusedly [22:03:20] Hello Daniel! [22:03:40] hey. i thought we were going to skip todays meeting, because people were traveling... [22:04:01] :) [22:04:35] DanielK_WMDE__, don't worry - we decided everything without you. porting MW to ASP is gonna be fun! [22:05:00] what MaxSem said ;-) [22:05:23] (it ended up just being the RobLa office hour, per what I wrote in E152 [22:05:25] MaxSem: ah. can you have that done by next tuesday? that would be great. [22:06:47] robla: oh. today's minutes seem to have last weeks minutes mixed in. Last week, the bot went away half way through the meeting. [22:07:48] robla: or... it was just me being confused. never mind. [22:11:58] DanielK_WMDE__ : are you in De now? [22:12:53] no, in jerusalem, for the hackathon [22:13:06] ah right ... [22:13:12] equally late ... [22:13:16] almost [22:14:19] Rob said it might be possible to talk about WUaS giving WUaS to Wikidata in this time and re WUaS is currently talking with former CC MIT OCW Executive and current Director MIT Dean of Online Learning, Cecilia d'Oliveira and have received Creative Commons' permissions from her to develop and adapt MIT OCW in 7 languages and in Wikidata [22:15:53] For example, she wrote in a recent email that CC WUaS could develop and adapt CC MIT OCW in 7 languages (in Wikidata), but that we must add the following - MIT OCW Creative Commons' licensing and endorsement clarifications a) MIT is not affiliated with, and does not endorse, World University and School b) MIT does not offer credit to WUaS students, and c) All MIT OpenCourseWare materials are available for free through h [22:16:51] And so for example, I've begun to add this here - http://worlduniversityandschool.org/WUaS_En_Wiki/index.php?title=Nation_States - where each country will become an online accrediting CC university, and in each countries' main and official languages [22:18:05] I had asked RobLa about process for communicating about this further is ... and he suggested talking about it in this hour [22:18:09] Scott_WUaS: I'm not sure I'm the right person to help you on this. I'm not the only person you've approached about this, though, am I? [22:19:45] I emailed WMF Legal Director Michelle Paulson, Lydia Pintscher, Ryan Kaldari and others a few times in recent weeks, but Michelle said she wasn't the person who could help WUaS or Wikidata re this WUaS gift to Wikidata [22:20:53] Who would be the best person RobLa? [22:22:47] As chair of ArchCom or similar, you would be a great person to communicate further about this, from WUaS's and my perspectives ... [22:24:26] And WUaS seeks to be a growth story for Wikidata, WMF and indeed for the web, as well as for universities and all 7,097+ languages - and in terms of hiring academics, coders + [22:25:27] Scott_WUaS: do you have an IRC channel on Freenode for WUaS? [22:25:51] Might it be possible to find a sub-team facilitator and shepherd to explore this further? [22:26:04] RobLa: Not yet [22:26:07] Scott_WUaS: do you have an IRC channel on Freenode for WUaS? [22:26:18] RobLa: Not yet [22:26:53] ah, ok.....well, it's very easy to create one. it's really simply a matter of saying /join #wuas (or whatever you want the channel name to be). [22:27:00] should we do that now? [22:27:29] Sounds good ... I think this would be good - #WorldUnivandSch [22:29:34] Scott_WUaS: ok, great, I just joined it. You aren't there yet, so I can't make you the op of it. If you had been the first to join, you'd automatically have been the op [22:29:36] was looking around how to start #WorldUnivAndSch, and shut the IRC window by accident [22:30:04] just type /join #WorldUnivAndSch [22:31:41] Great, Scott_WUaS and I are in the new channel now. Anyone else who is interested in discussing this further is free to join [22:32:14] robla: Thanks!