[00:14:57] Is there a config var that I can change so that my wiki uses hyphens instead of underscores to replace spaces? [00:15:35] tekmosis: no [00:15:53] tekmosis: that would break soo much [00:15:58] :( ok [00:16:37] tekmosis: you can have hyphens in page names, and you can have spaces in page names, how should the software tell the two apart? [00:17:44] Dunno, I didn't code MW. I just want hyphens over underscores for SEO reasons [00:20:59] Is there a config var to change Main_Page to something else? I could do a redirect or a move but I have 150+ wikis so that's not an ideal option to edit every one of them. [00:21:20] tekmosis: edit the page "MediaWiki:Mainpage" to the page title of where you want it to be [00:22:17] oh but if you don't want to edit very page [00:23:00] I could but it's not ideal [00:23:45] what version of mediawiki are you running? [00:23:48] 1.19.1 [00:26:32] yeah, there's no other way to do it without editing every page on the wiki :/ [00:27:02] darn. Thanks for checking though [00:27:13] cant hardcode [00:27:15] ? [00:29:02] I could do a redirect in php but I was hoping for an official supported way like a var I could set in LocalSettings.php [00:32:50] maybe we could make another config var like $wgTitleDelimiter [00:34:16] Withoutaname: that would screw so much code up [00:34:33] I think that'd help but it might be an edge case, not sure. I have a wiki farm so my needs aren't really in line (I think) with most people [00:35:33] tekmosis: you run them in the same box? [00:35:51] off of the same code base, yes [00:35:53] Betacommand: but other wikis like UseModWiki display pages based on capitalization [00:36:33] why an underscore and not say %20 [00:37:46] how did you set it up tekmosis ? [00:37:55] Withoutaname: capitalization is one thing, IE case sensitive/insensitive but you would need to forbid having - in both page names and user names, along with making several other checks randomly throughout the code [00:39:33] it could be fine for a later version [00:39:40] Withoutaname: I think mediawiki treats them as the same thing [00:39:43] maybe the Parsoid team might have something already [00:39:53] Betacommand: well I mean treating - as space [00:40:15] Withoutaname: because page titles can have - in them [00:40:35] if we wanted, page titles can also have _ but we don't allow that [00:40:55] Withoutaname: correct thats because of that issue [00:41:15] however in a lot of cases - is needed in page titles [00:41:35] rarely is _ used in page title [00:41:54] I'd let the user choose what the delimiter is for their wiki, and default it to _ [00:42:18] Withoutaname: its not that simple [00:42:57] it affects link handling, page creation, username creation, user renaming, blocking, and a slew of other things [00:42:59] is it because existing code like system messages already use - [00:43:09] thats one of them [00:44:08] Withoutaname: the _ is too ingrained in the system to be changed easily [00:44:48] Withoutaname: thats like making all the Yanks use the metric system. It isnt happening [00:45:27] In theory it could, but not realistically [00:45:45] Betacommand: maybe the Parsoid people could come up with something, they're the design people right? [00:45:58] actually, what do they usually do [00:46:24] Withoutaname: this would require a fairly major rewrite of the entire code base [00:46:40] a tall order, to be sure [00:47:06] the Cost/benefit just isnt there [00:50:11] I'm trying to setup a new wiki on top of nginx [00:50:35] I've downloaded the mw tar and unpacked it [00:51:36] the docs I've read say that mw needs to reside in a subdirectory of the document root [00:51:57] then the docs show a config with the mw directory as root [00:52:14] this is confusing to me, but I've tried to imitate it [00:52:44] When I point my browser to the server, I get a blank page [00:53:04] probably means there's a fatal error [00:53:05] !debug [00:53:06] For information on debugging (including viewing errors), see http://www.mediawiki.org/wiki/Manual:How_to_debug . A list of related configuration variables is at https://www.mediawiki.org/wiki/Manual:Configuration_settings#Debug.2Flogging [00:54:47] s34n: typically the folder containing mediawiki is in a folder which is then placed in the root. IE domain/ [00:55:10] legoktm: I don't see any errors in nginx logs of php-fpm logs [01:02:28] It actually serves a page, with and tags. It's just blank. [01:18:08] I take it back. There are no tage. It's just empty [03:15:23] Hi! I'm just wondering if it's at all possible to change the "time zone" acronym for a wiki? My wiki documents a fictional world which runs on it's own "time" (Just a clone of PDT), and I'd like to know if it's at all possible to change the term "PDT" to something else. [03:24:29] BumbleBee: take a look at https://www.mediawiki.org/wiki/Timezone, not sure if it'll let you define a fake one [03:34:46] Hmm, seems like I can't, but I'll look into it. [03:35:17] Are there any extensions or methods for displaying the live time anywhere then, say, on the top of the page? [03:36:30] I've seen wikis using Senamatic Mediawiki with such a feature. [03:39:33] http://wiki.guildwars.com/wiki/Main_Page - This wiki has the feature. How do you enable that? [03:43:47] Ah, javascript. [06:05:04] jquery trick to make comma-separated lists into arrays? https://gerrit.wikimedia.org/r/132811 [06:06:28] Nemo_bis: assistantLanguages.split( ',' ) is an array [08:39:36] Hi, how do I insert a piece of multi-line code in a numbered list? [08:42:29] this works [08:42:43]
long\nmulti-line\ncode
[12:24:07] Hi Everyone. Could you please suggest which parser (preferable php stand-alone library) we can use to tokenize or convert to DOM the MediaWiki file. [14:37:54] Hallo all! I was just awondering what sort of wizardry is required to get SELECT LAST_INSERT_ID(); to work with the MW abstraction layer [14:59:48] Ulfr_: you don't, however calling ->insertId() should do what you want [15:00:10] Skizzerz: That sounds delightfully like exactly what I want, do I call that on $res? [15:00:33] no, directly on the db object [15:01:01] Skizzerz: Preposterous. That sounds entirely too simple for the mediawiki abstraction layer [15:01:26] heh [15:01:58] well, there are legitimate reasons for needing what id was just inserted, and the entire point of the abstraction layer is to avoid dbms-specific things, so it'd make sense that there is a function to retrieve it ;) [15:02:06] I gave up all hope of things being straightforward after I discovered what you lot do with timestamps [15:02:07] :| [15:03:05] lol [15:03:18] I'm trying to install mediawiki. But when I unpack the tarball and point my browser at it, I only get a blank page. [15:03:53] s34n, check your web server's error logs [15:04:10] if apache on a debian/ubuntu-based system, you'll find them somewhere like /var/log/apache2/error.log [15:04:12] Krenair: no log entries for nginx or php-fpm [15:04:31] Ain't no reason to include binaries in what's clearly an integer [15:04:58] Ulfr_, which timestamp field are we talking about here? [15:05:12] Krenair: and nginx isn't returning an error [15:05:15] my guess is that our timestamps are actually varchar(14)/varbinary(14) [15:05:20] I get a 200 back with no content [15:05:22] instead of actual timestamps [15:05:26] https://www.mediawiki.org/wiki/Manual:Timestamp#Datatypes [15:05:27] anyway, /me has to head off [15:05:45] Krenair: I blanked that spot out of my memory once I realized binaries were involved [15:05:49] so many lost terminal sessions... [15:06:11] I just use proper unix timestamps for my little extension and illustrate why things sometimes don't make a lot of sense at a glance [15:08:09] * Ulfr_ may or may not also be dramatic about things like that. [15:15:59] I'm a bit stumped. With no errors in logs, I have no idea where else to look. [15:18:06] s34n: White screen? [15:18:18] Ulfr_: yes [15:18:25] no content at all [15:18:32] s34n: You try enabling error output? [15:18:44] s34n: http://www.mediawiki.org/wiki/Manual:$wgShowExceptionDetails [15:18:51] Ulfr_: yes [15:19:08] s34n: And you have the location of your actual apache error logs? [15:19:30] Ulfr_: yes. nginx logs, actually [15:20:14] s34n: Not familiar, sadly. Alls I know is if it white screens /something/ shows up in my apache error log. That logfile can get put somewhere wonky by LocalSettings or apache2.conf [15:20:59] Ulfr_: I haven't got to the point that I have LocalSettings yet [15:21:13] s34n: o.O [15:21:31] Blank wiki? [15:21:40] When you're trying to launch setup? [15:22:49] yes [15:25:42] Like I said, I'm getting a 200 back. But no content. [15:29:43] Anybody here using php-fpm? [16:12:05] ok. I fixed my php problems. Now mw loads an initial page with warnings. [16:12:10] Warning: session_start(): open(/var/lib/php/session/sess_qpth7m3qclsb6qr9g108rqm9i7, O_RDWR) failed: No such file or directory (2) [16:12:35] is that supposed to happen? Or do I have to fix permissions? [16:22:06] Can someone help me with the Special:RandomInCategory option [16:28:22] Anyone here? [16:38:36] WPhelp-901: Yup! What's up? [16:39:21] Hi yes I have been trying to use the Special:RandomInCategory to get a random category or page in a categories subcategories but it doesn't seem to work right [16:39:47] WPhelp-901: I doubt it descends into subcategories...our category implementation doesn't usually do that [16:40:02] https://en.wikipedia.org/wiki/Special:RandomInCategory/Video_games for example gives me only pages with the word Video games in it and the sample size seems very small [16:40:24] Oh [16:40:31] So if you look at https://en.wikipedia.org/wiki/Category:Video_games [16:40:39] Is there any way I can get a random page in a category? [16:40:48] It has 8 pages and 43 subcategories, it will give you one of those 51 pages [16:40:56] WPhelp-901: Not easily, I think :( [16:41:06] You could accomplish it by direct database queries on a dump, but effort [16:41:30] Sorry my browser just crashed [16:41:55] What wer eyou telling me? [16:43:18] Are you there? [16:44:23] 2014-05-21 - 09:40:56 WPhelp-901: Not easily, I think :( [16:44:25] 2014-05-21 - 09:41:06 You could accomplish it by direct database queries on a dump, but effort [16:44:53] You were saying something about goping to the category [16:45:16] WPhelp-381: Nothing useful to you [16:46:01] Oh ok well thanks anyway. There was something on toolserver that allowed me to do it but the account was close. Maybe Wikimedia could make something like it [17:43:42] What's the best way to make a link to Special:UserLogin (with a &returnto set ) inside an extension? [17:46:32] I'm interested in using Markdown to markup pages. [17:47:01] I see there are extensions for this. Has anyone here used any of them? [17:48:58] hi [19:18:04] do you have to do something special to make mediawiki see a new extension? [19:18:29] s34n, you'll need to include the extension's main file in your LocalSettings.php [19:18:50] the extension should come with instructions for this. you could also check the extension's mediawiki.org page, if it has one [19:20:13] Krenair: I did that. [19:20:32] okay, that should be all unless it has additional steps [19:21:08] s34n, are you having issues with some specific extension? [19:21:11] I've tried 3 different Markdown extensions. The first two seemed to have no effect. The last gives me an error [19:21:13] Fatal error: Call to a member function addMessage() on a non-object in /var/www/mw/extensions/AlternateSyntaxParser/AlternateSyntaxParser.php on line 95 [19:21:32] Some of them won't necessarily work with the latest version of MediaWiki. [19:23:33] I'm surprised there isn't more support for Markdown [19:24:35] All three extensions rely on the same Markdown lib and they all seem to be out of date with it. [19:38:39] hey there, do you need to have a mail server installed if you want to enable emails (password resets, talk page notifactions) [19:43:52] Chiyo: no, it's not necessary, but those emails most likely will be catched by spam filters if they don't come from a legitimate mailserver (or at least with the proper MX DNS record, etc) [19:44:55] right, noted :) [20:24:38] Hello, I need a little help. I received a backup of mediawiki web and db, moved it to a new server, everything works except thumbnails. If I check server logs, I see that thumbnails are being created in old path /home/sites/foo/... but I can not find this location either in php files nor in database. [20:26:28] Does anybody knows where is this path stored? [20:26:41] !localsettings | Bruce__ [20:26:41] Bruce__: All configuration is done in LocalSettings.php (near the end of the file). Editing other files means modifying the software. Default settings are not in LocalSettings.php, you can look in DefaultSettings.php. See , , , and [20:27:16] In LocalSettings.php there is nothing about this path [20:29:07] I tried "find -type f -exec grep foo {} \; -print" in web root and nothing found, the same for DB dump. No sign of "foo" [20:30:15] Bruce__: by default they're stored in /images [20:31:21] Bruce__: does it match your current location? [20:32:43] For example one of such images is being saved into (the old location) /home/sites/foo/example.com/extensions/WikiEditor/modules/images/toolbar/arrow-down.png but I am now using /var/www/foo/webs/example.com/... [20:34:19] well, those images aren't being saved. They're part of the WikiEditor extension, that's a very different problem! [20:34:45] most likely a problem with ResourceLoader's cache [20:35:11] let me see if I remember how to clear it [20:37:10] Vulpix: Thanks, I think I will be able to solve it now. I just needed a little kick :) [20:39:23] Bruce__: if you still have problems with image links/css/scripts, just clear the contents of the module_deps table https://www.mediawiki.org/wiki/Manual:Module_deps_table [20:42:17] hi DanielK_WMDE [20:43:16] hi sumanah [20:43:32] I am about to run off for an appointment but I will be back in <2 h [20:43:36] hi fhocutt - see ya! [20:43:54] today I updated this: https://en.wikibooks.org/w/index.php?title=Perlwikibot&stable=0 [20:44:18] Yes, I found the bugreport (https://www.mail-archive.com/wikibugs-l@lists.wikimedia.org/msg127256.html) too :) This solved the problem with error logs but thumbnails are still not being displayed. I see that there is a call to /usr/bin/php but I have open basedir restriction. [20:44:19] and have continued to make notes here: https://www.mediawiki.org/w/index.php?title=API_talk:Client_code [20:44:24] later sumanah! [20:44:42] :) [20:47:58] Bruce__: the call to /usr/bin/php is not for thumbnails, probably for Job queue: https://www.mediawiki.org/wiki/Manual:Job_queue#Changes_introduced_in_MediaWiki_1.22 [20:48:36] Bruce__: if you have MediaWiki 1.22, set $wgPhpCli = false; at least until 1.23 [20:49:17] Vulpix: yes, I though that it will be something like this. Now I should be handle it on my own, thank you for your help! [20:49:42] no problem :) [20:51:22] in 10 min in #wikimedia-office , we're talking about the square bounding box proposal + the typesafe enums proposal https://www.mediawiki.org/wiki/Architecture_meetings/RFC_review_2014-05-21 [20:52:13] Didn't we already talk about square bounding boxes? [20:52:40] bawolff: yep - in general I think it is likely we will sometimes talk about things more than once to get them all finished [20:56:58] bawolff: Hi [20:57:13] Hi. I need to respond to your email, don't I :) [20:57:42] Yeah if you are free :) [20:58:55] I had a doubt regarding the SQL tables, if I add an SQL entry and then run update.php, can't I use the same wiki pages [20:59:23] yeah, sorry I didn't earlier. I was away, and then I was sick and have a bit of an overflowing of email [20:59:27] I mean does the script not create the entry for all the pages? [20:59:50] bawolff: It's ok. Sorry to know about your sickness [21:00:13] I think I'm mostly over it now. [21:00:48] Great :) [21:01:36] If I remember correctly, I didn't like the page_props table/{{#setpagelanguage:..}} originally, because it felt circular to define the language for a page inside the page itself (What if something uses the language before the define statement) [21:05:04] Sorry got disconnected [21:11:13] bawolff: And about using SQL database, in case we need to define a new entry for pages, how to update the wikis to use the same pages? I am getting a database error [21:11:33] I ran updates.php [21:12:28] kunalg_: Can you give me more context [21:13:01] JeroenDeDauw: DanielK_WMDE: it strikes me that you would have opinions on https://www.mediawiki.org/wiki/Requests_for_comment/Typesafe_enums which we are discussing in #wikimedia-office right now [21:16:45] I made a new entry for pages in tables.sql and WikiPage.php. My wiki is already set up. On running update.php, what I thought was the entry will be created for all the pages with the default that I specify. [21:17:05] bawolff: Do I need to do it manually? [21:17:31] Oh I see. You need to add it to the database updater to [21:18:16] kunalg_: You have to add an entry in includes/installer/MysqlUpdater.php and also make a patch file in maintinance/archives [21:18:56] is there any way to use parsoid to convert html to wikitext without submitting the result to a page? [21:19:37] bawolff: Oh. Ok. Thank you [21:24:24] onei, not sure what you mean [21:24:39] did you check https://www.mediawiki.org/wiki/Parsoid#The_Parsoid_web_API ? [21:26:37] gwicke: I was looking at action=visualeditoredit on the api, didn't see that [21:26:41] thanks :) [21:30:07] onei, the VE api is private [21:30:23] & you are welcome ;) [21:30:31] you can run it through node though, I assume? [21:32:46] the mediawiki.org internal links have a popup dialog. it is very buggy for me, flickering. anyone know if this is a known issue? or what component it is? cuz I don't think I have good search terms for bugzilla [21:36:06] Navigational pop ups beta feature maybe [21:43:00] ahh, found it, hovercards [21:44:06] and yes, it is already reported [21:46:11] bawolff: Whatever it's called this week, amirite [21:46:36] damn people changing the names of things [21:46:55] * marktraceur hides his name-change-related patchsets [21:46:58] * bawolff still holds a grudge on pending changes. Also echo vs whatever its called now [21:47:09] * ^d has popup blockers installed for a reason. [21:47:22] bawolff: I was going to make Extension:TechDebt, but now I'm thinking I could just rename UploadWizard [21:47:37] <^d> I see what you did there ;-) [21:58:49] sumanah: about? [21:59:05] hi Reedy [21:59:12] Mind if I PM? [21:59:21] No prob [22:06:09] * rillke is bugged by the jQuery update [22:11:23] * bd808 tells rillke to go write some php instead [22:13:01] bd808: Don't make it *worse* [22:17:09] Today I visited Coney Island for the first time [22:17:14] How do I rebuild Special:Statistics data because http://dev.brickimedia.org/wiki/Special:Statistics is way off [22:17:43] It had an import run on it and then was wiped, but Special:Statistics still has the data from when the import was present [22:18:26] Not really a high priority since it isn't a content wiki but it throws things like WikiApiary off [22:19:41] Meiko: try running maintinance/initSiteStats.php [22:20:43] sumanah: Neat. See anything super cool? [22:21:53] bd808: The Atlantic Ocean is a really fearsome thing. It is huge. It just keeps washing up against the beach, in a regular rhythm. [22:22:27] Every time I visit a beach and see and hear an ocean, I think how amazing it is, and think about the fact that neither I nor anyone has to project-manage it [22:22:54] bawolff: Thanks, that was the script I was looking for :D [22:23:15] Well the moon and wind are constantly adjusting it [22:23:16] lol [22:23:44] bawolff: I just ran it but it didn't work :O [22:23:58] And I did updateSpecialPages.php too [22:24:00] That's sad when that happens [22:24:13] Did it error out, or are the numbers just off [22:24:19] They're just off [22:24:26] oh well I'm going to go play some video games and pretend I didn't bother trying to fix it :P [22:24:41] bd808: yeah ..... but there is no Gantt chart, no burndown chart, no person whose job it is to make the ocean waves happen. Someday I will have been away from management long enough that I don't think about that anymore [22:24:42] maybe [22:25:01] sumanah: I'm very enamored of the oceans too. Growing up in a land locked state and a desert area makes me really awed by that much water. [22:25:06] Meiko: For the article count, keep in mind the definition of an "article" is a little odd (It depends on namespace and if it contains at least 1 link) [22:25:31] There's only like 3 pages on the wiki though, that's what is weird [22:25:35] and it doesn't care about me and it has never either been proud of me or disappointed in me [22:25:39] http://dev.brickimedia.org/wiki/Special:AllPages [22:26:01] The first time I went to Fiji i realized it was the inverse of Idaho. Mostly water with a little land sprinkled here and there [22:26:05] Meiko: Yeah, that's a little off :) [22:26:11] bawolff: just a little :P [22:26:40] But like I said, it isn't a high priority since that's not even a content wiki; just a test instance of MediaWiki/Brickimedia [22:27:34] bd808: nod nod nod [22:29:06] bd808: have you ever read Orwell's "Some thoughts on the common toad"? [22:29:31] I don't think I have. Public domain by now I suppose? [22:29:33] not quite, no [22:29:35] http://theorwellprize.co.uk/george-orwell/by-orwell/essays-and-other-works/some-thoughts-on-the-common-toad/ [22:29:55] the last paragraph of that essay is something I can nearly recite by heart, I love it so [22:30:39] Ha. That is nice and horrible at the same time. [22:32:16] Orwell had the turn of mind that could think and articulate: "There must be some hundreds of thousands, if not millions, of birds living inside the four-mile radius, and it is rather a pleasing thought that none of them pays a halfpenny of rent." [22:33:32] I'm turning that over in my head, the attention to numbers, nature, power, economics. Anyway. Gotta put up the RfC meeting notes. bd808 like I said in email, glad your structured logging work is chugging fwd [22:34:05] sumanah: Thanks for checking on it. Someday it will merge... someday. [22:34:29] :) [22:50:18] hello sumanah! [22:50:33] hi fhocutt! [22:51:00] fhocutt: perfect timing, in that I have now just sent a note to wikitech-l and am sort of done with RfC review work for this week. Kind of [22:52:23] oh excellent. (mostly.) [22:52:32] :) [22:52:53] so, fhocutt have you & Merlijn been able to work on the shortlisting for a few languages? [22:53:05] not yet, although Perl has taken care of itself [22:53:09] nod [22:53:22] I have been spending most of my time on API talk:Client code [22:53:28] https://www.mediawiki.org/w/index.php?title=API_talk:Client_code [22:53:38] nod [22:54:16] By the way fhocutt did you take a look at my blog post http://www.harihareswara.net/sumana/2014/05/13/0 ? it has a nugget near the end you might find interesting [22:55:09] I am working through them myself--for the simpler ones it is pretty clear from the API calls what is going on, at least in terms of what it's trying to do [22:55:27] cool [22:55:39] (like, is it fetching any sort of "querycontinue=stuff", does it do continuations [22:55:42] ) [22:55:47] right [22:56:15] this morning I was taking a look at pywikibot, which seems rather like a sledgehammer if you're just trying to fetch some data [22:57:01] sumanah: there exist better PHP REPLs. `boris` is pretty okay. [22:57:05] as makes sense from its history it is very focused on enabling bots to make edits, and the documentation also focuses on this [22:57:31] cscott: right - and [22:57:44] The script maintenance/eval.php in MediaWiki provides a basic PHP interpreter with MediaWiki objects and classes loaded. (so says https://www.mediawiki.org/wiki/How_to_become_a_MediaWiki_hacker ) [22:57:44] https://github.com/d11wtq/boris [22:57:44] fhocutt: yes indeed [22:57:47] sumanah: oh, that's a new one to me! that's handy [22:58:09] fhocutt: I ran into that problem early in my Hacker School study period. I tried to use pywikibot and was very overwhelmed. Fortunately someone told me about requests! [22:58:27] (I did see that post! Did not know about Apache rewriting headers.) [22:58:30] cscott: I shall keep `boris` in mind [22:59:07] sumanah: when I was working on my microtask I looked at pywikibot and quickly decided to go evaluate something else [22:59:11] sumanah: i'm going to add both maintenance/eval.php and boris as a comment on your blog, for other readers [22:59:33] cool! [22:59:39] fhocutt: good choice! [22:59:51] pywikibot is right now a power user's tool [23:00:23] fhocutt: https://www.mediawiki.org/wiki/Amsterdam_Hackathon_2013#Wikibots_.28pywikipedia_and_others.29 and https://www.mediawiki.org/wiki/File:Bots_hackathon_2013.pdf may end up helping you with pywikibot [23:00:26] that is what it looks like; I didn't see any obvious documentation on just using it to get data. [23:00:38] yeah [23:00:45] it's really meant for you to do sophisticated things with [23:00:48] not simple things [23:00:50] in my opinion [23:01:12] this is why I think it's a good idea to have a handful of featured libraries, if available [23:01:25] I will look at those links after [23:01:27] different use cases [23:01:29] nod nod [23:01:31] precisely. [23:01:45] because for Python, there are about 3 levels [23:01:57] * sumanah listens [23:03:44] there's pywikibot (giant, scary, and optimised), wikitools/mwclient (handle GET/POST and have abstraction), Wikipedia only handles GET but has abstraction, and then something like simplemediawiki is really just an API wrapper [23:03:50] so 3.5 levels :) [23:05:08] fhocutt: (btw I sort of hope someone calls me "giant, scary, and optimised" someday. side note.) [23:05:13] if continuations were implemented on simplemediawiki all of them except for Wikipedia would pass our minimum bar [23:05:16] * fhocutt grins [23:06:01] but I don't feel like I can usefully pick one or maybe even two. [23:06:54] fhocutt: "Wikipedia only handles GET but has abstraction" - I feel like (a) terrible name collision and (b) yeah, doesn't pass our minimum bar and (c) I know I keep harping on this, but requests is awesome [23:06:54] if I were picking two, I would probably implement continuations in simplemediawiki and then choose either mwclient or wikitools, as long as pywikibot was such a power-user tool. [23:07:14] do any of the maintainers seem particularly responsive? [23:08:07] I don't remember if Wikipedia(agree on name) uses requests or not--it's that it doesn't handle tokens and I think not cookies either [23:08:36] I haven't pinged many of them. I did get a response from the maintainer of perlwikibot when I asked about some old documentation [23:09:02] We'll find out at a future step, then [23:09:06] I cleaned up this this morning: https://en.wikibooks.org/w/index.php?title=Perlwikibot&stable=0 [23:09:23] after getting a response here: https://github.com/MediaWiki-Bot/MediaWiki-Bot/issues/56#issuecomment-43701499 [23:09:35] this is good, because it appears to be the only currently maintained Perl library! [23:10:23] I feel like "Mike Doherty" sounds familiar [23:10:50] no, just one of those names [23:11:39] going back to Wikipedia, it does use requests, yes [23:12:33] the ones that I most need another set of eyes on are Java and JS [23:13:07] ok, so, with Java, IIRC you are working with either Tollef or Merlijn - Merlijn I think? I remember he said he could work with you over email on this [23:13:16] and with JS IIRC you will have a chat with Brad tomorrow [23:14:01] yes, planning on those! I looked at Java and got confused by how very many classes there are [23:14:10] I have not touched Java for ~10 y [23:15:18] so sumanah, are the notes on API talk:Client code the sorts of things you were looking for? [23:15:30] yes [23:15:40] good. [23:16:36] so to finish that up I'll talk with Merljin/Tollef/Brad [23:17:26] what next? Continue writing up specific criteria now that I have more of a sense of what's out there? [23:17:30] fhocutt: I figure that by the end of this week you'll be able to say "here's the shortlist" for each language by, like, bolding or subsectioning the relevant sections [23:18:02] Perl's shortlist is already taken care of by default [23:18:18] right [23:18:27] Python's shortlist is not so short, I would say [23:19:21] and then next - I'm looking at https://www.mediawiki.org/wiki/Evaluating_and_Improving_MediaWiki_web_API_client_libraries#Deliverables - I think it would be good for you to talk with Mithrandir about what the evaluation criteria/library "gold standard" would look like. He knows the most out of the four of us, I think, about what devs want in an API client library [23:19:43] I see that your Week 2 goals include Finish writing up evaluation criteria/library "gold standard" [23:19:55] ok, that sounds like a plan. [23:20:18] yeah. "Research and decide on criteria to evaluate these libraries in more depth" to lead into the week 2 goals [23:20:30] your plan is useful I think! [23:20:32] Ruby has only two libraries to put on a possible shortlist, and one of them doesn't do login [23:20:57] so it's really going to be Python/Java/JS where most of the work is. [23:21:22] I'm glad that we are saving time on Perl/Ruby to give you more time on Python/Java/JS [23:22:03] I'm a bit sad because I'd enjoyed the bits of Perl from dreamwidth work, but there is always more of that to do! [23:22:07] yes! [23:22:36] and if you really want the Perl library to be the one that you contribute to for the last month of your internship then maybe you could do that? [23:23:05] yeah, we will see. [23:23:13] for your career in general I would not say that's the most promising one to choose. :-) but hey. [23:23:31] I hear that learning Perl is a great way to do nothing but Perl forever [23:23:34] hahahahaha [23:24:09] getting familiar with JS would definitely be more versatile! [23:24:09] fhocutt: is now a good time to chat for a few min about Wiki Conference USA & your session? since I have been regrettably tardy in replying to your email about structuring your session(s)? [23:24:29] yes please. I need to get those ready! [23:24:35] We will be starting office hours with the incoming Wikimedia Foundation ED in just over 5 minutes (at the bottom of the hour) in #wikimedia-office. [apologies for multi channel spam] [23:24:42] fhocutt: I'm sorry [23:25:18] is ok--I have also been pretty distracted with getting this started. [23:26:11] so for OPW, I will email Tollef/Brad/Merljin tonight with what I have and some questions about JS/Java and then what developers want in a library [23:26:21] cool [23:26:25] (developers at different levels, also) [23:26:31] fhocutt: so, you're right that not everyone will have a laptop, and not everyone will have Python installed, and probably there will be crashy slow wifi [23:27:12] would turning it into something more like a live demo with copious notes available to follow along be an option? [23:27:30] I am also concerned about time. [23:27:40] fhocutt: I hear ya [23:28:13] I still need to email the organizers and see if I can do a 20/40min split, not 30/30 [23:28:42] while I could do a 30-min beginner-level talk about the API, I would like to have more time for the workshop [23:29:08] fhocutt: so, here is what I am thinking. At the very beginning of the first part, http://wikiconferenceusa.org/wiki/Submissions:Using_the_MediaWiki_web_API_to_get_%28only%29_the_data_you_need , announce that helpers will circulate and help people get set up for Part II [23:29:37] fhocutt: so, DURING part I, people like me can help go around and get the right stuff installed on people's laptops, one by one, while most people listen to your intro talk [23:29:57] I worry about that being fairly distracting. [23:30:02] Then! by the time part II runs around: http://wikiconferenceusa.org/wiki/Submissions:Using_web_API_client_libraries_to_play_with_and_learn_from_our_%28meta%29data [23:30:17] it would be somewhat distracting. [23:30:51] So, then, you could simply do 20 min of intro in Part I, then 10 min of setup, then the 30-min workshop [23:31:19] that is a possibility. [23:32:11] how comfortable are you setting up Python + installing libraries on Windows and Mac? [23:33:35] I can borrow the https://openhatch.org/wiki/Python_Workshops playbook [23:33:46] oh, useful. [23:35:21] hm, ok. [23:36:01] wikitools and mwclient are the obvious options for using libraries [23:36:38] I am wondering if I need to cut anything that I said I was going to cover, for a 30-min workshop. [23:37:10] I think https://openhatch.org/wiki/Philadelphia_Python_Workshop/Setup is the playbook we would copy but I will check in #openhatch [23:38:06] sumanah: tim and i will likely be in a meeting at 2pm next wednesday, to warn you [23:38:23] brion: ok, would Friday May 30 be better? [23:38:53] sumanah: that works for me but check with tim, fridays are ….. some other day for him ;) [23:38:59] true [23:39:16] brion: since next week is a weird week due to Monday in the US being a holiday, would you maybe be ok with doing the meeting Tues in stead? [23:39:18] instead* [23:39:26] lemme check cal [23:39:56] sumanah: 27th at 2pm pacific works for me [23:40:25] fhocutt: goals 1-5 in https://openhatch.org/wiki/Boston_Python_Workshop_6/Friday are applicable re what we want our participants to be ready to do by the time they come into your classroom [23:41:18] fhocutt: so I am leaning more and more towards the idea that what you do in the Part II of your workshop should be a tour with lots of notes and a page somewhere with all your notes/links/scripts so people can follow along [23:41:34] sumanah: yeah, that is what I'm thinking. [23:41:45] and then an unconference/open space session on the last day of WCA where we walk people through installation and the various other steps if they want/need help [23:42:08] because 1-5 is a lot for someone who's just fiddled with templates, which is all the experience that the description requires [23:42:10] oh that's right, there's open space! [23:42:47] yes! [23:43:10] ok, cool, if anyone wants that we can offer that. [23:43:51] I will, of course, take a page from Software Carpentry and make sure that I am doing live coding :) [23:44:37] does something that I can get set up and do in ~15 min sound like a good idea? [23:45:09] very possibly! are you gonna try to beta test it on your Seattle friends? [23:45:21] I will try! [23:45:54] having a backup thing that you can do with no internet connection is also a good idea [23:45:55] scheduling, and the fact that I leave for NY in 9 days iirc and there is a giant folk festival for 4 of them :) [23:46:00] ooh good idea. [23:46:39] you don't have to take hours to make slides - something, even if it's just screenshots of datavisualizations other people made [23:47:02] planning to have those in my first presentation [23:47:14] I am not going for superfancy slides [23:47:49] oh, is there going to be ppt karaoke? Because I will hunt down some of my chemistry slides if so :D [23:48:06] hahahaha - I was not planning on running PPT karaoke but I would not object to you foisting some decks into the mix if there is! [23:49:05] ok, so I think that is a plan [23:49:49] Depending on time the client libraries bit may be "look at how much less I have to stare at api.php to write this request!" [23:50:00] yeah! [23:50:21] "come find me after if you actually want to get started with one" [23:50:40] it's only really useful for queries if the libraries are better documented than the API [23:50:43] fhocutt: I want all of this to of course be better than grad school was for you. however at the end of this summer you will sort of be like a PhD in that you will be the world expert in our API client libraries. if you are not already [23:51:02] it is possible that I am getting there :D [23:51:14] but I imagine this kind of presentation is ....... very different from giving an academic paper [23:51:26] I am reassuring myself with the knowledge that I know more about this than pretty much anyone else at the conference [23:51:44] it's different from that, yeah, less "here's what I've done" and more "here's how it works" [23:52:09] but I've been explaining what I'm doing for this internship to friends and family, and that right there is at least half of the initial talk [23:52:26] "what's an API" "what are client libraries" [23:52:26] fhocutt: perhaps you have heard of the Kathy Sierra insight that as long as you stay focused on helping your audience get more awesome & gain skill, you will be golden [23:52:39] fhocutt: did I tell you about the time I had to explain what an API is to Wyatt Cenac? [23:52:56] yes! I was trying to focus on "what will the audience get out of this" when I was writing up my abstracts [23:53:01] and those will be the outline for my talk [23:53:04] yes! [23:53:06] who's Wyatt Cenack? [23:53:10] *Cenac? [23:53:11] stand-up comic [23:53:13] a daily show ccontributor [23:53:17] ah! [23:53:32] no! [23:53:41] I was reading "RESTful Web APIs" just before a stand-up comedy show, I was in the front row, he came out, I had not put my book away, he asked about it [23:54:03] btw, @lilatretikov is over in #wikimedia-office right now [23:54:11] that sounds unexpected! [23:54:13] btw when he read the title aloud, someone in the back yelled YEAH! so I know there is at least one developer in Brooklyn who is really into the REST philosophy of web API architecture [23:54:23] * fhocutt grins [23:56:53] for my timetracking/categorization purposes, would you say that the conference prep is part of my OPW-stuff or not? [23:57:12] anyway. So, you will indeed know more about your topic than most people at the conference, and those who know more than you will probably not come to your session because they want to do other things! [23:57:40] yeah I would say your conference prep counts as OPW-related, since you are giving a talk that is all about your OPW project really [23:57:53] I mean, you would not be working on these things if not for OPW [23:57:59] I would not be! [23:58:19] cscott: will you be at Wiki Conference USA? [23:58:44] and yes. I'm not very nervous about actually talking/presenting, just about making sure that I have good materials and information to support myself as I do [23:59:12] so I also plan to draft those this week [23:59:26] cool [23:59:47] ok! Anything else, sumanah? If not I will go off and send emails and write slides and such.