[00:02:01] jackmcbarn: hmm? [00:05:19] Skizzerz: never mind, i figured out what i needed (was trying to figure out why you did something to the code back in 2007) [00:06:34] ah, ok :) [00:06:46] chances are whatever I did back in 2007 isn't necessary/needed now [00:06:48] :P [03:03:18] where does one go to check all the classes that are predefined across the WMF wikis? [05:14:50] Hello! I am about to install mediawiki and the script installer asks me which protocol I prefer: http:// or http://www. Which one should I choose? Thank you. [05:15:43] eek [05:17:51] Hello! I am about to install mediawiki and the script installer asks me which protocol I prefer: http:// or http://www. Which one should I choose? Thank you. [06:08:32] you want it online or just intranet? [07:44:43] I wonder why touch.py doesn't support -summary! ;) [07:47:43] FYI https://www.mediawiki.org/wiki/User:AnankeBot [08:04:05] Hi [08:04:18] is there any option of doing oauth2 auth in mediawiki? [08:04:27] I want my users to sign up using their google accounts. [08:04:47] OpenID is not an option anymore as google dropped supprt. [08:05:09] Ouch, 70k translation units to go on mediawiki.org [08:05:56] otwieracz: http://markmail.org/search/?q=oauth2%20list%3Aorg.wikimedia [08:07:38] 5k outreach, 64k wikidata, 38k commons aaaaaaaaaaaaand... 242k meta! Sigh². [08:09:03] OK, but I do not see anything clear. [08:10:23] Your question wasn't, either [08:40:48] hi all ! [08:41:09] is there a french chanel of mediawiki ? [08:55:39] https://github.com/joostdekeijzer/mw-oauth2-client-extension [08:55:45] You have any thoughts how to use it? [08:56:53] I've installed it, configured, I see „OAuth2 Client (Version 0.2)” in Special:Version [08:57:08] But I have no idea how to log in using oauth2 [08:57:41] There's nothing about oauth/whatever on login page [09:29:34] can someone tell me why my pdf file is always 0kb wigh pdfbook? [09:29:42] with* [10:19:37] mranti: not without more info. :) What are steps to reproduce? [10:46:24] im looking to create a new logtype for my extension with subtypes "create, modify, delete" [10:46:38] to put into RC/Watchlist [10:46:43] how would I do that [10:51:45] Withoutaname: take a look at how delete does it [10:53:09] andre_: i installed html doc for windows [10:53:14] Betacommand: I did $logentry = new ManualLogEntry but I don't think that defines the type yet [10:53:33] then put the pdfbook files in /extensions/pdfbook/ [10:54:19] and set in the pdfbook.php file [10:54:29] # Whether or not an action tab is wanted for printing to PDF [10:54:30] $wgPdfBookTab = true; [10:57:51] mranti, how did you download the files? [10:59:50] http://www.mediawiki.org/wiki/Extension:PdfBook [11:00:29] wikimedia-mediawiki-extensions-PdfBook-01f5d14.tar.gz [11:00:39] mranti, which of the links there? [11:00:43] there are many... [11:04:34] http://www.mediawiki.org/wiki/Special:ExtensionDistributor/PdfBook [11:06:38] alright, so git master [11:09:24] !class ManualLogEntry [11:11:37] Withoutaname: So you want a new log, and you want to show creations and modifications on the same log page? What bit are you having a problem with? [11:13:25] Lcawte: I want a new log type like (Foo log) User:Admin (talk|contribs) created Foo ---- or (Foo log) User:Admin (talk|contribs) deleted Foo [11:14:15] ManualLogEntry does the latter part, but to get (Foo log) I have to specify type and subtype in the ManualLogEntry constructro [11:14:36] Have you had a look at https://www.mediawiki.org/wiki/Manual:Logging [11:18:49] Lcawte: oh thanks, ill look at that [11:19:23] Lcawte: im a little confused on the second line though, is "bar" a subtype to pass to logformatter [11:21:47] Withoutaname: I'm running late, I was supposed to leave a few minutes ago. Apologies, try looking up another extension or looking at the Manual pages :( [11:22:58] Withoutaname: I assume you read https://www.mediawiki.org/wiki/Manual:Logging_to_Special:Log ? People always ask about this topic but never on talk page so we never know what needs fixing on the docs :( [11:23:30] yeah, why do I need the logformatter [11:23:39] shouldn't it be the same in whatever skin im using [11:24:19] I get $wgLogTypes[] = 'foo'; where (Foo log) is the new log type, but $wgLogActionsHandlers['foo/bar'] = 'LogFormatter'; what does "bar" refer to [11:24:41] and there's a link to old svn... [12:19:55] everything needs a log formatter [12:33:52] Log formatter all the things. [12:37:05] "Are traditional Yellow pages classified ads better than online ads?2462256" That's a lovely spambot wiki page! [13:41:59] so cute https://archive.org/details/JosephLasicaCo-founderAngelaBeesleyonWikia [14:01:39] Hi! Are changes to the mail address in the MediaWiki preferences logged? I. e., if the address *now* is a@b.c, can I safely assume that it has been so at account creation? [14:03:12] am a newbie pulling my hair with getting ldap authentication to work with MW v1.23 on windows 2008 r2. Is there a forum available for troubleshooting? Error is: Failed to bind as DOMAIN\User. [14:04:02] !lists | sul_ [14:04:07] sul_: I suggest wikitech-l [14:05:25] thanks! will try wikitech-l [14:12:58] sul_: 1) no, they're not logged by default; Wikia has an extension for this, though IIRC it won't work as-is on a standard MW instance 2) ask on https://www.mediawiki.org/wiki/Extension_talk:LDAP_Authentication , Ryan Lane (the developer of the extension) is pretty responsive and actively monitors it; likewise, I'd assume that LDAP users are reading that page more than the mailing lists ;-) [14:16:44] thanks ashley! [14:17:14] you're welcome, let's hope you get everything sorted out :) [14:18:03] Umm [14:18:20] DPLForum is totally broken. I think I know why, though. :-) [14:18:34] because Wikia used it? [14:18:42] Well, it is a horiffic mess, but no. [14:19:04] https://git.wikimedia.org/blobdiff/mediawiki%2Fextensions%2FDPLforum/390d6fed598b23e8b168c21d2035274639f24a2c/DPLforum_body.php [14:19:15] This is fine: [14:19:22] [14:19:22] - if ( $row = $dbr->fetchObject( $res ) ) { [14:19:22] 413 [14:19:22] + $row = $dbr->fetchObject( $res ); [14:19:22] 414 [14:19:22] So excruciatingly slow... If only I had shell access. :) https://www.mediawiki.org/wiki/Special:Contributions/AnankeBot [14:19:23] + if ( $row ) { [14:19:33] This, not so fine: [14:19:33] [14:19:33] - while ( $row = $dbr->fetchObject( $res ) ) { [14:19:34] 420 [14:19:34] + $row = $dbr->fetchObject( $res ); [14:19:34] 421 [14:19:34] + while ( $row ) { [14:20:58] All DPLForum pages will freeze with that code, which went out on the REL1_23 branch. If you put the assignment back into the while it works again. [14:21:59] why is it even using a damn while()? it should be using foreach ( $res as $row ) { /* code goes here */ } [14:22:05] but DPLforum is horrible anyway! [14:22:13] True and true! [14:22:24] WikiForum is...slightly less horrible and sorta maintained, too! [14:23:23] I guess it is doing it for consistency with the two other cases in that section where it's only grabbing one $row. [14:36:06] GreenReaper: https://gerrit.wikimedia.org/r/137931 [14:39:41] ashley: Seems legit. [14:39:49] * GreenReaper tries, it doesn't fall over. [16:02:42] Anyone can help me use LogFormatter? [16:06:33] kunalg, might be easier to help you if you explained the issue first [16:06:59] Krenair: I am trying to make a log formatter in core [16:07:29] ok [16:07:44] Krenair: It should only be declared in $wgLogActionsHandlers right? [16:08:31] kunalg, https://www.mediawiki.org/wiki/Manual:Logging_to_Special:Log#1.19_and_later [16:08:37] yes [16:09:27] Seen that, even the default LogFormatter should give some output right? [16:24:30] are there any gateways between Mediawiki discussion extensions and mailing lists? [16:32:49] Reedy: do you have a moment to rsync a big file onto Commons for me? a new version of https://commons.wikimedia.org/wiki/File:Sumana-Harihareswara-wikicon-2014-keynote.webm [16:48:00] Hi [16:48:07] I need some special help with skinning [16:48:17] I'd like to place a link to a wiki page in my skin [16:48:35] but instead of doing a foobar [16:48:55] I am looking for a skin method which creates the href for a given title [16:49:00] is that possible? [17:18:17] root-80686: Hi, you can use a Title method, Title::getFullUrl() [17:19:26] kunalg: this is when I have first created a new page object, right? [17:19:42] root-80686: A new Title object [17:19:55] I found now "SkinTemplate::makeUrl()" [17:22:04] root-80686: Oh. Cool [17:22:17] reading some source code ;-) [17:22:26] but I am already stuck again [17:22:35] want to build the "print version" link [17:22:50] so I need to get the URL of the current page - or the current page title [17:23:14] I guess $this->text('title') already returns a HTML-friendly output [17:32:29] root-80686: I am not sure what you need but I think you might want to do something like $this->getSkin()->getTitle() [17:32:52] root-80686: Helps to make a Title object [17:43:57] msg('printableversion') ?> [18:20:57] withoutaname: Have you managed to sort your logging code out? I'm back now if you still need help. [18:22:23] Lcawte: In https://www.mediawiki.org/wiki/Manual:Logging_to_Special:Log $wgLogActionsHandlers['foo/bar'] = 'LogFormatter'; sends "foo/bar" to the formatter, but what does "bar" refer to [18:25:25] Right, so $wgLogActionsHandler simply defines what log handler a log type uses. The 'foo/bar' bit is the 'log/action' like in your ManualLogEntry(); call. [18:26:42] so you can send different actions to different formatters? [18:27:10] factor: I'm here! [18:27:12] err [18:27:15] fhocutt: I'm here! [18:27:40] valhallasw: hi! Let me get my tea, and I'll be ready to get started. [18:27:49] Yes, afaik. [18:28:36] I see, thanks Lcawte [18:29:01] Obviously you don't have to, you can wildcard the action to send them all to one formatter which means you have to write less code in your main file. [18:33:03] valhallasw, ok! [18:33:58] fhocutt: I'm looking at the simplemediawiki evaluation at the moment. I like the checklist-y style! [18:34:13] ty! [18:39:43] fhocutt: OK, so what should we look at? Should I give some more specific responses to the SMW eval, or should we take a look at other libraries? [18:40:26] could you take a look at the SMW github and see if what you see matches up with what I have there? [18:40:52] valhallasw: and then I was thinking of moving on to wikitools, for a more complex library [18:41:07] OK. Taking a look at SMW's github now [18:41:57] fhocutt: oh, I just realised we are missing a point that should be in the evaluation: the license used [18:42:18] valhallasw: oh, right! I assumed everything would be open automatically :P [18:42:22] I'll add a point [18:42:24] a check mark on whether the library explicitly mentions the license, and just a note which one it is [18:42:39] because if the library doesn't mention the license, one can't use it legally [18:42:53] and maybe for python whether there is python 3 support [18:44:05] I was going to put that under one of the usability points, hm [18:44:41] but it doesn't really fit, I will make a new bullet point [18:46:11] ok [18:46:57] "Can be used with the most recent stable version of the language it is written in (e.g. Python 3 compatible)" [18:47:27] Yeah, that sounds nice and generic :-) [18:47:53] fhocutt: one thing to note @ https is that not all libraries actually *check* https certificates [18:48:01] are there any gateways between Mediawiki discussion extensions and mailing lists? [18:48:17] valhallasw: could you say more about that? [18:48:41] fhocutt: basically, if you don't check the server's certificate, you are not protected from a man-in-the-middle attack [18:49:03] fhocutt: because the attacker could just tell you 'Hey! I'm mediawiki.org, and this is my https certificate: ....' [18:49:13] valhallasw: ok, I see [18:49:22] I'm getting some messages in the debug load about "Class XXX not found; skipped loading" [18:49:42] fhocutt: I *think* urllib/urrlib2 don't do this, but I'm trying to think of a way to easily test this [18:49:42] Is there a good way to track down why that's happening? [18:49:58] (specifically in this case it's some SimpleSAML classes) [18:50:46] valhallasw: stack overflow says it doesn't by default: http://stackoverflow.com/questions/6648952/urllib-and-validation-of-server-certificate [18:51:18] fhocutt: right. I'm trying to think of a way to test this easily [18:52:11] hm, requests does that validation by default [18:52:48] yep [18:54:00] valhallasw: there's no "ssl" in the code, so I am guessing that smw just doesn't handle https as expected. [18:54:38] fhocutt: ok, I've thought of a way to test it [18:55:25] https://procrastination.nl serves an incorrect certificate (my domain, without anything running) [18:56:16] so using that as url should give an 'incorrect certificate' error, and not a 'I don't understand the response' error [18:58:23] fhocutt: yep, JSONDecodeError: No JSON object could be decoded: line 1 column 0 (char 0), so it doesn't check certificates [18:58:41] yep, got the same thing. [18:59:00] I'll add some clarification to "supports https" [19:05:46] valhallasw: I am confused about why adding a link to the documentation in the README should have made the automatic testing fail for 2.6 and 3.3 [19:05:56] see https://github.com/ianweller/python-simplemediawiki/pull/10 and https://travis-ci.org/ianweller/python-simplemediawiki/builds/26683173 [19:06:17] fhocutt: I think maybe the commit you based on was already broken [19:06:38] or... [19:06:46] either that or the tests are flawed [19:07:03] yeah, it makes calls to https://simplemediawikitestsuite-ianweller.rhcloud.com/ - maybe that backend is flawed [19:07:14] doesn't work for me in any case [19:07:30] it's not loading for me [19:07:40] no, there it is [19:07:51] also, I'm confused why the library would need python-iso8601 as dependency [19:08:08] the github version doesn't use that :/ [19:09:11] the maintainer said it was an issue on Ubuntu's end, but I don't know much about how Ubuntu handles packages [19:09:19] oh, it's a dependency that was removed later on [19:09:43] hm, I see [19:09:47] https://bugs.launchpad.net/ubuntu/+source/python-simplemediawiki is the ubuntu bug tracker for python-simplemediawiki [19:10:21] I'll report this later today [19:10:24] https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=645164 is actually the same as your bug [19:11:02] so is this the sort of thing that would be under the control of the client library maintainer, or is it under the distro's control? [19:11:23] fhocutt: this is under the distro's control [19:11:25] hey all. I'm new to mediawiki administration. I'm looking to turn off minification of javascript assets in the resource loader. [19:11:34] I haven't found a setting to toggle for it, yet, though. [19:11:55] fhocutt: in this case, it seems it wasn't backported to 'precise' (ubuntu 12.04 LTS) [19:12:20] hm, ok. I'm on 14.04 myself [19:12:21] so once you upgrade to a more recent ubuntu, it should be fixed [19:12:25] hm. [19:12:26] that's weird [19:12:35] that's utopic, right? [19:12:38] trusty. [19:12:48] ah, right. [19:13:00] trusty also has version 1.0.2-2 :/ [19:13:22] I have a patched kernel so my wireless works so there is a chance it's specific to my computer. [19:14:44] yeah, the bug wasn't actually fixed in 1.0.2-2 it seems [19:14:53] so reporting it at the ubuntu end seems like the right thing to do [19:15:26] ok, will do this afternoon. [19:16:12] Cool! Well, that's basically all I can think of on simplemediawiki [19:16:26] Basically because what's there is already really good :-) [19:16:38] suggestions for how to evaluate responsiveness when some issues/pull reqs are responded to and others are not? [19:16:41] :D [19:16:44] thanks! [19:17:09] I'm not sure. Basically, I use bugs also as a 'to do list', so I'm aware they are there, but they are not high priority to fix [19:17:28] However, I also think that's a bad way of doing things - acknowledging the bug has been seen is a good thing to do on itself [19:17:34] I'd like to see maintainers at least acknowledge them, yes [19:17:51] so that reporters aren't saying things into a vacuum [19:19:02] I suppose I could use something like "Out of X reports within the last year, N were responded to within 3 weeks and M within 3 days" [19:19:12] that's nice and quantified [19:19:42] fhocutt: I think it also feels weird because the response is not 'actionable' (not sure what word to use) -- basically, you'd want to say 'I have seen your bug, but have not tried to reproduce it, and thus have not fixed it. Oh, and it's not high-priority for me' [19:20:06] hah, yes [19:20:06] but as devleoper you're not doing something with it immediately, and there's also nothing oyu're asking from the submitter at this moment [19:20:38] I like the X within N days version, yes. [19:20:47] might be a lot of work to calculate, though [19:20:51] I see your point there [19:21:01] most of these are not very high-traffic [19:21:16] but I guess you can just sample them otherwise [19:21:38] be sure to also check the closed bugs when you do that, or you'd probably get a very negative view of every project ;-) [19:21:46] oh, definitely :D [19:22:12] if it's too time-consuming, I will figure something else out [19:22:32] I have 5-10 min left, can we look at the structure of wikitools before I need to run off? [19:22:35] (this is, btw, also clear on your SMW bug: it's really easy to say 'not my issue, complain somewhere else'; it's much harder to reproduce & fix a real bug) [19:22:42] https://github.com/alexz-enwp/wikitools [19:22:47] yes, sure [19:23:16] oh dear, GPLv3 [19:23:32] ? [19:23:51] GPLv3 requires any projects based on a project to also be GPLv3 [19:24:06] I see [19:24:12] which some people see as an advantage ('helps spread free software') [19:24:45] right, but might discourage others [19:24:49] * fhocutt shrugs [19:25:02] so in the first pass of evaluation, my impression was that this was well documented but didn't have many features [19:25:03] anyway, it has a explicit license (good!) [19:25:28] it splits up its modules, which is good [19:25:54] hrm, where are the docs... [19:26:06] maybe just the example in the README [19:26:18] yeah, the splitting up is nice [19:26:32] I think it is just the example in the README [19:27:05] though it has docstrings, so let me see if they've done anything automated with them [19:27:08] it does do continuations, which is nice [19:27:27] here we go: https://code.google.com/p/python-wikitools/wiki/Documentation [19:28:05] 'The promised "QuickStart?" page doesn't seem to exist on this wiki.' *grin* [19:29:09] also nothing [19:29:13] oops [19:30:10] so it seems to be one step higher-level than SMW [19:30:17] yes. [19:30:54] it looks like the various methods are in the logical module [19:31:06] I'll have to do some testing to figure out how much of the API it covers [19:31:09] it basically exposes API requests in a small OOP layer ('Page', 'Category', and they are mainly used to group related API queries) [19:32:48] and then API has the machinery to handle the mechanics of API calls [19:32:54] (api.py) [19:32:57] right [19:33:33] note that this also uses urllib2, so no certificate validation [19:33:42] all of the Python ones do :/ [19:33:57] pywikibot uses httplib2 [19:34:02] there's an open issue suggesting requests, I'll comment there [19:34:05] oh, ok! [19:34:14] useful [19:34:19] not sure about the others; I would expect there to be a few ones using requests by now [19:34:24] nope. [19:34:32] I checked this earlier this week [19:35:09] I need to head off now-ish. Thanks so much for the feedback/comments/things to look for, valhallasw! [19:35:16] You're very welcome! [19:35:32] I will keep you all updated as I add more libraries to the evaluation page [19:35:52] \o [19:36:02] *waves* [20:03:59] can anyone tell me when AdminSettings.php became LocalSettings.php? (i'm doing some archaeology on a very old mediawiki installation) [20:05:05] Skud: Umm, 1.16 or 1.17 I think... Not sure [20:05:22] Skud: We still maintained backwards compatability with it long after they were removed [20:05:28] yeah wow, idek what this is. it's from 2007. but i am having trouble getting it running at all, locally [20:05:43] [[Manual:AdminSettings.php]] [20:05:45] @link [20:06:00] Skud: I checked. Removed in 1.16 [20:07:05] i think i might even have to downgrade PHP or something. wow. [20:08:29] Skud: What version is it? [20:08:40] It should say in the RELEASE-NOTES file [20:08:50] or in includes/DefaultSettings.php [20:08:56] 1.9.3 [20:09:17] lol [20:09:33] !1.9 [20:09:43] yes, i am aware of this. [20:09:52] meh, it doesn't say the release date [20:09:53] the problem is i'm trying to extract data from a long-deceased wiki [20:09:58] 2007 [20:10:05] feb 2007 to be exact [20:10:23] anyway, i want to get the wiki up and running locally, progressively upgrade it to something at least reasonably modern, then run an export/import [20:11:01] Skud: The data is in the database. That's fine. You only need to backup your database, install a recent version of MediaWiki, and do the upgrade [20:11:24] will it upgrade all the way from 1.9? [20:11:32] yes [20:11:39] someone told me it didn't when they tried, and i took them at their word, but if that is true then w00t! [20:13:43] Skud: Well its supposed to anyways, upgrades from 1.9 probably don't get tested to often now a days [20:13:54] but if it doesn't work, we could probably guide you through the issues [20:14:19] thanks, i'm gonna give that a shot [20:15:03] Vulpix: There's the big list of when things were released at https://www.mediawiki.org/wiki/Branch_points [20:15:23] Usually though people only run into big problems if trying to upgrade from before 1.5 [20:15:59] hi fhocutt [20:24:04] bawolff: ok i have 1.23 here and ready to go. i have the old database (from 1.9.3) loaded into mysql. i'm a bit confused as to how i should connect the two though. [20:24:42] bawolff: eg. should i go through the usual install process for 1.2.3 or do something else? [20:25:22] Skud: Just keep the LocalSettings.php from your old install (which should have the db credentials), and run update.php [20:25:57] (You may need to update extensions at the same time depending on which extensions you have installed) [20:27:23] ok, giving it a shot! [20:28:37] well it's running. it says "MediaWiki 1.23.0 Updater" but now it's just sitting there. [20:28:54] kind of expected it to tell me what it's doing/give me a progress report/something. [20:29:36] Skud: It should be giving a progress report as it does things [20:29:47] each step may take a little while though if its a big db [20:30:57] oh wait. database timed out. *fixes* [20:32:15] wait, what? DB connection error: No such file or directory (localhost) [20:32:37] csteipp: hi there! [20:32:55] Hey sumanah! Ready when you are :) [20:32:56] ha. it doesn't like localhost but does like 127.0.0.1 [20:32:56] oops! [20:33:22] csteipp: so, I figured we would Go over the existing security guidelines examples and see what still needs adding, maybe draw a couple diagrams. [20:33:27] ok, it died here: [20:33:28] [0bb39a05] [no req] Exception from line 337 of /Users/skud/code/permaculture.info/mediawiki-1.23.0/includes/installer/MysqlUpdater.php: Missing rc_timestamp field of recentchanges table. Should not happen. [20:33:31] https://www.mediawiki.org/wiki/Security_for_developers/Architecture [20:33:35] bawolff ^^ [20:34:13] Skud: Make sure its connecting to the right database [20:34:19] it is. [20:35:06] sumanah: sounds good. I was just about to write up "API Tokens".. should I do that so you can see what I'm thinking? Or should we identify first, and then fill in? [20:35:23] csteipp: sure, go ahead and paste your thoughts, here or in the wiki pg [20:35:29] bawolff: there's a table called "recent" but not "recentchanges" though [20:35:51] Skud: Umm, that's odd. Are you sure its a mediawiki database? [20:35:59] recentchanges has been in use since the beginning [20:36:19] pretty sure. but then, who the hell knows. [20:36:33] err, by beginning I mean mediawiki 1.5 [20:36:34] Skud: probably a missing $wgDBprefix? https://www.mediawiki.org/wiki/Thread:Project:Support_desk/Upgrading_from_1.16.4_to_1.19.2_-_rc_timestamp_error [20:36:40] recent has never been a table [20:36:42] i was given an SQL dump and a tarball of an ancient mediawiki install and asked if i could extract the pages from it and import them to a new mediawiki [20:37:02] There's a list of the history of our tables at https://www.mediawiki.org/wiki/Manual:Database_tables [20:37:29] Skud: What other tables do you have in there? Do you have a revision table, an image table, etc [20:37:34] text table [20:37:40] the tables i have include: images link member nonempty page pref recent session user [20:38:01] user_former_groups user_groups user_newtalk user_properties version [20:38:25] Umm, most of those are mediawiki tables [20:38:50] but version isn't, session isn't, member isn't [20:39:00] pref isn't [20:39:29] i have a feeling they may have used the database for multiple things without using a table prefix [20:39:41] csteipp-ish: hey there. IRC difficulties? :-) so, go ahead and tell me about API Tokens [20:39:44] Skud: All the page contents are stored in a table named text. If you don't have that table, you're missing all contents [20:39:55] we can also move into PM [20:40:06] yeah we're missing that [20:40:13] ok i'm going to tell them this database is unrecoverable. [20:40:47] their fallback is to have their community crowdsource copying stuff across by hand from archive.org [20:41:19] ouch [20:41:57] Skud: by hand? how many articles? [20:42:09] ha ha i just checked. thousands. whyyyyyy do they wnat to do this? [20:42:19] anyone here use the OAuth extension? [20:42:21] but if this is the only dump they have, and it doesn't include the text table, then there's no choice. [20:43:27] the other hting is, i think the pages they are going for are actually basically the same content that exists elsewhere in a semantic mediawiki [20:43:31] Skud: One other thing, do you have a cur or an old table? [20:51:21] bawolff: no [20:52:34] Just checking, as that's where we stored page contents prior to 1.5 [20:52:51] nope, i think this is completely screwed, and they should give up. [20:53:49] http://web.archive.org/web/20070308151934/http://www.permaculture.info/ is what they were trying to retrieve. i think it's pretty much the same as http://practicalplants.org/wiki/Practical_Plants [20:54:54] so i've asked them, "do you just want to make sure the information isn't lost?" because if so i think PP has it covered. [20:55:56] i doubt they desperately want to host 7300 pages about plants themselves. they probably just think it would be sad if it were lost entirely. [20:57:44] incidentally i've lost track of who's who, who's still working at WMF, who's in SF, etc, but if any WMF ppl that I know want to catch up, i'll probably be dropping by the office sometime in the next week or two. [21:00:07] Skud: Wait, so if the site is still up, can't they make a new dump? [21:00:50] Skud: Also, you may be interested in https://code.google.com/p/wikiteam/ [21:02:48] bye all [21:04:03] Skud: couldn't they ask archive.org for, you know, an archive? [21:05:08] of course then you'd still need some kind of screenscraping to convert the archived HTML to wikitext, I guess [21:24:59] good evening, i recently updated my mediawiki install to 1.23.0 and am now having difficulty with the Widgets extension even after updating it - each widget i try to use simply reports "Error in widget " [21:25:28] i did ensure i followed the installation instructions (download & initialize submodule) and ensured compiled_templates folder is writable by the web server [21:25:50] anything i've found related to this issue was smarty (submodule) not being downloaded and i've verified it is present [21:25:58] i also ran maintenance/update.php [22:18:15] wheeeeeere is vulpix when I need him [22:18:28] legoktm, georgebarnick, is this a valid Wikia person? https://translatewiki.net/wiki/Special:Contributions/Mira84 [22:18:50] Nemo_bis: never heard of them [22:19:13] http://community.wikia.com/wiki/User:Mira84 exists, but... [22:19:26] uh " i18n Project Manager at Wikia" [22:20:12] And doesn't know how to create a userpage? I guess that's possible. [22:26:28] lol [22:29:11] lol [22:29:14] x2 [22:32:47] Something is happening at Wikia. Today I even saw their CTO editing mediawiki.org pages about ULS [22:33:20] CTO & VP & demigod-whatever [22:34:21] huh [22:43:09] How do I get MediaWiki to reload LocalSettings.php ? [22:43:38] Moonlightning: You shouldn't need to? [22:45:03] hmm [22:45:34] Can $wgSitename contain apostrophes? [22:45:52] Nemo_bis: I know nothing about Wikia... [22:47:23] Moonlightning: should be able to [22:47:56] $wgSitename can be almost anything. Certain characters aren't allowed in $wgMetaNamespace though. However I think apostraphes are allowed in both [22:48:19] Oh, really? o.o [22:50:19] Looks like they are. [23:37:23] in reference to my prior issue with Widgets I had to run purgeList.php to purge all pages' cache [23:47:18] Towncitizen: In theory that's supposed to automatically happen whenever you edit LocalSettings.php [23:47:55] bawolff, would it be a queued job? i checked "showJobs.php" and it said 0 after editing LocalSettings [23:48:53] Towncitizen: no, not a queued job [23:49:21] don't know then, what could cause pages to not be purged? [23:49:34] There would be a slight difference in that it wouldn't purge squid/varnish, but I doubt you have that set up [23:50:21] IF $wgInvalidateCacheOnLocalSettingsChange is set to false it wouldn't happen [23:51:56] i do not have squid or varnish set up, and i don't have that setting in my localsettings [23:52:32] * bawolff doesn't know why it wouldn't work for you [23:52:48] ah well, i probably won't run into this issue again and even if i do i'll know what to do [23:52:54] thanks for your help