[00:06:40] might be weird, but why does MediaWiki create accounts from the Wikipedia global account? [00:07:38] it is already annoying that an account is created if I just visit a Wikipedia porject page, already collected 54 or so of them [00:15:02] !sul | tbo [00:15:02] tbo: See , , and . [00:17:24] bawolff: answer those why I have an account on mediawiki.org now even though I just visited it? [00:18:36] roughly. I was actually hoping !sul gave better links then that [00:18:38] tbo: to do anything on a wiki you need a local account [00:18:56] tbo: https://meta.wikimedia.org/wiki/Help:Unified_login is the link that's most informative [00:19:30] when visiting a wiki logged in it creates an account if you dont already have one. that way you dont need to create hundreds of accounts [00:19:33] Betacommand: I think it is annoying, I just visit a page to read something suddenly there is an account with my name and I can't delete it. :-( [00:19:37] but yes, if you visit any Wikimedia site, you auto get an account if you logged into a different wikimedia site (With the exception of some dev things) [00:19:53] tbo: why is that a problem, your account is global [00:19:54] tbo: Why do you find this annoying? [00:21:26] bawolff: because I have 52 wikipedia-account I don't need or wanted, in languages I don't speak :-( maybe I'm just over reacting but i think it is not a good idea especially because it is impossible to delete an account even w/ no edits [00:22:00] tbo: So if we didn't tell you that you had an account, you'd be fine with it? [00:22:27] tbo: why is having a local account a problem, when you already have a global account? [00:22:32] bawolff: how about only create an account when one is needed? [00:23:17] tbo: Well arguably you need one as soon as you visit any site [00:23:23] tbo: there is only one account, its just registering your global account on the local wiki [00:23:46] otherwise things like global user css wouldn't work [00:23:50] so you can have preferences, people can email you and what not [00:25:21] I think the system is weird, that is all, some settings are global, others are local, it is confusing [00:26:44] tbo: we are working on combining the two but its a long slow process [00:26:57] meh, most of them are local. Which is kind of unfortunate [00:28:18] <^demon|away> Making things unified would be so much easier with a time machine. [00:28:36] <^demon|away> Go back and tell Jimmy and Brion that we'll end up having more than just enwiki and dewiki. [00:28:41] <^demon|away> And unified accounts might be useful. [00:28:46] <^demon|away> :) [00:29:38] The fact we had dewiki should have been a give away [00:29:48] I know those germans like their independence, but... ;) [00:30:14] bawolff: just ignore Germans, they don't know what they want [00:30:32] I actually usually say that about the english [00:30:42] I only care about the sister projects :P [00:30:52] bawolff: I don't many English [00:31:03] <^demon|away> bawolff: I only care about mw.org mostly :) [00:31:24] It really is the best wiki [00:31:32] So little drama comparatively speaking [00:31:52] anyway, I even made the "wrong" Wiki my home wiki but well [00:32:55] yeah, we should really just not tell people what their "home" wiki is [00:33:03] Would reduce the number of angry people significantly [00:34:10] has it any relevance anyway? [00:35:08] No. Its just used internally. If we didn't tell you what it was, there would be no way to figure out which wiki it is. It does nothing [04:39:51] my dreamhost shared server has been killing processes recently, I'm trying to figure out if its wordpress or mediawiki that is taking all my resources. Is there a way to check how much resources mediawiki is using [05:19:29] Nate_LapT: top ? [05:40:36] Nate_LapT: I was getting that on DH. was usually to do with imagemagick resizing images for mediawiki [05:51:10] !wg PasswordSender [05:51:10] https://www.mediawiki.org/wiki/Manual:%24wgPasswordSender [05:52:16] that variable name is misleading [05:52:17] "Also used as the sender address for other email notifications." [06:33:24] samwilson: curious, why does http://wiki.dreamhost.com/MediaWiki#Not_enough_memory suggest 500 (!) MB then [06:34:24] It would be wonderful if someone could summarise issues of DreamHost users of MediaWiki somewhere. They're probably actionable but we have no contact between the two worlds. [07:11:09] nemo_bis: i'd be very happy to help with something like that [07:12:06] one thing i've been tinkering with is a system of generating thumbnails offline and ploking them in their correct locations. that'd reduce a pile of the out-of-memory things i see on DH sites. [07:17:31] offline? maybe he means upon generation [07:17:34] * upload [07:19:10] meh, there are so many RfC on thumbs that I'll make a category [07:20:23] tgr: where is the bug about creating thumbnails on upload or in some queue rather than when first requested? [07:22:59] Nemo_bis: not sure there exists one specifically [07:23:09] a lot of the related issues are discussed in https://bugzilla.wikimedia.org/show_bug.cgi?id=65217 [07:27:25] tgr: ok, linked from https://www.mediawiki.org/wiki/Requests_for_comment/Simplify_thumbnail_cache#Proposed_solution which basically has the opposite proposal [07:32:27] Everyone add to the new https://www.mediawiki.org/wiki/Category:Thumbnailing [08:01:41] Hm. https://pypi.python.org/pypi/pywikibot [08:36:11] Morning. I've placed some .png files into {mediawikidir}/images on my webserver, but I can't work out how to use these in articles. Can anyone advise? [09:15:27] jem^: you need to enable external images whitelist [09:42:53] Is there a configuration setting to make the whole wiki unavailable while it is set? [09:53:35] Zaister: there's $wgReadOnly which makes it all unwritable [10:08:43] samwilson: yeah I know, I was thinking more, just show a landing page that says something to the effect "the wiki is currently offline" [10:09:05] hm, nothing like that that i've ever seen [10:09:11] ok [10:09:14] still giving you admin access? [10:09:29] not necessarily [10:09:42] because you could always just comment out it's DB connection... that'd give some message [10:09:44] hehe [10:09:50] :) [11:28:20] I'm trying to find information on the hierarchical links bar that appears e.g. here: https://wiki.secondlife.com/wiki/Moving_start but part of the problem is that I don't know its name so I don't know what to search for. I mean the part at the top of the page that reads: Second_Life_Wiki > LSL_Portal > Events > Moving_start [11:28:43] Can anyone point me to any directions on what to look for? [11:55:40] I set $wgEmailConfirmToEdit = true; in my mediawiki installation (1.19), but it doesn't send any confirmation email to the registered users. [11:55:54] How can I solve this problem? [11:56:58] They are members of the group "Autoconfirmed users"... [12:29:49] my mediawiki is pretty old, and has been upgraded numerous times from, like 1.1.4, i think. [12:30:10] is there a default LocalSettings.php I can look at to see what i'm missing for default configs, or perhaps performance tweaks? [12:30:16] 1.23 "feels" slower. [12:45:45] Morbus: http://www.mediawiki.org/wiki/Manual:Performance_tuning [12:46:32] chabibi: sure, i'm aware of these, from a high level. [12:46:56] just wondering if there was some new default that caused 1.23 to "feel" slower than 1.22, that i'm not getting due to an inherited LocalSettings.php and command-line upgrades. [12:47:47] I see what you mean, you should perhaps diff your LocalSettings with the one provided with the distributed package [12:49:20] where's the default file located? [12:50:16] i see includes/DefaultSettings.php... [12:51:15] yeah, sorry, it is not with the package. [12:52:15] it is generated by the installer [12:52:25] so, install a dummy site locally. works for me. [12:52:25] thanks. [12:54:44] I have installed a mediawiki site with postgresql backend, but I want to install the site to the other place using mysql backend [12:55:13] could I export the data from the old site and import it in the new place [12:56:43] Nikerabbit: do you have suggestions? https://twitter.com/NeelieKroesEU/status/482077034974560256 [13:03:25] chabibi: hrm, http://techwelkin.com/slow-mediawiki-optimize-to-enhance-performance-part-1 too. [13:03:41] though some duplicating with M:Performance turning. [13:03:49] * Morbus thinks he'll try the file cache, at least. [13:05:13] Morbus: make sure also that your webserver is not misconfigured [13:05:40] Morbus: http://httpd.apache.org/docs/2.4/misc/perf-tuning.html [14:26:12] is there anyway to tweak the default galleries provided by Categories? http://www.disobey.com/wiki/Category:Posters_and_covers_for_1963 [14:26:19] i'd like to perhaps make the images bigger or remove the title, etc. [16:59:25] @seen bawolff [16:59:25] Lcawte: Last time I saw bawolff they were quitting the network with reason: Quit: ChatZilla 0.9.90.1 [Iceweasel 3.5.16/20121207230533] N/A at 6/26/2014 2:33:52 AM (14h25m33s ago) [17:21:54] hrm... [17:22:08] is there a way to disable the parser profiling data that appears on a Preview? [17:43:54] Hey! This may be a tricky one - I'm using a subclasesed makeImage (i.e. Parser::makeImage is called, then this function is applied to the output) for the purpose of outputting the support data for a lightbox image viewer. I'm having trouble with links inside captions. I've tried calling $wgParser->replaceLinkHoldersText() and this works for links in tags, but for links in thumbnail [17:43:54] descriptions I get stuff like . I'm wondering how to fix that. :-) [17:46:15] you're subclassing the parser! [17:46:19] GreenReaper: That's scary [17:46:44] is a link holder, which means the replace link holder method wasn't run on output [17:47:08] Well, we haven't output yet. [17:47:40] I'm wondering if the link holder doesn't contain all the links at that point. It's possible I'm just messing something up, too. [17:48:20] This is a hacked version of Extension:PrettyPhoto which I found really didn't work that well. It works better now, but this is a sticking point because we have many captions like: Here's [[GreenReaper]] with chocolate [17:48:27] Well the link holder should have the link the moment the "" [17:48:28] and it ends up as Here's with chocolate [17:49:46] If you look at the page it's actually "Here's [17:51:55] $thumbcaptions = $xpath->query( "//div[@class='thumbcaption']" ); - foreach ( $thumbcaptions as $thumbcaption ) { [extract node values, surround comment nodes with ] ] - pass through $wgParser->replaceLinkHoldersText() [17:53:19] In case it isn't obvious, this is horribly hacky [17:53:39] It seems odd that that would be the case because it actually *works* in , but not for [[Image:something | description [[link]] ]] by itself [17:53:41] Oh, yes. [17:53:52] I was thinking "isn't there a better way?" all through this. [17:54:07] What sort of data does your lightbox thingy need - image captions? [17:54:10] But I don't know enough about the parser and hook system to recode it. [17:54:11] Yes. [17:54:42] I suppose https://www.mediawiki.org/wiki/Extension:MultimediaViewer does not meet your needs? [17:55:30] And you want the various image captions to be in the "title" attribute of the image? [17:55:39] Actually I spent some time evaluating that this week. It would work if I got a 404 handler going (it doesn't request files correctly right now and it runs into caching issues). But overall it seems like it would have a worse reception than this. [17:56:07] I think we may move to MultimediaViewer but only after they work out their issues and maybe after we apply image descriptions that will make ti more useful. [17:56:19] Here, let me show you what I'm working with as test: http://de.wikifur.com/wiki/Benutzer:GreenReaper/Test [17:56:47] Yeah, i think it assumes 404 handling for performance reasons, as people were complaining image loads were slow so they cut out the request to the api that's a no-op on wikimedia [17:57:02] They have a fallback mode but it doesn't work right. [17:57:10] as I noted in the relevant bug. [17:57:27] and they request the same image anyway which will be cached as a 404 [17:57:36] So it does get generated but too late. [17:58:14] In the linked page, for the thumbnail at the right, you can see the title is actually "test Test3 an example URL" [17:58:47] I'm wondering why that isn't being translated by the replaceLinkHoldersText. [17:59:19] Could you var_dump() the value you're feeding replaceLinkHolders right before you send it to replaceLinkHolders ? [18:01:05] I have it outputting to a log, it is: test Test3 an example URL [18:02:36] Well that at least means that its getting what you expect [18:06:06] I fear the code is irredemably hacky. It gets screwed up by additional anchor tags within a as well, e.g. [http:// something] will mess it up because it looks for all //a[@class!='internal']. Maybe I can add class!='text' to that . . . [18:08:56] Hmm, for the thumbnail case, some combination of ParserMakeImageParams (for converting caption to title attribute) and ThumbnailBeforeProduceHTML (for other various data- attributes) hooks would be the useful hooks probably [19:05:13] hrms. [19:05:21] what should i do if all my thumbnails are generating, except for one? [19:05:41] i'm getting "Error creating thumbnail: Unable to save thumbnail to destination" on one of the images. [19:06:04] in a previous version of MediaWiki, it was able to generate the thumbnail fine, but i've just changed the size of them in the gallery, and now it's all upset. [19:07:10] Morbus: have you upgraded recently? [19:07:35] this site has been upgrade from, like 1.14 on a regular basis, aye. and most recently, from 1.22 to 1.23 [19:09:08] I'm trying to understand if it's a particular problem or a general one, for example, if you have upgraded 5 minutes ago and the only image that is being resized have problems but you haven't tested others to see if the problem happens on all of them [19:10:04] sure. i upgraded whenever 1.23 came out, i'm running a git checkout, and i only just changed the gallery widths today. [19:10:27] the existing thumbnail, from the original gallery widths prior to the upgrade, continued to show, until just now, when i changed the gallery widths. [19:10:32] It sounds like file permissions [19:10:51] on what? [19:10:56] Yeah, check where the thumb should go and see if your web server user has access to write to that directory. [19:11:01] given that all the other thumbs regenerated properly? [19:11:04] i.e. images/thumbs/a/ab/whatever [19:11:10] you can see it under http://www.disobey.com/wiki/The_Dead_Hate_the_Living! [19:11:16] Regenerating is different to resizing [19:11:23] It's quite possible that you ended up with different file permissions for some reason. [19:11:29] New size means a new file on disk [19:12:22] all of my files/temp/ directories are :apache 775 [19:12:33] is there anyway to tell which directory is attempting to be written for this one file? [19:12:48] images/thumbs usually [19:13:08] The file in question is The Dead Hate the Living!-2000-German-VCD-Ion-1.jpg and so you need to look in w/files/thumb/5/59/ [19:13:51] yes, interestingily, that dir is 755, not 775 [19:13:54] I would suggest figuring out the correct owner, group, and file permissions, and doing chown -r, chgrp -r and chmod -r on the images directory. [19:14:06] Make very sure you have it right first. [19:14:12] Maybe just on images/thumbs [19:14:30] well, i would prefer everything to be morbus:apache, 775/664 [19:14:35] seems to be a general problem http://www.disobey.com/w/thumb.php?f=The_Dead_Hate_the_Living!-2000-US-DVD-FullMoon-1.jpg&w=190 [19:14:37] whether or not that's how the files get created or not... ;) [19:14:40] It may be that, say, a maintenance script that you ran created that directory. [19:14:52] Have a look at the owner. [19:14:59] It may be your local account. [19:15:11] yeah, all the files that i'm seeing in here are my local account. [19:15:13] In any case, you can reset them for the tree with the above commands. [19:15:15] and group'd to apache. [19:15:31] Right, but the apache group won't be able to create files in that directory without write access for the group. [19:15:33] So the problem. [19:15:37] some files have the "right" (rw for group/apache) permissions, and some don't (r for group/apache) [19:15:57] Often happens on import. [19:16:07] could you tell me the directory for. ... [19:16:09] * Morbus roots around. [19:16:25] http://www.disobey.com/wiki/File:Blood_Widow-2014-Title.png [19:16:35] this one was uploaded this morning. curious what it was created as. [19:17:34] hrm... [19:17:48] so that directory was created as apache:apache 755 [19:17:59] and the files as apache:apache 644 [19:18:03] If you view an original file or its preview you get the directory [19:18:26] so, seems like i should apache:apache the whole files/ dir. [19:19:01] Possibly! Bear in mind that you may need to be a member of that group to run maint on it. [19:19:15] yeah, it's a dedicated server and i'm root, blah blah. [19:19:18] Or the mode may need to be . . . woah. [19:19:21] or, can sudo to root, etc. [19:21:25] there we go. [19:21:26] apache:apache'd the whole shebang, and everything is golden again. [19:21:30] thanks for the help. [19:21:41] (shebang being the files dir.) [19:21:42] You're welcome. [19:24:09] hello! bawolff Please help me out making a new table in the mediawiki database through the special page of my extension. [19:25:05] I wish to create a table for trait whenever any user clicks on "add new trait" on the form of special page of my extension. [19:27:14] Usually you wouldn't create a new table for user data, but new rows in an existing table (read up on database normal forms) [19:29:48] albertcoder: there's some docs at https://www.mediawiki.org/wiki/Manual:Database_access and https://www.mediawiki.org/wiki/Manual:Hooks/LoadExtensionSchemaUpdates [19:33:34] bawolff, I had already referred to these docs but still could not understand how to use a hook. [19:35:39] The first link does not say anything about creating tables and the second one has too brief information to understand. [19:36:36] bawolff, there is a specific requirement for creating a new table for adding user data. [19:40:08] That's possible, but almost certainly a bad database design [19:41:21] yeah there has been a lot of debate on this with my mentor and some other people of the community but this database design is must for the project. [19:42:07] albertcoder: "mentor" - are you doing a gsoc thing? [19:42:19] yes bawolff [19:42:33] cool. For which organization if you don't mind me asking [19:42:42] brlcad [19:43:32] this is a web related project so it mostly comes under mediawiki :) I cannot hold any discussion on brlcad IRC. [19:44:43] If you really must create tables from a web interface, you can usually do it via $dbw = wfGetDB( DB_MASTER ); $dbw->query( "CREATE TABLE foo..." )' [19:45:11] But seriously, there is probably a better database structure that doesn't involve creating tables [19:45:50] Also, sometimes the sql user mediawiki runs as won't have sufficient rights to create tables (Its a common security precaution, since usually that's not needed) [19:46:38] yeah it has been discussed from the beginning that there shouldn't be any strict requirement to create a new table for each trait each time. But I had to adhere by the instructions. [19:47:40] anyway bawolff can all sql queries be executed like the way you just mentioned? [19:47:42] bawolff: Saw your message in #Wikimania ... strange, I got a 3 day + opening + hackathon early bird ticket, I wonder why they're not available for regular offering. I'd suggest emailing ... whoever it is that's in charge of these things :| [19:48:18] albertcoder: Also keep in mind there may be existing extensions for storing "structured data", such as SemanticMediaWiki, which may be useful for your purposes [19:48:36] Lcawte: I did. They told me to check back in a couple days [19:49:11] absolutely bawolff looking forward to port my extension into SMW just after some consultation with my mentor. [19:50:04] but meanwhile please tell can all the sql queries be executed like this $dbw = wfGetDB( DB_MASTER ); $dbw->query( "CREATE TABLE foo..." )'? [19:53:02] yep [19:53:12] or bawolff what is the specific use of hooks? Can't we just embed the sql queries the way you just mentioned? [19:54:09] Although we recommend using the other wrappers where available to be db agnostic/prevent sql injection, everything can be done via ->query [19:54:46] okay great! thanks bawolff [20:14:36] When I click on 'edit' on a specific page the content doesnt come up [20:14:49] it stays unresponsibe [20:16:26] any idea what it might be? [21:06:22] hi [21:07:28] I have a weird problem with the LDAP Authenticator for MediaWiki. LDAP users that were registered after a certain point in time can't log into the wiki. I'm not sure what changed, but all newly created users cannot log in, though the old users can. I see no difference in the LDAP entry [21:08:44] here is an example log output for a successful login: https://dpaste.de/fM9n [21:09:00] and here for an unsuccessful one (i.e. one with a new user): https://dpaste.de/EPpM [21:09:58] generally the LDAP server works. I also have a phpbb forum with the same LDAP users being used, and the users can log in there