[00:02:19] *AaronSchulz is too busy atm [00:03:00] ok [00:38:12] 03(mod) Confirmation box - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16066 (10innocentkiller) [00:41:44] 03(mod) Confirmation box after saving an edit - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16066 summary (10public) [00:42:55] 14(INVALID) succession box templates adopted from wikipedia probably created incorrectly for BoxRec wiki - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16061 +comment (10innocentkiller) [00:43:30] ^demon: Do you have SVN open? [00:48:26] <^demon> MZMcBride: What's up? [00:48:41] http://www.mediawiki.org/wiki/Extension:CentralNotice <-- Somebody got cute and used a .local address in the source code. [00:48:54] Which has lead to confusion. Could it be changed to the standard example.com ? :-) [00:48:59] It should be a one-line fix. [00:53:21] <^demon> Done :) [00:53:22] <^demon> 2 lines [00:53:23] Led. * [00:53:36] 03demon * r42320 10/trunk/extensions/CentralNotice/CentralNotice.php: Example.com makes a much nicer dummy URL than Smorgasbord.local, and is less likely to confuse people :) [00:53:46] Thanks. :-) [00:54:10] <^demon> np [01:00:37] 03demon * r42321 10/trunk/phase3/includes/FakeTitle.php: Fix some strict standards on function declaration. Others might need this too, but these methods were mentioned in bug 16060. [01:00:52] 03(FIXED) Job queue spews PHP Strict notices - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16060 +comment (10innocentkiller) [01:11:28] 03(mod) Confirmation box after saving an edit - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16066 (10brett.jr.alton) [01:25:11] 03dale * r42322 10/trunk/extensions/MetavidWiki/ (13 files in 7 dirs): lots of sequence support updates. [01:26:10] 03(mod) Confirmation box after saving an edit - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16066 (10innocentkiller) [02:03:07] 03(mod) Wikimedia should become an OpenID provider - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=13631 (10mikelifeguard) [02:03:19] 03(mod) Support OpenID extension on all wikimedia projects - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=9604 +comment (10mikelifeguard) [02:15:40] 03(mod) Confirmation box after saving an edit - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16066 (10mikelifeguard) [02:41:19] Does anyone here know of any AjaxLogin extensions available for download? [02:43:07] https://svn.wikia-code.com/wikia/trunk/extensions/wikia/AjaxLogin/ [02:43:37] though it seems to use yahoo YUI [02:44:50] there's practically no wikia extensions that will actually work unhacked ;) [03:12:55] Splarka: Looks like I'll need a lot of hacking to make that work, thanks for the link to get me started though :-) [03:23:08] rar [03:29:46] Is there a way to exclude redirect pages from search suggest? [03:54:16] 03river * r42323 10/trunk/tools/ts-specs/ (TSlighttpd.spec TSsubversion.spec): add missing dependencies [04:09:46] i have a policy question related to live mirrors, but not exactly about live mirrors [04:10:29] is it ok to use live data from wp in a desktop client application (from where it'll never ever be crawled by a search engine) [04:11:17] this would be done using the Special:Export interface [04:11:40] it'd be better if you could just get the article HTML using the normal URL [04:11:50] much less expensive for the servers [04:11:53] You could use the API, no? [04:12:14] no [04:12:23] /wiki/Article_name [04:12:50] The API offers the page content. Just depends what you want the desktop client application to do, I suppose. [04:13:13] it's as bad as Special:Export [04:13:26] i'm confused. is the API different than Special:Export? [04:13:29] no [04:13:35] both hit the backends instead of the cache [04:13:37] it's the same [04:14:03] same data, same performance characteristics [04:14:04] You'd also hit the backend if you're logged in, though.. [04:14:04] so is it ok to use the API - i would probably want to use the API instead of scraping the HTML, because i'd want to generate simpler formatting [04:14:29] i mean live use the API [04:14:33] in a desktop client app [04:15:32] you're going to write your own wikitext parser? [04:15:43] he probably means action=parse [04:15:45] i guess i would have no choice but to do that [04:16:01] since html scraping is not allowed anyhow [04:16:10] it sure as hell is allowed [04:16:25] on behalf of the wikimedia foundation [04:16:32] I hereby declare that scraping is allowed [04:16:38] wee [04:16:41] ok thakns [04:16:45] *thanks [04:16:53] it's good to have options [04:17:06] and the load per client would be light [04:17:10] mm, Tim, any reason action=render is a MISS? -> http://en.wikipedia.org/wiki/David_Bowie?action=render [04:17:17] private, s-maxage=0, max-age=0, must-revalidate [04:17:25] well let me go ahead and tell you my idea. do any of you use Pidgin? [04:17:31] would be handy for html scraping without UI clutter [04:17:42] if you really want to have the wikitext, I would say use Special:Export [04:17:51] with a two-part URL [04:18:01] Splarka: thx for that pattern [04:18:16] http://en.wikipedia.org/wiki/Special:Export/Article_name [04:18:18] Hungryman: it isn't any better than export/api, no cache [04:18:31] because at least it's cacheable in principle [04:18:37] and make sure you set the User-Agent [04:18:41] i am not a 100% sure yet which of the two options would be best. i will have to see [04:18:56] ok, User-Agent [04:19:03] So one should use Special:Export instead of the API? That seems backward. [04:19:31] TimStarling said the API is Special:Export [04:19:44] MZMcBride: don't you have a job of your own to do? [04:20:02] Heh. [04:20:08] so can i tell you my idea and get a comment or two? [04:20:19] Not everyone lives in your timezone. [04:20:19] I reckon I can handle mine just fine by myself [04:20:31] Well, there's no need to get snippy. [04:21:00] do you use Pidgin, so i can explain further what i have in mind [04:21:12] just checking [04:21:18] But the API was built and has the rvprop=content option for a reason. [04:21:36] what does rvprop=content signify? [04:21:46] when hints fail, I have to be direct [04:22:06] A mind is like a parachute, it works best when open. [04:22:42] we can only purge a finite number of URLs from the squid on each edit [04:22:55] *werdnus waves. [04:23:02] because squid stores its URLs in a hashtable, there's no way to purge a prefix or a regex [04:23:07] you need to know the precise list [04:23:31] all URLs that aren't on that list should, in principle, be sent with cache-suppressing headers [04:23:45] the API conforms to this principle, as does Special:Export [04:24:22] now, if server load from current-version Special:Export requests becomes a problem, we could set cache headers and add the primary Special:Export URL to the purge list [04:24:44] but it's rather more difficult to do that for the API, because of the large number of ways a given query could be formed [04:25:27] what is the best API way to get a list of search results for a given search term? [04:25:52] I think there's only one of those [04:26:17] That makes far more sense. Thanks for the explanation. :-) [04:26:42] TimStarling how can that be done? [04:27:17] http://en.wikipedia.org/w/api.php [04:27:27] look for list=search [04:27:33] ok thanks [04:29:00] ahh, *reads back*, too many ways to action=render, to be cacheable + purgable-on-edit. /wiki/Foo?action=render /w/index.php?title=Foo&action=render /w/index.php?action=render&title=Foo [04:29:52] yes [04:30:02] we could redirect [04:30:06] if it became a problem [04:30:12] I tried to work with roan into making the API partially cachable, but the WMF servers override the api cache headers, so not much use [04:31:36] 03ipye * r42324 10/USERINFO/ipye: info page for ipye [04:31:54] that is, totally voluntarily via &maxage &smaxage, so if you have info you only care about being ~24h accurate, you set them to 86400 [04:32:21] So no way to exclude redirects from search suggestions? [04:33:47] well, some stuff could be cached in-process. [04:33:50] it's lame how several google search results for wikipedia articles point to redirects [04:33:56] Resplendent: search DefaultSettings.php [04:34:12] if it's not there, you would have to patch the source code [04:34:13] I mean, our article hits are X% of all hits in total, where X is some large rough number that Tim probably knows. [04:35:23] 15:33 < werdnus> well, some stuff could be cached in-process. [04:35:29] um, not in-process. [04:35:43] by the software itself, instead of by squid. [04:35:56] Yeah checked there already, guess it's not doable by me [04:36:15] Hungryman: some people suggested __NOINDEX__ on all redirects, but IIRC there are good reasons not to [04:36:38] we could use a 302... [04:36:41] in fact, why don't we? [04:37:14] redirects are good for SEO [04:37:17] they boost relevance [04:37:37] as in, doing them our way instead of the 302 way is good for SEO? [04:37:44] mm, wouldn't 307 be more appropriate (technically, probably not as widely accurately supported as 302) [04:37:48] yes [04:37:59] that's my theory anyway and I'm sticking to it [04:38:00] since all #REDIRECT are potentially temporary [04:38:03] hmmm... we care about SEO? [04:38:36] well, usually people want 301/302 redirects because they have some theory that it helps SEO [04:38:37] *Splarka looks for a recent thread about this on VPT [04:38:48] but I don't think it dose [04:39:11] there's a rather more obvious reason we don't do it that way [04:39:22] there'll be a three-digit bug number which tells you the whole story [04:40:02] some day, we'll be talking about four-digit bug numbers, and them being very old issues. [04:40:12] http://en.wikipedia.org/w/index.php?title=Wikipedia:Village_pump_(technical)&oldid=241287103#How_Google_handles_page_moves_on_Wikipedia [04:42:33] https://bugzilla.wikimedia.org/show_bug.cgi?id=15560 [04:42:35] here's one. [04:42:40] similar [04:42:41] but not the same [04:43:21] I can't find the one with 30x stuff, maybe it's marked WONTFIX [04:44:27] can't find it RESOLVED, either. [04:44:32] mm, isn't there a perennial request to serve 404 for red links too? and then people start going down the list of status codes.. "ooh, payment required, that could replace squid errors that say 'please donate'" [04:44:43] heh [04:45:13] https://bugzilla.wikimedia.org/show_bug.cgi?id=2585 [04:45:20] HTTP is oooold [04:45:21] Hi guys, I have a question [04:45:21] That's the one about 404. [04:45:49] !ask | SquishyVic [04:45:49] --mwbot-- SquishyVic: Don't say "I have a question", or ask "Is anyone around?" or "Can anyone help?". Just ask the question, and someone will help you if they can. Also, please read < http://workaround.org/moin/GettingHelpOnIrc > for a good explanation of getting help on IRC. [04:46:04] on http://swfanon.wikia.com/ the magic word __HIDDENCAT__ isn't working [04:46:09] mwbot: sorry :P [04:47:41] oh god is swfanon still around? [04:47:44] on http://swfanon.wikia.com/ the magic word __HIDDENCAT__ isn't working but on http://starwars.wikia.com it is :\ [04:47:50] Splarka: yes :P [04:48:27] strange [04:48:35] maybe it's only been discussed on the lists [04:48:40] SquishyVic: do you have hidden cats showing in your prefs? [04:48:53] or in site-css or personal css? [04:48:54] *Krimpet hopes that stands for "anonymous Flash animations." o_O [04:48:58] Nope, and I mean it just appears as raw text i guess you'd call it on the category. doens't dsiappear [04:49:06] I dont think so [04:49:12] Krimpet: AFA? [04:49:17] Alternative Family Arrangements, to me. [04:49:19] *werdnus has been studying. [04:49:24] werdnus, "SWF Anon" :p [04:49:46] mmm, damn, Wikia uses internal revision numbering still [04:50:36] What is that? Well either wya, it worked on Wookieepedia, not Star wars Fanon [04:50:53] __HIDDENCAT__ was introduced in 1.13, so you might have a different version of 1.13 (although both you and starwars have 1.13.2 listed) [04:50:59] Yeah [04:51:03] that's the werid thing [04:51:06] (I checked that) [04:51:38] anyway, the reason we don't do #redirect using HTTP redirects is because then you wouldn't get the "redirected from" message [04:51:41] you'll probably have to ask the techs who never visit #wikia [04:51:50] lol... yeah [04:51:54] which would make it hard to edit the redirect itself [04:51:57] or ever respond o_O [04:52:32] Well should I just ask a Wikia Staff member on their talk page? [04:53:00] Like Sannse, the only one who ever responds ;\ [04:53:23] probably. Are you absolutely sure it isn't a typo, and you're putting it on a page in the Category: namespace? [04:53:38] call it a quirk of the wiki display medium [04:53:53] (because even with prefs, the magic word should parse and not show raw, if it exists, unless it has been mistranslated, but swfanon should be english content) [04:54:18] bbl [04:54:21] TimStarling: we should set cookies to send that info between requests :) [04:54:22] *werdnus hides. [04:54:42] Werd: could be done with javascript and not break cache, but ew [04:54:52] there's a few ways to do it [04:55:01] I was joking :) [04:55:02] but they all make the stomach of an old-school programmer churn [04:55:12] well [04:55:22] best thing to do, make a list of index bots, and serve them 302/307 [04:55:35] Hm, ok thanks [04:55:38] well, we could do that, if it helped SEO [04:55:41] which I don't think it does [04:55:47] Couldn't we just check the referrer? [04:55:51] google places a high weight on the URL for relevance [04:55:53] sorry, best thing to do _if you wanted to do this_ [04:56:04] and the "redirected from" message is also prominent [04:56:43] bbl [04:57:21] We could use a 302 redirect, then check the referrer, see if it's a redirect to the page, and display "redirected from" appropriately [05:12:06] is there some kind of replacement for the built-in MediaWiki editor that provides syntax highlighting for templates etc.? [05:12:36] I have no idea what you're talking about. Have an example link? [05:12:44] sure there is [05:12:46] wikEd. [05:13:23] wikEd always sounded like an ailment to me. [05:13:48] heh [05:13:59] what kind of pager am I supposed to use for listing abuse filters? [05:14:01] hey tomaszf [05:14:18] hey there [05:14:28] werdnus? heh [05:14:37] AaronSchulz: my real server's down. [05:14:57] A former employee from my host decided to dev if=/dev/zero of=/dev/sda [05:15:01] erm, dd* [05:15:26] http://oreilly.com/catalog/9780596519797/#top :) [05:15:42] o reilly? [05:15:46] former employees with root rights. nice :) [05:16:34] *werdnus shrugs. [05:16:45] my dbs and stuff were replicated [05:16:54] but my configuration wasn't./ [05:17:06] no backups. even more nice ;) [05:18:40] it was my own VPS... [05:19:55] well, as if hardware could never fail. or ex-admins with root rights could never delete them ;) ok, I'll stop now, I guess you have enough trouble already without me putting salt into the wound :P [05:20:10] I didn't have anything important there. [05:20:15] just a bunch of test wikis, some IRC logs. [05:20:25] anything important was in my database, which is replicated. [05:21:07] man, TablePager confuses the hell out of me. [05:23:16] Best way to avoid that is to store all your stuff on /dev/sdb instead :D [05:25:53] <_mary_kate_> werdnus: so now you're looking for a new hosting provider? [05:38:48] _mary_kate_: not now, at any rate. [05:39:06] _mary_kate_: I'm too busy even to reconfigure it as it was, let alone find someone else. [06:17:46] Hello Everybody [06:18:11] Can anybody show some examples for this variable $wgScriptPath [06:18:19] 03werdna * r42325 10/trunk/extensions/AbuseFilter/ (AbuseFilter.i18n.php SpecialAbuseFilter.php): [06:18:19] Abuse Filter: [06:18:19] * Display list using a sexy TablePager. [06:18:19] * Allow sorting by various things, like hit count, id, consequences, last modified, etc. [06:18:36] I have not installed wiki in a sub directory [06:18:49] that is the problem [06:19:33] Splarka : Hi [06:19:49] *Splarka rars? [06:20:02] Splarka : i have not installed mediawiki in subdirectory [06:20:18] Splarka : what should be the $wgScriptPath [06:20:41] am I suddenly the expert because I joined first after your question? ^_^ [06:20:45] http://www.mediawiki.org/wiki/Manual:$wgScriptPath [06:20:51] ^^; [06:21:43] it is the location of your index.php, api.php, and things like thumb.php, image auth, and skins/js/css/styles if so configured [06:22:31] Splarka : i know what it is, but what should be the value if mediawiki not installed in a subdirectory? [06:23:11] depends on your situation *shrug*, dunno [06:23:37] best not to ask questions here to specific people who look active, just ask them in general [06:24:55] what should be the value of $wgScriptPath if mediawiki is not installed in a subdirectory? [06:25:03] if you want examples of it.. there are defaults in DefaultSettings.php, on wikipedia/wikimedia they use: "$wgScriptPath = '/w';" (see http://noc.wikimedia.org/conf/highlight.php?file=CommonSettings.php) but they have a bit of an odd setup with stylepath not a descendant of scriptpath [06:26:14] Splarka : got it, it should be blanked [06:27:45] well, having a null script path I've heard can be sometimes annoying. but Wikia does that. Just be sure not to have a null article path too or you'll have headaches [07:06:02] *werdnus kicks CIA-58 [07:06:03] ow [07:30:28] hello [07:30:52] i succefully installed mediawiki, but still have a small trouble with config the short url [07:31:59] i have an apache2 vhost /home/user/public/w/files_media_wiki.all_files_and_directory where i've put directly the media wiki files [07:32:21] i did what the page about the short url told me [07:33:13] so i can access my mediawiki throught http://subdomain.mydomain.tld/w/ [07:33:40] and the url is rewrited like : http://subdomain.mydomain.tld/wiki/page_blah [07:33:43] perfect [07:34:27] but... http://subdomain.mydomain.tld dont work , only show me the /w/ directory apache's list style [07:36:06] <_wooz> lo [07:36:21] so what i did wrong ? if i change the apache2 DocumentRoot to /home/user/public/w/ it doesnt work anymore [07:36:25] hello _wooz [07:37:11] if someone have an idea to lead me on the right path, i would appreciate [07:37:21] and sorry for my awfull english [07:42:44] hehe [07:42:47] GZ WITH A BOOK! [07:48:13] pingouin: add an index.php to the root directory that contains good morning :) [08:03:50] werdnus: ok thanks for the clue [08:04:07] werdnus: did i miss it on the 'install/configure/short url' page ? [08:04:50] no [08:08:23] ok [08:09:19] so if i only follow the 'install/configure/short url' page i have to give to people this url to access my wiki : http://mysubdomain.domain.tld/w/ [08:09:32] which is a little bit ugly [08:09:41] many thansk for the trick ;) [08:09:44] *werdnus shrugs. [08:09:47] it MIGHT be on that page. [08:10:06] hi, I'm fetching day by day a "picture of the day" and I want always to have the actual picture in the wiki. Is there a way to overwrite pictures behind wiki's eyes? [08:11:23] richardsfsdf, use $wgAllowExternalImagesFrom and some image rotation [08:12:01] richardsfsdf: or you can use a bot :) [08:12:27] rofl :) [08:13:25] thanks wgAllowExternalImagesFrom would be the best solution :) [08:14:20] you could also use CSS and make it a background image (dirty) [08:14:37] zomg^^ [08:14:51] would have to be site-wide css, inline doesn't allow background images [08:18:16] Someone should really file a bug for that... [08:19:13] Or, alternatively, you could use templates and stuff [08:19:40] MZM: for what? intentional [08:19:47] the externalimage will work :) thanks so much [08:20:01] Splarka: Yah, I was being facetious. [08:20:15] well don't be, it tickles [08:27:32] hi domas [08:29:06] Facetiously has all six vowels in alphabetical order. [08:31:02] words that have them all reversely: UNNOTICEABLE SUBCONTINENTAL UNCOMPLIMENTARY [08:31:46] and always fun for i: indivisibilities [08:31:53] roflmao^^ damnit [08:32:26] who crashed my browser? [08:32:27] Stewardesses is the longest word that can be typed using only your left hand. [08:32:37] Carebearstare does not count. [08:32:50] left [08:32:55] Esp. due to the questionable B. [08:33:02] boo, that's assuming qwerty [08:33:17] Indeed. [08:33:26] we zebra sex affected stewardesses are vexed at abstract tesseract art ~ average wetter bastard breeders crafted free access wax text [08:33:28] Typewriter is the longest word using the top row. [08:34:12] What about tewropyuwerqueroptuyo? [08:34:24] Not a real word, unfortunately. [08:34:30] it is now [08:34:37] *werdnus adds to Wiktionary. [08:34:40] And Splarka is cheating. I see those Bs young man! [08:34:54] kinkeh [08:35:04] bite [08:35:05] ten points if you can translate that new word into IPA [08:35:45] why is b cheating? [08:35:55] B is a right-hand key. [08:36:00] http://jeff560.tripod.com/words14.html [08:36:08] it isn't [08:37:12] werdnus: http://192.20.225.55/tts/speech/37c954fb8e9095e27c1463518fa93ba3.wav [08:37:20] http://upload.wikimedia.org/wikipedia/commons/thumb/4/40/Touch_typing.png/400px-Touch_typing.png [08:37:21] if the link doesn't work paste it into http://www.research.att.com/~ttsweb/tts/demo.php#top [08:39:04] Splarka: that's funny, but they're missing the final 't' [08:39:45] "CORPS has the plural spelled the same way as the singular but pronounced differently." [08:39:55] yeah, don't give the pronunciation then! [08:40:27] :O [08:40:28] werdnus: it is silent [08:40:34] I've always typed B with my right hand. [08:40:35] like "colbert report" [08:43:55] and here is how to get 1778 points in one turn in scrabble: http://p.defau.lt/?DsBjObJVHpCKcQzK7Jb8eg [08:45:06] Jaculating is a word? [08:45:14] e [08:45:25] what's preffernuss? [08:47:03] MZM: http://thepixiepit.co.uk/cgi-bin/scrab/scrabLookup.pl?search=jaculating [08:47:36] "OSPD*" are the home versions, safe for school, etc [08:48:09] Hmm. [08:48:10] so bad words are left out, like http://thepixiepit.co.uk/cgi-bin/scrab/scrabLookup.pl?search=fuck and http://thepixiepit.co.uk/cgi-bin/scrab/scrabLookup.pl?search=jew [08:48:23] *Splarka waits for "Jew is a swear word?" [08:48:38] I believe that tool is made of witchcraft and lies. [08:49:03] http://thepixiepit.co.uk/cgi-bin/scrab/scrabLookup.pl?search=witchcraft http://thepixiepit.co.uk/cgi-bin/scrab/scrabLookup.pl?search=lies [08:50:40] Here we go. [08:50:45] Required the full version of Oxford. [08:50:53] a. trans. To dart, hurl. b. intr. (for refl.) To dart forward. [08:50:53] 1623 COCKERAM, Iaculate, to dart. 1634 SIR T. HERBERT Trav. 20 They know accurately how to jaculate their Darts of blacke Ebony. 1860 EMERSON Cond. Life i. (1861) 27 Do you suppose, he can be estimated by his weight in pounds,..this reaching, radiating, jaculating fellow? [08:52:12] http://en.wiktionary.org/wiki/Special:WhatLinksHere/jaculate [08:58:26] Hi, I would like to activate a feature in mediawiki. On the page : http://www.mediawiki.org/wiki/User_hub/fr, there is a link under "User hub/fr" displayed " !subpages | janolap1 [08:58:58] --mwbot-- janolap1: By default some namespaces have subpages enabled and others don't. You can enable or disable them using $wgNamespacesWithSubpages in LocalSettings.php. [08:59:17] That little link is called a breadcrumb. [09:00:23] MZMCBride : Thank you ! I don't know that word (as you can see I'm french). [09:00:52] HAPPY CAPS LOCK DAY EVERYONE [09:04:36] scap lock day? [09:04:48] hi, is there a quickstart guide to making a private wiki? ie, nothing visible to anonymous users, new signups subject to admin approval and/or manual adding only? [09:04:51] scap lock week... [09:04:54] *werdnus locks the scap keys away [09:04:59] scap lock month [09:05:05] been ~3 weeks now eh? [09:05:12] *werdnus shrugs. [09:05:12] !access | Xavvy [09:05:12] --mwbot-- Xavvy: For information on customizing user access, see . For common examples of restricting access using both rights and extensions, see . [09:05:27] appreciate it :) [09:06:24] "Note Note: You can use the ConfirmAccount extension if you want to set up an account confirmation queue. (If not you may still proceed as follows.)" ;-) [09:06:34] *Splarka wonders if amidanny will ever write fuzzy conversation parsing to make mwbot self answer some questions, if the user is a noob and asking an obvious question [09:06:59] I'd much rather have searchable logs. [09:11:05] i have that half done, but i've been a lil busy [10:25:57] if anyone cares, USERINFO/ipye (r42324) is missing eol style native. [10:44:40] hi everybody [10:45:33] Can MediaWiki be installed 100% command-line, without using a web browser? [10:54:12] imt06: not in a particularly convenient way [10:54:31] there's a wikimedia-specific script to do it, which you may be able to adapt for your situation [10:54:46] TimStarling: maybe using "curl" or "wget" against /config/index.php? [10:55:10] maybe [10:55:21] are you packaging? [10:55:33] or running a wiki farm? [11:02:11] too late, going for dinner [11:03:23] pseudo-packing; I rather write shell scripts than documentation ;) [11:03:52] (I want to be able to drop/reinstall mediawiki several times and be able to reproduce it in the testing/production environments) [11:03:57] l8r [11:03:59] wow, dinner at 10:03pm [11:23:35] Hello. I've a simple problem. How can I make the following assignment work [11:24:07] $phrase = $this->html('subtitle'); [11:24:40] aja2: how does it "not work"? [11:24:47] also why does html('subtitle') ?> work [11:24:48] it's an assignment. it wil lassign something :) [11:24:51] 03(mod) Install more fonts (especially for Unicode) - 10http://bugzilla.wikimedia.org/show_bug.cgi?id=8898 (10N/A) [11:25:04] should there not be an echo command? [11:25:16] define "work" [11:25:19] I put $phrase = $this->html('subtitle'); [11:25:19] echo $phrase; [11:25:19] echo $phrase; [11:25:19] echo $phrase; [11:25:20] ?> [11:25:23] into my skin [11:25:32] but it only printed the subtitle once [11:25:39] it's like it's not being assigned [11:25:44] function html( $str ) { [11:25:45] echo $this->data[$str]; [11:25:45] } [11:25:51] does that answer your question? [11:26:02] generally, when you wonder what a function does, it helps to look at the source code :) [11:26:45] phrase = $this->data['subtitle']; is probably what you want [11:27:23] wow [11:27:27] okay that's great [11:27:35] thanks very very much indeed [11:27:53] have fun [11:27:55] just spent the last hour trying to work it out and now I have the solution [11:28:06] back to my skin. Thanks a lot for the quick answer [11:58:59] 03(NEW) Some #ask queries don't give correct results - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16067 normal; normal; MediaWiki extensions: Semantic MediaWiki; (dongiulio) [12:04:38] how do i create a sysop ? - i did not remember created one at installation script [12:05:37] i want to disallow anonymous user to create/modifying pages, and registration can only be done by a sysops - as i see on the help page/faq [12:06:19] i found how to disalow anonymous user, registration forbiden, but...not how i create a sysop (yet) [12:06:28] the first user is a sysop [12:06:44] oh ? i thought i was only admin [12:06:45] ok [12:06:52] so no problem [12:06:55] admin is sysop [12:07:05] old nomenclature. [12:07:19] ok, i thought it was two separate things [12:07:28] admin AND sysop [12:07:34] ok thanks once again werdnus ;) [12:08:45] :) [12:08:46] the first user is also a beureaucrat, btw [12:13:49] Bureaucrat. [12:14:18] 'crat [12:14:21] period. [12:14:32] werdnus? nerdna [12:15:33] Does anyone have a copy of Misza's AjaxLogin? [12:15:52] I know where Wikia's is, but I would practically have to rewrite it if I wanted it to work, which I will probably end up doing. [12:16:48] 03(FIXED) show active users in API siprop=statistics - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=16047 +comment (10roan.kattouw) [12:17:25] Charitwo: c'est moi. [12:19:14] 03(mod) cat_hidden field isn't used - 10https://bugzilla.wikimedia.org/show_bug.cgi?id=15191 (10roan.kattouw) [12:30:16] hey RoanKattouw: do you do much with the api opensearch? [12:30:36] Splarka: Do you mean action=opensearch or list=search ? [12:31:26] action=opensearch [12:31:50] Well I hardly ever touch it [12:32:01] And I pretty much never use the API :P [12:32:19] But I believe the Firefox search plugin uses action=opensearch [12:32:41] well, someone asked earlier for a way to suppress redirects with mwsuggest, and it seemed like the first step would be to have the API action support it, then the mwsuggest.js (or php global) could utilize it [12:33:46] What's mwsuggest? [12:34:07] go to a wmf wiki and type in a few letters in the search box [12:34:36] Right [12:34:43] Does that currently use action=opensearch? [12:35:20] view a page source and check in the global definitions