[00:14:15] Hi, if anyone's here there's a page with an error on it: https://www.mediawiki.org/wiki/Manual:$wgCapitalLinkOverrides [00:14:26] Something to do with the translation system [00:17:54] the error would be? [00:18:50] It's supposed to be transcluding {{note}}, but somewhere in between marking for translation and converting it to {{TNTN}} it failed to be a transclusion. [00:19:10] You can see the text Template:Note/en -- that doesn't belong there [10:49:17] I have a large image (3,950 × 12,288 px, 3.96 MB) and ImageMagick fails to create a 350px thumb. Is the a setting I can use to increase memeory limit? [13:22:14] hello, is it true that parsoid can only be installed with root privileges? [13:22:38] so I cannot install VE on a regular webserver? [13:24:23] guest99: do you mean shared hosting? [13:24:29] yes [13:25:04] You'll need sudo/root access to install Parsoid, so no you can't install it then. [13:28:57] will it ever be made shared-hosting compatible? [19:43:10] hi, can you delete a webpage managed by MediaWiki [19:43:18] the website is on World Wizzy [19:44:00] yes, you just need to be an admin [19:44:35] of the site [19:45:42] how do i become an admin of the site [19:47:05] ask a current admin [19:47:18] most wikis will have a page explaining that [19:48:12] I spend a few hours looking, but couldn't find an admin. [19:48:20] the page says modified 09:35, 2 December 2006. [19:48:41] what site is this? [19:49:34] world wizzy [19:49:39] world wizzy is "A static mirror of Wikipedia from early 2007...Because this is a static snapshot of Wikipedia taken from early 2007, you cannot edit anything you see here" [19:49:40] this is the full site: http://www.worldwizzy.com/library/Healthcare_policy [19:49:57] Oh, well in that case you're not going to get admin. [19:50:15] yes, but i would like to delete the information because it has my name associated with with [19:50:22] and i did not create the content [19:50:46] then it doesn't seem mirrored too well [19:50:57] Dionysus, then you'll just have to ask someone with control over the site. [19:51:02] I don't know what you expect us to do about it. [19:51:16] ok. I appreciate your help. [19:51:41] I'll try looking around more [19:51:44] thanks again [19:51:55] whois doesn't have any real contact info either [19:52:10] well, worth trying to contact Liviant LLC about it [19:52:10] Other than registrar@liviant.com [19:52:12] no, it just says godaddy owns it [19:52:37] maybe an enwiki admin could look who added that revision in https://en.wikipedia.org/w/index.php?title=Healthcare_policy&redirect=no deleted history [19:53:03] Dionysus: Liviant LLC of Bradford, Massachusetts +1.8886038438 / registrar@liviant.com [19:53:31] great. Thank you [19:53:42] I will explore these optoins [19:55:02] Platonides: surprise, it seems to be a full mediawiki installation :P http://www.worldwizzy.com/library/Healthcare_policy?action=history [19:55:51] http://www.worldwizzy.com/library/Special:Version MediaWiki 1.9.0 [19:56:07] And the sidebar in a really weird place [19:57:09] I guess another option for Dionysus would be to break into their server using any of the multiple vulnerabilities affecting MW 1.9.0 [19:57:18] that's probably illegal, though :) [19:57:21] lol [19:57:35] Hi huh [19:57:41] PiRS [20:01:17] hi [20:02:05] Hey Krenair [20:04:19] !ops [20:04:42] SmartWiki: do not change the topic. [20:06:06] No drinking and debugging... [20:19:05] what about Ballmer's peak? [20:21:53] lol [20:26:04] Krenair: You're not an op in here? [20:26:10] Though it seems ops weren't needed! [20:51:35] Why isn't the topic protected? [20:52:21] ...and back again [20:52:35] cathfolant: because when a new MediaWiki release is made, nobody with ops powers remembers to update it [20:52:46] ah ok [20:53:04] I don't know if that's the main reason, but at least it's one of them :) [20:53:21] * cathfolant wonders what MediaWhisky software would be [20:54:13] How much (approximately) time takes updating links when I have 23000 pages with multiple links at each, in my 2 CPU server and edit one of the pages which they link to? [20:54:58] (Not edit, create, I need to rehighlight all these links) [20:55:23] porton: the updates will be handled by the job queue, so it depends on your configuration [20:55:44] Vulpix: I meant: How long the full job queue would run? [20:55:50] if you don't execute the job queue manually (or b ya cron job), it will pick a batch of pages to update on every page refresh [20:56:15] it may depend on the complexity of the pages, I don't really know [20:56:19] Vulpix: I know. But I consider to run FULL job query on EACH page request. Is it reasonable? [20:56:38] Vulpix: http://withoutvowels.org/wiki/Tanakh:Genesis_1:5 is a typical page [20:56:45] what do you mean by "full job query"? [20:57:03] Vulpix: Updating ALL links which need updating [20:57:34] MediaWiki doesn't do that, hopefully for you [20:57:45] it picks a small batch every time [20:58:12] Vulpix: I know. But what if I will configure to run say 1000 updates on each page request? [20:58:24] you'll probably kill your server [20:59:14] Vulpix: Can I make a page with a button, clicking which will run full update on user's request? Maybe there is such an extension? It is very inconvenient to have links of wrong colors [20:59:20] if you get one pageview, you'll have a running process until it timeouts. If you have several pageviews, you'll end DDOSing your server [21:00:09] porton: run the job queue manually with this script: https://www.mediawiki.org/wiki/Manual:RunJobs.php [21:00:53] Vulpix: I want to make it accessible not only for me, the root, but also for all registered users, probably. Can this be done? Does it need custom programming? [21:01:09] yes, it needs custom programming [21:01:10] Laagonnie's change was actually helpful; the topic was 6 months out of date or more before I restored it :) [21:01:10] (Yes, it can be done with custom programming, but can it be done without?) [21:01:32] Yes [21:01:52] while true; do curl http://mywiki.org/; done [21:01:52] 21:52:37 cathfolant: because when a new MediaWiki release is made, nobody with ops powers remembers to update it <- indeed [21:02:12] or they do but others are faster then [21:04:24] It took about 10 sec of user+system (17.5 sec real time) time to runJobs.php [21:05:05] I will probably create a custom extension to run jobs from a Web interface [21:05:37] try https://www.mediawiki.org/wiki/Manual:$wgJobRunRate porton [21:05:51] * Nemo_bis prefers curl to UI [21:06:23] SPF|Cloud: We have already agreed that running jobs at regular user request would make the requests much too long (10 sec page loading is not acceptable) [21:06:32] k [21:07:00] or set a cron job on RunJobs.php [21:07:18] SPF|Cloud: Every minute? [21:07:49] I don't know, depends on the activity of your wiki [21:07:51] every 10 minutes [21:08:09] Vulpix: why do you oppose the idea to run it every minute? [21:08:24] 10 minutes should be good [21:08:39] if you run every minute, and the job takes longer that 1 minute, you'll end having multiple job runners in parallel [21:08:46] Vulpix: It takes 0.1 sec to run it when the queue is empty. So running it every minute would not slow down the system [21:09:05] porton: the problem is of course when the job queue is "full" [21:09:22] Vulpix: "full"? [21:09:33] when there are many jobs in the queue [21:10:17] I think, if I run it every minute, then long job queues will just not form [21:10:31] form? [21:10:43] Vulpix: I mean will not appear [21:11:20] creating a page that's linked from 1000 pages only takes a single edit. Yet it generates jobs to reparse 1000 pages [21:11:45] I'd call it a long job queue :) [21:11:48] 10 minutes and 1 minute are discussed now [21:11:52] what about 5 minutes? [21:13:11] choose the time interval you like, just be sure to specify the --maxtime parameter of runJobs.php [21:15:33] Vulpix: Oh, I think `php maintenance/runJobs.php --maxtime=50` (is time in seconds?) every 1 minute is OK [21:16:29] yes it is in seconds [21:16:48] I won't put a cron job for run at every minute, but it's your server, so it's up to you [21:18:19] Does runJobs.php write to stdout or stderr? I need to redirect it to /dev/null [21:20:26] `php maintenance/runJobs.php --maxtime=50 &>/dev/null` should work [21:21:06] assuming you want stdout and stderr both redirected to /dev/null [21:21:48] Thanks all, bye [21:22:20] both? I don't think that will redirect stderr [21:22:43] http://stackoverflow.com/questions/637827/redirect-stderr-and-stdout-in-a-bash-script says both [21:23:17] Carmela, no, I'm not an op here. [21:24:00] &> is a bashism [21:24:18] Carmela, I think I asked about that before and got some bad responses. [21:24:32] ah, right, I've always used 2>&1 to redirect all to one stream [21:24:49] yeah, I also would expect that [21:25:13] but it seems that &> also works probably [21:25:39] yes, it works, I've tested that [21:34:34] Krenair: Lame. [23:08:21] mit, nice vvv ;)