[10:24:56] hi fellows [10:25:16] could T76686 be made public ? [14:25:25] how can i see what a "maintenance/runJobs.php --maxjobs 1" process is doing? my wiki is not loading any pages while that is running [14:26:49] it's not doing any significant cpu or io load [14:35:39] maybe with a showJobs.php --list http://www.mediawiki.org/wiki/Manual:ShowJobs.php [15:37:03] Vulpix: no luck, that just shows "0" after i set $wgShowExceptionDetails = true [15:37:11] which it suggested when showing nothin [15:40:43] hannes3: if you stop the command (CTRL+C) and run it again, still hangs? [15:41:11] you may try to enable the display of PHP errors. See https://www.mediawiki.org/wiki/Manual:How_to_debug#PHP_errors [15:43:49] hannes3: what is your php memory limit? [15:44:43] and/or total memory usage on the machine? [15:50:40] i have an article with this chars ''; <- indicates a quote in the programming language Nix. but mediawiki just won't show the '' even though i've wrapped it in '';, why? [15:52:24] qknight: at the beginning of a line? [15:52:59] Betacommand: https://nixos.org/wiki/Xorg/xserver_example_settings_in_nix_configuration, just hit 'view source' [15:53:45] qknight: looking [15:53:59] i could wrap every '' into '' but that would be tedious [15:54:14] you don't have "; but '';, that's quite different [15:54:46] yes, i'm having ''; and not "; ;-) [15:54:47] bah, disregard that, it's my IRC client fault [15:55:09] charsets and encondings are a complicated thing [15:55:53] use in the whole block (not just surrounding ''), or maybe (if supported/installed in your installation) [15:56:28] Vulpix: but if i use then the code is not listed as code but all the newlines are removed and i get a one line blob which looks very broken ;-) [15:57:25] ah... yes. Use
 instead of , then
[15:59:29] 	 qknight: its neither, ; at the beginning of a line is special mediawiki synatx
[16:02:48] 	 Betacommand: i've fixed it now but i don't quite understand why
[16:03:00] 	 https://nixos.org/wiki/Xorg/xserver_example_settings_in_nix_configuration <- looks great even without 
[16:03:07] 	 Vulpix: 
 was a good idea though
[16:03:36] 	 qknight:  just adds monospace font. It doesn't scape wiki formatting stuff. 
 does (also respecting whitespace)
[16:04:19] 	 qknight: wikitext has some special syntax to do stuff, [[]],:,~~~~ are just a few of them
[16:04:50] 	 tjamls
[16:04:56] 	 thanks a ot
[16:04:57] 	 lines prefixed with space are treated more or less like 
, but wikitext is still parsed
[16:05:15] 	 Vulpix: oh, that could explain what i've encountered 
[16:05:50] 	 is there a mediawiki editor like https://stackedit.io/?
[16:08:20] 	 there's VisualEditor
[16:08:36] 	 Vulpix: i think i've been playing with it. but is it a default editor somewhere already?
[16:09:22] 	 no. It's an extension, and it needs a more complex setup than normal extensions
[16:19:17] 	 I'm trying to use the bulk insert extension, but I'm having some php issues with undefined offsets. Could someone help me for a bit?
[16:24:32] 	 Bulk Insert extension?
[16:24:56] 	 Reedy: http://meta.wikimedia.org/wiki/MediaWiki_Bulk_Page_Creator
[16:25:23] 	 That's not an extension :)
[16:25:40] 	 It looks like it's not been updated in like 4 years
[16:25:55] 	 tdannecy: what are you trying to do?
[16:26:07] 	 And it seemingly screenscrapes
[16:26:08] * Reedy barfs
[16:26:44] 	 Betacommand: Create about 200 pages on my wiki with a template on each one.
[16:27:04] 	 tdannecy: take a look at pywikibot
[16:27:15] 	 https://www.mediawiki.org/wiki/PWB
[16:30:09] 	 Betacommand: Will do! Thanks.
[17:15:20] 	 seems like the maintenance scripts dont like sqlite. the problem seems to be that the "database is locked".
[17:22:46] 	 !wm-bot
[17:22:46] 	 Hello, I'm wm-bot. The database for this channel is published at http://bots.wmflabs.org/~wm-bot/db/%23mediawiki.htm More about WM-Bot: https://meta.wikimedia.org/wiki/wm-bot
[17:23:30] 	 hi
[17:25:04] 	 hi
[17:25:07] 	 !wm-bot pong
[17:25:08] 	 Hello, I'm wm-bot. The database for this channel is published at http://bots.wmflabs.org/~wm-bot/db/%23mediawiki.htm More about WM-Bot: https://meta.wikimedia.org/wiki/wm-bot
[17:25:56] 	 !wm-bot pong
[17:25:56] 	 Hello, I'm wm-bot. The database for this channel is published at http://bots.wmflabs.org/~wm-bot/db/%23mediawiki.htm More about WM-Bot: https://meta.wikimedia.org/wiki/wm-bot
[17:26:58] 	 !wm-bot
[17:26:59] 	 Hello, I'm wm-bot. The database for this channel is published at http://bots.wmflabs.org/~wm-bot/db/%23mediawiki.htm More about WM-Bot: https://meta.wikimedia.org/wiki/wm-bot
[17:26:59] 	 hi
[17:33:40] 	 does anyone have a timeline regarding the opening of T76686 to the public ?
[17:55:41] 	 Betacommand: I ran into an issue with creating a family file. I don't understand what my traceback is telling me
[17:56:04] 	 tdannecy: can you pastebin it?
[17:56:18] 	 !ping
[17:56:18] 	 Pong.
[18:01:02] 	 Betacommand: Sure. I ran "python pwb.py scripts/pagefromfile.py -start:xxxx -end:yyyy -file:inputfile.txt" Output: http://pastebin.com/3keUuABn
[18:02:11] 	 tdannecy: u'Tim D'Annecy'
[18:02:15] 	 needs to be:
[18:02:23] 	 u"Tim D'Annecy"
[18:02:53] 	 usage of ' in a string caused the issue
[18:03:06] 	 Betacommand: Ah. Just saw that too. Can't believe I missed it.
[18:03:30] 	 tdannecy: Ive had dumber errors myself :)
[18:07:34] 	 Betacommand: Well, I ran into more issues generating a family file. I hate to be a mooch, but would you mind looking at my output from running "generate_family_file.py"?
[18:07:51] 	 tdannecy: paste bin it
[18:08:43] 	 and the error code your getting too
[18:09:53] 	 tdannecy: Ive only got a few minutes before I need to head out
[18:11:38] 	 Betacommand: Okay, I'm having an issue getting the paste bin to save correctly with the input, so I took a screenshot. http://imgur.com/ExNka8w.png
[18:12:31] 	 tdannecy: what is the url to your wiki's home page?
[18:13:02] 	 Betacommand: http://flnowarchive.org/mediawiki/index.php/Main_Page
[18:13:19] 	 Betacommand: Should I put that in instead of just blah.org/mediawiki?
[18:13:26] 	 yeah
[18:13:33] 	 Betacommand: Thank you.
[18:14:04] 	 Betacommand: Wow. It's right there in the instructions. -_- Sorry.
[18:14:04] 	 tdannecy: thats what the docs are also telling you
[18:14:17] 	 Betacommand: Yeah. -_- I'm sorry to bother you.
[18:16:54] 	 hi
[18:16:59] 	 hi
[18:17:30] 	 hi
[18:19:34] 	 Betacommand: Found out what my issue was. I put the group permissions to read=false on my LocalSettings
[18:31:00] 	 In https://phabricator.wikimedia.org/T65902, what does 'edit' stands for ? is it male,female or unknown?
[18:36:04] 	 Phoenix303: https://www.mediawiki.org/wiki/Help:Magic_words#GENDER
[18:36:28] 	 text for every gender. I guess translations should be able to specify specific genders
[18:44:40] 	 Vulpix : AFAIU suppose the current user gender preference is set as "female" but when i add {{GENDER:|male}} actually it returns "male" while it shouls return "female"? Please let me know if i have interpreted wrongly.
[18:45:16] 	 {{GENDER:|male}} will return male in all cases
[18:46:20] 	 what does this bug trying to add support for then https://phabricator.wikimedia.org/T65902?
[18:47:54] 	 I have no idea... not an #easy bug if you don't know what the author mean
[18:50:44] 	 Thanks Vulpix.
[19:45:00] 	 i am install Flow extension 
[19:45:10] 	 but they gave me error
[19:45:15] 	 [f6ab0e1e] 2015-01-04 19:40:53: Fatal exception of type MWException
[19:45:25] 	 any solution for this?
[19:46:30] 	 sofat: there should be more details in the error log
[19:47:05] 	 https://www.mediawiki.org/wiki/Manual:How_to_debug#PHP_errors -- set $wgShowExceptionDetails = true
[19:48:30] 	 ori, there is internal error
[19:48:48] 	 yes, but to figure out what it is, we need to know more about what the error is
[20:20:47] 	 ori, https://hiteshkumarsofat.wordpress.com/2015/01/04/error/
[20:20:51] 	 please check this
[20:21:20] 	 i could not found any solution
[20:22:25] 	 sofat: are you running the latest version of Flow? it looks like the code in the repository's master has changed
[20:22:41] 	 since the extension is still under development, it's probably best to use the latest code available
[21:25:09] 	 hello peoples
[21:28:16] 	 I have a bit of a dilemma. I'm associated with a group of people trying to restart a wiki that's been down for quite a while (since 2011). however the only backups are here: https://dl.dropboxusercontent.com/u/64088059/Steal%20this%20Wiki/wiki.stealthiswiki.org/wiki/Main_Page.html
[21:28:16] 	 How would we transfer this to a new wiki without manually copy/pasting each page?
[21:29:15] 	 The best any of us have come up with is to use a scraper and scrape the pages and transfer them into markdown then we installed a markdown plugin. but that's still tedious to correct the markdown because it doesn't transfer perfectly.
[21:30:16] 	 wesguy: do you have a database dump, or just the html versions?
[21:30:47] 	 just this version I'm linking, html
[21:31:09] 	 unfortunately the old wiki was abandoned a long time ago and we don't have access to the database
[21:31:13] 	 wesguy: and the original wiki isn't online anymore?
[21:31:58] 	 nah it's been down since 2011, but it's all GNU licensed so we figured we'd just restart it ourselves haha
[21:32:06] 	 me and a group of people
[21:32:12] 	 wesguy: http://w-i-k-i.appspot.com/ seems to do a reasonable job at converting to wikitext
[21:32:28] 	 there's also, you know, Parsoid ;)
[21:32:49] 	 MatmaRex: yeah, but parsoid likes parsoid-generated html somewhat better than random html ;-)
[21:32:56] 	 thanks I'll try that app
[21:33:08] 	 wesguy: it's still a bit of a mess with the links
[21:33:18] 	 yeah the markdown screws up the links too
[21:33:22] 	 there's probably a few others out there that convert html to wikitext as well
[21:35:15] 	 valhallasw`cloud: it'll take random HTML perfectly fine, as long as it's "clean", so to speak
[21:35:18] 	 wesguy: try their tools https://github.com/WikiTeam/wikiteam
[21:35:39] 	 wesguy: pandoc also works, but also problematic with links
[21:35:50] 	 this looks like it's working to a degree
[21:35:51] 	 output of the PHP parser should be pretty good for this…
[21:36:47] 	 hmm, i guess it won't handle images well?
[21:37:34] 	 hmm. yeah, you would need some pre- or post-processing :/ oh well
[21:37:55] 	 oh well
[21:38:04] 	 it still is working rathe well, better than the markdown
[21:38:13] 	 (but actually, it does handle images well. better than i expected!)
[21:38:40] 	 (try at http://parsoid.wmflabs.org/_html/ , you'd need to install it locally and play with it for it to work better with local links)
[21:40:11] 	 thanks for the help!
[22:25:39] 	 is there a phab ticket for the PDF renderer ignoring infoboxes?
[22:29:07] 	 there's one for it ignoring tables
[22:29:09] 	 so yes
[22:29:12] 	 somewhere
[22:29:28] 	 MatmaRex: it was just closed
[22:29:53] 	 MatmaRex: https://phabricator.wikimedia.org/T32706
[22:38:12] 	 theni dunno.