[04:01:16] does wikiapiary not work anymore [04:48:57] SantaC, how is this related to #mediawiki ? [04:50:02] andre__: because its used in templates on mediawiki.org thanks [04:51:28] ah, you mean linked? [04:51:35] sorry, early morning here :) [04:52:22] yes, so if its indeed non functional or closed down or whatever we should remove and/or hide it from display until its resolved [04:52:44] https://downforeveryoneorjustme.com/wikiapiary.com [04:53:22] yeah i checked that before asking here and looking silly for a routing issue lol [04:53:31] So your question is actually if anyone knows if Wikiapiary just has temporary technical problems or if the site does not exist anymore. [04:53:42] correct [15:29:45] Happy friday, folks! I have a question. I get an error when trying to install AntiSpoof which a search tells me is caused by not having Composer installed or updated. I'm having some issues figureing out how exactly to install/update it and would love if somebody might be able to walk me though a step-by step. I've tried to do it on my own for a [15:29:45] bit without success. [15:30:08] The error I'm getting is: [15:30:09] [8bc8101b9477cf405466bc1d] [no req] Error from line 148 of /home/nchakdhb/lostjewel/extensions/AntiSpoof/includes/AntiSpoof.php: Class 'Wikimedia\Equivset\Equivset' not found [15:37:16] PonyToast: composer is a command-line utility that will need to be downloaded and installed separately from mediawiki. Exactly how to do that will vary based on what OS you have (it is very likely in your OS package manager though) [15:37:30] it is not a mediawiki extension [15:43:20] So all of my hosting is through Namecheap; I have SSH and terminal access. [15:56:11] Skizzerz: i imported several of those files without removing the N prefix and it seemed to work... how important was that [16:05:26] stiv2k: wasn't expecting it to work, but I just looked it up and it seems mysql supports that notation so all should be well [16:05:34] sweet [16:05:53] alright so i just gotta convert those binary strings and i should be good to go.....then upgrade from 1.27 to 1.33 [16:06:03] ii' [16:06:12] i'm going to write a python script to do the conversions [21:00:19] Skizzerz: whats up with the objectcache table [21:02:18] stiv2k: you can ignore that one [21:02:24] Skizzerz: sweet [21:02:28] (leave it empty on the new install, it'll populate itself as needed) [21:02:56] Good evening. I would (still) like to add some visual editing to our internal mediawiki. Some people have a hard time to edit using code. Anything parsoid based is simply to complex. Are there any alternatives that could work? Anyone used this? https://www.mediawiki.org/wiki/Extension:WYSIWYG It's experimental, but works? [21:03:15] stiv2k: basically that table is used to cache various PHP objects depending on your $wgMainCacheType (and related) settings [21:03:33] if they aren't in the cache, they get rebuilt as needed, so no need to transfer that data over [21:04:48] Skizzerz: well that just about wraps it up. wiki looks like its working [21:04:51] but i cant log in, lol [21:05:20] :( [21:05:39] try the changePassword.php maintenance script to reset your password [21:07:43] also on stats page it says active users: -1 [21:08:14] run the rebuildAll maintenance script [21:10:45] done [21:14:19] alright my colleague says he was able to login so i guess i just forgot my pw [21:15:46] i think i need to configure it to send emails for the password reset link [21:17:11] unless you set an email when registering, that won't work. the above-mentioned maintenance script can let you change it to something else temporarily [21:17:32] (then you can log in and change it to something you prefer via the UI -- that way the actual pw isn't logged in command line history files) [21:18:32] i did set an email when registering yes [23:05:22] Hi all, I'm trying to convert a mediawiki site to a jekyll site (each page is a markdown file with some meta data at the top). Has anyone come across a good way to do this? [23:05:42] !export [23:05:42] To export pages from a wiki, navigate to Special:Export on the wiki, type in the names of the pages to export, and hit "export". See for an example of this form. See also: !import [23:06:53] Yeah that gives me a big XML file of all the pages. From there I've been looping through and generating the individual files. Where I'm struggling the most is covering the wiki markup to markdown, and preserving the internal links and images [23:07:53] I also figured this might be a common enough use case that someone's already solved it [23:08:36] pandoc can do that - quick search on ddg.gg reveals some github modules [23:14:12] Yeah pandoc is what I'm using and it's struggling to do anything beyond the basic conversions :/ I've seen a few repos but they're all unfinished projects it seems. Just figured there might be a well established tool for this considering how popular Jekyll/GitHub pages is [23:23:39] I don't know how complex your wikitext is - templates, magic words are not supported by markdown at all