[00:15:11] Jasper: That's right, yeah. We're not planning to replace action=query or anything like that. [00:15:20] what a shame [02:21:42] Question for Mediawiki Sysops out there, do you know of any good guides on an immutable mediawiki setup, or suggestions on where to look? I'm thinking of setting up a server with cookbooks or Docker [02:22:45] I'm working on a LEMP+Varnish stack setup that I want to make a guide for, but doing everything manually doesn't make for good server practice [02:24:50] CZauX: https://www.mediawiki.org/wiki/MediaWiki-Vagrant may or may not be a good starting point [02:27:13] I've already manually setup a very high performing LEMP+Varnish setup, problem being its all manual, which isn't very portable. I've contacted a few sysops out there on general methods of immutable servers and proper security, but Mediawiki sysops should have some more specific information onthat [03:38:24] PHP7 has made some good life choices and broken up with Mcrypt. I think they might hook up with Libsodium. http://news.php.net/php.internals/91853 [13:21:34] morning [15:33:54] hi [15:34:10] i'm migrating data from a MediaWiki 1.6.10 installation to 1.19.20+dfsg-0+deb7u3 (debian package version). i've populated the mysql database with the old tables and generated the LocalSettings.php using the mw-config/ web interface, which talked about some automatic migration of the database. [15:35:00] I need your help! I have noticed that the main distributions in the linux timeline have their logo : Debian, Slackware, on the svg, but not RedHat. So I grabbed it and did it, but now I don't know how it could be brought to the related page? Who can I reach for this purpose? [15:35:22] melodie, what "linux timeline"? [15:35:29] hi andre__ this one: [15:35:51] when i run "php update.php" in the maintenance/ directory, i get this: ...deleting old default messages (this may take a long time!)...A database query syntax error has occurred. [15:35:52] The last attempted database query was: [15:35:52] "INSERT INTO `recentchanges` (rc_timestamp,rc_cur_time,rc_namespace,rc_title,rc_type,rc_minor,rc_cur_id,rc_user,rc_user_text,rc_comment,rc_this_oldid,rc_last_oldid,rc_bot,rc_moved_to_ns,rc_moved_to_title,rc_ip,rc_patrolled,rc_new,rc_old_len,rc_new_len,rc_deleted,rc_logid,rc_log_type,rc_log_action,rc_params,rc_id) VALUES ('20160323152835','20160323152835','8','1movedto2','3','0','0','0','MediaWiki default','No longer [15:35:52] required','0','0','1','0','','MediaWiki default','1','0',NULL,NULL,'0','1025','delete','delete','a:0:{}',NULL)" [15:35:55] from within function "RecentChange::save". [15:35:57] Database returned error "1054: Unknown column 'rc_cur_time' in 'field list' (localhost)" [15:36:08] https://fr.wikipedia.org/wiki/Liste_des_distributions_Linux#Des_distributions_communautaires_et_grand-public [15:36:24] https://commons.wikimedia.org/wiki/File:Linux_Distribution_Timeline.svg [15:36:29] from that page in fact [15:36:38] is it unrealistic to expect an automatically correct jump from 1.6.10 to 1.19.20 ? [15:37:39] melodie: is RedHat logo free? [15:37:51] that may be the reason it wasn't included [15:38:10] https://en.wikipedia.org/wiki/File:RedHat.svg says it's not [15:38:11] it's just as the Slackware logo : it bears a ® [15:38:19] oeisu: it should work, but i guess nobody tried it. Also, try disabling all extensions when runnign update.php (then run it again with the extension enabled) [15:38:32] oeisu: anyway, please report the failure [15:38:33] Platonides this is nitpicking, Slackware is also a proprietary brand [15:38:33] !phab [15:38:33] https://phabricator.wikimedia.org/ [15:38:48] maybe Slackware logo shouldn't be there, either [15:39:12] Hmm, https://www.mediawiki.org/wiki/Manual:Recentchanges_table#rc_cur_time is deprecated, but only removed in 1.24 [15:39:14] hmmm https://commons.wikimedia.org/wiki/File:Slackware_Logo_alt.jpg [15:39:20] "The copyright holder of this work allows anyone to use it for any purpose including unrestricted redistribution, commercial use, and modification" [15:39:20] Platonides this is also stupid, when we show the world about the gnu/linux distributions they have to be advertised properly [15:40:03] melodie: the inclusion of that logo may mean that you can't eg. print that image [15:40:19] or that it has to be removed from a physical wikipedia distribution [15:40:49] it all depends on the terms that the copytight owner allows to use theirlogo [15:41:47] Platonides does the RedHat page at wikipedia mention any issue about representing it? It does not seem so to me: https://en.wikipedia.org/wiki/File:RedHat.svg [15:41:53] it asks to to misrepresent it [15:42:16] melodie: Wikipedia in English is using it under a Fiar Use rationale [15:42:21] DanielK_WMDE_: is disabling extensions a command line option to "php update.php", or do i have to regenerate LocalSettings.php, making sure to uncheck certain checkboxes? [15:42:36] that's not allowed on Wikimedia Commons [15:42:44] oeisu: you will need to comment some lines on LocalSettings.php [15:42:50] nor many Wikipedias [15:43:15] see for instance how https://es.wikipedia.org/wiki/Red_Hat doesn't include any image [15:43:36] Platonides fair use rationale ? what sort of alien language is that? (Could you be more precise?) [15:43:57] Fair Use is a doctrine under US law [15:44:09] that allows using copyrighted content [15:44:13] https://en.wikipedia.org/wiki/Fair_use [15:44:17] ok, I head to the #redhat chan and ask them if they want to it to look nice [15:44:21] under certain requisites [15:44:35] copyright is messy [15:46:08] it's unlikely anyone on #redhat is able to license their logo [15:47:38] Platonides they can ask upstream in their company. Meanwhile, my own presentation in a room with a few people will have the logo, because if it doesn't, it looks awkward [15:49:04] well, that timeline is GFDL [15:49:22] so your addition of a GFDL-incompatible image is probably not allowed [15:49:49] in addition to the fact that you might need authorization from RedHat legal department [15:49:57] if you wanted to do everything legally [15:50:17] a burden for what's "logical"? yes, I'm afraid so [15:51:08] ideally I'd like every part of the gnu/linux distributions to be open source/free as in freedom, but then the hardware does not work. so what can us users do? [15:51:31] I'll just ask RedHat if they want their logo there, and if so, to consider doing whatever they have to [15:52:19] but on Friday my own device will show a nice image, not looking like it despises one of the 3 mothers of many distributions that are out there actually. [15:52:44] I can assure you the public won't see any issue in that presentation [15:53:31] as I said, it's "logical" [15:53:40] but copyright law often isn't [15:53:57] their problem [15:54:09] once I'll have pointed to it, of course [15:55:01] I'm an advanced user, nothing more, not in the legals and they have the means to treat the legal matters, while I don't [15:55:17] James_F: hey, I configured and enabled citoid for fa.wp :) [15:55:20] today [15:56:07] you are probably living in a country signatory of the Berne convention [15:57:18] Platonides thanks for you insights [15:57:24] I'm sorry [15:57:38] Amir1: Yay! [15:58:27] Hi there, [15:59:06] :) [15:59:16] I'm trying to get mediawiki up using vagrant, and I've added some roles in, but after I log in using vagrant ssh, I do not have the mediawiki directory under /vagrant. [15:59:21] What can I do? [15:59:31] Did I miss some roles? [16:00:13] where can I test patches to extensions? [16:01:02] the following are enabled roles: * graph * multimedia * multimediaviewer * poem * wikidiff2 [16:03:31] any help with mediawiki vagrant will be appreciated... [16:03:57] Amir1, Yaron, are you using vagrant? [16:04:16] dror: so you did a `vagrant up` and you don't have a /vagrant/mediawiki? [16:04:26] right. [16:04:49] I wonder if I'm missing some roles... [16:04:58] Have you tried running `vagrant provision` and checking for errors? [16:05:20] the main MediaWiki clone is applied by default [16:05:20] not yet. thanks. I'll try. [16:06:53] bd808: I got this: Error: Could not run: Could not find file /vagrant/puppet/manifests/site.pp [16:07:54] if you cloned the vagrant, go inside the vagrant directory and do "git status" [16:07:56] dror: hmm... that sounds pretty broken. Are you running the vagrant command from your laptop directly or from inside the VM? It needs to run from your host computer. [16:08:26] if it shows you deleted them, just do "git reset --hard origin/master" [16:09:41] bd808: I'm running the vagrant command from the laptop/host. [16:10:42] Amir1: my git status seems fine: Your branch is up-to-date with 'origin/master'. [16:11:56] oh maybe you're running "vagrant provision" in wrong place [16:12:08] do it inside your vagrant folder [16:13:48] (also try "git pull") [16:16:37] yeah, did it inside the vagrant folder. git pull did make some changes. [16:18:18] now, I do have the site.pp file, but vagrant provision still claims the same error. [16:19:19] I wonder if the volumes from your host computer are not mounted properly into the VM? Perhaps `vagrant reload` qould be useful to restart the VM and ensure that the shares are available? [16:22:05] for the record, when migrating from mediawiki 1.6.10 to 1.19.20, i had to manually touch the mysql database to make maintenance/update.php happy: ALTER TABLE recentchanges ADD `rc_cur_time` varbinary(14) NOT NULL DEFAULT ''; [16:22:05] ALTER TABLE recentchanges ADD `rc_moved_to_ns` tinyint(3) unsigned NOT NULL DEFAULT '0'; [16:22:06] ALTER TABLE recentchanges ADD `rc_moved_to_title` varbinary(255) NOT NULL DEFAULT ''; [16:22:44] the wiki is up and running now :-P [16:22:55] did the 'vagrant reload', started (almost) fine, with guest additions comment (installed but not running). [16:23:31] Then - I received a Usage: service < option > | --status-all | [ service_name [ command | --full-restart ] ] bash: line 3: vboxadd: command not found bash: line 2: setup: command not found [16:24:16] the virtual machine runs, though. [16:25:54] where can I find the logs for the setup script? [16:50:31] Hey, all, I have a MediaWiki install where if I move a file or update a Template, I have to edit the downstream pages for the changes to take effect. Ideas? [16:52:47] doesn't it automatically happen after a bit? [16:52:54] those changes should go into the job queue [16:53:03] which should run after a bit [16:53:16] OK, I'll wait 15 minutes and report back. Will you be here, Platonides ? [16:53:17] maybe you have it disabled [16:53:29] Where could I look for that? [16:53:39] I don't have many pages, so it shouldn't take long to go through them right now. [16:53:41] perhaps because it should be done from a cron file which isn't running [16:53:56] That would explain a lot. [16:53:59] how many rows that the jobqueue table have? [16:54:06] Let me look at the MediaWiki FAQ.... [16:54:19] I'm not that familiar with SQL -- can you tell me how to find that? [16:54:33] oh, and as a workaround, you could fetch the page with ?action=purge [16:54:39] instead of performing a real edit [16:55:32] is it the maintenance/rebuildall.php job that needs to run regularly? [16:55:43] No [16:55:46] !job [16:55:46] http://integration.wikimedia.org/ci/job/ [16:55:49] !jobqueue [16:55:49] The Job Queue is a way for mediawiki to run large update jobs in the background. See http://www.mediawiki.org/wiki/Manual:Job_queue [16:55:56] ok, the table is called simply 'job' [16:56:03] SELECT COUNT(*) FROM job; [16:56:24] OK, let me figure out how to get into the database in postgresql [16:57:09] AHHHHH That page makes all of this make so much more sense. Thanks. [17:01:53] dror: sorry, I only saw your question now, but no, I've never used Vagrant. [17:03:16] Platonides: Worked perfectly. Thanks. I have 5-minute jobs running now for each of my Wiki's. [17:10:48] Hello, I am brand new to working with wiki's and have a question that is a bit in-depth but I cannot for the life of me understand the how-to from reading through the help guides available. [17:10:48] how to I get article content rendered as html in a json dump? ie: no skin, just the article content, but not in wikitext, in html? [17:10:50] I am looking to create a template for generating game detail pages (I.E. One page per game) for multiple games. Information to be included are items such as cover art, platform, genre, publisher, developer, franchise, region, etc. I have been reading through the help manual for the past few days but cannot grasp exactly how to set this up so that when a user goes to create a new page it will automatically have all of these fields [17:11:11] links to list pages (I.E. click the developer and a page with all games listed by that developer). Is anyone available to give me a breakdown on how to achieve this? Thank you. [17:12:55] djidol: Maybe one of my pages can give you somewhere to start. https://aninix.net/wiki/Template:Service [17:13:24] If you're looking for the CONTENT to autopopulate, MediaWiki may not be your tool, or you will at least need some other tool to generate the wikitext to include. [17:14:30] No, not necessarily auto-populate, I like the layout of the standard video games info on wikipedia for example [17:14:53] I don't need all the additional information I.E. summary, plot, etc. just the facts of the game itself I am more concerned with [17:15:23] if someone goes to create a page for a game, how would I set-it up so that it would display for example, {{Infobox video game |title = "title", etc. that they can then edit [17:15:32] like this wiki page for example https://en.wikipedia.org/w/index.php?title=Dark_Souls&action=edit [17:16:13] djidol: include the subtemplates in the Game template you're making and be smart about the needed fields for the page. [17:17:19] so, for example if I created Template:Game [17:17:30] ?action=render gives the html version, I'd like the page info, with the render version included in the same json. Anyone know a url for that? [17:17:30] and in that game template I placed {{Infobox video game | title = Dark Souls | image = Dark Souls Cover Art.jpg | caption = Cover art used in Western regions | developer = [[FromSoftware]] [17:17:50] anytime someone went to create a page it would auto-place that information? [17:17:59] and then they could simply edit info as needed? [17:18:15] djidol, are you using semantic mediawiki extensions? that has fielding like you might be looking for. [17:18:47] I am not, I have literally just set-up my wiki this morning and learned how to configure the initial details, I will look into that extension [17:18:48] thank you [17:19:45] sematic plugins handle defining attributes and form templates for page creation that ask for those fields. it also handles querying pages based on those fields. [17:20:12] semantic-mediawiki.org [17:21:15] shoot, I don't think semantic will work as I have installed mediawiki through the softpedia extension via my webhost instead of manually [17:22:00] well, the templates that DarkFeather mentioned will get you part of what you're looking for, but probably not the query interface. [17:22:40] anyone point me at a url format to get html rendered page content as well as page info in the same json output? [17:42:16] Anyone know where I need to specify server settings in the EmailForm extension? [17:42:38] typically on LocalSettings.php [17:44:49] Tried looking there... I think they need to be in EmailForm.php but no idea where and how exactly they should be put in [17:46:51] in EmailForm.php there would be the defaults [17:47:05] which you then override on LocalSettings.php [17:47:09] after loading the extension [17:47:47] Do you mean the smtp settings? [17:48:21] Because I try submitting a form using the default settings and it says it was sent successfully but it never arrives [17:52:12] AAlex: my guess is that the extension uses the email settings defiend by core. Which in turn uses php [17:52:17] ...uses php [17:52:35] s built in mail function. which again defaults to using sendmail [17:52:52] if sendmail isn't set up properly on the box, it won't do anything [17:53:03] the alternative is to configure mediawiki to use smtp [17:53:04] !smtp [17:53:04] See for information about configuring MediaWiki to use SMTP for sending mails, instead of using the sendmail interface. [17:53:36] AAlex: i don't know if the extension will actually use that, and i don't know if you have another problem, but i'd say it's worth a try. [17:57:16] Cool, I'll try that. Thanks guys [18:55:56] Hello, all, I'm trying to apply some categories by template, but I don't want the template in the category. Is there a way to exclude Wikitext when viewing the Template page? [20:00:59] hey [20:01:17] can I ask questions about the dpl extension here? [20:13:59] !ask | Lingo____ [20:13:59] Lingo____: Please feel free to ask your question: if anybody who knows the answer is around, they will surely reply. Don't ask for help or for attention before actually asking your question, that's just a waste of time – both yours and everybody else's. :) [21:12:20] I'm good. re-installed and it works!