[05:12:46] hi [05:14:13] In the wikipedia revisions information there is a content field ... Can anyone tell me what is its purpose and how is it different from the diff field ? [05:48:39] Hi Sami. [05:49:09] Sami: Did you read https://www.mediawiki.org/wiki/Manual:Revision_table ? [05:49:16] Which field are you confused about? [05:50:29] if you run this query https://en.wikipedia.org//w/api.php?action=query&format=xml&prop=revisions&titles=Sachin_Tendulkar&rvprop=ids%7Ctimestamp%7Ccomment%7Cuser%7Ccontent&rvlimit=30&rvdiffto=prev [05:50:57] there will be a content field for each revision.....now I see its same for all review [05:52:57] https://en.wikipedia.org/w/api.php?action=help&modules=query%2Brevisions [05:53:07] content is the unparsed (raw) wikitext of the revision, I believe. [05:53:14] For each revision I am confused about the purpose of a content.... If you take a line and do a simple Control+ F , you will find it occuring in all the revisions [05:53:15] You're requesting it when you include &rvprop=content. [05:53:28] If you remove rvprop=content, it will be omitted. [05:53:37] If you're looking at diffs, sometimes you want the content. [05:53:41] In order to do a manual diff. [05:53:45] Or a hand-made diff. [05:53:56] Like taking the wikitext of two revisions and comparing them yourself. [05:54:27] Diffs are a bit weird. What are you trying to do? [05:54:38] so if we take raw content of two revisions and compare I will get the diff ? [05:54:55] Sure, that's true of any two documents. [05:55:03] The MediaWiki diff engine is a bit more advanced. [05:55:10] So sometimes people prefer its HTML output, I guess. [05:55:30] Since it includes features such as interline diffs. [05:55:35] When a particular character or space changed. [05:55:38] Intraline? [05:55:40] One of those. [05:56:19] I am trying to see what is the exact changes made in the revision using media wiki ....is there a simple process ? because sometime dff field is empty [05:56:30] Not all revisions have a visible diff. [05:56:39] For example, renaming (moving) a page causes a revision. [05:56:41] Or protecting a page. [05:56:45] But the revision content will stay the same. [05:56:49] You can see this in the user interface. [05:57:13] https://en.wikipedia.org/w/index.php?title=List_of_hypermarkets&action=history [05:57:25] You see " (46,914 bytes) (0)‎ . . (Protected" ? [05:57:41] That's an example of a revision that doesn't change the content. [05:59:02] so if the diff field is empty then the revision doesnot change the content [06:03:13] Probably, yeah. [06:03:16] That sounds reasonable to me. [06:03:32] You can also use the hash to check if the content changed. [06:03:40] https://en.wikipedia.org/w/api.php?action=help&modules=query%2Brevisions sha1 [06:25:37] Can you please tell how to use hash to find out what content has changed ? [06:27:09] What I see is even if there is some changes done yet the diff field may be uncached .....as an example you can see this link and search for FillherTease user changes https://en.wikipedia.org//w/api.php?action=query&format=xml&prop=revisions&titles=Sachin_Tendulkar&rvprop=ids%7Ctimestamp%7Ccomment%7Cuser&rvlimit=30&rvdiffto=prev [07:26:32] Hello [07:30:30] hi Arturo_ [07:34:55] how do you use rvcontinue in action query prop revision ? [08:18:03] hey guys I was wondering where I can ask questions about how to implement a new parser function into an existing extension [08:18:11] but #mediawiki-dev is not responding [08:18:30] so does anyone know a better place to ask something about this? [08:44:14] Joeytje50: #wikimedia-dev is the best place. [08:44:27] ...or maybe some mailing list. [08:47:36] :/ [08:47:46] I've been waiting for a response there for 45 minutes [08:48:14] and the bots constantly talking kind of makes it hard to see other messages in between them [11:07:42] !hss [11:07:42] https://upload.wikimedia.org/wikipedia/mediawiki/6/69/Hesaidsemanticga2.jpg [11:21:24] not very funny... [13:01:19] Hello! I have an old mediawiki install that was spammed to hell. I am trying to rollback all the spam, recover the original pages and then setup a new install. I'll copy/paste the data manually if I need to. It wasn't a very large wiki. But I would like to recover it. [13:07:23] Delpy: so to get this straight by spam I assume you mean like people whom gone on your wiki and made bad edits? [13:08:24] Zppix: Not just bad edits- They put irrelevant (SPAM) information on the pages. I don't even know how they got in. I might have not gotten a setting right. [13:09:18] If your able to rollback all the edits, you should have original revisions and then be able to update to latest mediawiki [13:09:52] what version of mediawiki are you on currently? [13:10:03] Zppix: I am going over the site now. I am rolling the pages back manually- 1 by 1. [13:10:40] 1.25.3 [13:12:56] Delpy: once your finished with that this manual will be helpful and myself and other volunteers are around, (link: https://www.mediawiki.org/wiki/Manual:Upgrading ) [13:13:19] ok [13:14:53] Delpy: also note that 1.25.3 is an ancient unsupported insecure version [13:15:01] but you might know by now. :P [13:15:40] I know databases have significant advantages to flat files, but I am frustrated at how hard its been for me to properly back up the database. I feel much better having a simple text copy of the files. [13:16:03] andre__: Yep. :-) [13:16:39] You can use mysqldump or similar... And it's a one line to backup a database [13:16:53] I have an even older wiki database backup and I have no idea how to extract info from it and restore it (it is from an even mor ancient version of mediawiki). [13:17:34] Reedy: The problem is not backing up per se. The problem is extracting the data afterwards if your new isntallation isn't *exactly* the same as the old. [13:18:08] Hi all, I'm installing mediawiki, I uploaded the mediawiki folder, but i get a 500 internal error page when I try to browse to home/w/ [13:18:17] Again, these are my own frustrations. Maybe they have more to do with my own lack of mediawiki management ability than mediawiki itself. [13:19:04] this is the error log: [Tue May 09 07:05:48 2017] [error] [client IP ADDRESS] Request exceeded the limit of 10 internal redirects due to probable configuration error. Use 'LimitInternalRecursion' to increase the limit if necessary. Use 'LogLevel debug' to get a backtrace. [13:19:49] any ideas? [13:22:28] zanato: sounds like a mistake in your configuration [13:22:59] # Use PHP5.6 as default AddHandler application/x-httpd-php56 .php # BEGIN WordPress AddHandler application/x-httpd-php70 .php RewriteEngine On RewriteBase /BLAHBLAH/ RewriteRule ^index\.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /~whatstn7/index.php [L] # END WordPress [13:23:04] zanato: did you try to configure a shorturl config ? [13:23:06] that's my .htaccess [13:23:15] thing is i've removed wordpress [13:25:40] zanato: so if you removed wordpress, then shouldn't you remove your .htaccess lines dealing with wordpress ? seems they are still in there. [13:32:10] [Tue May 09 07:30:52 2017] [error] [client IPADDRESS] File does not exist: /home8/local/public_html/w [13:32:59] so what is inside your public_html directory ? [13:33:56] the mediawiki folder is on the home directory [13:34:29] the public_html has .htaccess as well as error .shtmls [13:34:51] should I move the wiki folder to public_html? [13:35:43] no [13:35:59] what kind of server is this ? your own ? [13:36:21] nah it's a shared hosted server [13:36:33] do they provide instructions ? [13:36:52] not with wiki installs [13:36:59] mainly wordpress sites here [13:37:16] can you link them? [13:39:50] https://my.bluehost.com/hosting/help/htaccess [13:40:10] considering the error, I might try adding a redirect to the .htaccess file [13:45:19] right, is this the only thing you will host on that site ? [13:45:36] at the moment yes [13:45:55] ok, move the mediawiki folder into public_html [13:47:10] should i change the permissions from 755? [13:48:02] no, that should be ok. the installer later will guide you on that I believe... [13:48:34] so once it's there, you should be able to do http://domain/nameofmediawikidirectory/mw-config/index.php to start the installer [13:48:45] https://www.mediawiki.org/wiki/Manual:Installation_guide [13:50:12] You do not have permission for this request /w/mw-config/index.php [13:52:10] do you have permissions on that file ? [13:54:54] it worked :) [13:55:06] looks like i had to recurse the permission change [13:56:08] thanks thedj! [14:16:05] Hey guys, do you know of a reliable offline wiki software? [14:21:24] zanato: good ! [14:22:21] zanato: btw. please keep reading all the documentation accurately. web server configuration is not easy. and a mistake can have quite a few nasty consequences. [14:22:54] Delpy: kiwix.org is the most reliable thing there is probably. [14:23:20] oh, you mean for running an offline wiki ? or just reading wiki content offline ? [14:24:13] thedj: The former. [14:24:46] i would say there is no such thing as an offline wiki, at most a locally hosted or not internet connected one :) [14:25:44] thedj: There should be one. I find I prefer to work in the wiki workflow rather than regular word processor workflow with some projects. [14:27:54] well anyone can install a local server (if you know something about servers). I'm not sure what the easiest one is for something like that however. [14:28:33] it'd be great if there were an official docker container though [14:29:12] there is, but it's badly maintained. https://hub.docker.com/r/wikimedia/mediawiki/ [14:29:52] thedj: Imagine if you had set up a local server just to write a word processing document. I think the wiki format is good for both online and offline uses. [14:30:45] i'm personally just fine with modern text editors. [14:31:39] oh, how'd I not find that page [14:31:48] https://www.mediawiki.org/wiki/Docker [14:31:52] does not mention it [14:32:05] djr013_: the github page has some pull requests etc. [14:33:02] i think if someone was really enthousiastic about it, it would help push it forward. Internally for wikimedians it's little more than a pet project with no production consequences, so it doens't get much attention. [14:33:43] Hello, Special:RequestAccount shows "Make sure that you first read the Terms of Service before requesting an account", how can I replace that with a custom message? [14:33:51] oh, yeah it's obviously more of a mediawiki thing than a wikimedia thing [14:34:10] Delpy: a lot of programs are like that though, they've simply taken all of the basic configuration steps out of building each instance [14:34:45] djr013_: there's lots of info in https://phabricator.wikimedia.org/T92826 (reade bottom up, because there's a lot of outdated info at the top) [14:34:53] djr013_: ? [14:35:10] Delpy: word processing and similar "local" document editors [14:35:27] DrSlony: you can find the name of any interface message of mediawiki by appending ?uselang=qqx to the the url [14:35:47] then edit the wikipage MediaWiki:name_of_interface_message [14:35:49] djr013_: Well yeah- otherwise you'd have to be a sys admin just to write a memo. [14:36:06] thedj great, thanks. [14:37:06] I want to install Parsoid, but I don't know have node.js installed. Is it reasonably possible to install node.js w/o root access? [14:37:43] thedj: yeah, I did see that phabricator page, it's linked from the mediawiki wiki [14:41:04] extrarius: do you have shell access ? [14:41:28] yes, my main concern is getting all the dependencies without being able to use apt-get [14:42:07] yes, that's likely the biggest challenge. or yo need to find precompiled binaries for the os it will run on. [14:42:25] uname -r will tell you what kind of system you are on. [14:42:38] but honestly, I wouldn't get my hopes up [14:42:49] I already have a few custom apps running, mainly memcached along with the python interface, but luckily those didn't require many deps [14:43:22] djr013_: All documents are local somewhere. :-) [14:43:35] wow deep :) [14:46:58] uname -r gives "3.2.61-grsec-modsign" which I'm not at all familiar with [14:47:19] uname -a gives Linux waldport 3.2.61-grsec-modsign #1 SMP Tue Aug 12 09:58:26 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux [14:49:03] custom kernel with grsecurity mod [14:50:34] are there any tools that take advantage of the apt-get repos but can be run as a low user? I can understand binary packages not working, but source packages would be very nice [14:50:50] i'm not too familiar with that personally [15:39:59] OK, so I've manually backed up my wiki pages and am going to start anew with a fresh mediawiki install. Now, I can input the wiki pages manually (its less than 50 pages), however, how do I let mediawiki know that all the images have been uploaded and its not starting from nothing (if that makes sense)? [15:44:10] Delpy, how would MediaWiki "start" doing something (what) with the images? [15:45:07] Do https://www.mediawiki.org/wiki/Manual:Importing_external_content and https://www.mediawiki.org/wiki/Manual:ImportImages.php help? [15:45:31] andre__: What I mean is- How do I get Mediawiki to recognize that there are already uploaded images if I upload the images folder, but the database is new..? [15:45:47] Checking... [15:47:15] https://www.mediawiki.org/wiki/Help:Export and https://meta.wikimedia.org/wiki/Help:Import in theory [15:47:21] not sure though what you've already done and why :) [15:47:55] andre__: I simply have a copy of the images folder. [15:48:11] And am going to set up a new database/mediawiki install [15:48:24] Delpy: have you followed any steps / tutorial somewhere to export your wiki content? [15:49:26] andre__: No. I've saved the pages manually and am going to enter them manually again (because my old wiki is old and seriously compromised). The images is all I'm worried about now and I'm checking the links you sent. [15:56:43] Hello, i try to had this animation model https://fr.wikipedia.org/wiki/Modèle:Animation to this wiki : http://lagbt.wiwiland.net without succes Thanks [16:23:55] Goultard: Hi. Do you have a question? [16:45:12] Hi everybody [16:47:24] Maybe someone can learn me why this code did not turns in a signature? https://meta.wikimedia.org/w/index.php?title=Response_to_2017_ban_in_Turkey/Sig&diff=16722045&oldid=16722035 [16:49:28] hey, i tried to update my MW version from 1.26 to 1.28 and now whenever i try to access a page it displays a 403, can someone explain how to solve this? [17:04:42] OK, so how do I prevent anyone but the admin from creating new pages or editing pages? [17:08:57] Delpy: https://www.mediawiki.org/wiki/Special:MyLanguage/Manual:User_rights [17:09:08] Zppix: Ah, thanks! [17:09:14] Delpy: no problem [17:09:41] Zppix: I've become paranoid & don't want my new install to be insecure for more than it needs to be! [17:11:01] Delpy: be sure to use https [17:11:14] Zppix: Definitely. [17:19:06] hey, i tried to update my MW version from 1.26 to 1.28 and now whenever i try to access a page it displays a 403, can someone explain how to solve this? [17:23:56] facto: how did you update? can you still access the front page? [17:24:12] facto: what does the log file of your webserver software say? [18:04:22] Is it normal for a wikimedia installation to be hit with spam less than 2 minutes after installation??? [18:21:33] Delpy: I guess it's not unheard of. [18:21:44] !captcha [18:21:44] For more information about CAPTCHAs and MediaWiki, see . [18:22:12] Niharika: I saw it with my own eyes- Not 2 MINUTES after I installed it!!! [18:22:49] Niharika: The hilarious thing is that the server decided that *I*, the admin, needed to do a CAPTCHA, but not the bots!!! [18:22:50] Delpy: :( I wonder why people don't spend their time doing something more useful. [18:23:01] Delpy: Anyway, captchas should help. Also... [18:23:02] !nuke [18:23:03] http://www.mediawiki.org/wiki/Extension:Nuke [18:23:05] If it's a known domain and stuff... [18:23:10] There's things already crawling it [18:28:12] Good evening. Does Mediawiki software have cross-site messaging please? I know login (on Wikipedia anyway) is unified, but if I get a message elsewhere, does it show up wherever I login, or only on the site it came from? [18:29:29] With Echo installed, notifications do appear on other sites [18:30:35] I'm trying to install echo on my 1.28 wiki, but the download does not include extension.json [18:30:58] do I need to do something to generate this file, or should I just use the 'old' method? [18:31:31] old method [18:31:32] Not everything was converted at the same time [18:32:00] And not everything is converted yet [18:36:21] Flow seems to be an up-and-coming extension and echo is a dependency, so I was surprised both still require the old method [18:37:55] Mixture of reasons. Complexity of their loader code, and active developement making it harder still [18:38:05] They have both been converted for REL1_29 thankfully [18:38:31] Niharika: Reedy: It's a known domain name- they've hit it before, but I didn't expect this fast of a response. Also, I want to allow only the admin to make changes or anyone the admin approves individually. [18:38:49] That probably needs FlaggedRevs [18:39:02] Or you just disable anonymous registration and close signup [18:41:27] Deply: I use the recaptch2 from google for account creation and it works wonderfully [18:42:02] I have it set to always show for account creation [18:42:43] thanks extrarius. This is a closed wiki though. I only want to allow a handful of people to edit it for now. [18:44:05] well, another strategy is to remove createaccount from everybody except admins, then only you can create accounts. If you have their email addr, you can create it and set it to email them a random pw [18:44:27] extrarius: Don't know how to do that, though. [18:49:12] Delpy, something like this in your LocalSettings.php : https://pastebin.com/jxJHSncE [18:51:30] Hey thanks, extrarius! [18:56:47] Once you've added that, you can use the Special:CreateAccount page to create accounts for other [18:57:22] extrarius: Ah very good. This will do just fine for my small wiki. [19:10:33] SHould this not enable editing for the admin, even if all editing was disabled in a line previously: $wgGroupPermissions['administrators']['edit'] = true; [19:11:16] Delpy: the name of the built-in "Administrators" group is 'sysop', not 'administrators' [19:11:24] so try that [19:11:38] MatmaRex: D'oh! Thanks [19:15:33] I want to have user names decorated depending on certain conditions, kind of like you can earn 'badges' on some forums. Is there an extension that does something like this? [19:24:11] extrarius, don't know as you have not explained the conditions. :) However searching for "mediawiki badges extension" on an internet search engine, one of the first results is https://www.mediawiki.org/wiki/Extension:OpenBadges [19:34:10] yeah, open badges is not what I want. I'd like to have a "special" set of groups that have no associated privileges (neither granted nor revoked) where membership in these groups decorates the username before it is displayed (and I don't care how multiple ones combine) [19:51:15] Is there a good hook for modifying any instance of a link to a user's page? Just generic LinkBegin / LinkEnd ? [19:59:17] I think there's a link hook [21:14:33] Alguien sabe como cambiar el texto que aparece al posicionarse sobre un link? [21:18:20] Hi. [21:19:25] i need to change the description in a link when I do a hover. normally it show's the page name, I need to write a little description when I do a hover. It's possible= Thanks [21:51:44] How the heck do you delete a user from mediawiki? [21:54:10] You can't [21:55:45] Reedy: Is that a serious answer? [21:56:23] Yup [21:57:07] Reedy: And who thought that was a great idea? [21:57:26] Pass [21:58:07] It's a pretty basic function. [21:58:09] Maybe [21:58:14] For Wikipedias purpose, we have no desire to delete users [21:58:39] for a different answer, check out Extension:User Merge and Delete [21:58:48] no idea how well it works or if it's compatible with recent mw versions [21:59:02] as I've never had the need to delete users via the UI [21:59:11] Skizzerz: I am looking at that page right now, but it says: "you cannot delete a user A without having merged the user A to B" [21:59:26] yes, so that contributions are maintained without corrupting the database [21:59:30] Some sites have a user they merge all the crap to [21:59:38] *why* are you looking to delete a user? [22:00:28] (not trying to pick apart your reason, just want to understand where you're coming from and perhaps offer alternatives) [22:00:46] Skizzerz: Because all my users are spammer that got into my wiki because of mediawiki's lacking default security. And now I have a bunch of bot/spam users apparently "stuck" in the Wiki who can no longer edit or create pages. [22:01:08] ok [22:02:02] so, the issue with deleting users is that you can't just delete a user and expect things to work [22:02:17] if you delete a user, there needs to be no references to that user, meaning they can't have any contributions [22:02:38] in mediawiki UI, all* actions are reversible, so if you delete a page, it can be viewed/undeleted later [22:02:46] Skizzerz: But these users did not contribute anything but spam which I have nuked. [22:03:11] Skizzerz: Exactly- which means that the darned spam is *Still* there!! [22:03:14] so deleting via UI (including Extension:Nuke) doesn't actually remove contributions from the db, meaning deleting the user without cleaning up those contribs will cause your db to be in a corrupted state [22:03:40] the solution if you want to clear out the spam and spam users is to just throw some sql at it to wipe it out at the db level [22:03:52] isn't a perfect solution, but it works in I'd say 80% of cases [22:04:11] ok... Is there a way at least to clear out the spam via the UI or an extension?? [22:04:42] I don't know of any good ways to clean spam via UI/extension in a way that makes it go away for good... [22:04:51] did you search mediawiki.org? [22:05:05] Yes [22:06:14] are you comfortable around databases to script your own thing? I can at least tell you what the main tables to look out for are [22:06:51] (I have a script I use to nuke spam, but I offer that as a paid service so I'm a bit reluctant to just share it, sorry) [22:08:35] but in general you'll want to clean up the users table, as well as page, revision, and text. Then you'll want to run the rebuildAll maint script to clean up recentchanges and the like [22:09:16] if you've previously deleted the edits, the archive table stores deleted text [22:11:00] depending on the nature of the spam, that might be enough. If the spammers were using mw features such as file uploads, categories, templates, or extensions you have installed, then you may need to clean additional tables [22:12:07] hope that helps! I see that https://www.mediawiki.org/wiki/Manual:Combating_spam has some links to utilities as well that you may be able to use as a starting point for customizations instead of needing to start from scratch [22:21:07] (there's also the deleteArchivedRevisions and deleteUnusedUsers maintenance scripts, if you don't have any other deleted revisions you want to keep around) [22:21:11] Delpy: ^ [23:11:05] Hello all. I'd like to do a complete snapshot/mirror of the Wikipedia as it is today, with pictures. Could someone point me in the right direction please? [23:11:32] I just need to know which data files these are, I'm not sure after seeing the directory listings [23:28:31] Skizzerz: thanks. [23:29:00] Not sure how to invoke the script though. I am on a shared server. Don't think they'd give me shell access. [23:29:14] you can ask :) [23:29:18] (deleteArchivedRevisions.php, that is) [23:29:23] if you're paying for hosting, ssh access is typically on offer [23:29:29] if it's free hosting, it typically is not offered [23:29:39] No, not free. [23:29:48] then you should probably be able to get shell access [23:30:02] what host? [23:30:07] (if you don't mind me asking) [23:30:36] SiteGround. [23:31:38] I haven't dealt with their customer service as much yet. [23:32:08] you don't need to, it's on by default there [23:32:13] you just need to set a couple of things up first [23:32:16] Oh, ok. [23:32:26] I didn't know. I've never used it before with them. [23:32:40] see https://www.siteground.com/tutorials/ssh/ for some links, but the gist of it is to log into cpanel, scroll down to the bottom and click the SSH/Shell access icon, then set up a new keypair [23:32:52] Ah. [23:32:56] (or add an existing public key you've generated elsewhere) [23:33:12] Thanks. This is good info. [23:33:58] they run it on a nonstandard port (18765), so you'll need to point your ssh client at that instead of the default [23:34:28] hmm. k [23:34:56] if you're on Windows, the first two tutorials there give full step-by-step instructions ("How to enable SSH through cPanel" and "How to open an SSH connection using Putty") [23:35:30] I'm on a Mac, but I don't think that'll make any significant difference. [23:35:40] it's easier on mac :) [23:35:51] Except that. Hehe [23:36:24] still do the first link, but then after that you can do the rest via mac's Terminal app [23:36:40] k [23:37:25] the first command you'll run is ssh-add path/to/private/key [23:37:41] (replace path/to/private/key to wherever the private key is saved) [23:37:58] then ssh yoursite.com:18765 [23:38:06] you'll be prompted for a username, use your cpanel username