[01:43:42] Person1234: what do you mean? [02:24:17] It'll take ~1 month to import Wikipedia on this PC. [02:25:38] oh well [02:28:28] pressure679: I am mirroring a sister wiki. [02:28:49] pressure679: I agree the process is less than ideal. importDump.php ran out of memory, so I had to split the dump into separate dumps. [02:29:34] pressure679: I think you can get an sql dump from wikix or another wikipedia mirror. They may host these dumps and importing them may be quicker. [02:48:19] Sveta: I am clueless, I have this program which can do it in a bit over a week (without compression and encryption) given the rate of speed from 3 days, and I have this program which indexes the xml dumps, and I want the least read/write wear on the drive, so I think I am going with one of the other 2 programs. [02:48:49] I suspect mysql would be a better choice, but me -. [02:50:48] Which programs are that? [02:53:03] My own, written in golang, I am mostly self-taught though. [02:53:12] - A programmer by training. [02:53:39] You mentioned three programs. [02:55:44] No, I meant 2, one with a 3rd party library from Github and one with simple indexing by regular expression lookup. - There is more to the programs though. [02:59:36] ok [08:40:56] Hi. I have a mediawiki web site, and when I click on a link in special:special pages I get and error saying that one of the characters in the page header is not valid. My site is not in English, and I understood this can happen because some problem in the url encoding [08:41:10] Can someone please help me debug this? [10:21:29] Drizt, is the website public? [13:33:32] any idea why mw wouldnt be treating new interwiki links right? i tried to add the phab prefixes and its seeing them as namespaces [15:20:26] Hi. I'm trying to use Wikipedia's API from codepen.io (codepen.io is the page loaded in my browser, and I'm typing in the console). I'm getting "No 'Access-Control-Allow-Origin' header is present on the requested resource". What should I do? [15:29:02] Willwhite:add to the url &origin=* [15:30:12] That is add to the api url you are fetching [15:31:00] Thanks bawolff. That worked. [15:31:11] Glad to help [15:32:31] Do you know if/where that is documented? [15:33:27] It should be on the main api help page (if you just go to api.php) [15:35:53] Found it. Thanks. [15:38:49] That's a better solution than using JSONP, which is the answer I found on other forums. Being unauthenticated is so common, especially in getting started, that I think this should be more prominent. [15:41:55] Yeah jsonp is easy to use but kind of terrible [15:42:55] i think we used the origin parameter to be extra paranoid, although it might not have actually been that all neccesary [16:07:37] I've added about &origin=* to https://www.mediawiki.org/wiki/API:Main_page#A_simple_example. [16:09:22] Thank you :) i appreciate that you approved the docs [16:09:31] *improved [16:37:59] No problem. Thanks for making the API! [16:40:13] It's used in one of the Free Code Camp challenges (Build a Wikipedia Viewer). [19:21:58] I have multiple wiki setup using the same base (1.27.1), same domain but different paths, and I'm trying to upgrade to 1.27.3, but the update script gives me an Undefined index error for the REQUEST_URI statement. [19:22:13] Is there documentation on this somewhere, or a changelog that explains why this fails for the 1.27.3 upgrade? [19:40:31] or.....maybe I should just read the docs [19:40:32] AwoL: Its probably something in your LocalSettings.php [19:40:55] bawolff_: This page has some suggestions on updates...https://www.mediawiki.org/wiki/Manual:Wiki_family [19:41:04] It looks like it's just due to the REQUEST_URI environment variable not being set [19:41:48] AwoL: You can also apply db updates manually (Looking at the new things in maintinance/archive/ ) [19:42:26] ah neat [21:30:32] Hi again. I'd like to get a random quote from Wikiquote using an API (the MediaWiki one I guess). Can I get just one quote, or do I have to get a page with quotes on it? [21:43:16] willwhite_: page with quotes probably [21:48:48] Can the API guarantee I'll get a page with a quote? [21:49:24] E.g. I just got https://en.wikiquote.org/wiki/715 and there's no quote. [21:51:09] heh, why does that even exist [21:52:04] wikiquote has 30k pages, and 2k of them are those "year placeholders". lol [22:05:33] One of our most successful projects. [22:34:31] Esther: just because your dictionary doesn't have the same definition of success as the one the WMF use, there's no need to take such a dismissive tone. [22:45:59] Why is Wikiquote a MediaWiki and not just a database of quotes? [22:47:30] when all you have is a hammer, everything looks like a nail [22:48:57] Do you know if there's an effort to change it to a database? [22:56:51] Commons is the first site we need to migrate from MediaWiki, if we ever could migrate anything away from MW [23:00:58] Would it work to use a database for new entries? [23:03:14] And use experience with that to see how best to migrate the older entries? [23:03:43] NotASpy: In a way, we are – we're migrating the structured content to Wikibase. [23:04:29] yeah, moderately excited for structured data, particularly if it can do something with EXIF data in due course [23:07:30] James_F: Are new entries going into Wikibase? [23:07:42] willwhite_: All the entries will. [23:14:55] James_F: May as well have new entries going in first, before all of them. [23:15:21] Right? [23:15:58] Hi, I'm looking for a database dump of only 2 tables (pages and pagelinks) of Wikipedia. I'd like to look up all redirects to specific pages. Any pointers? [23:16:59] df4ew: Sounds like you want https://dumps.wikimedia.org/backup-index.html [23:17:19] willwhite_: I don't know that team's deployment plans; it's a long way off at this point. [23:19:03] James_F: Thanks but I looked at that and it seemed like too big. I don't need full dumps, just enough to check for redirects. [23:19:57] James_F: kind of like this tool: http://69.142.160.183/~dispenser/cgi-bin/rdcheck.py?page=Neymar [23:22:42] James_F: Please would you tell me the best way to contact that team? [23:25:21] willwhite_: https://commons.wikimedia.org/wiki/Commons_talk:Structured_data I think. [23:32:23] Thanks James.