[00:20:41] Hello there! Could someone help me with a mediawiki permission settings? [02:30:18] Is there any way to suppress a group editnotice? [08:30:00] Is there a way to query the mw api *and force rate-limiting* [08:30:23] I need to debug the rate-limiting behaviour of my API consumer, but I'm here to double-check if there is a way to do so *without* intentionally spamming the API. :P [08:34:23] ELLIOTTCABLE: not really, we don't rate limit read access. [08:37:55] hrm, that's strange, because I'm getting messy crashes due to HTTP 429 Too Many Request responses [08:38:10] (this is on Wikimedia properties, not a default MediaWiki install) [08:39:01] oh yeah, that's a recent thing [08:39:08] * legoktm looks up rate limit [08:40:06] it's supposed to be 50/s from IP, with a burst of 250 [08:40:14] per IP address* [08:40:29] are you making API requests in parallel? [08:40:32] * ELLIOTTCABLE nods [08:40:38] yes, many-many. :P [08:40:56] so I'm currently implementing intelligent backing-off via 429's [08:40:57] ok, please don't [08:40:59] https://www.mediawiki.org/wiki/API:Etiquette [08:41:08] > If you make your requests in series rather than in parallel (i.e. wait for the one request to finish before sending a new request, such that you're never making more than one request at the same time), then you should definitely be fine. [08:41:09] heh, yep. very aware! hence current work. [08:44:53] The overall goal: I'm trying to crawl *categories*, finding all categories that are a descendant of a given root category. [08:45:09] is there a way I can do this with an API ‘generator’, instead of brute-forcing list=categorymembers calls? [08:45:25] ELLIOTTCABLE: what is your consumer's user-agent, btw? [08:45:51] currently? https://twitter.com/ELLIOTTCABLE/status/665445884847632384 [08:46:27] usually? let me generate it, hold on [08:46:33] ori: why do you ask? [08:46:49] ELLIOTTCABLE: so, you can just grab an sql dump of all the category links from https://dumps.wikimedia.org/enwiki/20151102/ (for enwiki for example) and then process the data however you'd like [08:47:31] legoktm: I *think* I need more up-to-date information than that. If I can't generate it in one API call with a generator, I'll just query it in series over time, probably [08:47:54] being able to relate a user-agent from the logs to an IRC nickname can be useful if we need to get in touch to ask you to impose stricter limits on your consumer [08:48:03] ori: *nods* [08:48:07] don't worry, we don't send black helicopters to your house or anything :) [08:48:38] 'wikipedi.as/0.0.2 (http://wikipedi.as; by ELLIOTTCABLE (http://ell.io/tt)) DisambiguationCrawler' [08:49:07] it's generated from the deploying user's package.json, it's an open-source thing; could be anything of that forma [08:49:37] ELLIOTTCABLE: you can also get mysql access to sanitized (private info is redacted) database replicas by using our tool labs dev platform - https://wikitech.wikimedia.org/wiki/Help:Tool_Labs [08:49:53] hmmm, that *might* be faster. [08:50:29] legoktm: is some sort of sign-up / specific access required for that? because this is a “hit a button and deploy your own copy of this” project, so such a thing wouldn't work for any users cloning the project [08:51:25] You have to sign up for an account and someone does a quick check to make sure you're not a spambot or whatever. You could set up a special API endpoint hosted on tool labs and have people hit it? [08:51:31] * ELLIOTTCABLE nods [08:51:38] that could work; two separate tools [08:52:18] the only thing I need the API for at the moment is determining if a particular page is a disambiguation page (for a wiki-specific definition thereof, but ...); that might be a useful tool to exist there *anyway*. [08:52:42] I'll take a look, that sounds cool~ [08:53:22] ELLIOTTCABLE: In the early years Wikimedia's capacity was always just barely enough, and we're still a bit reactive in our approach [08:53:41] ohhh [08:53:45] wait, that's easy [08:53:49] if you are interested in participating in the design of rate-limits, https://phabricator.wikimedia.org/T97204 is a good starting-point [08:54:35] ELLIOTTCABLE: https://en.wikipedia.org/w/api.php?action=query&titles=ABC&prop=pageprops look for "disambiguation" in the "pageprops" object [08:55:03] legoktm: … wait, [08:55:05] w [08:55:08] what. how does it even know that. [08:55:21] is “This category is a disambiguation category” first-classed in the data-model in some way that I didn't know about!? [08:55:35] wow, this saves me so much freaking work, if it's reliable [08:55:57] https://www.mediawiki.org/wiki/Extension:Disambiguator is how we mark disambiguation pages in the database [08:56:57] basically any page that includes "__DISAMBIG__" is marked in the database as a disambiguation page. And that magic word is usually part of some meta template that they all include [08:57:11] it's possible there are some pages without the template, but in that case, they'd probably also be missing the category [09:01:35] mmmmm. [09:01:44] is this used across all the major languages? [09:02:03] I believe so [09:02:11] Is there a quick way I can check? [09:02:18] oh, obviously, checking the templates for that magic word [09:02:23] legoktm: thank you so much. :D [09:02:30] find a random disambiguation page on those projects and see if it's marked properly? [09:02:32] you're welcome :) [09:07:17] unrelated to absolutely anything, ‘hispanoescribas’ is now my new favourite word. [09:10:19] agh. ja.wikipedia doesn't seem to use it, unless I've mangled navigating the language. [09:10:27] https://ja.wikipedia.org/w/index.php?title=Template:Aimai&action=view [09:10:30] er, =edit* [09:12:54] Have you checked a disambiguation page? [09:13:28] https://ja.wikipedia.org/w/index.php?title=ABC&action=info says it is a disambig page [09:14:09] ELLIOTTCABLE: you have to go a level deeper, __DISAMBIG__ is in https://ja.wikipedia.org/w/index.php?title=Template:Dmbox&action=edit [09:14:19] legoktm: ahhah. I spoke too soon. [09:14:24] * ELLIOTTCABLE goes back to checking [09:18:52] it's amazing how differently different languages/communities do some of the same things [13:17:28] hey all [13:18:09] hi [13:19:02] ...confused here... im installing multiple wikis on the same server, so i'm trying to follow the directions on the manual:wiki_family page... [13:19:23] ... mediawiki.org/wiki/Manual:Wiki_family... [13:19:47] im on step 6, second case, where you install wikis on the same domain but different paths... [13:20:21] ..then it gives example code for what goes in your LocalSettings.php file... [13:20:47] ...where do you put that file? [13:21:31] ...same i'm installing 3 wikis, in /wiki1, /wiki2, and /wiki3... do i put the same file in all 3 directories? [13:29:36] ...are there any issues with just installing 3 different times in 3 different directories, no need to go to step 6? [13:30:21] ..i mean.. any issues with just install it 3x in 3 different directories... not following the wiki family portion of the manual? [13:32:11] hrmm... okie dok [14:10:44] Hello all. I've written a special page extension called SpecialRandomWanted, which takes the user to a random page that doens't exist, but has references to it (redlinks). I know MySQL and I know PHP, I'm not so familiar with the mediawiki querying methods though. To make sure it only returns pages that don't exist, it does a left join from pagelinks to pages. I'm trying to put a "page_title is null" in to the where clause so [14:10:57] returned, but this makes the query error. removing this condition makes the query work, but not as intended. [14:11:35] I get the impression I'm doing something wrong with regards to Mediawiki's SQL method, since I've tested the query directly on the MediaWiki db and it works. Could someone help? [14:12:26] Here's a pastebin of some code that might help: http://pastebin.com/XnrJaY17 [14:31:38] 'ORDER BY RAND()', that's scary from a DBA point of view :P [14:41:13] krispbacon: it throws "Error: 1054 Unknown column 'page_title' in 'where clause' (localhost)" [14:41:34] you need to specify the page table in the "tables" array [14:56:53] Vulpix: Thanks for the answer. Never worked with randoms in databases before, especially not MySQL, not too sure what to do but I'm not too bothered about performance and this seems to get the job done simply, so i'll leave it. Oh so the tables array acts like an aliasing array, rather than the from clause? Also, now the special page returns "". And how did you get that error code? All that happens when I [14:56:55] error" [14:57:48] !debug [14:57:48] For information on debugging (including viewing errors), see http://www.mediawiki.org/wiki/Manual:How_to_debug . A list of related configuration variables is at https://www.mediawiki.org/wiki/Manual:Configuration_settings#Debug.2Flogging [15:00:17] you can use eval.php to test things and run $dbr->selectSQLText() to see the generated SQL [15:08:48] Thanks a lot, I'll have a play around [17:01:25] hello [17:01:40] i've some problems connecting to the db to install mediawiki locally [17:02:16] but i can connect from the command line with the same credentials so... [17:02:35] it's probabli a problem of how it's mediawiki running [17:02:45] can anyone help me out? [17:05:46] <[PiR2|Bot]> softplay, what error messages do you see? [17:06:21] DB connection error: Permission denied (127.0.0.1). [17:06:22] Check the host, username and password and try again. [17:06:30] in the installation wizrd [17:08:06] <[PiR2|Bot]> did you check the password? [17:08:11] yes [17:08:18] username and password are fine [17:08:24] i doublechecked it [17:08:55] and i checked the variables of the db too [17:09:04] i really cannot understand [17:09:19] in the db channel people says it's a problem of the cms [17:09:40] <[PiR2|Bot]> did you enter the account name anywhere? Was it root? [17:09:42] cause from the command line i can use the user without problems [17:09:51] no it wasn't root [17:10:04] it was a specific account for the wiki usage [17:10:13] i granted privileges [17:10:37] wait let me check only onething i'll be back in one minute [17:11:27] GRANT USAGE ON *.* TO 'wikiDev'@'localhost.localdomain' IDENTIFIED BY PASSWORD '*DD98640C6870803AEDBDDA36DCCF8E54BC8B9A6D' | [17:11:27] +- [17:11:34] is this enough? [17:11:42] i mean it's usage [17:12:23] <[PiR2|Bot]> maybe? try it. I'm not an SQL expert though. [17:13:58] <[PiR2|Bot]> I found your stackoverflow question when searching for the error [17:14:26] :) [17:16:09] <[PiR2|Bot]> try GRANT ALL instead of USAGE [17:17:32] <[PiR2|Bot]> GRANT ALL PRIVILEGES ON wikidb.* TO 'wikiuser'@'%' IDENTIFIED BY 'password' WITH GRANT OPTION ; [17:18:30] <[PiR2|Bot]> softplay: ping [17:18:41] yes i've just tryied [17:18:55] the grant table says tha now it has all privileges [17:19:07] let's see if the installation wizard works [17:20:45] [PiR2|Bot], it doesn't work :( [17:20:49] still [17:20:56] same error [17:22:00] <[PiR2|Bot]> try following all these steps as closely as possible? https://www.mediawiki.org/wiki/Manual:Installation/Creating_system_accounts [17:24:21] <[PiR2|Bot]> maybe you should start over with the db/user and follow that exactly. Then it /should/ work (but Murphy's Law, of course) [17:29:45] [PiR2|Bot], this page is recursive :O [17:30:53] i didn't know i had to restart the mysql server is this really needed? [17:31:52] shouldn't be [17:32:00] <[PiR2|Bot]> I honestly don't know. [17:32:00] <[PiR2|Bot]> It wouldn't hurt though, so I guess it's worth a shot. [17:40:56] <[PiR2|Bot]> Reedy: any suggestions for softplay? [17:43:07] line 61: ulimit: cpu time: cannot modify limit: Permission denied [17:43:17] wow [17:53:01] [PiR2|Human], so i have to run the mediawiki installer from root? [17:53:38] i mean if this limit.sh is changing something so crucial [17:54:03] <[PiR2|Human]> Reedy what do you think ^ [17:54:49] but can i do that? the server would be running on www-data anyway... [17:54:52] mumble [17:55:06] let me read again the tutorial you gave me [18:02:45] i found some bug reports [18:02:50] old though [18:07:36] <[PiR2|Human]> link? [18:08:42] https://www.mediawiki.org/wiki/Project:Support_desk/Archive_10 Error creating thumbnails/galleries [18:08:54] https://bugzilla.redhat.com/show_bug.cgi?id=611308 [18:10:06] <[PiR2|Human]> are you using RHEL/Fedora? [18:12:33] Fedora yes [18:12:46] [PiR2|Human], but a newer version [18:12:49] it's 20 [18:12:59] this bugreports is speaking about the 13 [18:13:08] so it should be already fixed