[11:25:09] So I'm getting a constant login-throttled errors on API login with a BotPassword (through Pywikibot) - I've got the password attempts throttles off for that wiki, anything else I can do to debug it? [11:30:02] Lcawte: have you cleared the throttles after adjusting the settings? [11:31:15] Yep [13:54:28] Hey all, I have a tangential question about Apache but figured I'd ask here since my sites are busy MW wikis. I've noticed (mainly due to some recent cron error output emails) that I tend to get a lot of deleted-but-still-open apache logs (more often the .1 logs) after the daily logrotate. I am using delaycompress in the logrotate config for apache [13:54:29] and it has a postrotate apache2 reload command, which does a graceful restart, but since there are so many web requests, I think it's just not enough time for the graceful restarts to completely let go of the open logs, hence the processes still using the rotated files. Anyone know of a better way to handle this? [14:05:22] It seems like putting a sleep statement right after the reload in the postrotate command might be sufficient. Any real drawbacks to that approach? [14:45:51] trying to get a list of pages from my wiki to use with local scripts. basically want Special:AllPages in plaintext without pagination, and local [14:46:14] any recommendations? i'm using this in conjunction with edit.php and view.php [14:47:21] cmc: The api? [14:47:30] i can access that locally? [14:47:44] Define locally [14:47:51] command prompt. no wget/curl [14:47:52] If you can browse your wiki, you can access the api [14:48:13] edit.php and view.php are maintenance scripts [14:48:14] You'd need to use something [14:48:24] The closest you'e gonna get is probably an sql query then [14:48:32] they run locally. [14:48:49] can you point me to the code for Special:AllPages? [14:49:11] nevermind [14:49:31] it's /includes/specials/SpecialAllPages.php for those playing at home [14:49:42] Yes, but you won't be able to just "run" that [14:50:00] nope but i can hack with it : ) [14:50:24] sql.php might be easier [16:18:52] easier for a one-off perhaps. i'm trying to do things in a way that will be easy to maintain : | [16:44:51] Using the API would be easier to maintain... [16:45:13] Hell, you could write a maintenance script to call the API internally and dump the data [17:11:30] Any 'crat in mediawiki.org around? I need someone to make my bot "bot" right so it doesn't flood RC: https://www.mediawiki.org/wiki/Special:UserRights/Dexbot [17:12:22] Reedy: James_F ^ Please [17:12:25] https://www.mediawiki.org/wiki/Special:Contributions/Dexbot [17:12:29] Looks like you've not bothered already :P [17:12:45] done [17:12:51] I haven't started yet :D [17:12:55] That's not much [17:14:26] It'll do 1k more edits [17:14:48] Fine. [17:15:00] Oh, Reedy's done it already. [17:15:11] [18:12:45] done [17:15:27] Yeah yeah. [17:16:05] Thanks [17:48:56] Reedy: that was my goal initially. i settled for this for now. it's kinda gross [17:49:13] php maintenance/sql.php --json --query 'SELECT `page_title` FROM `xdc_wiki`.`page` WHERE `page_content_model` = "wikitext" AND `page_namespace` = 0' | jq .[].page_title | xargs -P0 -I{} bash -c 'f={} [17:49:13] ; mkdir -p output/"${f%/*}"; php maintenance/view.php "$f" > output/"$f".wikitext' [18:25:10] James_F: oof, I had not looked for 'ContLang' via Config. [18:25:14] https://gerrit.wikimedia.org/r/#/c/mediawiki/core/+/595205/1/includes/Rest/EntryPoint.php [18:25:16] nice catch [18:25:19] https://codesearch.wmflabs.org/deployed/?q=(wg%7C%27%7C%22)ContLang%5Cb&i=nope&files=&repos= [18:25:37] MF and Ciruss :/ [18:26:01] Krinkle: Yeah. :-( [18:26:14] Surprise, etc. [20:42:57] hello everyone. I am on Wikimedia/Wikipedia by Username: ItWiki97