[01:37:20] There is an incorrect image for the cover of the novel "Twilight Eyes" by Dean Koontz, how do I report? [01:39:07] Actually it's "Servants of Twilight" by Dean Koontz [01:39:34] It shows "Twilight" by author Leigh Nichols [01:41:47] Guest36439: this is the wrong channel for that sort of stuff - you should maybe report it on the #wikipedia channel. [01:42:06] how do I do that? [01:42:35] mediawiki.org? [01:43:38] Guest36439: i'm not sure where you should be complaining, but i think that cover is correct, actually [01:43:39] "The Servants of Twilight is a novel by American suspense writer Dean Koontz, originally released under the pseudonym Leigh Nichols in 1984. Its original title was just Twilight" [01:43:45] oh, he left. well. [03:40:31] I have a gadget on a wiki that I work on that works when I call it from U:ShoeMaker/vector.js with importScript, doesn't work as a gadget - unless I append `?debug=true` to the URL. Does that mean that I'm not calling the right dependencies on Gadget-definitions or what? [03:42:41] currently the Gadget-definitions for it reads: [ResourceLoader|dependencies=mediawiki.api] [03:43:40] Actually, it's: Script[ResourceLoader|dependencies=mediawiki.util,mediawiki.api]|Script.js [03:44:01] I don't think I'm using .util - could probably be trimmed off. [03:56:17] does mediawiki.api include mediawiki.api.edit etc? [03:56:34] or do I need to define call all of the pieces I use separately? [04:14:41] ShoeMaker: possibly, or you rely on the script executing in the global scope [05:53:25] tgr: the script in question is http://ddowiki.com/page/MediaWiki:Gadget-DeOrphaner.js loaded in http://ddowiki.com/page/MediaWiki:Gadgets-definition#testing [05:55:05] ShoeMaker|sleeps: check the error console [05:55:32] if it's something about a missing method then it's a dependency issue [05:55:41] otherwise, probably timing? [08:21:06] Hey, is there any place where I could get a sample LocalSettings.php file for 1.30.0? [08:22:02] I am in progress of upgrading 1.19 sites to 1.30, and would be neat to be able to revise all the LocalSettings files and remove all the unneeded crap for once [08:41:54] I'm trying to download Scribunto but the download page just gives an internal server error message. Is this a known issue? [08:41:56] https://www.mediawiki.org/wiki/Special:ExtensionDistributor/Scribunto [09:36:43] The (possible) caching can really mess you up, you get something working for 1 second, then it doesn't work again and you never know what made it work [09:49:53] How could I create a super user via commandline for a site/all sites running on one mediawiki instance? [11:05:50] Yesterday I installed a 2 wiki wiki family with shared database with these instructions: http://www.mediawiki.org/wiki/Manual:Wiki_family and http://www.mediawiki.org/wiki/Manual:Shared_database and the $wgInstantCommons is not working on the new wiki [11:07:20] The description pages show up just fine but the images don't [11:07:47] Now that thinking of it maybe I didn't do 'chown -R www-data images_testwiki' [11:08:23] That could totally be it if the $wgInstantCommons caches the pictures in the images dir to take load off the WMF servers [11:11:09] nope.. w/images_testwiki is writable to the Apache2 [11:12:54] I did "fix" a very similar "images not showing up" problem of the old wiki (when installing wiki family) by removing the following and changing name of images/ back to images. [11:12:57] #$wgUploadDirectory = "$IP/images_$wikiId"; [11:12:58] #$wgUploadPath = "/images_$wikiId"; [11:30:17] I don't get it... what is wrong [11:43:00] I have both '$wgUseInstantCommons = true;' and '$wgForeignFileRepos[]' set. Could this be the problem? [11:43:17] It should just fallback through the chain [11:49:07] I dunno what is wrong.. the file description pages on test.consumerium.org/wiki/ work just fine except the main image is missing [11:50:17] maybe the web server rewrites are doing something they not supposed to be doing [11:51:21] Do I need to escape an underscore in rewrite? [11:51:25] lets try.. [11:52:33] nope [11:54:35] removing the underscores solved the issue [11:54:47] something funky with the rewrites regarding underscores maybe? [12:10:18] it briefly worked (maybe untill I actually moved the directory 'images_testwiki' to 'imagestestwiki', which would not make sense that it started working in the first place with the directory name being other than the wiki is expecting [12:11:14] something somewhere in the wiki family confs is breaking this is my guess [12:14:58] Just a thought of a different issue.. Can users that have made zero edits be safely and cleanly removed from the user table? There are ~ 67,000 accounts on Consumerium development wiki out of which actual human accounts maybe 200. The rest have been registered by various bots at various time periods [12:30:09] Ok.. problem seems to have been solved by affixing '/w' to the $wgUploadPath .. I copy-pasted this error from https://www.mediawiki.org/wiki/Manual:Wiki_family#Basic_principles. Now looking at it I should have understood that that example wiki is being installed in the webroot and not the /wiki/-directory (with the pretty URL's) [12:39:58] hi there [12:40:37] I'm trying to work out how to set robots.txt on our MediaWiki instance - I've read https://www.mediawiki.org/wiki/Manual:Robots.txt but am still not clear exactly where to add the robots.txt text [12:40:50] I've put it here https://hlp-wiki-develop.agileventures.org/MediaWiki:Robots.txt [12:41:13] but not sure if bots will find it https://wiki.healthylondon.org/robots.txt doesn't show it ... [12:44:06] No [12:44:35] in wikimedia wikis, we have a script that does are rewrite to pull from that page via a php script [12:44:50] https://github.com/wikimedia/puppet/blob/e959321aa620b77403cc9379db2e86080323c6e8/modules/mediawiki/files/apache/sites/public-wiki-rewrites.incl#L3-L4 [12:45:05] https://github.com/wikimedia/operations-mediawiki-config/blob/master/w/robots.php [12:45:12] But that won't just work by copying it away [12:45:32] It's easiest to just create a robots.txt file on disk [12:48:05] @Reedy - thanks I'll put it there on disk as you say [12:48:09] in htdocs I guess? [13:10:34] Are there instructions somewhere on how to set up a Commons imitating central media repository to be accessed via $wgForeignFileRepos by the other wikis? I did try searching and din't find this info. [13:40:31] Hi there! [13:43:29] Someone knows if LdapAuthentication extension works with mediawiki-1.30.0 and PHP 7.0? [13:50:18] dmatheus, did you run into an error when trying? [13:52:37] Yes, the debug log shows that the user was binded ok, but passwd it was typed incorrect, but I've typed correctly [13:57:12] I am installing vagrant but got this error [13:57:31] after vagrant up [13:57:33] A Vagrant environment or target machine is required to run this command. Run `vagrant init` to create a new Vagrant environment. Or, get an ID of a target machine from `vagrant global-status` to run this command on. A final option is to change to a directory with a Vagrantfile and to try again. [14:00:26] this error: It appears your machine doesn't support NFS, or there is not an adapter to enable NFS on this machine for Vagrant. Please verify that `nfsd` is installed on your machine, and try again. If you're on Windows, NFS isn't supported. If the problem persists, please contact Vagrant support. [14:42:30] hi, I've got a spreadsheet with about 2000 rows where each row represents a historical person. We were going to build a little web front end and put the data into JSON, but now my boss wants to put it into a wiki - is there a way to use PHP to generate multiple (~2000) wiki pages at once, from a php array? [14:50:19] wonkadoo: use a bot [14:50:58] if it doesn't absolutely have to be PHP, pywikibot is rather fully-featured so you should be able to plug into that framework in order to create 2k pages [14:51:54] alternatively, you could research mediawiki's import/export XML format, transform your data into that, and upload it via [[Special:Import]] [14:52:07] (you may need to break it into multiple files in order to prevent timeouts) [14:52:22] bot method is likely easier tbh [14:56:14] !bot [14:56:14] A bot is an automatic process which interacts with MediaWiki as though it were a human editor and is designed to simplify repetitive actions too cumbersome to do manually. An overview of how to create a bot and a list of known frameworks can be found at http://en.wikipedia.org/wiki/Wikipedia:Creating_a_bot [14:59:20] thank you! [15:00:42] I'd personally use https://github.com/alexz-enwp/wikitools since it's the framework I know best. [15:03:58] it doesn't have to be PHP, I've just used PHP a bit so I figured that would be a good route - a bot sounds like a good approach - the data's a bit of a mess, so I'll have to clean it up before it can get into any useful format, so honestly XML is probably just as easy as a php array or whatever [15:05:12] I wouldn't fuck around with XML and Special:Import. [15:05:15] would I need some kind of raised privileges to use either the XML import or the bot approach? the wiki is maintained by another organization so I may not get privs [15:05:22] I would just prepare the wikitext and then POST that. [15:05:24] roger that, Ivy [15:05:44] It depends on the wiki's settings and customs. [15:05:55] Sometimes wikis restrict page creation to prevent vandalism and spam. [15:06:06] Sometimes wikis want you to use a separate bot account. [15:06:15] Like User:WonkadooBot instead of User:Wonkadoo. [15:06:44] ok, I'll add that to our list of questions for them [15:06:45] The framework you use will allow logging in with a password and then will edit as that user. [15:06:59] gotcha, cool [15:07:20] I've always heard of bots, but never actually gotten a chance to use one! :) [15:08:50] https://en.wikipedia.org/wiki/Wikipedia:Database_reports/Short_single-author_pages/Configuration is an example using wikitools. [15:09:08] You can ignore the SQL parts in this case. But it's basically just establishing a wiki object, logging in, and then calling .edit(). [15:12:34] this particular example - where is it pulling the data from? is it coming in from that SQL? [15:31:25] Technical Advice IRC meeting starting in 30 minutes in channel #wikimedia-tech, hosts: @addshore & @Lucas_WMDE - all questions welcome, more infos: https://www.mediawiki.org/wiki/Technical_Advice_IRC_Meeting [16:23:40] is ti down? [16:23:51] it is down [16:29:10] flying_sausages, what is "it"? [16:29:39] and why should it? [16:29:53] https://www.mediawiki.org/wiki/Special:ExtensionDistributor/AdminLinks [16:54:26] flying_sausages: sorry about that, it should be fixed now. [16:54:36] <# [16:54:51] works [17:29:32] damned I was unaware of such meeting [17:29:43] tech meeting :P [17:30:11] there'll be another one next week. [18:07:16] https://phabricator.wikimedia.org/T95504, Here I found the Flow\Block\Block in my codebase, So what is the new name that to be replaced by Block. [18:52:47] hey guys, what do you think is the most efficient way to delete all files that are older than 15 days, with no exception? [18:57:18] why would you want to do that? [18:58:26] query the db, paste the output on a txt and then php deleteBatch.php -u=YourUserName files_to_delete.txt [18:58:32] Can you please help me with with the above bug T95504 [18:58:32] T95504: Rename Flow\Block\* to something that doesn't conceptually interfere with \Block - https://phabricator.wikimedia.org/T95504 [19:00:18] Gopa_____, what is the question? [19:00:27] /how/ to help? [19:00:39] the more specific your question is, the more likely you get an answer. [19:01:19] https://phabricator.wikimedia.org/T95504, Here I found the Flow\Block\Block in my codebase, So what is the new word that to be replaced instead Block. [19:02:37] Gopa: which word do you propose? [19:03:02] and being curious, where did you find Flow\Block\Block in the codebase (file name?) [19:03:29] andre__: SubmissionHandler.php [19:05:54] andre__: I propose Block1 [19:06:39] Gopa_____, there are four files where Flow\Block\Block is used? [19:06:49] Gopa_____, why "Block1"? Please explain your idea [19:07:04] Gopa_____, plus maybe ask on #wikimedia-collaboration instead. Because it is a Flow question. [19:07:30] (And #wikimedia-dev is a general developer venue. #mediawiki is often more for users, not developers.) [19:07:48] andre__: Thanks, I thought of Block\Block is a bit confusing so I suggest Block1 [19:07:58] Gopa_____, how is Block1 less confusing? [19:08:09] adding indeces is even worse I think. [19:08:17] but naming things is hard, I agree. [19:08:25] Gopa_____, what does Flow\Block\Block exactly do? [19:08:53] andre__: so what can you suggest for that [19:09:00] Gopa_____, that is what I am asking you. [19:09:04] Gopa_____, in any case I recommend to ask on #wikimedia-collaboration instead. Because it is a Flow question. [19:09:33] Gopa_____, and you should be able to explain what it does. So you can understand how to come up with a better name. [19:09:51] ok , Thanks [19:10:54] Gopa_____, "Block1" is not a better name. You should understand what it does and find a word that is close or better. [19:11:50] Oh ya thanks, sir I will go to the code and will try to understand the code [19:12:31] Gopa_____: no need for "sir" :) [19:14:07] andre__: ok thank you [19:36:09] mmmhhh... I'm having some spam filter firing on a wiki of mine [19:37:37] I need to get around it [19:38:19] What extension takes care of the checking of the blacklists? I think these are coming from Wikimedia's blacklist [19:43:46] best would be if I could locally whitelist these two hits and leave it like that [19:50:17] I just briefly disabled the SpamBlacklist while I snuck the artile into the wiki [19:50:35] Problem solved. [20:38:47] tgr: DanielK_WMDE: Wikis that have it enabled [20:38:51] https://www.irccloud.com/pastebin/8BXA5OZZ/ [20:39:50] thanks Amir1! that's the list of wikis doing fine-grained usage tracking, right? [20:40:51] tgr: This is list of wikis that have lua fine grained usage tracking (xkill) enabled, "fine-grained usage tracking" (=statement usage tracking) is enabled in another set of wikis [20:41:09] that is subset of this list [20:42:22] These wikis are cawiki, cewiki, elwiki, kowiki, trwiki [20:44:58] xkill only differentiates between types of things (description/claim/label/...)? [20:45:58] yes, it removes the "X" tracking for the getEntity() lua call, and enables more specific tracking [20:46:08] tracking for individual properties is controleld by a separate twitch [20:48:13] ack [20:48:28] I got to run too, let's continue this tomorrow [23:11:41] I'm not sure if I should -1 https://gerrit.wikimedia.org/r/#/c/407596/4/Expr.php for indentation, but the existing indentation in that file seems rediculous [23:14:00] bawolff: how much time will you stay around? I'd like that AutoProxyBlock security patch uploaded/merged/backported if that's okay? [23:14:18] I'll be here for a little bit more [23:14:36] Sorry about being not very responsive on that, I've been kind of sick for the last week [23:14:47] bawolff: no worries! [23:15:11] I was thinking that I could upload the patch and git-commit --author= Mainframe, etc. [23:15:27] I later noticed that the patch is in a weird format [23:16:07] its git mailbox format [23:16:16] if you do git am < patchfile [23:16:26] git will take care of importing the author and what not [23:16:41] it will also work with patch -p1 < patchfile [23:16:52] but git am is nice as it preserves metadata [23:17:23] so I download that patch and git am < patch.txt ? [23:17:39] yep [23:17:44] wish me luck [23:17:54] if it doesn't apply cleanly for some reason [23:18:01] git am -3 < patch.txt [23:18:03] often works [23:18:21] ok :) [23:18:32] first I'll clone the repo [23:21:06] bawolff: okay so I did $ git am < patch [23:21:12] but now git diff shows nothing [23:21:26] (topic of the patch was displayed) [23:22:44] git show [23:22:50] it also commits the patch [23:23:05] oh git show indeed shows the thing [23:24:40] remote: You are not allowed to perform this operation. [23:24:40] remote: To push into this reference you need 'Push' rights. [23:25:11] https://gerrit.wikimedia.org/r/408941 [23:26:07] You probably need to add a Change-Id line [23:26:17] did you use a command like [23:26:26] git push HEAD origin:refs/for/master [23:26:52] https://gerrit.wikimedia.org/r/#/c/408941/1/AutoProxyBlock.body.php displays some red things, I guess those lines should be removed [23:27:08] I did git push origin HEAD:master [23:27:29] yeah, that's tabs vs spaces [23:29:08] btw, did you test that the extension still works after the patch? (If not, I should test it before +2ing it) [23:29:33] bawolff: I didn't test [23:29:44] I've removed those empty spaces now [23:30:49] still looks like there's some there [23:31:43] phcbf could automatically fix this, although I don't think its set up for this extension [23:32:18] err, wait [23:32:22] I just mistead the diff [23:33:09] nope, no composer phpcs/phpcbf there [23:33:30] I think I have a patch for that, but after we fix the security thing + convert to ext-reg [23:33:36] but yes, this looks correct now. I +2'd it [23:33:48] thanks [23:33:56] it's the authorship correct? [23:35:04] umm, I forgot to look at that [23:35:19] Author mainframe98 [23:35:21] looks right [23:35:24] okay [23:35:33] I don't want to get credit for something I didn't do [23:35:47] author is him, committer it's me; that's good [23:38:14] bawolff: I've cherry-picked to REL 130 and REL 129; not sure if we need more backports? [23:38:38] In theory, 1.27 as well as usually we do all supported releases [23:38:50] its a bit more fuzzy if its a requirement for third party extensions though [23:40:26] * bawolff is going to finish testing this first [23:44:44] Cherry pick failed: merge conflict for REL1_28 [23:45:21] weirdly, in my test it didn't work [23:45:56] because https://en.wikipedia.org/w/api.php?format=json&action=query&list=blocks&bkip=121.143.29.30&bklimit=1&bkprop=expiry|reason returned nothing [23:46:03] but umm, it should not have returned nothing [23:46:15] I'm either being stupid or there is a bug in the list=blocks api module [23:47:50] or it's outdated? [23:47:59] is the unserialization fixed though? [23:48:25] yeah, the seucrity issue is fixed [23:48:37] I was just checking to make sure no new issues were introduced [23:48:46] good good [23:51:47] API query looks good [23:52:46] check with https://en.wikipedia.org/w/api.php?format=json&action=query&list=blocks&bkip=156.3.55.135&bklimit=1&bkprop=expiry|reason [23:52:50] which is actually blocked [23:54:48] I think i know what happened. [23:55:13] I was looking at old blocks prior to when range blocks were supported, and the api is buggy about those [23:57:53] well i filed https://phabricator.wikimedia.org/T186766 for that, although it probably doesn't matter much