[00:01:22] does anybody have any advice on how to automate or provide a form for the creation of a redirect page to a newly created page? [00:01:52] I'm using preloaded text and some templates to simplify creating a page from a CreateForm [00:10:13] is Gloria here [00:11:30] : idle for 3 hours, 38 minutes, 30 seconds [00:11:35] @ c [00:12:42] huh: learn the lore [00:14:41] ah, http://www.mediawiki.org/wiki/Extension:AutoRedirect [00:20:34] still trying to finish :S [00:29:27] c: Probably not. [00:29:33] Hi huh. [00:30:02] Gloria: what does mediawiki use to generate the whiteout search page when a page fails to load [00:31:00] Whiteout search page? [00:31:48] Gloria: like this http://i.imgur.com/S6UsW8m.png [00:35:52] Still not sure I understand what you're trying to do. [00:36:04] Gloria: Add text/links to that page [00:36:18] https://www.mediawiki.org/wiki/Special:Search/dfji2 [00:36:32] https://www.mediawiki.org/wiki/Special:Search/dfji2?uselang=qqx [00:36:52] You can modify some pre-existing message there. [00:36:59] Or use JavaScript to just drop in whatever you'd like. [00:37:23] Or you can write an extension to hook into the page's output. [00:39:23] Gloria: the wiki/google search form page i linked is not the same form as special:search [00:39:41] That was Google? [00:40:18] that was what loaded when the page wouldn't load (from 'excessive' dpl queries on users with a less than stellar connection) [00:40:33] Cute. [00:41:06] hence the FFXIV and WWW radio buttons [00:41:31] WWW is Google, you see. [00:44:07] Gloria: yes, hence wiki/google search form. i wanna know how to modify that page [00:44:59] DatabaseError.php ? [00:45:29] line 266 [00:50:46] c: https://github.com/wikimedia/mediawiki-core/blob/master/includes/db/DatabaseError.php see line 266 [00:54:00] pir^2: i think that's exactly what I need, line 266 is just a } but i did find the search function. wanting to insert some kind of PAGENAME?action=purge&DPL_refresh=yes link in there [01:04:33] c did it work? [01:05:37] pir^2: hard to reproduce when my connection can handle the queries [01:06:05] you could probably force it to display but meh [01:06:15] and not sure how to do a {{FULLURL:{{FULLPAGENAME}}}} hardcoded [01:06:44] i added it to dberror-usegoogle, but that text didn't display on that user's screenshot [01:14:34] maybe [01:14:36] Clear DPL cache and Refresh Page [01:15:24] Err, careful there. [01:15:54] oh? [01:16:06] Be sure to sanitize any output such as HTTP_REFERER. [01:16:23] what do you mean? [01:16:47] HTTP_REFERER can be an arbitrary string, I believe. [01:16:58] it should return the URL of the current page [01:16:59] So you want to make sure that when you output it, it doesn't contain anything nasty. [01:17:07] It should, but I think it can be fucked with. [01:17:27] Whether it's a safe string or not, it doesn't hurt to escape the output. :-) [01:17:38] htmlspecialchars() or some such should be fine. [01:19:07] http://stackoverflow.com/questions/5934747/is-serverhttp-referer-safe [01:20:06] http://www.gremwell.com/exploiting_xss_in_referer_header [01:22:32] how would i wrap htmlspecialchars with the a tag [01:23:25] You could wrap $_SERVER['HTTP_REFERER'] in it. [01:23:29] And then stick that in an . [01:23:36] But the general approach here seems pretty funky. [01:23:55] You seem to be mitigating server issues by adding a "try again" button rather than fixing the underlying issue. [01:24:24] well DPL refresh fixes the issue 90% of the time [01:25:27] and since it affects a minority of users, I'd like a way to add the purge/refresh link for those who encounter the error page [01:26:42] Clear DPL cache and Refresh Page [01:27:32] You have to make sure it's set. [01:27:40] Sometimes the referer isn't set. [01:31:08] would you recommend something other than referer to return the url ? [01:31:49] I dunno. [01:31:58] I'm tired and thinking about Game of Thrones. :-) [03:22:36] .whois morbus [03:22:39] oops [03:26:00] Who IS Morbus, anyway. [03:47:30] is there a function that would allow me to get information on every wikilink for a particular page? [03:56:17] Withoutaname: in what context? API, or internal to MediaWiki or? [03:56:44] internal to mediawiki, in the php [03:56:47] im looking at the core files [03:57:13] Withoutaname: outgoing links or incoming links? [03:57:45] outgoing links [03:57:53] for outgoing links, you can either fetch from the various link tables (not sure if there’s a convenient function), or you can fetch the parsed text (hopefully from ParserCache) and check the links in the ParserOutput object [03:58:22] oh hey [03:58:30] Withoutaname: Title::getLinksFrom() may be handy [03:58:49] though it does have a warning about templates o_O [03:59:26] thanks [03:59:36] good luck :D [05:23:57] Люди, как добавить фильтр злоупотреблений [05:51:23] м? [11:03:56] marktraceur, llwy|sleep: just a long-time MediaWiki user, that's all. [12:50:37] Hey! After I've upgraded to 1.22, my customized MediaWiki:Sidebar no longer has any effect. I've read the release notes and the like, but can't find anything about a change in behaviour of Sidebar. The page MediaWiki:Sidebar has not changed, but no matter what I do with it, I can't make the sidebar work. Anyone knows what it is? [12:55:39] Hi aurimai. [12:55:48] Can you provide a link to your wiki? [12:59:44] Gloria: you're always changing the nick... [13:00:51] Gloria: Here you go, but the pages are blacklisted by default, with some whitelisted exceptions: http://trv.jbv.no/wiki/Hovedside [13:08:24] Gloria: If needed, I can provide the content of some not-whitelisted pages. [13:08:30] liangent: Every few months. :-) [13:09:22] aurimai: What looks wrong? I see a working sidebar. [13:09:42] Hmmm. [13:09:49] Those are the default links. [13:10:06] aurimai: Perhaps the sidebar doesn't work because you've restricted access to the page "MediaWiki:Sidebar"? [13:10:11] You could try whitelisting that page. [13:10:47] Gloria: Yes, the sidebar itself is working, but I can't customize it. There were no problems with the restricted access before the upgrade from 1.21 to 1.22. [13:10:56] Hmmm. [13:11:19] Well, I'd try whitelisting that page as a test. [13:11:56] It's also possible that MediaWiki is looking at MediaWiki:Sidebar/nb instead of MediaWiki:Sidebar. [13:13:00] Gloria, thank you, I'll try that. [13:13:41] Gloria, but I don't understand why this behaviour must be different in the new version. I've not seen any release notes about it. [13:14:35] Maybe it's a bug. :-) [13:14:45] Or maybe an incompatible extension, poorly written release notes, etc. [13:14:49] It could be anything. [13:18:37] Gloria: I hate release notes. they're always causing merge conflicts :( [13:19:37] Hello #mediawiki, why do I find auto-created redirect pages redirect automatically, while manually created redirect pages only show a redirect link on the page? [13:29:09] CasperVector: the issue is probably not how the pages were created, but rather how you get to the page - if there's a "redirect=no" in the URL, the page won't redirect. [13:29:52] Taron: I have just discovered the real reason: I was redirecting `Project:Sandbox' to `Special:AllMessages', while redirects to special pages have been disabled by design. [13:30:14] Yaron: thanks all the same :) [13:30:24] Ah! [13:58:05] thedj[laptop]: Are you ok with merging: https://gerrit.wikimedia.org/r/#/c/131489/ [13:58:35] ... there is an edit conflict upcoming [15:23:31] has anyone tried using old version of mediawiki over vagrant [15:23:44] for some reason it doesnt work [15:34:31] kishanio, old version? [15:35:06] Krenair: yes 1.19 i was trying to fix a bug for same so had to switch back. [15:35:11] some how doesnt seem to work. [15:35:21] What was wrong with it? [15:36:24] Krenair: its says /vagrant/mediawiki/includes/utils/IP.php is missing? [15:36:47] Is it missing? [15:37:23] oh yes. Seems vagrant config is dependent on it? [15:41:27] Krenair: oh yes. Seems vagrant config is dependent on it? [15:41:42] * Krenair shrugs [15:43:29] Krenair: Alright. I'll switch to master branch and test the code for now. [15:55:30] I'm trying to log in to a MediaWiki site, but it keeps telling me I have cookies disabled when I don't and I'm using other login-based websites fine. What could be going on? [15:56:50] harej: does that happen on more than one browser? [15:56:58] Good question. Let me try Firefox. [15:57:35] Yaron, I get the same error on Firefox as I do Chrome. [15:57:57] Ah - it's probably the site, then. Is this a public wiki? [15:58:08] Yep [15:58:46] Well, you should tell them. [16:11:57] kishanio: what's your configuration of the old mediawiki on vagrant? did you install it over the default version included, or in a different directory? can you also give me the version of php that you're working with in vagrant, and pastebin the /etc/apache2/sites-enabled/000-default ? [16:15:37] andre149: Configuration for mediawiki as in? i have followed http://www.mediawiki.org/wiki/MediaWiki-Vagrant to precise and it worked swiftly until i switched to old branch i.e. 1.19 [16:15:58] andre149: also php version is 5.3.10 and there is no 000-defualt [16:16:10] andre149: instead there is devwiki [16:17:10] andre149: and pastebin for same http://pastebin.com/aahdD3r0 [16:18:04] can you define what you mean by switching to the old branch? did you manually install 1.19 into vagrant, or use an older version of vagrant that shipped with 1.19? [16:18:50] also, that file: 'Include site.d/devwiki' -- so I will need the output of that as well [16:19:20] Urm no since i had to fix bug i just checkout tracking branch from remote/gerrit/REL_19 [16:22:37] andre149: is that an ideal way to switch? And side.d/devwiki http://pastebin.com/3tuWp6Q8 [16:23:04] its from site.d/devwiki/000-devwiki [16:23:33] Djain, please, call me in PM [16:24:45] ahhh. I would highly recommend taking the current vagrant (which we know works), and just use it for a stable dev environment -- and install whatever version of mediawiki you need to work on into it as you would normally. the main advantage of vagrant is simply that you know the php, apache, etc-- all remain stable and aren't subject to break or change like they are on say, your main system [16:27:44] so it looks like the best way to go about it would be to drop 1.19 into /var/www, and then access it via localhost:8080/mediawiki-(versionname) [16:28:04] andre149: Oh yes vagrant is a real blessing. but thanks mate i'll do a clean install over vagrant and get started. [16:28:16] oh okay that seems easy and do-able [16:28:22] let me get on it. [16:28:27] k :) [16:56:52] hi there [16:56:58] Hello. [16:57:08] I'm trying to map country codes to wikidata IDs [16:57:53] using API? [16:57:53] is there any way to query the API with something like "get me all items with P901='XX'" ? [16:57:59] yup using API [16:58:57] I'm trying to get through the doc but I feel that I'm not getting it right [17:01:32] There *should* be a way, but the Wikidata API is still being developed. [17:03:56] I can see that :) [17:04:17] so currently it's unlikely to be implemented right? [17:05:42] hum maybe somehow using list=alllinks [17:06:10] You might try #wikidata [17:06:21] I don't see it implemented, although it would be possible with SQL [17:09:59] makes sense [17:10:01] thanks [18:52:36] hello [18:52:45] i'm using mediawiki rest API [18:53:26] i've this command to get the html of a simple page: ...api.php?action=query&prop=extracts&titles=Therion [18:53:43] but what if the page is inside a category? [18:54:10] do i have to change the title of the page to category:name? [18:55:58] softplay: are you trying to get the list of pages in the category? [18:56:04] or the category description? [18:57:40] no just the html of a page inside a category [18:57:59] but've tryied with title: Category:name and it worked.. [18:58:02] thanks [19:01:08] no i was wrong [19:01:13] its not working [19:01:27] it returns a 200 code but an error message in html [19:01:34] :( [19:01:43] ambiguous.. [19:02:16] softplay: can you explain what you want? [19:02:26] What content are you trying to get from the category? [19:02:51] "no just the html of a page inside a category" [19:03:13] -- you can get the contents of a page regardless of whether or not it is in a category [19:03:53] o so the only thing i have to type is the name of the page? [19:03:56] yes [19:03:58] i'm checking [19:05:05] cool [19:05:09] thanks [20:03:40] Hi. [20:03:53] How can I get a list of wanted pages via API? [20:07:08] Piotrek: maybe http://en.wikipedia.org/w/api.php?action=query&list=querypage&qppage=Wantedpages ? [20:07:43] &qplimit=500 [20:12:54] OK, works, thanks huh. [21:11:55] if i put mediawiki into read-only mode ($wgReadOnly) and also revoke UPDATE/INSERT/DELETE on the mysql database itself [21:12:02] will the wiki still work? [21:27:50] sankey, $wgReadOnly used to mean "no db transactions at all" [21:28:18] it was then changed to "do not let users write" (but the wiki could still do things) [21:28:35] why do you want to do that? [21:31:47] it should work if you have caching not dependant on the db [21:31:53] else i'm unsure [21:34:40] Platonides: basically, i want to host a mirror of a wiki [21:34:59] but the mirror should not be writable [21:35:52] i suppose then i would need to rsync the uploads directory [21:38:10] i should also mention that i was planning on using mysql master-slave replication to keep the databases in sync [21:39:43] i think i'm over-engineering this [21:39:52] probably better to just keep backups [22:05:13] I'm trying to quickly figure out/remember which version of PHP we're running on WMF servers. Can someone remind me where I look for that? [22:05:51] http://en.wikipedia.org/wiki/Special:Version [22:05:55] 5.3.10-1ubuntu3.10+wmf1 (apache2handler) [22:06:01] I thought of looking there and then neglected it, thanks [22:06:09] duh duh duh. Thakns [22:07:27] ("duh duh duh" is me being self-deprecating btw in case that was not clear) [22:08:06] sumanah: :P [23:29:30] does anyone know if there is a way to search in gerrit for patchsets that haven't received any comments from humans (i.e. not jenkins)? I tried -is:reviewed but that doesn't seem to work... https://gerrit.wikimedia.org/r/#/q/reviewer:self+status:open+-is:reviewed,n,z [23:49:29] Hi waldir. [23:49:38] waldir: I wanted to make a Gerrit report to do that. [23:49:45] You should write it. :-) [23:50:27] Gloria: so, that means the functionality definitely doesn't exist? :/ [23:50:55] I was hoping some of the gerrit pros around here could have hacked some dark magic to achieve that effect :P [23:51:31] It's probably possible to query. [23:51:37] You can count comments, I imagine. [23:51:45] And since jenkins-bot always(?) comments, just subtract one? [23:53:56] hmm, but you mean in gerrit's actual web interface or using some api? I might need to read up some more on the advanced search features