[00:00:12] meht: If you create an account on en.wikipedia.org, you'll be able to use that on mediawiki.org [00:00:25] account creation is a bit borked right now, we're working on fixing it, sorry! [00:38:24] Hi! Is there anyway that I can get the current page's ID via PHP? I'm working a MagicWord-oriented extension and getting the page's ID is just about the first step. [00:39:51] There's the magic word '{{PAGEID}}', but that's not too relevant in terms of writing an extension. [00:42:39] Orea: Title->getArticleId() ? [00:43:09] Orea: https://www.mediawiki.org/wiki/Manual:Title.php#getArticleID.28.29 [00:46:28] bd808: That seems to work, however it does NOT work if you're editing a page (Say, I wanted to cerate a Magic Word that would display the page's ID when you select preview) -- The output is blank when previewing, but fine when viewing the page directly. Is there not another way? [00:46:35] *create [00:47:24] {{PAGEID}} somehow manages to get the ID while you're editing, so how can I? [00:47:37] Hmm… I don't know. I've never played with making a new magic word. Have you looked at how {{PAGEID}} is implemented? [00:47:42] $parser->getTitle()->getArticleId() [00:47:43] ? [00:49:07] legoktm: That appears to work! I was simply running $this->Title->getArticleId(). [00:50:17] Um. [00:50:41] Yeah, that's not valid. [00:50:51] And why you would've got a blank page from PHP. [00:50:55] Check your web server error log. [00:52:27] Krenair: It must have been some had some form valid-ness to it - It could generate the page's ID when directly viewed, but not while previewing. [00:53:20] Viewing a page doesn't necessarily cause it to be reparsed. [00:54:16] Wait, you got an actual valid ID back? [00:55:47] Krenair: Yes, '1' - The page ID of the Main Page. It could have possibly been a fluke. [00:56:23] Well if you had an actual field of the class called Title, ok... [01:03:55] Krenair: I was previously outputting numbers to see if the magic word even worked. I probably did something stupid and set the output to '1', though I genuinely do not recall doing so. [01:13:51] thanks legoktm, I've done so now [01:23:30] question time. So these guys i'm working with have mediawiki installed through a tarball download rather than git, but I'm trying to update their version. If I just do git clone ontop of their current installation, will it overwrite/update what's necessary, or am I going to have to delete a bunch of stuff ahead of time and then do git clone [01:24:50] georgebarnick: git won't let you clone if the directory already exists [01:25:04] dammit that's right [01:25:16] georgebarnick: I would just figure out what changes if any they have, set up a git clone in a different directory, and must move it over everything [01:25:37] yeah, that's what I'm gonna do [01:25:43] thanks :P [05:12:13] is there a guide on the proper format of wikitext when you're editing? like spacing [05:12:38] for example I have the option of putting on the same line, or I can hit enter and leave below [05:12:54] both yield the same result, but I don't know what's considered proper [05:14:30] I mean that's just one example, another would be =Title= vs. = Title = and *Text vs * Text [05:14:41] and just random stuff like that [08:26:34] I am working on bug 31331 [08:27:19] I want to know what are the required changes I have to make to file javascriptFFFS [08:27:41] I mean what is the required output from this file [08:32:13] alisha: a start would be to ensure it passes validation for the format ;) [08:33:25] Ok, but can you explain it a little bit more [08:34:48] alisha: have you located the lines comment 0 is talking about? [08:36:28] No, I am unable to do so [08:37:43] Is in comment 0, javaScriptFFS.php is talked about [08:37:45] ? [08:41:14] alisha: that's fine, the files were split; try now https://bugzilla.wikimedia.org/show_bug.cgi?id=31331#c11 [08:41:39] Note, most of the functions and names mentioned in other comments are from core (or PHP itself even) [08:42:19] Ok [08:42:25] I will try it now [08:52:29] Nemo_bis: I have located the lines [08:52:45] Now what should I do to proceed further [08:53:11] I mean what are the requirements towards which I can proceed? [08:54:10] alisha: can you tell what those lines are doing? [08:57:20] They are actually counting the number of elements in segment [08:57:36] Please correct me if I am wrong [09:36:59] alisha: well, that's one line; what about the entire "Concatenate separated strings" block [09:37:14] and beyond that, what about the whole foreach [09:49:30] can someone give me an example for wiki with surveys enabled [09:55:01] pallavi_: https://wikiapiary.com/wiki/Extension:Survey [09:55:26] (wikiapiary is linked from all extensions' pages fyi) [09:56:28] Speaking of which, apparently we're now over 6400 known MediaWiki extensions. https://wikiapiary.com/wiki/Extension:Extensions [09:59:13] Nemo_bis: Are those staging websites ? can i play around with their enabled extensions? [09:59:27] no idea, no idea [10:00:02] they're wikis, just visit them and start doing whatever they let you do, but tidy up after that [10:23:01] hello everyone! [10:23:02] I am new to wikimedia and am getting started with it. I am presently configuring gerrit for use and get an error while trying to run ssh using [10:23:02] ssh @gerrit.wikimedia.org -p 29418 [10:23:02] The error being : Permission denied (publickey). [10:23:02] Could someone please help? [10:24:34] just to make clear, i did replace with my username [11:11:09] ankita: is your ssh key add here https://gerrit.wikimedia.org/r/#/settings/ssh-keys [11:12:26] pallavi_: hi! yes, [11:12:26] i would also like to point out that i've added keys for two accounts [11:14:56] ankita: [11:14:56] Unfortunately, interactive shells are disabled. [11:14:56] To clone a hosted Git repository, use: [11:14:57] git clone ssh://@gerrit.wikimedia.org:29418/REPOSITORY_NAME.git [12:13:43] my search suggest is quite odd. it doesn't find all pages. typing "ri" finds nothing for example (pages exist). [12:24:48] legoktm, your graphs at https://www.reddit.com/tb/2iy10s are a bit misleading. The y-axis doesn't start at 0 :-p [12:57:49] Is action=purge broken? https://www.mediawiki.org/w/index.php?title=MediaWiki&action=purge redirects to normal without the parameter. $request->getVal( 'action' ) returns empty [12:59:46] Ok that's a blocker for me [13:07:45] Subfader: err? what's the issue: [13:08:01] https://bugzilla.wikimedia.org/show_bug.cgi?id=71978 [13:08:03] Subfader: what else should it do? It purges the cache, then redirects you to the page, which will trigger a parse [13:08:39] $request->getVal( 'action' ) returns nothing on action=purge. That's a problem. [13:10:10] Subfader: and you base that observation on your echo? [13:10:21] not? [13:10:42] Subfader: because that echo is *probably* not sent to the client at all, as mediawiki returns an HTTP/302 [13:11:10] I get it my log. "action: " [13:11:20] Yes. From the *second* request. [13:12:44] Ah. Then why is redirected without the parameter at all? [13:13:49] Because it's not needed? [13:14:03] Subfader: this is what happens: You request ?action=purge. Mediawiki purges the cache. Mediawiki returns a HTTP/302 to . You request . Mediawiki has an empty cache, and re-generates the page. Mediawiki serves you the re-generated page. [13:15:09] Sounds logical, but how am I supposed to validate action=purge in an extension after that? I want to purge Main Page to flush memcached queries on that page [13:16:21] Do it via the api and get a proper return value? [13:17:18] This breaks lots of extensions and user scripts :( [13:17:38] Has this behaviour changed recently or something? [13:17:54] Dunno, I'm coming from 1.16 [13:18:06] what? user scripts don't rely on action=purge returning another action=purge URL [13:18:38] Reedy: yes, action=purge didn't redirected to normal page before [13:19:44] When is before? :) [13:20:08] 1.19 [13:20:11] in 1.1 [13:20:20] * in 1.21 it has this behavior [13:20:29] I dunno in 1.20 [13:21:22] no, 1.20 already redirects, so it was introduced in 1.20 [13:21:59] Subfader: I'm still confused. Why don't you get fresh data from memcached when the page is parsed? [13:22:03] * Vulpix could probably do some cleanup of old versions [13:22:17] Subfader: why do you have a user script to do that...? [13:23:02] no I don't have a user script for that. I ment that this change is breaking lots of scripts grabbing the action parameter to check against "purge"- [13:24:08] I have no clue why an user script would want to do that. [13:28:31] Then not. I need to do with https://www.mediawiki.org/wiki/Extension:Newest_Pages now :) [13:32:55] Nontheless I'm not sure if function showDiffPage() really grabs the action=purge properly before the redirect. [13:33:29] Couldn't I just disable the redirect? Would save me lots of work. [13:35:49] Not without hacking core [13:35:52] public function onSuccess() { [13:35:53] $this->getOutput()->redirect( $this->getTitle()->getFullURL( $this->redirectParams ) ); [13:35:53] } [13:36:17] I hack the core all the time. That's why I was on 1.16 so long :) Cheers. trying [13:37:16] I still suggest using the API [13:37:18] do an internal request [13:37:18] https://en.wikipedia.org/w/api.php?action=purge&titles=Main_Page|API [13:37:36] [13:46:33] I didn't work with the API yet. I keep that for later. Atm I just want to upgrade. [13:47:33] Redirect is suppressed by action is returned as "view". Somewhere certain actions are falling back to "view", right? [13:48:29] ** but [13:53:52] Subfader: in PHP, when you request a page with action=purge, I guess it receives "purge" as action, but you can't output anything from that request because it's being redirected [13:55:08] Reedy posted above how suppress that redirect. I can grab the action parameeter now, but it falls back to "view". I guess like others do as well, e.g. 'revisiondelete' [13:55:09] I don't know if all hooks that are normally triggered with action=view will be triggered for action=purge. I guess some of them not, specially those that are for output [13:56:01] I guess you only need to use the proper hook for that [13:59:06] Action::getActionName( $this->getContext() ) >> returns "view" which is logical [13:59:14] $request->getVal( 'action' ) >> returns nothing [14:01:09] Subfader: that code is being executed for action=view but not for action=purge. What hook are you using? [14:01:36] otherwise it would execute twice: one for action=purge, and another for action=view when the page redirects [14:02:32] none I guess. https://www.mediawiki.org/wiki/Extension:Newest_Pages [14:03:19] It's just a special page [14:04:08] Can I make direct call for $_GET['action'] safe easily? [14:04:21] can you actually purge a special page? does it make sense? [14:04:45] It gets included [14:05:45] It doesn't seem to be cached [14:05:52] Subfader: i.e. if you purge the page it's included in, that page will automatically get the new version on the next parse [14:05:53] I do ;) [14:08:56] but I memcache each of the outputs. e.g. 50 newest articles (3 minutes), 10 newest pages in custom NS (20 minutes). On each queries of these I ask if action=purge to flush memcache [14:10:46] So when I added a new hot page in custom NS which I want to show immediatly, I flush Main Page [14:11:21] use a hook on page creation/move/delete/undelete to flush that memcache key, and not on action=purge [14:14:08] Already thought about that much more elegant solution. A) I write an extension where I trigger all these hooks once or B) I call the hooks on every hack where I memcache stuff. [14:14:52] A) requires complicated memc key passing. I asked if I can regex match the keys: https://www.mediawiki.org/wiki/Thread:Project:Support_desk/Delete_various_Memcache_by_regex_matching_the_keys [14:15:17] no, you can't regex memcache keys [14:16:40] Ok so B then. If this works I could bring back the purge redirect for the sake of it :P [14:23:59] Makes the purge tab from https://www.mediawiki.org/wiki/Extension:Purge working not in all cases then :/ [14:24:38] I can remove the tab on pages where I use memcached queries. [14:24:50] See: Trouble! ;) [14:26:21] Would calling the API be complicated? I have no clue how to use it inside. I only used json output so far. [14:27:07] Nope [14:27:25] Now to remember what the bot recall for this is [14:27:27] !apiinternal [14:27:27] https://www.mediawiki.org/wiki/API:Calling_internally [14:27:33] woo, first time [14:27:40] :P [14:29:18] Cheers! [14:33:16] pallavi_: hey sorry, for the delay, certain network issues [14:33:46] could you please explain what is the last line supposed to do? [14:34:55] isn't ssh @gerrit.wikimedia.org -p 29418 meant to add to the list of known hosts? [14:36:55] ankita: if it isn't already on the list, it should prompt you to add to the list. Maybe there's a config option to disallow anything that's not explicitly in known hosts [14:38:41] Vulpix: this is the complete thing that appeared [14:38:47] https://www.irccloud.com/pastebin/yDYJCRdp [14:39:17] so, there was an error after adding to the list of hosts [14:39:20] okay, so it has been added because it wasn't on the list [14:39:47] the error "Permission denied (publickey)." is unrelated to the known hosts [14:41:03] actually, what i am trying to do was "Run ssh. (Do be paranoid and check for the matching SSH fingerprint for gerrit.wikimedia.org:29418 when logging in for the first time.) $ ssh @gerrit.wikimedia.org -p 29418" [14:41:15] did you uploaded your public key to gerrit? are you using that key to authenticate through ssh? [14:41:31] This is as per the instructions given on the gerrit tutorial wiki [14:41:48] i uploaded keys for two different accounts [14:43:23] i tried by adding the private keys of both the accounts and they both give the same error [14:43:48] i tried by adding the private keys of both the accounts and they both give the same error. I tried them one after the other [14:44:17] if I enter a wrong username, it throws me the same error [14:44:25] also, i tried case sensitive username as shown in gerrit, but even that was no use [14:44:53] the user name is to be the git global user name, right? [14:46:23] it's the username you use to login to gerrit [14:46:54] try passing "-vvv" to the ssh command (without quotes), to get debug output [14:46:56] ok then, i used the same. infact gerrit seems to capitalise the first letter, i even tried that [14:47:18] Reddy: I don't quite understand want I want from the API. Can it tell me that the user came from action=purge after the redirect? [14:48:37] Vulpix: debug3: Incorrect RSA1 identifier [14:48:56] this seems to be causing the problem [14:51:37] https://www.irccloud.com/pastebin/2ol0wIOf [14:51:50] This is the detailed error report [14:52:48] Vulpix: could using two rsa keys be a problem? [14:52:48] I have added one corresponding to the email id I use for gerrit. [14:53:16] The other one is corresponding to the github email id [14:53:18] apparently the server rejected your public key id_rsa_707 for your user name [14:53:41] that's the same it outputs for me if I attempt to connect specifying a different username [14:53:58] what could be a possible solution to this? [14:54:25] shall I regenerate the pub keys for id_rsa_707 and update on gerrit? [14:54:46] check that you've uploaded your public key and not the private key (if that's ever possible) [14:55:24] the public key is the one that ends with .pub [14:57:01] thedj: Hi, how are you? For me https://gerrit.wikimedia.org/r/#/c/4060/ seems to be an impovement what's the reason for your -2... we should see if we either abandon or merge the change [14:59:34] public keys, checked [14:59:38] Vulpix: [15:04:14] * Vulpix away [17:28:44] Do many templates (like {{Translate|word}}) in one page compromise MediaWiki efficiency (increase CPU load)? [17:45:39] porton: quantity is unlikely to be an issue in itself, unless they're tens of thousands [17:46:21] How to update pages using a template after I've edited the template? [17:47:27] ideally the should purge themselves after a bit, but I've seen lots of people on personal wikis where this doesn't happen [17:47:35] *they [17:48:12] I also have delayed (slow) changing the color of the links to missing pages after I create these pages anwe [17:48:15] *anew [17:48:38] What is the problem with delayed update? [17:49:45] in theory, if the template/link is being used on few pages, they are purged inmediately. But that doesn't seem to be true always [17:50:25] Vulpix: I have really many pages and most links are to yet nonexistent pages [17:50:57] they'll be purged by the job queue, then [17:51:44] Vulpix: `crontab -l` says "no crontab for root" [17:52:30] ... `crontab -l -u withoutvowels` says "no crontab for withoutvowels" (withoutvowels is the username, the wiki runs under) [17:52:40] Nemo_bis: I have gone through the lines mentioned in comments of bug 31331 [17:52:46] Vulpix: which job queue? [17:53:24] Please tell me what the real bug is which is to be removed [17:53:38] And what is the desired output? [17:54:19] !job [17:54:19] http://integration.wikimedia.org/ci/job/ [17:54:26] !jobqueue [17:54:26] The Job Queue is a way for mediawiki to run large update jobs in the background. See http://www.mediawiki.org/wiki/Manual:Job_queue [17:54:32] porton: ^ [17:55:22] alisha: desired output is that the output stays the same [17:56:37] But with less lines of code [17:56:46] Ok [17:57:02] so basically I have to reduce lines of code [17:57:24] or I have understood it wrong? [17:57:57] The bug is about using standard functions to reduce mistakes in JSON production [17:58:19] PleaseStand linked a function in core which you'll probably need to use [17:58:28] (with the utf8 option) [17:58:55] Comment 0 also mention a specific bug the current implementation might have (single quotes) [18:00:14] is it possible to restore mediawiki images if i have ftp access but not shell access? [18:00:33] yes [18:00:45] yes, they should be on the deleted directory [18:01:34] sorry. incorrect phrasing. I want to migrate to another server. i can copy /images folder but i dont have shell acces to run importImages.php [18:02:23] gleki: if you copy both images and database, you don't need importImpages.php at all [18:22:53] hello [18:23:10] new to irc - in need of help with mediawiki [18:24:14] anyone who knows how to edit a page a fornits? can't seem to find the solution [18:26:55] reglohnesnej: fornits? [18:28:39] Did I say something wrong? [18:44:09] Hi, I'm trying to test mediawiki's OAuth extension. Am I supposed to pass the consumer key when calling Special:OAuth/initiate? [18:53:04] hello again! [18:53:40] regarding the problems that i faced while trying to run the command: ssh @gerrit.wikimedia.org -p 29418 [18:53:55] I got the error as: Permission denied (publickey). [18:54:29] Googling the error, I found out that changing the sshd_config file might do [18:54:49] but in /etc/ssh/ i dont have a file by that name [18:55:04] rather, there's one by the name ssh_config [18:55:21] could someone please help figure out something? [18:56:31] The relevant links in case these might be useful to someone else facing similar problems [18:56:31] http://superuser.com/questions/543626/ssh-permission-denied-on-correct-password-authentication [18:56:31] http://serverfault.com/questions/55343/cant-get-ssh-public-key-authentication-to-work [18:56:31] http://stackoverflow.com/questions/7086678/permission-denied-publickey-when-connecting-to-aws-sever [18:57:23] ankita: have you uploaded your public key to gerrit? [18:57:31] I am new to gerrit and faced the errors while setting it up [18:57:41] MatmaRex: yes, i did [18:57:48] https://gerrit.wikimedia.org/r/#/settings/ssh-keys [18:58:06] hm, okay. are you sure that git and ssh use the corresponding private key when communicating with gerrit? [18:58:12] i have added my pub keys pertaining to two different accounts, i hope that is not causing a problem [18:58:46] MatmaRex: I'm not sure about this. How can i check if they are? [18:59:19] uh, i don't really know how to verify that, but surely it must be possible [18:59:31] i'll try googling [19:00:05] hmm, you can run `ssh -i /path/to/id_rsa` to make it use the chosen private key [19:05:54] MatmaRex: Can something of this type be useful (1st answer) ? [19:05:55] http://stackoverflow.com/questions/3225862/multiple-github-accounts-ssh-config [19:06:39] ankita: no idea [19:08:06] I am getting the same error [19:08:34] I have tried a lot of things: tried the private keys of both the accounts for ssh-add [19:08:57] made the email id that i use for gerrit as primary on github [19:09:43] MatmaRex: Do you have any alternate suggestion that I might try? [19:11:01] ankita: well, you can submit patches to gerrit without the command-line tools, but that's not really pleasant in the long run [19:11:21] using https://tools.wmflabs.org/gerrit-patch-uploader/ [19:11:27] yes, plus it would be more convenient using the cli [19:12:11] i don't know how to help you, sorry :( you might have better luck asking after the weekend, there are more people online then [19:15:41] sure! [19:16:02] MatmaRex: thanks for your time and support [20:32:54] Why is this wiki so fast https://openhatch.org/wiki/Main_Page [20:45:28] cloudflare [21:20:31] Hi! Need some opinion or diagnosis concerning an issue. Anyone there ? [21:22:55] !ask | Bobmorane [21:22:56] Bobmorane: Please feel free to ask your question: if anybody who knows the answer is around, they will surely reply. Don't ask for help or for attention before actually asking your question, that's just a waste of time – both yours and everybody else's. :) [21:23:31] Ok sry [21:25:39] No response from Special:Upload when uploading a picture. Mediawiki or server issue ? Here is the page: http://wiki.vpmedias.fr/Sp%C3%A9cial:T%C3%A9l%C3%A9verser [21:26:20] Whatever the type of picture it is [21:28:16] Bobmorane: What exactly do you mean by no response? Timeout? White page? [21:29:17] Back to the same page, empty form [21:30:21] FoxT: Back to the same page, empty form [21:31:26] Bobmorane: This usually means, that either nothing is transmitted with the form or that the server ignores it. But really I have no ides what could be the reason. [21:44:42] FoxT: Okay. So probably the server and a restricted access somewhere, cause the wiki is still stable, properly updated, the other forms run... I'll check it. Thanks [22:04:56] FoxT: thanks [22:15:05] valhallasw`cloud: I just took them from grafana https://grafana.wikimedia.org/#/dashboard/db/edit-performance [22:26:59] valhallasw`cloud: but I think for average wikipedians, the graphs make a clear point :P [22:28:58] I'm having trouble finding the source page URL of a transclusion. There's a {{Content}} inserted, but I can't find anything like http://mediawikiinstall/Inserted [22:29:17] Would be http://mediawikiinstall/Content for the example above.... =P [22:29:56] vmassuchetto: You are looking for http://mediawikiinstall/Template:Content [22:30:31] FoxT, awesome, thanks. [22:30:35] np :) [22:39:17] Legoktm: i guess :-P [23:49:59] !mail [23:49:59] http://lists.wikimedia.org/pipermail/ [23:50:07] !passwordreset [23:50:07] For information regarding the resetting of a user's password, please see [23:58:07] I misread vmassuchetto as marktraceur. [23:58:08] Weird.