[00:00:19] yeah [00:00:27] what have you tried? [00:08:22] MatmaRex: Up until now, I was using Mediawiki-Butt to do a basic search and grab page titles with the get_search_results function, but it seems a bit on the sparse side functionality-wise. With the mediawiki_api gem I'd like to allow users to specify a pagename with "!w pagename", find out if the page exists, and if so, return it (possibly with the first sentence or two). If the page doesn't exist, I'd like to do a limited search and [00:08:22] return the first 3 results. [00:09:17] (i'll be right back, sorry) [00:09:25] No prob! I'm here all night. [00:13:45] I guess I'm just getting tripped up by syntax with mediawiki_api. I don't see where any of the API actions actually return data. [00:23:24] manafount: i'm not familiar with that gem, but with plain API queries, these would be: [00:24:27] https://en.wikipedia.org/w/api.php?action=opensearch&format=json&search=Dog&limit=3 [00:24:41] for searching [00:24:50] …and that actually returns the first sentences of each article too. i didn't even know that. [00:25:27] you don't really need any special clients. just send the HTTPS request (replace "Dog" with your search query, percent-encoded) and parse the JSON output [00:25:31] Neat! Is OpenSearch the standard search function for other mediawiki sites as well? [00:25:45] yeah, this is what powers the search box on all pages [00:26:30] Cool. How about finding out if a specific page exists? Is that possible through the API? [00:28:01] can the search do things like "all pages that reference a category name" ? [00:28:32] manafount: yeah. most of the interesting things are under action=query [00:29:02] manafount: and btw, there's the full API documentation: https://en.wikipedia.org/w/api.php?action=help&recursivesubmodules=1 (long page) [00:29:13] seems to work /api.php?action=opensearch&format=json&search=[[Category:Help]] [00:29:14] and here's a clever sandbox thing for playing with queries: https://en.wikipedia.org/wiki/Special:ApiSandbox#action=opensearch&format=json&search=Dog&limit=3 [00:29:26] Thanks. Was just practicing with the sandbox :) [00:29:36] TimRiker: maybe, i don't really know. i do know that the Discovery team implemented some fancy stuff [00:29:45] it's probably documented somewhere on mediawiki.org [00:30:37] manafount: to just check for existence, it's enough to use just action=query. :) https://en.wikipedia.org/wiki/Special:ApiSandbox#action=query&format=json&titles=dog%7Casdfasdf [00:30:42] hmm. no format=rss [00:30:53] (you can check multiple titles at the same time by separating them with '|') [00:31:19] I don't think I need to login or anything to make API calls in my wiki, but I do have a bot account setup. Is there any benefit to logging in as a bot before executing any of those https requests? [00:31:54] manafount: you can ask for up to 5000 results at once instead of up to 500 [00:32:07] Ah. That seems a bit excessive for what I need. [00:32:10] ;) [00:32:16] I think you've answered all my questions, though. Thanks so much! [00:32:40] manafount: it's good to get an account and set a User-Agent header in your requests if you plan to do a lot of queries [00:33:12] (so in case something goes wrong, people would be able to contact you) [00:34:25] I'll work on that. I'm a bit new to ruby and haven't done any https requests or html parsing through it, so that's my next task. [00:36:35] Net::HTTP is kind of lame, but there are a few gems wiith nicer interfaces. i think httparty and restclient are two big ones? but it's been a while since i wrote stuff in Ruby [00:37:14] hmm. opensearch does not find the same pages that action=search finds [00:37:27] MatmaRex: Thanks. I was looking at Net::HTTP, so I appreciate the suggestions. [00:37:51] timr [00:37:53] bah. [00:38:11] TimRiker: it's geared more towards "search suggestions" than "search", so i guess the results may differ [00:38:41] there's some irc channel here when the search people lurk. #wikimedia-discovery, probably? [00:42:07] k. some other time. thx [01:43:29] hey dudes, got some problems, anyone here who could help me with following log-entry: [01:43:30] [fcgid:warn] [pid 24878] (32)Broken pipe: [client 91.63.171.155:50635] mod_fcgid: ap_pass_brigade failed in handle_request_ipc function [02:03:14] Hi Guest72764. [02:03:34] Are you getting an error when trying to install or configure the MediaWiki application? [02:09:32] no, everythin else seems fine... [02:10:25] but i get serveral dozens of error-log entrys per hour [02:14:12] Have you tried searching for the error in a search engine? [04:23:37] Anyone using varnish here? Trying to find a sane solution to dealing with the Vary: cookie header that media wiki sets. [04:26:27] I'm thinking the right thing to do is white list cookies that matter, but haven't quite gottent that working yet.. [04:31:58] Also why is 'vary: cookie' set in the first place? This seems like overkill [04:33:11] Hi, is anyone around who might be familiar with IEGs? [04:51:16] well got the white listing working in varnish working, if anyone else has experience with this would be interested in how you guy's handled it still. [06:45:40] !fast [06:45:40] [09:01:38] ori around? [09:02:17] HakanIST: barely. what's up? [09:26:46] hello [09:26:50] !list [09:26:50] mediawiki-l and wikitech-l are the primary mailing lists for MediaWiki-related issues. See http://lists.wikimedia.org/ for details. [09:41:53] ori there seems to be an issue with mobile talk page again, posted messages go to [[Undefined]] [09:43:44] HakanIST: That is so annoying. I don't develop that code, but I can help make sure it gets fixed. Could I ask you again to file a task? [09:45:37] ori sure thing, I'm waiting for more info from a user on how to reproduce [09:46:15] HakanIST: thanks. I may not be able to respond for another 8 hours :-/ [12:14:49] Hi, can someone help me please with a query in wikimedia? [12:15:04] I would like to extract the pagelinks of a category page and I can't seem to find a solution [12:15:27] For example, this category: https://en.wikipedia.org/wiki/Category:Aviation_accidents_and_incidents_in_2004 [12:16:45] Im interested in extracting the links of for example China Eastern Airlines Flight 5210, F Flash Airlines Flight 604, etc. [12:17:08] But contrary to templates, it seems it is not straight forward [12:17:14] can someone please help me with this query? [12:18:14] Ruthygg: https://en.wikipedia.org/w/api.php?action=help&modules=parse ?? [12:19:10] Ruthygg: https://en.wikipedia.org/w/api.php?action=parse&title=Category:Aviation_accidents_and_incidents_in_2004&prop=links [12:19:41] oops, it should be https://en.wikipedia.org/w/api.php?action=parse&page=Category:Aviation_accidents_and_incidents_in_2004&prop=links [12:20:02] Ruthygg: then do your own parsing, python, ruby, whatever language [12:20:36] Thanks, jirib, but I also get that result from MySQL as well [12:20:41] so what I dont understand [12:20:49] is why I cant get the pagelinks in that category [12:20:54] for example, if you open that page [12:21:06] https://en.wikipedia.org/wiki/Category:Aviation_accidents_and_incidents_in_2004 [12:21:25] You will see a link to China Eastern Airlines Flight 5210 [12:21:29] and others [12:21:50] why is that I cant get this link in MySQL and in your link https://en.wikipedia.org/w/api.php?action=parse&page=Category:Aviation_accidents_and_incidents_in_2004&prop=links [12:21:53] they dont appear [12:22:35] :( [12:27:37] Ruthygg: because this is not real page [12:29:24] What do you mean? [12:29:36] How could I have access to those links? [12:38:18] Ruthygg: that the page is generated https://en.wikipedia.org/w/api.php?action=parse&page=Category:Aviation_accidents_and_incidents_in_2004&prop=wikitext [12:40:29] so what do you mean, this is a different page? [12:40:32] Ruthygg: you need action=query https://en.wikipedia.org/w/api.php?action=help&modules=query [12:41:40] action query? [12:42:41] Can you please explain me a bit more? Is not possible to do it with a simple query? [12:47:21] Ruthygg: https://en.wikipedia.org/w/api.php?action=query&cmtitle=Category:Aviation_accidents_and_incidents_in_2004&list=categorymembers [12:47:33] https://www.mediawiki.org/wiki/API:Categorymembers [13:04:18] Thanks a lot jirib!!! [13:04:36] Ruthygg: urw [13:04:38] From here I can figure it out! Thanks a lot for taking time for this, I appreciate it :) [14:16:44] morning [15:28:30] Hi! I'm looking to apply here, can anyone suggest me something to start with? [15:33:23] malabika: here for GSoC? [15:34:14] codezee: No, for Outreachy [15:35:08] malabika: https://www.mediawiki.org/wiki/Outreachy/Round_12 [17:24:50] hi. what's the best (read: working) method to get rid of spammers on a mediawiki install? We have lots of that: https://wiki.videolan.org/Special:RecentChanges/ . [17:28:42] hi Mediawikians [17:28:51] thresh, https://www.mediawiki.org/wiki/Manual:Combating_spam [17:29:18] is there a way to display a user's name as a magic word on a page they are visiti?ng [17:30:25] andre__: "Blocking edits by new or anonymous users to specific often-targeted pages", how do I go about that? [17:30:58] to put it simply, a page that says"welcome, XUsernamePerson!" [17:32:14] thresh, https://www.mediawiki.org/wiki/Manual:Combating_spam#Individual_page_protection ? [17:32:50] (I have a more interesting use case in mind, but that's the functionality I need) [17:36:38] legoktm: would you know? [17:37:16] Pharos: is this for a Wikimedia wiki? [17:37:21] there are a few different ways to do it [17:37:30] yes, for enwiki [17:37:35] andre__: yeah, unfortunately we don't have specific pages that are spammed. :/ [17:38:02] andre__: I was thinking about some kind of "auto confirm user after 24h period" or such. [17:38:28] Pharos: it would have to be done through a JavaScript gadget then [17:38:51] legoktm: let me know if another channel is more appropriate for wikimedia-specific mediawiki stuff [17:39:03] usually #wikimedia-tech [17:39:22] ok, i'll shift thereif that's better [17:51:55] thresh, https://www.mediawiki.org/wiki/Manual:$wgAutoConfirmAge [17:53:38] andre__: yeah, I've just set now $wgGroupPermissions['user']['createpage'] = false; (same for edit,createtalk), and wgAutoConfirmAge to 3 hours. [17:53:41] andre__: thanks. [17:54:23] let's see if that helps. maybe manual spammers are not very patient. [18:57:04] I have mediawiki 1.62.2 deployed in 3 different "lanes". in my local, and dev lane things work, but on production I have this issue: syntaxhighlight blocks are gone when invoking VisualEditor. [18:57:49] Hello. As a PhD student working on a research in software engineering, I would like to access ( one-time) following data https://wikitech.wikimedia.org/wiki/Analytics/Data/Webrequest Could you please advise me the contact person I should write? [18:58:51] They are there for wikieditor in source edit, preview, page view, etc, but when visualeditor kicks in they go away. on my other lanes they show up fine and allow the popup block editor for the syntaxhightlight block. [19:00:00] dash_2: See https://meta.wikimedia.org/wiki/Research:Access_to_non-public_data [19:00:02] example page: https://tech.lds.org/wiki/VisualEditor unfortunately my other lanes that work fine are not publicly accessible. [19:01:04] dash_2: There's also some page view data at https://dumps.wikimedia.org/other/ which is just publically available, no special permission required [19:01:36] dash_2: Also, there's a channel for analytics people at #wikimedia-analytics which might be able to answer your questions better [19:01:49] sry, in order to try the edit, you would have to sign up for a free ldsaccount. same account used for familysearch.org if any here are genealogy buffs. [19:02:07] bawolff: Thanky you very much. [19:02:34] I'm running the same mediawiki and parsoid code deployed in each of my lanes. [19:02:45] TimRiker: Yeah, odd. The SyntaxHighlight block doesn't make it into the CE. [19:02:54] TimRiker: Or the data model. [19:03:16] ya. no idea why. it works fine on my other installs. [19:03:40] TimRiker: Inserting a new block looks to work OK… [19:04:01] hmm. didn't try that.. testing. [19:04:11] TimRiker: If you edit the content to drop the ` line="1"` does it work? [19:04:51] James_F, nope. just made that change. [19:06:28] hmm. I tried inserting a code<> block, and the insert dialog came up, let me enter text, but then didn't show it after i hit insert. [19:07:36] interesting. saving the page has both paragraphs there. the original and the one I added. neither showed on the page in the editor. [19:07:48] TimRiker: Did you enter content in the "code" bit or just in "language"? Empty blocks aren't inserted. [19:08:44] TimRiker: (As an aside, you may wish to have your skin use the Apex theme of OOUI rather than the MediaWiki one, depending on what you're looking to get done. [19:09:46] James_F, yes. I entered code and saved it. you can see the update there. [19:10:26] Eurgh, yeah. [19:11:20] what's the impact of the apex vs mediawiki theme? by the way, the default themes are installed there too. same results in them. [19:12:01] TimRiker: I'm really puzzled by this issue. [19:12:01] TimRiker: Apex looks like: https://doc.wikimedia.org/oojs-ui/master/demos/#toolbars-apex-ltr whereas MediaWiki looks like: https://doc.wikimedia.org/oojs-ui/master/demos/#toolbars-mediawiki-ltr [19:12:24] TimRiker: (MediaWiki is the default, but Apex's design feels a bit closer to the current look and feel.) [19:13:05] ah. I see what you mean. thx. [19:13:18] James_F, well, I'm glad I'm not the only puzzled one. :) [19:13:59] https://tech.lds.org/wiki/VisualEditor?veaction=edit&useskin=vector (again, you need a free account) [19:15:35] a long shot theory is that there is something in the visualeditor request from the browser that looks like a sql attack or something to our firewall and is getting blocked. I don't see any error codes on the network requests though. [19:17:25] TimRiker: It's most perplexing, indeed. :-( I cannot give a reasonable answer as to what's happening. The blocks show in read mode (extension installed), they can be inserted in the editor (registration working), they serialise (new ones get saved into wikitext == converter code is present and working compatibly) but they don't get into the data model at all. [19:17:46] TimRiker: The 'normal' reason people have issues is version incompatibilities, but it's the same versions of the code in each server, right? [19:18:24] yes. same code. both parsoid and mediawiki deployed from the same git repo. let me check and restart parsoid just to be sure. [19:18:54] It's not going to be a Parsoid issue. [19:19:14] It's most likely a VisualEditor or SyntaxHighlight up-to-date-ness issue. [19:19:14] I just can't work out how. [19:19:51] same version of visualeditor and SyntaxHighlight as deployed on my other lanes and it works fine there. [19:20:12] Including the sub-module? [19:20:18] (For VE core inside VE-MW.) [19:21:43] yes. all deployed from the same git repo. all up to date with HEAD. (just double checked) [19:22:02] both my dev lane and production are on the same rhel release with the same packages installed. [19:22:30] both are proxied through a loadbalancer/firewall that handles the SSO and ssl. [19:24:11] * James_F nods. [19:24:11] That's what I assumed. [19:24:11] * James_F sighs. [19:25:00] if I turn on debug flags so I get the "Dump Model" widget, I don't see a mwBlockSyntaxHighlight in the view on production. but I DO see that on local and dev lanes with the same code. [19:25:58] Yeah. [19:26:16] That's how I know something very odd is broken. [19:28:24] Help [19:28:35] with? [19:29:16] how do I login to resubmit documents on camsoda.com [19:30:00] (camsoda.com is not mediawiki, and nsfw, in case anyone is wondering) [19:30:56] TimRiker: What's this free account thing? I'm interested in poking at your problem, but https://tech.lds.org/wiki/VisualEditor?veaction=edit&useskin=vector 404s for me [19:31:06] thank u Lori.. [19:31:21] hmm.. another theory. the apps are deployed on different paths. there could be some unexpected impact of some rewrite rules. again, I don't see any incorrect return results from what I can see. digging into that. [19:32:47] avery: can't help you, sorry. [19:33:19] I have no way of contacting camsod a support [19:38:42] thAnks anyway [20:02:46] ParserError: Failed to parse the JSON response for Extension Parse - hmm. I am seeing some errors in the parsoid log. I didn't notice those before. [20:03:17] Hello [20:03:34] RoanKattouw, https://ldsaccount.lds.org/register [20:04:20] I am so sorry but I want discuss for vandalism of Azerbaijani Wikipedia sysops. [20:04:30] Am I right channel? [20:05:35] Aabdullayev851: I'm afraid not, I'm not sure if that local Wikipedia has an IRC channel [20:05:56] Aabdullayev851: perhaps check in #wikipedia-en ? [20:06:03] Local Wikipedia has IRC channel [20:06:19] But I want speak someone from Meta [20:06:56] Somebody can stop our sysops. [20:07:08] Aabdullayev851: Ah right, #wikimedia? [20:07:18] or #wikimedia-stewards [20:07:47] requesting /tech.lds.org/v3/page/html/VisualEditor/71604 from the parsoid instance gets a page without the blocks back and an error in the logs [20:08:47] http://pastebin.com/Ntb7mRRY [20:08:50] Aabdullayev851: I would reccomend #wikimedia and asking there if there is a more appropriate channel [20:10:44] @myrcx thanks [20:10:46] what does parsoid call on api.php to get a page? [20:12:47] this looks ok to me. http://tech.lds.org/wiki/api.php?format=json&action=query&prop=revisions&rawcontinue=1&rvprop=content%7Cids%7Ctimestamp%7Cuser%7Cuserid%7Csize%7Csha1%7Ccontentmodel%7Ccomment&revids=71604 [20:19:38] James_F|Away, I think I found it. parsoid is calling back to the wiki, and getting blocked by a firewall rule. it's getting html instead of json on some request. [20:28:45] yep. certainly the case. parsoid is calling back to the wiki, and getting blocked. I'll chase it down with our firewall group. thx for the help all! [20:31:03] Hah nice [20:31:04] Glad you found it TimRiker [20:31:49] ya! me too. it was tough to track down. on the phone with the firewall team now, we'll see how long this takes.... [20:49:59] Hi! I have a wiki page with name One Two Three. I anchor it using: [[One Two Three]] . How do I add the alt text, so that it the text looks like 1 2 3 ? [20:54:46] new_student: [[One Two Three|1 2 3]] [20:56:13] new_student: The format is [[Article name|Displayed text]] - does the above work for you? :) [20:56:24] Yes, just read the wiki page. Worked. Thanks :) [20:56:29] Awesome :) [21:00:09] hello [21:00:34] I am applying for gsoc 2016 [21:01:01] i am interested in the idea:Implement HTML e-mail support in MediaWiki [21:01:30] i need guidance from you all. [21:05:30] ? [21:05:43] i made a review for https://gerrit.wikimedia.org/r/#/c/274773/ , know it has failed so should I submit another patch ? [21:10:38] dev_: You may have better luck getting guidance if you ask a specific question [21:14:19] i want guidance on how to move forward,as i have selected the idea. [21:14:38] now i want to know what are the things i need to do next ? [21:15:52] Generally speaking, try to fix a small bug [21:16:05] contact people who have said their interested in mentoring that idea [21:17:09] hey ho, anyone who has an idea what's broken with the translate extension at https://commons.wikimedia.org/wiki/Template:Rename/i18n ? [21:17:21] Translate this page; This page has changes since it was last >>marked for translation<<. [21:17:43] Page translation: Please confirm the action. >> Confirm << [21:17:53] Empty page. [21:18:08] rillke: known bug https://phabricator.wikimedia.org/T128638 [21:18:21] thanks :) [21:18:27] may i know where can i get bugs? [21:19:08] dev_: Hi! :) [21:19:17] rillke: there are no SWAT deployments on Friday, and I am not going to stay up at 3am (https://wikitech.wikimedia.org/wiki/Deployments) so fix is delayed until next week [21:19:52] dev_: I noticed you're interested in Wikimedia's GSoC - perhaps you would like to join #wikimedia-dev? [21:20:51] Both #wikimedia-dev and #mediawiki are really equally on topic for gsoc [21:21:20] yes sir i am interested .should i join that channel? [21:21:21] although #wikimedia-dev does expose people to the bots, which is probably good for gsoc students [21:21:57] dev_: bawolff raises a good point, probably a good idea to join both here and #wikimedia-dev [21:22:03] -__- [21:22:32] bawolff: good point though :) [21:29:13] TimRiker: Ooooh. Odd. [21:29:18] TimRiker: Does that mean it now works? [21:38:44] James_F, I danced around the firewall by directing api.php calls through https and that seems to have fixed it. [21:39:43] TimRiker: Nice! I'm glad you got it working, but also surprised. Another thing for me to file away as advice. [21:40:11] thx! [22:56:17] Hello, I have a question. I tried googling it but can't seem to get a solid answer. Is there a way to make an "iframe" or embed another page into a MediaWiki page? Thanks in advance! :)