[02:37:43] is Katie here [02:38:57] perhaps not. legoktm ? [02:39:03] hello c [02:39:18] legoktm: shouldn't FuzzyBot be flagged as such? [02:40:00] I'm not sure [02:40:14] it's making edits via translation admins [02:40:27] automatically though, based on the contribs [02:42:00] c: I found https://phabricator.wikimedia.org/T62000, looks like people there are split [02:43:09] hrm [02:43:39] But I think that bug is about having the extension flag it as a bot automatically, not per-wiki. [02:44:12] Start a thread on current issues? [02:44:34] I assume Nemo_bis and Katie will have the same opinions though :P [02:46:06] that and someone in another channel mentioned that the revision history of Shirayuki's talk page is in a state of perpetual loading [02:47:29] looks like it... [02:47:36] probably a bug in Flow [02:47:45] * legoktm eyes quiddity [02:47:46] (probably) [02:48:09] > TypeError: mw.config.get(...) is null [02:48:19] Exception in module-execute in module ext.visualEditor.desktopArticleTarget.init: load.php:177:688 [02:48:19] TypeError: title is null TypeError: title is null [02:48:23] weirdddddd [02:51:34] ohh, I see [02:53:16] c: https://phabricator.wikimedia.org/T113833 [02:55:37] * legoktm runs away to dinner [11:27:42] NickServ -what does that mean [11:30:47] Guest97991: you picked an already registered nickname, I guess (nickserv only talked to you) [11:32:15] I need some advice/ help [11:33:54] feel free to ask here [11:35:58] Hello. If Guest97991 is too impersonal for you feel free to use /nick (chase won't work, as already taken) [11:36:30] a id [11:37:47] bdmi I'd I'm getting all my devices monitored and hacked. how is this possibe [11:39:24] We're a MediaWiki support channel, dediacted to answer questions and answers about our software. I'm afraid general security hygiene isn't in scope of the channel and we're not the most suitable place to assist you in the pursue of such matter. [12:09:24] i am getting Exception encountered, of type "BadMethodCallException" when trying to access my watchlist on mediawiki.org [12:09:40] anyone else having this issue? [12:09:46] * Reedy looks [12:09:52] Lydia_WMDE: is your watchlist large or anything? [12:10:00] not really no [12:10:06] less than 100 entries i'd guess [12:10:23] and definitely smaller than on other wikis [12:11:12] let me see if I can find it in logstash [12:12:25] Lydia_WMDE: https://phabricator.wikimedia.org/T113418 [12:12:43] Reedy: thx [12:26:47] Lydia_WMDE: blame matthiasmullie I think :D [15:27:46] can I ask beginner level questions here? [15:28:54] depends on what you want to ask [15:30:09] Well I just started learning about API's yesterday and I wanted to use mediawiki API. But I cannot understand anything at all. It's frustrating. [15:30:26] I found embed.ly and twitter API's to be very easy [15:31:09] All I want is to get all the links from a wikipedia page and I can't understand how to query that. [15:31:16] And the sandbox is not helping at all. [15:31:34] Is there something I should have read before attempting this. [15:31:59] godofhavoc: look at https://en.wikipedia.org/w/api.php?action=help&recursivesubmodules=1 and just search for something that sounds related. [15:32:13] godofhavoc: every API module has short documentation there and a few examples [15:32:53] godofhavoc: for example, what you want is https://en.wikipedia.org/w/api.php?action=help&recursivesubmodules=1#query+links [15:33:37] yes, the expanded api help page is useful for searching specific concepts [15:37:43] MatmaRex: Thanks dude. This is much more helpful than the documentation. Still I think I will spend a long time figuring this out. [16:16:48] Hi! I want to download a wikidata archive and use some tools like http://wdq.wmflabs.org/api?q=around[625,$user_latitude,$user_longitude,$range], and some other requets locally, is there an easy way to do that [16:16:52] ? [16:20:15] I'm sorry but I have to ask one more dumb question. How do you know the 'number' of pages that links to a given page? [16:21:10] Count them? :D [16:21:27] Special:WhatLinksHere or via the api [16:21:47] The API won't give you the count, so you do the count clientside [16:22:28] Reedy: Do you mean I have to use continue= to list them and keep counting like that. [16:22:48] Yup, if you want to count them all [16:22:55] limit=max [16:23:49] Reedy: Isn't that grossly inefficient. Moreover don't they limit it to 500 max. [16:24:07] It depends on the query [16:25:14] Reedy: I just need to rank all the links in a page according to the 'popularity' of it. [16:25:56] https://phabricator.wikimedia.org/T19993 [16:26:10] I wanted a count ages ago, see above [16:46:19] hmm, having trouble decoding and re-encoding page titles https://commons.wikimedia.org/wiki/File:1658_1700_Ara_biae_Schenk_%26_Valk_Janssonius.JPG is the page but https://commons.wikimedia.org/w/api.php?action=query&prop=info&format=json&titles=File:1658_1700_Ara_biae_Schenk_&_Valk_Janssonius.JPG trips up. Using ruby URI encode / decode [16:47:52] also https://commons.wikimedia.org/w/api.php?action=query&prop=info&format=json&titles=File:1658_1700_Ara_biae_Schenk_%26_Valk_Janssonius.JPG trips up with invalid characters :) [16:48:43] so it's about & in page titles, I should be able to search around for this [16:49:48] : -> %3a [16:50:12] hmm [16:53:48] trying to get it working in sandbox first :) [16:58:02] so, if I use the page title on the wiki page with the space it works (e.g. File%3A1658%201700%20Ara%) but not if I copy the title from the URL [18:25:22] how to do i create a link to page containing all of the articles for a namespace, must in the same way categories are used? [18:26:15] simplequestionyo: [[Special:AllPages]]? [18:26:26] no a specific namespace [18:28:06] {{fullurl:Special:AllPages|namespace=10}} ? [18:28:25] PrefixIndex/Namespace? (Doesn't work for NS_MAIN [18:34:39] thanks [18:34:54] i would have preferred an internal reference but this works [18:34:59] its an external link [18:35:19] Class plain links :p [18:36:34] simplequestionyo: you can include it on a page like a template: {{Special:AllPages|namespace=10}} [18:37:16] pagination won't work, though, if you have more pages that what is displayed [18:38:01] or use any dynamic page list extension that can generate a list with better formatting option [18:41:42] thanks Vulpix [19:32:21] Hi. What wiki is in yours opinion the best good looking wiki on the Internet? [19:32:57] Except Wikipedia nad WikiHow. [19:34:53] and* [20:27:24] elektryk: presumably the "Beautiful wiki" https://wikiapiary.com/wiki/Beautiful [20:32:36] Nemo_bis: Pity that it's only beautiful by name ;) [22:04:44] having some trouble getting the EmailPage extension working. links do not appear in the toolbox or as a page action despite having configured wgEmailPageToolboxLink and wgEmailPageActionLink. running MW 1.25.1 and confirmed the extension is loaded by checking Special:Version [22:06:47] can any of you guru's point me to something i'm perhaps missed? [22:38:55] if its an older extension, it might just not be compatible with newer versions of MediaWiki [23:58:05] Does anybody know what the Parsoid API URL is? [23:58:26] http://parsoid-lb.eqiad.wikimedia.org/ doesn't seem legitimate [23:59:36] Why not? [23:59:52] Reedy: dunno, looked a bit like a test server