[02:18:42] what could have been overlooked with subpage breadcrumbs being undisplayed on a custom skin but show ok in monobook/vector/etc [02:43:23] Are the breadcrumbs in the HTML? [02:43:31] If so, it's a CSS issue. [02:43:42] If not, it's a PHP issue. [02:52:16] i assume you mean span class subpages, which is not in the HTML for the custom skin [03:24:00] You know what they say about people who assume, [03:32:35] Yvette: what do you suggest [03:55:42] Fix the custom skin PHP? [03:55:44] c: ^ [04:05:25] Yvette: i'm trying to compare skins and figure out what's missing between the two [04:09:19] Sounds riveting. [04:25:06] Yvette: https://i.imgur.com/DHM6teM.jpg [08:14:35] Hello, I came few days ago with a problem that didn't find solution so I try again [08:16:03] On a wiki with 1.28.2 installed, I have an erratic behaviour of my RecentChanges, indeed, it still shows lines regarding deleted/oversighted entries (and it shouldn't) [08:18:08] I don't know if this is related but in DebugMode, I got plenty of: [08:18:09] [GlobalTitleFail] MessageCache::parse called by SpecialRecentChanges->getExtraOptions/ChangeTags::buildTagFilterSelector/Message->parse/Message->toString/Message->parseText/MessageCache->parse with no title set. [08:18:10] [GlobalTitleFail] MessageCache::parse called by ChangeTags::formatSummaryRow/ChangeTags::tagDescription/Message->parse/Message->toString/Message->parseText/MessageCache->parse with no title set. [08:29:42] Linedwell: is your JobQueue running properly? [08:31:31] there are no jobs in the queue [08:31:46] How do I know if it "works properly" ? [08:31:57] (showJobs.php returns 0) [08:32:44] and even if I run a manual execution of the queue, the result is the same, on all my farm [08:37:06] p858snake, fyi the problem wasn't there until we updated from 1.26 to 1.28 [10:38:07] Where is the source code of minerva css located? [10:46:21] acagastya: https://phabricator.wikimedia.org/diffusion/SMIN/ [10:46:40] (if that's what you mean?) [10:54:21] andre__: I can't login to gerrit using my wikitech login credentials. What should I do? [10:54:43] acagastya: what's the error? [10:55:21] It says invalid username or password. [10:57:10] acagastya: Can you successfully log in on https://wikitech.wikimedia.org ? [10:57:23] Yes, I can. [10:57:43] acagastya, then I am not sure what the problem is :( [10:58:00] plus #wikimedia-tech might be better suited for this (not an issue with mediawiki itself) [10:58:24] Ah, it is case sensitive, the username field. [10:59:01] This is the problem, my username is supposed to be in lowercase, but the software won't allow! [11:00:28] i thought RainbowSprinkles fixed that… [11:00:33] file a task about that [11:01:21] https://gerrit.wikimedia.org/r/#/c/326150/ should have fixed that [11:05:11] p858snake, do you have any idea regarding my above issue please ? [11:05:34] (RC's showing obsolete lines) [11:28:24] Hi [11:28:58] hi Wisam [11:29:10] I just want to ask if there is any method to use wikipedia recent changes API for a specific page title? [11:29:31] how do you mean? [11:30:23] The recent change API returns all recent changes for all pages or for a specific category [11:30:53] I need to get recent changes per page ID or per page title [11:31:20] so you mean the history of a page? [11:31:37] sort of [11:31:41] I mean this [11:31:43] https://www.mediawiki.org/wiki/API:RecentChanges [11:32:10] but as mentioned in the link rctitles: Restrict results to these page titles 1.14-1.15 (Removed in 1.15) [11:32:55] https://www.mediawiki.org/wiki/API:Query#Specifying_pages ? [11:33:56] I've tried this and it does not work, since titles isn't a parameter for recent changes API [11:34:35] https://en.wikipedia.org/w/api.php?action=query&list=recentchanges&format=xml&titles=Albert%20Einstein&rcprop=user|timestamp|title&rcshow=anon&rclimit=max&rctype=external [11:35:31] https://en.wikipedia.org/w/api.php?action=query&list=recentchanges&format=xml&titles=Albert%20Einstein&rcshow=anon&rclimit=max&rctype=external [11:43:45] Any help? [11:47:05] please [11:53:42] Hi, i need to use api for action=feedrecentchanges, the resulting page should show the page that is changed but without the DIFF. Is this possible ? [11:54:18] Any help regarding recent changes API per page ID or per page title [11:54:20] ?? [12:09:53] inviso: Seemingly not [12:09:54] https://en.wikipedia.org/w/api.php?action=help&modules=feedrecentchanges [12:10:02] You might want to add a feature request for a property to disable the diff [12:10:08] Should be easy enough to add [12:11:37] Wisam: Why do you need recent changes? [12:11:44] Just get a list of revisions for that page title/id [12:11:54] It's effectively the same thing [12:12:32] I did the revisions part [12:13:33] But I need recent changes to get info such as bot, anonymous, redirect, and patrolled changes [12:14:26] Hmm [12:14:36] Well, anonymous you can work out [12:14:41] Is it an IP address that made a change [12:14:44] yes [12:14:47] Bot, you can lookup the user properties [12:14:50] redirect... Look at the page text? [12:15:14] or the page info, and see if it tells you if it's a redirect [12:15:25] Anyway, you can't do what you want to do currently. You'd have to file a request for it to be added [12:15:37] Either a filter on the recent changes (I don't know if it'll need more database indexes) [12:15:51] but it would be easier if there is a method to get all these attributes per title if it is possible [12:16:08] Or, for this sort of information to be on the revision list, yeah [12:16:45] Is there any other way to show only pages that has changed ? New pages does not work since it do not show newley changed pages, only old stuff. [12:17:23] Nope [12:17:29] Not currently [12:18:29] As I read, it was possible in 1.14 (Removed in 1.15) [12:18:51] Oh? Link? [12:18:56] im not a programmer nor poweruser, so i cant request anything, but thanks anyway. [12:19:02] It may have been removed for performance reason [12:19:06] inviso: You don't need to be [12:19:28] You just need to put a request, in english, of what you want on our bug tracker, and if it's feasible, it should get done [12:19:37] or if it's not feasible, someone should tell you why [12:20:35] Im on LTS, it will take years until that is resolved [12:21:12] well, it won't take years for it to be resolved, it might take years for it to be in your version, sure [13:38:22] You can certainly add an extension to your LTS. Extensions can do just about anything, just a SMOP. [15:14:07] Anyone here tried converting mediawiki to html using pandoc? [15:20:19] pandoc seems to convert ` ` (whitespace) at the start of lines to  (a-circumflux), why? [15:20:46] character set / encoding issue maybe? [15:24:53] Hey everyone [15:24:57] saper: Are you here today sir? [15:29:34] andre__: thanks I checked w3school it says  is 194 and whitespace is 32 [15:29:48] andre__: not sure how they get mixed up [15:41:11] so I imported a wiki with many articles that *look like* they have a namespace (based on their title), however, they don't appear when using their namespace on Special:PrefixIndex. Anyone knows how to fix this? [15:42:35] calnation: this happens if the namespace does not exist on your wiki before importing the contents [15:42:38] !namespace [15:42:38] See http://www.mediawiki.org/wiki/Help:Namespaces for user help and documentation, and http://www.mediawiki.org/wiki/Manual:Namespace for administration. For adding namespaces, see !extranamespace [15:43:43] calnation: if you want them to become real namespaces, you should configure them in LocalSettings, and run https://www.mediawiki.org/wiki/Manual:NamespaceDupes.php [15:47:42] Do you know if the namespace "Draft" has to be defined or not? Documentation on it is very little and confusing [17:20:53] hello [17:25:18] hi eceryone [17:25:29] *everyone [17:26:42] Is it possible to get the dump of all the pages having the specified template using the API? [17:58:16] still trying to figure out what's missing with the skin files for a custom skin that is causing the breadcrumbs for subpages to not display, i've been comparing the files between vector/monobook and our custom skin and I'm not seeing any relevant references to subpages, navigation, or breadcrumbs [18:12:55] hi c [18:12:56] Hi [18:13:29] c: have you made sure the namespace that you're testing on has $wgNamespacesWithSubpages to to true? [18:13:43] It seems that extracts API does not work for some wikipedia articles [18:13:50] as below [18:13:50] https://en.wikipedia.org/w/api.php?action=query&prop=extracts&explaintext&format=xml&titles=Albert%20Einstein [18:14:05] legoktm: well yes, considering they work properly in out-of-the-box skins like monobook and vector :) [18:14:47] c: the relevant part that output subpage links is: $this->html( 'subtitle' ) [18:15:14] ^ [18:15:43] and that is set in SkinTemplate::prepareQuickTemplate() / Skin::subPageSubtitle() [18:15:44] Any help? [18:16:32] Wisam: hmm, it shouldn't do that, let me see [18:16:50] Wisam: https://phabricator.wikimedia.org/T165161 [18:16:53] Appreciated [18:20:58] Is this issue going to be fixed soon? [18:21:59] we're working on it yes [18:22:39] Thanks a lot :) [18:31:03] Hello, is there a known issue with the Query prop=extracts API? We are seeing some queries return a blank extract when they used to return plaintext content. Ex https://en.wikipedia.org/w/api.php?action=query&format=json&redirects=&continue=&prop=extracts%7Cpageimages%7Cpageprops&pithumbsize=300&exlimit=1&exintro=&explaintext=&exsectionformat=plain&titles=Barack+Obama [18:31:24] yes, being fixed right now [18:31:43] Thank you, is there any kind of existing ticket to track this [18:33:50] https://phabricator.wikimedia.org/T165161 [18:44:33] that's the ticket [20:42:39] Hi [20:43:27] Wisam: hello, how may we help you? [20:43:38] I am trying to make too many requests to wikipedia and I am getting the following error [20:43:40] ConnectionError: HTTPConnectionPool(host='en.wikipedia.org', port=80): Max retries exceeded with url [20:43:52] what can I do [20:43:59] Wisam: is this with an api request or from a browser? [20:44:18] api request using python [20:44:45] Have you tried limiting request amounts and time in between them? [20:46:23] No, I don't know, and don't know how, since I am making revisions api requests and users requests on users who made those revisions [20:51:59] Any help? [20:54:20] Wisam: im not to familar with python... sorry, all i can say is slow api request amounts [20:56:24] I think that you have just helped me [20:56:57] I'll use the sleep function after a certain of api requests [20:57:15] It may solve the problem, right? [20:57:41] *certain amount of api requests [20:57:52] Wisam: Possibly, try it and let's see [20:58:50] Thanks :) [20:58:58] it work Wisam ? [20:59:32] I am waiting for the result, since the error appears after several minutes [20:59:46] Wisam: okay [21:00:01] You are the best support team ever [21:00:09] thanks a lot [21:00:21] king_nero: yes I am [21:00:40] Wisam: Well, I'm just a volunteer, but thank you for coming to ask questions, feel free to visit and ask any questions anytime [21:00:42] :P [21:00:47] king_nero: regarding endorsements feel free to edit https://www.mediawiki.org/wiki/User_talk:Saper :) [21:01:10] saper: could you not please [21:02:47] Wisam: did you try using pywikibot framework? [21:03:14] I am using requests library [21:03:43] !pywikibot | Wisam [21:03:44] Wisam: pywikibot is a bot framework written in Python that allows for easier automated actions on a wiki. For more information, see . For alternatives, see . We're in #pywikibot if you want to chat :) [21:04:10] Oh, thanks a lot [21:05:02] does it work on arabic wikipedia as well? [21:05:38] of course [21:05:52] it does login for you and lots of other goodies [21:06:24] you can list pages from categories and even you can edit manually in the bot - it can display pages for you and ask you to confirm the changes manually [21:07:26] I am working on a project to assess the quality of Wikipedia articles based on their attributes [21:07:51] so I do have to request these attributes per page [21:08:06] using the API of course [21:08:30] what do you mean by attributes? [21:09:07] mmmm [21:09:21] total revisions [21:09:32] average revision size [21:09:34] pywikibot can also work on dumps downloaded from https://dumps.wikimedia.org/arwiki/20170501/ for example [21:09:36] page views [21:09:40] so offline work is possible also [21:09:42] article age [21:10:21] do you mean downloading the whole database? [21:10:24] ah, understand [21:10:44] you can download a wiki - only metadata, or article text, or even history if you have space [21:10:54] I have tried, but the dumps are huge [21:11:23] 9.3 GB compressed (no need to decompress it) [21:12:22] you can get something without article text, those are way smaller [21:12:38] These files contain no page text, only revision metadata. [21:12:39] arwiki-20170501-stub-meta-history.xml.gz 1.4 GB [21:12:46] It was easier to use the APIs than using database servers with such database size [21:13:17] sure, it depends how you do it [21:13:38] I am almost done, but I had the problem of connection failure [21:13:42] I use dumps sometimes just to search categories / titles / other metadata and then I go and fetch the live articles from the wiki [21:14:32] http://stackoverflow.com/a/24050000 says it is a confusing error message [21:15:02] I have also noticed a problem today on the extracts API, since it does not return values for some articles [21:17:46] saper: not to mention sometimes dumps can be semi-out-of-date [21:18:09] Yess, you are right [21:18:33] Zppix: sure they can be, that's the nature of offline. but that one is 12 days old. [21:18:46] saper: apis are as of that instant [21:19:07] Wisam: pywikibot can let you focus on doing your thing fast, no need to worry about many issues like API throttling for example [21:19:39] yeah but crawling a whole wiki for research can take a bit [21:20:17] for reaseach purposes dumps are very nice. you can say it was based on dump XXX and others can even try to reproduce the results for example. [21:24:03] Thanks a lot for your help [21:24:15] The problem is solved [21:24:28] Really appreciated :D [21:29:00] Wisam: how did you solve it? [21:35:15] I used the sleep function from time library [21:35:31] at the end of each loop iteration [21:35:50] waiting for 5 seconds then continue [21:41:31] saper: I am just wondering why when I have the warningmessage installed, when I go into edit mode, all the rest (sidebar and header) go missing [21:41:34] Do you have any idea? [21:56:34] king_nero: no. let me check [21:56:55] Im running MW 1.26 [22:04:46] king_nero: works4me with MW 1.28.1 with the vector skin [22:05:17] Want a screenshot? [22:05:39] no [22:05:53] I have tested it on 1.28.1 [22:07:00] king_nero: anything in the php log? [22:07:06] which skin? [22:07:11] nothing [22:07:13] default vector [22:07:30] javascript console? [22:07:45] do you have php logging enabled to the file? [22:08:20] nothing in the console giving errors [22:08:31] sorry, no idea [22:08:33] maybe ill run wgDebug [22:08:36] ill look sir :) [22:08:38] good idea [22:09:30] sometimes it can be letter inserted before just looking, thank you