[08:05:50] hi! I maintain a sourceforge project. I recently had to install mediawiki myself. [08:06:12] now I have problems that people spam it. [08:06:36] I want to prevent that people edit pages or create accounts etc. [08:07:22] My old account where I had admin rights doesn't work neither. So I can't admin it. [08:08:06] I would really like to back out all recent edits including new accounts etc. And then make it completely readonly. [08:11:33] is there some way that I can backup the pages in a non-db format? so I can paste them in another wiki later? [08:11:53] this is the wiki : http://cppcheck.sourceforge.net/wiki/ [08:52:44] liangent: hi [08:52:45] around? [08:54:52] greetings,is anybody familiar with the pdfbook mediawiki extension? [08:55:23] no [08:55:29] what are you trying to do [08:56:11] i added it to a wiki i`ve set up and tried to export categorys with it to a pdf file.however,the output is a 1kb file with nothing in it [08:57:02] i`ve checked the extension talk and some forums but that problem was only mentioned during 2007 to 2008.code seems optimized now and it should not react that way [08:59:57] aharoni: hi [09:00:05] hi! [09:00:09] hello [09:00:24] liangent: maybe you'll be able to help me with zh variants a bit. [09:00:55] I mentored a GSoC project for creating auto-translated screenshots for the VisualEditor user guide. [09:01:08] So we have this in Finnish, for example: https://www.mediawiki.org/w/index.php?title=Help:VisualEditor/User_guide/fi [09:01:27] You can notice that most screenshots have Finnish text. [09:01:39] For Chinese it's a problem, because of the variants. [09:01:46] So we have https://www.mediawiki.org/w/index.php?title=Help:VisualEditor/User_guide/zh , which is a translated page. [09:02:19] hmm but every variant needs to have its own images [09:02:20] but the screenshots are based on the translations in the VE source, and they are separated to zh-hans and zh-hant. [09:02:29] So yeah - we have that already. [09:02:35] let me find you a link... [09:03:12] -{zh-hans:[[File:VisualEditor_toolbar-zh-hans.png]];zh-hant:[[File:VisualEditor_toolbar-zh-hant.png]];}- should work [09:03:25] liangent: https://commons.wikimedia.org/wiki/Category:VisualEditor-zh-hans [09:03:31] https://commons.wikimedia.org/wiki/Category:VisualEditor-zh-hant [09:03:38] but I'm not sure whether it's handy for you to use a different kind of markups for some languages [09:03:45] the screenshots themselves should be OK, but please check. [09:04:40] Guest15683: why did you choose the pdfbook? we usually use Collection [09:05:02] if I'm at https://www.mediawiki.org/w/index.php?title=Help:VisualEditor/User_guide/zh, is it traditional or simplified? [09:05:56] aharoni: it's decided by your user preferences, browser preferences etc [09:06:14] mmmmm... I suspected so :) [09:06:16] it's actually good. [09:06:17] qgil: hi [09:06:27] hi Svetlana [09:06:34] qgil: what happens to doc.wikimedia.org after migration? [09:06:55] liangent: if I'm at /zh, and my variant is automatically picked, will -{zh-hans:[[File:VisualEditor_toolbar-zh-hans.png]];zh-hant:[[File:VisualEditor_toolbar-zh-hant.png]];}- do the right thing? [09:07:17] Svetlana, which migration? There are no specific plans yet. [09:07:26] aharoni: yeah it will do [09:07:37] and btw those images are OK [09:07:50] liangent: thanks. [09:07:51] qgil: I'm concerned that the documentation lacks examples and is not multilingual so I was wondering whether phabricator can support any of that [09:08:19] Now I need to adapt https://www.mediawiki.org/wiki/Template:LANGUAGESCREENSHOTURL to handling variants. [09:09:00] Svetlana, we are not using Phabricator for documentation, but we want to use MediaWiki better for developer documentation -- see https://www.mediawiki.org/wiki/Data_%26_Developer_Hub [09:09:38] aharoni: I guess you'll have to hard code a few language codes then [09:09:50] Svetlana, besides, Phabricator doesn't excel in multilinguism either [09:09:51] liangent: meh [09:10:10] language converter exposes few interfaces, incl in parser, api, lua etc [09:10:19] liangent: is there maybe a magic word that checks whether I'm in a context where variants are supported? [09:10:52] so that if I'm on a page in zh, I'd do -{}- in the template, and otherwise I'd use the current code. [09:10:55] qgil, why not dev.mediawiki.org then? [09:11:21] aharoni: I don't remember anything similar, but let me have a look [09:11:26] thanks [09:11:45] Svetlana, this is why I was asking "which migration". :) dev.wikimedia.org one day, yes, but we are not fully dedicated to that project yet. Phabricator first. [09:11:58] qgil, IMO that would really be www.mediawiki.org and meta.wikimedia.org for wikimedia-specific, extra site would make it more confusing [09:12:07] qgil, I'm not hurrying you, just sharing some thoughts. [09:12:44] Svetlana, I think one site for all is less confusing. If there are exceptions, then let's look at the exceptions. [09:12:44] qgil, notice that you said dev.wikimedia.org and I said dev.mediawiki.org, two different things (although I mentally rejected both atm). [09:12:58] liangent: maybe I can just poke at {{CURRENTCONTENTLANGUAGE}}, but then I'll have to hardcode zh and sr and kk and others. [09:13:41] Svetlana, dev.wikimedia.org is supposed to be just a redirect to mediawiki.org -- see https://phabricator.wikimedia.org/T372 [09:14:21] qgil, can I update the wiki page then? the "dev.wikimedia.org" notation is not needed then. [09:15:04] why not needed? [09:15:43] I'd rather recommend you to discuss in Phabricator tasks before changing the documentation on-wiki [09:15:53] aharoni: I can't find one [09:16:07] but even CURRENTCONTENTLANGUAGE is not documented on [[Help:Magic words]] [09:16:17] Svetlana, your input is welcome there. You are good at making us to think harder. :) [09:16:55] meh it's not a magic word :( [09:17:46] it's a fake magic word :) [09:17:47] qgil, I'm not sure where we're heading. With bugzilla and wiki, folks are drafting specs on-wiki and talking on-wiki, and only filing stuff to bugzilla to /track/ it. Phabricator is used for drafting specs and making decisions now. I may be conservative, but I don't see how this is a good idea. [09:18:57] Svetlana, Phabricator is a project management tool, MediaWiki is a documentation tool. [09:20:19] qgil, when I have a new idea, I can see it being developed first, and usually this happened on-wiki as it is a good collaboration platform. With Phabricator, the thinking on specs takes a shape of /discussions/, which are not a better structure. [09:20:35] discussing documentation on-wiki is good, discussing whole projects, I'm not sure -- or I'm certain that it's not the best tool for that, and we are bringing Phabricator just for that [09:22:00] Svetlana, dev.wikimedia.org is today a small project, and it has already 18 discussions (tasks) open: https://phabricator.wikimedia.org/maniphest/?statuses=needsinfo%2Copen&allProjects=PHID-PROJ-xvpoyy5qoomv3xbonon6#R [09:22:20] I know. I recall Phabricator has a wiki. I would encourage that you use it to collaborate. I can't /stand/ discussions as a source of information about projects scope or status. For years, I had been perfectly able to avoid reading discussions entirely and obtain information by reading its documentation (and only read discussions where I need to communicate ideas and make an impact). [09:22:24] tracking that on-wiki is less efficient, not to talk about defining dependencies and assigning tasks [09:23:02] I can see that, I just suggested how to change things. A large part of the web are forums and Q&A sites and wiki is more readable than any of that. [09:23:29] Svetlana, if you want to read, follow the wiki pages in mediawiki.org. If you want to discuss superficially, wiki discussion pages are good. If you want to get involved in the project, join the project in Phabricator. [09:24:00] I visited a Phabricator task about an IRC bot needed to announce phabricator changes. I couldn't figure out what the spec is (what is needed) or how complete it is, because it was all a dozen of screenfulls of a discussion. [09:25:31] Each task should have a clear spec and if the original initial bug report is not editable, like a wiki, then this is not possible to provide an up-to-date spec. [09:25:55] Svetlana, https://phabricator.wikimedia.org/T131 has a very clear title and a description that is as editable as a wiki, that should reflect the current state of the discussion. You don't need to read the comments if you don't want to. [09:26:56] I can't access editing facilities, it doesn't let me log in with SUL nor with Wikitech username. [09:28:16] Svetlana, the homepage says "NOTE: REGISTRATION IS DISABLED", linking to a task showing why and including a link to https://www.mediawiki.org/wiki/Phabricator#Access_to_phabricator.wikimedia.org [09:28:54] Ah, so that's temporary, okay. About the bot, I can't see it in the description that someone started working on it. Nobody has? [09:30:44] Svetlana, "Assigned To None", "Projects Bugzilla-Migration (Ready to Go)" link to https://phabricator.wikimedia.org/tag/bugzilla-migration/board/ [09:30:45] so no [09:32:59] ok. I'm not sure why one would take collaboration off-wiki, but such is life — not everything fits into wiki. [09:33:27] Thanks. And is there someone (in addition to you) with whom I could chat about the IRC bot if I have more questions later? [09:34:20] legoktm and YuviPanda are kind of looking at this task, but I don't know whether they are committed to do it [09:35:39] Svetlana, if you get a Phabricator account and subscribe to the task, you will be notified as soon as there is any movement [09:36:33] I had one at fab.wmlabs, but I don't know whether that means anything. And my SUL and fab.* and Wikitech usernames/credentials are different... [09:36:49] Thanks I will try to poke around after I do some more reading. [09:38:05] hi! [09:38:07] Svetlana, our fab accounts are gone forever. SUL, fab, wiktech etc can be all different, and be still all aggregated to your single account (where you can aggregate your email addresses) [09:38:35] actually, let me document this in https://www.mediawiki.org/wiki/Phabricator/Help#Signing_in [09:39:14] do you guys know if is there a way to show a list of all categories of a linked page next to its title? [09:39:59] something like: The Wheel - Important Inventions, Automotive, Round Things [09:53:34] hi. how can i return the list of links from "what links here" page? [10:19:01] aharoni: using it in that way doesn't work [10:19:12] -{ }- is parsed after [[File: ]] [10:19:23] liangent: thanks for noticing :) [10:20:11] liangent: I hate templates so much. [10:24:38] aharoni: and I guess, if zh-hans image doesn't exist while zh-hant does, preferably show zh-hant but not en for zh-hans users [10:24:52] I know it'll lead to complicated template code [10:29:10] liangent: this shouldn't be an issue, actually, because both zh-hans and zh-hant are created automatically [10:29:24] I still can't manage to get -{ to work though [10:29:30] liangent: see http://etherpad.wikimedia.org/p/zh-ve-screenshots [10:30:32] aharoni: okay [10:32:58] aharoni: done [10:37:25] liangent: thanks, it's getting there, [10:37:37] aharoni: I guess it would be better write the whole thing in Lua [10:37:43] yes :) [10:37:59] there's an issue, however: [10:38:35] it shouldn't have "[[File:" [10:38:38] only "File:" [10:38:40] is that possible? [10:44:42] liangent: Grr. Looks like if I just remove the "[[", it gives plain-text "[[File:VisualEditor Toolbar Formatting-zh-hans.png|frameless|border|200px|center]]" and doesn't insert the image. [10:44:53] liangent: have to go now, back in about ah hour [10:48:12] liangent: how can I get something like -{ }- in Lua? [10:49:13] aharoni: no it's not possible [10:49:25] both two things are not possible... [10:49:48] to achieve the latter thing we just output some -{ }- in Lua [10:51:32] * aharoni bangs head [10:51:34] oh well. [10:51:37] I'll try later today. [10:51:42] Thanks a lot for the help. [10:55:01] gleki: I don't understand the question [10:55:26] nabax: yes, you can use the classic skin [10:55:41] or even cologne blue iirc [10:56:14] https://www.mediawiki.org/wiki/Skin:Standard , should still work [10:56:25] Perhaps there are others [11:13:29] Nemo_bis: gleki: Cologne Blue skin is still working and supported, you just need to install it separately [11:13:52] Nemo_bis: "Standard" and several others are defunct, although apparently Paladox showed interest in reviving them [11:22:23] MatmaRex: defunct as in exploding when installed? [11:22:59] Because if you know in what versions Standard doesn't work that would be useful to add on its page [11:23:14] Nemo_bis: no, it's not available for installation at all. [11:23:25] it was removed and is still gone [12:03:38] MatmaRex: what do you mean, it has no repository? [12:03:44] yes [12:03:54] well one just have to checkout a branch [12:04:16] a rather old branch of core [12:04:37] the skin from it won't work on master anymore, might work on 1.23 or 1.24 [12:04:58] (it won't work on master because we just removed backwards-compatibility code to support such skins a few days ago) [12:06:49] ok, edited the page [12:22:00] hy all, i'm looking to populate our mediawiki through api, do you have any good example for advice ? [12:22:55] aharoni: by the way it's possible and relatively easy to introduce the variant factor earlier in the php parser, for example a parser function or magic word [12:23:35] aharoni: I tried something similar but the parsoid people rejected that, saying that it pollutes or fragments their cache [12:34:47] xinity: in what form is your text currently contained? [12:36:19] Yaron: yo mean my mediawiki text ? in a mysql database [12:38:06] xinity: oh, okay. Have you looked into any of the MediaWiki API wrappers? [12:39:19] I would recommend using one of them; you can see them all here: https://en.wikipedia.org/wiki/Wikipedia:Creating_a_bot#Programming_languages_and_libraries [12:40:45] Yaron: i'm collecting informations as for now, to understand how these apis works and how to use them in my own purpose [12:42:49] Hi sucheta [12:43:07] sumanah, hello hello! [12:57:02] sumanah: *waves* [12:57:08] Hi valhallasw`cloud! [12:58:17] valhallasw`cloud: how are you doing? [12:59:35] sumanah: pretty good -- lot's of thesis writing at the moment. We're going to do some work together with a research group at Syracuse, which I'm excited about [12:59:49] Oh neat! Does that mean you'll be in New York State more often? [13:00:33] For now, someone from Syracuse will come here for a month, but maybe I'll also go there at some point. [13:02:37] sumanah: how about you? Excited to go to Hacker School again? [13:02:59] yes! sorry am in the middle of a couple things, biab [13:26:57] liangent: "pollutes cache"... I hate that excuse :( [13:27:07] I never know what to do about it. [13:33:44] hello, is there a way to have mediawiki on a virtual machine locally and on a production server and then add new data to the local copy and push the changes to the production server? [13:54:24] hi. How to create user without using UserAdmin plugin? [14:00:22] fellipe: Special:UserLogin/signup ? [14:01:55] valhallasw`cloud: when I click on special pages, the page becomes blank. I am using brazilian portuguese language. I've installed the UserAdmin plugin, but, since I am unable to open special pages, I cannot use it. Any idea? [14:10:05] fellipe: you should look at why you can't open special pages [14:10:15] because that is a sign of bigger issues [14:10:25] hi. I'm thinking of retrieving fresh category members recursively with subcats for new page patrol at wikipedia. [14:10:43] is that easily doable, or if not yet, then is it better to do as a tool or as an extension? [14:11:09] p858snake|l: any idea from where do I start? [14:13:09] !blank | fellipe [14:13:09] fellipe: A blank page or HTTP 500 error usually indicates a fatal PHP error. For information on debugging (including viewing errors), see . [14:16:12] I just hope that people don't ignore the question above. I'm asking a third time this week. [14:17:13] Svetlana: just because people don't respond doesn't mean they *ignore* your question [14:17:13] Svetlana: a new extension would be hard to get on to Wikipedia, no? [14:17:46] if it scales, not very hard, I'd just start with small wikis. the question is, who would benefit from it being an extension? [14:18:24] * bmansurov is back (gone 00:01:34) [14:18:38] * bmansurov is away: I'm busy [14:18:58] I basically won't be in this channel after Sept 30 but I think http://www.harihareswara.net/sumana/2014/02/26/0 might be a useful thing to add to the bot's vocabulary re explaining to people (a) when to talk on irc vs wikitech-l and (b) that when we refer them to help docs, they should read them [14:18:59] an extension has a perhaps more integration with the wiki, but I'm not sure whether this is needed. [14:19:34] sumanah, please don't vanish! we will miss you [14:20:06] Svetlana: I'm glad I'll be missed. http://www.harihareswara.net/sumana/2014/09/12/0 is why I'm leaving. [14:20:23] I don't mean technically hard, I mean hard to get past all the layers of review required. Or have you previously been able to get extensions onto WMF sites? [14:21:58] https://www.mediawiki.org/wiki/Writing_an_extension_for_deployment - the layers of review that Yaron mentions [14:23:20] sumanah: nice, good luck with your next things. I'd love you to stay here as much as you feel comfortable; you being familiar with this software and helping as a volunteer would be one of the most wanted and missed things. [14:23:45] Svetlana: in case you want to add your voice to the crowd, http://www.gossamer-threads.com/lists/engine?do=post_view_flat;post=506740;page=1;mh=-1;list=wiki;sb=post_latest_reply;so=ASC is a goodbye thread [14:23:51] Yaron: if there is some merit in it being an extension, I'll manage the reviews, I'm sure. [14:24:01] Svetlana: you should probably build upon RTRC or Huggle [14:24:03] Yaron: I'm just not sure whethere there is such merit or not. [14:24:14] Nemo_bis: I'd like it to be server-side and without javascript. [14:24:41] Then why did you ask about tools [14:24:43] I looked at RTRC: it is against all possible thoughts of mine on how to help our work scale in long-term. [14:24:54] because tools are server-side without javascript. [14:25:06] What tools? [14:25:12] tools at wmlabs. [14:25:29] rarely [14:25:32] I'm not using a gadget: it has no db access, and the api is slow. [14:25:44] Svetlana: You may also be interested in reading a speech I gave earlier this year: https://en.wikisource.org/wiki/Hospitality,_Jerks,_and_What_I_Learned [14:26:04] there's also an audio recording and a video recording [14:26:47] sumanah: I'm just interested in people not disappearing. writing code is a rewarding activity, but helping with the software for a year per each line of code written is even more adorable, and ensures that the software grows in the long-term. [14:27:09] Svetlana: If we want to retain people then we need to make an environment that helps retain them. [14:27:12] it took me quite some time to learn that maintaining and supporting software is the 99% activity of its life. [14:27:26] its? [14:27:36] That's why I suggest that you read my speech, which gives some suggestions for how we can do better at retaining people. [14:27:44] ah. [14:27:58] ok, I'm going to read it then, thanks. [14:29:03] Got error 'PHP message: PHP Fatal error: Call to undefined function wfLoadExtensionMessages() in /var/www/wiki/extensions/UserAdmin/SpecialUADMBase.class.php on line 59\n' [14:29:19] Nemo_bis: you also probably ought to read it. [14:29:24] I am using php-fpm with fcgi and I get this when accessing special pages [14:29:30] sumanah: ought to? [14:29:33] Yep. [14:29:37] Sounds strong [14:29:42] Yes. [14:29:52] Sounds unwarranted [14:30:33] Well, I suppose that depends on whether you have any particular opinion of my judgment in general. :) [14:30:34] fellipe: it's a new installation or an upgrade? [14:31:33] In the speech I offer some articulations of why we ought to behave in certain ways, and not behave in other ways, if we want to encourage and retain contributors, including technical contributors, and help each other learn. [14:31:36] Vulpix: trying to use UserAdmin plugin [14:31:40] Svetlana: making yet another patrolling tool would IMHO be silly, but if you don't want JavaScript in it I suppose you want a mere listing tool; there were several in the past, for instance https://tools.wmflabs.org/steinsplitter/upu.php [14:31:40] sumanah: I agree with your thoughts on not looking at edit count, and valuing hospitality. I'm trying to address that by making it possible to patrol new pages by topic, so as that wikiprojects folks can patrol pages and do more content work than the current patrol does. [14:31:40] fellipe: apparently, that extension isn't maintained and is not compatible with recent versions of the software https://www.mediawiki.org/wiki/Extension:UserAdmin [14:31:43] http://www.mediawiki.org/wiki/Extension_talk:UserAdmin [14:31:59] Svetlana: And in the technical community? and how we treat each other on wikitech-l and here in this channel? [14:32:17] sumanah: current new page patrol (and any patrol) is done by people without matching them by topic. [14:33:21] sumanah: I don't know about here. I felt that the development and decision-making is too internal; they're trying to fix it by moving trello and mingle to phabricator. should be good. I'm not sure what else they have in mind. ... I also thought it's useful to develop wiki software in a way which empowers people with ability to write wizards using wiki markup. [14:34:29] fellipe: some of the features provided by that extension are provided by other stable extensions [14:34:38] sumanah: why did you specifically tell Nemo_bis to read it? [14:34:41] Svetlana: I notice you're saying "they" and not "we" [14:35:04] sumanah: I couldn't do anything about it. I wanted to convince them to say with bugzilla, but I couldn't. [14:35:33] Svetlana: and if you want something integrated in core then you'd "only" need to make Special:RecentChangesLinked for categories recursive and/or let watchlist a category (for new additions) [14:35:51] Svetlana: you're part of this community as well, though, so I hope you will be able to say "we" about some things :) [14:35:53] Yaron: he was already in the conversation [14:35:55] sumanah: they made the decision by polling wmf employees about their common workflows, checking which software does them, opening a 'is it ok to move to ', got support, started moving. [14:36:12] Svetlana: well, there was an RfC that was open to all people [14:36:39] Yaron: I think you might like to read it as well! [14:36:45] sumanah: it was posed as a "is it ok to move to fab?", not as a "stay at bugzilla? or move to fab? or move to X? or move to Y?". this triggers herding instinct. [14:36:48] if you haven't already - I don't think you were at Wiki Conference USA [14:37:12] * Nemo_bis now feels ignored by Svetlana ;) despite the apparent urgency of the question answered [14:37:30] * sumanah can leave the conversation so you can fix the tech question :) [14:37:32] Svetlana: it was intentionally posed this way to reach a conclusion within our lifetimes [14:37:40] Nemo_bis: thanks, I will look into these pages. [14:37:54] MatmaRex: I know, it's just an explanation why I used "they". [14:38:06] and Svetlana https://www.mediawiki.org/wiki/Writing_an_extension_for_deployment might help you [14:38:16] eh [14:38:21] sumanah: stop linking that page, it's inappropriate for this case [14:38:24] ah ok [14:38:39] Nemo_bis: what, no "please"? sounds strong! sounds unwarranted! ;-) [14:38:53] heh [14:39:13] I was typing the please now ;) [14:39:38] courtesy as an afterthought, got it ;-) [14:39:41] But your repeating that suggestion seemed to ignore Yaron's argument [14:39:48] argument? [14:39:59] 16.17 < Yaron> Svetlana: a new extension would be hard to get on to Wikipedia, no? [14:40:24] you didn't address this point so I find it rude to insist on the extension thingy [14:40:27] oh I thought it was just an illustration of the situation [14:40:34] it's good for me to know how to write it, even if it's hard to deploy... I really wonder whether it's better inside of wiki or separate [14:40:43] I'm not saying "you should do this" I'm saying "if you choose to do this, then here are some things you will face" [14:40:46] a more theoretical question [14:41:05] perhaps I should have explicitly said that [14:41:29] well, when you link it twice it looks insistent [14:41:39] you and I definitely misread each other a lot, I tihnk [14:41:42] Svetlana: Inside, probably. Maybe a way to translate wikitext into GuidedTours things [14:42:56] sumanah: dunno; btw I read that wikisource page ages ago [14:42:57] marktraceur: how do you define 'guided tour'? [14:42:58] now afk [14:43:09] I feel like the trap there is that you'd wind up putting too much functionality into the language [14:43:19] !e GuidedTours | Svetlana [14:43:19] Svetlana: https://www.mediawiki.org/wiki/Extension:GuidedTours_ [14:43:26] wm-bot you douchebag [14:43:30] it's ok [14:43:33] Svetlana: https://www.mediawiki.org/wiki/Extension:GuidedTours [14:43:39] I was talking about fresh category members tool [14:44:14] https://www.mediawiki.org/wiki/Extension:GuidedTours doesn't exist either [14:44:21] https://www.mediawiki.org/wiki/Extension:GuidedTour exists [14:44:22] ah [14:45:05] I sort of wonder where to see that in action and I'll check it out, but it's unrelated to the original question I think [14:45:26] sumanah: I've learned the hard way that sometimes written text (in chat, forums, etc) is missing an important emotive expression system like the voice tone or expression of the face, that often cause unwanted misinterpretations [14:45:32] yup [14:45:37] absolutely [14:46:00] also, best of luck on your new project :) [14:46:03] and of course there's also people who just have a fundamentally different attitude towards what's ok and not ok in communication, collaboration, etc [14:47:01] different culture/language is also a barrier sometimes [14:47:19] yep [14:47:23] Different culture on IRC? Naw. :) [14:47:24] Thank you Vulpix [14:48:03] Svetlana: Basically I'm looking for an API that can already kind of do a "wizard" so we don't have to make one from scratch [14:50:22] If I change $wgServer to a different path, what else do I need to do to correct all the page links? [14:52:25] niqdanger: I don't know, what's broken for you? [14:52:43] niqdanger: if you used internal link syntax ([[link]]) they should be fixed by themselves, but pages may need to be purged so they get reparsed [14:53:59] I had originally set the DocumentRoot in apache to /var/www/html/mediawiki, but since i now have to host other sites. I changed it to just /var/www/html. I set $wgServer from "http://hostname" to "http://hostname/mediawiki" but the pages dont seem to reflect this change. [14:55:19] niqdanger: well, that's wrong. $wgServer should be the server name, without any path [14:55:51] niqdanger: you need to change $wgScriptPath - https://www.mediawiki.org/wiki/Manual:$wgScriptPath [14:56:16] and maybe $wgArticlePath if you've changed it [14:57:22] OH! Duh. Im a dingdong. Thanks :-) THat fixed it!! [14:58:01] :) [15:26:26] marktraceur, for what application? [15:28:32] bawolff: hi. [15:28:53] Hi Svetlana [15:29:42] qgil: almost ready [15:29:56] sumanah, ok! [15:30:07] bawolff: I think you could (probably) have an idea. if I'd like to write a tool for fresh category members retrieval recursively, would it better be a tool or on-wiki extension? I keep thinking how it'd benefit from being an extension, from being integrated into the wiki or possibly being transcluded, but I'm not seeing clear benefit. [15:30:45] extension will be much, much, much harder to get approved [15:31:24] I'd recommend starting with a tool labs tool, and if you get success with that/people using it, then perhaps looking at turning it into an extension [15:31:57] I just wouldn't write it in php there. this means that the 'turning into' process would be painful (rewriting in another language from scratch). [15:32:16] Svetlana: For wizards in MediaWiki [15:32:30] marktraceur: [[:n:Help:Wndialog]]? [15:35:42] bawolff, ok, I'll do that then. [15:37:21] What on earth is :n: [15:43:54] wikinews [15:45:15] Wikinews has a magic javascript wizard thing, where basically you add certain templates, which gives links with url parameters, if js detects the url parameters, it sends the page to the parser (via api), but with some template arguments replaced, thus giving magic dynamic wizards [15:49:18] Eugh [15:54:33] marktraceur: As an example, click the see older articles button on https://en.wikinews.org/wiki/Category:Science_and_technology [15:55:00] Ewwww [15:55:18] I want to fix that so badly [15:55:20] But I can't [16:05:28] fix what? [16:09:27] Svetlana: as a reminder, extension and tool are not the only options [16:10:02] what are the other ones? [16:10:34] if you mean doing it in core: I can see something like this doable, but I'm not sure that'd be approved of in recursive mode [16:11:21] it's also not a thing I can see myself doing (and I don't fancy getting anyone else to do it too) [16:12:00] It depends what you want to gain from it and what you'd end up doing otherwise [16:14:16] Svetlana: and if you want something integrated in core then you'd "only" need to make Special:RecentChangesLinked for categories recursive and/or let watchlist a category (for new additions) [16:14:39] Integration in the two places I mentioned is quite straightforward, if only you manage to make a performant query :) which you may need to do anyway for a labs tool [16:14:45] I don't see how any of these two are intuitive for users, I need to think of it [16:15:07] How can watchlist *not* be intuitive? O_o [16:15:10] I'm a bit stupid, I was thinking of making separate queries as I was expecting it to be infinite recursion [16:15:30] Infinite and CS don't get along well [16:15:32] which I can't see how to fit into a single query [16:15:56] It may well be impossible [16:16:03] from what I could gather even recursion of level 3 looks a bit ugly :) [16:16:05] But it will be equally impossibly for a tool [16:16:39] yeah, there probably are things like categorytree to steal code from [16:19:22] valhallasw`cloud: thanks. [16:19:55] * valhallasw`cloud looks confused [16:21:41] Svetlana: The UI is a bit...eh. [16:21:55] Also I hate that I know that. Too much design work lately. [16:23:17] infinite loops go against our coding conventions ;) [17:07:36] * aharoni is continually perplexed by Scribunto documentation. [17:07:53] aharoni: what specifically? [17:08:09] * ori is not a Scribunto wizard but can try to help. [17:08:12] שנה טובה :) [17:08:38] I'm trying to hack something with Chinese variants. [17:08:50] (An exotic territory for all of us.) [17:09:37] shana tova! [17:09:46] The wiki syntax "-{zh-hans:Simplified;zh-hant:Traditional;}-" is supposed to write "Simplified" and "Traditional" according to the user's preferred language. [17:10:18] Now, earlier I asked liangent about this and she said that there's no Lua function to get such a function, and I must call the same syntax from Lua. [17:10:36] That is, to get Lua to parse that wiki syntax. [17:10:56] So how do I get Lua to parse wiki syntax? [17:11:47] Reading the Scribunto reference, I'd think that it's `frame:preprocess( '-{zh-hans:Simplified;zh-hant:Traditional;}-' )`, [17:12:18] but if I try to run that in the Scribunto console, I get the error: "attempt to index global 'frame' (a nil value)" [17:12:59] 'frame' what {{#invoke|..}} passes lua functions [17:13:03] so it wouldn't work in global scope [17:13:33] you'd have to enclose it in a function and call that function with a simulated frame object (or save the module and use a real {{#invoke}} to test it) [17:13:50] that bit *is* really confusing; i remember being stymied by it [17:14:30] *'frame' is what, [17:16:50] Did I tell you about my 1.11-to-1.18 upgrade [17:17:33] I needed to comment out one line that was causing the problem and it went flawlessly in a minute [17:18:56] Thanks for the great MediaWiki software [17:19:16] been using it for over 10 yrs now and it's one of the best pieces of software [17:20:46] jubo2: glad to hear [17:20:55] Which line did you need to comment out? [17:21:28] bawolff: iirc it was some SQL line updating some thing [17:29:25] ori: so I tried that at https://zh.wikipedia.org/wiki/%E6%A8%A1%E5%9D%97:VariantTest [17:29:40] with a test page at https://zh.wikipedia.org/wiki/User:Amire80/Lua [17:29:51] and I get a Lua error: "attempt to call field 'prepocess' (a nil value)" [17:30:01] you're missing an 'r' :) [17:30:23] MY KEYBOAD IS BOKEN DAMN IT [17:30:28] thanks :) [17:30:36] np! i hope it works [17:31:19] yep, seems to do the right thing [17:46:15] ori: so I had a bit of progress, but now I'm stuck with something rather odd. [17:46:43] The general idea is that I have auto-created translated sreenshots for the VisualEditor user guide. [17:47:03] It works for English: https://www.mediawiki.org/wiki/Help:VisualEditor/User_guide [17:47:23] and it works for Finnish: https://www.mediawiki.org/wiki/Help:VisualEditor/User_guide/fi (notice the difference in the text in the screenshots). [17:47:37] But it mysteriously fails for Chinese: https://www.mediawiki.org/wiki/Help:VisualEditor/User_guide/zh . [17:47:49] It is probably a mistake in the module, but a very weird one. [17:48:14] The module is https://www.mediawiki.org/wiki/Module:Language_screenshot_filename . [17:48:32] I am pretty sure that the "language = frame:preprocess('-{zh-hans:zh-hans;zh-hant:zh-hant;}-')" line works correctly, [17:49:08] but when the language is Chinese (zh) I get an error about the fileTitle variable. [17:49:24] Nikerabbit: ^ maybe you'll have an idea. [17:50:01] "fileTitle = mw.title.new(filename, 'Media')" must be failing [17:50:38] probably, but why? [17:50:49] I am pretty sure that filename is valid. [17:50:53] aharoni: "If a number id is given, an object is created for the title with that page_id. If the page_id does not exist, returns nil." ( https://www.mediawiki.org/wiki/Extension:Scribunto/Lua_reference_manual#mw.title.new ) [17:51:22] mmmm, but the page should exist. [17:51:25] ah no, that's not it [17:51:31] "If the text is not a valid title, nil is returned." [17:51:42] (I have no experience with Lua) [17:52:25] aharoni: oh, you know, lua strings are not unicode [17:52:50] yes... but these are supposed to be all-ASCII. [17:52:52] aharoni: i think there is a unicode string library you need to use for string operations involving non-ascii code-points (i know, it's 2014) [17:52:56] hm [17:53:35] ori: that sucks [17:53:48] try just returning the value of "scenario .. '-' .. language .. '.png' " so we can see what it is? [17:54:46] let's see... [17:56:39] ori: returns what I expect: "VisualEditor toolbar-zh-hans.png" [17:57:51] I'm confused about how frame:preprocess('-{zh-hans:zh-hans;zh-hant:zh-hant;}-') would work. Doesn't lang converter only happen if the content language is zh (or sr, etc), which is never the case on mw.org ? [17:58:46] oh wait, translate extension doing funky things. possibly never mind [17:59:32] bawolff: yes :) [17:59:48] why 'Media' and not 'File' btw? [18:00:12] bawolff: Today I noticed for the first time that https://www.mediawiki.org/wiki/Help:VisualEditor/User_guide/zh has a variant selector. [18:00:44] ori: Media is a hack that checks for the existence of a file in Commons and not only on the current wiki. Very user-friendly (not). [18:00:56] i bet that's it [18:01:01] try changing that to File [18:01:40] But that will fail, because the file is on Commons. [18:01:50] And it does work for other languages. [18:04:38] * bawolff still bets on the language converter. =mw.getCurrentFrame():preprocess('-{zh-hans:zh-hans;zh-hant:zh-hant;}-') doesn't seem to work even on zh.wikipedia (but I'm using user language english) [18:05:31] unless module namespace is hard coded to en page language [18:05:36] languages are silly [18:07:43] bawolff: I was able to make something like this work at https://zh.wikipedia.org/wiki/%E6%A8%A1%E5%9D%97:VariantTest . [18:09:07] aharoni: Doesn't work for me [18:09:16] it outputs -{zh-hans:Simplified;zh-hant:Traditional;}- [18:09:17] Huh? [18:09:32] How about https://zh.wikipedia.org/wiki/User:Amire80/Lua ? [18:09:36] which than gets converted to what you expect, because the parser goes over the {{#invoke:'s output a second time [18:10:05] based on testing at Special:ExpandTemplates [18:10:26] bawolff: what does https://zh.wikipedia.org/wiki/User:Amire80/Lua say to you? If you didn't play with preferences much, it should say "Simplified". [18:10:52] It says simplified, but https://zh.wikipedia.org/w/index.php?action=raw&ctype=text/css&title=User:Amire80/Lua&templates=expand does not [18:11:12] And you'd need https://zh.wikipedia.org/w/index.php?action=raw&ctype=text/css&title=User:Amire80/Lua&templates=expand to say simplified, since you're passing the result to a lua function, not parsing it [18:12:26] bawolff: I don't know what are you referring to... on which line in which module am I passing the result to a lua function? [18:12:52] I've read before this thing that media: checks in Commons and File: doesn't, but never observed such a thing in practice [18:13:14] The difference to me is that media: links the original and File: transcludes or links the description [18:13:53] In fileTitle = mw.title.new(filename, 'Media') [18:14:31] ok... but I have a reason to think that filename is what I want it to be. [18:14:35] filename at that point is scenario .. '--{zh-hans:zh-hans;zh-hant:zh-hant;}-.png' [18:14:59] oh, bawolff is right [18:15:08] as usual! [18:15:23] OK... is there a way in which I could get it to be actually parsed? [18:16:47] hi - getting this error tht blocks runJobs: A database query syntax error has occurred. The last attempted database query was:"UPDATE `btw_site_stats` SET ss_total_edits=ss_total_edits+1,ss_good_articles=ss_good_articles-1" from within function "SiteStatsUpdate::doUpdate". Database returned error "1690: BIGINT UNSIGNED value is out of range in '(`mw117`.`btw_site_stats`.`ss_good_articles` - 1)' (127.0.0.1)" [18:17:17] how do i fix this - i have not changed configuration or anything, first time that this has occurred [18:17:49] i can try running update now [18:18:21] hypergrove: I bet the ss_good_articles is -1 [18:20:16] ran update, same error on runJobs. (during update got "Warning: Illegal string offset 'LIMIT' in /var/www/w/includes/db/Database.php on line 1133")" [18:20:34] bawolff, how should i fix that [18:21:40] hypergrove: Don't know about the second error you mentioned. For the first error, try running [18:21:49] php updateArticleCount.php --update [18:22:08] bawolff, ok trying [18:22:20] actually, php initStats.php is probably a better script [18:22:30] ok [18:22:55] wait, we issue a job run just to increment/decrement fields of the site stats table? .____. that's ridiculous [18:23:05] works, thank you bawolff [18:23:26] Vulpix: I don't think we do [18:23:37] Maybe he has a job that edits a page, and thus updates stats [18:23:52] ah, that would make sense! [18:24:02] replacetext [18:24:16] yep, that would do it [18:24:19] bawolff: so, any idea about how I could get '-{zh-hans:zh-hans;zh-hant:zh-hant;}-' parsed within Lua? Or do you have a reason to think that it's just impossible? [18:24:38] aharoni: Sorry, I was just looking through the lua manual, nothing pops out [18:24:39] thanksagain for help folks - bye [18:25:10] * bawolff was very confused why his mw layout was all weird, then remembered I'd run git checkout 1.19.18 [18:25:53] maybe jackmcbarn might know [18:34:12] aharoni: Hmm, even template expansion doesn't seem to expand lang converter things [18:34:57] bawolff: maybe it would work as a secondary request or something [18:35:19] what I *actually* need is not so much to parse that string, but to find out what is the variant of the current page. [18:35:59] Maybe best bet would be to just introduce a {{CURRENTLANGVARIANT}} magic word [18:39:40] Is it possible to some users unable to see some pages? [18:39:57] farhadix: yes [18:40:28] However, we don't optimize for that use case, so doing limited no viewing with mediawiki is a "use at your own risk" type of feature [18:40:32] bawolff, can you help me? can you please send a link? [18:41:21] Its mostly safe though provided you don't let the users who aren't allowed to see other pages, edit rights [18:42:33] !wg WhitelistRead [18:42:33] https://www.mediawiki.org/wiki/Manual:%24wgWhitelistRead [18:42:40] !groups [18:42:40] For information on customizing user access, see < http://www.mediawiki.org/wiki/Help:User_rights >. For common examples of restricting access using both rights and extensions, see < http://www.mediawiki.org/wiki/Manual:Preventing_access >. [18:44:25] bawolff, thanks [18:45:29] bawolff: https://bugzilla.wikimedia.org/show_bug.cgi?id=71366 , FWIW [18:52:59] is this language converter thing documented anywhere? [18:54:28] bawolff: hardly [18:54:41] I know pretty much nothing about its technical internals. [18:54:48] I spoke to liangent about that earlier today. [18:54:55] cache [18:55:13] Yes, I think liangent is the only one currently around who even remotely understands it [18:55:26] Sorry, meant to write this: [18:55:32] I meant more from a user perspective [18:55:46] Liangent said: "by the way it's possible and relatively easy to introduce the variant factor earlier in the php parser, for example a parser function or magic word. I tried something similar but the parsoid people rejected that, saying that it pollutes or fragments their cache" [18:56:14] maybe also cscott [18:57:32] I just found this - https://meta.wikimedia.org/wiki/Automatic_conversion_between_simplified_and_traditional_Chinese [18:57:44] talking about mediawiki 1.4, but shockingly looks to be accurate [18:58:55] Heh... Now that I think of it, I first learned about the -{ }- syntax from a help page in the Serbian Wikipedia, [18:59:16] I happen to know Russian and Serbian is similar enough. [19:00:01] j.mp/ZgrHcf [19:00:04] http://j.mp/ZgrHcf [19:00:21] fyi, here's a summary of what we came up with after discussing things with Liangent and others: https://bugzilla.wikimedia.org/show_bug.cgi?id=41716#c37 [19:00:51] And https://meta.wikimedia.org/wiki/Automatic_conversion_in_Serbian_language is not very informative. [19:02:52] https://www.mediawiki.org/wiki/Writing_systems/Syntax is what I'm looking for I think [19:04:04] * cscott pokes his head up [19:04:21] yeah, i grok language converter. check the parsoid spec for some links. [19:05:12] https://www.mediawiki.org/wiki/Parsoid/MediaWiki_DOM_spec/Language_conversion_blocks -- but https://www.mediawiki.org/wiki/Writing_systems/Syntax is the main useful link there. [19:05:31] This feature is scary... ;) [19:05:40] yes. it is. [19:06:32] there is a general consensus that this feature is too complicated and hairy and we should deprecate it and never speak of it again --- from people who aren't familiar with the languages involved. [19:07:21] once you understand the language situation better, it seems more and more reasonable. ;) [19:07:29] alternatively, a post-processing step on the DOM could also be fine [19:08:31] yes, i think gwicke is right wrt "polluting the parsiod cache" etc. i agree that it's best to capture all the variant info in the DOM and do the variant conversion in a post-processing step. [19:08:48] i think there's a little disagreement about whether that post-processing step is server side or client side or what. [19:09:02] but i haven't read the backlog yet. what are we talking about? [19:09:30] * gwicke got to go [19:12:03] oh, and the "more reasonable" thing above doesn't mean that something like the content translation tool might not be a better solution long term. hopefully we will continue making solid progress on maintenance of wikis with very similar content (either direct translations, or character set transliterations, or things in between) [19:12:21] but at some point we need to parse language converter blocks, in only to allow us to migrate content out of them. [19:12:27] *if only [19:13:19] cscott: https://bugzilla.wikimedia.org/show_bug.cgi?id=71366 [19:13:44] cscott: https://www.mediawiki.org/wiki/Module:Language_screenshot_filename [19:14:26] I would actually do it as a simple template, but I'm not sure that that is feasible either. [19:14:37] ah. hm. [19:15:02] i think brad's suggestion is probably correct. the variant conversion happens after wikitext parsing in the pipeline. [19:15:21] yes. [19:15:25] although I've got an open bug to integrate them a little more so that variant parsing isn't quite so brittle [19:15:29] that's what it seems to be doing. [19:16:01] the question is - is there any way at all to get it. [19:16:38] (https://bugzilla.wikimedia.org/show_bug.cgi?id=52661 is the "variant parsing is brittle" bug) [19:16:40] the best thing would be to have a Lua function like mw.language.getVariant [19:17:21] so, the currently-selected variant is a user preference. so it might be best placed under mw.user or some such? i'm not that familiar with the MW JS API. [19:17:42] it's not JS, it's Lua. [19:18:06] there's a "default variant" for the wiki, which is sometimes based on the hostname you use. that is, if you visit zh-cn.wikipedia.org you get the zh-cn variant of zhwiki. (my details here are probably slightly wrong) [19:18:35] but the user's variant preference, iff they are logged in and have selected a preferred variant, takes precedence. [19:18:41] I think we use it as part of the path - e.g. replace /wiki/ with /zh-cn/ [19:18:51] bawolff: yeah, that sounds right. [19:19:00] doing this in Lua would be pretty evil [19:19:12] anyway, there are three levels: wiki default, path-based default, then user preference. [19:19:22] as it'd make it impossible to use the same HTML DOM as the base for DOM-based variant conversion [19:19:30] i'm sure it's bouncing around in PHP land somewhere. [19:19:39] * gwicke really has to go now [19:19:41] < altin: good to see you here > [19:20:24] aharoni: yeah, i agree with gwicke in that it's really best to emit all the variants from Lua land, and let the later stages in the pipeline handle the conversion appropriately. [19:20:46] i haven't even begun discussing character set transliteration, which is somewhat separate from the -{ }- markup. [19:21:17] just let the l18n team translate your message as -{...}- with all the appropriate variants, and emit that from lua land. [19:21:21] unless i'm misunderstanding the problem [19:21:59] The -{ }- is not the important part per se. My current need is to know what is variant of the page as it is currently viewed. [19:22:57] -{ }- could theoretically be a (hacky) way to find it out, but it is actually parsed after Lua runs, so it doesn't help. [19:23:08] some PHP digging and Scribunto hacking is probably called for, then. i wouldn't be surprised to find that the variant wasn't exported yet. [19:23:17] is there a lua function to 'parse to wikitext'? [19:23:54] I mean, parse to HTML. If so, parseToHTML('-{var1:1,var2:2,var3:3}-') might work. [19:24:11] not that I could find. [19:26:45] I assume frame:preprocess won't help (because of bug 52661) [19:27:07] but there is mw.html:wikitext which might work [19:28:23] it seems like if you were to add a new API, something underneath mw.language would be the right place. There's already mw.language.getContentLanguage and mw.language.fetchLanguageNames for example. [19:28:41] you'd want something like mw.language.fetchLanguageVariants and mw.language.getContentVariant perhaps. [19:29:34] cscott: yes [19:30:09] at this point in time I can barely write a Scribunto module, let alone hack its internals or libraries :) [19:30:10] from reading https://www.mediawiki.org/wiki/Extension:Scribunto/Lua_reference_manual it looks to me that the person who created the API didn't know about language converter [19:30:46] aharoni: i could probably help you with that, i've done some scribunto hacking, but i'm currently oversubscribed with the OCG rollout on monday. maybe Tim-away would be available to help? [19:30:55] NP [19:31:08] a number of those APIs could use improvement/extension to handle variants. [19:31:29] cscott: Just remember we don't want to be fetching user-specific settings and saving things based on them into a cache that gets shown wrongly to a user with different settings. [19:32:03] anomie: true. but i think aharoni wants the info for stats gathering or some such? [19:32:16] no, not for stats. [19:32:24] there are a number of scributo apis that provide access to user-specific properties which would be bad to cache. [19:32:26] for creating a file name on the fly. [19:32:41] cscott: Such as? [19:32:48] [[File:-{v1:foo,v2:bar}-.jpg]] doesn't work for you? [19:33:14] fortunately/unfortunately, language variants can be nested inside things in evil ways. [19:33:43] I don't think that such a thing will work on a page that is written in a language that doesn't support variants. [19:33:43] aharoni: Yeah, variant-specific filenames is going to be bad. "Pages using this file" would change depending on who last triggered a parse of the page == evil. [19:34:37] anomie: do you have any better idea for storing a screenshot that can have text in Simplified Chinese and Traditional Chinese? [19:34:51] anomie: I don't think so, the one links are taken from should be parsed with canonical parser options [19:34:57] -{v1:[[File:foo.jpg]],v2:[[File:bar.jpg]]}- might work out; that'll record the page uses both files even though only one is displayed. [19:35:18] anomie's version will be kinder to parsoid in the future as well ;) [19:35:19] Otherwise we'd have that problem with [[{{int:foo}}]] type deals [19:35:32] It could make more sense to get Translate to store pages with variants as /zh-hans and /zh-hant [19:35:35] currently it's /zh [19:35:40] bawolff: You may be right, although then you have files reporting not used when they are which isn't much better. [19:35:44] bawolff: {{int:}} is evil [19:35:45] it has advantages and disadvantages. [19:35:47] true [19:35:49] and yes, it is [19:36:02] i like the /zh-hans and /zh-hant idea. [19:36:23] For a long time, {{int:}} used to actually cause the problem you're describing though [19:36:53] As I said - it has advantages and disadvantages. The advantage is that it's like the Chinese Wikipedia - generic, and controllable by preferences or by the variant selector at the top. [19:37:09] bawolff: Although, was the problem fixed well enough that it would cover variants too, or just well enough that {{int:}} doesn't break it anymore? [19:37:14] Disadvantage - I need to have long discussions on IRC to figure out how to do a simple thing :) [19:37:30] * bawolff would need to look at the code to be sure [19:50:42] hey guys im wondering how to combine urls [19:50:49] so if someone types in /wiki/johndoe [19:50:56] but the actual page is /wiki/john_doe [19:51:03] that it will detec that and redirect them [19:52:01] !redirect [19:52:01] Redirects are used to forward users from one page name to another. They can be useful if a particular article is referred to by multiple names, or has alternative punctuation, capitalization or spellings. See [20:04:24] thanks [21:53:33] Nemo_bis: were you pulling a lot of files through OCG yesterday evening? [21:53:58] i was running into some problems where OCG wasn't cleaning up its temporary directories, but I can't seem to reproduce them today [21:54:15] if you were doing anything special yesterday, i'd like to have to do it again so I can figure out what was going wrong [21:56:47] cscott: yes, I was stress testing a bit [21:57:52] Nemo_bis: any wiki projects in particular? [21:59:06] cscott: en.wiki, de.wiki, it.source IIRC [22:16:21] Is it possible to hide a pages title? [23:09:16] marktraceur: I think the wndialog thing is flexible enough to produce pretty looking wizards too