[00:18:08] Is there a channel for media wiki devs? [00:23:32] mpickering, well, I guess #wikimedia-dev has most of us [00:23:57] Here isn't an incorrect place either though [00:24:26] yeah [00:24:46] if you have a question, ask it. it's probably appropriate for here, but if not we should be able to point you in the right direction [00:28:42] Ok, how do media wiki devs find using phab day to day and have you modified your installation significantly? [00:29:06] Are those two questions related? [00:29:07] :) [00:30:08] We (GHC) Also use phab but find it has a few corners. I was wondering if you had found the same. [00:30:50] We aren't as predisposed to hack on it though as it's not written in Haskell ;) [00:31:02] It's been a vast improvement over Bugzilla [00:33:01] So have you customised your installation a lot? [00:33:32] Or which mailing list do I look at to find people discussing this? [00:36:04] We generally advise against making hacks to core and extensions [00:36:13] Many of us would try and make sure these "hacks" make it into mainline MW [00:38:09] legoktm, hi! re https://wikitech.wikimedia.org/wiki/User:Gryllida/sandbox, would you please like to mentor it? Would I be able to use it as a starting point for a new labs instance request? [00:44:56] Reedy: I think they're still talking about phab. [00:45:32] Oh [00:45:33] Meh [00:45:41] Same rules apply [00:46:32] i think we have some custom plugins or something, but not really "hacks" in the original source code afaik [00:46:54] andre__ would know [00:47:14] mpickering: you can also take a look at our phabricator-phabricator tasks https://phabricator.wikimedia.org/project/view/5/ [01:07:57] gry: uhhh. well fwiw I wouldn't recommend using lab for developing, I'd do it locally. [01:39:30] legoktm: I can do that; just wondering what do you recommend using lab for? [01:40:10] gry: well after developing something, to share it with other people and have them test it [01:42:05] sure [01:43:03] so I just develop it locally and ask questions from time to time? do you have some hints about what to read to get started? would you be willing to mentor the idea as such? [01:52:11] Hi gry. [01:54:36] hi Debra [10:55:40] hello [10:55:42] all [10:56:17] I have a problem with my mediawiki and I m gonna crazy O.O [10:56:40] when I try to do action query with api.php [10:57:13] I receive a message "Action unknown" (french->english translation, don't know the original message) [10:57:32] Version is 1.21.2 [10:57:44] Anybody can help me, please ? [11:02:06] chubaka: if you're still running 1.21.2 you have way bigger problems than the one you described here [11:02:20] (like a few dozen security issues) [11:04:07] but i cannot upgrade now [11:16:01] upgrade to 1.23.15 [11:16:06] still buggy [11:17:09] http://MYSERVER/WIKI/api.php?action=query [11:17:16] => No such action [11:18:06] I know it's a legacy but my php version is 5.3.3 and i cannot upgrade [11:18:08] chubaka, which other parameters do you pass? [11:19:00] api.php?format=json&action=query&list=recentchanges&rclimit=50&rcprop=title&rctype=edit [11:19:20] (MW1.23 supports PHP 5.3.2-5.5.8) [11:20:23] php -v => PHP 5.3.3 / Zend Engine 2.3.0 / Zend Guard Loader 3.3 [11:20:52] Have you tried /w/api.php?action=query instead of /WIKI/api.php?action=query ? [11:21:04] (Depends on your configuration) [11:22:08] http://MYSERVER/WIKI/api.php?action=query => same [11:23:00] so /w/ redirects to /wiki/ ? or what do you mean by "same"? [11:23:47] WIKI is an alias [11:23:58] same error [11:24:00] !time [11:24:00] For help with configuring timezones in MediaWiki, please consult . [11:25:00] because mediawiki is installed on /usr/share/mediawiki [11:25:12] real name is centreon-bdc [11:25:43] in reality => http://MYSERVER/centreon-bdc/api.php?action=query [11:25:57] but same error: No such action [11:29:44] server specs : [11:30:04] CentOS 6.6 [11:30:11] Apache 2.2.15 [11:30:19] PHP 5.3.3 / Zend Engine 2.3.0 / Zend Guard Loader 3.3 [11:31:02] MariaDB 5.5.47 [13:44:34] premium reputation .. Gengy [13:47:42] Gengy: Welcome to the MediaWiki support channel. [15:45:24] hi [15:45:41] is there a way to download the mediawiki docs for offline usage ? [15:46:19] Offline? [15:46:23] As in on your own wiki? [15:47:09] i dont have internet for 1-2 month [15:47:20] and what are "the mediawiki docs"? :) [15:47:40] but i want to develop my wiki [15:47:55] Export a book as PDF? :) [15:48:09] lol pdf [15:48:38] so is there a way to download the developer docs ? [15:54:05] DMI-1407: Depends on what types of docs you mean [15:54:31] The actual docs on mw.org aren't actually that useful from a developer prespective in many ways (With a few exceptions) [15:54:35] as example, i dont know all hooks [15:54:53] so i need the manual for the hooks for offline usage [15:54:54] Use the file docs/hooks.txt that is included with MediaWiki [15:55:48] You can also download doxygen and generate the entire contents of doc.wikimedia.org locally if you want (Somehow anyways, I've never done it) [15:56:20] There's also https://www.mediawiki.org/wiki/Category:MediaWiki_Virtual_Library_(MVL) [15:57:03] yeah i downloaded it allready [15:57:06] And if you really want all of mw.org there's dumps available at dumps.wikimedia.org (You would then have to import it into a local mw install) [15:57:36] Maybe you could get mw.org into a kiwix file somehow [15:57:37] oh cool [15:59:31] hmm, I don't see a kiwix file for mw.org. You could maybe ask them about it on #kiwix [16:01:08] uhm i have an xampp installed [16:01:26] does i only need the database backup ? [16:03:05] There's a command line script to import database dumps - importDump.php [16:03:10] i am on a save way if i download https://dumps.wikimedia.org/other/static_html_dumps/current/en/ ? [16:03:36] It works fine for small wikis - however if you're importing something big, things get complicated. I think mw.org is small enough to be considered small [16:04:55] I don't think the static_html_dumps are up to date [16:05:06] they are not [16:05:12] yeah 2008 :/ [16:05:26] that extension has been dead for a number of years [16:05:26] DMI-1407: also, that's a link to the english wikipedia dump, probably not what you want [16:06:02] uhm why not english ? [16:06:19] you want mediawiki rather than en.wikipedia. right? [16:06:26] DMI-1407: you probably want https://dumps.wikimedia.org/mediawikiwiki/20161101/ [16:07:30] oh... [16:07:37] do we really not have a nice way for people to just grab a 'copy' of the Manual? that's interesting [16:09:59] DMI-1407: for reference, most of the things in the mw.org manual: can also be found out by reading includes/DefaultSettings.php [16:10:13] really, our docs on mw.org are not that good :s [16:10:19] https://www.mediawiki.org/wiki/Category:MediaWiki_Virtual_Library_(MVL) it seems some of these are epub/pdf downloadable but many are not [16:10:39] does i have to download all the gz archives ? [16:10:51] i want to get this backup to work on my xampp [16:11:34] DMI-1407: no, https://dumps.wikimedia.org/mediawikiwiki/20161101/mediawikiwiki-20161101-pages-meta-current.xml.bz2 is probably the one you want [16:11:39] I don't know about xampp, but if you were wanting to set up a mirror site locally, you would need all of the tables, plus the mediawikiwiki-20161101-pages-meta-current.xml.bz2 file [16:11:51] and the mediawikiwiki-20161101-stub-meta-current.xml.gz file [16:11:55] depends whether or not you want categories to work or not [16:12:05] and you'd want to run a script on it to convert it to mysql to import it [16:12:36] * apergos looks around for off-line reader stuff again [16:13:19] hm, is it not possible to use phpmyadmin to import the xml tables ? [16:13:36] No [16:13:41] PHPMyAdmin deals with SQL [16:13:44] apergos: Surprising the kiwix wiki doesn't seem to have a zim file for mw.org [16:13:45] And it'd probably timeout too [16:14:10] DMI-1407: there are sql.gz files; these are mysqldumps of various tables [16:14:26] then for the revision and text data, you need the stubs and pages files referenced above [16:14:38] IF you go the route of setting up like a mysql db [16:14:57] if not, you may be able to shortcut that by just using the pages file linked above, and a clever script [16:15:20] DMI-1407: there's some info about dumps at https://meta.wikimedia.org/wiki/Data_dumps [16:15:36] Setting up a local copy is kind of an involved process, its not just download data file and go [16:20:52] there is xowa but I have no idea how well it works [16:21:12] or exactly what it does even; it's purported to set up a local copy of whatever wiki though [16:24:11] DMI-1407: What is your goal here exactly? To learn enough about MW to write an extension while spending an extended time offline? [16:24:49] (I ask because if I know what you're trying to learn, I might be able to suggest more specific docs) [16:26:38] i try to create a skin and change some settings, for now i changed my mw that its connecting to my gmail instead of using the php mail function [16:27:03] but i am not ready with the skin and some other little modifications [16:27:26] i am changing my home to another place [16:27:44] DMI-1407: ok, so first of all, I would recommend downloading some example skins [16:27:51] so i will not able to look into the online wikimedia docs [16:27:59] As having example skins can help a lot [16:28:28] i am using this vertex skin as base [16:28:34] For various config variables, you can probably just look at includes/DefaultSettings.php . For hooks at docs/hooks.txt [16:28:54] but finally i need to get this wiki dumps to work on my xampp installation [16:29:32] i dont need wikipedia, only the wikimedia site [16:29:47] At then I'd recommend downloading pretty much everything in https://www.mediawiki.org/wiki/Category:Skinning [16:29:49] (its a little bit confusing) [16:30:19] DMI-1407: Actually you need MediaWikiWiki not Wikimedia (and not to be confused with MediaWiki the software) :) [16:30:24] The naming is confusing as hell [16:30:46] Wikimedia means the combination of Wikipedia, wiktionary, etc [16:30:51] MediaWiki is the software [16:30:59] and MediaWikiWiki is the wiki documenting the software [16:31:20] ok [16:33:51] uhm [16:34:00] currently i have php version 5.5.15 installe [16:34:21] is it possible to use the dumps with this version or does i need to update it ? [16:35:01] as far as i know, i need to use the correct mw version for the current dump or not ? [16:41:23] PHP version is tied to MW version [16:41:28] 5.5.15 is fine [16:46:37] i checked the php.ini, so safe_mode is off and open_basedir is empty [16:47:14] but im not sure how to use php via commandline from windows cmd x/ [16:48:29] does i have to setup the database and install the mw befor importing the dumps ? (i think yes) [16:49:10] yes [16:49:15] but i dunno the configuration for the mw so that the dumps will work on it [16:49:53] config doesn't really matter [16:50:44] really ? whats with things like table prefix ? [16:52:12] oh, don't set a table prefix [16:52:21] I guess that one does kind of matter [16:52:32] bawolff: I set one on my dev wiki... Amusing when people merge stuff that breaks it [16:52:52] If you end up directly importing sql files from the dump sites, a table prefix will complicate things [16:53:28] Reedy: that's probably a good idea. If people break that an sql injection probably isn't far away [16:53:59] I think I remember RoanKattouw committed something (back in SVN days) that didn't cater for table prefixes [16:57:31] can i change the folder of that installation directory ? [16:58:24] as example instead of mediawikiwiki or enwiki lets say wiki_dump ? [16:58:49] or is that foldername important ? [16:59:40] The folder shouldn't matter. After you change it you have to update $wgScriptPath (and possibly $wgArticlePath) in LocalSettings.php [17:00:05] That is, if the installer has already been run [17:06:19] uhm [17:06:25] whats the database name ? :/ [17:06:31] whatever you want it to be [17:06:35] ok [17:08:13] Reputation premium [17:21:30] trader title [17:22:11] title trader [17:26:13] is the order from that xml gz files important while importing ? [18:19:19] title trader [18:19:26] reputation premium [18:23:24] Gengy: do you have an issue or problem with MediaWiki? If no, you're welcome to hang out, but please do not spam the channel with nonsense [19:26:51] I made my Cite extension unhappy. Can anyone tell me how I appease it? [19:27:12] I was a fool and updated it for something unrelated and now I see lots of errors like: Cite error: Invalid tag; name "pmid15265852" defined multiple times with different content [19:27:27] Sacrifice a goat. [19:27:46] Tried that. Didn't work. It's getting expensive [19:27:48] and messy [19:28:35] :P [19:29:18] We started with chickens. There was no apparent effect [19:32:36] Newborn children? [19:36:08] Election's not over yet. That's still aggressively illegal [19:36:50] aaaaaaaanyway I've found the error in the cite documentation, and it's implied that the references need fixing. However there wasn't a problem before I updated Cite. [19:37:43] Hi - can anyone tell me how i can join two categories to perform an inline ask query? I thought it might be something like {{#ask: [[Cat1]] AND [[Cat2] | ?Cat1Val | ?Cat2Val }} - any ideas? [19:38:05] !hss [19:38:05] https://upload.wikimedia.org/wikipedia/mediawiki/6/69/Hesaidsemanticga2.jpg [19:38:20] I have been waiting SO long to do that to someone else [19:38:35] :o [19:39:30] abnev: There's a separate channel for the semantic extension, probably a lot of the same folks [19:39:58] when I asked my first semantic question in here that's what I got [19:39:58] Ulfr: ah cheers [19:41:42] I'm also here to shamelessly cadge knowlege from wiser heads, so they might not break out the pitchforks for semantic questions these days. xD [19:41:43] Ulfr: you're welcome [19:41:55] Wait, what? [19:42:07] I mean, thank you, but I'm feeling puzzlement bordering on alarm [19:44:26] Reedy: What did you do. I'm legtimately looking over my shoulder now [19:45:07] * Reedy grins [19:45:39] :| [19:47:38] Well, now that I'm fearing for my life, is there any variable I can feed Cite to make it chill out about that error? [19:48:32] Don't use the same name many times with differing content? [19:48:54] It's not my content that's breaking though, and there wasn't an error before the cite update [19:48:56] Reuse if with [19:49:04] oh god [19:52:15] Reedy: http://www.wikidoc.org/index.php/ST_elevation_myocardial_infarction_glycoprotein_IIbIIIa_inhibition#References [19:53:32] sry that i break this discussion [19:53:50] because importDump.php script [19:53:52] php importDump.php --conf ../LocalSettings.php /path_to/dumpfile.xml.gz wikidb [19:54:01] wikidb means the name of the database ? [19:54:10] the mysql database, yes [19:54:21] and what is meant with the dumpfile ? [19:54:35] i have 30 dump files... [19:54:37] that's where the tarball containing your wiki dump is located [19:54:51] you either need to cat them together or import them one at a time [19:55:16] i've loaded all files on this page: https://dumps.wikimedia.org/mediawikiwiki/20161101/ [19:55:40] i am only a litle bit confused which on is the first or is the order not important ? [19:56:28] if i can i will do one at a time... [19:57:27] You're beyond my knowlege, someone else might know better [19:57:35] https://meta.wikimedia.org/wiki/Data_dumps/Import_examples [19:57:41] currently i read this [19:58:03] i am on 1.1.1 step 6-7 [20:00:03] bawolff, Reedy ? :/ [20:04:22] I've never imported a large dump like this... [20:07:43] Reedy: I figured out what you meant with Cite. Thanks man [20:08:07] Past me: [19:41:43] Ulfr: you're welcome [20:08:08] :P [20:08:36] ... [20:08:36] My mind [20:08:39] is blow [20:08:39] n [20:08:44] how are you a futureman [20:15:56] :/ [20:40:05] uhm the script does not work [20:40:41] nothing happens and no error messages... [22:41:44] Hi.