[00:01:01] a ssh client or server? [00:02:06] ssh client [00:02:31] it's probably installed, already [00:02:45] sudo apt-get install openssh-client [00:05:19] it was indeed installed [00:06:00] and i took the wrong one: openssh-server XD; [00:09:11] what is the difference between openSSH-client and openSSH-server? [00:12:05] you use openSSH-client to connect to openSSH-server [00:13:11] i see [00:15:55] humm i guess i will tke turnkey mediawiki instead then [00:16:15] so for now, i will go to bed now [00:16:21] bye [01:47:11] Hello community! [01:47:22] have an API question and I'm hoping I could get some help here [01:49:20] Hi Hibans. [01:49:23] What's your question? [01:49:56] Hi Debra, I'm trying to find out if there is a way for me to retrieve the Lead/Image for an article [01:50:10] On Wikipedia? [01:50:14] Or on your personal wiki? [01:50:16] correct [01:50:21] on Wikipedia [01:50:25] Wikipedia uses an extension called PageImages. [01:50:30] I used Chris Farley as an example: https://en.wikipedia.org/w/api.php?action=query&prop=pageimages&format=json&titles=Chris_Farley [01:51:05] Sure. [01:51:10] What's the question? [01:51:28] I also tried using this call to get a list of images, but there is no indication which one is the Main/Lead image [01:51:29] https://en.wikipedia.org/w/api.php?action=query&prop=images&format=json&titles=Chris%20Farley [01:51:49] Oh. [01:52:32] Trying to figure out if there is a way for me to find out which image is the main one. I also tried used the simple API, but that one doesn't have the image. [01:52:32] https://simple.wikipedia.org/w/api.php?action=query&prop=pageimages&format=json&titles=Chris_Farley [01:53:58] https://en.wikipedia.org/w/index.php?title=Chris_Farley&action=info [01:53:59] Hmmm. [01:55:00] Oh, so that's what's considered the mani image? not what's here? https://en.wikipedia.org/wiki/Chris_Farley [01:56:11] I guess so? [01:56:13] That's weird. [01:57:42] That's odd. But thanks for helping. [02:05:26] The image on the right is the 'infobox' image [02:05:32] might be able to google for something that gets that [02:06:24] CZauX: I tried. No Luck. [02:11:55] Hibans: https://en.wikipedia.org/w/api.php?action=query&prop=revisions&rvprop=content&format=xmlfm&rvsection=0&rvparse=&redirects=&titles=chris%20farley [02:12:30] There's the actual html of rvsection 0. There are oembed proxies out there that get the infobox and an abstract of the page [02:12:44] so it should be possible to use the api to parse that html, pull the image out [02:13:09] awesome! That might just do! :) [02:16:36] Hypothetically you should be able to just check that the table has the 'infobox' class. Then parse through the Table rows and columns until you get the 'image' class. First one you see should have the URL for the image in an tag as a child [02:17:31] along with the width and height. [02:18:35] credit: https://gist.github.com/hubgit/90fd36028dc2b608480a [02:19:47] Yup. I already have that working in a way. Thanks a bunch, I wish I could send you a cookie! ha [02:20:02] And thatnks for the Credit, I'll make sure to add that as well [02:20:26] np [02:22:27] Also I believe this page may be of help if there's some other options you need: https://www.mediawiki.org/wiki/API:Revisions [04:06:11] hi folks, installing mediawiki on debian gnu/linux. i have mysql server and client installed yet the mediawiki installer only detects postgres (also installed). any help please? [04:11:16] ah, of cause, also needed the php driver for mysql, that's solved it [04:11:41] spend ages on this, only to work out the solution shortly after asking for help... oh well [04:11:49] cheers folks [04:52:11] hi again folks [04:53:06] hoping somebody might be able to help with this. i'm getting 'Fatal exception of type MWException' error, this started after i changed the administrator password for the wiki (using the maintenance script) [04:54:05] thing is - i have purged mediawiki (so all configuration files should have been removed), and dropped the database, then reinstalled and gone through the setup process [04:54:16] but still getting that damn error... any suggestions? [05:38:57] Does Mediawiki include an actual html parser I can use to throw HTML in the form of a string and get values from attributes? [06:11:39] CZauX: https://secure.php.net/manual/en/book.dom.php maybe? [06:15:59] Hrm, true. ty. [08:12:32] how can i remove a user from wiki? [08:13:36] you don't, really. you can block theuser from contributing, hide the username so it doesn't appear in lists of users, and delete/oversight the user and user talk pages if you like; you can do the same to their contribus [08:14:16] but removing the user completely from the db is something we don't do [08:16:17] i forgot the username password, and it's email address is incorrect. so i don't really know how to recover it? [08:29:43] if you are the user, there's not much you can do [08:30:41] if you are the wiki admin and know the user, you can reset the user's password, there's a maintenance script [09:52:38] just updated from 1.25.6 to 1.27.1 and ran update.php but it had an error as db user didn't have needed permission. when i try to rerun it says its up to date and does nothing. what should i do? [09:53:36] Hi I'm trying to install Statcounter extension. In the instructions it says [09:53:41] Fill in [[MediaWiki:Statcounter-project-id]] and [[MediaWiki:Statcounter-security]] with the values of the JavaScript variables sc_project and sc_security, respectively, as provided to you by StatCounter.com [09:53:55] and I don't know what this means [14:06:10] what does rebasing a patch mean? [14:07:02] enigmaeth: It means to modify the patch based on changes made upstream, by other people so that there are no conflicts while merging. [14:07:10] You can look it up online. [14:07:40] Essentially to put it on "top" of others' changes. [14:08:12] okay! [14:08:55] enigmaeth, also see the first lines of https://www.mediawiki.org/wiki/Gerrit/Tutorial#Prepare_to_push_your_change_set_to_Gerrit [14:10:02] okay! [14:12:00] enigmaeth: it's a very good question... We didn't explain that term in the Gerrit/Tutorial page. Probably because we were already too used to it. So your fresh questions are very welcome as they allow improving our documentation. [14:12:31] nice [14:15:04] andre__: someone has to code-review this again before this goes to main build test? https://gerrit.wikimedia.org/r/#/c/312744/ [14:15:40] enigmaeth, that link still says "Code-Review +2" [14:15:56] yes [14:16:19] the patch failed main build tests [14:29:26] andre__, enigmaeth: when verification fails after a +2, that +2 needs to be repeated (removed and re-applied) [14:29:57] i added my own +2 now, that should do it [14:30:05] enigmaeth: thanks for helping with this :) [14:30:48] nice...thanks [14:31:25] DanielK_WMDE: so merge it? [14:31:53] i gave a +2, that shou,d merge it [14:32:00] ohh...yes! [14:32:41] uhhh... whut? why did I give a V+2 instead of a CR+2? I clicked the big blue button? [14:32:48] The gerrit UI is horrible :/ [14:32:58] ok, now [15:08:20] I've made progress! $updater->dropExtensionField( 'page', 'page_counter', that call is buggaring my hitcounter install up, and I need the sql script associated with that call to execute. The SQL runs fine, but not if there's no matching field to drop. How can I call the sql without dropping something? D: [15:15:52] well, I give up. I got the table populated now, but the hooks aren't doing a thing :( [15:25:03] does https://www.mediawiki.org/wiki/Extension:Google_Analytics_Integration still work as advertised? [15:26:05] If anyone knew, anyone would update that page? :) [15:26:44] Well, if you've got a GA acc set up it should take mere minutes to test that it works. [15:27:02] I imagine if it *didn't* work, there'd be more complaints on the Talk page; it's a pretty popular extension. [15:27:41] About 12% of all wikis use it: https://wikiapiary.com/wiki/Extension:Google_Analytics_Integration [15:28:39] And that does include a few dozen 1.27+ wikis. [15:43:48] Most people install analytics without any clue as to what to do with them [15:44:07] True enough. [15:54:42] Nemo_bis: Happily enough I am not instructed to utilize or understand the analytics. I'm pretty sure it's mostly just about iflating someone's e-peen [15:55:07] inflating* [16:04:41] oh! other brainstorm I had. I need to provide a link that loads faster than the homepage for haproxy to stop pitching fits for no reason. At the moment I have a tiny little hello world script to check, but I was thinking it would be better to use api.php. Are there any spectacularly bad reprecussions from doing so? [18:21:05] hello folks [18:21:34] i finally found out what is wrong with my scribunto: it was because of php version! [18:22:20] my mediawiki is a big liar: it said it's 7.0.10 as mentioned on my wiki: http://www.allaze-eroler.com/wiki/Special:Version [18:22:51] and when i typed php -v in my ssh command, i get 5.4.45 as php version [18:24:20] command line and web server php aren't always the same version [18:24:29] which i found out [18:25:08] it's by reading this https://www.mediawiki.org/wiki/Thread:Extension_talk:Scribunto/Lua_scribunto_error_on_many_pages_(status_127) that i realise the problem [18:37:08] is that correct? $wgScribuntoEngineConf['luastandalone']['luaPath'] = "$IP/extensions/Scribunto/engines/LuaStandalone/binaries/lua5_1_5_linux_32_generic/lua" [18:44:21] which is the place can i download that lua file? [19:10:31] i tried to type "make linux" to generate lua file [19:10:33] it failed [19:12:05] the lua binary should be bundled with the extension [20:05:41] could someone review this : https://gerrit.wikimedia.org/r/#/c/312744/ [20:05:41] It already has two CR+2. It failed the gate pipeline build tests. A new task was opened and now resolved. [20:05:42] This patch is rebased. [20:09:38] Just set up a new debian 7.0 server with mediawiki, the IP comes up with the "It works!" message, but I can't get to IP#/mediawiki [20:09:43] What am I missing? [20:17:00] Had to uncomment the alias in apache.cfg.... So far so good. [20:17:36] Does mediawiki support facebook login? [20:17:41] (and automatic account creation) [20:21:38] OK. Now how to do the database restore.. :) [20:27:06] Does mediawiki support facebook login? [20:27:19] crhylove: if you have a sql dump, just execute it in mysql [20:28:45] Yep.... rsyncing it to the right spot now... [20:30:59] OK, got the sql dump.... Trying restore now.... [20:39:38] With a tag parser hook, is there a good way to get the raw text before the tags are stripped without having another hook? [20:40:04] using onParserFirstCallInit with $parser-sethook [20:41:56] OK, so I got the database restored from the other server... [20:42:27] But now I don't have the main index. If I click Random page, I'm getting all the articles... [20:42:38] So the data is there, but the main page is the default one. [20:47:02] How can I get a list of all the pages in the wiki? [20:47:49] Hi crhylove. [20:48:01] If you visit the page "Special:AllPages" on your wiki, it has a list. [20:48:05] Hi Debra! Found an old version of the main page. :) [20:48:27] so now I just need to make that the current main page, and I'm pretty much good. [20:50:21] BOOM. Did a quick edit of the older revision of the main page, hit save: GOLDEN. [20:50:27] Nice to get all this data back. :) [20:50:42] 2 years worth of technical documentation. FEELS GOOD [20:53:44] Does mediawiki support facebook login? Perhaps with Extension:OAuthAuthentication? [20:54:44] Asterixf2: https://www.mediawiki.org/wiki/Extension:ULogin looks like it does it. [20:55:08] Meneth: thx! [21:00:06] someone can explain me what i should put in "cite_web" template to make Cite extension useful ? [23:56:36] OK, so if I have to change the IP of my mediawiki server, how do I fix it after that? [23:59:12] nm. Found it. Localsettings IP. :)