[00:01:52] Morbus: You could hack something up with categories. [00:02:08] https://en.wikipedia.org/wiki/Wikipedia:HotCat is vaguely similar to what you want. [00:02:31] Except for the pre-populated list part, I guess. [06:19:24] bd808: huh, T70876 is kind of a weird one, I'm going to have to think about that [06:19:24] T70876: Make user_email_authenticated status visible on labs - https://phabricator.wikimedia.org/T70876 [07:31:24] hi - i've upgraded MediaWiki to 1.29.1 - and now the "Popular Pages" special page is gone. Using theme "Monobook". [07:53:38] mwi: do you remember your old version? [07:54:45] mwi: you will need https://www.mediawiki.org/wiki/Extension:HitCounters but if might have lost your data [08:04:32] saper, 1.27.something [08:05:08] "A fresh install in MW 1.26 and higher is not possible! Attention: If you do not act according to the following instructions, an update to MediaWiki 1.26 or newer can permanently delete your hitcounter numbers!" [08:07:23] so i guess i have a slight chance of loosing the old data then :P [08:10:33] i guess it doesn't work properly, i added that extension [08:10:43] and the whole wiki stopped working with db query errors [08:10:46] removed it, and it works again [08:11:58] so i guess a fresh installation is still not possible; will have to dig up the old backups then and manually transfer those hit_counter tables [08:15:44] would be good to fix those db errors, i haven't tried HitCounters recently [08:32:40] Hi, can any one help me on Oauth? [08:33:30] roni: more specifically? [08:35:29] I got the vrifier token, and im not sure how to use it to use the api (for edit action) in translatewiki [08:36:32] you need to exchange it for an access token [08:36:42] and use the access token to sign the API requests [08:37:19] roni: what library are you using? [08:38:21] the verifier is not the token? [08:38:58] both are called tokens in the OAuth spec [08:39:10] but the verifier is just an intermediary [08:41:50] im using querystring and nodejs [08:42:53] what OAuth library, I mean [08:45:14] you mean 1.0? [08:46:38] I mean passport or something like that [08:46:45] are you trying to implement the OAuth protocol by hand: [08:46:48] ? [08:48:26] I fount some example for twiter, but it works for translatewiki to, I use require("querystring") [08:49:01] found* [08:49:24] hoo I see what you mean [08:49:55] im not using ant library [08:49:59] any [08:54:50] I'd recommend doing so, with a decent library the full OAuth protocol is typically three functions calls [08:55:39] but if you do implement by hand, you need to call Special:OAuth/verify, sign that call with the verifier token, and you get the access token in the response [09:02:33] tgr, I'm workin with roni on this [09:03:01] it's for a project that involves translations, which is my expertise [09:03:11] but I've never used OAuth myself, so I don't know anything about it [09:03:20] the question is: [09:03:59] 1. There's the one-time step, which involves getting the user to a MediaWiki site, and asking for username and password. I think that we've managed to make that work. [09:04:26] 3. And there's the final step, which is making the edit action with OAuth action. [09:04:37] But what is step number 2? [09:04:44] Is there step number 2? [09:04:52] step number 1 only needs to happen once. [09:04:54] username and password is not normally involved [09:05:14] although if you are not logged in MediaWiki will probably redirect you to a login screen first [09:05:20] yes, exactly. [09:05:24] there is 1-legged OAuth and 3-legged OAuth [09:05:28] and that's the first step, and it only needs to be done once. [09:05:36] 1-legged is called owner-only in MediaWiki [09:06:01] from what I read "owner-only" is good for bots, but not for interactive use. what we do is more interactive. [09:06:04] (I think) [09:06:16] indeed [09:06:40] the OAuth Bible is a decent resource [09:06:42] http://oauthbible.com/#oauth-10a-three-legged [09:07:09] so step 1 has three legs [09:07:09] the application that roni and I are making performs repeated edit actions, each of which is initiated by the user, and we want these edit actions to be done with the correct username. [09:07:18] - get a request token [09:07:28] I think that this qualifies as "interactive". [09:07:44] - send the user to the authorization dialog, use the request token to identify, get a verifier token back [09:07:58] - exchange the request token to an access token [09:09:07] once you have done that (or used owner-only, in which case you immediately get an access token) you can sign API requests with the consumer token + access token, and that serves as authentication [09:09:39] but doing that by hand is better left for masochists [09:10:18] I'm not familiar with OAuth in node, jdlrobson has done it before [09:10:38] the libraries he used are linked from the mw.org documentation page [09:12:13] https://www.mediawiki.org/wiki/OAuth/For_Developers#Node.js [09:14:10] aharoni: https://www.mediawiki.org/wiki/OAuth/For_Developers#OAuth_in_detail was my attempt to document the steps in detail (although again, if you need to know that, you have probably not looked hard enough for a decent library) [09:15:34] tgr: thanks [09:15:39] roni, can you try that? [09:15:53] jdlrobson is probably not online now, but he may be later [09:16:19] ok [09:51:47] Hi all, anyone have problem with session died when typing article/page ? [10:37:44] km4: on wikipedia? [11:00:19] tgr: on my wikimedia, now Im testing session.gc_maxlifetime [11:01:47] MediaWiki is not affected by PHP session settings [11:02:00] unless you are using a rather old version [11:02:34] see the ObjectCache docs [11:17:04] tgr: ok thanks [17:11:12] Hello everyone. [17:11:34] http://paste.touhou.fm/izuwehebek.vbs can anyone tell me what does this mean? [17:11:50] 2017/10/16 19:06:40 [error] 271#0: *25690 FastCGI sent in stderr: "PHP message: PHP Fatal error: Cannot declare self-referencing constant 'ΓΈ' in /var/www/ways-of-darkness.sonck.nl/includes/db/LBFactory.php on line 52" while reading response header from upstream, client: 84.0.48.185, server: ways-of-darkness.sonck.nl, request: "GET /Main_Page HTTP/1.0", upstream: "fastcgi://unix:/var/run/php-fpm/php5-fpm.sock:", host: "ways-of-darkn [17:11:50] ss.sonck.nl" [17:19:39] metalhead33: what version of mediawiki and what version of PHP are you using? [17:21:39] apparently Mediawiki 1.26.2 and PHP 5.6.26-r2 [17:22:01] so that looks/sounds like your mediawiki files got edited or corrupted somehow [17:22:15] Oh? [17:22:20] so I'd recommend replacing them with a fresh copy [17:22:28] What do you mean by "them"? [17:22:35] A lot of the maintenance scripts still work though. [17:22:40] Like, I can still parse articles. [17:22:48] while you're at it, I'd also recommend upgrading as 1.26 is very old and out of support [17:22:49] getText.php still works [17:22:57] I do not have admin rights to the server [17:23:24] then please tell the server admin to upgrade -- the upgrade process should fix that error as a byproduct [17:23:45] Okay, I will... once he is available. [17:24:12] is there a way to clear caches or something? [17:24:27] can you please elaborate as to what you are trying to do? [17:25:02] Trying to find a way to fix the error without having admin access (in other words, capacity to upgrade) - like running some maintenance scripts. [17:25:25] not possible [17:25:39] Damn... [17:25:51] fixing the error would require editing the files on the webserver, and if you can do that then you can upgrade [17:26:13] I can edit files within /var/www/ways-of-darkness [17:26:33] so you can upgrade then :) [17:26:34] The only things I cannot do are things that require superuser privileges, like installing new software [17:26:37] oh? [17:26:59] yep, updating mediawiki doesn't require superuser/root access [17:27:09] oh, nice [17:27:20] but I will make some backups first. [17:27:33] Nah, I won't... I already have backups. [17:27:48] always make backups :) [17:27:55] there's a guide at https://www.mediawiki.org/wiki/Manual:Upgrading [17:28:17] but the tl;dr is 1) make a backup of both the files and db (for db use mysqldump, for files mv them to a different directory is sufficient) [17:28:30] 2) download and extract a tarball of the latest version into a NEW directory (not over the old files) [17:28:52] 3) copy over LocalSettings.php, your images, custom skins, and extensions [17:28:59] (check the extensions for udpates as well) [17:29:04] 4) run the update.php maintenance script [17:29:54] But wait [17:29:59] You mentioned images [17:30:13] chown apache:apache requires superuser [17:30:31] that guide is assuming that the webserver is running as the 'apache' user [17:30:41] if you have permission to the files, then chown isn't strictly necessary [17:30:48] (and would probably break things if you did) [17:49:43] seeking help making file upload size 5GB, any help would be appreciated [17:51:43] stryker: file upload size is governed by 3 things: 1) mediawiki settings, 2) PHP settings, 3) webserver settings [17:52:03] in addition to the size, you'll need to expand the timeouts by a lot as well [17:52:10] as 5GB will take a while to upload [17:53:17] Copy. I have adjusted LocalSettings.php, php.ini, what webserver settings would I need to adjust? I am running mediawiki on iis7 [17:53:26] I recommed looking into chunked uploads, which would mitigate the timeout issues [17:53:33] iirc the UploadWizard extension supports them [17:54:41] wonderful, I will try that. Thanks again!!! [17:54:56] with that extension, you wouldn't need to configure PHP or the webserver, just mediawiki itself [17:55:38] perfect. would I need to adjust the webserver settings...as previously mentioned? [17:56:17] adjusting webserver settings is needed if you want to upload it all in one go (via Special:Upload) [17:56:37] Extension:UploadWizard will upload the file split into 1MB chunks [17:56:45] perfect thanks. [17:56:45] which should be far under the existing limits set in the webserver [17:57:12] note that you will likely need a recent version of mediawiki for the UploadWizard extension to work properly [18:30:19] well, the mediawiki got updated, but I now have to wait for the lazy admin to chown apache:apache all the stuff [18:31:41] it should only be chowned to apache:apache if that's what it was beforehand [18:31:51] that's what it was [18:31:54] ok [18:32:09] your wiki will *probably* work just fine regardless [18:32:24] (as long as you update your extensions and run the update.php maintenance script) [18:32:51] oh, allright [18:32:53] but [18:33:00] the extensions folder isn1t owned by apache [18:33:04] it's owned by another user [18:33:10] and it's 0770 [19:08:44] Well, this is interesting... [19:09:02] I got my Wiki to work, and what I am planning to do is to replace this - http://ways-of-darkness.sonck.nl/Module:Demographics_Database - with something more elegant. [19:09:59] Too bad that actually connecting Lua to an SQL database would require two things: Admin's permission (LuaRocks required superuser access to install), a guarantee that LuaRocks functions with Scribunto... which it probably doesn't. [19:10:28] So I decided to improvise and maybe upload some text files to the wiki, and parse tables from that... but I cannot read files in scribunto. No file IO, it seems. [19:10:53] So much for improvisation. [19:18:49] hi all. I just installed mediawiki and I want to use or or something, but it seems that although formats the text nicely, it still makes # a numbered list, and doesn't seem to understand newlines very well. Any idea how to do this without a truckload of extensions? [19:20:42] are you looking for
?
[19:21:27] 	 quite possibly - haven't used mediawiki for some time :)
[19:25:24] 	 Does anyone know if it's possible with Scribunto to parse text from an uploaded .txt or .csv file? As in, uploaded into the wiki.
[19:25:55] 	 btw, syntax highlightning - I guess there are a ton of plugins for that - I'm thinking bash and python, perl and perhaps a few others. Can someone recommend a plugin for that?
[19:26:20] 	 !syntaxhighlight
[19:26:20] 	 there are several extensions for syntax highlighting, see http://www.mediawiki.org/wiki/Category:Syntax_highlighting - the most popular one is at http://www.mediawiki.org/wiki/Extension:SyntaxHighlight_GeSHi
[19:26:29] 	 metalhead33: https://www.mediawiki.org/wiki/Extension:Scribunto/Lua_reference_manual Seems the answer is no
[19:26:35] 	 Sad.
[19:27:39] 	 I might be able to still cheat the system by sending in some huge block of text instead of parsing form a csv.
[19:27:57] 	 Okay, not that huge, given how it'll be probably less text than ana verage article.
[19:28:07] 	 (as in, parsing from an argument)
[19:38:38] 	 metalhead33: scribunto is heavily sandboxed. You can put the reference tables on the wiki as pages (in JSON format) which is probably your best bet
[19:39:28] 	 That makes sense.
[19:39:39] 	 Guess that's the best way I can implement a database.
[19:39:47] 	 After all, this is repulsive: http://ways-of-darkness.sonck.nl/Module:Demographics_Database
[19:40:08] 	 see https://www.mediawiki.org/wiki/Extension:Scribunto/Lua_reference_manual#mw.loadData for more info
[19:41:58] 	 (can be combined with mw.text.jsonDecode in order to store the data as a json blob instead of as a lua table iirc)
[19:42:51] 	 note that you can use Special:ChangeContentModel to configure a page to be json (lets mediawiki validate the json and uses Extension:CodeEditor if installed)
[19:44:40] 	 So basically mw.text.jsonDecode(mw.loadData('moduleName/data.json'))?
[19:45:42] 	 or just mw.loadData() if you're fine storing it as a lua table
[19:45:50] 	 e.g. https://en.wikipedia.org/wiki/Module:InterwikiTable
[19:46:04] 	 but yes
[19:46:05] 	 Good question.
[19:46:15] 	 It is going to be a relational database, consisting of multiple tables.
[19:46:19] 	 er, other way
[19:46:38] 	 you'd mw.loadData() a lua module which fetches the json page content and parses it
[19:46:39] 	 https://cdn.discordapp.com/attachments/290860229330206722/369473426890686464/dem.png that is, porting this to Lua or JSON
[19:47:02] 	 either way is going to suck
[19:47:31] 	 it'd be better to figure out what queries you'd run against that rdbms, and then export those queries as the data
[19:47:49] 	 There is a problem with actually connecting to a database.
[19:48:32] 	 1. It'd have to go through the server admin, and I want to deal with him as little as possible. You know, superuser access required to install LuaRocks, which is required to luasqlite, luamysql, etc. 2. It's not even guaranteed to work with Scribunto to begin with.
[19:48:46] 	 In fact, I consider it highly likely that it wouldn't work.
[19:48:51] 	 2. it definitely wouldn't work
[19:49:05] 	 you'd need to modify scribunto itself to enable that
[19:49:18] 	 (and not "change some config settings" modify but actually edit the code)
[19:49:22] 	 Okay, so that one is definitely out of question.
[19:49:29] 	 Guess I'm back to improvising.
[19:50:12] 	 Before I realized that scribunto just doesn't allow parsing from uploaded files, I intended to just upload a bunch of CSV files, and convert from raw text to string/number/bool depending on some parameters written there
[19:50:24] 	 lemme play around with something
[19:50:33] 	 is the database going to remain the canonical source?
[19:50:39] 	 Yep.
[19:51:33] 	 so my suggestion would be to do what I said above -- export the results of the queries you care about and use a bot to edit the wiki version of those query results (whether in lua or json is irrelevant), maybe schedule it to run every X period of time via cron
[19:51:54] 	 so the bot would connect to the db, run the query, serialize the result to whatever format, then hit the wiki API to edit the data pages
[19:54:51] 	 Maybe I should stick with JSON or Lua tables then.... 
[19:55:32] 	 Or just implement my own thing that processes raw text and turns it into a table.
[19:55:39] 	 *shrug*
[19:55:50] 	 this just automates the export from the db to the wiki if you want the db to remain the canonical source
[20:44:33] 	 metalhead33: so as an example, I created https://www.thetestwiki.org/wiki/LoadJsonTest which uses https://www.thetestwiki.org/wiki/Module:LoadJsonTest to generate the wikitable, https://www.thetestwiki.org/wiki/Module:LoadJsonTest/data for the lua table, and https://www.thetestwiki.org/wiki/Module:LoadJsonTest/data.json for the underlying json
[20:44:37] 	 in case you find that useful
[20:45:10] 	 May I reuse the src?
[20:45:37] 	 sure
[20:46:16] 	 thanks
[20:47:22] 	 I made the json somewhat equivalent to a relational db, and the module which loads it resolves the relations by gluing the records together
[22:17:11] 	 wow, composer update ran out of memory
[22:17:19] 	 what would it even do that's memory intensive
[22:18:18] 	 dependancy resolving mess?