[03:44:22] Has anyone here know how to handle cookie issues while using the MediaWiki login/edit Rest API? To be specific, how to preserve cookies between separate XHR requests? [04:41:21] srish_aka_tux: https://developer.mozilla.org/en-US/docs/Web/API/XMLHttpRequest/withCredentials [04:44:00] Thanks, bd808. I did try this, still no luck. Should debug more! :) [04:44:18] darn [04:46:48] my javascript fu is pretty weak [04:47:05] much better with python :) [04:48:22] I would very likely 'cheat' and use jquery.ajax [04:49:36] :) Python script in https://www.mediawiki.org/wiki/API:Edit/Editing_with_Python works like a charm. I was trying a JS version and ran into tedious cookie stuff :-/ [04:51:47] srish_aka_tux: I'd cheat on that one too with `pip install mwclient` :) [04:52:38] I do have 2 patches in there though so its not completely cheating [04:54:57] Wow, didn't know this existed, looks pretty well documented. I was trying to not cheat and write some code :P [09:39:17] https://en.wikisource.org/wiki/Wikisource:Scriptorium/Help#Page:The_Cutter.27s_Practical_Guide_1898_Edition_Part_1.djvu.2F28 [09:39:54] Once again I am running up against the problem of the apparently trying to be helpful and not doing what was desired [09:40:59] It would be helpful if at some point the parser was re-written so that I am not every few months playing the same back and forth of hunt the precise combination of templates needed to ensure something formats correctly on pages and using LST [09:41:55] I've mentioned this at least twice before and isn't apparently a priority.... I'd like to know WHY? [10:10:16] *the parser trying to be helpful [10:16:27] hi@all [10:17:50] after moving adhspedia to another server, i've got a weird problem. maybe one of you could help me with it: [10:17:51] Fatal error: Uncaught Error: Call to a member function getIP() on null in /home/adhspedia/public_html/includes/user/User.php:2233 Stack trace: #0 /home/adhspedia/public_html/includes/session/SessionBackend.php(712): User->getName() #1 /home/adhspedia/public_html/includes/session/SessionBackend.php(197): MediaWiki\Session\SessionBackend->save(true) [10:17:52] #2 /home/adhspedia/public_html/includes/session/SessionManager.php(473): MediaWiki\Session\SessionBackend->shutdown() #3 [internal function]: MediaWiki\Session\SessionManager->shutdown() #4 {main} thrown in /home/adhspedia/public_html/includes/user/User.php on line 2233 [10:19:16] Apache/2.4.18 (Ubuntu) [10:19:47] PHP 7.0.18 [10:20:25] i've just manually upgraded the mediawiki to 1.29.0 but the problem stays the same [10:31:42] Okay anyone good with SQL? [10:31:47] Thiq query doesn't works [10:31:53] https://quarry.wmflabs.org/query/18908 [10:31:55] *this [10:32:11] It should be dropping out rows with a a particualr template but doesn't [10:38:33] Tried changing it to a category and it still doesn't work - https://quarry.wmflabs.org/query/18908 [10:38:35] Suggestions? [12:10:23] Is there a way to add a title to code snippet? I'd like to add the filename to foo bar [12:16:03] Hello. I have MediaWiki 1.30.0-alpha, fresh install on nginx. After config on index page I don't see skins - it says they are disabled. In LocalSetting.php i have all "wfLoadSkin" included. Still nothing. What can I do? [12:21:25] anyone? [12:28:38] bielecki, https://www.mediawiki.org/wiki/Manual:Errors_and_symptoms#The_wiki_appears_without_styles_applied_and_images_are_missing [12:30:11] MediaWiki 1.24 and later don't include skins in the Git download. [12:32:44] Darmock: I installed from Ubuntu repo - there are skins in skins/ and these are included in config [12:33:56] andre__: in load.php i see comment "Max made me put this here" - so it seems something is wrong (if I understood wiki correctly), but wiki help includes only apache help, not nginx one :/ [12:35:30] and Ubuntu ships version.....? [12:36:26] when I open mw-config it says "MediaWiki 1.30.0-alpha installation" [12:36:57] and you installed from the official ubuntu repos? o.O they only have 1.27 [12:37:42] lol, I just checked dpkg and it indeed says "1:1.27.3-1~bpo8+1" [12:37:49] * Darmock spins up an ubuntu VM [12:38:20] but I swear that in mw-config it says 1.30.0-alpha o.O [12:38:44] is it 17.04? [12:38:54] no, it's 16.04 [12:38:58] hmm ok ta [12:41:10] might be a little while, not really familiar with how these packages are set up myself, i normally install from tarball [12:41:26] Is there a way to add a title to code snippet? I'd like to add the filename to foo bar [12:41:43] okay, I'm not going anywhere [12:43:29] Darmock: just to be clear - I set up nginx, php, mediawiki and all that stuff on localhost, and I'm connecting using 127.0.0.1 [12:44:12] yeah i can't find any mediawiki packages, could you paste the output of apt-cache show mediawiki out of curiosity? [12:45:21] to a pastebin of course, http://paste.debian.net will do [12:48:47] Darmock: here you are: http://paste.debian.net/979583/ [12:56:21] bielecki: that version (1:1.27.3-1~bpo8+1) is from debian repos, not sure how well that's gonna fly [12:57:06] so what do you suggest to do? [12:58:53] bielecki: you can grab the ubuntu-specific packages from https://launchpad.net/~legoktm/+archive/ubuntu/mediawiki-lts but it will conflict with your current installation, you'll need to do some tidying up, not sure how apt deals with name conflicts [13:00:19] you know, I can purge current install, because I was stuck at main page :v [13:02:51] ahh also, bear in mind the packages will force apache on you, i believe [13:03:02] you can probably disable whatever vhost it tries to create and replace it with nginx [13:03:10] I know that it's something stupid, a stupid typo or mistake, I can feel it, but I can't find it [13:03:37] but if you don't wanna mess around with that i recommend installing via tarball, it's almost as easy as installing by package [13:04:20] if tarball won't install apache by force I can configure it to work with nginx [13:04:47] it's funny that wiki says that it should work with nginx as well, but there is no support for it [13:04:48] yeah the tarball is just the PHP files, whack em in a directory, point nginx w/ php-fpm to it, and bam [13:05:06] okay, let's try this one [13:06:00] might need to fiddle with permissions, so that nginx can read the files there, and also write to $wikiroot/images [13:06:28] assuming you want uploads of course.. lots of combinations [13:09:27] er, s/nginx/php-fpm/ [13:09:56] for writing [13:53:38] Darmock: thank you very much - I installed it from tar, reconfigured it and it works like a charm <3 [13:54:07] bielecki: happy editing =) [13:54:20] thank you <3 [15:01:47] Hi [15:01:59] How do I get the raw wiki text/html of a page? [15:02:47] which of these two? [15:03:06] action=edit for the first? "view source" or wget or curl or whatever for the second? [15:03:21] andre: The problem is thus [15:03:41] https://en.wikisource.org/w/index.php?title=Page:United_States_Statutes_at_Large_Volume_2.djvu/3&action=edit hides a secret line [15:03:56] https://www.mediawiki.org/wiki/API:Parsing_wikitext [15:06:03] The hidden line is - [15:06:30] which is pagequality level="" user="Bar" [15:07:04] It's not currently possible to directly query that hidden data [15:07:13] At least not easily [15:08:42] I neither know if you refer to wikitext or to HTML and if that secret line is intended to be secret or what the expected outcome is or what the actual outcome is, but... alright? :) [15:08:50] * andre__ shrugs :) [15:09:28] andre__: The line that doesn't show in the UI edit field is what used to track stuff for Proofread page [15:09:44] ok? [15:10:07] I was needing a way to qucikly find which pages are a specfic level and which I'd edited [15:10:14] or changed to a specfic level [15:10:29] In most other circumstances I'd do this with Quarry [15:10:48] But you can't run SQL directly against the "TEXT" of a wiki page [15:10:57] At least not without waiting for ever [15:16:40] I'm considering raising a phab ticket [15:17:04] Unless there's a fast grep tool that looks through the API? [15:27:54] ShakespeareFan00: It's still unclear to me 1) what is the problem, 2) what you expect to happen, 3) what actually happens. [15:28:02] But maybe someone else does understand... [15:28:26] I am trying to find pages with a particualr text string in them [15:38:54] ShakespeareFan00, looks like there are specific categories for each page quality [15:39:08] Yes [15:39:13] why not do some sort of boolean search with the category and whatever else other conditions you need? [15:39:29] OH-> Feasible [15:39:45] Having the field directly in the page table would also be helpful [15:40:39] i'm not familiar with quarry but I assume you can use SQL to search by category [15:40:47] that would probably be something to file a phab ticket about [15:42:50] https://phabricator.wikimedia.org/T172408