[08:53:20] hm I installed puppet on some of my servers and I quite like it [08:53:32] dunno why I always hate it here, maybe the combination of gerrit [08:53:33] :P [08:53:57] puppet quite suck when you need to wait 3 weeks for you config updates to get merged [09:24:27] Is this a server misconfiguration?: https://tools.wmflabs.org/xtools/pcount redirects to http://www.tools-webgrid-01.com:4086/xtools/pcount/ [09:32:47] se4598: that is a tool misconfig [09:32:58] nothing to do with server [09:37:06] !tooldocs [09:37:07] https://wikitech.wikimedia.org/wiki/Nova_Resource:Tools/Help [09:38:43] what is the max diskspace a project can use? [09:49:08] lbenedix1: when you say 'project' you mean labs project or tool in toollabs? [09:50:46] I mean in /data/project/Foobar [09:52:04] Ah. Well… there isn't an official quota. You might find yourself chided if you're using space for dumb things (e.g. unpurged logfiles) but as long as you have a valid/interesting case for file use you have a lot of leeway. [09:52:08] How much space are you thinking? [09:52:33] I cant estimate this now [09:53:34] I'm part of a research group based in Berlin, Germany. Currently we are looking for bot edits in enwiki. [10:00:42] lbenedix1: unless you're in the 'dozens of gb' zone then don't worry about it much. [10:00:57] You may have performance issues before space becomes a concern :) [10:12:46] great ;) [10:13:21] the heavy work will be done by our local machines, but we need some data from the revision table [10:13:54] Right now /data/project/ can map to either a gluster server or an nfs server. [10:14:06] btw. is it possible to get a dump of this table? [10:14:11] Soon we'll migrate everything over to nfs, but if you want to use nfs in the short-term you need to specify that in the instance config. [10:14:39] lbenedix1: I don't know. It's possible to get access to labs-local mirrors of most tables but I haven't done it. [11:18:26] hi, is it possible make a sql query to wikidata tables through Nova Resource:Tools/Tools/Query service ? [12:00:25] rotpunkt: I don't think so [12:00:38] rotpunkt: Tools/Query is Semantic MediaWiki [12:00:51] rotpunkt: you might want to ask in #wikidata how to query Wikidata tables. [12:01:05] they said to ask here :) [12:02:15] however the wikidata tables start with wb_ prefix? [12:09:24] (03CR) 10Dzahn: [C: 04-1] "after talking to AndrewB, just create the key locally for testing since it's a one-off thing, using labs/private is ok, but depends on wha" [labs/private] - 10https://gerrit.wikimedia.org/r/109480 (owner: 10JanZerebecki) [12:50:43] (03CR) 10JanZerebecki: "Using role::planet with role::puppet::self fails because this file is missing." [labs/private] - 10https://gerrit.wikimedia.org/r/109480 (owner: 10JanZerebecki) [13:03:59] (03CR) 10JanZerebecki: "Clarification: As the result of role::planet the key (and to be generated cert) is used by an apache instance whose https port is not inte" [labs/private] - 10https://gerrit.wikimedia.org/r/109480 (owner: 10JanZerebecki) [15:31:55] Coren, I need your assistance again. xtools seems to be getting hammered again. [15:33:16] Cyberpower678: I'm sorry dude, but you need to consider that I do not exist for the next two weeks; between our increasing pressure to move the eqiad and travel, I manage to have anti-free-time. You might want to ask Tim when he's around, he can almost certainly help you. [15:33:42] Coren, thanks. What's his username? [15:33:47] scfc_de [15:34:04] Oh. He's my second goto. [15:35:00] (Anti-free-time anhilliates free time on contact, violently) [15:36:36] * Damianz wonders what form is replacing Coren for the next 2 weeks *ghostbusters theme tune* [15:36:47] Coren, travel == fosdem? [15:37:40] Also the Facebook Open Academy thing; I'll be in Palo Alto right after FOSDEM (I barely have time to shower between the planes), then at the SF office on Feb 10-11 (which we could use for a severe sprint) [15:38:17] Coren: I can throw some water over you on Sunday before we start tidying up, if you're feeling smelly. [15:38:19] that's a whole lot of back and forth! [15:38:52] Heh. [15:39:03] Jet Set Sysadmin! [15:39:30] * yuvipanda steals some airline miles from Coren [15:39:48] Does wonders to my frequent flying status though; I'll make 1W gold this year. [15:39:57] hurrr [15:40:39] Coren, when I shutdown the webservice, why does it only say no input file specified? [15:41:10] Cyberpower678: Probably because you got something in your index.php that expects the lighttpd environment? [15:41:26] Shouldn't it default back to the original webservers? [15:41:29] (If you shut down the lighttpd, you fall back to apache by default) [15:42:10] It does. That message is what happens when your script gets invoked by Apache. Perhaps you have a .htaccess that fiddles things, or permissions issues? [15:42:24] (Apache is more fiddly about permissions that lightly is) [15:42:35] * Cyberpower678 looks, but he hasn't messed with .htaccess files. [15:44:18] Probably permissions then. [15:44:37] * Cyberpower678 added some rewrite rules. [15:44:47] In lighttpd? [15:45:02] * Cyberpower678 sees that somebody added some rewrite rules is what he wanted to say. [15:45:08] Ah. [15:45:16] * Coren goes back to work. [15:46:16] I'll just keep xtools off for the time being. [16:25:35] se4598: Re https://tools.wmflabs.org/xtools/pcount, that's a bug in our web config (cf. https://bugzilla.wikimedia.org/show_bug.cgi?id=59926). There are ways to work around that in ~/.lighttpd.conf, but the easiest is to add a "/" to the link, i. e. https://tools.wmflabs.org/xtools/pcount/. [16:26:06] scfc_de: you had some troubles with wm-bot? [16:29:23] scfc_de: oh, ok, it's a new bug. assuming "www.tools-webgrid-01.com" is used internal hostname, isn't that a bit unsafe (expecially to end it on a pubic domain like .com)? [16:29:25] petan|wk: Uh, that was long ago. I think it was updating the SALs on "!log", but didn't report it here as success ("yes, master!"). But I think it has fixed itself since. [16:29:48] scfc_de: wm-bot was never doing, that it was logbot [16:30:06] petan|wk: In this channel as well? [16:30:18] yes, everywhere [16:30:23] wm-bot doesn't log to sal [16:30:51] I think you had problems compiling requests module [16:31:43] requests are using dotnetwikibot which is broken and doesn't work with latest mediawiki [16:31:45] se4598: I'm pretty sure the redirect goes to http://tools-webgrid-01:4711/something and your browser (or your ISP's proxy) adds www. and .com when it doesn't the website. If you happen to use Alice DSL in Germany, there's even a configuration option that unfortunately is enabled by default :-). [16:31:46] that is why it doesn't work now [16:33:22] petan|wk: No, I wanted to replace that with querying SMW, and I have done that on the command line in mono successfully since. But I am totally unable to compile Request.csproj into a module that I could load for testing. But I'm in the process of polishing up my changes and then leave it to you to do the packing :-). [16:33:50] you need to compile wm-bot before [16:34:00] it's a dependency you can't build a module without core [16:34:32] it was probably erroring because it couldn't find the wm-bot binary to link against [16:36:04] I'm sure it can be done :-). But my brain was quite busy enough to find out what .NET libraries I could use in Mono on Ubuntu Precise and what not :-). But I have it working on the command line for shell requests, so just copy & paste for Tools requests, and you can do the magic that's needed for compiling. [16:36:15] scfc_de: oh, now I see it too, stupid cache and auto-extend from firefox... [16:58:07] (03CR) 10Andrew Bogott: [C: 031] "I am too sleepy to merge this right now, but everything about your plan sounds just fine to me. I suggest that you add a comment next to " [labs/private] - 10https://gerrit.wikimedia.org/r/109480 (owner: 10JanZerebecki) [17:07:31] coren: is this still an issue: https://rt.wikimedia.org/Ticket/Display.html?id=5572 ? [17:08:35] drdee: Yes, though there is a "right" ticket about the controller. I was unaware of that duplicate. Hang on, lemme find it. [17:11:17] drdee: Or not. I would have /sworn/. Lemme update that one. [17:11:29] ty Coren [17:13:19] Coren: another question: what is the oldest version of ubuntu running in labs? 10.04? 9.04? i am trying to replicate https://rt.wikimedia.org/Ticket/Display.html?id=3101 without success so far [17:14:23] drdee: I *think* we have a couple of 10.04 left, but I'm pretty sure the last Hardys are gone now. [17:15:09] The vast majority is now Precise. [17:15:32] And I think the only Lucids are a couple of miscellaneous boxen. [17:15:50] But RobH is probably the better one to answer that with certainty. [17:17:22] i can't login to my instance master.pmtpa.wmflabs [17:17:30] my key gets rejected [17:18:04] * Coren goes look at the logs. [17:19:16] drdee: Your root filesystem is out of inodes. [17:19:57] how to fix that? [17:20:19] Delete files. I'm looking to see if there is somewhere with a lot of files that can safely be nuked. [17:20:53] but I can't login :) [17:21:05] Yeah, hence my looking for you. [17:21:07] andrewbogott: RFEs for wikitech/addmember go to "MediaWiki extensions"/"OpenStackManager"? [17:21:21] Looks like most of the files are in lib/hadoop-hdfs [17:21:41] I mean /var/lib/... [17:22:16] Specifically, /var/lib/hadoop-hdfs/cache/ [17:22:19] scfc_de: RFE... [17:22:29] bugzilla, we're talking about? [17:22:31] then yes [17:22:38] drdee: I can blow up some of the logs in /var/log, probably enough to let you in to cleanup. [17:22:41] andrewbogott: Requests for enhancement ~ Bugzilla. [17:22:44] sure go ahead [17:22:45] andrewbogott: k [17:23:12] drdee: Try it now. [17:23:31] yup [17:23:34] ty again [17:23:34] You have 41 free inodes right now. Make room soon. :-) [17:23:50] Well, 37 now that you're logged in. :-) [17:25:12] another question: can you update https://rt.wikimedia.org/Ticket/Display.html?id=2038 [17:25:25] (i am just going through all the labs rt related tickets) [17:25:36] mark said that you would know [17:27:05] That half of the mail issue has indeed been sorted out for some time. [17:29:09] could you update the ticket and mention what's left to be done? [17:30:31] drdee: It's an entirely different problem set, and one which only has had recent clearance from labs. I'm not sure ticket necromancy is the right thing; there's a BZ to track it though (it's a user-facing thing). 58796 I'll update that with news. [17:30:38] s/labs/legal/ [17:31:45] cool [17:35:49] Hello. I need to replace automatically a template parameter name from a wiki page (ie. {{cite web|title=foo}} to {{cite web|parameter=foo}}. Do you know an easy way to do this without using regular expression over the entire page? ie. a wikitext parser? [17:36:51] if you're using Python, http://pythonhosted.org/mwparserfromhell/ [17:40:16] yes sorry, I meant in python [17:40:23] I'm just trying mwparserfromhell [17:40:31] have you got any examples? the documentation is not very good [17:41:14] I haven't really used it much myself, you can wait for Earwig|away to come back, it's his software [17:42:10] Arnaugir: Do you want to change that on a WMF wiki? I think there are some bots who have experience doing something like that. [17:42:21] scfc_de: yes, ca-wiki [17:42:43] the problem is, users when translating from spanish or english don't translate the citation templates params [17:42:52] then my idea is to use a continuous bot to do so for them [17:43:07] I tried with regex and succeeded but wiki code is too complex [17:43:39] there's too many situations so I need a parser,this one looks good. will try and ask Earwig|away later :) [17:43:45] I know, that's why I would try to have someone else do it :-). [17:44:06] I searched for bots in en-wiki which did the same task i'm looking for but didn't find any [17:44:23] if you do know some, please tell me! [17:45:32] I'm sorry, I don't have a link at hand. [17:52:00] scfc_de: thank you anyway:) [17:56:31] Coren: are you around? [17:56:52] Betacommand: Yes, albeit ridiculously busy. Anything really quick I can help with? :-) [17:58:26] Coren: can you turn off Filter 554? its broken [17:58:49] Arnaugir: as scfc_de says, mwparserfromhell is what you want... I'm sure Earwig|away will help you better than I can [17:59:24] thanks. will wait for him. [17:59:32] Betacommand: On enwiki? [17:59:43] Arnaugir: to figure things out I often play with it in the Python interpreter, you're basically looking at calling filter_templates() after parsing, then iterate over the templates and iterate over each template's "params" list [18:00:07] yes, that's what i'm trying to do. but I'm getting loads of errors (as always :P) [18:00:13] heh [18:00:37] you can also call template.has() to check if it has a given parameter (e.g. an English named one) [18:00:39] anyway I was able to retrieve the list of templates and iterate over them to get their params [18:00:51] it's a good step but I cannot move forward [18:01:00] Coren: yeah [18:01:38] Arnaugir: what are you using to parse templates? [18:02:10] code = mwparserfromhell.parse(text) [18:02:10] for template in code.filter_templates(): [18:03:14] Arnaugir: what kind of errors are you getting? [18:04:05] now none. when I'm inside the loop: [18:04:05] if template.name.matches("ref-web"): [18:04:16] then in this case I want to replace "title" with "titol" [18:04:30] but I'm trying to figure out how to do so (I must confess I'm kind of a python-noob :P) [18:04:40] I've got a params list [18:05:05] and mwparserfromhell has its own replace function but not for lists [18:05:08] Arnaugir: template.params.replace('title', 'titol') ? [18:05:19] already tried and does not work [18:05:20] hmmm, let me test that [18:05:32] Nettrom: not what I would do [18:06:03] Betacommand: obviously, because it doesn't work :) [18:06:27] Arnaugir: what is an example page? [18:06:51] https://github.com/earwig/mwparserfromhell/ [18:08:31] fail [18:08:50] Arnaugir: what is an example of a wiki page that you are trying to change? [18:08:57] sorry [18:09:02] :) [18:09:19] https://ca.wikipedia.org/wiki/Usuari:Arnaugir/proves is my test page [18:09:28] there is some {{ref-web}} templates with "títol" param [18:09:32] which should be changed to "title" [18:09:40] sorry, the other way round [18:10:39] Arnaugir: I can iterate over params and replace the name by setting it [18:11:02] Arnaugir: for param in template.params: if param.name.matches('title'): param.name = u'titol' [18:11:25] with the caveat that Python might need some line breaks here [18:11:28] Nettrom: thanks, I will try that [18:11:30] yes, sure [18:11:42] now pywikibot stopped working - have to fix this first. so strange. [18:12:40] Arnaugir: define stopped working? [18:12:51] Arnaugir: Im a python programmer [18:13:18] Betacommand: when I'm trying to login.py with pywikibot, this happens: [18:13:59] Exception TypeError: TypeError("'NoneType' object is not callable",) in ignored [18:14:13] this must be because of mwparserfromhell because it always worked... [18:14:20] or I broke sth :( [18:14:55] Arnaugir: pastebin the fill traceback [18:17:12] fill or full? :P [18:18:10] *full [18:18:30] Betacommand: http://pastebin.com/kFLUf89K [18:18:46] I just used setup.py install [18:18:48] before that [18:19:14] but now I'm trying to login.py in pywikibot... and it's not working, before it was... [18:19:55] Oh boy, what have I screwed up now? I created a new service group, but when I try to "become" it I get: "sudo: sorry, a password is required to run sudo" [18:20:22] dschwen: logout and re-connect [18:21:17] Arnaugir: first step, is remove "core" and use compat, you will have far fewer headaches [18:22:30] yes, I had this in mind. [18:22:35] How to remove it - just remove files? [18:22:47] Betacommand: [18:23:17] will do this and will see whether it fixes. I dunno why I installed core - probably because I was even noobier than now! [18:25:11] Arnaugir: http://stackoverflow.com/questions/16092273/how-can-i-uninstall-a-package-loaded-with-python-setup-py-install [18:25:36] Betacommand, I did, still not working. [18:26:27] dschwen: you completely exited SSH and reconnected? [18:26:32] Ughm [18:26:41] it was a shared connection, D'Oh [18:26:52] works now [18:29:06] (03PS1) 10Diederik: Added test to check which version of Ubuntu is running. [labs/migration-assistant] - 10https://gerrit.wikimedia.org/r/110216 [18:29:31] (03CR) 10Diederik: [C: 032 V: 032] "Ok." [labs/migration-assistant] - 10https://gerrit.wikimedia.org/r/110216 (owner: 10Diederik) [18:59:31] Betacommand: can you help me with pywikibot installation (compat). I svn co the files, and created userconfig.py with no problems, only remaining thing is to change PYTHONPATH as per: https://www.mediawiki.org/wiki/Manual:Pywikibot/Installation#Shortcut_in_command_line [18:59:49] where should I put the .bashrc file? [19:02:46] well it looks like pywikibot is wrking, no need to do that [19:02:53] :) [19:04:30] Betacommand: Nettrom yay! succeeded! https://ca.wikipedia.org/w/index.php?title=Usuari%3AArnaugir%2Fproves&diff=12886981&oldid=12886966 thank you very much, you were so helpful. [19:07:04] hi Arnaugir [19:07:10] hi scfc_de [19:07:11] sorry [19:07:14] pywikibot is in Git now, I think, not SVN [19:07:27] so if you've done an SVN co for the file I worry that you're getting a very old version [19:07:31] sumanah: well i did download by svn [19:08:04] looks like it's up to date... [19:08:27] OK! I will be quiet then :) [19:08:29] :) [19:08:41] thx for pointing out i will take a lookk [19:09:41] sure :) [19:14:41] Arnaugir: I'm pretty sure sumanah is right about pywikibot having moved to Git; there are links on https://www.mediawiki.org/wiki/Manual:Pywikibot ("This project in ..."). [19:14:55] scfc_de: thx. Will take a look [19:16:00] Arnaugir: if you will tell us where you saw directions that told you to do "svn co", we can update it [19:16:34] both https://www.mediawiki.org/wiki/Manual:Pywikibot/Installation and https://en.wikibooks.org/wiki/Pywikibot/Installation [19:16:50] hm. [19:17:06] so, is it possible to install via Git with TortoiseSVN, or should i drop TortoiseSVN and stick to git (never did that... maybe I'm talking nonsense) [19:17:25] * sumanah does not know, sorry [19:17:26] good luck! [19:17:53] scfc_de: now that I downloaded via SVN, can I change to Git? [19:19:30] Arnaugir: I don't use Windows, but on Linux I would delete the SVN copy (provided you didn't make any changes there) and start with a fresh Git clone. [19:19:41] thanls [19:19:43] thanks [19:22:45] scfc_de: my checkout url is from github . does this mean I'm already downloading from Git (via SVN, which I just discovered is possible)? [19:22:47] scfc_de, can you have a look at xtools again? It seems to be under some strain again. [19:22:57] According to users. But I'm not seeing much. [19:23:55] Arnaugir: BTW, there's a dedicated IRC channel for pywikibot, #pywikibot. While there are bot operators in this channel who use it, #pywikibot has probably more experts. How do you have downloaded via SVN? Via GitHub.com? [19:24:29] scfc_de: yes github.com [19:24:35] thanks I didn't know about that channel [19:26:38] Arnaugir: I think then that they are equal, *but* if possible I would use a regular Git (or Gerrit) clone, because otherwise you'll probably end up confused when people refer to specific Git commits, etc. [19:26:59] thank you. [19:29:10] CP678: In access.log I'm seeing on average perhaps two requests per minute (!), so it's definitely not being hammered. [19:30:21] scfc_de, I thought so, but something is causing a 502 proxy error. [19:31:07] I was wondering if you could provide insight. [19:32:52] * Coren grumbles as a day full of meetings collides with the inevitable flurry of Wait, before you leave requests. [19:35:16] Coren, ? [19:35:48] CP678: Travelling for the next two weeks or so. [19:35:56] oh. [19:36:27] Coren, before you leave, can you look at why xtools is throwing 502s? :p [19:37:28] CP678: That happens when your scripts return improper responses to the proxy. I.e.: it's broken. Check your error logs. [19:37:45] CP678: There are a number of requests in webproxy's access.log that show connections timing out to your webserver. On the latter's error.log, there is that "sockets disabled" business again (and some errors in PHPtemp.php). Could it be that you are opening a vast number of files per request? [19:38:22] Or DB connections, for that matter. [19:38:26] scfc_de: (also, sockets in general) [19:38:56] scfc_de, it shouldn't. It's been opening the same amount of files that it always opens, same goes with DB connections, and sockets. [19:39:19] xtools shouldn't actually open any socket. [19:42:36] Coren, oh and if Irestart the webserver everything works for a while, and then it hangs up and returns to 502 [19:42:58] CP678: Then clearly you are leaking file descriptors. [19:43:44] CP678: Remember you are running as FastCGI -- it keeps the same PHP process running for speed. If you open files and do not close them all properly, you may well run out. [19:43:54] (Same goes for DB connections) [19:45:40] Coren, by opening files do you mean opening them to read and edit them or require_once? [19:46:18] CP678: PHP should handle its own filedescriptors cleanly; I do mean things you open yourself (including database connections). [19:46:25] Coren: Sure? I don't comprehend xtools's sources fully, but I don't see any loop or such (I suspect FastCGI would need such a thing?). [19:48:29] scfc_de: That's run by lighty via /usr/bin/php-cgi PHP_FCGI_CHILDREN=4 [19:48:45] Coren: Do you know an easy way to see the number of fds a process consumes? [19:49:26] scfc_de: lsof can help, you can also look in /proc//fd [19:49:51] scfc_de: That'll enumerate the FDs though, not count them (but then wc -l can count) [19:53:56] scfc_de: lsof -p will show you for any one process. [19:53:57] If I look at a sample, I see only fds 0-5 (=> normal), and only one DB connection. [19:54:04] hey coren: i want to give a public ip address to https://wikitech.wikimedia.org/wiki/Nova_Resource:I-00000a42.pmtpa.wmflabs but it fails? am I over my limit again? [19:54:41] drdee: Almost certainly. FYI, if it's for web access it's generally not useful to give a public IP. [19:54:51] drdee: Otherwise, I can up your quote. [19:54:54] quota* [19:54:59] it is :) [19:55:24] it is, for web access? [19:57:38] drdee: https://wikitech.wikimedia.org/wiki/Help:Proxy [19:57:58] wooo proxy! :D [19:58:07] drdee: use the proxy, you will also get https and spdy for free :P [19:58:39] on it [19:59:13] * Coren idly wonders if Yuvi stalks 'proxy' as a keyword. :-) [19:59:33] Coren: heh, no. Limechat has this wonderful bottom pane that shows me all messages from all channels 'm not on [19:59:43] Coren: and my peripheral vision just picks up these things :P [20:02:29] If I do "ps auxfwww | sed -ne 's/^50570 \+\([0-9]\+\) .*$/\1/p;' | xargs -rl sudo lsof -p | fgrep -v ' mem ' | less" on webgrid-01 (50570 being local-xtools), I get about ~ 400 lines. That doesn't seem enough for server.max-fds which should be 1024 by default (trigger is 90 % => 921). [20:05:17] Coren: lsof complains about potentially incomplete data due to stale mounts /mnt/pagecounts & /home. But if the processes are newer than the fault, there shouldn't be any fds missing, or? [20:06:39] Even if they weren't it'd just be missing exact filenames. [20:06:50] server.max-connections is set to 20, but that shouldn't be an issue either. [20:07:12] But CP said that the problem only occurs after some time; perhaps you should wait until it occurs before looking. :-) [20:07:24] (If he recently restarted the webservice) [20:08:35] For example http://tools.wmflabs.org/xtools/pcount/index.php?name=Schnederpelz&lang=de&wiki=wikipedia times out for me now. What I have more trouble understanding is why no other tool's complaining. If it was a systematic failure, nothing should work. [20:09:59] And if I look at error.log on tools-webproxy, only time-outs to :4086 show up, that's xtools. [20:25:18] If I "become xtools", and then "php index.php", I get "Access denied"?! require_once not working? But why does it time out then? [20:33:47] "Access denied" comes from /data/project/xtools/stats.php because I didn't set a User-Agent from the command line. [20:39:42] If I run xtools/pcount/index.php from the command line, I get six fds as well and it returns in 1.something seconds. [20:47:56] So I'm pretty certain that the error lies within xtools. [21:19:54] Earwig: I'm trying to install mwparserfromhell in wm labs [21:20:05] okay [21:20:07] I installed from svn [21:20:16] svn co https://github.com/earwig/mwparserfromhell mwparserfromhell [21:20:19] but now, how to install it? [21:20:33] i mean i tried setup.py install but throws error [21:20:38] what's the error? [21:21:14] write error [21:21:15] let me paste [21:24:22] Earwig: http://pastebin.com/SzSBLkU2 [21:24:54] right, so that's happening because it's trying to install to the system directory for everyone on the server, which you don't have access to [21:24:56] so there's two options [21:25:08] either you use a virtual environment, or python setup.py install --user [21:25:37] --user will make it install to a directory within your home [21:25:40] I think .local [21:27:18] okay I tried the second one and worked. so Earwig now if I use "import mwparserfromhell" it should work, right? [21:27:20] will try... [21:27:26] thank you so much [21:28:13] yep, that should be right [21:32:32] Earwig: I still got the error when importing . I installed succesfully mwparserfromhell in /home/arnaugir but my scripts are located in /data/project/arnaubot [21:32:43] probably that's the problem? [21:32:45] yes [21:32:51] you need to install it with --user in the project [21:32:56] good [21:33:10] but, for example wikitools I installed them in /home/arnaugir [21:33:14] and then changed PYTHONPATH [21:33:17] and worked fine [21:34:00] I suppose you could do that... but either way it's best practice to put all code in the project so that if someone else takes over and your account disappears, it'll still work [21:34:56] thx for the advice. [21:35:53] Coren: my proxy still gives me a 504 bad gateway, should I wait longer? i have waited 1.5 hours [21:36:32] drdee: It shouldn't, I suppose. yuvipanda is the proxy guru and will notice my ping to help you. :-) [21:37:34] kool [21:38:47] drdee: it should be instant [21:38:55] nah [21:40:36] any tips? [21:42:36] yuvipanda: ^^ [21:43:32] drdee: what project is it? [21:43:57] drdee: and can you add me? [21:44:00] so I can check? [21:44:29] 1sec [21:45:12] done [21:45:16] instance 'rt' [21:45:21] group 'packaging' [21:45:31] looking [21:51:18] there are again some bots running on tools-login [21:51:29] load ~5 [22:17:16] scfc_de, I'm back. [22:17:28] Any luck figuring out what's going on with xtools. [22:24:22] Cyberpower678: It's apparently an error within xtools that occures only by online access; I don't know why. I would start by putting an 'die("stop");' at the top of index.php, restarting webservice, and then moving the die lower until the error occurs. [22:25:35] nothing in the error.log? [22:26:14] scfc_de, it helps to know what the error is. [22:28:42] The error is that it is apparently consuming fds. I don't know if this is related to the errors in /data/project/xtools/public_html/phptemp/PHPtemp.php. [22:29:20] scfc_de, This error never has been an issue before though. [22:31:37] Cyberpower678: I don't know. [22:56:45] yuvipanda: any news