[17:10:56] @regsearch . [17:10:56] Results (found 48): puppet, instance, morebots, git, bang, nagios, bot, labs-home-wm, labs-nagios-wm, labs-morebots, gerrit-wm, wiki, labs, bastion, extension, wm-bot, projects, putty, gerrit, change, wikitech, revision, monitor, alert, password, unicorn, help, bz, os-change, instancelist, instance-json, leslie's-reset, damianz's-reset, amend, credentials, bug, queue, socks-proxy, sal, info, security, logging, ask, sudo, access, $realm, keys, $site, [17:18:33] o.0 [17:18:41] My reset!? [17:19:19] :) [17:19:30] that's an emergency plan if you caused troubles... [17:19:41] I have backup plan for everyone... [17:19:42] :D [17:20:01] * Damianz throws some peanut butter over petan and lets the dogs lick him to death [17:20:43] Hmm should I go spend an hour climbing before my gf gets hope, it's kinda cold =/ [17:20:54] s/hope/home/ [17:21:07] jeremyb: still need help setting up an instance? [17:21:16] I think that bots-2 is pretty idle [17:27:53] !bug 33406 [17:27:53] N Nova Resource:I-000000bf‎; 20:00 . . (+672) . . 10.4.0.34 (Talk)‎ [17:28:18] :o [17:28:24] !bugzilla [17:28:32] !bug del [17:28:32] Successfully removed bug [17:28:38] o.0 [17:28:43] heh, ok [17:28:44] !bug is https://bugzilla.wikimedia.org/show_bug.cgi?id=$1 [17:28:44] Key was added! [17:28:47] i saw it existed [17:28:53] thx [17:28:53] now it does [17:29:13] !bugzilla alias bug [17:29:14] Successfully created [17:29:19] !b alias bug [17:29:19] Alias is already existing! [17:29:22] !b [17:29:22] https://bugzilla.wikimedia.org/$1 [17:29:24] could you enabled that one for #wikimedia-tech ? [17:29:30] sure [17:29:36] thx [17:29:37] actually I think you can do it too [17:29:47] forgot syntax >) [17:30:16] aah,ok, you just repeat it there, never mind [17:30:28] i thought of another command that just enables a key for a certain channel [17:30:37] of course this is fine:) [17:30:45] it's possible to set up shared db [17:30:54] this channel and -operation share same [17:31:01] so now bug exist even there [17:32:17] cool [17:32:46] triggers that just output data when actively triggered should be fine in all [17:33:17] no sure what you mean :o [17:34:15] a bot alias that does just output stuff if a human uses it should be ok in all channels [17:35:01] just bots that output stuff by themselves (recent changes, code submits..etc) might cause discussion, depending on the channel [17:35:19] for the !bug command, it also fits into #mediawiki of course [17:35:53] hi petan [17:35:56] hey [17:36:05] We just need bots that trigger bots that trigger bots that trigger bots in a never ending loop [17:36:09] That makes for fun irc [17:36:18] i was just wondering if anyone else was running supybot, we could combine efforts [17:36:27] but i can just run my own too [17:36:31] actually features of this bot can be configured per channel [17:36:41] @infobot-off [17:36:41] Infobot disabled [17:36:43] !bot [17:36:45] you see [17:36:51] @infobot-on [17:36:51] Infobot enabled [17:36:57] same for rc changes etc [17:37:02] cool [17:37:19] by default all channels have separate db but it can be shared, and local admins can control what bot is supposed to do [17:37:41] if they don't like something, they can turn it off [17:38:12] I wrote a simple manual [17:38:14] @help [17:38:14] Type @commands for list of commands. This bot is running http://meta.wikimedia.org/wiki/WM-Bot version wikimedia bot v. 1.1.4 source code licensed under GPL and located in wikimedia svn [17:38:52] Is wm-bot the same as tech or is tech running the un-maintaind version written in sick? [17:39:02] it's the same irc user :) [17:39:06] so same binary etc [17:39:21] it's running all subsystem in own thread [17:39:28] * threads [17:39:43] so it's pretty fast even if it's just one binary [17:39:53] I really need to re-write my irc bot but I have no desire to do it atm D: [17:40:02] do you want an idea for another bot feature? [17:40:09] sure [17:40:11] that we actually had / have a ticket for [17:40:28] !nagios down [17:40:28] http://nagios.wmflabs.org/nagios3 [17:40:33] or similar [17:40:39] what it should do [17:40:52] add a scheduled down time to the Nagios host [17:40:58] ahh [17:41:05] so if you are working on it, when you do maintenance [17:41:09] i was thinking to send a page [17:41:16] you can let Nagios know [17:41:20] o.0 [17:41:23] and dont get any Criticals [17:41:24] right how it would work? [17:41:35] it would need to communicate with nagios server [17:41:37] Do you really store the dbs in html? Or is that just a dump from the configs [17:41:43] dump [17:41:59] it should do what you do via web-frontend, after logging into nagios with a browser, and then clicking "schedule downtime" [17:42:00] there is a thread which dumps db every 10 seconds when there is a change [17:42:12] yes.. ehm..let me look it up..holdon [17:43:00] petan: Should my home directory on bots-cb show up if I log into bots-apache1? [17:43:13] it doesn't work like that [17:43:18] methecooldude: we changed it [17:43:24] petan: http://old.nagios.org/developerinfo/externalcommands/commandinfo.php?command_id=118 [17:43:45] methecooldude: what is your ldap? [17:43:49] petan: write the right stuff into nagios.cmd [17:43:59] methecooldude: Look in the share dir [17:44:28] petan / Damianz: Nothing to do with ClueBot... just wondering [17:44:36] methecooldude: what is your ldap? [17:44:44] richs [17:45:08] it's in /mnt/public_html/richs [17:45:13] methecooldude: Though if you looked in the share you'd find ie [17:45:15] s/ie/it [17:45:19] Oh [17:45:20] so you can set chmod 700 on your home [17:45:35] to prevent folks without root to steal your pr0n :o [17:46:04] mutante: I think it would need to run on nagios [17:46:04] Is the homedir nfs mount with root squash? [17:46:08] no [17:46:43] mutante: I could make a bot there which would do that, of course, but if you want to use it in production someone would need to deploy it there [17:47:40] Damianz: you can't use sudo to access nfs [17:47:51] because of squash [17:47:58] * no squash [17:48:14] Couldn't you just remount it with squash anyway :P [17:49:09] probably :) [17:49:21] you could also ssh to bots-nfs and sudo su [17:49:31] OrenBochman: hi, I heard you had troubles with set up of mediawiki [17:50:00] petan: hello [17:50:11] petan: yes [17:50:20] petan: I have no experience with either labs or with setting things up in linux [17:50:20] want me to help you? [17:50:27] I can set it up for you [17:50:37] what project it is? [17:50:57] petan: I need any help you can give [17:51:10] it is called Search [17:51:31] can you add me to project? my name is there is Petrb [17:51:47] add me as regular user, and create an instance [17:52:02] or if you already created one, you don't have to [17:52:58] OrenBochman: here https://labsconsole.wikimedia.org/wiki/Special:NovaInstance [17:53:06] there is option to add people to project [17:53:20] I'm there [17:53:26] Unless you're me and have no nova access :P [17:53:47] yes, but I am not there :) [17:54:04] I've added you [17:54:06] ok [17:54:11] 01/02/2012 - 17:54:11 - Creating a home directory for petrb at /export/home/search/petrb [17:54:40] my internet connection here was supposed to be upgraded to 30 MB but it gor downgraded instead ... so my connection is sporadic [17:54:41] search-test? [17:54:48] there is instance search-test [17:54:55] should I install mediawiki there? [17:54:56] that's what I set up [17:55:07] it is supposed to allow testing [17:55:08] @whoami [17:55:08] You are trusted identified by name .*@wikipedia/.* [17:55:08] yes [17:55:12] 01/02/2012 - 17:55:11 - Updating keys for petrb [17:55:19] ok [17:55:31] petan: it _could_ execute a shell script on the nagios host via ssh.. but let me think about that again [17:55:48] ah... right [17:56:18] mutante: It could just post to the cgi script using a login that has access to do commands :P [17:56:58] petan: it would help if you could also do an import simple english wikipedia [17:57:24] Damianz: scripting the https connection and web user login with curl takes a bit more than executing via ssh, but whatever works [17:57:29] unfortunatelly I don't think I am allowed to do that, you need to ask someone from operation, like Ryan [17:57:47] I don't have much access over production stuff [17:58:12] well the idea was to test something like that with the Nagios labs instance [17:58:14] I could import articles, but not users [17:58:25] before trying anything in production (nagios) [17:58:28] I need just the articles [17:58:38] mutante: I talked to Oren, but of course I can test it here :) [17:58:44] OrenBochman: ok [17:58:50] petan: ooh :) ok [17:59:43] mutante: I think we should first create a proposal how it should work, if others agree it's a good way to use it I would create a code and test it on labs, then you could do whatever with that on production [18:00:53] I don't know yet what all would be that thing allowed to do with production nagios, so I have no idea how to do that, but if it could ssh there or execute a command it would be easy [18:01:02] petan: that's right, but has totally low priority.. we dont really need it in production :p [18:01:30] petan: because there we prefer to see all alarms anyways [18:01:33] I will add this into my own todo list :) [18:01:34] petan: it [18:01:47] petan: it"s actually more a feature for labs than production [18:01:48] @whoami [18:01:48] You are trusted identified by name .*@wikimedia/.* [18:02:01] ah, ok [18:02:08] petan: I was told you might be putting the current search config into puppet? [18:02:13] in that case it's easy because bot we run already run there [18:02:15] petan: because by definition in labs people will be working on stuff all the time, producing a lot of Nagios warnings, that can be ignored [18:02:15] on nagios [18:02:30] mutante: I can implement it here, sure [18:02:57] petan: great. thanks. letting you help Oren for now:) [18:03:06] actually I would like to make interface on labsconsole where people could change settings of nagios [18:03:29] i like doing Nagios stuff too [18:03:29] it already get data from api.php [18:03:40] it download a list of instances using smw [18:03:55] then it parse them and reload [18:04:21] OrenBochman: could you also add me to Search? (jeremyb) [18:05:53] OrenBochman: do you want to use svn head or stable medawiki? [18:06:08] if stable, which version or which revision [18:06:12] or wmf1.18.1 ? [18:06:13] petan: which settings? we can do that via puppet classes and puppet variables, and people can just use the instance config dialog to pick classes and set variables [18:06:20] or whatever it is [18:06:24] mutante: not really [18:06:30] Ryan want to change puppet on labs [18:06:34] so it would not be possible there [18:06:38] oh..ok [18:06:46] it works on production only [18:07:05] I think he talked about it yesterday it's in logs [18:07:12] ok, i should check [18:07:30] bbl though, need to get some food as well [18:07:35] @search ssh [18:07:36] Results (found 2): bastion, socks-proxy, [18:07:38] Gah puppet reminds me I need to go figure out cfengine access restirctions per class. [18:07:41] !socks-proxy [18:07:41] see https://labsconsole.wikimedia.org/wiki/Access#Accessing_public_and_private_instances ssh @bastion.wmflabs.org -D [18:08:03] petan: I'm sorry to put lots of my requirments on you - but could you also install java + tomcat [18:08:04] sure [18:08:12] OrenBochman: of course [18:08:36] jeremyb: what user name ? [18:09:02] OrenBochman: jeremyb [18:09:11] hmmm - ok done [18:09:11] 01/02/2012 - 18:09:11 - Creating a home directory for jeremyb at /export/home/search/jeremyb [18:09:22] wow that's a fast response (from the bot) [18:09:22] !log search installed mysql + tomcat + java [18:09:24] Logged the message, Master [18:09:56] OrenBochman: do you want the same version of stuff as in production? [18:10:11] 01/02/2012 - 18:10:11 - Updating keys for jeremyb [18:11:04] yes [18:11:40] jeremyb: will you install it or I should do it [18:11:57] looks like 1.18wmf1 then? [18:12:31] petan: i'll be back on around 20:00-20:30 UTC. running out the door in 2 secs. up to you [18:15:03] I will see, depends on my free time :) [18:15:47] ok, well i'll be screened here (irssi) so if you say something about it i'll see later [18:16:33] petan: also we should hook up with notpeter sometime but i think he's afk now. (he was doing puppetizing for search) [18:16:39] * jeremyb detaches [18:16:52] ok [18:17:03] jeremyb: this channel is logged you don't need to stay [18:19:08] OrenBochman: I think you need to reconfigure firewall [18:19:35] OrenBochman: https://labsconsole.wikimedia.org/wiki/Special:NovaSecurityGroup [18:19:39] click add rule [18:20:16] petan: what is your local time [18:21:08] 19:21 [18:22:18] OrenBochman: from port 80 to port 80 protocol tcp range 10.4.0.0/24 [18:22:30] insert that [18:22:52] we need it so that we can access it from outside [18:23:05] hi Roan [18:23:33] OrenBochman: when you finish that, tell me [18:23:41] just a sec [18:24:49] what about security groups ? [18:25:02] just click add [18:25:08] OrenBochman: from port 80 to port 80 protocol tcp range 10.4.0.0/24 [18:25:35] ok [18:25:39] done [18:26:08] nice [18:26:42] RECOVERY HTTP is now: OK on search-test search-test output: HTTP OK: HTTP/1.1 200 OK - 453 bytes in 0.004 second response time [18:27:17] * Damianz paws labs-morebots [18:27:21] petan: I may have to go in a few minute - will you be around tommorow or in about 3 hours ? [18:27:29] maybe [18:27:33] it's almost finished anyway [18:27:41] wonderful [18:27:49] * Damianz wonders where in the intertubes petan actually resides timezone wise [18:27:49] I think it will be running in 10 minutes [18:28:08] I have a couple of questions [18:28:15] I will keep a text file in /mnt/ where it would be described how is it configured with passwords etc [18:28:32] so you can easily start using it [18:29:03] is there a way I can access all the wikipedia dumps files without getting a local copy [18:29:14] dumps not, db yes [18:29:19] but you need access to tool server [18:29:42] which is outside of labs [18:29:55] they must be stashed somewhere in the cluster [18:30:01] Ryan plans to insert access to production db but it will take few months I guess [18:30:20] probably yes, maybe ask mutante or someone with access to cluster [18:30:27] labs are separate network [18:30:45] but downloading dump to your instance is a matter of few minutes [18:30:52] it's on same network as wikipedia [18:46:24] OrenBochman: around? [18:49:15] OrenBochman: it's done [19:01:15] hm someone around can help me install 1.18wmf [19:01:24] it stuck on creating tables [19:07:54] nvm [19:23:15] !log search deployed 1.18wmf to /var/www/w [19:23:16] Logged the message, Master [19:24:24] !log search extracting simple wiki to /tmp [19:24:25] Logged the message, Master [19:25:39] hm... [19:44:09] !monitor [19:44:09] http://nagios.wmflabs.org/cgi-bin/nagios3/status.cgi?host=$1 [20:39:12] PROBLEM Disk Space is now: CRITICAL on search-test search-test output: DISK CRITICAL - free space: / 0 MB (0% inode=89%): [20:39:43] meh [20:44:08] lol [20:44:12] RECOVERY Disk Space is now: OK on search-test search-test output: DISK OK [21:02:12] PROBLEM Disk Space is now: WARNING on search-test search-test output: DISK WARNING - free space: /mnt 1127 MB (5% inode=99%): [21:17:31] yay [21:39:04] jeremyb: here? [21:52:17] re [21:52:58] OrenBochman: it's almost done [21:53:12] you should be able to connect there [21:53:31] now it's running import of simple wiki db it will run for few hours [21:56:41] great [21:56:48] can you connect there? [21:56:56] files are in /var/www/w [21:57:06] db in /mnt/db_files [21:57:45] trying to connect [21:57:58] I just added a tunnel to my ssh [21:58:15] right [21:59:09] if possible could you send me a history of what you did [21:59:26] !sal [21:59:26] https://labsconsole.wikimedia.org/wiki/Server_Admin_Log see it and you will know all you need [21:59:32] there is a log [21:59:51] I deployed mediawiki and now it's running import of db [21:59:59] before that I installed mysql and configured it [22:02:42] can you open it? [22:02:49] no [22:02:56] I get a connection reset [22:02:56] I will install some more extensions too [22:02:59] ah [22:03:23] I need the lucene search extension and the connector [22:03:28] ok [22:04:54] import will probably finish tommorow until then site wouldn't work properly [22:05:23] tell me once an instance has been set up we take a snapshot and save it ? [22:05:39] I can create a backup [22:05:43] so that db would stay same [22:06:39] petan: are you a developer ? [22:06:47] yes [22:07:25] you did work on antivandalism ? [22:07:32] I still do [22:07:58] there is some info on my user page on en [22:08:06] !wiki User:Petrb [22:08:06] http://en.wikipedia.org/wiki/User:Petrb [22:08:08] were you in this year's wikimania ? [22:08:12] no [22:08:49] it's amazing how slow my internet access is [22:08:59] irc is the only thing that moves [22:09:06] :P [22:10:04] what languages do you develop with maninly ? [22:13:29] you don't ? [22:13:47] c [22:13:54] c#, c++... sometimes php [22:13:55] is huggle VB ? [22:14:01] devel not [22:14:05] it used to be [22:14:38] so you ported to c# ? [22:14:42] yes [22:15:49] I'm asking because I'd like to consult on integration .... [22:15:55] ah [22:16:57] one of the active bugs in search is that it olnly indexes current version [22:17:10] actually I am importing only latest revisions [22:17:18] because you created only 20gb instance [22:17:23] full db needs more [22:17:38] how useful would it be from your point of view to index older version [22:17:55] to search for deleted stuff? [22:18:06] maybe it could be useful but it would slow it down [22:20:17] * Damianz points sphinx at OrenBochman and views his contents [22:30:00] I can access www on search [22:30:11] what dir should i use [22:30:17] uh [22:30:20] you see a wiki? [22:30:54] OrenBochman: mediawiki is in /var/www [22:32:42] you see a wiki? [22:33:01] I see a dir w/ [22:33:08] can you open it in browser now? [22:33:25] I maped localhost:8081 to search-wiki:80 [22:33:28] sure [22:34:30] you can create an account there [22:34:43] let me know when you do that [22:34:56] password for db is located in LS.php [22:35:35] if you just sent a url I missed it [22:35:47] my connection get's reset [22:35:48] OrenBochman_: open a wiki [22:35:51] does it work? [22:36:15] if so, register so that I can set it up [22:42:10] I've registerd [22:42:13] I've registered [22:42:36] ok [22:42:56] this import will run maybe several days but I think you can start using it [22:43:11] thing is the site keeps throwing me to port 8060 [22:43:19] any ideas about that [22:43:20] oH [22:43:21] wait [22:43:59] fixed [22:44:40] OrenBochman: try [22:44:49] :o [22:45:08] ok better [22:45:36] search works [22:45:37] :) [22:45:45] need to install citation extentiontoo [22:45:50] it is [22:45:54] but It's great [22:45:58] Cite is there [22:48:34] Stonehenge has a cite problem [22:49:10] not sure what it's about [22:50:22] I guess the template {{reflist}} isn't imported yet [22:50:40] ok I'll try again in the morning [23:03:09] where is tomcat located ? [23:03:46] dunno [23:04:14] usr/share/tomcat6 [23:04:21] yep [23:04:23] thanks [23:06:17] any idea what port it is running on ? [23:08:34] !google [23:08:41] damn [23:08:57] its 8080 [23:09:00] yes [23:09:02] it's working [23:09:11] but I need to map that too [23:09:17] that's not hard [23:09:25] sorry for bugging you :-) [23:09:29] np :) [23:09:57] are you ever in Budapest ? [23:11:10] that's close [23:11:39] if there was a dev meeting I would definitely come there [23:12:49] not many dev people there [23:12:58] I was at thier chapter meeting last month [23:13:47] @RC+ mediawiki Wikimedia_Labs/Create_a_bot_running_infrastructure [23:13:48] Inserted new item to feed of changes [23:14:07] @RC+ mediawiki Wikimedia_Labs/status [23:14:07] Inserted new item to feed of changes [23:21:16] petan: did you do all the work using the console [23:21:25] terminal [23:21:33] nice [23:21:55] that's how I work heh :) [23:21:57] can I connect to it with a xtermnal [23:21:58] I am linux user [23:22:09] xterm? what for... [23:22:27] good point [23:22:39] so I don't freak out so much [23:23:00] I guess it a server machine ... [23:23:03] depends what you need to do, but I think terminal offer all possibilities as xterm [23:23:47] I remeber what editor I used to like not vm and not emacs [23:23:59] nor vi nor vim [23:24:03] nano? [23:24:05] mcedit [23:24:08] could be [23:24:14] Eww [23:24:28] vim > vi > emacs > pico > nano [23:24:32] sure [23:24:41] I have to work with vi :D [23:24:48] ofcourse [23:24:49] we don't have vim on production in work [23:24:55] Same [23:24:58] heh [23:25:11] Vim on staff/internal servers and desktops, vi on production boxes :( [23:25:13] I love macros [23:25:19] I used to know vi for about a month [23:25:23] My vimrc saves a shed load of time. [23:25:26] vi is great [23:25:33] vim is even better :D [23:27:43] vim is like YAY vi is like wooow [23:29:19] heh, I need to check some machines brb [23:29:43] search-test is ubuntu? [23:29:51] Yeah, everything is. [23:30:06] Or I think it is anyway :P [23:30:35] * Damianz should go work on his python apt grabby thing [23:31:43] ok, everything is broken heh [23:31:53] LIFE IS BORKED? [23:32:01] system doesn't like a change of year [23:32:04] * Damianz accuses petan of beliving 2012 is when the world ends [23:32:43] :D [23:33:31] seriously, it did a lot of troubles yesterday [23:33:40] we are recovering till now [23:33:42] :D [23:34:37] It's funny trying to debug the issues that occur once a year :D [23:34:47] that's why no one fixed them [23:34:59] always some workaround and who cares... [23:35:40] devs won't fix stuff which doesn't crash more than 10 times a day [23:36:19] folks from operation like me are there to restart it :D and we are cheaper [23:36:37] :o [23:36:56] yay [23:37:14] * petan goes to fix something now [23:58:32] PROBLEM Free ram is now: CRITICAL on bots-cb bots-cb output: Critical: 1% free memory [23:58:47] Damianz: ^ [23:59:30] 40 mb free