[02:41:34] just installed vagrant, how come it is returning 404 errors when loading file_get_contents()? [02:45:18] can someone explain to me the sense behind installing an entire virtualization system just to test/develop mediawiki, versus just using php's built in webserver? [02:45:39] because I'm guessing the only real reason people want to use vagrant is because they're afraid of apache [02:46:20] ...and if they're afraid of apache, they probably can't deal with php, and as such they probably can't develop for mediawiki [02:46:59] oh [02:47:08] idk i was told vagrant would help me [02:47:18] what's your problem? [02:47:21] maybe I can help [02:47:37] well i did vagrant up and vagrant ssh [02:47:47] it seems i cant locate any files [02:48:01] Are you just trying to load a page? [02:48:01] haha I mean, what's the task you're actually trying to get done. vagrant is another level of abstraction [02:48:06] Or are you doing development work? [02:48:13] dev work [02:48:20] Are you working on a script? Can you pastebin it? [02:48:46] andrew: Setting up a dev environment can be tricky. ;-) [02:48:53] Gloria: do you remember https://bugzilla.wikimedia.org/60135 [02:48:54] andrew: Apache is one of the easier parts, I agree. [02:49:09] Gloria: I know, I want to find out if he's working in an environment where it's not so tricky though ;) [02:49:49] eh, apache is a bit of a nightmare. but nothing compared to php [02:50:04] Apache setup isn't usually particularly difficult. [02:50:13] But emulating Wikipedia, which is usually what people really want, is a bit trickier. [02:50:18] well, to learn it thoroughly, for when weird things come up, is [02:50:24] like url rewriting etc [02:50:38] Right, so even that can be set up in Vagrant. :-) [02:50:40] AIUI. [02:50:47] not an acronym I recognize? [02:50:52] As I understand it. [02:50:54] ah [02:50:57] I haven't actually used Vagrant still. [02:51:00] :D [02:51:02] xD [02:51:29] Setting up MediaWiki alone isn't too difficult, but MediaWiki-Vagrant has some pretty neat pieces, I believe. [02:51:35] Including Puppet configuration, etc. [02:51:48] Anything to make testing and contributing changes easier... :-) [02:53:17] well, the only reason I would find virtualization useful, is if I had an unstable environment, where php and apache versions keep changing. the stability of virtualization is nice [02:53:31] but if all he needs to get started is to just extract mediawiki in /var/www.. [02:53:43] or if he's a windows user [02:53:48] Right. [02:54:01] because actually then hooking up this virtualized system to your dev environment/editor...more stuff to configure [02:55:12] Gloria: are you a regular contributor btw? I'm quite new here [02:55:42] Withoutaname: what exactly are you trying to do -- just get a dev environment set up? can we have more information, such as the system you use? [02:55:53] andrew: Yeah, I'm a regular. [02:56:01] cool :) [02:56:12] yes actually im just going to restart vagrant completely [02:56:18] it doesn't seem to see any shared files [02:56:30] Withoutaname: haha one step at a time. what system are you running? [02:56:35] win7 [02:57:32] Gloria [02:57:36] Ori! [02:57:44] Withoutaname: have you ever worked with virtualbox/vagrant/ssh before? [02:57:58] no [02:58:06] ori: I need you or Lego to un-fuck checker's home dir on tool-labs-login-hello. [02:58:12] sDrewth is beating me up about it. [02:58:15] i was told this was the easiest way to test my scripts [02:58:28] Hi Withoutaname. [02:58:33] Ohai Gloria. [02:58:33] hi [02:58:59] Withoutaname: k, can you tell me the steps you've taken to set it up so far? [02:59:15] downloaded everythin [02:59:51] so you've downloaded and installed Git, virtualbox, and vagrant? [03:00:05] yeah [03:00:14] Gloria: what's wrong with it? [03:00:21] Gloria: I submitted a patch for bug 4, who can test it? [03:00:23] i did vagrant destroy on the old one anyway [03:00:26] ori: "git status" is dirty. [03:00:35] dirty how? [03:00:44] Like someone moved files around. [03:00:50] Withoutaname: did you then open up the terminal, go to a directory of your choosing, and run the git commands at https://www.mediawiki.org/wiki/Mediawiki-vagrant ? [03:00:55] Why is EasyTimeline in Perl? [03:01:10] pir^2: :D [03:01:12] ori: http://p.defau.lt/?Jq3ZZaY8bF3GZMYj5KjcEQ [03:01:16] yeah, but it wasn't showing the files from my vagrant directory [03:01:16] Withoutaname: what version of vagrant? [03:01:21] andrew: Hi! Do I know you? [03:01:22] Gloria lies, I said that the beatings would stop. ;-) [03:01:24] you can find out on 1.6.1 [03:01:28] err, vagrant -v [03:01:40] pir^2: nah, I just smile whenever anyone finds something written in perl in a modern project [03:01:55] it's a perl script which is invoked from PHP [03:02:10] pir^2: mystery wrapped in an engima? [03:02:13] pir^2: Bug 4? [03:02:18] Gloria: yes, bug 4 [03:02:24] That timeline one? [03:02:26] Let me look. [03:02:32] I did some basic IPC [03:02:33] God that's such ancient magic. [03:02:48] Gloria: well, i live-hacked the code [03:02:51] you should commit it [03:02:51] pir^2: you should take a look at the existing html to mediawiki converters if you want to see some truly chaotic php invoking perl :) [03:03:06] ori: I don't know what's what. It all got changed when Lego moved it. And now you've changed it. [03:03:09] andrew: example? [03:03:11] Withoutaname: can you pastebin the output of the shell commands when you run them? [03:03:37] And there's a testing directory. [03:03:38] how to do that? [03:03:43] And I have no idea what that is. [03:03:44] I'd hate love to see that :P [03:03:59] I can just blow everything away. [03:04:12] But that seemed a bit silly when I can just pester one of you to set it straight. [03:04:29] just let me restart vagrant [03:04:32] pir^2: http://www.donationcoder.com/forum/index.php?topic=22715.msg205274#msg205274 -- and don't worry, I've made my own in php that actually works, only takes 6 minutes to convert a ~800pg site, and isn't a nightmare. I'm spending a bit more weeks refining it before I send it out to the world though [03:05:51] Fuck me, GitHub is down? [03:06:00] Withoutaname: please, let's just go a single step at a time. can you open a terminal, make a new directory, enter it, and run the git commands on step 4 of https://www.mediawiki.org/wiki/Mediawiki-vagrant -- then paste the output at pastebin.com [03:06:02] Gloria: yep :D [03:07:35] My browser crashed. [03:07:38] andrew, how do i copy the output from git bash? [03:08:02] Gloria: you use chrome? ;p [03:08:22] GitHub is down :/ [03:08:23] Withoutaname: uh I haven't really used windows in a while, but I'm pretty sure you can select, right click, and hit copy [03:08:26] I overwhelmed Chrome to the point that I'm afraid to re-open it. [03:08:32] pir^2: Yes. [03:08:39] So I was using Safari, which just crashed. [03:08:45] So I guess I'll go hide out in Firefox. [03:08:49] Gloria: hey that's a good argument for virtualization right there, yeah? multiple chrome instances :D [03:09:31] pir^2: please tell me you're not actually trying to understand that script :P [03:09:44] I'm running Firefox 26. [03:09:49] looking at the code [03:09:51] I should probably upgrade. [03:09:52] * pir^2 uses FF 28 [03:10:01] I hear 29 is like Chrome finally. [03:10:06] debian is stuck on 24 =\ [03:10:11] it's molasses [03:10:59] andrew: I can't copy output from a git bash terminal... [03:11:16] there's no right click or ctrl+c [03:11:24] Withoutaname: can you just take a screenshot and put it on imgur xD [03:11:54] wow, that's confusing [03:12:09] Withoutaname: ah I think I get it. you're running those commands within vagrant, I need you to run them within windows cmd [03:12:40] andrew: I did vagrant up and now it's running its hour-long update [03:12:47] pir^2: it's just poorly written regex ontop of more regex, which calls perl, with more...regex [03:12:52] so we'll just have to wait and see [03:12:56] Gloria: committed locally [03:13:02] Gloria: can you push to github from tools? [03:13:17] or scp -r it? [03:13:18] andrew: this is the one I had to hack to fix bug 4 (a really old one...) http://git.wikimedia.org/blob/mediawiki%2Fextensions%2Ftimeline.git/HEAD/EasyTimeline.pl -- it's just a bunch of regexes [03:13:18] Of course not. [03:13:30] I don't have a private key on Labs. [03:13:42] Withoutaname: that's not necessary to get you a working environment. and you can run multiple vagrant instances, so while that screwed up one is doing its thing, we can just set up a proper one [03:13:44] can you add me to the repo for a moment? [03:13:47] i'll push it myself [03:13:50] Gloria: you could do ssh -A though [03:14:02] ori: I tried a few minutes ago... GitHub is down. [03:14:08] And then my browser crashed. [03:14:20] your git was dirty [03:14:29] It was. [03:14:32] Looks much better now. [03:14:40] pir^2: I'm getting an 'internal error' at that link. and I don't have anything against regex, just bad regex [03:14:42] > Author: tools.checker [03:14:45] Cute. [03:14:51] pmtpa.wmflabs [03:14:58] Too cute. [03:15:03] nonsense, tampa was shut down years ago [03:15:13] andrew: how about http://git.wikimedia.org/raw/mediawiki%2Fextensions%2Ftimeline.git/HEAD/EasyTimeline.pl (warning: huge)? [03:15:23] andrew: ok thanks. so do you want me to repeat step 4 using cmd.exe [03:15:45] Withoutaname: yes, but first mkdir newvagrant [03:15:51] cd newvagrant [03:15:55] and then run those commands [03:16:11] ori: https://github.com/legoktm/checker # I can't add you as a collaborator. [03:16:27] i'll submit a pull request [03:16:32] <3 [03:16:41] Thanks for taking care of that. [03:16:45] Interesting, that script seems to support a number of wiki engines (e.g. DokuWiki) not just MW.. [03:16:51] (or at least part of it) [03:17:04] pir^2: nope, not sure why it's erroring out [03:17:06] Now just need to figure out why http://tools.wmflabs.org/checker fails miserably. [03:17:23] http://tools.wmflabs.org/checker/ works, so meh. [03:17:36] pir^2: wait, is this not a web interface, but an actual gi--oh gosh, my fault [03:17:41] ? [03:17:54] pir^2: this is not accessible via a browser, is it? [03:18:26] nvm [03:18:31] sorry haha [03:18:45] I'll get it [03:19:06] if ($year < 1800) { &Error2("Function 'DaysFrom1800' expects year >= 1800, not '$year'."); return; } [03:19:10] wtf? [03:19:22] pir^2: https://git.wikimedia.org/tree/mediawiki%2Fextensions%2Ftimeline this works for whatever reason [03:19:58] ori, still around? [03:20:05] andrew: are you new here? :) [03:21:03] andrew: so uh, git commands don't work in cmd.ex [03:21:48] Withoutaname: are you trying to install MW-vagrant? [03:21:56] yeah [03:22:07] pir^2: yep. and oh dear, that entire thing looks stressful. and what's wrong with that function you pasted? it makes sense :P [03:22:32] Withoutaname: please pastebin or imgur a screenshot of what you're seeing [03:23:34] andrew: that's probably why bug #4 was unsolved for so long (hint: reported in 2004) [03:23:58] Krenair: hey [03:24:19] bug 5 is similar :/ [03:24:20] pir^2: I'm not sure how to find bug reports in this system yet [03:24:25] https://bugzilla.wikimedia.org/show_bug.cgi?id=4 [03:24:26] ori, earlier on I noticed rcstream.wmflabs.org is now outputting example data rather than a live stream from deployment-prep wikis [03:24:54] Krenair: it's streaming changes as well as sample data [03:25:03] Withoutaname: can you give a screenshot? [03:25:05] Krenair: there aren't enough edits on the beta cluster [03:25:12] ah, ok [03:25:26] Krenair: i think the fake data is slightly off, too [03:25:35] i'd like to change it to poll the RC API so that it's a lifelike simulation [03:25:57] pir^2: eh, if the time you invest in it starts to exceed what it would take to just rewrite it in modern form.. [03:26:08] pir^2: is it an important extension? [03:26:10] ori: You should have it poll irc.wikimedia.org. [03:26:29] andrew: kind of, it's used on a lot of pages [03:26:35] I don't have time to rewrite it [03:26:57] pir^2: there's a *possibility* I might be interested in such a project after I've committed my html-wiki this month [03:27:12] trying to build up my portfolio for some interviews in june [03:27:45] Gloria: a snake eating its own tail? [03:27:52] this one's a mess; there are a lot nicer things to do [03:28:00] ori, did you see my post on wikitech about that? [03:28:39] ori: Inception. [03:29:11] pir^2: well, I won't let myself think about it until I've gotten my other work through [03:29:23] Krenair: what is rcstream.wmflabs.org? Does it have to do with that RfC? [03:29:25] Krenair: yes, very nice -- thanks. I was actually just in the process of replying. I started some docs at https://www.mediawiki.org/wiki/RCStream [03:29:39] * pir^2 looks [03:29:51] pir^2: http://imgur.com/H5rPaAq [03:30:02] ^ andrew [03:30:06] ori, I noticed after I wrote it that the library I used is for an old version of python [03:30:17] and doesn't actually work with python 3 [03:30:30] Everyone still uses 2.7 anyway. [03:30:38] Or 2.8? Whichever is most recent 2.x. [03:30:45] There will be no 2.8. [03:30:52] 2.7.3? [03:31:01] See PEP 404 [03:31:07] "Guido recently felt he needed to re-empathize that there will be no Python 2.8" [03:31:25] Krenair: relevant numbering [03:31:26] *emphasize [03:31:33] actually, while I'm here--is it possible that someone could save me a *lot* of time, and give me a hint as to bypass the main page of the wiki, when index.php is called? Is that within the routing system I should be looking? goal: my own front page with different structure and css than the rest of the wiki [03:32:08] I could make a index.html that then calls index.php instead...but that sounds like a bad idea [03:32:55] Withoutaname: you're inside vagrant, not windows shell [03:33:16] So I started looking into porting it [03:33:28] andrew: What do you mean bypass? [03:33:28] andrew: im logged out now, in the git bash shell [03:33:45] andrew: By default, the main page is "Main Page". This can be customized. [03:34:12] Gloria: the goal is to have people go to costumes.org, for example, and be presented with a webpage that is not actually the wiki. and then it has links to various wiki categories, that then open the wiki [03:34:35] andrew: Right, by default that's how your Web server probably acts. [03:34:46] Krenair: oh, fun [03:34:46] Krenair: it uses coroutines which are cool [03:34:47] Unless you did something silly like put your files in the root directory. [03:34:54] Did you put MediaWiki inside /? [03:35:02] Gloria: yes, but it'll call index.php, which runs mediawiki. so I need a routing rule I suppose that sends it to the html file I wish [03:35:15] Gloria: nah it's a virtual server in apache with the wiki as the root directory [03:35:28] (that domain, calls that directory as its root) [03:35:31] Setting a different index file is usually trivial. [03:36:05] Withoutaname: http://windows.microsoft.com/en-US/windows-vista/Open-a-Command-Prompt-window [03:36:12] Generally you don't want index.php to live at example.com/index.php. [03:36:45] andrew: git commands don't work in my cmd [03:37:18] So far I can get it to connect to a server using python 3 [03:38:02] Gloria: how is it usually done? [03:38:17] andrew: example.com/w/index.php [03:38:22] Withoutaname: but are you in the cmd? we can fix the git issue next. I need to confirm where you are [03:38:27] That's what wikipedia.org uses, anyway. [03:38:29] Krenair: cool, let me know if you get stuck -- would love to help [03:38:34] !root [03:38:35] There is no such key, you probably want to try: !nullpath, !rewriteproblems, [03:38:35] andrew: what git command is he trying to run? [03:38:38] !nullpath [03:38:38] Don't use the example.com/Page_title URL scheme. It isn't supported by developers and WILL break in ways you can't foresee (and if it doesn't, it might break in the future). Some examples can be found at http://www.mediawiki.org/wiki/Manual:Wiki_in_site_root_directory#Reasons_why_putting_wiki_pages_in_the_root_directory_of_the_web_site_is_bad [03:38:59] !foo [03:38:59] For changing the page footer, see the FAQ (!) . More information at http://www.mediawiki.org/wiki/Footer and http://www.mediawiki.org/wiki/Manual:Skinning#Footer [03:39:04] pir^2: just the ones to grab vagrant and mediawiki [03:39:25] Gloria: oops. :P [03:39:29] he can just download the source if this is so hard, couldn't he? [03:39:41] Withoutaname: did you install git? [03:39:43] andrew: Anyway, you can still do it, but it'll probably be annoying. [03:39:50] To do what you want to do. [03:40:08] pir^2 yes I'm going to send another pic to show what i mean [03:40:10] \ [03:40:26] ori, yeah, for some reason it gets stuck when I create the SocketIO object in 3 but not 2 [03:40:48] andrew: what's your mediawiki.org username or gerrit username? [03:40:54] if you have one [03:41:06] last log is DEBUG:socketIO_client.transports:[transport selected] websocket [03:41:22] Gloria: I'm actually a little lost though--how then do I tell apache that it needs to run the index.php in a directory below root? virtualhosts? wouldn't the result be the same? [03:41:33] pir^2: too new :) [03:41:51] andrew: index.php can run from any directory... [03:42:01] pir^2: been working with mediawiki source for a while to get a client project up, but only discovered this channel and everything else recently [03:42:08] cool [03:42:10] Gloria: I'll look it up [03:42:32] andrew: http://imgur.com/ge2BS8Z [03:42:41] Withoutaname: good work recently btw, saw all your commits [03:43:01] oh did you search for commit owners? [03:43:09] i saw yours too :P [03:43:30] Krenair: that should be delegating to _WebsocketTransport.__init__, might want to just add print statements/logging calls to that func [03:43:36] andrew: I have both git bash shell and windows shell up [03:43:48] step 4, git clone doesn't work with windows cmd [03:43:55] it does in the git bash i installed [03:44:08] left is old vagrant [03:44:16] andrew: So what most people do is have /var/www/example.com/ as the root directory of example.com. [03:44:18] right is the newvagrant dir you told me to setup [03:44:45] did you clone vagrant to the newvagrant directory? [03:44:59] cant, see the picture [03:45:03] git clone doesn't work in cmd [03:45:07] andrew: And then /var/www/example.com/w/ is the MediaWiki installation while people set up virtual paths to equate example.com/w/index.php?title= to example.com/wiki/ [03:45:08] ori, https://gist.github.com/Krenair/4775feddce4c10de1875 [03:45:26] Withoutaname: you need to install git in windows. http://git-scm.com/download/win [03:45:34] Withoutaname: then the commands will work in windows cmd [03:46:15] so i downloaded the wrong git? [03:46:50] Gloria: I have never actually approached it that way before, that makes much more sense [03:47:18] Withoutaname: I have no idea--I just know that to get git commands working in windows cmd, which is what you need, you have to install that one [03:50:12] Gloria: https://github.com/legoktm/checker/pull/1 [03:50:16] Withoutaname: did you ever successfully clone it? [03:50:45] andrew: I see it now, in the installation process I selected the first option, "use git from git bash only", which was the default, instead of "use git from the windows command prompt" [03:50:52] Krenair: socket.recv without a timeout will block until data is available [03:51:08] Withoutaname: ahhh thank you, I will remember that for others. actually maybe that should be put on the page itself aha [03:51:27] Withoutaname: but can't you still clone from the git bash and then cd to that dierctory? [03:51:38] ori, I've been doing some stuff wrong [03:51:41] pir^2: I did, it's at /mw/vagrant [03:51:47] ori: git reset --hard origin/master && git pull --rebase [03:51:48] andrew was telling me to install a new one [03:51:55] ori: Can I kill this tar.gz now? [03:51:55] ignore me for a few hours :p [03:52:02] plus git bash doesn't allow copy paste [03:52:04] Gloria: yes [03:52:08] pir^2: yeah I have the blow it all up, start over in another directory approach when troubleshooting :P [03:52:09] only windows cmd does [03:52:21] ori: <333 [03:52:30] okay :P [03:52:33] tools.checker@tools-login:~$ git status [03:52:33] # On branch master [03:52:33] nothing to commit (working directory clean) [03:52:37] \o/ [03:52:41] andrew: what I think im going to do is just delete mw/vagrant [03:52:44] and then clone again [03:52:59] Withoutaname: you're in a different directory, so it shouldn't be necessary. let's just keep going [03:53:16] Withoutaname: if you deviate from the instructions I have, it makes it harder for me to follow what you're doing, essentially [03:53:20] oh my god, it's six am [03:53:23] sDrewth: Okay, I think we're mostly all set, other than slash issue. If you can follow-up on that, that'd be grea.t [03:53:27] great, too. [03:53:36] ori: Barely midnight. [03:54:11] Gloria: 9pm :) [03:54:22] WEST COAST BEST COAST [03:54:47] Gloria: and yet I'm flying to boston in a couple weeks... :) [03:55:40] Gloria: @andrew doesn't like california so terribly much [03:55:43] * sDrewth goes to look [03:55:56] Withoutaname: how's it going? [03:56:31] sDrewth: I imagine it looks the same, but the underlying files are neater now. [03:57:10] still an internal error [03:57:27] Internal error [03:57:27] The URI you have requested, /checker/?db=enwikisource&title=Index%3ADictionary+of+National+Biography.+Sup.+Vol+I+%281901%29.djvu, appears to be non-functional at this time. [03:57:28] andrew: it's downloading [03:58:32] ori, okay so what I had been doing was stupid [03:58:51] In the beginning I thought I could just modify the installed python scripts with sudo without doing anything in VCS [03:59:05] Withoutaname: git is downloading, or you're successfully running the git commands in cmd? [03:59:22] I also wasn't aware I had an older version of the websockets library. The most up to date version already works with python3 [03:59:27] (not bash, windows cmd.exe) [04:00:36] So if you git clone git@github.com:liris/websocket-client.git; cd websocket-client; sudo python3 setup.py install [04:00:50] andrew: it's cloning now [04:01:10] Krenair: try https://docs.python.org/2/library/2to3.html [04:01:20] Then the same with git@github.com:invisibleroads/socketIO-client.git but apply this patch: https://gist.github.com/Krenair/24c1364ddf0c5f6798a0 [04:01:41] andrew: I ran the first three commands in step 4 [04:01:44] should i do vagrant up [04:01:57] It will run the code from my email [04:03:05] Withoutaname: type in 'echo %cd%' (without ' ' ) and paste the output here before you do [04:05:31] Withoutaname: I'll confirm your in the right directory. then I'll need you to shut off the other instance of vagrant that is running, and then you'll be ready [04:06:39] * Krenair is going now [04:06:49] bye Krenair [04:06:49] C:\Users\user\mw\newvagrant\vagrant [04:07:13] Withoutaname: k, by whatever means necessary, you need to close the other running vagrant [04:08:20] (preferably by opening another windows cmd, navigating to where it's installed, and typing vagrant down) [04:09:20] pir^2: so, want to give me a quick rundown of the hierarchy here? :P [04:09:31] There is one? [04:09:43] pir^2: well I assumed there's at the very least maintainers for the core [04:11:26] andrew: https://en.wikipedia.org/wiki/Wikipedia:Do_NOT_bite_the_developers [04:14:00] hm. that is quite possibly the best contract between users and developers that I have ever experienced. [04:15:16] but pir^2 -- not anyone can just commit to the main codebase, can they? [04:15:24] ok andrew done it [04:15:30] Withoutaname: it's closed? [04:15:35] yeah [04:16:04] Withoutaname: I'm distrusting of computers. open your browser, go to localhost:8080 and make sure nothing comes up [04:16:46] andrew: actually, anyone can get an account on Wikimedia's gerrit instance pretty easily, then you can commit https://www.mediawiki.org/wiki/Developer_access [04:16:49] andrew: I have PRTG Network Monitor [04:16:58] but only a few users have +2 (merge/approve) rights [04:17:13] andrew: wait, how long ago did you became dev [04:17:18] https://www.mediawiki.org/wiki/%2B2 [04:19:26] andrew: maybe i should just switch to port 8081? [04:19:40] Withoutaname: not actually a dev yet [04:19:58] Withoutaname: yeah, that sounds safe, if you can do that in the vagrant config [04:20:19] been working within mediawiki on my own for a while, learning how it works and doing client projects [04:20:59] andrew: please see https://www.mediawiki.org/wiki/Developer_access - it's quite easy to get dev access [04:21:16] andrew: done, modified HTTP_PORT = 8081 [04:21:32] although it's possible Wikimedia will be changing gerrit and bugzilla to phabricator soon [04:21:40] pir^2: so essentially, you just commit, then others look at it on their own, and send it up if it passes? [04:21:46] Yes. [04:22:06] Most others can just +1 or -1, but a few can +2 (approve) it. [04:22:14] pir^2: is that a lengthy process? [04:22:22] Withoutaname: k, vagrant up [04:22:24] well, it can take a long time, but it's not too bad [04:22:46] it often takes a few tries to get it perfect [04:23:15] Your GIT/Gerrit username has not been set. Please enter your username or hit ENT [04:23:15] ER for anonymous. [04:23:15] If VM has already been created, run 'vagrant provision' to fix all remote git UR [04:23:15] Ls. [04:23:22] You can always set it later by editing .settings.yaml file [04:23:27] pir^2: well that's fine :) [04:23:56] should I set my git username [04:24:01] Withoutaname: absolutely [04:24:08] to whatever is on gerrit.wikimedia [04:24:11] yup [04:24:46] ok i think it's going to update [04:24:46] so I'm going to leave for awhile [04:25:12] Withoutaname: k [04:25:52] pir^2: I'll be working with extensions for a while anyhow [04:26:11] I'm kind of new to MediaWiki development too [04:26:30] pir^2: what parts of the codebase have you worked on? [04:26:45] various extensions and a few things in core [04:26:54] what in core? [04:27:21] I added the "creations only" option in Special:Contributions (MW 1.23) [04:27:25] (I could cheat if I knew your gerrit name haha) [04:27:30] I did a few more things in core, but I don't think they're merged [04:28:20] can't you see where they are in the review process? =\ [04:29:35] Yes, just checked [04:29:43] nothing since a week ago :/ [04:31:57] I have merged commits to core, WMF's own config, and these extensions: CentralAuth (x3), GeoData, RSS, Score, EtherpadLite, ProofreadPage, ApiSandbox, Translate [04:32:00] Gloria: still just getting the internal error [04:32:19] actually 4 CentralAuth [04:32:48] bye [04:33:04] oh uh bye [04:46:04] sDrewth: Oh, right. That was a separate issue... [04:53:11] Gloria: got tasks to do, bbl [06:42:24] @Withoutaname it all work out? [07:34:06] sDrewth: All better now. [07:46:55] Gloria: Magic! Thanks so much [07:48:09] Gloria: just noticed that it isn't sorting into groups though [07:48:21] https://tools.wmflabs.org/checker/?db=enwikisource_p&title=Index:Dictionary_of_National_Biography_volume_34.djvu [07:48:23] I don't really understand the groupings. [07:48:36] * Gloria looks. [07:48:46] transcluded page or not [07:48:50] Yeah... [07:49:13] compared to https://toolserver.org/~mzmcbride/checker/?db=enwikisource_p&title=Index:Dictionary_of_National_Biography_volume_34.djvu [07:49:41] Hrm. [07:55:18] sDrewth: Okay, fixed that as well. [07:55:34] https://tools.wmflabs.org/checker/?db=enwikisource_p&title=Index:Dictionary_of_National_Biography_volume_34.djvu works properly now. [08:05:50] Gloria: may I ask, what project is that? [08:06:13] andrew: checker? It's some silly script I wrote for sDrewth. [08:06:17] In 2011. [08:06:26] sDrewth: Where does the time go... [08:06:29] :D [08:06:36] It does something with Wikisource. [08:06:40] I can't really remember. [08:06:49] tools.wmflabs.org is a tool testing playground thing. [08:06:56] Servers and database replicas and such. [08:09:21] http://tools.wmflabs.org/mzmcbride/ [08:10:13] fascinating [08:10:27] hello xD [08:17:57] Gloria: your Content-Type is wrong!!! [08:18:19] Oh dear. [08:18:28] Well, the file used to be index. [08:18:30] But I just renamed it. [08:18:34] Because it wasn't being served. [08:19:15] http://p.defau.lt/?1_R_1REShsB25378Ebppqw [08:19:22] Seems I've already addressed this problem before... [08:24:11] andrew: done [08:24:22] got vagrant up and running properly this time [08:25:24] Withoutaname: awesome :) [08:25:46] andrew: now I have to figure out why it's returning 404s when I'm calling the file_get_contents() function [08:25:59] Withoutaname: in php, you mean? [08:26:06] yes [08:26:17] i dont think vagrant likes it [08:26:21] Anyone know about parsing plugins? Writing a plugin using $wgParser->setHook() I want to have a unique value per hook instance "view_1" "view_2" etc. Can I store some state in $parser? (not verry familar with php) [08:26:54] Withoutaname: 404 though? I'm confused--can you test on a lower level, within the vagrant bash? like make a tiny php script that just loads the contents of a file and prints it [08:28:41] Withoutaname: it's also possible that if you're getting 404s, the file you're trying to load isn't accessible by www-data, which is the user apache runs as. you could cheat and quickly chmod 777 whatever it is [08:29:40] I'm trying to load https://localhost:8081/wiki/Main_Page [08:29:57] andrew: how do I make it accessible? [08:30:07] Withoutaname: actually, before everyone in here kills me for this, go with a chmod o+rw instead :P [08:30:08] load IIS? [08:30:34] Withoutaname: you're working in vagrant, which runs linux-- apache is the http server, not iis [08:31:15] ok so if it's in C:/Users/user/mw/vagrant [08:31:22] I have to chmod everything in vagrant? [08:31:47] idk where "main_page" would be located on my filesystem [08:31:51] oh right, you haven't worked with virtualized environments before [08:32:20] so forget about c:/users etc. you need to work within vagrant's shell, the one you had up earlier (not cmd). it has its -own- filesystem [08:32:31] and within -that-, you work with things [08:32:41] isn't it synced with my windows filesystem [08:33:13] i can drop files into the windows system and it'll register on the linux one right [08:33:46] I think it's able to connect to it, yes. but that's chaotic, windows works with files in a different way than linux, and permissions will be off-that might be the problem [08:34:03] time for another screenshot ;) [08:35:28] krinkle is looked after in the room 2 (performance guidelines discussion)! [08:36:12] Nemo_bis: ? [08:40:34] andrew: where would the main page normally be located on the filesystem [08:42:18] Withoutaname: that's the concept I'm trying to explain-- vagrant, that you just loaded, is simply a file on your filesystem. when loaded, it expands to its own filesystem. like a box of its own. within that, is a default mediawiki installation that from what I'm reading, should load automatically [08:42:51] you should be able to go to localhost:8080 (in your case 8081) and see something come up [08:43:09] yeah, the main page [08:43:42] to a wiki? it works? [08:43:55] yes the wiki [08:44:04] and everything functions fine? special pages, etc? [08:44:18] special:recentchanges is fine [08:44:23] i can access it from my firefox [08:44:40] http://127.0.0.1:8081/wiki/Special:RecentChanges [08:44:43] try the special pages link itself, the one that loads them all. I find if something isn't right, it's the one that breaks [08:45:38] Special:SpecialPages? It's a list, but it won't load all special pages [08:46:24] well, that list, yeah. it works too? [08:46:47] I can access it just fine [08:47:07] i just want to know why it's rejecting file_get_contents [08:47:09] k. so what is it that you're trying to do -- you need access to a specific file? through the wiki? [08:47:40] yeah, e.g. file_get_contents( 'http://localhost:8081/wiki/Main_Page' ) [08:47:54] ahh. see, here's the separation [08:48:06] that link, is what your browser on your -host- computer sees [08:48:31] but mediawiki is separate, on its own virtual machine [08:49:19] (running a different OS) -- so the answer will be closer to: http://127.0.0.1/wiki/Main_Page [08:49:45] there won't be a port, because on the virtual machine, I'm going to guess apache is actually running on port 80 [08:49:56] and vagrant routes it to 8081 for external hosts [08:51:20] wow... [08:51:23] that worked, thanks [08:51:25] :) [08:51:40] I think it will really help if you do a little bit of reading on virtualization [08:52:15] andrew: you said you were hacking another client's mediawiki installation right? [08:52:46] would you be ok with reviewing code? [08:53:03] Withoutaname: yes, though my major project was really with creating a html -> wiki processor [08:53:18] Withoutaname: I suppose a tiny bit. [08:53:35] but I'm definitely not as familiar with the codebase as a lot of the other people here [09:13:44] what does Content::serialize() do [09:13:49] andrew: problem solved :) [09:16:50] Withoutaname: it serializes the content into whatever the serialization format is...basically it depends on what type of Content object you're dealing with [09:19:19] legoktm: doesn't getNativeData do the same thing? [09:19:28] no [09:19:35] well, sometimes! [09:20:02] getNativeData returns the raw "content" of the object [09:20:39] so for example, if you had a JSONContent object, getNativeData would return a PHP array, but serialize would return a string of json [09:20:51] (I think) [09:24:56] ugh, if I knew about this room before I spent all that time figuring classes out myself.. :P [09:32:37] :) [09:36:41] hey legoktm [09:36:49] hi [09:37:23] could you give me some pointers as to what im doing wrong [09:37:38] I can try. Post your code somewhere? [09:38:03] http://pastebin.com/sYCdgwXE [09:39:05] and what's not working? [09:39:26] hmm [09:39:32] well, I don't think parsing html is a good idea [09:39:52] $user->addGroup( 'bot' ); <-- not necessary since you specify EDIT_FORCE_BOT [09:40:57] I also don't think you need to do $wgContLang->lc; just expect that the person running the script gave a valid input (you already reject invalid ones) [09:41:31] i wasn't sure because prefixes were case-insensitive [09:46:38] legoktm, I wasn't sure how to strip the interwiki prefixes without modifying e.g. those in nowikis [09:47:14] so I opted to try and parse the content to get html, if it produces a link i take the link and try to insert it back in [09:47:20] isn't there a way to do this without parsing? I think I came across methods for dealing with and parsing links [09:47:32] I'm not really sure how to either [09:47:38] *without additional parsing [09:47:42] the parsoid folks might have some ideas [09:48:47] andrew: bug 60135, convert e.g. [[wikipedia:foo]] into [https://en.wikipedia.org/wiki/Foo foo] [09:49:38] blockers: stop converting things in , parse [[wikipedia:foo]]bar [09:49:42] ahh [09:51:03] also possible annoying edge cases like [[ wikipedia : foo]] [09:52:43] andrew: how would you do this without parsing [10:01:48] Withoutaname: too new to the codebase, I don't even know the options [10:02:15] if I were not to take advantage of methods already around, I'd do it just like you are :P [10:55:56] Reedy: Hi are you here? [10:56:24] !ask [10:56:24] Please feel free to ask your question: if anybody who knows the answer is around, they will surely reply. Don't ask for help or for attention before actually asking your question, that's just a waste of time – both yours and everybody else's. :) [10:58:43] marktraceur: I patched the bug https://bugzilla.wikimedia.org/show_bug.cgi?id=63211 and now there are some other major changes Reedy commented in https://gerrit.wikimedia.org/r/#/c/132734/ (last comment) [10:59:00] do i ammend as part of same bug or file a new one? [10:59:43] marktraceur: Reedy i just need to confirm it before i commit new changes [10:59:58] kishanio: I think you should amend to that *patchset* so we have all of the features right away [11:00:12] marktraceur: Alright. Thanks. [11:33:18] Is mediawiki open source? [11:33:57] Moony22: Yeah :) it's licensed under the GPLv2 or later. [11:34:14] marktraceur: ah, good, thanks [11:34:55] Moony22: Notably, GPL is a copyleft license, so it's not as permissive as you may be used to. [11:35:08] Moony22: But http://git.wikimedia.org/blob/mediawiki%2Fcore.git/d8f3dd8f391512fe68712c4fef229e84fa797afb/COPYING has the gory details [11:35:45] thanks [11:36:05] Yup! [11:37:47] marktraceur: Well, if there is any time of license I am used to it would be a copyleft license [11:38:08] That's good! [12:09:59] Is there a good overview of code browsing modes somewhere? I.e. grep -> find -> mass edit -> whatever? [12:10:14] Currently I'm mostly just using ack mode to do greps, dired, occur [12:10:46] and icomplete.. [12:13:24] omfg an avar [12:14:07] Best day avar. [12:25:42] Also very wrong channel, urghl [12:26:14] Fascinating [12:26:19] * Reedy pets avar [12:27:13] whee. extension working http://fingswotidun.com/testwiki/index.php/Main_Page [12:28:00] lerc: Have a bug tracker? :) [12:28:36] not so much, what did you break? [12:28:42] Just two suggestions. [12:28:58] lerc: "escape" should close the overlay; the buttons should have cursor: pointer [12:30:46] Hmm, esc used to close the overlay. Must have got stommed in an update [12:31:50] Well then, bug report. [13:21:24] mergy merge of backport please https://gerrit.wikimedia.org/r/#/c/132464/ [13:22:51] Nemo_bis: REL1_22 backports are Mark/Marcus's call. [13:24:38] hello [13:25:20] is there any way to disable forced capitalisation for usernames? [13:25:52] James_F: that doesn't mean they have an exclusive :) they like to be helped [13:26:03] guest92: No. [13:26:04] No [13:26:23] Nemo_bis: Helped but not ambushed. It's been two days, give them time to react. :-) [13:26:36] Nemo_bis: There's no cut of 1.22 coming out this week; no great hurry. [13:26:51] why thoughg [13:27:03] just to annoy you [13:28:29] James_F: there is no ambushing :) see bug [13:29:01] they should allow it [13:29:16] im appalled [13:29:24] guest92: However. [13:30:08] Nemo_bis: Where Mark said Hemeshi was going to test it. And yet I don't see a comment, let alone a +1. [13:30:41] No idea what your point is :) [13:58:52] Helllo #mediawiki, what's the current function of `MediaWiki:All_messages'? Is it still present in a freshly installed MediaWiki 1.19+ (With one migrated from an old wiki, I'm just too lazy to have a try)? Is it safe to delete it? Thank you very much :) [14:01:17] CasperVector: What do you mean? [14:03:14] Reedy: I mean, (1) is it safe to delete `MediaWiki:All_messages' in favour of `Special:AllMessages'? (2) (asked out of curiosity: is it still present in a freshly installed MediaWiki 1.19+? Thanks.. [14:03:52] It's not really in favour of [14:03:58] The first is a message [14:04:38] I don't think it's used by anything [14:10:18] Reedy: thanks, I asked because I found a warning saying that `MediaWiki:All_messages' is (still) a key page when I wanted to edit/delete it... [14:17:35] Reedy: BTW, `MediaWiki:All_messages' is really (provable with SQL) a page on my wiki, which was migrated from 1.2.0 rc3 ;) [14:58:05] ori, https://github.com/invisibleroads/socketIO-client/pull/51 [15:03:23] Krenair: wow! [15:03:30] nice work! :D [15:04:00] I don't think it's that impressive, but okay :D [15:24:47] * Gloria commented. [15:25:10] Can't read "heartbeat" without thinking about Heartbleed. [15:25:47] same here, every time [15:26:00] Gloria: do you have suggestions on how to make the HTML less silly? https://gerrit.wikimedia.org/r/#/c/131727/ [15:26:34] (I mean the TODO there) [15:26:35] "Code looks fine as long as it isn't secretly sending private keys to bad guys." [15:27:11] Ugly checkbox? [15:27:23] Can I see the HTML itself anywhere? [15:27:27] Do you have your own beta wiki? [15:33:15] Gloria, fixed [15:40:10] Sweet. [15:48:13] hey, how to disable "Created page with" text from appearing in recent changes? [15:48:36] guest22: by entering a comment in the comment field [15:48:47] any alternatives? [15:48:57] summary field* [15:49:04] what are you trying to accomplish? [15:49:33] i have a project where pages are constantly created and it really spams the recent changes page [15:50:30] bot flag [15:55:23] ok autosumm-new [15:55:52] bye. [15:56:04] BYE [17:34:47] Hey guys, anyone awake? [17:35:21] I installed MediaWiki through XAMPP thing a couple of days ago and I'm currently trying to add two custom namespaces on 500 ~ 503 [17:36:23] And then I added this small section at the entire end of the LocalSettings.php: http://puu.sh/8Ii9O.png [17:37:12] But I'm checking both Special:AllPages and HVdict:Chicken through the console now, and the custom namespaces don't seem to exist yet [17:38:19] Anyone have an idea if I'm missing anything or doing something wrong? [17:57:45] Yatalu: have you rebooted the web server? [17:58:26] Well it's localhost stuff [17:58:33] I restarted XAMPP [17:58:46] Stopped and restarted Apache and MySQL twice [17:59:31] My friend's helping me with this issue too and I currently expanded my local settings to this: http://puu.sh/8Ik0X.png [18:03:31] Yatalu: go back to the old setup [18:04:00] Yatalu: run a query similar to http://en.wikipedia.org/w/api.php?action=query&meta=siteinfo&siprop=general|namespaces|namespacealiases|statistics&format=jsonfm on your API and post the results in pastebin [18:06:45] Yatalu: when replying please prefix your comment with my name so I get pinged, makes conversations go faster [18:20:55] Is MediaWiki 1.23 going to be released this month? [18:21:09] I think I remember seeing somewhere that May 2014 was going to be a release time [18:21:13] but I may be wrong [18:21:36] you're right [18:21:46] sweet [18:21:48] can't wait [18:22:08] https://www.mediawiki.org/wiki/MediaWiki_1.23 [18:22:19] 1.23.0 >2014-05-29 [18:22:25] * -->ç [18:22:28] I've been on 1.23wmf15 forever now (and stuff stops working on 1.23wmf16, so I've just stayed on wmf15 since it works) [18:23:09] What stops working, Meiko? [18:23:20] I don't remember, it just broke big time [18:23:33] it's likely 1.23 will have the same issues [18:23:43] I would check again and report [18:24:20] I'll see if REL1_23 works when it's officially released before I change anything or report it [18:24:50] wikis I'm refering to are Brickimedia [18:24:52] [[Brickimedia]] [18:24:55] Betacommand: http://pastebin.com/GDRh4KFg [18:24:58] wm-bot :O [18:25:05] why is wm-bot's linkie not on here? :( [18:25:19] Meiko, in that case you may end up waiting for 1.24 for the fixes [18:25:33] It could just be a problem with wmf16 [18:25:58] If i have several branches with commits waiting to be reviewed. and whenever i pull on master with --rebase does that even update my other branches? [18:26:00] or it could just be a 1.22 extension that needs to be updated [18:26:36] Betacommand: console returns 500 as namespace now. I don't know what happened, but whatever we/my friend did has worked [18:26:43] Yay [18:26:47] Thanks for the help [18:26:47] xD [18:34:50] Meiko: the reason we do alphas and release candidates is so that issues like the ones you're having can be reported *before* the official release, as if the bug is indeed with 1.23 and it makes it to the official release, you may end up waiting for 1.23.1 as a best-case scenario, and 1.24 otherwise [18:35:23] if you aren't sure how to tell where the issue is, we can help :) [18:35:25] Skizzerz: problem is its almost certainly a problem with our setup [18:35:39] looking at some of our extensions, some are still on 1.21 and 1.22 branches [18:35:51] okay, do those extensions have updates? [18:35:56] yeah [18:35:57] if yes, have you tried updating them? [18:36:01] we will [18:36:11] when we need to (when 1.23 is released) [18:38:32] on a test instance of the wiki farm we're using to develop some of our future extension/skin releases, it works fine on 1.24 (master branch) so it's just a problem with some extensions on our live surely [18:38:38] *live site surely [18:39:34] http://meta.brickimedia.org/wiki/Special:Version - live site, on 1.23wmf15; http://georgebarnick.com/refreshed-beta/index.php?title=Special:Version - test instance without as many extensions, on 1.24 [18:40:37] changing the live site's core branch to wmf/1.23wmf16 or REL1_23 gives a database error (no source of the error or log entry though), so it's something with an extension that's on brickimedia.org but not georgebarnick.com [18:40:50] Probably semanticmediawiki because that causes problems every time we update :P [19:17:22] Hello, is there a way to let registered users modify pages but only accept those modifications if an sysop/admin approves it? [19:18:11] FlaggedRevs [19:18:32] Reedy: thanks, will check it out [20:02:47] hi [20:02:57] it is still not possible to create non existent page? [20:03:06] wat [20:03:32] MediaWiki allows the creation of pages, Juandev [20:03:49] well, it was a bug [20:03:57] what was a bug? [20:04:23] I think it was removed, but no I tried to create page with no text in mw ns at cs.wv and it was not performed [20:04:27] apergos: hi [20:05:12] Juandev, is there an entry in bugzilla? [20:05:26] yes, I should find it in my e-mail [20:05:48] yes, a bugzilla entry exists for that (Idon't remember the number though) [20:07:06] lets see [20:07:26] what does ParserOptions::setInterwikiMagic() do [20:08:09] Krenair: probably this one: https://bugzilla.wikimedia.org/show_bug.cgi?id=50124 [20:08:55] Withoutaname: it does... magic :P (joke) [20:09:10] :-) [20:13:01] i am trying to restore a wiki from xml dump. running: php /var/www/wiki/maintenance/update.php [20:13:06] getting an error: [809c1570] [no req] Exception from line 324 of /var/www/w/includes/installer/MysqlUpdater.php: Missing rc_timestamp field of recentchanges table. Should not happen. [20:13:16] how can i fix it? [20:17:34] the easy fix: create the missing rc_timestamp field of recentchanges table. See maintenance/tables.sql for the definition of the fields of that table [20:18:42] would be good to know from and to which MediaWiki version are you upgrading [20:19:10] 1.22.6 [20:19:47] could you confirm that? [20:19:52] Krenair: ? [20:22:33] I have no idea what you're talking about Juandev [20:22:50] Krenair: well, so I will reopen the bug [20:23:11] gj [20:23:14] Vulpix: am I supposed to somehow run tables.sql? [20:23:26] Juandev, If you are running a version that is supposed to be fixed, but you still get the bug, then reopen it, yes [20:23:37] adrelanos: no, that script is run on database creation, not for an upgrade [20:23:57] Vulpix: i am not upgrading. it's a restore. [20:24:03] Krenair: he he. I am not running, the Wikimedia Foundation is running [20:24:05] updated /var/www as well as xml [20:24:11] now I want to test restore [20:24:23] i mean backup made of /var/www as well as xml [20:24:24] adrelanos: why are you running update.php then? [20:24:42] Juandev, if you should be able to create MediaWiki: pages on a WMF wiki but you can't create blank ones, then okay [20:25:05] Vulpix: what else must i run to initially create sql database? [20:25:37] also because [20:25:38] php /var/www/wiki/maintenance/importDump.php /home/user/WhonixWikiBackups/dumpContentCurrent.xml [20:25:38] Error: 1146 Table 'wiki.xtranslate_messageindex' doesn't exist (localhost) [20:25:48] somehow these tables have to be created [20:25:52] i though update.php does this [20:26:29] adrelanos: you should follow the normal installation instructions. First create your wiki. And then import (after being sure the Project: namespace, content language, and namespaces are set correctly!) [20:27:00] Vulpix: i see. skipping the web interface based setup isn't possible? [20:27:01] Krenair: err sorry in user ns [20:27:19] Juandev, that's not this bug then [20:27:49] adrelanos: there's a maintenance script for installation, IIRC [20:28:26] maintenance/install.php [20:33:03] ok [20:33:06] i seem to have no luck [20:33:09] [8259c458] [no req] Exception from line 308 of /var/www/w/includes/db/LBFactory.php: Mediawiki tried to access the database via wfGetDB(). This is not allowed. [20:33:27] lolphail [20:33:41] (this is what i was doing: php install.php --name Whonix --admin test --pass xxxxxx [20:34:42] dbname, sbserver, dbuser, dbpass [20:37:36] still getting that error. using now: [20:37:38] php install.php --name Whonix --admin test --pass xxxxxx --dbname wiki --dbuser root --dbpass xxxxxx --dbserver localhost [20:37:38] [67ed1376] [no req] Exception from line 308 of /var/www/w/includes/db/LBFactory.php: Mediawiki tried to access the database via wfGetDB(). This is not allowed. [20:40:13] done [21:08:07] is it possible to make a mysql mediawiki backup without user accounts and passwords? [21:11:18] adrelanos: mysqldump --ignore-table=mywiki.user [21:12:04] although it would result in a broken database if you restore it [21:12:21] ok [21:12:41] i was looking for something simpler than going though the web based installer first before being able to import the xml [21:12:55] (we want others to be able to reproduce our wiki - but we don't want to share passwords) [21:13:12] just for own restoration, restoring full mysql backup would be probably simpler [22:43:06] what function is responsible for parsing [url foo] into "foo" with an arrow [23:10:07] legoktm: As per your comment in https://gerrit.wikimedia.org/r/#/c/132836/. I need to clarify few thing. 1.) After making a anonymous function a class function i would need to pass it a global variable array to store masks? Is this okay? 2.) if so i still don't follow naming conventions for globs. Thanks. [23:10:57] can you please comment on above. :) [23:58:21] So i submitted a patch under a wrong branch? Later i switched branches on my local again commited and submitted it again with by manually appending same change id from the previous patch. [23:58:40] gerrit created a new change for this patch? is this expected?