?

Log in

No account? Create an account
Linux Community's Journal
 
[Most Recent Entries] [Calendar View] [Friends View]

Thursday, October 24th, 2002

Time Event
1:14a
all right then
I know I'm risking ridicule but here we go, I've been working all night and I'm tired.

I need to transfer files from my old server to my new server. I thought it would be best to just FTP into the old server from the new server and download all the files, since my copy of the site here on my dev machine would take 2 days to upload with my cable modem's upload cap. The problem is whenever it hits a directory it tells me no such file and doesn't grab it. I understand this is because the directory does not exist on my new server. The problem is there are over 1000 directories in the site. It wouldn't be a time saver to have to make all those by hand and one by one download the contents into them.

So my question is: Is there a way to make the mget command automatically create the new directories on the local machine when it finds them, and continue downloading all the files inside of it, such as a -r command for going recursive through them? Or even if not mget is there some command other than that all together that I should be using but I'm missing it. Thank you all.
10:00a
9:33p
Win Emu
in your opinion, what's the best win emulator for linux, huh?
I use Win4Linux 4.0 but can't say it's very stable

<< Previous Day 2002/10/24
[Calendar]
Next Day >>
About LiveJournal.com