If you have a lot of files to upload to a website and the only access is via FTP then you can use a nice GUI tool to just drag and drop the directory, or you can use the command line ncftpput tool.
Why use a command line tool?
I needed to upload a directory containing thousands of files and directories with many subdirectories. I could have used a GUI tool to do it but it would have taken a really long time. There was no SSH, SFTP, rsync or SCP access to the website in question so I did this:
1) tar up and gzip the directory on my local machine
2) copy it via sftp to my US based VPS (just one file copied up)
3) decompress it on the VPS
4) use ncftpput to copy it to the website (all files copied from VPS to website)
The whole process took a couple of minutes or so, instead of potentially 30 minutes or longer using a GUI tool like Transmit on my crappy ADSL connection.
How to use ncftpput to recursively copy a directory
Let’s say the directory containing the files to upload is to go into the website’s root directory and is called “myapplication”.
The current working directory looks like this:
$ ls -l total 8380 drwx------ 5 chris chris 4096 Aug 23 2010 myapplication -rw-r--r-- 1 chris chris 8418990 Apr 3 11:36 myapplication.tar.gz
To recursively upload the myapplication directory, do this:
ncftpput -u [username] -p [password] -R [hostname] / myapplication
Substitute [username], [password] and [hostname] with the appropriate values; -R tells ncftpput to recursively upload; / is the directory you want to upload to, and myapplication at the end is the directory to upload.