I very occasionally need to support people with websites on servers other than those I manage. If the server is only accessible using FTP and I need to get a copy of the entire website, then I use the ncftpget command line tool to recursively download all the files and directories in one go.
ncftpget is part of the ncftp package.
On APT based Linux distros (e.g. Debian) you can install it like this:
apt-get install ncftp
On YUM based Linux distros (e.g. CentOS, RHEL) you can install it like this:
yum install ncftp
Usage of ncftpget
Use the command like so:
ncftpget -R -T -v -u [username] [hostname] [local path] [remote path]
-R tells ncftpget to download files and directories recursively
-T says not to try a tar download (it’s never worked for me and results in the error "tar: This does not look like a tar archive" and "tar: Exiting with failure status due to previous errors". The actual files subsequently download just fine).
-v says to be verbose and show the download progress; you can omit this but it can be useful to see what’s going on
-u specifies the username to use. Change [username] to the user to log in as
Replace [hostname] with the server to ftp to.
Replace [local path] to where you want to copy the files to.
Replace [remote path] to the path where the files are on the server you are connecting to.
Let’s say our username is "chris", the server is 10.1.1.10, the path we want to save to locally is /tmp and we want to download files from the remote host from /httpdocs. Do this:
ncftpget -R -T -v -u chris 10.1.1.10 /tmp /httpdocs
You are then prompted for the password and the download starts.