Getting website logs

I need to look at my website logs, what is the easiest way to get them ?
I thought using ssh would be, and it seems to be the only way, when I try to look at them from the CP, they are not there. the “logs” dir, just show a empty file “http”.
using ssh this “http” is a directory, and that is where the logs are, how can I download the, log files so I can go over them offline ?
for example this is one : access.log.2013-11-11.gz I would like to down load it ,then unzip it so I can read it.
thanks from Garr

User needs shell access.

Manage Users > Edit

ssh or SFTP works for accessing logs. (as you correctly identified they are not available via ftp.)

Most ftp clients will allow you to use SFTP, personally I use WinSCP.

I all ready do have ssh access, and can view the logs there, however I do not understand the scp command, how I can use it to download the access.log.2013-11-11.gz I would like to down load it to my computer ,then unzip it so I can read it. Since it is zipped it can not be read online,…
My computer is linux mint,… I did look at man-k scp, but still could not figure out what the syntex would be to do that ?
Thanks for the responses
edited:never mind, I figured it out

Here’s how I did it:

I went to Manage Users > Edit and gave my username Shell Access.

I connected to my account with free FileZilla (saving the event in FZ’s filemanager for future use.)

I navigated: logs > > http

I right-clicked on access.log, then chose download.


Ok, thanks again, I did figure it out,
it seems easier using the ssh shell, but not ssh, I just typed stfp, "myaddress/login,etc"
was prompted for my password, the once logged in, get path/path/filename , and it downloaded to my computers /home dir.
I did need to look at the man-k sftp a little to understand how to write the path correctly. (linux makes life so Much easier! :slight_smile: