App container and SSH access logs are permanently stored in S3. Logs are streamed in near real-time to the archive, to minimize loss of data in the unlikely event of an unrecoverable machine scenario. The flip-side is that reviewing archived logs requires a bit more effort.
Note: create a help desk ticket to obtain the name of your log bucket and server identifiers.
Browsing Collected Logs
To browse by date, use the AWS CLI to start at the log type you are trying to view:
# SSH Logs aws s3 ls s3://LOG-BUCKET/SERVER/ssh/ # App or DB Logs aws s3 ls s3://LOG-BUCKET/SERVER/containers/
The above will return the years collected by the system. To drill down into specific months and date, just append the next segment, followed by a slash. For example:
# SSH logs - browse months aws s3 ls s3://LOG-BUCKET/SERVER/ssh/2022/ # SSH Logs - browse days aws s3 ls s3://LOG-BUCKET/SERVER/ssh/2022/09/ # SSH Logs - browse list of logs for a specific day aws s3 ls s3://LOG-BUCKET/SERVER/ssh/2022/09/30/ # App or DB Logs - browse months aws s3 ls s3://LOG-BUCKET/SERVER/containers/2022/ # App or DB Logs - browse days aws s3 ls s3://LOG-BUCKET/SERVER/containers/2022/09/ # App or DB Logs - browse list of logs for a specific day aws s3 ls s3://LOG-BUCKET/SERVER/containers/2022/09/30/
Retrieving Archived Logs
First, create a directory where you want to download logs to:
mkdir fetched_logs; cd fetched_logs
Next, download an entire day of logs like this:
aws s3 cp s3://LOG-BUCKET/SERVER/containers/2022/09/30/ . --recursive
You should see a very rapid download of multiple logs.
Viewing Log Contents
Linux provides several nifty commands that enable you to browse and search compressed files.
To view all of the downloaded logs using a paginated view:
zmore *
To search for a specific string:
zgrep 'my search' *
If you'd like to combine all of the logs into one uncompressed file (named "all_logs") so that you can use other tools to analyze the data, do:
zcat * > all_logs
Extracting Useful Data
The awk utility can be used to pull out interesting data, followed by other commands to summarize it. For example, the following command can be used to extract all of the IP addresses from a consolidated Nginx log file:
awk '{print $1}' all_logs | sort | uniq
Cleaning Up
Remove the downloaded logs to conserve storage space on your servers:
rm -fr fetched_logs