Squid

Prompt

Analyze this Squid proxy log to answer the following questions.

squid_access.log15.9KB

Tutorial Video

Walk-Through

This challenge involves analyzing a Squid proxy log. Basic scripting knowledge is necessary to complete the challenge in a reasonable amount of time.

Use head to see the first few lines of the log. The first field, commonly the time, is in an odd format of numbers and decimals. This is epoch time. Epoch time is the time in seconds from January 1 1970 at midnight.

image

Converting a timestamp from Epoch to Unix:

Online tools can be used to convert the timestamp to a human readable Unix format (see tools below) . or you can use the date command to convert it within linux:

image

Using awk to extract column data:

To answer the second and third questions, looking up the format of a squid log (https://wiki.squid-cache.org/Features/LogFormat) shows that the field after the timestamp represents the time spent by the proxy in processing the client request, shows in milliseconds. To extract this field, we can use awk '{print $2}' and sort -n to sort numerically.

image

The last question can be answered by extracting the IP address field, sorting and counting the number of unique values with awk '{print $3}' | sort | uniq | wc -l.

image

For other examples of using awk, refer to Log Analysis challenge Nginx.

Questions

1. In what year was this log saved?

Take any of the Epoch timestamps and convert them into a human-readable date. An online tool, such as Epoch Converter, can be used to do this.

2. How many milliseconds did the fastest request take?

cat squid_access.log | awk '{print $2}' | sort -n
Extract the second field (the response time) and then sort the results numerically

3. How many milliseconds did the longest request take?

Same as the question above

4. How many different IP addresses did the proxy service in this log?

cat squid_access.log | awk '{print $3}' | sort | uniq | wc -l
Extract the third field (the IP address of the proxy client), sort, get the unique values, and then get the line count

5. How many GET requests were made?

cat squid_access.log | awk '{print $6}' | sort | uniq –c
Extract the 6th field (the HTTP Request type), sort, and then get the unique values with a count of their occurrences

6. How many POST requests were made?

Same as the question above

7. What company created the antivirus used on the host at 192.168.0.224?

The name of the company is found within the URLs of the requests made 192.168.0.224

cat squid_access.log | grep "192.168.0.224"
Search for any lines that contain the IP address from the question.

8. What URL is used to download an antivirus update?

Use the command from the question above and then find the URL that includes “virus” and “definitions”

©️ 2025 Cyber Skyline. All Rights Reserved. Unauthorized reproduction or distribution of this copyrighted work is illegal.