Login

Prompt

Analyze a custom application login event log to help us understand user behavior.

login.log256.1KB

Walk-Through

This challenge involves analyzing a custom application log format that uses tab-delineated columns. The tab-delineated format is well-suited for the cut tool to extract specific columns from the log. cut can be used in combination with several other Linux command line utilities to obtain the answers to the questions.

Using head and tail to see the first few or last few lines:

To start, use ls to list the files in the directory, you should see login.log. The cat command can be used to display the contents of the file. Sometimes, log files can be quite long, so to avoid having to scroll back up through several lines, use head or tail to just see the first few lines or the last few lines. Used with no arguments, they will display 10 lines by default:

image

This can be helpful for log files that have column headers - using head instead of cat will display the column names and the first few lines of data.

Counting words or lines in the output:

Piping the wc command (short for word count), along with the -l flag (lower case L for “lines”) will count the lines in the output:

image

Display only one column with cut:

To display only the usernames, use the cut command with the -f flag to extract field 3 (the username column). The default delimiter for cut is a tab space.

image

Sorting a list alphabetically and displaying unique output:

The usernames can be sorted alphabetically by piping the output through the command sort:

image

Some usernames are listed twice. To list only the unique entries, use the uniq command.

image

The -c flag will show the number of times an entry occurs in the output:

image
💡

Please note that uniq -c without sort will yield a different (and incorrect) result because uniq -c only counts consecutive duplicate lines. If the same line appears multiple times, but not next to each other, uniq -c cannot identify them: sort puts all identical lines next to each other, allowing uniq -c to count them properly.

This list can be sorted again, this time numerically, with the -n flag:

image

Other features of cut:

The output can be piped throughcut -f 1,3 to display the first column (Date and Time) and the third column (usernames):

image

To display only the date (without the timestamp), use cut -d " " -f 1. This tells cut to split the line by spaces (instead of the default tab) and extract the first field:

image

Questions

1. How many total login attempts were made in this log?

cat login.log | wc -l
Get the line count of the log

2. How many unique usernames appear in this log?

cat login.log | cut -f 3 | sort | uniq | wc -l
Extract the third field (with the usernames) of the log, sort the usernames, get the unique usernames, and then get a line count of the number of unique usernames

3. What is the username with the most login attempts?

cat login.log | cut -f 3 | sort | uniq -c |sort -n
Extract the third field (with the usernames) of the log, sort the usernames, get a frequency count of each unique username, and then sort the unique usernames by frequency

4. How many attempts were made for the username with the most login attempts?

Use the same command as the question above

5. What is the date with the most login attempts?

cat login.log | cut -f 1 | cut -d " " -f 1 | sort | uniq -c | sort -n
Extract the first field (with the date+time) of the log, extract just the date, sort the dates, get a frequency count of each unique date, and then sort the unique dates by frequency

6. What is the username that had logins from the most unique IP addresses?

cat login.log | cut -f 2,3 | sort | uniq | cut -f 2 | sort | uniq -c | sort -n
Extract the second field (with the ip address) and third field (with the username) of the log, sort the ip/username pairs, get the unique ip/username pairs, then extract just the usernames from each pair, sort the usernames, get a frequency count of how many unique pairs each username has, and then sort by frequency

©️ 2025 Cyber Skyline. All Rights Reserved. Unauthorized reproduction or distribution of this copyrighted work is illegal.