Cyber Skyline Trove
Cyber Skyline Trove

Cloud Marketing

Prompt

A hot new firm just opened up to help organizations with digital marketing, but it seems like they need some help with security. Identify the security vulnerability and demonstrate to the firm that their web server is not securely configured by proving that you can extract the contents of /etc/flag from their web server.

Note: Your scope is limited to HTTPS & you may not use automated bruteforce tools for this challenge.

Walk-Through

In this challenge, you are given questions to guide you through a small organization’s webpage and direct you to find a security vulnerability.

With any web application challenge, start by loading up the website and manually crawling around the pages, inspecting the HTML, Javascript, and network requests.

In this case, after some manual browsing it's clear that there isn't a lot of interactivity on this page, other than being able to access individual content pages like Team, About, etc.

💻

All web-based challenges should be opened in a separate window to easily view the developer tools and to reduce confusion between resources on the Cyber Skyline website and the actual challenge website.

Guide

The first two questions for this challenge can be solved by exploring the site. To find how many people work at the organization, navigate to the Team page and count the people visible on it.

The Team page is found at this path
The Team page is found at this path

To determine what “path the firm does not want search engines to index”, we’ll need to understand what that means. Any time there's a question about what a domain administrator doesn't want search engines to index, it’s referencing the robots.txt file. This file, by default, lives at the root of the domain at /robots.txt. The file is responsible for telling search engines and other web spiders/crawlers what they should do on this specific website.

robots.txt, however, relies on “voluntary compliance”. Servers do not block access to disallowed pages, browsers do not enforce robots.txt rules, and malicious bots may be designed to ignore this file. The creation of this file is not a security feature, it simply may reduce traffic to certain pages because the pages are not listed by search engines.

Add robots.txt to the end of the URL and click enter to go to the page.
Add robots.txt to the end of the URL and click enter to go to the page.

Here the /robots.txt file contains:

The answer is redacted.
The answer is redacted.

User-agent with * specifies the rules that follow apply to all spiders/bots.

Disallow is the directive that tells the crawlers to not index the specified page.

Now for the task of finding the flag. Remember that servers do not block access to disallowed pages, so lets investigate the page by entering it at the end of the base URL. In this specific case, the specified page is not found. If the page did exist, the contents of the page would be visible instead of an error message.

Image

Return to the challenge description for more information. The description specifies "... that you can extract the contents of /etc/flag from their web server." This is asking for a file read from the server. The first type of vulnerability to test for here would be a directory traversal (AKA path traversal).

Directory traversal vulnerabilities occur when the web application accepts user-controlled file paths without proper validation. When a request to a specific page (like index.html) is called, the web server doesn't validate that the request is only asking for a file within its own context. This allows an attacker to access files outside of the intended directory structure. The classic directory traversal would be to make a request to a file in a folder closer to the root of the file system like ../../../etc/passwd. For more detailed information about how this vulnerability works, visit this site on Path Traversal owasp.org.

While navigating around the site you may have noticed that the URL for each page wasn't the standard <http://example.com/team>, instead it had some query parameters attached. For example the team page is: https://[uid]-marketing.web.cityinthe.cloud/?view=team.html&uid=[uid]

The key point to look at here is the ?view=team.html parameter. This suggests that the web server is being pointed to a specific file in a way that is controllable by the user. We can test if this is the case by changing team.html to ./ and observing the response.

If you get a generic server error or the site redirects without doing anything, that's a sign it isn't vulnerable. A site may be vulnerable to directory traversal if you get:

  • an actual file, or
  • a type of "File not found" error

Making the request we see:

Request for
Request for https://[uid]-marketing.web.cityinthe.cloud/?view=./&uid=[uid]

… A “Not Found” error. So we likely have a directory traversal vulnerability!

To test for it properly, we know there's a file /etc/flag that we need to get. Now all we need to do is incrementally increase the number of directories down we look to see if we can read the file. The first request then would be with /etc/flag, the second ../etc/flag, and onwards until you either get a response or go beyond a reasonable number of iterations (~10 is a good amount).

Using the payload ../../../etc/flag yields the flag:

https://[uid]-marketing.web.cityinthe.cloud/?view=../../../etc/passwd&uid=[uid]

The answer is redated. The flag generated is unique to each user.
The answer is redated. The flag generated is unique to each user.

Useful tools for this challenge:

  • robots.txt Wikipedia page: https://en.wikipedia.org/wiki/Robots.txt
  • OWASP path traversal: https://owasp.org/www-community/attacks/Path_Traversal
  • Use the Tutorial Video below

Tutorial Video

Watch our full Tutorial Video to learn more specifics about directory traversal and see a walkthrough of how to solve this challenge:

Cyber Skyline Live: Traversing the Web

In Cyber Skyline Live - Traversing the Web, you'll learn from Franz Payer, CEO of Cyber Skyline, about how to find common web security vulnerabilities. This video is for educational purposes only.

youtu.be

Cyber Skyline Live: Traversing the Web

Questions

1. How many people are listed on their firm's team page?

2. What path does the firm not want search engines to index?

3. What is the value of the flag?

4. What is the common name of the vulnerability used to retrieve the flag?

©️ 2026 Cyber Skyline. All Rights Reserved. Unauthorized reproduction or distribution of this copyrighted work is illegal.