Challenge 2
Since the question statement asks us to not behave like a bot, this must be something related to how bots on the internet work.
The traffic on the website was being generated by google. Google is a web crawler. To reduce the traffic you have to forbid a web crawler to read certain directories and files on your server. This is achieved through a robots.txt
file on the server that tells the web crawler what it is allowed to visit and what it isn't.
On reading robots.txt
, you will find the directories that are hidden. Visiting those directories and files gives you the flag.