The first step when looking for bug bounties is to get to know the target.

Here are some of the type of information that should be gathered on your target:

  • create a list of all the subdomains & IPs that belong to the target
  • Find information about the type of software & services the site uses.
  • Do they have a GitHub Account?
  • Check the robots.txt file
  • Does the site have any input forms, any parameters in the URLS?

Using search engines to do recon

Sometimes search engines will crawl websites and index pages that have sensitive information or pages. Google Dorking is when you use specific queries to narrow down the results to exactly what you are looking for. Dorks can be used to search for certain files, strings, titles, web pages, etc.

Exploit Db is a site where people can submit dorks, dorks that were submitted on the site could be used or modified to help find information about the site that you want to target.

Google dorking is not just useful for hacking, but it can also be used in regular searches. For example lets say that we are targeting the site mcdonalds.com. We want to find out all the sites that Google has indexed we could use the following dork to find all the results for mcdonalds.com.

site: mcdonalds.com

The dork above will tell the search engine that we want all the sites that have “mcdonalds.com”.

The intitle dork seen below will tell the search engine to look for all the results that include “Index of” in the title.

Example of intitle dork

The image above shows the intitle dork in action. We used the dork to find websites that have a title of “Index of”.

The intext operator can be used to find pages that contain certain words. In the case of the image above, we told the search engine to look for any results from lowes.com that contain the text “password”

Most operators ( intitle, site, intext) can be combined with other operators to narrow down the results even more.

Google Dorks can can be used to find api key or other sensitive information that could aid you finding a vulnerability. When dorking, you do not need to use Google, other search engines can be used as well. All search engines are different and will probably return different pages.

Checking the robots.txt file for sensitive directories

Most sites will have a robots.txt file, the purpose of this file is to tell web crawlers what they are allowed to crawl and what they are not allowed to crawl. It is up to the crawler creator to honor the robots.txt file.

Robots.txt file can be a great way to see where any sensitive directories or file are on the site.

example of a robots.txt file.

The image above shows that usa.gov does not want crawlers to crawl the /includes/ directory.

Finding subdomains

There are many hacking tools that can be used to find subdomain. One of my personal favorite tool is subdomain3.

When looking for subdomains it is a good idea to use a couple of different subdomain enumerate tools because they might use different methods of getting subdomains. One possible way that a tool might look for a subdomain is to run through a list of popular subdomains that will check to see if the subdomain is valid. Another way is using DNS records to check to see if any subdomains can be found in the DNS records.

Now that we have a list of subdomains, we have widen our net. We could use the following tools to check to see if any of the subdomains are vulnerable to subdomain takeovers. Subdomains takeover usually happen because the subdomain has a CNAME in the DNS records but no host that is providing contents.

Dangling DNS records are when a host name is still tied to a DNS record but the hostname is not owned by the original user. A malicious user could create an AWS account and instance with the host name of the old server. Now all the traffic will be routed to the malicious server where they could carry out a bunch of different attacks. I created a list of some tools that can check to see if the server is vulnerable.

Hacking tools such as the ones listed above are cool and might work good, but sometimes you have to manually look for vulnerabilities in order to find one. It is a good idea to get familiar with how vulnerabilities work instead of relying on hacking tool to do the heavy lifting.

Good ole Nmap

Nmap is great scanning tool, it can be used in a bunch of different ways. Such as:

  • Looking for services running on non standard ports
  • Determining what services are running on a host
  • Has a wide selection of NSE scripts
  • It has Firewall Evasion features
  • Can perform different types of scans ( xmas, fin, connect, idle, udp, etc )
nmap -sV -sC 127.0.0.1

The -sv flag runs service detection scan. The -sC flag will run all the default NSE scripts. This might give the scanner more information about the services that are hosted on the server.

nmap -p- 127.0.0.1

The nmap command above will scan all 65535 ports.

nmap 192.168.1.1 -A 

The -A flag will run Os detection, version detection, script scanning as well as run a traceroute.

nmap -Pn --script=dns-brute example.com

The command above will use dns-brute NSE script to gather subdomains that might exists for the domain.

Playing with user input

The first thing I do when I come across any input fields is try the payloads listed here.

If that does not work I will enter special characters at random and see if the website responded in a certain way or removed some of the characters.

If I enter characters like the following:

<script></script>

and the website responds but removes certain characters like this:

<script</script> 

I might then try a string like this:

<script>><</script>

With the hopes that the website removes the first > but keeps the other > . Sometimes entering random characters and looking at what the site accepts or keeps can give you a good idea how they filter out certain characters. Also it is a good idea to look at the source code of the page to see if there is any JavaScript or code that tells you how they filter out special characters.

URL parameters can also be tested. There are a bunch of different attack vectors that could be used such as:

Read other people’s Bug Bounty write ups

No one is good at everything, same with bug Bounties. There is always new stuff to learn. One of the best ways to learn is to read other peoples bug bounty reports. Most of writeups can be found on Medium, some other good sites that have bug bounty write ups are:

Another good method of getting Bug bounty tips is to follow this hashtag on Twitter.

Below is a list of good sites that can help you learn how certain vulnerabilities work and allow you to practice hacking them in a safe environment

Never Give up

When you are starting out, it might takes months or weeks until you find a bug. But do not give up! You can learn as you hack and use the stuff you learned on other bug bounty programs.

It’s not where you start, it’s where you finish.

My wrestling coach in High School used to say that quote all the time during practice and meets. It holds true to almost anything you do. If you keep hacking away and learning you will likely become a expert on the topic. Remember though that no one can become a expert at everything and getting a bug bounty will take time and a lot of effort.

Maybe, spend a whole week targeting a site with a certain vulnerability, try everything you can think and then the next week try looking for a different type of vulnerability.

Use white boards or take notes during the hunt, that way you can come back to hacking when you take a break. Also sometimes if you been working on something too long you get tunnel vision and brain. Where the solution might be right there in front of you but since you too focus on it you miss it. After a day or two break come back with fresh eyes and brain and you might see something or figure it out.