Automation of the reconnaissance phase during Web Application Penetration Testing I

There are many things that every Penetration Tester and Bug Bounty Hunter does during blackbox testing of web application. These repetitive things cost a lot of time during penetration testing, and the time is usually short. Facing these obstacles, I have created a tool that automates many activities and increases work efficiency (it is still in development phase).

This article describes workflow that I am using during Web Application Penetration Testing with scope “*.domain.com”. My research is based on the OWASP methodology and the methodology contained in the book “Hack Tricks” written by Carlos Polop. For the purposes of this article, let’s assume that all resources included in the “*.domain.com” domain are our assessment scope. The results of the operation of each of the tools listed below will be saved in text files for further processing.

Generally speaking, when the scope of the test covers all of the company’s websites, we are interested in the following resources:

  1. Protocols (scheme)
Source: own study

Before starting work, setup API keys for Subfinder in “$HOME/.config/subfinder/config.yaml”, launch a new project in Burp Suite and turn off interception, as shown in the screenshot below:

Source: own study

The tester’s first task is to collect as many subdomains as possible. Start from passive enumeration — without generating any traffic directly with the infrastructure managed by the target organization. Tools used:

  • crt.sh

The screenshot below shows how you can automate this process using bash:

Source: own study

The next step is to use brute-force techniques to guess subdomains names. Tools used:

  • massdns

The screenshot below shows how you can automate this process using bash:

Source: own study

Combine all generated files into one. Filter out duplicates and dead records from the list. Tools used:

  • dnsx

The screenshot below shows how you can automate this process using bash:

Source: own study

Identify new subdomains using mutations of already known subdomains.
Tools used:

  • altdns

The screenshot below shows how you can automate this process using bash:

Source: own study

If time permits, the next step should be enumeration of 3rd level subdomains by brute-forcing them. So brute-forcing steps mentioned above should be repeated for each subdomain found. This step is very time consuming and it should be done at the very end of the subdomain enumeration, so that you can start working on the subdomains found so far.
Tools used:

  • puredns

The screenshot below shows how you can automate this process using bash:

Source: own study

By now, you should have a fairly large number of subdomains. The next task is to scrape the network for of all known url addresses related to the subdomains found, in order to extract more subdomains from them. Tools used:

  • gau

The screenshot below shows how you can automate this process using bash:

Source: own study

Check for Cross-Origin Resource Sharing misconfigurations and domain takeover vulnerability in all found subdomains. Tools used:

  • CorsMe

The screenshot below shows how you can automate this process using bash:

Source: own study

Take a screenshot of each subdomain and store it in the “screens” directory. Tools used:

  • EyeWitness

The screenshot below shows how you can automate this process using bash:

Source: own study

Check status code of all subdomains. Proxy all subdomains and urls found to Burp Suite for another automatic scans and manual testing.

Tools used:

  • Burp Suite

The screenshot below shows how you can automate this process using bash:

Source: own study

Finally, zone transfer is checked. Tools used:

  • dnsrecon

The screenshot below shows how you can automate this process using bash:

Source: own study

At the very end, ip addresses of all found subdomains should be checked, and port scan should be performed on all discovered ip addresses. Tools used:

  • dig

The screenshot below shows how you can automate this process using bash:

Source: own study

After all of these steps check the following text files and directory:

  1. live.txt — list with live subdomains

The above process has been automated in one script called “crimson_recon”. This is one of the three modules that are part of the “crimson” tool that I am constantly creating and sharing at Github. Now you should have a general overview of the company’s infrastructure. Select one of the domains and start the next module called “crimson_target”. Which will be described in my next article.

References:

  1. https://github.com/Karmaz95/crimson

Penetration Tester | Security Reasearcher | Bug Bounty Hunter

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store