Fingerprinting

This page explains a more active enumeration in addition to brute-force options, for a passive recon and usefull tools for web enumeration, go to "OSINT Websites & Domains" section.

https://book.hacktricks.xyz/network-services-pentesting/pentesting-web

Web Server, Technologies and Versions

If Virtual Hosting is on, or we discover the real IP behind a WAF, we should:

nslookup

server {IP}

{IP}

Then we shoul import this to /etc/hosts or C:\Windows\System32\drivers\etc\hosts --> {IP} {Domain}

- whatweb

Stealthy:

whatweb -a 1 <URL>

Aggresive

whatweb -a 3 <URL>

- webtech

webtech -u <URL>

- webanalyze

webanalyze -host https://google.com -crawl 2

- Wappalyzer

Check Wappalyzer firefox extension.

- Builtwith

http://builtwith.com/site.com

- Firebug

We should check the browser viewable source code analysis: JAVASCRIPT, CSS, HTML, comments.

Look at the source code with Firebug extension.

- Burp

Check raw responses.

Software Version Reporter Addon: https://portswigger.net/bappstore/ae62baff8fa24150991bad5eaf6d4d38

Retire.js Addon: https://portswigger.net/bappstore/36238b534a78494db9bf2d03f112265c

- Anonymous Proxies

To detect anonymous proxies:

nmap --script="http-open-proxy" --script-args="proxy.url=www.debian.org,proxy.pattern=The Universal Operating System" www.dominio.es -p8123,3128,8000,8080 -PN

SSL Analysis

When looking at SSL, we should check the version, the supported algorithms, the key length and perform a certificate analysis.

- sslscan

https://github.com/rbsec/sslscan

To verify the SSL/TSL enabled services.

- testssl

https://github.com/drwetter/testssl.sh

To test TLS/SSL encryption anywhere on any port.

SSL Scanner Burp Addon is based on this tool: https://portswigger.net/bappstore/474b3c575a1a4584aa44dfefc70f269d

- Supported algorithms

null, weak --> https://www.ssllabs.com/ssltest/analyze.html?d=site.com

- Key Length

<1024

- Certificate analysis

Check CA, validity and site name

To look at the certificate:

openssl s_client -connect {IP:PORT}

If we have a CSR (Certificate Signing Request), to get info:

openssl req -in {request file} -text

HTTP Headers

We should check the headers to identify versions, missing security cookies in bad responses, etc.

- httprecon

- httprint

- netcraft

http://uptime.netcraft.com/up/graph/?host=site.com

http://searchdns.netcraft.com/?host=site.com&x=11&y=11)

- Shodan

http://www.shodanhq.com/search?q=site.com

- nmap

nmap -sV --version-all -p80,443 --all {direct IP}

- To identify the lack of security headers

Analyze responses and check if the following headers are missing:

- Cookies Flags

Look at secure=true, HttpOnly, SameSite and expirity flags (check that they do not have an excessively long lifetime).

Tools to check cookie flags:

https://github.com/Sinkmanu/cookiescanner/tree/master

https://www.site24x7.com/tools/secure-cookie-tester.html

https://domsignal.com/secure-cookie-test

Also try Cookie Hijacking (i.e. EditThisCookie) and try using an authenticated session cookie within a different browser to detect if its vulnerable.

Try decoding cookies: https://cyberchef.org/

If JWT are present, try manipulating them: https://jwt.io

- Headers Analyzer Burp Extension

https://portswigger.net/bappstore/8b4fe2571ec54983b6d6c21fbfe17cb2

HTTP Methods

OPTIONS TRACE TRACK PUT DELETE CONNECT

nmap --script=http-methods.nse --script-args http-methods.retest=1 192.168.10.0/24

nmap -p 443 --script http-methods --script-args http-methods.url-path='/path' {IP}

httpmethods tool: https://github.com/ShutdownRepo/httpmethods

python3 httpmethods -u http://www.example.com/

Information Disclosure

https://taksec.github.io/google-dorks-bug-bounty/

- Check clasic files

robots.txt, .DS_Store, crossdomain.xml and clientaccesspolicy.xml, sitemap.xml, security.txt and humans.txt.

- Passwords in public files

site:domain.com, cache:site:domain.com, "site.com" hack | password | "your password is" ..., filetype:DOC OR filetype:PDF OR filetype:XLS OR ... site:domain.com, intext:(password | passcode | pass), intext:(username | userid | user)

./theHarvester.py -d site.com -b google -v

- Sensitive Directories

We could try locating some of the through google dorks, OSINT or fuzzing.

site:sitio.com intitle:index.of, intitle:index.of.admin, intitle:index.of inurl:admin

- Cross-domain Referer leakage

When a web browser makes a request for a resource, it typically adds an HTTP header, called the "Referer" header, indicating the URL of the resource from which the request originated.

If the resource being requested resides on a different domain, then the Referer header is still generally included in the cross-domain request.

Spidering/Fuzzing/Brute force directories and files

Once exhausted the OSINT techniques, we could try locating sensitive directories, files and subdomains fuzzing them.

If we locate forbidden directories, we could try locating files there.

Dictionaries

raft-large-directories-lowercase.txt

directory-list-2.3-medium.txt

RobotsDisallowed/top10000.txt

- Others

https://github.com/carlospolop/Auto_Wordlists/blob/main/wordlists/bf_directories.txt

Dirsearch included dictionary

http://gist.github.com/jhaddix/b80ea67d85c13206125806f0828f4d10

Assetnote wordlists

https://github.com/random-robbie/bruteforce-lists

https://github.com/google/fuzzing/tree/master/dictionaries

https://github.com/six2dez/OneListForAll

https://github.com/random-robbie/bruteforce-lists

/usr/share/wordlists/dirb/common.txt

/usr/share/wordlists/dirb/big.txt

/usr/share/wordlists/dirbuster/directory-list-2.3-medium.txt

Extensions

Technologies: .java, .cs, .php, ...

Backups: .zip, .rar, .tar, .tar.gz, .gz, .tar.bz2, .tgz, .7z, ...

Temporary: .ext~, .ext~1, .tmp, ...

Others: .txt, .bak, .src, .inc, ...

Tools

- Burp Suite Pro

- wfuzz

To search for directories:

wfuzz -c --hc=404 -t 200 -w /usr/SecLists/Discovery/Web-Content/directory-list-2.3-medium.txt https://example/pathexample/FUZZ (In fuzz is where we are replacing the payload)

wfuzz -w /usr/share/seclists/Discovery/Web-Content/raft-medium-directories.txt https://domain.com/api/FUZZ

To search for subdomains:

wfuzz -c --hc=404 -t 200 -w /usr/SecLists/Discovery/DNS/subdomains-top1million-5000.txt -H "Host: FUZZ.example.com" http://example.com

To search for files:

wfuzz -c --hc=404 -t 200 -w /usr/SecLists/Discovery/Web-Content/directory-list-2.3-medium.txt -z list,php-html https://example/pathexample/FUZZ.FUZZ

wfuzz -c --hc=404 -w /usr/share/wordlists/dirbuster/directory-list-2.3-medium.txt -z list,php-aspx http://192.168.1.X/design/FUZZ.FUZ2Z

File extensions can also be added in a file and wfuzz will fuzz all those extensions:

wfuzz -c --hc=404 -w /usr/share/wordlists/dirbuster/directory-list-2.3-medium.txt -z file,extensions http://192.168.1.X/design/FUZZ.FUZ2Z

- gobuster

To search for dirs and files:

gobuster dir -u http://example.com -w /usr/SecLists/Discovery/Web-Content/direcroty-list-2.3-medium.txt -t 200 (-x to sarch for a file extension)

gobuster dir -e -u http://example.com -w /usr/share/wordlists/dirb/common.txt

gobuster dir -u http://example.com/ -w /usr/share/wordlists/dirb/common.txt -x php,html,aspx > dirs-files.txt

To list subdomains:

gobuster vhost -u http://example.com -w /usr/SecLists/Discovery/DNS/subdomains-top1million-5000.txt

- ffuf

To list subdomains:

ffuf -t 200 -w /usr/SecLists/Discovery/DNS/subdomains-top1million-5000.txt -H "Host: FUZZ.example.com" http://example.com -fs 169

- Dirsearch

To obtain interesting endpoints:

dirsearch -l subdomains_alive.txt -x 500,502,429,404,400 -R 5 --random-agent -t 100 -F -o directories.txt -w /home/coffinxp/oneforall/onelistforallshort.txt

- Paramspider, GAU, Katana and URO

https://github.com/devanshbatham/ParamSpider

https://github.com/lc/gau

https://github.com/projectdiscovery/katana

https://github.com/s0md3v/uro

This tools will help us obtain parameters within URLS.

With GAU and URO:

cat subdomains_alive.txt | gau > urls-params.txt

cat urls-params.txt | uro -o filtered-params.txt

With Paramspider:

paramspider -d example.com --subs

paramspider -l urls.txt

With Katana:

cat subdomains_alive.txt | katana -f qurl > fuzz_endpoints.txt

Last updated