Bug Bounty Recon (Part-4)

Aswin Thambi Panikulangara
3 min readJun 19, 2023

Waybackurls And Discovering Parameters

Waybackurls

Archived web pages refer to copies or snapshots of websites captured at specific points in time. These snapshots are created and stored by web archiving services, such as the Wayback Machine by the Internet Archive. Archiving web pages allows for the preservation of web content, enabling users to access and reference past versions of websites.

Discovering waybackurls

Waybackurls is a command-line tool that extracts URLs from the Wayback Machine’s archived web pages. It allows you to retrieve historical snapshots of websites and gather valuable information for bug bounty recon and other security testing purposes. Here’s how you can use Waybackurls:

waybackurls example.com

here you can see the result.

you can use the following command to save the output to a file called urls.txt :

waybackurls example.com > urls.txt

Now you can use grep to find some sensitive endpoints and informations disclosed via archives

result

you can use any keywords like Admin, user, email, token, keys.. etc.

Finding parameters from Waybackurls

By discovering and analyzing parameters, you can assess how they are handled, validated, or sanitized by the application. This can lead to the identification of vulnerabilities such as SQL injection, cross-site scripting (XSS), command injection, path traversal, and more.

You can use the following regex for greping and saving parameters as a wordlist, also remove duplicates.

grep -oP ‘(?<=\?|&)\w+(?==|&)’ urls.txt | sort -u

As you can see this will help to make wordlists of parameters from a list of urls.

Conclusion

Using this technique I got a lots of bounties for bugs like information disclosures, broken access issues etc.. Give it a try.

I was able to Access invoices send by the company to the clients

Also please suggest what to write on upcoming blogs.

you can now buy me a coffee: https://www.buymeacoffee.com/r0074g3n7

In my previous blogs I have covered following:

Part 1

Part 2

Part 3

--

--