GNU/Linux ◆ xterm-256color ◆ bash 46 views

The Wayback URLs Crawler is a Bash script designed to automate the process of collecting archived URLs from the Wayback Machine for a specified target domain. By leveraging the Wayback Machine’s API, the script fetches historical data about a target website, which can be particularly useful for security researchers, penetration testers, digital archivists, or anyone interested in examining past versions of a website.

The script retrieves a list of URLs that have been archived over time, offering a snapshot of the site’s historical content. This data can help identify past security vulnerabilities, track content changes, or recover lost web pages. The resulting list of URLs is saved to a text file, allowing for easy analysis and further processing.

https://github.com/lamcodeofpwnosec/Waybash