Website URL Scanner is a simple command-line tool that allows you to scan a website and extract all URLs. It can be useful for various purposes, such as link analysis or checking for broken links.
- Easy-to-use command-line interface
- Fast and efficient URL extraction
- Supports both HTTP and HTTPS URLs
-
Ensure you have PowerShell installed on your system.
-
Clone this repository:
git clone https://github.com/dotdesh71/website-url-scanner.git Run the scrape_website.bat file and follow the on-screen instructions.
Execute the scrape_website.bat file.
Enter the domain name of the website when prompted.
Wait for the tool to scan the website and extract URLs.
View the extracted URLs in the output.txt file.
This project is licensed under the MIT License - see the LICENSE file for details. Contributions
Contributions are welcome! Feel free to open an issue or submit a pull request. Support
If you encounter any issues or have questions, please open an issue. Acknowledgments
This tool uses PowerShell for web scraping.
If you need custom enhancements or additional features for this tool, you can hire me on Fiverr.