Skip to content

Crawlex is a powerful Chrome extension designed to assist bug bounty hunters in their work by enabling easy crawling of all possible URLs within web pages

Notifications You must be signed in to change notification settings

Defend-X/crawlex

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Crawlex - Crawl websites from your browser quick and easy

Crawlex is a powerful Chrome extension designed to assist bug bounty hunters in their work by enabling easy crawling of all possible URLs within web pages with just a single click. This extension simplifies the process of discovering potential vulnerabilities and expanding the scope of bug bounty programs.

Features

  • Crawls web pages and extracts all URLs for further analysis and testing.
  • Provides a paginated view of the crawled URLs for easy navigation and organization.
  • Supports customization of the number of URLs displayed per page.
  • Automatically removes duplicate URLs to streamline the testing process.
  • Compatible with popular bug bounty platforms and websites.

Installation

  1. Clone the Crawlex repository to your local machine or download it as a ZIP file.
  2. Open Google Chrome and go to the Extensions page by entering chrome://extensions in the address bar.
  3. Enable Developer mode by toggling the switch at the top-right corner of the page.
  4. Click on the Load unpacked button and select the directory where you cloned or extracted the Crawlex extension.
  5. The Crawlex extension should now be installed and visible in the list of extensions. You can pin it to the toolbar for easy access.

Usage

  1. Open the web page you want to crawl for URLs.
  2. Click on the Crawlex extension icon in the Chrome toolbar to activate the extension.
  3. Crawlex will automatically start scanning the page for URLs. The progress will be displayed in the extension popup.
  4. Once the crawling is complete, the extension will present a paginated view of the extracted URLs.
  5. Use the navigation buttons provided by the pagination to browse through the URLs.
  6. Click on a URL to open it in a new tab for further analysis and testing.
  7. Customize the number of URLs displayed per page by adjusting the pagination limit in the extension settings.
  8. Use the search functionality within the extension to quickly find specific URLs.
  9. The extension automatically removes duplicate URLs to ensure efficient testing.

Contributing

Contributions to Crawlex are welcome! If you find any issues or have suggestions for improvements, please open an issue in the issue tracker: https://github.com/Defend-X/crawlex/issues. You can also submit pull requests with bug fixes or new features.

When contributing to Crawlex, please make sure to follow the existing code style and conventions. Provide clear and concise descriptions for your changes and ensure that they are well-documented.

License

Crawlex is open-source software released under the MIT License. You are free to use, modify, and distribute the extension in accordance with the terms of the license.

Disclaimer

Crawlex is a tool designed to assist bug bounty hunters in their work. It should be used responsibly and ethically, respecting the terms and conditions of bug bounty programs and the applicable laws. The developers of Crawlex are not responsible for any misuse or illegal activities conducted using this extension.

Credits

Crawlex was developed by DefendyX team.

Support

For any questions, issues, or feedback related to Crawlex, please reach out to the project's issue tracker: https://github.com/Defend-X/crawlex/issues or contact the developer directly at [email protected]

Happy bug hunting with Crawlex!

About

Crawlex is a powerful Chrome extension designed to assist bug bounty hunters in their work by enabling easy crawling of all possible URLs within web pages

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published