Tool and Usage
|Latest release||0.81 |
Entries that should not be crawled by a web spider, are typically placed in a Disallow entry in the robots.txt file. This file is read by a crawl tool and any of the Disallow entries are skipped for indexing. These entries are interesting, as sometimes they reveal a lot of information about the web server. This tool helps to quickly check which entries are accessible.
Tool review and remarks
The review and analysis of this project resulted in the following remarks for this security tool:
- + The source code of this software is available
Supported operating systems
Parsero is known to work on FreeBSD, Linux, macOS, and Microsoft Windows.
Similar tools to Parsero:
arping is a tool for the discovery of hosts on a computer network using the Address Resolution Protocol (ARP).
arp-scan is a security tool that sends ARP packets to hosts on the local network. Any responses to the requests are displayed.
Oscanner is an Oracle assessment framework to perform enumeration on Oracle installations. It is written in Java and provides a graphical overview of findings.
Found an improvement? Help the community by submitting an update.