Tool and Usage
Parsero is a Python script to analyze robots.txt on web servers. It specifically looks for the Disallow entries and checks which entries might be accessible.
Entries that should not be crawled by a web spider, are typically placed in a Disallow entry in the robots.txt file. This file is read by a crawl tool and any of the Disallow entries are skipped for indexing. These entries are interesting, as sometimes they reveal a lot of information about the web server. This tool helps to quickly check which entries are accessible.
Usage and audience
- + The source code of this software is available
Author and Maintainers
Parsero is under development by Javier Nieto.
Support operating systems
Parsero is known to work on FreeBSD, Linux, macOS, and Microsoft Windows.