LSE toolsLSE toolsParsero (290)Parsero (290)

Tool and Usage

Parsero is a Python script to analyze robots.txt on web servers. It specifically looks for the Disallow entries and checks which entries might be accessible.

Screenshot for Parsero tool review


Entries that should not be crawled by a web spider, are typically placed in a Disallow entry in the robots.txt file. This file is read by a crawl tool and any of the Disallow entries are skipped for indexing. These entries are interesting, as sometimes they reveal a lot of information about the web server. This tool helps to quickly check which entries are accessible.

Usage and audience

Tool review

The review and analysis of this project resulted in the following remarks for this security tool:


  • + The source code of this software is available

Author and Maintainers

Parsero is under development by Javier Nieto.


Support operating systems

Parsero is known to work on FreeBSD, Linux, macOS, and Microsoft Windows.

This tool page was recently updated. Found an improvement? Become an influencer and submit an update.
Project details
Latest release 0.81 [2014-09-29]
Last updatedSept. 17, 2017

Project health

This score is calculated by different factors, like project age, last release date, etc.


GitHub iconParsero GitHub project

Related terms