.\" Text automatically generated by txt2man .TH parsero 1 "27 Jan 2020" "parsero-0.0+git20140929.e5b585a" "Audit tool for robots.txt of a site" .SH NAME \fBparsero \fP- Audit tool for robots.txt of a site \fB .SH SYNOPSIS .nf .fam C \fBparsero\fP [\fB-h\fP] [\fB-u\fP \fIURL\fP] [\fB-o\fP] [\fB-sb\fP] [\fB-f\fP \fIFILE\fP] .fam T .fi .fam T .fi .SH DESCRIPTION Parsero is a free script written in Python which reads the Robots.txt file of a web server through the network and looks at the Disallow entries. The Disallow entries tell the search engines what directories or files hosted on a web server mustn't be indexed. For example, "Disallow: /portal/login" means that the content on www.example.com/portal/login it's not allowed to be indexed by crawlers like Google, Bing, Yahoo\.\.\. This is the way the administrator have to not share sensitive or private information with the search engines. .SH OPTIONS .TP .B \fB-h\fP, \fB--help\fP Show help message and exit. .TP .B \fB-u\fP \fIURL\fP Type the \fIURL\fP which will be analyzed. .TP .B \fB-o\fP Show only the "HTTP 200" status code. .TP .B \fB-sb\fP Search in Bing indexed Disallows. .TP .B \fB-f\fP \fIFILE\fP Scan a list of domains from a list. .SH EXAMPLE Common usage: .PP .nf .fam C $ parsero -u www.example.com .fam T .fi Using a list of domains from a list: .PP .nf .fam C $ parsero -f /tmp/list-of-domains.txt .fam T .fi .SH SEE ALSO \fBlinkchecker\fP(1), \fBproxychains4\fP(1). .SH AUTHOR \fBparsero\fP was written by Javier Nieto . .PP This manual page was written by Thiago Andrade Marques for the Debian project (but may be used by others).