.TH datalad-crawl 1 2016\-11\-10 "datalad-crawl 0.4.1" .SH SYNOPSIS \fBdatalad\-crawl\fR [--version] [-h] [-l LEVEL] [-p {condor}] [--is-pipeline] [-t] [-r] [-C CHDIR] [file] .SH DESCRIPTION Crawl online resource to create or update a dataset. .br .br Examples: .br .br $ datalad crawl # within a dataset having .datalad/crawl/crawl.cfg .SH OPTIONS file configuration (or pipeline if --is-pipeline) file defining crawling, or a directory of a dataset on which to perform crawling using its standard crawling specification. Constraints: value must be a string [Default: None] \fB--version\fR show the program's version and license information \fB-h\fR, \fB--help\fR, \fB--help-np\fR show this help message. --help-np forcefully disables the use of a pager for displaying the help message \fB-l\fR LEVEL, \fB--log-level\fR LEVEL set logging verbosity level. Choose among critical, error, warning, info, debug. Also you can specify an integer <10 to provide even more debugging information \fB-p\fR {condor}, \fB--pbs-runner\fR {condor} execute command by scheduling it via available PBS. For settings, config file will be consulted \fB--is-pipeline\fR flag if provided file is a Python script which defines pipeline(). [Default: False] \fB-t\fR, \fB--is-template\fR flag if provided value is the name of the template to use. [Default: False] \fB-r\fR, \fB--recursive\fR flag to crawl subdatasets as well (for now serially). [Default: False] \fB-C\fR \fI\s-1CHDIR\s0\fR, \fB--chdir\fR \fI\s-1CHDIR\s0\fR directory to chdir to for crawling. Constraints: value must be a string [Default: None] .SH AUTHORS datalad is developed by The DataLad Team and Contributors .