'\" t .\" Title: function::tokenize .\" Author: .\" Generator: DocBook XSL Stylesheets v1.76.1 .\" Date: May 2013 .\" Manual: A collection of standard string functions .\" Source: SystemTap Tapset Reference .\" Language: English .\" .TH "FUNCTION:" "3stap" "May 2013" "SystemTap Tapset Reference" "A collection of standard strin" .\" ----------------------------------------------------------------- .\" * Define some portability stuff .\" ----------------------------------------------------------------- .\" ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ .\" http://bugs.debian.org/507673 .\" http://lists.gnu.org/archive/html/groff/2009-02/msg00013.html .\" ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ .ie \n(.g .ds Aq \(aq .el .ds Aq ' .\" ----------------------------------------------------------------- .\" * set default formatting .\" ----------------------------------------------------------------- .\" disable hyphenation .nh .\" disable justification (adjust text to left margin only) .ad l .\" ----------------------------------------------------------------- .\" * MAIN CONTENT STARTS HERE * .\" ----------------------------------------------------------------- .SH "NAME" function::tokenize \- Return the next non\-empty token in a string .SH "SYNOPSIS" .sp .nf tokenize:string(input:string,delim:string) .fi .SH "ARGUMENTS" .PP \fIinput\fR .RS 4 string to tokenize\&. If NULL, returns the next non\-empty token in the string passed in the previous call to \fBtokenize\fR\&. .RE .PP \fIdelim\fR .RS 4 set of characters that delimit the tokens .RE .SH "DESCRIPTION" .PP This function returns the next non\-empty token in the given input string, where the tokens are delimited by characters in the delim string\&. If the input string is non\-NULL, it returns the first token\&. If the input string is NULL, it returns the next token in the string passed in the previous call to tokenize\&. If no delimiter is found, the entire remaining input string is returned\&. It returns NULL when no more tokens are available\&.