.\" Automatically generated by Pod::Man 2.28 (Pod::Simple 3.29) .\" .\" Standard preamble: .\" ======================================================================== .de Sp \" Vertical space (when we can't use .PP) .if t .sp .5v .if n .sp .. .de Vb \" Begin verbatim text .ft CW .nf .ne \\$1 .. .de Ve \" End verbatim text .ft R .fi .. .\" Set up some character translations and predefined strings. \*(-- will .\" give an unbreakable dash, \*(PI will give pi, \*(L" will give a left .\" double quote, and \*(R" will give a right double quote. \*(C+ will .\" give a nicer C++. Capital omega is used to do unbreakable dashes and .\" therefore won't be available. \*(C` and \*(C' expand to `' in nroff, .\" nothing in troff, for use with C<>. .tr \(*W- .ds C+ C\v'-.1v'\h'-1p'\s-2+\h'-1p'+\s0\v'.1v'\h'-1p' .ie n \{\ . ds -- \(*W- . ds PI pi . if (\n(.H=4u)&(1m=24u) .ds -- \(*W\h'-12u'\(*W\h'-12u'-\" diablo 10 pitch . if (\n(.H=4u)&(1m=20u) .ds -- \(*W\h'-12u'\(*W\h'-8u'-\" diablo 12 pitch . ds L" "" . ds R" "" . ds C` "" . ds C' "" 'br\} .el\{\ . ds -- \|\(em\| . ds PI \(*p . ds L" `` . ds R" '' . ds C` . ds C' 'br\} .\" .\" Escape single quotes in literal strings from groff's Unicode transform. .ie \n(.g .ds Aq \(aq .el .ds Aq ' .\" .\" If the F register is turned on, we'll generate index entries on stderr for .\" titles (.TH), headers (.SH), subsections (.SS), items (.Ip), and index .\" entries marked with X<> in POD. Of course, you'll have to process the .\" output yourself in some meaningful fashion. .\" .\" Avoid warning from groff about undefined register 'F'. .de IX .. .nr rF 0 .if \n(.g .if rF .nr rF 1 .if (\n(rF:(\n(.g==0)) \{ . if \nF \{ . de IX . tm Index:\\$1\t\\n%\t"\\$2" .. . if !\nF==2 \{ . nr % 0 . nr F 2 . \} . \} .\} .rr rF .\" ======================================================================== .\" .IX Title "MediaWiki::Bot 3pm" .TH MediaWiki::Bot 3pm "2016-07-21" "perl v5.22.2" "User Contributed Perl Documentation" .\" For nroff, turn off justification. Always turn off hyphenation; it makes .\" way too many mistakes in technical documents. .if n .ad l .nh .SH "NAME" MediaWiki::Bot \- a high\-level bot framework for interacting with MediaWiki wikis .SH "VERSION" .IX Header "VERSION" version 5.006003 .SH "SYNOPSIS" .IX Header "SYNOPSIS" .Vb 1 \& use MediaWiki::Bot qw(:constants); \& \& my $bot = MediaWiki::Bot\->new({ \& assert => \*(Aqbot\*(Aq, \& host => \*(Aqde.wikimedia.org\*(Aq, \& login_data => { username => "Mike\*(Aqs bot account", password => "password" }, \& }); \& \& my $revid = $bot\->get_last("User:Mike.lifeguard/sandbox", "Mike.lifeguard"); \& print "Reverting to $revid\en" if defined($revid); \& $bot\->revert(\*(AqUser:Mike.lifeguard\*(Aq, $revid, \*(Aqrvv\*(Aq); .Ve .SH "DESCRIPTION" .IX Header "DESCRIPTION" \&\fBMediaWiki::Bot\fR is a framework that can be used to write bots which interface with the MediaWiki \s-1API \s0(). .SH "METHODS" .IX Header "METHODS" .SS "new" .IX Subsection "new" .Vb 4 \& my $bot = MediaWiki::Bot({ \& host => \*(Aqen.wikipedia.org\*(Aq, \& operator => \*(AqMike.lifeguard\*(Aq, \& }); .Ve .PP Calling \f(CW\*(C`MediaWiki::Bot\->new()\*(C'\fR will create a new MediaWiki::Bot object. The only parameter is a hashref with keys: .IP "\(bu" 4 \&\fIagent\fR sets a custom useragent. It is recommended to use \f(CW\*(C`operator\*(C'\fR instead, which is all we need to do the right thing for you. If you really want to do it yourself, see for guidance on what information must be included. .IP "\(bu" 4 \&\fIassert\fR sets a parameter for the AssertEdit extension (commonly 'bot') .Sp Refer to . .IP "\(bu" 4 \&\fIoperator\fR allows the bot to send you a message when it fails an assert. This is also the recommended way to customize the user agent string, which is required by the Wikimedia Foundation. A warning will be emitted if you omit this. .IP "\(bu" 4 \&\fImaxlag\fR allows you to set the maxlag parameter (default is the recommended 5s). .Sp Please refer to the MediaWiki documentation prior to changing this from the default. .IP "\(bu" 4 \&\fIprotocol\fR allows you to specify 'http' or 'https' (default is 'http') .IP "\(bu" 4 \&\fIhost\fR sets the domain name of the wiki to connect to .IP "\(bu" 4 \&\fIpath\fR sets the path to api.php (with no leading or trailing slash) .IP "\(bu" 4 \&\fIlogin_data\fR is a hashref of credentials to pass to \*(L"login\*(R". .IP "\(bu" 4 \&\fIdebug\fR \- whether to provide debug output. .Sp 1 provides only error messages; 2 provides further detail on internal operations. .PP For example: .PP .Vb 10 \& my $bot = MediaWiki::Bot\->new({ \& assert => \*(Aqbot\*(Aq, \& protocol => \*(Aqhttps\*(Aq, \& host => \*(Aqen.wikimedia.org\*(Aq, \& agent => sprintf( \& \*(AqPerlWikiBot/%s (https://metacpan.org/MediaWiki::Bot; User:Mike.lifeguard)\*(Aq, \& MediaWiki::Bot\->VERSION \& ), \& login_data => { username => "Mike\*(Aqs bot account", password => "password" }, \& }); .Ve .PP For backward compatibility, you can specify up to three parameters: .PP .Vb 1 \& my $bot = MediaWiki::Bot\->new(\*(AqMy custom useragent string\*(Aq, $assert, $operator); .Ve .PP \&\fBThis form is deprecated\fR will never do auto-login or autoconfiguration, and emits deprecation warnings. .PP For further reading: .IP "\(bu" 4 MediaWiki::Bot wiki .IP "\(bu" 4 > .IP "\(bu" 4 Creating a new bot .IP "\(bu" 4 Setting the wiki .IP "\(bu" 4 Where is api.php .SS "set_wiki" .IX Subsection "set_wiki" Set what wiki to use. The parameter is a hashref with keys: .IP "\(bu" 4 \&\fIhost\fR \- the domain name .IP "\(bu" 4 \&\fIpath\fR \- the part of the path before api.php (usually 'w') .IP "\(bu" 4 \&\fIprotocol\fR is either 'http' or 'https'. .PP If you don't set any parameter, it's previous value is used. If it has never been set, the default settings are 'http', 'en.wikipedia.org' and 'w'. .PP For example: .PP .Vb 5 \& $bot\->set_wiki({ \& protocol => \*(Aqhttps\*(Aq, \& host => \*(Aqsecure.wikimedia.org\*(Aq, \& path => \*(Aqwikipedia/meta/w\*(Aq, \& }); .Ve .PP For backward compatibility, you can specify up to two parameters: .PP .Vb 1 \& $bot\->set_wiki($host, $path); .Ve .PP \&\fBThis form is deprecated\fR, and will emit deprecation warnings. .SS "login" .IX Subsection "login" This method takes a hashref with keys \fIusername\fR and \fIpassword\fR at a minimum. See \*(L"Single User Login\*(R" and \*(L"Basic authentication\*(R" for additional options. .PP Logs the use \f(CW$username\fR in, optionally using \f(CW$password\fR. First, an attempt will be made to use cookies to log in. If this fails, an attempt will be made to use the password provided to log in, if any. If the login was successful, returns true; false otherwise. .PP .Vb 4 \& $bot\->login({ \& username => $username, \& password => $password, \& }) or die "Login failed"; .Ve .PP Once logged in, attempt to do some simple auto-configuration. At present, this consists of: .IP "\(bu" 4 Warning if the account doesn't have the bot flag, and isn't a sysop account. .IP "\(bu" 4 Setting an appropriate default assert. .PP You can skip this autoconfiguration by passing \f(CW\*(C`autoconfig => 0\*(C'\fR .PP For backward compatibility, you can call this as .PP .Vb 1 \& $bot\->login($username, $password); .Ve .PP \&\fBThis form is deprecated\fR, and will emit deprecation warnings. It will never do autoconfiguration or \s-1SUL\s0 login. .PP \fISingle User Login\fR .IX Subsection "Single User Login" .PP On \s-1WMF\s0 wikis, \f(CW\*(C`do_sul\*(C'\fR specifies whether to log in on all projects. The default is false. But even when false, you still get a CentralAuth cookie for, and are thus logged in on, all languages of a given domain (\f(CW\*(C`*.wikipedia.org\*(C'\fR, for example). When set, a login is done on each \s-1WMF\s0 domain so you are logged in on all ~800 content wikis. Since \f(CW\*(C`*.wikimedia.org\*(C'\fR is not possible, we explicitly include meta, commons, incubator, and wikispecies. .PP \fIBasic authentication\fR .IX Subsection "Basic authentication" .PP If you need to supply basic auth credentials, pass a hashref of data as described by LWP::UserAgent: .PP .Vb 9 \& $bot\->login({ \& username => $username, \& password => $password, \& basic_auth => { netloc => "private.wiki.com:80", \& realm => "Authentication Realm", \& uname => "Basic auth username", \& pass => "password", \& } \& }) or die "Couldn\*(Aqt log in"; .Ve .PP \fIBot passwords\fR .IX Subsection "Bot passwords" .PP \&\f(CW\*(C`MediaWiki::Bot\*(C'\fR doesn't yet support the more complicated (but more secure) oAuth login flow for bots. Instead, we support a simpler \*(L"bot password\*(R", which is a generated password connected to a (possibly-reduced) set of on-wiki privileges, and \s-1IP\s0 ranges from which it can be used. .PP To create one, visit \f(CW\*(C`Special:BotPasswords\*(C'\fR on the wiki. Enter a label for the password, then select the privileges you want to use with that password. This set should be as restricted as possible; most bots only edit existing pages. Keeping the set of privileges as restricted as possible limits the possible damage if the password were ever compromised. .PP Submit the form, and you'll be given a new \*(L"username\*(R" that looks like \&\*(L"AccountUsername@bot_password_label\*(R", and a generated bot password. To log in, provide those to \f(CW\*(C`MediaWiki::Bot\*(C'\fR verbatim. .PP \&\fBReferences:\fR API:Login , Logging in .SS "logout" .IX Subsection "logout" .Vb 1 \& $bot\->logout(); .Ve .PP The logout method logs the bot out of the wiki. This invalidates all login cookies. .PP \&\fBReferences:\fR API:Logging out .SS "edit" .IX Subsection "edit" .Vb 8 \& my $text = $bot\->get_text(\*(AqMy page\*(Aq); \& $text .= "\en\en* More text\en"; \& $bot\->edit({ \& page => \*(AqMy page\*(Aq, \& text => $text, \& summary => \*(AqAdding new content\*(Aq, \& section => \*(Aqnew\*(Aq, \& }); .Ve .PP This method edits a wiki page, and takes a hashref of data with keys: .IP "\(bu" 4 \&\fIpage\fR \- the page title to edit .IP "\(bu" 4 \&\fItext\fR \- the page text to write .IP "\(bu" 4 \&\fIsummary\fR \- an edit summary .IP "\(bu" 4 \&\fIminor\fR \- whether to mark the edit as minor or not (boolean) .IP "\(bu" 4 \&\fIbot\fR \- whether to mark the edit as a bot edit (boolean) .IP "\(bu" 4 \&\fIassertion\fR \- usually 'bot', but see . .IP "\(bu" 4 \&\fIsection\fR \- edit a single section (identified by number) instead of the whole page .PP An \s-1MD5\s0 hash is sent to guard against data corruption while in transit. .PP You can also call this as: .PP .Vb 1 \& $bot\->edit($page, $text, $summary, $is_minor, $assert, $markasbot); .Ve .PP \&\fBThis form is deprecated\fR, and will emit deprecation warnings. .PP \fICAPTCHAs\fR .IX Subsection "CAPTCHAs" .PP If a \s-1CAPTCHA\s0 is encountered, the call to \f(CW\*(C`edit\*(C'\fR will return false, with the error code set to \f(CW\*(C`ERR_CAPTCHA\*(C'\fR and the details informing you that solving a \s-1CAPTCHA\s0 is required for this action. The information you need to actually solve the captcha (for example the \s-1URL\s0 for the image) is given in \f(CW\*(C`$bot\->{error}\->{captcha}\*(C'\fR as a hash reference. You will want to grab the keys 'url' (a relative \s-1URL\s0 to the image) and 'id' (the \s-1ID\s0 of the \s-1CAPTCHA\s0). Once you have solved the \&\s-1CAPTCHA \s0(presumably by interacting with a human), retry the edit, adding \&\f(CW\*(C`captcha_id\*(C'\fR and \f(CW\*(C`captcha_solution\*(C'\fR parameters: .PP .Vb 8 \& my $edit = {page => \*(AqMain Page\*(Aq, text => \*(Aqgot your nose\*(Aq}; \& my $edit_status = $bot\->edit($edit); \& if (not $edit_status) { \& if ($bot\->{error}\->{code} == ERR_CAPTCHA) { \& my @captcha_uri = split /\eQ?/, $bot\->{error}{captcha}{url}, 2; \& my $image = URI\->new(sprintf \*(Aq%s://%s%s?%s\*(Aq => \& $bot\->{protocol}, $bot\->{host}, $captcha_uri[0], $captcha_uri[1], \& ); \& \& require Term::ReadLine; \& my $term = Term::ReadLine\->new(\*(AqSolve the captcha\*(Aq); \& $term\->ornaments(0); \& my $answer = $term\->readline("Please solve $image and type the answer: "); \& \& # Add new CAPTCHA params to the edit we\*(Aqre attempting \& $edit\->{captcha_id} = $bot\->{error}\->{captcha}\->{id}; \& $edit\->{captcha_solution} = $answer; \& $status = $bot\->edit($edit); \& } \& } .Ve .PP \&\fBReferences:\fR Editing pages , API:Edit , API:Tokens .SS "move" .IX Subsection "move" .Vb 1 \& $bot\->move($from_title, $to_title, $reason, $options_hashref); .Ve .PP This moves a wiki page. .PP If you wish to specify more options (like whether to suppress creation of a redirect), use \f(CW$options_hashref\fR, which has keys: .IP "\(bu" 4 \&\fImovetalk\fR specifies whether to attempt to the talk page. .IP "\(bu" 4 \&\fInoredirect\fR specifies whether to suppress creation of a redirect. .IP "\(bu" 4 \&\fImovesubpages\fR specifies whether to move subpages, if applicable. .IP "\(bu" 4 \&\fIwatch\fR and \fIunwatch\fR add or remove the page and the redirect from your watchlist. .IP "\(bu" 4 \&\fIignorewarnings\fR ignores warnings. .PP .Vb 6 \& my @pages = ("Humor", "Rumor"); \& foreach my $page (@pages) { \& my $to = $page; \& $to =~ s/or$/our/; \& $bot\->move($page, $to, "silly \*(Aqmerricans"); \& } .Ve .PP \&\fBReferences:\fR API:Move .SS "get_history" .IX Subsection "get_history" .Vb 1 \& my @hist = $bot\->get_history($title, $limit, $revid, $direction); .Ve .PP Returns an array containing the history of the specified \f(CW$page_title\fR, with \&\f(CW$limit\fR number of revisions (default is as many as possible). .PP The array returned contains hashrefs with keys: revid, user, comment, minor, timestamp_date, and timestamp_time. .PP \&\fBReferences\fR: Getting page history , API:Properties#revisions .SS "get_text" .IX Subsection "get_text" Returns an the wikitext of the specified \f(CW$page_title\fR. The second parameter is \&\f(CW$revid\fR \- if defined, returns the text of that revision; the third is \&\f(CW$section_number\fR \- if defined, returns the text of that section. .PP A blank page will return wikitext of "" (which evaluates to false in Perl, but is defined); a nonexistent page will return undef (which also evaluates to false in Perl, but is obviously undefined). You can distinguish between blank and nonexistent pages by using defined: .PP .Vb 2 \& my $wikitext = $bot\->get_text(\*(AqPage title\*(Aq); \& print "Wikitext: $wikitext\en" if defined $wikitext; .Ve .PP \&\fBReferences:\fR Fetching page text , API:Properties#revisions .SS "get_id" .IX Subsection "get_id" Returns the id of the specified \f(CW$page_title\fR. Returns undef if page does not exist. .PP .Vb 2 \& my $pageid = $bot\->get_id("Main Page"); \& die "Page doesn\*(Aqt exist\en" if !defined($pageid); .Ve .PP \&\fBRevisions:\fR API:Properties#info .SS "get_pages" .IX Subsection "get_pages" Returns the text of the specified pages in a hashref. Content of undef means page does not exist. Also handles redirects or article names that use namespace aliases. .PP .Vb 6 \& my @pages = (\*(AqPage 1\*(Aq, \*(AqPage 2\*(Aq, \*(AqPage 3\*(Aq); \& my $thing = $bot\->get_pages(\e@pages); \& foreach my $page (keys %$thing) { \& my $text = $thing\->{$page}; \& print "$text\en" if defined($text); \& } .Ve .PP \&\fBReferences:\fR Fetching page text , API:Properties#revisions .SS "get_image" .IX Subsection "get_image" .Vb 1 \& $buffer = $bot\->get_image(\*(AqFile:Foo.jpg\*(Aq, { width=>256, height=>256 }); .Ve .PP Download an image from a wiki. This is derived from a similar function in MediaWiki::API. This one allows the image to be scaled down by passing a hashref with height & width parameters. .PP It returns raw data in the original format. You may simply spew it to a file, or process it directly with a library such as Imager. .PP .Vb 3 \& use File::Slurp qw(write_file); \& my $img_data = $bot\->get_image(\*(AqFile:Foo.jpg\*(Aq); \& write_file( \*(AqFoo.jpg\*(Aq, {binmode => \*(Aq:raw\*(Aq}, \e$img_data ); .Ve .PP Images are scaled proportionally. (height/width) will remain constant, except for rounding errors. .PP Height and width parameters describe the \fBmaximum\fR dimensions. A 400x200 image will never be scaled to greater dimensions. You can scale it yourself; having the wiki do it is just lazy & selfish. .PP \&\fBReferences:\fR API:Properties#imageinfo .SS "revert" .IX Subsection "revert" Reverts the specified \f(CW$page_title\fR to \f(CW$revid\fR, with an edit summary of \f(CW$summary\fR. A default edit summary will be used if \f(CW$summary\fR is omitted. .PP .Vb 3 \& my $revid = $bot\->get_last("User:Mike.lifeguard/sandbox", "Mike.lifeguard"); \& print "Reverting to $revid\en" if defined($revid); \& $bot\->revert(\*(AqUser:Mike.lifeguard\*(Aq, $revid, \*(Aqrvv\*(Aq); .Ve .PP \&\fBReferences:\fR API:Edit .SS "undo" .IX Subsection "undo" .Vb 1 \& $bot\->undo($title, $revid, $summary, $after); .Ve .PP Reverts the specified \f(CW$revid\fR, with an edit summary of \f(CW$summary\fR, using the undo function. To undo all revisions from \f(CW$revid\fR up to but not including this one, set \f(CW$after\fR to another revid. If not set, just undo the one revision ($revid). .PP \&\fBReferences:\fR API:Edit .SS "get_last" .IX Subsection "get_last" Returns the revid of the last revision to \f(CW$page\fR not made by \f(CW$user\fR. undef is returned if no result was found, as would be the case if the page is deleted. .PP .Vb 5 \& my $revid = $bot\->get_last(\*(AqUser:Mike.lifeguard/sandbox\*(Aq, \*(AqMike.lifeguard\*(Aq); \& if defined($revid) { \& print "Reverting to $revid\en"; \& $bot\->revert(\*(AqUser:Mike.lifeguard\*(Aq, $revid, \*(Aqrvv\*(Aq); \& } .Ve .PP \&\fBReferences:\fR API:Properties#revisions .SS "update_rc" .IX Subsection "update_rc" \&\fBThis method is deprecated\fR, and will emit deprecation warnings. Replace calls to \f(CW\*(C`update_rc()\*(C'\fR with calls to the newer \f(CW\*(C`recentchanges()\*(C'\fR, which returns all available data, including rcid. .PP Returns an array containing the \f(CW$limit\fR most recent changes to the wiki's \fImain namespace\fR. The array contains hashrefs with keys title, revid, old_revid, and timestamp. .PP .Vb 5 \& my @rc = $bot\->update_rc(5); \& foreach my $hashref (@rc) { \& my $title = $hash\->{\*(Aqtitle\*(Aq}; \& print "$title\en"; \& } .Ve .PP The \*(L"Options hashref\*(R" is also available: .PP .Vb 10 \& # Use a callback for incremental processing: \& my $options = { hook => \e&mysub, }; \& $bot\->update_rc($options); \& sub mysub { \& my ($res) = @_; \& foreach my $hashref (@$res) { \& my $page = $hashref\->{\*(Aqtitle\*(Aq}; \& print "$page\en"; \& } \& } .Ve .ie n .SS "recentchanges($wiki_hashref, $options_hashref)" .el .SS "recentchanges($wiki_hashref, \f(CW$options_hashref\fP)" .IX Subsection "recentchanges($wiki_hashref, $options_hashref)" Returns an array of hashrefs containing recentchanges data. .PP The first parameter is a hashref with the following keys: .IP "\(bu" 4 \&\fIns\fR \- the namespace number, or an arrayref of numbers to specify several; default is the main namespace .IP "\(bu" 4 \&\fIlimit\fR \- the number of rows to fetch; default is 50 .IP "\(bu" 4 \&\fIuser\fR \- only list changes by this user .IP "\(bu" 4 \&\fIshow\fR \- itself a hashref where the key is a category and the value is a boolean. If true, the category will be included; if false, excluded. The categories are kinds of edits: minor, bot, anon, redirect, patrolled. See \&\*(L"rcshow\*(R" at . .PP An \*(L"Options hashref\*(R" can be used as the second parameter: .PP .Vb 4 \& my @rc = $bot\->recentchanges({ ns => 4, limit => 100 }); \& foreach my $hashref (@rc) { \& print $hashref\->{title} . "\en"; \& } \& \& # Or, use a callback for incremental processing: \& $bot\->recentchanges({ ns => [0,1], limit => 500 }, { hook => \e&mysub }); \& sub mysub { \& my ($res) = @_; \& foreach my $hashref (@$res) { \& my $page = $hashref\->{title}; \& print "$page\en"; \& } \& } .Ve .PP The hashref returned might contain the following keys: .IP "\(bu" 4 \&\fIns\fR \- the namespace number .IP "\(bu" 4 \&\fIrevid\fR .IP "\(bu" 4 \&\fIold_revid\fR .IP "\(bu" 4 \&\fItimestamp\fR .IP "\(bu" 4 \&\fIrcid\fR \- can be used with \*(L"patrol\*(R" .IP "\(bu" 4 \&\fIpageid\fR .IP "\(bu" 4 \&\fItype\fR \- one of edit, new, log (there may be others) .IP "\(bu" 4 \&\fItitle\fR .PP For backwards compatibility, the previous method signature is still supported: .PP .Vb 1 \& $bot\->recentchanges($ns, $limit, $options_hashref); .Ve .PP \&\fBReferences:\fR API:Recentchanges .SS "what_links_here" .IX Subsection "what_links_here" Returns an array containing a list of all pages linking to \f(CW$page\fR. .PP Additional optional parameters are: .IP "\(bu" 4 One of: all (default), redirects, or nonredirects. .IP "\(bu" 4 A namespace number to search (pass an arrayref to search in multiple namespaces) .IP "\(bu" 4 An \*(L"Options hashref\*(R". .PP A typical query: .PP .Vb 10 \& my @links = $bot\->what_links_here("Meta:Sandbox", \& undef, 1, \& { hook=>\e&mysub } \& ); \& sub mysub{ \& my ($res) = @_; \& foreach my $hash (@$res) { \& my $title = $hash\->{\*(Aqtitle\*(Aq}; \& my $is_redir = $hash\->{\*(Aqredirect\*(Aq}; \& print "Redirect: $title\en" if $is_redir; \& print "Page: $title\en" unless $is_redir; \& } \& } .Ve .PP Transclusions are no longer handled by \fIwhat_links_here()\fR \- use \&\*(L"list_transclusions\*(R" instead. .PP \&\fBReferences:\fR Listing incoming links , API:Backlinks .SS "list_transclusions" .IX Subsection "list_transclusions" Returns an array containing a list of all pages transcluding \f(CW$page\fR. .PP Other parameters are: .IP "\(bu" 4 One of: all (default), redirects, or nonredirects .IP "\(bu" 4 A namespace number to search (pass an arrayref to search in multiple namespaces). .IP "\(bu" 4 \&\f(CW$options_hashref\fR as described by MediaWiki::API: .Sp Set max to limit the number of queries performed. .Sp Set hook to a subroutine reference to use a callback hook for incremental processing. .Sp Refer to the section on \*(L"linksearch\*(R" for examples. .PP A typical query: .PP .Vb 10 \& $bot\->list_transclusions("Template:Tlx", undef, 4, {hook => \e&mysub}); \& sub mysub{ \& my ($res) = @_; \& foreach my $hash (@$res) { \& my $title = $hash\->{\*(Aqtitle\*(Aq}; \& my $is_redir = $hash\->{\*(Aqredirect\*(Aq}; \& print "Redirect: $title\en" if $is_redir; \& print "Page: $title\en" unless $is_redir; \& } \& } .Ve .PP \&\fBReferences:\fR Listing transclusions API:Embeddedin .SS "get_pages_in_category" .IX Subsection "get_pages_in_category" Returns an array containing the names of all pages in the specified category (include the Category: prefix). Does not recurse into sub-categories. .PP .Vb 2 \& my @pages = $bot\->get_pages_in_category(\*(AqCategory:People on stamps of Gabon\*(Aq); \& print "The pages in Category:People on stamps of Gabon are:\en@pages\en"; .Ve .PP The options hashref is as described in \*(L"Options hashref\*(R". Use \f(CW\*(C`{ max => 0 }\*(C'\fR to get all results. .PP \&\fBReferences:\fR Listing category contents , API:Categorymembers .SS "get_all_pages_in_category" .IX Subsection "get_all_pages_in_category" .Vb 1 \& my @pages = $bot\->get_all_pages_in_category($category, $options_hashref); .Ve .PP Returns an array containing the names of \fBall\fR pages in the specified category (include the Category: prefix), including sub-categories. The \f(CW$options_hashref\fR is described fully in \*(L"Options hashref\*(R". .PP \&\fBReferences:\fR Listing category contents , API:Categorymembers .SS "get_all_categories" .IX Subsection "get_all_categories" Returns an array containing the names of all categories. .PP .Vb 2 \& my @categories = $bot\->get_all_categories(); \& print "The categories are:\en@categories\en"; .Ve .PP Use \f(CW\*(C`{ max => 0 }\*(C'\fR to get all results. The default number of categories returned is 10, the maximum allowed is 500. .PP \&\fBReferences:\fR API:Allcategories .SS "linksearch" .IX Subsection "linksearch" Runs a linksearch on the specified \f(CW$link\fR and returns an array containing anonymous hashes with keys 'url' for the outbound \s-1URL,\s0 and 'title' for the page the link is on. .PP Additional parameters are: .IP "\(bu" 4 A namespace number to search (pass an arrayref to search in multiple namespaces). .IP "\(bu" 4 You can search by \f(CW$protocol\fR (http is default). .IP "\(bu" 4 \&\f(CW$options_hashref\fR is fully documented in \*(L"Options hashref\*(R": .Sp Set \fImax\fR in \f(CW$options\fR to get more than one query's worth of results: .Sp .Vb 7 \& my $options = { max => 10, }; # I only want some results \& my @links = $bot\->linksearch("slashdot.org", 1, undef, $options); \& foreach my $hash (@links) { \& my $url = $hash\->{\*(Aqurl\*(Aq}; \& my $page = $hash\->{\*(Aqtitle\*(Aq}; \& print "$page: $url\en"; \& } .Ve .Sp Set \fIhook\fR to a subroutine reference to use a callback hook for incremental processing: .Sp .Vb 10 \& my $options = { hook => \e&mysub, }; # I want to do incremental processing \& $bot\->linksearch("slashdot.org", 1, undef, $options); \& sub mysub { \& my ($res) = @_; \& foreach my $hashref (@$res) { \& my $url = $hashref\->{\*(Aqurl\*(Aq}; \& my $page = $hashref\->{\*(Aqtitle\*(Aq}; \& print "$page: $url\en"; \& } \& } .Ve .PP \&\fBReferences:\fR Finding external links , API:Exturlusage .SS "purge_page" .IX Subsection "purge_page" Purges the server cache of the specified \f(CW$page\fR. Returns true on success; false on failure. Pass an array reference to purge multiple pages. .PP If you really care, a true return value is the number of pages successfully purged. You could check that it is the same as the number you wanted to purge \- maybe some pages don't exist, or you passed invalid titles, or you aren't allowed to purge the cache: .PP .Vb 2 \& my @to_purge = (\*(AqMain Page\*(Aq, \*(AqA\*(Aq, \*(AqB\*(Aq, \*(AqC\*(Aq, \*(AqVery unlikely to exist\*(Aq); \& my $size = scalar @to_purge; \& \& print "all\-at\-once:\en"; \& my $success = $bot\->purge_page(\e@to_purge); \& \& if ($success == $size) { \& print "@to_purge: OK ($success/$size)\en"; \& } \& else { \& my $missed = @to_purge \- $success; \& print "We couldn\*(Aqt purge $missed pages (list was: " \& . join(\*(Aq, \*(Aq, @to_purge) \& . ")\en"; \& } \& \& # OR \& print "\en\enone\-at\-a\-time:\en"; \& foreach my $page (@to_purge) { \& my $ok = $bot\->purge_page($page); \& print "$page: $ok\en"; \& } .Ve .PP \&\fBReferences:\fR Purging the server cache , API:Purge .SS "get_namespace_names" .IX Subsection "get_namespace_names" .Vb 1 \& my %namespace_names = $bot\->get_namespace_names(); .Ve .PP Returns a hash linking the namespace id, such as 1, to its named equivalent, such as \*(L"Talk\*(R". .PP \&\fBReferences:\fR API:Meta#siteinfo .SS "image_usage" .IX Subsection "image_usage" Gets a list of pages which include a certain \f(CW$image\fR. Include the \f(CW\*(C`File:\*(C'\fR namespace prefix to avoid incurring an extra round-trip (which will also emit a deprecation warnings). .PP Additional parameters are: .IP "\(bu" 4 A namespace number to fetch results from (or an arrayref of multiple namespace numbers) .IP "\(bu" 4 One of all, redirect, or nonredirects. .IP "\(bu" 4 \&\f(CW$options\fR is a hashref as described in the section for \*(L"linksearch\*(R". .PP .Vb 1 \& my @pages = $bot\->image_usage("File:Albert Einstein Head.jpg"); .Ve .PP Or, make use of the \*(L"Options hashref\*(R" to do incremental processing: .PP .Vb 11 \& $bot\->image_usage("File:Albert Einstein Head.jpg", \& undef, undef, \& { hook=>\e&mysub, max=>5 } \& ); \& sub mysub { \& my $res = shift; \& foreach my $page (@$res) { \& my $title = $page\->{\*(Aqtitle\*(Aq}; \& print "$title\en"; \& } \& } .Ve .PP \&\fBReferences:\fR API:Imageusage .ie n .SS "global_image_usage($image, $results, $filterlocal)" .el .SS "global_image_usage($image, \f(CW$results\fP, \f(CW$filterlocal\fP)" .IX Subsection "global_image_usage($image, $results, $filterlocal)" Returns an array of hashrefs of data about pages which use the given image. .PP .Vb 1 \& my @data = $bot\->global_image_usage(\*(AqFile:Albert Einstein Head.jpg\*(Aq); .Ve .PP The keys in each hashref are title, url, and wiki. \f(CW$results\fR is the maximum number of results that will be returned (not the maximum number of requests that will be sent, like \f(CW\*(C`max\*(C'\fR in the \*(L"Options hashref\*(R"); the default is to attempt to fetch 500 (set to 0 to get all results). \f(CW$filterlocal\fR will filter out local uses of the image. .PP \&\fBReferences:\fR Extension:GlobalUsage#API .SS "links_to_image" .IX Subsection "links_to_image" A backward-compatible call to \*(L"image_usage\*(R". You can provide only the image title. .PP \&\fBThis method is deprecated\fR, and will emit deprecation warnings. .SS "is_blocked" .IX Subsection "is_blocked" .Vb 1 \& my $blocked = $bot\->is_blocked(\*(AqUser:Mike.lifeguard\*(Aq); .Ve .PP Checks if a user is currently blocked. .PP \&\fBReferences:\fR API:Blocks .SS "test_blocked" .IX Subsection "test_blocked" Retained for backwards compatibility. Use \*(L"is_blocked\*(R" for clarity. .PP \&\fBThis method is deprecated\fR, and will emit deprecation warnings. .SS "test_image_exists" .IX Subsection "test_image_exists" Checks if an image exists at \f(CW$page\fR. .IP "\(bu" 4 \&\f(CW\*(C`FILE_NONEXISTENT\*(C'\fR (0) means \*(L"Nothing there\*(R" .IP "\(bu" 4 \&\f(CW\*(C`FILE_LOCAL\*(C'\fR (1) means \*(L"Yes, an image exists locally\*(R" .IP "\(bu" 4 \&\f(CW\*(C`FILE_SHARED\*(C'\fR (2) means "Yes, an image exists on Commons " .IP "\(bu" 4 \&\f(CW\*(C`FILE_PAGE_TEXT_ONLY\*(C'\fR (3) means \*(L"No image exists, but there is text on the page\*(R" .PP If you pass in an arrayref of images, you'll get out an arrayref of results. .PP .Vb 10 \& use MediaWiki::Bot::Constants; \& my $exists = $bot\->test_image_exists(\*(AqFile:Albert Einstein Head.jpg\*(Aq); \& if ($exists == FILE_NONEXISTENT) { \& print "Doesn\*(Aqt exist\en"; \& } \& elsif ($exists == FILE_LOCAL) { \& print "Exists locally\en"; \& } \& elsif ($exists == FILE_SHARED) { \& print "Exists on Commons\en"; \& } \& elsif ($exists == FILE_PAGE_TEXT_ONLY) { \& print "Page exists, but no image\en"; \& } .Ve .PP \&\fBReferences:\fR API:Properties#imageinfo .SS "get_pages_in_namespace" .IX Subsection "get_pages_in_namespace" .Vb 1 \& $bot\->get_pages_in_namespace($namespace, $limit, $options_hashref); .Ve .PP Returns an array containing the names of all pages in the specified namespace. The \f(CW$namespace_id\fR must be a number, not a namespace name. .PP Setting \f(CW$page_limit\fR is optional, and specifies how many items to retrieve at once. Setting this to 'max' is recommended, and this is the default if omitted. If \f(CW$page_limit\fR is over 500, it will be rounded up to the next multiple of 500. If \f(CW$page_limit\fR is set higher than you are allowed to use, it will silently be reduced. Consider setting key 'max' in the \*(L"Options hashref\*(R" to retrieve multiple sets of results: .PP .Vb 2 \& # Gotta get \*(Aqem all! \& my @pages = $bot\->get_pages_in_namespace(6, \*(Aqmax\*(Aq, { max => 0 }); .Ve .PP \&\fBReferences:\fR API:Allpages .SS "count_contributions" .IX Subsection "count_contributions" .Vb 1 \& my $count = $bot\->count_contributions($user); .Ve .PP Uses the \s-1API\s0 to count \f(CW$user\fR's contributions. .PP \&\fBReferences:\fR API:Users .SS "timed_count_contributions" .IX Subsection "timed_count_contributions" .Vb 1 \& ($timed_edits_count, $total_count) = $bot\->timed_count_contributions($user, $days); .Ve .PP Uses the \s-1API\s0 to count \f(CW$user\fR's contributions in last number of \f(CW$days\fR and total number of user's contributions (if needed). .PP Example: If you want to get user contribs for last 30 and 365 days, and total number of edits you would write something like this: .PP .Vb 2 \& my ($last30days, $total) = $bot\->timed_count_contributions($user, 30); \& my $last365days = $bot\->timed_count_contributions($user, 365); .Ve .PP You could get total number of edits also by separately calling count_contributions like this: .PP .Vb 1 \& my $total = $bot\->count_contributions($user); .Ve .PP and use timed_count_contributions only in scalar context, but that would mean one more call to server (meaning more server load) of which you are excused as timed_count_contributions returns array with two parameters. .PP \&\fBReferences:\fR Extension:UserDailyContribs .SS "last_active" .IX Subsection "last_active" .Vb 1 \& my $latest_timestamp = $bot\->last_active($user); .Ve .PP Returns the last active time of \f(CW$user\fR in \f(CW\*(C`YYYY\-MM\-DDTHH:MM:SSZ\*(C'\fR. .PP \&\fBReferences:\fR API:Usercontribs .SS "recent_edit_to_page" .IX Subsection "recent_edit_to_page" .Vb 1 \& my ($timestamp, $user) = $bot\->recent_edit_to_page($title); .Ve .PP Returns timestamp and username for most recent (top) edit to \f(CW$page\fR. .PP \&\fBReferences:\fR API:Properties#revisions .SS "get_users" .IX Subsection "get_users" .Vb 1 \& my @recent_editors = $bot\->get_users($title, $limit, $revid, $direction); .Ve .PP Gets the most recent editors to \f(CW$page\fR, up to \f(CW$limit\fR, starting from \f(CW$revision\fR and going in \f(CW$direction\fR. .PP \&\fBReferences:\fR API:Properties#revisions .SS "was_blocked" .IX Subsection "was_blocked" .Vb 3 \& for ("Mike.lifeguard", "Jimbo Wales") { \& print "$_ was blocked\en" if $bot\->was_blocked($_); \& } .Ve .PP Returns whether \f(CW$user\fR has ever been blocked. .PP \&\fBReferences:\fR API:Logevents .SS "test_block_hist" .IX Subsection "test_block_hist" Retained for backwards compatibility. Use \*(L"was_blocked\*(R" for clarity. .PP \&\fBThis method is deprecated\fR, and will emit deprecation warnings. .SS "expandtemplates" .IX Subsection "expandtemplates" .Vb 1 \& my $expanded = $bot\->expandtemplates($title, $wikitext); .Ve .PP Expands templates on \f(CW$page\fR, using \f(CW$text\fR if provided, otherwise loading the page text automatically. .PP \&\fBReferences:\fR API:Parsing wikitext .SS "get_allusers" .IX Subsection "get_allusers" .Vb 1 \& my @users = $bot\->get_allusers($limit, $user_group, $options_hashref); .Ve .PP Returns an array of all users. Default \f(CW$limit\fR is 500. Optionally specify a \&\f(CW$group\fR (like 'sysop') to list that group only. The last optional parameter is an \*(L"Options hashref\*(R". .PP \&\fBReferences:\fR API:Allusers .SS "db_to_domain" .IX Subsection "db_to_domain" Converts a wiki/database name (enwiki) to the domain name (en.wikipedia.org). .PP .Vb 6 \& my @wikis = ("enwiki", "kowiki", "bat\-smgwiki", "nonexistent"); \& foreach my $wiki (@wikis) { \& my $domain = $bot\->db_to_domain($wiki); \& next if !defined($domain); \& print "$wiki: $domain\en"; \& } .Ve .PP You can pass an arrayref to do bulk lookup: .PP .Vb 6 \& my @wikis = ("enwiki", "kowiki", "bat\-smgwiki", "nonexistent"); \& my $domains = $bot\->db_to_domain(\e@wikis); \& foreach my $domain (@$domains) { \& next if !defined($domain); \& print "$domain\en"; \& } .Ve .PP \&\fBReferences:\fR Extension:SiteMatrix .SS "domain_to_db" .IX Subsection "domain_to_db" .Vb 1 \& my $db = $bot\->domain_to_db($domain_name); .Ve .PP As you might expect, does the opposite of \*(L"domain_to_db\*(R": Converts a domain name (meta.wikimedia.org) into a database/wiki name (metawiki). .PP \&\fBReferences:\fR Extension:SiteMatrix .SS "diff" .IX Subsection "diff" This allows retrieval of a diff from the \s-1API.\s0 The return is a scalar containing the \fI\s-1HTML\s0 table\fR of the diff. Options are passed as a hashref with keys: .IP "\(bu" 4 \&\fItitle\fR is the title to use. Provide \fIeither\fR this or revid. .IP "\(bu" 4 \&\fIrevid\fR is any revid to diff from. If you also specified title, only title will be honoured. .IP "\(bu" 4 \&\fIoldid\fR is an identifier to diff to. This can be a revid, or the special values \&'cur', 'prev' or 'next' .PP \&\fBReferences:\fR API:Properties#revisions .SS "prefixindex" .IX Subsection "prefixindex" This returns an array of hashrefs containing page titles that start with the given \f(CW$prefix\fR. The hashref has keys 'title' and 'redirect' (present if the page is a redirect, not present otherwise). .PP Additional parameters are: .IP "\(bu" 4 One of all, redirects, or nonredirects .IP "\(bu" 4 A single namespace number (unlike linksearch etc, which can accept an arrayref of numbers). .IP "\(bu" 4 \&\f(CW$options_hashref\fR as described in \*(L"Options hashref\*(R". .PP .Vb 12 \& my @prefix_pages = $bot\->prefixindex("User:Mike.lifeguard"); \& # Or, the more efficient equivalent \& my @prefix_pages = $bot\->prefixindex("Mike.lifeguard", 2); \& foreach my $hashref (@pages) { \& my $title = $hashref\->{\*(Aqtitle\*(Aq}; \& if $hashref\->{\*(Aqredirect\*(Aq} { \& print "$title is a redirect\en"; \& } \& else { \& print "$title\en is not a redirect\en"; \& } \& } .Ve .PP \&\fBReferences:\fR API:Allpages .SS "search" .IX Subsection "search" This is a simple search for your \f(CW$search_term\fR in page text. It returns an array of page titles matching. .PP Additional optional parameters are: .IP "\(bu" 4 A namespace number to search in, or an arrayref of numbers (default is the main namespace) .IP "\(bu" 4 \&\f(CW$options_hashref\fR is a hashref as described in \*(L"Options hashref\*(R": .PP .Vb 2 \& my @pages = $bot\->search("Mike.lifeguard", 2); \& print "@pages\en"; .Ve .PP Or, use a callback for incremental processing: .PP .Vb 8 \& my @pages = $bot\->search("Mike.lifeguard", 2, { hook => \e&mysub }); \& sub mysub { \& my ($res) = @_; \& foreach my $hashref (@$res) { \& my $page = $hashref\->{\*(Aqtitle\*(Aq}; \& print "$page\en"; \& } \& } .Ve .PP \&\fBReferences:\fR API:Search .SS "get_log" .IX Subsection "get_log" This fetches log entries, and returns results as an array of hashes. The first parameter is a hashref with keys: .IP "\(bu" 4 \&\fItype\fR is the log type (block, delete...) .IP "\(bu" 4 \&\fIuser\fR is the user who \fIperformed\fR the action. Do not include the User: prefix .IP "\(bu" 4 \&\fItarget\fR is the target of the action. Where an action was performed to a page, it is the page title. Where an action was performed to a user, it is User:$username. .PP The second is the familiar \*(L"Options hashref\*(R". .PP .Vb 8 \& my $log = $bot\->get_log({ \& type => \*(Aqblock\*(Aq, \& user => \*(AqUser:Mike.lifeguard\*(Aq, \& }); \& foreach my $entry (@$log) { \& my $user = $entry\->{\*(Aqtitle\*(Aq}; \& print "$user\en"; \& } \& \& $bot\->get_log({ \& type => \*(Aqblock\*(Aq, \& user => \*(AqUser:Mike.lifeguard\*(Aq, \& }, \& { hook => \e&mysub, max => 10 } \& ); \& sub mysub { \& my ($res) = @_; \& foreach my $hashref (@$res) { \& my $title = $hashref\->{\*(Aqtitle\*(Aq}; \& print "$title\en"; \& } \& } .Ve .PP \&\fBReferences:\fR API:Logevents .SS "is_g_blocked" .IX Subsection "is_g_blocked" .Vb 1 \& my $is_globally_blocked = $bot\->is_g_blocked(\*(Aq127.0.0.1\*(Aq); .Ve .PP Returns what IP/range block \fIcurrently in place\fR affects the IP/range. The return is a scalar of an IP/range if found (evaluates to true in boolean context); undef otherwise (evaluates false in boolean context). Pass in a single \s-1IP\s0 or \s-1CIDR\s0 range. .PP \&\fBReferences:\fR Extension:GlobalBlocking .SS "was_g_blocked" .IX Subsection "was_g_blocked" .Vb 1 \& print "127.0.0.1 was globally blocked\en" if $bot\->was_g_blocked(\*(Aq127.0.0.1\*(Aq); .Ve .PP Returns whether an IP/range was ever globally blocked. You should probably call this method only when your bot is operating on Meta \- this method will warn if not. .PP \&\fBReferences:\fR API:Logevents .SS "was_locked" .IX Subsection "was_locked" .Vb 1 \& my $was_locked = $bot\->was_locked(\*(AqMike.lifeguard\*(Aq); .Ve .PP Returns whether a user was ever locked. You should probably call this method only when your bot is operating on Meta \- this method will warn if not. .PP \&\fBReferences:\fR API:Logevents .SS "get_protection" .IX Subsection "get_protection" Returns data on page protection as a array of up to two hashrefs. Each hashref has a type, level, and expiry. Levels are 'sysop' and 'autoconfirmed'; types are \&'move' and 'edit'; expiry is a timestamp. Additionally, the key 'cascade' will exist if cascading protection is used. .PP .Vb 6 \& my $page = \*(AqMain Page\*(Aq; \& $bot\->edit({ \& page => $page, \& text => rand(), \& summary => \*(Aqtest\*(Aq, \& }) unless $bot\->get_protection($page); .Ve .PP You can also pass an arrayref of page titles to do bulk queries: .PP .Vb 7 \& my @pages = (\*(AqMain Page\*(Aq, \*(AqUser:Mike.lifeguard\*(Aq, \*(AqProject:Sandbox\*(Aq); \& my $answer = $bot\->get_protection(\e@pages); \& foreach my $title (keys %$answer) { \& my $protected = $answer\->{$title}; \& print "$title is protected\en" if $protected; \& print "$title is unprotected\en" unless $protected; \& } .Ve .PP \&\fBReferences:\fR API:Properties#info .SS "is_protected" .IX Subsection "is_protected" This is a synonym for \*(L"get_protection\*(R", which should be used in preference. .PP \&\fBThis method is deprecated\fR, and will emit deprecation warnings. .SS "patrol" .IX Subsection "patrol" .Vb 1 \& $bot\->patrol($rcid); .Ve .PP Marks a page or revision identified by the \f(CW$rcid\fR as patrolled. To mark several RCIDs as patrolled, you may pass an arrayref of them. Returns false and sets \&\f(CW\*(C`$bot\->{error}\*(C'\fR if the account cannot patrol. .PP \&\fBReferences:\fR API:Patrol .SS "email" .IX Subsection "email" .Vb 1 \& $bot\->email($user, $subject, $body); .Ve .PP This allows you to send emails through the wiki. All 3 of \f(CW$user\fR (without the User: prefix), \f(CW$subject\fR and \f(CW$body\fR are required. If \f(CW$user\fR is an arrayref, this will send the same email (subject and body) to all users. .PP \&\fBReferences:\fR API:Email .SS "top_edits" .IX Subsection "top_edits" Returns an array of the page titles where the \f(CW$user\fR is the latest editor. The second parameter is the familiar \f(CW$options_hashref\fR. .PP .Vb 4 \& my @pages = $bot\->top_edits("Mike.lifeguard", {max => 5}); \& foreach my $page (@pages) { \& $bot\->rollback($page, "Mike.lifeguard"); \& } .Ve .PP Note that accessing the data with a callback happens \fBbefore\fR filtering the top edits is done. For that reason, you should use \*(L"contributions\*(R" if you need to use a callback. If you use a callback with \fItop_edits()\fR, you \fBwill not\fR necessarily get top edits returned. It is only safe to use a callback if you \fIcheck\fR that it is a top edit: .PP .Vb 9 \& $bot\->top_edits("Mike.lifeguard", { hook => \e&rv }); \& sub rv { \& my $data = shift; \& foreach my $page (@$data) { \& if (exists($page\->{\*(Aqtop\*(Aq})) { \& $bot\->rollback($page\->{\*(Aqtitle\*(Aq}, "Mike.lifeguard"); \& } \& } \& } .Ve .PP \&\fBReferences:\fR API:Usercontribs .SS "contributions" .IX Subsection "contributions" .Vb 1 \& my @contribs = $bot\->contributions($user, $namespace, $options); .Ve .PP Returns an array of hashrefs of data for the user's contributions. \f(CW$ns\fR can be an arrayref of namespace numbers. \f(CW$options\fR can be specified as in \*(L"linksearch\*(R". .PP Specify an arrayref of users to get results for multiple users. .PP \&\fBReferences:\fR API:Usercontribs .SS "upload" .IX Subsection "upload" .Vb 2 \& $bot\->upload({ data => $file_contents, summary => \*(Aquploading file\*(Aq }); \& $bot\->upload({ file => $file_name, title => \*(AqTarget filename.png\*(Aq }); .Ve .PP Upload a file to the wiki. Specify the file by either giving the filename, which will be read in, or by giving the data directly. .PP \&\fBReferences:\fR API:Upload .SS "upload_from_url" .IX Subsection "upload_from_url" Upload file directly from \s-1URL\s0 to the wiki. Specify \s-1URL,\s0 the new filename and summary. Summary and new filename are optional. .PP .Vb 5 \& $bot\->upload_from_url({ \& url => \*(Aqhttp://some.domain.ext/pic.png\*(Aq, \& title => \*(AqTarget_filename.png\*(Aq, \& summary => \*(Aquploading new pic\*(Aq, \& }); .Ve .PP If on your target wiki is enabled uploading from \s-1URL,\s0 meaning \f(CW$wgAllowCopyUploads\fR is set to true in LocalSettings.php and you have appropriate user rights, you can use this function to upload files to your wiki directly from remote server. .PP \&\fBReferences:\fR API:Upload#Uploading_from_URL .SS "usergroups" .IX Subsection "usergroups" Returns a list of the usergroups a user is in: .PP .Vb 1 \& my @usergroups = $bot\->usergroups(\*(AqMike.lifeguard\*(Aq); .Ve .PP \&\fBReferences:\fR API:Users .SS "Options hashref" .IX Subsection "Options hashref" This is passed through to the lower-level interface MediaWiki::API, and is fully documented there. .PP The hashref can have 3 keys: .IP "max" 4 .IX Item "max" Specifies the maximum number of queries to retrieve data from the wiki. This is independent of the \fIsize\fR of each query (how many items each query returns). Set to 0 to retrieve all the results. .IP "hook" 4 .IX Item "hook" Specifies a coderef to a hook function that can be used to process large lists as they come in. When this is used, your subroutine will get the raw data. This is noted in cases where it is known to be significant. For example, when using a hook with \f(CW\*(C`top_edits()\*(C'\fR, you need to check whether the edit is the top edit yourself \- your subroutine gets results as they come in, and before they're filtered. .IP "skip_encoding" 4 .IX Item "skip_encoding" MediaWiki's \s-1API\s0 uses \s-1UTF\-8\s0 and any 8 bit character string parameters are encoded automatically by the \s-1API\s0 call. If your parameters are already in \s-1UTF\-8\s0 this will be detected and the encoding will be skipped. If your parameters for some reason contain \s-1UTF\-8\s0 data but no \s-1UTF\-8\s0 flag is set (i.e. you did not use the \&\f(CW\*(C`use utf8;\*(C'\fR pragma) you should prevent re-encoding by passing an option \&\f(CW\*(C`skip_encoding => 1\*(C'\fR. For example: .Sp .Vb 2 \& $category ="Cat\ex{e9}gorie:moyen_fran\ex{e7}ais"; # latin1 string \& $bot\->get_all_pages_in_category($category); # OK \& \& $category = "Cat". pack("U", 0xe9)."gorie:moyen_fran".pack("U",0xe7)."ais"; # unicode string \& $bot\->get_all_pages_in_category($category); # OK \& \& $category ="Cat\ex{c3}\ex{a9}gorie:moyen_fran\ex{c3}\ex{a7}ais"; # unicode data without utf\-8 flag \& # $bot\->get_all_pages_in_category($category); # NOT OK \& $bot\->get_all_pages_in_category($category, { skip_encoding => 1 }); # OK .Ve .Sp If you need this, it probably means you're doing something wrong. Feel free to ask for help. .SH "ERROR HANDLING" .IX Header "ERROR HANDLING" All functions will return undef in any handled error situation. Further error data is stored in \f(CW\*(C`$bot\->{error}\->{code}\*(C'\fR and \f(CW\*(C`$bot\->{error}\->{details}\*(C'\fR. .PP Error codes are provided as constants in MediaWiki::Bot::Constants, and can also be imported through this module: .PP .Vb 1 \& use MediaWiki::Bot qw(:constants); .Ve .SH "AVAILABILITY" .IX Header "AVAILABILITY" The project homepage is . .PP The latest version of this module is available from the Comprehensive Perl Archive Network (\s-1CPAN\s0). Visit to find a \s-1CPAN\s0 site near you, or see . .SH "SOURCE" .IX Header "SOURCE" The development version is on github at and may be cloned from .SH "BUGS AND LIMITATIONS" .IX Header "BUGS AND LIMITATIONS" You can make new bug reports, and view existing ones, through the web interface at . .SH "AUTHORS" .IX Header "AUTHORS" .IP "\(bu" 4 Dan Collins .IP "\(bu" 4 Mike.lifeguard .IP "\(bu" 4 Alex Rowe .IP "\(bu" 4 Oleg Alexandrov .IP "\(bu" 4 jmax.code .IP "\(bu" 4 Stefan Petrea .IP "\(bu" 4 kc2aei .IP "\(bu" 4 bosborne@alum.mit.edu .IP "\(bu" 4 Brian Obio .IP "\(bu" 4 patch and bug report contributors .SH "COPYRIGHT AND LICENSE" .IX Header "COPYRIGHT AND LICENSE" This software is Copyright (c) 2016 by the MediaWiki::Bot team . .PP This is free software, licensed under: .PP .Vb 1 \& The GNU General Public License, Version 3, June 2007 .Ve