1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557 558 559 560 561 562 563 564 565 566 567 568 569 570 571 572 573 574 575 576 577 578 579 580 581 582 583 584 585 586 587 588 589 590 591 592 593 594 595 596 597 598 599 600 601 602 603 604 605 606 607 608 609 610 611 612 613 614 615 616 617 618 619 620 621 622 623 624 625 626 627 628 629 630 631 632 633 634 635 636 637 638 639 640 641 642 643 644 645 646 647 648 649 650 651 652 653 654 655 656 657 658 659 660 661 662 663 664 665 666 667 668 669 670 671 672 673 674 675 676 677 678 679 680 681 682 683 684 685 686 687 688 689 690 691 692 693 694 695 696 697 698 699 700 701 702 703 704 705 706 707 708 709 710 711 712 713 714 715 716 717 718 719 720 721 722 723 724 725 726 727 728 729 730 731 732 733 734 735 736 737 738 739 740 741 742 743 744 745 746 747 748 749 750 751 752 753 754 755 756 757 758 759 760 761 762 763 764 765 766 767 768 769 770 771 772 773 774 775 776 777 778 779 780 781 782 783 784 785 786 787 788 789 790 791 792 793 794 795 796 797 798 799 800 801 802 803 804 805 806 807 808 809 810 811 812 813 814 815 816 817 818 819 820 821 822 823 824 825 826 827 828 829 830 831 832 833 834 835 836 837 838 839 840 841 842 843 844 845 846 847 848 849 850 851 852 853 854 855 856 857 858 859 860 861 862 863 864 865 866 867 868 869 870 871 872 873 874 875 876 877 878 879 880 881 882 883 884 885 886 887 888 889 890 891 892 893 894 895 896 897 898 899 900 901 902 903 904 905 906 907 908 909 910 911 912 913 914 915 916 917 918 919 920 921 922 923 924 925 926 927 928 929 930 931 932 933 934 935 936 937 938 939 940 941 942 943 944 945 946 947 948 949 950 951 952 953 954 955 956 957 958 959 960 961 962 963 964 965 966 967 968 969 970 971 972 973 974 975 976 977 978 979 980 981 982 983 984 985 986 987 988 989 990 991 992 993 994 995 996 997 998 999 1000 1001 1002 1003 1004 1005 1006 1007 1008 1009 1010 1011 1012 1013 1014 1015 1016 1017 1018 1019 1020 1021 1022 1023 1024 1025 1026 1027 1028 1029 1030 1031 1032 1033 1034 1035 1036 1037 1038 1039 1040 1041 1042 1043 1044 1045 1046 1047 1048 1049 1050 1051 1052 1053 1054 1055 1056 1057 1058 1059 1060 1061 1062 1063 1064 1065 1066 1067 1068 1069 1070 1071 1072 1073 1074 1075 1076 1077 1078 1079 1080 1081 1082 1083 1084 1085 1086 1087 1088 1089 1090 1091 1092 1093 1094 1095 1096 1097 1098 1099 1100 1101 1102 1103 1104 1105 1106 1107 1108 1109 1110 1111 1112 1113 1114 1115 1116 1117 1118 1119 1120 1121 1122 1123 1124 1125 1126 1127 1128 1129 1130 1131 1132 1133 1134 1135 1136 1137 1138 1139 1140 1141 1142 1143 1144 1145 1146 1147 1148 1149 1150 1151 1152 1153 1154 1155 1156 1157 1158 1159 1160 1161 1162 1163 1164 1165 1166 1167 1168 1169 1170 1171 1172 1173 1174 1175 1176 1177 1178 1179 1180 1181 1182 1183 1184 1185 1186 1187 1188 1189 1190 1191 1192 1193 1194 1195 1196 1197 1198 1199 1200 1201 1202 1203 1204 1205 1206 1207 1208 1209 1210 1211 1212 1213 1214 1215 1216 1217 1218 1219 1220 1221 1222 1223 1224 1225 1226 1227 1228 1229 1230 1231 1232 1233 1234 1235 1236 1237 1238 1239 1240 1241 1242 1243 1244 1245 1246 1247 1248 1249 1250 1251 1252 1253 1254 1255 1256 1257 1258 1259 1260 1261 1262 1263 1264 1265 1266 1267 1268 1269 1270 1271 1272 1273 1274 1275 1276 1277 1278 1279 1280 1281 1282 1283 1284 1285 1286 1287 1288 1289 1290 1291 1292 1293 1294 1295 1296 1297 1298 1299 1300 1301 1302 1303 1304 1305 1306 1307 1308 1309 1310 1311 1312 1313 1314 1315 1316 1317 1318 1319 1320 1321 1322 1323 1324 1325 1326 1327 1328 1329 1330 1331 1332 1333 1334 1335 1336 1337 1338 1339 1340 1341 1342 1343 1344
|
# NAME
MediaWiki::Bot - a high-level bot framework for interacting with MediaWiki wikis
# VERSION
version 5.007000
# SYNOPSIS
use MediaWiki::Bot qw(:constants);
my $bot = MediaWiki::Bot->new({
assert => 'bot',
host => 'de.wikimedia.org',
login_data => { username => "Mike's bot account", password => "password" },
});
my $revid = $bot->get_last("User:Mike.lifeguard/sandbox", "Mike.lifeguard");
print "Reverting to $revid\n" if defined($revid);
$bot->revert('User:Mike.lifeguard', $revid, 'rvv');
# DESCRIPTION
**MediaWiki::Bot** is a framework that can be used to write bots which interface
with the MediaWiki API ([http://en.wikipedia.org/w/api.php](http://en.wikipedia.org/w/api.php)).
# METHODS
## new
my $bot = MediaWiki::Bot({
host => 'en.wikipedia.org',
operator => 'Mike.lifeguard',
});
Calling `MediaWiki::Bot->new()` will create a new MediaWiki::Bot object. The
only parameter is a hashref with keys:
- _agent_ sets a custom useragent. It is recommended to use `operator`
instead, which is all we need to do the right thing for you. If you really
want to do it yourself, see [https://meta.wikimedia.org/wiki/User-agent\_policy](https://meta.wikimedia.org/wiki/User-agent_policy)
for guidance on what information must be included.
- _assert_ sets a parameter for the AssertEdit extension (commonly 'bot')
Refer to [http://mediawiki.org/wiki/Extension:AssertEdit](http://mediawiki.org/wiki/Extension:AssertEdit).
- _operator_ allows the bot to send you a message when it fails an assert. This
is also the recommended way to customize the user agent string, which is
required by the Wikimedia Foundation. A warning will be emitted if you omit
this.
- _maxlag_ allows you to set the maxlag parameter (default is the recommended 5s).
Please refer to the MediaWiki documentation prior to changing this from the
default.
- _protocol_ allows you to specify 'http' or 'https' (default is 'http')
- _host_ sets the domain name of the wiki to connect to
- _path_ sets the path to api.php (with no leading or trailing slash)
- _login\_data_ is a hashref of credentials to pass to ["login"](#login).
- _debug_ - whether to provide debug output.
1 provides only error messages; 2 provides further detail on internal operations.
For example:
my $bot = MediaWiki::Bot->new({
assert => 'bot',
protocol => 'https',
host => 'en.wikimedia.org',
agent => sprintf(
'PerlWikiBot/%s (https://metacpan.org/MediaWiki::Bot; User:Mike.lifeguard)',
MediaWiki::Bot->VERSION
),
login_data => { username => "Mike's bot account", password => "password" },
});
For backward compatibility, you can specify up to three parameters:
my $bot = MediaWiki::Bot->new('My custom useragent string', $assert, $operator);
**This form is deprecated** will never do auto-login or autoconfiguration, and emits
deprecation warnings.
For further reading:
- [MediaWiki::Bot wiki](https://github.com/MediaWiki-Bot/MediaWiki-Bot/wiki)
- [<Installing `MediaWiki::Bot`](https://github.com/MediaWiki-Bot/MediaWiki-Bot/wiki/Install)>
- [Creating a new bot](https://github.com/MediaWiki-Bot/MediaWiki-Bot/wiki/Creating-a-new-bot)
- [Setting the wiki](https://github.com/MediaWiki-Bot/MediaWiki-Bot/wiki/Setting-the-wiki)
- [Where is api.php](https://github.com/MediaWiki-Bot/MediaWiki-Bot/wiki/Where-is-api.php)
## set\_wiki
Set what wiki to use. The parameter is a hashref with keys:
- _host_ - the domain name
- _path_ - the part of the path before api.php (usually 'w')
- _protocol_ is either 'http' or 'https'.
If you don't set any parameter, it's previous value is used. If it has never
been set, the default settings are 'http', 'en.wikipedia.org' and 'w'.
For example:
$bot->set_wiki({
protocol => 'https',
host => 'secure.wikimedia.org',
path => 'wikipedia/meta/w',
});
For backward compatibility, you can specify up to two parameters:
$bot->set_wiki($host, $path);
**This form is deprecated**, and will emit deprecation warnings.
## login
This method takes a hashref with keys _username_ and _password_ at a minimum.
See ["Single User Login"](#single-user-login) and ["Basic authentication"](#basic-authentication) for additional options.
Logs the use $username in, optionally using $password. First, an attempt will be
made to use cookies to log in. If this fails, an attempt will be made to use the
password provided to log in, if any. If the login was successful, returns true;
false otherwise.
$bot->login({
username => $username,
password => $password,
}) or die "Login failed";
Once logged in, attempt to do some simple auto-configuration. At present, this
consists of:
- Warning if the account doesn't have the bot flag, and isn't a sysop account.
- Setting an appropriate default assert.
You can skip this autoconfiguration by passing `autoconfig => 0`
For backward compatibility, you can call this as
$bot->login($username, $password);
**This form is deprecated**, and will emit deprecation warnings. It will
never do autoconfiguration or SUL login.
### Single User Login
On WMF wikis, `do_sul` specifies whether to log in on all projects. The default
is false. But even when false, you still get a CentralAuth cookie for, and are
thus logged in on, all languages of a given domain (`*.wikipedia.org`, for example).
When set, a login is done on each WMF domain so you are logged in on all ~800
content wikis. Since `*.wikimedia.org` is not possible, we explicitly include
meta, commons, incubator, and wikispecies.
### Basic authentication
If you need to supply basic auth credentials, pass a hashref of data as
described by [LWP::UserAgent](https://metacpan.org/pod/LWP%3A%3AUserAgent):
$bot->login({
username => $username,
password => $password,
basic_auth => { netloc => "private.wiki.com:80",
realm => "Authentication Realm",
uname => "Basic auth username",
pass => "password",
}
}) or die "Couldn't log in";
### Bot passwords
`MediaWiki::Bot` doesn't yet support the more complicated (but more secure)
oAuth login flow for bots. Instead, we support a simpler "bot password", which
is a generated password connected to a (possibly-reduced) set of on-wiki
privileges, and IP ranges from which it can be used.
To create one, visit `Special:BotPasswords` on the wiki. Enter a label for
the password, then select the privileges you want to use with that password.
This set should be as restricted as possible; most bots only edit existing
pages. Keeping the set of privileges as restricted as possible limits the
possible damage if the password were ever compromised.
Submit the form, and you'll be given a new "username" that looks like
"AccountUsername@bot\_password\_label", and a generated bot password.
To log in, provide those to `MediaWiki::Bot` verbatim.
**References:** [API:Login](https://www.mediawiki.org/wiki/API:Login),
[Logging in](https://github.com/MediaWiki-Bot/MediaWiki-Bot/wiki/Logging-in)
## logout
$bot->logout();
The logout method logs the bot out of the wiki. This invalidates all login
cookies.
**References:** [API:Logging out](https://www.mediawiki.org/wiki/API:Logout)
## edit
my $text = $bot->get_text('My page');
$text .= "\n\n* More text\n";
$bot->edit({
page => 'My page',
text => $text,
summary => 'Adding new content',
section => 'new',
});
This method edits a wiki page, and takes a hashref of data with keys:
- _page_ - the page title to edit
- _text_ - the page text to write
- _summary_ - an edit summary
- _minor_ - whether to mark the edit as minor or not (boolean)
- _bot_ - whether to mark the edit as a bot edit (boolean)
- _assertion_ - usually 'bot', but see [http://mediawiki.org/wiki/Extension:AssertEdit](http://mediawiki.org/wiki/Extension:AssertEdit).
- _section_ - edit a single section (identified by number) instead of the whole page
An MD5 hash is sent to guard against data corruption while in transit.
You can also call this as:
$bot->edit($page, $text, $summary, $is_minor, $assert, $markasbot);
**This form is deprecated**, and will emit deprecation warnings.
### CAPTCHAs
If a [CAPTCHA](https://en.wikipedia.org/wiki/CAPTCHA) is encountered, the
call to `edit` will return false, with the error code set to `ERR_CAPTCHA`
and the details informing you that solving a CAPTCHA is required for this
action. The information you need to actually solve the captcha (for example
the URL for the image) is given in `$bot->{error}->{captcha}` as a
hash reference. You will want to grab the keys 'url' (a relative URL to
the image) and 'id' (the ID of the CAPTCHA). Once you have solved the
CAPTCHA (presumably by interacting with a human), retry the edit, adding
`captcha_id` and `captcha_solution` parameters:
my $edit = {page => 'Main Page', text => 'got your nose'};
my $edit_status = $bot->edit($edit);
if (not $edit_status) {
if ($bot->{error}->{code} == ERR_CAPTCHA) {
my @captcha_uri = split /\Q?/, $bot->{error}{captcha}{url}, 2;
my $image = URI->new(sprintf '%s://%s%s?%s' =>
$bot->{protocol}, $bot->{host}, $captcha_uri[0], $captcha_uri[1],
);
require Term::ReadLine;
my $term = Term::ReadLine->new('Solve the captcha');
$term->ornaments(0);
my $answer = $term->readline("Please solve $image and type the answer: ");
# Add new CAPTCHA params to the edit we're attempting
$edit->{captcha_id} = $bot->{error}->{captcha}->{id};
$edit->{captcha_solution} = $answer;
$status = $bot->edit($edit);
}
}
**References:** [Editing pages](https://github.com/MediaWiki-Bot/MediaWiki-Bot/wiki/Editing-pages),
[API:Edit](https://www.mediawiki.org/wiki/API:Edit),
[API:Tokens](https://www.mediawiki.org/wiki/API:Tokens)
## move
$bot->move($from_title, $to_title, $reason, $options_hashref);
This moves a wiki page.
If you wish to specify more options (like whether to suppress creation of a
redirect), use $options\_hashref, which has keys:
- _movetalk_ specifies whether to attempt to the talk page.
- _noredirect_ specifies whether to suppress creation of a redirect.
- _movesubpages_ specifies whether to move subpages, if applicable.
- _watch_ and _unwatch_ add or remove the page and the redirect from your watchlist.
- _ignorewarnings_ ignores warnings.
my @pages = ("Humor", "Rumor");
foreach my $page (@pages) {
my $to = $page;
$to =~ s/or$/our/;
$bot->move($page, $to, "silly 'merricans");
}
**References:** [API:Move](https://www.mediawiki.org/wiki/API:Move)
## get\_history
my @hist = $bot->get_history($title);
my @hist = $bot->get_history($title, $additional_params);
Returns an array containing the history of the specified page $title.
The optional hash ref $additional\_params can be used to tune the
query by API parameters,
such as 'rvlimit' to return only 'rvlimit' number of revisions (default is as many
as possible, but may be limited per query) or 'rvdir' to set the chronological
direction.
Example:
my @hist = $bot->get_history('Main Page', {'rvlimit' => 10, 'rvdir' => 'older'})
The array returned contains hashrefs with keys: revid, user, comment, minor,
timestamp\_date, and timestamp\_time.
For backward compatibility, you can specify up to four parameters:
my @hist = $bot->get_history($title, $limit, $revid, $direction);
**References**: [Getting page history](https://github.com/MediaWiki-Bot/MediaWiki-Bot/wiki/Getting-page-history),
[API:Properties#revisions](https://www.mediawiki.org/wiki/API:Properties#revisions_.2F_rv)
## get\_history\_step\_by\_step
my @hist = $bot->get_history_step_by_step($title);
my @hist = $bot->get_history_step_by_step($title, $additional_params);
Same as get\_history(), but does not return the full history at once, but let's you
loop through it.
The optional call-by-reference hash ref $additional\_params can be used to loop
through a page's full history by using the 'continue' param returned by the API.
Example:
my $ready;
my $filter_params = {};
while(!$ready){
my @hist = $bot->get_history_step_by_step($page, $filter_params);
if(@hist == 0 || !defined($filter_params->{'continue'})){
$ready = 1;
}
# do something with @hist
}
**References**: [Getting page history](https://github.com/MediaWiki-Bot/MediaWiki-Bot/wiki/Getting-page-history),
[API:Properties#revisions](https://www.mediawiki.org/wiki/API:Properties#revisions_.2F_rv)
## get\_text
Returns the wikitext of the specified $page\_title.
The first parameter $page\_title is the only required one.
The second parameter is a hashref with the following independent optional keys:
- `rvstartid` - if defined, this function returns the text of that revision, otherwise
the newest revision will be used.
- `rvsection` - if defined, returns the text of that section. Otherwise the
whole page text will be returned.
- `pageid` - this is an output parameter and can be used to fetch the id of a page
without the need of calling ["get\_id"](#get_id) additionally. Note that the value of this
param is ignored and it will be overwritten by this function.
- `rv...` - any param starting with 'rv' will be forwarded to the api call.
A blank page will return wikitext of "" (which evaluates to false in Perl,
but is defined); a nonexistent page will return undef (which also evaluates
to false in Perl, but is obviously undefined). You can distinguish between
blank and nonexistent pages by using [defined](https://metacpan.org/pod/perlfunc#defined):
# simple example
my $wikitext = $bot->get_text('Page title');
print "Wikitext: $wikitext\n" if defined $wikitext;
# advanced example
my $options = {'revid'=>123456, 'section_number'=>2};
$wikitext = $bot->get_text('Page title', $options);
die "error, see API error message\n" unless defined $options->{'pageid'};
warn "page doesn't exist\n" if $options->{'pageid'} == MediaWiki::Bot::PAGE_NONEXISTENT;
print "Wikitext: $wikitext\n" if defined $wikitext;
**References:** [Fetching page text](https://github.com/MediaWiki-Bot/MediaWiki-Bot/wiki/Fetching-page-text),
[API:Properties#revisions](https://www.mediawiki.org/wiki/API:Properties#revisions_.2F_rv)
For backward-compatibility the params `revid` and `section_number` may also be
given as scalar parameters:
my $wikitext = $bot->get_text('Page title', 123456, 2);
print "Wikitext: $wikitext\n" if defined $wikitext;
## get\_id
Returns the id of the specified $page\_title. Returns undef if page does not exist.
my $pageid = $bot->get_id("Main Page");
die "Page doesn't exist\n" if !defined($pageid);
**Revisions:** [API:Properties#info](https://www.mediawiki.org/wiki/API:Properties#info_.2F_in)
## get\_pages
Returns the text of the specified pages in a hashref. Content of undef means
page does not exist. Also handles redirects or article names that use namespace
aliases.
my @pages = ('Page 1', 'Page 2', 'Page 3');
my $thing = $bot->get_pages(\@pages);
foreach my $page (keys %$thing) {
my $text = $thing->{$page};
print "$text\n" if defined($text);
}
**References:** [Fetching page text](https://github.com/MediaWiki-Bot/MediaWiki-Bot/wiki/Fetching-page-text),
[API:Properties#revisions](https://www.mediawiki.org/wiki/API:Properties#revisions_.2F_rv)
## get\_image
$buffer = $bot->get_image('File:Foo.jpg', { width=>256, height=>256 });
Download an image from a wiki. This is derived from a similar function in
[MediaWiki::API](https://metacpan.org/pod/MediaWiki%3A%3AAPI). This one allows the image to be scaled down by passing a hashref
with height & width parameters.
It returns raw data in the original format. You may simply spew it to a file, or
process it directly with a library such as [Imager](https://metacpan.org/pod/Imager).
use File::Slurp qw(write_file);
my $img_data = $bot->get_image('File:Foo.jpg');
write_file( 'Foo.jpg', {binmode => ':raw'}, \$img_data );
Images are scaled proportionally. (height/width) will remain
constant, except for rounding errors.
Height and width parameters describe the **maximum** dimensions. A 400x200
image will never be scaled to greater dimensions. You can scale it yourself;
having the wiki do it is just lazy & selfish.
**References:** [API:Properties#imageinfo](https://www.mediawiki.org/wiki/API:Properties#imageinfo_.2F_ii)
## revert
Reverts the specified $page\_title to $revid, with an edit summary of $summary. A
default edit summary will be used if $summary is omitted.
my $revid = $bot->get_last("User:Mike.lifeguard/sandbox", "Mike.lifeguard");
print "Reverting to $revid\n" if defined($revid);
$bot->revert('User:Mike.lifeguard', $revid, 'rvv');
**References:** [API:Edit](https://www.mediawiki.org/wiki/API:Edit)
## undo
$bot->undo($title, $revid, $summary, $after);
Reverts the specified $revid, with an edit summary of $summary, using the undo
function. To undo all revisions from $revid up to but not including this one,
set $after to another revid. If not set, just undo the one revision ($revid).
**References:** [API:Edit](https://www.mediawiki.org/wiki/API:Edit)
## get\_last
Returns the revid of the last revision to $page not made by $user. undef is
returned if no result was found, as would be the case if the page is deleted.
my $revid = $bot->get_last('User:Mike.lifeguard/sandbox', 'Mike.lifeguard');
if defined($revid) {
print "Reverting to $revid\n";
$bot->revert('User:Mike.lifeguard', $revid, 'rvv');
}
**References:** [API:Properties#revisions](https://www.mediawiki.org/wiki/API:Properties#revisions_.2F_rv)
## update\_rc
**This method is deprecated**, and will emit deprecation warnings.
Replace calls to `update_rc()` with calls to the newer `recentchanges()`, which
returns all available data, including rcid.
Returns an array containing the $limit most recent changes to the wiki's _main
namespace_. The array contains hashrefs with keys title, revid, old\_revid,
and timestamp.
my @rc = $bot->update_rc(5);
foreach my $hashref (@rc) {
my $title = $hash->{'title'};
print "$title\n";
}
The ["Options hashref"](#options-hashref) is also available:
# Use a callback for incremental processing:
my $options = { hook => \&mysub, };
$bot->update_rc($options);
sub mysub {
my ($res) = @_;
foreach my $hashref (@$res) {
my $page = $hashref->{'title'};
print "$page\n";
}
}
## recentchanges($wiki\_hashref, $options\_hashref)
Returns an array of hashrefs containing recentchanges data.
The first parameter is a hashref with the following keys:
- _ns_ - the namespace number, or an arrayref of numbers to
specify several; default is the main namespace
- _limit_ - the number of rows to fetch; default is 50
- _user_ - only list changes by this user
- _show_ - itself a hashref where the key is a category and the value is
a boolean. If true, the category will be included; if false, excluded. The
categories are kinds of edits: minor, bot, anon, redirect, patrolled. See
"rcshow" at [http://www.mediawiki.org/wiki/API:Recentchanges#Parameters](http://www.mediawiki.org/wiki/API:Recentchanges#Parameters).
An ["Options hashref"](#options-hashref) can be used as the second parameter:
my @rc = $bot->recentchanges({ ns => 4, limit => 100 });
foreach my $hashref (@rc) {
print $hashref->{title} . "\n";
}
# Or, use a callback for incremental processing:
$bot->recentchanges({ ns => [0,1], limit => 500 }, { hook => \&mysub });
sub mysub {
my ($res) = @_;
foreach my $hashref (@$res) {
my $page = $hashref->{title};
print "$page\n";
}
}
The hashref returned might contain the following keys:
- _ns_ - the namespace number
- _revid_
- _old\_revid_
- _timestamp_
- _rcid_ - can be used with ["patrol"](#patrol)
- _pageid_
- _type_ - one of edit, new, log (there may be others)
- _title_
For backwards compatibility, the previous method signature is still
supported:
$bot->recentchanges($ns, $limit, $options_hashref);
**References:** [API:Recentchanges](https://www.mediawiki.org/wiki/API:Recentchanges)
## what\_links\_here
Returns an array containing a list of all pages linking to $page.
Additional optional parameters are:
- One of: all (default), redirects, or nonredirects.
- A namespace number to search (pass an arrayref to search in multiple namespaces)
- An ["Options hashref"](#options-hashref).
A typical query:
my @links = $bot->what_links_here("Meta:Sandbox",
undef, 1,
{ hook=>\&mysub }
);
sub mysub{
my ($res) = @_;
foreach my $hash (@$res) {
my $title = $hash->{'title'};
my $is_redir = $hash->{'redirect'};
print "Redirect: $title\n" if $is_redir;
print "Page: $title\n" unless $is_redir;
}
}
Transclusions are no longer handled by what\_links\_here() - use
["list\_transclusions"](#list_transclusions) instead.
**References:** [Listing incoming links](https://github.com/MediaWiki-Bot/MediaWiki-Bot/wiki/Listing-incoming-links),
[API:Backlinks](https://www.mediawiki.org/wiki/API:Backlinks)
## list\_transclusions
Returns an array containing a list of all pages transcluding $page.
Other parameters are:
- One of: all (default), redirects, or nonredirects
- A namespace number to search (pass an arrayref to search in multiple namespaces).
- $options\_hashref as described by [MediaWiki::API](https://metacpan.org/pod/MediaWiki%3A%3AAPI):
Set max to limit the number of queries performed.
Set hook to a subroutine reference to use a callback hook for incremental
processing.
Refer to the section on ["linksearch"](#linksearch) for examples.
A typical query:
$bot->list_transclusions("Template:Tlx", undef, 4, {hook => \&mysub});
sub mysub{
my ($res) = @_;
foreach my $hash (@$res) {
my $title = $hash->{'title'};
my $is_redir = $hash->{'redirect'};
print "Redirect: $title\n" if $is_redir;
print "Page: $title\n" unless $is_redir;
}
}
**References:** [Listing transclusions](https://github.com/MediaWiki-Bot/MediaWiki-Bot/wiki/Listing-transclusions)
[API:Embeddedin](https://www.mediawiki.org/wiki/API:Embeddedin)
## get\_pages\_in\_category
Returns an array containing the names of all pages in the specified category
(include the Category: prefix). Does not recurse into sub-categories.
my @pages = $bot->get_pages_in_category('Category:People on stamps of Gabon');
print "The pages in Category:People on stamps of Gabon are:\n@pages\n";
The options hashref is as described in ["Options hashref"](#options-hashref).
Use `{ max => 0 }` to get all results.
**References:** [Listing category contents](https://github.com/MediaWiki-Bot/MediaWiki-Bot/wiki/Listing-category-contents),
[API:Categorymembers](https://www.mediawiki.org/wiki/API:Categorymembers)
## get\_all\_pages\_in\_category
my @pages = $bot->get_all_pages_in_category($category, $options_hashref);
Returns an array containing the names of **all** pages in the specified category
(include the Category: prefix), including sub-categories. The $options\_hashref
is described fully in ["Options hashref"](#options-hashref).
**References:** [Listing category contents](https://github.com/MediaWiki-Bot/MediaWiki-Bot/wiki/Listing-category-contents),
[API:Categorymembers](https://www.mediawiki.org/wiki/API:Categorymembers)
## get\_all\_categories
Returns an array containing the names of all categories.
my @categories = $bot->get_all_categories();
print "The categories are:\n@categories\n";
Use `{ max => 0 }` to get all results. The default number
of categories returned is 10, the maximum allowed is 500.
**References:** [API:Allcategories](https://www.mediawiki.org/wiki/API:Allcategories)
## linksearch
Runs a linksearch on the specified $link and returns an array containing
anonymous hashes with keys 'url' for the outbound URL, and 'title' for the page
the link is on.
Additional parameters are:
- A namespace number to search (pass an arrayref to search in multiple namespaces).
- You can search by $protocol (http is default).
- $options\_hashref is fully documented in ["Options hashref"](#options-hashref):
Set _max_ in $options to get more than one query's worth of results:
my $options = { max => 10, }; # I only want some results
my @links = $bot->linksearch("slashdot.org", 1, undef, $options);
foreach my $hash (@links) {
my $url = $hash->{'url'};
my $page = $hash->{'title'};
print "$page: $url\n";
}
Set _hook_ to a subroutine reference to use a callback hook for incremental
processing:
my $options = { hook => \&mysub, }; # I want to do incremental processing
$bot->linksearch("slashdot.org", 1, undef, $options);
sub mysub {
my ($res) = @_;
foreach my $hashref (@$res) {
my $url = $hashref->{'url'};
my $page = $hashref->{'title'};
print "$page: $url\n";
}
}
**References:** [Finding external links](https://github.com/MediaWiki-Bot/MediaWiki-Bot/wiki/Finding-external-links),
[API:Exturlusage](https://www.mediawiki.org/wiki/API:Exturlusage)
## purge\_page
Purges the server cache of the specified $page. Returns true on success; false
on failure. Pass an array reference to purge multiple pages.
If you really care, a true return value is the number of pages successfully
purged. You could check that it is the same as the number you wanted to
purge - maybe some pages don't exist, or you passed invalid titles, or you
aren't allowed to purge the cache:
my @to_purge = ('Main Page', 'A', 'B', 'C', 'Very unlikely to exist');
my $size = scalar @to_purge;
print "all-at-once:\n";
my $success = $bot->purge_page(\@to_purge);
if ($success == $size) {
print "@to_purge: OK ($success/$size)\n";
}
else {
my $missed = @to_purge - $success;
print "We couldn't purge $missed pages (list was: "
. join(', ', @to_purge)
. ")\n";
}
# OR
print "\n\none-at-a-time:\n";
foreach my $page (@to_purge) {
my $ok = $bot->purge_page($page);
print "$page: $ok\n";
}
**References:** [Purging the server cache](https://github.com/MediaWiki-Bot/MediaWiki-Bot/wiki/Purging-the-server-cache),
[API:Purge](https://www.mediawiki.org/wiki/API:Purge)
## get\_namespace\_names
my %namespace_names = $bot->get_namespace_names();
Returns a hash linking the namespace id, such as 1, to its named equivalent,
such as "Talk".
**References:** [API:Meta#siteinfo](https://www.mediawiki.org/wiki/API:Meta#siteinfo_.2F_si)
## image\_usage
Gets a list of pages which include a certain $image. Include the `File:`
namespace prefix to avoid incurring an extra round-trip (which will also emit
a deprecation warnings).
Additional parameters are:
- A namespace number to fetch results from (or an arrayref of multiple namespace
numbers)
- One of all, redirect, or nonredirects.
- $options is a hashref as described in the section for ["linksearch"](#linksearch).
my @pages = $bot->image_usage("File:Albert Einstein Head.jpg");
Or, make use of the ["Options hashref"](#options-hashref) to do incremental processing:
$bot->image_usage("File:Albert Einstein Head.jpg",
undef, undef,
{ hook=>\&mysub, max=>5 }
);
sub mysub {
my $res = shift;
foreach my $page (@$res) {
my $title = $page->{'title'};
print "$title\n";
}
}
**References:** [API:Imageusage](https://www.mediawiki.org/wiki/API:Imageusage)
## global\_image\_usage($image, $results, $filterlocal)
Returns an array of hashrefs of data about pages which use the given image.
my @data = $bot->global_image_usage('File:Albert Einstein Head.jpg');
The keys in each hashref are title, url, and wiki. `$results` is the maximum
number of results that will be returned (not the maximum number of requests that
will be sent, like `max` in the ["Options hashref"](#options-hashref)); the default is to
attempt to fetch 500 (set to 0 to get all results). `$filterlocal` will filter
out local uses of the image.
**References:** [Extension:GlobalUsage#API](https://www.mediawiki.org/wiki/Extension:GlobalUsage#API)
## links\_to\_image
A backward-compatible call to ["image\_usage"](#image_usage). You can provide only the image
title.
**This method is deprecated**, and will emit deprecation warnings.
## is\_blocked
my $blocked = $bot->is_blocked('User:Mike.lifeguard');
Checks if a user is currently blocked.
**References:** [API:Blocks](https://www.mediawiki.org/wiki/API:Blocks)
## test\_blocked
Retained for backwards compatibility. Use ["is\_blocked"](#is_blocked) for clarity.
**This method is deprecated**, and will emit deprecation warnings.
## test\_image\_exists
Checks if an image exists at $page.
- `FILE_NONEXISTENT` (0) means "Nothing there"
- `FILE_LOCAL` (1) means "Yes, an image exists locally"
- `FILE_SHARED` (2) means "Yes, an image exists on [Commons](http://commons.wikimedia.org)"
- `FILE_PAGE_TEXT_ONLY` (3) means "No image exists, but there is text on the page"
If you pass in an arrayref of images, you'll get out an arrayref of
results.
use MediaWiki::Bot::Constants;
my $exists = $bot->test_image_exists('File:Albert Einstein Head.jpg');
if ($exists == FILE_NONEXISTENT) {
print "Doesn't exist\n";
}
elsif ($exists == FILE_LOCAL) {
print "Exists locally\n";
}
elsif ($exists == FILE_SHARED) {
print "Exists on Commons\n";
}
elsif ($exists == FILE_PAGE_TEXT_ONLY) {
print "Page exists, but no image\n";
}
**References:** [API:Properties#imageinfo](https://www.mediawiki.org/wiki/API:Properties#imageinfo_.2F_ii)
## get\_pages\_in\_namespace
$bot->get_pages_in_namespace($namespace, $limit, $options_hashref);
Returns an array containing the names of all pages in the specified namespace.
The $namespace\_id must be a number, not a namespace name.
Setting $page\_limit is optional, and specifies how many items to retrieve at
once. Setting this to 'max' is recommended, and this is the default if omitted.
If $page\_limit is over 500, it will be rounded up to the next multiple of 500.
If $page\_limit is set higher than you are allowed to use, it will silently be
reduced. Consider setting key 'max' in the ["Options hashref"](#options-hashref) to
retrieve multiple sets of results:
# Gotta get 'em all!
my @pages = $bot->get_pages_in_namespace(6, 'max', { max => 0 });
**References:** [API:Allpages](https://www.mediawiki.org/wiki/API:Allpages)
## count\_contributions
my $count = $bot->count_contributions($user);
Uses the API to count $user's contributions.
**References:** [API:Users](https://www.mediawiki.org/wiki/API:Users)
## timed\_count\_contributions
($timed_edits_count, $total_count) = $bot->timed_count_contributions($user, $days);
Uses the API to count $user's contributions in last number of $days and total number of user's contributions (if needed).
Example: If you want to get user contribs for last 30 and 365 days, and total number of edits you would write
something like this:
my ($last30days, $total) = $bot->timed_count_contributions($user, 30);
my $last365days = $bot->timed_count_contributions($user, 365);
You could get total number of edits also by separately calling count\_contributions like this:
my $total = $bot->count_contributions($user);
and use timed\_count\_contributions only in scalar context, but that would mean one more call to server (meaning more
server load) of which you are excused as timed\_count\_contributions returns array with two parameters.
**References:** [Extension:UserDailyContribs](https://www.mediawiki.org/wiki/Extension:UserDailyContribs)
## last\_active
my $latest_timestamp = $bot->last_active($user);
Returns the last active time of $user in `YYYY-MM-DDTHH:MM:SSZ`.
**References:** [API:Usercontribs](https://www.mediawiki.org/wiki/API:Usercontribs)
## recent\_edit\_to\_page
my ($timestamp, $user) = $bot->recent_edit_to_page($title);
Returns timestamp and username for most recent (top) edit to $page.
**References:** [API:Properties#revisions](https://www.mediawiki.org/wiki/API:Properties#revisions_.2F_rv)
## get\_users
my @recent_editors = $bot->get_users($title, $limit, $revid, $direction);
Gets the most recent editors to $page, up to $limit, starting from $revision
and going in $direction.
**References:** [API:Properties#revisions](https://www.mediawiki.org/wiki/API:Properties#revisions_.2F_rv)
## was\_blocked
for ("Mike.lifeguard", "Jimbo Wales") {
print "$_ was blocked\n" if $bot->was_blocked($_);
}
Returns whether $user has ever been blocked.
**References:** [API:Logevents](https://www.mediawiki.org/wiki/API:Logevents)
## test\_block\_hist
Retained for backwards compatibility. Use ["was\_blocked"](#was_blocked) for clarity.
**This method is deprecated**, and will emit deprecation warnings.
## expandtemplates
my $expanded = $bot->expandtemplates($title, $wikitext);
Expands templates on $page, using $text if provided, otherwise loading the page
text automatically.
**References:** [API:Parsing wikitext](https://www.mediawiki.org/wiki/API:Parsing_wikitext)
## get\_allusers
my @users = $bot->get_allusers($limit, $user_group, $options_hashref);
Returns an array of all users. Default $limit is 500. Optionally specify a
$group (like 'sysop') to list that group only. The last optional parameter
is an ["Options hashref"](#options-hashref).
**References:** [API:Allusers](https://www.mediawiki.org/wiki/API:Allusers)
## db\_to\_domain
Converts a wiki/database name (enwiki) to the domain name (en.wikipedia.org).
my @wikis = ("enwiki", "kowiki", "bat-smgwiki", "nonexistent");
foreach my $wiki (@wikis) {
my $domain = $bot->db_to_domain($wiki);
next if !defined($domain);
print "$wiki: $domain\n";
}
You can pass an arrayref to do bulk lookup:
my @wikis = ("enwiki", "kowiki", "bat-smgwiki", "nonexistent");
my $domains = $bot->db_to_domain(\@wikis);
foreach my $domain (@$domains) {
next if !defined($domain);
print "$domain\n";
}
**References:** [Extension:SiteMatrix](https://www.mediawiki.org/wiki/Extension:SiteMatrix)
## domain\_to\_db
my $db = $bot->domain_to_db($domain_name);
As you might expect, does the opposite of ["domain\_to\_db"](#domain_to_db): Converts a domain
name (meta.wikimedia.org) into a database/wiki name (metawiki).
**References:** [Extension:SiteMatrix](https://www.mediawiki.org/wiki/Extension:SiteMatrix)
## diff
This allows retrieval of a diff from the API. The return is a scalar containing
the _HTML table_ of the diff. Options are passed as a hashref with keys:
- _title_ is the title to use. Provide _either_ this or revid.
- _revid_ is any revid to diff from. If you also specified title, only title will
be honoured.
- _oldid_ is an identifier to diff to. This can be a revid, or the special values
'cur', 'prev' or 'next'
**References:** [API:Properties#revisions](https://www.mediawiki.org/wiki/API:Properties#revisions_.2F_rv)
## prefixindex
This returns an array of hashrefs containing page titles that start with the
given $prefix. The hashref has keys 'title' and 'redirect' (present if the
page is a redirect, not present otherwise).
Additional parameters are:
- One of all, redirects, or nonredirects
- A single namespace number (unlike linksearch etc, which can accept an arrayref
of numbers).
- $options\_hashref as described in ["Options hashref"](#options-hashref).
my @prefix_pages = $bot->prefixindex("User:Mike.lifeguard");
# Or, the more efficient equivalent
my @prefix_pages = $bot->prefixindex("Mike.lifeguard", 2);
foreach my $hashref (@pages) {
my $title = $hashref->{'title'};
if $hashref->{'redirect'} {
print "$title is a redirect\n";
}
else {
print "$title\n is not a redirect\n";
}
}
**References:** [API:Allpages](https://www.mediawiki.org/wiki/API:Allpages)
## search
This is a simple search for your $search\_term in page text. It returns an array
of page titles matching.
Additional optional parameters are:
- A namespace number to search in, or an arrayref of numbers (default is the
main namespace)
- $options\_hashref is a hashref as described in ["Options hashref"](#options-hashref):
my @pages = $bot->search("Mike.lifeguard", 2);
print "@pages\n";
Or, use a callback for incremental processing:
my @pages = $bot->search("Mike.lifeguard", 2, { hook => \&mysub });
sub mysub {
my ($res) = @_;
foreach my $hashref (@$res) {
my $page = $hashref->{'title'};
print "$page\n";
}
}
**References:** [API:Search](https://www.mediawiki.org/wiki/API:Search)
## get\_log
This fetches log entries, and returns results as an array of hashes. The first
parameter is a hashref with keys:
- _type_ is the log type (block, delete...)
- _user_ is the user who _performed_ the action. Do not include the User: prefix
- _target_ is the target of the action. Where an action was performed to a page,
it is the page title. Where an action was performed to a user, it is
User:$username.
The second is the familiar ["Options hashref"](#options-hashref).
my $log = $bot->get_log({
type => 'block',
user => 'User:Mike.lifeguard',
});
foreach my $entry (@$log) {
my $user = $entry->{'title'};
print "$user\n";
}
$bot->get_log({
type => 'block',
user => 'User:Mike.lifeguard',
},
{ hook => \&mysub, max => 10 }
);
sub mysub {
my ($res) = @_;
foreach my $hashref (@$res) {
my $title = $hashref->{'title'};
print "$title\n";
}
}
**References:** [API:Logevents](https://www.mediawiki.org/wiki/API:Logevents)
## is\_g\_blocked
my $is_globally_blocked = $bot->is_g_blocked('127.0.0.1');
Returns what IP/range block _currently in place_ affects the IP/range. The
return is a scalar of an IP/range if found (evaluates to true in boolean
context); undef otherwise (evaluates false in boolean context). Pass in a
single IP or CIDR range.
**References:** [Extension:GlobalBlocking](https://www.mediawiki.org/wiki/Extension:GlobalBlocking/API)
## was\_g\_blocked
print "127.0.0.1 was globally blocked\n" if $bot->was_g_blocked('127.0.0.1');
Returns whether an IP/range was ever globally blocked. You should probably
call this method only when your bot is operating on Meta - this method will
warn if not.
**References:** [API:Logevents](https://www.mediawiki.org/wiki/API:Logevents)
## was\_locked
my $was_locked = $bot->was_locked('Mike.lifeguard');
Returns whether a user was ever locked. You should probably call this method
only when your bot is operating on Meta - this method will warn if not.
**References:** [API:Logevents](https://www.mediawiki.org/wiki/API:Logevents)
## get\_protection
Returns data on page protection as a array of up to two hashrefs. Each hashref
has a type, level, and expiry. Levels are 'sysop' and 'autoconfirmed'; types are
'move' and 'edit'; expiry is a timestamp. Additionally, the key 'cascade' will
exist if cascading protection is used.
my $page = 'Main Page';
$bot->edit({
page => $page,
text => rand(),
summary => 'test',
}) unless $bot->get_protection($page);
You can also pass an arrayref of page titles to do bulk queries:
my @pages = ('Main Page', 'User:Mike.lifeguard', 'Project:Sandbox');
my $answer = $bot->get_protection(\@pages);
foreach my $title (keys %$answer) {
my $protected = $answer->{$title};
print "$title is protected\n" if $protected;
print "$title is unprotected\n" unless $protected;
}
**References:** [API:Properties#info](https://www.mediawiki.org/wiki/API:Properties#info_.2F_in)
## is\_protected
This is a synonym for ["get\_protection"](#get_protection), which should be used in preference.
**This method is deprecated**, and will emit deprecation warnings.
## patrol
$bot->patrol($rcid);
Marks a page or revision identified by the $rcid as patrolled. To mark several
RCIDs as patrolled, you may pass an arrayref of them. Returns false and sets
`$bot->{error}` if the account cannot patrol.
**References:** [API:Patrol](https://www.mediawiki.org/wiki/API:Patrol)
## email
$bot->email($user, $subject, $body);
This allows you to send emails through the wiki. All 3 of $user (without the
User: prefix), $subject and $body are required. If $user is an arrayref, this
will send the same email (subject and body) to all users.
**References:** [API:Email](https://www.mediawiki.org/wiki/API:Email)
## top\_edits
Returns an array of the page titles where the $user is the latest editor. The
second parameter is the familiar [$options\_hashref](#linksearch).
my @pages = $bot->top_edits("Mike.lifeguard", {max => 5});
foreach my $page (@pages) {
$bot->rollback($page, "Mike.lifeguard");
}
Note that accessing the data with a callback happens **before** filtering
the top edits is done. For that reason, you should use ["contributions"](#contributions)
if you need to use a callback. If you use a callback with top\_edits(),
you **will not** necessarily get top edits returned. It is only safe to use a
callback if you _check_ that it is a top edit:
$bot->top_edits("Mike.lifeguard", { hook => \&rv });
sub rv {
my $data = shift;
foreach my $page (@$data) {
if (exists($page->{'top'})) {
$bot->rollback($page->{'title'}, "Mike.lifeguard");
}
}
}
**References:** [API:Usercontribs](https://www.mediawiki.org/wiki/API:Usercontribs)
## contributions
my @contribs = $bot->contributions($user, $namespace, $options, $from, $to);
Returns an array of hashrefs of data for the user's contributions. $namespace
can be an arrayref of namespace numbers. $options can be specified as in
["linksearch"](#linksearch).
$from and $to are optional timestamps. ISO 8601 date and time is recommended:
2001-01-15T14:56:00Z, see [https://www.mediawiki.org/wiki/Timestamp](https://www.mediawiki.org/wiki/Timestamp) for all
possible formats.
Note that $from (=ucend) has to be before $to (=ucstart), unlike direct API access.
Specify an arrayref of users to get results for multiple users.
**References:** [API:Usercontribs](https://www.mediawiki.org/wiki/API:Usercontribs)
## upload
$bot->upload({ data => $file_contents, summary => 'uploading file' });
$bot->upload({ file => $file_name, title => 'Target filename.png' });
Upload a file to the wiki. Specify the file by either giving the filename, which
will be read in, or by giving the data directly.
**References:** [API:Upload](https://www.mediawiki.org/wiki/API:Upload)
## upload\_from\_url
Upload file directly from URL to the wiki. Specify URL, the new filename
and summary. Summary and new filename are optional.
$bot->upload_from_url({
url => 'http://some.domain.ext/pic.png',
title => 'Target_filename.png',
summary => 'uploading new pic',
});
If on your target wiki is enabled uploading from URL, meaning `$wgAllowCopyUploads`
is set to true in LocalSettings.php and you have appropriate user rights, you
can use this function to upload files to your wiki directly from remote server.
**References:** [API:Upload#Uploading\_from\_URL](https://www.mediawiki.org/wiki/API:Upload#Uploading_from_URL)
## usergroups
Returns a list of the usergroups a user is in:
my @usergroups = $bot->usergroups('Mike.lifeguard');
**References:** [API:Users](https://www.mediawiki.org/wiki/API:Users)
## get\_mw\_version
Returns a hash ref with the MediaWiki version. The hash ref contains the keys
_major_, _minor_, _patch_, and _string_.
Returns undef on errors.
my $mw_version = $bot->get_mw_version;
# get version as string
my $mw_ver_as_string = $mw_version->{'major'} . '.' . $mw_version->{'minor'};
if(defined $mw_version->{'patch'}){
$mw_ver_as_string .= '.' . $mw_version->{'patch'};
}
# or simply
my $mw_ver_as_string = $mw_version->{'string'};
**References:** [API:Siteinfo](https://www.mediawiki.org/wiki/API:Siteinfo)
## Options hashref
This is passed through to the lower-level interface [MediaWiki::API](https://metacpan.org/pod/MediaWiki%3A%3AAPI), and is
fully documented there.
The hashref can have 3 keys:
- max
Specifies the maximum number of queries to retrieve data from the wiki. This is
independent of the _size_ of each query (how many items each query returns).
Set to 0 to retrieve all the results.
- hook
Specifies a coderef to a hook function that can be used to process large lists
as they come in. When this is used, your subroutine will get the raw data. This
is noted in cases where it is known to be significant. For example, when
using a hook with `top_edits()`, you need to check whether the edit is the top
edit yourself - your subroutine gets results as they come in, and before they're
filtered.
- skip\_encoding
MediaWiki's API uses UTF-8 and any 8 bit character string parameters are encoded
automatically by the API call. If your parameters are already in UTF-8 this will
be detected and the encoding will be skipped. If your parameters for some reason
contain UTF-8 data but no UTF-8 flag is set (i.e. you did not use the
`use [utf8](https://metacpan.org/pod/utf8);` pragma) you should prevent re-encoding by passing an option
`skip_encoding => 1`. For example:
$category ="Cat\x{e9}gorie:moyen_fran\x{e7}ais"; # latin1 string
$bot->get_all_pages_in_category($category); # OK
$category = "Cat". pack("U", 0xe9)."gorie:moyen_fran".pack("U",0xe7)."ais"; # unicode string
$bot->get_all_pages_in_category($category); # OK
$category ="Cat\x{c3}\x{a9}gorie:moyen_fran\x{c3}\x{a7}ais"; # unicode data without utf-8 flag
# $bot->get_all_pages_in_category($category); # NOT OK
$bot->get_all_pages_in_category($category, { skip_encoding => 1 }); # OK
If you need this, it probably means you're doing something wrong. Feel free to
ask for help.
# ERROR HANDLING
All functions will return undef in any handled error situation. Further error
data is stored in `$bot->{error}->{code}` and `$bot->{error}->{details}`.
Error codes are provided as constants in [MediaWiki::Bot::Constants](https://metacpan.org/pod/MediaWiki%3A%3ABot%3A%3AConstants), and can also
be imported through this module:
use MediaWiki::Bot qw(:constants);
# AVAILABILITY
The project homepage is [https://metacpan.org/module/MediaWiki::Bot](https://metacpan.org/module/MediaWiki::Bot).
The latest version of this module is available from the Comprehensive Perl
Archive Network (CPAN). Visit [http://www.perl.com/CPAN/](http://www.perl.com/CPAN/) to find a CPAN
site near you, or see [https://metacpan.org/module/MediaWiki::Bot/](https://metacpan.org/module/MediaWiki::Bot/).
# SOURCE
The development version is on github at [https://github.com/MediaWiki-Bot/MediaWiki-Bot](https://github.com/MediaWiki-Bot/MediaWiki-Bot)
and may be cloned from [git://github.com/MediaWiki-Bot/MediaWiki-Bot.git](git://github.com/MediaWiki-Bot/MediaWiki-Bot.git)
# BUGS AND LIMITATIONS
You can make new bug reports, and view existing ones, through the
web interface at [https://github.com/MediaWiki-Bot/MediaWiki-Bot/issues](https://github.com/MediaWiki-Bot/MediaWiki-Bot/issues).
# AUTHORS
- Dan Collins <dcollins@cpan.org>
- Mike.lifeguard <lifeguard@cpan.org>
- Alex Rowe <alex.d.rowe@gmail.com>
- Oleg Alexandrov <oleg.alexandrov@gmail.com>
- jmax.code <jmax.code@gmail.com>
- Stefan Petrea <stefan.petrea@gmail.com>
- kc2aei <kc2aei@gmail.com>
- bosborne@alum.mit.edu
- Brian Obio <brianobio@gmail.com>
- patch and bug report contributors
# COPYRIGHT AND LICENSE
This software is Copyright (c) 2021 by the MediaWiki::Bot team <perlwikibot@googlegroups.com>.
This is free software, licensed under:
The GNU General Public License, Version 3, June 2007
|