Download Information
To download the datasets, please fill out the following User Agreement. You will receive a link to download the datasets once the User Agreement has been registered.
Please cite the following paper if you use the dataset:
(link) T. Markchom et al., “PETS2025: Multi-Authority Multi-Sensor Maritime Surveillance Challenge and Evaluation,” 2025 IEEE International Conference on Advanced Visual and Signal-Based Systems (AVSS), Tainan, Taiwan, 2025, pp. 1-14, doi: 10.1109/AVSS65446.2025.11149786.
@INPROCEEDINGS{11149786,
author={Markchom, Thanet and Boyle, Jonathan and Chen, Lulu and Ferryman, James and Marturini, Matteo and Veigl, Stephan and Opitz, Andreas and Kriechbaum-Zabini, Andreas and Bratskas, Romaios and Gkamaris, Anastasios and Papachristos, Dimitris and Leventakis, George and Fan, Wenjun and Huang, Hsiang-Wei and Hwang, Jeng-Neng and Kim, Pyongkun and Kim, Kwangju and Huang, Chung-I and Saito, Kenta and Kaneko, Shunta and Sudo, Kyoko and Thanh Thien, Nguyen and Kao, Meng-Yu and Hsieh, Jun-Wei and Lilek, Teepakorn and Pomsuwan, Tossapol and Gu, Jinjie and Xu, Tianyang and Zhu, Xuefeng and Wu, Xiaojun and Kittler, Josef and Stacy, Stephanie and Gabaldon, Alfredo and Tu, Peter and Kim, Sangwon and Kim, Dongyoung and Lee, Kyoungoh},
booktitle={2025 IEEE International Conference on Advanced Visual and Signal-Based Systems (AVSS)},
title={PETS2025: Multi-Authority Multi-Sensor Maritime Surveillance Challenge and Evaluation},
year={2025},
volume={},
number={},
pages={1-14},
keywords={YOLO;Target tracking;Geology;Surveillance;Sea measurements;Thermal sensors;Autonomous aerial vehicles;Transformers;Sensors;Telemetry},
doi={10.1109/AVSS65446.2025.11149786}}
Legal note: The image sequences are copyrighted by the EURMARS project and the University of Reading. Permission is hereby granted for free download for the purposes of the PETS2025 challenge and academic and industrial research. Where the data is disseminated (e.g., in publications or presentations), the source should be acknowledged.
Challenges 1 and 2 (challenge1_challenge2/)
- Below is the structure of the dataset folder.
challenge1_challenge2/ │ ├── train/ │ │ ├── [scenario]/ │ │ │ ├── [sensor type]/ │ │ │ │ ├── annotations.xml │ │ │ │ ├── images/ │ ├── test/ │ │ ├── [scenario]/ │ │ │ ├── [sensor type]/ │ │ │ │ ├── images/
- For Challenge 1 and Challenge 2, there are
- 11 scenarios for training
- 4 scenarios for testing (evaluation)
- Each scenario includes a varying number of sensor types.
- The dataset folder
challenge1_challenge2/containstrain/andtest/. - Data is organized by [scenario] and [sensor type]
- Each sensor type folder contains:
annotations.xml: Stores the ground truth bounding boxes, labels, and track IDs for Challenges 1 and 2.images/: Contains images in a sequential format (e.g., 0001.jpg, 0002.jpg, etc.).
- The test set does not contain
annotations.xml, as it is intended for evaluation. - Each
annotations.xmlfile contains ground truth bounding boxes, object classes, and track IDs in XML format. - Each image is represented by an
<image>tag, which contains the following attributes:- id: A unique identifier for the image.
- name: The file name of the image (e.g., “0001.jpg”).
- width and height: The dimensions of the image (in pixels).
- Inside each
<image>tag, multiple<box>tags are present. - Each
<box>defines a bounding box around an object of interest in the image and includes the following attributes:- label: The class of the object (“person”, “vessel”, or “vehicle”).
- xtl, ytl: The coordinates of the top-left corner of the bounding box.
- xbr, ybr: The coordinates of the bottom-right corner of the bounding box.
- Each bounding box contains an
<attribute>tag specifying a track ID for the object.
The tables below summarise the statistics of the datasets.
- GS_RGB = Visible (RGB) ground sensor
- GS_SWIR = Short-wave infrared (SWIR) ground sensor
- GS_Therm = Thermal ground sensor
- GS_UV = Ultraviolet (UV) ground sensor
- UAV_RGB = Visible (RGB) UAV
- UAV_Therm = Thermal UAV
Training set
| Scenario | Sensor | #images | #persons | #vessels | #vehicles | #tracks |
|---|---|---|---|---|---|---|
| bg1 | GS_RGB | 378 | 3141 | 769 | 76 | 24 |
| GS_SWIR | 385 | 2288 | 822 | – | 15 | |
| GS_Therm | 386 | 2675 | 778 | – | 25 | |
| GS_UV | 386 | 2479 | 897 | – | 15 | |
| bg3 | GS_RGB | 728 | 6791 | 1523 | – | 16 |
| GS_SWIR | 800 | 6693 | 1846 | – | 14 | |
| GS_Therm | 742 | 6183 | 1485 | – | 19 | |
| GS_UV | 702 | 5117 | 1657 | – | 16 | |
| bg4 | GS_RGB | 589 | 4148 | 1367 | 162 | 42 |
| GS_SWIR | 754 | 6591 | 2091 | 13 | 32 | |
| GS_Therm | 635 | 4145 | 1471 | – | 26 | |
| GS_UV | 746 | 6669 | 2204 | 17 | 30 | |
| bg5 | GS_RGB | 486 | 1792 | 676 | – | 17 |
| GS_SWIR | 583 | 1751 | 1165 | – | 9 | |
| GS_Therm | 499 | 450 | 861 | – | 12 | |
| GS_UV | 530 | 879 | 995 | – | 6 | |
| bg7 | GS_RGB | 756 | 2758 | 1009 | 69 | 24 |
| GS_Therm | 810 | 1165 | 1869 | – | 13 | |
| UAV_Therm | 808 | 8801 | 1395 | 408 | 44 | |
| bg9 | GS_RGB | 357 | 1060 | 611 | 14 | 14 |
| GS_SWIR | 386 | 581 | 1094 | – | 7 | |
| GS_Therm | 391 | 1037 | 685 | – | 8 | |
| GS_UV | 387 | 600 | 1250 | – | 7 | |
| bg10 | GS_RGB | 459 | 1798 | 1051 | 144 | 17 |
| GS_SWIR | 449 | 1297 | 1559 | – | 9 | |
| GS_Therm | 454 | 1192 | 972 | – | 6 | |
| GS_UV | 387 | 986 | 1340 | – | 11 | |
| bg11 | GS_RGB | 391 | 5203 | 1255 | – | 30 |
| GS_SWIR | 409 | 4085 | 1325 | – | 30 | |
| GS_Therm | 406 | 4217 | 1102 | – | 33 | |
| GS_UV | 410 | 3771 | 1356 | – | 30 | |
| bg12 | GS_RGB | 263 | 776 | 958 | – | 9 |
| GS_SWIR | 300 | 753 | 306 | – | 7 | |
| GS_Therm | 297 | 652 | 36 | – | 5 | |
| GS_UV | 301 | 790 | 416 | – | 7 | |
| cy1 | UAV_RGB | 970 | 2115 | 399 | – | 10 |
| UAV_Therm | 969 | 1795 | 906 | – | 5 | |
| cy2 | UAV_RGB | 1204 | 1569 | 888 | – | 12 |
| UAV_Therm | 1193 | 3962 | 2151 | – | 22 | |
| Total | 22086 | 112755 | 44540 | 903 | 678 | |
Test set
| Scenario | Sensor | #images | #persons | #vessels | #vehicles | #tracks |
|---|---|---|---|---|---|---|
| bg2 | GS_RGB | 871 | 8233 | 2506 | – | 26 |
| GS_SWIR | 846 | 6069 | 2197 | – | 14 | |
| GS_Therm | 845 | 5679 | 1723 | – | 20 | |
| GS_UV | 845 | 5131 | 2318 | – | 18 | |
| UAV_Therm | 842 | 6780 | 1947 | – | 17 | |
| bg6 | GS_RGB | 628 | 3536 | 1485 | 133 | 28 |
| GS_SWIR | 569 | 3531 | 2116 | – | 14 | |
| GS_Therm | 565 | 3851 | 1823 | – | 15 | |
| GS_UV | 571 | 3818 | 2038 | – | 16 | |
| bg8 | GS_RGB | 1526 | 7434 | 5345 | – | 24 |
| GS_SWIR | 1515 | 5093 | 4609 | – | 25 | |
| GS_Therm | 1515 | 6337 | 4046 | – | 22 | |
| UAV_Therm | 1513 | 9455 | 7426 | 425 | 47 | |
| cy3 | UAV_RGB | 1072 | 440 | 1552 | – | 16 |
| UAV_Therm | 1059 | 1714 | 1840 | – | 9 | |
| Total | 14782 | 77101 | 42971 | 558 | 311 | |
Challenge 3 (challenge3/)
- Below is the structure of the dataset folder:
challenge3/ │ ├── train/ │ │ ├── [scenario]/ │ │ │ ├── [sensor type]/ │ │ │ │ ├── annotations.xml │ │ │ │ ├── annotations_geo.xml │ │ │ │ ├── images/ │ │ │ │ ├── telemetry/ │ ├── test/ │ │ ├── [scenario]/ │ │ │ ├── [sensor type]/ │ │ │ │ ├── annotations.xml │ │ │ │ ├── images/ │ │ │ │ ├── telemetry/
- For Challenge 3, there are 8 scenarios for training and 5 scenarios for testing. All scenarios can be categorised into three groups:
- rd: Controlled scenarios designed with known conditions for trials and calibration. The UAV deployment position and the object of interest were at the same altitude, both above sea level.
- bg: Real-world scenarios where the UAV deployment position and the object of interest were at the same altitude, both at sea level.
- cy: Real-world scenarios where the UAV deployment position and the object of interest were at different altitudes, the UAV deployment position was above sea level, while the object was at sea level.
- The dataset folder is similar to Challenges 1 and 2, but with additional data.
- Each sensor type folder contains:
annotations.xml: Stores ground truth bounding boxes for an object of interest, for which participants must provide geolocations. The format is the same as in Challenges 1 and 2.annotations_geo.xml: Contains ground truth bounding boxes for an object of interest along with their corresponding geolocations.images/: Contains images in a sequential format (e.g., 0001.jpg, 0002.jpg, etc.).telemetry/: Contains JSON files with telemetry data in a sequential format, corresponding to the images inimages/(e.g., 0001.json, 0002.json).
- The test set does not contain
annotations_geo.xml, as it is intended for evaluation. - The
annotations.xmlformat is the same as in Challenge 1 and Challenge 2. - The
annotations_geo.xmlformat is similar to theannotations.xmlfile but with the addition of geolocation coordinates. Specifically, each<box>tag contains an additional<attribute>tag with the coordinates attribute. This attribute holds the geolocation of the object, presented in the format [longitude, latitude].
The tables below show the statistics of the training and test sets for this challenge.
- UAV_Therm = Thermal UAV
Training set
| Scenario | Sensor | #images | #persons | #vessels |
|---|---|---|---|---|
| cy1 | UAV_Therm | 969 | – | 1812 |
| cy2 | UAV_Therm | 1193 | – | 2268 |
| rd1 | UAV_Therm | 514 | 830 | – |
| rd2 | UAV_Therm | 588 | 1054 | – |
| rd3 | UAV_Therm | 1388 | 2776 | – |
| rd4 | UAV_Therm | 322 | 644 | – |
| rd5 | UAV_Therm | 395 | 674 | – |
| rd6 | UAV_Therm | 589 | 1058 | – |
| Total | 5958 | 7036 | 4080 | |
Test set
| Scenario | Sensor | #images | #persons | #vessels |
|---|---|---|---|---|
| bg7 | UAV_Therm | 808 | – | 1394 |
| cy4 | UAV_Therm | 351 | – | 686 |
| cy5 | UAV_Therm | 225 | – | 450 |
| cy6 | UAV_Therm | 201 | – | 402 |
| rd7 | UAV_Therm | 561 | 928 | – |
| Total | 2146 | 928 | 2932 | |