{"id":255,"date":"2025-03-05T14:48:27","date_gmt":"2025-03-05T14:48:27","guid":{"rendered":"https:\/\/research.reading.ac.uk\/pets2025\/?page_id=255"},"modified":"2025-03-21T09:15:59","modified_gmt":"2025-03-21T09:15:59","slug":"author-instructions","status":"publish","type":"page","link":"https:\/\/research.reading.ac.uk\/pets2025\/author-instructions\/","title":{"rendered":"Author Instructions"},"content":{"rendered":"<h1>Task Descriptions<\/h1>\n<h3 data-start=\"51\" data-end=\"199\"><strong data-start=\"55\" data-end=\"197\">Challenge 1: Target Detection and Classification<span style=\"color: #ff0000\"> (Mandatory)<\/span><\/strong><\/h3>\n<ul data-start=\"110\" data-end=\"739\">\n<li data-start=\"110\" data-end=\"382\">\n<p data-start=\"112\" data-end=\"293\"><strong data-start=\"112\" data-end=\"125\">Objective<\/strong>: Detect and classify persons (including individuals on deck, on the shore, and a floating mannequin or dummy in the water), vessels, and vehicles in image sequences.<\/p>\n<ul data-start=\"298\" data-end=\"382\">\n<li data-start=\"298\" data-end=\"382\">Provide bounding boxes and class labels of detected objects in the given format.<\/li>\n<\/ul>\n<\/li>\n<li data-start=\"384\" data-end=\"563\">\n<p data-start=\"386\" data-end=\"457\"><strong data-start=\"386\" data-end=\"397\">Sensors<\/strong>: Image sequences from multiple sensor types are provided:<\/p>\n<ul data-start=\"462\" data-end=\"563\">\n<li data-start=\"462\" data-end=\"517\"><strong data-start=\"464\" data-end=\"482\">Ground Sensors<\/strong>: Visible (RGB), Thermal, UV, and SWIR.<\/li>\n<li data-start=\"522\" data-end=\"563\"><strong data-start=\"524\" data-end=\"539\">UAV Sensors<\/strong>: Visible (RGB) and Thermal.<\/li>\n<\/ul>\n<\/li>\n<li data-start=\"565\" data-end=\"649\">\n<p data-start=\"567\" data-end=\"649\"><strong data-start=\"567\" data-end=\"584\">Training data<\/strong> with ground truth bounding boxes and class labels is provided.<\/p>\n<\/li>\n<li data-start=\"650\" data-end=\"739\">\n<p data-start=\"652\" data-end=\"739\"><strong>Test data<\/strong> is provided; results should be submitted for <strong>all test sequences (if possible, for comprehensive evaluation), or at least two test sequences from different sensor types<\/strong>. For example, one sequence could be from RGB ground sensor in scenario &#8220;bg2&#8221;, and another could be from the thermal UAV in scenario &#8220;cy3&#8221;.<\/p>\n<\/li>\n<\/ul>\n<table style=\"border-collapse: collapse;width: 100%\">\n<tbody>\n<tr style=\"height: 24px\">\n<td style=\"width: 15.7368%;height: 24px;text-align: center\"><strong>Class<\/strong><\/td>\n<td style=\"width: 22.521%;height: 24px;text-align: center\"><strong>Example<\/strong><\/td>\n<td style=\"width: 22.9656%;text-align: center;height: 24px\"><strong>Example<\/strong><\/td>\n<\/tr>\n<tr style=\"height: 24px\">\n<td style=\"width: 15.7368%;height: 10px;text-align: center\"><strong>Person <\/strong><\/p>\n<p><strong>(on deck)<\/strong><\/td>\n<td style=\"width: 22.521%;height: 10px\"><img loading=\"lazy\" decoding=\"async\" class=\"size-medium wp-image-540 aligncenter\" style=\"letter-spacing: 0.08px\" src=\"https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/person_on_deck_RGB-300x200.jpg\" alt=\"\" width=\"300\" height=\"200\" srcset=\"https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/person_on_deck_RGB-300x200.jpg 300w, https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/person_on_deck_RGB-1024x683.jpg 1024w, https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/person_on_deck_RGB-768x512.jpg 768w, https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/person_on_deck_RGB-272x182.jpg 272w, https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/person_on_deck_RGB.jpg 1277w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/td>\n<td style=\"width: 22.9656%;height: 10px\"><img loading=\"lazy\" decoding=\"async\" class=\"size-medium wp-image-541 aligncenter\" style=\"letter-spacing: 0.08px\" src=\"https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/person_on_deck_SWIR-300x200.jpg\" alt=\"\" width=\"300\" height=\"200\" srcset=\"https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/person_on_deck_SWIR-300x200.jpg 300w, https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/person_on_deck_SWIR-768x512.jpg 768w, https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/person_on_deck_SWIR-272x182.jpg 272w, https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/person_on_deck_SWIR.jpg 801w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/td>\n<\/tr>\n<tr style=\"height: 24px\">\n<td style=\"width: 15.7368%;height: 24px;text-align: center\"><strong>Person <\/strong><\/p>\n<p><strong>(on the shore)<\/strong><\/td>\n<td style=\"width: 22.521%;height: 24px\"><img loading=\"lazy\" decoding=\"async\" class=\"size-medium wp-image-538 aligncenter\" src=\"https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/person_on_shore_UV-300x200.jpg\" alt=\"\" width=\"300\" height=\"200\" srcset=\"https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/person_on_shore_UV-300x200.jpg 300w, https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/person_on_shore_UV-1024x683.jpg 1024w, https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/person_on_shore_UV-768x512.jpg 768w, https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/person_on_shore_UV-1536x1024.jpg 1536w, https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/person_on_shore_UV-2048x1366.jpg 2048w, https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/person_on_shore_UV-272x182.jpg 272w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/td>\n<td style=\"width: 22.9656%;height: 24px\"><img loading=\"lazy\" decoding=\"async\" class=\"size-medium wp-image-581 aligncenter\" src=\"https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/person_on_shore_SWIR-300x200.jpg\" alt=\"\" width=\"300\" height=\"200\" srcset=\"https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/person_on_shore_SWIR-300x200.jpg 300w, https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/person_on_shore_SWIR-1024x683.jpg 1024w, https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/person_on_shore_SWIR-768x512.jpg 768w, https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/person_on_shore_SWIR-272x182.jpg 272w, https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/person_on_shore_SWIR.jpg 1227w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/td>\n<\/tr>\n<tr style=\"height: 24px\">\n<td style=\"width: 15.7368%;height: 24px;text-align: center\"><strong>Person <\/strong><\/p>\n<p><strong>(in the water)<\/strong><\/td>\n<td style=\"width: 22.521%;height: 24px\"><img loading=\"lazy\" decoding=\"async\" class=\"size-medium wp-image-543 aligncenter\" src=\"https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/person_in_water_UAVRGB-300x200.jpg\" alt=\"\" width=\"300\" height=\"200\" srcset=\"https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/person_in_water_UAVRGB-300x200.jpg 300w, https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/person_in_water_UAVRGB-272x182.jpg 272w, https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/person_in_water_UAVRGB.jpg 675w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/td>\n<td style=\"width: 22.9656%;height: 24px\"><img loading=\"lazy\" decoding=\"async\" class=\"size-medium wp-image-542 aligncenter\" src=\"https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/person_in_water_UAVTherm-300x200.jpg\" alt=\"\" width=\"300\" height=\"200\" srcset=\"https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/person_in_water_UAVTherm-300x200.jpg 300w, https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/person_in_water_UAVTherm-272x182.jpg 272w, https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/person_in_water_UAVTherm.jpg 427w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/td>\n<\/tr>\n<tr style=\"height: 24px\">\n<td style=\"width: 15.7368%;height: 24px;text-align: center\"><strong>Vessel<\/strong><\/td>\n<td style=\"width: 22.521%;height: 24px\"><img loading=\"lazy\" decoding=\"async\" class=\"size-medium wp-image-551 aligncenter\" src=\"https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/vessel_UAVRGB-300x201.jpg\" alt=\"\" width=\"300\" height=\"201\" srcset=\"https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/vessel_UAVRGB-300x201.jpg 300w, https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/vessel_UAVRGB-272x182.jpg 272w, https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/vessel_UAVRGB.jpg 554w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/td>\n<td style=\"width: 22.9656%;height: 24px\"><img loading=\"lazy\" decoding=\"async\" class=\"size-medium wp-image-552 aligncenter\" src=\"https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/vessel_Therm-300x200.jpg\" alt=\"\" width=\"300\" height=\"200\" srcset=\"https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/vessel_Therm-300x200.jpg 300w, https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/vessel_Therm-272x182.jpg 272w, https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/vessel_Therm.jpg 436w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/td>\n<\/tr>\n<tr>\n<td style=\"width: 15.7368%;text-align: center\"><strong>Vehicle<\/strong><\/td>\n<td style=\"width: 22.521%\"><img loading=\"lazy\" decoding=\"async\" class=\"size-medium wp-image-553 aligncenter\" src=\"https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/vehicle_RGB-300x200.jpg\" alt=\"\" width=\"300\" height=\"200\" srcset=\"https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/vehicle_RGB-300x200.jpg 300w, https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/vehicle_RGB-1024x682.jpg 1024w, https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/vehicle_RGB-768x512.jpg 768w, https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/vehicle_RGB-272x182.jpg 272w, https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/vehicle_RGB.jpg 1301w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/td>\n<td style=\"width: 22.9656%\"><img loading=\"lazy\" decoding=\"async\" class=\"size-medium wp-image-554 aligncenter\" src=\"https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/vehicle_UAVTherm-300x200.jpg\" alt=\"\" width=\"300\" height=\"200\" srcset=\"https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/vehicle_UAVTherm-300x200.jpg 300w, https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/vehicle_UAVTherm-272x182.jpg 272w, https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/vehicle_UAVTherm.jpg 512w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<p>&nbsp;<\/p>\n<hr data-start=\"741\" data-end=\"744\" \/>\n<h3 data-start=\"441\" data-end=\"572\"><strong data-start=\"445\" data-end=\"570\">Challenge 2: Long-Term Target Tracking (Optional)<\/strong><\/h3>\n<ul data-start=\"773\" data-end=\"1702\">\n<li data-start=\"773\" data-end=\"1319\">\n<p data-start=\"775\" data-end=\"863\"><strong data-start=\"775\" data-end=\"788\">Objective<\/strong>: Track persons, vessels, and vehicles continuously over image sequences.<\/p>\n<ul data-start=\"868\" data-end=\"1319\">\n<li data-start=\"868\" data-end=\"947\">Objects should be detected and classified as persons, vessels, or vehicles.<\/li>\n<li data-start=\"952\" data-end=\"1183\">Each detected object should be assigned a unique identifier (or track ID) that remains consistent throughout the sequence. The ID should be provided along with the corresponding bounding box and class label in the given format.<\/li>\n<li data-start=\"1188\" data-end=\"1319\">If an object disappears temporarily (due to occlusion or movement), it should be associated with the same ID when it reappears.<\/li>\n<\/ul>\n<\/li>\n<li data-start=\"1321\" data-end=\"1500\">\n<p data-start=\"1323\" data-end=\"1394\"><strong data-start=\"1323\" data-end=\"1334\">Sensors<\/strong>: Image sequences are provided from multiple sensor types:<\/p>\n<ul data-start=\"1399\" data-end=\"1500\">\n<li data-start=\"1399\" data-end=\"1454\"><strong data-start=\"1401\" data-end=\"1419\">Ground Sensors<\/strong>: Visible (RGB), Thermal, UV, and SWIR.<\/li>\n<li data-start=\"1459\" data-end=\"1500\"><strong data-start=\"1461\" data-end=\"1476\">UAV Sensors<\/strong>: Visible (RGB) and Thermal.<\/li>\n<\/ul>\n<\/li>\n<li data-start=\"1502\" data-end=\"1612\">\n<p data-start=\"1504\" data-end=\"1612\"><strong data-start=\"1504\" data-end=\"1521\">Training data<\/strong> includes ground truth bounding boxes, class labels, and track IDs of individual objects.<\/p>\n<\/li>\n<li data-start=\"1613\" data-end=\"1702\">\n<p data-start=\"1615\" data-end=\"1702\"><strong data-start=\"1615\" data-end=\"1628\">Test data<\/strong> is provided; results should be submitted for <strong>all test sequences (if possible, for comprehensive evaluation), or at least two test sequences from different sensor types.<\/strong>\u00a0For example, one sequence could be from RGB ground sensor in scenario &#8220;bg2&#8221;, and another could be from the thermal UAV in scenario &#8220;cy3&#8221;.<\/p>\n<\/li>\n<\/ul>\n<hr data-start=\"1704\" data-end=\"1707\" \/>\n<h3 data-start=\"813\" data-end=\"893\"><strong data-start=\"817\" data-end=\"891\">Challenge 3: Target Geolocation Approximation\u00a0<\/strong><strong data-start=\"445\" data-end=\"570\">(Optional)<\/strong><\/h3>\n<ul data-start=\"1753\" data-end=\"3147\">\n<li data-start=\"1753\" data-end=\"2757\">\n<p data-start=\"1755\" data-end=\"1918\"><strong data-start=\"1755\" data-end=\"1768\">Objective<\/strong>: Approximate the geolocations (longitudes and latitudes) of a specified object (either a person or vessel) over image sequences from a thermal UAV.<\/p>\n<ul data-start=\"1923\" data-end=\"2757\">\n<li data-start=\"1923\" data-end=\"2058\">Throughout each image sequence, ground-truth bounding boxes of objects are provided, for which geolocations should be approximated.<\/li>\n<li data-start=\"2063\" data-end=\"2187\">Provide the geolocations of these bounding boxes in the form of <strong>longitude and latitude<\/strong> coordinates, in the given format.<\/li>\n<li data-start=\"2192\" data-end=\"2493\">Telemetry data is provided, including focal length, digital zoom ratio, latitude, longitude, relative altitude, absolute altitude, gimbal yaw, gimbal pitch, gimbal roll, and timestamps corresponding to each image in the image sequence.<\/li>\n<li data-start=\"2192\" data-end=\"2493\">The thermal UAV used for collecting the data is <strong>DJI MAVIC 3T.<\/strong> The specifications of this UAV can be found in <a href=\"https:\/\/enterprise.dji.com\/mavic-3-enterprise\/specs\">https:\/\/enterprise.dji.com\/mavic-3-enterprise\/specs<\/a>.<\/li>\n<li data-start=\"2498\" data-end=\"2757\"><strong>(Important)<\/strong> Please note that the UAV&#8217;s absolute altitude could be inaccurate since it primarily relies on barometric pressure sensors, which are susceptible to fluctuations in air pressure due to changing weather conditions. Therefore, using absolute altitude may require calibration.<\/li>\n<\/ul>\n<\/li>\n<li data-start=\"2759\" data-end=\"2841\">\n<p data-start=\"2761\" data-end=\"2841\"><strong data-start=\"2761\" data-end=\"2771\">Sensor<\/strong>: Only image sequences from the <strong data-start=\"2803\" data-end=\"2818\">thermal UAV<\/strong> sensor are provided.<\/p>\n<\/li>\n<li data-start=\"2842\" data-end=\"3025\">\n<p data-start=\"2844\" data-end=\"3025\"><strong data-start=\"2844\" data-end=\"2861\">Training data<\/strong> includes ground-truth bounding boxes of objects, for which geolocations should be approximated, along with their ground-truth longitude and latitude coordinates.<\/p>\n<\/li>\n<li data-start=\"3026\" data-end=\"3147\">\n<p data-start=\"3028\" data-end=\"3147\"><strong data-start=\"3028\" data-end=\"3041\">Test data<\/strong> is provided with ground-truth bounding boxes of objects, for which geolocations should be approximated. Results should be submitted for <strong>all test sequences (if possible, for comprehensive evaluation), or at least two test sequences<\/strong>.<\/p>\n<\/li>\n<\/ul>\n<hr \/>\n<h1>Submission Guidelines<\/h1>\n<ul>\n<li data-start=\"42\" data-end=\"178\">Submissions can be made based on any of the provided test sequences. However, please submit results for test sequences from at least two sensors.<\/li>\n<li data-start=\"237\" data-end=\"373\">The submission to the challenge consists of the generated XML files and a short description (less than 500 words) of the methods used.<\/li>\n<li data-start=\"375\" data-end=\"420\">Please create a submission folder structured as follows:<\/li>\n<\/ul>\n<p style=\"padding-left: 40px\"><code>submission_ch[<strong>challenge_number<\/strong>]_[<strong>submission_name<\/strong>]\/<br \/>\n\u251c\u2500\u2500 [<strong>scenario<\/strong>]\/<br \/>\n\u2502 \u251c\u2500\u2500 [<strong>sensor_type<\/strong>]\/<br \/>\n\u2502 \u2502 \u2514\u2500\u2500 predictions.xml<\/code><\/p>\n<ul>\n<li>For example, the structure of the submission folder for Challenge 1, submitted by &#8220;JohnDoe&#8221;, is shown below. It includes prediction files for test image sequences from &#8220;GS_Therm&#8221; and &#8220;UAV_Therm&#8221; across the &#8220;bg2,&#8221; &#8220;bg6,&#8221; &#8220;bg8,&#8221; and &#8220;cy3&#8221; scenarios.<\/li>\n<\/ul>\n<p style=\"padding-left: 40px\"><code>submission_ch1_JohnDoe\/<br \/>\n\u251c\u2500\u2500 bg2\/<br \/>\n\u2502 \u251c\u2500\u2500 GS_Therm\/<br \/>\n\u2502 \u2502 \u2514\u2500\u2500 predictions.xml<br \/>\n\u2502 \u251c\u2500\u2500 UAV_Therm\/<br \/>\n\u2502 \u2502 \u2514\u2500\u2500 predictions.xml<br \/>\n\u251c\u2500\u2500 bg6\/<br \/>\n\u2502 \u251c\u2500\u2500 GS_Therm\/<br \/>\n\u2502 \u2502 \u2514\u2500\u2500 predictions.xml<br \/>\n\u251c\u2500\u2500 bg8\/<br \/>\n\u2502 \u251c\u2500\u2500 GS_Therm\/<br \/>\n\u2502 \u2502 \u2514\u2500\u2500 predictions.xml<br \/>\n\u2502 \u251c\u2500\u2500 UAV_Therm\/<br \/>\n\u2502 \u2502 \u2514\u2500\u2500 predictions.xml<br \/>\n\u251c\u2500\u2500 cy3\/<br \/>\n\u2502 \u251c\u2500\u2500 UAV_Therm\/<br \/>\n\u2502 \u2502 \u2514\u2500\u2500 predictions.xml<\/code><\/p>\n<ul>\n<li>Please generate a ZIP file that includes the submission folder and a text file containing a short (&lt;500 word) description of the methodology used. Then, email it to <a href=\"mailto:submissions@pets2025.net\"><strong>submissions@pets2025.net<\/strong><\/a>, providing full author and affiliation details.<\/li>\n<\/ul>\n<hr \/>\n<h1>XML schema<\/h1>\n<p><strong>Challenges 1 and 2<\/strong><\/p>\n<ul>\n<li>XML schema file: <a href=\"https:\/\/drive.google.com\/file\/d\/170kmXXgJCkrKBUIJJAQ7owV4xmQQfseL\/view?usp=sharing\"><code>ch1ch2-schema.xsd<\/code><\/a><\/li>\n<li>XML output example file: <a href=\"https:\/\/drive.google.com\/file\/d\/1uUUpzjmzIAtXa9gFOLdlGc_jvyWQlOxF\/view?usp=sharing\"><code>ch1ch2-predictions.xml<\/code><\/a><\/li>\n<\/ul>\n<h2><span style=\"font-size: 12pt\"><strong>Challenge 3<\/strong><\/span><\/h2>\n<ul>\n<li>XML schema file: <a href=\"https:\/\/drive.google.com\/file\/d\/1DOFTkU7F566X-Yr99xwbKo4woVxu5JgG\/view?usp=sharing\"><code>ch3-schema.xsd<\/code><\/a><\/li>\n<li>XML output example file: <a href=\"https:\/\/drive.google.com\/file\/d\/1GZZZqWA4lM3dEVUyZtO-bf17kGdoLdNl\/view?usp=sharing\"><code>ch3-predictions.xml<\/code><\/a><\/li>\n<li>Please note that the &#8220;coordinates&#8221; attribute in the XML files follows the format <strong>[longitude, latitude]<\/strong>.<\/li>\n<\/ul>\n<hr \/>\n<h1>Evaluation<\/h1>\n<ul>\n<li>The submitted XML results will be evaluated by the PETS2025 Organising Committee using an appropriate selection of benchmark metrics.<\/li>\n<li>All submissions will be included in a challenge summary paper, which will be published at AVSS 2025, with all challenge participants listed as co-authors.<\/li>\n<\/ul>\n","protected":false},"excerpt":{"rendered":"<p>Task Descriptions Challenge 1: Target Detection and Classification (Mandatory) Objective: Detect and classify persons (including individuals on deck, on the shore, and a floating mannequin or dummy in the water),&#8230;<a class=\"read-more\" href=\"&#104;&#116;&#116;&#112;&#115;&#58;&#47;&#47;&#114;&#101;&#115;&#101;&#97;&#114;&#99;&#104;&#46;&#114;&#101;&#97;&#100;&#105;&#110;&#103;&#46;&#97;&#99;&#46;&#117;&#107;&#47;&#112;&#101;&#116;&#115;&#50;&#48;&#50;&#53;&#47;&#97;&#117;&#116;&#104;&#111;&#114;&#45;&#105;&#110;&#115;&#116;&#114;&#117;&#99;&#116;&#105;&#111;&#110;&#115;&#47;\">Read More ><\/a><\/p>\n","protected":false},"author":1047,"featured_media":384,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"_acf_changed":false,"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"__cvm_playback_settings":[],"__cvm_video_id":"","footnotes":""},"coauthors":[17],"class_list":["post-255","page","type-page","status-publish","has-post-thumbnail","hentry"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v21.8.1 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Author Instructions - Performance Evaluation of Tracking and Surveillance 2025<\/title>\n<meta name=\"robots\" content=\"noindex, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<meta property=\"og:locale\" content=\"en_GB\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Author Instructions - Performance Evaluation of Tracking and Surveillance 2025\" \/>\n<meta property=\"og:description\" content=\"Task Descriptions Challenge 1: Target Detection and Classification (Mandatory) Objective: Detect and classify persons (including individuals on deck, on the shore, and a floating mannequin or dummy in the water),...Read More &gt;\" \/>\n<meta property=\"og:url\" content=\"https:\/\/research.reading.ac.uk\/pets2025\/author-instructions\/\" \/>\n<meta property=\"og:site_name\" content=\"Performance Evaluation of Tracking and Surveillance 2025\" \/>\n<meta property=\"article:modified_time\" content=\"2025-03-21T09:15:59+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/Untitled-design-27.png\" \/>\n\t<meta property=\"og:image:width\" content=\"500\" \/>\n\t<meta property=\"og:image:height\" content=\"500\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Estimated reading time\" \/>\n\t<meta name=\"twitter:data1\" content=\"6 minutes\" \/>\n\t<meta name=\"twitter:label2\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data2\" content=\"Thanet Markchom\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"WebPage\",\"@id\":\"https:\/\/research.reading.ac.uk\/pets2025\/author-instructions\/\",\"url\":\"https:\/\/research.reading.ac.uk\/pets2025\/author-instructions\/\",\"name\":\"Author Instructions - Performance Evaluation of Tracking and Surveillance 2025\",\"isPartOf\":{\"@id\":\"https:\/\/research.reading.ac.uk\/pets2025\/#website\"},\"datePublished\":\"2025-03-05T14:48:27+00:00\",\"dateModified\":\"2025-03-21T09:15:59+00:00\",\"breadcrumb\":{\"@id\":\"https:\/\/research.reading.ac.uk\/pets2025\/author-instructions\/#breadcrumb\"},\"inLanguage\":\"en-GB\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/research.reading.ac.uk\/pets2025\/author-instructions\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/research.reading.ac.uk\/pets2025\/author-instructions\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/research.reading.ac.uk\/pets2025\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Author Instructions\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/research.reading.ac.uk\/pets2025\/#website\",\"url\":\"https:\/\/research.reading.ac.uk\/pets2025\/\",\"name\":\"Performance Evaluation of Tracking and Surveillance 2025\",\"description\":\"PETS2025\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/research.reading.ac.uk\/pets2025\/?s={search_term_string}\"},\"query-input\":\"required name=search_term_string\"}],\"inLanguage\":\"en-GB\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Author Instructions - Performance Evaluation of Tracking and Surveillance 2025","robots":{"index":"noindex","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"og_locale":"en_GB","og_type":"article","og_title":"Author Instructions - Performance Evaluation of Tracking and Surveillance 2025","og_description":"Task Descriptions Challenge 1: Target Detection and Classification (Mandatory) Objective: Detect and classify persons (including individuals on deck, on the shore, and a floating mannequin or dummy in the water),...Read More >","og_url":"https:\/\/research.reading.ac.uk\/pets2025\/author-instructions\/","og_site_name":"Performance Evaluation of Tracking and Surveillance 2025","article_modified_time":"2025-03-21T09:15:59+00:00","og_image":[{"width":500,"height":500,"url":"https:\/\/research.reading.ac.uk\/pets2025\/wp-content\/uploads\/sites\/355\/2025\/03\/Untitled-design-27.png","type":"image\/png"}],"twitter_card":"summary_large_image","twitter_misc":{"Estimated reading time":"6 minutes","Written by":"Thanet Markchom"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/research.reading.ac.uk\/pets2025\/author-instructions\/","url":"https:\/\/research.reading.ac.uk\/pets2025\/author-instructions\/","name":"Author Instructions - Performance Evaluation of Tracking and Surveillance 2025","isPartOf":{"@id":"https:\/\/research.reading.ac.uk\/pets2025\/#website"},"datePublished":"2025-03-05T14:48:27+00:00","dateModified":"2025-03-21T09:15:59+00:00","breadcrumb":{"@id":"https:\/\/research.reading.ac.uk\/pets2025\/author-instructions\/#breadcrumb"},"inLanguage":"en-GB","potentialAction":[{"@type":"ReadAction","target":["https:\/\/research.reading.ac.uk\/pets2025\/author-instructions\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/research.reading.ac.uk\/pets2025\/author-instructions\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/research.reading.ac.uk\/pets2025\/"},{"@type":"ListItem","position":2,"name":"Author Instructions"}]},{"@type":"WebSite","@id":"https:\/\/research.reading.ac.uk\/pets2025\/#website","url":"https:\/\/research.reading.ac.uk\/pets2025\/","name":"Performance Evaluation of Tracking and Surveillance 2025","description":"PETS2025","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/research.reading.ac.uk\/pets2025\/?s={search_term_string}"},"query-input":"required name=search_term_string"}],"inLanguage":"en-GB"}]}},"_links":{"self":[{"href":"https:\/\/research.reading.ac.uk\/pets2025\/wp-json\/wp\/v2\/pages\/255","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/research.reading.ac.uk\/pets2025\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/research.reading.ac.uk\/pets2025\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/research.reading.ac.uk\/pets2025\/wp-json\/wp\/v2\/users\/1047"}],"replies":[{"embeddable":true,"href":"https:\/\/research.reading.ac.uk\/pets2025\/wp-json\/wp\/v2\/comments?post=255"}],"version-history":[{"count":10,"href":"https:\/\/research.reading.ac.uk\/pets2025\/wp-json\/wp\/v2\/pages\/255\/revisions"}],"predecessor-version":[{"id":655,"href":"https:\/\/research.reading.ac.uk\/pets2025\/wp-json\/wp\/v2\/pages\/255\/revisions\/655"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/research.reading.ac.uk\/pets2025\/wp-json\/wp\/v2\/media\/384"}],"wp:attachment":[{"href":"https:\/\/research.reading.ac.uk\/pets2025\/wp-json\/wp\/v2\/media?parent=255"}],"wp:term":[{"taxonomy":"author","embeddable":true,"href":"https:\/\/research.reading.ac.uk\/pets2025\/wp-json\/wp\/v2\/coauthors?post=255"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}