Automated deadwood detection from UAV RGB imagery

Deadwood detection from RGB UAV imagery using Mask R-CNN

  • Introduction
  • Workflow
    • Dataset description and generation
    • Comparison of plot characteristics and digitized deadwood
    • Mask R-CNN model training
    • Patch level results
    • Scene level results
    • Result comparison with field data
    • Deriving deadwood statistics for full Evo study area
  • Running models with your own data

On this page

  • Abstract
  • Getting started
  • Data used
  • Authors

View source

Deadwood detection from RGB UAV imagery using Mask R-CNN

Author

Janne Mäyrä

Published

December 22, 2022

Abstract

Deadwood and decaying wood are the most important components for the biodiversity of boreal forests, and it has been estimated that around a quarter of all flora and fauna in Finnish forests depend on that. However, there is a severe lack of stand-level deadwood data in Finland, as the operational inventories either focus on the large-scale estimates or omit deadwood altogether. Unmanned Aerial Vehicles (UAVs) are the only method for remotely mapping small objects, such as fallen deadwood, as even the most spatially accurate commercial satellites provide 30cm ground sampling distance, compared to less than 5 cm that is easily achievable with UAVs.

In this work, we utilized Mask R-CNN to detect individual standing and fallen deadwood instances from RGB UAV imagery. We manually annotated over 14 000 deadwood instances from two separate study sites to use as the training and validation data, and also compared these data to field-measured deadwood data. Our models achieved test set Average Precision (AP) of 0.284 for the same geographical area the models were trained on, and AP of 0.220 for geographically distinct area used only for testing.

In addition to instance-level deadwood maps, we also estimated stand-level characteristics for the numbers of deadwood. In addition, we estimated the approximate total volume of fallen deadwood based on the annotated polygons. These stand-level features clearly show the borders of the conserved forest, and the volume estimations are able to distinguish between naturally formed deadwood hotspot and areas with logging remnants. The proposed method enables deadwood mapping for larger areas, complementing the traditional field work.

Getting started

Much of the work relies heavily on https://github.com/jaeeolma/drone_detector, and instructions for its installation work here also.

Data used

Examples are using UAV RGB Orthomosaics from either Hiidenportti, Kuhmo, Eastern-Finland or Sudenpesänkangas, Evo, Southern-Finland. Hiidenportti dataset has a spatial resolution of around 4cm, and Sudenpesänkangas dataset has a spatial resolution of 4.85cm. Hiidenportti data contains 9 different UAV mosaics, and Sudenpesänkangas data is one single orthomosaic. From these data, we constructed rectangular virtual plots (hereafter referred as scenes) to use as a training and validation data for the models. From Hiidenportti, we constructed 33 scenes of varying sizes in such way that all 9m circular field plots present in the area were covered, and each field plot center had at least 45 meter distance to the edge of the scene. For Sudenpesänkangas, due to the area and orthomosaic being larger, we extracted 100x100m plots in such way that each scene contains only one circular field plot. In total, Hiidenportti data contained 33 scenes that cover 71 field plots, and Sudenpesänkangas data contaied 71 scenes.

Deadwood data that was used for training the models was manually annotated using QGIS software. We annotated all visible fallen deadwood trunks and standing deadwood canopies present in the scenes, and saved the results as geojson files. These data were then tiled and converted to COCO-format using functions from drone_detector. Sudenpesäkangas dataset consists of 5334 annotated deadwood instances, and Hiidenportti contains 8479 annotations.

Authors

  • Janne Mäyrä, Finnish Environment Intitute SYKE
  • Topi Tanhuanpää, University of Eastern Finland
  • Anton Kuzmin, University of Eastern Finland
  • Einari Heinaro, University of Helsinki
  • Timo Kumpula, University of Eastern Finland
  • Petteri Vihervaara, University of Eastern Finland
Dataset description and generation