Please use this identifier to cite or link to this item:
|Title: ||Automatic detection of blurred images in UAV image sets|
|Authors: ||Sieberth, Till|
Chandler, Jim H.
|Issue Date: ||2016|
|Publisher: ||© 2016 International Society for Photogrammetry and Remote Sensing, Inc. (ISPRS). Published by Elsevier|
|Citation: ||SIEBERTH, T., WACKROW, R. and CHANDLER, J.H., 2016. Automatic detection of blurred images in UAV image sets. ISPRS Journal of Photogrammetry and Remote Sensing, 122, pp. 1-16.|
|Abstract: ||Unmanned aerial vehicles (UAV) have become an interesting and active research topic for photogrammetry. Current research is based on images acquired by an UAV, which have a high ground resolution and good spectral and radiometrical resolution, due to the low flight altitudes combined with a high resolution camera. UAV image flights are also cost effective and have become attractive for many applications including, change detection in small scale areas.
One of the main problems preventing full automation of data processing of UAV imagery is the degradation effect of blur caused by camera movement during image acquisition. This can be caused by the normal flight movement of the UAV as well as strong winds, turbulence or sudden operator inputs.
This blur disturbs the visual analysis and interpretation of the data, causes errors and can degrade the accuracy in automatic photogrammetric processing algorithms. The detection and removal of these images is currently achieved manually, which is both time consuming and prone to error, particularly for large image-sets. To increase the quality of data processing an automated process is necessary, which must be both reliable and quick. This paper describes the development of an automatic filtering process, which is based upon the quantification of blur in an image. Images with known blur are processed digitally to determine a quantifiable measure of image blur. The algorithm is required to process UAV images fast and reliably to relieve the
operator from detecting blurred images manually. The newly developed method makes it possible to detect blur caused by linear camera displacement and is based on human detection of blur. Humans detect blurred images best by comparing it to other images in order to establish whether an image is
blurred or not. The developed algorithm simulates this procedure by creating an image for comparison using image processing. Creating internally a comparable image makes the method independent of additional images. However, the calculated blur value named SIEDS (saturation image edge difference
standard-deviation) on its own does not provide an absolute number to judge if an image is blurred or not. To achieve a reliable judgement of image sharpness the SIEDS value has to be compared to other
SIEDS values from the same dataset.
The speed and reliability of the method was tested using a range of different UAV datasets. Two datasets will be presented in this paper to demonstrate the effectiveness of the algorithm. The algorithm proves to be fast and the returned values are optically correct, making the algorithm applicable for
UAV datasets. Additionally, a close range dataset was processed to determine whether the method is also useful for close range applications. The results show that the method is also reliable for close range
images, which significantly extends the field of application for the algorithm.|
|Description: ||This paper was accepted for publication in the journal ISPRS Journal of Photogrammetry and Remote Sensing and the definitive published version is available at http://dx.doi.org/10.1016/j.isprsjprs.2016.09.010|
|Publisher Link: ||http://dx.doi.org/10.1016/j.isprsjprs.2016.09.010|
|Appears in Collections:||Published Articles (Architecture, Building and Civil Engineering)|
Files associated with this item:
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.