HUBEI AGRICULTURAL SCIENCES ›› 2021, Vol. 60 ›› Issue (23): 153-156.doi: 10.14088/j.cnki.issn0439-8114.2021.23.034

Previous Articles     Next Articles

Farmland boundary extraction from UAV remote sensing images for agricultural applications based on improved Mean Shift

ZHENG Ming-xue, LUO Zhi-qing, CHEN Pin-ting, GUAN Bo, MA Hai-rong   

  1. Institute of Agricultural Economic and Technological,Hubei Academy of Agricultural Sciences/Hubei Agricultural Science and Technology Innovation Center Agricultural Economic and Technological Research Sub-Center/Hubei Rural Revitalization Research Institute,Wuhan 430064,China
  • Received:2021-08-02 Online:2021-12-10 Published:2021-12-21

Abstract: Timely understanding of the spatial distribution of crops is of great significance for scientific management and crop increase. UAV remote sensing system has unique advantages in the application of large-area agricultural remote sensing images because of its advantages of mobility, low cost and high resolution image acquisition. To solve the problem of the difficulty of boundary extraction in UAV images, such as the time and effort of artificial vectoring and the common over-segmentation of existing image segmentation methods, an automated boundary extraction process based on improved Mean Shift is proposed. According to the global characteristics of the farmland itself and the local characteristics of the crops in the farmland, the method combines the pixel location information and image color information to describe the farmland boundary characteristics, in accordance with the spatial bandwidth setting principle. The experimental results show that the improved Mean Shift method proposed in this paper can achieve excellent results for land parcel boundary extraction from UAV agricultural remote sensing images, which provides support and inspiration for other scholars to carry out farmland parcel boundary research.

Key words: Mean Shift, farmland boundary extraction, UAV remote sensing images, agricultural application

CLC Number: