First Online: 05 June 2017 DOI: 10.1007/s11119-017-9528-3 Cite this article as: Louargant, M., Villette, S., Jones, G. et al. Precision Agric (2017). doi:10.1007/s11119-017-9528-3
M. Louargant S. Villette G. Jones N. Vigneau J. N. Paoli C. Gée
Agroécologie, AgroSup Dijon
INRA, Univ. Bourgogne Franche-Comté Dijon CedexFrance
This study aimed to assess the spectral information potential of images captured with an unmanned aerial vehicle, in the context of crop–weed discrimination.
A model is proposed in which the entire image acquisition chain is simulated in order to compute the digital values of image pixels according to several parameters (light, plant characteristics, optical filters, sensors…) to reproduce in-field acquisition conditions.
The spectral mixings in the pixels are modeled, based on an image with a 60 mm spatial resolution, to estimate the impact of the resolution on the ability to discriminate small plants. The classification potential (i.e. the ability to separate two classes) in soil and vegetation and in monocotyledon and dicotyledon classes is studied using simulations for different vegetation rates (defined as the proportion of vegetation covering the surface projected in the considered pixel).
The classification is unsupervised and based on the Mahalanobis distance computation. The results of soil-vegetation discrimination show that pixels with low vegetation rates can be classified as vegetation: pixels with vegetation rate greater than 0.5 had a probability to be correctly classified between 80 and 100%. Classification between monocotyledonous and dicotyledonous plants requires pixels with a high vegetation rate: to obtain a probability to be correctly classified better than 80%, vegetation rates in the pixels have to be over 0.9.
To compare the results with data from real images, the same classification was tested on multispectral images of a weed infested field. The comparison confirmed the ability of the model to assess vegetation–soil and crop–weed discrimination potential for specific sensors (such as the multiSPEC 4C sensor, AIRINOV, Paris, France), where the acquisition chain parameters can be tested.
lien direct : http://ift.tt/2setsrH