The browser you are using is not supported by this website. All versions of Internet Explorer are no longer supported, either by us or Microsoft (read more here: https://www.microsoft.com/en-us/microsoft-365/windows/end-of-ie-support).

Please use a modern browser to fully experience our website, such as the newest versions of Edge, Chrome, Firefox or Safari etc.

Comparison of multi-temporal and multispectral Sentinel-2 and Unmanned Aerial Vehicle imagery for crop type mapping

Author

  • Stephania Zabala Ramos

Summary, in English

Precision Agriculture aims to maximize crop production and the efficiency of land use to meet the increased demand for food while minimizing environmental impact and economic cost of food production. Maps that provide information about where crops are grown, which crops and how much of these, are needed for Precision Agriculture applications. Data acquired remotely with sensors on platforms like satellites, aircrafts or drones are very useful to produce information for this purpose. However, at present, there is no single sensor that can provide sufficient data to produce these maps while considering the various growth stages and temporal changes of crops over the growing cycle. Therefore, this study investigated the possibility of combining data from the recently lunched Sentinel-2A (S2A) satellite and an Unmanned Aerial Vehicle (UAV) for crop monitoring. We evaluated the potential of the spectral, spatial and temporal information of S2A imagery for crop type mapping at the plot level in southern Sweden. We explored the compatibility of spectral bands between MicaSense RedEdge, a commonly used multispectral UAV sensor, and S2A to assess the utility of UAV observations in complementing and replacing satellite imagery with cloud-cover and noise. Moreover, we examined the seasonal variation of crops based on its greenness through S2A and UAV observations.
A method known as Random Forest (RF) was used to classify the different crop types in the study area. In this method, a model is used to predict the crop type (predicted variable) based on reflectance values of 11 spectral bands and vegetation indices (explanatory variables). The model was trained and tested on a separate subset of ground truth data. In addition, we tested the performance of the Variable Selection using Random Forest (VSURF) algorithm to reduce the number of variables, eliminate redundancy in the dataset and achieve acceptable accuracies for crop classification. First, we used S2A imagery from 12 dates (from April through July) to assess the improvement of crop distinction when using multi-temporal data. The number of variables used in the classification was reduced from 145 to 8, achieving an accuracy of 93% and a Kappa coefficient of 0.92. Regarding key spectral information, we found that red-edge and shortwave infrared bands were of high value for crop mapping. Also, the blue band appeared to be important for differentiating crops, together with the maximum Normalized Difference Vegetation Index (NDVI) for the growing season. Conversely, bands in the near-infrared were amongst the least important for the classification of crops in the study area.
Moreover, three atmospherically corrected S2A images were compared to three UAV orthomosaics consisting of raw image values (intensity values) and reflectance values after the radiometric correction performed by ATLAS (cloud-based data platform service provided by MicaSense), which converts image values to reflectance values. The comparison was based on averaged UAV pixels falling into S2A pixel sized cells. The band-by-band analysis of bands blue, green, red, red-edge and near-infrared evaluated the correlation and mean differences of reflectances and three vegetation indices NDVI, Enhanced Vegetation Index (EVI), and Green Chromatic Coordinate (GCC) between sensors. The results showed that the correlation of reflectances improved after the radiometric correction performed by ATLAS. The most correlated bands were red and near-infrared, closely followed by the green band. However, statistically, significant differences were found in the actual physical units. Similarly, vegetation indices (VI) reduced the variability in the data and showed stronger correlations, although significant differences were also found mostly with EVI. VIs values from S2A imagery were higher in bare soil and lower in green areas compared to those from UAV orthomosaics.
The S2A NDVI time-series for crops showed potential to provide seasonality information that can be of high value for various agriculture applications, including crop monitoring. NDVI derived from UAV orthomosaics were used to complement the time-series and to evaluate how well they represent the temporal variation. TIMESAT was used to improve data quality and produce smooth seasonal curves. Results showed that despite absolute differences between the indices obtained from both sensors, UAV observations could provide continuity to the S2A time-series and improve up-scaling of vegetation phenology.

Publishing year

2017

Language

English

Publication/Series

Lund Un iversity GEM thesis series

Document type

Student publication for Master's degree (two years)

Topic

  • Earth and Environmental Sciences

Keywords

  • classification
  • Random Forest
  • precision agriculture
  • UAV
  • remote sensing
  • Micasense
  • GEM

Funder

  • Erasmus Mundus Programme

Report number

22

Supervisor

  • David Tenenbaum (Universitetslektor)