UAS Precision Agriculture


The aim of this Big Data Fusion Contest is to promote research in UAS based precision agriculture, using very high-resolution Big Data. The target observations are agriculture crops i.e cotton, soya bean, and corn in summer season. In precision agriculture sensitivity of multispectral and HSI imagery collected from time-series unmanned aerial vehicle (UAS), can be used to study and detect pesticide-induced stress at various scales in the carefully controlled experiments.

The task is to train a Machine Learning (ML) or Deep Learning (DL) model for UAS Precision Agriculture Classification based on limited number of Image Tiles. The challenging task is the very high spatial resolution of HSI at 8cm and MS at 4 cm, presents their own high definition radiometric resolution curse of speckle and noise.

A Deep Learning model for Classification in UAS Precision Agriculture, might yield higher accuracy than an ML model.

Evaluation Criterion:

  • Kappa Accuracy of the 3 class classification.
  • The top 5 Accuracy Scores will be assessed based on computation time and Kappa accuracy on the test data.
  • The computation times will be computed on ViCAR Machines and the criteria will be HPC Score: ((120 - Computation time in sec)*0.62) + Kappa Accuracy *0.38


Balakrishna Gokaraju, Yogesh S. Kale and Clinton Griffin
ViCAR Center & North Carolina A&T State University in NC, USA

Sathishkumar Samiappan
GRI and Mississippi State University in MS, USA

Note: Further details regarding the competition will be released shortly.