Competition 2025

Competition 2025 : Beyond Visible Spectrum: AI for Agriculture

Introduction

Welcome to the 2025 competition: Beyond Visible Spectrum: AI for Agriculture 2025

Beyond the Visible Spectrum: AI for Agriculture!

This competition presents an exciting opportunity for researchers and practitioners to advance computer vision techniques in agricultural crop disease monitoring. By leveraging vast multi/hyper-spectral remote sensing image datasets, participants are encouraged to develop innovative deep learning algorithms to enhance the accuracy and efficiency of identifying crop diseases, contributing to sustainable agricultural practices and global food security.

Participants in this challenge will have the chance to contribute to significant advancements in precision farming and crop monitoring by improving deep learning vision models. Our competition aims to drive forward the capabilities of agricultural technology, making it more efficient, accurate, and sustainable.

This is not just a competition; it is a call to action for the computer research community to bring about transformative changes in the analysis and utilisation of agricultural data, benefiting communities worldwide through smarter sustainable farming practices


Background

The escalating challenges posed by crop diseases and pests necessitate advanced technological interventions to ensure global food security and promote sustainable agricultural practices. According to the Food and Agriculture Organization, crop diseases and pests account for an estimated 20 to 40 percent loss in global crop yields each year, resulting in economic impacts exceeding $220 billion annually. Addressing these challenges demands the development of AI systems that prioritize precision, accuracy, and adaptability, particularly in data-scarce environments.

Hyperspectral remote sensing plays a vital role in precision agriculture by providing detailed spectral information that can be used to monitor the health and condition of crops. Unlike traditional imaging methods, hyperspectral sensors capture data across a wide range of wavelengths, allowing for the detection of subtle differences in plant physiology that may indicate stress, disease, or nutrient deficiencies. This high spectral resolution enables more accurate classification of crop types, early identification of diseases, and better assessment of crop productivity. By using hyperspectral data, farmers and researchers can make more informed decisions regarding irrigation, fertilization, and pest management, ultimately leading to improved yields and more sustainable agricultural practices.

However, hyperspectral data also comes with certain drawbacks. The high dimensionality of hyperspectral data leads to increased computational complexity, making data processing and analysis more resource intensive. Additionally, hyperspectral sensors are often costly, and the large volume of data generated requires significant storage capacity and advanced processing capabilities. These challenges can limit the accessibility and scalability of hyperspectral technology, particularly for small-scale farmers or in regions with limited technological infrastructure. The generation of synthetic data using deep learning is becoming increasingly important in deep learning, particularly in scenarios where real-world data is limited or expensive to obtain. Synthetic data can help augment existing datasets, providing a more diverse set of training examples that enhance the robustness and generalization capabilities of AI models. Techniques such as Generative Adversarial Networks (GANs), Variational Autoencoders (VAEs), and diffusion models are commonly used to create realistic synthetic data that mimics the spectral properties of actual agricultural imagery. By using synthetic data, researchers can address issues such as data scarcity, improve model performance, and reduce the need for extensive manual data collection, which is often labour-intensive and time-consuming.

In precision agriculture, synthetic hyperspectral data is particularly valuable because it allows researchers to simulate a wide range of crop conditions, including various stages of disease progression, without requiring extensive fieldwork. This capability enables the development of more accurate and resilient models for crop monitoring, disease detection, and resource management. By augmenting real datasets with synthetic examples, deep learning models can better understand subtle variations in spectral features, leading to improved early detection of crop diseases and more effective interventions. Synthetic data generation thus plays a crucial role in advancing precision agriculture by enabling scalable, data-driven solutions that contribute to sustainable farming practices.

Building on the success of previous precision agriculture initiatives, we announce to host the 2025 Competition on “Beyond Visible Spectrum: AI for Agriculture.” This competition aims to inspire participants to develop innovative deep learning algorithms tailored to hyperspectral datasets, focusing on synthetic data generation and crop disease classification. The competition is structured into two key tasks that reflect emerging AI needs in agriculture:


Challenge

The competition comprises two tasks: Task 1 Hyperspectral Data Analysis for Crop Disease Classification, while Task 2 Exploring Synthetic Hyperspectral Data Generation.

Task 1 Hyperspectral Data Analysis for Crop Disease Classification

Participants will develop AI models to classify crop diseases with high accuracy using real hyperspectral imagery. The task focuses on distinguishing between wheat stripe rust, healthy crops, and other conditions by leveraging the rich spectral and spatial data provided by hyperspectral images. The goal is to enable early and reliable disease detection, empowering farmers with actionable insights to improve productivity and sustainability.

Task 2 Exploring Synthetic Hyperspectral Data Generation

This task challenges participants to generate high-quality synthetic hyperspectral data that replicates the spectral features of real-world agricultural imagery. By creating realistic synthetic datasets, participants will help address data scarcity and enhance the performance of deep learning models. This task aims to reduce the reliance on costly and time-intensive data collection, paving the way for scalable AI-driven solutions in precision agriculture.


Schedule

Launch Date: TBD
Dataset Released: TBD
Submission Opening: TBD
Submission Deadline: TBD
Evaluation Period: TBD

Participation

This competition contains two tasks and participants can enter either or both

To participate:

  1. Create/Login to your Kaggle account
  2. Follow the link below for the Task you wish to complete
  3. Join the competition and download the dataset
  4. Submit your solutions!!!

Full details for both tasks are available below


Organisers

Professor Liangxiu Han

Manchester Metropolitan University

l.han@mmu.ac.uk

Professor Wenjiang Huang

Aerospace Information Research Institute, Chinese Academy of Sciences

huangwj@radi.ac.cn

Technical Chairs

Dr. Xin Zhang

Manchester Metropolitan University

Tam Sobeih

Manchester Metropolitan University

Dr. Yue Shi

Manchester Metropolitan University

Dr. Yingying Dong

Aerospace Information Research Institute


Contact

Professor Liangxiu Han

Manchester Metropolitan University

l.han@mmu.ac.uk

Acknowledgements


References

[1] Zhang, X., Han, L. et al. “How well do deep learning-based methods for land cover classification and object detection perform on high resolution remote sensing imagery?.” Remote Sensing 12.3 (2020): 417. https://doi.org/10.3390/rs12030417

[2] Zhang, X., Han, L., Dong, Y., Huang, W., et al. “A deep learning-based approach for automated yellow rust disease detection from high-resolution hyperspectral UAV images.” Remote Sensing 11.13 (2019): 1554. https://doi.org/10.3390/rs11131554

[3] Zhang, X., Han, L., et al. “The self-supervised spectral–spatial vision transformer network for accurate prediction of wheat nitrogen status from UAV imagery.” Remote Sensing 14.6 (2022): 1400. https://doi.org/10.3390/rs14061400

[4] Zhang, X., and Han, L. “A generic Self-Supervised Learning (SSL) framework for representation learning from spectral–spatial features of unlabeled remote sensing imagery.” Remote Sensing 15.21 (2023): 5238. https://doi.org/10.3390/rs15215238

[5] Shi, Y., Han, L., et al. “A Fast Fourier Convolutional Deep Neural Network For Accurate and Explainable Discrimination Of Wheat Yellow Rust And Nitrogen Deficiency From Sentinel-2 Time-Series Data”, Frontiers in Plant Science, Volume 14 - 2023

[6] Shi Y, Han L, Han L, Chang S, Hu T, Dancey D, “A Latent Encoder Coupled Generative Adversarial Network (LE-GAN) for Efficient Hyperspectral Image Super-resolution” IEEE Transactions on Geoscience and Remote Sensing 1-1 2022, DOI 10.1109/tgrs.2022.3193441

[7] Shi Y, Han L, Kleerekoper A, Chang S, Hu T. “Novel CropdocNet Model for Automated Potato Late Blight Disease Detection from Unmanned Aerial Vehicle-Based Hyperspectral Imagery”. Remote Sensing. 2022, 14(2):396. https://doi.org/10.3390/rs14020396

[8] Shi, Y. Han, L., Huang, W., et al. “A Biologically Interpretable Two-stage Deep Neural Network (BIT-DNN) For Vegetation Recognition From Hyperspectral Imagery”. IEEE Transactions on Geoscience and Remote Sensing, 2021, doi: 10.1109/TGRS.2021.3058782

[9] Saxena, Divya, and Jiannong Cao. “Generative adversarial networks (GANs) challenges, solutions, and future directions.” ACM Computing Surveys (CSUR) 54.3 (2021): 1-42.

[10] Razghandi, Mina, et al. “Variational autoencoder generative adversarial network for synthetic data generation in smart home.” ICC 2022-IEEE International Conference on Communications. IEEE, 2022.

[11] Yang, Ling, et al. “Diffusion models: A comprehensive survey of methods and applications.” ACM Computing Surveys 56.4 (2023): 1-39.

[12] https://han-research.gitlab.io/Agvision/

[13] https://www.kaggle.com/competitions/beyond-visible-spectrum/overview

[14] https://www.kaggle.com/competitions/beyond-visible-spectrum-ai-for-agriculture-P1/overview

[15] https://www.kaggle.com/competitions/beyond-visible-spectrum-ai-for-agriculture-2023-p2

[16] https://www.kaggle.com/competitions/beyond-visible-spectrum-ai-for-agriculture-2024

[17] https://www.kaggle.com/competitions/beyond-visible-spectrum-ai-for-agriculture-2024p2