Drees, Lukas: Data-driven Image Generation for Crop Growth Modeling. - Bonn, 2025. - Dissertation, Rheinische Friedrich-Wilhelms-Universität Bonn.
Online-Ausgabe in bonndoc: https://nbn-resolving.org/urn:nbn:de:hbz:5-80296
@phdthesis{handle:20.500.11811/12721,
urn: https://nbn-resolving.org/urn:nbn:de:hbz:5-80296,
author = {{Lukas Drees}},
title = {Data-driven Image Generation for Crop Growth Modeling},
school = {Rheinische Friedrich-Wilhelms-Universität Bonn},
year = 2025,
month = jan,

note = {Crop growth models play a crucial role in agriculture as they help to understand how crops grow and thrive. They allow farmers to predict how much harvest they can expect, which is important when planning, storing, and marketing food. They can optimize the use of water, fertilizers, and pesticides, which saves costs and contributes to sustainability by minimizing environmental impact. Moreover, they play a central role in research, for example, when investigating modern cultivation systems such as mixed crops and determining in which constellation different crops thrive. These aspects contribute to making agriculture more sustainable and efficient, which are both important and major challenges given climate change and a growing world population. There are many different crop growth models, which can be categorized into process-based models, data-driven models, and a mixture of both. This thesis focuses on data-driven models, which involve learning the growth behavior of plants from real data. Machine learning algorithms and methods are used for this purpose. In particular, we have two fundamental requirements for the crop growth models that are developed in this work. First, the model prediction should be based on an image showing the status quo of the plant at an early growth stage, in addition to other factors. Second, target parameters should not be determined directly, but an artificial image should be generated first that shows this plant's potential future growth stage. Implementing both requirements results in several advantages for the plant growth models. The model prediction is based on real data/observations in the field, allowing realistic images of future plants to be generated that are related to the input. The generated images can not only be reused as artificial sensor data but also provide significant added value in the visualization of the spatial crop distribution in the field and in model explainability.
For image generation we use the method of Generative Adversarial Networks (GANs) and perform experiments on the plants Arabidopsis thaliana, Brassica oleracea var. botrytis (cauliflower) and mixed crops consisting of Triticum aestivum (spring wheat) and Vicia faba (field bean). We first demonstrate that a data-driven crop growth model based purely on RGB images and a fixed growth step can generate realistic images for different field treatments. We show that the images not only look realistic but can also be used as artificial sensor data from which meaningful plant traits can be derived. Next, we increase the input flexibility of the data-driven model so that irregular sequences can be processed in the input and images of arbitrary growth stages can be generated. This enables not only interpolation and extrapolation of image sequences but also the generation of stochastic image distributions and pixel-wise visualization of growth variability. Finally, we present a third data-driven crop growth model that can handle a multi-modal input, i.e., an image plus additional growth influencing factors of different types. It is demonstrated that results from a process-based model can be utilized as input for a data-driven growth model, which provides the opportunity to add a spatial component to the process-based model or to re-calibrate it. Particularly noteworthy is the ability to recombine images and growth influencing factors, allowing for simulations that are analyzed qualitatively and quantitatively.
Overall, this work contributes significantly to data-driven crop growth modeling by generating realistic images of future growth stages from which realistic target parameters can be derived. In particular, the flexibility through irregular image sequences or multi-modal conditions in the input, the ability to generate a time-variable and stochastic output, and the integration with a process-based model are experimentally demonstrated. In combination with investigations on data requirements and experiments on the generalizability of crop growth models, the work provides essential indicators of how image-generating data-driven crop growth models can be used in agricultural practice in the future.},

url = {https://hdl.handle.net/20.500.11811/12721}
}

Die folgenden Nutzungsbestimmungen sind mit dieser Ressource verbunden:

InCopyright