Soybean ( L.) Leaf Moisture Estimation Based on Multisource Unmanned Aerial Vehicle Image Feature Fusion.
Plants (Basel, Switzerland)
Efficient acquisition of crop leaf moisture information holds significant importance for agricultural production. This information provides farmers with accurate data foundations, enabling them to implement timely and effective irrigation management strategies, thereby maximizing crop growth efficiency and yield. In this study, unmanned aerial vehicle (UAV) multispectral technology was employed. Through two consecutive years of field experiments (2021-2022), soybean ( L.) leaf moisture data and corresponding UAV multispectral images were collected. Vegetation indices, canopy texture features, and randomly extracted texture indices in combination, which exhibited strong correlations with previous studies and crop parameters, were established. By analyzing the correlation between these parameters and soybean leaf moisture, parameters with significantly correlated coefficients ( < 0.05) were selected as input variables for the model (combination 1: vegetation indices; combination 2: texture features; combination 3: randomly extracted texture indices in combination; combination 4: combination of vegetation indices, texture features, and randomly extracted texture indices). Subsequently, extreme learning machine (ELM), extreme gradient boosting (XGBoost), and back propagation neural network (BPNN) were utilized to model the leaf moisture content. The results indicated that most vegetation indices exhibited higher correlation coefficients with soybean leaf moisture compared with texture features, while randomly extracted texture indices could enhance the correlation with soybean leaf moisture to some extent. RDTI, the random combination texture index, showed the highest correlation coefficient with leaf moisture at 0.683, with the texture combination being Variance1 and Correlation5. When combination 4 (combination of vegetation indices, texture features, and randomly extracted texture indices) was utilized as the input and the XGBoost model was employed for soybean leaf moisture monitoring, the highest level was achieved in this study. The coefficient of determination (R) of the estimation model validation set reached 0.816, with a root-mean-square error (RMSE) of 1.404 and a mean relative error (MRE) of 1.934%. This study provides a foundation for UAV multispectral monitoring of soybean leaf moisture, offering valuable insights for rapid assessment of crop growth.
10.3390/plants13111498
Design and implementation of a portable snapshot multispectral imaging crop-growth sensor.
Frontiers in plant science
The timely and accurate acquisition of crop-growth information is a prerequisite for implementing intelligent crop-growth management, and portable multispectral imaging devices offer reliable tools for monitoring field-scale crop growth. To meet the demand for obtaining crop spectra information over a wide band range and to achieve the real-time interpretation of multiple growth characteristics, we developed a novel portable snapshot multispectral imaging crop-growth sensor (PSMICGS) based on the spectral sensing of crop growth. A wide-band co-optical path imaging system utilizing mosaic filter spectroscopy combined with dichroic mirror beam separation is designed to acquire crop spectra information over a wide band range and enhance the device's portability and integration. Additionally, a sensor information and crop growth monitoring model, coupled with a processor system based on an embedded control module, is developed to enable the real-time interpretation of the aboveground biomass (AGB) and leaf area index (LAI) of rice and wheat. Field experiments showed that the prediction models for rice AGB and LAI, constructed using the PSMICGS, had determination coefficients (R²) of 0.7 and root mean square error (RMSE) values of 1.611 t/ha and 1.051, respectively. For wheat, the AGB and LAI prediction models had R² values of 0.72 and 0.76, respectively, and RMSE values of 1.711 t/ha and 0.773, respectively. In summary, this research provides a foundational tool for monitoring field-scale crop growth, which is important for promoting high-quality and high-yield crops.
10.3389/fpls.2024.1416221
Flexible wearable sensors for crop monitoring: a review.
Frontiers in plant science
Crops were the main source of human food, which have met the increasingly diversified demand of consumers. Sensors were used to monitor crop phenotypes and environmental information in real time, which will provide a theoretical reference for optimizing crop growth environment, resisting biotic and abiotic stresses, and improve crop yield. Compared with non-contact monitoring methods such as optical imaging and remote sensing, wearable sensing technology had higher time and spatial resolution. However, the existing crop sensors were mainly rigid mechanical structures, which were easy to cause damage to crop organs, and there were still challenges in terms of accuracy and biosafety. Emerging flexible sensors had attracted wide attention in the field of crop phenotype monitoring due to their excellent mechanical properties and biocompatibility. The article introduced the key technologies involved in the preparation of flexible wearable sensors from the aspects of flexible preparation materials and advanced preparation processes. The monitoring function of flexible sensors in crop growth was highlighted, including the monitoring of crop nutrient, physiological, ecological and growth environment information. The monitoring principle, performance together with pros and cons of each sensor were analyzed. Furthermore, the future opportunities and challenges of flexible wearable devices in crop monitoring were discussed in detail from the aspects of new sensing theory, sensing materials, sensing structures, wireless power supply technology and agricultural sensor network, which will provide reference for smart agricultural management system based on crop flexible sensors, and realize efficient management of agricultural production and resources.
10.3389/fpls.2024.1406074
Identifying rice field weeds from unmanned aerial vehicle remote sensing imagery using deep learning.
Plant methods
BACKGROUND:Rice field weed object detection can provide key information on weed species and locations for precise spraying, which is of great significance in actual agricultural production. However, facing the complex and changing real farm environments, traditional object detection methods still have difficulties in identifying small-sized, occluded and densely distributed weed instances. To address these problems, this paper proposes a multi-scale feature enhanced DETR network, named RMS-DETR. By adding multi-scale feature extraction branches on top of DETR, this model fully utilizes the information from different semantic feature layers to improve recognition capability for rice field weeds in real-world scenarios. METHODS:Introducing multi-scale feature layers on the basis of the DETR model, we conduct a differentiated design for different semantic feature layers. The high-level semantic feature layer adopts Transformer structure to extract contextual information between barnyard grass and rice plants. The low-level semantic feature layer uses CNN structure to extract local detail features of barnyard grass. Introducing multi-scale feature layers inevitably leads to increased model computation, thus lowering model inference speed. Therefore, we employ a new type of Pconv (Partial convolution) to replace traditional standard convolutions in the model. RESULTS:Compared to the original DETR model, our proposed RMS-DETR model achieved an average recognition accuracy improvement of 3.6% and 4.4% on our constructed rice field weeds dataset and the DOTA public dataset, respectively. The average recognition accuracies reached 0.792 and 0.851, respectively. The RMS-DETR model size is 40.8 M with inference time of 0.0081 s. Compared with three classical DETR models (Deformable DETR, Anchor DETR and DAB-DETR), the RMS-DETR model respectively improved average precision by 2.1%, 4.9% and 2.4%. DISCUSSION:This model is capable of accurately identifying rice field weeds in complex real-world scenarios, thus providing key technical support for precision spraying and management of variable-rate spraying systems.
10.1186/s13007-024-01232-0
Phytotoxicity of HNTs to rice (Oryza sativa L.): Effects on rice growth and development.
Chemosphere
The phytotoxicity of halloysite nanotubes (HNTs) to rice (Oryza sativa L.) was evaluated at several stages from germination, seedling growth to spike setting, and the seedling stage was selected to study the effect of HNTs on the growth of rice. Rice was cultured using different concentrations of HNTs dispersions and a blank control group was cultured with deionized water. It was found that HNTs did not affect the germination of rice seeds, and at the seedling stage, the low concentration of HNTs dispersion (0.1 mg mL) promoted the growth of rice. This significantly increased the biomass and root system of rice seedlings and also promoted the development of stems and leaves of rice seedlings. However, high concentration of HNTs dispersion (100 mg mL) had an inhibitory effect on rice growth, resulting in a significant decrease in rice biomass, causing oxidative damage (increase in HO content and malondialdehyde content, and disruption of cell membrane permeability), and causing a decrease in chlorophyll content in rice. The rice seedlings treated with HNTs were transplanted into the soil, and it was found that all the rice could grow healthily. The growth trend was consistent with the seedling stage, and all groups of rice were able to produce spikes, which indicated that the effect of HNTs on rice was slight. In total, this work displayed the toxicity of HNTs to rice, which lays the foundation for the application of HNTs in agricultural field.
10.1016/j.chemosphere.2024.143735