Publications
NIBIOs employees contribute to several hundred scientific articles and research reports every year. You can browse or search in our collection which contains references and links to these publications as well as other research and dissemination activities. The collection is continously updated with new and historical material.
2025
Abstract
The simultaneous improvement of crop yield and nitrogen (N) use efficiency (NUE) can be potentially achieved through precision N management. In this study, crop modeling, remote sensing, and machine learning were combined to develop and assess a precision N recommendation strategy (MSM-PNM) for maize (Zea mays L.) based on twenty-eight site-years of field experiments conducted in Northeast China involving various N rates and planting densities. For the MSM-PNM strategy, a crop growth model was used at the eight-leaf stage of maize to initially predict the in-season optimal side-dressing N rate (EOSN) by combining current season’s available weather data prior to side-dressing with weather data from previous years for the remaining growing period. Maize N status was then estimated using machine learning models based on data collected with an active canopy sensor along with weather, soil, and management information. Finally, the crop model predicted EOSN was adjusted using the estimated maize N status. The prediction accuracy of EOSN (R2 = 0.70, root mean squared error (RMSE) = 18.60 kg ha−1) based on this integrated MSM-PNM strategy was higher than using crop model only (R2 = 0.65, RMSE = 20.10 kg ha−1). The precision maize management system based on the integrated MSM-PNM strategy decreased N rates by 6–16 % and increased NUE by 8–18 % over farmer practice applying 250 kg N ha−1 as basal N fertilizer without side-dress N application and regional optimal management practice applying split N at fixed rate and timing, while maintaining high grain yield and marginal net return. It is concluded that this new integrated precision N management strategy combining the advantages of crop modeling, remote sensing, and machine learning can significantly increase maize NUE while maintaining high crop yield, thus contributing to food security and agricultural sustainability.
Authors
Xiaokai Chen Yuxin Miao Krzysztof Kusnierek Fenling Li Chao Wang Botai Shi Fei Wu Qingrui Chang Kang YuAbstract
Timely and accurate monitoring of crop nitrogen (N) status is essential for precision agriculture. UAV-based hyperspectral remote sensing offers high-resolution data for estimating plant nitrogen concentration (PNC), but its cost and complexity limit large-scale application. This study compares the performance of UAV hyperspectral data (S185 sensor) with simulated multispectral data from DJI Phantom 4 Multispectral (P4M), PlanetScope (PS), and Sentinel-2A (S2) in estimating winter wheat PNC. Spectral data were collected across six growth stages over two seasons and resampled to match the spectral characteristics of the three multispectral sensors. Three variable selection strategies (one-dimensional (1D) spectral reflectance, optimized two-dimensional (2D), and three-dimensional (3D) spectral indices) were combined with Random Forest Regression (RFR), Support Vector Machine Regression (SVMR), and Partial Least Squares Regression (PLSR) to build PNC prediction models. Results showed that, while hyperspectral data yielded slightly higher accuracy, optimized multispectral indices, particularly from PS and S2, achieved comparable performance. Among models, SVM and RFR showed consistent effectiveness across strategies. These findings highlight the potential of low-cost multispectral platforms for practical crop N monitoring. Future work should validate these models using real satellite imagery and explore multi-source data fusion with advanced learning algorithms.
Abstract
Interpreting multi-component 1H NMR spectra is difficult due to peak overlap, concentration variability, and low-abundance signals. We cast mixture identification as a single-pass multi-label task. A compact CNN–Transformer (“Hybrid”) model was trained end-to-end on domain-informed and realistically simulated spectra derived from a 13-component flavor library; the model requires no real mixtures for training. On 16 real formulations, the Hybrid attains micro-F1 = 0.990 and exact-match (subset) accuracy = 0.875, outperforming CNN-only and Transformer-only ablations, while remaining efficient (~0.47 M parameters; ~0.68 ms on GPU, V100). The approach supports abstention and shows robustness to simulated outsiders. Although the evaluation set was small, and the macro-ECE (per-class, 15 bins) was inflated by sparse classes (≈0.70), the micro-averaged Brier is low (0.0179), and temperature scaling had negligible effect (T ≈ 1.0), indicating the good overall probability quality. The pipeline is readily extensible to larger libraries and adjacent applications in food authenticity and targeted metabolomics. Classical chemometric baselines trained on simulation failed to transfer to real measurements (subset accuracy 0.00), while the Hybrid model maintained strong performance.
Abstract
Single-class object detection, which focuses on identifying, counting, and tracking a specific animal species, plays a vital role in optimizing farm operations. However, dense occlusion among individuals in group activity scenarios remains a major challenge. To address this, we propose YOLO-SDD, a dense detection network designed for single-class densely populated scenarios. First, we introduce a Wavelet-Enhanced Convolution (WEConv) to improve feature extraction under dense occlusion. Following this, we propose an occlusion perception attention mechanism (OPAM), which further enhances the model’s ability to recognize occluded targets by simultaneously leveraging low-level detailed features and high-level semantic features, helping the model better handle occlusion scenarios. Lastly, a Lightweight Shared Head (LS Head) is incorporated and specifically optimized for single-class dense detection tasks, enhancing efficiency while maintaining high detection accuracy. Experimental results on the ChickenFlow dataset, which we developed specifically for broiler detection, show that the n, s, and m variants of YOLO-SDD achieve AP50:95 improvements of 2.18%, 2.13%, and 1.62% over YOLOv8n, YOLOv8s, and YOLOv8m, respectively. In addition, our model surpasses the detection performance of the latest real-time detector, YOLOv11. YOLO-SDD also achieves state-of-the-art performance on the publicly available GooseDetect and SheepCounter datasets, confirming its superior detection capability in crowded livestock settings. YOLO-SDD’s high efficiency enables automated livestock tracking and counting in dense conditions, providing a robust solution for precision livestock farming.
Abstract
In broiler breeding, precise counting is crucial for improving production efficiency and ensuring animal welfare. Nevertheless, counting chickens precisely is a challenging task especially when young chicks always huddle for warmth. Although deep learning has been widely taken in different counting related tasks, more accurate localization and counting of chickens in high stocking density scenes still has not been well investigated. We propose a point supervised dense chickens flock counting network (PCCNet), which directly utilizes points as learning targets. The network adopts information feature fusion to assist the identification of broilers high stocking density scenes. In addition, considering the distance of neighboring points as matching cost in point matching algorithms is advantageous for generating more reasonable matching results, facilitating model convergence. To validate the effectiveness of the proposed network, a Chicken Counting Dataset (CCD) is built, consisting of two subsets separated by different ages: CCD_A and CCD_B. The accuracies of PCCNet on the two subsets of CCD are 97.85% and 97.06%, with corresponding Mean Absolute Errors (MAE) of 1.966 and 5.173, and Root Mean Square Errors (RMSE) values of 3.474 and 7.034, respectively. Our model achieves better broiler counting performance than other state-of-the-art (SOTA) methods.
2024
Abstract
Raman spectroscopy is a powerful and non-invasive analytical method for determining the chemical composition and molecular structure of a wide range of materials, including complex biological tissues. However, the captured signals typically suffer from interferences manifested as noise and baseline, which need to be removed for successful data analysis. Effective baseline correction is critical in quantitative analysis, as it may impact peak signature derivation. Current baseline correction methods can be labor-intensive and may require extensive parameter adjustment depending on the input spectrum characteristics. In contrast, deep learning-based baseline correction models trained across various materials, offer a promising and more versatile alternative. This study reports an approach to manually identify the ground-truth baselines for eight different biological materials through extensively tuning the parameters of three classical baseline correction methods, Modified Multi- Polynomial Fit (Modpoly), Improved Modified Multi-Polynomial Fitting (IModpoly), and Adaptive Iteratively Reweighted Penalized Least Squares (airPLS), and combining the outputs to best fit the training data. We designed a one-dimensional Transformer (1dTrans) tailored to fit Raman spectral data for estimating their baselines, and evaluated its performance against convolutional neural network (CNN), ResUNet, and three aforementioned parametric methods. The 1dTrans model achieved lower mean absolute error (MAE) and spectral angle mapper (SAM) scores when compared to the other methods in both development and evaluation of the manually labeled original raw Raman spectra, highlighting the effectiveness of the method in Raman spectra pre-processing.
Authors
Jakob GeipelAbstract
No abstract has been registered
Authors
Ingeborg Klingen Nils Bjugstad Therese With Berge Krzysztof Kusnierek Hans Wilhelm Wedel-Jarlsberg Roger Holten Anette Sundbye Lene Sigsgaard Håvard Eikemo Kirsten Tørresen Valborg KvakkestadAbstract
No abstract has been registered
Abstract
No abstract has been registered
Abstract
No abstract has been registered