Publications

NIBIOs employees contribute to several hundred scientific articles and research reports every year. You can browse or search in our collection which contains references and links to these publications as well as other research and dissemination activities. The collection is continously updated with new and historical material.

2018

Abstract

Introduction and purpose: The ability of apple rootstocks to become infected by Neonectria ditissima, the cause of European canker, was studied over two years. Materials and methods: Rootstocks B9 and M9 with a size suitable for grafting (6-10 mm stem diameter, termed rootstocks), and smaller sized rootstocks (<5 mm stem diameter, termed transplants) of B9, M9, M26, MM106 and Antonovka were inoculated with N. ditissima at different times, either with contaminated map pins or with spore suspensions. In addition, the rootstocks were either defeathered (side shoots removed), topped (top shoot headed) or both, to create wounds that would normally occur during propagation, while wounds on transplants were made by removing leaves. Results and discussion: One month after inoculation, slightly sunken canker lesions had developed around the inoculation points of the map pins or wounds. No lesions developed on the non-inoculated controls. Map pin inoculation resulted in 30% to 89% infection and spore suspension sprayed on wounds from 5% to 45% infection. When the cankered areas were split open, brown lesions with necrotic tissue due to infection by N. ditissima appeared. The transplants of M9, M26 and MM106 inoculated with contaminated map pins in 2014 developed necrosis on 40% to 67% of the plants, but there were no differences in the incidence or severity among the different types. On the transplants of B9, Antonovka and M9 inoculated in 2015, there was more necrosis on B9 (42%) than on Antonovka (11%) and more sporulating lesions on B9 (29%) than on M9 (9%) or on Antonovka (4%). Conclusion: It can be concluded that rootstocks used for apple trees may become infected by N. ditissima, and wounds should thus be protected during propagation.

To document

Abstract

In recent years, rising competition for water coupled with new environmental regulations has exerted pressure on water allocations for turfgrass irrigation. In this article, we reviewed published scientific and industry evidence on the agronomic and environmental impacts of turfgrass irrigation using a robust systematic review methodology. Our focus was on the links between (i) irrigation management (amount and frequency), (ii) agronomic responses to irrigation (turf quality, growth rates and rooting) and (iii) environmental impacts (nitrogen leaching). Based on an initial screening of 653 studies and data extracted from 83 papers, our results show that in most cases, under moderate levels of deficit irrigation (50%–60% of actual evapotranspiration), turf quality can be maintained at an acceptable level but with lower water consumption compared to irrigating back to field capacity. Irrigation beyond field capacity was found to increase the risk of nutrient leaching. However, evidence also showed that the concentration and total loss of urn:x-wiley:09312250:media:jac12265:jac12265-math-0001 in leachate were influenced more by nitrogen (N) rates, soil characteristics, turfgrass species and turfgrass growth rates than by irrigation practices. Our analyses suggest that turfgrass irrigation should be scheduled to apply water at moderate levels of deficit irrigation, sufficient to maintain turfgrass quality but limited to promote a deep and extensive rooting system. The findings provide new insights and valuable evidence for both scientists and practitioners involved in turfgrass research and management.

Abstract

Norway is strongly committed to the Paris Climate Agreement with an ambitious goal of 40% reduction in greenhouse gas emission by 2030. The land sector, including agriculture and forestry, must critically contribute to this national target. Beyond emission reduction, the land sector has the unique capacity to actively removing CO2 from the atmosphere through biological carbon storage in biomass and in soils. Soils are the largest reservoir of terrestrial carbon, and relatively small changes in soil carbon content can have an amplified mitigation effect on the Earth’s climate. Therefore, improved management of soils for carbon storage is receiving a lot of attention, for example through international political initiatives such as the “4-permill” initiative. However, in Norway, many mitigation measures targeting soil carbon might negatively impact food production and economic activity. For example, soil carbon storage can be increased by shifting from cereal crop production to grasslands, but Norway already has abundant grassland and a comparatively small area dedicated to cereals. Another such issue is cultivation on drained peatland, where food is produced at the expense of large losses of soil carbon as CO2 to the atmosphere. Therefore, there is a need to look for win-win solutions for soil carbon storage, which benefit both food production and climate mitigation. Large-scale conversion of agricultural and forest waste biomass to biochar is such an option, and is considered the activity with the largest potential for soil carbon sequestration in Norway. Biochar has been demonstrated to have a mean residence time exceeding 100 years in Norwegian field conditions (Rasse et al, 2017), and no negative effects on plant and soils has been observed. However, despite the convincing benefits of biochar as a climate mitigation solution, it has not yet advanced much beyond the research stage, notably because its effect on yield are too modest. Here, we will first present the comparative advantage of biochar technology as compared to traditional agronomy methods for large-scale C storage in Norwegian agricultural soils. We will further discuss the need for developing innovations in pyrolysis and nutrient-rich waste recycling leading to biochar-fertilizer products as win-win solution for carbon storage and food production.

To document

Abstract

Using Caenorhabditis elegans as a model organism, this study addresses the potential linkage between toxicity of NM300K Ag nanoparticles (AgNPs), their particle size distribution and the presence of dissolved Ag in the test media. Of the three endpoints assessed (growth, fertility and reproduction), reproduction was the most sensitive, with 50% effect concentration (EC50) ranging from 0.26-0.84 mg Ag L-1 and 0.08-0.11 mg Ag L-1 for NM300K and AgNO3, respectively. Silver uptake by C. elegans was similar for both forms of Ag, while bioaccumulation was higher in AgNO3 exposure. The observed differences in toxicity between NM300K and AgNO3 did not correlate to bioaccumulated Ag, which suggests the toxicity to be a function of the type of exposing agent (AgNPs vs AgNO3) and their mode of action. Before addition of the food source, E. coli, size fractionation revealed that dissolved Ag comprised 13-90 % and 4-8 % of total Ag in the AgNO3 and NM300K treatments, respectively. No dissolved Ag was detectable in the actual test media, due to immediate Ag adsorption to bacteria. Results from the current study highlight that information on behavior and characterization of exposure conditions is essential for nanotoxicity studies.