Tag Archives: blueberry package

Using transient models such as the HYDRUS model has been suggested as an alternative

The differential response of roots to nutritional patchiness is probably a consequence of complex nutrient-specific signal transduction pathways .To investigate the effects of heterogeneous root salinity and nutrient conditions, several split-root tomato experiments were conducted . Water uptake from the saline root-zone dramatically decreased within 8 h of treatment in contrast to the non-saline root-zone, with a more pronounced effect when nutrients were provided only to the non-salinized root-zone . This reduction in water uptake did not correlate with decreased root growth , with the saline root-zone only showing significantly less root growth towards the end of the experiment . The rapidity and consistency of decreased water uptake by roots in the saline zone, from treatment imposition through to Day 9, suggests that a primary physiological response was fol-lowed by a morphological response. To further explore the role of heterogeneous nutrient provision on root activity, complete nutrient solutions were selectively depleted of either N or K+ in the non-saline root half while the other root half received a saline, complete nutrient solution . These treatments provoked a ‘two-phase-response’. Immediately upon treatment application, the saline conditions given to one side of the roots dominated, immediately decreasing water uptake of those roots. Subsequently, water uptake from the saline-treated, nutrient-supplied roots proportionally increased, probably in response to the nutrient deficiency induced by the omission of the nutrient on the non-saline side. This effect was marked when K+ was only present in the saline root half and slight in the case of N. The presence of K+ in the nutrient solution was the most important determinant of root activity even when coinciding with salinity,blueberry box resulting in a notably higher shoot tissue Na+ and Cl− concentration when the sole source of K+ was to the saline root volume .One valuable tool in categorizing and quantifying genetic variation in salt tolerance has been to define crop relative yield responses in terms of threshold salinities up to which yields are unaffected and linear decreases in relative yield with increasing salinity thereafter .

However, it is critical to recognize that these relation-ships have generally always been presented in terms of variation in parameters such as ECe or more occasionally in terms of variation in EC1:5 that relate to the salinity of the soil. However, it is not the salinity of the soil that affects plant growth but the salinity of the soil solution, and thus the ratio of salt to water in the soil. This means that the salinity stress on a plant can be doubled by doubling the salt concentration in a soil or by halving the water concentration of the soil. Furthermore, as soils become drier, plant growth becomes affected by the increasingly negative matrix potentials that develop in soils because of the adhesion of water by soil pores. This view profoundly affects the whole idea of the heterogeneity of salinity stress in soils, because heterogeneity arises because of variable: leaching effects of irrigation or rain-fall on salt concentrations in soil, hydrating effects of irrigation or rainfall on soil water contents, effects of surface soil evaporation increasing salt concentrations by capillarity and decreasing water contents in the soil, and/or water extraction rates of roots and the ion uptake/exclusion capacity, which over time also influence ion and water abundances near the roots.All irrigation water introduces salts to the system and in regions with high evapotranspir-ation and low rainfall, traditional salinity management emphasizes deliberate leaching of salts away from the root-zone while avoiding elevation of the water table to prevent damage to crops . Leaching is usually achieved by applying irrigation water in excess of crop evapotranspirational demands. The fraction of applied water that drains below the root-zone is referred to as the ‘leaching fraction’ and this value is used to coarsely gauge the extent of leaching . Larger leaching fractions generally result in larger zones with a low soil water salinity but may necessitate disposal of large volumes of saline drainage water and may cause additional salinization through capillary rise of saline water by raising the water table , as well as environmental impacts of drainage water disposal. Designing the appropriate leaching fractions needed to avoid yield loss is context-specific and will depend on the crop, soil texture, climate, irrigation system and irrigation schedule, and the salinity of irrigation water being used . Ayers and Westcot developed a simple approach to calculate the leaching requirement based on salt mass balance calculations.

This approach estimates the leaching fraction required to keep the average root-zone salinity below the salinity threshold of the crop, assuming a specific root distribution and a strictly vertical, continual water flow. Approaches like this neglect the spatial non-uniformity of irrigation water application as well as the temporal dynamics of irrigation and water uptake during the season and assume that the average root-zone salinity determines the impact of salinity on the crop . While the physical principles underlying salinity management have not changed since Ayers and Westcott developed these leaching guidelines, management goals have shifted over time to better recognize environmental impacts of nutrient and salinity losses and develop more advanced micro-irrigation and fertigation systems. This has given rise to both new challenges and new opportunities in managing salinity. Challenge 1: Managing salinity under micro-irrigation systems. Spatial patterns of salt accumulation are diverse and differ by irrigation system , with each irrigation system having specific challenges to salinity management. In the simplest case, flood irrigation applies water uniformly across the whole surface . In this case, salinity distribution is approximately uniform in the horizontal direction, but a salinity gradient exists vertically . Assuming sufficient leaching, salinity increases with depth in these systems and uniform leaching of salts below the root-zone causes the salinity within it to be relatively homogeneous. In contrast, applying water to only part of the surface causes strong horizontal salinity heterogeneity, as in furrow irrigation and more advanced micro-irrigation systems. Micro-irrigation aims to target water application to the root-zone, thereby improving water use efficiency by applying less water to regions with low root density and providing an opportunity to deliver water at a rate which matches crop demand. Flood and overhead sprinkler irrigation manage soil moisture and salt content at the field scale, while micro-irrigation approaches management at the root-zone scale. Targeted water application results in targeted leaching, with micro-irrigation leaching salts in zones which are rich with plant roots, while flood irrigation requires additional water to also leach salts from field zones between plants with low root density, making micro-irrigation more efficient than furrow/sprinkler irrigation for managing salinity . When drip and furrow irrigation were compared, drip irrigation sustained higher yields of salt-sensitive crops compared to furrow irrigation when saline groundwater is shallow, while using less water than furrow irrigation .

The economic incentive to install micro-irrigation systems is context-dependent, with the advantage of micro-irrigation over conventional irrigation becoming less clear when growing salt-tolerant crops or when irrigation water is abundant. Despite its potential to accumulate salts in the root-zone, even subsurface drip can have advantages over salinity management with traditional irrigation. While higher tomato yields justified the expense of installing a subsurface drip irrigation system in California, the same was not true of cotton, which remained lucrative with furrow irrigation ,blueberry package as such salt-tolerant crops tend to tolerate flood irrigation without yield loss provided that irrigation is applied pre-planting to avoid stand establishment losses . In drip irrigation systems with strongly localized water application, salt is not only leached downwards, but significant lateral water movement away from the drip emitter also leaches salt horizontally resulting in salt accumulation in the fringes of the wetted volume . This leads to a strongly heterogeneous small-scale salt distribution where soil salinity levels in the top 20 cm can vary by a factor of more than five within only 40 cm of horizontal distance . Although the extent of horizontal salt movement depends on the soil texture and can be partially controlled by emitter spacing, under micro-irrigation, salts concentrated between emitters near the surface generally have little opportunity to intrude into the root-zone without precipitation, due to surface evaporation and irrigation . It is therefore recommended that crops be arranged close to emitters where salinity is low and that new lines be installed as close as possible to where old lines existed to avoid the need for preseason reclamation leaching . Subsurface drip irrigation results in a different pattern of water flow and salinity accumulation. While water application at the soil surface causes salts to leach downward and outward from the water source, subsurface irrigation causes resident and irrigated salts to flow upward through advection and accumulate above the dripline where plants are present . This accumulation pattern antagonizes the establishment of many row crops be-cause germination is relatively sensitive to salt stress . Such production systems rely on pre-season rain, sprinkler or surface irrigation to leach salts below the drip line where they may be leached downward by subsurface irrigation . Shallow installation of subsurface drip lines is advantageous where sufficient pre-season rains are present as irrigating the soil surface may be avoided altogether . This issue can be mechanically managed in processing tomato by adding soil to planting beds , followed by irrigation to accumulate salts into the uppermost zone of the bed, which is subsequently removed and placed in the furrow between rows, where very little horizontal salt movement occurs . The strong localization of water application in drip irrigation questions the applicability of historical steady-state leaching models to micro-irrigation systems . These models insufficiently account for the highly local nature of micro-irrigation and underestimate both the local leaching fraction experienced by plants and the tolerable EC of irrigation water .

Adequate management of heterogeneous salinity patterns and localized leaching under drip or micro-sprinkler may allow sustainable crop production in soils that would otherwise be deemed too saline for that species.These models account for localized application of water and changes in flow rates over time by explicitly simulating two-dimensional water and solute transport in the root-zone by numerically solving mechanistic models. However, although these models are very strong in depicting physical transport processes, they often oversimplify the description of plant physiological processes governing water and solute uptake. For example, the HYDRUS model neglects that the distribution of water uptake is also affected by nutrient concentrations. Moreover, even if it was possible to perfectly simulate the water, nutrient and salinity dynamics for a given scenario, it would still be unclear how the calculated heterogeneous salinity distribution would translate into plant performance. Incorporating current knowledge of plant responses to heterogeneous conditions might make these models more suitable for evaluating salinity management practices. Challenge 2: How to simultaneously optimize N efficiency and minimize the impact of salinity. The necessity of a leaching fraction for long-term salinity management is coupled with the issue of nutrient loss, especially for nitrate , which exhibits similar leaching potential as Cl−. Any practice designed to remove Na+ or Cl− from the root-zone probably also leaches NO3 − . Although a common problem, few studies have addressed the integrated nature of salinity and nutrient management . While NO3 − and Cl− are subject to very similar transport mechanisms and rates in the soil, their distribution in the soil can nevertheless be quite different, and high Na+ and Cl− concentrations do not necessarily coincide with high NO3 − concentrations. This is because: in contrast to Na+ and Cl−, NO3 − is preferentially taken up by plant roots; and nitrogen fertilizer is deliberately added to the irrigation water during fertigation and is to some degree independent of water application. Understanding crop nitrogen demands and responses to spatially localized nutrients and salinity may help manage fertigation systems to achieve the simultaneous goal of salinity leaching and minimal nitrate loss. By providing nutrients through fertigation in a manner that retains nutrients in the low-salinity zone adjacent to the drip-emitter, roots can avoid exploring the saline fringes of the wetted zones,thus reducing salt exposure. HYDRUS-based modelling suggests that high-frequency applications of small amounts of nitrate, timed toward the end of a fertigation event, can help retain NO3 − in the root-zone adjacent to the irrigation source while allowing salt to be leached to the peripheral root-zone.

Weed block fabric is commonly used in organic and hydroponic production systems

Fabric was unrolled and pinned by hand to cover the post-row surface between raspberry beds prior to post installation. The fabric remained in place during the experiment and was unpinned and rolled up at the end of the project for potential reuse. Yard waste mulch from local suppliers was delivered to the project sites. Mulch was a woody < 2-inch screened material with < 20% fine components. Different mulch sources at the two sites were used because the distance between sites and volume requirements for each site were prohibitively large to source from a single supplier. Mulch was delivered by tractor to post rows, where it was spread with rakes to cover the entire post row with a 2- to 3-inch thick layer. At both locations mulch was applied once prior to post installation and persisted throughout the trial period. Polyacrylamide , a nontoxic soil-binding polymer, was applied prior to rain events at a rate of 2 pounds per acre. In 2016–2017, PAM was mixed with water and applied with a backpack sprayer, but due to plugging of nozzles we dispersed dry PAM to post rows instead in 2017–2018 and observed similar efficacy and increased ease of application.In the 2016–2017 season, we collected runoff samples by hand within 30 min from the beginning of the runoff generation, approximately 25 feet away from the ends of each of the treatment post rows . About 250 milliliters of runoff water in each sample were brought from field sites to the UC Cooperative Extension Ventura County lab and immediately tested for turbidity using a turbidimeter , acidified with sulfuric acid to reach pH 3 and either shipped immediately to the ANR analytical lab at UC Riverside or stored at 4°C until shipment. Levels of nitrogen forms and total nitrogren and phosphorus were determined using a Discrete Analyzer AQ2 . In 2017–2018,blueberry box we collected grab samples as described above. We also collected runoff in 5-gallon buckets installed at 25 feet from the end of post rows to intercept first flush of runoff at soil surface level.

Additionally, we installed suction lysimeters about 30 feet away from the ends of the post rows at 8-inch depth at Santa Maria and 8- and 24-inch depths at Somis and collected leachate after rains. In 2017–2018 we also collected sediment from the buckets after runoff occurred, and the sediment samples were dried and weighed at the UCCE Ventura County lab. In April 2018, we took soil samples that were analyzed for soil moisture, nitrate nitrogen and phosphorus content. We calculated the costs of each treatment for the 1,800– square foot experiment plot and then extrapolated the costs into a per acre basis for one tunnel use period. A tunnel use period covers a 3-year production cycle of raspberry from establishment until termination. Costs of treatments included materials, labor and equipment when applicable. Granular dry PAM formulation application to soil was used in the analyses. We also adjusted the treatment’s costs if it provided weed control benefit. In addition, some treatments can serve for more than one tunnel use period. Therefore, we distributed the costs accordingly.Not all treatments had runoff during light rains. Barley cover crop and yard waste mulch likely interfered with low flows and aided water retention in post rows. We observed slower flows and greater puddling in post rows with barley or mulch than in other treatments or untreated soil . Soil sampled 3 days after rain in March 2018 at Somis had 8% to 12% greater moisture content at both sampling depths under mulch compared with other treatments . Mulch also conserved more soil moisture than fabric at Santa Maria .Combined nitrite and nitrate levels in runoff samples ranged from 0.29 to 6.48 milligrams per liter over two seasons of sampling. This variability is due to the intensity and frequency of the rains during this period, which also affected the accumulated fertigated nitrogen that occurred between rain events. Fabric and PAM did not reduce nitrate or nitrite in runoff compared with untreated soil at any of the sampling dates at both locations and sampling seasons , while mulch was equally ineffective in 2016–2017 in reducing NOx in runoff at both locations. During one out of five runoff events in 2016–2017, barley reduced NOx levels in runoff by 48% compared with untreated soil, but not significantly during other rain events of that season.

During two out of five runoff events at Somis in 2017–2018, barley reduced NOx levels in runoff by 71% and 82% and mulch reduced them by 67% and 91% compared with untreated soil, but reductions were not significant at other sampling events. At Santa Maria, none of the treatments had significant impact on NOx in runoff when compared with untreated soil . All treatments at Somis were effective in reducing ammonium in runoff in 2016–2017 compared with untreated soil , but only barley was effective in 2017–2018. The overall greater average levels of ammonium in 2017–2018 were likely due to use of passive samplers that intercepted the first flush of runoff, which may have had a greater concentration of pollutants than runoff collected later . Ammonium is typically carried on sediments, so lower ammonium would indicate less sediment movement. This suggests that barley cover crop and yard waste mulch can reduce both the concentration of dissolved ammonium nitrogen in runoff and the volume of runoff, leading to potential reductions in nitrogen losses to the environment compared with untreated soil. Soil under barley and mulch had significantly less nitrate nitrogen compared with other treatments in March 2018 at Somis . At Santa Maria, all treatments except for mulch had 25% to 81% less nitrate nitrogen than that of untreated soil, although mulch was also similar to all other treatments. Mulch deterioration might have reduced its efficacy at Santa Maria. At Santa Maria, nitrate nitrogen levels in leachate collected at 8-inch depth on all sampling dates ranged from 12 to 27 parts per million in PAM and untreated plots, which was 52% to 80% greater than those in other treatments . At Somis a similar trend was observed: nitrate nitrogen levels in leachate under PAM and untreated soil were 7 to 22 ppm, which was 80% to 90% greater than those under barley or mulch. Leachate nitrate concentrations under fabric were not different from those in untreated soil . These results suggest that barley and mulch can reduce nitrate nitrogen in soil and leachate. Mulch and cover crop act as a barrier to runoff water with dissolved nitrogen and sediment and may retain nitrogen to be used for cover crop growth and for residue and mulch decomposition. Turbidity in first flush of runoff was reduced 5- to 10-fold by all treatments compared with untreated soil at both locations in 2018 . These results were similar to turbidity in grab samples taken in 2017 and 2018 , which suggests that all treatments were effective in reducing waterborne sediments on site. Additionally, 75% to 97% less sediment was collected from passive samplers in all treated post rows compared with those in untreated soil, as shown for March 10, 2018 .

Relatively high sediment load in fabric treatment resulted from deposits of soil on top of the fabric during removal of plastic from raspberry beds. Similar to the March 10 rain event, we observed significantly lower sediment levels after other rains in all treated post rows compared with untreated rows . We also observed fewer erosion channels in treated post rows compared with untreated plots at both sites during the trial. Besides the agronomic benefits,blueberry package retaining soil in the field is also a good pesticide management practice because soil-adsorbed pesticides will stay in the field and not end up in receiving bodies of water. In a previous study, Mangiafico et al. showed that concentrations of the harmful insecticide chlorpyrifos in runoff were linearly related to sample turbidity. This suggests that retaining waterborne sediments on-site is an effective method for mitigating runoff of this pesticide. Preventing soil movement with these post row treatments may also reduce the costs of sediment removal from receiving waterways and associated environmental impacts . Phosphorus levels in the first flush of runoff samples were reduced by 24% to 85% in all treatments compared with untreated soil at Somis in 2018, except for PAM on Feb. 27, 2018 . Lack of efficacy of PAM on that date may have resulted from deterioration of the PAM seal due to soil disturbance after PAM application and before runoff sample collection. At Somis in 2016–2017 and Santa Maria in 2018, we observed a similar reduction in phosphorus by all post row treatments compared with untreated soil . Since phosphorus is normally adsorbed to soil particles , reduction in turbidity and phosphorus in runoff samples from treated post rows followed a similar trend. Reducing losses of phosphorus from production fields may help prevent eutrophication in receiving waterways when this micro-element is limiting for algal growth . Since tunnel post rows receive water and retain soil moisture, conditions are favorable for weed growth. At both locations weed barrier fabric provided nearly complete weed control with only occasional weed germination in areas where soil was deposited on the top of the fabric.

Application of PAM did not provide control, and weed densities in PAM-treated rows were similar to those in untreated plots. Yard waste mulch provided 81% to 90% weed control at Somis but did not control weeds in two out of three evaluation dates at Santa Maria . Mulch at Santa Maria was much finer compared with the one at Somis, and likely decomposed more rapidly, allowing weed growth. Barley cover crop provided 86% and 42% weed control on two evaluation dates at Somis, but after barley was reseeded, high germination of little mallow occurred . Incorporation of barley during reseeding likely disturbed hard-coated weed seeds sufficiently to break dormancy; however, mallow was controlled before seed production when barley was mowed in spring. Barley cover cropat Santa Maria provided 87% and 43% weed control at two out of three evaluation dates. At Somis in 2018, we observed 3.5 more volunteer raspberry shoots in post rows with mulch compared with other treatments or untreated plots . Unlike weeds, raspberry shoots were able to penetrate mulch and establish, likely benefiting from the greater soil moisture content under it . These results show that weed barrier fabric, mulch and barley can effectively reduce weed control costs in raspberry tunnel post rows, but greater volunteer raspberry shoot management may be required if mulch is used.Although the diversity and economic size of California’s agricultural production may increase its resilience and resistance to perturbations such as urbanization, higher temperatures and increasing resource costs are forecast for the next 100 years , and there is great uncertainty as to how producers will respond to a changing climate both within California and globally. Producers may face significant challenges as regional temperatures, precipitation and weather pattern variability, and national and international markets are altered by global climate change. As commodity prices are dependent on global production and demand, any assessment of the impacts of climate change on California agriculture must be done in the context of both regional and global changes in yields. The magnitude and direction of these yields will be determined by climatic factors such as temperature, precipitation, and weather variability, and production factors such as biotic responses to elevated atmospheric CO2 concentrations, the availability and application of nutrients, and the ability of producers to adapt to these changes. Furthermore, as global markets develop for carbon trading, opportunities may arise for California agricultural producers to mitigate greenhouse gases . Therefore, adjustments in global food and mitigation markets together will no doubt significantly determine California agricultural producers’ response to climate change. Furthermore, since agriculture is not only of economic significance, but also secures the livelihood of most of the world’s population, impacts of climate change on food and farm security are of particular importance . Predicting yields in the coming century requires complex modeling that integrates both global and regional climate change models, crop growth models and economic models, with the expectation that climate change will likely impact different regions of the world in distinct ways .