Using transient models such as the HYDRUS model has been suggested as an alternative

The differential response of roots to nutritional patchiness is probably a consequence of complex nutrient-specific signal transduction pathways .To investigate the effects of heterogeneous root salinity and nutrient conditions, several split-root tomato experiments were conducted . Water uptake from the saline root-zone dramatically decreased within 8 h of treatment in contrast to the non-saline root-zone, with a more pronounced effect when nutrients were provided only to the non-salinized root-zone . This reduction in water uptake did not correlate with decreased root growth , with the saline root-zone only showing significantly less root growth towards the end of the experiment . The rapidity and consistency of decreased water uptake by roots in the saline zone, from treatment imposition through to Day 9, suggests that a primary physiological response was fol-lowed by a morphological response. To further explore the role of heterogeneous nutrient provision on root activity, complete nutrient solutions were selectively depleted of either N or K+ in the non-saline root half while the other root half received a saline, complete nutrient solution . These treatments provoked a ‘two-phase-response’. Immediately upon treatment application, the saline conditions given to one side of the roots dominated, immediately decreasing water uptake of those roots. Subsequently, water uptake from the saline-treated, nutrient-supplied roots proportionally increased, probably in response to the nutrient deficiency induced by the omission of the nutrient on the non-saline side. This effect was marked when K+ was only present in the saline root half and slight in the case of N. The presence of K+ in the nutrient solution was the most important determinant of root activity even when coinciding with salinity,blueberry box resulting in a notably higher shoot tissue Na+ and Cl− concentration when the sole source of K+ was to the saline root volume .One valuable tool in categorizing and quantifying genetic variation in salt tolerance has been to define crop relative yield responses in terms of threshold salinities up to which yields are unaffected and linear decreases in relative yield with increasing salinity thereafter .

However, it is critical to recognize that these relation-ships have generally always been presented in terms of variation in parameters such as ECe or more occasionally in terms of variation in EC1:5 that relate to the salinity of the soil. However, it is not the salinity of the soil that affects plant growth but the salinity of the soil solution, and thus the ratio of salt to water in the soil. This means that the salinity stress on a plant can be doubled by doubling the salt concentration in a soil or by halving the water concentration of the soil. Furthermore, as soils become drier, plant growth becomes affected by the increasingly negative matrix potentials that develop in soils because of the adhesion of water by soil pores. This view profoundly affects the whole idea of the heterogeneity of salinity stress in soils, because heterogeneity arises because of variable: leaching effects of irrigation or rain-fall on salt concentrations in soil, hydrating effects of irrigation or rainfall on soil water contents, effects of surface soil evaporation increasing salt concentrations by capillarity and decreasing water contents in the soil, and/or water extraction rates of roots and the ion uptake/exclusion capacity, which over time also influence ion and water abundances near the roots.All irrigation water introduces salts to the system and in regions with high evapotranspir-ation and low rainfall, traditional salinity management emphasizes deliberate leaching of salts away from the root-zone while avoiding elevation of the water table to prevent damage to crops . Leaching is usually achieved by applying irrigation water in excess of crop evapotranspirational demands. The fraction of applied water that drains below the root-zone is referred to as the ‘leaching fraction’ and this value is used to coarsely gauge the extent of leaching . Larger leaching fractions generally result in larger zones with a low soil water salinity but may necessitate disposal of large volumes of saline drainage water and may cause additional salinization through capillary rise of saline water by raising the water table , as well as environmental impacts of drainage water disposal. Designing the appropriate leaching fractions needed to avoid yield loss is context-specific and will depend on the crop, soil texture, climate, irrigation system and irrigation schedule, and the salinity of irrigation water being used . Ayers and Westcot developed a simple approach to calculate the leaching requirement based on salt mass balance calculations.

This approach estimates the leaching fraction required to keep the average root-zone salinity below the salinity threshold of the crop, assuming a specific root distribution and a strictly vertical, continual water flow. Approaches like this neglect the spatial non-uniformity of irrigation water application as well as the temporal dynamics of irrigation and water uptake during the season and assume that the average root-zone salinity determines the impact of salinity on the crop . While the physical principles underlying salinity management have not changed since Ayers and Westcott developed these leaching guidelines, management goals have shifted over time to better recognize environmental impacts of nutrient and salinity losses and develop more advanced micro-irrigation and fertigation systems. This has given rise to both new challenges and new opportunities in managing salinity. Challenge 1: Managing salinity under micro-irrigation systems. Spatial patterns of salt accumulation are diverse and differ by irrigation system , with each irrigation system having specific challenges to salinity management. In the simplest case, flood irrigation applies water uniformly across the whole surface . In this case, salinity distribution is approximately uniform in the horizontal direction, but a salinity gradient exists vertically . Assuming sufficient leaching, salinity increases with depth in these systems and uniform leaching of salts below the root-zone causes the salinity within it to be relatively homogeneous. In contrast, applying water to only part of the surface causes strong horizontal salinity heterogeneity, as in furrow irrigation and more advanced micro-irrigation systems. Micro-irrigation aims to target water application to the root-zone, thereby improving water use efficiency by applying less water to regions with low root density and providing an opportunity to deliver water at a rate which matches crop demand. Flood and overhead sprinkler irrigation manage soil moisture and salt content at the field scale, while micro-irrigation approaches management at the root-zone scale. Targeted water application results in targeted leaching, with micro-irrigation leaching salts in zones which are rich with plant roots, while flood irrigation requires additional water to also leach salts from field zones between plants with low root density, making micro-irrigation more efficient than furrow/sprinkler irrigation for managing salinity . When drip and furrow irrigation were compared, drip irrigation sustained higher yields of salt-sensitive crops compared to furrow irrigation when saline groundwater is shallow, while using less water than furrow irrigation .

The economic incentive to install micro-irrigation systems is context-dependent, with the advantage of micro-irrigation over conventional irrigation becoming less clear when growing salt-tolerant crops or when irrigation water is abundant. Despite its potential to accumulate salts in the root-zone, even subsurface drip can have advantages over salinity management with traditional irrigation. While higher tomato yields justified the expense of installing a subsurface drip irrigation system in California, the same was not true of cotton, which remained lucrative with furrow irrigation ,blueberry package as such salt-tolerant crops tend to tolerate flood irrigation without yield loss provided that irrigation is applied pre-planting to avoid stand establishment losses . In drip irrigation systems with strongly localized water application, salt is not only leached downwards, but significant lateral water movement away from the drip emitter also leaches salt horizontally resulting in salt accumulation in the fringes of the wetted volume . This leads to a strongly heterogeneous small-scale salt distribution where soil salinity levels in the top 20 cm can vary by a factor of more than five within only 40 cm of horizontal distance . Although the extent of horizontal salt movement depends on the soil texture and can be partially controlled by emitter spacing, under micro-irrigation, salts concentrated between emitters near the surface generally have little opportunity to intrude into the root-zone without precipitation, due to surface evaporation and irrigation . It is therefore recommended that crops be arranged close to emitters where salinity is low and that new lines be installed as close as possible to where old lines existed to avoid the need for preseason reclamation leaching . Subsurface drip irrigation results in a different pattern of water flow and salinity accumulation. While water application at the soil surface causes salts to leach downward and outward from the water source, subsurface irrigation causes resident and irrigated salts to flow upward through advection and accumulate above the dripline where plants are present . This accumulation pattern antagonizes the establishment of many row crops be-cause germination is relatively sensitive to salt stress . Such production systems rely on pre-season rain, sprinkler or surface irrigation to leach salts below the drip line where they may be leached downward by subsurface irrigation . Shallow installation of subsurface drip lines is advantageous where sufficient pre-season rains are present as irrigating the soil surface may be avoided altogether . This issue can be mechanically managed in processing tomato by adding soil to planting beds , followed by irrigation to accumulate salts into the uppermost zone of the bed, which is subsequently removed and placed in the furrow between rows, where very little horizontal salt movement occurs . The strong localization of water application in drip irrigation questions the applicability of historical steady-state leaching models to micro-irrigation systems . These models insufficiently account for the highly local nature of micro-irrigation and underestimate both the local leaching fraction experienced by plants and the tolerable EC of irrigation water .

Adequate management of heterogeneous salinity patterns and localized leaching under drip or micro-sprinkler may allow sustainable crop production in soils that would otherwise be deemed too saline for that species.These models account for localized application of water and changes in flow rates over time by explicitly simulating two-dimensional water and solute transport in the root-zone by numerically solving mechanistic models. However, although these models are very strong in depicting physical transport processes, they often oversimplify the description of plant physiological processes governing water and solute uptake. For example, the HYDRUS model neglects that the distribution of water uptake is also affected by nutrient concentrations. Moreover, even if it was possible to perfectly simulate the water, nutrient and salinity dynamics for a given scenario, it would still be unclear how the calculated heterogeneous salinity distribution would translate into plant performance. Incorporating current knowledge of plant responses to heterogeneous conditions might make these models more suitable for evaluating salinity management practices. Challenge 2: How to simultaneously optimize N efficiency and minimize the impact of salinity. The necessity of a leaching fraction for long-term salinity management is coupled with the issue of nutrient loss, especially for nitrate , which exhibits similar leaching potential as Cl−. Any practice designed to remove Na+ or Cl− from the root-zone probably also leaches NO3 − . Although a common problem, few studies have addressed the integrated nature of salinity and nutrient management . While NO3 − and Cl− are subject to very similar transport mechanisms and rates in the soil, their distribution in the soil can nevertheless be quite different, and high Na+ and Cl− concentrations do not necessarily coincide with high NO3 − concentrations. This is because: in contrast to Na+ and Cl−, NO3 − is preferentially taken up by plant roots; and nitrogen fertilizer is deliberately added to the irrigation water during fertigation and is to some degree independent of water application. Understanding crop nitrogen demands and responses to spatially localized nutrients and salinity may help manage fertigation systems to achieve the simultaneous goal of salinity leaching and minimal nitrate loss. By providing nutrients through fertigation in a manner that retains nutrients in the low-salinity zone adjacent to the drip-emitter, roots can avoid exploring the saline fringes of the wetted zones,thus reducing salt exposure. HYDRUS-based modelling suggests that high-frequency applications of small amounts of nitrate, timed toward the end of a fertigation event, can help retain NO3 − in the root-zone adjacent to the irrigation source while allowing salt to be leached to the peripheral root-zone.