of 4
Risk management

USA - The Agricultural Credit Situation

Serious problems that developed in the agricultural credit situation in 2009 could escalate in 2010–2011. The earliest problems have occurred for lenders with loan concentrations in beef, dairy, hogs and poultry. Producers in all the protein sectors have suffered significant losses for over a year, resulting in a large increase in non-performing loans. Although there were few foreclosures in 2009, without a significant turnaround in income, many dairy and hog loans are in a near crisis situation. Many producers have lost enough equity that their lenders will be forced to discontinue financing. The impact on the agricultural sector will be magnified by the fact that many dairy and hog operations represent the majority of the assets for many producers. If the livestock operation fails, all of the assets of the business will have to be liquidated, including the land base. In addition, most of the more successful dairy operations are not in a position where their lender will allow them the leverage up to purchase the assets of the operations that are liquidating, even at a significant discount. While dairies tend to fail as individual businesses, many hog operations are contractually part of integrated supply chains. Some very well managed hog operations are going to be liquidated not just because of their own performance, but because their integrator fails and the entire supply chain goes down with him. Crop producers have fared fairly well over the past several years, but many grain producers are likely to have carryover operating debt if they purchased inputs early in the year when inputs were high and then had a poor crop or forward priced their crop after prices declined. Fortunately, most grain producers experienced a run of several years of above-normal income and pushed cash forward into 2009 for tax reasons. Now, however, margins going forward appear to be returning to normal levels. An increase in the federal biofuels standard world provide a temporary benefit to grain producers, but it would further exacerbate the financial problems of livestock producers. Some borrowers will be able to restructure their loans using guarantees from USDA’s Farm Service Agency (FSA), but even this lender of last resort will require that borrowers demonstrate the ability to service the loan. Also, many confinement livestock operations have credit needs that far exceed FSA’s limits. The reality is that there has been little involuntary exit from agriculture in the last 4 or 5 years. Unfortunately, extended boom periods tend to be followed by a cleansing period of about 3 years and a hangover effect can extend beyond that. It has been my observation the half-life of the lessons learned from a financial crisis is about 10 years. A good example is the importance that lenders place on profitability analysis. After the farm financial crisis of the 1980s, the Office of the Comptroller of the Currency found that 70 percent of agricultural banks were evaluating borrowers’ accrual adjusted net income in 1990. By 1995, the number had dropped to 50 percent as conditions improved, memories faded and competition heated up. Remember that the function of a competitive market is to drive the economic return to the average producer to breakeven through supply and demand responses in both input and output markets. In equilibrium, the top-end producers are profitable and growing, the average are hanging in there, and the bottom end are losing money and being forced to exit the industry. Business success and survival depend on continuous improvement at a pace necessary to stay out of the back of the pack. The same is true for lenders. During the boom periods, growth is strong, profits are increasing, loan losses are low and competition among lenders is intense. As farm income deteriorates, loan problems begin to mount and commercial lenders begin to pull in their horns as they start experiencing loan losses and their lending staff gets bogged down in dealing with adverse credit. Fortunately, really serious industrywide problems occur only every 20 to 30 years. The last time was the farm financial crisis of the 1980s. Unfortunately, some signs point to a period of financial stress extending over several years. How severe the problems become will depend primarily on three factors: How soon net farm income rebounds What happens to land values How soon and how much interest rates increase Although financial regulators and Congress are always more reactive than proactive, their actions significantly affect how lenders operate. The current administration is increasing the money supply and providing more liquidity in the financial markets, but commercial bank and Farm Credit System regulators are aggressively working to ensure that lenders recognize and mitigate risks. Also, the increase in bank failures and FDIC losses—in part because of higher limits on insured deposits—has prompted the FDIC to recommend that banks prepay their FDIC premiums for the next 3 years to rebuild the insurance fund. To mitigate risk, bank regulatory agencies are considering raising minimum capital standards. The Farm Credit Administration is also focusing on minimum capitalization and profitability levels for the Farm Credit Associations. The result will be that regardless of the underlying cost of funds, risk premiums and interest rate spreads will have to increase for all commercial lenders. A major regulatory change is that loan loss reserve requirements are now expected to be more forward looking and anticipatory and less dependent on recent history. Only a few years ago, regulators and accounting firms were criticizing lenders for excess loss reserves that were built to absorb the longer term cyclical downturns. Some lenders that were required to roll out what were deemed “excess” reserves are now being criticized for either not being adequately reserved or needing higher capital levels to absorb shock events. The current climate has affected lender behavior. First, all lenders have less appetite for risk. This is being manifested in several ways: Lenders are requiring more and better documentation of the information provided by borrowers, as well as closer monitoring of performance after loans are made.There are fewer exceptions to underwriting standards.Emphasis has increased on repayment capacity, including more analysis of accrual adjusted net income rather than just cash basis tax returns. The Farm Financial Standards Council has recognized for nearly two decades that cash basis accounting can lag true profitability by 2 years or more in terms of both upturns and downturns.Working capital and liquidity are also more important. Cash may be king, but a business can be making payments and going broke by refinancing, selling assets, building accounts payable and deferring the replacement of capital assets. So staying current on payments may not be enough by itself to keep borrowers’ loans out of trouble. Repricing terms are shorter—a loan may be amortized over 15 or 20 years but repriced every 5 years. This practice is driven largely by the lender’s ability to match fund the maturity of the loan or to sell the loan in the secondary market. Higher risk premiums are being built into interest rates. In part, this reflects that inadequate premiums had previously been priced into higher risk loans, often because of competition. Advance rates are lower, such as requirements for higher down payment or equity.More emphasis is being placed on borrowers’ risk management practices. Borrowers with larger credit needs are having more difficulty getting loans larger than their primary lender’s hold positions or legal lending limit, as potential participants are experiencing their own problems and demanding better documentation and quality. The use of FSA guarantees has increased significantly. Nationally, operating loan guarantee volume was up over 30 percent and mortgage guarantees up 9 percent. FSA actually ran out of funding for the operating loan guarantees in 2009. The funding for both programs has been increased by 20 percent for 2010, but the carryover from 2009 is going to have to come out of that as well. Borrowers who will require FSA assistance must start early. FSA’s staff will be swamped, and if demand increases as expected the funding could run out before the year is over. These changes reinforce the importance of several factors. First, interest rates and debt structure can be as important as debt levels in terms of the impact of debt on producer’s financial performance, and rates are subject to changing much more rapidly. One of the current concerns in financial markets is the potential for increases in interest rates and inflation as a result of the increased federal debt and creation of new entitlement programs. Interest rates are about as low as they can get, so the only way to go is up. If the economy rebounds and the private sector reenters the capital debt markets, there is a potential for a crowding-out effect. The federal debt will be issued and refinanced, but the rates are determined by the level of competition in the market. Recently, the federal government has had little competition. Since nearly 40 percent of the federal debt is held by foreign investors, the problem could be exacerbated if inflation occurs and the dollar is devalued, which would make U.S. Treasuries less attractive to those investors except at much higher rates. The Federal Reserve could end up between the proverbial rock and a hard place, needing to raise interest rates to curb inflation, but at the same time not wanting to stall economic growth. While interest rates are likely to increase, they probably will not go up significantly in 2010. The current economic recovery doesn’t have enough legs under it and I have not yet seen a new economic engine emerging. Another increasingly important factor in the credit decision process is the borrower’s proven management ability. Lenders recognize that management is the primary determinant of success or failure, but it is also extremely hard to quantify in risk rating models. Many studies have found that the top quarter of producers in terms of profitability tend to only be about 5 percent better than average, whether in terms of costs, production or marketing. But they do it overand over again. By way of analogy, remember that the future Hall of Fame baseball player with a .300 lifetime batting average gets only 1 more hit every 20 times at bat than the player who hits .250 and just manages to hang on. Of the different management attributes, risk management ability will become more important in separating the winners from the losers as increased volatility gets priced into or pushed down the value chain. Among the most obvious examples are situations that have occurred are where grain elevators, merchandisers and fertilizer dealers either have changed their internal policies or are being limited by their own lenders in terms of the exposure they can take on futures contracts or inventories. If they can’t manage the risk for their customer through forward contracts, hedging, or prepurchasing inventory without prepayment, the price risk gets shifted to the producer. Another major issue for producers, lenders, suppliers and buyers of agricultural products is counter party risk. This is the risk of the other party to a contract failing to keep their end of the agreement. Bankruptcies by ethanol plants, processors, integrators, grain elevators, fertilizer dealers and others have left a number of producers with major losses and their lenders with new problem loans. As for loan losses on a broad scale, the ultimate financial impact on the financial health of agricultural sector will be determined by and reflected in land values. The basic reason is that 87 percent of total farm assets are in real estate. With the increase in land values in recent years the total debt:asset ratio for the agricultural sector is at historically low levels, but the number can be very deceiving. First, 70 percent of farm operations carry no debt. The use of credit is more concentrated among capital intensive and larger operations that depend primarily on farm income for debt repayment. Most of the shift away from debt over the last 10 years has occurred in farms and ranches generating less than $500,000 annual gross sales. Add to this the fact that 42 percent of land in farms is owned by non-operator landlords and of the 58 percent owned by farm operators, 61.3 percent is owned by farmers with less than $250,000 annual gross sales. Because both the net worth and the underlying collateral for many farm loans, even operating loans, is real estate and because the majority of farm debt and farm income is concentrated on commercial scale farms and ranches, the value of land is critical to the risk of loss in the event of default faced by agricultural lenders. The market value of land is determined at the margin—the prices of land sold. If farm income drops and debt servicing problems occur, forced sales will increase. If able buyers get nervous about reduced incomes prospects and believe land values could fall, they will sit on the sidelines. This would exacerbate the problem and land values would fall even further. Changes in land values obviously aren’t evenly distributed. Land type, quality and location differ significantly, and so will the market impacts. Declines in value have already been occurring in recreational and transitional land markets, and on marginal quality agricultural land. If values fall by less than 10 percent from their peak the impact will be minimal, but 20 percent would result in significant problems, and more than 30 percent could result in a restructuring of the industry similar to the 1980s. One of the major problems is that declines in land values not only result in foreclosures and loan losses, they also decrease the market value equity of all land owners. Unfortunately, markets tend to overreact on both the upside and downside. Alan Greenspan referred to this response as irrational exuberance/fear and said that 80 percent of market economics are psychology. Experience has shown that when it comes to predicting financial problems, the debt:income ratio is a much better leading indicator than the debt:asset ratio. For example, the charts at the end of the article show that the debt:income ratio indicated problems starting to develop in 1977, while the debt:asset ratio didn’t start reflecting any negative until 1981. The question is whether the drop in net farm income from $87 billion in 2008 to a forecasted $54 billion in 2009 is an aberration or the beginning of an extended downturn. If net farm income remains below $60 billion in 2010 and 2011, there will be problems. If it falls below $50 billion, the problems will be serious. Another source of credit problems is that over the past 10 years, some lenders have moved into new types and areas of lending where they had no previous experience or expertise. In good economic times, no problems were apparent, but as conditions deteriorate, the weaknesses are beginning to appear. Unfortunately, at this point it’s often too late. Not only can loans in these areas jeopardize the lending institution; but, performing borrowers also experience greater difficulty in getting their needs serviced as the lending institution moves to limit its risk, increase its spreads to offset losses and its lending staff’s time becomes consumed by fighting fires. Obviously some areas and lenders are experiencing more problems than others, not necessarily because they are poorer lenders or their borrowers are poorer managers, but because of the nature of their market and location. Current examples include drought- or water-damaged areas, concentrated livestock areas and regions with large amounts of transitional property. These changes point to several lessons for producers: A lender’s request for more accurate and complete information should not be viewed as questioning the borrower’s character; it’s just a good business practice.Tougher credit standards will almost always follow new ownership or management of a lending institution. Such changes often indicate not that the new management is too demanding, but that the former management was too lax, which often explains why there is new ownership/management. Many of the stricter credit standards being adopted by lenders can be directly attributed to legislation passed during the 1980s that provided for additional borrowers’ rights, mandatory debt restructuring, more liberalized bankruptcy laws and the increased threat of lender liability lawsuits. In response, lenders have been forced to be more selective about whom they finance. Because litigation usually arises from situations in which borrowers are highly leveraged or in financial trouble, it has become more difficult for higher risk borrowers to qualify for credit or for lenders to continue financing if a borrower’s financial situation deteriorates significantly. Just as malpractice lawsuits have raised the cost of health care, the threat of legal action has changed the lending environment and caused lenders to become more cautious and conservative. Many lenders have felt the impact of the recession through losses on participations purchased in loans originated outside the state. There are several banks and farm credit associations where over half of their charge-offs in 2009 and a significant portion of their adverse credit volume have resulted from loan participations. Loans on ethanol plants and commercial real estate are obvious problem areas. Others include loans for purposes outside the lender’s area of expertise and loans where lenders relied too heavily on the lead lender or financial rating services. Larger banks and farm credit associations that have defined benefit retirement programs are under additional stress to maintain profitability and rebuild capital. In some instances, 25 percent or more of their net earnings are being required to rebuild retirement funds that were deplenished by the collapse of the financial markets. The past 2 years in commodity, real estate and financial markets have made it abundantly clear that changes can occur quickly. We have also learned that Black Swan events are real. The tails of economic and financial distributions are larger than the assumptions of a normal distribution. Most risk models capture only “normal” periods, and that includes the rating services such as Moody, Dun and Bradstreet. This experience has several lessons that need to be heeded in the future: Econometric models tend to be data dependent and backward looking. Boards and managers need to rely on judgment and experience and learn to look for leading indicators outside their immediate environment.Total enterprise risk management is critical, but implementing it is both expensive and easier said than done. Even the most sophisticated financial institutions are still basically silo risk managers.Although linear trends are good indicators of behavior and performance, they seriously understate the potential rate of change created by the external environment, including the impact of technological change. Tipping points often cause exponential rather than linear changes for both upturns and downturns. Timing is critical—for getting in, expanding, cutting back or getting out. The studies I have seen and my own experience indicate that timing is the main difference that separates the top 10 percent from the rest of the top 25 percent of managers and businesses.Employee and management incentive compensation systems need to be evaluated and redesigned. People ultimately do what they are incentivized to do. We have learned over time that incentives that focus too much on volume or cost minimization can be disasters. Even systems that focus on profitability have often failed to effectively factor in the risk:reward relationship, not just for the individual but also for the business. Prime examples are Wall Street trader bonus structures, the combining of commercial banking and investment banking enabled by the repeal of the Glass-Steagall Act, the lack of regulation of derivative markets, and executive golden parachutes that pay off even if the business is unsuccessful.


Properly identify ear rot diseases

With several types of ear rot diseases appearing on Ohio’s corn crop, properly identifying them is important for producers to make decisions about feeding grain to livestock. “It is important to identify ear rot problems before harvest because some ear rot fungi produce mycotoxins that are harmful to livestock,” said Pierce Paul, an Ohio State University Extension plant pathologist with the Ohio Agricultural Research and Development Center. “After harvest is becomes much more difficult to tell what’s causing the problem.” Weather conditions Current weather conditions are favorable for ear rot development: wet weather late in the season, frost occurring before maturity, corn standing in the field for an extended period and delayed maturity. Other indicators of potential ear rot problems include bird and insect damage, hybrid susceptibility and ears drying down in an upright position. Most common Three of the most common ear fungal diseases in Ohio are Gibberella, Diplodia and Fusarium ear rots. “Gibberella ear rot is the most prevalent of the ear rots this year. However, we also have received reports of Diplodia ear rot in some fields,” said Paul. “It is fairly easy to tell ear rots apart in the field based on the color of the fungal growth on the ear, how the mold develops, and how the moldy kernels are distributed on the ear.” With Gibberella ear rot, the fungus enters the ear tips and leaves a pinkish mold on the kernels that progresses down from the tip toward the base of the ear. Gibberella ear rot develops best when cool temperatures and frequent rainfall occur during the three-week period after silk emergence. Like Gibberella ear rot, Fusarium ear rot also causes a pink discoloration on the kernels, but the disease develops best when warm, wet weather occurs during the two to three-week period after silking. In addition, the pink moldy kernels are usually scattered all over the ear and as the disease develops, the infected kernels may become tan or brown or have white streaks. Most different Diplodia ear rot is the most different of the three diseases. It causes a thick white mold on the ear, usually starting from the base and progressing toward the tip. Infection can begin before tasseling up to silking, and disease development is favored by wet weather and mild temperatures during early ear development. As the disease develops, the husk becomes bleached. Paul emphasized that symptoms of ear rot don’t always appear on the outside of the husk. “This is particularly true with late infections,” said Paul. “To determine if you have an ear rot problem, walk fields, strip back the husks of about 50 plants and look for telltale symptoms.” Mycotoxins The Gibberella ear rot fungus produces mycotoxins that are harmful to animals. These include deoxynivalenol (Vomitoxin), zearalenone and T-2 toxin. “Suspect grain should be tested for these mycotoxins by chemical analysis before being fed to animals,” said Paul. “As a general rule do not feed any grain with 5 percent or more Gibberella moldy kernels.” Diplodia ear rot is less of a concern from a mycotoxin standpoint. There have been no reports of Diplodia producing mycotoxins that are harmful to animals in Ohio, but animals do refuse to eat grain with high levels of Diplodia-damaged kernels. Additionally, severely affected grain has low nutritional value. Source - www.farmanddairy.com


Sprinkler Irrigation Systems, Which to Choose

Sprinkler irrigation systems are rainfall-like methods of distributing water throughout soil. Water is distributed through a network of pipes by pumping, which—through spray heads—sprays it into the air, breaks it up into tiny drops falling to the soil. It is one of four basic irrigation methods; the other three are subsurface, surface or gravity and trickle irrigations. Of those four, sprinkler irrigation systems are the most commonly used throughout the globe. Can be composed of one or many sprinklers, sprinkler irrigation systems deliver efficient coverage for both small and large areas. They are suitable for all types of soil. They are also adaptable to nearly all irrigable soils. Such field crops benefit sprinkler irrigation systems as wheat, gram, pulses, vegetables, cotton, soya bean, tea, coffee beans and other fodder crops. Therefore they are suitable for residential and industrial lawns as well as golf and race courses. Some sprinkler irrigation systems are as follow: Center pivot system It is one of the most adaptable sprinkler irrigation systems. However, it is not suitable for irregularly shaped fields, long narrow fields and fields containing such types of obstruction as trees, farmsteads or others. Center pivot with corner system With corner attached, it can irrigate most of the corner areas a conventional center pivot system hardly covers. Linear move system Sometimes called a lateral move, it is built the same way as center pivot system, with moving towers and pipes connecting them. However, in linear move system all towers move at the same speed and in the same direction. Due to its high installment cost, linear move system is commonly used for high value crops as potatoes, vegetables and turf. Traveling big gun system It is one of sprinkler irrigation systems particularly adaptable to various crop heights, variable travel speeds, odd shaped fields and rough terrains, big gun system requires a moderate installment cost, more labor and higher operating pressures than center pivot and linear systems. Side roll system Sometimes called a wheel roll system, it is the most commonly chosen out of available sprinkler irrigation systems to irrigate an area from 60 to 90 feet wide. The sprinklers are mounted on weighted, swiveling connectors that no matter where the side roll stops, the sprinklers will always be right side up. It is not recommended for slopes greater than 5 percent, and it should be used mainly on flat surface. When selecting one or more sprinkler irrigation systems, most important things to consider are: The field shape and size The field topography The amount of time and labor required to operate those sprinkler irrigation systems Types of crop you are going to grow Source - agricultureguide.org


Hail Injury on Corn

Hail pounded various parts of Iowa over the last two weeks. Storm systems continue to march aggressively across the state. Shredded, twisted corn lies in their paths (Figure 1). Corn across the state ranges from 6th to perhaps the 10th leaf stage. That means that corn growing points extend above the ground. Damage to the growing point compromises recovery and yield. Figure 1. Corn field damaged by June 14, 2009, hail storm. Photo on June 18 near Bloomfield, Iowa. R. Elmore photo. Hail decreases yields by reducing stands as well as destroying leaves. The severity depends on the crop’s growth stage. Corn has an advantage over soybean early in the season when storms roll through since corn’s growing point remains below ground until about the sixth-leaf stage. Young plants like this are not killed if only leaf or stem tissue is lost. Before we go on though, remember that the staging method used by most agronomists differs from that employed by hail adjusters The sixth-leaf stage of the ISU leaf-collar system correlates to the seventh-leaf stage used by hail adjusters. From V8 and following (ISU system), the hail adjuster method is about two leaves ahead. We will use the ISU leaf-collar system in this discussion. By destroying leaves and reducing stands, hail can wreak havoc on a corn crop, especially this late in the season. For corn, the growing point (the area of the corn plant where leaves and the tassel are initiated, located at the base of the stalk) is a visual indicator of the plant’s health. If the growing point is above ground and damaged by hail, the corn cannot recover. A healthy growing point is a white to cream color. Whereas, a brown growing point is a sign of death. The brown growing point indicates the corn has rotted. This can be seen by cutting the plant lengthwise. If your field has been hailed: Wait it out for a couple days. It takes four to five days of good weather for corn to recover. Call your crop insurance agent, hail adjusters are trained and equipped to assess hail damage losses. Wait at least three to five days after a hail storm to obtain an accurate damage appraisal. Evaluate crop growth stage. Corn growth stage at the time of the storm is critical. If the plant has less than six collared leaves, yield will rarely be affected. Expect re-growth. This is true regardless of the amount of defoliation if weather after the storm favors growth. Much of the corn at this time of the year has vulnerable growing points. Assess viable stands. Evaluate injured plants to determine the growing point’s viability. Use a sharp knife and cut lengthwise down the stem. Plants with a healthy growing point should survive. Make assessments of plant survival three to five days after the storm allowing plants to recover. If weather is not conducive for plant growth for a prolonged period after the storm, assessing the remaining stand may require waiting up to a week. Estimate yield losses from defoliation. As mentioned, leaf loss or defoliation will rarely affect yield before the sixth leaf stage. Plants with six leaves or greater will experience yield losses depending on the extent of the defoliation. Estimate yield losses from stand reductions. Stand loss may occur following significant hail storms. Small reductions in plant survival do not impact yields much; for example a one-third reduction in stand will only reduce yield by 1% percent if it occurs before V8. Neighboring plants compensate to some extent for the lost plant. Here are some points to keep in mind if your field has been hailed: 1) Patience. Call your crop insurance agent, hail adjusters are trained and equipped to assess hail damage losses. Wait at least three to five days after a hail storm to obtain an accurate damage appraisal. Allow recovery time for the plants. 2) Evaluate crop growth stage. Corn growth stage at the time of the storm is critical. If the plant has less than six collared leaves, yield will rarely be affected. Expect re-growth. This is true regardless of the amount of defoliation if weather after the storm favors growth. As mentioned above though, most Iowa corn has more than six collared leaves - growing points are vulnerable. 3) Assess viable stands. Evaluate injured plants to determine the growing point’s viability. Use a sharp knife and cut lengthwise down the stem. The growing point of a healthy plant is white to cream color. Plants with a healthy growing point should survive. Make assessments of plant survival three to five days after the storm allowing plants to recover. If weather is not conducive for plant growth for a prolonged period after the storm, assessing the remaining stand may require waiting up to a week. Some plants in near Bloomfield damaged from a June 14 storm will not recover because of a rot that developed in the stalk. The rot was visible only when plants were cut lengthwise (Figure 2). Cool damp weather following the storm discouraged rapid plant recovery allowing the organisms to invade stems destroying (the plant's) opportunity to recover. Figure 2. Corn plant damaged by hail on June 14, 2009, with base of stem cut lengthwise. Brown discoloration above growing point will likely kill the plant's growing point. Weather following the hail storm was not conducive for vigorous plant growth and recovery. Bloomfield, Iowa, June 18, 2009. R. Elmore photo. 4) Estimate yield losses from defoliation. As just mentioned, leaf loss or defoliation will rarely affect yield before the sixth leaf stage. Plants with six leaves or greater will experience yield losses depending on the extent of the defoliation - see Table 1. Table 1. Estimated percent yield reduction from leaf loss caused by hail damage. Corn can withstand a substantial loss of leaf area without major yield losses. A reduction in leaf area less than 50 percent does not reduce yield if it occurs before V13. For example, at V13 - thirteen collared leaves - a 50 percent reduction in leaf area reduces yield by only 10 percent. However, when the crop is tasseled, VT, yield is reduced by 31 percent. 5) Estimate yield losses from stand reductions. Stand loss may occur following significant hail storms. Small reductions in plant survival do not impact yields much; for example a one-third reduction in stand will only reduce yield by 1o percent if it occurs before V8. Neighboring plants compensate to some extent for the lost plant. However, after V8, yields are reduced by the same amount that the stand is reduced. A one-third reduction in stand will reduce the yield potential by one-third. We are conducting research in conjunction with the crop insurance industry to determine if this is still valid with modern hybrids and management. Twisted whorl plants – a.k.a. tied or buggy whipped - may result from hail injury. A study on twisted-whorl plants by Thomison and Mangen at the Ohio State University found that in fields with major hail damage exhibited 36 to 61 percent twisted whorls. One month later that number was reduced to 0 to 9 percent; most plants grew out of it. The site that had the largest yield losses experienced major stand losses. Once plant survival is established, use Table 2 to determine yield potential of the current stand based on the original planting date and plant population. Compare this to the yield potential of a replant. Replanting corn now is difficult to justify based on normal planting date responses. Table 2. Relative yield potential of corn by planting date and population. Note: Values based on preliminary Iowa research and modeling; 100% yield potential is estimated to occur with 35,000 plant population and early planting. From: Iowa State University Extension, Corn Field Guide, CSI 001. 2009. Overall, remember the key is to assess plant viability thoroughly once plants have had a good chance to recover. Contact your crop insurance company before destroying the crop or replanting. Roger W. Elmore and Lori Abendroth


Grasshopper Management Under Drought Conditions

Department of Entomology, North Dakota State University DS-1-97, June 1997revised May 2004Department of Entomology Cultivation of the soil at the proper time can be one of the most effective cultural practices available to farmers for the reduction of grasshopper populations. Tillage reduces grasshoppers by eliminating the green plants on which grasshoppers feed. Fields that are tilled in late summer or early fall will not attract grasshoppers for egg-laying activity. Grasshoppers seldom lay eggs in tilled fields even when a heavy covering of plant residue remains. Early spring tillage is a better option than fall tillage during seasons of drought for the obvious reasons of reducing wind erosion and trapping snow to maximize moisture for the next growing season. It is advisable to complete early spring tillage to eliminate all green growth before grasshopper egg hatching begins. If no food is available when they hatch, the young grasshoppers will starve because they can only move a short distance (about a yard) to locate food. Early spring tillage will also provide weed control to conserve moisture. Tillage for the sole purpose of physically destroying grasshopper egg pods or exposing them so that they dry out or are consumed by birds and other predators is of limited value. Again, the primary benefit of a well-timed tillage operation is to either reduce green plant material for attracting adult hoppers or to induce starvation by spring tillage. If grasshoppers are present when tillage operations begin, it will probably be impossible to achieve adequate control by simply eliminating green plant material in a field. Once grasshoppers have fed and developed to the second growth stage (2nd instar), they usually are mobile enough to move to adjacent crops for a new food supply. In these fields, trap strips could be used to "collect" or concentrate grasshoppers in a relatively small area. Then it is possible to control them quickly and economically using a minimum amount of insecticide. Trap strips can consist of narrow strips of weeds left during tillage or seeded small grain strips grown to concentrate adult hoppers. The disadvantage of this method is that the grower has to fallow the strip area a second year and utilize another method of control. Late summer or early fall cover crop seedings and winter wheat may be in jeopardy in areas where grasshoppers have been a problem during the growing season. As an example, flax seeded for a cover crop on fallow ground in areas of high hopper incidence could be lost within a few days of emergence. Winter wheat could also disappear rapidly to hungry adult hoppers. Thimet 20G systemic insecticide applied as a planting time treatment can give effective hopper control in wheat. Since the hoppers will be feeding into fields from the margins, normally only the margins for about 80 to 100 feet into the fields should require treatment. Green vegetation in fall stubble left for winter wheat cover should be controlled since it will draw hoppers into the field prior to planting winter wheat. Current recommendations for winter wheat include spraying a herbicide at least two weeks ahead of seeding to desiccate volunteer grain used as a food source by the wheat curl mite that transmits wheat streak mosaic. Early control of this fall regrowth has the additional benefit of reducing attractiveness of these fields to grasshoppers. High hopper populations in stubble left for winter wheat could make spring seeded grain a more practical alternative, especially in drought affected fields. Late seeded winter wheat is a second alternative since hoppers become lethargic and less active as fall approaches. Additional Resources that can be Accessed through the Internet: North Dakota State University Extension Service Insect Updates for North Dakota - a source of insect information for the region NDSU Crop and Pest Report Newsletter - seasonal information for North Dakota agricultural production NDSU IPM Field Crop Survey - Grasshopper field surveys affecting cropland around the state. E-272    Grasshopper Biology and Management. NDSU Extension Circular (web version). North Dakota Department of Agriculture        Road Right of Way Grasshopper Control Laws and Considerations North Dakota Laws Relating to Road Right of Way Grasshopper Control Spray Program Considerations for Roadside Right of Way Grasshopper NDDA Approval of Grasshopper Control Programs in Rights of Way Involving State Funds Along State Highways Insecticides for Right-of-Way and Non-crop areas Other Resources: Map of the western United States illustrating the current Grasshopper and Mormon Cricket Outbreak and Survey Information: Field surveys and mapping were done by USDA-APHIS-PPQ Western Region. 2004 National Rangeland Grasshopper Forecast Map 2003 National Rangeland Grasshopper Forecast Map 2002 National Adult Grasshopper Survey Map Grasshoppers: Their Biology, Identification, and Management  (USDA-ARS Northern Plains Agricultural Research Laboratory) A comprehensive resource for information on grasshopper management that contains the latest research in grasshopper management, identification, ecology and control tactics, including: APHIS� Grasshopper Integrated Pest Management User Handbook    This handbook provides practical information on biological and chemical control methods; range management techniques; and environmental impacts. It incorporates decision support tools and an overview of grasshopper ecology, outbreaks and modeling. Field Guide to Common Western Grasshoppers    This handy guide provides an overview of grasshopper biology, anatomy, populations and life cycles. It also shows how to survey and collect grasshoppers. Grasshopper Species Fact Sheets    This collection of 56 species fact sheets features distribution maps and color photographs for each species, as well as information on their economic importance, identification and biology. Grasshopper Identification Tools    Identification guides, keys, and photographs to help identify over 90 species of grasshoppers. While there are more than 400 known species of grasshoppers in the Western United States, only about two dozen are considered pest species and a few are beneficial. Decision Support Software for Rangeland Grasshopper Management    Hopper 4.0 and CARMA 3.3 computer software provide advice on the best course of action when dealing with grasshopper outbreaks on rangeland. New Grazing Management Research     The latest preventative grasshopper management research on using grazingmanagement to reduce the intensity and duration of grasshopper outbreaks. New Grasshopper Chemical Control Methods (RAATs)    The latest techniques for reducing application rates and costs up to 50% by alternating untreated swaths with treated swaths. RAATs maintains effective grasshopper control and reduces environmental impact. New Integrated Pest Management Research    Includes: South Dakota RAATs Demonstration Project Report, North Dakota IPM Demonstration Project Report, Preventative Grasshopper Management Brochure


Small Grain Diseases: Management of Those More Common and Severe in Dry Years

Small grain diseases that are more severe under dry soil conditions in North Dakota include wheat streak mosaic virus and common root rot.  These two diseases add stress to the small grain plant which is already under stress from lack of moisture and too much heat.  Wheat streak mosaic is a virus disease primarily attacking wheat crops, and is transmitted by wheat curl mites.  Common root rot is caused by a fungus ordinarily found in North Dakota soils, a fungus that may attack roots and crowns of wheat and barley.  Another fungal root disease, Fusarium crown rot, is not as frequently observed as common root rot, but under dry conditions, it also may cause damage on wheat, barley and oats. Wheat streak mosaic: Wheat streak mosaic virus is carried from plant to plant by wheat curl mites.  Wheat curl mites are very tiny (1/100 inch in length).  Their population generally increases rapidly under extended warm dry weather, and with high mite numbers, the chance of infection of susceptible crops becomes higher.  The mite transmits the virus during feeding and the virus infection causes yellow streaking of leaves, stunting of the plant, and reduced yields.  The mite needs a green bridge for survival; it frequently overwinters on winter wheat or perennial grassy weeds, and then moves from these plants to adjacent spring grains in the spring or early summer. Mites move more frequently during heat and drought stress, seeking green, healthy plant tissue. They move to the outer edge of leaves where they are positioned for easy movement in the wind.  The wind moves them to adjacent plants or fields, where mite transmission of the virus reoccurs.  The virus disease severity also is greater under drought stress because the plant has fewer nutritional and water resources to compensate for virus infection. Wheat streak mosaic management: Wheat streak mosaic is managed by two cultural practices: elimination of wheat volunteer plants and grassy weeds with herbicides (or tillage) at least two weeks prior to planting a new crop; and use of appropriate planting dates.  Each practice helps to break the green bridge needed for mite survival. Elimination of volunteers and grassy weeds: Roundup or other glyphosate-containing compounds may be used to destroy these hosts of the mites and virus.  This should be done two weeks prior to planting a new crop into that field or adjacent field.  If infected volunteers, grassy weeds, or even infected fields are left standing, they may be a serious source of infection.   Roundup or other glyphosate products act slowly to kill plants, and mites on these plants will move out of the infected volunteers for up to 10 days after herbicide treatment. Planting dates: Winter wheat should be planted in mid-September, or later in southern North Dakota counties.  This reduces risk of exposure to high mite populations dispersing from infected plants during August and early September.  Green corn also may be an important reservoir for the mite and virus, and planting winter wheat next to corn that is still green increases risk of movement of the mite from the corn into the winter wheat.  On the other hand, spring wheats should be planted early to escape infection prior to summer mite buildup. Severe damage is most likely if the crop is infected at a young growth stage. Additional information about wheat streak mosaic disease may be found in NDSU Extension Circular PP-646 “Wheat Streak Mosaic”. Common root rot: Common root rot is caused by a soil-borne fungus, Bipolaris sorokiniana. The common root rot fungus is widespread in North Dakota soils, causing root rot of wheat and barley.  Common root rot is characterized by a dark-brown discoloration of the roots, sub-crown internode, and often the crown.  The disease interferes with water uptake and subsequently affects grain fill.  Moist soil and cool temperatures often allow wheat and barley plants to compensate for infected roots, and yield losses may not be noticed in a cool growing season.  However, early infection of roots can result in severe yield loss if the plant is subsequently exposed to high temperatures and dry soils.  A diseased root system can’t absorb enough water for the plant under these stress conditions. Common root rot management: Three strategies are generally used to combat common root rot:  crop rotation, variety choice, and seed treatment. Crop rotation: Crop rotation is the most effective method of reducing the risk of common root rot.  Non-host crops include broadleaf crops such as soybean, canola, dry beans, and flax.  Oats also are a good rotation crop if small grains must be grown.  Increasing the time between re-cropping wheat or barley also is effective; each additional year break from these two crops reduces the level of the fungus in the soil. Varieties: Varieties of spring wheat, durum, and barley differ in their susceptibility to common root rot.  The latest information on spring wheat variety susceptibility can be found in the NDSU Extension publication on spring wheat varieties “NDSU Extension Publication A-574". Seed treatment: Several currently registered seed treatments have suppression activity against the common root rot fungus in wheat and barley.  The most likely conditions where seed treatments would be beneficial are: under continuous wheat or barley production; short rotational cycles between wheat or barley; or in soils or areas where moisture stress is likely.  Further information on seed treatments for wheat and barley may be found in NDSU Extension Circular P-622, “ND Field Crop Fungicide Guide”. Fusarium root rot: Root rot caused by several fungi in the genus Fusarium may be associated with very dry soils or areas that get low annual precipitation.  Fusarium root rot on wheat often is called dryland foot rot.  Fusarium root rots are characterized by a brown to reddish-brown discoloration of the roots and crowns.  Affected plants may be in patches, appearing as prematurely ripened plants.  As with common root rot, plants infected with Fusarium root rot cannot absorb enough water to carry plants through grain fill. Fusarium root rot management: Crop rotation and seed treatments help reduce the risk of Fusarium root rot. Crop rotation: All non-cereal crops are good rotation choices to reduce the risk of this disease.  As with common root rot, the longer break between cereal crops reduces the risk, as well. Seed treatment: Several currently registered seed treatment products have activity against root rot caused by Fusarium species.  Further information on seed treatments for this disease may be found in NDSU Extension Circular P-622, “ND Field Crop Fungicide Guide”. Additional information about common root rot and Fusarium dry rot may be found in NDSU Extension Circular PP-785 “Root and Crown Rots of Small Grains”. Marcia McMullen - Plant Pathologist


Sharing the cost of fighting animal disease

UK government is determined to press ahead with plans to make livestock producers share the cost of dealing with animal disease outbreaks. Producers will have to join an annual fee-based livestock registration scheme under proposals to establish a fighting fund to help pay for disease outbreaks. In deciding how to implement cost-sharing here, DEFRA has looked at the following schemes which operate overseas: Denmark Production-based levies in Denmark are focused on disease prevention rather than controlling outbreaks and any associated compensation. Levies are collected in accordance with Danish law following recommendations made by the agricultural sector. Some 14% of the money raised is then invested in the prevention and control of animal diseases. Levies are administered by levy boards for each agricultural sector. Each board includes representatives from the farming sector. Board membership, budgets and accounts are supervised by the Danish ministry of agriculture. Netherlands The Netherlands Animal Health Fund was created in the mid-1990s following a deal between the livestock sectors and the government. In effect, though, it is wholly funded by industry. The Dutch government believes disease control is an integral part of livestock production and so the costs should be borne by farmers. Money is raised in advance of any outbreak by means of a "peace time" levy. This funds disease surveillance and monitoring. A bank guarantee ensures financial obligations are honoured should the fund be called upon. Germany Cost-sharing in Germany is well-developed. In the case of notifiable diseases such as BSE or swine fever, a statutory compensation scheme refunds the value of livestock - as well as subsequent culling and rendering costs. But costs such as private veterinary fees, cleansing and disinfection and consequential losses are ineligible. The scheme is administered by each province (Länder) and financed equally by the government and industry through a species-specific levy. Levy rates are fixed annually according to need, and funds raised, including the reserve, are ring-fenced by species. Spain Livestock farmers in Spain can insure against animal disease through the Spanish National Agricultural Insurance Agency (ENESA) - a government agency. The range of insurable risks is set out in an annual agricultural insurance plan. The agency, which includes industry stakeholders, acts as arbiter in all disputes. Animal disease insurance products are provided by the private sector but are subsidised by the state - by some 37-43% - a level set by the agency. Farmers can take out insurance individually or through a co-operative or professional body. France Farm disease standards and disease freedom accreditation in France are set by private groups of farmers. Established in the early 1950s, these groups are recognised in French law as animal health bodies. The cost of belonging to an animal health body depends on livestock numbers. Compensation funds depend on accumulated contributions. In the case of foot-and-mouth, the French government pays compensation for animals slaughtered. Consequential losses due to the disease are covered by the animal health groups. But foot-and-mouth is the only disease covered this way. Ireland Since 1979, Ireland has operated an animal disease levy system for dairy cattle and cattle slaughtered or exported live. Money collected is used to contribute towards the compensation costs for the TB and brucellosis eradication schemes. Levy rates are determined on the basis of contributing about 50% towards compensation costs. The remaining compensation costs, testing, equipment purchases and other costs are paid by the Irish Department of Agriculture and Food, which seeks partial re-imbursement each year from Brussels. Australia Cost-sharing for emergency animal disease control was launched in Australia in 2002. Negotiated and signed by the government and industry, the agreement covers new and exotic diseases and some endemic diseases. Contributions vary according to disease, ranging from disease controls entirely funded by the government, to measures against severe outbreaks of known endemics which are 80% funded by industry. Ultimate accountability for cost-sharing resides with an Emergency Animal Disease National Management Group which takes decisions on policy and resource allocation issues during a disease outbreak. It includes government and industry representatives. DEFRA


New techniques keep disease on the back foot

Changing disease threats to UK livestock, as highlighted by the arrival of bluetongue last year, mean research into animal health problems is needed now more than ever, according to the Moredun Foundation's honorary president, John Cameron. "Present research topics include worm control - still one of the biggest costs to livestock producers - and Johne's disease, widely regarded as the biggest health problem facing the cattle industry," Mr Cameron told delegates at last week's Moredun open day. Worm vaccine Farmers could soon have a new weapon in the ongoing battle against worms, according to Moredun researcher David Smith. "Ongoing work on a vaccine against haemonchus, the barber's pole worm, is promising and could lead to the development of vaccines against a range of worms." "Although it is one of the most important parasites of sheep in the southern hemisphere, there are signs of increasing prevalence in the UK and it is thought this is in line with rising global temperatures," said Prof Smith. "Hopefully a vaccine will be in place before this worm becomes a real problem in the UK," he said. "And by using this parasite as a pathfinder, we will be able to use this model to adapt to other parasites such as Ostertagia in cattle." Haemonchus attaches to the abomasum and sucks blood, resulting in anaemic sheep. Because this worm feeds on blood, the molecules on the microvilli membrane have become appropriate vaccination targets. The vaccine had been trialled on Merinos in Australia, which have a high incidence of haemonchus infestation. Vaccinated animals showed a lower rate of infection, with fewer worm eggs than non-vaccinated sheep. On-going trials suggested the vaccine should be commercially feasible and Prof Smith hoped one may be available in the next five years. Drench resistance Unless anthelmintics were used sensibly they could soon be exhausted, Frank Jackson warned delegates. "And with rising temperatures extending the parasite season, control is needed now," he said. "In Scotland, 80% of lowland farms have benzimidazole (white drench) resistance and 30% ivermectin (yellow drench) resistance. But, even when you stop using a wormer family, resistance will still persist on individual farms for many years. "We need to slow the process of resistance and this can be done by not exposing worms to anthelmintics and treating all animals of a class in a flock. However, when we have a means of identifying animals requiring treatment then we could use a targeted selected treatment approach," he said. Targeted treatments currently being used included faecal egg counting, however, this required lab analysis, said Dr Jackson. Current research had been looking at liveweight gain as an indication of worm infestation. The thinking behind this was because early on in infection, appetite decreased, thus having an impact on liveweight gain. Nematodes alone could reduce feed intake between 10% and 40% and reduce food use by up to 40%. "Electronic ear tags and automated shedding systems would mean large numbers of sheep could be handled with minimal labour input," said Dr Jackson. "Moredun is currently developing a model which predicts what weight lambs should be at a certain point in time and, using this system, could also calculate the efficiency of production. Preliminary tests show weight is an important indictor of worm infection, however, an issue requiring further research is how often animals needs weighing." Scrapie Detecting sheep with scrapie was set to become much easier following the development of a new test for use on live sheep, explained vet research pathologist, Mark Dagleish. A test developed at Moredun could now be used to screen large numbers of live sheep at farm level, detecting infection up to one year before animals show any clinical signs. This allowed for large-scale sampling quickly and accurately, said Dr Dagleish. "The abnormal form of the prion protein (PrP) causing scrapie is detected in nervous and lymph glands. The protein also accumulates in the recto-anal-mucosal-associated-lymphoid-tissue (RAMALT), which is more easily accessed using a rectum speculum. "The rectum speculum inserted up the anus exposes crypts where the abnormal prion protein is found. Forceps are then used to cut the crypt off. This part of the procedure is quick the hardest part is catching the sheep and holding them in place on the bale," he said. A local anaesthetic was used around the anus, although Dr Dagleish said there were no health or welfare problems resulting from the procedure. Currently the procedure was being used purely for research, but Dr Dagleish said in future if there was to a move over to scrapie resistant genotypes, this test would become more widely available, particularly since it was 98.7% positive in detecting clinically infected animals. Ovine pulmonary adenocarcinoma A disease still proving difficult to combat is the contagious viral disease, ovine pulmonary adenocarcinoma (OPA). The disease, also known as Jaagsiekte, has non-specific clinical symptoms including loss of condition, increasing breathing difficulties and in some cases, discharge of fluid from nostrils. The true numbers of affected flocks were unknown as people didn't want to admit they'd got it, but it was thought to be quite widespread, said research scientist Chris Cousens. Out of 125 tested farms in Scotland 61% of flocks had some clinical cases of OPA. And although several sheep in a flock may be affected, often only a single sheep was noticed at any one time, added Dr Cousens. "Many more animals are infected with Jaagsiekte sheep retrovirus (JSRV) than will develop the disease and some of the carriers will never get sick. It is thought the disease is transmitted by lung fluid, which contains 50m particles of JSRV a teaspoonful of lung fluid. Respiratory route and also milk could be important transmissions routes," said Dr Cousens. "There is no treatment for affected animals, nor a vaccine to prevent infection," she added. "The current control strategy is regular inspection of adult sheep and testing of animals showing clinical symptoms. OPA can cause losses of 25% in a flock, so it is important to buy animals from reputable sources." Sarah Trickett


Growers can benefit from new technology that warns of potential pest and disease threats

Growers can benefit from new technology that warns of potential pest and disease threats. Mike Beardall reports Weather stations and micro-climate monitoring have become major weapons in the battle against pests and diseases in crops. With the development of equipment linked to computers, accurate print-outs give growers precise models on which to base spraying programmes against pests and diseases in specific crops. Growers can now create condition models for individual crops and locations that will indicate when they are at risk and the optimum times to spray. Lancashire-based vegetable consultant Fred Tyler says: "Systems have never been more accurate in recording temperatures, leaf wetness and humidity the three key components in forecasting the onset of disease. "In brassicas, particularly, we can measure conditions and put them against the model situation for diseases like ringspot, dark leaf spot, white blister and light leaf spot. "By spraying only when it is appropriate and at the most efficient time, growers can save a great deal of money through not wasting products - and disease prevention is at its most effective." Tyler has been working with models developed by Warwick HRI through the Morph system and using Smaartlog weather-monitoring stations and software, developed byAardware Design of Walton-on- Thames, Surrey. Smaartlog offers a system that can be adapted and expanded to fit an individual grower's specific requirements, from a single-sensor standalone system to a 1 7-sensor system with remote communications. Weather data is downloaded and analysed in the software and can be exported to bespoke horticultural software. One of the system's designers, Graham Moss, says: "Growers have to have as much easy-to-use data as possible and be able to put it against disease and pest models. We have made this as up-to-the- minute as possible and results from growers across the country have shown it to be hitting the target." From 1 July, brassica growers in Lincolnshire will receive alerts direct to their mobile phones of high disease-risk conditions, as part of a free service run by Syngenta. The Brassica Disease Warning System uses spore-trapping and disease-monitoring information from the Lincolnshire-based Allium & Brassica Centre, combined with in- field weather data and a predictive model developed by Warwick HRI. It provides a risk assessment for Alternaria, white blister and ringspot, based on field traps across Lincolnshire, the North East and South East. The forecasts are the result of extensive research and development undertaken by Warwick HRI, with funding from both Defra and the HDC. Information comes from seven monitoring sites across Eincolnshire. Allium & Brassica Centre joint managing director Andy Richardson says: "Advance warning of disease enables better timing of protective fungicides and the opportunity to produce the highest- quality brassicas." In onions, susceptible to downy mildew, growers use similar condition models to those in potato blight prediction, he adds. Potato growers use the Blightwatch network, through ADAS and the Met Office, which monitors atmospheric conditions in 650 post-coded areas of the UK and texts warnings to growers' mobile phones (see panel, p34). "Growers can be far more in control now," says Mark Bullen, specialist crops commercial manager at Syngenta, which has developed monitoring systems for growers that not only predict disease and pest attack but recommend the correct products to apply. He adds: "Knowledge is vital and puts the grower in a stronger position to make correct decisions about when to spray. "General weather data from the Met Office is useful, but information from specific fields and knowing about localised weather atterns, wind, airflow and the like means you can spray at the right time." Bullen believes that in the past five years or so, monitoring equipment has become far more efficient, and growers now have much greater control over when to treat crops. Agri-tech Services managing director Simon Turner agrees. "Growers should be in the driving seat with their own equipment," he says. "The grower puts statistics into the computer and this information is put against the model for the specific crop. "The results today are really accurate and microclimates can be monitored in small areas, rather than referring to statistics from some miles away." He adds: "There is a danger of relying on technology too much. Most growers are experienced enough to know their crops and local conditions. They will still use their own knowledge and observation but monitoring equipment and computer predictions are terrific management tools." Turner's company has more than 200 clients using weather stations, mainly the Davis Vantage Pro. Nottinghamshire-based Fresh Growers chief executive Martin Evans, who is also chairman of the British Carrot Growers' Association, has used Dacon weather stations linked to computers for some years to help determine optimum times to spray against diseases and pests. But he is not entirely convinced that new technology holds all the answers, believing that it is the skill and experience of growers - plus "getting out on the ground" - that determines final decisions on spraying. "A 10ha field can have rainfall variations of as much as 80 per cent." says Evans. "So a rainfall monitor in one corner will only give you the reading for that area. Weather stations are fine but it is a combination of the old skills and the new information that is used in the decision-making process. "You cannot rely on computers entirely to tell you when to spray. There is a lot of precision equipment around but there is a risk if you put your faith in it 100 per cent. Experience tells you when the risk is greatest and I think we can be hoodwinked into assuming there is a perfect pest-and-disease model for every situation." The 12-member Fresh Growers group has 8,000ha across Nottinghamshire and North Lincolnshire and grows eight per cent of Britain's carrots - along with 90 per cent of the country's Chantenay carrots - plus asparagus, sprouts, cabbages and lettuce. "We plan to have zero tolerance in chemical residue, and that means careful use of spraying," says Evans. "Forecasting and predicting is vital, but we have to be realistic and use local knowledge as well as scientific information." Dr Richard Harrington of Hertfordshire-based Rothamsted Research says monitoring and forecasting systems are becoming crucial for growers as climate warming continues. "The general implication is that, with climate warming, insect pests are set to be a greater problem for UK growers. Incrop inspections and pest-monitoring systems will have to be ready to cope with the threat, but can provide growers with the crucial information to take action early and effectively." Climate change will have a dramatic effect on vegetable and potato pest populations in the future, he adds. Mealy cabbage aphid can be expected to be much more of a problem, while the risk of cabbage root fly may increase, with each generation appearing earlier and three generations occurring in the few remaining areas and years that currently have two. Diamondback moth may also increase problems if it is able to overwinter in milder conditions, but the damage caused by turnip moth could be reduced if heavy spring rain becomes more normal, as this will kill larvae. Harrington also warns potato growers that it may be increasingly difficult to keep out Colorado beetle if conditions continue to warm up. "We anticipate first disease alerts for Altemaria and ringspot being triggered around the end of June or early July, but with many growers not starting fungicide programmes until August, disease can build within the crop," warns Richardson. "Starting spray programmes earlier and timing sprays in conjunction with a disease-forecasting system is the key to control of foliar diseases, particularly with fungicides such as Amistar." Syngenta specialist crop manager Bruce McKenzie adds: "The more help that growers can get from modern forecasting methods, the better they will be able to treat crops before problems start. Pest and disease warnings give growers extra support in making informed decisions." Crop protection: monitoring systems can help growers calculate the optimum time to spray crops, leading to minimal damage and the chance to produce the highest-quality harvest NEW PRODUCT US-based Spectrum Technologies recently launched "affordable solutions for growers" at Plumpton Agricultural College in East Sussex. The Watchdog 2000 atmospheric recording unit, which retails at around Pounds 1,000, uses wireless communication systems from field sensors to a computer to predict disease onset. Alerters, which can be linked to mobile phones, suggest optimum times to apply fungicides and pesticides. "The system monitors more than a dozen important growing conditions for effective crop and pest management," says Spectrum's international sales director David Lau. "The LCD display reviews historical data and can show severity indicators from onboard plant- disease models. This equipment gives growers a complete picture of environment conditions." Distributed by Weather Front of Eastbourne - which also markets Davis weather stations under its EnviroMonitors banner-Watchdog 2000 includes a digital microscope that sends a picture of leaf and stem surfaces to a computer to be emailed for analysis. The system starts at around Pounds 250, including the software. "Knowledge is vital and puts the grower in a stronger position to make correct decisions about when to spray" Mark Bullen, specialist crops commercial manager, Syngenta BLIGHTWATCH Blightwatch, run by ADAS and the Met Office, offers potato growers a unique warning system of impending blight threat conditions. Growers in 650 monitored areas get a text alert to their mobile phones warning of the need to spray. While most potato growers monitor their own localised conditions with weather stations, the Blightwatch system has proved a valuable extra precaution. Danger times are "Smith periods" when at least two consecutive days have a minimum temperature of 10[degrees]C and on each day at least 11 hours have relative humidity greater than 90 per cent. Smith period conditions are conducive for sporulation of the potato blight pathogen on lesions - and leaf wetness is also necessary for infection to occur. If these periods occur at seven- to 10-day intervals, there is a greater chance of blight development. Smith periods are only an aid to decision support and do not in themselves always indicate the need for immediate application of a fungicide. The Met Office collects hourly weather information from a network of more than 100 stations in the UK. Traditionally. Smith periods for potato blight warnings have been calculated from these station values. "Smith Periods are only one management tool and do not predict weather conditions on a field-by-field basis," says an ADAS representative. "At this level, the slope, aspect and effects of the local microclimate have an influence. This must be borne in mind when deciding on spray programmes." Spraying potatoes: Blightwatch system tells growers when to spray against blight Source - Horticulture Week


Farm Commodity Programs: Direct Payments, Counter-Cyclical Payments, and Marketing Loans

Federal law has authorized farm income and commodity price support programs for over 70 years. The 2002 farm bill authorizes the current programs for the 2002-2007 crop years. The payment framework combines direct payments of the 1996 farm bill with counter-cyclical payments of prior laws. Subsidies continue for wheat, feed grains, upland cotton, and rice, and soybeans and peanuts are added to the list of major crops. Dry peas, lentils, and chickpeas were added to the loan program, and wool, mohair, and honey were reinstated. The law continues to allow planting flexibility and does not require acreage reduction. Payments for each crop year can be spread out for up to 23 months. The Congressional Budget Office projects FY2005 outlays of $13.5 billion for farm commodity, conservation, and export programs, $8.4 billion of which are for the covered commodities and peanuts. Legislative Background Since the 1930’s, federal law has required the U.S. Department of Agriculture (USDA) to offer price and income support to producers of certain farm commodities.1 Authority comes from three permanent laws: the Agricultural Adjustment Act of 1938 (P.L. 75-430), the Agricultural Act of 1949 (P.L. 81-439), and the Commodity Credit Corporation (CCC) Charter Act of 1948 (P.L. 80-806). Congress typically alters provisions in these laws through multiyear farm bills or appropriations acts. The most recent authorizing legislation, the Farm Security and Rural Investment Act of 2002 (P.L. 107-171, or the 2002 farm bill), was signed into law on May 13, 2002. This law temporarily suspends most provisions of the permanent laws. Title I contains provisions regarding farm income and commodity price support programs for the 2002- 2007 crop years. It replaced the Federal Agriculture Improvement and Reform (FAIR) Act of 1996 (P.L. 104-127), including provisions for the 2002 crop year. Other titles in the law affect conservation, trade, nutrition, credit, rural development, and research. Eligible Commodities. This report covers grains, cotton, oilseeds, and peanuts. These commodities have similar rules, and generally account for about two-thirds of CCC outlays. Payments for dairy, sugar, and tobacco are outside the scope of this report.2 The 2002 farm bill defines two classes of commodities: “covered commodities” and loan commodities.” Covered commodities include wheat, feed grains (corn, grain, sorghum, barley, and oats), upland cotton, rice, soybeans, and other oilseeds (sunflower seed, rapeseed, canola, safflower, flaxseed, mustard seed, crambe, and sesame seed). Loan commodities include the covered commodities, plus extra long staple cotton, wool, mohair, honey, dry peas, lentils, and small chickpeas.3 Peanuts are classified separately, receive the same types of payments as covered commodities.4 Eligible Producers. To receive payments, an individual must share in the risk of producing a crop and comply with conservation and planting flexibility rules. If a landlord receives a fixed cash rent, then the tenant bears all the risk and receives the government payment. Tenants might not benefit fully, though, if landlords raise cash rents or switch to share rental agreements. Agricultural economists widely agree that a large fraction of government payments passes through to landlords, and that government payments raise the price of land. About 60% of acres enrolled in the program are rented.5 Types of Payments Commodity program payments under the 2002 farm bill combine the direct payment framework of the 1996 farm bill with counter-cyclical payments in preceding laws. Depending on the crops farmers grow or have a history of planting, they can receive three types of payments: direct payments, counter-cyclical payments, and marketing loans. Each payment has an annual limit per farm or individual, but these limits, in practice, are not constraining because some large farms can be reorganized to meet the rules, or marketing loans can be repaid in such a way as to avoid the limits. Legislation was introduced in the 108th Congress (S. 667) to further restrict payment limits. For more information, see CRS Report RS21493, Payment Limits for Farm Commodity Programs. Direct Payments. The 1996 farm bill created production flexibility contract (PFC) payments, and the 2002 farm bill renamed them direct payments. These annual payments are unrelated to (decoupled from) current production or current market prices. The farmer is not obligated to grow the crop to receive a direct payment, and may plant any crop (with the exception of fruits and vegetables) without losing benefits.6 The 2002 farm bill preserves direct payments for wheat, feed grains, cotton, and rice, and extends them to previously uncovered soybeans, minor oilseeds, and peanuts. As with the prior law, the direct payment is based on 85% of the eligible “base acres” multiplied by the “direct payment yield” for each farm and the “payment rate” per unit (Table 1). For more information about crop bases and payment yields, see CRS Report RS21615, Updating Crop Bases and Yields Under the 2002 Farm Bill. The annual limit on direct payments is $40,000 per person, and can be doubled under certain rules. Counter-Cyclical Payments. Automatic payments when market prices are low were first implemented in 1973, but were discontinued in the 1996 farm bill. The 2002 farm bill reinstates such payments for grains and upland cotton and now extends them to soybeans, other oilseeds, and peanuts.7 Counter-cyclical payments, formerly called deficiency payments, make up for the difference between a crop’s target price and a lower season-average market price. The target price is a statutory benchmark defined in the farm bill. When market prices exceed the target price, no payment is made. As with direct payments, counter-cyclical payments are tied to a farm’s base acres and “counter-cyclical payment yield” and do not depend on current production. Thus, even though the counter-cyclical program depends on market prices, it does not require farmer to market any of the relevant commodity. The annual limit on counter-cyclical payments is $65,000 per person and can be doubled under certain rules. Other payments are considered “counter-cyclical” also. For example, loan deficiency payments (described below) are counter-cyclical because they increase as prices fall. Marketing Loans and Loan Deficiency Payments. Commodity loans have long been part of farm policy, but the current form of marketing loans and loan deficiency payments (LDPs) began with the 1985 farm bill to keep the storage requirements of the loan program from distorting supply. Marketing loans are nonrecourse loans that farmers can obtain by pledging their harvested commodities as collateral. The loans provide interim financing by allowing farmers to receive some revenue for their crop when the loan is requested, while at the same time storing the commodity for later disposition when prices may be higher. LDP’s are an alternative to taking out a loan, and allow farmers to market grain at any time in response to market signals while receiving the benefits of the loan program. The “covered crops” and peanuts are eligible; extra long staple (ELS) cotton also is eligible, but not for LDPs. The 2002 farm bill reinstated loans for wool, mohair, and honey, and added dry peas, lentils, and small chickpeas. Marketing loans provide minimum price guarantees on the crop actually produced, unlike direct or counter-cyclical payments, which are tied to historical bases. The farm bill sets loan rates at the national level (Table 1), but USDA adjusts these to local loan rates to reflect spatial difference in markets, transportation and other factors.8 The annual limit on marketing loan gains and LDPs is $75,000 per person, and this limit can be doubled under certain rules. However, gains from using commodity certificates or forfeiting commodities are not limited. Thus, the marketing loan program is effectively unlimited. Timing of Payments The farm bill establishes a payment schedule. Direct payments (DP) are made in two parts: a 50% advance payment in December, and the balance in the following October. Counter-cyclical payments (CCP) are made in three parts: a first payment in October of the year the crop is harvested of up to 35% of the projected payment, a second payment in the following February, and a final payment at the end of the marketing year. Thus, payments for the 2004 crop began in December 2003 with the advance direct payment, and will end by October 2005 with the final counter-cyclical payment. For tax deferral or other reasons, producers can elect to not receive advance or partial payments. Marketing loans are available anytime after the commodity is normally harvested until a specified date in the following calendar year (e.g., for corn, from fall harvest until May 31). Marketing loans mature nine months after a loan is obtained. Federal Spending on Commodity Programs The 2002 farm bill covers crop years 2002-2007. Given the timing of payments, federal outlays for these crop years will be made primarily in FY2003-2008. The Congressional Budget Office (CBO) periodically estimates a baseline for agricultural programs. These estimates account for projections of production, inventories, and prices. Jim Monke, Analyst in Agricultural Policy, Resources, Science, and Industry Division


Using crop insurance as a marketing tool

“How many of you who grow corn have already sold part of your 2007 crop, in light of these high corn markets?” By asking that question, Dr. Art Barnaby put corn growers on the spot right away during his recent presentation at the NDSU Extension Service Annual Crop Insurance Conference in Fargo. “More than likely, many of you haven't because of the following two reasons,” he continued. “First, you think the price may go higher and you are waiting for that to happen, or you are hesitant to contract bushels, even before the crop is planted.” But Barnaby, who is a professor of Ag Economics at Kansas State University, said there are management tools that growers can use to cover both situations. In the case of the possibility of the prices going higher, a grower can purchase options in the grain market, and to cover the bushels contracted, a producer can purchase crop insurance to cover that risk. In some areas, producers are beginning to look at Group Risk Plans (GRP) and Group Risk Income Protection (GRIP) for reducing risk, but he feels these programs need to be studied carefully before making a decision to base coverage on them. The GRP, according to the Risk Management Agency (RMA) uses a county index as the basis for determining a loss. When the county yield for the insured crop, as determined by the National Agricultural Statistics Service (NASS) drops below the trigger level chosen by the farmer, an indemnity is paid. Payments are not based on the individual farmer's loss records. Yield levels in this program are available for up to 90 percent of the expected county yield. GRP protection involves less paperwork and costs less than the farm-level coverage, however, individual crop losses may not be covered if the county yield does not suffer a similar loss level.   Under the GRIP plan, indemnity payments are made only when the average county revenue for the insured crop falls below the revenue level chosen by the farmer. Again, Barnaby points out, there is little protection for hail, wind, flood or other spot losses. There is no Prevented Planting or Re-Plant protection, and lenders should be concerned if a farmer elects this type of coverage since a grower can suffer a total loss and receive no payment. Also, the indemnity payments for GRP and GRIP coverage can often by delayed up to 10 months. Instead of going with the GRP or GRIP plans, Barnaby encourages farmers to develop a marketing plan that would sell crops in increments throughout the year and use the options as a way of protecting from market price changes once contracts are made. “Many farmers do just fine by selling in increments and trying to get a high average rather than trying to pick the top of the market,” he said. “Often times the more complicated marketing plans don't add a great deal, but no decision is also a decision. “Once a marketing plan is decided upon, put it in writing, since a written plan is more likely to be followed,” he continued. “And for those who are having a difficult time setting up a marketing plan, maybe they should turn that task over to their wife. Many who have tried this approach say it works fine.” Based on the current market conditions for corn, Barnaby is recommending that corn growers take out Revenue (RA) crop insurance policies with a Harvest Price Option. “A RA policy with a Harvest Price Option or Crop Revenue Coverage expands the marketing window up to two years before harvest,” he said. “Most farmers are more comfortable forward selling grain when the inventory is guaranteed and their indemnity payment is not delayed five to 10 months as is often the case for GRP and GRIP coverage.” Finally, since crop insurance has so many options and plans, each grower should make his crop insurance plans in conjunction with a trusted crop insurance agent, who is trained to help the producer make his way through the complex maize of Federal Crop Insurance. DALE HILDEBRANT, Farm & Ranch Guide Source  - http://www.farmandranchguide.com


Drought and Disaster Reduction

Introduction Drought is a complex, slow-onset phenomenon that affects more people than any other natural hazard and results in serious economic, social, and environmental impacts in both developing and developed countries.  Drought conditions have been widespread in recent years in Europe, North Africa, the Mid-East and West Asian countries, India, China, North and Central America, and South America.  Recent droughts in Africa have placed more than 30 million people at risk for hunger and malnutrition.  In Ethiopia, an estimated 11 million people are at risk of experiencing food shortages as a result of a drought that has cut cereal production by 20 to 30 percent.  Afghanistan has experienced 4 consecutive years of drought in which precipitation has been 55 percent of the long-term average.   India is currently experiencing its worst drought in 15 years.  Twelve states have been drought declared and more than 300 million people are affected.  In Australia, drought has increased the nation’s trade deficit to its highest level in 2 years, cut wool production by 40 to 60 percent, cut summer harvests to their lowest levels in 20 years, and resulted in damages of A$300 million to residential properties because of cracking foundations. Since 1996, severe drought has occurred throughout large portions of North America, including the United States, Mexico, and Canada.  More than 5 consecutive drought years affected northern Mexico and the south-central and southwestern United States during this period and resulted in serious impacts and aggravated existing transboundary water conflicts between Mexico and the United States.  With more than 300 river basins currently being shared by 2 or more countries, drought conditions will continue to exacerbate international water conflicts. Growing concerns about a future increase in the frequency and severity of drought and mounting evidence of the expanding vulnerability of many countries to drought further underscore the importance of placing greater emphasis on pro-active drought policies and preparedness. One of the trends associated with recent drought events has been the growing complexity of drought impacts.  Past drought impacts have been linked most closely to the agricultural sector, for obvious reasons.  These impacts continue and are increasing in many countries as poor land use practices, rapidly increasing population, inappropriate government policies, environmental degradation, poverty, and civil conflicts exacerbate food production potential.  However, in many nations, particularly those characterized by more complex economies, the impacts of drought quickly ripple to other sectors as drought conditions extend for multiple seasons and years.  In the United States, for example, recent droughts have produced significant impacts on transportation, recreation and tourism, energy production, wildland and forest fires, and the environment, endandering the existence of animal and plant species and aggravating soil erosion.  In 2002, 50% of the United States was affected by moderate, severe, and extreme drought, and estimates of losses range from $11 to $20 billion. Drought conditions have continued into 2003 for most of the western United States, and impacts may be more severe than in 2002. In both developing and developed countries, the impacts of drought are often an indicator of nonsustainable land and water management practices, and drought assistance or relief provided by governments and donors often encourages land managers and others to continue these practices.  It is precisely these existing resource management practices that have often increased societal vulnerability to drought (i.e., exacerbated drought impacts).   This often results in a decreased resilience of individuals and communities and an increased dependence on government.  One of the principal goals of drought policies and preparedness plans is to move societies away from the traditional approach of crisis management, which is reactive in nature, to a more pro-active, risk management approach.  The goal of risk management is to promote the adoption of preventative or risk-reducing measures and strategies that will mitigate the impacts of future drought events, thus reducing societal vulnerability.  This paradigm shift emphasizes preparedness, mitigation, and improved early warning systems (EWS) over emergency response and assistance measures. Drought is a creeping natural hazard that is a normal part of climate for virtually all regions of the world. Drought onset and end are often difficult to determine.  In fact, there is often widespread disagreement among scientists as to whether a drought exists and its level of severity.  Certainly this confusion explains, to some extent, the lack of progress in drought preparedness in most countries.  Drought early warning systems must be an integral part of preparedness plans, but to date most of these systems have been largely ineffective for a variety of reasons.  Improved coordination within and between levels of government and with regional and international organizations, as well as the application of new technologies and tools, provide unprecedented opportunities in drought monitoring that are proving to be effective in some settings.  It is essential that lessons learned in drought monitoring, planning, mitigation, and policy be shared between countries and regions in order for nations to reduce their vulnerability to drought in the immediate future. Drought Risk and Vulnerability Assessment Drought risk is a product of a region’s exposure to the natural hazard and its vulnerability to extended periods of water shortage.  If nations and regions are to make progress in reducing the serious consequences of drought, they must improve their understanding of the hazard and the factors that influence vulnerability. The frequency of occurrence of meteorological drought at various levels of intensity and duration defines the drought hazard for drought-prone nations and regions.  It is critical for countries to better understand this hazard and how it varies temporally and spatially and establish comprehensive and integrated drought early warning systems that incorporate climate, soil, and water supply factors such as precipitation, temperature, soil moisture, snowpack, reservoir and lake levels, ground water levels, and streamflow.  It is also essential that we identify trends in temperature and precipitation amounts, changes in the seasonal distribution and intensity of precipitation events, and other changes in climate that might be helpful in understanding how the hazard may change on a temporal and spatial basis in the future. Vulnerability to drought is dynamic and influenced by a multitude of factors, including population growth and regional shifts in population, urbanization, technology, government policies, land use and other natural resource management practices, desertification processes that reduce the productivity of the natural resource base, water use trends, and increasing environmental awareness.  Individually, these factors are important because they may increase or decrease vulnerability.  Collectively, they can have a drastic effect on the types and magnitude of impacts associated with drought.  Thus, current concerns about the escalating impacts of drought may be the result of an increased frequency of drought events (i.e., meteorological drought), changes in vulnerability, or a combination of these elements. Drought Monitoring and Early Warning A recent report published by the World Meteorological Organization (WMO) documented the shortcomings and needs of drought early warning systems.  The shortcomings noted were: · data networks–often of inadequate density and data quality for meteorological and hydrological monitoring and lacking for other key climate and water supply parameters (e.g., soil water, stream gauges); ·  data sharing–inadequate data sharing between government agencies and the high acquisition costs for these data by end users limits application of data for drought research and to enhance drought preparedness, mitigation, and response activities; ·  early warning system products–data and information products are often not end user friendly and users are not trained in the application of this information in decision making; · drought forecasts–unreliable seasonal forecasts and the lack of specificity of information provided by forecasts limit the uses of this information by farmers and other weather- sensitive industries; ·   drought monitoring tools–inadequate indices for detecting the early onset and end of drought, although the Standardized Precipitation Index is a new tool that is being widely used to detect the early emergence of drought in many countries; · integrated drought/climate monitoring–drought monitoring systems should be integrated and based on multiple indicators to fully understand the magnitude, spatial extent, and impacts of drought; ·   impact assessment methodology–lack of impact assessment methodology hinders impact estimates and the activation of mitigation and response programs; ·  delivery systems–data and information on drought conditions, seasonal forecasts, and other products are often not delivered to users in a timely manner; · global early warning systems–a global drought assessment product based on multiple key drought indicators or indices (e.g., SPI, satellite data) would be helpful to international organizations, NGOs, and others in detecting emerging drought areas and potential food deficit areas. Drought Policy and Preparedness Drought-prone nations should develop national drought policies and preparedness plans that place emphasis on risk management rather than following the traditional approach of crisis management, where the emphasis is on reactive emergency response measures.  Crisis management decreases self-reliance and increases dependence on government and donors.  This approach has been ineffective because response is untimely (i.e., post-impact), poorly coordinated within and between levels of government and with donor organizations and NGOs, and poorly targeted to drought-stricken groups or areas.  Many governments and others now understand the fallacy of crisis management and are striving to learn how to employ proper risk management techniques to reduce societal vulnerability to drought and therefore lessen the impacts associated with future drought events. Developing vulnerability profiles for regions, communities, population groups, and others will provide critical information on who and what is at risk and why.  This information, when integrated into the planning process, can enhance the outcome of the process by identifying and prioritizing specific areas where progress can be made in risk management. In the past decade or so, drought policy and preparedness plans have received increasing attention from governments, international and regional organizations, and NGOs.  Simply stated, a national drought policy should establish a clear set of principles or operating guidelines to govern the management of drought and its impacts.  The policy should be consistent and equitable for all regions, population groups, and economic sectors and consistent with the goals of sustainable development.  The overriding principle of drought policy should be an emphasis on risk management through the application of preparedness and mitigation measures.  Preparedness refers to pre-disaster activities designed to increase the level of readiness or improve operational and institutional capabilities for responding to a drought episode.  Mitigation actions, programs, or policies are implemented during and in advance of drought to reduce the degree of risk to human life, property, and productive capacity.  Emergency response will always be a part of drought management because it is unlikely that government and others can anticipate, avoid, or reduce all potential impacts through mitigation programs.  A future drought event may also exceed the “drought of record” and the capacity of a region to respond.   However, emergency response should be used sparingly and only if it is consistent with longer-term drought policy goals and objectives. A national drought policy should be directed toward reducing risk by developing better awareness and understanding of the drought hazard and the underlying causes of societal vulnerability. The principles of risk management can be promoted by building greater institutional capacity within countries through encouraging the improvement and application of seasonal and shorter-term forecasts, developing integrated monitoring and drought EWS and associated information delivery systems, developing preparedness plans at various levels of government, adopting mitigation actions and programs, and creating a safety net of emergency response programs that ensure timely and targeted relief. Drought Preparedness Networks: Stimulating Regional and Global Cooperation and Information Sharing As new technologies, tools, and methodologies become available and are subsequently adopted by drought-prone countries and regions, the importance of sharing this information and experience is paramount to future advances in drought preparedness.  One way to accomplish that goal is through development of a network of regional networks for drought preparedness.  Such networks, relying heavily on the Internet for linking institutions within and between regions, will facilitate the exchange of information and experiences. The NDMC and International Drought Information Center (IDIC) at the University of Nebraska are working in partnership with ISDR and other key U.N. agencies, U.S. federal agencies, NGOs, and appropriate regional and national institutions to build a global drought preparedness network that will promote the concepts of drought preparedness and mitigation with the goal of building greater institutional capacity to cope with future episodes of drought.  In essence, this global drought partnership will enhance current national and regional institutional capacities through expansion of the NDMC=s drought information clearinghouse on the World Wide Web and by building regional drought preparedness networks.  Working individually, many nations and regions will be unable to improve drought coping capacity.  Collectively, working through global and regional partnerships, the goal of reducing the magnitude of economic, environmental, and social impacts associated with drought in the 21st century can be achieved.  Information on drought EWS, automated data collection techniques, drought indices and indicators, triggers for mitigation and response actions, planning methodologies, drought policies, and mitigation actions and programs are just a few of the areas where interaction between countries and regions can expedite progress on drought preparedness. Donald A. Wilhite - Director, National Drought Mitigation Center and International Drought Information Center, and Professor and Associate Director, School of Natural Resource Sciences, University of Nebraska, Lincoln, Nebraska 68583 U.S.A.


Satellite technology to help farmers predict soil moisture and crop yield

Australian farmers will soon be able to measure soil moisture in paddocks from data collected by a space satellite under a University of Melbourne, NASA and European Space Agency (ESA) experiment. Dr Jeff Walker from the Department of Civil and Environmental Engineering of the University of Melbourne is leading an international experiment, (the National Airborne Field Experiment) to test and enhance satellite technology that will measure soil moisture levels in paddocks for Australian primary producers. “Using the space technology, farmers will be able to obtain predictions about soil moisture and crop yield out to seven days and three months. This will help them to make critical decisions about what to plant and when, their likely crop yield,” Dr Walker said. “Our vision is that via the Internet, farmers will be able to download key information about current and future soil moisture in their paddocks, which has been generated from a combination of model predictions and satellite observations.” Using a small aircraft fitted with equipment similar to that of the satellite, the University of Melbourne-led research team aims to find out how to measure soil moisture up to one metre underground. The satellite technology currently measures only five centimetres below the Earth’s surface. Researchers on foot will be collecting ground measurements concurrently with the plane as it flies over the area, to help validate the aircraft’s measurements. The result of the experiments will be the development of the first dedicated soil moisture satellite (SMOS - Soil Moisture and Ocean Salinity) to be launched by the ESA next year. “Water management for irrigation is a critical issue for farmers in Australia and the world,” Dr Walker said. “The enhanced satellite technology will enable farmers to forecast crop yield, politicians to make drought declarations and monitor global climate change, and organisations like the Bureau of Meteorology to conduct flood forecasting and weather prediction,” he said. The three week experiment is being conducted between 30 October and 22 November in Narrandera, 100 km west of Wagga, NSW. It is the second in a series of experiments to be conducted in Australia. International collaborators include USA (NASA, USDA, Uni South Carolina),Canada (Env. Canada, Guelph Uni), France (CESBIO), The Netherlands (Wageningen University), Australia (University of Melbourne, University of Newcastle, Flinders University, CSIRO, Charles Sturt University, NSW Dept of Primary Industries, NSW Dept of Natural Resources). Source - http://uninews.unimelb.edu.au/


Climate change in central Europe

The number of weather-related natural catastrophes is on the increase in Europe. There is growing evidence that this trend is already due in part to climate change. The latest IPCC report confirms that the climate is changing. The mean global temperature is constantly rising. It has increased by 0.74°C in the last 100 years and by no less than 0.13°C per decade in the last 50 years – double the rate for the 100-year period as a whole. The rate of increase has been even more pronounced in Europe, with a rise of 0.95°C in the last 100 years, and as much as 1.0°C in Germany, 1.1°C in Austria and 1.4°C in Switzerland. The rise in temperature has been even more pronounced in the last few decades. Munich Re was one of the first companies in the finance sector to draw attention to the problem, pointing out in a 1973 publication on flooding that the growing losses might be due to human-induced climate change. The 2007 IPCC (Intergovernmental Panel on Climate Change) report confirms the statements and warnings we have issued over the last three decades: it is more than 90% probable that climate-changing trace gases released into the atmosphere by human activity are the primary cause of the global increase in temperature. The IPCC also confirms our analyses indicating that climate change is already causing a greater number and higher intensity of weather extremes. Climate change curbs economic growth Sir Nicholas Stern’s report on the “Economics of Climate Change”, published in October 2006, addressed the financial impact of climate change. It predicted a reduction in annual global growth of at least 5%, or US$ 2,200bn, by the middle of this century. The Stern Review forecasts that the costs will be limited to 1% of annual global gross domestic product (US$ 445bn) provided we take corresponding action. Such action would enable us to remain below the critical dividing line of a 2°C increase in global average temperatures compared with pre-industrial levels. This objective will only be achieved if CO2 eq concentrations can be stabilised at 445–535 ppm by 2050. Some increase is inevitable due to growing emissions from emerging countries such as China and India. It is also crucial that we finance steps to adapt to climate change impacts that can no longer be prevented. The insurance industry has a key role to play in this by providing solutions to deal with the financial losses. Insurance aspects Catastrophe losses have vastly increased for insurers in recent decades, five out of six natural catastrophes being triggered by weather extremes. These developments have prompted the need for a re-analysis of the risks posed by weather-related hazards in Europe. Increasing loss potential A winter storm can produce insured losses of over ?50bn in central Europe. However, events on a local scale should not be underestimated either. Hailstorm scenarios can cause billions of insured losses in major cities. At the same time, the vulnerability of insured property is increasing due to structural changes, the use of new materials, and the extension of cover. As a result of selective underwriting, insured losses from floods remain low in central Europe, but the loss potential is growing because of the higher incidence of heavy precipitation, the development of exposed areas, and increasing concentrations of values in the areas at risk. Scenarios involving more than one country are likely to produce losses of several billion euros. Despite global warming, extreme winters like that of 2005/2006 remain a possibility. Heavy snowfall and prolonged cold spells during which the snow accumulates could cause major losses – as in Austria in 2006, when the insured market loss came to a substantial 300m. Munich Re considers deductibles an effective means by which insurers can manage the risks, since the burden is more widely shared. Sublimits of up to 50% of the sum insured, on the other hand, are less efficient because, across the portfolio as a whole, they are much higher than the probable maximum loss Munich Re expects for natural hazards. However, sublimits can be useful when applied to individual risks. Hail and windstorm insurance penetration high Munich Re has examined various aspects of insurance in Germany, Italy, Austria, Switzerland, the Czech Republic, the Slovak Republic and Slovenia. Hail and storm insurance penetration in the central European countries reviewed is 80–100%, but the figure is generally much lower for flood risks. This is due to the fact that the storm and hail perils are often covered under the fire policy. Market penetration in Switzerland, where the insurance of natural hazard risks (except earthquake) is normally obligatory, is virtually 100%. In Austria, unlike the other countries analysed, the state has set up an emergency fund for exceptional natural hazard events, funded out of tax revenues. However, there is no legal entitlement to compensation. State coverage State help is primarily used to deal with flood risks and reduced agricultural yields. Reduced yields have increased insurers’ loss ratios considerably in some years, resulting in higher premiums for farmers. In central Europe, cover in Austria is among the most comprehensive. In addition to hail, policies also cover frost, windstorm, flood, drought, and prolonged rainfall at harvest time, premiums being subsidised by the state to the tune of 50%. The provision of blanket, state-subsidised multi-peril crop insurance necessitates a risk partnership between the agricultural sector, the insurance industry and the state. Transferring risk to the capital markets Alternative risk transfer methods are assuming greater importance, given the growing loss potential. The securitisation of catastrophe risks by the issue of catastrophe bonds on the capital markets has soared since Hurricane Katrina. In 2006, around US$ 5bn worth of catastrophe risk bonds were issued, twice as much as in 2005. In all, catastrophe bonds currently amount to around US$ 15bn. Experts believe that, in the medium term, 20% of catastrophe risk capacity will be placed on the capital markets. In addition, catastrophe risks are traded using insurance derivatives. Thus, companies can protect themselves in the short term against the financial consequences of exceptional weather factors. It is now also possible to place risks with lower return periods (25–30 years), following an initial phase in which mainly top-layer risks were securitised. Munich Re uses catastrophe bonds to cover weather-related natural catastrophes (windstorm Europe, hurricane USA) and offers both insurance and derivative solutions tailored to the situation of the individual company. Munich Re’s Risk Trading Unit provides support and advice for clients on the transfer of risks to the capital markets. Effects during the summer half-year Heatwaves — In the course of the 21st century, temperatures will continue to increase during the summer half-year: by 2.5–3.5°C in northern Germany, compared with the 1961–1990 average, and more than 3.5°C in southern Germany, the south-west Czech Republic, Austria, Switzerland, northern Italy and Slovenia. Many places will experience more heatwaves. — Scenarios for Upper Austria indicate heat periods (temperatures remain at or above approximately 30°C for a minimum of 20 days) every two years on average, compared with every 20 at present. There will also be significantly more dry periods – like heat periods, often associated with high-pressure conditions – in Upper Austria, Burgenland and Styria. — According to the results of climate research, Switzerland, the Czech Republic, the Slovak Republic, Hungary, northern Italy and Slovenia can also expect more periods of heat and drought. Precipitation — Total precipitation in the summer half-year will decrease, but some events could be more intense, so that rainfall will be distributed over a smaller number of occurrences. — One climate model shows that average July–September precipitation will be significantly less than in the period 1961–1990, the reduction being as much as 20–30% in parts of Austria, northern Italy, the Czech Republic, Switzerland and the Slovak Republic. — Conversely, 1% heaviest five-day summer precipitation will increase in parts of northern Italy and northern Switzerland and along a band comprising northern Austria and many parts of the Czech Republic, Poland and eastern Germany. — Models indicate that a warmer climate could mean fewer low-pressure rain systems forming in the northern Mediterranean basin due to central European trough conditions. However, those precipitation events could become more intense. Thunderstorms — The tendency towards thunderstorm formation, with the risk of hail, strong gusts, tornadoes, flash floods and lightening, will increase to a varying extent, depending on the region. This trend has been confirmed by observations in the Swiss plateau region and in southwest Germany, where it has been directly measured during the past 30 years. — The Swiss observations show that European weather conditions conducive to thunderstorms also became more frequent in the course of the 20th century. Consequently, local increases of this sort are also probable for other regions of central Europe, although difficult to substantiate at the present time. Climate models are still too approximate to indicate such effects in the projections. In the areas of northern Italy, Slovenia and southern Austria affected by moist air masses from the Adriatic, people will have to be prepared to deal with more severe thunderstorms bringing hail, heavy precipitation and strong gusts. West European trough conditions — Repeated bouts of extremely severe precipitation along the Mediterranean coast from southeast Spain to Italy and Slovenia cause floods between September and November. Torrential rain events in late summer and autumn are expected to become more frequent and intense in the Mediterranean countries. Munich Re


Greening most serious citrus disease

Citrus greening has been described as the world’s most serious citrus disease, and researchers are combining their efforts to help combat this threat to an important Florida industry. Citrus greening is a fatal disease that has never been successfully eradicated in other parts of the world. The disease, which slowly weakens and kills all types of citrus trees, causes fruit to become lopsided and taste bitter, making it unusable. Fruit does not develop the desired color, hence the greening name. There is no known cure for the disease, which is on the U.S. Department of Agriculture’s select list of threats to plants and wildlife regulated by the Agricultural Bioterrorism Protection Act. The disease does not harm people. The scientific name for citrus greening — huanglongbing — means “yellow shoot” which describes the symptom of a bright yellow shoot that commonly occurs on a sector of infected trees. It is a serious disease of citrus because it affects all citrus cultivars and causes decline of trees. Citrus greening has seriously affected citrus production in a number of countries in Asia, Africa, the Indian subcontinent and the Arabian Peninsula, and was discovered in July 2004 in Brazil. Wherever the disease has appeared, citrus production has been compromised with the loss of millions of trees. It has not been reported in Australia or in the Mediterranean Basin. In August 2005, the disease was found in the south Florida areas of Homestead and Florida City. Since that time, multiple residential and commercial citrus sites have been found infected with citrus greening. The type found is the Asian form which occurs in warm low altitude areas. According to a recent report published by the University of Florida Institute of Food and Agricultural Sciences (IFAS), the initial or early symptoms of citrus greening on leaves are vein yellowing and asymmetrical chlorosis referred to as a blotchy mottle. The blotchy mottle symptom is the most diagnostic symptom of the disease. Leaves might be small and upright with a variety of chlorotic patterns that often resemble mineral deficiencies such as those of zinc, iron, manganese, calcium, sulfur and/or boron. Often some of the leaves may be totally devoid of green or with only islands of green spots. The blotchy mottle symptom also may be confused with other diseases such as stubborn, severe forms of citrus tristeza virus (CTV), Phytophthora root rot, water logging or citrus blight. Root systems of infected trees are often poorly developed and new root growth may be suppressed. Citrus greening is difficult to manage and continued production of citrus has proven difficult and expensive in areas where it is widespread. Since it is transmitted by the Asian citrus psyllid, which is well established in Florida, there is clearly a potential for the continued spread of the disease throughout Florida citrus. IFAS researchers say the use of clean budwood and certified healthy trees is essential in preventing the disease. Budwood sources and nursery stock should be protected from psyllid infestation by screened enclosures and the use of systemic insecticides such as imidacloprid. Some biological control of the psyllid is available, but the amount of psyllid control provided by introduced parasitoids has not been sufficient to limit disease spread. The Asian citrus psyllid, say the scientists, feeds on many rutaceous plant species. The psyllid has a preference for the landscape ornamental, orange jessamine (Murraya paniculata). It has recently been found to be a host of the citrus greening bacterium and can serve as a host plant for the psyllid. Another rutaceous ornamental, Severinia buxifolia or orange boxwood, is also a host for the bacterium as well as the psyllid. Movement of these ornamentals is restricted under state compliance agreements and should not be moved from areas where the disease occurs. Scouting for greening-infected trees should be done routinely so that infected trees can be removed. It has been suggested that scouting be conducted four times per year. The frequency of scouting may be higher in areas previously determined to have positive greening trees. Symptoms of citrus greening, say IFAS researchers, are the easiest to find from October to March. However, symptoms may be present at other times of the year. The current methods used to scout are walking, using all-terrain vehicles, and on platforms mounted on vehicles. Symptomatic tree numbers and the rows in which they are found should be marked with colored flagging tape and GPS coordinates taken or the sites marked on a map to facilitate relocation and removal of these trees. In some cases, a greening PCR diagnosis test may be necessary to confirm the disease. Diagnosis of citrus greening may be difficult since some nutrient deficiency symptoms and other problems are often confused with some of the symptoms associated with greening. Greening-affected leaves may accumulate starch, and an iodine-based starch test can be used to assist in determining what leaves should be sent for PCR diagnosis. The iodine test alone is not used for greening diagnosis. However, it is useful in deciding which leaves should be sent for diagnosis. The procedure for the test can be found on the Citrus Research and Education Center greening Web site at the address listed below. Samples of suspected greening infected trees may be sent in for PCR diagnosis to the Southern Gardens Diagnostic Laboratory or to the Southwest Florida REC in Immokalee beginning in the late spring of 2008. The procedures for submission of suspect samples for PCR testing is available at the following Web site: http://www.flcitrusmutual.com/content/docs/issues/canker/sg_samplingform.pdf Removing infected trees is the only way to insure they will not serve as a source of the bacteria for psyllid acquisition and subsequent transmission. Prior to removal, the infected tree should be treated with a foliar insecticide (such as Danitol, fenpropathrin) to kill all adult psyllids feeding on that tree. Failure to control these psyllids will result in the infected psyllids dispersing to new plants once the diseased tree is removed. Pruning of symptomatic limbs has been attempted in many countries to reduce the inoculum available to the psyllids. However, because the disease is systemic, pruning has not been successful since other parts of the tree may already be infected, but not yet symptomatic. Additionally, if a tree is still infected after pruning, the new flush produced will serve as a feeding site for adult psyllids to acquire the greening bacteria. The infected psyllids may then disperse to uninfected trees once the new flush hardens off. Integrated pest management strategies should focus on the following: Use of disease-free nursery trees, reduction of the inoculum by frequent disease surveys, removal of symptomatic trees, and suppression of Asian citrus psyllid populations. Paul L. Hollis, Farm Press Editorial Staff


How to Make Soybean Replant Decisions

Some soybean fields have been flooded by heavy rainfall. Others have been hit by hail. Hail damage early in the growing season often looks worse than it really is and flood damage is often more detrimental than hail damage in the beginning of the growing season. "But that doesn't mean you should ignore hail damage," says Palle Pedersen, Iowa State University Extension soybean agronomist. As soon as a bean plant emerges, the growing point, located in the cotyledons, is above ground, he explains. This makes soybeans particularly susceptible to damage from hail, frost, insects such as bean leaf beetle, or anything that cuts the plant off below the cotyledons early in its life. Examine growing point carefully "Consider a soybean plant dead if it is in the cotyledon stage and is cut off below the cotyledons, or if it is damaged by hail to such a degree that there is no remaining green leaf tissue or regrowth," says Pedersen. "The reason is that nutrients and food reserves in the cotyledons supply the needs of the young plant during emergence and for about seven to 10 days after emergence, or until about the V1 stage - one fully-developed trifoliate leaf." Cotyledons are the first photosynthetic organs of the soybean seedling and also are major contributors for seedling growth. That's why stand reductions are likely to follow hailstorms. After V1, photosynthesis by the developing leaves is adequate for the plant to sustain itself, he adds. Estimate the plant population Accurately estimating soybean plant population is important before making replant decisions. "Plant population should be based on an accurate stand count, along with factors such as yield potential of the existing stand, date of replanting and the real cost of replanting," says Pedersen. To evaluate the existing stand, you should look at uniformity of stand and overall health of plants. Only some areas of the field may require replanting if the majority of the field seems to have enough viable plants remaining. Pedersen says it's important to wait several days (3 to 5) after a crop has been damaged or has emerged before replanting. Injury can look very serious the day after the hail damage event but recovery may be possible. How low can your stand go? Previous ISU studies show that a final stand as low as 73,000 plants per acre has consistently yielded more than 90% of the optimum plant population. That is a little more than two plants per foot of row in 30-inch row spacing. That may not sound like a lot but it is, says Pedersen. The reason is that soybean plants can compensate for missing plants and reduced stands by branching out to make up for a thin stand. "Keep in mind the lower the stand count, the more weeds will become a problem due to less shading, especially later in the growing season. If a reduced stand is saved, weed control must be a top priority," he says. Other problems associated There are some secondary problems associated with flooding and hail damage. Disease may increase and consequently further reduce stands since plants that have been damaged or wounded are more susceptible to infection from plant pathogens like Phytophthora root rot and Pythium. In addition to all this, seed quality was a serious issue this year. Flooding and pathogens have a greater impact when poor quality seed is used than when the seed isn't mechanically damaged and is free of seedborne disease. Watch for wounds on plants "Soybean plants that have torn stems should be watched closely in the coming weeks for evidence of pathogen infection," says Pedersen. "Lesions around the base of the stem and plant wilting are often good indicators. If this is the case, it will be necessary to estimate the number of viable plants in the field again and make a decision concerning replanting." However, it is difficult to assess this type of injury soon after flooding or a hail event. Thus, if the field has a history of pathogen problems and if it continues to rain, loss of wounded plants will probably increase, he says. Make estimate of viable plants "Remember, yields will not necessarily be reduced just because the plant stand has been reduced," says Pedersen. "When it is possible to get back into the fields, take plenty of time to visit each of your fields and take the time to make a good estimate of the number of viable plants in the stand where flooding or hail has occurred." A replant decision based on a quick look at a field may therefore underestimate the existing plant population. "We recommend that you plant the original full season variety until June 20 in northern and central Iowa and until early July in southern Iowa," says Pedersen. Rod Swoboda rswoboda@farmprogress.com More information on soybean replant decisions can be found at http://www.soybeanmanagement.info/.

of 4
istanbul escort şişli escort tbilisi escort şişli escort şişli escort maslak escort istanbul escort beşiktaş escort taksim escort izmir escort ümraniye escort mecidiyeköy escort şişli escort taksim escort ümraniye escort kartal escort şirinevler escort maltepe escort istanbul escort ümraniye escort kadıköy escort vip escort mersin escort istanbul escorts ataköy escort avcılar escort beylikdüzü escort okmeydanı escort şişli escort tuzla escort işitme cihazı sex shop sex shop sex shop sex shop sex shop sex shop sex shop sex shop