Precision Ag Practices Can Buttress Conservation Efforts

First envisioned in the 1990’s after the US government relaxed restrictions on GPS signals, precision ag technology can help farmers reduce inputs such as fuel and fertilizer, saving money and helping the environment.

Over the last few decades, U.S. farmers have increasingly turned to precision agriculture practices to help to manage application of a range of inputs, including seeds, fertilizer, and irrigation water. These practices help to lower their production expenses while at the same time reducing the potential for fertilizer runoff into ground and surface water. A 2007 NRCS study found that soil and water quality benefits can result from reduced or targeted application of inputs and irrigation water benefiting the environment through the lowered use of inputs.

Satellite images of the Earth’s surface began to become available in the early 1970’s. The first man-made satellites were launched in 1957, mainly to demonstrate the robustness of Russian and U.S. engineering and science advances. The U.S. Landsat satellites, first launched by NASA in 1972, were designed to observe and record remote images of the Earth’s surface. For the first few decades, these satellites did not provide good enough resolution to be useful for agricultural purposes, especially at the individual farm level—initially, resolution could be no more precise than 33 feet. This limitation was imposed on U.S.-based satellites by the U.S. government for national security reasons--the fear was that accurate information from such images might assist foreign powers in identifying potential military targets in this country. The restrictions were lifted in the late 1990’s to allow U.S. commercial satellite firms to better compete in an international market.

Dr. Pierre Robert, a soil scientist at the University of Minnesota, is credited as the ‘father of precision agriculture’ due to his work in the late 1980’s describing the concept of applying inputs at variable rates across fields in ways consistent with maximizing production while minimizing input costs. He established the first Precision Agriculture Center with funds from USDA’s Fund for Rural America grant program in 1995, and conducted early research on variable rate application of fertilizer and herbicides. The first yield monitor was sold commercially in 1992 by Al Myers through his company Ag Leader Technology. The Idea of a variable rate fertilizer spreader conceived by USDA scientist John Hummel in 1985, and the first equipment was built and used in-field in 1987, using guidance by radio beacons and digitized soil maps.

Today, farmers can operate GPS receivers on their combines that receive satellite images that allow them to pinpoint their location geospatially within one meter. Use of different light spectra in those images, such as near infrared and microwave, can help farmers not only determine where their crops are having problems but also help them diagnose why it is occurring. Combined with the ability to deliver precise amounts of fertilizer or herbicide to a given spot in a large field based on such a diagnosis, it can help farmers maximize yield and optimize use of inputs.

Data from U.S. Department of Agriculture’s (USDA) Agricultural Resource Management Survey (ARMS) showed that in 2010 and 2012, some form of precision agriculture was used on more than 70 percent of all U.S. corn and soybean acres respectively. This involved use of a yield monitor on more than 60 percent of corn and soybean acres, but only about one-fifth of those acres benefited from the use of variable rate technology (VRT).

Such technology not only helps farmers reduce their use of key inputs such as fuel, fertilizer, and other chemical inputs, thus reducing costs of production, but these savings also translate into reduced greenhouse gas emissions at an aggregate level. For example, NRCS estimates that if 10 percent of U.S. crops were planted using automatic guidance systems, such an adoption rate would reduce fuel use by 16 million gallons, herbicide use by 2 million quarts, and pesticide use by 4 million quarts. Use of automatic guidance can also reduce the number of trips across the field, reducing soil compaction.

Precision ag can also help farmers identify portions of their fields that have low innate productivity, and make them aware that they could generate more revenue from those sub-field areas by installing conservation structures such as buffer strips, bioreactors, or pollinator habitat, receiving government payments for adopting those practices, rather than farming them at a net loss. The installation of edge of field practices, especially on fields that border bodies of water, reduces nutrient runoff and improves water quality.

Recent research also suggests that improvements in the efficiency of nitrogen fertilizer applications can reduce the emissions of nitrous oxide (N2O) from cropland. This gas is one of the main greenhouse gases emitted as a result of agricultural activities in the United States, directly accounting for about half of all direct agricultural emissions and five percent of total U.S. emissions as of 2020. Use of VRT to apply fertilizer on fields based on the specific needs of the crop at various locations within the field can line up rates applied with expected nitrogen uptake by the crop more accurately, leading to less excess nitrogen subject to either runoff or volatilization into the atmosphere as N2O. A 2020 field trial undertaken by the extension service at the University of Nebraska was able to reduce nitrogen application on average by 25 pounds per acre by applying at a prescribed rate depending on the agronomic characteristics on various portions of the field, as opposed to a flat rate across the entire field.

A 2021 USDA report on computer usage and ownership found that only 25 percent of American farmers were using precision agriculture technologies on their farming operations, with the highest adoption rates in Midwest states dominated by row crops. Their definition included both cropping operations such as described above, as well as robotic milking parlors and electronic tagging and monitoring of livestock. A 2016 USDA study found that larger cropping operations (2,900 acres or more) are twice as likely to adopt such technology as their smaller counterparts.

AgWeb-Logo crop
Related Stories
Farmers detail how they navigated the strip-till learning curve and offer some tips you’ll want to know before making the move.
The new tool helps corn growers get optimum seed-to-soil contact, more efficient fertilizer use and full in-cab control of guidance technology.
How sharpening the point is allowing farmers to do more and use less
Read Next
Get News Daily
Get Market Alerts
Get News & Markets App