The use of computer-related technologies is nothing new in agriculture. Farmer-technologists began experimenting with mapping fields using the earliest commercially available global positioning data 20 or more years ago, and the term "precision agriculture" has firmly entered the language of data-enabled resource stewardship.
However, the integration of field-level data with global network capabilities, hardware and software advancements and, perhaps most importantly, societal imperatives surrounding pollution mitigation and animal welfare, have led to a new era in tools that not only make farmers' calculations more accurate and their lives less labor-intensive, but also portend of better dissemination of data about agriculture’s impact literally downstream.
In just the past five years, academic researchers have released scalable predictive models that can help farmers fine-tune the amount of fertilizers they will need to apply in given weather patterns, and create new land-use plans to alleviate pollutants such as manure residue flowing off their land. Additionally, farm implement vendors have created computationally-enabled tools such as robotic milking systems that not only relieve farmers of the physical labor of setting their cows up in milking parlors – and allowing the cows, once trained, to come to be milked when they are ready, improving their comfort and health – but which also can report real-time cow health and milk quality data to business associates such as veterinarians and farm consultants anywhere in the world.
The advent of such tools is not an indicator farmers will be able to slack off on overall effort, according to Francisco Rodriguez, marketing director of automatic milking and feeding technology for the North American offices of Tumba, Sweden-based DeLaval.
"The idea is to help progressive farmers, not lazy farmers," Rodriguez says. "It's our work to set the right expectations, because when you have a robot, the quantity of data you receive is a lot more than you had before, and that information has to be leveraged. Otherwise, you're not taking advantage of the reason to have a system like that."
Rodriguez's observations appear to hold true across a wide swath of agriculture. While not every farmer will need every data tool, the age of the "Internet of Everything" has clearly arrived for agriculture. Leveraging this data in context is a new imperative, and the tools are emerging to help farmers and their customers and partners to do so in concert.
Cornell University professor Harold van Es has long experience with data-driven agriculture. While there has been no shortage of relevant devices and databases in the past 20 years, van Es says systematically fitting the resulting data points together lagged far behind.
"The development of computerized data acquisition and control systems in agriculture, which happened 15 or 20 years ago, was really driven by engineers, equipment dealers, and people like that," van Es says. "The challenge was the equipment engineering was way ahead of the agronomic knowledge. So, sure, you had the ability to measure yield every second on your combine, but what were you going to do with the data? How did it allow you to make better decisions? Similarly, with the control systems, you could apply phosphorus or nitrogen at different rates on different parts of the field based on very minute scales – but how do you know how much to put where? That became the biggest challenge."
U.S. Department of Agriculture (USDA) soil research scientist Peter Vadas also says the type and amount of information available were often mismatched with where need was the greatest. For example, phosphorus loss, often the result of excess manure runoff, was measured and predicted primarily through a tool developed in the 1990s called the Phosphorus Index. Yet the index was adopted with different variables in different states, regardless of whether a waterway in one state ultimately drained into a watershed in another, with different soil characteristics. The tools available to front-line farmers, Vadas says, were either like the too-simple, often-inaccurate index, or watershed-sized databases that were clearly overkill for a single farmer's needs.
Both van Es and Vadas have been at the forefront of developing new modeling tools that aim to hit the sweet spot of just the right amount of information at field level:
The prospects for such tools are promising, and the reasons go far beyond farmers' profits and loss: as enterprises worldwide seek to reduce their impact on the environment, they are discovering in-house efforts can only go so far.
One example, van Es says, is American retail giant Wal-Mart; the company, he says, can install only so many solar panels on its warehouse roofs to decrease its carbon footprint. In fact, when they looked through their supply chain, Wal-Mart executives found that over-fertilization was the largest single factor impacting the environment. As a result, Wal-Mart has enlisted its major suppliers to undertake fertilizer optimization efforts, which include the use of tools such as Adapt N; the Ithaca-area test farm, they noted, could reduce greenhouse gas emissions by 2,450 tons annually by using the tool.
"What Wal-Mart is doing is very exciting because it may be a new paradigm of how we're going to be addressing some of these sustainability issues," van Es says. "Regulations have their limits. Now the private sector is saying, 'our customers and our board of directors want us to be sustainable,' and when a company as big as Wal-Mart says 'we want to do this,' everybody jumps. They have the deep pockets to make it happen. General Mills can't say to Wal-Mart, 'go take a hike.'"
Gregory Goth is an Oakville, CT-based writer who specializes in science and technology.
No entries found