Yet despite the benefits, technology in animal agriculture is often villified.
By Jude Capper, Ph.D.
In 1611, European dairy cattle were imported into Jamestown, Va., and the fledgling U.S. dairy industry was formed. Since those first cattle arrived, the industry has made huge productivity gains.
The earliest recorded U.S. milk production data relates to a Jersey cow (Flora 13) that produced 511 lbs. of milk over 350 days in 1854. By contrast, USDA data shows average milk production per cow was 20,396 lbs./cow in 2008.
Advances in technology allow the dairy industry to produce more milk using fewer cows, and thus fewer resources, than in the past. Compared to dairying in the 1940’s, we use 21% of the animals, 23% of the feed, 35% of the water and only 10% of the land to produce a gallon of milk.
Nonetheless, a lengthy time interval may occur between discovery of a new technology and widespread adoption. For example, A.I. use improves productivity by increasing the rate of genetic improvement. However, a 40-year gap occurred between its first use in dairy cattle (1936) and it being used (to some extent) in 90% of herds (1977). To date, the only technology that approached 100% adoption within a relatively short time was the bulk milk tank.
The accompanying graph shows trends in milk production per cow from 1960 to 2007 for the United States, Canada, an aggregate of the top six milk producing countries in Europe (Netherlands, United Kingdom, Germany, France, Italy and Poland) and New Zealand. Although milk yields were somewhat similar back in 1960, the lines have diverged widely over time. The U.S. has shown the fastest rate of improvement; Canada and Europe are intermediate; and New Zealand production has remained fairly static.
Improvements in productivity for the U.S., Canada and Europe were made possible by advances in genetics, nutrition, management and animal health. However, differences in the rate of improvement may be partially explained by the attitude towards – and the adoption of – technology within the various regions. The U.S. generally being pro-technology, whereas Europe in particular takes a more hostile position.
Lower productivity increases the environmental impact of dairy production, as a greater number of animals are required to produce the same amount of milk. For every one animal within the 2007 U.S. dairy population, Canada required 1.1 animals, Europe required 1.4 animals and New Zealand required 2.4 animals to maintain the same level of production. This rise in animal numbers in-creases the amount of feed needed to maintain the population, thus increasing cropland, water, fertilizer and fossil fuel use.
This issue is not confined to regions that have highly-developed dairy industries. In 2008, the Chinese government recommended the dairy product intake of each citizen should increase from 3.5 oz/day to 10.6 oz/day. At current levels of milk production, this requires an additional 65 million dairy animals in China and a huge increase in resource (feed, cropland, water, etc.) use. If productivity is improved to current U.S. levels, the additional number of animals required would be reduced by two-thirds, to 23 million head.
It is paradoxical that in every other industry, technology use is embraced as the path to future improvement, but in animal agriculture, use of technologies that enhance productivity are being actively campaigned against by activists. Prohibiting technology use reduces productivity and increases both resource use and environmental impact. As the population increases and resources continue to dwindle, the role of technology in improving productivity and helping to supply consumers with safe, affordable, nutritious dairy products should be celebrated, not vilified.
■ Jude Capper, Ph.D. is an assistant professor of dairy sciences at Washington State University. Her current research includes investigating the environmental impact of dairy products produced through differing on-farm management practices. Contact her via phone: 509-335-6192; or e-mail: email@example.com.