NewsReport Online
E N V I R O N M E N T   &   R E S O U R C E S

©PhotoDisc
Regulating Biotech Crops

It has been 25 years since the National Academy of Sciences held its now historic Asilomar conference, where a group of scientists, lawyers, and government officials proposed new ground rules for the regulation of recombinant DNA technology, a laboratory tool that made genetic engineering a reality. Since then, the National Academies have continued to offer guidance on the safety of biotechnology.

Most recently, the National Research Council issued the results of a yearlong study looking at health and environmental risks posed by plants that have been genetically engineered to resist pests, and how the federal government goes about regulating these plants.

The committee that wrote the report found no evidence suggesting foods on the market today are unsafe to eat as a result of genetic modification. But it also said the federal agencies that regulate them should do a better job of coordinating their work, as the pesticidal plants become more popular with farmers. In 1999 alone, more than 70 million acres of transgenic crops — such as corn, cotton, and soybeans — were planted in the United States.

In order to enhance the credibility of the regulatory process, the agencies also should expand public access to the scientific data on which regulations are based, the committee said. Ultimately, the public’s acceptance of genetically modified foods rests on its confidence in the system.

Only in very rare circumstances have pest-protected plants caused obvious health or environmental problems, the committee found. And it emphasized that no strict distinction exists between the risks posed by plants genetically engineered through modern molecular techniques and those modified by conventional breeding practices. This means regulators should focus on the individual properties of a plant, not the method by which it was bred.

Among the ecological concerns examined is the possibility that transgenic plants could impact so-called nontarget species, such as beneficial insects. However, the committee said any impact is likely to be smaller than that from chemical pesticides. Even so, more research is needed in this area. For example, a highly publicized report showed that monarch butterfly larvae suffered adverse health effects or died after being fed pollen in a laboratory from corn that had been genetically engineered to produce an insecticide known as Bt. This issue needs to be studied further in the field where pollen density might be lower and the toxin might be deactivated by environmental factors.

Likewise, more research is needed to assess how pest-resistant genes might spread to weedy relatives, possibly exacerbating weed growth, the committee said. In addition, scientists need to further evaluate the ability of pests to evolve and develop resistance to plants that have been genetically modified to kill them. — Bill Kearney

Genetically Modified Pest-Protected Plants: Science and Regulation. Committee on Genetically Modified Pest-Protected Plants, Board on Agriculture and Natural Resources and Commission on Life Sciences (2000, approx. 315 pp.; ISBN 0-309-06930-0pr; pre-publication copies available from National Academy Press, tel. 1-800-624-6242; $50.00 plus $4.50 shipping for single copies).

The committee was chaired by Perry Adkisson, distinguished professor emeritus and chancellor emeritus, Texas A&M University, College Station. The study was funded by the National Research Council.


Power to Spare

High gas prices are a stark reminder of this country’s need for cheap, secure, environmentally friendly energy sources. The federal government recognized this back in 1974, when the oil crisis had drivers lining up at the pumps in droves, worried about gas supplies. Part of the federal response was to invest in renewable energy, spurring development of technologies that could produce electricity from sources which would, for all practical purposes, never run out. Some examples include power derived from crop wastes, geothermal heat from the Earth’s core, sunlight, and wind.

While this program of research and development has led to better, more-affordable technologies, the rate at which they have been put into commercial use has not met expectations, says a new report from the National Research Council. It reviewed the effectiveness of the efforts by the U.S. Department of Energy’s Office of Power Technologies (OPT), at the request of the office itself.

A big part of OPT’s problem in guiding these new technologies into commercial use stems from an earlier lack of strategic focus, the report concludes. In setting its goals, the office did not adequately account for the barriers to selling electricity produced by alternative methods, including major changes within the electricity industry. Big utility companies — thought to be the primary customers — are more likely to stick with traditional fossil fuels as long as they are cheaper. One possible strategy could be to seek out niche markets that are not well-served by existing power grids, particularly in developing countries. In the end, OPT must do a better job tracking changes in the marketplace.

Another problem is that congressional funding has been inconsistent, and frequently too small to support the development of a coordinated strategy. The report offers a series of recommendations for the overall program and work being done in each specific technology. They highlight areas where new plans are most needed, and call for the creation of a new office within the Department of Energy to focus on distributed power systems that will be well-suited to renewable energy technologies. — Neil Tickner

Renewable Power Pathways: A Review of the U.S. Department of Energy’s Renewable Energy Programs. Committee on Programmatic Review of the DOE’s Office of Power Technologies, Board on Energy and Environmental Systems, Commission on Engineering and Technical Systems (2000, 136 pp.; ISBN 0-309-06980-7; available from National Academy Press, tel. 1-800-624-6242; $30.75 plus $4.50 shipping for single copies).

The committee was chaired by H.M. Hubbard, retired president and chief executive officer, Pacific International Center for High Technology Research, Golden, Colo. The study was funded by the U.S. Department of Energy.


Too Much of a Good Thing

Each spring and summer, part of the Gulf of Mexico along Texas and Louisiana turns into a “dead zone.” Excessive amounts of nitrogen and phosphorus — which make their way to coastal waters via acid rain and rivers polluted with agricultural runoff and industrial waste — trigger harmful algal blooms that destroy or drive away fish, crabs, starfish, and other marine life.

The damage in the Gulf is just one example of serious environmental problems — from red tides in the mid-Atlantic to manatee deaths off of Florida — that are occurring along all of the nation’s coasts because of too much nitrogen and phosphorus in the water. Of 139 U.S. coastal areas assessed recently, 44 were identified as being severely affected by high levels of the nutrients.

A new report by a committee of the Research Council calls for the federal government to work with state and local agencies to develop a national strategy for protecting fresh and coastal watersheds from excessive nitrogen and phosphorus. At a minimum, the effort should strive to reduce the number of severely damaged coastal areas by at least 25 percent before 2020 and ensure that no other healthy coastal areas get into trouble.

Coastal environmental quality could be significantly improved if local and state agencies focused on identifying sources of excess nutrients and reducing their release, the committee said. But oversight at this level often is not sufficient for protecting large watersheds that span several states or for dealing with pollution sources far from coastal areas.

The federal government should take the lead on issues that span multiple jurisdictions or threaten federally protected natural resources, the committee said. Clear guidelines are needed for the maximum amounts of nutrients, or loads, that are released into waterways. For example, the Environmental Protection Agency develops standards for different types of regional watersheds and should continue to do so, focusing its efforts on identifying the sources of nutrients and setting maximum daily loads.

Accurate estimates of nutrients that travel along waterways and get dumped into coastal waters are essential for developing effective strategies to curb excesses. Federal, state, and local agencies should form partnerships with academic and research institutions to develop a monitoring program for the nation’s coastal waters. And every 10 years, the government should conduct a national assessment to determine the extent of nutrient problems and the effectiveness of efforts to combat them.

In addition, establishing a federally managed clearinghouse would provide better information to state and local coastal authorities, as would a complete database on the Internet with links to information. The committee recommended further research to improve the understanding of the causes, and the environmental and economic impacts, of nutrient contamination. — Molly Galvin

Clean Coastal Waters: Understanding and Reducing the Effects of Nutrient Pollution. Committee on the Causes and Management of Coastal Eutrophication, Ocean Studies Board and Water Science and Technology Board, Commission on Geosciences, Environment, and Resources (2000, approx. 300 pp.; ISBN 0-309-06948-3; available from National Academy Press, tel. 1-800-624-6242; $44.95 plus $4.50 shipping for single copies).

Robert Howarth, professor of ecology, program in biogeochemistry and environmental change, Cornell University, Ithaca, N.Y., chaired the committee. The study was funded by the National Oceanic and Atmospheric Administration, Environmental Protection Agency, U.S. Geological Survey, and the Electric Power Research Institute.


Privatizing Helium

During the Cold War, the United States began producing and stockpiling helium at a special facility in Texas. The rare, nonrenewable element — extracted mostly from natural gas — was considered essential for many military applications. In the decades that followed, the private sector also began to develop a broad array of technologies and processes that use helium, including fiber-optic cables, semiconductors, jet propulsion systems, and medical devices such as magnetic resonance imaging systems.

After several U.S. companies began to produce and sell helium in the 1970s and 1980s, the government decided to get out of the helium business. In 1996 Congress directed the Bureau of Land Management (BLM) to sell off most of the government’s considerable stockpile of the gas. The scientific community expressed concerns over this development. Selling the federally managed helium reserve — with the possibility that a subsequent shortage might occur — could threaten national security or the interests of the U.S. scientific, technical, and biomedical enterprises.

But a new report by a committee of the Research Council says privatizing the federal reserves of helium as mandated by law should not have an adverse effect on the overall production and use of the gas in the next two decades. The committee based its finding on the assumptions that demand for helium would continue to rise slowly but steadily, that no drastic reductions in supplies would occur, and that no new large source of the gas would be discovered. However, a number of actions should be taken to ensure that sufficient supplies continue to be available beyond 2020.

Federal law requires the government to assess in 2015 the results from the sale of its reserves. A mechanism should be developed to ensure that reviews could take place earlier, the committee said, especially if there appears to be dramatic changes in supply or demand. In addition, BLM should bolster methods for tracking and forecasting the international helium market, which could impact domestic supplies. Although the government keeps tabs on the gross amount of helium exported, little is known about how the gas is actually used.

To ensure that enough helium will continue to be produced in the future, the government should explore new ways to locate supplies of the gas. In addition, improved storage systems are needed, and new technologies that could conserve, recycle, or eventually replace the use of helium should be developed. — M.G.

The Impact of Selling the Federal Helium Reserve. Committee on the Impact of Selling the Federal Helium Reserve, Board on Physics and Astronomy, Commission on Physical Sciences, Mathematics, and Applications; and National Materials Advisory Board, Commission on Engineering and Technical Systems (2000, 98 pp.; ISBN 0-309-07038-4; available from National Academy Press, tel. 1-800-624-6242; $18.00 plus $4.50 shipping for single copies).

Committee co-chairs were Robert Ray Beebe, senior vice president, Homestake Mining Co. (retired), Tucson, Ariz.; and John Reppy, John L. Wetherill Professor of Physics, Cornell University, Ithaca, N.Y. The study was funded by the U.S. Department of the Interior.


Copyright National Academy of Sciences. All rights reserved.