E N G I N E E R I N G & T E C H N O L O G Y
Army Reserve National Guard personnel on training exercises;
U.S. Army photo
Reserves Going Hi-Tech
National Guard and Reserve personnel who serve part time in the military are increasingly being called on to participate in peacekeeping operations, as well as to train for possible deployments in international conflicts. Although guardsmen and reservists bring considerable expertise to the U.S. armed forces, their principal jobs are in the civilian world.
A report from a National Research Council committee examines how new technologies can be used to improve the effectiveness of reserve forces, and how their personnel can be better integrated with those on active duty.
The committee urged the Department of Defense to implement four high-priority pilot programs to provide military leaders with better information on issues that affect the reserves:
- Increased Training Time through Technology, to increase training opportunities by applying distance-learning technology;
- Advanced Distributed-Learning Technology for Maintenance Personnel, to determine if diagnostic and repair technologies can be transmitted over long distances to field maintenance personnel;
- Streamlined Administrative Processes, to speed up the time it takes to mobilize reservists by demonstrating the use of advanced computer information systems; and
- Telesupport and Remote Staffing, to help reduce the size and vulnerability of combat forces deployed overseas by linking them with technical support units and personnel based in the United States. — Bob Ludwig
Technology-Based Pilot Programs: Improving Future U.S. Military Reserve Forces. Committee on Reserve Forces for 2010 and Beyond, Division of Military Science and Technology, Commission on Engineering and Technical Systems (1999, 82 pp.; ISBN 0-309-06576-3; available from National Academy Press, tel. 1-800-624-6242; $18.00 plus $4.50 shipping for single copies).
The committee was chaired by Donald Fredericksen, Hicks and Associates, McLean, Va. The study was funded by the U.S. Department of Defense.
Lessons From Y2K
While revelers around the globe celebrate the turn of the century and a new millennium this New Year’s Eve, computer technicians everywhere will be keeping their eyes on the clock for another reason. For them, the approach of the year 2000 has meant hundreds of hours fixing the so-called Y2K bug, and they are anxious to see the results.
Only when the New Year has arrived will we find out how well governments and businesses around the world did to mitigate potential mishaps. But regardless of the magnitude of disruptions, or lack thereof, the preparation demanded by the Y2K computer issue and the vulnerabilities it exposed will provide valuable lessons. For this reason, the Research Council is launching a study to assess the impact of Y2K and to look at how organizations can best prepare for future “shocks” to systems dependent on computer technology.
One phase of the study will involve an examination of the U.S. Air Force’s Y2K preparedness efforts and the result of those endeavors. Researchers will pay particular attention to Air Force systems that are connected to outside or even non-U.S. networks, and the risk that poses. In addition, researchers will explore whether disruptions caused by Y2K can be used in training to simulate the effects of a “cyber warfare” attack on Air Force networks, and whether the Air Force’s response to Y2K can be applied to the protection of other critical systems in the future.
In the second part of the project, the Research Council will team up with the European Commission — the policy arm of the European Union — to study the international effects of Y2K. Together they will draw on lessons learned, to foster a better understanding of how dependable complex computer systems really are.
Researchers involved in the project expect to hold workshops and issue reports in 2000 and 2001 on both phases of the project. — Bill Kearney (See listing under New Projects.)
Rights to Data
The public’s access to science and technology information in computer databases is in danger of being compromised. New legal approaches such as the European Union’s 1996 Directive on the Legal Protection of Databases, and other legal initiatives now being considered in the U.S. Congress and state legislatures, are threatening to upset the balance between the rights of database owners who are concerned about possible commercial misappropriation of their products, and public users of the data such as researchers, educators, and libraries.
In examining these developments, a new report from a Research Council committee concluded that because database owners enjoy significant legal, technical, and market-based protections, the need for new statutory protection has not been sufficiently substantiated. Nevertheless, although the committee opposed the creation of any strong new protective measures, it recognized that some additional limits against wholesale misuse of databases may be necessary.
In particular, a new, properly scoped and focused U.S. law might be a reasonable alternative to the European Union’s database directive. Such legislation could then serve as a legal model for an international treaty. The report also recommends a number of guiding principles for such legislation, as well as related policy actions for the administration. For example, one principal is that any new federal protection of databases should balance the costs and benefits of the proposed changes for both the database rights holders and users. Other principles cover areas such as length and scope of legal protection and exemptions for nonprofit institutions. — B.L.
A Question of Balance: Private Rights and the Public Interest in Scientific and Technical Databases. Committee for a Study on Promoting Access to Scientific and Technical Data for the Public Interest, Commission on Physical Sciences, Mathematics, and Applications (1999, 136 pp.; ISBN 0-309-06825-8; available from National Academy Press, tel. 1-800-624-6242; $38.00 plus $4.50 shipping for single copies).
The committee was chaired by Robert J. Serafin, director, National Center for Atmospheric Research, Boulder, Colo. The study was funded by the National Science Foundation, National Institutes of Health, National Institute of Standards and Technology, National Aeronautics and Space Administration, National Oceanic and Atmospheric Administration, and Department of Energy.
Broadband’s Home Run
Better uses of copper wire, as well as the acceleration in the use of fiber-optic cable and wireless approaches, continue to revolutionize the telecommunications industry — making transport of greater amounts of information cheaper and more efficient. Telephone companies, Internet service providers, and cable television operators are using these new “broadband” technologies to offer people new services. But who has access to what technologies, and at what cost, are questions attracting widespread attention.
As technical innovation, industry competition, and government regulation interact, advancements in broadband technology will continue. This interaction, however, is viewed in different ways by the various stakeholders. In a new study, a Research Council committee will explore the technologies and examine the future scenarios in which they may be involved. The study group will identify options that are technically and economically feasible, and make recommendations for public policy that will facilitate broader deployment and support for local access to broadband service.
The project is expected to be completed in the latter part of 2001. — B.L. (See listing under New Projects.)