Ghost in the machine: Year 2000 spooks nation’s computers
The birth of the new millennium could sound the death knell for many of the nation's computers. Barring drastic measures, experts say, many computer-controlled
May 1, 1997
Written by Jeff Green
The birth of the new millennium could sound the death knell for many of the nation’s computers. Barring drastic measures, experts say, many computer-controlled operations worldwide will malfunction at midnight, Dec. 31, 1999.
Computers will fall prey to the Year 2000 problem, also known as the Millennium Bug or “Y2K.” The problem is simple, its implications nearly unfathomable. Early in the computer age, programmers designed computers and applications using just two digits to record calendar years. A two-digit date field records the year 1997 as “97.” Problems arise when date fields read “00” and computers calculate the dawn of a new century – the 20th century, that is. Computers using two-digit date fields read the year “00” as 1900, rendering date-related calculations false and potentially crashing operations worldwide.
“Anything that runs off a computer chip could very well have this problem,” says Mike Humphrey, business director at Public Technology Inc. (PTI), the Washington, D.C.-based technology arm of the the National League of Cities, the National Association of Counties and the International City/County Management Association. Y2K failure in software related to finance, taxes, insurance and countless other key areas will spawn waves of consequences.
Disagreement over the scope of the problem exists, but discussions of Y2K’s economic impact typically deal in billions of dollars. The Gartner Group, a Stamford, Conn.-based market research firm, estimates that Y2K’s total global cost will range from $300 billion to $600 billion. Capers Jones, chairman of Burlington, Mass.-based Software Productivity Research, disagrees. He says Y2K appears to be the most expensive single problem in human history, with a tally of $1.6 trillion including expected litigation costs.
Concern has escalated recently regarding potential problems in products, including everything from elevators to security systems, that have two-digit date fields in “embedded” microprocessors. In local governments, Y2K could disrupt everything from school enrollment to prisoner-release schedules. Nationally, affected areas range from medical care to Defense Department operations.
Awareness runs high, but follow-through among localities is poor, says Bill Loller, a project leader at Mountain View, Calif.-based G2 Research. Too few have begun efforts to achieve Y2K compliance, a labor-intensive and time-consuming process. Gartner estimates that fewer than 50 percent of IS departments will be fully compliant by the year 2000. Local governments, which confront special challenges in the process, could lag even further behind.
Y2K already is disrupting operations that extend past 1999. In February, Keith Rhodes, technical director of the General Accounting Office, told a House of Representatives subcommittee that the Defense Department had mistakenly notified a contractor that it was 97 years past due on a contract that ends in January 2000.
“As we get closer and closer to the year 2000, we are going to see an increasing frequency of problems until, of course, the big bang,” says Y2K consultant Peter de Jager.
Public and private entities alike have been slow to act on the problem, at least in part because it sounds like science fiction or millennium-induced hysteria to those holding the purse strings. Experts bemoan the lack of recognition Y2K has received from government and corporate leaders.
Florida’s House speaker, for one, voiced doubts. Reacting last December to a cost estimate for the state’s Y2K fix, Orlando Republican Daniel Webster, was quoted as saying, “I don’t see it as a big problem, a $115 million problem . . . Given the way technology is moving, I guarantee you that if you put it on the Internet, someone will come up with a solution far cheaper.”
However, “It is not that simple,” responds Mary Kurkjian, a vice president at Blue Bell, Pa.-based Unisys and creator of the company’s state and local government Y2K program.
Shortsighted Shorthand The Y2K problem is hardly new, says de Jager, a former COBOL programmer and the founder of the Toronto-based firm de Jager & Co. (http://www.year2000.com). Though new in the public consciousness, the problem has been known by programmers since they first wrote the code.
Until recent years, computer memory was highly limited and expensive. As a result, programmers were called on to use “shorthand.” Given little choice, programmers used two-digit date fields, assuming wrongly that the code would not be in use decades later. Many later applications were also designed with two-digit date fields. In some cases, programmers who wrote the faulty shorthand are now being called out of retirement for top pay to aid Y2K repairs.
The Y2K problem directly interferes with hardware, software and data; consequently, it affects all platforms. Customized software applications written for mainframe computers are particularly troublesome. Automated tools are available to help convert code, but the procedure remains a project manager’s nightmare.
Gartner says pilot projects verified that up to 90 percent of applications are subject to failure if not made compliant. The firm estimates that the cost of upgrading computer systems will average $1.10 per line of code, but that number varies greatly according to case-specific circumstances and is expected to rise as competition for resources grows.
Still, “On the mainframe side, the hardware problem has been pretty well taken care of,” says PTI’s Humphrey. Also, Macintosh computers and newer PCs are not directly affected. According to Microsoft, newer applications such as Win 95 are compliant. However, an estimated 80 percent of PCs using older Windows products could face Y2K failure. Relatively simple fixes are available for operating-system and BIOS-level problems, though some applications could prove more complex. Macs were designed to handle 21st-century dates, but that is no guarantee that all Mac applications are compliant.
If, in fact, Y2K does end up costing billions of dollars, the money will find someone’s pockets. Thus, a sizable industry has sprouted. In addition to the hundreds of companies offering Y2K repair services, some insurance companies are creating policies to cover companies against Y2K losses. Many law firms have established Y2K working groups to deal with the flood of litigation expected to follow Y2K failures. The American Stock Exchange in March began trading options on the de Jager Year 2000 Index, the first index in the industry. The list of companies on the index was provided by de Jager and narrowed by AMEX to 18, in industries such as packaged software and programming.
Where do you stand? “A very small percentage of governments are on track to have everything done by the year 2000,” says G2’s Loller. Bob Cohen, vice president of the Information Technology Association of America, says localities are not necessarily any further behind than other sectors. Most sectors lag behind, but banks and insurance companies lead the pack.
The Y2K solution can be broken into three major steps: inventory, conversion and testing. The work can be completed entirely by an outside vendor, in-house or by a combination of the two. In many cases, managers will be forced to focus only on key systems. Managers can opt to replace systems entirely or to outsource entire software applications. Outsourcing spreads the cost of conversion over multiple payments, but it can be a political issue if jobs are lost. Also, the time needed to replace hardware and applications is running out.
Among the challenges faced by localities are achieving recognition at high levels, dealing with often-antiquated and poorly documented systems and coordinating efforts with disparate entities. Budget issues, such as immutable salary cycles, will hamper localities. “It is not common for governments to make new money available for the year 2000 problem,” Kurkjian says. Cost is a deterrent, particularly for a project that seems to offer little return on investment.
De Jager predicts that as competition for Y2K services grows, the public sector will be less able to compete. Cohen says smaller localities will suffer most as competition increases. Computer hardware companies are already turning away Y2K business.
The seller’s market will force local governments to change the way they do business, according to Kurkjian. “Government is notoriously slow at IT projects,” she says.
De Jager agrees, “We seem to have forgotten our illustrious history of never getting things done on time.”
Companies often tout products on Wall Street as magic bullets that will solve the Y2K problem, but industry experts generally agree that such a panacea is unlikely to surface. “The best that we can hope for is that there will be increasingly better productivity tools,” Kurkjian says.
“The denial is huge,” de Jager says. Faced with demands for year 2000 resources, he adds, many high-level officials “literally don’t believe it..”
The federal government has also reacted sluggishly. Starting last spring, the House began a series of subcommittee hearings on Y2K issues. GAO acknowledges that there will likely be some computer failure in the government as a result of Y2K, and the Office of Management and Budget has forecast a budget of $2.3 billion for federal Y2K problems, an estimate widely criticized as too low.
By May, de Jager says, “Everybody should have at least finished their inventory and done a full impact analysis so they have some sense of how big the problem is.”
Conventional wisdom says all of 1999 should be left for testing. Any later, according to Kurkjian, and “there will be some cycles that they will never have tested.”
“We’re supposed to ideally spend 50 percent of the time of any project on testing,” de Jager explains. “For every 1,000 changes you make to a system, you introduce about eight errors.”
“We’ve been having a concerned look at this area for a couple of years,” says Brian Wantling, director of data processing in Henrico County, Va. The first step was to name a point person to lead Y2K efforts. The county paid Unisys more than $100,000 for an inventory test, which covered about five million lines of code. The company recommended three potential routes. The least expensive, at about $2.2 million, was for the county to complete the task in-house. Want-ling chose that route and is now two-thirds done. He says the cost should be under $1 million and that he aims to finish conversion by late 1998.
Sam Owen, MIS director for Winston-Salem, N.C., began looking at the Y2K problem more than two years ago. Like Wantling, he says his goal is to finish conversion by late 1998. The most difficult aspect of the process, according to Owen, was getting recognition that Y2K is a major problem that is not going to go away. Beyond that, he says, “We’ve had good support from management.” Owen says his biggest concern is the unknown, particularly that resulting from a lack of coordination among agencies.
Karen Henderson, Y2K project manager for King County, Wash., says her department is doing inventories outside of the MIS department to find what operations will be affected. Henderson began looking at the Y2K problem a year ago and aims to finish conversion by March 1999.
Margin for error Industry experts admit they cannot accurately forecast the extent of the Y2K problem. The margin for error on many Y2K estimates, according to de Jager, could be as high as plus or minus 50 percent. No one can really grasp how far-reaching the problem is, he adds, because “there is no central inventory” for computer applications.
Still, de Jager hardly seems equivocal. “There will be failures. It is inevitable,” he says. De Jager estimates that 1 percent of all businesses will fail as a result of the Y2K problem, basing that on the assumptions that 35 percent of companies have compliance projects in place and that 86 percent of all projects are either turned in late or never completed.
Jones, whose Software Productivity Research publishes Y2K information largely on a pro-bono basis, predicts that 5 to 7 percent of all businesses will fail as a result of Y2K problems (http://www.spr.com).
De Jager expresses hope, but little faith, that the year 2000 problem has been overestimated. “That may very well end up being the truth of the matter, that it was never as big as we thought,” he says. Of course, it might also be bigger.
After five years using UNIX, the Planning, Zoning and Building Department (PZB) of Palm Beach County, Fla., last year switched its GIS operation to Windows NT. “We studied NT for almost two years and came to the conclusion that it is the future of our networked GIS,” says Donna Goldstein, PZB’s GIS systems supervisor.
Goldstein cites several reasons for the change, including the growing number of software developers who favor NT and release products exclusively in that format, as well as NT’s reputation as being easy to use because of its graphical interface. NT had been installed in several other county departments, and PZB wanted them to have access to the GIS server and databases via an existing Wide Area Network (WAN). Also, GIS work had risen to a point where upgrades were needed to maintain PZB’s level of service, and the department was installing Windows-based office software inaccessible from the UNIX workstations.
The switch required installation of an NT version of the GIS software and the purchase of a new server and workstations. UNIX data had to be moved to an NT version of the same database software. PZB originally built its GIS with Huntsville, Ala.-based Intergraph’s Modular GIS Environment (MGE) software, server and workstations for UNIX. PZB bought the new software and hardware from the same vendor, which implemented the switch and moved the data from UNIX.
PZB says the toughest aspect of the switch was timing it to occur all at once. Over a weekend, the UNIX data was exported to tape drives. The NT server was installed and set up with an identical database structure so the files could be imported. Goldstein credits the smoothness of the switch partly to training classes her staff completed.
As a result of the transition, six county departments now access the GIS data via the WAN, and several offices plan to install their own GIS to better utilize the data. Instead of calling PZB to request data, these departments can now obtain needed data from their own desktops.
The switch allowed PZB to speed its GIS development. Most GIS requests had come from within PZB and were limited to the census block level, as the database had no parcel information. However, PZB worked with the County Appraiser’s Office to export parcel DXF files into the GIS database. PZB will link the database to parcels using site addresses or property control numbers to conduct site-specific GIS queries. The department wrote a batch program in DOS within GIS software to translate “dumb” DXF parcel files into “smart” multilayer DGN files. Thus, every parcel boundary is identified in a separate GIS layer.
With its database growing in value and accessibility, PZB plans to provide access to more customers, including county offices and the public. PZB is buying GeoMedia Web Map, World Wide Web site development software that is intended to allow customers to obtain certain data via an Intranet on the WAN and eventually via the Internet.