SolutionSoft Systems, Inc.
2350 Mission College Blvd. Suite 715
Santa Clara, CA 95054
Phone: (408)988-7378
E-mail: paulwang@solution-soft.com
Abstract
Testing is critical to the success of your Year 2000 project. Hewlett-Packard estimates that 40 percent of time and resources of a typical Year 2000 project will be spent on the testing phase. Market research firms such as Meta Group and Gartner Group estimate an even higher percentage (50 to 60 percent).Year 2000 testing is unique in that both time and data must reflect the Year 2000 date.
It is worst with client-server applications, where multiple systems and data sources are involved. This paper describes how to set up a productive and safe test environment and how to conduct boundary testing, functional testing, and baseline testing in such an environment.
A well-designed test strategy and test environment is critical to ensure a successful and on-time completion. After all, Year 2000 deadline is immovable and the clock is ticking!
Introduction
The cause of Year 2000 problem is well-known. The storage and memory cost is very expensive twenty years ago. To save cost and believing applications would not last till Year 2000, traditionally year is stored as two digit without the century portion. 19th century is always assumed. Of course this logic broke when we encounter Year 2000: 1900 is assumed instead!
Year 2000 problem is pretty universal. Not only older platforms such as mainframe, MPE and UNIX have Year 2000 problems; relatively new platforms such as PC and JAVA have problems too.
Year 2000 posts a biggest challenge to the software industry. For the first time ever in software history, the deadline is immovable! To make matter worse, millions of applications are affected. Everyone will be doing the same thing at the same time. The resources will be scare and budget will be tight.
Federal Reserve estimate the costs for Year 2000 conversion for Fortune 500 companies are 11 billion dollars. The total cost for US companies are 50 billion dollars. Those figures does not include hardware and software upgrade costs. Gartner Group estimates 300 to 600 billion dollars conversion cost worldwide.
The legal and liability costs may be even higher than the actual conversion cost! Lawsuits are expected if contract obligations were not met or customer received damage due to Year 2000 glitches. In this very early Year 2000 season, there are more than 200 cases in court or settlement phase now! The two recent Year 2000 settlements are 2.2 and 8.5 million dollars.
Finally, the so-called Year 2000 problem may not exactly occur on Jan. 1, 2000. Depends on how far the application looks into the future, applications will fail earlier than "expected". Indeed, we have seen prison inmates being released before their time and year 2000 expiration credit card caused cash register system crashes NOW.
Year 2000 Project Phases
The typical Year 2000 conversion project consists of four phases: inventory, assessment, conversion and testing.In the inventory phase, we document and find out what hardware and software do we have. Software includes operating systems, compilers, applications, utilities, databases, data files, scripts, batch jobs, command files, and anything related to running the business. We should also note what version of software do we have and if we have source code or test suites available. This should be a living document and be maintained forever. It will come in handy with the next major project, such as Europe currency conversion.
In the assessment phase, we assess if the applications are Year 2000 compliant. If not, we would like to know HOW will it fail, so how much impact to my business. We also must find out WHEN will it start failing, so we know how much time do we have to fix it!
We do the actual code correction in the conversion phase. Use 4-digit year is an obvious approach. It is a long term solution, however, it requires more work. Not only the applications need to be converted, but also the data need to be migrated to the new format. The data conversion aspect make the switch between old and new applications tricky. A "magic weekend" approach may be hard to do for 24 by 7 shops or companies span many countries with dependent applications!
The other popular approach is fixed window. In this scheme, 2-digit year is still used and its value is converted based on a fixed window threshold. Any 2-digit year less than the threshold means 20th century and 19th century otherwise. For example, the MPE/iX operating system uses 50 as the threshold and HP-UX operating system uses 68. This approach is less work, since no data conversion is required. However, it is relatively short term solution and it may not work with data span over 100 years, such as age related data. In addition, sort may become an issue, where 0 is less than 99, but year 2000 is larger than year 1999.
One "best of both world" approach, 2-digit year with special coding, deserves special attention. It still use 2-digits year, but use "A" to "Z" to specify years between year 2000 to year 2269. For example: year 2001 is "A1" and year 2014 is "B4". No data conversion is required and sort works. The solution is relatively long-term and the conversion effort is similar to fixed window approach.
The last phase is testing, where we verify the applications work and are indeed Year 2000 compliant. Not only boundary testing is needed, but also functional testing and integration testing. The details are covered in later sections.
Testing is the most time-consuming and costly phase. Hewlett-Packard estimates that 40 percent of time and resources of a typical Year 2000 project will be spent on the testing phase. Market research firms such as Meta Group and Gartner Group estimate an even higher percentage (50 to 60 percent). Chase Manhattan recently announced that it increases its Year 2000 budget from 250 millions to 300 millions due to under-estimation of testing. For them, the testing phase accounts for 70 percent of the resources and costs.
Test Environment
For obvious reasons, users should NEVER do assessment or testing against their production environment. The applications may not be Year 2000 compliant, which may produce unpredictable results or even damage the production environment! Testers may accidentally put in test data into the production environment, which could be a disaster.As a result, users must duplicate their production environment into test environment for assessment or testing. The test environment may be on the same production systems or standalone test systems. When creating the test environment, we want to make it very obvious and clear to all users so there is no confusion with the production environment. New directory structure and new users should be created.
With client-server and distributed applications, the test environment may span multiple systems and platforms. With supply chain and EDI applications, sometimes the test environment even span multiple companies and countries!
Date Simulation
To verify Year 2000 compliant, we obviously must simulate date and time at year 2000. The first thing people have in mind is to reset the system clock to year 2000. This approach have many issues.First of all, many demo or utilities software have expiration dates built-in. Those time-bombed software may be expired in the test environment. This may be a major inconvenience if such software utilities are needed to maintenance or monitor the test environment.
Many users tries to reset the system clock on their production systems. This is very inconvenient and costly. The system may only be available during "off hours" - weekends and midnights. In addition to poor morale, higher overtime and facility cost may result.
It is also very dangerous to the production environment. File timestamps (last accessed time, last modified time) of some files will have year 2000 date. This would cause backup and archiving issues. Applications created by make files are also in danger, since such applications are compiled and linked by file timestamps. Worse yet, this may cause data corruptions. Considering a database log file containing transactions for year 1998, year 2000 and back to year 1998, database recovery with this log file would surely cause data corruptions! Therefore, USERS SHOULD NEVER EVER RESET SYSTEM CLOCKS ON THEIR PRODUCTION SYSTEMS.
Users may only reset system clocks on standalone "crash and burn" systems, where lost of data doesn’t matter. This is a very non-productive approach. Resetting system clock applies to all users on the system. Therefore, testing is single-threaded and one clock at a time. Tester A can’t test Feb. 29, 2000 until tester B finished testing with Jan. 1, 2000. Testers constantly are waiting and synchronizing with each other. It is also a very costly approach. Users not only have to buy or lease the hardware, but also duplicate all the software licenses for various databases and utilities. Due to the sheer volumes of testing demands and non-productive nature of tester synchronization, a large number of standalone test systems may be needed.
The best approach is to utilize a software date simulation tool. Date simulation tool provides virtual clock. Multiple users may see different virtual clocks concurrently on the same system. This boosts the productivity dramatically. Testers can conduct their testing without concerning or waiting about other testers. Better date simulation tool also let rest of the users and the file system see the current date and time. As a result, file timestamps reflect the current time and it is safe to run on production systems! With the productivity gain and the possibility of conduct assessment and testing on production systems, users can eliminate or reduce the number of test systems needed.
Data Aging
There are two dimensions of Year 2000 testing: time and data. The applications we utilize today have both time and data before year 2000. The converted applications always should work when both time and data are after year 2000. Don’t forget to test the forecast case, where time is before year 2000 and data is after year 2000, as well as the review case, where time is after year 2000 and data is before year 2000.One common oversight is to just test beyond year 2000 time with current data. Not all date related program logic may be exercised and tested, since there are no beyond year 2000 data in the current database!
One may use the date simulation tool to generate aged data. The other way is to age the production database by a relative year into a test database. Let’s say, add three years to all date related data in the database.
One potential pitfall is the original date may land on a different weekday. This may create "an impossible combination" where the application may behave unexpectedly and cause testing delay, but is not really a problem. For example, an application may assume there is no order taken during weekend. This condition can be avoided by the 28-years rule. If we add a multiple of 28 years to any date, it always land on the same weekday.
Just like date simulation tools, there are data aging tools available to help users to age their databases and to create test databases.
Boundary Testing
It is obvious that we want to test all boundary cases for Year 2000 compliant. This includes at least Jan. 1, 1999, Sep. 9, 1999, Dec. 31, 1999, Jan. 1, 2000, Jan. 31, 2000, Mar. 31, 2000 and Dec. 31, 2000. Sep. 9, 1999 is common to be used as a special value to denote expiration, or not applicable; etc. Of course, don’t forget Feb. 29, 2000. Year 2000 is a leap year (year 2000 is both divisible by 100 and 400). Many applications’ leap year calculation would miss it and think it is a invalid date! If the converted application use fixed-window technique, then dates around the window threshold should be tested as well.Functional Testing
Once we modify the application to make it Year 2000 compliant, there is no guarantee the original functionality still remain the same. Unfortunately, not all applications come with complete functional test suites so we can just run through the test suites and verify the functionality.We can always update or create complete functional test suites. Although this is the best approach and great for the future usages as well, there is simply no time for that for most companies. One popular technique to combat this issue is called baseline testing.
With baseline testing, we fix the inputs with today’s date and data and collect the outputs. This is our baseline. We then rerun the applications with simulated date and/or aged data and compare the outputs. The heading of reports may show different dates, however, all numbers (average, total; etc.) should all matches. If they are all matched, then we pass the baseline testing. We then can be reasonably comfortable that the functionality is still intact. Of course, baseline testing involves many runs of different date and data combinations and various business critical reports and situations.
Rather than compare the reports manually each time, report comparison tools are helpful to compare the result automatically and only alarm the tester if there are differences.
Integration Testing
A production system have many applications and many applications may interface and influence to each other. Rather than test everything at once, it is best to test each application standalone incrementally with boundary and functional testing. Once they successfully pass the tests individually, we then conduct the integration testing by putting all applications together and test every aspects of a daily production system.With client-server or distributed applications, the integration testing may involve more than one systems and platforms. Since PC or client may not be a reliable time source, fortunately most of the client-server applications get their time solely from the server. However, it still a good idea to simulate the "same time" across all test systems during integration testing.
Test Automation
Testing is a very tedious and repetitive process. Each test cases of a test suite must be repeatable. The test setup, running, and interpreting results may take hours and highly manual intensive. The same test suite will run many iterations against the same application during the Year 2000 compliant project. Even with just one line of source code modification, the whole suite need to be rerun again.
In such a repetitive environment, it is critical to automate the test process as much as possible to increase efficiency, eliminate human errors, prevent boredom, and save costs. The best scenario is to automate all aspects of a test suite so that test setup (date simulation, initial database content), test run, and result checking are all done by test scripts. Testers are only notified if the test case fails. It makes perfect sense to spent more time the first time to automate it rather than running it hundreds of times manually.
Date simulation, data aging and comparison tools are all instrumental in the automation process. Without them, many test steps would become manual.
Disk Space Consideration
One big surprise to Year 2000 testing is the disk space requirement. The test environment takes the same room as the production environment. Many testers will keep multiple copies of the same test database to test different aspects of the application. In addition, multiple versions of source code and programs of the converted applications also take disk space.
Users probably do not want to double their disk space capacity just for the Year 2000 project! On the other hand, joggling disk space by purging and restore test environment from tapes wastes valuable time and may invalidate test results due to out of disk space errors.
An online archiving compression tool could be a nice alternative. The infrequently-used files, including older versions of applications, can be compressed online to make room for the Year 2000 test environment. The test environment itself can be compressed also. The compressed files are transparent to users. When users access those files, they will automatically be decompressed. Assuming 75% compression ratio, with one application worth of disk space, it can actually hold four applications!
Priority Plan & Contingency Plan
Gartner Group estimates that more than 50 percent of companies will not be fully Year 2000 compliant comes Jan. 1, 2000. With so many applications impacted and so little time left, companies are not fully Year 2000 compliant today should also think of contingency plan.
If you have not yet finished the assessment phase or not even started, a priority plan is critical. Finish the assessment phase as the first priority. Then base your priority plan on the applications’ business impact, sensitivity to dates, and the must-fix date (when will it starts failing). The applications have great business impact, very sensitive to dates, and will fail soon should get the first attention with very comprehensive and complete testing. We can take short cut with applications having little business impact and not very sensitive to dates. Only minimum testing will do and we might decide it is OK to let if fail!
All business-critical applications should have a contingency plan. This might mean hiring additional staff to handle transactions manually on paper or kludges like aged the data in the past so the current applications will continue function, but with the wrong date. It may look ugly, but it sure beats no plan at all when everything fails apart.
Conclusion
With 50 to 60 percent of Year 2000 project ‘s resources and costs spend on the testing phase, a well-designed test strategy and test environment is critical to ensure a successful and on-time completion. Boundary testing, functional testing, and integration testing are must for a comprehensive and completed test. Software tools for date simulation, data aging, report comparison, data compression are key to automate the test process, to reduce additional hardware and software costs, and to boost productivity.
After all, Year 2000 deadline is immovable and the clock is ticking. The question is: Will you beat the clock?
Biographic
Paul Wang is the president of SolutionSoft Systems, Inc. He is a software developer, specializing in transaction management, system performance, file system internals, data base, and on-line transaction processing. Previously, he was the internal architect of transaction processing at Hewlett-Packard.SolutionSoft Systems, Inc. is a Hewlett-Packard’s channel partner and Cure2000 partner, specialized in Year 2000 and Data Management solutions.