The mistake of the century
Speaking about numerous cases of "gawks" in software, of course, one cannot ignore one more mistake that has tormented programmers' minds for at least the last five years of the twentieth century.
As you may have already guessed, we are talking about the notorious "mistake of the year 2000" called Y2K. For its correction, the whole industry in which dealers have received considerable profit, playing on the fear of users before approaching computer madness has been created.
All over the world about $200 billion was spent for the prevention of "an error of 2000". But, as they say, it has carried away. And the essence of the problem was that according to the tradition established in the 70s only two digits were used to indicate the serial number of the year. Consequently, when the year 2000 came, thousands of computers around the world would not have been able to distinguish the year 2000 from 1900. For home "personnel" it is not so terrible, but for banking, transport and other systems, rigidly attached to the current date, such a "mistake" could be costly.
However, despite all the measures taken to correct the error, some troubles have occurred. And they started before the year 2000. For example, in Philadelphia in November 1999 several hundred people received subpoenas from the court, obliging them to appear in court in... 1900. And more than 30,000 Americans were surprised to find in their mailboxes official notifications from the U.S. Social Security Administration that their pension certificates are to be replaced, as in 1900 they expire.
Nothing global has happened with the "2000 mistake," though some of the "inconsistencies" associated with this mistake have been evident from time to time throughout the year 2000. Thus, for example, the failure related to the problem of 2000 in the information system of the Norwegian national railway company NSB, occurred with a delay of one year. On the morning of December 31, 2000, none of the company's new trains could leave the depot on time.
The computers were unable to recognize the current date, which was a big surprise for the specialists, who before the year 2000 carefully tested all the systems against the millennium error. None of them suspected that the stumbling block would be not January 1, but December 31, 2000. The situation was quite simple: the on-board computer clock was set on December 1, 2000. It cost.
Everything would be fine, but few users know that Y2K is likely to come back. The reason is that, according to experts, 80% of the world's computer systems used to prevent the crisis in 2000, a measure that gives only a relatively brief "delay in the execution of the sentence.
Instead of simply expanding the format of the data used to represent the year from two to four digits, a "time window shift" has been applied in many cases. This simple trick allows the program to "guess" whether the date dates back to the 20th century or the 21st century. As a rule, year numbers from the first part of the supported range (for example, from 0 to 30) are considered to be related to the new date range, and the rest, for example, 87 - to the previous one. It is clear that this measure is temporary and in some 30-40 years "the problem of 2050", as it is called, will become a full growth. By the way, and in the 70s it also seemed that there was still a lot of time before the XXI century...
Instead of concluding
Funny, but one of the methods of checking a program for the content of errors is that in the already ready and debugged program are intentionally introduced ... additional errors. Surprised?
And there is nothing surprising here. Just a program being tested with errors made in advance is given to a programmer for debugging who tries to restore its operability. If only deliberately introduced errors are detected, it is considered that the source program does not contain errors. But if, while searching for "reference errors", additional errors are detected and it is not known where they come from, the program, alas, must be recognized as raw. It is a good method.
Reliable. But as it is noticed in one of Murphy's laws, "errors cause new errors". Maybe that's why there are more errors in ready-made programs?