John C E Ellis

My thoughts and ideas

  • THE MILLENNIUM BUG: 25 YEARS ON

    THE MILLENNIUM BUG: 25 YEARS ON

    A Tale of Hype, Herculean Efforts, and Successful Outcomes (in the main)

    INTRODUCTION

    As the clock ticked closer to midnight on December 31, 1999, the world braced itself for a technological apocalypse known as the Millennium Bug, or Y2K. This looming threat was predicted to disrupt computer systems worldwide, leading to catastrophic failures in everything from financial systems to power grids. The hysteria in the media surrounding Y2K was intense, resulting in a massive, coordinated global effort to prevent the feared outcomes. Yet, as the new millennium dawned, the expected chaos failed to materialize, leaving many to wonder: was the threat real, or was it all just hype?

    THE ISSUE: WHAT WAS THE MILLENNIUM BUG?

    The Millennium Bug, or Y2K, was in reality two seperate problems;

    The first stemmed from a seemingly innocuous programming shortcut used in the early days of computing. To save valuable memory space and money, programmers represented years with only two digits (e.g., 1970 as “70”). This practice worked well until the year 2000 approached. The fear was that systems would interpret the year “00” as 1900, potentially causing errors in date-sensitive operations across a myriad of systems, including financial transactions, power grids, and government databases.

    The second was a specific technical issue involved PC BIOS dates, where many older computers’ Basic Input/Output System (BIOS) would not correctly recognize the year 2000, potentially causing PC startup failures. Chris Myers a Fellow of the Institution, wrote this technical document back in 1997, to explain the hardware issues.

    THE HYPE: THE WORLD ON EDGE

    The potential consequences of Y2K were widely publicized, creating a global frenzy. Media outlets predicted widespread chaos: banks would fail, airplanes would fall from the sky, and essential services would grind to a halt. Governments, businesses, and individuals prepared for the worst. Companies spent billions of dollars on remediation, and contingency plans were made to address possible disruptions. Some people even hoarded supplies, fearing that basic utilities might fail.

    While the World woke up to the idea of the Millenium Bug during 1999, in preperation for the potential downfall of society at 1 minute past 12 am on the 1st of January 2000, the problem was already a threat on the 1st of January 1999. Many systems, for example motor/house/life insurance and banking systems, already were entering data where the next renewal would be in the year 2000, or 00, if the system was not ready, the potential for failure was high.

    THE WORK: THE GLOBAL EFFORT

    In response to the impending threat, a monumental effort was launched to fix and upgrade systems. The work involved several key steps:

    1. *Inventory and Assessment* : Identifying systems, hardware and software that were potentially vulnerable.

    2. *Remediation*: Updating or replacing hardware and software to handle the date change correctly, including updating PC BIOS to ensure proper date recognition.

    3. *Testing* : Rigorous testing to ensure that changes were effective and did not introduce new problems.

    4. *Contingency Planning* : Developing backup plans to address potential failures.

    This colossal undertaking saw cooperation across industries and borders, involving governments, private sector companies, and international organizations. The UK alone spent over £20 million and the United States is estimated to have spent over $100 billion on Y2K preparations.

    SPECIFIC EXAMPLES: UK & WORLDWIDE

    In the UK, the government launched the Action 2000 initiative, led by the respected business leader Don Cruickshank. This initiative aimed to ensure that both public and private sector organizations were prepared for Y2K. The National Health Service (NHS), for instance, invested significantly in ensuring that its systems, including patient records and medical devices, would not fail due to Y2K-related issues.

    One notable global example is the work done by the financial sector. The New York Stock Exchange and NASDAQ underwent extensive testing and upgrades to prevent trading disruptions. Similarly, in Japan, the government and private sectors collaborated to ensure that essential services, such as banking and utilities, were safeguarded against potential Y2K failures.

    Some IAP members reported how busy they were while others said it was quiet, or even mundane.

    THE OUTCOME: MINOR GLITCHES, NO MAJOR FAILURES

    As the clock struck midnight on January 1, 2000, the world held its breath. But rather than the predicted chaos, the transition to the new millennium was surprisingly smooth. Minor glitches occurred, but nothing on the scale of the anticipated catastrophe. Some examples of minor issues included:

    1. *United States* : A few slot machines in Delaware stopped working temporarily, and in Washington, D.C., a couple of spy satellites experienced minor data hiccups.

    2. *Japan* : Some minor glitches were reported, such as errors in radiation monitoring equipment, which were quickly resolved.

    3. *Australia* : Bus ticket validation machines in two cities failed to operate correctly for a few hours.

    4. *UK* : A few credit card transactions went awry.

    5. *South Korea* : Feel sorry for the 170 people sent court summons, to attend on the 4th of January 1900.

    Overall, these minor incidents were quickly addressed, and no significant disruptions were reported.

    ANALYSIS: WAS IT WORTH IT?

    In hindsight, the smooth transition can be seen as evidence of the success of the massive remediation efforts. Without the extensive preparations, the outcome might have been very different. The lack of major incidents was not because the threat was imaginary, but because of the proactive measures taken to address it.

    Furthermore, the Y2K preparations had several positive side effects:

    *Modernization* : Many outdated systems were updated or replaced, leading to more efficient and reliable operations.

    *Increased Awareness* : The event raised awareness of the importance of maintaining and updating critical infrastructure.

    *Preparedness* : Organizations developed better contingency planning and risk management practices.

    CONCLUSION

    The story of the Millennium Bug is a testament to the power of coordinated global action in the face of a common threat. While the anticipated chaos did not materialize, the extensive preparations undoubtedly played a crucial role in ensuring a smooth transition into the new millennium. Y2K serves as a valuable lesson in the importance of vigilance, preparation, and collaboration in managing technological risks. In the end, the Millennium Bug was not a catastrophe, but a catalyst for improvement and modernization.

    THE FUTURE

    While researching for this article, one of our Fellows, Irene Jones, mentioned EPOCH time. This was used on older UNIX based systems and was used to store the number of seconds that have elapsed since 00:00:00 January 1st 1970. It will also have a similar crisis moment in January 2038. Also known as UNIX time, on the 19th of January 2038, any 32 bit signed Integer storage fields used to store the time will overfill. The solution is of course, to replace the fields with 64 bit fields, but who knows what dragons may exist in some of these older systems. It may seem a way off, but some serious system changes may well be predicted in the future.

     

  • D-DAY 80 YEAR ANNIVERSARY

    D-DAY 80 YEAR ANNIVERSARY

    On 6th June 2024, on the 80 year anniversary of D-Day, events were held across the UK to commemorate the largest seaborne invasion in history; a mission that marked the beginning of Western Europe’s liberation in the second world war.

    Here’s an overview of how computing helped at this critical time.

    COMPUTING IN THE 1940s AND IT’S ROLE IN D-DAY: The 1940s marked a significant period in technological advancement, particularly in computing. This era, defined by World War II, saw the development and use of early computers that played critical roles in various military operations. One of the most notable events where computing technology made a significant impact was during the planning and execution of D-Day, the Allied invasion of Normandy on June 6, 1944.

    EARLY COMPUTERS & THEIR CAPABILITY: In the early 1940s, the concept of digital computing was in its infancy. The computers of this era, such as the British Colossus, were rudimentary by today’s standards but revolutionary at the time. These machines could perform calculations at speeds unattainable by humans and were crucial in processing large volumes of data quickly.

    COLOSSUS: Developed by British engineer Tommy Flowers, Colossus was designed to break German encryption, specifically the Lorenz cipher used by the German High Command. Its ability to decipher encrypted messages allowed the Allies to gather crucial intelligence.

    CODEBREAKING & INTELLIGENCE: One of the most critical contributions of computing to D-Day was in the field of codebreaking. The British Government Code and Cypher School at Bletchley Park, home to the Colossus computer, played a pivotal role in deciphering German communications.

    DECIPHERING THE LORENZ CIPHER: The Lorenz cipher, used by the German High Command, was more complex than the Enigma cipher. Colossus, which became operational in late 1943, was instrumental in breaking this cipher. The intelligence gleaned from these decrypted messages provided the Allies with insights into German troop movements, defensive strategies, and overall military planning.

    OPERATIONAL SECURITY: Understanding German communications allowed the Allies to implement effective countermeasures and deception strategies, such as Operation Fortitude, which misled the Germans about the actual landing site of the invasion.

    WEATHER FORECASTING: Another critical aspect of D-Day planning was in weather forecasting. The success of the invasion was heavily dependent on favorable weather conditions.

    METEOROLOGICAL CALCULATIONS: Meteorologist Group Captain James Stagg, who advised General Eisenhower, relied on observations (from a ship several hundred miles off the West coast of Ireland), discussions with colleagues and his intuition rather than computational aids to predict a brief window of good weather, which ultimately determined the timing of the invasion. Today, satellites and bouys in the ocean, give far more accurate weather predictions (sometimes!).

    LOGISTICS & PLANNING: The sheer scale of Operation Overlord, the code-name for the Battle of Normandy, required meticulous planning and coordination. Most of the planning was manual, but picking a time to land, was down to a simple computation device that calculated tide tables for high and low water.

    CONCLUSION: The computing technology of the 1940s, while primitive compared to modern standards, played a crucial role in the success of D-Day. The integration of these technologies into military operations not only contributed to the success of D-Day but also laid the groundwork for the development of modern computing and its applications in various fields.

    The biggest contribution to D-Day, was given by the 130,000 men on the the day, nearly 5,000 of whom died on the landing areas. This allowed nearly one million men over the next few weeks to land in France and go on to liberate Europe of the evil that had plagued if for many years.