Homeless in Vancouver: The Unix Epoch-alypse is only 25 years away

    1 of 2 2 of 2

      Three weeks ago, on December 1, Google’s YouTube and South Korean pop star PSY shared an amazing milestone: the music video for “Gangnam Style”—by far the most viewed thing on YouTube by a wide margin—also became the first to be watched more than 2,147,483,647 times, causing the video sharing service’s 32-bit counter to overflow, or rollover like an odometer.

      As Google stated on its blog, no one originally expected a video would ever be watched in excess of 2.1 billion times, but when the time came they were quick to upgrade the counter to a 64-bit integer (9,223,372,036,854,775,808).

      History made. Problem solved (yawn).

      The failure of YouTube’s 32-bit view counter and the blithe way it was dealt with it illustrates a looming issue faced by millions of aging 32-bit computer systems and how we may (or may not) be affected by it.

      The limit of YouTube’s 32-bit counter is shared by all 32-bit operating systems and it is baked into all computers built on top of 32-bit processors.

      This limit doesn’t apply to new laptops and desktop computers, which have 64-bit processors and run 64-bit compatible operating systems, but it could apply to programs which may continue to use 32-bit functions, for whatever reason.

      It certainly does include all the millions of 32-bit ARM processors in new Google Chromebooks and Android tablets and snartphones. As well as Microsoft’s ill-fated Surface RT tablet and HP’s new Stream 7 tablet which uses a 32-bit Intel Atom processor running 32-bit Windows 8.1.

      But mostly it applies to the tens of millions of 32-bit computer systems still in use today and the millions which I expect will continue to be used for years and quite possibly decades to come.

      How Unix will be bitten by the 2038 bug

      Consider Unix, the influential mainframe operating system. For whatever reason, one of the ways it records time is as the seconds that have elapsed since its Epoch, or rough birth date, of 00:00:00 Coordinated Universal Time (UTC) Thursday, January 1, 1970 (excluding leap seconds).

      The habit of keeping Unix time was carried over to at least four other influential operating systems: BSD and Linux which are both based on Unix, Mac OSX which is built on top of BSD, and Android which is a version of Linux.

      Programmers have known for decades that 32-bit versions of Unix-like operating systems have a problem with the year 2038:

      Eight seconds after 3:14 A.M. UTC on January 19, 2038, all Unix-like operating systems that record Unix time using a 32-bit register will have run through all the 2,147,483,647 seconds they can physically count and they will apparently begin counting wrong somehow; possibly backwards or starting from the so-called Epoch date in 1970 or perhaps a date in 1969.

      Why is Windows XP even a bit affected?

      Curiously, although all versions of Microsoft Windows are supposed to count time from January 1, 1601, forward to nearly 29,000 years in the future, one FAQ on the so-called 2038 bug indicates that the 32-bit Windows 2000 Professional also fails to cross the 2038 time barrier.

      The test involves running a simple PERL script so I installed PERL (ActivePERL) in the three versions of Windows I have on hand.

      Windows XP (left) failed the 2038 test, but Windows 10 had no problems.

      My 64-bit Windows 8 had no problem with 2038 and neither did the 64-bit Technical Preview of Windows 10 running in emulation, However, a 32-bit Windows XP running in emulation failed to display the date after the critical second in 2038.

      Itsy-bitsy nuisance or a big old problem?

      This could be seen as simply an amusing intellectual curiosity; after all, who will be still using a 32-bit computer 24 years and one month from now?

      Perhaps the better question isn’t who but rather what might still be using 32-bit computer controls in a quarter century?

      I would expect the only real problems to come from legacy 32-bit computer systems embedded where we least expect them—in traffic lights, ATMs, hospital equipment, anti-lock braking systems, water purification pumps, and the like.

      We should remember that since the year 2000—after the famous Y2k bug was behind us—the habit of sticking computer controls in things just accelerated.

      Even as new laptops and desktop computers have largely transitioned to 64-bit architecture, low-power 32-bit ARM processors represent the cutting edge of embedded systems and many embedded systems are still being sold with 8- or 16-bit microprocessors because less processing power means less cost and power consumption.

      All this suggests there could be millions of embedded 32-bit versions of Windows and Linux lurking in every aspect of our infrastructure well past 2038.

      That may sound far-fetched but how much more far-fetched than millions of computers are still running Windows XP 13 years after it was introduced? StatsCounter still places XP second with a 13.16 percent worldwide share of users.

      And what about the 95 percent of the world’s ATM machines that are believed to be running embedded XP — how many of those have been upgraded?

      In 2010 it was reported that 54 percent of Windows 7 installations were estimated to be 32-bit, along with 89 percent of Vista and 99 percent of XP. Four years later, that suggests that over a third of all desktop systems in use could still be 32-bit.

      We’ll just have to wait a bit to find out

      Those of us around in 2038 will find out what 32-bit systems remain and more importantly which ones were programmed to care about keeping track of time or numbers to such a degree that they fail.

      Meanwhile, programmers, who have known about all this for ages, appear to be of two minds.

      Some programmers foresee real problems, with an unpredictable number critical systems suddenly failing, but the majority appear to look forward to the extra work—40- and 50-something programmers even joke about it representing job security or being part of their retirement nest egg.

      Actually, the 32-bit overflow problem, as exemplified by the 2038 bug, has already been cropping up.

      Thanks to a programming workaround to handle really big numbers, the America On Line (AOL) server crashed back in 2006 at a moment which turned out to be, not coincidentally, one billion seconds until the expiry of 32-bit Unix time.

      There were reports last year that both Android devices and iPhones displayed 2038 problems: an iPhone running iOS 6.1 couldn’t stay on a date past January 1, 2038 and an tablet running Android couldn’t be set to a date beyond 4.1 1/19/2038.

      It’s also been assumed and rumoured (but not demonstrated) that long-term calculations involving date ranges beyond 2038, such as mortgage amortizations and government fiscal plans will have been compromised or at least complicated.

      The YouTube counter problem is the highest profile manifestation I know of 32-bit overflow, and hopefully all future instances will be as trivial and easy to fix.

      They probably won’t be harder to fix than the habit of some programmers to implement easy solutions, without much thought to the future.

      How we count a bit too much on computers

      These days people use 0 to 9, or base-10 notation, to represent every possible number. Computers instead rely on base-2 notation which can represent every number using the “ones” and “zeroes”, which neatly correspond to the on-off fluctuations of an electric current.

      Each on-off fluctuation represents a single binary digit, aka a “bit”, which can be either 0 or 1.

      The Central Processing Units (CPUs) inside computers that do the actual mathematical heavy lifting are physically limited by the number of bits they can process at one time, as well as the number of bits they can use to represent memory addresses on a hard drive.

      This bit limit of the CPU (which has risen as the cost of computer chips have fallen) also places an upper limit on the ability of software to manipulate numbers.

      Gets a bit confusing, doesn’t it?

      The computer industry has therefore gotten into the habit of referring to 8-, 16-, 32- or 64-bit as a measure of computing power. This is as much of a marketing ploy as anything else. The bit limit is real but also really confusing.

      It becomes a bit meaningless in the way these bit limits can be different for the different CPU operations, and even a fully 64-bit CPU is no guarantee that all the programs on a computer are fully 64-bit compatible.

      Eight bits is, for no particularly good reason, said to equal a byte (also called an octet). The maximum possible range of values are those that are unsigned, that is to say, they have no positive or negative value (maximum signed values total the same as below but are divided between positive and negative).

      • 8-bit (1 byte) = 0-256
      • 16-bit (2 bytes) = 0-65,536.
      • 24-bit (3 bytes) = 0-16,777,215.
      • 32-bit (4 bytes) = 0-4,294,967,295
      • 64-bit (8 bytes) = 0-18,446,744,073,709,551,615

      The bittersweet end of an era

      We are clearly in the closing days of the 32-bit era of personal computing, which began about 20 years ago.

      In 1985, Intel introduced the 32-bit 80386 microprocessor. It took a decade for this little chip to make its way down the food chain from mainframe computers to become the rocket engine under the hood of the second phase of the personal computing revolution, which I mark as beginning with the release in 1995 of Microsoft’s first consumer-oriented 32-bit operating system, Windows 95.

      I’m not forgetting that Apple actually beat Microsoft by moving the Macintosh to 32-bit addressing in 1991 with System 7 (my first Apple OS) but Apple’s market share never exceeded 12 percent.

      No one can deny that over 80 percent of all personal computer users got their first taste of the benefits of real 32-bitness on a Windows-Intel machine.

      Stanley Q. Woodvine is a homeless resident of Vancouver who has worked in the past as an illustrator, graphic designer, and writer. Follow Stanley on Twitter at @sqwabb.

      Comments

      More on straight.com

      CONFESSIONS

      I feel

      I feel that customer service has gone downhill. They make you wait for so long and they rarely...

      I SAW YOU

      M

      I was biking/you were walking at Richards and Georgia. You looked stunning with your strawberry...