In IT, legacy technology is older technology used by businesses that is outdated in one way or another (usually to the extent that the company that created the technology no longer supports it, and people with the skills and experience to maintain the technology are tough to find), but that is hard or impossible to replace with newer technology because of the cost or difficulty of upgrading.
Legacy technology is rarely “better” than the newer technology that could potentially replace it, though there are some cases where people stick with an older technology—usually an operating system or computer—because they’re more comfortable with it or because the newer version is missing important features (or, alternatively, the newer version has more features, but they aren’t very useful and reduce the simplicity and performance of the technology).
Even when legacy technology is clearly inferior to newer technology, however, one still has to be impressed by the ability of legacy tech to last as long as it does (up to 30 years, in some cases), especially when most IT assets only last three to five years on average.
Legacy technologies can include hardware, applications, operating systems, and programming languages. In many cases, they go together—old programming languages with old mainframes, old applications with old operating systems, etc. This is partly why it’s so difficult and expensive to upgrade some legacy technologies, because it’s usually not just one asset but an entire system that has to be replaced.
The five legacy technologies that currently have, or have had, the most impact on the IT world are:
Mainframes are large, powerful, centralized, and highly-reliable computers. They are mainly used by big organizations to quickly, continually, and securely process and store huge amounts of data. Some banks, for example, use mainframes to manage all of their transactions. Other types of organizations that are likely to have mainframes include insurance companies, retailers, credit card companies, universities, and government agencies.
In fact, 96 of the world’s largest 100 banks, nine out of 10 of the world’s largest insurance companies, 23 of the 25 largest retailers in the United States, and 71 percent of the Fortune 500 use IBM System z mainframes. Currently, there are 10,000 mainframes actively being used around the world.
Mainframes were first introduced in the late 1950s, and for a while they were the only way for large businesses and government organizations to satisfy their high data storage and processing requirements. Later, though, smaller and less centralized computers came out that could meet some of these requirements for less cost, and mainframes became more of a niche product for organizations that, in addition to their high data storage and processing requirements, also had high reliability and security requirements, as well.
Mainframes also continue to be used due to the high cost and complexity of migrating mainframe-based systems to other types of hardware, and because organizations with existing mainframes can’t just give up on such a large investment (most of IBM’s mainframes cost in the high six figures, for example).
Businesses aren’t still using the same mainframes that they were in the 1950s, of course. Mainframes are considered legacy technology even when they’re fresh off of the assembly line, however, because they’re often used to run old, custom-built applications written in obsolete programming languages.
And speaking of programming languages…
COBOL (COmmon Business-Oriented Language) is a general-purpose, hardware-independent programming language that was created in 1959 by a committee of American engineers. It wasn’t the first widely-adopted general-purpose programming language—that was FORTRAN (FORmula TRANslating System), developed by IBM engineers in 1957.
The point of COBOL was to make it possible to develop advanced data processing programs that could be used on any computer. It was also designed to be easy-to-use, so that business users didn’t have to be mathematicians to be able to program with it.
Members of the committee that came up with COBOL included engineers from the United States Air Force and Navy and businesses such as IBM, Honeywell, and RCA. In 1962, IBM made it the company’s primary programming language, and by 1970, COBOL was the most widely-used programming language in the world.
COBOL is still incredibly popular even 55 years after its initial development. It’s still widely used by the federal government, and 48 percent of businesses and government organizations report using COBOL “a lot,” which is more than any other programming language.
Many of the programs that are currently run on mainframes are written in COBOL, as well.
The Digital Equipment Corporation (DEC), founded in 1957, was at one time the second-largest computer company in the world. Its specialty was the minicomputer, which was basically a smaller, lower-cost mainframe. The company failed to adjust as the minicomputer gave way to the personal computer (PC) after the 1970s and it was acquired by Compaq in 1998.
Even though DEC no longer exists, though, there are still a number of organizations—including the United States military—that still use the company’s hardware and software.
For example, its Programmed Data Processor (PDP) minicomputers are currently being used by the onboard radar systems of the US Navy, the UK’s Atomic Weapons Establishment, Airbus, and nuclear power-plants; and its VAX minicomputers are being used by the US Navy’s submarines, the US Air Force’s Minuteman ICBM program, the Hawk missile systems, and the F-15 and F-18 fighter jets.
Never had an opportunity to crunch numbers with a mainframe, write code in COBOL, or launch missiles with a DEC minicomputer before? Well, here’s a legacy technology that you probably have some first-hand experience with: Windows XP.
Windows XP was first introduced in 2001, and eventually sold over one billion copies. At the height of its popularity, it was running on three out of every four computers in the world.
On April 8, 2014, Microsoft ended its support for Windows XP; it won’t be providing any further free security updates or technical support for the operating system. Even five months after the deadline, however, nearly 1 in 4 PC´s still run Windows XP.
Reportedly, many of the XP holdovers are located in China, and they’re putting off upgrading for a variety of reasons, including cost and supposed fears about US government surveillance.
Organizations in other parts of the world are sticking with the operating system, on the other hand, because they have XP applications that are difficult or impossible to migrate to a newer version of Windows.
For these organizations that are stuck with the old OS, Microsoft is offering a year of security patches at an annual cost of up to $5 million, though the British government ended up having to pay about $9 million per year for this “Custom Support.”
OS/2 was an operating system developed in 1987 by IBM and Microsoft. The companies wanted OS/2 to replace their outdated DOS-based operating systems, and to help them establish a dominant position in the market for personal computers.
OS/2 never really caught on, however, because its resource requirements were too high and its interface was overly complicated. It also was overshadowed by Microsoft’s release of Windows 3.0 in 1990, the first version of that operating system (which now has a market share of about 90 percent) to achieve widespread popularity.
Though the last official version of OS/2 was released in 2001, and IBM stopped providing support for the operating system in 2006, there are still a number of organizations that used OS/2 into the late 2000s or that currently use the operating system.
For example, OS/2 is currently used, or was used until very recently, by ATM companies in Australia, Brazil, and Iran; ticketing systems for the New York City subway, the London Underground, and France’s SNCF railways; and the checkout systems of grocery stores.
Welcome to my blog! I mainly like to write about the history of IT.