How Have Today’s Supercomputers Influenced Tomorrow’s Home Computers?
Your desktop might never be the same…
Supercomputers have always offered a tantalizing glimpse into the future. Fifty years ago, the silicon transistor-powered Cray 6600 was ten times faster than any standard computer of its era. Along with Manchester University’s Atlas, which pioneered the use of virtual memory, these groundbreaking machines were the world’s first true supercomputers. They also provided early indications of how computing power would develop, which is why today’s exclusive band of supercomputers may offer clues about all our computers of tomorrow.
In The Beginning…
The first supercomputer forecasts for mainstream technology involve their power and processing abilities. The 1976 Cray 1 introduced chaining as a way of reducing computational speed, while its 1985 successor was the first computer to perform over one gigaflop of calculations – the same number achievable by an Apple iPad 2 almost three decades later. Intel’s 1990s ASCI Red supercomputer offered 12 terabytes of disk storage, yet it was only last autumn that Samsung’s PM1633a solid state drive offered 16TB of storage to consumers.
Of course, generation gaps between supercomputers and desktop computers have a great deal to do with economies of scale, not to mention physical scale. Samsung’s 16TB hard drive fits in a 2.5-inch SSD-standard case, whereas the ASCI Red supercomputer would have filled a pair of three-bedroom semi-detached houses. Moore’s law dictates that computer processing power doubles every 18 months, yet the physical hardware required to undertake this volume of processing shrinks in inverse proportion to those higher specifications. Mass production of microprocessors and solid state storage devices has simultaneously brought the cost of formerly laboratory-standard hardware to levels enabling mass adoption to take place.
The Future Of Home Computers
So what does the performance of today’s supercomputers suggest for domestic devices in the 2020s? It should be borne in mind that 2016’s water-cooled monsters are an order of magnitude more powerful than anything previously seen, with the Japanese K computer squeezing one hundred eight-core processors into each of its 600 individual cabinets. The recently unveiled Sunway TaihuLight can perform 93 trillion calculations per second, and its only similarity to any domestic computer in existence is a Linux OS. Industry observers have speculated that today’s supercomputers are too powerful for software to exploit their full potential, which is only really suited to tasks like molecular dynamics modelling simulations.
But Will It Be Enough?
Clearly this untapped power will be harnessed for less mathematical tasks by tomorrow’s domestic and commercial computers. The Internet of Things will require vast amounts of processing power and storage, both to govern billions of web-enabled devices and to draw meaningful conclusions from their collective output. Truly seamless VR will demand rendering capabilities far beyond today’s hardware platforms, and the relentless march towards autonomous vehicles will involve computers that can recognize and resolve challenges from stray pets to faulty traffic lights. Some industry analysts expect fully autonomous vehicles to require a degree of processing power akin to a millennial supercomputer, just to navigate safely and effectively.
To Infinity And Beyond!
Supercomputers also offer a glimpse into the future of commercial IT services like web hosting. Costs would be lowered if more affordable technology could perform the functions of today’s server rooms, while reducing space and power usage could make web hosting companies even more cost-effective and environmentally friendly. As the IoT begins to take hold, hosting firms with sufficient hardware may be able to offer processing services alongside website hosting and analytics. Abundant storage space should make cloud storage effectively limitless, which will accelerate migration towards online file sharing as opposed to the offline data storage of historic supercomputers. After all, even these elite machines don’t always accurately predict the future of IT…