IBM's Sequoia and the politics of dual-use supercomputers
Once again, the United States can claim having the fastest supercomputer in the world. The US system, built by IBM and funded by the National Nuclear Security Agency (NNSA), is called Sequoia and replaced the previous reigning champion from China, K Computer. The Chinese held the title for the past 2 years, much to the dismay of many western countries, mainly because of what a country can do with such advanced systems.
Supercomputers, initially developed by Seymour Cray in the 1960s, were originally used to assist scientists with nuclear physics research. These complex systems, back then and today, are classified as "dual-use" by many governing bodies -- including the United States Federal government -- which means that they have both civilian and military application. Because of these dual capabilities, countries do not strive to have the title of the world's fastest supercomputer simply for bragging rights, but because it is imperative for reasons of national security.
According to NNSA Administrator Thomas D'Agostino, "Computing platforms like Sequoia help the United States keep its nuclear stockpile safe, secure and effective without the need for underground testing" and "give us increased confidence in the nation's nuclear deterrent as the weapons stockpile changes under treaty agreements." Sequoia will allow nuclear scientists to better understand weapons performance by simulating and predicting what happens to materials at very high and very low pressure and temperature points. It will also help to anticipate and prevent problems that come as a result of the aging of nuclear weapons. Through life extension programs (LEPs) made possible by Sequoia, weapons are assessed and modified in order to make them last longer, an important function considering the United States has not manufactured any new nuclear weapons since the Cold War. LEPs also play an important role in ensuring that the US no longer has to run underground nuclear tests, the last of which were conducted in 1992.
On the civilian side, supercomputers have myriad uses. They are essential to modern weather forecasting and are used in research to predict climate change. They have helped us to better understand earthquakes and predict how the waves will travel, which allows us to much more efficiently map out Earth's interior. Supercomputers are also extremely beneficial in bioscience. Scientists can take apart the structure of a virus using supercomputers, as they did with the "Swine Flu" epidemic of 2009, which in turn helps to figure out how to treat the disease.
While the flexible capabilities of supercomputers are incredibly useful, they also create tension with other countries when it comes to buying and selling these machines. Governments are very wary of companies selling supercomputers to rival countries because of their ability to drive nuclear and biological weapons development. "Dual-use," when applied to a particular technology, means that outside of technical specifications, there are immediate political, financial and policy ramifications on the designers and companies that build such systems. This means that certain markets are off-limits, both legally and politically. For example, in the 1990s, the United States government accused the Chinese government of re-purposing supercomputers for military reasons while purchasing them under the guise of being used for civilian purposes.
Policy makers in the United States -- often members of Congress -- will cite western companies for assisting certain foreign governments by selling such technologies and force these companies to discontinue their sales. This is detrimental to the financial well-being of many large companies as, in many cases, the fastest growing market for such technologies is overseas. Many supercomputers are also used for cryptanalysis -- particularly encryption and code cracking -- which has always been considered a strategic priority of the United States. Advancements in this area by entities not affiliated with the National Security Agency (NSA) are immediately flagged as a potential threat to our ability to communicate securely on matters of importance.
On a secondary level, there are other issues which will eventually arise affecting the usage of supercomputers in academia. While we consider such institutions to be beyond politics, there are many graduate students who are not US citizens who have access to such systems. This means the dual-use policies which typically cover the tangible component of a technology transfer can be worked around by individuals who have access to such systems. This can also have an impact on US citizens who work with foreign graduate students -- much like when the state department told several law programs that if their students read Wikileaks, it could impact their ability to receive a security clearance.
Managing the politics around building a supercomputer is just as important as managing the technology.
When it comes to technical specifications, the Sequoia is rather impressive. Using the Linpack Benchmark, it tested at 16.32 petaflops, or quadrillion floating-point operations per second, as compared to K Computer's 10.51 petaflops. In simpler terms, according to the BBC, "IBM said Sequoia was capable of calculating in one hour what otherwise would have taken 6.7 billion people using hand calculators 320 years to complete if they had worked non-stop." Also, even though it is much larger, Sequoia uses 7.9 megawatts of electricity whereas K Computer uses 12.6.
Today's fastest supercomputer is 273,930 times faster than Thinking Machines's CM-5/1024, which held the top spot in 1993, and researchers show no signs of stopping. IBM first broke the "petaflop barrier" in 2008 with its Roadrunner and Intel, who makes the majority of chips for supercomputers, says that it will be able to create a chip that breaks the next, the exaflop barrier, by 2018. Some are skeptical of this claim because processors are not getting any faster, but Intel says that its "Xeon Phi" chip will get around this by acting more as a co-processor running alongside the server CPU. The first Xeon Phi chip, code-named Knights Corner, will have over 50 cores and put out four or five gigaflops per watt. Ten times that amount of performance-per-watt is needed in order to reach exascale. It will be used in a supercomputer called Stampede, which Intel plans to debut toward the end of this year at the Texas Advanced Computing Center.
The implications of supercomputers, like most other technologies, is beyond just understanding the technical performance of such systems. The dual-use declaration -- logically decided or not -- is a reality that affects innovation. While most engineers, researchers and scientists want to create technological marvels, it is also important that they understand the world in which such systems operate and the consequences of developing such technologies outside of technical excellence.