This article was originally published in the spring 2016 issue of Argonne Now, the laboratory’s science magazine.
Director, Mathematics & Computer Science division
“We have a couple of more generations with current silicon transistor technology where devices continue to shrink. I’d say 8-10 years.
Today’s supercomputers are massively parallel; they are so fast because they string many processors together. When you keep adding more processors, the problem becomes software. Can you split your task into 10 million smaller tasks? 100 million? Then power becomes a problem. So does resilience; how do you build a system that keeps functioning if some individual gates are misfiring?
One interesting area is called approximate computing. We’re spending a lot of power to reduce the frequency of error. What if you built a system that makes mistakes much more frequently but uses much less energy?
We can also explore specialized computers; we’ve known for a long time you can build specialized computers that can be faster than general-purpose computers for a particular problem. As silicon gets to its limits and general-purpose computing starts slowing down, perhaps specialized systems get more attractive.”
“I think the greatest challenge of the age may be how to process information in a way that mimics the elegance of the human brain. The brain does incredibly power-efficient computing. It runs on about 20 watts; supercomputers run on many megawatts. It’s something like a million times more efficient. Google and these other big companies that use huge computer banks spend a lot of energy cooling them.
And the thing is, the ‘clock cycle’ is very slow in the brain compared to computers. A neuron can only fire 5 times per second. Transistors can switch billions of times. But computers are very linear; one operation and then another. Human brains are wired very differently. So maybe when we look for the “next transistor,” we don’t actually need a faster version of silicon, but something with new properties that inspires us to build computers with some entirely different architecture.”
Suzanne G.E. te Velthuis
“Memory is a serious bottleneck for better, faster computers. One approach we’re studying is to base memory on skyrmions, tiny bubbles of magnetism that form in certain materials.
In current hard disk memory drives, the read-and-write head measures changes in magnetism as it goes across a drive. But in an alternative system called “racetrack” memory, you move the data, not the head. That data could be a ribbon of skyrmions, encoding the 1s and 0s. This system could be more efficient and use much less power.
Just recently our team discovered a way to make skyrmions at room temperature that can be controlled with a relatively small current of electricity. The field has made a lot of progress in the last three to five years, but we still need to shrink the skyrmion bubbles, and understand more about their behavior and motion at different scales.”
Director, Center for Nanoscale Materials
“The basic architecture of the computer hasn’t changed in 60, 70 years. But as we go forward, the kinds of computing workloads we need are increasingly different and it’s driving a feeling that we need new architectures.
It’s not a question of figuring out a new device. I think it has to come from the other way around. It’s got to be a new architecture that may demand a new type of device—not the way we do it now, where physicists come up with a new device and then try to figure out how to use it for computing. We need to come up with new architectures to solve problems more efficiently, both in terms of computing time and power, and then find the circuits, devices, processes and materials in order to build that architecture.
To get there, one of my goals at the Center for Nanoscale Materials is to increase the interaction between physicists, chemists and materials scientists with computer scientists—an interaction which could be better than what it is today.”
Deputy Director, Institute for Molecular Engineering
“In the past 15 years we’ve made enormous strides in using quantum systems to store and manipulate information. In a classical computer, information is encoded in binary (1 or 0). But the quantum state of an atom can exist in an arbitrarily large number of states, and this can be used to store and process information in a fundamentally different way. In a quantum system, a single computational cycle could perform highly complex calculations and could solve certain problems which are difficult for even the fastest supercomputers.
In addition, as the act of measurement can change the state of a quantum system, it could offer inherently secure communication. If anyone tries to eavesdrop, they will end up destroying the message.
Quantum technology is rapidly progressing and may soon have an impact on many parts of society. If you told someone 10 years ago you wanted to control the quantum states of atomic nuclei at room temperature, they would think you were crazy. Now we do that routinely.”