Each transistor in traditional computers can represent either a "1" or a "0", which is the language of computers. And the more transistors you can put on a chip working efficiently with others and other parts of the computer, the more calculations per second you can achieve within a single reasonable sized box. Each transistor also needs minute amounts of electricity, which generates heat (hence why your laptop and desktop computers get warm), and too much heat would mean melting the system itself.
Thus making a computer that outperforms the last version and keep in line with progress (whether following Moore's Law, faster or slower) is a challenge in balance and design.
And since we're already at the 10nm per transistor scale today, this being the size of 100 hydrogen atoms across, there is not much room for improvement in size without one single atom in the manufacturing process ruining a transistor thus making manufacturing these less cost effective and unreliable to produce. Some people think we won't be able to go much smaller than this.
However, thanks to new manufacturing techniques, we've been able to produce transistors at the 7nm and then 5nm scale recently (we won't see these before 2019 at least). 5nm is thus 50 hydrogen atoms wide. Thus there are limits to how small you can go.
So in order to resolve issues and keep computers becoming more powerful all the time and retain reasonable size, we need to find other ways to do it than rendering transistors smaller in scale.
I discuss the implications of this in the video below:
In a nutshell though, we could do one of the following:
- Find a way to work with subatomic particles to emulate "1" and "0" transistors. This would mean us using single atoms in some ways to get discreet signals. I haven't seen any research going in that direction since we 're not even sure how subatomic particles behave concretely within atoms to leverage them in this fashion.
- Use quantum state bits, or qbits. In this case, instead of using "1" and "0" transistors, we use extremely small transistors, roughly the same size as the ones described above, but instead of having only 2 possible states, the transistors would work with qbits capable of evaluating multiple states all at once. Ex. a traditional computer with 2 transistors has 4 states (00, 01, 10 and 11). 2 qbits in that scenario would be able to process all four states all at the same time. The best quantum computer we have built so far is capable of operating with 16 qbits, which is equal to 65,536 traditional bits. If we were to start using qbit-driven computers for processing, we could then increase the power of computers exponentially, further moving things forward, without sacrificing size at all. A well-connected friend of mine to quantum computing has recently told me that quantum computers are going commercial as soon as 2018.
- Change the computer architecture to make it more efficient and less energy hungry. Usually, computer processing units have their own memories they interconnect with to work, called a core. Typically we connect cores together to increase the amount of information processed simultaneously, but since each core has to process information using electricity with its own memory before spewing our results, there is a lot of back and forth within the core that takes extra time and heats up the system. Hewlett Packard seem to have figured out a way to use a single shared memory block for all processors instead, and each processor would communicate with the memory using optics, which is faster, more energy efficient, and thus much cooler. There is also less interconnection communications going on since all processors then use the same central memory system. This simple architecture change can make machines cooler and also more modular, where we could add processors (or new cores) to the system without worrying about memory. Less communication within the system with less energy consumed, and less heat. This can increase significantly the speed of computing, and make them smaller, since there is way less need for empty ventilation space for devices.. Though improvements in this case are likely not going to be exponential, it is a great piece to the puzzle. Combined with quantum computing for the processing parts, incredible speeds could be reached and an exponential increase in simultaneous computations per second could be reached too.
- Last but not least, good research has been made on a brand new type of transistor that is meant to emulate biological neurons in the sense that they take inputs from different sources in a weighted fashion, learn from input's frequencies and is able to adapt and provide different outputs through "experience". Though these would not fit very well within the frameworks of conventional computing like the advancements discussed in points #1 through #3, on can see the advantage of having transistors that learn on the fly and adapts based on usage, in particular for learning artificial intelligence systems. We then get a system where the software is naturally able to learn but where the hardware itself also has a learning component that simulates experience.
All in all, exciting stuff that should allow for Moore's Law to continue its exponential path towards the future for another while. Can I say whether or not Moore's Law will eventually become untrue? I really can't. But taking into account our creativity to find new novel ways to keep computing power to increase, I can only assume for now that in the next few years, we'll just find new ways to make computers better, more efficient, faster, smaller and smarter.
All we need to do is keep encouraging creativity and we'll all reap the rewards of improved support for our personal and professional lives.
No comments:
Post a Comment