In 1965 Gordon Moore published his paper "Cramming more components onto integrated circuits" and became famous by what is called Moore’s law. Moore’s law states that the amount of transistors of a computer processor doubles every 2 years[2]. In this paper he predicts a whole revolution in computing and communication that would be caused by the integration, like the advent of home computers or cell phones (the picture above, taken from his paper, illustrates that):
"Integrated circuits will lead to such wonders as home computersor - at least terminals connected to a central computer - automatic controls for automobiles, and personal portable communications equipment".
Moore was very successful with his statement especially because the amount of transistors is also related to every other computer metric such as processors’ performance for example. If I had read this article in 1965 I would have thought mathematically and reasoned that this integration could not be sustainable for quite long. Moore also thought that and this prediction was just for the following 10 years, in other words, until 1975.
To understand
I say better because processors’ performance did grow proportionally with its integration not just because of faster clock cycles but mainly because we used the extra transistors in amazing architectural improvements on pipelining, super-scalars, branch prediction, and etc.
But what
I can imagine myself, in late 60's talking to some “mates”:
- What do you think of Gordon’s paper? – I would have asked
- Well, in 10 years we may reach 32 times more transistors. It seems feasible – somebody would have said.
- What if we can keep this for, let’s say, the next 40 years at least? What if computers could be really found at every desk? – I wish I would have said this :-)
- Wow –someone would say – but this would mean 1,000,000 times more transistors. And, of course, tons of computers.
- Imagine the performance growth, the memory growth, …
- But – there is always a skeptical – who on Earth would need that much performance at his desk?
So, from the time
The fact is that computers got faster and applications got far more complex, not necessarily in this order. We use today the computer for applications that would have been unthinkable decades ago by the simple fact that they just could arise now, when we have the right conditions.
By now you might be asking yourself where I want to get. Well, my dear reader, for a series of reasons (power limitations, heat, …) processors industry decided to use the extra transistors to put more than one processor, or core, in a chip. Multi-cores are now the fashion of computer architecture and everybody is discussing about it. It is certainly a big turn of direction.
Cores per processor are growing (2, 4, 6, 8, 16 …) and advancing to the desktop. Some people then started asking for how long? How many cores can we use in the desktop? Who on Earth will need so many cores at the desk?
And this is exactly the point. Most of today’s applications are not really able to take advantage of several cores. Does this mean that we did reach the point we do not need to buy new computers? Will I be happy for several years with my 6 core computer? Will computer industry finally slow down on computer sells?
Some people think so but I disagree. I believe it is just a matter of time for current applications to start multi-threading everywhere and give some use for our cores. But this, in my opinion, will not be the reason we will buy next generations hardware. I believe a whole set of new applications will arise just because today we have all this processing power “for free”. Applications we may just imagine today and applications we can not yet think about. Really good speech transcriptors (I would not need to type this post), image recognition and photo/videos image searching. What about asking the computer for all the videos, pictures and documents where your child is present? What about the ones where the day was sunny? Your laptop will drive your car, based on the traffic info, while you are dictating a memo.
Either multi-cores will slow down computer industry and we will be satisfied for longer with our desktop computers or the race will keep going and new amazing hardware and software will emerge. The third possibility is the advent of a totally different model of computer business. Choose your future scenario and tell me your opinion.
I am not good at all in “futurology” but I believe cores are here to stay and programmers will make their way to use them giving us different types of functionalities we did not even know we wanted but we will not be able to live without. And this technology race will keep on going.
For how long? For at least a couple of decades when the business might change.
Where is the limit? “The infinite and beyond”.
[1] Buzz Lightyear in ToyStory
[2] Initially he said every year but then he backed up and stated it would double every couple of years.