A startup recently crammed 100 cores on a single chip. While that prototype is not ready for general use, Andy Bechtolsheim (co-founder of Sun Microsystems) recently predicted that, in about a decade, 64 cores will be the norm, but that each core will have no more memory available to it than today.
This begs the question, why? Who needs massively multi-core? Who are we building this for?
Mobile devices and laptops? Minimizing power consumption is far more important there. Desktop and gaming boxes? Most desktop applications barely tax the systems we have, and gaming systems need specialized GPUs, not CPUs. Servers? The massive clusters used by Google, Yahoo, and Microsoft crave memory and network bandwidth, not more CPUs.
Adding to the difficulty, we currently have no idea how to use all these cores, especially in mobile or the desktop. Current programming models simply cannot find the parallelism and there is considerable doubt that they ever will. On servers, there may be some hope that carefully constructed independent loads could balance demands on bandwidth, memory, I/O, and CPU, but, even there, finding enough exclusively CPU-heavy workloads to use all those cores seems unlikely.
Rather than have more cores, it seems that most of us would rather have lower power consumption and more memory. In mobile devices, we want them to go for days without recharge. On the desktop and data center, we want lower power bills and fast access to data. We do not want more CPUs.
Segmentation of the industry seems to have caused us to be blind to what we should build. CPU manufacturers are pushing their piece in isolation, ignoring the big picture of what consumers want. Consumers do not need more CPUs. Consumers do not need massively multi-core.
For a radically different look at massively multi-core, please see also my earlier post, "What to do with those idle cores?"