Credit: Alicia Kubista / Andrij Borys Associates
Mentions of the phrase heterogeneous computing have been on the rise in the past few years and will continue to be heard for years to come, because heterogeneous computing is here to stay. What is heterogeneous computing, and why is it becoming the norm? How do we deal with it, from both the software side and the hardware side? This article provides answers to some of these questions and presents different points of view on others.
Let's start with the easy questions. What is heterogeneous computing? In a nutshell, it is a scheme in which the different computing nodes have different capabilities and/or different ways of executing instructions. A heterogeneous system is therefore a parallel system (single-core systems are almost ancient history). When multicore systems appeared, they were homogeneous—that is, all cores were similar. Moving from sequential programming to parallel programming, which used to be an area only for niche programmers, was a big jump. In heterogeneous computing, the cores are different.
No entries found