I've pointed to finding applications that are embarrassingly parallel (EP), meaning it is essentially trivial for them to use parallel hardware, as a key way to avoid problems. Why should be the case?
Actually, what I’m trying to find are not just EP applications, but rather killer apps for mass-market parallel systems: Mass-market applications which force many people to buy a “manycore” system, because it’s the only thing that runs them well. In fact, I’m thinking now that EP is probably a misstatement, because that’s really not the point. If a killer app’s parallelism is so convoluted that only three people on the planet understand it, who cares? The main requirement is that the other 6.6 Billion minus 3 of us can’t live without it (or at least a sufficiently large fraction of that 6.6B-3). Those three might get obscenely rich. That’s OK, just so long as the rest of us don’t go down the tubes.
I’m thinking here by analogy to two prior computer industry revolutions: The introduction of the personal computer, and the cluster takeover. The latter is slightly less well-known, but it was there.
The introduction of the PC was clearly a revolution. It created the first mass market for microprocessors and related gear, as well as the first mass market for software. Prior to that, both were boutique products – a substantial boutique, but still a boutique. Certainly the adoption of the PC by IBM spurred this; it meant software developers could rely on there being a product that would last long enough, and be widespread enough, for them to afford to invest in product development. But none of that would have mattered if at the time this revolution began, microprocessors hadn’t gotten just (barely) fast enough to run a killer application – the spreadsheet. Later, with a bit more horsepower, desktop processing became a second level (and boosted Macs substantially) but the first big bump was the spreadsheet. It was the reason for the purchase of PCs in large numbers by businesses, which kicked everything into gear: The fortuitous combination of a particular form and level of computing power and an application using that form and power.
Note that nobody really lost in the PC revolution. A few people, Intel and Microsoft in particular, really won. But the revenue for other microprocessors didn’t decline, they just didn’t have the enormous increases of those two.
The takeover of servers by clusters was another revolution, ignited by another conjunction of fast-enough microprocessors and a killer app. This time, the killer app was the Internet. Microprocessors got just fast enough that they could provide adequate turnaround time for a straightforward “read this page” web request, and once that happened you just didn’t need single big computers any more to serve up your website, no matter how many requests came in: Just spray the request across a pile of very cheap servers. This revolution definitely had losers. It was a big part of what in IBM was referred to as “our near-death experience”; it’s hard to fight a difference in cost of about 300X (which is what I heard it estimated within IBM development for mainframe vs. commodity clusters, for cases where the clusters were adequate – big sore point there). Later Sun succumbed to this, too, even though they started out being one of the spears in IBM’s flanks. Once again, capability plus application produced revolution.
We have another revolution going on now in the turn to explicit parallelism. But so far, it’s not the same kind of revolution because while the sea-change in computer power is there, the killer app, so far, is not – or at least it’s not yet been widely recognized.
We need that killer app. Not a simpler means of parallel programming. What are you going to do with that programming? Parallel Powerpoint? The mind reels.
(By the way, this is perfect – it’s exactly what I hoped to have happen when I started this blog. Trying to respond to a question, I rethought a position and ended with a stronger story as a result. Thanks to David Chess for his comment!)