Thursday, April 2, 2009

Twilight of the GPU?

If a new offering really works, it may substantially shrink the market for gaming consoles and high-end gamer PCs. Their demo runs Crysis – a game known to melt down extreme PCs – on a Dell Studio 15 with Intel's minimalist integrated graphics. Or on a Mac. Or a TV, with their little box for the controls. The market for Nvidia CUDA, Intel Larrabee, IBM Cell, AMD Fusion, are all impacted. And so much for cheap MFLOPS for HPC, riding on huge gamer GPU volumes.

This game-changer – pun absolutely intended – is Onlive. It came out of stealth mode a few days ago with an announcement and a demo presentation (more here and here) at the 2009 Game Developers' Conference. It's in private beta now, and scheduled for public beta in the summer followed by full availability in winter 2009. Here's a shot from the demo showing some Crysis player-versus-player action on that Dell:



What Onlive does seems kind of obvious: Run the game on a farm/cloud that hosts all the hairy graphics hardware, and stream compressed images back to the players' systems. Then the clients don't have to do any lifting heavier than displaying 720p streaming video, at 60 frames per second, in a browser plugin. If you're thinking "Cloud computing meets gaming," well, so are a lot of other people. It's true, for some definitions of cloud computing. (Of course, there are so many definitions that almost anything is true for some definition of cloud computing.) Also, it's the brainchild of Steve Perlman, creator of Web TV, now known as MSN TV.

Now, I'm sure some of you are saying "Been there, done that, doesn't work" because I left out something crucial: The client also must send user inputs back to the server, and the server must respond. This cannot just be not a typical render cloud, like those used to produce production cinematic animation.

That's where a huge potential hitch lies: Lag. How quickly you get round-trip response to inputs. Onlive must target First Person Shooters (FPS), a.k.a. twitch games; they're collectively the largest sellers. How well a player does in such games depends on the player's reaction time – well, among other things, but reaction time is key. If there's perceived lag between your twitch and the movement of your weapon, dirt bike, or whatever, Onlive will be shunned like the plague because it will have missed the point: Lag destroys your immersion in the game. Keeping lag imperceptible, while displaying a realistic image, is the real reason for high-end graphics.

Of course Onlive claims to have solved that problem. But, while crucial, lag isn't not the only issue; here's a list which I'll elaborate on below: Lag, angry ISPs, and game selection & pricing.

Lag

The golden number is 150 msec. If the displayed response to user inputs is even a few msec. longer than that, games feel rubbery. That's 150 msec. to get the input, package it for transport, ship it out through the Internet, get it back in the server, do the game simulation to figure out what objects to change, change them (often, a lot: BOOM!), update the resulting image, compress it, package that, send it back to the client, decompress it, and get it on the screen.

There is robust skepticism that this is possible.

The Onlive folks are of course exquisitely aware that this is a key problem, and spent significant time talking vaguely about how they solved it. Naturally such talk involves mention of roughly a gazillion patents in areas like compression.

They also said their servers were "like nothing else." I don't doubt that in the slightest. Not very many servers have high-end GPUs attached, nor do they have the client chipsets that provide 16-lane PCIe attachment for those GPUs. Interestingly, there's a workstation chipset for Intel's just-announced Xeon 5500 (Nehalem) with four 16-lane PCIe ports shown.

I wouldn't be surprised to find some TCP/IP protocol acceleration involved, too, and who knows – maybe some FPGAs on memory busses to do the compression gruntwork?

Those must be pretty toasty servers. Check out the fan on this typical high-end GPU (Diamond Multimedia using ATI 4870):


The comparable Nvidia GeForce GTX 295 is rated as consuming 289 Watts. (I used the Diamond/ATI picture because it's sexier-looking than the Nvidia card.) Since games like Crysis can soak up two or more of these cards – there are proprietary inter-card interconnects specifically for that purpose – "toasty" may be a gross understatement. Incendiary, more likely.

In addition, the Golive presenters made a big deal out of users having to be within a 1000-mile radius of the servers, since beyond that the speed of light starts messing things up. So if you're not that close to their initial three server farms in California, Texas, Virginia, and possibly elsewhere, you're out of luck. I think at least part of the time spent on this was just geek coolness: Wow, we get to talk about the speed of light for real!

Well, can they make it work? Adrian Covert reported on Gismodo about his hands-on experience trying the beta, playing Bioshock at Onlive's booth at the GDC (the server farm was about 50 miles away). He saw lag "just enough to not feel natural, but hardly enough to really detract from gameplay." So you won't mind unless you're a "competitive" gamer. There were, though, a number of compression artifacts, particularly when water and fire effects dominated the screen. Indoor scenes were good, and of course whatever they did with the demo beach scene in Crysis worked wonderfully.

So it sounds like this can be made to work, if you have a few extra HVAC units around to handle the heat. It's not perfect, but it sounds adequate.

Angry ISPs

But will you be allowed to run it? The bandwidth requirements are pretty fierce.

HD 720p bandwidth, with normal ATSC MPEG-2 compression, rarely goes over 3 Mb/sec. Given that, I'm inclined to take at face value Onlive's claim to require only 1.5 Mb/s for games. But 1.5Mb/s translates into 675 Megabytes per hour. Ouch. A good RPG can clock 50-100 hours before replay, and when a player is immersed that can happen in a nearly continuous shot, interrupted only by biology. That's passing-a-kidney-stone level of bandwidth ouch. 

NOTE: See update below. It's likely even worse than that.

Some ISPs already have throttling and bandwidth limiting in place today. They won't let Onlive take over their network. Tiered pricing may become the order of the day, or, possibly, the ISP itself may do the gaming offering itself and bill for the bandwidth invisibly, inside the subscription (this is done today for some on-demand TV). Otherwise, they're not likely to subsidize Onlive. Just limit bandwidth a little, or add just a bit of latency…

Net Neutrality, anyone?

Game Selection and Pricing

This is an area that Onlive seems to have nailed. Atari, Codemasters, Eidos, Electronic Arts, Epic, Take Two, THQ, Ubisoft, and Warner Bros. are all confirmed publishers. That's despite this being a new platform for them to support, something done only reluctantly. How many PC games are on the Mac?

I can see them investing that much in it, though, for two reasons: It reduces piracy, and it offers a wide range of pricing and selling options they don't have now.

The piracy issue is pretty obvious. It's kind of hard to crack a game and share it around when you're playing it on a TV.

As for pricing and selling, well, I can't imagine how much drooling is going on by game publishers over those. There are lots of additional ways to make money, as well as a big reduction of barriers to sale. Obviously, there are rentals. Then rent-to-own options. Free metered demos. Spectating: You can watch other people playing – the show-offs, anyway – to get a feel for a game before jumping in yourself. All this can dramatically reduce the hump everybody has to get over before shelling out $40-$60 for a new high-end title, and could broaden the base of potential customers for high-end games substantially. It could also, of course, give a real shot in the arm to more casual games.

Of course, this assumes nobody gets greedy here and strangles themselves with prices that aren't what people are willing to pay. I expect some experimentation there before things settle down.

A final note before leaving this general topic: Massively multiplayer game and virtual world publishers (e.g., Blizzard (Worlds of Warcraft), Linden Labs (Second Life)) were conspicuously absent from the list. This may not be a temporary situation. You could surely run, for example, the WoW client on Onlive – but the client is useless unless it is talking to a WoW server farm somewhere.

Impact on the GPU Market

According to HPCWire's Editor's blog, "GPU computing is the most compelling technology to come on the scene in recent memory" because GPUs have become general enough to do a wide variety of computationally intensive tasks, the software support has become adequate, and, oh, yes, they're dirt cheap MFLOPS. Relatively speaking. Because they're produced in much larger volumes than the HPC market would induce. Will Golive change this situation? It's hard to imagine that it won't have at least some impact.

Obviously, people with Dell Studio 16s aren't buying GPUs. They never would have, anyway – they're in the 50% of the GPU market Intel repeatedly says it has thanks to integrated graphics built into the chipset. The other end of the market, power gamers, will still buy GPUs; they won't be satisfied with what Golive provides. That's another 10% spoken for. What of the remaining 40%? If Golive works, I think it will ultimately be significantly impacted.

Many of those middle-range people may simply not bother getting a higher-function GPU. Systems with a decent-performing GPU, and the memory needed to run significant games, cost substantially more than those without; I know this directly, since I've been pricing them recently. A laptop with >2GB, >200GB disk, and a good Nvidia or ATI GPU with 256KB or more memory, pushes or surpasses $2000. This compares with substantially under $1000 for an otherwise perfectly adequate laptop if you don't want decent game performance. Really, it compares with a flat $0, since the lack of CPU performance increase would otherwise not even warrant a replacement.

GPUs, however, won't completely disappear. Onlive itself needs them to run the games. Will that replace the lost volume? Somewhat, but it's certain Onlive's central facilities will have higher utilization than individuals' personal systems. That utilization won't be as high as it could be with a worldwide sharing facility, since the 1000-mile radius prohibits the natural exploitation of time zones.

They did mention using virtualization to host multiple lower-intensity games on each server; that could be used to dramatically increase utilization. However, if they've managed to virtualize GPUs, my hat's off to them. It may be possible to share them among games, but not easily; no GPU I know of was designed to be virtualized, so changes to DirectX will undoubtedly be needed. If this kind of use becomes common, virtualization may be supported, but it probably won't be soon. (Larrabee may have an advantage here; at least its virtualization architecture, Intel VT, is already well-defined.)

There are other uses of GPUs. Microsoft Windows Vista Aero interface, for example, and the "3D Web" (like this effort). These, however, generally expect to ride on the multi-Billion dollar game market's driving of GPU volumes. They're not drivers (although Microsoft thought it was), they're followers.

If Onlive succeeds – and as noted above, it may be possible but has some speed bumps ahead – the market for games may actually increase while the GPU market sinks dramatically.

--------------------------------------------------

Acknowledgement: The ISP issue was originally raised by a colleague in email discussion. He's previously asked to remain incognito, so I'll respect that.

UPDATE: 

Well, I misheard some numbers. Onlive needs 1.5 Mbps for standard-TV resolution. They say it requires 5 Mbps for 720p HDTV images. That’s just achievable with current CATV facilities, and would require 1.125 GB downloaded per hour. That really puts it outside the pale unless the service provider is engaged in providing the service. (In my defense, when I tried to find out the real bandwidth for 720p video, I found contradictions. This What Exactly is HDTV? site, which I used above, seems pretty authoritative, and has the 3 Mbps number. But this site is incredibly detailed, and claims everybody needs major compression to fit into the congressionally allocated 18 Mbps television channel bandwidth; uncompressed it's about 885 Mbps. So I can’t claim to understand where this all ends up in detail, but it really doesn’t matter: Clearly it is a very bad problem.

Thanks again to my anonymous colleague, who pointed this out.

By the way, welcome to all the Mac folks directed here from The Tao of Mac. I appreciate the reference, and I fully understand why this may be of interest. If I could run Fallout 3 (and Oblivion) on a Mac without going through the hassle of Boot Camp, I’d have one now. I’m just totally disgusted with Vista.

8 comments:

Igor Ostrovsky said...

This is an interesting possibility, but as always, there are too many variables to predict anything with certainty.

If it works, it could be a strong competitor for gaming laptops.

The cost argument is harder to make against gaming consoles, as they are fairly cheap. But, some of the features like fast game delivery do sound compelling.

Rex Guo said...

"..a good Nvidia or ATI GPU with 256KB or more memory.."

I think you meant 256MB.

But yes, Onlive definitely has the opportunity to grow the games market. While the FPS genre is more popular now, there's no reason why new game genres cannot be created that rely less on low latency, thus making them more synergistic with Onlive.

Greg Pfister said...

Rex,

Oops - thanks for the catch. Yes, 256MB.

There may indeed be new game genres. Hard to rely on that, though.

Greg

Anonymous said...

Hi, I enjoyed reading your post. Can you tell me where you got that golden number of 150ms from? From playing Quake, Unreal Tournament, and Quake Live I can tell that playing with more than 60ms isn't much fun anymore. At least not if you want to win. I guess for single player games like Mass Effect this number can be higher. But for competitive play with other humans the delay cannot be small enough.

Greg Pfister said...

Source - know what, I forget! I remember finding it generally floating around the web. Some examples, searching just now:

http://bit.ly/9qWd6a - A paper on effects of local lag on virtual environments. It says 0-150ms is OK, above that is bad.

http://bit.ly/9tWX3T - which is about measured lag in a QuakeIII Arena tournament, specifically calling out 150 ms. But it's from 2001, so maybe things have gotten more competitive since then.

http://bit.ly/c4gPwY - this one cites other research that agrees with you, putting the threshold for visual feedback down at 67 ms.; but their own work on haptic feedback indicates it's OK up to 200ms.

So, maybe "it depends." I think if they manage 150ms., they'll be OK. I don't think truly serious twitch gamers, with reflexes far better than mine are now, will ever use it, anyway. That still leaves lots of gamers.

Anonymous said...

Anonymous: you're probably just looking at network latency. Keep in mind that there's also lag introduced due to rendering time and even your display, if you're not using a CRT. Eurogamer.net's Digital Foundry section has had several good articles on this; start with this one and follow the links for more information.

The advantage that Onlive has here is that they can use high-end GPUs and so on that most gamers, except for those with rather expensive tricked-out PCs, don't have. (Keep in mind that the 7th generation console technology is pretty old and slow these days--they should be able to render a frame several times faster than a console fairly easily.)

That said, the problem they're attacking is far from trivial even in theory, much less in practice.

And Greg, sheesh, just get a PS3 or something if you want to play Fallout 3. You'll spend less than a sixth of what you would on a well-configured computer, and avoid the configuration hassles. (That said, Fallout 3 has still wedged my PS3 a dozen times in my last 60 hours or so of gameplay.)

Greg Pfister said...

Curt,

Thanks for the knowledgeable comment -- here and the others you made recently.

PS3 - yeah, that's a better solution, I agree, and cheaper. If I believably promise myself to get a lower-end laptop next time. Stupid Vaio, which I bought for the Nvidia card (and low weight) now won't even run the new SL client that correctly detects the GPU, uses it, and overheats in about 3 minutes. :-(

Anonymous said...

Yes, and it makes a great a Blu-ray player, too!

Oh, one thing I didn't mention about that, though, is that of course you'll want an HDTV if you're going to use a PS3. Living in Japan it just didn't come to mind, because almost nobody has a standard definition TV any more. (The only place you can even buy a new SD TV here in Japan would be as part of your cellphone or iPod or something like that.)

For me it was a no-brainer, since I'm a big movie buff and have had an HD projector for more than half a decade now. The games were sort of a nice bonus, as I hadn't been much of a gamer for, well, decades, but they sucked me back in. The recent generation of games, in particular some of the Sony exclusives such as the Uncharted series, have incredible production values, and are dragging in a lot of the art of cinema now. (One example is the rise of motion capture, especially when done at the same time as voice acting, though that must frustrate a number of graphics researchers who've been trying to develop algorithms for natural body movement over the past few decades.) And then the geek side really dragged me in: following the progress and comments from game development studios is quite fascinating. It's also fun to see the developers' geek factor influence the games. On comment that particularly stuck in my mind was in the commentary videos that came with Uncharted: they said that they really wanted a lot of water throughout the game because they'd just developed some code that allowed them to make water look really, really nice for the first time when doing real-time rendering.

Post a Comment

Thanks for commenting!

Note: Only a member of this blog may post a comment.