If a new offering really works, it may substantially shrink the market for gaming consoles and high-end gamer PCs. Their demo runs Crysis – a game known to melt down extreme PCs – on a Dell Studio 15 with Intel's minimalist integrated graphics. Or on a Mac. Or a TV, with their little box for the controls. The market for Nvidia CUDA, Intel Larrabee, IBM Cell, AMD Fusion, are all impacted. And so much for cheap MFLOPS for HPC, riding on huge gamer GPU volumes.
This game-changer – pun absolutely intended – is Onlive. It came out of stealth mode a few days ago with an announcement and a demo presentation (more here and here) at the 2009 Game Developers' Conference. It's in private beta now, and scheduled for public beta in the summer followed by full availability in winter 2009. Here's a shot from the demo showing some Crysis player-versus-player action on that Dell:
|
|
What Onlive does seems kind of obvious: Run the game on a farm/cloud that hosts all the hairy graphics hardware, and stream compressed images back to the players' systems. Then the clients don't have to do any lifting heavier than displaying 720p streaming video, at 60 frames per second, in a browser plugin. If you're thinking "Cloud computing meets gaming," well, so are a lot of other people. It's true, for some definitions of cloud computing. (Of course, there are so many definitions that almost anything is true for some definition of cloud computing.) Also, it's the brainchild of Steve Perlman, creator of Web TV, now known as MSN TV.
Now, I'm sure some of you are saying "Been there, done that, doesn't work" because I left out something crucial: The client also must send user inputs back to the server, and the server must respond. This cannot just be not a typical render cloud, like those used to produce production cinematic animation.
That's where a huge potential hitch lies: Lag. How quickly you get round-trip response to inputs. Onlive must target First Person Shooters (FPS), a.k.a. twitch games; they're collectively the largest sellers. How well a player does in such games depends on the player's reaction time – well, among other things, but reaction time is key. If there's perceived lag between your twitch and the movement of your weapon, dirt bike, or whatever, Onlive will be shunned like the plague because it will have missed the point: Lag destroys your immersion in the game. Keeping lag imperceptible, while displaying a realistic image, is the real reason for high-end graphics.
Of course Onlive claims to have solved that problem. But, while crucial, lag isn't not the only issue; here's a list which I'll elaborate on below: Lag, angry ISPs, and game selection & pricing.
Lag
The golden number is 150 msec. If the displayed response to user inputs is even a few msec. longer than that, games feel rubbery. That's 150 msec. to get the input, package it for transport, ship it out through the Internet, get it back in the server, do the game simulation to figure out what objects to change, change them (often, a lot: BOOM!), update the resulting image, compress it, package that, send it back to the client, decompress it, and get it on the screen.
There is robust skepticism that this is possible.
The Onlive folks are of course exquisitely aware that this is a key problem, and spent significant time talking vaguely about how they solved it. Naturally such talk involves mention of roughly a gazillion patents in areas like compression.
They also said their servers were "like nothing else." I don't doubt that in the slightest. Not very many servers have high-end GPUs attached, nor do they have the client chipsets that provide 16-lane PCIe attachment for those GPUs. Interestingly, there's a workstation chipset for Intel's just-announced Xeon 5500 (Nehalem) with four 16-lane PCIe ports shown.
I wouldn't be surprised to find some TCP/IP protocol acceleration involved, too, and who knows – maybe some FPGAs on memory busses to do the compression gruntwork?
Those must be pretty toasty servers. Check out the fan on this typical high-end GPU (Diamond Multimedia using ATI 4870):
The comparable Nvidia GeForce GTX 295 is rated as consuming 289 Watts. (I used the Diamond/ATI picture because it's sexier-looking than the Nvidia card.) Since games like Crysis can soak up two or more of these cards – there are proprietary inter-card interconnects specifically for that purpose – "toasty" may be a gross understatement. Incendiary, more likely.
In addition, the Golive presenters made a big deal out of users having to be within a 1000-mile radius of the servers, since beyond that the speed of light starts messing things up. So if you're not that close to their initial three server farms in California, Texas, Virginia, and possibly elsewhere, you're out of luck. I think at least part of the time spent on this was just geek coolness: Wow, we get to talk about the speed of light for real!
Well, can they make it work? Adrian Covert reported on Gismodo about his hands-on experience trying the beta, playing Bioshock at Onlive's booth at the GDC (the server farm was about 50 miles away). He saw lag "just enough to not feel natural, but hardly enough to really detract from gameplay." So you won't mind unless you're a "competitive" gamer. There were, though, a number of compression artifacts, particularly when water and fire effects dominated the screen. Indoor scenes were good, and of course whatever they did with the demo beach scene in Crysis worked wonderfully.
So it sounds like this can be made to work, if you have a few extra HVAC units around to handle the heat. It's not perfect, but it sounds adequate.
Angry ISPs
But will you be allowed to run it? The bandwidth requirements are pretty fierce.
HD 720p bandwidth, with normal ATSC MPEG-2 compression, rarely goes over 3 Mb/sec. Given that, I'm inclined to take at face value Onlive's claim to require only 1.5 Mb/s for games. But 1.5Mb/s translates into 675 Megabytes per hour. Ouch. A good RPG can clock 50-100 hours before replay, and when a player is immersed that can happen in a nearly continuous shot, interrupted only by biology. That's passing-a-kidney-stone level of bandwidth ouch.
NOTE: See update below. It's likely even worse than that.
Some ISPs already have throttling and bandwidth limiting in place today. They won't let Onlive take over their network. Tiered pricing may become the order of the day, or, possibly, the ISP itself may do the gaming offering itself and bill for the bandwidth invisibly, inside the subscription (this is done today for some on-demand TV). Otherwise, they're not likely to subsidize Onlive. Just limit bandwidth a little, or add just a bit of latency…
Net Neutrality, anyone?
Game Selection and Pricing
This is an area that Onlive seems to have nailed. Atari, Codemasters, Eidos, Electronic Arts, Epic, Take Two, THQ, Ubisoft, and Warner Bros. are all confirmed publishers. That's despite this being a new platform for them to support, something done only reluctantly. How many PC games are on the Mac?
I can see them investing that much in it, though, for two reasons: It reduces piracy, and it offers a wide range of pricing and selling options they don't have now.
The piracy issue is pretty obvious. It's kind of hard to crack a game and share it around when you're playing it on a TV.
As for pricing and selling, well, I can't imagine how much drooling is going on by game publishers over those. There are lots of additional ways to make money, as well as a big reduction of barriers to sale. Obviously, there are rentals. Then rent-to-own options. Free metered demos. Spectating: You can watch other people playing – the show-offs, anyway – to get a feel for a game before jumping in yourself. All this can dramatically reduce the hump everybody has to get over before shelling out $40-$60 for a new high-end title, and could broaden the base of potential customers for high-end games substantially. It could also, of course, give a real shot in the arm to more casual games.
Of course, this assumes nobody gets greedy here and strangles themselves with prices that aren't what people are willing to pay. I expect some experimentation there before things settle down.
A final note before leaving this general topic: Massively multiplayer game and virtual world publishers (e.g., Blizzard (Worlds of Warcraft), Linden Labs (Second Life)) were conspicuously absent from the list. This may not be a temporary situation. You could surely run, for example, the WoW client on Onlive – but the client is useless unless it is talking to a WoW server farm somewhere.
Impact on the GPU Market
According to HPCWire's Editor's blog, "GPU computing is the most compelling technology to come on the scene in recent memory" because GPUs have become general enough to do a wide variety of computationally intensive tasks, the software support has become adequate, and, oh, yes, they're dirt cheap MFLOPS. Relatively speaking. Because they're produced in much larger volumes than the HPC market would induce. Will Golive change this situation? It's hard to imagine that it won't have at least some impact.
Obviously, people with Dell Studio 16s aren't buying GPUs. They never would have, anyway – they're in the 50% of the GPU market Intel repeatedly says it has thanks to integrated graphics built into the chipset. The other end of the market, power gamers, will still buy GPUs; they won't be satisfied with what Golive provides. That's another 10% spoken for. What of the remaining 40%? If Golive works, I think it will ultimately be significantly impacted.
Many of those middle-range people may simply not bother getting a higher-function GPU. Systems with a decent-performing GPU, and the memory needed to run significant games, cost substantially more than those without; I know this directly, since I've been pricing them recently. A laptop with >2GB, >200GB disk, and a good Nvidia or ATI GPU with 256KB or more memory, pushes or surpasses $2000. This compares with substantially under $1000 for an otherwise perfectly adequate laptop if you don't want decent game performance. Really, it compares with a flat $0, since the lack of CPU performance increase would otherwise not even warrant a replacement.
GPUs, however, won't completely disappear. Onlive itself needs them to run the games. Will that replace the lost volume? Somewhat, but it's certain Onlive's central facilities will have higher utilization than individuals' personal systems. That utilization won't be as high as it could be with a worldwide sharing facility, since the 1000-mile radius prohibits the natural exploitation of time zones.
They did mention using virtualization to host multiple lower-intensity games on each server; that could be used to dramatically increase utilization. However, if they've managed to virtualize GPUs, my hat's off to them. It may be possible to share them among games, but not easily; no GPU I know of was designed to be virtualized, so changes to DirectX will undoubtedly be needed. If this kind of use becomes common, virtualization may be supported, but it probably won't be soon. (Larrabee may have an advantage here; at least its virtualization architecture, Intel VT, is already well-defined.)
There are other uses of GPUs. Microsoft Windows Vista Aero interface, for example, and the "3D Web" (like this effort). These, however, generally expect to ride on the multi-Billion dollar game market's driving of GPU volumes. They're not drivers (although Microsoft thought it was), they're followers.
If Onlive succeeds – and as noted above, it may be possible but has some speed bumps ahead – the market for games may actually increase while the GPU market sinks dramatically.
--------------------------------------------------
Acknowledgement: The ISP issue was originally raised by a colleague in email discussion. He's previously asked to remain incognito, so I'll respect that.
UPDATE:
Well, I misheard some numbers. Onlive needs 1.5 Mbps for standard-TV resolution. They say it requires 5 Mbps for 720p HDTV images. That’s just achievable with current CATV facilities, and would require 1.125 GB downloaded per hour. That really puts it outside the pale unless the service provider is engaged in providing the service. (In my defense, when I tried to find out the real bandwidth for 720p video, I found contradictions. This What Exactly is HDTV? site, which I used above, seems pretty authoritative, and has the 3 Mbps number. But this site is incredibly detailed, and claims everybody needs major compression to fit into the congressionally allocated 18 Mbps television channel bandwidth; uncompressed it's about 885 Mbps. So I can’t claim to understand where this all ends up in detail, but it really doesn’t matter: Clearly it is a very bad problem.
Thanks again to my anonymous colleague, who pointed this out.
By the way, welcome to all the Mac folks directed here from The Tao of Mac. I appreciate the reference, and I fully understand why this may be of interest. If I could run Fallout 3 (and Oblivion) on a Mac without going through the hassle of Boot Camp, I’d have one now. I’m just totally disgusted with Vista.