tag:blogger.com,1999:blog-3155908228127841862.post3899046908397549029..comments2023-06-28T10:04:44.463-06:00Comments on The Perils of Parallel: A Larrabee in Every PC and MacGreg Pfisterhttp://www.blogger.com/profile/12651996181651540140noreply@blogger.comBlogger11125tag:blogger.com,1999:blog-3155908228127841862.post-86270789023555980612009-09-08T05:28:34.372-06:002009-09-08T05:28:34.372-06:00I was at nVidia HQ earlier this year, and I popped...I was at nVidia HQ earlier this year, and I popped the 'x86 part' question to the guy who was at Director level. He simply said 'we've been doing microprocessors for a long time'.<br /><br />If the case of Intel licensing x64 extensions from AMD is anything to go by, I think it's only matter of time that Intel and NV figure out a cross-licensing deal. They should know that fragmenting the market at this level can only hurt everyone. Does anyone know under what terms did Transmeta get their license for the x86 ISA?Rex Guohttps://www.blogger.com/profile/15338990598816261237noreply@blogger.comtag:blogger.com,1999:blog-3155908228127841862.post-49013902493397494882009-09-07T08:56:33.046-06:002009-09-07T08:56:33.046-06:00"If you build it, they [the apps] will come....."If you build it, they [the apps] will come..."Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3155908228127841862.post-76970914062580759362009-09-05T09:42:48.910-06:002009-09-05T09:42:48.910-06:00Greg,
I just bought a Logitech Quickcam Pro 9000 c...Greg,<br />I just bought a Logitech Quickcam Pro 9000 camera for $100, and Microsoft is coming up with a HD camera as well. As far as compression goes, Nvidia and ATI have demonstrated that H.264 encoding can be accelerated 3-4 time.<br /><br />I think it is a combination of all three - large broadband pipes, HD cameras, and GPUs that provide the technology inflection for video-conferencing. Further, the recession has triggered businesses to seriously consider VC as an alternative to air-travel ...<br /><br />I agree, time will tell whether it's a killer or not ! But in the absence of any other apps, it's worth watching ...<br /><br />MSeenuMSeenunoreply@blogger.comtag:blogger.com,1999:blog-3155908228127841862.post-55899728580364075682009-09-04T17:16:46.539-06:002009-09-04T17:16:46.539-06:00Good idea. What's the price tag for an HD came...Good idea. What's the price tag for an HD camera? I'd have to assume someone would come up with a way to do major major compression with all those ops, but I'd be willing to make that assumption.<br /><br />But doesn't this presume that what's holding back videoconferencing now is bad picture quality? I'm not so sure of that. Every laptop has a lower-def camera now, and it hasn't taken over the world as far as I can tell.<br /><br />My take: Surely worth working on, but probably not "killer." I could be wrong.<br /><br />GregGreg Pfisterhttps://www.blogger.com/profile/12651996181651540140noreply@blogger.comtag:blogger.com,1999:blog-3155908228127841862.post-88290852228320147972009-09-04T12:14:29.462-06:002009-09-04T12:14:29.462-06:00How about high definition video-conferencing as a ...How about high definition video-conferencing as a killer app? With bi-directional internet bandwidths crossing the 1 mbps mark, GPUs providing tera-flops, and high definition webcams becoming affordable, VC is all set to take off!!Mseenunoreply@blogger.comtag:blogger.com,1999:blog-3155908228127841862.post-16286391736714150052009-09-04T11:30:58.964-06:002009-09-04T11:30:58.964-06:00@Igor --
I agree, Larraabee is a Leap, and that&...@Igor -- <br /><br />I agree, Larraabee is a Leap, and that's a major issue I didn't mention. <br /><br />Who knows, Larrabee may end up as successful as Itanium. Both aim at parallelism, of different kinds. Hey, that's probably worth another post to discuss.<br /><br />Great comments, everybody.<br /><br />GregGreg Pfisterhttps://www.blogger.com/profile/12651996181651540140noreply@blogger.comtag:blogger.com,1999:blog-3155908228127841862.post-16614944151461549982009-09-04T00:25:49.945-06:002009-09-04T00:25:49.945-06:00As far as "the future is parallel" stori...As far as "the future is parallel" stories go, this one is fairly plausible. Integration between the CPU and the GPU may well be unavoidable.<br /><br />Of course at this point, even the success of Larrabee is hard to predict. Larrabee is a big leap from any existing hardware, and great technologies are usually developed incrementally. E.g., NVidia arguably had a more gradual path towards its GPGPUs. I will be very impressed if Intel pulls this off.<br /><br />IgorAnonymousnoreply@blogger.comtag:blogger.com,1999:blog-3155908228127841862.post-108476740078163652009-09-03T23:34:16.433-06:002009-09-03T23:34:16.433-06:00I tend to avoid The Inquirer. Bit-tech said last M...I tend to avoid The Inquirer. Bit-tech said last March, though, that at a Morgan Stanley Technology Conference, "Nvidia’s senior vice president of investor relations and communications, Michael Hara, was asked when Nvidia would want to get into the general purpose microprocessor business. Hara said that “the question is not so much I think if; I think the question is when.”"<br /><br />See http://bit.ly/15McbQ .<br /><br />Maybe they could partner with someone with a license. I've no clue who, though.Greg Pfisterhttps://www.blogger.com/profile/12651996181651540140noreply@blogger.comtag:blogger.com,1999:blog-3155908228127841862.post-43729210252939179612009-09-03T23:17:24.799-06:002009-09-03T23:17:24.799-06:00Interesting to hear you buy into The Inquirer'...Interesting to hear you buy into The Inquirer's Nvidia X86 rumors. Your explanation certainly motivates it but leaves me wondering whether they can actually legally do it. Presumably Intel and AMD have no reason to extend any X86 license to them?Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3155908228127841862.post-7686621906281401622009-09-03T22:56:48.771-06:002009-09-03T22:56:48.771-06:00Thanks for the quick catch! A little googling indi...Thanks for the quick catch! A little googling indicated the 4870 X2 is now out, at 2.4 TF, so I just inserted a "4". :-)<br /><br />GregGreg Pfisterhttps://www.blogger.com/profile/12651996181651540140noreply@blogger.comtag:blogger.com,1999:blog-3155908228127841862.post-14022951427028553762009-09-03T22:43:11.180-06:002009-09-03T22:43:11.180-06:00Good piece, but the flops number you quoted for AT...Good piece, but the flops number you quoted for ATI GPUs is wrong. Their last year GPUs reach 1.2 teraflop/s, while in a week or so ATI will launch their new architecture which is likely to be rated 2+ teraflop/s.Anonymousnoreply@blogger.com