tag:blogger.com,1999:blog-3155908228127841862.post6519347754315549876..comments2023-06-28T10:04:44.463-06:00Comments on The Perils of Parallel: Nvidia-based Cheap Supercomputing Coming to an EndGreg Pfisterhttp://www.blogger.com/profile/12651996181651540140noreply@blogger.comBlogger10125tag:blogger.com,1999:blog-3155908228127841862.post-62230892713864542262010-10-11T14:51:14.235-06:002010-10-11T14:51:14.235-06:00Hi, Curt. Thanks for that link. Amazing how many c...Hi, Curt. Thanks for that link. Amazing how many comments there are from people denying reality in this case.<br /><br />GregGreg Pfisterhttps://www.blogger.com/profile/12651996181651540140noreply@blogger.comtag:blogger.com,1999:blog-3155908228127841862.post-79485710576011616072010-10-11T13:26:07.756-06:002010-10-11T13:26:07.756-06:00It seems that NVIDIA themselves are now seeing the...It seems that NVIDIA themselves are now seeing the beginning of the end and are <a href="http://www.semiaccurate.com/2010/10/05/why-nvidia-killing-their-partners-branded-cards/" rel="nofollow">eating their graphics card partners</a> in an attempt to make revenue look better (though at the expense of their profit margins).Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3155908228127841862.post-75176672140129592612010-09-06T13:23:14.260-06:002010-09-06T13:23:14.260-06:00Interesting point. It's worth mentioning here ...Interesting point. It's worth mentioning here that the new xbox 360 comes already with the CPU/GPU in one package (chip?).Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3155908228127841862.post-44325751218479872632010-08-30T16:56:20.296-06:002010-08-30T16:56:20.296-06:00Oh, and by the way - Clarkdale is not, as I said a...Oh, and by the way - Clarkdale is not, as I said above, a low-end laptop device; it's mainstream. Sorry. I'll edit the text to fix that.Greg Pfisterhttps://www.blogger.com/profile/12651996181651540140noreply@blogger.comtag:blogger.com,1999:blog-3155908228127841862.post-57631858068065392712010-08-30T16:55:09.712-06:002010-08-30T16:55:09.712-06:00@Uncle Joe - see the discussion by Linus Torvalds ...@Uncle Joe - see the discussion by Linus Torvalds I referenced in the prior comment: http://tiny.cc/vz79k<br /><br />@anonymous - +1. Exactly.<br /><br />@zork - I agree with you; it's a point I would have made myself, had I thought of it.Greg Pfisterhttps://www.blogger.com/profile/12651996181651540140noreply@blogger.comtag:blogger.com,1999:blog-3155908228127841862.post-65479540852616393372010-08-20T17:06:37.661-06:002010-08-20T17:06:37.661-06:00Your post has a good point, but I think that rathe...Your post has a good point, but I think that rather than being the end of "supercomputing for the masses", integration onto the CPU die is really the beginning of it going mainstream.<br /><br />CPUs are optimized for branching, and as a result the percentage of die area devoted to cache is going to 90%. It's very hard to increase throughput under those conditions.<br /><br />GPUs are optimized for throughput, and the die area for ALUs is 80-90%. This is a perfect for uniform data processing, i.e. photos, video, number crunching, etc.<br /><br />It will be nice to have both, and applications will surely use whatever processor is best. NVidia might have a tough time of it though.mehttps://www.blogger.com/profile/07764659981449970714noreply@blogger.comtag:blogger.com,1999:blog-3155908228127841862.post-40412330049046063512010-08-19T17:39:00.898-06:002010-08-19T17:39:00.898-06:00One way or another this is going to have an effect...One way or another this is going to have an effect on NVIDIAs dynamics. <br />Since most of these units ship in PCs a very many end users dont really care about graphics performance the question will be can it run whatever?<br />There will always be a niche for people that want neat (gfx) hardware bought at a premium - but whether this will remain a viable financial proposition is to be demonstrated. Theres a long history of dead companies that also produced neat hardware eg.NEXT and the NEXTCube, DEC and Alpha they too had a dedicated following and a mindshare...Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3155908228127841862.post-68451618624670654612010-08-15T20:33:28.249-06:002010-08-15T20:33:28.249-06:00I'm not sure about NVIDIA getting most of its ...I'm not sure about NVIDIA getting most of its subsidies from the low end. I know NVIDIA gets a lot of revenue from its Tesla products. I'm using a Tesla C2050 ($2400) and architecturally it's the same as a GTX 470 ($500)<br /><br />The thinking is basically, if you can't afford errors, then you can afford error free hardware.Yale Zhanghttps://www.blogger.com/profile/14962451710129452473noreply@blogger.comtag:blogger.com,1999:blog-3155908228127841862.post-10331302996953116182010-08-12T19:41:10.131-06:002010-08-12T19:41:10.131-06:00Andrew, I must respectfully disagree.
About the e...Andrew, I must respectfully disagree.<br /><br />About the effect of low-end sales, and the issue of high-end/low-end profits: Rather than going through a long-winded explanation, I'll refer you to Linus Torvald's long-winded but excellent explanation over on RWT: http://tiny.cc/vz79k<br /><br />He makes the case that even if all the profit is from the high end -- *zero* from the low end -- the low end still effectively subsidizes the high end by soaking up fixed costs. I agree with his analysis.<br /><br />I agree Nvidia has a lot of mind share. At one point, so did Cell -- although Nvidia's probably exceeds Cell's peak by an order of magnitude. The question is whether that mindshare will last when customers have to pay the whole price for their hardware. Their high mindshare may result in a long fall.<br /><br />Unfortunately, Intel merely has to be adequate for this to happen - not particularly good, just adequate. This isn't a mark they historically have met. With integration and a massive improvement in silicon technology, adequacy might just happen.<br /><br />GregGreg Pfisterhttps://www.blogger.com/profile/12651996181651540140noreply@blogger.comtag:blogger.com,1999:blog-3155908228127841862.post-45896833589520469922010-08-12T04:48:21.785-06:002010-08-12T04:48:21.785-06:00NVIDIA makes its money out of high-end graphics. T...NVIDIA makes its money out of high-end graphics. That's a mixture of graphics professionals (engineers, designers, video editors, artists...) and gamers. They have added a bit of HPC in there, with CUDA.<br /><br />The graphics professionals and gamers aren't going to switch to the netbook Fusion chips or Intel's integrated graphics. Maybe some might want a netbook with Fusion instead of an Atom netbook. But Intel's IGP has a really bad reputation for anyone who is serious about graphics (which is where most of NVIDIA's income comes from.)<br /><br />You've got to remember that NVIDIA really has a lot of mindshare among software developers: especially games developers, professional graphics developers and HPC researchers. Those people (actually, me too) need to have a company like NVIDIA pushing technology forwards, whereas Intel with IGP and (to a much lesser extent) ATI have been following behind.<br /><br />Fusion will push ATI/AMD ahead of NVIDIA in some senses, as long as NVIDIA don't have a secret x86 project up their sleave and ready to go next year.<br /><br />NVIDIA's Tegra has really disappointed in the marketplace. They haven't managed to get into a market now dominated by Qualcomm, Samsung and TI, despite having some great technology.<br /><br />So, really, all of these projects are making it difficult for NVIDIA to sell their back-catalogue of chips. It doesn't block their ability to sell new GPUs.<br /><br />What we're seeing is companies saying they can do a better job than NVIDIA, and then finding that it's actually quite hard. NVIDIA definitely have some tough times ahead, but they still have the high-margin end of the market almost to themselves (apart from the period when Cypress was king and Fermi was struggling).Andrew Richardshttps://www.blogger.com/profile/06349106495739189749noreply@blogger.com