Nvidia 480-Core Graphics Card Approaches 2 Teraflops 261
An anonymous reader writes "At CES, Nvidia has announced a graphics card with 480 cores that can crank up performance to reach close to 2 teraflops. The company's GTX 295 graphics cards has two GPUs with 240 cores each that can execute graphics and other computing tasks like video processing. The card delivers 1.788 teraflops of performance, which Nvidia claims is the fastest single graphics card in the market."
But will it run Crysis?... (Score:5, Interesting)
Yes (Score:3, Informative)
The problem with video card review is they don't bother testing anything lower than 1920x1080 which is 2.25x bigger than 720.
Crysis takes a lot to run but it has already been tamed as long as you aren't running at 2560x1600 or some other absurd resolution.
Re: (Score:2)
Re: (Score:3, Insightful)
Re: (Score:3, Insightful)
Right, but I don't know very many people with 1920x1200 displays. I have one, and my 18-month old GPU can run Crysis and any other game just fine at that rez, but practically everyone else I know is still at 1280x1024 or 1680x1050.
Realistically, reviewers should find the resolution and settings at which a game is playable, meaning 25-30 fps average for most games. Sure, it's funny to know that Crysis will get 8 fps at 2560x1600 with 16x AA+AF, but if that's what they reviewers think even hardcore gamers e
Re: (Score:3, Insightful)
1920x1200 is the most preferred resolution because it is the native resolution of most of the 24" panels. If you don't play at native resolution, you get to experience glorious scaling artifacts. Glorious, glorious scaling artifacts.
Re: (Score:3, Insightful)
Almost makes me pine for the days of the CRT. ... Well, maybe not exactly. I don't want to imagine how heavy a 24" or larger CRT would be, but I'd love for another technology not locked to a single native resolution to break through the never-ending sea of fixed-pixel devices. For now, I just run my LCD in the scaled "maintain aspect" mode on my Radeon and enjoy the black borders on non-native resolutions. Better than that nice blurry stretch effect I'd get otherwise!
Re: (Score:2)
I was thinking about this. I am picking up a 6gb ram/4870x2/i7 920 setup and kept thinking: "Why not just run 1680x1050 dual monitors with like 16xAA"
the thing about being able to run the current generation of games at 2560x1600 also ensures that there isn't a chance in hell you'll be able to with the same setup a year later as games will be too demanding, and lowering resolution while preserving aspect ratio probably makes everything look like crap. Not to mention how disappointing that would be.
Car analog
Re:But will it run Crysis?... (Score:4, Insightful)
Re: (Score:2)
Something tells me that you have no idea what so ever.
UID right? Too high. You should be at least in the 5 digit range to make a claim like that.
Re:But will it run Crysis?... (Score:5, Funny)
A year from now, people won't be talking about Crysis anymore. Bigger and better games will be out. Such is the nature of the gaming industry.
The fact that Crysis has great graphics doesn't mean its a great game. As an avid gamer for over 20 years, I can say without a doubt that on average there is no correlation between good graphics and good games.
In addition to my 20 years of gaming, I've got a 5 digit UID. I am therefore an authoritative source on the subject.
Re:But will it run Crysis?... (Score:5, Funny)
Where can I get one of these five digit things you speak of? OMG, I am so behind the times!
Re: (Score:3, Funny)
Re: (Score:2)
My point? A lot of work can go into making a game the *best*, whether it be the prettiest or most technical, if it fails to run on hardware even a few generations later, then something is wrong.
(I do believe Doom ]|[ does run quite well on current top of the line GPUs, so the comparison isn't c
Re: (Score:2)
Re: (Score:2, Troll)
Re: (Score:2)
od me troll if you like, but on second thought, I'll post AC just in case.
Re:But will it run Crysis?... (Score:5, Insightful)
Well, I do know what goes into a game like Crysis, being a 3D game programmer and all. Those programmers were very, very good, believe me. Some of the stuff they pulled off is just amazing.
The reason Crysis is slow is because of the artistic direction. Outdoor environments full of plants and shadows with a huge viewing distance is very hard to implement in a 3D engine. I mean really fucking hard. Making a game like that playable at all is a tradeoff between two scraggly trees on a flat green carpet that pretends to be grass, OR an enormous amount of research into optimization techniques that are very hard and time consuming to implement. The Crysis engine is pretty much the state of the art in optimization. And these guys managed to squeeze in fantastic shader effects on top of that, depth of field, and even some basic radiosity shadowing for the characters!!! That's just insane.
Most reviewers and players with the right hardware thought the game looked amazing, way better than its peers at the time, or even now. I thought the effects (especially in the spaceship) looked better than most Sci-Fi movies, which is a stunning achievement for a 3D game running on a $500 video card. I upgraded my PC just to play the game, and I thought it was worth it. Lots of people did too:
http://www.penny-arcade.com/comic/2007/10/15/ [penny-arcade.com]
Take your head out of your ass and stop belittling other people's achievements until you have some of your own to compare it to, OK?
Re: (Score:3, Funny)
Dead Space.
2008 GotY for me.
Looks awesome, sounds absolutely amazing, loses very little in the console ports, is a great game, runs great on a variety of hardware, is extremely stable, and was published by EA with SecuRom.
Re:But will it run Crysis?... (Score:4, Interesting)
No game is made for gamers in the future.
Game sales are extremely front loaded.
After a month, 50% of games are in the bargain bin.
Re:But will it run Crysis?... (Score:5, Interesting)
This used to be true, but actually seems to be less true now than it was. When I went to buy a game at Best Buy recently, some of the games with good stock, good display space, and $30+ prices were more than a year old.
The development cost on a tier-1 computer game is high enough now that not many of them get released. There isn't another game to put in the shelf slot if they take down Crysis, and there won't be for another year or so.
Re: (Score:2)
I'm tired of random rants about how Crysis sucks just because it's graphically demanding. They made an incredible game that has continued to take advantage of new hardware. Most games are the opposite, they code backwards so that most people with existing hardware can max it out.
Re: (Score:3, Insightful)
I'll do one better.
Case = bullshit $20 wonderjob at a pawn shop.
PSU - 700w Rocketfish for 70 bucks.
mobo/CPU combo - PC Chips with dual-core AMD Athlon64 X2 5200+ - 60 bucks
RAM - 4GB cheapo RAM - 20 bucks from craigslist.
GPU - 512MB 9800GTX+ - 175 from pricewatch.
Hard Drive - 80GB 7200RPM WD - FREE from craigslist, complete with porn!
Optical drive - DAEMON TOOLS, but I've found the one in my machine for 10 bucks
OS License - XP Pro - 100.
455 bucks, Crysis at 1920x1080 at high settings. I get very few framerat
Power Requirement (Score:5, Funny)
1.21 Jiggawatts
Re: (Score:2, Troll)
What the hell is a Jiggawatt?
Re: (Score:2, Informative)
Re: (Score:2, Interesting)
Actually, the soft g ("j") pronunciation is correct and illiterate computer types abominated it with a hard g. "Back to the Future" wasn't wrong, we are.
Re: (Score:2)
Re:Power Requirement (Score:5, Insightful)
If you really want to go back to the source, "giga" is Greek and uses a "j" sound. [wiktionary.org]
Consider the word "gigantic". It has the same root, "giga". Some people pronounce it with a hard "g", some with a soft "g".
The language is a mess.
Re: (Score:2)
And some people, like me, pronounce it with both a hard 'g' and a soft 'g'.
Re: (Score:3, Interesting)
You're definitely wrong about the pronunciation of gif: http://www.olsenhome.com/gif/ [olsenhome.com]
Re: (Score:3, Informative)
The spec and the creator says you're wrong.
Do you have sources or do you just like telling people they're wrong without any data to back it up?
Here's another source if you trust wikipedia more than random webpages that can't be edited by half the world: http://en.wikipedia.org/wiki/Gif#Pronunciation [wikipedia.org]
Re:Power Requirement (Score:4, Informative)
Apparently the US National Bureau of Standards decided in the 1960s that Jiggawatt was the one true pronunciation.
http://en.wikipedia.org/wiki/Giga [wikipedia.org]
And jif is only correct for the same reason, the developers decided "Choosy developers choose Jif" was a hilarious slogan they could use internally for the gif format.
http://en.wikipedia.org/wiki/Gif#Pronunciation [wikipedia.org]
So yes, Jigabit, Jigabyte, Jigawatt, those are how we are legally supposed to pronounce them, atleast in the US.
Re: (Score:2)
WHOOOSH
And what does Marty say after Doc keeps repeating '1.21 Jigowatts' over and over after hearing himself say it on the video tape?
Re: (Score:2)
Jay-Z's so big now he generates his own form of power.
Re: (Score:3, Funny)
Weight has nothing to do with it, Marty!
Re: (Score:2)
It doesn't even take an entire jigabyte of my HD to store my favorite movie, Jodzilla vs Jamera.
Re: (Score:2)
Re: (Score:2)
Not to mention that you have to have the video card traveling at 88 MPH.
Re: (Score:2)
Ok so the card is only really useful for mobile computing. To get that speed you have to be on an interstate in most of the US, or a school zone in california.
Contest... (Score:5, Funny)
Yet again, Nvidia showed ATI that it, indeed, has the biggest penis.
Re:Contest... (Score:5, Funny)
Yet again, Nvidia showed ATI that it, indeed, has the biggest penis.
Yeah, but it's mega-floppy at that.
Re: (Score:2)
Once again we see it's not size, but how you use it. Come on, it fits in any normal PCI slot!
Re:Contest... (Score:5, Funny)
Not quite - They proved they have the biggest number of penises... Making for some interesting crossover potential into the Hentai gaming market.
/ Wonders what "ultra realistic" means as regards H - "Wow, the fur on her tail looks almost real, and her breasts look like actual porcelain!"
Re: (Score:2)
Not quite - They proved they have the biggest number of penises... Making for some interesting crossover potential into the Hentai gaming market.
And to fit all those penises on the card, they had to make sure they were very very small.
Re: (Score:2, Informative)
ATI has a bigger p*nis (Score:3, Informative)
Great... (Score:5, Insightful)
Re:Great... (Score:5, Funny)
when can I get a video card that doesn't take up half my case and melts down after 6 months of use? Not to mention, doesn't cost an arm and a leg.
2006?
Re: (Score:2)
Re: (Score:2)
No kidding. I have a BFG Tek 8800GTX that's been replaced five times since I got it. My game system used to be an overclocked affair with several hard drives, but over time I've reduced it to a 700W Corsair PSU, an un-overclockable Intel branded motherboard, one hard disk and stock Crucial RAM, thinking maybe my setup was killing the card... all in an enormous Antec P180 case, which has dedicated cooling for the graphics slot and multiple 120mm fans.
Fucking thing died again a couple weeks ago. Even when it'
Re: (Score:2)
Re: (Score:2)
and Aftermarket Heatsink. My 8800GTX runs just fine tyvm.
Re: (Score:2)
I bought an ATI when my 8800GTS 512 died; I didn't want to play the lottery as to whether the replacement would have the same manufacturing defects [tgdaily.com] or not.
nVidia are going to have to do something pretty special to attract me back after that; putting two of their power hungry barely-fabricatable huge monolithic GPUs on a single card just isn't it.
Re: (Score:2)
The problem seems to be that many video cards ship with inadequate cooling systems. At least that's been my experience. Back in the day, custom cooling solutions were pretty much reserved for those doing serious overclocking. Now cooling requirements have gone up, but manufacturers generally use the bare minimum, such that the GPU doesn't overheat as soon as it's powered up, and nothing more.
I've only got a 7900GTX, but after having it replaced once, and then getting more jaggies, slowdown, and stutterin
Re: (Score:2)
Right now (Score:4, Informative)
One of the benefits of the technology war is that it produces good midrange and low end technology as well. This is particularly true in the case of graphics cards since they are so parallel. They more or less just lop off some of the execution units and maybe slow down the clock and you get a budget card.
Whatever your budget is, there's probably a good card available at that level. Now will it be as fast as the GTX 295? Of course not. However they'll be as fast as they can be at that price/power consumption point.
Don't pitch because some people need/want high end cards. Enjoy the fact that they help subsidize you getting good, cheap midrange cards.
If you want serious suggestions, tell me your budget range and what you want to do and I'll recommend some cards.
Re: (Score:2)
Amen to that. I've got a 9600GT (1GB ddr3), a couple months ago, for a very nice price. It meets all my needs, and then some more.
I never buy the latest model of anything. It is simply (for me) not worth it.
Re: (Score:2)
So how'd you solve it? (Score:2)
No kidding! I just ran into my first Nvidia heat-o'-death situation too.
Anyone know of an after-market part to draw air directly over your PCIe cards? This is a problem that's right now solved by the turning-my-graphics-card-into-a-jet-engine solution. It works, but if there's a quieter answer that keeps the graphics power I'd be happy to hear it.
Here's the skinny:
790i comes with 3 PCIe slots so I thought to try SLI with two new cards, and an older one (in the middle thanks to the bridge) for second monitor
Re: (Score:2, Informative)
Re: (Score:2)
Might do it. I've been reading about some directional fans that may help.
Problem is these cards tend to draw air from the face instead of the back. That's no good when your neighbours are mighty-hot already.
Zalman's got a fan I might crack the plastic case on these guys for. I figure a little more space between cards and adding a bunch of surface area might help get some air in that's not coming directly off the other cards.
Thanks a bunch!
Better Question (Score:2)
When *won't* you be able to get a video card that takes up less than half your case and doesn't require its own power supply?
Right now you can still get a high powered graphics card for less than $50 with a small or no fan. But those cards are 2 year old technology. These days all the latest and greatest are essentially a PC within a PC and I doubt the power and cooling requirements will go down with time.
So in 5 years these rediculously large cards will cost $50 but they'll still be rediculously large.
10
Re: (Score:2)
Right now you can still get a high powered graphics card for less than $50 with a small or no fan.
Define "high powered", please...
It's a feature (Score:2)
Didn't you know the second power connection to your GPU is actually for the oven/space heater function? So it's actually a feature!
Nvidia realized long ago that to maximize play-time they needed a way for users to cook and stay warm near their PCs.
I've made some mean eggs on my case, recipe came from the included Nvidia cook-book.
-Matt
480 core? (Score:5, Interesting)
Color me doubtful but I suspect it's 480 stream processors which isn't anywhere NEAR the same thing as the "cores" on the CPU or even the core of the GPU.
Why has the press suddenly started to call stream processors "cores"? Marketing?
Re: (Score:3, Insightful)
OpenCL (Score:2)
Will this card support OpenCL?
Why do we bother... (Score:5, Funny)
Re: (Score:2, Informative)
Because this card can only do 1.788 tera-multiply-adds per second. Try instead to have it build a parse tree, then run transformation algorithms on it (chasing pointers all over the place) and so on, like you would while compiling code, and this thing will make the Atom look great.
CPUs are optimized for general computing, GPUs are optimized for stream-oriented numeric computing. Both have their uses, and the ideal is probably a combination of both, as is currently done.
Re: (Score:2)
This is reaching the point of "why bother with a supercomputer?" If your app can make use of those cards, an enthusiast board with 3-way SLI can deliver more performance than a 4 year old, debut at #66 supercomputer (that I manage) can. In one PC, with a hell of a lot less to go wrong.
4 years old is pretty old for a supercomputer, but still that amount of computational power is staggering.
Re: (Score:2)
Why do we bother... with CPUs anymore? I'm just going to fill a case with graphics cards and call it a day.
Then you can enjoy the fact that you'll be able to run your anti-virus software 21x faster too. [theinquirer.net]
Just in time... (Score:3, Funny)
... for Windows 7 (or whatever they call Vista now).
Re: (Score:2)
Is that on Doubles or Floats? (Score:2)
Because their Tesla boards post nearly a TFLOP of performance for single precision computing, but only 78 MFLOPS for double precision.
*sigh* (Score:5, Funny)
Re: (Score:2)
Pissing contest indeed (Score:3, Interesting)
Compare this to the Radeon 4870 X2 : 2 55nm RV770 GPUs on the same PCB connected by a PCIe bridge although the card has a "Crossfire X Sideport" interlink ( which I think is Hypertransport, although I may be wrong ) that directly connects the two GPUs, which isn't enabled in their drivers at the moment. (you can see it on the PCB -- a set of horizontal traces directly linking both GPUs ) One might wonder if they've delayed enabling the direct link because they knew Nvidia would respond this way.
Anyway, it's always great when two companies battle it out, as the consumer always wins.
jdb2
Full Review with Benchmarks of The Card Here: (Score:4, Informative)
I'm sticking with ATI (Score:5, Insightful)
Re: (Score:3, Interesting)
Well ATI recently anounced that they want to start supporting open source drivers again. It's just a matter of time, I hope. Otherwise I'll have to go with Intel for my next chipset.
480 cores and no user's manual (Score:4, Insightful)
I mean seriously, as long as they don't publish the hardware specifications so you can write your own software for it, it's preety much useless. The only thing you can do with it is play games. And even then you have to fear every little software update as it might trigger some bug in the binary only drivers the manufacturer provides.
Re: (Score:3, Interesting)
Well but if you only have a binary only interface you can still only do what the manufacturer allows you. And if the manufacturer says that you cannot do whatever you are doing, it can simply stop you from doing that.
But of course you are right, there is a large chance that CPU-based rendering might make dedicated GPUs obsolete again.
Re: (Score:2)
I'm waiting for the latest and greatest supercomputers to have huge GPU farms.
Just wait until they perfect rapid fabrication and live expansion. GPU farming is the future, fabricating additional cores on demand.
Re: (Score:2)
Not gonna happen.
There's a lot of flops, sure, but they're arranged in a long pipeline where the only input is "texture map" and the only output is "frame buffer". That's not much good for general purpose processing.
Oh, and they're only single precision, which wipes out another big chunk of possibilities.
Re:Sounds good but.. (Score:5, Informative)
http://www.realworldtech.com/page.cfm?ArticleID=RWT072405191325&p=2
A single 8800 kill the cell and the video processor in the ps3 combined
Re: (Score:2)
Re: (Score:3, Insightful)
Re: (Score:2)
According to:
http://en.wikipedia.org/wiki/Cell_(microprocessor) [wikipedia.org]
The ps3 cell would be capable of 1 teraflop, IF you could keep it fed. The nvidia part is actually getting that level of throughput.
Re: (Score:2)
"The card fits into any normal PCI Express slot."
Re: (Score:3, Informative)
The specs are very specific (lol, get it?).
I take it you havn't seen full-length graphics cards yet? 280's, 8800 GTX's, GX2's, etc, aren't full length cards, but they're close.
These are full length cards: http://management.cadalyst.com/cadman/Review/AMD-ATI-FireGL-V8600-and-FireGL-V8650-Graphics-Car/ArticleStandard/Article/detail/526886?contextCategoryId=6631 [cadalyst.com]
You can tell the difference by them not only being longer, but having that retention connector at the end (right side of the pictures) which helps stea
Re: (Score:3, Insightful)
Hey now this man speaks the truth albeit with a poor choice of words, to use an alternative but equally popular automotive analogy, I may attach a PCIE connector to my car but that does not therefore mean that my car is suitable for operation inside a standard computer case much less plugged into an actual PCIE slot.
its not a problem to implement 52342525113 cores (Score:5, Informative)
Apart from, you know, link length.
The most important thing to understand is that these aren't actually 'cores' in the same sense that your Core 2 Duo has two of them. They're shader units. It works more like SIMD than parallelization, only instead of something like SSE that can perform a single operation per clock across 4 packed floating point values it performs the operation on thousands of them.
If they could slap a billion or a million or even a thousand shader units on a card without actually reducing performance they would, but they can't. At a certain point the bottleneck becomes link length. You can overcome it by increasing voltage but then heat becomes the issue. This is a large part of the reason transistor count is tied to transistor size. NVIDIA isn't "failing" in this respect, they're just succumbing to the laws of physics.
If they could improve performance by slapping 20 or 4 or even 2 of the *actual* cores on each card they would, but they can't. Because it's not an actual processor, it doesn't have fancy features like three levels of cache and a TLB and branch prediction and out-of-order execution. But even if they were engineered to work this way, you can't improve PC performance by slapping in a thousand Core 2 Quads either. A part of the reason Xeons have so much cache is so you can mitigate the penalty of having 8 processors using commodity RAM, but eventually you run up against that bottleneck. Shared resources become saturated much faster than most people expect.
The most efficient way of improving graphics performance is with SLI because you are replicating all of the hardware, the memory and the bus the *actual* core depends on. For the exact same reason, you can extract the most performance out of each CPU core by putting each one in a different machine.
Re: (Score:3, Insightful)
Re: (Score:3, Insightful)
A part of the software design process is how to break up the main application into the different components. With multi-threading, the design needs to figure out what can be handled in a different thread, and if having a different thread for that function is worth the code administration needed to tie things all together.
Remember, it is fairly easy to make a different thread and have it do what you want it to do. The difficulty is in how to tie the different threads together to make the application work
Re: (Score:2, Funny)
Yeah, but you'll need all that power to run Windows 8
Re: (Score:2, Informative)
What the fuck are you smoking? It's a $500 card.
Re: (Score:2)
Current PC games are utilizing these latest generation cards NOW...
I use the predecessor to this card (Nvidia's GTX280 GPU with its 240 'cores') to play the latest FPS games at 1920x1200 and it runs a Folding@Home GPU CUDA client whenever it is not gaming...
If I had one of these new $500 GTX295's I could run my games even faster or even assign one of the GPU's with its 240 'cores' for physics processing (A/K/A Nvidia PhysX, F/K/A Ageia PhysX)
Re: (Score:2)
Re: (Score:2)
You owe me a cup of coffee and a new keyboard, anonymous friend.
LOL!