Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Upgrades Graphics Software Hardware

NVIDIA Makes First 4GB Graphics Card 292

Frogger writes to tell us NVIDIA has released what they are calling the most powerful graphics card in history. With 4GB of graphics memory and 240 CUDA-programmable parallel cores, this monster sure packs a punch, although, with a $3,500 price tag, it certainly should. Big-spenders can rejoice at a new shiny, and the rest of us can be happy with the inevitable price shift in the more reasonable models.
This discussion has been archived. No new comments can be posted.

NVIDIA Makes First 4GB Graphics Card

Comments Filter:
  • by Jaysyn ( 203771 ) on Monday November 10, 2008 @11:11AM (#25704119) Homepage Journal

    A video card I can't use on XP32 since it can't properly allocate that much VRAM & system RAM at the same time.

    • by IanCal ( 1243022 ) on Monday November 10, 2008 @11:30AM (#25704467)
      If you're doing scientific computing requiring about 4 gigs of ram, and need the processing power of current-gen graphics cards then you should be able to figure out how to migrate from XP32 to 64 bit.

      That you are using an old operating system incapable of dealing with this new hardware is not the fault of nVidia.

      • That you are using an old operating system incapable of dealing with this new hardware is not the fault of nVidia.

        There's [gizmodo.com]

        a FUCKLOAD [wikipedia.org]

        of problems that are [nvidia.com] I'll never buy anything with an Nvidia card in it again.
      • You can still run all your 32-bit programs with no problems. Windows has an extremely good virtualization layer that allows 32-bit software to run in the 64-bit OS with no problems. We've done extensive testing and use of it at work. So even supposing you did need a big card and a 32-bit app, well that'd work.

        Of course if you are doing anything that would need 4GB of video RAM, good chance you need a good deal more than 4GB of system RAM. After all, you probably aren't keeping the data only on the card, nev

    • The sort of person still running a 32 bit OS is not from the same set as those who might spend $3k on the latest and greatest hardware. You don't matter to them.

    • The days of the graphics card mapping all it's memory into PCI address space at once are over and have been for some time. IIRC modern cards use a movable window of 256MB or so for access to graphics card ram from the rest of the system.

    • by Nimey ( 114278 ) on Monday November 10, 2008 @11:59AM (#25705095) Homepage Journal

      Really, people. If you're going to buy such an expensive professional card, you're going to go with a professional-grade operating system, which will of course be 64-bit.

    • Re: (Score:3, Insightful)

      32-bit is dead. It should have been dead 4 years ago...

      Any serious computer enthusiast or professional running a 32bit os on today's hardware should be ashamed. They're holding the industry back.

      • by PRMan ( 959735 )

        Actually, it's all the applications (such as Adobe Flash) that don't work on 64-bit that are holding the industry back.

        It's much easier for them to recompile than it is for us to work without certain software...

        • 32-bit apps work fine on 64-bit Windows unless the application specifically checks for it and doesn't work on purpose.
          The most common problem I have is my Windows XP x64 being misidentified as Server 2003 because it shares the same kernel version (NT 5.2), and even though the "I am a server OS" flag is OFF the software still refuses to run because you apparently have a server OS. Vista x64 is obviously not affected by this, as it has the same kernel version as 32-bit Vista, so an app that works on 32-bit Vi

    • A video card I can't use on XP32 since it can't properly allocate that much VRAM & system RAM at the same time.

      A few things things wrong with this statement:

      1. The GPU (which is far beyond 32 bits) is accessing the VRAM, not the CPU
      2. Video rendering/CAD powerhouses are the target audience for this card (not consumer-level gamers/enthusiasts), whom are probably NOT going to be running this card on a 32 bit version of XP
  • History repeats... (Score:2, Informative)

    by B5_geek ( 638928 )

    I am reminded of old 3DFx advertisments just before they went belly-up.

  • by jollyreaper ( 513215 ) on Monday November 10, 2008 @11:13AM (#25704155)

    Does this mean we can finally run Crysis now?

  • i've always wanted to watch wall-e as it is being rendered in real-time

  • no it's not (Score:5, Funny)

    by hcdejong ( 561314 ) <hobbes@@@xmsnet...nl> on Monday November 10, 2008 @11:25AM (#25704363)

    ... "the most powerful video card in history", it's "the most powerful videocard yet".

    [/pet peeve]

    • by Anonymous Coward on Monday November 10, 2008 @11:32AM (#25704513)

      I dunno, those Germans made quite a powerful video card back in the 1940s.

      It certainly had more power than those steam-powered video cards the French made in WWI.

    • Wrong closing tag.
       
      You mean [/pedant].

    • maybe you should call Guinness Book of World Records and tell them that all their records are incorrect. or you could, you know, stop being such a pedant.

    • "the most powerful video card in history", it's "the most powerful videocard yet".

      FACT: The Rebel Alliance used significantly more powerful videocards to render the Death Star in Star Wars Episode IV: A New Hope.

      FACT: This event occured a long time ago in a galaxy far far away.

      [/pet peeve]

    • Everyone else takes the definition of "history" in this context as:

      * the aggregate of past events

      You seem to be taking it as:

      * the continuum of events occurring in succession leading from the past to the present and even into the future

      Both of which are valid, the first is far more common though.

      Claiming that the statement is wrong, when it is correct according to the more common definition of the word is a bit of a stretch.

  • Not for home users (Score:3, Informative)

    by Bieeanda ( 961632 ) on Monday November 10, 2008 @11:30AM (#25704477)
    If the price tag didn't tip you off, this is one of Nvidia's Quadro line. They're not enthusiast boards, they're for intensive rendering work-- film-grade CG or simulations. Now, while the technology may come down to consumer-level hardware, especially if Carmack's supposition that real-time raytracing is the next big step, but this is like comparing a webcam to a real-time frame grabber.
  • by DragonTHC ( 208439 ) <Dragon AT gamerslastwill DOT com> on Monday November 10, 2008 @11:32AM (#25704509) Homepage Journal

    I don't believe anyone claimed this was a gaming card.

    This is a scientific number cruncher. Its use is in visual computer modeling for anything from weather models to physics models.

    How about folding@home? this does it faster than any computer on the block.

    All of you kids making jokes about crysis are missing the point. This might run games, but it's a science processor first.

    • Re: (Score:2, Informative)

      by sa1lnr ( 669048 )

      folding@home.

      My 3GHz C2D gives me 1920 points every 30/33 hours. My Geforce 8800GT gives me 480 points every 2.5 hours.

    • It's mostly a professional visualization card. nVidia has three different brands for the same basic hardware:

      GeForce: This is the consumer line of cards. They are intended for desktop and gaming use. They generally feature the least RAM, and no special outputs (just dual DVI and the like).

      Quadro: Professional visualization cards. Same core as the GeForce, just a different market. The primary difference is they are certified with pro apps, and you pay extra for that. Along those lines, they have drivers opti

    • No it's not, it's a graphics card. Most people buying them will be doing it for realtime rendering. Something like volume rendering can use boatloads of memory. Certainly we were using 1Gbyte of texture memory in 2000 on SGI machines.

      If you want a 'science processor' you should be thinking about a Tesla http://www.nvidia.co.uk/object/tesla_c1060_uk.html [nvidia.co.uk].

  • That's Awesome. (Score:2, Insightful)

    by Petersko ( 564140 )
    In two years I'll be able to pick it up for $149.

    That's the great thing about video cards. Even a card that's two generations old is a terrific card, and they're fantastically cheap.
  • It's all about the headlines, that's all.
  • Old news (Score:4, Informative)

    by freddy_dreddy ( 1321567 ) on Monday November 10, 2008 @11:56AM (#25705033)
    These [nvidia.com] were being sold in the first half of August for 10500$ - containing 2 of those cards. Only 3 months late.
  • No, really, could it?

  • All this amounts to is a proof of concept for technology. To quote Simon from Mad TV "LOOK WHAT I CAN DO"!

    To those that think that this has any application what so ever let me say a few things.

    #1) Can you just think of what the driver for this think might be? Ludicrous.

    #2) It would likely require specialized programing even to function, none of which would be supported anywhere, and all of which would likely have to be custom.

    #3) For those think this has scientific applications guess again. You should get t

On the eighth day, God created FORTRAN.

Working...