Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

Nvidia Reintroduces SLI with GeForce 6800 Series 432

An anonymous reader writes "It's 1998 all over again gamers. A major release from ID software, and an expensive hotrod video card all in one year. However, rather than Quake and the Voodoo2 SLI, it's Doom3 and Nvidia SLI. Hardware Analysis has the scoop, 'Exact performance figures are not yet available, but Nvidia's SLI concept has already been shown behind closed doors by one of the companies working with Nvidia on the SLI implementation. On early driver revisions which only offered non-optimized dynamic load-balancing algorithms their SLI configuration performed 77% faster than a single graphics card. However Nvidia has told us that prospective performance numbers should show a performance increase closer to 90% over that of a single graphics card. There are a few things that need to be taken into account however when you're considering buying an SLI configuration. First off you'll need a workstation motherboard featuring two PCI-E-x16 slots which will also use the more expensive Intel Xeon processors. Secondly you'll need two identical, same brand and type, PCI-E GeForce 6800 graphics cards.'"
This discussion has been archived. No new comments can be posted.

Nvidia Reintroduces SLI with GeForce 6800 Series

Comments Filter:
  • Damn (Score:4, Funny)

    by MikeDX ( 560598 ) on Monday June 28, 2004 @08:52AM (#9549841) Journal
    There goes my savings again!
    • Re:Damn (Score:4, Funny)

      by Total_Wimp ( 564548 ) on Monday June 28, 2004 @09:47AM (#9550249)
      I had gotten so used to even the best gaming equipment being within the realm of possiblility. There was no need to buy a really expensive mobo or xeon processors because they wouldn't help gaming one lick. One awesome $3k rig was just as good as another really.

      Now the option actually exists for me to play Doom3 on one of those very high rez LCDs if only I had the balls to mortgage the house for one of these setups.

      Thank you nvidia, now I can dream again for something I'll never touch! Bugati Veyron, Liv Tyler, Fort Knox; what would life be like without the pleasure of the untouchable dream?

      TW
  • Hmmm... (Score:4, Funny)

    by cOdEgUru ( 181536 ) on Monday June 28, 2004 @08:53AM (#9549845) Homepage Journal
    Its sad that my first born has to go..

    But perversly exhilarating to hold an SLI configuration in my hands instead..
    • Re:Hmmm... (Score:3, Funny)

      by 7-Vodka ( 195504 )
      No daddy noo!!
      Please daddy don't trade me for some extra FPS. Everyone knows the human eye can't tell the difference.

      Shut up you little brat. I can tell and it ruins my game man!

  • For Rich Folks Only (Score:5, Interesting)

    by Brain Stew ( 225524 ) <{ten.nozirev} {ta} {gawkcaz}> on Monday June 28, 2004 @08:55AM (#9549858) Homepage
    These cards are expensive enough, now they are suggesting we buy 2!?

    I guess if you have a lot of money and want to play with a (marginal) advantage, an SLI setup is for you.

    As for myself, I am a poor college student not even able to afford 1 of these cards. A situation I think is similar to a lot of other geeks/gamers.

    Which begs the question, who is this aimed at?
    • by King_of_Prussia ( 741355 ) on Monday June 28, 2004 @08:58AM (#9549876)
      14 year old 1337-sp33king white boys living with their rich parents. The same people who will use these computers to play counterstrike with hacks on.
    • by PIPBoy3000 ( 619296 ) on Monday June 28, 2004 @09:09AM (#9549979)
      I picked up a Voodoo 2 card way back when for the incredibly high price of $300 (which was a ton close to ten years ago with the money I was making). A couple years later, I picked up my second Voodoo 2 for $30.

      Think of it as an inexpensive way to nearly double your video card's performance at a fairly cheap price when others are upgrading to the new version of the card that is only 40-50% faster (unlike the SLI mode which is rumored to be 75-90% faster).

      The tricky part will be that you have to have a motherboard to support it, which for now will only be the ones made for high-end workstations.
    • If you wait a couple of months after it's release, you can probably save 50%. It's just another graphics car that will be outdated in a few months.
    • by Kjella ( 173770 ) on Monday June 28, 2004 @09:15AM (#9550035) Homepage
      ...priorities. If gaming is your life (or if you're a working man with a gaming fix), two of these aren't that "extreme". People easily spend 10k$+ more on a car than a car that'd get them from A to B just as safely and easily, just for style and more luxury.

      If gaming is what you do a considerable number of hours of your life, why not? Even as a student, it'd be some weekends without being completely wasted (and maybe work an hour or two as a weekend extra), and you'd have it.

      All that being said, from what I saw with the last cards it looked to me like GPU speed was starting to go beyond what conventional monitors and CPUs could do. And those really huge monitors are usually far more expensive than the GFX cards, even two of them.

      2xGF6800 = 10000 NOK
      Sony 21" that can do 2048 x 1536/86 Hz = 14000 NOK ...and that was the 3rd webshop I had to go to in order to actually find one of those - most now have some legacy 17 and 19" CRTs and the rest LCDs, which go no further than 1600x1200 (even at 21") and don't need an SLI solution.

      Personally, I'll probably stick to GF4600 until hell freezes over, I just don't manage to get hyped up on the FPS games anymore. I'd rather go with a HDTV + HD-DVDs, should they ever appear...

      Kjella
      • Slightly O/T (Score:2, Insightful)

        by baudilus ( 665036 )
        Sony 21" that can do 2048 x 1536/86 Hz

        Every serious gamer knows that 86Hz is unacceptable. True gamers know: CRT > LCD / PLASMA. Until you can find me a plasma that can refresh at 125Hz or greater, I'll stick with my 80lb. CRT.

        Any gamer extreme enough to buy two of these cards plus the requisite hardware should be smart enough to know that a flat panel is a waste of money for games. Then again, they are gamers...
        • Re:Slightly O/T (Score:3, Informative)

          by Jeremy Erwin ( 2054 )
          Every serious gamer knows that 86Hz is unacceptable. True gamers know: CRT > LCD / PLASMA.

          I think he was talking about a CRT. LCDs aren't capable of rendering even 86 frames per second.

          However, if you want the absolute highest resolution, a 3840x2400 LCD may be the way to go.

        • More O/T (Score:3, Funny)

          by IncohereD ( 513627 )
          CRT > LCD / PLASMA

          This equation just made me laugh. Like...if plasma is less than one, just think of how much smaller LCD has to be than CRT!

          OK, I'm done.
      • by Doppler00 ( 534739 ) on Monday June 28, 2004 @09:39AM (#9550190) Homepage Journal
        performance is not just about "screen resolution". Many people would like to turn more details on in games. In some cases, these details could give you a tacticle advantage. For example, turning shadows on. Sure, it wastes some CPU time, but if you have it turned on, you might spot someone around the corner that you would not have otherwise.
    • "begs the question" (Score:2, Informative)

      by image ( 13487 )
      > Which begs the question, who is this aimed at?

      I recently learned this here, so please don't take this as a criticism.

      The phrase "begs the question" doesn't mean what you think it means. It does not mean, "this leads to the question."

      Rather, it is a term used in logic to indicate a fallacy in which the question or statement itself tries to prove its truth by asserting its own truth. This is commonly known as circular reasoning. More here [nizkor.org].

      I agree with you about wondering who the product is aimed
    • Which begs the question, who is this aimed at?

      Well, I bet the developers of the beautiful Unreal Engine 3 [unrealtechnology.com] are using this. Current hardware can't run it at very playable framerates. I remember them saying you'll need 2GiB of RAM to play it maxed out.
    • by flsquirrel ( 115463 ) on Monday June 28, 2004 @09:32AM (#9550146)
      Ok, I'm not too much older than you if you're still in college, but I'm going to play old curmudgeon anyway.

      you can put together a decent solution for computing now for around a grand. Kick in another $250 for needing a good workstation board to get the right slots and say $600 ($300x2) for the two cards and you're still just under $2000. THIS IS CHEAP. I'm sorry. I know how many lawns I had to mow as a youngin to buy my first pentium 60. That was $2k for JUST the computer and monitor. That included a baseline 1 meg video card, no cdrom and no sound. The cdrom and sound card cost me another $400 a couple months later.

      So cry me a freaking river. Get a weekend job. Stop spending so much money on booz. If this is a priority for you, then you'll find the money. If it's not a priority, then quit your pissing and moaning.
    • Sounds good to me (Score:5, Insightful)

      by not_a_product_id ( 604278 ) on Monday June 28, 2004 @09:43AM (#9550215) Journal
      No point in complaining. Let the folk rich enough (stupid enough?) to afford it, buy it. Either it just won't take off (in which case you've saved yourself a load of cash) or it'll go great, the price will drop, the bugs will be ironed out and you'll get it at a price you can afford.
      What is there to complain about?
    • "I guess if you have a lot of money and want to play with a (marginal) advantage, an SLI setup is for you."

      "On early driver revisions which only offered non-optimized dynamic load-balancing algorithms their SLI configuration performed 77% faster than a single graphics card. However Nvidia has told us that prospective performance numbers should show a performance increase closer to 90% over that of a single graphics card."
      • I wouldn't consider 77% or 90% to be marginal.
    • > Which begs the question, who is this aimed at?

      It will be aimed at the hardware reviewers. The resurrection of SLI will get back Nvidia's ranking as Number 1 for high performance video. I would imagine a few gamers with more money than they need will also setup a dual Nvidia system but the primary audience will be those who publicise performance ratings.
    • by Fweeky ( 41046 )
      Get two moderately priced cards, SLI them together and get much better performance than a single high end card for around the same price. Plenty of people pay that for a mere 10% performance difference, what makes you think they won't pay it for 70%+?

      Not all geeks are poor, and not all poor geeks are beyond saving up and spending a large amount of their income on what interests them.
  • by Tsunayoshi ( 789351 ) <tsunayoshi&gmail,com> on Monday June 28, 2004 @08:57AM (#9549867) Journal
    Am I the only one to find it hilarious that at the top of the page there was a Flash ad for an ATI Radeon X800?
  • Just a band aid.. (Score:5, Interesting)

    by eddy ( 18759 ) on Monday June 28, 2004 @08:57AM (#9549871) Homepage Journal

    ... till we have multi-core and/or multi-GPU consumer cards. (they're already available [darkcrow.co.kr] at the high-end)

    Questionmark.

    • Re:Just a band aid.. (Score:2, Informative)

      by RageEX ( 624517 )
      SGI has had multi-GPU graphics cards for a long time (since the late 80s?), and boy are they expensive.

      I have an Indigo2 with MaxIMPACT graphics. It has 2 Geometry Engines and 2 Raster Managers. I believe that each set handles a different scan line. Because it is done entirely in hardware MaxIMPACT is twice as fast as a single GE/RE board like HighIMPACT.

      I beleive that ATI's modern GPUs have been designed to work in parallel (up to 32 chips?). It's very cool to see a card using 4 R300s.

      SGI is startin
  • Time to... (Score:2, Funny)

    by Korgan ( 101803 )
    Break out my old Voodoo2 cards again... SLI? Sure... My voodoo2 cards in series from my Radeon x800 :-) Who says you have to have a 6800? ;-)
  • by gl4ss ( 559668 ) on Monday June 28, 2004 @08:59AM (#9549883) Homepage Journal
    like 3dfx they bought?

    maybe they shouldn't have.. sure they probably had some great people and so on but ultimately "it didn't work out".

    "hey, we can't keep up! let's just use brute force on increasing our cards capabilities!!! that's cheap and economical in the long run keeping our company afloat, right? right??"

    • Except that NVidia is keeping up with their competitors in most other areas. Whatever performance loss or gain margin they have with ATI, it isn't enough to say, hands down, X is better than Y.

      If you remember the last days of 3dfx, what they were selling was more expensive, slower, had a lower resolution and a distinctly washed-out look compared to comparable Nvidia parts. In fact, I remember convincing several people at a LAN party to dump their Voodoo 2 cards for the TNT, because although the frame rat
  • Can you hook up 4 monitors to this badass configuration?
  • by Anonymous Coward
    So THAT'S how we can run Longhorn! It makes sense!

    Oh wait...
  • Power Requirements (Score:5, Interesting)

    by Anonymous Coward on Monday June 28, 2004 @09:00AM (#9549895)
    So, One card that requires a 400 Watt power supply + Another card that requires a 400 Watt power supply = The need for an 800 Watt power supply?!
    • by Henriok ( 6762 )
      Of course not! One card need a COMPUTER with a 400 W powersupply. There's a lot more than a graphics bord that needs power in a computer.
    • NVidea says that their cards draw 110 Watts ( more if you overclock). So a good 600 watt power supply should be able to handle the second card.
    • by afidel ( 530433 ) on Monday June 28, 2004 @10:40AM (#9550706)
      Nah, these cards draw a LOT of power, but not anywhere near 400W. They DO draw over 150W from the 12V rail though so getting a PSU with 4 12V rails capable of handling in excess of 300W on 12V is going to be somewhat problematic. Run of the mill 550W PSU's supply max 24A @ 12V which is NOT enough for the cards, let alone cards plus motherboard. The biggest PSU I could find were capable of 36A @ 12V which gives you an overhead of under 100W for all other 12V devices in the case, this includes hard drive(s), motherboard, cd/dvd rom(s), etc. Guess you are going to need a server class case with multiple PSU's to run this kind of a configuration!
    • by TubeSteak ( 669689 ) on Monday June 28, 2004 @11:18AM (#9551018) Journal
      forget power requirements, what about the effin cooling? Does it strike anyone else as extremely stupid to put two scorching hot graphics cards back to back? [hardwareanalysis.com] I mean... come on!

      Alienware took a very different tack with their solution [pcper.com] because it requires a 3rd PCI slot AND it's analog (3rd & 4th pics). I guess its a series of tradeoffs: Space vs flexibility, with Nvidia winning the battle for space but losing on flexibility.

      That aside, its rediculous that nvidia is expecting their OEM cooling solutions to do any kind of justice to the heat from those cards. Alienware already expects water cooling to be part of the solution and has cases designed accordingly... couldn't NVIDIA have done it any other way? Do they absolutely have to have a hardware link between their cards?

      "A power draw of 250 Watts for the 6800 Ultra SLI solution is very realistic."
      Then explain how this will work [tomshardware.com].

    • The need for an 800 Watt power supply?!

      Nah. The reason why the new graphics cards run so hot is that they're self-powered. Each carries its own RTG [doe.gov].

      As long as you have a lead-lined case and follow local, state, and federal ordinances regarding disposal of nuclear materials--then you should be fine.

      Glad I could clear that up.

  • New Motherboards (Score:3, Interesting)

    by Anonymous Coward on Monday June 28, 2004 @09:00AM (#9549896)
    It's a bit presumptuous to assume that when these SLI cards come out, the only motherboards supporting multiple PCI-E x16 slots will be Intel Xeon based. As far as I knew, AMD were planning on doing 939 based motherboards with multiple PCI-E.

    At any rate, doesn't this sort of make the whole Alienware Video-Array seem like a bust?
  • Why don't they make a graphics card with two GPU and double memory size ? Or wouldn't fit on of these buggers into a computer case ? Yes they exploit the dual PCIex busses, but it doubt that they really use the would bandwith.
    • Each PCIe channel is, IIRC, 150MB/sec independant bandwidth, and they come in 1, 4, and 16-channel slots.

      a 16-channel PCIe slot is 2.4GB/sec of bandwidth...

      I bet a high-resolution FX card would use most of that. But then again, they probably use PCIe-16 because PCIe-4 would be far too little.
  • Reliability (Score:5, Insightful)

    by lachlan76 ( 770870 ) on Monday June 28, 2004 @09:02AM (#9549912)
    Am I the only person who thinks that holding the two together with a non-flexible medium and is held on only with solder is a bit dangerous? Not that the solder would break, but when it is removed, it could be a bit tricky. Perhaps a cable on there would be safer.

    Other than that the only problem I can see is that you need about AU$2000 worth of video card, and at least AU$1000 worth of Xeon to use it. Maybe for engineers and artists, but will the average person have any use for it? I don't feel that an extra AU$3000 is worth it for the extra frame rate in games.

    For the pros though it would be very good though.
    • by Zocalo ( 252965 ) on Monday June 28, 2004 @09:12AM (#9550012) Homepage
      Other than that the only problem I can see is that you need about AU$2000 worth of video card, and at least AU$1000 worth of Xeon to use it.

      Look on the bright side; most Xeon systems already have the second PSU that you are going to need to power the extra card and turbofan based cooling system.

    • Re:Reliability (Score:5, Interesting)

      by PhrostyMcByte ( 589271 ) <phrosty@gmail.com> on Monday June 28, 2004 @09:13AM (#9550020) Homepage
      Never mind how they are held together. The Geforce 6 already requires a shitload of power (2 molex connectors on the rear of it) and puts out a lot of heat. So you have two very hot cards right next to eachother, one of them getting really bad airflow. If one of your $500 video cards doesn't die, your PSU surely will!
    • I don't feel that an extra AU$3000 is worth it for the extra frame rate in games

      Get out of here you heathen...........
  • ALX (Score:3, Interesting)

    by paradesign ( 561561 ) on Monday June 28, 2004 @09:04AM (#9549931) Homepage
    How does this stack up against Alienwares ALX dual graphics card system. I remember reading an article where the Alienware guys bashed the SLI method. Theirs, each card renders half the screen, either top or bottom, not every other line.
    • Re:ALX (Score:3, Insightful)

      by kawaichan ( 527006 )
      dude, alienware basically is using nvidia's SLI method for their alx boxes

      noticed that they were using two 6800s for their benchmarks?
      • why was it back then said that they had just extra software do the trick? because that's what they said.

        this nvidias own solution doesn't really seem like the alienware's.
    • Nvidia gives everybody the chance to use dual cards, without buying a $5000 Alienware system. And because it's directly by the vendor, I bet the quality/speed is better. This is going to lose AW a good deal of money, of course they are going to bash it.
  • 4 slots (Score:5, Insightful)

    by MoreDruid ( 584251 ) <moredruid&gmail,com> on Monday June 28, 2004 @09:04AM (#9549933) Journal
    OK, I'm all for performance gain and pushing the limit, but geez, 2 of these cards take up 4 slots. How are you supposed to squeeze in your Audigy card with extra connectors and still put in your extra firewire/usb?

    And I'm also wondering how the heat is going to be transferred away from the cards. It looks like you need some serious cooling setup to keep those two babies running.

  • Bah... (Score:5, Insightful)

    by mikis ( 53466 ) on Monday June 28, 2004 @09:05AM (#9549947) Homepage
    Call me when they put two GPUs on one card... Or even better, when they put two cores on one chip. Soon enough motherboard will be an add-on to graphic card.

    Plus, many people were upset about power and cooling requirements. This monster would occupy FOUR slots and require, what, a 600W PSU? (ok, just kidding, "only" 460W should be enough)
    • This is a workstation class configuration, its not meant to apply to your casual gamer, its meant for the guys that actually shelled out for an EEP4.
      You can also look at it as a means of obtaining what was previously quadro only class quality, which is awesome for all of us maya and 3d max hobbyist ;).
      At any rate, if you dont like it, dont buy it! For the rest of us, its xmas early.
  • The proliferation of aimbots and wallhackers will still mean you just look better getting pwn3d.
  • by mustardayonnaise ( 685416 ) on Monday June 28, 2004 @09:14AM (#9550028)
    John Carmack said about a year and a half ago that Doom 3 would run 'well' on a top-end system of that time- which was a 3.06 GHz P4 equipped with a Radeon 9700 Pro. What's frightening/upsetting is that this SLI setup really isn't coming into play to satisfy the games of today like Doom 3- it's coming into play for the games of next year and the year after. It's just a little off-putting that in order to play the newest games you need a SET of graphics cards with those kind of power and space requirements.
    • no you won't, by the time these games appear there'll be a single card with more power than these. that's how it usually goes anyway. i've been bitten by "2 of anything" (CPUs, video cards) bad value before.
  • Whoa...with one GPU power requirement being a 500W power supply. Where the Hell am I going to get a power supply to run two of these beasts. Soon my PC is going to draw more than the beer fridge!
  • Only Nvidia? (Score:3, Interesting)

    by ViceClown ( 39698 ) * on Monday June 28, 2004 @09:24AM (#9550079) Homepage Journal
    This is the mobo design Alienware came up with, right? My understanding is that you can use ANY two video cards that are the same and are PCI-X. You could just as well do two ATI cards. Who submitted this? Nvidia marketing? :-)
  • by Viceice ( 462967 )
    Please register or login. There are 9 registered and 3599 anonymous users currently online. Current bandwidth usage: 872.66 kbit/s

    How many of you are there hitting refresh just to see the hit counter go up? :D

  • SLI (Score:5, Informative)

    by p3d0 ( 42270 ) on Monday June 28, 2004 @09:25AM (#9550097)
    Is it too much to ask to define, or at least hyperlink, the acronyms you use?

    SLI stands for Scan Line Interleave [gamers.org].

  • I seem to remember that one of these cards took up 2 slots, and needed a third just for good air flow. How much space are these going to take up? Also, just one of these bad boys needed something like 400-500W of power. What kind of power supply is needed for 2???
  • My question... (Score:3, Informative)

    by SageMadHatter ( 546701 ) on Monday June 28, 2004 @09:26AM (#9550100)
    When a single 6800 card requires a 480watt power supply [anandtech.com] and two dedicated power lines, what would the power requirements be for two [hardwareanalysis.com] of these cards in the same computer system?
  • Xeons? (Score:5, Insightful)

    by ameoba ( 173803 ) on Monday June 28, 2004 @09:27AM (#9550105)
    Why would they design something like this and force it to use a Xeon?

    For starters, the Xeon is still stuck at a 533MHz FSB, limiting its performance. Add in the fact that they're ridiculously overpriced & most games show little to no performance improvement when running on an SMP system. A single P4 or Athlon64 will stomp the Xeon in almost all gaming situations.

    Of course, with this tech a ways away & there not really being any PCI-E motherboards on the market now that Intel's recalled them all, I guess they're betting on high-end enthusiast boards to ship with the second x16 slot by the time this thing is actually ready for market...

    Really, the biggest application for this kinda power that I can forsee would be game developers who want to see how well their games scale for next-gen video hardware...
  • Hanging in my closet is a "souvenier" from my last adventure with SLI: A Quantum3D Obsidian t-shirt. In my eagerness to own the latest and greatest graphics card I paid 600 bucks up front to preorder this card which was developed by a spin-off from 3DFx. The card shipped 6 weeks late, suffered from overheating since it crammed the components from 2 cards into a single PCI slot, and was soon equaled in performance by a simple pair of Voodoo 2 cards in adjacent slots. I expect a similar fate for this monstro
  • The G5s come with PCI-X slots, and will prolly run Doom 3.

    Usually the cost of G5s are a bit too steep, but compared to Xeon they are an absolute steal!

    And you get a cool machine/OS to boot! :)

  • by HolyCoitus ( 658601 ) on Monday June 28, 2004 @09:46AM (#9550238)
    That's all I am seeing here. You don't need to use the Ultra in your configuration of this. The article even states you can use a single slot GT, which would be greater than a single Ultra and cost you 200 dollars more for a great performance boost. Or you could even use basic 6800 cards which are under 300 dollars.

    This is going to be great when it matures, and is one of the huge advantages to PCI-Express when that becomes the standard on future motherboards over AGP. Yes, I know Intel is making motherboards with this, but who the hell wants to pay all that money for such a small jump?

    Since people seem to be lost on the nvidia cards, here goes a run down of what they are releasing and the price area:

    300$ - nvidia 6800
    400$ - nvidia 6800gt
    500$ - nvidia 6800 Ultra
    600$+ - nvidia 6850/6800 Ultra Extreme

    The 6800 and GT are single slot cards with a signle Molex connector. Those can be used in the SLI configuration as well. Get the facts straight before you post flamebait and troll.
  • Cost? (Score:4, Insightful)

    by Watcher ( 15643 ) on Monday June 28, 2004 @09:51AM (#9550283)
    So, lemme get this straight-in order to get a 77% speed increase, I'm going to have to blow hundreds on a second card ($400), xeon processor, motherboard, memory, and a damned good cooling system so it all doesn't melt and I don't go deaf? Wouldn't it make more sense to buy a decent card now, and wait two years for them to put out the single GPU card that does the same performance for $200? Unless you're really worried about dropping under 100 frames, or you have a lot of high end rendering to do, I can't imagine this really being worth it. At least with the Voodoo 2 SLI system you could buy a second card without having to invest in a huge honking system that makes a dual G5 look cheap.
  • 98 ? NOT ! (Score:3, Funny)

    by TTL0 ( 546351 ) on Monday June 28, 2004 @11:25AM (#9551069)
    "It's 1998 all over again"

    No, in '98 I had a great job and salary.

  • Alienware already has a patent-pending process to do SLI on their own motherboards, whether it is with an ATi or Nvidia based videocard. The two caveats are: 1. so far, this will only be through Alienware, and 2. the videocards have to be exactly the same card.

    Alienware purchased a former 3dfx licensee who had outstanding patents on some of their own SLI tech. Alienware has wisely furthered the research and will be marketing it soon. And it doesn't require a Xeon processor...

    Here's the press release:

    http://www.alienware.com/press_release_pages/pre ss _release_template.aspx?FileName=press051204.asp

  • by tstoneman ( 589372 ) on Monday June 28, 2004 @12:25PM (#9551617)
    Ahhh, what sweet memories.

    I bought a shitload of 3DFX stock back in the late 90s because they were the king of 3D. I remember walking into a computer store, and seeing something on the screen... I thought it was clip from a movie, but they told me it was Mechwarrior 2 (I think 2) playing on a Voodoo card. My mind was blown. How they got movie-like graphics onto a computer was beyond my capacity to understand. I dropped the $350 and bought one immediately and played with it and loved it.

    Then, after a while, I thought, 3DFX is the king and they will never die. I put my money where my mouth was and forked over my entire savings to buy 3DFX, around $15k. There-in I learned a few great lessons:

    1) The best technology doesn't mean the best company. "Good enough" with a better run company will usually blow you away. Ask Microsoft or nVidia (well, at the time nVidia wasn't the top runner that it is today).

    2) No matter how great of an explanation you make, the stupidest things like 16-bit color vs 32-bit color can kill you (22-bit color just doesn't cut it to the dumb-ass consumers). It's better to just cross your t's and dot your i's in the first place so that you don't have any such vulnerabilities.

    They went tits up, and I basically lost my money. nVidia bought the remaining pieces of 3DFX, and that includes all their patents. I'm not surprised they went SLI, and for companies that use it like 3d effect companies, it will probably save them bundles of time.
  • by quantax ( 12175 ) on Monday June 28, 2004 @12:34PM (#9551686) Homepage
    Given that this configuration requires a Xeon based system w/ the dual PCI-E slots, this seems geared more towards the 3D development end of things, with Maya, Softimage and such. I've yet to meet a gamer-only with a Xeon rig, so this would seem to be a boon for the new Gelato systems, allowing for more GPU power. I just hope Nvidia doesn't end up emulating 3DFx's later moves in which it decides raw speed > innovation, as that is not really a winning strategy, especially these days where we're on the brink of a new age of gaming graphics using advanced shading techniques previously only seen in pre-rendered footage.

"No matter where you go, there you are..." -- Buckaroo Banzai

Working...