Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Software

Comprehensive Video Benchmarks 50

Crusader writes: "Matt Matthews has produced an extensive series of benchmarks which examine four separate games' performance on the Voodoo5, the Rage 128 Pro, the G400 Max, and the GeForce 2 3D graphics cards under Linux. Performance against Windows 98 is also included." We also received: driveitlikeyoustoleit writes, "3dfx, NVIDIA and ATI's best are all pitted against each other in a high-end 3D video card roundup. The authors pit six GeForce2 GTS (from ASUS, Creative, ELSA and Hercules) based cards against an ATI Radeon 256 and a 3dfx Voodoo5 5500. Performance for a change isn't the only criteria in question (although the end scores are somewhat weighted in favor of fps) but also at full-scene anti-aliasing, image quality and DVD performance/quality are critically looked at. The screen shots page showing off FSAA comparisons are great visual indicators of what the cards can do."
This discussion has been archived. No new comments can be posted.

Comprehensive Video Benchmarks

Comments Filter:
  • Now, I know this isn't the best forum for anything pro-ATI, but I'll continue nonetheless.

    For as long as I can remember, ATI's chips have performed far better in 32-bit color depth tests than 16-bit color-depth tests. Yet, Sharky doesn't seem to show any charts comparing the cards in 32-bit except for the Re-Volt benchmark which they admit is outdated. However, they do state on page 6 [sharkyextreme.com] of their review that the GeForce-2 cards rule both 16-bit and 32-bit.

    Did I miss something?

    Anyways, it seems that the Radeon is only a few FPS behind the GeForce-2 cards, and I imagine that difference is humanly imperceivable except for super-humans. Meanwhile, you gain better DVD playback and other huge multimedia offerings, especially if you look into the always-a-pleasuer All-in-Wonder line from ATI.

    So, why did Sharky need to use so many pages to get these points across?

  • For the same resolution, run the games in a window, instead of going full-screen for Windows.
    As far as the display driver is concerned, doing a page flip on a non-fullscreen surface is impossible. That should eliminate the difference produced by the blitting drivers for Linux.

    Besides, issues like the efficient use of AGP bus, DMA, eliminating bad polygons when you do clipping and synchronization all can have a great impact for performance - blitting vs flipping is just a minor issue on all but the highest resolutions.
  • I am a major gamer. I like to play Quake1/2/3 a LOT. I play some UT and a lot of Half-Life (Firearms/TFC/Counter Strike/Plain of HL). I have a 3d Prophet2 GeForce2 GTS 64mb video card. The thing is insane, cost me a pretty penny but it was worth it. So far I've only tested it in Windows. What I want to do next is test it out in linux. In fact just a few hours ago I finished downloading the SuSE 6.4 evaluation ISO. I guess I could try it out with that. Unfortunatly for me I can't play Half-Life under linux, but I would be able to try it out with Quake3.

    In Windows, I'm running the Detonator 3 6.31 drivers. Before I was using 5.16a drivers (Came on the cd) and was pulling about 45-50fps on timedemos at 1152x864 with 32bit color on and all the other settings jacked (Sound on during the timedemos too). Now with those same settings and the DET3 6.31 drivers I get about 70fps. Sure I probably can't truely see 70fps, but it's WAY smoother than 45-50fps.

    I'd just like to see how Linux could handle my card and Quake3. Oh yeah, my box is an Athlon 750/128mb ram/3d Prophet2 GF2 GTS 64mb/Of course more stuff but nothing else I need to mention that is essential to gaming or something. So what I would like to know is if anyone else has similiar hardware that can run tests with Linux and Windows to see what difference they get. I did read that article a while back about comparing Linux to Windows with 3d games, but I'd rather see what a user gets not some lab.

    -PovRayMan
  • Although gaming is important, I already have two Nvidia/NT systems that do a fine job. Right now, I want to build a Linux OpenGL (err...MESA) development system and plan to use XF86 4.0 and DRI to take advantage of windowed hardware acceleration. Can anyone recommend a solution here? I am seriously looking at the G400 or G450 from Matrox as their DRI drivers seem fairly mature. The only other option is a Voodoo5. I am willing to try either (though the dual head Matrox is really tempting). I would like some feedback from other people who have done more than run fullscreen gaming benchmarks. I am especially interested in windowed rendering performance and stability. The article mentions lockups with the Matrox card which, of course, is unacceptable. Any input is welcome.

    Thanks.
  • How about:

    Does the 2D look any good on my monitor, because I spend far more time using 2D than playing 3D games? ....
  • Heard anything about VIA chipsets? (Specifically more recent ones, I have a KT133 Athlon mobo)

    On the whole, I've never had problems with VIA chipsets and AGP, this is the first unstable driver I've ever had, and I've been using VIA-based mobos (Epox Super7 for a K6-2 and now an Athlon KT133-based board.) for 3 years.
  • If you read the orignal submission [slashdot.org] to slashdot, it links to two articles. There were actually two submissions combined into one. The second submission pointed to the Sharky Extreme Article [sharkyextreme.com].

    Anyways, back to your comments... The Radeon has been quite surprising to the industry. It came in just behind all of the GeForce-2 cards in Sharky's benchmarks and well above the Voodoo5. My point of contention with them is that their tests seem to be limited to 16-bit color-depth.

    As for the stability of their drivers, I don't know where you're coming from. I've personally had no problems with their drivers. Hanging out in comp.sys.ibm.pc.hardware.video [pc.hardware.video], I've seen an pretty-much uniform distribution of complaints for all card manufacturers.

    Finally, as I'm not a Linux user, so I can't comment on the drivers available for Linux. I've found limited BeOS [be.com] drivers available, though, and they're stable. Perhaps by your logic BeOS developers are better than Linux developers? Perhaps ATI doesn't forecast enough profit in the Linux sector to justify making in-house drivers? This is spawns a whole 'nother can of worms.

  • by Andy Dodd ( 701 ) <atd7NO@SPAMcornell.edu> on Monday October 09, 2000 @06:01AM (#721376) Homepage
    One thing that doesn't seem to get addressed by ANY Linux 3D hardware review I've seen (One was also in LJ recently), is the issue of stability.

    We run Linux because it's stable, even in cases where we know that we lose performance because of that stability.

    Well, all of these reviews basically say, "If you can afford it, buy NVIDIA! They're fast!".

    So what if your card is fast but your machine crashes in the middle of a game? NVIDIA's drivers just plain suck in the stability arena - My machine crashes on a regular basis in both Q3 and UT with those crap drivers. Even Windows looks rock-solid compared to Linux when using those accursed drivers! And all this because the only difference between the Quadro and GeForce is a PCI/AGP device ID and a few flags set in the driver because of said ID - it's the only plausible reason for being the only vendor not to release source/specs. (Check out http://www.geocities.com/tnaw_xtennis/)
  • thanks (and to everyone else who replied).

    So, what I'm hearing is that there is no attempt or standard way to make the quality of each frame's graphics lower to lighten the load and sustain a high framerate? That would seem like a better solution for the player.

  • All of these cards are blazing. As such, driver stability is much more of an issue. Matrox is pretty top notch in this department. NVidia has had it's share of problems in the past, but may be improving. I would much, much prefer to get a rock solid card and driver combination than some crazy card that goes for extra performance at all costs.
  • The difference between Linux and Windows at 1280x1024 on the GF2 is the difference between playable framerates and unplayable framerates. 'nuff said.

    Actually, I think I need to cogitate more. The NVIDIA Linux drivers are pulled from more or less the same source as the Windows Linux drivers. This means code-wise, the two are fairly even in terms of performance tweeks. Now the Linux drivers will gain a little bit as NVIDIA tweeks glue between the drivers and OS, but I doubt the speed will ever reach that of Windows. Now page-flipping could be one of the things holding back the Linux drivers, but note that even at the lower-res tests, Linux is still behind. All this ignores one point: Shouldn't Linux be undeniable FASTER? Why use the OS of it is only just ALMOST as fast as Windows. Windows is a piece of junk. Why can't Linux beat it?
  • Here's my token useless post:

    It doesn't take a rocket scientist to tell you that the same card will perform similarly. One of the definitions of insanity is repeating the same action and expecting a different result. Now, I guess the silkscreening of the brand name might cause some molecular differences, leading to a performance boost/loss. If you were to perform some simple statistics on those results from the various GF's you could eliminate most, if not all, of them as normal variation.
  • No, I wouldn't call it good performance, but then again I've never seen such a problem either. Apart from lockups when switching to console in the initial release of the drivers they've behaved flawlessly for me.
  • Because of things like page flipping?
    >>>>>>>
    A 20fps difference because of page flipping? I highly doubt it. At most, you're sucking up around 200MB of bandwidth because of the blitting. That's a drop in the bucket for the GF2. Not to mention the fact that even at low res, the Linux drivers are STILL slower. No, the evidence suggests that something other than the blitting is holding back the GF2.

    Because most Linux users routinely have tens of processes running in the background while using their systems?
    >>>>>
    If those idiots conducted benchmarking with tens of processes running in the background, than it is their own damn fault. Not to mention the fact that any half intelligent user tweeks their machine, cutting out unneeded processes. Of course, half of the people using RH6 are running SendMail in the background (which is loaded by default on those systems.)

    Because, as good as the NVIDIA drivers are, they are still new (labeled as Beta, even)?
    >>>>>>>
    They are still beta, but they are based on rock solid code. What about cross-platform access don't you understand? Any bugs are in the driver/OS glue layer, and I seriously doubt that any major performance bugs could be hiding in there.

    Because a good Linux driver should perform some minimal input checking to enforce security (I don't know if NVIDIA does this--more of a DRI-ish
    thing)?
    >>>>>>>
    If security involves a 20-30% performance hit, than I say "welcome crackers!" On a game machine, security at that level is absolutely and totally useless and should be turned of. For a home machine, the only security should be in the network server. There is no need on a desktop to protect the user from local processes.
  • I'm not a gamer, and therefore I don't pay much attention to this technology. I am somewhat interested in it, though. Can someone give a quick rundown of how it works? What I mean is, if one graphics piece of hardware is slower than another, I assume the game doesn't just run slower as that would screw up the play. So, either it tones down the operations it undertakes (fewer colors, lines, etc.) or it throws some operations overboard (I'm falling behind, just start with the next screen and the eye will barely see a flicker) or it does some thing I haven't imagined...

    Is it easy to describe or is there a place that has a good explanation? Oh, and what are the most useful/critical accelerations?

  • Well this review is really is going to be a factor in the purchase of my next card when it admits that it has been "weighted" in favour of certain types of applications. Is it really too much to expect to be given reviews in terms of raw perforamce? Considering the amount of journalistic integrity seen so far on the net, probably.

    Then again, this isn't really suprising from a website that somehow manages to fit more banner ads onto each page (and there are for some reason, a lot of pages) than there is actual content. With the amount of corporate whoring they're managing to acheive in their page layouts, is it any wonder that their reviews feature skewed statistics which practically invalidate their purpose? It also makes you wonder where else corporate $$$ comes into the equation in these kinds of reviews.

    I'd much rather that we saw more reviews from sources that don't appear to be pandering for cash from commercial sources. Whenever you see a banner ad, you can't trust the information you're being given. Hmm, now what site does that bring to mind?

  • - It's doubtful this article would have been posted to /. if it hadn't focused on Linux vs. Windows performance.

    - With the exception of Nvidia, Linux drivers substantially trailed the windows drivers on any given card.

    - The disparity between Windows and Linux performance gets bigger at higher resolutions and texture sizes.

    - Certain cards (omitted to prevent flame wars) aren't worth bothering with.

  • I know I look at video cards with a number of things in mind. They are:

    How well will it run with Windows 2000 and Linux
    How well will it play Unreal Tournament and Quake
    How much does it cost?

    I don't give a flying hoot how many floating point mega textured shaded pixel snagget things it can draw in 1 bazillionith of a second. I wanna play games and have KDE look funky in 1024x768 mode. Benchmarks don't impress me, I'm afraid. I honestly just care about how well the card will run on my machine and play the games I like.

    And this review answered exactly that. It is almost precisly the review I have been waiting for. The NIVA kicked serious butt, and is now my #1 choice.

  • Well this review is really is going to be a factor in the purchase of my next card when it admits that it has been "weighted" in favour of certain types of applications.
    Well, I'd have thought such a scheme would be ideal for people whose primary concern is running "those certain types of applications".
    Is it really too much to expect to be given reviews in terms of raw perforamce?
    Oh yeah, I find those "3-million bogo-texels per second" numbers on the side of the boxes really useful......
  • Video card rendering and the calculations that determine the state of the game at any moment run asynchronously; you might think of the video card as fetching a "snaphot" of the game world whenever it is done drawing the previous frame to your screen. (It's not really that simple of course, but I think the analogy is fairly sound.) So the state of the game is calculated at an arbitrary (very high) rate, and the video card keeps up as best it can. The faster the framerate, the more information your eyes receive, your brain needs to interpolate less, and the game runs subjectively more "smoothly". And in the case of a fast-paced action game, you receive more timely and accurate feedback of game events that require reaction on your part - such as the position of a rocket that last frame was homing in on your face.
  • Do you call crashing routinely within an hour (sometimes within 5 minutes) of starting a game good performance?

    I don't. NVidia's drivers are unstable and crash my system all the time.
  • Whenever I have needed help I have found the nvidia people that sit in that channel very helpful.
  • Trailing behind the POS that is Windows98 by more than 10% (at the higher res tests, more like 25%) is DEFINATELY bad! I gets even worse. This disparity gets bigger when you note that QuakeIII frame-rates in NT4 are even faster (by about 10-15%) than in Win98.
  • Were they helpful when you asked them when their driver will be open source? Were they helpful when you asked them when they are going to use XF86 4.0's DRI API instead of their own closed, proprietary one?
    I've asked them neither of these things, quite frankly I can find little reason to bitch about their drivers when they seem to perform significantly better than the Open Source ones out there.
  • A) NVIDIA has some unopenable stuff in the drivers.
    B) (Much more important) NVIDIA has no reason to help out Matrox and the others. If you get your head out of your oss ass and look around, you'll notice that all the card companies *except* NVIDIA are having problems with their OpenGL drivers. If you read last months MaximumPC, you'll read an interview with the OpenGL driver developer at Matrox. He says that a GL driver is a lot of work, which is why the Matrox GL drivers aren't 100% yet. NVIDIA has a kick-ass OpenGL driver. A GL driver isn't an ordinary graphics driver. It is a complete implementation of the OpenGL pipeline (an ICD) Now if your implementation of OpenGL was faster than everyone else's (who were having problems with their own implementations and would love to get their hands on yours) wouldn't YOU keep yours closed?
  • It's also somewhat interesting that the card with (by far) the greatest performance under Linux has closed source drivers.

    It's funny you mention this - this is ridiculous, especially for us loyal few who have been patiently waiting for 3dfx to get their head out of their *ss. I would be content with a linux driver from 3dfx that was halfway decent. They're not even close to performance of the windows drivers, and yet NVIDIA linux users have some pretty solid drivers. Not a real strong argument for open source is it? My next card will be an NVIDIA, closed source or whatever...
  • To an extent Matt does deal with this--you'll note the absence of the G400 and ATI cards from certain tests because, as Matt noted, they are not stable enough to be considered reviewable.

    As to the NVIDIA drivers, it is my experience that their stability is inversely proportional to how crappy your AGP chipset is. My work box has never had a problem running any of our games, my flatmate's box with an AVi chipset has occasional crashes during games for no explicable reason.

    As a counterpoint, the Win32 TNT setup I'm using right now incorrectly renders triangle strips for some reason...
  • They are the best that you can get but their customer support sux. Linux has all sorts of problems with them:- mine too!
  • I want to build a Linux OpenGL (err...MESA) development system and plan to use XF86 4.0 and DRI to take advantage of windowed hardware acceleration. Can anyone recommend a solution here?

    YMMV, but here goes. We recently installed SuSE 7.0 on a handful of machines in order to get a "one-stop-shop" for X 4.0 and OpenGL.

    The Matrox G400 and Creative Labs Annihilator Pro GeForce cards we had lying around worked but weren't too stable (Matrox is rock solid on 2D tho'); but (cross fingers) the Creative Labs TNT2 Ultra seems stable (and fast) over the last week or so.

    I would like some feedback from other people who have done more than run fullscreen gaming benchmarks.

    We use them for writing OpenGL and OpenInventor programs, and also Quake III.

  • Because of things like page flipping? Because most Linux users routinely have tens of processes running in the background while using their systems? Because, as good as the NVIDIA drivers are, they are still new (labeled as Beta, even)?
    Because a good Linux driver should perform some minimal input checking to enforce security (I don't know if NVIDIA does this--more of a DRI-ish thing)?

    Etc.

    m.
  • Now, if you want them to do what you tell them to, you better either buy a card or buy some stock

    And after having done that, whine to them, not to us.
  • Comment removed based on user account deletion
  • This is OT I know but I have to respond.

    If the framerates are less that doesn't effect the speed of the game. This is because the game is framerate independant. They do this by calculating everything according to realtime. Lets say you have a car moving at 3 game units per second. To find the cars position you take the last position and using the velocity and time since the last calculation you can calculate the new position. This has its own problems if the framerate is too low or high. It is the same thing as aliasing. If you take few samples then the resulting model (projectile path, object movement etc.) will be inacurate. Example: if you throw a ball up and sample it 3 times in midair the maximum height achieved probably won't be one of the samples so in a game you wouldn't be able to jump as high. I hope this answered your question...wish I was better at explaining things sorry.


    My Home: Apartment6 [apartment6.org]
  • by nehril ( 115874 ) on Monday October 09, 2000 @06:44AM (#721402)
    I get the impression that Sharky's reviewers have sort of lost touch with the real world a bit. They talk about how one card is better than another because it has TV-out or TV-in... yet how many hardcore gamers use such a feature? would a hardcore gamer (sharky's intended audience) EVER play Q3 on a fuzzy, 640x480 television? Would anyone even drag their rig out into the living room?

    All this from the same crew who benchmarks Q3 at 1600x1200, and spits on any card that loses that race (how many ppl have monitors that can do 1600x1200 at 100 hz anyways?)

    They rated the Elsa card as SuperFantasticGetOneOrDie, yet the identical Powergene card was rated as "bleh," for those "those on an extremely tight budget" (the powergene is only $10 less than the elsa.) But according to the reviewer "you don't get a name brand" with the powergene, so stay away unless you are ghetto. Reality check anyone? BOTH cards are stock reference designs, except for a possible future tv in/out module for the elsa.

    Also, by reading these "shootouts" one would get the impression that quake 3 is the only game on the market. If they benchmark some other game, it has to be a quake clone. I play the Quake series to death, but I also play strategy games like Homeworld (which can bring a video card/cpu to its KNEES during intense battles.) Where is the benchmark on some non-FPS game?

    How about image quality? I personally turn on FSAA on my Geforce when playing Homeworld at 800x600, because it looks SO much better than 1024/768 without FSAA. If sharky reviewers would play something besides FPSs then stuff like image quality would get ranked way higher.

    Anyways, thats the end of my rant. Whenever you read one of these reviews, keep in mind the biases of the reviewer, and remember that they sometimes get caught up in "reviewerland," which is not necessarily connected to the "real world."

  • NVIDIA's drivers just plain suck in the stability arena

    Umm... that's funny, because I have a GeForce 2 and I have no stability problems whatsoever.

    ------

  • I'd reply to this if it wasn't at -1 already. Oh whatever. If you don't like it, whatcha doing in the thread anyway? Stop wasting bandwidth/servertime.
  • No matter what engine you put in, and no matter how much you trick it out, you can never turn a Camaro into a Corvette. 3dfx and NVIDIA are the Vettes, ATi is the Camaro, and S3 is... oh, let's say an Escort?
    --
  • Yes quality is generally scalable in several ways, including but not limited to:

    - screen resolution in pixels
    - geometric complexity, i.e., the simplification of curved solids, the use of less complex representations of items in the game etc.
    - the presence/absence of special effects that require additional rendering passes before the final image goes to screen (shadows, dynamic lighting effects)
    - filtering algorithms with varying computational requirements that affect the subjective quality of textured geometry and lighting.

    As far as what is a better solution... well, it's a matter of personal preference. Everybody wants games to look "pretty", and much of the value of today's games seems to lie in the "wow" factor that comes with roaming a reasonably convincing, immersive environment. But in the highly competitive world of first-person 3D shooters such as the Quake series of games, for example, many hardcore players choose to make compromises in order to sustain high framerates and keep the game as RESPONSIVE as possible. I myself (while certainly not among the finest players of such games) maintain 3-4 different configuration files for different moods and purposes. Sometimes I feel like seeing all the eye candy (map development, benchmarking), sometimes I want the competitive edge that come with pure speed - it's all a set of compromises.

    <ramble>

    Also remember, 3D games are among the most demanding applications available. Games like the original GL Quake drove the video hardware industry forward as much as they responded to available hardware, and consumers continue to demand more and more "cool stuff". OpenGL was originally formulated as a way to represent 3D geometry in a serious engineering/CAD environment, to display work prior to final rendering; if you had told the folks at Silicon Graphics that their libraries would be the basis for applications that demanded full-screen rendering 60 times a second and consumer-level hardware that could actually do it, they would have laughed in your face.

    </ramble>

    So anyway, yes - there are fairly standard ways to scale the quality of game graphics back in order to sustain framerate. It's a matter of personal preference. I personally can't justify my purchase of one of the original GeForce cards (I had a perfectly functional - no, actually better in terms of 2D desktop graphics - Matrox card with an older 3DFX card for games until a few months ago) EXCEPT that I wanted to play games faster, and have them look cooler. And I'll eventually purchase another card so I can enjoy what I feel to be acceptable performance on the next round of games, and at the same time I'll be able to play Quake III Arena at 1600x1200 with all the options turned up and still get a more-than-usable framerate :)
  • How well will it run with Windows 2000 and Linux
    How well will it play Unreal Tournament and Quake
    How much does it cost?


    Will it be stable
    Does it use an IRQ(S3 chipsets anyone?)
    How well does the company do forward compatibility

    I've got a Diamond Stealth III S540 card that costs nothing and works pretty much OK. And as for Linux drivers, check on
    http://www.s3.com/default.asp?menu=support&sub_men u=S3_Graphic_Chips&item=drivers& amp; amp;product= Savage4_395


  • Never misspeled a word?? BIG!!!
  • Hang on to that Virge card of yours, it's a bloody relic. Mine has worked for four years now, under 4 different operating systems, and it only recently got replaced. Right now it's still a display adapter, as it is prominently present as a wall decoration:-)
  • by CrusadeR ( 555 ) on Monday October 09, 2000 @03:04AM (#721410) Homepage
    Anandtech [anandtech.com] has a Linux benchmark article up this morning also:
  • All this shows is that even the slowest G400 is an order of magnitude faster than the s3 Virge/DX I have now, and more stable. Wish I had the time to fix that, though it might act even funnier on NetBSD than it does on Linux.

    Mr. Piccolo (probably posting anonymously since this is Mosaic)
  • I would be interested how Graphiccards would perform under Berlin. Especially 3d Graphics. Shouldn't Berlin perfrom _a lot_ better then X? I mean..it is designed to be a 3d environment isn't it?
    Maybe it isn't the time yet. Berlin has a long way to go...
  • hmmnn, so X card gives, what, maybe a couple more fps than Y card.

    In the end, the only people that could be bothered to read the article, are hard-core gamers, or someone considering upgrading.
    Neither of which would be reading Slashdot. 90% of Hard-Core gamers don't really care about Linux performance, because all of the best games run on Windows (let's be honest).

    Ah well, it is 15C in Soho, and raining, with a minimum low temperature of 10C
  • Well, lesee here.
    -I- read the article.

    Apparently, you at least looked at it, or you wouldn't have preached what was in it.

    I think the lure of this article is the fact that it's comparing windows and linux graphics. It's important because, unfortunately (as you said), many awesome games are windows only. And articles like this show the capabilities of games/graphics under linux, as well as the community interest in having those capabilities...

    *shrug*, not trying to convince you you're wrong, just another viewpoint...
  • X is not a slow as people seem to think -- witness the good showing of the NVIDIA Linux drivers against the Windows drivers. There is no magic bullet that will give you substantially better performance under Berlin than you have in X.
  • Were they helpful when you asked them when their driver will be open source?

    Yes, they were. They explained to me why the drivers wern't open source, and they had a good reason.

    These days, a company has to watch its back when it comes to patents. Some little snot could sue them into oblivion for a broken patent. Unfortunatly, a product as complex as a graphics card can infringe on any number of patents, and it slips through the cracks. Apparently all their cards rely on some technology that is already patented. They didn't admit this outright, but they hinted strongly. It was an honest mistake, but they can't afford to take any chances. Opening the drivers would reveal the infringed patent, and that's a Bad Thing(tm). To make up for it, they have put a lot of time and effort into their drivers. You can see that by their performance.

    Not using the DRI API, however, was a technical decision that I don't agree with. They didn't feel it was adequate. Personally, I'd rather lose 5% of performance for standards-based software, but I don't run the company and I havn't bought a new card from them in three years because I'm so happy with the one I have. So, I don't feel that I have the authority to tell them what to do. Now, if you want them to do what you tell them to, you better either buy a card or buy some stock. Otherwise, you have no right.

    Dave

    'Round the firewall,
    Out the modem,
    Through the router,
    Down the wire,
  • Because the difference is significantly more than a couple of frames per second.

    It's also somewhat interesting that the card with (by far) the greatest performance under Linux has closed source drivers.

    I'm neither a hard core gamer (I occasionally play Quake III) nor considering upgrading but I still found it an interesting read. If it doesn't interest you then why not just shut up and move on?
  • I think I missed something -- why are you talking about Sharky?

    Nobody is interested in ATi. With the exception of their TV output, which I hear is excellent, their products are passable at best. ATi is what you throw into a cheap machine, or into a server that won't be running X (or any other GUI).

    And speaking of flakiness, ATi is the king. They have awful, awful drivers for Windows, and if their open source X servers are stable, it means some small group of hacker hobbyists are better than the ATi programmers, and/or that the server uses limited acceleration features.

    The AiW series doesn't support Linux.

    More pages = more ad impressions.
    More ad impressions = More money.
    --

Solutions are obvious if one only has the optical power to observe them over the horizon. -- K.A. Arsdall

Working...