Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

NVidia Accused of Inflating Benchmarks 440

Junky191 writes "With the NVidia GeForce FX 5900 recently released, this new high-end card seems to beat out ATI's 9800 pro, yet things are not as they appear. NVidia seems to be cheating on their drivers, inflating benchmark scores by cutting corners and causing scenes to be rendered improperly. Check out the ExtremeTech test results (especially their screenshots of garbled frames)."
This discussion has been archived. No new comments can be posted.

NVidia Accused of Inflating Benchmarks

Comments Filter:
  • by SRCR ( 671581 ) on Thursday May 15, 2003 @09:07AM (#5963232) Homepage
    To bad Nvidia has to resort to these things to keep selling there cards.. The used to be great.. but now i have my doubts..
    • by op51n ( 544058 )
      Since upon reading the article it even states that nVidia don't have access to the version of 3dmark2003 (not on the beta team) so they can have errors between the drivers and the code for 3dmark and not know. This is the kind of thing that can happen, and will take a driver update to fix, but does not necessarily mean they are doing anything wrong.
      As someone who has always been impressed by nVidia's driver updates and the benefits they can give each time, I am going to wait to see if it really is somethin
    • by satch89450 ( 186046 ) on Thursday May 15, 2003 @10:00AM (#5963652) Homepage
      [Nvidia] used to be great.. but now i have my doubts

      Oh, c'mon. Benckmark fudging has been an on-going tradition in the computer field. When I was doing computer testing for InfoWorld, I found some people in a vendor's organization would try to overclock computers so they would do better in the automated benchmarks. ZD Labs found some people who "played" the BAPco graphics benchmarks to earn better scores by detecting a benchmark was running and cutting corners.

      <Obligatory-Microsoft-bash>

      One of the early players was Microsoft, with its C compiler. I have it from a source in Microsoft that when the Byte C-compiler benchmarks figures were published in the early 1980s Microsoft didn't like being back of the pack. "It would take six months to fix the optimizer right." It would take two weeks, though, to put in recognizers for the common benchmarks of the time and insert hand-optimized "canned code" to better their score.

      </Obligatory-Microsoft-bash>

      Microsoft wasn't the only one. How about a certain three-letter company who fudged their software? You have multiple right answers to this one. :)

      When the SPECmark people first formed their benchmark committee, they knew of these practices and so they made the decision that SPECmarks were to be based on real programs, with known input and output, and the output was checked for correct answers before the execution times would be used.

      And now you know why reputable testing organizations who use artifical workloads check their work with real applications: to catch the cheaters.

      Let me reiterate an earlier comment by Alan Partridge: it's idiots who think that a less-than-one-percent difference in performance is significant. (Whether you the shoe fits you is something you have to decide for yourself.) What benchmark articles don't tell you is the spread of results they obtain through multiple testing cycles. When I was doing benchmark testing at InfoWorld, it was common for me to see trial-to-trial spreads of three percent in CPU benchmarks, and broader spreads than that with hard-disk benchmarks. Editors were unwilling to admit to readers that results were collected that formed a "cloud" -- they wanted a SINGLE number to put in print. ("Don't confuse the reader with facts, I want to make the point and move on.") I see that in the years since I was doing this full-time that editors are still insisting on "keep it simple" even when it's wrong.

      Another observation: when I would trace back hardware and software that was played with, the response from upper management was universally astonishment. They would fall over backwards to ensure we got a production piece of equipment. To some extent, I believed their protestations, especially when bearded during their visits to our Labs. One computer company (name withheld to protect the long-dead guilty) was amazed when we took them into the lab and opened up their box. We pointed out that someone had poured White-Out over the crystal can, and that when we carefully removed the layer of gunk the crystal was 20% faster than usual. Talk about over-clocking!

      So when someone says "Nvidia is guilty of lying" I say "prove it", further saying that you have to show with positive proof that the benchmark fudging was authorized by top management. I can't tell from the article, but I suspect someone pulled a fast one, and soon will be joining the very long high-technology bread line.

      Pray the benchmarkers will always check their work.

      And remember, the best benchmark is YOUR application.

    • Voodoo economics (Score:3, Interesting)

      Nvidia's current problems sound familiar don't they? 3DFX started floundering once they made it to the top, and started worrying more about profit margin and market share than putting out the best video cards. If they keep this behavior up, I give it two years before ATI starts looking at buying them out.
  • by binaryDigit ( 557647 ) on Thursday May 15, 2003 @09:08AM (#5963235)
    Isn't this SOP for the entire video card industry? Every few years someone gets caught targeting some aspect of performance to the prevailing benchmarks. I guess that's what happens when people wax on about "my video card does 45300 fps in quake and yours only does 45292, your card sucks, my experience is soooo much better". For a while now it's been the ultimate hype driven market wrt hardware.
    • I know, I thought this was common practice across the board in the video card industry. NVidia has always had the shadiest marketing (remember what the 256 stood for in the GeForce 256?) so I don't really think anyone would be surprised by this.
      • In a way, it's a symptom of the importance that these benchmarks have assumed in reviews. Now, cards are tweaked towards improved performance within a particular benchmark, rather than improving overall.
        • by newsdee ( 629448 ) on Thursday May 15, 2003 @09:35AM (#5963442) Homepage Journal
          Now, cards are tweaked towards improved performance within a particular benchmark

          This is always the case with any chosen performance measurement. Look at managers asked to bring quarterly profits. They tend to be extremely shortsighted...

          Moral of the story: be very wary on how you measure and always add a qualitative side to your review (e.g. in this case, "driver readiness/completedness").

        • by Ed Avis ( 5917 ) <ed@membled.com> on Thursday May 15, 2003 @09:47AM (#5963550) Homepage
          Why is it that people are assessing the performance of cards based on running the same narrow set of benchmarks each time? Of _course_ if you do that then performance optimization will be narrowly focused towards those benchmarks. Not just on the level of blatant cheating (recording a particular hardcoded text string or clipping plane) but more subtle things like only optimizing one particular code path because that's the only one the benchmark exercises.

          More importantly why is any benchmark rendering the exact same scene each time? Nobody would test an FPU based on how many times per second it could take the square root of seven. You need to generate thousands, millions of different scenes and render them all. Optionally, the benchmark could generate the scenes at random, saving the random seed so the results are reproducible and results can be compared.
          • by satch89450 ( 186046 ) on Thursday May 15, 2003 @10:19AM (#5963817) Homepage
            Nobody would test an FPU based on how many times per second it could take the square root of seven.

            Really? Do you write benchmarks?

            I used to write benchmarks. It was very common to include worst-case patterns in benchmark tests to try to find corner cases -- the same sort of things that QA people do to try to find errors. For example, given your example of a floating-point unit: I would include basic operations that would have 1-bits sprinkled throughout the computation. If Intel's QA people would have done this with the Pentium, they would have discovered the un-programmed quadrant of the divide look-up table long before the chip was committed to production.

            Why do we benchmark people do this? Because we are amazed (and amused) at what we catch. Hard disk benchmarks that catch disk drives that can't handle certain data patterns well at all, even to the point of completely being unable to read back what we just wrote. My personal favorite: how about modems from big-name companies that drop data when stressed to their fullest?

            The SPECmark group recognizes that the wrong answer is always bad, so they insist that in their benchmarks the unit under test get the right answer before they even talk of timing. This is from canned data, of course, not "generating random scenes." The problem with using random data is that you don't know if the results are right with random data -- or at least that you get the results you've gotten on other testbeds.

            Besides, how is the software supposed to know how the scene was rendered? Read back the graphics planes and try to interpret the image for "correctness"? First, is this possible with today's graphics cards, and, second, is it feasible to try? Picture analysis is an art unto itself, and I suspect that being able to check rendering adds a whole 'nuther dimension to the problem. I won't say it can't be done, but I will say that it would be expensive.

            For FPUs, it's easy: have a test vector with lots of test cases. Make sure you include as many corner cases as you can conceive. When you make a test run, mix up the test cases so that you don't execute them in the same order every pass. (This will catch problems in vector FPU implementations.) Check those results!

            Now, if you will tell me how to extend that philosophy to graphic cards, we will have something.

            • What you are describing isn't benchmarking, it's stress testing.

              Benchmarks are meant to predict performance. While it is essential to check the validity of the answer (wrong answers can be computed infinitely fast), the role of a benchmark isn't to check never-seen-in-practice cases or so-rarely-seen-in-practice-that-running-100x-slow e r-won't-matter.

              That reminds me of the "graphic benchmark" used by some Mac websites that compares Quickdraw/Quartz performance when creating 10k windows. Guess what, Quart
            • Just a note (Score:3, Interesting)

              by Sycraft-fu ( 314770 )
              On the whole scene being rendered correctly:

              It is perfectly possable ot read the graphics data from the card and write it to a file, like a tiff. In fact, I've seen some benchmarking programs that do. Then what you can do, for DirectX at any rate, is compare against a reference renderer. The development version of DX has a full software renderer built in that can do everything. It is slow as hell, being a pure software implementation, but also 100% 'correct' being that it is how DirectX intends for stuff t
      • Lies (Score:2, Funny)

        by Flamesplash ( 469287 )
        Lies, Damn Lies, and Marketing
    • by Surak ( 18578 ) * <surak&mailblocks,com> on Thursday May 15, 2003 @09:16AM (#5963297) Homepage Journal
      Goodbye, karma. ;) And, realistically, what does it matter? If two cards are similar in performance, but one is just a little bit faster, in reality it's not going to make *that* much of a difference. You probably wouldn't even notice the difference in performance between the new nVidia card and the ATI 9800, so what all the fuss is about, I have no clue.
      • You probably wouldn't even notice the difference in performance between the new nVidia card and the ATI 9800, so what all the fuss is about, I have no clue.

        Two things, both related to the key demographic:

        1) When you're spending $200USD or more on any piece of hardware, you want to know that your purchasing decision was the best one you could make. Given that the majority of the people making these big-buck video card purchasing decisions are males in high school/college, who in general don't have that m

      • so what all the fuss is about, I have no clue.


        The fuss is about the honesty of nvidia's business practices. I dont know about you, but I do not excuse dishonesty from business people -- they should be held to a very high standard.

        If what extremetech is saying (that nvidia purposefully wrote their driver identify a specific benchmark suite, and then ONLY to inflate the results) it would be increadiby significant. if so, I would *NEVER* buy another nvidia product again -- and I would make clear to the
        • Yeah, but they all do it, and it isn't strictly video board manufacturers either. That '80 GB' hard drive you just bought isn't 80 GB, it's (depending on the manufacturer) either a 80,000,000,000 byte hard drive or a 80,000 MB hard drive...either way it isn't by any stretch of imagination 80 GB. That Ultra DMA 133 hard drive, BTW, can't really do a sustained 133 MB/s transfer rate either, that's the burst speed and you'll probably NEVER actually achieve that transfer rate in actual use. That 20" CRT you just bought isn't 20", it's 19.2" inches of viewable area. A 333 MHZ FSB isn't 333 MHZ, it's 332-point-something mhz, and even then it isn't really 333 MHZ because it's really like 166 mhz and doubled because DDR memory allows you to read and write on the high and low side of the clock. That 2400 DPI scanner you just bought is only 2400 DPI with software interpolation. Your 56K modem can really only do 53K due the FCC regulations requiring them to disable the 56K transfer rate. The list goes on.
          • by Polo ( 30659 ) * on Thursday May 15, 2003 @12:56PM (#5965426) Homepage
            I believe my 19.2" viewable-area monitor is a twenty-ONE inch monitor, thank-you-very-much!
          • actually it is 80 GB (Score:3, Interesting)

            by Trepidity ( 597 )
            giga = 10^9, and an 80 GB hard drive has 80 x 10^9 (10 billion) bytes. This is standard notation that has been in use for at least a hundred years. Perhaps what you're looking for is 80 GiB, which the hard drives are not advertised as.

            This is standard even in most other parts of computing (anything engineering-oriented especially). For example, that 128kbps mp3 you downloaded is 128000 bits/second, not 128*1024 bits/second.
    • by Hellkitty ( 641842 ) on Thursday May 15, 2003 @09:16AM (#5963300) Journal
      You make an excellent point. I am tired of spending way too much money trying to reach that holy grail of gaming. The slight improvement in hardware isn't going to change the fact that I'm only a mediocre gamer. The best gamers are going to kick my ass regardless of what hardware they use. I don't need to spend $400 every six months to be reminded of that.
      • I know man, every time I think I've found the holy grail of gaming, it just turns out to be a beacon!
    • by Anonymous Coward on Thursday May 15, 2003 @09:31AM (#5963406)
      Posting anonymously because I used to work for a graphics card company.

      I've seen a video card driver where about half the performance-related source code was put in specifically for benchmarks (WinBench, Quake3, and some CAD-related benchmarks), and the code was ONLY used when the user is running said benchmark. This is one of the MAJOR consumer cards, people.

      So many programming hours put into marketing's request to optimize the drivers for a particular benchmark. It makes me sick to think that we could have been improving the driver's OVERALL performance and add more features! One of the reasons I left......
    • by medscaper ( 238068 ) on Thursday May 15, 2003 @09:59AM (#5963647) Homepage
      my video card does 45300 fps in quake and yours only does 45292, your card sucks

      Uhhh, can I have the sucky card?

      Please?

  • Hmmmm (Score:2, Interesting)

    Well they got caught...they obviously arnt to good at it, after all they did get caught

    I dont know why anyone ever cheats on benchmarks...how could you ever get away with it? do you really think no one is going to do their own benchmark? Come on. This is probably one of those most retarded things I have ever seen a company do.

    Oh well, Nvidia is getting to the point were they are going to have beat out ATI at some point if they want to survive
    • Re:Hmmmm (Score:5, Informative)

      by drzhivago ( 310144 ) on Thursday May 15, 2003 @09:15AM (#5963285)
      Do you remember how a year or so ago ATI released a driver set that reduced image quality in Quake 3 to increase frame rate?

      Here [tech-report.com] is a link about it in case you forgot or didn't know.

      It just goes to show that both companies play that game, and neither to good results.
      • Re:Hmmmm (Score:4, Informative)

        by D3 ( 31029 ) <daviddhenningNO@SPAMgmail.com> on Thursday May 15, 2003 @09:53AM (#5963600) Journal
        Actually ATI has done this as far back as the Xpert@Play series from 1997/98. They wrote drivers that gave great benchmarks with the leading benchmark tests. Then people started using game demos as benchmarks and the cards showed their true colors. This is why places like Tom's Hardware use a variety of games to make it hard for manuacturers to cheat.
    • they obviously arnt to good at it, after all they did get caught

      What stands out in my mind is that they cheated, and yet they still loose compared to ATI! It's the worst kind of cheating... Mediocre.

      do you really think no one is going to do their own benchmark?

      The benchmark will say the same thing when you run it, as it did when they ran it... You will have to notice the fact that the images are lower quality to realize there is something awry.

      This is probably one of those most retarded things I h

    • Database Vendors (Score:3, Interesting)

      by CaptainZapp ( 182233 )
      DB Vendors absolutely love benchmarks. Especially when they can rig them themselves. My take is that it looks good to management type geezers. Something along the line of:

      20zillion transactions per second provided you have a massive parallel Alpha with 1024 processors and 256 TB of physical memory for just 23.99$ per transaction assuming that you found your massive parallel Alpha on a heap of scrap metal.

  • I don't know (Score:5, Insightful)

    by tedgyz ( 515156 ) on Thursday May 15, 2003 @09:10AM (#5963247) Homepage
    I have read two reviews on AnandTech [anandtech.com] and [H]ardOCP [hardocp.com]. Neither of them made any such accusations. They both said visual quality was fine.

    Targeting benchmarks is just part of the business. When I was on the compiler team at HP, we were always looking to boost our SPECint/fp numbers.

    In a performance driven business, you would be silly not to do it.
    • Speaking of HardOCP, they had seen this kind of thing coming, and have been leading a drive against artificial benchmarks since February. They seem to have been trying to focus on engine-based testing since with the likes of Quake 3 and UT in their reviews too.

      More information here [hardocp.com].
      • Re:I don't know (Score:3, Interesting)

        by tedgyz ( 515156 )
        I absolutely agree with the drive towards real-world benchmarks. Who the hell cares if some stupid artificial benchmark can be derailed. Show me problems with a real game or app and then I'll care.
      • This cheat can be used in ALL benchmarks, including real-life game-demos. All those benchmarks work the same way, they show a prerecorded demo and calculate it's FPS. And that's how 3DMark works as well.
    • by krog ( 25663 ) on Thursday May 15, 2003 @09:21AM (#5963335) Homepage
      The point is that visual quality *was* fine... within the benchmark's prescribed path. "Off the rail" is when problems started occuring.

      This is why all software and hardware should be open-source.
      • by Anonymous Coward
        This is why all software and hardware should be open-source.

        Right, and why all your bank records should be public (just in case you are stealing and have illegal income). And all your phone records should be public as well as details of your whereabouts (just in case your cheating on your wife/skipping class). And of course, why the govt should have access to all your electronics transmissions (internet, cell, etc), just in case you're doing something that they don't like.
      • Hey, here's an off-the-wall idea: why don't you let the people writing the code choose what they want to do with it. If you want to make your code open-source, fine; and if you don't want to use code that isn't open-source, also fine. Otherwise, save the zealotry for Sunday mornings.

    • Re:I don't know (Score:2, Interesting)

      by eddy ( 18759 )

      Read the article. The cheating does not directly affect quality. Then how is it cheating I hear you ask? Because it only increase performance in the _specific_ scene and path rendered in the benchmark.

      This is similar to claiming having the worlds fastest _calculator_ of decimals of Phi, only to have it revaled that you're simply doing std::cout << phi_string << std::endl;

      ATI, Trident [spodesabode.com] and now nVidia. I really hoped nVidia would stand about this kind of lying.

    • The way I look at it is that the benchmark is based upon performing an identical set of operations, to use an analogy 100 mathematical problems on a question paper. Your grade is given on the time it takes for you to answer these 100 questions correctly.
      nVidia realised that the markers of this paper were only basing their score on the accuracy of every other question for this particular exam so specifically designed their driver for this specific exam paper to only bother to answer every other question.
      You
    • Targeting benchmarks is just part of the business. When I was on the compiler team at HP, we were always looking to boost our SPECint/fp numbers.

      Sure, but the Spec guidelines for optimization say: "Optimizations must generate correct code for a class of programs, where the class of programs must be larger than a single SPEC benchmark or SPEC benchmark suite."

      We all know that vendors target benchmarks. The important question is: does the optimization have a general benefit, other than inflating the bench
  • whatever (Score:2, Insightful)

    by JeffSh ( 71237 )
    I looked at the photos, and it seems to me to be just a driver fuckup on the 3dmark benchmarks.

    Since when did rendering errors caused by driver problems become "proof" of a vendor inflating benchmarks?

    And this story was composed by someone with the qualifications of "Website content creator, who likes video games alot" not a driver writer, not anyone technically inclined beyond the typical geek who plays alot of video games and writes for a website called "EXTREME tech" because you know, their name makes
    • Re:whatever (Score:5, Informative)

      by Pulzar ( 81031 ) on Thursday May 15, 2003 @09:26AM (#5963375)
      Instead of only looking at the pictures, read the whole article before making decisions on whether it's a driver "fuckup" or an intentional optimization.

      The short of it is that nVidia added hard-coded clipping of the scenes for everything that the banchmark doesn't show in its normal run, and which gets exposed as soon as you move the camera away from its regular path.

      It's a step in the direction of recording an mpeg on what the benchmark is supposed to show and then playing it back at 200 fps.

      • Re:whatever (Score:3, Informative)

        by MrBlue VT ( 245806 )
        My opinion (being a 3D programmer) that the situation is most likely a bug in the 3DMark program itself that then compounds a driver bug in the nVidia drivers. Since the driver itself does not have access to the program's data structures, it would be impossible for the driver to throw away undraw objects before the point where it would normally do it when clipping. Just because these "leet" game playerz at ExtremeTech think they know anything about graphics programming, doesn't mean they actually do.
    • Re:whatever (Score:5, Interesting)

      by GarfBond ( 565331 ) on Thursday May 15, 2003 @09:45AM (#5963522)
      Because these rendering errors only occur when you go off the timedemo camera track. If you were on the normal track (like you would be if you were just running the standard demo) you would not notice it. Go off the track and the card ceases to render properly. It's an optimization that is too specific and too coincidental for the excuse "driver bug" to work. It's not the first time nvidia has been seen to 'optimize' for 3dmark either (there was a driver set, a 42.xx or 43.xx, can't remember, where it didn't even render things like explosions and smoke in game test 1 for 3DM03)
  • Yeah well... (Score:3, Interesting)

    by IpsissimusMarr ( 672940 ) * on Thursday May 15, 2003 @09:11AM (#5963256) Journal
    Read this article NVIDIA's Back with NV35 - GeForceFX 5900 Ultra [anandtech.com]

    3Dmark03 may be inflated but what counts is real world game benching. And FX 5900 wins over ATI in all but Comanche 4.

    Interesting ehh?
    • Re:Yeah well... (Score:3, Insightful)

      by truenoir ( 604083 )
      Same deal with Tom's Hardware. They did some pretty extensive benchmarking and comparison, and the 5900 did very well in real world games (to include the preview DOOM III benchmark). I'm inclined to believe the driver problem nVidia claims. Especially since it's nVidia and not ATI, they'll likely fix it quickly (not wait 3 months until a new card comes out...not that I'm still bitter about my Rage Fury).
    • 3Dmark03 may be inflated but what counts is real world game benching. And FX 5900 wins over ATI in all but Comanche 4.

      Well, in all honesty, this cheat could be used in ALL popular benchmarks. I mean, how do those real-life game-benchmark work? They run a pre-recorded demo and calculate the FPS. Just like 3DMark does. Only difference is that in 3DMark, you can stop the demo and move the camera around, which exposes this type of cheating. You can't do that in the real-life game-demos.

  • The reason (Score:5, Funny)

    by S.I.O. ( 180787 ) on Thursday May 15, 2003 @09:11AM (#5963257)
    They just hired some ATI engineers.
  • Surprisingly, most people didn't flinch when ATI did it. (Remember the Quake.exe vs Quack.exe story?)
    • Well, to tell you the truth...I LIKE application specific optimization as long as it is general purpose enough to be applied across the board to that application. However, in this case, the corners are cut in a benchmark and are targetted SPECIFICALLY at the scene as rendered in the benchmark. If ATI had done the same thing in Quake, the pre-recorded timedemos would be faster, but not actual gameplay...that wasn't the case, the game itself was rendered faster. The only poor choice they made was how they
  • by mahdi13 ( 660205 ) <icarus.lnx@gmail.com> on Thursday May 15, 2003 @09:13AM (#5963271) Journal
    nVidia has been one of the more customer friendly video card makers...ever. They have full support for all platforms from Windows to Macs to Linux, this makes them, to me, one of the best companies around.
    So now they are falling into the power trap of "we need to be better and faster then the others" which is only going to have them end up like 3DFX in the end. Cutting corners is NOT the way to gain consumer support.

    As I look at it, it doesn't matter if your the fastest or not...it's the wide variety of platform support that has made them the best. ATi does make better hardware but their software (drivers) are terrible and not very well supported. If ATi would get the support that nVidia has been giving for the last few years, I would start using ATi hands down...It's the platform support that I require, not speed.
    • As much as I agree with you I don't the the article gives sufficient grounding for the accusation. First of all, driver optimizations that are specific to a certain type of 3d-engine or even a particular 3d-engine of even a particular application of that 3d-engine aren't per-se a bad thing; it's certainly the case that nVidia and ATI probably take specific account of Q3 and UT2003 engines in their drivers - if that account for a large part of their usage, it would be insane not too. As such, a benchmark t
    • by SubtleNuance ( 184325 ) on Thursday May 15, 2003 @09:49AM (#5963562) Journal
      ATi does make better hardware but their software (drivers) are terrible and not very well supported.

      that is a old accusation - that had a kernel of truth 24 months ago, but Ive used ati cards for years, and they have gone rock solid since forums like this just started to accept that schlock as 100% truth.

      Bottom line: dont believe the hype. this is just *not* true.
  • by Anonymous Coward
    "Because nVidia is not currently a member of FutureMark's beta program, it does not have access to the developer version of 3DMark2003 that we used to uncover these issues."

    Wow, some prelease software is having issues with the new brand-new drivers? Who would have thought... Why not wait for official release of the software and the drivers before making hasty conclusions?

    In addition, who really cares about 3DMark? Why not use time which is wasted on 3DMark benchmark for benchmarking real games? After all
    • by Pulzar ( 81031 ) on Thursday May 15, 2003 @09:39AM (#5963470)
      Please try reading the article in more detail.

      The developer version is not a pre-release, it's the same version with some extra features that let you debug things, change scenes, etc.

      As soon as you move the camera away from it's usual benchmark path, you can see that nVidia hard-coded clipping of the benchmark scenes to make it do less work than it would need to in a real game, where you don't know where the camera will be in advance.

      As I mentioned in another post, it's a step in the direction of recording an mpeg of the benchmark and playing it at a high fps rate.
  • Very old practice. (Score:5, Interesting)

    by shippo ( 166521 ) on Thursday May 15, 2003 @09:17AM (#5963307)
    I recall about 10 years ago that one of the video adaptor manufacturers optimised their Windows 3.1 acclerated video drivers to give the best performance possible with the benchmark program Ziff-Davis used for their reviews.

    One test involved writing a text string in a particular font continuously to the screen in. This text string was encoded directly in the driver for speed. Similarly one of the polygon drawing routines was optimised for the particular polygons used in this benchmark.
    • That was my employer at the time. We released a driver that pushed our Winmark score up to 40, while ATI had 25 or so, and needless to say got caught. The story goes that Marketing had been complaining that ATI had Winmark cheats in their drivers, and ZD wasn't doing anything about it, so we put in as many cheats as we could think of, supposedly to make a point. Of course, all it accomplished was to give us a big ol' black eye. Wouldn't have been the dumbest stunt those clowns in Marketing ever pulled eithe
  • Sigh... (Score:3, Insightful)

    by Schezar ( 249629 ) on Thursday May 15, 2003 @09:18AM (#5963310) Homepage Journal
    Back in the day, Voodoo cards were the fastest (non-pro) cards around when they first came out. A significant subset of users became Voodoo fanboys, which was ok, since Voodoo was the best.

    Voodoo was beaten squarely by other, better video cards in short order. The fanboys kept buying Voodoo cards, and we all know what happened to them ;^)

    GeForce cards appeared. They were the best. They have their fanboys. Radeon cards are slowly becoming the "other, better" cards now.

    Interesting....

    (I'm not sure what point I was trying to make. I'm not saying that nVidia will suck, or that Radeon cards are the best-o. The moral of this story is: fanboys suck, no matter their orientation.)
    • 3DFX slowly died when the GeForces came out and then 3DFX was aquired by nVidia
      So, what you are saying is now that nVidia is slowly dieing they will soon be aquired by ATi in the next couple of years?

      I like that theory...hopefully it doesn't happen to nVidia, but it's a solid theory ;-)
    • Re:Sigh... (Score:3, Interesting)

      by cgenman ( 325138 )
      Good point, but I think the larger point is.

      No one has ever held onto the #1 spot in the graphics card industry. No one.

      Perhaps it is because you are competing against a monolith that the up-and-comers can convince their engineers to give up hobbies and work 12 hour days. Perhaps it is because the leader of a #1 must be conservative in its movements to please the shareholders. Perhaps it is because with 10 other companies gunning for your head, one of them will be gambling on the right combination of t
  • Investigate on what? On how to make up excuses "this is an unexpected irregularity of the driver"? This is ridiculous.

    It's clearly a deliberate attempt. But it looks like NV's going to deny responsibility on this one.

    Shame on them...
  • by BenjyD ( 316700 ) on Thursday May 15, 2003 @09:21AM (#5963338)
    The problem is that people are buying cards based on these silly synthetic benchmarks. When performance in one arbitrary set of tests is so important to sales, naturally you're going to see drivers tailored to improving performance in those tests.

    Of course, if Nvidia's drivers were released under the GPL, none of the mud from this would stick as they could just point to the source code and say "look, no tricks". As it is, we just get a nasty combination of the murky world of benchmarks and the murky world of modern 3D graphics.
    • Of course, if Nvidia's drivers were released under the GPL, none of the mud from this would stick as they could just point to the source code and say "look, no tricks".

      Forgive me, but that sounds like one very stupid idea.

      Why would you want to expose your hard earned work to the world? NVidia pays very well for programmers to think of wild and imaginative (out o' the box) programming techniques to get the most from their hardware.

      With rogue drivers out there thanks to open-sourcing the code, someone cou
      • by BenjyD ( 316700 ) on Thursday May 15, 2003 @09:57AM (#5963629)
        If Nvidia GPL their drivers, no other company can directly incorporate code from them without also releasing their drivers under the GPL. So, NVidia found out just as much as ATI do.

        GPLing the drivers would give NVidia:

        1) Thousands of developers willing to submit detailed bug reports, port drivers, improve performance on 'alternative' operating systems etc.
        2) Protection from these kind of cheating accusations
        3) Better relationship with game developers - optimising for an NVidia card when you've got details of exactly how the drivers work is going to be much easier than for a competitor card.
        4) A huge popularity boost amongst the geek community, who spend a lot on hardware every year.

        NVidia is, first and foremost, a hardware company. In the same way that Sun, IBM etc. contribute to open-source projects in order to make their hardware or other services more appealing, NVidia stand to gain a lot too.

        And as for rogue drivers? I suppose you're worried about rogue versions of the Linux kernel destroying your processor?
        • 5) Liability. Though it doesn't Make Sense (tm), if someone downloaded an "optimized driver" from superoptimizedrivers.com that in turn melted their chip or corrupted their vid card RAM in some way there would be repurcussions.

          Realize, in a society in which people sue others over dogs barking too loud, NVidia would definitely hear from a very small but very vocal group about it.

          6) Nivida's Programmers Don't Want This. Why? Let's say they GPL'd just the Linux reference driver. And in less than two weeks, a new optimized version came out that was TWICE as fast as the one before. This makes the programmers looks foolish. I know this is pure ego, but it is a concern I'm sure, for a programmer w/ a wife and kids.

          I know this all sounds goofy, and trivial. But politics and Common Sense do not mesh. Again, I think your intentions are great and in a perfect world there would be thousands working on making the best, most optimized driver out there.

          But if such a community were to exist (and you know it would), why bother paying a league of great programmers and not just send out a few test boards to those most active in that new community, more than willing to do work for Free (as in beer?)

          Just something to think about.
    • by aksansai ( 56788 ) <aksansai@gmEEEail.com minus threevowels> on Thursday May 15, 2003 @12:24PM (#5965115)
      Companies have long adopted the "open-source" fundamental philosophy even before Linux and what I call the modern open source movement caught on. Often, a company would have a nice product - license the code to a sub-company (who would modify/repackage/etc the original product). The license agreement stipulated that all modifications would 1) have to be reviewed by the company without restriction from the sub-company 2) the modifications would have to be approved by the company.

      Take for instance the relationship between Microsoft and IBM during the OS/2 era. The two companies working on the same code base produced OS/2 and, eventually, the NT kernel.

      Or, more recently - the brilliant strategy of Netscape Communications Corporation - the birth of the Mozilla project. To the open source community - take our browser, modify it like hell, make it a better project. You have, of course, Mozilla as the browser - but Netscape (Navigator) still exists (as a repackaged, "enhanced" Mozilla).

      nVidia's source code release would have two major impacts as far as their performance goes.

      1) ATI (et al.) would find the actual software-based enhancements they could also incorporate into their own driver to improve their product.

      2) nVidia could capture the many brilliant software developers that happen to be a part of the whole nVidia "cult" - this could lead to significant advancements to their driver quality (and overall product quality).

      My guess is that the lid is kept so tightly shut on nVidia's drivers because they can keep their chips relatively simple through their complex software driver. ATI, perhaps, has the technical edge in the hardware arena, but does not have the finesse for software enhancing drivers like nVidia does.
  • My suspicion is the benchmark is giving (incorrect) culling info to the driver. ATI's driver ignores it, and Nvidia honors it.
    • I would suspect something like this too... I'm not a 3D card expert, but from what I understood the way the "cheating" was found was by stopping the whole scene, freezing everything going on (including all processing of culling information). When you then start rotating the camera around, you are supposed to get rendering anomalities, since the scene is optimised to be viewed from a different angle. Why this happens with the geforce only I dont know, but I would guess that its because nvidia and ati drivers
  • Reviews should try to uncover it and find out who does it right now which is the only thing that really matters when getting a product.

    The whole Quake / Quack fiasco for ATI was enlightening, but does anyone know if ATI does this currently?

    Frame rates are overrated anyway, since people buying these cards are buying new ones before their current ones go down to noticable frame rates. Features, picture quality and noise is what matters.

    ATI seems to still have the upper hand, and at least for ATI cards ther
    • by pecosdave ( 536896 ) on Thursday May 15, 2003 @09:33AM (#5963427) Homepage Journal
      I bought the first 64MB DDR Radeon right after it came out. I held on to the card for months waiting for ATI to release a driver, didn't happen. I heard of people having sucess getting 3D acceleration to work, but I could never duplicate that success.

      Finally after months of waiting I traded my Radeon to my roommate and got a GeForce 2 Pro with 64MB of DDR. Runs beutifully on Linux, I even play UT2K3 with it on an Athlon 850. Finally after having the GeForce2 for about four months I happen across a site that tells me how to make 3D acceleration work for the Radeon. To late now, I'm happy with my GeForce, and UT2K3 seems to only really want to work with nVidia anyways.

      I don't think drivers are the best way to defend ATI considering they tend to shrug off other OS's and nVidia has committed themselves to supporting Alternate OS's.
  • Benchmarks are nothing else than statistics: In order to get to a (more or less) meaningful benchmark, you repeat the same process over and over, possibly in different environments. Then you analyze the results, resulting in a statistic of whatever you've benchmarked.

    Therefore, the old Disraeli saying applies: "There are lies, damn lies, and statistics."

    Or, to essentially say the same thing without expletives: Never trust a statistic you haven't faked yourself.

  • Not a big deal. (Score:5, Informative)

    by grub ( 11606 ) <slashdot@grub.net> on Thursday May 15, 2003 @09:32AM (#5963416) Homepage Journal

    One has to take all benchmarks with a grain of salt if they come from a party with financial interestes in the product. Win 2K server outperforms Linux, a Mac is 2x the speed of the fastest Wintel box, my daddy can beat up your daddy..

    It's not suprising but it is somewhat disappointing.
  • ATI was caught optimizing Quake3. In theory, this is a *good* thing. Quake3 is used by a lot of people, and was/is the engine for many of the games that people buy top end video cards for.

    I'm sure nVidia does the same thing: new Detonator driver releases have been known to get amazing improvements for specific games.

    ATI screwed up by affecting the visual quality. Well, screwing up visual quality would be acceptable if there was a documented setting to turn that particular optimization off, but there wa
  • by mr_luc ( 413048 ) on Thursday May 15, 2003 @09:38AM (#5963467)
    Targetting performance for benchmarks is one thing.

    These drivers were written with specific limits built in that make the drivers COMPLETELY irrelevant to ordinary gaming, as ET demonstrates by moving the camera just a bit from the designated path.

    This would be like chopping the top off of a car to make it lighter, to reduce the distance it takes for it decellerate in a brake test. Or compensating for a crappy time off the starting line by removing the back half of the car and bolting a couple of RATO rockets where the back seats used to be. Or loading the car up with nitro, or something. You think Car and Driver Magazine wouldn't say something?

    These drivers make the card completely unsuitable for ordinary gaming. They aren't 'more powerful' -- they are a completely altered version of the drivers that are ONLY good at improving one particular set of benchmarks.
    • You might be interested to know that automakers (at least American automakers) used to provide "gross" HP ratings on their engines. The gross rating is obtained by running the engine with no alternator, fan, air cleaner, or other loads experienced during normal operation. After the industry switched to net ratings in the early 70's, the HP's dropped sharply- and a lot people thought it was all because of the smog controls.
  • by Anonymous Coward
    Overclockers.com [overclockers.com] has a very well thought out Editorial on this issue titled ""Trust is Earned" It is well worth the read.
  • Enough of that... (Score:3, Interesting)

    by Dwedit ( 232252 ) on Thursday May 15, 2003 @09:47AM (#5963555) Homepage
    Show me the Quack 3 Arena benchmarks! Then we'll decide which card is the best!
  • by D3 ( 31029 ) <daviddhenningNO@SPAMgmail.com> on Thursday May 15, 2003 @09:49AM (#5963569) Journal
    Damn, a few years ago ATI did a similar thing to the drivers with the Xpert@play cards. The cards got good benchmarks that never held up once people actually played the games. They got beat up pretty bad for it at the time. Now it looks like nVidia's turn.
  • Possible solutions. (Score:2, Interesting)

    by eddy ( 18759 )

    The article talks about possible solutions to the problem of "repeatability" while still avoiding the problem of cheating in the way alleged here. I don't remember it mentioning this possible solution though: How about if the camera was controlled by a mathematical function of a seed given by hand. Like you'd seed a PRNG.

    This way you could repeat the benchmarks by giving the same seed. Generate a 'default one' at each new install (this to ensure clueless reviewers get a new seed). Make it easy to enter a

  • As a programmer, I develop a test plan before I even start writing code. This is similar to someone giving me a requirement, and then changing the requirement after I've built a test plan and developed code toward that test. . . it's not really fair to the driver developers.

    I'm going to side with nVidia, that this is a bug in the driver. Benchmarks make good testing software, but the best way to ensure good drivers is to make the benchmarks as comprehensive as possible. ExtremeTech is attributing to ma

  • Nvidia has announced a hardware solution to the problem...

    ...the fix consists of another vacuum cleaner to be attached to the card.

  • by YE ( 23647 )
    The 3dmark03 benchmark is cheating in the first place, implementing stencil shadows in two of the game tests in such a braindead manner which no sane programmer would put in an actual game.

    It also uses ATI-only pixel shaders 1.4, and reverts to dual-pass on other cards.

    Why all this?

    NVIDIA isn't on the 3dmark03 beta program (read: didn't pay FutureMark a hefty lump of greenbacks).
  • by ElGanzoLoco ( 642888 ) on Thursday May 15, 2003 @10:51AM (#5964101) Homepage
    My personal favorite from this article:

    nVidia believes that the GeForceFX 5900 Ultra is trying to do intelligent culling and clipping to reduce its rendering workload

    It's alive ! :-)

  • NVidia not cheating (Score:4, Informative)

    by linux_warp ( 187395 ) on Thursday May 15, 2003 @11:14AM (#5964347) Homepage
    hardocp.com [hardocp.com] on the front page has a great writeup on this.

    But basically, extremetek is just a little bit mad because they were excluded from the doom3 benchmarks. Since nvidia refused to pay the 10s of thousands of dollars to be a member of the 3dmark03 board, they have absolutely no access to the software used to create this bug.

    Here is the full exept from hardocp.com:

    3DMark Invalid?
    Two days after Extremetech was not given the opportunity to benchmark DOOM3, they come out swinging heavy charges of NVIDIA intentionally inflating benchmark scores in 3DMark03. What is interesting here is that Extremetech uses tools not at NVIDIA's disposal to uncover the reason behind the score inflations. These tools are not "given" to NVIDIA anymore as the will not pay the tens of thousands of dollars required to be on the "beta program" for 3DMark "membership".

    nVidia believes that the GeForceFX 5900 Ultra is trying to do intelligent culling and clipping to reduce its rendering workload, but that the code may be performing some incorrect operations. Because nVidia is not currently a member of FutureMark's beta program, it does not have access to the developer version of 3DMark2003 that we used to uncover these issues.

    I am pretty sure you will see many uninformed sites jumping on the news reporting bandwagon today with "NVIDIA Cheating" headlines. Give me a moment to hit this from a different angle.

    First off it is heavily rumored that Extremetech is very upset with NVIDIA at the moment as they were excluded from the DOOM3 benchmarks on Monday and that a bit of angst might have precipitated the article at ET, as I was told about their research a while ago. They have made this statement:

    We believe nVidia may be unfairly reducing the benchmark workload to increase its score on 3DMark2003. nVidia, as we've stated above, is attributing what we found to a bug in their driver.

    Finding a driver bug is one thing, but concluding motive is another.

    Conversely, our own Brent Justice found a NVIDIA driver bug last week using our UT2K3 benchmark that slanted the scores heavily towards ATI. Are we to conclude that NVIDIA was unfairly increasing the workload to decrease its UT2K3 score? I have a feeling that Et has some motives of their own that might make a good story.

    Please don't misunderstand me. Et has done some good work here. I am not in a position to conclude motive in their actions, but one thing is for sure.

    3DMark03 scores generated by the game demos are far from valid in our opinion. Our reviewers have now been instructed to not use any of the 3DMark03 game demos in card evaluations, as those are the section of the test that would be focused on for optimizations. I think this just goes a bit further showing how worthless the 3DMark bulk score really is.

    The first thing that came to mind when I heard about this, was to wonder if NVIDIA was not doing it on purpose to invalidate the 3DMark03 scores by showing how the it could be easily manipulated.

    Thanks for reading our thoughts; I wanted to share with you a bit different angle than all those guys that will be sharing with you their in-depth "NVIDIA CHEATING" posts. While our thoughts on this will surely upset some of you, especially the fanATIics, I hope that it will at least let you possibly look at a clouded issue through from a different perspective.

    Further on the topics of benchmarks, we addressed them earlier this year, which you might find to be an interesting read.

    We have also shared the following documentation with ATI and NVIDIA while working with both of them to hopefully start getting better and more in-game benchmarking tools. Please feel free to take the documentation below and use it as you see fit. If you need a Word document, please drop me a mail and let me know what you are trying to do please.

    Benchmarking Benefiting Gamers

    Objective: To gain reliable benchmarking and image quality tools
  • by Maul ( 83993 ) on Thursday May 15, 2003 @11:36AM (#5964587) Journal
    Companies always tweak their code, insist on tests optimized for their hardware, etc. in order to get an edge up on benchmarks. This is probably especially true in cases where the competition is so neck-and-neck, as it seems to be with the video card industry. It seems that these companies will do anything to show they can get even two or three more FPS than the competition. It is hard to treat any benchmark seriously because of this.

    At the same time, I'm debating what my next video card should be. Even though ATI's hardware might be slightly better this round, the differences will probably be negligable to all but the most extreme gamers. At the same time NVidia has proven to me that they have a history of writing good drivers, and they still provide significantly better support to the Linux community than ATI does.

    For this reason I'm still siding with the GeForce family of video cards.
  • Short Description. (Score:3, Informative)

    by BrookHarty ( 9119 ) on Thursday May 15, 2003 @12:43PM (#5965281) Journal
    Reading the posts, I dont think everyone is understanding the point of the rail test.

    Using the rail test, Nvidia excluded almost all non-visible data. This shows nvidia tweaked its drivers to only render data seen on the rail test, which would only happen if you tweak your drivers for the benchmarks. (aka the cheat)

    I like it better if benchmarks uses average FPS on a game, and you go PLAY the game, and watch for yourself.

    Try 1024x768/1280x1240/1600x1200 with all AA/AF modes. Also stop using 3ghz P4's for the benchmarks, use a mix of 1ghz/2ghz/3ghz AMD/Intel boxes so we can know if the hardware is worth the upgrade.

  • They did it before (Score:3, Interesting)

    by kwiqsilver ( 585008 ) on Thursday May 15, 2003 @01:12PM (#5965585)
    With the Riva128, back when I had a 3Dfx Voodoo (or Voodoo2).
    They garbled texture maps to achieve a higher transfer rate and frame rate. Then they went legit for the TNT line.
    I guess the belief "if you can't win, cheat" is still there at nvidia.
    I wonder if ATi makes a good Linux driver...
  • by Animats ( 122034 ) on Thursday May 15, 2003 @01:50PM (#5965907) Homepage
    This problem came up in compiler benchmarks years ago, and a solution was developed. Someone wrote a benchmark suite which consisted of widely used benchmarks plus slightly modified versions of them. Honest compilers did the same on both. Compilers that were recognizing the benchmarks did quite differently. The results were presented as a row of bar graphs - a straight line indicated the compiler was honest; peaks indicated a cheat.

    Some compilers miscomplied the modified benchmark, because they recognized the code as the standard benchmark even though it wasn't exactly the same.

    (Anybody have a reference for this? I heard the author give a talk at Stanford years ago.)

A morsel of genuine history is a thing so rare as to be always valuable. -- Thomas Jefferson

Working...