Page 1 of 1
Ati's new card!
Posted: May 4, 2004, 2:33 pm
by pyrella
http://www.hothardware.com/viewarticle. ... =517&cid=2
Preliminary testing is very good. Going to have to wait until both this and the new GeForce card are out with retail drivers to get some true comparisons....but raw power ....looking very nice.
Posted: May 4, 2004, 3:10 pm
by Aslanna
And only one power plug! Bonus.
Posted: May 4, 2004, 4:01 pm
by Winnow
Shhhhh Py!
We need someone to keep buying those nVidia cards to help push ATi to keep pumping out the best product for us to buy!
Posted: May 5, 2004, 11:25 am
by Winnow
It looks like business as usual early on. If you have a protoype fuel cell home power generator, the nVidia 6800 will get you in the ballpark of ATi's new card but still not as good or better.
AnandTech.com gave an edge to ATI, particularly citing the smaller size and power requirements of cards using its chips, but in the end also saw things roughly even.
"I don't think anyone thought the race would be this close after what has been going on over the past couple years with ATI and Nvidia," the site's Derek Wilson said in a review.
HardOCP.com gave both versions of the X800 its "Must Have Hardware" award and said its testing showed Markham, Ontario-based ATI with a clear performance advantage.
"When it comes right down to it the X800Pro matches or beats the GeForce 6800Ultra in game performance and (image quality)," said Brent Justice in his review. "The Radeon X800XT Platinum Edition goes even further and burns through these games like a hot knife through butter, besting Nvidia's 6800Ultra by an easily noticeable real-world margin."
Posted: May 5, 2004, 2:41 pm
by Kguku
Christ i thought we had seen the last of the Winnow ATi fanboyism, I guess not.
Posted: May 5, 2004, 2:49 pm
by Winnow
Kguku wrote:Christ i thought we had seen the last of the Winnow ATi fanboyism, I guess not.
That would be BestAvailableProduct Fanboyism.
Posted: May 5, 2004, 2:55 pm
by Kelshara
Eh it is only ver recent history that ATi has had a better product. Their previous products were crap, and their drivers even worse. And yes, you are a fanboi

Was not aware that they came this fanatical except in the "Blizzard" flavour.
Posted: May 5, 2004, 3:02 pm
by Winnow
Kelshara wrote:Eh it is only ver recent history that ATi has had a better product. Their previous products were crap, and their drivers even worse. And yes, you are a fanboi

Was not aware that they came this fanatical except in the "Blizzard" flavour.
It's been almost two years now. If that's considered recently in computer technology then so be it. I don't praticularly want to have a shitty product for two years clinging onto a name brand...the problem you have is after two years, the latest ATi product is still better.
How many more years will you be saying "next gen cards will be different!" If they are, great. I'm buying the best out there when I need a card.
You go clingboy.
Posted: May 5, 2004, 4:20 pm
by Xouqoa
I just got a GeForce FX 5900 for free, so no ATI for me! (Upgraded from GF4 ti4600 from the manufacturer since they don't stock my old card for warranties anymore!)
Posted: May 5, 2004, 4:24 pm
by Janx
Winnow is just one of MANY people (and I'm not talking fanboi's) that are proclaming ati as the better hardware. Go read a majority of the hardware sites that bench competing products and you'll see that they favor the ATi card as well.
As it stands I still favor ATI as they have a more refined product. Instead of adding new features ala Nvidia, they realized that the features currently existing en masse have yet to be utilized to their fullest and expanded on them. Also, Ati doesnt require the power consuption on their top end card like nvidia. 2 power connectors to power a video card? sheez.
I sill dont like ATi's drivers compared to Nvidia though, and theres the note of OGL performace. I HATE running ogl applications on my 9800 pro. Right now I cant get CoH to run at anything other then 60hz refresh rate, if i try and force it I get an error that fullscreen mode isnt supported. Also gamma doesnt "stick". If I tab in/out of an application it reverts the gamma to a low setting (regardless of what its set in game).
ATI Fanboi
Posted: May 5, 2004, 5:13 pm
by Adelrune Argenti
I have a couple of 6800's here that I am testing. Overall, a very nice card compared to the Radeon 9800 XT. More frames per second is always good. I have been spending the last 2 days now trying to track down some of the Radeon X800 cards to test with. Currently, no one has them yet.
For the record, the ATI cards have performed better than the Nvidia cards up till the new 6800. However, ATI countered with their new card and all indications are it will be the card to beat.
Posted: May 5, 2004, 5:22 pm
by masteen
I love how the nVidia people keep saying "It's only been two years!" Well, it's only been 5 since 3dfx was king of the hill, and where are they today? The graphics industry moves too fast to be anything but a Winnow-caliber whore.

Posted: May 5, 2004, 5:22 pm
by Faerin
As much as I would love to support ATI (them being a Canadian company and all), I just simply can't switch to their cards untill they pull their head out of their asses and start supporting architectures other than DirectX.
If all you do is game, then an ATI card may be the way to go (that still remains to be seen, as the early benchmarks are all pretty even atm with the NVx cards generally showing a little higher image quality). But if you do any development/cad work etc.. NVidia is still leader of the pack by a large margin. And with the gap between consumer and pro level hardware closing more and more quickly, ATI may get left behind if they don't wise up.
Posted: May 5, 2004, 5:30 pm
by masteen
I thought that Matrox still owned a large part of the high-end CAD/CAM market? Has nVidia passed them?
Posted: May 5, 2004, 5:53 pm
by Janx
NVx cards showing higher quality images? Everything i've personally experienced and read shows ATI having the upper hand in image quality.
Posted: May 5, 2004, 6:00 pm
by masteen
The 8xAA image is superior on the nVidia, but puts out like 1/4 the fps that the ATI does at comparable settings. In most other AA, aniso, or combined benchmarks, the ATI comes out on top.
Posted: May 5, 2004, 6:19 pm
by Janx
One thing I do miss since I've gone to ATI is digital vibrance. <3 DV.
Posted: May 5, 2004, 6:34 pm
by Winnow
I need to recheck the breakdown of which graphics card companies are getting the new consul game machine contracts and which companies are losing them.
Until last year Nvidia was the overwhelming leader in making the chips that are used to improve the ability of personal computers to portray complex three-dimensional and fast-moving graphics images, a quality particularly prized by video game players. Moreover, it had an exclusive deal with Microsoft for the graphics processing chips used in the Xbox game machine.
That all came undone last year when Nvidia placed an ill-timed bet on a new generation of chip-making technology that ended up reaching the market too late. At the same time, Nvidia lost its Xbox contract to rival ATI when it refused to meet Microsoft's stringent pricing demands.
As a result, after growing at about 100 percent annually for four years, Nvidia's revenue fell from $1.91 billion in 2002 to $1.82 billion in 2003.
"ATI not only outperformed Nvidia, they had a cheaper chip," said David Wu, an analyst at Wedbush Morgan Securities who follows the graphics chip market.
Shares of Nvidia fell from a high of $71.71 in January 2002 to as low as $7.37 in October 2002.
--------
The Radeon X800 series cards perform best in some of our most intensive benchmarks based on newer games or requiring lots of pixel shading power, including Far Cry, Painkiller, UT2004, and 3DMark03's Mother Nature scene—especially at high resolutions with edge and texture antialiasing enabled. The X800s also have superior edge antialiasing. Their 6X multisampling mode reduces edge jaggies better than NVIDIA's 8xS mode, and the presence of temporal antialiasing only underscores ATI's leadership here. With a single-slot cooler, one power connector, and fairly reasonable power requirements, the Radeon X800 XT Platinum Edition offers all its capability with less inconvenience than NVIDIA's GeForce 6800 Ultra. What's more, ATI's X800 series will be in stores first, with a more mature driver than NVIDIA currently has for the GeForce 6800 line.
Posted: May 5, 2004, 8:17 pm
by Kelshara
How many more years will you be saying "next gen cards will be different!" If they are, great. I'm buying the best out there when I need a card.
Have I ever said that? Nope. Did it take a long time for ATi to prove their newest drivers weren't a fucking waste of space? Hell yeah, and I STILL don't like their drivers all that much.. Would I blindly buy ATi? Hell no, nor would I blindly buy Nvidia. But overall, in work situations and at home I have (over the last what, quite a few years now) had less problems with Nvidia than ATi. ATi is ahead now, but who knows who will be ahead tomorrow.
I buy the best bang for the buck usually, whatever brand that might be. So stop putting words in my mouth, dumbass.
Edit: Oh and I still have a Voodoo card laying around! Best card evah!
Posted: May 6, 2004, 2:35 am
by Durew
/drool X800 XT.. definetely will be my next upgrade.. my comp is pretty new, 6 mo's old.. with a 9600 pro in it; I cant wait!

Posted: May 6, 2004, 9:40 am
by Dregor Thule
My compositing/digital FX teacher was the one responsible for doing the lighting on the poses for the Ruby character. I got to see the scene file for the actual model. Was pretty cool. Had breast controls ><
Oh, and the funny thing about that pose, if you look at it from any other angle she's falling over backwards, hehe.
Posted: May 6, 2004, 10:52 am
by Drolgin Steingrinder
that model needs camel toe.
Posted: May 6, 2004, 2:24 pm
by Winnow
Dregor Thule wrote:
Oh, and the funny thing about that pose, if you look at it from any other angle she's falling over backwards, hehe.
Using a 6800 she would have fallen over!
Drolgin Steingrinder wrote:
that model needs camel toe.
That's the new Playboy Airbrush filter in action.
Posted: May 7, 2004, 2:58 pm
by Winnow
For those with time, here's an informative thread on the x800 and 6800 cards. Watch the 6800 backer get worked over. Good info especially if you are considering an HTPC but it also explains why ATi owns this round as well.
http://www.avsforum.com/avs-vb/showthre ... did=398227
Posted: May 17, 2004, 9:06 pm
by Adelrune Argenti
Last week, I got my hands on a couple of X800's. I ended up replacing some 6800's with the X800's and wow what a difference. We were seeing some huge gains in FPS with it. This is definately the new card to beat. I am now awaiting several more X800's as well as other ATI schwag.
Posted: May 28, 2004, 6:53 pm
by Forthe
Both are nice cards and performance swings depending on the API used. ATI's part outperforms Nvidia by a decent margin with DX9. Nvidia's part outperforms ATI by a decent margin with OGL. They perform fairly evenly on DX8 and older.
If you are mostly a windows user then I'd go with the ATI due to Nvidia's ugly 2 slot design and power requirements. For non window OS's or if you run a lot of OGL apps\games I'd go with Nvidia.
Posted: May 28, 2004, 7:41 pm
by Winnow
You can hard mod the x800 pro to open up 8 more pipes as well! More reason to get ATi!
http://forums.overclockers.co.uk/showth ... genumber=5
Posted: June 3, 2004, 7:05 pm
by Forthe
Posted: June 3, 2004, 8:27 pm
by Winnow
old news! ATI already responded to those accusations! There's nothing to see here!
I'll find the link later.
Posted: June 4, 2004, 2:46 am
by Forthe
Winnow wrote:
old news! ATI already responded to those accusations! There's nothing to see here!
I'll find the link later.
ATI's written response is irrelevent. With ATI's current drivers it is impossible to have anisotropic trilinear filtering unless you use all colored mipmaps. Which you will only ever see in image quality tests (coincidence of course).
This is the same optimization, except that ATI excludes colored mipmaps, that nvidia took a shitload of flak for last year. You can now turn off those optimizations in nvidia's drivers and ATI should allow the same.
Every ATI benchmark using anisotropic trilinear filtering is now suspect and judging by the tests done using colored mipmaps they will be revised much lower.
Call a spade a spade.
Posted: June 4, 2004, 3:19 am
by Winnow
ATI has been cleared on the technical side of things although they admit to some shanannigans.
ATI's algorithms appear to fit into the legitimate category with professional hardware Web sites hard-pressed to find any image-quality degradations caused by the filtering optimizations. According to Tech Report's Scott Wasson, "ATI's adaptive trilinear appears to be damn near impossible to catch with the naked eye."
They made optimizations but they didn't do much in performance unlike nVidia's shanannigans.
While ATI has been cleared of cheating charges in the technical arena, the company still has to overcome the damage caused by its failure to disclose the existence of the optimizations to the editors of major hardware publications.
A spade is only a spade if it makes a significant difference and in this case it didn't while the nVidia optimizations did. Just more nVidia propaganda!
Here's more of nVidia trying to pull a fast one with the PCI Express announcement.
http://www.theinquirer.net/?article=16332
Posted: June 4, 2004, 5:04 am
by Forthe
I have no problem with optimizations that improve speed with little cost.
I do have a problem when a company advertises it does anisotropic trilinear filtering when in fact the product will
never do anisotropic trilinear filtering (minus IQ tests).
The right solution is to put the option in there to disable it and then have the marketing department praise up your optimization. Much better than lying about it.
PS. I have no doubt Nvidia x-ray'd ATI's chips. Damn yanks

Posted: June 4, 2004, 9:11 am
by Kelshara
This is amusing... Winnow needs bigger oars to keep rowing like that!

Posted: June 4, 2004, 11:56 am
by Winnow
Forthe wrote:I have no problem with optimizations that improve speed with little cost.
I do have a problem when a company advertises it does anisotropic trilinear filtering when in fact the product will
never do anisotropic trilinear filtering (minus IQ tests).
The right solution is to put the option in there to disable it and then have the marketing department praise up your optimization. Much better than lying about it.
PS. I have no doubt Nvidia x-ray'd ATI's chips. Damn yanks

Do you have a fetish about anisotropic trilinear filtering or something Forthe? : ) Since ATI is a canadian company I'm cutting them some slack on throwing around big fancy techno terms and concentrating on the bottom line performance of newly released cards.
Both companies have to hype their products but all you need to do is check any of the geek websites to know which card performs better.
The difference here is ATI made optimizations that were like a placebo and didn't do anything to help performance while nVidia's optimizations were the equivalent of a football player shooting himself up with steroids.
I'm trying to promote peace and harmony here by supporting a canadian company!
Posted: June 4, 2004, 12:05 pm
by Kelshara
Support a Canadian company all you want, I support the clothes your avatar wears!
Posted: June 4, 2004, 2:16 pm
by Forthe
Winnow wrote:The difference here is ATI made optimizations that were like a placebo and didn't do anything to help performance while nVidia's optimizations were the equivalent of a football player shooting himself up with steroids.
The optimizations did help performance. Just check out the performance drop when using colored mipmaps.
Winnow wrote:Both companies have to hype their products but all you need to do is check any of the geek websites to know which card performs better.
The majority of tests where ATI showed better performance were the benchmarks with anisotropic filtering turned on. Very often ATI was behind with no AA or aniso turned on and with just AA turned on. From Tom's:
As we will see repeatedly throughout our benchmark section, anisotropic filtering is the greatest strength of the new Radeon X800 cards.
The real hypocracy is ATI has been pushing reviewers to turn off nvidia's brilinear optimizations, giving them directions to do so, condeming it as a cheat while they were hiding the fact that they were using brilinear optimizations.
Posted: June 6, 2004, 8:21 pm
by Winnow
GeForce 6800 Only Has 8 Pipelines?
Posted by Newsfactory on Monday, June 07 @ 01:10:04 CEST
Thanks to Digitalwanderer for pointing me to this post over at Beyond3D's Forum posted by Ufon.
If you check out his benchies that were run on a Inno3D GeForce 6800 Ultra 400/1100 MHz and a Leadtek A400 TDH (GeForce 6800) 335/700 MHz! It seems that these Cards are runing with 8 pipes and not the 12 Nvidia claimed.
No HDTV in Geforce 6800
Posted by El_Coyote on Saturday, June 05 @ 14:55:03 CEST
Quote:
"If you’re counting on HDTV working on your spanky new 6800, you’re out of luck: it has been confirmed this morning by Derek Perez of NVIDIA that the both the core and the existing reference board have no capacity to support an HDTV signal – though it is planned for the future NV43 & NV41 cores. If you are hoping a third party vendor is going to slap a discrete chip on there, you are likely to be disappointed because as far as we know no vendors are straying from the reference design."
Bahahahah. What a oversized, overheating piece of shit card. Please buy it!
Posted: June 6, 2004, 8:25 pm
by Kelshara
I love how everything Nvidia does is BAD and whenever ATi lies they are just doing what is best for the customer.. fanboi!
And WTF would I run HDTV on a computer anyway?
Posted: June 6, 2004, 8:58 pm
by Winnow
Kelshara wrote:I love how everything Nvidia does is BAD and whenever ATi lies they are just doing what is best for the customer.. fanboi!
And WTF would I run HDTV on a computer anyway?
HTPCs (Home Theater Personal Computers)
There's lots of things you can do to an image if you run it via your computer. fddshow has all sorts of image enhancing filters. Integration between PCs and Home Theaters is growing rapidly. You personally may not use it but many would like to.
I can guarantee the DVD movies I view on my projector look better than anything you've ever seen from a standalone DVD player including the new 1080i upscaling players or from normal DVD players on the PC unless of course you're using fddshow : )
You can read up on that here:
http://htpcnews.com/main.php?id=ffdshowdvd_1
You can read up on HTPCs a bit here:
http://www.avsforum.com/avs-vb/forumdis ... forumid=26