Ati's new card!
Ati's new card!
http://www.hothardware.com/viewarticle. ... =517&cid=2
Preliminary testing is very good. Going to have to wait until both this and the new GeForce card are out with retail drivers to get some true comparisons....but raw power ....looking very nice.
Preliminary testing is very good. Going to have to wait until both this and the new GeForce card are out with retail drivers to get some true comparisons....but raw power ....looking very nice.
Pyrella - Illusionist - Leader of Ixtlan on Antonia Bayle
if you were walking around and you came upon a tulip with tits, would you let it be for the rest of the world to enjoy.. or would you pick it and carry it off to a secluded area to motorboat them?
-Cadalano
if you were walking around and you came upon a tulip with tits, would you let it be for the rest of the world to enjoy.. or would you pick it and carry it off to a secluded area to motorboat them?
-Cadalano
It looks like business as usual early on. If you have a protoype fuel cell home power generator, the nVidia 6800 will get you in the ballpark of ATi's new card but still not as good or better.
AnandTech.com gave an edge to ATI, particularly citing the smaller size and power requirements of cards using its chips, but in the end also saw things roughly even.
"I don't think anyone thought the race would be this close after what has been going on over the past couple years with ATI and Nvidia," the site's Derek Wilson said in a review.
HardOCP.com gave both versions of the X800 its "Must Have Hardware" award and said its testing showed Markham, Ontario-based ATI with a clear performance advantage.
"When it comes right down to it the X800Pro matches or beats the GeForce 6800Ultra in game performance and (image quality)," said Brent Justice in his review. "The Radeon X800XT Platinum Edition goes even further and burns through these games like a hot knife through butter, besting Nvidia's 6800Ultra by an easily noticeable real-world margin."
It's been almost two years now. If that's considered recently in computer technology then so be it. I don't praticularly want to have a shitty product for two years clinging onto a name brand...the problem you have is after two years, the latest ATi product is still better.Kelshara wrote:Eh it is only ver recent history that ATi has had a better product. Their previous products were crap, and their drivers even worse. And yes, you are a fanboiWas not aware that they came this fanatical except in the "Blizzard" flavour.
How many more years will you be saying "next gen cards will be different!" If they are, great. I'm buying the best out there when I need a card.
You go clingboy.
- Janx
- Almost 1337
- Posts: 537
- Joined: July 3, 2002, 1:44 pm
- Gender: Mangina
- XBL Gamertag: Janx
- Location: Memphis
Winnow is just one of MANY people (and I'm not talking fanboi's) that are proclaming ati as the better hardware. Go read a majority of the hardware sites that bench competing products and you'll see that they favor the ATi card as well.
As it stands I still favor ATI as they have a more refined product. Instead of adding new features ala Nvidia, they realized that the features currently existing en masse have yet to be utilized to their fullest and expanded on them. Also, Ati doesnt require the power consuption on their top end card like nvidia. 2 power connectors to power a video card? sheez.
I sill dont like ATi's drivers compared to Nvidia though, and theres the note of OGL performace. I HATE running ogl applications on my 9800 pro. Right now I cant get CoH to run at anything other then 60hz refresh rate, if i try and force it I get an error that fullscreen mode isnt supported. Also gamma doesnt "stick". If I tab in/out of an application it reverts the gamma to a low setting (regardless of what its set in game).
ATI Fanboi
As it stands I still favor ATI as they have a more refined product. Instead of adding new features ala Nvidia, they realized that the features currently existing en masse have yet to be utilized to their fullest and expanded on them. Also, Ati doesnt require the power consuption on their top end card like nvidia. 2 power connectors to power a video card? sheez.
I sill dont like ATi's drivers compared to Nvidia though, and theres the note of OGL performace. I HATE running ogl applications on my 9800 pro. Right now I cant get CoH to run at anything other then 60hz refresh rate, if i try and force it I get an error that fullscreen mode isnt supported. Also gamma doesnt "stick". If I tab in/out of an application it reverts the gamma to a low setting (regardless of what its set in game).
ATI Fanboi
- Adelrune Argenti
- Almost 1337
- Posts: 831
- Joined: July 9, 2002, 4:22 pm
- Location: San Diego, CA
I have a couple of 6800's here that I am testing. Overall, a very nice card compared to the Radeon 9800 XT. More frames per second is always good. I have been spending the last 2 days now trying to track down some of the Radeon X800 cards to test with. Currently, no one has them yet.
For the record, the ATI cards have performed better than the Nvidia cards up till the new 6800. However, ATI countered with their new card and all indications are it will be the card to beat.
For the record, the ATI cards have performed better than the Nvidia cards up till the new 6800. However, ATI countered with their new card and all indications are it will be the card to beat.
Adelrune Argenti
- masteen
- Super Poster!
- Posts: 8197
- Joined: July 3, 2002, 12:40 pm
- Gender: Mangina
- Location: Florida
- Contact:
I love how the nVidia people keep saying "It's only been two years!" Well, it's only been 5 since 3dfx was king of the hill, and where are they today? The graphics industry moves too fast to be anything but a Winnow-caliber whore. 

"There is at least as much need to curb the cruel greed and arrogance of part of the world of capital, to curb the cruel greed and violence of part of the world of labor, as to check a cruel and unhealthy militarism in international relationships." -Theodore Roosevelt
-
- Gets Around
- Posts: 104
- Joined: July 3, 2002, 12:29 pm
- Location: Ottawa, Ontario. Canada
- Contact:
As much as I would love to support ATI (them being a Canadian company and all), I just simply can't switch to their cards untill they pull their head out of their asses and start supporting architectures other than DirectX.
If all you do is game, then an ATI card may be the way to go (that still remains to be seen, as the early benchmarks are all pretty even atm with the NVx cards generally showing a little higher image quality). But if you do any development/cad work etc.. NVidia is still leader of the pack by a large margin. And with the gap between consumer and pro level hardware closing more and more quickly, ATI may get left behind if they don't wise up.
If all you do is game, then an ATI card may be the way to go (that still remains to be seen, as the early benchmarks are all pretty even atm with the NVx cards generally showing a little higher image quality). But if you do any development/cad work etc.. NVidia is still leader of the pack by a large margin. And with the gap between consumer and pro level hardware closing more and more quickly, ATI may get left behind if they don't wise up.
- masteen
- Super Poster!
- Posts: 8197
- Joined: July 3, 2002, 12:40 pm
- Gender: Mangina
- Location: Florida
- Contact:
I thought that Matrox still owned a large part of the high-end CAD/CAM market? Has nVidia passed them?
"There is at least as much need to curb the cruel greed and arrogance of part of the world of capital, to curb the cruel greed and violence of part of the world of labor, as to check a cruel and unhealthy militarism in international relationships." -Theodore Roosevelt
- masteen
- Super Poster!
- Posts: 8197
- Joined: July 3, 2002, 12:40 pm
- Gender: Mangina
- Location: Florida
- Contact:
The 8xAA image is superior on the nVidia, but puts out like 1/4 the fps that the ATI does at comparable settings. In most other AA, aniso, or combined benchmarks, the ATI comes out on top.
"There is at least as much need to curb the cruel greed and arrogance of part of the world of capital, to curb the cruel greed and violence of part of the world of labor, as to check a cruel and unhealthy militarism in international relationships." -Theodore Roosevelt
I need to recheck the breakdown of which graphics card companies are getting the new consul game machine contracts and which companies are losing them.
--------Until last year Nvidia was the overwhelming leader in making the chips that are used to improve the ability of personal computers to portray complex three-dimensional and fast-moving graphics images, a quality particularly prized by video game players. Moreover, it had an exclusive deal with Microsoft for the graphics processing chips used in the Xbox game machine.
That all came undone last year when Nvidia placed an ill-timed bet on a new generation of chip-making technology that ended up reaching the market too late. At the same time, Nvidia lost its Xbox contract to rival ATI when it refused to meet Microsoft's stringent pricing demands.
As a result, after growing at about 100 percent annually for four years, Nvidia's revenue fell from $1.91 billion in 2002 to $1.82 billion in 2003.
"ATI not only outperformed Nvidia, they had a cheaper chip," said David Wu, an analyst at Wedbush Morgan Securities who follows the graphics chip market.
Shares of Nvidia fell from a high of $71.71 in January 2002 to as low as $7.37 in October 2002.
The Radeon X800 series cards perform best in some of our most intensive benchmarks based on newer games or requiring lots of pixel shading power, including Far Cry, Painkiller, UT2004, and 3DMark03's Mother Nature scene—especially at high resolutions with edge and texture antialiasing enabled. The X800s also have superior edge antialiasing. Their 6X multisampling mode reduces edge jaggies better than NVIDIA's 8xS mode, and the presence of temporal antialiasing only underscores ATI's leadership here. With a single-slot cooler, one power connector, and fairly reasonable power requirements, the Radeon X800 XT Platinum Edition offers all its capability with less inconvenience than NVIDIA's GeForce 6800 Ultra. What's more, ATI's X800 series will be in stores first, with a more mature driver than NVIDIA currently has for the GeForce 6800 line.
Have I ever said that? Nope. Did it take a long time for ATi to prove their newest drivers weren't a fucking waste of space? Hell yeah, and I STILL don't like their drivers all that much.. Would I blindly buy ATi? Hell no, nor would I blindly buy Nvidia. But overall, in work situations and at home I have (over the last what, quite a few years now) had less problems with Nvidia than ATi. ATi is ahead now, but who knows who will be ahead tomorrow.How many more years will you be saying "next gen cards will be different!" If they are, great. I'm buying the best out there when I need a card.
I buy the best bang for the buck usually, whatever brand that might be. So stop putting words in my mouth, dumbass.
Edit: Oh and I still have a Voodoo card laying around! Best card evah!
- Dregor Thule
- Super Poster!
- Posts: 5994
- Joined: July 3, 2002, 8:59 pm
- Gender: Male
- XBL Gamertag: Xathlak
- PSN ID: dregor77
- Location: Oakville, Ontario

My compositing/digital FX teacher was the one responsible for doing the lighting on the poses for the Ruby character. I got to see the scene file for the actual model. Was pretty cool. Had breast controls ><
Oh, and the funny thing about that pose, if you look at it from any other angle she's falling over backwards, hehe.
- Drolgin Steingrinder
- Way too much time!
- Posts: 3510
- Joined: July 3, 2002, 5:28 pm
- Gender: Male
- PSN ID: Drolgin
- Location: Århus, Denmark
For those with time, here's an informative thread on the x800 and 6800 cards. Watch the 6800 backer get worked over. Good info especially if you are considering an HTPC but it also explains why ATi owns this round as well.
http://www.avsforum.com/avs-vb/showthre ... did=398227
http://www.avsforum.com/avs-vb/showthre ... did=398227
- Adelrune Argenti
- Almost 1337
- Posts: 831
- Joined: July 9, 2002, 4:22 pm
- Location: San Diego, CA
- Forthe
- Way too much time!
- Posts: 1719
- Joined: July 3, 2002, 4:15 pm
- XBL Gamertag: Brutus709
- Location: The Political Newf
Both are nice cards and performance swings depending on the API used. ATI's part outperforms Nvidia by a decent margin with DX9. Nvidia's part outperforms ATI by a decent margin with OGL. They perform fairly evenly on DX8 and older.
If you are mostly a windows user then I'd go with the ATI due to Nvidia's ugly 2 slot design and power requirements. For non window OS's or if you run a lot of OGL apps\games I'd go with Nvidia.
If you are mostly a windows user then I'd go with the ATI due to Nvidia's ugly 2 slot design and power requirements. For non window OS's or if you run a lot of OGL apps\games I'd go with Nvidia.
All posts are personal opinion.
My opinion may == || != my guild's.
"All spelling mistakes were not on purpose as I dont know shit ." - Torrkir
My opinion may == || != my guild's.
"All spelling mistakes were not on purpose as I dont know shit ." - Torrkir
You can hard mod the x800 pro to open up 8 more pipes as well! More reason to get ATi!
http://forums.overclockers.co.uk/showth ... genumber=5
http://forums.overclockers.co.uk/showth ... genumber=5
old news! ATI already responded to those accusations! There's nothing to see here!Forthe wrote:Seems ATI may have been cheating some.
http://www.tomshardware.com/graphic/20040603/index.html
I'll find the link later.
- Forthe
- Way too much time!
- Posts: 1719
- Joined: July 3, 2002, 4:15 pm
- XBL Gamertag: Brutus709
- Location: The Political Newf
ATI's written response is irrelevent. With ATI's current drivers it is impossible to have anisotropic trilinear filtering unless you use all colored mipmaps. Which you will only ever see in image quality tests (coincidence of course).Winnow wrote:old news! ATI already responded to those accusations! There's nothing to see here!Forthe wrote:Seems ATI may have been cheating some.
http://www.tomshardware.com/graphic/20040603/index.html
I'll find the link later.
This is the same optimization, except that ATI excludes colored mipmaps, that nvidia took a shitload of flak for last year. You can now turn off those optimizations in nvidia's drivers and ATI should allow the same.
Every ATI benchmark using anisotropic trilinear filtering is now suspect and judging by the tests done using colored mipmaps they will be revised much lower.
Call a spade a spade.
All posts are personal opinion.
My opinion may == || != my guild's.
"All spelling mistakes were not on purpose as I dont know shit ." - Torrkir
My opinion may == || != my guild's.
"All spelling mistakes were not on purpose as I dont know shit ." - Torrkir
ATI has been cleared on the technical side of things although they admit to some shanannigans.
Here's more of nVidia trying to pull a fast one with the PCI Express announcement.
http://www.theinquirer.net/?article=16332
They made optimizations but they didn't do much in performance unlike nVidia's shanannigans.ATI's algorithms appear to fit into the legitimate category with professional hardware Web sites hard-pressed to find any image-quality degradations caused by the filtering optimizations. According to Tech Report's Scott Wasson, "ATI's adaptive trilinear appears to be damn near impossible to catch with the naked eye."
A spade is only a spade if it makes a significant difference and in this case it didn't while the nVidia optimizations did. Just more nVidia propaganda!While ATI has been cleared of cheating charges in the technical arena, the company still has to overcome the damage caused by its failure to disclose the existence of the optimizations to the editors of major hardware publications.
Here's more of nVidia trying to pull a fast one with the PCI Express announcement.
http://www.theinquirer.net/?article=16332
- Forthe
- Way too much time!
- Posts: 1719
- Joined: July 3, 2002, 4:15 pm
- XBL Gamertag: Brutus709
- Location: The Political Newf
I have no problem with optimizations that improve speed with little cost.
I do have a problem when a company advertises it does anisotropic trilinear filtering when in fact the product will never do anisotropic trilinear filtering (minus IQ tests).
The right solution is to put the option in there to disable it and then have the marketing department praise up your optimization. Much better than lying about it.
PS. I have no doubt Nvidia x-ray'd ATI's chips. Damn yanks
I do have a problem when a company advertises it does anisotropic trilinear filtering when in fact the product will never do anisotropic trilinear filtering (minus IQ tests).
The right solution is to put the option in there to disable it and then have the marketing department praise up your optimization. Much better than lying about it.
PS. I have no doubt Nvidia x-ray'd ATI's chips. Damn yanks

All posts are personal opinion.
My opinion may == || != my guild's.
"All spelling mistakes were not on purpose as I dont know shit ." - Torrkir
My opinion may == || != my guild's.
"All spelling mistakes were not on purpose as I dont know shit ." - Torrkir
Do you have a fetish about anisotropic trilinear filtering or something Forthe? : ) Since ATI is a canadian company I'm cutting them some slack on throwing around big fancy techno terms and concentrating on the bottom line performance of newly released cards.Forthe wrote:I have no problem with optimizations that improve speed with little cost.
I do have a problem when a company advertises it does anisotropic trilinear filtering when in fact the product will never do anisotropic trilinear filtering (minus IQ tests).
The right solution is to put the option in there to disable it and then have the marketing department praise up your optimization. Much better than lying about it.
PS. I have no doubt Nvidia x-ray'd ATI's chips. Damn yanks
Both companies have to hype their products but all you need to do is check any of the geek websites to know which card performs better.
The difference here is ATI made optimizations that were like a placebo and didn't do anything to help performance while nVidia's optimizations were the equivalent of a football player shooting himself up with steroids.
I'm trying to promote peace and harmony here by supporting a canadian company!
- Forthe
- Way too much time!
- Posts: 1719
- Joined: July 3, 2002, 4:15 pm
- XBL Gamertag: Brutus709
- Location: The Political Newf
The optimizations did help performance. Just check out the performance drop when using colored mipmaps.Winnow wrote:The difference here is ATI made optimizations that were like a placebo and didn't do anything to help performance while nVidia's optimizations were the equivalent of a football player shooting himself up with steroids.
The majority of tests where ATI showed better performance were the benchmarks with anisotropic filtering turned on. Very often ATI was behind with no AA or aniso turned on and with just AA turned on. From Tom's:Winnow wrote:Both companies have to hype their products but all you need to do is check any of the geek websites to know which card performs better.
The real hypocracy is ATI has been pushing reviewers to turn off nvidia's brilinear optimizations, giving them directions to do so, condeming it as a cheat while they were hiding the fact that they were using brilinear optimizations.As we will see repeatedly throughout our benchmark section, anisotropic filtering is the greatest strength of the new Radeon X800 cards.
All posts are personal opinion.
My opinion may == || != my guild's.
"All spelling mistakes were not on purpose as I dont know shit ." - Torrkir
My opinion may == || != my guild's.
"All spelling mistakes were not on purpose as I dont know shit ." - Torrkir
GeForce 6800 Only Has 8 Pipelines?
Posted by Newsfactory on Monday, June 07 @ 01:10:04 CEST
Thanks to Digitalwanderer for pointing me to this post over at Beyond3D's Forum posted by Ufon.
If you check out his benchies that were run on a Inno3D GeForce 6800 Ultra 400/1100 MHz and a Leadtek A400 TDH (GeForce 6800) 335/700 MHz! It seems that these Cards are runing with 8 pipes and not the 12 Nvidia claimed.
No HDTV in Geforce 6800
Posted by El_Coyote on Saturday, June 05 @ 14:55:03 CEST
Quote:
"If you’re counting on HDTV working on your spanky new 6800, you’re out of luck: it has been confirmed this morning by Derek Perez of NVIDIA that the both the core and the existing reference board have no capacity to support an HDTV signal – though it is planned for the future NV43 & NV41 cores. If you are hoping a third party vendor is going to slap a discrete chip on there, you are likely to be disappointed because as far as we know no vendors are straying from the reference design."
Bahahahah. What a oversized, overheating piece of shit card. Please buy it!
HTPCs (Home Theater Personal Computers)Kelshara wrote:I love how everything Nvidia does is BAD and whenever ATi lies they are just doing what is best for the customer.. fanboi!
And WTF would I run HDTV on a computer anyway?
There's lots of things you can do to an image if you run it via your computer. fddshow has all sorts of image enhancing filters. Integration between PCs and Home Theaters is growing rapidly. You personally may not use it but many would like to.
I can guarantee the DVD movies I view on my projector look better than anything you've ever seen from a standalone DVD player including the new 1080i upscaling players or from normal DVD players on the PC unless of course you're using fddshow : )
You can read up on that here:
http://htpcnews.com/main.php?id=ffdshowdvd_1
You can read up on HTPCs a bit here:
http://www.avsforum.com/avs-vb/forumdis ... forumid=26