Are you kidding me? The 4850 will CRUSH that game with max settings and the largest resolution, as well as an 8800GT. After reading every post you made in this thread, you're obviously lacking hardware knowledge, no offense. This is a Blizzard game after all.
Sadly you're incorrect, Nvidia still has a performance edge on ATI thanks to SLI - when it comes down to money and single card configs, your best bet by far is to go with ATI's 4870 for $260 or 9800GX2 for $310. Unless you can do SLI, then by all means pick up two 8800GT's for $220. I do say this because as of late there is many conflicting webpages that sway to both ATI and Nvidia. For example, on tomshardware Nvidia is crushing ATI and on anandtech, ATI is crushing Nvidia - which is it? Good question, we won't really find out for a couple more months. But so far, after trading in my 8800 GTX that I've had since the card was released for an ATI 4870, I could not be more happier.
sorry pal but you don't know what you're talking about. There is a little thing called Crossfire X where you can put two 4870 X2's together though I'm not even going to bother going into that. The Geforce GTX280 in a card to card comparison does NOT stack up to the 4870 X2 at the ultra high end which was what I was talking about to begin with. In SLi, two GTX280's outperform a single 4870 X2, as it should. However dual 4870 X2's outperform two and and match three card SLi GTX280's in most games. As always, it boils down to scailing and drivers which are always in a state of constent evolution.
HOWEVER! What I was talking about before you decided to erroniously correct me was in a SINGLE card to SINGLE card comparison, the 4870 X2 wins by large margins at the ultra-high resolutions. And by ULTRA-high, I mean 2560x1600 AKA: 30" monitors where the GTX280's 1GB of RAM bottlenecks the card.
The reason I made a single card to single card comparason is mainly cuz, who do you think here is going to drop 1300 dollars in Tri SLi? (slightly more than that but I'm going with easy to quote numbers). Hell, people aren't even going to spend 1100 on Dual 4870 X2's because spending 550 on a single card is a bit of a stretch for the common Computer User. If anything I should have expanding my post to include the 4870 and the 4850 and explain my recommendations for them though my original argument was to suggest to Diablo 3 fans to WAIT before investing in a graphics card until the game actually comes out. It's not a question of "can the card run Diablo 3 now or not" but "Can it play all games in the future or not." As good as Diablo 3 may or may not be, people use their computers for more than just to play one game. They're going to pick up other games and they're going to want to play it with reasonable performance and I want them to know that they don't have to run out and buy a card now when the game isn't even out yet because if they do, they're only going to get screwed in the long run because something is going to come out that they are going to be interested in and instead of waiting, they have to deal with poor graphics performance in a 2 year old card.
If by "Tom's hardware has so and so leading and Anandtech has who cares leading" you have to look at real world tests and you have to look and see what they are testing the cards on. 3Dmark06 and 3Dmark Vantage are cute little tests with big numbers to make overclockers feel special and that's about it. It doesn't really prove anything mainly because ATi and nVidia can fudge the numbers very easilly with drivers. For the longest time ATi had higher 3Dmark06 scores than nVidia and the 8800 GTX was mopping the floor with ATi's cards. Why is that? Cuz ATi wrote better 3Dmark drivers. Hell, nVidia is doing it right now with their current drivers by having the card offload PhysX from the CPU onto the GPU during the physics test in Vantage. It drastically boosts their scores in Vantage. However, if you know how a game works, in a single card configuration, you don't have your GPU offload physics from the CPU and have it do the graphics processing at the same time because it only slows down framerates and isn't very effecient. In SLi or Crossfire, it works and works very well, but in single slots, it doesn't work very well. Is it cheating to boost scores? Probably, but the fact that the test itself is a flawed representation of real-world performance, you take it with a grain of salt.
Also you have to look to see if they actually stress the cards. Big deal if you can get 100 FPS in Crysis if you don't have it completely maxed out with high amounts of AA and AF at a resolution of 1920x1200 or higher.
Also you have to look at the test bed. What processor was it paired with? What motherboard? Ram? Clock speeds? What good is the data if it isn't gathered on an appropriate test bed?
I do this shit for a living. I know what I'm talking about. If I say something is better than something else, I'm rarely wrong. Not because I magically know it is better, but because I make sure I'm not wrong. I gather information, I look for proof, I do tests, I interview people, and then I make my decision to recommend a product or not. However I'm man enough to say "If I'm proved wrong, I'll take what I said back." Only thing is what you said has nothing to do with what I said and I think I deserve an apology for you claiming my information to be erronious especially since in my argument, I said nothing about Crossfire or SLi and was only speaking of single card configurations.
ati rapes nvidia use 2 4870 xs's ona crossfire mother board and youl rape a sli mother board with 98oogtx's plus nvidia is horrible they had an over priced gpu at the time the 38?? came out so ati had lower price AND better performance so nvida had to lower price 100 bucks to get even with the ati and the ati still ws a better buy sooo ati ruless
Rollback Post to RevisionRollBack
Not even Death will save you from Diablo Bunny's Cuteness!
ati rapes nvidia use 2 4870 xs's ona crossfire mother board and youl rape a sli mother board with 98oogtx's plus nvidia is horrible they had an over priced gpu at the time the 38?? came out so ati had lower price AND better performance so nvida had to lower price 100 bucks to get even with the ati and the ati still ws a better buy sooo ati ruless
fanbois fanbois fanbois.
Ati and nVidia each have their strengths and weaknesses.
If Ati keeps up this nice job as it does now when it is around D3 release I will again go with Ati. I have 8800GTS now that I bought a little over year ago, it was probably the best choice back then since Ati didn't have anything to compete with then. Recent slacking of nvidia kinda made me favor Ati again. Ati has been improving a lot with 3000 and 4000 generation and nvidia is making new model names with basically the same technology but with slightly higher clock speeds.. ripoff.
A friend wants to get a new Laptop and must do so in the near future. He wants to make sure that his laptop can play Diablo III.
With his budget he narrowed his choices down to 2 laptops.
1: 512 MB dedicated graphics
2.0 GHz
3 GB memory
240 GB hard drive
windows Vista
13.3" screen.
2. 256 Shared graphics
3GB memory
2.2 GHz
Windows Vista
13.3" screen.
Which one do you guys think is more suitable? o and the shared one says up to 1GB when I checked the Graphics card (but I have no clue what that means :confused:)
A friend wants to get a new Laptop and must do so in the near future. He wants to make sure that his laptop can play Diablo III.
With his budget he narrowed his choices down to 2 laptops.
1: 512 MB dedicated graphics
2.0 GHz
3 GB memory
240 GB hard drive
windows Vista
13.3" screen.
2. 256 Shared graphics
3GB memory
2.2 GHz
Windows Vista
13.3" screen.
Which one do you guys think is more suitable? o and the shared one says up to 1GB when I checked the Graphics card (but I have no clue what that means :confused:)
it means 1gb of the ram is shared between the ram and GPU now the other one's GPU has 2 512 DEDICATED mem on the GPU which means it has 1.24 gb specifically for the GPU and not used by the ram...i would get the one that has the dedicated a fairly good HDD which means a fair amount of space.. and can you respond with the manufacturers and models of these 2 computers i would like to look into them alittle more because if they have a PCIe you are allowed to get the magme box/ asus XG station these have small PCIe connectors and they have (do not know for sure) a vga port or a dvi ports... who knows anyway but then he cable goes from the PCIe into the magma or asus station and you open up the station and plug in your new graphics card once done plug in the power cord and plug in your new monitor and then youll be able to run at higher res....The asus XG station is a better coice only because you are able to overclock the card right from the station with a little knob...the magme box is a little trickierr... you must only use a single width card with the magmabox ^1 you must buy a diffrent model to support double width GPu's these cost around 200-500 bucks and i think are worth it...just an add-on for your gaming pleasures
Edit: this is not the only reason i would like to know the models because i also want to see if they are decent rugged computers that can meet your gamer needs... also i need to know you budget for all of this
Rollback Post to RevisionRollBack
Not even Death will save you from Diablo Bunny's Cuteness!
I don't usually stand up to 13yolds who use the word "rape", but he's right on most points. Even though what he said doesn't stand as a full sentence. I wouldn't blame them though for keeping the 88xx and the 98xx overpriced when they could, it's called marketing. And they didn't lower the GTX280 price by $100 if that's what you mean, they lowered it by $250. (That got so many fanboys pissed off over at my gfx card forum, buy it at $650 only to have it lowered by $250 in 3 weeks time!! ;))
I laughed long and hard over that....and then looked over and remembered I payed full price for my 9800 GTX only to have it worth 200 a month later...lol that's how it goes though. Anyone who's been in the Computer Game as long as me knows that when you buy stuff when it first comes out, it's going to be inflated and cost considerably less down the road....though normally it isn't slashed in price a month after it comes out...
Agreed 100% Varth, you know what, in my life lately I've been noticing that many people bend conversation topics to fit their arguments. I find this annoying and I keep away from such people. (sometimes it's just a misunderstanding though, but you can't always tell)
PS: Tom's is falling apart lately, it's getting more and more biased towards nVidia, I don't like it. They didn't even use the 4870x2 beta drivers in their review, rather some older drivers(vast performance boost, the older ones don't support 4870x2 properly). Oh, and with reviews, I'm sure he didn't mean just 3DMark. There are games optimized towards nV(unreal tournament, company of heroes), there are games optimized towards ATi(half life 2), and all game benches favor the 4870x2 when forcing 16x AA at 1920x1200 :D. It always depends on the games they choose to benchmark and the CPU.
Review sites are sometimes favoring one vendor, and most times are ignorant enough to do so unwillingly. I checked at least 15 review sites before making my choice. Can you believe that no sites except this one http://www.cluboverclocker.com/reviews/video/sapphire/hd4870x2_2GB/page1.asp do heavy AA checks, on at least 1 title? Hell, most don't even push very high on Crysis.
What do they think? That people buying these cards, won't consider cranking up AA? /shrug
I know what you mean. I try not to take it personally when people misquote me or argue over something I said.
And ya, I've noticed too that Tom's hardware is losing its credibility which sucks. It makes it hard to find unbiased reviews now. I tend to go to Maximum PC to find good reviews. They are completely unbiased but of course, being primarily a magazine, they don't have as many reviews as Tom's. For what I can't find on there, I tend to just google it and fish through the BS till I get some good info. lol and if I really wanna see bias, I can always go to theinquirer and laugh at Charlie's nVidia and Vista rants
Oh ya, the AA thing is VERY annoying. If I see a review that doesn't have an AA test for each title, I just stop reading then. I want to see a non-AA test at 1920x1200 and then a heavy AA test at 1920x1200 so then I can get a good idea of just how strong the card is at the extremes.
www.overclockersclub.com is great, really. When they're done benching a card and none of the staff needs it, they give it away too. Recently they just had gotten rid of an HD4850. However, they're a smaller community, but always review the same games, some leaning towards ATi (Call of Juarez) and some towards nVidia (Crysis).
They're a small community, but they can also help you with any computer problem you have, I've kept my computer 2 times because of them.
Rollback Post to RevisionRollBack
It's the decisions you make when you have no time to make them that define who you are.
Spent 429.00 on this 8800, so I don't predict I'll be buying a videocard anytime soon. Blizzard should be focusing the 8800/9800/hd4870 series as the focal point for high settings in Diablo 3. It's better for Blizzard to focus on today's standards, then the future's, much more people will have cards among these series, then the latest rip off at the time Diablo 3 releases. It probably won't take much at all as it is to max out Diablo 3 knowing how easy Blizzard games are on System Specs. It will probably look great and require relatively low specs because of good optimization.
sorry pal but you don't know what you're talking about. There is a little thing called Crossfire X where you can put two 4870 X2's together though I'm not even going to bother going into that. The Geforce GTX280 in a card to card comparison does NOT stack up to the 4870 X2 at the ultra high end which was what I was talking about to begin with. In SLi, two GTX280's outperform a single 4870 X2, as it should. However dual 4870 X2's outperform two and and match three card SLi GTX280's in most games. As always, it boils down to scailing and drivers which are always in a state of constent evolution.
REALLY, WHAT'S CROSSFIRE?!?! Durrrr. Alright... So why are you comparing a $400 card to a $600 card? Then you're comparing a $800 card (Two 280's) to a $1200 card (Two 4870 X2's) what's your point - oh I see what your point is... Two 4870 X2's, yeah a WASTE OF MONEY. Agreed, it does come down to drivers but not in this case because your comparison is bogus, on the plus SLI drivers > Crossfire drivers. I find it absolutely hilarious that you try to act smart by comparing two completely different cards and proving that one is better. Really sir, shucks, I didn't know a DUAL GPU card was better then a single GPU card, jee thanks!!
HOWEVER! What I was talking about before you decided to erroniously correct me was in a SINGLE card to SINGLE card comparison, the 4870 X2 wins by large margins at the ultra-high resolutions. And by ULTRA-high, I mean 2560x1600 AKA: 30" monitors where the GTX280's 1GB of RAM bottlenecks the card.
CORRECT, you're absolutely right but the 4870 X2 is about $230 more expensive here than a 280 - quite the far comparison? I don't think so. You're comparing a single card that has one GPU to another card that has two. You SERIOUSLY think the reason why the X2 is superior to the 280GTX is because of a RAM bottleneck? I thnk you're forgetting those 1600 SHADERS. Again an unfair comparison and again, pointless.
The reason I made a single card to single card comparason is mainly cuz, who do you think here is going to drop 1300 dollars in Tri SLi? (slightly more than that but I'm going with easy to quote numbers). Hell, people aren't even going to spend 1100 on Dual 4870 X2's because spending 550 on a single card is a bit of a stretch for the common Computer User. If anything I should have expanding my post to include the 4870 and the 4850 and explain my recommendations for them though my original argument was to suggest to Diablo 3 fans to WAIT before investing in a graphics card until the game actually comes out. It's not a question of "can the card run Diablo 3 now or not" but "Can it play all games in the future or not." As good as Diablo 3 may or may not be, people use their computers for more than just to play one game. They're going to pick up other games and they're going to want to play it with reasonable performance and I want them to know that they don't have to run out and buy a card now when the game isn't even out yet because if they do, they're only going to get screwed in the long run because something is going to come out that they are going to be interested in and instead of waiting, they have to deal with poor graphics performance in a 2 year old card.
Hope that wasn't directed at me because tl;dr.
If by "Tom's hardware has so and so leading and Anandtech has who cares leading" you have to look at real world tests and you have to look and see what they are testing the cards on. 3Dmark06 and 3Dmark Vantage are cute little tests with big numbers to make overclockers feel special and that's about it. It doesn't really prove anything mainly because ATi and nVidia can fudge the numbers very easilly with drivers. For the longest time ATi had higher 3Dmark06 scores than nVidia and the 8800 GTX was mopping the floor with ATi's cards. Why is that? Cuz ATi wrote better 3Dmark drivers. Hell, nVidia is doing it right now with their current drivers by having the card offload PhysX from the CPU onto the GPU during the physics test in Vantage. It drastically boosts their scores in Vantage. However, if you know how a game works, in a single card configuration, you don't have your GPU offload physics from the CPU and have it do the graphics processing at the same time because it only slows down framerates and isn't very effecient. In SLi or Crossfire, it works and works very well, but in single slots, it doesn't work very well. Is it cheating to boost scores? Probably, but the fact that the test itself is a flawed representation of real-world performance, you take it with a grain of salt
.
Great... It's a shame that I look at gaming benchmarks and don't give a rats ass about 3dmark garbage. I really hope you weren't directing that at me either, because for one, I don't care I'm aware how computer hardware functions and two it's irrelevant, almost a tl;dr.
Also you have to look to see if they actually stress the cards. Big deal if you can get 100 FPS in Crysis if you don't have it completely maxed out with high amounts of AA and AF at a resolution of 1920x1200 or higher.
Oh and why would you need high amounts of AA and AF on high resolutions? Here's a little tip, at 1920x1200, the most you'd need is 2x because the image has already been scaled so high, the jagged edges that need to be smoothed are well, not apparent. Anything higher than that doesn't need AA, you're just adding an extra layer of fail and slowness. Read up on it. How come Mr "I do this shit for a living" didn't know that? Oh Snape.
Also you have to look at the test bed. What processor was it paired with? What motherboard? Ram? Clock speeds? What good is the data if it isn't gathered on an appropriate test bed?
Why would I waste my time on something so trivial? As long as both GPUs use the same components which I check, minus the motherboards for SLI/Crossfire, I really don't care because it won't make a difference. Either way, you wont get a single shred of extra frames on a $300 motherboard vs. a $150 motherboard. If for some reason it does make a difference, the difference will be TRIVIAL.
I do this shit for a living.
I do this for a hobby, it's easier to get by in life when you have a REAL job and do this sort of thing on the side. Working at a computer store doesn't make you smarticus.
I know what I'm talking about. If I say something is better than something else, I'm rarely wrong. Not because I magically know it is better, but because I make sure I'm not wrong. I gather information, I look for proof, I do tests, I interview people, and then I make my decision to recommend a product or not. However I'm man enough to say "If I'm proved wrong, I'll take what I said back." Only thing is what you said has nothing to do with what I said and I think I deserve an apology for you claiming my information to be erronious especially since in my argument, I said nothing about Crossfire or SLi and was only speaking of single card configurations.
Are you on crack kid? You're comparing a DUAL GPU card to a single GPU card, who the hell did you expect would take the crown and you want an apology? You weren't talking about SLI or Crossfire, but I was, two 8800 GT's perform similar to one 4870 X2. Two 8800 GT's cost $200 total, one 4870 X2 costs over $600. Not only that, I was implying that you were incorrect in the terms of one 4870 X2 vs. 280GTX being double the performance at 1920x1200+, it clearly is not - it clearly does NOT get smoked as you put it, at $230 cheaper I would EXPECT it to get beat not smoked - which stands true. Seriously, don't you have a better comparison? You're just a fanboy trying to make Nvidia look bad. If those benchmarks look like DOUBLE to you, or even close to DOUBLE, you should get your eyes checked and/or go back to school. I'd also like an apology for having to read through all of your rubbish, please and thank you,
EDIT:
Quote from "Varth" »
I laughed long and hard over that....and then looked over and remembered I payed full price for my 9800 GTX only to have it worth 200 a month later...lol that's how it goes though. Anyone who's been in the Computer Game as long as me knows that when you buy stuff when it first comes out, it's going to be inflated and cost considerably less down the road....though normally it isn't slashed in price a month after it comes out...
You bought a 9800GTX at full price right when it first came out? Even though the trend was obvious right in your face that the video card market was going to tumble? Surely someone with your experience and sheer cunning could have thwarted such a mistake. Shame, I picked up my 4870 at such a discount, see I KNEW how the market worked so a week BEFORE the 4870 was planned to be released, I sold my 8800GTX while the prices were still inflated and pretty much got my 4870 FREE with cash to spare.
This is fairly old news but some might not know it, so I'm going to post it here.
AMD has decided to bundle some of the upcoming Blizzard Games with some of their ATI Radeon graphics cards, and I think we can expect at least 2 of the 3 upcoming titles to be bundled with a suitable graphics card.
So if you where planing to upgrade your graphics card to be able to play Diablo III you might just want to wait a bit and buy the bundle instead, because at least that way you'll be shure you have a cappable card.
As i can see it i bet that my Nvidia 7600 will run diablo on high graphics. Blizzard always makes there games so that it can run on the lowest end machines possible so I don't know y you guys even think of upgrading graphics cards. No need expecially for D3.
Creatures don't even stay on the ground after you kill them now either, according to blizzard. Which is odd because in Titan Quest, all creatures you killed stood on the ground for the whole duration of the game with no slow downs, as well as the items. :rolleyes:
I think YOU are the one on crack, no offense, but the 8800GT is ~10% slower than the 4850, best case scenario, how could two of them outperform the 4870x2, especially when SLI scaling is slightly worse on average than XF scaling? Or are we takling nv-optimised games, old drivers where x2's XF doesn't scale at all, 1280x1024, no AA, no AF?
I said similar, NOT better. The difference is within 10-40% well justified by the price. Is the 4870 X2 a better card? OF COURSE but as it's obvious a lot of people don't have $600+ to dishout on one card.
About AA not needed at high resolutions:
I will get a 1920x1200 screen, and I will use 16xAF and at least 4xAA. 1920x1200 screens start at 24 inches, which means that you WILL see rough edges. Heck, I was annoyed by rough edges on my 19" CRT(=18" effective) @ 1600x1200.
Scroll up, check the overclockersclub link I gave, click the Crysis benchmark, check it out. There are screenshots. Depending on the size of your screen, you might get a almost 1:1 ratio with a 24" @1920x1200, like I do with my 15" at 1280x800. And believe me, the edges are very noticeable.
I have a 24/22 inch screen and let me tell you, anything past 4xAA is POINTLESS, feel free to bog down your frames to satisfy your ego, I really don't care how you waste your time. 1600x1200 will need A LOT more AA at least 8x to remove the majority of jagged edges. If you think you're doing good, it's all in your head. Why are you telling me all this garbage about how annoyed you were with your lowgrade monitors? I don't care. Turn up your AA and quit crying, low resolutions like that need VERY high amounts of AA.
Comparing a dual card to a single core card, is pretty normal. It's fine. Both cores are on one board. It's ONE card. When you buy it, both cores come, on one PCB, inside the package.
No, comparing a dual card to a single card is a waste of time because they aren't in the same category, I'm aware how the X2 is designed, thanks for the useless post. Though I can see how it's relevant because you're from Europe and the prices are so screwed up there, not to mention overpriced.
You don't need to buy two things. ONE card. It's like saying a 9800GTX can't be compared directly to the 9800GT because the GT has 16 shader processors less.
Are you INSANE? You're comparing something insignificant to a major component that makes the X2 what it is. A far comparison would be the 280GTX's version of an SLI card.
And it's at $550, not $600, and the GTX280 at $430 (newegg prices) => $120 difference, ~30% up to 80% better performance. It's worth the extra money, since the price/perf ratio is even slightly better on the 4870x2. Worst case scenario, one core doesn't scale(lol not going to happen), you still have a 4870 with 1GB on the go (~6-7% less perf than GTX280). If you are interested in a review with one of the 4870x2's cores disabled, I can find one later.
My country, the GTX280 is just 30 euros less. Not worth it, period.
You assume I'm from the States, in Canada the 280GTX is about $410, while the X2 is sitting at around $620 - prices are still inflated on the X2. While I can buy 2 8800 GT's for $180, or 3 even for $250. Obviously not the performance on an X2, but close and a hell of a lot cheaper. I'm sure we can all agree that the 280GTX is a ripoff, but there's no point comparing it to something that is in another league.
REALLY, WHAT'S CROSSFIRE?!?! Durrrr. Alright... So why are you comparing a $400 card to a $600 card? Then you're comparing a $800 card (Two 280's) to a $1200 card (Two 4870 X2's) what's your point - oh I see what your point is... Two 4870 X2's, yeah a WASTE OF MONEY. Agreed, it does come down to drivers but not in this case because your comparison is bogus, on the plus SLI drivers > Crossfire drivers. I find it absolutely hilarious that you try to act smart by comparing two completely different cards and proving that one is better. Really sir, shucks, I didn't know a DUAL GPU card was better then a single GPU card, jee thanks!!
CORRECT, you're absolutely right but the 4870 X2 is about $230 more expensive here than a 280 - quite the far comparison? I don't think so. You're comparing a single card that has one GPU to another card that has two. You SERIOUSLY think the reason why the X2 is superior to the 280GTX is because of a RAM bottleneck? I thnk you're forgetting those 1600 SHADERS. Again an unfair comparison and again, pointless.
Hope that wasn't directed at me because tl;dr.
Great... It's a shame that I look at gaming benchmarks and don't give a rats ass about 3dmark garbage. I really hope you weren't directing that at me either, because for one, I don't care I'm aware how computer hardware functions and two it's irrelevant, almost a tl;dr.
Oh and why would you need high amounts of AA and AF on high resolutions? Here's a little tip, at 1920x1200, the most you'd need is 2x because the image has already been scaled so high, the jagged edges that need to be smoothed are well, not apparent. Anything higher than that doesn't need AA, you're just adding an extra layer of fail and slowness. Read up on it. How come Mr "I do this shit for a living" didn't know that? Oh Snape.
Why would I waste my time on something so trivial? As long as both GPUs use the same components which I check, minus the motherboards for SLI/Crossfire, I really don't care because it won't make a difference. Either way, you wont get a single shred of extra frames on a $300 motherboard vs. a $150 motherboard. If for some reason it does make a difference, the difference will be TRIVIAL.
I do this for a hobby, it's easier to get by in life when you have a REAL job and do this sort of thing on the side. Working at a computer store doesn't make you smarticus.
Are you on crack kid? You're comparing a DUAL GPU card to a single GPU card, who the hell did you expect would take the crown and you want an apology? You weren't talking about SLI or Crossfire, but I was, two 8800 GT's perform similar to one 4870 X2. Two 8800 GT's cost $200 total, one 4870 X2 costs over $600. Not only that, I was implying that you were incorrect in the terms of one 4870 X2 vs. 280GTX being double the performance at 1920x1200+, it clearly is not - it clearly does NOT get smoked as you put it, at $230 cheaper I would EXPECT it to get beat not smoked - which stands true. Seriously, don't you have a better comparison? You're just a fanboy trying to make Nvidia look bad. If those benchmarks look like DOUBLE to you, or even close to DOUBLE, you should get your eyes checked and/or go back to school. I'd also like an apology for having to read through all of your rubbish, please and thank you,
EDIT:
You bought a 9800GTX at full price right when it first came out? Even though the trend was obvious right in your face that the video card market was going to tumble? Surely someone with your experience and sheer cunning could have thwarted such a mistake. Shame, I picked up my 4870 at such a discount, see I KNEW how the market worked so a week BEFORE the 4870 was planned to be released, I sold my 8800GTX while the prices were still inflated and pretty much got my 4870 FREE with cash to spare.
Wow you're being a complete ass. How bout you show some respect in your argument instead of insulting me at every turn?
First off, I don't work at a computer store. I'm a graphic artist and cartoonist and I have a computer business on the side.
I live in the US where computer components aren't nearly as inflated as in Canada or Europe so ALL my price comparisons are based on US prices and are ment for US consumers.
As for the cheap shot against my 9800 GTX, I "technically" bought it at full price because I used the EVGA step-up program to get it from an 8800 GTS 320 version that I had purchased for 200 dollars at the time for a budget build (which was marked down from the original 300 at newegg). When the 9800 GTX came out for a then unbelievable price of approximately 300 dollars, it was a no brainer to go for it considering the 8800 GTX was STILL selling for over 400 dollars and the 9800GX2 was going to retail for over 600 dollars. You think that was a bad buy now since you actually know the backstory behind it?
HERE WE GO AGAIN! The common complaint of "You can't compare a dual-gpu card to a single gpu card." Here's the deal: You've got two different designs. You have the massive Monolithic GPU design and you have the Dual GPU design. One uses brute force, the other uses two GPUs in tandom to perform Alternate Frame Rendering to in theory double performance (which isn't truely the case thanks to drivers). This is the same BS excuse that people made when Intel was stuck with Single Core P4's and AMD had the Athlon X2. "You can't compare performance because it's got an extra processor." Well I'm sorry, but you can. The GTX280 is registured as the "High End Nvidia Product" and the 4870 X2 is registured as the "High End ATI Product" so OF COURSE they are going to be compared because they are supposed to be in the same "high end" price bracket. They both are dual slot cards designed to utilize ONE PCI-Express slot to interface with a computer motherboard. One uses one design and the other uses another design so you can't complain that one design works better than the other, you can only ligitimately complain about the price (which everyone, myself included, does)
People made the same BS remarks about the prototype dual 6800, the flawed 7950GX2, the underperforming 1950X2 and 2600X2, and the 3870X2 and the 9800GX2. Now comes the 4870X2. Only thing is, there's more complaints and moaning because for the FIRST time, the dual gpu design is actually paying off whereas before, performance over a single GPU wasn't as spectacular as promised.
BELIEVE ME if nVidia could made a dual GTX280 card, they would in a heartbeat however, their flawed oversized GPU produces too much heat and has too low of yields for them to even attempt it. Even when they switch the card to 55nm manufacturing, it'll still be too hot and too power consuming for a dual GPU card.
ATi processors are vastly different from nVidia processors. The most basic way I can explain this is 120 ATi shaders are roughly less than half as powerful as the same amount of nVidia shader processors. Why is that? Because they are achetectually different from the other. If you want to know the details, go google it but there is a reason why the 800 shaders of the 4870 do not outperform the 240 shaders of the GTX280 at resolutions that eliminate the extra ram advantage of the GTX280.
As for "AA being useless at those resolutions" I dont know what monitor you're using, but I'm working with the good stuff here. I'm not talking about your standard NT 6-bit monitors, I'm talking about 8-bit S-IPS monitors. Again, go google it and research it because I'm not going to waste my time going into depth. I'll just say this: It displays a vastly superior picture
Saying a 350 dollar board doesn't perform any better than a 150 dollar board is bullshit. Wait, I'll amend that, saying a 350 dollar board performs better than a 250 dollar board is valid, but comparing either of those to a "budget" 150 dollar board has no barring. You've probably never even researched this because there is a decent difference between the two. Granted I'll never buy a 350 dollar motherboard but the high end boards perform about 10% or more better than a cheaper board in MORE than just framerates. They provide more stable overclocks, they have better RAID performance, they have better communication with RAM and graphics/PCI slots. In about all areas, the more expensive boards tend to deliver about 10% better performance along a large range of parts. Hell, just look at Skulltrail. For 650 dollars, you can have a beast of a board that LITERALLY leaves every other high-end board in the dust.
Myself, I'm a budget buyer. I don't need the highest frames on the block. I laugh at the people who spend 8 grand on a PC and my computer can still perform to high expectations in the same games and applications. But to ignorantly say that extra money doesn't bring any performance benifits well you sir, you're just not doing your homework.
Dual GPU boards are the wave of the future. I assure you nVidia will do their damnedest to get an effecient Dual GPU board out there as fast as they can. I'm no fanboy for ATi. I've gone through the MX440, the FX5600, the 6800, 7800 GS, 8600, 8800 GTS, and the 9800 GTX. The only ATi card I ever owned was the 2600 and it sucked so I returned it. The only card I ever regretted was the FX5600 because guess what? I bought the nVidia hype behind it. I ate their shit when I shoulda got a Radeon 9800 Pro or better. After that I learned to be objective and ignore hype. So I look for the best deals out there and I also educate myself on what is the best performer sans pricetag.
Right now, the 4870X2 has a pricetag of 550 in America. If it's too expensive in Canada, too bad you'll have to wait for the price to drop just like everyone here does. The 4870X2 is almosdt the exact price of two 4870's in the US and honestly, with the performance it delivers, it has earned that price (to a degree). If nVidia's GTX280 was still king, it will demand an exorbinat price as well. It's how it goes. And no two 8800GT's in SLI do NOT perform the same as a 4870 X2...I dont know where you read that but that is not even close...
If you direct your attention to the right of your charts, you'll notice that the numbers for the 30" monitors are available where a single 4870X2 is beating a pair of GTX280's in SLi. And also if you notice on the 1920x1200, the 4870X2 crossfire isn't working. Everyone knows that the drivers for the 4870X2 are not mature, especially in Crysis. Just like with nVidia cards earlier, Crysis does not like Crossfire but give it a few more months and you'll see improvements for both cards in Sli and Crossfire across the board.
Lastly, if you want to argue, be respectible. I'm not even going to ingratiate you with a response if you continue being immature with the way you represent your argument here.
Rollback Post to RevisionRollBack
To post a comment, please login or register a new account.
sorry pal but you don't know what you're talking about. There is a little thing called Crossfire X where you can put two 4870 X2's together though I'm not even going to bother going into that. The Geforce GTX280 in a card to card comparison does NOT stack up to the 4870 X2 at the ultra high end which was what I was talking about to begin with. In SLi, two GTX280's outperform a single 4870 X2, as it should. However dual 4870 X2's outperform two and and match three card SLi GTX280's in most games. As always, it boils down to scailing and drivers which are always in a state of constent evolution.
HOWEVER! What I was talking about before you decided to erroniously correct me was in a SINGLE card to SINGLE card comparison, the 4870 X2 wins by large margins at the ultra-high resolutions. And by ULTRA-high, I mean 2560x1600 AKA: 30" monitors where the GTX280's 1GB of RAM bottlenecks the card.
The reason I made a single card to single card comparason is mainly cuz, who do you think here is going to drop 1300 dollars in Tri SLi? (slightly more than that but I'm going with easy to quote numbers). Hell, people aren't even going to spend 1100 on Dual 4870 X2's because spending 550 on a single card is a bit of a stretch for the common Computer User. If anything I should have expanding my post to include the 4870 and the 4850 and explain my recommendations for them though my original argument was to suggest to Diablo 3 fans to WAIT before investing in a graphics card until the game actually comes out. It's not a question of "can the card run Diablo 3 now or not" but "Can it play all games in the future or not." As good as Diablo 3 may or may not be, people use their computers for more than just to play one game. They're going to pick up other games and they're going to want to play it with reasonable performance and I want them to know that they don't have to run out and buy a card now when the game isn't even out yet because if they do, they're only going to get screwed in the long run because something is going to come out that they are going to be interested in and instead of waiting, they have to deal with poor graphics performance in a 2 year old card.
If by "Tom's hardware has so and so leading and Anandtech has who cares leading" you have to look at real world tests and you have to look and see what they are testing the cards on. 3Dmark06 and 3Dmark Vantage are cute little tests with big numbers to make overclockers feel special and that's about it. It doesn't really prove anything mainly because ATi and nVidia can fudge the numbers very easilly with drivers. For the longest time ATi had higher 3Dmark06 scores than nVidia and the 8800 GTX was mopping the floor with ATi's cards. Why is that? Cuz ATi wrote better 3Dmark drivers. Hell, nVidia is doing it right now with their current drivers by having the card offload PhysX from the CPU onto the GPU during the physics test in Vantage. It drastically boosts their scores in Vantage. However, if you know how a game works, in a single card configuration, you don't have your GPU offload physics from the CPU and have it do the graphics processing at the same time because it only slows down framerates and isn't very effecient. In SLi or Crossfire, it works and works very well, but in single slots, it doesn't work very well. Is it cheating to boost scores? Probably, but the fact that the test itself is a flawed representation of real-world performance, you take it with a grain of salt.
Also you have to look to see if they actually stress the cards. Big deal if you can get 100 FPS in Crysis if you don't have it completely maxed out with high amounts of AA and AF at a resolution of 1920x1200 or higher.
Also you have to look at the test bed. What processor was it paired with? What motherboard? Ram? Clock speeds? What good is the data if it isn't gathered on an appropriate test bed?
I do this shit for a living. I know what I'm talking about. If I say something is better than something else, I'm rarely wrong. Not because I magically know it is better, but because I make sure I'm not wrong. I gather information, I look for proof, I do tests, I interview people, and then I make my decision to recommend a product or not. However I'm man enough to say "If I'm proved wrong, I'll take what I said back." Only thing is what you said has nothing to do with what I said and I think I deserve an apology for you claiming my information to be erronious especially since in my argument, I said nothing about Crossfire or SLi and was only speaking of single card configurations.
Has anyone ever seen the cow kings dance well i have its right here! http://youtube.com/watch?v=Cd2wyKQm13I
Oh and do you wanna see the new cows in d3 here they are
http://youtube.com/watch?v=TpMAqQ_pc0I
/l、
(゚、 。 7
l、 ~ヽ ------CAT IM A KITTY CAT AND I DANCE DANCE DANCE
..じしf_, )ノ AND I DANCE DANCE DANCE!!!
Come stay awhile and..its time for shedding our enemies blood not idle talk aww no one ever listens... lol
fanbois fanbois fanbois.
Ati and nVidia each have their strengths and weaknesses.
RIP: Demon Hunter: lvl 50 | Barb: lvl 60 (plvl 5) | Monk: lvl12 & lvl70 (plvl 200)
With his budget he narrowed his choices down to 2 laptops.
1: 512 MB dedicated graphics
2.0 GHz
3 GB memory
240 GB hard drive
windows Vista
13.3" screen.
2. 256 Shared graphics
3GB memory
2.2 GHz
Windows Vista
13.3" screen.
Which one do you guys think is more suitable? o and the shared one says up to 1GB when I checked the Graphics card (but I have no clue what that means :confused:)
it means 1gb of the ram is shared between the ram and GPU now the other one's GPU has 2 512 DEDICATED mem on the GPU which means it has 1.24 gb specifically for the GPU and not used by the ram...i would get the one that has the dedicated a fairly good HDD which means a fair amount of space.. and can you respond with the manufacturers and models of these 2 computers i would like to look into them alittle more because if they have a PCIe you are allowed to get the magme box/ asus XG station these have small PCIe connectors and they have (do not know for sure) a vga port or a dvi ports... who knows anyway but then he cable goes from the PCIe into the magma or asus station and you open up the station and plug in your new graphics card once done plug in the power cord and plug in your new monitor and then youll be able to run at higher res....The asus XG station is a better coice only because you are able to overclock the card right from the station with a little knob...the magme box is a little trickierr... you must only use a single width card with the magmabox ^1 you must buy a diffrent model to support double width GPu's these cost around 200-500 bucks and i think are worth it...just an add-on for your gaming pleasures
Edit: this is not the only reason i would like to know the models because i also want to see if they are decent rugged computers that can meet your gamer needs... also i need to know you budget for all of this
I laughed long and hard over that....and then looked over and remembered I payed full price for my 9800 GTX only to have it worth 200 a month later...lol that's how it goes though. Anyone who's been in the Computer Game as long as me knows that when you buy stuff when it first comes out, it's going to be inflated and cost considerably less down the road....though normally it isn't slashed in price a month after it comes out...
I know what you mean. I try not to take it personally when people misquote me or argue over something I said.
And ya, I've noticed too that Tom's hardware is losing its credibility which sucks. It makes it hard to find unbiased reviews now. I tend to go to Maximum PC to find good reviews. They are completely unbiased but of course, being primarily a magazine, they don't have as many reviews as Tom's. For what I can't find on there, I tend to just google it and fish through the BS till I get some good info. lol and if I really wanna see bias, I can always go to theinquirer and laugh at Charlie's nVidia and Vista rants
Oh ya, the AA thing is VERY annoying. If I see a review that doesn't have an AA test for each title, I just stop reading then. I want to see a non-AA test at 1920x1200 and then a heavy AA test at 1920x1200 so then I can get a good idea of just how strong the card is at the extremes.
They're a small community, but they can also help you with any computer problem you have, I've kept my computer 2 times because of them.
It's the decisions you make when you have no time to make them that define who you are.
[SIGPIC][/SIGPIC]
REALLY, WHAT'S CROSSFIRE?!?! Durrrr. Alright... So why are you comparing a $400 card to a $600 card? Then you're comparing a $800 card (Two 280's) to a $1200 card (Two 4870 X2's) what's your point - oh I see what your point is... Two 4870 X2's, yeah a WASTE OF MONEY. Agreed, it does come down to drivers but not in this case because your comparison is bogus, on the plus SLI drivers > Crossfire drivers. I find it absolutely hilarious that you try to act smart by comparing two completely different cards and proving that one is better. Really sir, shucks, I didn't know a DUAL GPU card was better then a single GPU card, jee thanks!!
CORRECT, you're absolutely right but the 4870 X2 is about $230 more expensive here than a 280 - quite the far comparison? I don't think so. You're comparing a single card that has one GPU to another card that has two. You SERIOUSLY think the reason why the X2 is superior to the 280GTX is because of a RAM bottleneck? I thnk you're forgetting those 1600 SHADERS. Again an unfair comparison and again, pointless.
Hope that wasn't directed at me because tl;dr.
Great... It's a shame that I look at gaming benchmarks and don't give a rats ass about 3dmark garbage. I really hope you weren't directing that at me either, because for one, I don't care I'm aware how computer hardware functions and two it's irrelevant, almost a tl;dr.
Oh and why would you need high amounts of AA and AF on high resolutions? Here's a little tip, at 1920x1200, the most you'd need is 2x because the image has already been scaled so high, the jagged edges that need to be smoothed are well, not apparent. Anything higher than that doesn't need AA, you're just adding an extra layer of fail and slowness. Read up on it. How come Mr "I do this shit for a living" didn't know that? Oh Snape.
Why would I waste my time on something so trivial? As long as both GPUs use the same components which I check, minus the motherboards for SLI/Crossfire, I really don't care because it won't make a difference. Either way, you wont get a single shred of extra frames on a $300 motherboard vs. a $150 motherboard. If for some reason it does make a difference, the difference will be TRIVIAL.
I do this for a hobby, it's easier to get by in life when you have a REAL job and do this sort of thing on the side. Working at a computer store doesn't make you smarticus.
Are you on crack kid? You're comparing a DUAL GPU card to a single GPU card, who the hell did you expect would take the crown and you want an apology? You weren't talking about SLI or Crossfire, but I was, two 8800 GT's perform similar to one 4870 X2. Two 8800 GT's cost $200 total, one 4870 X2 costs over $600. Not only that, I was implying that you were incorrect in the terms of one 4870 X2 vs. 280GTX being double the performance at 1920x1200+, it clearly is not - it clearly does NOT get smoked as you put it, at $230 cheaper I would EXPECT it to get beat not smoked - which stands true. Seriously, don't you have a better comparison? You're just a fanboy trying to make Nvidia look bad. If those benchmarks look like DOUBLE to you, or even close to DOUBLE, you should get your eyes checked and/or go back to school. I'd also like an apology for having to read through all of your rubbish, please and thank you,
EDIT:
You bought a 9800GTX at full price right when it first came out? Even though the trend was obvious right in your face that the video card market was going to tumble? Surely someone with your experience and sheer cunning could have thwarted such a mistake. Shame, I picked up my 4870 at such a discount, see I KNEW how the market worked so a week BEFORE the 4870 was planned to be released, I sold my 8800GTX while the prices were still inflated and pretty much got my 4870 FREE with cash to spare.
I said similar, NOT better. The difference is within 10-40% well justified by the price. Is the 4870 X2 a better card? OF COURSE but as it's obvious a lot of people don't have $600+ to dishout on one card.
I have a 24/22 inch screen and let me tell you, anything past 4xAA is POINTLESS, feel free to bog down your frames to satisfy your ego, I really don't care how you waste your time. 1600x1200 will need A LOT more AA at least 8x to remove the majority of jagged edges. If you think you're doing good, it's all in your head. Why are you telling me all this garbage about how annoyed you were with your lowgrade monitors? I don't care. Turn up your AA and quit crying, low resolutions like that need VERY high amounts of AA.
No, comparing a dual card to a single card is a waste of time because they aren't in the same category, I'm aware how the X2 is designed, thanks for the useless post. Though I can see how it's relevant because you're from Europe and the prices are so screwed up there, not to mention overpriced.
Are you INSANE? You're comparing something insignificant to a major component that makes the X2 what it is. A far comparison would be the 280GTX's version of an SLI card.
You assume I'm from the States, in Canada the 280GTX is about $410, while the X2 is sitting at around $620 - prices are still inflated on the X2. While I can buy 2 8800 GT's for $180, or 3 even for $250. Obviously not the performance on an X2, but close and a hell of a lot cheaper. I'm sure we can all agree that the 280GTX is a ripoff, but there's no point comparing it to something that is in another league.
Wow you're being a complete ass. How bout you show some respect in your argument instead of insulting me at every turn?
First off, I don't work at a computer store. I'm a graphic artist and cartoonist and I have a computer business on the side.
I live in the US where computer components aren't nearly as inflated as in Canada or Europe so ALL my price comparisons are based on US prices and are ment for US consumers.
As for the cheap shot against my 9800 GTX, I "technically" bought it at full price because I used the EVGA step-up program to get it from an 8800 GTS 320 version that I had purchased for 200 dollars at the time for a budget build (which was marked down from the original 300 at newegg). When the 9800 GTX came out for a then unbelievable price of approximately 300 dollars, it was a no brainer to go for it considering the 8800 GTX was STILL selling for over 400 dollars and the 9800GX2 was going to retail for over 600 dollars. You think that was a bad buy now since you actually know the backstory behind it?
HERE WE GO AGAIN! The common complaint of "You can't compare a dual-gpu card to a single gpu card." Here's the deal: You've got two different designs. You have the massive Monolithic GPU design and you have the Dual GPU design. One uses brute force, the other uses two GPUs in tandom to perform Alternate Frame Rendering to in theory double performance (which isn't truely the case thanks to drivers). This is the same BS excuse that people made when Intel was stuck with Single Core P4's and AMD had the Athlon X2. "You can't compare performance because it's got an extra processor." Well I'm sorry, but you can. The GTX280 is registured as the "High End Nvidia Product" and the 4870 X2 is registured as the "High End ATI Product" so OF COURSE they are going to be compared because they are supposed to be in the same "high end" price bracket. They both are dual slot cards designed to utilize ONE PCI-Express slot to interface with a computer motherboard. One uses one design and the other uses another design so you can't complain that one design works better than the other, you can only ligitimately complain about the price (which everyone, myself included, does)
People made the same BS remarks about the prototype dual 6800, the flawed 7950GX2, the underperforming 1950X2 and 2600X2, and the 3870X2 and the 9800GX2. Now comes the 4870X2. Only thing is, there's more complaints and moaning because for the FIRST time, the dual gpu design is actually paying off whereas before, performance over a single GPU wasn't as spectacular as promised.
BELIEVE ME if nVidia could made a dual GTX280 card, they would in a heartbeat however, their flawed oversized GPU produces too much heat and has too low of yields for them to even attempt it. Even when they switch the card to 55nm manufacturing, it'll still be too hot and too power consuming for a dual GPU card.
ATi processors are vastly different from nVidia processors. The most basic way I can explain this is 120 ATi shaders are roughly less than half as powerful as the same amount of nVidia shader processors. Why is that? Because they are achetectually different from the other. If you want to know the details, go google it but there is a reason why the 800 shaders of the 4870 do not outperform the 240 shaders of the GTX280 at resolutions that eliminate the extra ram advantage of the GTX280.
As for "AA being useless at those resolutions" I dont know what monitor you're using, but I'm working with the good stuff here. I'm not talking about your standard NT 6-bit monitors, I'm talking about 8-bit S-IPS monitors. Again, go google it and research it because I'm not going to waste my time going into depth. I'll just say this: It displays a vastly superior picture
Saying a 350 dollar board doesn't perform any better than a 150 dollar board is bullshit. Wait, I'll amend that, saying a 350 dollar board performs better than a 250 dollar board is valid, but comparing either of those to a "budget" 150 dollar board has no barring. You've probably never even researched this because there is a decent difference between the two. Granted I'll never buy a 350 dollar motherboard but the high end boards perform about 10% or more better than a cheaper board in MORE than just framerates. They provide more stable overclocks, they have better RAID performance, they have better communication with RAM and graphics/PCI slots. In about all areas, the more expensive boards tend to deliver about 10% better performance along a large range of parts. Hell, just look at Skulltrail. For 650 dollars, you can have a beast of a board that LITERALLY leaves every other high-end board in the dust.
Myself, I'm a budget buyer. I don't need the highest frames on the block. I laugh at the people who spend 8 grand on a PC and my computer can still perform to high expectations in the same games and applications. But to ignorantly say that extra money doesn't bring any performance benifits well you sir, you're just not doing your homework.
Dual GPU boards are the wave of the future. I assure you nVidia will do their damnedest to get an effecient Dual GPU board out there as fast as they can. I'm no fanboy for ATi. I've gone through the MX440, the FX5600, the 6800, 7800 GS, 8600, 8800 GTS, and the 9800 GTX. The only ATi card I ever owned was the 2600 and it sucked so I returned it. The only card I ever regretted was the FX5600 because guess what? I bought the nVidia hype behind it. I ate their shit when I shoulda got a Radeon 9800 Pro or better. After that I learned to be objective and ignore hype. So I look for the best deals out there and I also educate myself on what is the best performer sans pricetag.
Right now, the 4870X2 has a pricetag of 550 in America. If it's too expensive in Canada, too bad you'll have to wait for the price to drop just like everyone here does. The 4870X2 is almosdt the exact price of two 4870's in the US and honestly, with the performance it delivers, it has earned that price (to a degree). If nVidia's GTX280 was still king, it will demand an exorbinat price as well. It's how it goes. And no two 8800GT's in SLI do NOT perform the same as a 4870 X2...I dont know where you read that but that is not even close...
If you direct your attention to the right of your charts, you'll notice that the numbers for the 30" monitors are available where a single 4870X2 is beating a pair of GTX280's in SLi. And also if you notice on the 1920x1200, the 4870X2 crossfire isn't working. Everyone knows that the drivers for the 4870X2 are not mature, especially in Crysis. Just like with nVidia cards earlier, Crysis does not like Crossfire but give it a few more months and you'll see improvements for both cards in Sli and Crossfire across the board.
Lastly, if you want to argue, be respectible. I'm not even going to ingratiate you with a response if you continue being immature with the way you represent your argument here.