Hey all, I'm pretty tech savvy myself but I'd like a few opinions on somethings. This summer I'll be getting a job and graduating from high-school so I'll have a nice sum of money to throw around. I've decided on a solid build but two things are still in the uncertain phase.
First the Gpu's. I've got a 6950 currently flashed to a 6970 bios, but It won't be able to run an Eyefinity setup, which is my goal. So my question is do I buy a 7970 or a Gtx680 and run a 3x1 eyefinity setup, Or do I get another 6950 lose the ability to CFX with better cards in the future but save some dough. I'm not so into the idea of buying 2 more 6950's cause once you hit 3 Gpu's I've heard so many problems arise and It's just no worth the hassle, if someone can argue against this point well enough it might be the better route to take.
Second and lastly the monitors. I want the Dell U2312 monitors cause they're ips beautiful colors, nice screen size and 16:9 aspect ratio, but I've heard that the Samsung S23A750D is even better. I've seen the U2312 and I know it's beautiful but I've never seen the samsung so If any of you have any experience then please I could use some help.
Rollback Post to RevisionRollBack
Not even Death will save you from Diablo Bunny's Cuteness!
The cost of three Dell U2312 puts you very close to one Dell U2711. Just personal opinion, but I enjoy one large screen much more than smaller ones separated by 1-inch-thick bezels. Plus Diablo 3 (like most modern games) is natively 16:9, so multi-monitor setups of any kind have issues.
You don't need to upgrade your GPU for Diablo 3 even on a 2560x1440 monitor like the U2711, but it might help if you insist on the triple-screen approach. For other games, a 7970 or 680 (or two (or a 690)) will be needed if you want to stay close to max details at triple-screen resolutions.
true but i heard the samsung is better quality, is 120Hz and has thinner bezels.
Btw do you think a 1100t will bottleneck a 6950 xfire setup or a 7970?
Rollback Post to RevisionRollBack
Not even Death will save you from Diablo Bunny's Cuteness!
true but i heard the samsung is better quality, is 120Hz and has thinner bezels.
You'll need dual-690s if you want to hit 120 FPS on 3 screens
If you don't mind my asking, what do you do for a living?
I'm a student who absolutely loves gaming, I also enjoy photo editing so the e-ips is big plus on that side. and that's not true, I saw a vid of a guy hitting ~120 on 3 samsungs with 2 7970's, although he wasn't completely maxed.
Rollback Post to RevisionRollBack
Not even Death will save you from Diablo Bunny's Cuteness!
true but i heard the samsung is better quality, is 120Hz and has thinner bezels.
Btw do you think a 1100t will bottleneck a 6950 xfire setup or a 7970?
I could imagine it at least. Bulldozer is crap.
Well there goes trying to get a cheaper build. Guess I'll be sticking with ivy bridge or sandy bridge
Rollback Post to RevisionRollBack
Not even Death will save you from Diablo Bunny's Cuteness!
Well there goes trying to get a cheaper build. Guess I'll be sticking with ivy bridge or sandy bridge
A cheaper build kinda went out the window when you started talking about triple-monitors and multi-GPUs
Bulldozer is okay for applications that can saturate all the cores, but isn't competitive otherwise. Sandy Bridge is now outdated, but can be overclocked a bit further than Ivy Bridge. Ivy Bridge is extremely power-efficient, and is definitely the way to go if you're not overclocking much. The characteristics of modern games gives smaller numbers of very fast cores an advantage, so even Intel's "budget" CPUs can outperform Bulldozer on a performance-per-dollar level.
My personal Ivy Bridge + GTX 680 system (just built 5 days ago) idles at only 48 watts, peaks at about 300 watts, and is the quietest system I've ever built thanks to the minimal cooling needs of this low-power gear. Literally two months ago such a system would have been impossible to do.
Well there goes trying to get a cheaper build. Guess I'll be sticking with ivy bridge or sandy bridge
A cheaper build kinda went out the window when you started talking about triple-monitors and multi-GPUs
Bulldozer is okay for applications that can saturate all the cores, but isn't competitive otherwise. Sandy Bridge is now outdated, but can be overclocked a bit further than Ivy Bridge. Ivy Bridge is extremely power-efficient, and is definitely the way to go if you're not overclocking much. The characteristics of modern games gives smaller numbers of very fast cores an advantage, so even Intel's "budget" CPUs can outperform Bulldozer on a performance-per-dollar level.
My personal Ivy Bridge + GTX 680 system (just built 5 days ago) idles at only 48 watts, peaks at about 300 watts, and is the quietest system I've ever built thanks to the minimal cooling needs of this low-power gear. Literally two months ago such a system would have been impossible to do.
Well I guess you're right. I've always been a more Amd guy, it's just that I'd kind of want to do 3d maybe at one point if not, it doesn't matter, and amd is offering cheaper solutions. GRR THIS GETS ME SO ANGRY I DONT KNOW WHAT TO DO!
Rollback Post to RevisionRollBack
Not even Death will save you from Diablo Bunny's Cuteness!
The AMD FX-8150 sells for around $240, an Intel i5-3550 sells for $220 and is faster. AMD might be advertising their chip as having 8 cores, but the reality is that it's 4 fully-featured cores with some duplication to make 8. One thing that isn't duplicated is floating-point processing, which shows up in 3D rendering benchmarks like Cinebench: observe the "8-core" 8150 get knocked around by Intel's 4-core i5-2500k (Sandy Bridge) here: http://www.anandtech...duct/434?vs=288 .
I don't consider myself an Intel guy or AMD guy. I buy whatever is best for my money. Back in the Pentium 4 / Athlon64 days, I bought an Athlon 64 because Intel was honestly embarassing at the time. Starting with the Core 2 series, Intel has been superior, and it seems that AMD isn't even competitive today :|
The AMD FX-8150 sells for around $240, an Intel i5-3550 sells for $220 and is faster. AMD might be advertising their chip as having 8 cores, but the reality is that it's 4 fully-featured cores with some duplication to make 8. One thing that isn't duplicated is floating-point processing, which shows up in 3D rendering benchmarks like Cinebench: observe the "8-core" 8150 get knocked around by Intel's 4-core i5-2500k (Sandy Bridge) here: http://www.anandtech...duct/434?vs=288 .
I don't consider myself an Intel guy or AMD guy. I buy whatever is best for my money. Back in the Pentium 4 / Athlon64 days, I bought an Athlon 64 because Intel was honestly embarassing at the time. Starting with the Core 2 series, Intel has been superior, and it seems that AMD isn't even competitive today :|
I agree they're just looking to be that middle guy for the cheaper stuff. The Phenom II x4 965 isn't bad but its outperformed and it comes in at a relatively low price point. I'm still on a core 2 duo at 3.33 Ghz. I'd like to upgrade to a 2700k or a 3770k, the Ht entices me because I might do a video stream. If I ever get into that stuff I'd have the processing power to do so with shitloads left to spare for the game, but If I get tight on budget then I might have to settle for a 2500K OR 3570K. I was also thinking why not just future proof the system by going with a 2011 socket. get a 3820 which is 4 cores Ht to 8 and it preforms at or above the 2700k. I can upgrade more Ram which is good for photo editing and it has a lot more Pci-e lanes. I remember Z68 could only support like 2 at x8 each. Hell even going with an LGA 1336 would be better, and it would still never bottleneck the GPU, plus those prices are probably dropping rapidly thanks to ivy bridge.
Rollback Post to RevisionRollBack
Not even Death will save you from Diablo Bunny's Cuteness!
I was also thinking why not just future proof the system by going with a 2011 socket. get a 3820 which is 4 cores Ht to 8 and it preforms at or above the 2700k. I can upgrade more Ram which is good for photo editing and it has a lot more Pci-e lanes.
There's a $700 price difference between these two.
The additional PCIe lanes do nothing if you're not adding hardware to use them; Ivy Bridge still gets 16 PCIe 3.0 lanes which is enough for two GTX 690s.
The 3960X has twice the heat output, so keeping your system noise down will be that much harder.
Motherboards are much more expensive.
Ivy Bridge allows up to 32 GB of RAM, and you haven't listed a single application that would use more than 8.
LGA 2011 is not future-proof because "Haswell" uses a new socket.
I was also thinking why not just future proof the system by going with a 2011 socket. get a 3820 which is 4 cores Ht to 8 and it preforms at or above the 2700k. I can upgrade more Ram which is good for photo editing and it has a lot more Pci-e lanes.
There's a $700 price difference between these two.
The additional PCIe lanes do nothing if you're not adding hardware to use them; Ivy Bridge still gets 16 PCIe 3.0 lanes which is enough for two GTX 690s.
The 3960X has twice the heat output, so keeping your system noise down will be that much harder.
Motherboards are much more expensive.
Ivy Bridge allows up to 32 GB of RAM, and you haven't listed a single application that would use more than 8.
LGA 2011 is not future-proof because "Haswell" uses a new socket.
PHOTOSHOP, that thing EATS MEM. I've worked on a few projects in which I couldn't add any more brush strokes due to a lack of RAM. It gets annoying when you have a lot of brush strokes simulating Fire, or what have you.
I guess you're right on all the other points though, what about LGA 1336 as a cheaper build, or is sandy bridge beating it there already?
Rollback Post to RevisionRollBack
Not even Death will save you from Diablo Bunny's Cuteness!
PHOTOSHOP, that thing EATS MEM. I've worked on a few projects in which I couldn't add any more brush strokes due to a lack of RAM. It gets annoying when you have a lot of brush strokes simulating Fire, or what have you.
Okay, well go with Sandy/Ivy Bridge and start with 2x8 GB of RAM; if you need more max it out with another 2x8 If memory is critically important, you can go with a Xeon or Opteron system with over 100GB of RAM, but gaming performance suffers on these architectures due to reduced single-thread performance.
I guess you're right on all the other points though, what about LGA 1336 as a cheaper build, or is sandy bridge beating it there already?
LGA 1366 has the same problems as LGA 2011, with less speed. I haven't seen very aggressive discounting; Newegg is still selling the 6-core 1366 CPU for $1030, which is crazy. Sandy Bridge has seen a few price cuts, though.
PHOTOSHOP, that thing EATS MEM. I've worked on a few projects in which I couldn't add any more brush strokes due to a lack of RAM. It gets annoying when you have a lot of brush strokes simulating Fire, or what have you.
Okay, well go with Sandy/Ivy Bridge and start with 2x8 GB of RAM; if you need more max it out with another 2x8 If memory is critically important, you can go with a Xeon or Opteron system with over 100GB of RAM, but gaming performance suffers on these architectures due to reduced single-thread performance.
I guess you're right on all the other points though, what about LGA 1336 as a cheaper build, or is sandy bridge beating it there already?
LGA 1366 has the same problems as LGA 2011, with less speed. I haven't seen very aggressive discounting; Newegg is still selling the 6-core 1366 CPU for $1030, which is crazy. Sandy Bridge has seen a few price cuts, though.
I used to have 8gb of DDr2 on an old core 2 quad, and that wasn't enough mem so I'm starting out with 16GB. It should be plenty sufficient for my needs. I've been looking at newegg as well and I'm surprised they aren't discounting the 1366 cpu's. They're more than 2 years old at this point you'd think they drop even just a little bit. so I guess I'm going with a 3570k. This is the current build. I decided to leave the second and third monitor till later, and I won't eyefinity them. I'll most likely just use them for guides, and web browsing while playing games like LoL and what not.
1000 watt PSU? Are you going to use dual video cards?
What GPU are you getting?
Rollback Post to RevisionRollBack
"Is God willing to prevent evil, but not able? Then he is not omnipotent. Is he able, but not willing? Then he is malevolent. Is he both able and willing? Then whence cometh evil? Is he neither able nor willing? Then why call him God?"
That case is gigantic, and doesn't seem to have any sound suppression features. You're not buying enough equipment to fill it out, and there are many quieter options...
How are you calculating that you need such a big power supply? By my math, you're looking at 600 watts under load (at most) even with a 6990 or dual 6950s. I'd go with a 750 watt (or so) with higher efficiency, like this: http://www.newegg.co...N82E16817151087
Get any Z77 board except that one. The "thermal armor" would be more appropriately called "thermal insulator", as it causes your board to run hotter unless you use the included 40mm fan, in which case it's louder.
The H100 cooler is only going to buy you about 100 MHz more overclocking than a single-fan design like the H80. If you're not overclocking, there's no need to water cool as Ivy Bridge has relatively little heat output at stock clocks.
A 6950 under full load pulls 280 watts, times 2 that's 560 watts.
A 6990 is going to pull about 440 watts under full load, 500 if OC'd.
Intel Core i5-3570K @3.4GHz = 160 watts under full load
OC'd @ 5GHz = 270 watts under full load
So 750watt would be pushing the minimum imo.
Rollback Post to RevisionRollBack
"Is God willing to prevent evil, but not able? Then he is not omnipotent. Is he able, but not willing? Then he is malevolent. Is he both able and willing? Then whence cometh evil? Is he neither able nor willing? Then why call him God?"
Intel Core i5-3570K @3.4GHz = 160 watts under full load
My own 3770k system uses about 130 watts from the wall at stock clocks Prime95, but I'm cheating a little. I'm using a gold efficiency power supply (assuming your source is not), 1.35volt RAM (most people use 1.5 or more), and a GTX 680 (28nm GPUs seem to idle very efficiently).
I'm not really disputing your numbers, but I want to point out that a system with high-efficiency components doesn't need that much power.
Ivy Bridge might be able to hold 5 GHz long enough to finish a benchmark, but I don't think it could hold that speed for much longer, even with an H100.
Something to think about: a system that's expected to pull 700+ watts for extended periods of time is going to be loud. I count at least 11 fans in this build (5 on the case, 2 on the H100, at least 2 on the dual GPUs, one in the power supply, and one on the motherboard to offset the insulation from the "thermal armor"), and they're all going to have to spin up to evacuate that much heat.
Rollback Post to RevisionRollBack
To post a comment, please login or register a new account.
First the Gpu's. I've got a 6950 currently flashed to a 6970 bios, but It won't be able to run an Eyefinity setup, which is my goal. So my question is do I buy a 7970 or a Gtx680 and run a 3x1 eyefinity setup, Or do I get another 6950 lose the ability to CFX with better cards in the future but save some dough. I'm not so into the idea of buying 2 more 6950's cause once you hit 3 Gpu's I've heard so many problems arise and It's just no worth the hassle, if someone can argue against this point well enough it might be the better route to take.
Second and lastly the monitors. I want the Dell U2312 monitors cause they're ips beautiful colors, nice screen size and 16:9 aspect ratio, but I've heard that the Samsung S23A750D is even better. I've seen the U2312 and I know it's beautiful but I've never seen the samsung so If any of you have any experience then please I could use some help.
You don't need to upgrade your GPU for Diablo 3 even on a 2560x1440 monitor like the U2711, but it might help if you insist on the triple-screen approach. For other games, a 7970 or 680 (or two (or a 690)) will be needed if you want to stay close to max details at triple-screen resolutions.
Btw do you think a 1100t will bottleneck a 6950 xfire setup or a 7970?
If you don't mind my asking, what do you do for a living?
Bulldozer is okay for applications that can saturate all the cores, but isn't competitive otherwise. Sandy Bridge is now outdated, but can be overclocked a bit further than Ivy Bridge. Ivy Bridge is extremely power-efficient, and is definitely the way to go if you're not overclocking much. The characteristics of modern games gives smaller numbers of very fast cores an advantage, so even Intel's "budget" CPUs can outperform Bulldozer on a performance-per-dollar level.
My personal Ivy Bridge + GTX 680 system (just built 5 days ago) idles at only 48 watts, peaks at about 300 watts, and is the quietest system I've ever built thanks to the minimal cooling needs of this low-power gear. Literally two months ago such a system would have been impossible to do.
Well I guess you're right. I've always been a more Amd guy, it's just that I'd kind of want to do 3d maybe at one point if not, it doesn't matter, and amd is offering cheaper solutions. GRR THIS GETS ME SO ANGRY I DONT KNOW WHAT TO DO!
I don't consider myself an Intel guy or AMD guy. I buy whatever is best for my money. Back in the Pentium 4 / Athlon64 days, I bought an Athlon 64 because Intel was honestly embarassing at the time. Starting with the Core 2 series, Intel has been superior, and it seems that AMD isn't even competitive today :|
I guess you're right on all the other points though, what about LGA 1336 as a cheaper build, or is sandy bridge beating it there already?
LGA 1366 has the same problems as LGA 2011, with less speed. I haven't seen very aggressive discounting; Newegg is still selling the 6-core 1366 CPU for $1030, which is crazy. Sandy Bridge has seen a few price cuts, though.
I used to have 8gb of DDr2 on an old core 2 quad, and that wasn't enough mem so I'm starting out with 16GB. It should be plenty sufficient for my needs. I've been looking at newegg as well and I'm surprised they aren't discounting the 1366 cpu's. They're more than 2 years old at this point you'd think they drop even just a little bit. so I guess I'm going with a 3570k. This is the current build. I decided to leave the second and third monitor till later, and I won't eyefinity them. I'll most likely just use them for guides, and web browsing while playing games like LoL and what not.
Qty. Product Description Savings Total Price
Antec DF-85 Black Steel / Plastic ATX Full Tower Computer Case Item #: N82E16811129087 $139.99
Dell UltraSharp U2312HM Black 23" 8ms Widescreen LCD IPS Panel Monitor with LED Item #: N82E16824260055 $249.99
StarTech 10ft Mini DisplayPort to DisplayPort Adapter Cable MDP2DPMM10 Mini DisplayPort to DisplayPort Interface Item #: N82E16815158233 $19.98 ($9.99 each)
Antec CP-1000 1000W Continuous ATX12V v2.3 / EPS12V v2.91 SLI Ready 80 PLUS Certified Modular Active PFC Power Supply Exclusively ... Item #: N82E16817371036 $149.99
CORSAIR Vengeance 16GB (4 x 4GB) 240-Pin DDR3 SDRAM DDR3 1600 (PC3 12800) Desktop Memory Model CMZ16GX3M4X1600C9G Item #: N82E16820233247 $119.99
ASUS SABERTOOTH Z77 LGA 1155 Intel Z77 HDMI SATA 6Gb/s USB 3.0 ATX Intel Motherboard Item #: N82E16813131821 $239.99
Intel Core i5-3570K Ivy Bridge 3.4GHz (3.8GHz Turbo) LGA 1155 77W Quad-Core Desktop Processor Intel HD Graphics 4000 BX80637I53570K Item #: N82E16819116504 $249.99
CORSAIR H100 (CWCH100) Extreme Performance Liquid CPU Cooler Item #: N82E16835181017 $114.99
Subtotal: $1,284.91
What GPU are you getting?
Epicurus
A 6990 is going to pull about 440 watts under full load, 500 if OC'd.
Intel Core i5-3570K @3.4GHz = 160 watts under full load
OC'd @ 5GHz = 270 watts under full load
So 750watt would be pushing the minimum imo.
Epicurus
I'm not really disputing your numbers, but I want to point out that a system with high-efficiency components doesn't need that much power.
Ivy Bridge might be able to hold 5 GHz long enough to finish a benchmark, but I don't think it could hold that speed for much longer, even with an H100.
Something to think about: a system that's expected to pull 700+ watts for extended periods of time is going to be loud. I count at least 11 fans in this build (5 on the case, 2 on the H100, at least 2 on the dual GPUs, one in the power supply, and one on the motherboard to offset the insulation from the "thermal armor"), and they're all going to have to spin up to evacuate that much heat.