Budget 41-50k Gaming PC

thanks @jhononweed for the details. I am mostly a FPS / RTS guy, so a lower cost AMD solutions seems to be the right bet. I'm not too worried about power consumption, and will be using aftermarket cooling, so that's that.

If i choose to go with the AMD rig which you've suggested, will my Corsair 450VX be able to handle the load? I am currently looking at the possibility of keeping my 4850 and 450VX, and then later opt for R9, when prices settle... your opinion?

Yups i would say save and upgrade to Seasonic S12 ii 620W over any 500-550W PSU's as that would give you better headroom for dual cards with OC'd beasty FX.
Power consumption is not that much on FX as every chip is different that other one....but 20~25W more at best over intel's.
Use your current PSU + GPU and get FX 8320/8350 with Asus M5A99FX pro r2 mobo with Corsair H100i.....that would be best bang for buck for your usage. AMD R9 280x - 299$ ( over 30K rs here ) and R9 290 - 450~500$ ( 40+ K rs here ).R290X will be limited edition priced@599$.Just a guess but mostly + - 2-3KRs on prices.
 
Power consumption is not that much on FX as every chip is different that other one....but 20~25W more at best over intel's.
Not sure what you mean by "at best", but the maximum power consumption difference between i7 4770k and FX 8350 comes to be 90 Watts , happens when all threads/cores are loaded. 4670k, 3770k, 2550k all take less power than 4770k - http://www.xbitlabs.com/articles/cpu/display/core-i5-4670k-4670-4570-4430_7.html#sect0, and depending on importance of single thread, perform better or worse than FX 8350.
 
Haswell runs too hot and needs good cooling solutions
Yes. But I am seeing people going overboard on Haswell cooling thinking that it will help overclocking. Spending more than 2 - 3k on cooling Haswell won't improve overclocking potential significantly. Why do I say that?

The delidders have reduced temperatures by more than 12 degrees celcius. That is enormous. Without delidding, getting that much lower temperature than a 2-3k air cooler would need a 50k cooling solution. And yet, the delidders have been unable to overclock the processor much more than people with Hyper 212 EVO without delidding. 4.5 GHz is where overclocking mostly stops for most people - whether temperatures are 70-77C for delidded processors, or 90 C for un-delidded processors doesn't affect overclocking that much.

So while Haswell runs hot, spending a lot on cooling it is not going to help.
 
Not sure what you mean by "at best", but the maximum power consumption difference between i7 4770k and FX 8350 comes to be 90 Watts , happens when all threads/cores are loaded. 4670k, 3770k, 2550k all take less power than 4770k - http://www.xbitlabs.com/articles/cpu/display/core-i5-4670k-4670-4570-4430_7.html#sect0, and depending on importance of single thread, perform better or worse than FX 8350.
And i can pull dozen of other reviews were max difference on power consumption b/w i7 and Fx 8350 is ~25-30 W.
Every body forget how Intel's power usage when it's put on Overclock but hey not every chip is same... ;)
 
And i can pull dozen of other reviews were max difference on power consumption b/w i7 and Fx 8350 is ~25-30 W.
Every body forget how Intel's power usage when it's put on Overclock but hey not every chip is same... ;)

Ok then quote the reviews. Your "at best" appeared, prima facie, to be maximum power difference. If you meant "minimum " by "at best", do clarify.

Though if it meant minimum power difference, your point is enormously weakened.

Edit : all chips are different but existence of one pair of 8350 and 4770k chips where power difference is 90 Watts proves that 20-25 W cannot be the maximum difference between any pair of 8350 and 4770k chips.
 
Last edited:
And i can pull dozen of other reviews were max difference on power consumption b/w i7 and Fx 8350 is ~25-30 W.
Every body forget how Intel's power usage when it's put on Overclock but hey not every chip is same... ;)
No need for dozens, one is enough.

In the meantime here's one more review that doesn't show a 25~30W difference : http://www.bit-tech.net/hardware/2013/06/12/intel-core-i5-4670k-haswell-cpu-review/6 . Also for an 800MHz overclock the AMD uses ~50W more, while a 1.2Ghz overclock on the Intel uses ~34W more.

You forget that the FX 8xxx chips have trouble staying within their TDP limits at stock clocks itself. Thats the reason all those MSI boards with VRMs not designed with any overhead were burning out.
 
No need for dozens, one is enough.

In the meantime here's one more review that doesn't show a 25~30W difference : http://www.bit-tech.net/hardware/2013/06/12/intel-core-i5-4670k-haswell-cpu-review/6 . Also for an 800MHz overclock the AMD uses ~50W more, while a 1.2Ghz overclock on the Intel uses ~34W more. So 16W more....ohh god that's way too much consumption if you consider price per dime of i5.:wtf:

You forget that the FX 8xxx chips have trouble staying within their TDP limits at stock clocks itself. Thats the reason all those MSI boards with VRMs not designed with any overhead were burning out. Never knew that may be Asus/Giga knew that.

Here it comes >> http://www.legitreviews.com/amd-fx-8350-8-core-black-edition-processor-review_2055/13
system-power.jpg


Consumption data against i7 3770k on load, take i7 4770k and situation improves for Fx.
Still i can't find 90+W usage either way.:)
You do know that Bit-tech reviews are biased towards Intel and they use their own benchmarking s/w's.
 
Here it comes >> http://www.legitreviews.com/amd-fx-8350-8-core-black-edition-processor-review_2055/13
system-power.jpg


Consumption data against i7 3770k on load, take i7 4770k and situation improves for Fx.
Still i can't find 90+W usage either way.:)
You do know that Bit-tech reviews are biased towards Intel and they use their own benchmarking s/w's.

Ok, 57 W load difference between 3770k and 8350. From xbitlabs review, difference between 3770k and 4770k is 10W. So hypothetical difference between 4770k and 8350 is around 47 W, for chips and test setup of legitreviews.

Waiting for a review with maximum difference 25-30 W. Also waiting for clarification on "at best".
 
Prime95 used for the CPU load tests was made by Bit-tech? :rolleyes:
You could add cine-bench if you want too....hey but there too Fx fares badly too against i5/i7.

Ok, 57 W load difference between 3770k and 8350. From xbitlabs review, difference between 3770k and 4770k is 10W. So hypothetical difference between 4770k and 8350 is around 47 W, for chips and test setup of legitreviews.

Waiting for a review with maximum difference 25-30 W. Also waiting for clarification on "at best".
First prove 90+W from your initial post marked to me before you edited to this
Edit : all chips are different but existence of one pair of 8350 and 4770k chips where power difference is 90 Watts proves that 20-25 W cannot be the maximum difference between any pair of 8350 and 4770k chips.
25-30W or 90W, which one is more ? Then co-relate to actual price paid by using extra wattage either in monthly or yearly basis and compare it with price premium Intel is charging.
 
First prove 90+W from your initial post marked to me before you edited to this

25-30W or 90W, which one is more ? Then co-relate to actual price paid by using extra wattage either in monthly or yearly basis and compare it with price premium Intel is charging.

I proved it immediately after saying about 90 W, by posting xbitlabs review. It is you that are yet to substantiate your claim of 25-30 W difference. The "evidence" you did post has no direct comparison between 4770k and 8350, and deduction doesn't support your claim. You are free to post the "dozens" of reviews supporting your claim.

90 W is more, even though you've been claiming "maximum" is 25-30. Waiting for a mathematics book reference proving 90 is less than 25 or 30
if xbitlabs review is not "proof", legitreviews is not, too. Why then do you post such links?
 
I would stay away from weed if you are able to appreciate Bit-tech is biased with their in house benchmark's.:)
What do you mean by "in house benchmark"? Bit-tech used a standard Prime95 run for cpu load tests for power consumption. Legitreviews has used the very same prime95 test for their load tests as well. So where's the bias? Anandtech and Techreport have shown very similar ~80W differences between procs in their benchmarks. I suppose they are biased too.

Just to be clear, I started off about power consumption. What does Cinebench and compiler optimisations have to do with cpu load tests? Thats not an unbiased view, its a muddled view.

P.s. AFAIK the AMD compiler is not even available for Windows. I guess the OP should switch to Linux and recompile all his apps with the AMD compiler.
 
What do you mean by "in house benchmark"? Bit-tech used a standard Prime95 run for cpu load tests for power consumption. Legitreviews has used the very same prime95 test for their load tests as well. So where's the bias? Anandtech and Techreport have shown very similar ~80W differences between procs in their benchmarks. I suppose they are biased too.

Just to be clear, I started off about power consumption. What does Cinebench and compiler optimisations have to do with cpu load tests? Thats not an unbiased view, its a muddled view.

And i showed you power consumption chart.
Have a look at prime95 results of both reviewing sites and try to make out difference between those.Still in dispute then check results from other reviewers.
Software
Below is a list of the applications we've used for our testing - most of them are available free for public consumption, although some are popular professional software applications.
We've used Windows 7 Home Premium 64-bit SP1, as this is the most flexible and reliable 64-bit OS for testing the applications above.
[url]http://www.bit-tech.net/hardware/2013/06/12/intel-core-i5-4670k-haswell-cpu-review/2http://www.bit-tech.net/hardware/2013/06/12/intel-core-i5-4670k-haswell-cpu-review/2[/URL]
Cinebench and compiler optimisations came when you posted prime95 as benching s/w which is Intel optimised then again you will dispute that too.:rolleyes:
P.s. AFAIK the AMD compiler is not even available for Windows. I guess the OP should switch to Linux and recompile all his apps with the AMD compiler.
Sure tell every other reviewer this for their next AMD cpu reviews till then Cinebench 11.5R is heavily one sided or use latest 15R version which might open up few blind views.

PS : Just try to read full post's / reviews rather than hop on anti bandwagon.
 
I repeat - What do compiler optimisations have to do with power consumption? Here's a hint - nothing. So whats your problem?
 
And i can pull dozen of other reviews were max difference on power consumption b/w i7 and Fx 8350 is ~25-30 W.
Every body forget how Intel's power usage when it's put on Overclock but hey not every chip is same... ;)



What's taking the time? Are you writing the dozen reviews ?
 
@amitkher , @Crazy_Eddy

While i find the discussion insightful, it still doesn't help me. I am not into heavy overclocks, and unless the TDP drastically affects my choice of Mobo / PSU, a difference of 40-50 W does not bother me. :)

It would be great if you could give your views on :-
1. Multi Core vs Single Core performance, with an eye on the future (considering the fact that both MS and SOny have incorporated AMD multi-core solutions in their consoles)
2. Choice of mobo for AM3+ (I'm thinking 990FX - but I got a little apprehensive reading the bit about MSI mobos burning out)
3. If not AMD, then which kit in Intel?

Cheers! :)
 
I repeat - What do compiler optimisations have to do with power consumption? Here's a hint - nothing. So whats your problem?
Eddy kindly stop being crazy ! Re-read the post.:)
@amitkher , @Crazy_Eddy

While i find the discussion insightful, it still doesn't help me. I am not into heavy overclocks, and unless the TDP drastically affects my choice of Mobo / PSU, a difference of 40-50 W does not bother me. :)

It would be great if you could give your views on :-
1. Multi Core vs Single Core performance, with an eye on the future (considering the fact that both MS and SOny have incorporated AMD multi-core solutions in their consoles)
2. Choice of mobo for AM3+ (I'm thinking 990FX - but I got a little apprehensive reading the bit about MSI mobos burning out)
3. If not AMD, then which kit in Intel?

Cheers! :)

1. Mate buy i7 keeping future in mind as i see not many upgrade their core components of rig's so frequently.It might be costly but other than you or we only have FX as alternate option and choice is pretty obvious if money is limited or else buy Intel.
2. Buy Asus 990X or Fxpro....one is sub 10K board and other one 11K board.You wont go wrong on either of them.
3. Same as pointer 1.
 
While i find the discussion insightful, it still doesn't help me. I am not into heavy overclocks, and unless the TDP drastically affects my choice of Mobo / PSU, a difference of 40-50 W does not bother me. :)
Sorry for going off-topic. As I say, whatever CPU you buy will perform as it should for its price point. Each company prices their processors based on where they see it fitting in, in terms of performance. A dual core Intel i3 costs 7k, a "six core" FX 6300 costs 7k - AMD is not being generous here, the reason they priced it similarly is because you will find both perform fairly similarly. If your CPU budget is 10k and you get an FX 8xxx, there's absolutely nothing wrong.

My post was only to point out the erroneous claims by john who claims that :-
a) FX processors have hardly any power difference vs Haswell <-- his 25~30W claims still nowhere to be seen, even after we've given him 4 reviews with 80W+ differences
b) Haswells use more power than FX when overclocked <-- still not proven, john's only response was a sarcastic comment about price
If John quit being defiant about his incorrect claims, we could've closed up this off-topic discussion many posts ago.

It would be great if you could give your views on :-
1. Multi Core vs Single Core performance, with an eye on the future (considering the fact that both MS and SOny have incorporated AMD multi-core solutions in their consoles)
2. Choice of mobo for AM3+ (I'm thinking 990FX - but I got a little apprehensive reading the bit about MSI mobos burning out)
1. The AMD "Jaguar" CPU architecture used in the new PS4/Xbox One consoles is an Intel Atom equivalent, i.e. nowhere near processing performance of AMD's FX processors leave alone Intel's Core procs. While its not a big deal considering consoles are optimised platforms, I guess the lesson from this is your GPU matters more than your CPU.
2. You should be fine since people in this thread haven't recommended MSI boards. You can cross-check with the motherboard VRM database : http://www.overclock.net/t/946407/amd-motherboards-vrm-info-database - pick a board that has 140W TDP and as many phases as you can fit into your budget


Eddy kindly stop being crazy ! Re-read the post.:)
Please understand that :
1. We are discussing Power consumption ONLY
2. Prime95 is used for Power consumption load tests in both your legitreviews test and bit-tech test. The "in house" media benchmark* is NOT used for power consumption tests.
3. Prime95 results cannot be directly compared between sites because they are using different test setups altogether. I am unable to see what the legitreviews Intel test setup is since they have only listed the AMD test rig.
4. Even if prime95 is optimised for Intel and runs 10 times faster, it will still load the Intel chip to ~100% which will still help us evaluate its max power consumption

* The bit-tech "in house" media benchmark uses Handbrake for video encoding. Handbrake is a front-end for the x264 encoder.
Legitreviews uses an x264HD benchmark, which uses the x264 encoder again.
Please point out the intel bias in the above test scenario? If you're saying the input video files to the x264 encoder is "optimised for Intel", you are smoking some killer stuff.

P.s.: We are all still waiting for the dozens of reviews or even one review showing 25~30W difference. Is it usual for AMD fanbois to throw about exaggerated claims and then call others crazy when they can't back their claims? :)
 
Last edited:
Back
Top