Confused b/w (3060 ti - 6700xt) and (i5 12400f - Ryzen 5 5600)

Daemon_

Recruit
Are Nvidia advantages actually worth sacrificing 4gm vram?

Which Cpu~ Gpu combo will be the best between these? Getting used Gpu both at 18k.
 
Yes, 6700XT or 6750XT are an easy pick over 3060Ti, 3070, 3070Ti or 4060Ti because of 12GB VRAM & the value it offers at 35k. Pick Asus model, else Sapphire. My GPU is a 3070 FYI.

I picked the i5 12400 because of the iGPU for my rig, I don't want any downtime if the dGPU has any issues.

Mention your overall budget & fill the questionnaire.
 
Are Nvidia advantages actually worth sacrificing 4gm vram?

Which Cpu~ Gpu combo will be the best between these? Getting used Gpu both at 18k.
Nah, unless RT really matters to you (not that the 3060Ti will be good at RT), just get the 6700XT.

Between the 12400F and 5600, pick whichever is cheaper.
 
Yes, 6700XT or 6750XT are an easy pick over 3060Ti, 3070, 3070Ti or 4060Ti because of 12GB VRAM & the value it offers at 35k. Pick Asus model, else Sapphire. My GPU is a 3070 FYI.

I picked the i5 12400 because of the iGPU for my rig, I don't want any downtime if the dGPU has any issues.

Mention your overall budget & fill the questionnaire.
I haven't been following the pc market for years now so I would be glad if you could answer a few questions for me. I thought that DLSS was worth picking these over amd cards? I thought that would have become a standard option in majority of AAA games no? Ik that FSR 3 is coming out (or has it already come out?) but it still isn't matched to DLSS right?
 
I haven't been following the pc market for years now so I would be glad if you could answer a few questions for me. I thought that DLSS was worth picking these over amd cards? I thought that would have become a standard option in majority of AAA games no? Ik that FSR 3 is coming out (or has it already come out?) but it still isn't matched to DLSS right?
Yes, DLSS is indeed superior to FSR 3 (which is already out). However, none of the 30-series cards below the 3080 are worth it, mostly because of VRAM limitations. The 6700XT/6750XT are better buys, and better priced than the NVIDIA equivalents (unless you're going used). Plus AMD has also implemented its own version of Frame Generation (Fluid Motion) across the 6000 series as well, alongside the 7000 series
 
I haven't been following the pc market for years now so I would be glad if you could answer a few questions for me. I thought that DLSS was worth picking these over amd cards? I thought that would have become a standard option in majority of AAA games no? Ik that FSR 3 is coming out (or has it already come out?) but it still isn't matched to DLSS right?
Having used both DLSS 2.x & FSR 2.x in games, I can't see a difference usually. Game to game, there can be artefacts because of upscaling, I have seen it in both. From online comparisons, surely DLSS has a slightly better image quality. IMO both are not worth it for 1080p target resolution because sub 1080p rendering doesn't have enough pixels for a great upscaling. So 1440p DLSS/FSR quality is the lowest you can use (unless you are using it on a 15" laptop where because of smaller screen size sub-1080p rendering is not as big of an issue as on a 24"+ monitor).

I game at 1440p, and I have already seen many games where I hit VRAM limit. This situation will get worse with time. My rig is almost 2 years old, so 8GB VRAM today for 1440p gaming is an easy no from me. I have to lower texture quality to make sure VRAM limit is not hit. Games where I faced VRAM issues are FH5, Hogwarts Legacy & TLOU1. FH5 - GPU is powerful enough to run it at 1440p extreme at 90+ fps but I have to set textures at high (even at ultra I get an out of VRAM warning after about 1 hr of gameplay). Sure I will miss Nvidia Broadcast but if I was choosing between 3070 & 6750XT, my choice will be 6750XT. So take your call based on these info.

Another option is to wait for a month & hope 7700XT drops below 40k. IMO this is required as 45k is too close to 7800XT & 4070 pricing.


Regarding frame generation, those are not actual frames, so if you are already getting 60+fps natively, it will be great to make it even smoother with FG & make it seem like 100fps. It is getting better with time but RTX 40xx GPUs are a terrible value at this price & DLSS 3.x is alone not enough to be worth buying a terrible value card like 4060 or 4060Ti.
 
I agree that 6700xt is better option vs 3060ti if at same price.

Having used both DLSS 2.x & FSR 2.x in games, I can't see a difference usually
FSR is terrible at lower quality whereas DLSS holds much better.
In one game where i tried FSR, i got extra artifacts around smoke that looked ugly. Also generally i think it gives softer output vs DLSS.
Did not check, maybe 4k quality or similar is ok, dunno.

Also DLDSR is very nice too, but maybe again not as relevant in this performance range. I really prefer DLDSR + DLSS these days. Amd has super resolution, not sure how it compares.

Regarding frame generation, those are not actual frames, so if you are already getting 60+fps natively, it will be great to make it even smoother with FG & make it seem like 100fps. It is getting better with time but RTX 40xx GPUs are a terrible value at this price & DLSS 3.x is alone not enough to be worth buying a terrible value card like 4060 or 4060Ti.
FSR frame gen can work very well. I have only tested in one game ( witcher 3) with 3080 through a mod + DLSS upscaling, but it makes things smoother + i dont feel any lag + very few artifacts only in extreme cases. Well worth the trade off for me. Also goes against nvidia bs about need extra hardware for frame gen. I was skeptical about this earlier, and yes frame numbers may mean little here but it looks to be worth it and hopefully will get better with time.

But point is that this is no longer nvidia 4000 series monopoly feature now.
 
idk about you guys, but fsr and dlss are both terrible at 1080p.
and both 3060ti/6700xt are 1080p cards
but i m interested to know if they look alright in 1440p?
 
idk about you guys, but fsr and dlss are both terrible at 1080p.
and both 3060ti/6700xt are 1080p cards
but i m interested to know if they look alright in 1440p?
Depends on game, DLSS quality at 1440p is fine vs native, but i instead use DLSS balanced/performance + DLDSR from 4k which looks better to me.
There is too much blur these days. Was terrible in Metro Exodus until i did this + also used AMD CAS. Amd cards should be able to use CAS without mods, so that's a plus too for AMD.

At 1080p, yeah i tried on Control with DLSS quality and it was very soft. I did not realize for some tiime that it was DLSS that was causing graphics to look crappy. But another game looked fine ..
 
Depends on game, DLSS quality at 1440p is fine vs native, but i instead use DLSS balanced/performance + DLDSR from 4k which looks better to me.
There is too much blur these days. Was terrible in Metro Exodus until i did this + also used AMD CAS. Amd cards should be able to use CAS without mods, so that's a plus too for AMD.

At 1080p, yeah i tried on Control with DLSS quality and it was very soft. I did not realize for some tiime that it was DLSS that was causing graphics to look crappy. But another game looked fine ..
I normally use the same combination with my 3080, as well.

Looks and runs great! Any sort of upscaling will look garbage at 1080p. They are only effective at 1440p and higher.
 
I owned a RTX 3050 for more than a year before I upgraded to a 3080. I bought it at the height of the GPU shortage because my old card failed. So I feel qualified to say that if you can get it, DLSS is absolutely worth it and it can take a 30-40 fps gaming experience to 60-70 with little to no visual fidelity loss.

I would go so far as to say that I'd pick a Nvidia card over a corresponding AMD card that has ~20% better raster performance just for DLSS.

I used it at 1080p exclusively, usually on quality mode. So an internal render resolution of 720p, and it was like magic. With the occasional artefact and ghosting to remind me that it wasn't, but still a very easy choice. FSR, at least from what I've seen, is not remotely in the same league. It just looks like a spatial upscaler. DLSS can actually add detail that is not there. And in certain games like Cyberpunk where the TAA implementation, at least at launch, was really bad. DLSS 1080p was/is superior to native 1080p.

Furthermore, Nvidia has always used a more efficient texture compression algorithm than AMD has. So while Nvidia's card are absolutely still behind AMD when it comes to VRAM. In some games, I've noticed that Nvidia cards use up to 1 gig less of VRAM at the same settings.
 
I used it at 1080p exclusively, usually on quality mode. So an internal render resolution of 720p, and it was like magic. With the occasional artefact and ghosting to remind me that it wasn't, but still a very easy choice. FSR, at least from what I've seen, is not remotely in the same league. It just looks like a spatial upscaler. DLSS can actually add detail that is not there. And in certain games like Cyberpunk where the TAA implementation, at least at launch, was really bad. DLSS 1080p was/is superior to native 1080p.
This is game dependent, i get its good enough in cyberpunk even at 1080p. But it was horrible in control - quite blurry and disabling it worked much better for me at 1080p. General opinion of many reviewers ( who must have played many more games) is that up scaling in general at 1080p does not work well.
ofc, having DLSS as an option if we move up to 1440p, is also well worth considering.

Anyway both are decent cards.

Right now, vram usage has gone back down in new games. So we could say 8gb is still borderline enough for now, but its much more on shaky grounds vs 12gb ( or 11gb if you want to call it).
I am hitting VRAM limits in witcher 3 next gen frequently because of how i am playing with 3080.
I tried to use a mod to increase draw distance of vegetation/detail and crashed soon enough, had to scale down a bit.

Reduced texture quality will look worse too.

RT performance imo is not very relevant because they are both slow. I think 3080 is probably bare minimum. And RT itself can take up more vram.

Its unfortunate that nvidia has been so skimpy with VRAM. And hopefully AMD catches up with RT/ai based FSR in next gen to get more competetive.
RT is the bottleneck now, that is where focus should be.
 
Last edited:
This is game dependent, i get its good enough in cyberpunk even at 1080p. But it was horrible in control - quite blurry and disabling it worked much better for me at 1080p. General opinion of many reviewers ( who must have played many more games) is that up scaling in general at 1080p does not work well.
ofc, having DLSS as an option if we move up to 1440p, is also well worth considering.

Anyway both are decent cards.

Right now, vram usage has gone back down in new games. So we could say 8gb is still borderline enough for now, but its much more on shaky grounds vs 12gb ( or 11gb if you want to call it).
I am hitting VRAM limits in witcher 3 next gen frequently because of how i am playing with 3080.
I tried to use a mod to increase draw distance of vegetation/detail and crashed soon enough, had to scale down a bit.

Reduced texture quality will look worse too.

RT performance imo is not very relevant because they are both slow. I think 3080 is probably bare minimum. And RT itself can take up more vram.

Its unfortunate that nvidia has been so skimpy with VRAM. And hopefully AMD catches up with RT/ai based FSR in next gen to get more competetive.
RT is the bottleneck now, that is where focus should be.
I'm trying to recall all the games I played with DLSS at 1080p. I never tried Control. I believe that was a launch title for DLSS and used an earlier version, which could explain why it was bad. But I know for a fact it works really well in Death Stranding, Guardians Of The Galaxy (extremely underrated game and great DLSS implementation), Baldur's Gate 3, Hogwarts Legacy (Took it from barely playable with frequent dips at high 1080p on the 3050 to a smooth 40-60 fps) and Alan Wake II. In fact, I've never had to turn it off since getting a 3000 series card. And with the RTX 3050 being an underpowered card, DLSS is crucial to why it still can be a good choice for 1080p gaming at the right price. Something many reviewers missed when it launched.

DLSS at 1080p absolutely works. But it depends on what your reference is. Is it better than playing at 720p? Absolutely. Is it better than FSR? Nearly always. FSR never manages to convince you that you're looking at anything more than a clever sharpening filter. DLSS will do things like reconstruct a wire fence texture that wasn't even visible at 720p, and even compensate for low texture resolution in many places. Imagine doing this kind of upscaling with FSR lol!

Nvidia definitely gimped all the 3000 series cards, and the 4000 series even more so, with the low vram. They're pretty greedy and need to be disrupted. I desperately want Intel (XeSS looks better than FSR imo and I'm excited for Battlemage) or AMD to give them some real competition. But AMD has consistently dropped the ball on the software side of things. FSR is not a real competitor to DLSS. It's great if you have an old card, but no one should seriously be comparing FSR to DLSS as if they're equivalent technologies. The new AMD Fluid Motion thing is a great example of the AMD mindset. More frames! But only when you're basically standing still?? And not recommended in fps games according to reviewers! It's a gimmick.
 
I'm trying to recall all the games I played with DLSS at 1080p. I never tried Control. I believe that was a launch title for DLSS and used an earlier version, which could explain why it was bad. But I know for a fact it works really well in Death Stranding, Guardians Of The Galaxy (extremely underrated game and great DLSS implementation), Baldur's Gate 3, Hogwarts Legacy (Took it from barely playable with frequent dips at high 1080p on the 3050 to a smooth 40-60 fps) and Alan Wake II. In fact, I've never had to turn it off since getting a 3000 series card. And with the RTX 3050 being an underpowered card, DLSS is crucial to why it still can be a good choice for 1080p gaming at the right price. Something many reviewers missed when it launched.
ok, i have played only 2 games at 1080p before upgrading monitor and in first game it worked well and not in 2nd game.

DLSS at 1080p absolutely works. But it depends on what your reference is. Is it better than playing at 720p? Absolutely. Is
At 1440p, DLSS seems good vs native too. But usually its because native isn't very good because TAA and similar stuff uses lower resolution apparently and then adds details from nearby frames.
Even with DLSS i don't really like how blurry games can be these days in motion almost always and sometimes even when still ( Metro Exodus enhanced ).
I played watch dogs 1 with some mods and MSAA, and that was nice and sharp.

So i use DLDSR to use a higher initial resolution and then use DLSS and somehow it works. So yeah, DLSS/DLDSR etc are all almost essential for me for many games.
On Amd side there is CAS which can also be nice, AMD is nice they open sourced it and its available to NVIDIA people through reshade.

. The new AMD Fluid Motion thing is a great example of the AMD mindset. More frames! But only when you're basically standing still?? And not recommended in fps games according to reviewers! It's a gimmick.
Have you tried it ? It works very well. I am playing witcher 3 with it + DLSS via a mod on 3080.
I had my doubts, but frame gen is very useful too and i dont have any particular issue in motion in this game ( have heard that mod is working well on cyberpunk as well).
Its certainly not a gimmick. UI is clean ( had to use DLDSR 2.25x instead of lower resolution DLDSR as it effected UI), Artifacts are rare, dont feel any latency issues - well worth the trade off by getting smoother frames.

FSR frame gen is the first example for me showcasing that AMD can catch up on software too.
They really need to improve RT performance as its the bottleneck for new games and substantially improve FSR to be competetive. There was some news about AMD console next chip might have neural stuff so maybe it will come to desktop too and we might get better upscaling. And that they were going to do something in RT pipeline in hardware, so maybe both things get fixed before intel slaps them next gen. Lets see ..
 
Last edited:
ok, i have played only 2 games at 1080p before upgrading monitor and in first game it worked well and not in 2nd game.


At 1440p, DLSS seems good vs native too. But usually its because native isn't very good because TAA and similar stuff uses lower resolution apparently and then adds details from nearby frames.
Even with DLSS i don't really like how blurry games can be these days in motion almost always and sometimes even when still ( Metro Exodus enhanced ).
I played watch dogs 1 with some mods and MSAA, and that was nice and sharp.

So i use DLDSR to use a higher initial resolution and then use DLSS and somehow it works. So yeah, DLSS/DLDSR etc are all almost essential for me for many games.
On Amd side there is CAS which can also be nice, AMD is nice they open sourced it and its available to NVIDIA people through reshade.
Yeah, DLSS absolutely kicks ass at 1440p. The difference between it and native is even more slight. I usually have to pixel peep and compare screenshots to even see a difference. Can you imagine playing without it at any high resolution? That's why I can't bring myself to buy an AMD card or even recommend it to people. AI upscaling is absolutely the future. And FSR is just not there yet.
Have you tried it ? It works very well. I am playing witcher 3 with it + DLSS via a mod on 3080.
I had my doubts, but frame gen is very useful too and i dont have any particular issue in motion in this game ( have heard that mod is working well on cyberpunk as well).
Its certainly not a gimmick. UI is clean ( had to use DLDSR 2.25x instead of lower resolution DLDSR as it effected UI), Artifacts are rare, dont feel any latency issues - well worth the trade off by getting smoother frames.
You're talking about FSR3. I was talking about AFMF, the generic solution that works on the driver level. It does not have access to the motion vectors from the game engine, so it's much worse.

In the video, he got much worse latency of an extra ~20ms and his fps dips down drastically each time he moves his cursor at a speed appropriate for an fps game. So probably a nifty trick for an MMO or any isometric perspective game, if you don't mind a big latency penalty but I probably wouldn't even bother because those games usually don't need that fps boost in the first place. The use case for AFMF is tiny.

Coming to FSR3, I tried the mod for Cyberpunk that uses FSR frame combined with DLSS. It made full pathtracing at 1440p doable on the 3080, as long as you were okay with lots of ghosting. Not sure how that compares to DLSS 3. Frame generation definitely has potential, but here again you have the classic case of Nvidia locking down DLSS 3 to the 40 series cards because they claim the optical flow analysis can't be done on the 30 series etc. They can get away with that sort of thing because DLSS has no real competitors at the moment. Hopefully FSR3 gets a lot better with time.
 
I'm trying to recall all the games I played with DLSS at 1080p. I never tried Control. I believe that was a launch title for DLSS and used an earlier version, which could explain why it was bad. But I know for a fact it works really well in Death Stranding, Guardians Of The Galaxy (extremely underrated game and great DLSS implementation), Baldur's Gate 3, Hogwarts Legacy (Took it from barely playable with frequent dips at high 1080p on the 3050 to a smooth 40-60 fps) and Alan Wake II. In fact, I've never had to turn it off since getting a 3000 series card. And with the RTX 3050 being an underpowered card, DLSS is crucial to why it still can be a good choice for 1080p gaming at the right price. Something many reviewers missed when it launched.

DLSS at 1080p absolutely works. But it depends on what your reference is. Is it better than playing at 720p? Absolutely. Is it better than FSR? Nearly always. FSR never manages to convince you that you're looking at anything more than a clever sharpening filter. DLSS will do things like reconstruct a wire fence texture that wasn't even visible at 720p, and even compensate for low texture resolution in many places. Imagine doing this kind of upscaling with FSR lol!

Nvidia definitely gimped all the 3000 series cards, and the 4000 series even more so, with the low vram. They're pretty greedy and need to be disrupted. I desperately want Intel (XeSS looks better than FSR imo and I'm excited for Battlemage) or AMD to give them some real competition. But AMD has consistently dropped the ball on the software side of things. FSR is not a real competitor to DLSS. It's great if you have an old card, but no one should seriously be comparing FSR to DLSS as if they're equivalent technologies. The new AMD Fluid Motion thing is a great example of the AMD mindset. More frames! But only when you're basically standing still?? And not recommended in fps games according to reviewers! It's a gimmick.
Im gonna be gaming at 1080p, don't think DLSS or Fsr will matter since that will not look good. And Nvidia's 40 series cards are capable of DLSS frame gen. While Fsr 3 has come out which works on almost every gpu even i gpus can produce similar performance.
 
In the video, he got much worse latency of an extra ~20ms and his fps dips down drastically each time he moves his cursor at a speed appropriate for an fps game. So probably a nifty trick for an MMO or any isometric perspective game, if you don't mind a big latency penalty but I probably wouldn't even bother because those games usually don't need that fps boost in the first place. The use case for AFMF is tiny.
Will need to give it time, hopefully there will be use cases where its useful. These things seem to get released early and fixed with time.

Coming to FSR3, I tried the mod for Cyberpunk that uses FSR frame combined with DLSS. It made full pathtracing at 1440p doable on the 3080, as long as you were okay with lots of ghosting. Not sure how that compares to DLSS 3. Frame generation definitely has potential, but here again you have the classic case of Nvidia locking down DLSS 3 to the 40 series cards because they claim the optical flow analysis can't be done on the 30 series etc. They can get away with that sort of thing because DLSS has no real competitors at the moment. Hopefully FSR3 gets a lot better with time
Not tried on Cyberpunk, but its close to perfect in witcher 3 - no ghosting as such, atleast i dont notice it. Can get some artifacts and flickering over objects, but rare.
Using this mod, version 0.7. Since its open source, people will hopefully keep improving on it and fix where its broken.
 
Back
Top