Jump to content

The official can AMD finally deliver a competitive GPU thread


Recommended Posts

7 minutes ago, Opinionated Ham Scarecrow said:

I'm leaning towards a 6800 too. I've never owned an AMD gpu though so I feel scared and ashamed.

Think it was 2016(?) I jumped from AMD's 290x to Nvidia's 1070 and the difference in driver quality for reducing glitches and random crashes was huge. Very much hope that AMD sort these type of issues for the 6000 series. The one thing that gives me massive hope is the timing of the new consoles, their architecture and the new GPU architecture. Should see a lot of optimisations made across the board

Link to post
Share on other sites
9 minutes ago, Thor said:

 

 

 

Also this! :lol: I just don't like the idea of switching now that I have seen what nvidia can do with DLSS and Ray Tracing on my 2080ti. But if it's clear next year that AMD's cards are genuinely competitive, we'll see...

If AMD manage to sustain year on year gains, or even just hit 20%, bulk up their RT capability, and get their DLSS competitor released, then with DirectStorage hitting the market and RDNA3 they may well kick some sand in Nvidia’s picnic.

Link to post
Share on other sites
Just now, footle said:

If AMD manage to sustain year on year gains, or even just hit 20%, bulk up their RT capability, and get their DLSS competitor released, then with DirectStorage hitting the market and RDNA3 they may well kick some sand in Nvidia’s picnic.

And that's in the future, which is why I'm not buying anything now. :)

Link to post
Share on other sites

I went from a Radeon 7950, to my 980TI, so not too scary to move back for the 6800.  I don't recall anything problematic with my 7950 but it was a long time ago.

 

I think I heard that both new consoles are Radeon based, is that true?  If so, would that mean that we could all benefit from some underlying optimisation in some games which in turn would help with driver stability?

 

No idea if that's a valid question or not.  Keen to hear people thoughts.

Link to post
Share on other sites
10 minutes ago, Simbo said:

I went from a Radeon 7950, to my 980TI, so not too scary to move back for the 6800.  I don't recall anything problematic with my 7950 but it was a long time ago.

 

I think I heard that both new consoles are Radeon based, is that true?  If so, would that mean that we could all benefit from some underlying optimisation in some games which in turn would help with driver stability?

 

No idea if that's a valid question or not.  Keen to hear people thoughts.

 

Both next gen consoles are using bespoke RDNA/2 solutions. Like you mention, I think this is going to pay massive dividends in the future, especially around RT. Nvidia will always have the best implementation of it, but it is going to have to be specifically developed on a game/engine basis, the same as DLSS. AMD are going down the route of having  most of their features (including dedicated RT hardware but done differently and their sortofbutnotquite DLSS rival) developed to be used in DX12, which should mean that anything done for console, will be enabled on the 6 series  GPUs that utilise it.  

 

Guess the million dollar question is whether devs will go the proprietary route of Nvidia (and how much cash Nvidia bungs them to do so) or the broader catchall of  the DX12 route. 

Link to post
Share on other sites

Brilliant.  Thanks for explaining that so clearly.  I think it's given me more confidence that I've made the right choice with the Rx 6800 over the 3070.

 

Just need to hope there is sufficient availability come Nov 18th.

 

Really looking forward to increasing the fidelity of my gaming in vr.  My 980ti is starting to struggle given the higher screen resolution of the quest 2 from my original CV1.

Link to post
Share on other sites
5 minutes ago, Simbo said:

Brilliant.  Thanks for explaining that so clearly.  I think it's given me more confidence that I've made the right choice with the Rx 6800 over the 3070.

 

Just need to hope there is sufficient availability come Nov 18th.

 

Really looking forward to increasing the fidelity of my gaming in vr.  My 980ti is starting to struggle given the higher screen resolution of the quest 2 from my original CV1.

 

This is quite a good video on the subject of RT. It may explain things a bit easier.

 

 

Obviously this isn't an apples to apples comparison as he's not using the 6800 series but it should give you the basic gist.

Link to post
Share on other sites

Watching that video and it seems to me that developers are going to overuse RT at the start, making everything just ridiculously shiny and unnatural, before settling down as time goes on. Some of those scenes look like staring into a mirror in every little patch of water, which is even more unrealistic given that the water doesn't move at all. Plus it makes the character models look incredibly lo-res.

 

Done well I can see it will be fantastic and immersive without you even realising it's there, but in that video it was just too much. 

Link to post
Share on other sites

Looking at that Watchdogs video ( DAMN it's gorgeous, might have to get it, considering it's London) the nVidia RT is much better, but it is a little too much in terms of reflection, at least in this game. Glass and windows shouldn't  "mirror" quite as much. Looking at the side by side, something in the middle would be good. 

Link to post
Share on other sites
18 hours ago, Gabe said:

Watching that video and it seems to me that developers are going to overuse RT at the start, making everything just ridiculously shiny and unnatural, before settling down as time goes on. Some of those scenes look like staring into a mirror in every little patch of water, which is even more unrealistic given that the water doesn't move at all. Plus it makes the character models look incredibly lo-res.

 

Done well I can see it will be fantastic and immersive without you even realising it's there, but in that video it was just too much. 

 

Always happens, like the overuse of bloom in the PS2/GC/Xbox era and Witcher 3's foliage thrashing about in silent 100km winds. 

Link to post
Share on other sites
2 hours ago, Benny said:

Lense flaaaaaare.

 

(That was the PS/N64 era equivalent)

 

Aw, I miss lens flare :(

 

Especially the lens flare going on in Mass Effect - that was great.

Link to post
Share on other sites
2 hours ago, Benny said:

Lense flaaaaaare.

 

(That was the PS/N64 era equivalent)

 

 

The other one I couldn't be doing with was chromatic aberration but, thankfully, it's popularity as a thing this generation was mercifully brief.

Link to post
Share on other sites
Just now, Mike S said:

 

 

The other one I couldn't be doing with was chromatic aberration but, thankfully, it's popularity as a thing this generation was mercifully brief.

 

It still rears its ugly head on occasion, and when it does, everything becomes a blurry mess. A Plague Tale is simply beautiful... Once you turn it off and can actually see the texture work on the edges of the screen.

Link to post
Share on other sites
3 minutes ago, gooner4life said:


warzone hits that at 1080p with everything maxed now.

Reported vRAM usage isn't necessarily what the card needs. It just "allocates" it, if you will. The highest I've heard of something actually using the vRAM is 6.5GB, and that was on Red Dead at 4k Ultra.

Link to post
Share on other sites
1 minute ago, Benny said:

 

It still rears its ugly head on occasion, and when it does, everything becomes a blurry mess. A Plague Tale is simply beautiful... Once you turn it off and can actually see the texture work on the edges of the screen.

 

 

Whenever I go through the graphics settings after first installation of a game it is something I turn off immediately - just an awful gimmick that, as you say, only makes things look worse.

 

Link to post
Share on other sites
3 minutes ago, IcEBuRN said:

Reported vRAM usage isn't necessarily what the card needs. It just "allocates" it, if you will. The highest I've heard of something actually using the vRAM is 6.5GB, and that was on Red Dead at 4k Ultra.

 

Agreed, but on my 1060 6gb now warzone's performance tanks if i let it go over the allocated 6gb, it used to use way less, but it's a pretty horribly optimised game in fairness.

Link to post
Share on other sites

As mentioned earlier in the thread I've been thinking of the RX 6800 but this was because I was limiting myself to my existing 650W PSU.  

 

Whilst I may not have an immediate requirement just now for the RX 6800 XT, I will be upgrading the rest of my PC over the next 12 months (motherboard/CPU & monitor), so I think I should just bite the bullet and go for the XT now and get myself a new PSU.

 

Would appreciate some recommendations on PSUs - any particular make/model you recommend?   Also, should I go as high as 800W/850W or would 750W be sufficient?  

 

(Apologies in advance if I should be asking this in the PC building thread)

Link to post
Share on other sites
1 hour ago, Simbo said:

As mentioned earlier in the thread I've been thinking of the RX 6800 but this was because I was limiting myself to my existing 650W PSU.  

 

Whilst I may not have an immediate requirement just now for the RX 6800 XT, I will be upgrading the rest of my PC over the next 12 months (motherboard/CPU & monitor), so I think I should just bite the bullet and go for the XT now and get myself a new PSU.

 

Would appreciate some recommendations on PSUs - any particular make/model you recommend?   Also, should I go as high as 800W/850W or would 750W be sufficient?  

 

(Apologies in advance if I should be asking this in the PC building thread)

Well the 3080 is 20W higher and Nvidia recommends 750W, but most reviews have been fine with a quality 80+ Gold at 700W+

 

Think quality of supply is proving to be more important with these new GPUs so maybe grab a 750-800W quality Gold+ PSU just in case you want a beefier CPU in the future too.

 

LTT forums have a tier list of PSUs which is very useful too: https://linustechtips.com/topic/1116640-psucultists-psu-tier-list/

Link to post
Share on other sites

In addition for recommendations: https://psutierlist.com/ . After that, you can cross-reference units with reviews (hopefully some will lead you to torture tests like what HardOCP used to do)

I personally use a Seasonic Focus Gold 750w and it seems perfectly capable of serving a 3080 - 6800XT should be chugging less than that, as mentioned.

It's also worth mentioning that Seasonic also supplies manufacturers with PSUs based on the Focus Gold series (like the Riotoro G2 mentioned in the list), so be sure to shop around to find yourself a bargain.

Link to post
Share on other sites
4 hours ago, Simbo said:

As mentioned earlier in the thread I've been thinking of the RX 6800 but this was because I was limiting myself to my existing 650W PSU.  

 

Whilst I may not have an immediate requirement just now for the RX 6800 XT, I will be upgrading the rest of my PC over the next 12 months (motherboard/CPU & monitor), so I think I should just bite the bullet and go for the XT now and get myself a new PSU.

 

Would appreciate some recommendations on PSUs - any particular make/model you recommend?   Also, should I go as high as 800W/850W or would 750W be sufficient?  

 

(Apologies in advance if I should be asking this in the PC building thread)

Ah shit, this has just made me think. I use an Akitio Node Pro eGPU with my Mac for rendering and I’ve currently got a 1080ti in there, but it means I’m stuck on High Sierra. I was really looking to upgrade to Big Sur when it launches and put a 6800XT in there instead, but looking at the specs, it says it’s only got a 500W PSU. Does this mean I’ll need a new eGPU to run it?

Link to post
Share on other sites

I don't know anything about eGPUs but that said, the total draw power from the 6800XT is 300W.  So maybe it's ok if it's only powering the card itself?  

 

....

 

In amongst all the recent performance benchmarks, this popped up on Twitter and made me laugh.

 

 

 

Link to post
Share on other sites
On 28/10/2020 at 17:02, Liamness said:

The most popular GPU today is still a GTX 1060, a 120W GPU. Yet AMD today announced one 250W and two 300W cards - I don't really get it.

 

Same idea as Nvidia, launch the aspirational Halo effect flagships first and win back some mindshare. The average online/realworld peer group influencer isn't running a GTX 1060 and probably never did, they'd have been on a GTX 1080 or GTX 1080 Ti.

 

AMD had the mid-range RX 480/580 to compete against the GTX 1060 and anything better for a few years and then the RX 5700 to compete against the RTX 2070, but Nvidia didn't lose much in the way of market share (I tend to forget Vega ever existed), despite both of those products being both performance and price competitive. They clearly needed to show they had something good enough against the best, even if only a small fraction of sales comes from the most expensive cards.

 

A2hJdeq.png

 

 

AMD finally semi-competitive again, but with the as rumoured worse RT (around Turing level). Nvidia clearly have spies in AMD's business which is why they whacked up the power requirements of their parts and rushed them to market to smother AMD before they can get a foothold.

 

 

 

  

On 29/10/2020 at 10:38, Scruff said:

Guess the million dollar question is whether devs will go the proprietary route of Nvidia (and how much cash Nvidia bungs them to do so) or the broader catchall of  the DX12 route. 

 

Makes no real difference to Nvidia, they are also involved in the specs of DX12 Ultimate, and both Turing and Ampere are fully compliant with the feature set so it doesn't really matter if that stuff gets used, it would also explain why they are pushing more exclusive software and high end hardware features now like their fancy latency tools for people with the money and their latency lowering technologies in all the popular online games.

 

All that should still give them the edge and as they already control 80%+ of the gaming GPU market, it's an uphill battle for AMD and Nvidia aren't going to get caught with their trousers down for years like Intel to make it easy to beat them.

 

 

 

Link to post
Share on other sites
On 28/10/2020 at 16:35, dfq23 said:

Too many *'s with the 6900xt benchmarks for my liking.  It's over clocked and using the new 5 series Ryzen memory thing.

 

Not exactly an apples to apples comparison

 

I think this really needs a restating, ALL, of those slides state smart memory...which seems to only be for 550/570 motherboards...

 

So...it will be interesting to see real world examples, and what the difference it in terms of using one on amd or intel, as, if it actually offers a measurable amount of difference, or was the only way to get better results.

 

It could put people off buying it if they only have the money for a gpu, and that they wont be getting the full benefit...

 

Guess we will know in a couple of weeks though!

 

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Use of this website is subject to our Privacy Policy, Terms of Use, and Guidelines.