Jump to content

NVIDIA 4090 $1599 Oct 12 4080 16gb $1199 4080 12gb (basically 4070ti) $899 Nov - stay warm this winter


Uzi
 Share

Recommended Posts

6 minutes ago, layten said:

Palit: Blocks your path.

  Hide contents

image.png

My 3080ti, looks not even its mother could love

 

 

That's the weak old light side Palit, they've brought a new level of brillance with absolute dark power this time, it cares not for mothers, just to crush other cards dreams and your foot if you drop it out the box.

Link to comment
Share on other sites

Tbf you need to compare those to 3090 launch prices as its a 90 series product and the price to perf jump per pound/dollar on the 4090 to 3090 is actually really good. Objectively speaking as a titan class/halo product aimed at extreme enthusiasts only. 

 

The current 4080 stack needs to be discontinued and re shuffled. The 16gb 4080 next year needs to be slotted at the 12gb and the inevitable 4080ti needs to be at the 16gb 4080 slot wise. The 12gb needs to be discontinues and sold slightly cut down as a 4070 for way less money. 

 

TMSC fabs and in general are really expensive now. You'll never see 3080 pricing again. 

Link to comment
Share on other sites

1 hour ago, Uzi said:

Tbf you need to compare those to 3090 launch prices as its a 90 series product and the price to perf jump per pound/dollar on the 4090 to 3090 is actually really good. Objectively speaking as a titan class/halo product aimed at extreme enthusiasts only.

No doubt the 4090 is the best 'value' here from what we've seen, but surely any other judgements have to be reserved for numbers that haven't been cherrypicked and massaged by Nvidia? On the titles they showed numbers for without DLSS it looked like a 70% jump at best and the RRP is about 70% more than you can buy a new 3090 Ti for at the moment. I'm not sure price/performance is really increasing all that much without DLSS3 which seems to pretty much be turning your TV's motion smoothing on, having it look horrible and calling it double the frame rate.

 

Edit: Not trying to discourage you from buying one BTW, if top performance is within your reach then why not! I'm just not sure we've seen anything like enough to even really guess at the value yet.

Link to comment
Share on other sites

21 minutes ago, Rsdio said:

No doubt the 4090 is the best 'value' here from what we've seen, but surely any other judgements have to be reserved for numbers that haven't been cherrypicked and massaged by Nvidia? On the titles they showed numbers for without DLSS it looked like a 70% jump at best and the RRP is about 70% more than you can buy a new 3090 Ti for at the moment. I'm not sure price/performance is really increasing all that much without DLSS3 which seems to pretty much be turning your TV's motion smoothing on, having it look horrible and calling it double the frame rate.

 

Edit: Not trying to discourage you from buying one BTW, if top performance is within your reach then why not! I'm just not sure we've seen anything like enough to even really guess at the value yet.

Of course. Basing it on a few things. One are specs which are set in stone and not up for debate hence my cuda core value calculation. That's not even taking into account the lift architecture has to made one core in a new gen usually more efficient than one from last gen. The 4090 has almost 17k cuda cores and this alone makes it a monster in comparison to the core count of a 3090 or 3090ti. There is no logical way the card won't provide significant perf value vs 3090 unless the card is broken. 

 

Next which does deserve a pinch of salt is that nvidia have released some game perf indications such as AC Valhalla which has neither DLSS nor RT and the indication is a large architectural jump for raster alone. Nothing stupid like 2x +. AC Valhalla has no nvidia tech in the game engine they can use to cherry pick to any significant degree. 

 

I've not taken anything with DLSS 3.0 into account as I don't know how it works. The hardware is sufficent to make an educated conclusion without knowing final perf numbers. It completely annihilates the previous 90 series without a significant price jump. 

Link to comment
Share on other sites

Isn't the CUDA core count also up by about 70%  vs. the 3090 Ti too? I don't know how linearly that tends to scale but since it ties in I'm kinda feeling like that might end up being the real world, apples to apples jump trying to read between the lines of it all. Obviously you're totally right about price/performance based on their launch RRPs but at current prices it seems like price/performance might be pretty similar?

 

Obviously price/performance of the 90 vs the 80 is way better than usual but that definitely says more about what they've done with the 80s than anything else.

Link to comment
Share on other sites

32 minutes ago, Rsdio said:

Isn't the CUDA core count also up by about 70%  vs. the 3090 Ti too? I don't know how linearly that tends to scale but since it ties in I'm kinda feeling like that might end up being the real world, apples to apples jump trying to read between the lines of it all. Obviously you're totally right about price/performance based on their launch RRPs but at current prices it seems like price/performance might be pretty similar?

 

Obviously price/performance of the 90 vs the 80 is way better than usual but that definitely says more about what they've done with the 80s than anything else.

the 4090 is not the best bang for buck considering you can get a brand new 3090 for £700! ! I wouldnt say it is a bargain by any means and we don't know where DLSS 3.0 will end up once it matures. I've seen some knee jerk theory posts and videos about it online though from people who don't even have the hardware yet. 

 

What I would say it is the best bang for buck halo product at launch and for those wanting the newest and  highest power possible right now. I'd buy the 3090 but it's simply too weak for the graphics power I want right now with some of the games I play. Whereas the 4080 is dogshit from every angle. Price and perf

 

Edit: if anyone searches the below on amazon do your research on the seller 

Screenshot_2022-09-22-16-52-44-12_b5f6883d2c20a96c53babc0b4ac88108.thumb.jpg.32227d86e7ff927bbb276375f505a422.jpg

 

Link to comment
Share on other sites

Hmm, very tempting! Feedback looks alright but slightly weird combination of product lines lol.

 

They also have a Gigabyte 6900 XT for £513 and an Asus TUF 3080 12gb for £588. Do you reckon the 3090 is worth the jump? I have no experience of Palit but having had several Asus cards I'd definitely be more confident with them.

Link to comment
Share on other sites

Just now, Rsdio said:

Hmm, very tempting! Feedback looks alright but slightly weird combination of product lines lol.

 

They also have a Gigabyte 6900 XT for £513 and an Asus TUF 3080 12gb for £588. Do you reckon the 3090 is worth the jump? I have no experience of Palit but having had several Asus cards I'd definitely be more confident with them.

I did some research, I'd steer clear!

 

image.thumb.png.e093fcf687ece52b1657086d1b3a505b.png

image.thumb.png.5f1cb2af12be4946ac9ee85a6c0211cf.png

 

Seems like horse shit

 

Link to comment
Share on other sites

Just seen that AMD have dropped the RRPs on the 6000 series, the 6900 XT going from $999 to $699 caught my eye. Should be another step towards the outgoing cards coming down in general, at least.

 

Edit: Ignore me, seems like it was just a clickbait article based on Newegg prices :facepalm:

Link to comment
Share on other sites

11 minutes ago, Moz said:

I'm still not convinced by raytracing, still. It smacks of PhysX to me. With my 3080 it's never worth the performance hit. Having said that, going back and breathing new life into old games with raytracing is a great idea.


that’s why you need a 4090: turn everything up and never worry about performance until the next generation of consoles again.

Link to comment
Share on other sites

1 hour ago, Benny said:

That's some extremely clever tech that I will never use to interfere with the art direction of old games with.

I'm going to add so much raytracing to Full Throttle and there's nothing you can do to stop me!*

 

 

* I appreciate there's no way it can do anything to a 2D, SCUMM-engine based game

Link to comment
Share on other sites

21 hours ago, Rsdio said:

 I'm not sure price/performance is really increasing all that much without DLSS3 which seems to pretty much be turning your TV's motion smoothing on, having it look horrible and calling it double the frame rate.

 

 Btw I've seen this claim going around by random youtubers which is frankly silly for them to make guesses without testing it out in person and to think Nvidia AI tech is the same as what TV options do (it would be like calling DLSS the same as how TV's upscale content with built in scalers), but Digital Foundry just posted this (and they have a 4090 which they are currently testing). Going to be interesting to see their full video.

 

image.thumb.png.82b41b676d51cbfbe25d595fadff5406.png

Link to comment
Share on other sites

Yeah, I was just being glib there and anyone putting stuff on YouTube that's likely to be parroted everywhere shouldn't be as daft, but personally I'm not aware of any kind of real-time interpolation that doesn't have a significant (to me) downside, be it the image or lag being introduced or both. Which isn't to say it won't ever happen of course and it'd be nice if it was soon.

 

Also, Alex on DF seems like a sound guy and his videos are always worth watching but he really loves his upscaling in general and also seems to favour slathering motion blur on everything so I don't necessarily find that his subjective findings tend to jive with my own preferences as far as image quality goes.. Motion clarity has never come across as much of a priority for him when it's really important to me - I doggedly clung on to my 1080p plasma until two months ago and only bought a C1 because 120hz BFI was being phased out! Once his test for this comes out I might get on DF's Patreon for the uncompressed video though because I'm definitely curious about how it'll look.

 

On a related note, am I the only one who's starting to struggle a bit with the amount of 'features' you have to juggle and tweak between GPU drivers and games these days? Various types of traditional vsync (which was always a minefield purely in itself), VRR, upscaling options, input lag tweaks (tying in with various window mode options), RT, motion interpolation now and that's without Windows getting involved with its own 'enhancements', how complex modern TVs are and the weird interactions between all of the above that can often arise. Optimising games could always be somewhat of a pain I know, but I feel like either it's getting worse or I'm just getting old.

Link to comment
Share on other sites

5 hours ago, Rsdio said:

On a related note, am I the only one who's starting to struggle a bit with the amount of 'features' you have to juggle and tweak between GPU drivers and games these days? Various types of traditional vsync (which was always a minefield purely in itself), VRR, upscaling options, input lag tweaks (tying in with various window mode options), RT, motion interpolation now and that's without Windows getting involved with its own 'enhancements', how complex modern TVs are and the weird interactions between all of the above that can often arise. Optimising games could always be somewhat of a pain I know, but I feel like either it's getting worse or I'm just getting old.

 

It can be a huge nightmare, but I really enjoy eking out performances by tweaking settings...I think I spend more time faffing about with that than I do playing the bloody things sometimes! I literally spent an hour earlier today running through the benchmark on CP2077 with various settings on and off just to see what effects they had on the frame rate...I found it strangely enjoyable :P

 

On another tangent, I am seriously wondering just how loud the new 40xx cards are going to be when being pushed. I'm a headphone user anyway, but by god running those benchmarks seriously caused some hefty noise when I took the headset off...and the heat. Winter's gonna be lovely in my office/gamesroom/delete as appropriate.

 

 

Link to comment
Share on other sites

I didn't used to mind it as much but I think that was when your tweaking was more or less confined to the in-game options and maybe at worst an ini/cfg file or whatever. Your display was set and forget, the drivers more or less were too and Windows stayed more out of the way. Things feel messier and harder to keep track of now - it can be hard to figure out if this setting over here is helping or hindering that related one over there.

 

The absolute best is when you're fiddling endlessly and it's a game featuring the message everyone wants to see: 'Changes will not take effect until the game is restarted.'

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Use of this website is subject to our Privacy Policy, Terms of Use, and Guidelines.