This collaborative space allows users to contribute additional information, tips, and insights to enhance the original deal post. Feel free to share your knowledge and help fellow shoppers make informed decisions.
Deal History includes data from multiple reputable stores, such as Best Buy, Target, and Walmart. The lowest price among stores for a given day is selected as the "Sale Price".
Sale Price does not include sale prices at Amazon unless a deal was posted by a community member.
Are the leaks accurate that a 4070 will be competitive with this and at a sub $500 price point?
Looking to put a build together but doesn't make sense to if that's accurate.
I do believe a 4070 will fall somewhere between the 3080-3090. The current gen 3070 traded blows with the 2080 Ti. The only thing that wasn't close was the the $500 price tag up until recently!
Are the leaks accurate that a 4070 will be competitive with this and at a sub $500 price point?
Looking to put a build together but doesn't make sense to if that's accurate.
Honestly, I sincerely doubt it. Turing was a minor improvement over Pascal. Ampere represented a pretty large improvement over Turing, but with a high power envelope. Sources show pretty uniformly that Nvidia's power envelope is even higher this time around. I don't see that translating into big performance gains. My gut tells me this is going to be another Turing generation, which wasn't a meaningful improvement. There's also the issue of availability; even though crypto is down, I still don't think inventory is going to be good.
AMD's cards are the wild card this year. Rumors claim they've made pretty big year over year gains while maintaining power efficiency. I'm not seeing those same sorts of rumors from Nvidia.
Got this card from microcenter last week and I totally regret it, even at idle tempratures the back plate gets too hot to touch, load temps are way too high. Seems like Nvidia is ignoring efficiency and just making their cards more and more power hungry to crush the comp.
Got this card from microcenter last week and I totally regret it, even at idle tempratures the back plate gets too hot to touch, load temps are way too high. Seems like Nvidia is ignoring efficiency and just making their cards more and more power hungry to crush the comp.
That GDDR6X runs HOT. Not uncommon for the ram modules to hit 90-100C during load. They are rated up to 105-110C before throttling, but damn, thats toasty.
Honestly, I sincerely doubt it. Turing was a minor improvement over Pascal. Ampere represented a pretty large improvement over Turing, but with a high power envelope. Sources show pretty uniformly that Nvidia's power envelope is even higher this time around. I don't see that translating into big performance gains. My gut tells me this is going to be another Turing generation, which wasn't a meaningful improvement. There's also the issue of availability; even though crypto is down, I still don't think inventory is going to be good.
AMD's cards are the wild card this year. Rumors claim they've made pretty big year over year gains while maintaining power efficiency. I'm not seeing those same sorts of rumors from Nvidia.
Knowing Nvidia, they, as company, do not want to be second fiddle to AMD. Otherwise, they will start to lose market share. Nvidia WILL do whatever it can to be number 1 including raising the power usage. I can see Nvidia pushing SKUs with 600-900 watts if they start to lag behind AMD. AMD is really pushing themselves too. A lot of the rumors are beginning to gain traction. At this point in time, I would even claim them as leaks instead of rumors. We're in July, within 3-4 months from now the next generation of GPUs will launch.
Like
Helpful
Funny
Not helpful
Sign up for a Slickdeals account to remove this ad.
Knowing Nvidia, they, as company, do not want to be second fiddle to AMD. Otherwise, they will start to lose market share. Nvidia WILL do whatever it can to be number 1 including raising the power usage. I can see Nvidia pushing SKUs with 600-900 watts if they start to lag behind AMD. AMD is really pushing themselves too. A lot of the rumors are beginning to gain traction. At this point in time, I would even claim them as leaks instead of rumors. We're in July, within 3-4 months from now the next generation of GPUs will launch.
I'm sure nVidia doesn't *want* to lose to AMD, but they appear to be stuck. Ampere required a pretty big increase in power envelope to begin with, and it scaled less and less efficiently as more power was pushed into it. A lot of the speculation around nVidia's 4000-series echoes the froth around the 3000-series, which was pretty overplayed. (For example, a lot of the wild performance improvements over previous gens ended up being after DLSS.)
AMD's chiplet design, however, clearly has a lot of gas left in the tank. The RX6000-series was their first real go at the crown (I don't count the Radeon VII, that was basically vaporware) and while they didn't get it (and their OEMs going crazy with price didn't help) they delivered way, way more power efficient performance than nVidia.
AMD could still bork things (they're actually kind of great at doing that on the GPU side) but I'm not convinced that nVidia's 4000-series is going to be the huge improvement the internet wants it to be. I hope I'm wrong, and this is the generation where we get awesome, performant cards for cheap. But I'm not holding my breath.
Are the leaks accurate that a 4070 will be competitive with this and at a sub $500 price point?
Looking to put a build together but doesn't make sense to if that's accurate.
Absolutely 0 chance a 4070 will be as powerful as this. Not even when they held back performance at the release of 2000 because AMD couldn't compete did the 1070 come anywhere close to the 2080ti. They also used ray tracing as an excuse back then for why the performance didn't have a big leap. I think people will be regretting holding off for something they won't be able to buy for a while and the performance will be less than they expect.
3
Like
Helpful
Funny
Not helpful
Sign up for a Slickdeals account to remove this ad.
Absolutely 0 chance a 4070 will be as powerful as this. Not even when they held back performance at the release of 2000 because AMD couldn't compete did the 1070 come anywhere close to the 2080ti. They also used ray tracing as an excuse back then for why the performance didn't have a big leap. I think people will be regretting holding off for something they won't be able to buy for a while and the performance will be less than they expect.
I remember that the 2000 series excuse was raytracing. My RTX 2070 wasn't as powerful as my gtx 1080. Hopefully AMD and Nvidia releases a moderate boost. I am assuming the 4080 will be equivalent to a 3090, but will be using 16gb max.
Leave a Comment
22 Comments
Sign up for a Slickdeals account to remove this ad.
Looking to put a build together but doesn't make sense to if that's accurate.
Looking to put a build together but doesn't make sense to if that's accurate.
Looking to put a build together but doesn't make sense to if that's accurate.
Looking to put a build together but doesn't make sense to if that's accurate.
AMD's cards are the wild card this year. Rumors claim they've made pretty big year over year gains while maintaining power efficiency. I'm not seeing those same sorts of rumors from Nvidia.
AMD's cards are the wild card this year. Rumors claim they've made pretty big year over year gains while maintaining power efficiency. I'm not seeing those same sorts of rumors from Nvidia.
Sign up for a Slickdeals account to remove this ad.
AMD's chiplet design, however, clearly has a lot of gas left in the tank. The RX6000-series was their first real go at the crown (I don't count the Radeon VII, that was basically vaporware) and while they didn't get it (and their OEMs going crazy with price didn't help) they delivered way, way more power efficient performance than nVidia.
AMD could still bork things (they're actually kind of great at doing that on the GPU side) but I'm not convinced that nVidia's 4000-series is going to be the huge improvement the internet wants it to be. I hope I'm wrong, and this is the generation where we get awesome, performant cards for cheap. But I'm not holding my breath.
Looking to put a build together but doesn't make sense to if that's accurate.
Sign up for a Slickdeals account to remove this ad.
Leave a Comment