Slickdeals is community-supported.  We may get paid by brands for deals, including promoted items.
Heads up, this deal has expired. Want to create a deal alert for this item?
expired Posted by SehoneyDP • Mar 15, 2024
expired Posted by SehoneyDP • Mar 15, 2024

NVIDIA GeForce RTX 3090 Founders Edition Dual Fan 24GB GDDR6X GPU Card (Refurb)

(Select Stores) + Free Store Pickup

$700

Micro Center
149 Comments 119,048 Views
Visit Micro Center
Good Deal
Save
Share
Deal Details
Select Micro Center Stores have NVIDIA GeForce RTX 3090 Founders Edition Dual Fan 24GB GDDR6X PCIe 4.0 Graphics Card (Refurbished, 9001G1362510RF2) on sale for $699.99. Select free store pickup where available.
  • Note: Availability for pickup will vary by location and is very limited.
Thanks to Deal Hunter SehoneyDP for sharing this deal.

Features:
  • 24GB GDDR6X 384-bit Memory
  • 7680 x 4320 Maximum Resolution
  • PCIe 4.0
  • Full Height, Triple Slot
  • DisplayPort 1.4a, HDMI 2.1

Editor's Notes

Written by jimmytx | Staff
  • About this Store:
    • All products come with 60 days of Complimentary Tech Support
    • This product may be returned within 30 days of purchase (details).
  • Additional Information:

Original Post

Written by SehoneyDP
Community Notes
About the Poster
Deal Details
Community Notes
About the Poster
Select Micro Center Stores have NVIDIA GeForce RTX 3090 Founders Edition Dual Fan 24GB GDDR6X PCIe 4.0 Graphics Card (Refurbished, 9001G1362510RF2) on sale for $699.99. Select free store pickup where available.
  • Note: Availability for pickup will vary by location and is very limited.
Thanks to Deal Hunter SehoneyDP for sharing this deal.

Features:
  • 24GB GDDR6X 384-bit Memory
  • 7680 x 4320 Maximum Resolution
  • PCIe 4.0
  • Full Height, Triple Slot
  • DisplayPort 1.4a, HDMI 2.1

Editor's Notes

Written by jimmytx | Staff
  • About this Store:
    • All products come with 60 days of Complimentary Tech Support
    • This product may be returned within 30 days of purchase (details).
  • Additional Information:

Original Post

Written by SehoneyDP

Community Voting

Deal Score
+84
Good Deal
Visit Micro Center

Leave a Comment

Unregistered (You)

Top Comments

HappyAccident
517 Posts
229 Reputation
If you have a 24GB card, just download koboldcpp which is a 250mb exec, and get a GGUF model off huggingface -- a 20B or 34B model is about 15GB, then run it. Total time, including installing card and drivers ~1hr.

Check out /r/localllama on reddit.
HappyAccident
517 Posts
229 Reputation
This card is great if you want to play with local language models. Tired of GPT refusing to answer your questions about how to dispose of that dead body in your freezer? Run your own local model that has been 'un-aligned' and it will let you know exactly which brand of acid to get at the hardware store to remove that pesky problem. All you need a boatload of VRAM to run it.
xlongx
273 Posts
153 Reputation
It's a great card. But for gamers, seems 4070 ti super with 3 years of warranty is a better choice.

148 Comments

Sign up for a Slickdeals account to remove this ad.

Mar 16, 2024
517 Posts
Joined Feb 2021
Mar 16, 2024
HappyAccident
Mar 16, 2024
517 Posts
Quote from sky0102 :
mang, this used to be over $2.5k at the peak of the pandemic and mining craze with an HP rig with a 10th gen intel.
1. Invent time machine
2. Buy up all the 3090s
3. Go back 3 years and sell 3090s for $$$$$
4. Realize this is what I used my damn time machine for and smack myself
6
1
Mar 16, 2024
796 Posts
Joined Sep 2004
Mar 16, 2024
cyberandroid
Mar 16, 2024
796 Posts
90 day warranty kills it for me!
Mar 16, 2024
813 Posts
Joined Dec 2007
Mar 16, 2024
52club
Mar 16, 2024
813 Posts
One thing I noticed is they already marked down the price a bit on the 3080 refurbs they got. I'd love to hear the story behind these cards suddenly appearing in relatively large numbers at a specific retailer like they have. All that said, unless you need it now, I wouldn't be surprised if they continue to mark these down.
1
Mar 16, 2024
3,109 Posts
Joined Dec 2012
Mar 16, 2024
pmperry
Mar 16, 2024
3,109 Posts
Quote from wpc :
the vram matters a lot when you start gaming at higher resolutions. 4k specifically
https://youtu.be/TQzwOBXFUYg?feature=shared

not really benefitting all that much honestly.
Mar 16, 2024
2,807 Posts
Joined Jul 2020
Mar 16, 2024
ThirstyCruz
Mar 16, 2024
2,807 Posts
Quote from HappyAccident :
This card is great if you want to play with local language models. Tired of GPT refusing to answer your questions about how to dispose of that dead body in your freezer? Run your own local model that has been 'un-aligned' and it will let you know exactly which brand of acid to get at the hardware store to remove that pesky problem. All you need a boatload of VRAM to run it.
Precisely...dead bodies and bombs? You need local llm. Nvidia needs to stop be so damn cheap with vram. Yeah I know it's expensive, but it's utterly insane that 24gb is max before getting into 10s of thousands $$.

While that extra 8gb vram can benefit, llm are such hogs, it won't suddenly enable you to do THAT much more..

Soon, either Nvidia will have legit competition and /or llm compute will become exponentially more efficient that we will look back and laugh atb16g or 24gb ram days.

It might take a few years , but there's lots of human compute working on this every second, and I will applaud the demise of Nvidia stronghold
2
Mar 16, 2024
2,807 Posts
Joined Jul 2020
Mar 16, 2024
ThirstyCruz
Mar 16, 2024
2,807 Posts
Quote from 52club :
One thing I noticed is they already marked down the price a bit on the 3080 refurbs they got. I'd love to hear the story behind these cards suddenly appearing in relatively large numbers at a specific retailer like they have. All that said, unless you need it now, I wouldn't be surprised if they continue to mark these down.
They will come down because the those that NEED either will buy h/a100. LLM Hobbyist will wait until 50 series. The reality TODAY, buying a 3090 or 4080 wrt to LLM isn't going to make or break you..
1
Mar 16, 2024
517 Posts
Joined Feb 2021
Mar 16, 2024
HappyAccident
Mar 16, 2024
517 Posts
Quote from ThirstyCruz :
They will come down because the those that NEED either will buy h/a100. LLM Hobbyist will wait until 50 series. The reality TODAY, buying a 3090 or 4080 wrt to LLM isn't going to make or break you..
The 5 series won't have any more VRAM than we have now, so you would not be smart to wait for another series when there is the 3090 sitting here now for $700. Unless you are training models constantly you are not being greatly limited by the compute power of these cards -- you are being limited by how many you can fit into a case and not blow up your power supply so you can max out VRAM by paralleling the cards.
1

Sign up for a Slickdeals account to remove this ad.

Pro
Mar 16, 2024
294 Posts
Joined Jan 2013
Mar 16, 2024
Yukikaze
Pro
Mar 16, 2024
294 Posts

Our community has rated this post as helpful. If you agree, why not thank Yukikaze

Use Tom's hardware GPU hierarchy charts to help you decide what GPU to buy specifically for gaming.

https://www.tomshardware.com/revi...,4388.html

Looking at these charts, it seems like the 3090 is roughly on par with the 4070 super at 1080p and 1440p. At 4k the 3090 beats the 4070 super and is roughly on par with the 4070 ti. Of course, in a game that is heavily VRAM dependent, you'll probably see a slightly smoother experience on the 3090 but only at 4k and maybe maxed out 1440p.

Personally, for gaming specifically, I'd buy either the 4070 super or 4070 ti super. You'd be using noticeably less power thus you won't need to change your current PSU (maybe you're only running say a 650w Gold?) and be roughly at the same level of performance depending on your target resolution.
2
Mar 16, 2024
171 Posts
Joined Aug 2013
Mar 16, 2024
peteer01
Mar 16, 2024
171 Posts
As others have said, this is not the card for you if you're looking for gaming.

But if you're looking for anything AI related, especially things like Stable Diffusion, the 24GB is the reason this is your choice. You can get ~10% faster with a 3080Ti refurb, but you're really getting a SIGNIFICANT multiplier to how long any generation takes any time your VRAM becomes the limiting factor. The 24GB of VRAM is the most available, so you're eliminating VRAM limitations as much as you can with a consumer level card once you're at 24GB.

Combinations of higher resolutions, upscaling, LoRAs and ControlNet will make extra VRAM critical. You need a minimum of 16GB to train locally as well.

If you want gaming performance and a warranty, a new 4000 series makes more sense. If you want 24GB of VRAM and Nvidia, this is the cheapest entry available.

The price increase and performance increase at 768x768 generation is almost the same percentage between the 3090 and 3090 Ti at MicroCenter, so the 3090 Ti also makes sense if you've got the extra $100 and want the extra speed.

The 4090 is roughly 250% more, so unless you already knew you wanted a $2,000 card, the 3090 and 3090Ti are the only real decision to make for someone looking for 24GB VRAM.
Last edited by peteer01 March 16, 2024 at 09:25 AM.
1
Mar 16, 2024
3,572 Posts
Joined Sep 2003
Mar 16, 2024
starcaptor
Mar 16, 2024
3,572 Posts
Anyone with experience running LLMs know if running two of the 3060 12gb cards (currently $245 each at woot) would be comparable to one 3090, strictly for LLM purposes, not for gaming?
Mar 16, 2024
171 Posts
Joined Aug 2013
Mar 16, 2024
peteer01
Mar 16, 2024
171 Posts
Quote from starcaptor :
Anyone with experience running LLMs know if running two of the 3060 12gb cards (currently $245 each at woot) would be comparable to one 3090, strictly for LLM purposes, not for gaming?
Yes. They are not. Some training and running options will require 16GB (or more) on a single card. Better to get as much VRAM on a single card as possible.
Last edited by peteer01 March 16, 2024 at 09:25 AM.
Mar 16, 2024
504 Posts
Joined Apr 2021
Mar 16, 2024
YW55
Mar 16, 2024
504 Posts
Before dropping money on a 24GB card, try running GGUF model split between GPU and CPU to see if the larger model is even worth it to you. Also just for LLM, you can get away with AMD cards since their ROCm (AMD's CUDA) forks are available now. 6900XT 16GB can run up to 34B EXL2 at about 15t/s.

For gaming there are AMD 6950XT, 7900GRE, 7800XT that are cheaper and have similar performance with the exception of ray tracing.
Last edited by YW55 March 16, 2024 at 08:28 AM.
1
Mar 16, 2024
1,302 Posts
Joined Nov 2013
Mar 16, 2024
Maverick1776
Mar 16, 2024
1,302 Posts
As tempting as this is, remember there is no DLSS 3.0 / 3.5 support. And specific RTX settings in Cyberpunk won't work on 3000 series cards.
Like many others have said, picking up a 4070 ti or ti super is a better idea.
Mar 16, 2024
1,713 Posts
Joined Sep 2018
Mar 16, 2024
slimdunkin117
Mar 16, 2024
1,713 Posts
Quote from wpc :
did you just bring up AMD in an NVIDIA deal thread?
This is a deal site. I bring up deals
2

Sign up for a Slickdeals account to remove this ad.

Mar 16, 2024
55 Posts
Joined Dec 2010
Mar 16, 2024
justletmepost1
Mar 16, 2024
55 Posts
FYI Ollama now supports AMD cards:

https://ollama.com/blog/amd-preview

So the question becomes is it better to buy a refurbished 3090 or get a new 7900 xtx with the same ram for a similar price?

If you are just looking to use LLMs an argument can be made the 7900 XTX Is a better investment, but if you're looking to use it predominantly for image generation, such as stable diffusion, then the Nvidia card is better.
1

Leave a Comment

Unregistered (You)

Popular Deals

View All

Trending Deals

View All