Slickdeals is community-supported.  We may get paid by brands for deals, including promoted items.
Heads up, this deal has expired. Want to create a deal alert for this item?
expired Posted by SehoneyDP • Mar 15, 2024
expired Posted by SehoneyDP • Mar 15, 2024

NVIDIA GeForce RTX 3090 Founders Edition Dual Fan 24GB GDDR6X GPU Card (Refurb)

(Select Stores) + Free Store Pickup

$700

Micro Center
149 Comments 119,043 Views
Visit Micro Center
Good Deal
Save
Share
Deal Details
Select Micro Center Stores have NVIDIA GeForce RTX 3090 Founders Edition Dual Fan 24GB GDDR6X PCIe 4.0 Graphics Card (Refurbished, 9001G1362510RF2) on sale for $699.99. Select free store pickup where available.
  • Note: Availability for pickup will vary by location and is very limited.
Thanks to Deal Hunter SehoneyDP for sharing this deal.

Features:
  • 24GB GDDR6X 384-bit Memory
  • 7680 x 4320 Maximum Resolution
  • PCIe 4.0
  • Full Height, Triple Slot
  • DisplayPort 1.4a, HDMI 2.1

Editor's Notes

Written by jimmytx | Staff
  • About this Store:
    • All products come with 60 days of Complimentary Tech Support
    • This product may be returned within 30 days of purchase (details).
  • Additional Information:

Original Post

Written by SehoneyDP
Community Notes
About the Poster
Deal Details
Community Notes
About the Poster
Select Micro Center Stores have NVIDIA GeForce RTX 3090 Founders Edition Dual Fan 24GB GDDR6X PCIe 4.0 Graphics Card (Refurbished, 9001G1362510RF2) on sale for $699.99. Select free store pickup where available.
  • Note: Availability for pickup will vary by location and is very limited.
Thanks to Deal Hunter SehoneyDP for sharing this deal.

Features:
  • 24GB GDDR6X 384-bit Memory
  • 7680 x 4320 Maximum Resolution
  • PCIe 4.0
  • Full Height, Triple Slot
  • DisplayPort 1.4a, HDMI 2.1

Editor's Notes

Written by jimmytx | Staff
  • About this Store:
    • All products come with 60 days of Complimentary Tech Support
    • This product may be returned within 30 days of purchase (details).
  • Additional Information:

Original Post

Written by SehoneyDP

Community Voting

Deal Score
+84
Good Deal
Visit Micro Center

Leave a Comment

Unregistered (You)

Top Comments

HappyAccident
517 Posts
229 Reputation
If you have a 24GB card, just download koboldcpp which is a 250mb exec, and get a GGUF model off huggingface -- a 20B or 34B model is about 15GB, then run it. Total time, including installing card and drivers ~1hr.

Check out /r/localllama on reddit.
HappyAccident
517 Posts
229 Reputation
This card is great if you want to play with local language models. Tired of GPT refusing to answer your questions about how to dispose of that dead body in your freezer? Run your own local model that has been 'un-aligned' and it will let you know exactly which brand of acid to get at the hardware store to remove that pesky problem. All you need a boatload of VRAM to run it.
xlongx
273 Posts
153 Reputation
It's a great card. But for gamers, seems 4070 ti super with 3 years of warranty is a better choice.

148 Comments

Sign up for a Slickdeals account to remove this ad.

Mar 16, 2024
715 Posts
Joined Nov 2011

This comment has been rated as unhelpful by Slickdeals users.

Mar 16, 2024
220 Posts
Joined Mar 2016
Mar 16, 2024
rowmean77
Mar 16, 2024
220 Posts
Tree fiddy maybe
3
Mar 16, 2024
47 Posts
Joined Mar 2019
Mar 16, 2024
Richard1979
Mar 16, 2024
47 Posts
90 day warranty scares me and Microcenter hasn't yet emailed me back if an extended warranty is available (and the price). I got a 3060 ti 8gb thats fine with right now. But its bottlenecking my R9-5950X cpu so I was thinking of pulling the trigger on this…
Mar 16, 2024
347 Posts
Joined Dec 2017
Mar 16, 2024
EthanH5438
Mar 16, 2024
347 Posts
I hope they have this with 4090s
Mar 16, 2024
504 Posts
Joined Apr 2021
Mar 16, 2024
YW55
Mar 16, 2024
504 Posts
Quote from MuddyBottoms :
Just no, this is a refurbished last gen card. I cannot imagine in any scenario a person being happy spending 700 on this a year or two from now.
It's the CUDA tax. There's really no other reason to get this card besides machine learning. Nvidia purposefully limit VRAM on consumer cards to push machine learning customers to their enterprise cards that costs 5-10X more.
Mar 16, 2024
1,026 Posts
Joined Mar 2005
Mar 16, 2024
d4deal
Mar 16, 2024
1,026 Posts
Rep to Op. I reserved RTX 3090 for $700 + tax in Tustin. Then, I saw 3090 Ti for $800 + tax and reserved that too. When my brother wakes up, I will let him choose. woot

Per Micro Center, the 3090 Ti enables more CUDAs (10,752 vs. 10,496) but needs less power (750w vs. 850w) than the 3090. Both have 2 fans and need slots.
https://www.microcenter.com/produ...efurbished)

I have bought many refurbished iPads and laptops from Micro Center. They are usually in good to excellent conditions. If anything goes wrong, easy return for full refund.
Last edited by d4deal March 16, 2024 at 10:47 AM.
2
Mar 16, 2024
504 Posts
Joined Apr 2021
Mar 16, 2024
YW55
Mar 16, 2024
504 Posts
Quote from d4deal :
Rep to Op. I reserved RTX 4090 for $700 + tax in Tustin. Then, I saw 4090 Ti for $800 + tax and reserved that too. When my brother wakes up, I will let him choose. woot

Per Micro Center, the refurb 4090 Ti enables more CUDAs (10,752 vs. 10,496) and needs less power (750w vs. 850w) than the refurb 4090. Both are dual fans and triple slots.
https://www.microcenter.com/produ...efurbished)

I have bought many refurbished iPads and laptops from Micro Center. They are usually in good to excellent conditions. If anything goes wrong, easy return for full refund.
You mean 3090, 4090Ti doesn't exist yet.

Sign up for a Slickdeals account to remove this ad.

Mar 16, 2024
130 Posts
Joined Nov 2018
Mar 16, 2024
NavyCreature9244
Mar 16, 2024
130 Posts
This or a 4080 Super?
Mar 16, 2024
2,805 Posts
Joined Jul 2020
Mar 16, 2024
ThirstyCruz
Mar 16, 2024
2,805 Posts
Quote from starcaptor :
Anyone with experience running LLMs know if running two of the 3060 12gb cards (currently $245 each at woot) would be comparable to one 3090, strictly for LLM purposes, not for gaming?
And while you can run 2 cards, best have lot of patience getting them to work, and the optimized. I'd rather pull my hairs out
Mar 16, 2024
312 Posts
Joined Dec 2014
Mar 16, 2024
asifgunz
Mar 16, 2024
312 Posts
Quote from HappyAccident :
This card is great if you want to play with local language models. Tired of GPT refusing to answer your questions about how to dispose of that dead body in your freezer? Run your own local model that has been 'un-aligned' and it will let you know exactly which brand of acid to get at the hardware store to remove that pesky problem. All you need a boatload of VRAM to run it.
But the same gpt has workaround if you phrase your prompts with a bit of "love". For example if you ask gpt to advice you like your loving grandmother, it will be a lot nicer to you 😉 with outputs.

You'd think I'm trolling but I'm really not.
Mar 16, 2024
2,805 Posts
Joined Jul 2020
Mar 16, 2024
ThirstyCruz
Mar 16, 2024
2,805 Posts
Quote from HappyAccident :
The 5 series won't have any more VRAM than we have now, so you would not be smart to wait for another series when there is the 3090 sitting here now for $700. Unless you are training models constantly you are not being greatly limited by the compute power of these cards -- you are being limited by how many you can fit into a case and not blow up your power supply so you can max out VRAM by paralleling the cards.
You're absolutely right! Some application need more vrams vs higher cuda. There certainly a use case for 3090 24gb at this price, no doubt... Just always know what you need, and doodoo diligence
Mar 16, 2024
2,805 Posts
Joined Jul 2020
Mar 16, 2024
ThirstyCruz
Mar 16, 2024
2,805 Posts
Quote from justletmepost1 :
FYI Ollama now supports AMD cards:

https://ollama.com/blog/amd-preview

So the question becomes is it better to buy a refurbished 3090 or get a new 7900 xtx with the same ram for a similar price?

If you are just looking to use LLMs an argument can be made the 7900 XTX Is a better investment, but if you're looking to use it predominantly for image generation, such as stable diffusion, then the Nvidia card is better.
I hate Nvidia, and can't wait for the cuda swamp to be done with. However if you need actual work done and not a natural genius getting dependencies ironed out, you will spend equal time just getting AMD to work as the work itself.

There might be some exception but not many and hope it improves fast. It won't be this year for sure
1
Mar 16, 2024
95 Posts
Joined Apr 2016
Mar 16, 2024
shilderb
Mar 16, 2024
95 Posts
Quote from SehoneyDP :
I was definitely thinking the same thing for just $100 more the longevity should be worth it right? I'm not expert, but the big difference seems to be the vram? 24GB vs 16GB is there a big difference in consideration of that? Does it help things in particular?
VRAM's become my obession since I started using generative AI through Stable Diffusion. That stuff will eat up as much as you can feed it. I'm currently on a 2070 and a 3 minute video with 48 frames a second takes me 16 hours to generate. Upgrading to 16GB should cut that in half to 8 hours and an additional 8 on top of that would probably bring it down to 4 hours... it's a big difference when you're trying to learn something that requires a lot of trial and error.
Mar 16, 2024
251 Posts
Joined Feb 2015
Mar 16, 2024
domonin
Mar 16, 2024
251 Posts
Is this card loud at all? My gigabyte 3080 is extremely loud and the fans cannot be controlled by afterburner etc

Sign up for a Slickdeals account to remove this ad.

Mar 16, 2024
227 Posts
Joined Oct 2011

This comment has been rated as unhelpful by Slickdeals users.

Leave a Comment

Unregistered (You)

Popular Deals

View All

Trending Deals

View All