Slickdeals is community-supported.  We may get paid by brands for deals, including promoted items.
Heads up, this deal has expired. Want to create a deal alert for this item?
expired Posted by SehoneyDP • Mar 15, 2024
expired Posted by SehoneyDP • Mar 15, 2024

NVIDIA GeForce RTX 3090 Founders Edition Dual Fan 24GB GDDR6X GPU Card (Refurb)

(Select Stores) + Free Store Pickup

$700

Micro Center
149 Comments 118,750 Views
Visit Micro Center
Good Deal
Save
Share
Deal Details
Select Micro Center Stores have NVIDIA GeForce RTX 3090 Founders Edition Dual Fan 24GB GDDR6X PCIe 4.0 Graphics Card (Refurbished, 9001G1362510RF2) on sale for $699.99. Select free store pickup where available.
  • Note: Availability for pickup will vary by location and is very limited.
Thanks to Deal Hunter SehoneyDP for sharing this deal.

Features:
  • 24GB GDDR6X 384-bit Memory
  • 7680 x 4320 Maximum Resolution
  • PCIe 4.0
  • Full Height, Triple Slot
  • DisplayPort 1.4a, HDMI 2.1

Editor's Notes

Written by jimmytx | Staff
  • About this Store:
    • All products come with 60 days of Complimentary Tech Support
    • This product may be returned within 30 days of purchase (details).
  • Additional Information:

Original Post

Written by SehoneyDP
Community Notes
About the Poster
Deal Details
Community Notes
About the Poster
Select Micro Center Stores have NVIDIA GeForce RTX 3090 Founders Edition Dual Fan 24GB GDDR6X PCIe 4.0 Graphics Card (Refurbished, 9001G1362510RF2) on sale for $699.99. Select free store pickup where available.
  • Note: Availability for pickup will vary by location and is very limited.
Thanks to Deal Hunter SehoneyDP for sharing this deal.

Features:
  • 24GB GDDR6X 384-bit Memory
  • 7680 x 4320 Maximum Resolution
  • PCIe 4.0
  • Full Height, Triple Slot
  • DisplayPort 1.4a, HDMI 2.1

Editor's Notes

Written by jimmytx | Staff
  • About this Store:
    • All products come with 60 days of Complimentary Tech Support
    • This product may be returned within 30 days of purchase (details).
  • Additional Information:

Original Post

Written by SehoneyDP

Community Voting

Deal Score
+84
Good Deal
Visit Micro Center
Leave a Comment
To participate in the comments, please log in.

Top Comments

If you have a 24GB card, just download koboldcpp which is a 250mb exec, and get a GGUF model off huggingface -- a 20B or 34B model is about 15GB, then run it. Total time, including installing card and drivers ~1hr.

Check out /r/localllama on reddit.
This card is great if you want to play with local language models. Tired of GPT refusing to answer your questions about how to dispose of that dead body in your freezer? Run your own local model that has been 'un-aligned' and it will let you know exactly which brand of acid to get at the hardware store to remove that pesky problem. All you need a boatload of VRAM to run it.
It's a great card. But for gamers, seems 4070 ti super with 3 years of warranty is a better choice.

148 Comments

Sign up for a Slickdeals account to remove this ad.

Mar 19, 2024
151 Posts
Joined Oct 2008
Mar 19, 2024
xslickIDx
Mar 19, 2024
151 Posts
Quote from Delph :
I ordered this online yesterday and drove 1h one way today to pick it up from Tustin, CA. The manager said that they were told not to sell these as they are missing a cable. I tried to convince him to sell it to me anyway, but not luck. I recommend calling the store before you drive to pick it up.
I complained when I went to pick up an item that was ready, but it wasn't there. Manager was going to send me the item for free, but I felt bad so I only asked for what was worth my time for free. No way would I let it go that easily driving an hr one way.
Mar 20, 2024
36 Posts
Joined Aug 2008
Mar 20, 2024
Delph
Mar 20, 2024
36 Posts
Quote from xslickIDx :
I complained when I went to pick up an item that was ready, but it wasn't there. Manager was going to send me the item for free, but I felt bad so I only asked for what was worth my time for free. No way would I let it go that easily driving an hr one way.
I actually asked them if I could pay right now and get a delivery - they said no Frown
Mar 20, 2024
241 Posts
Joined Oct 2012
Mar 20, 2024
natecmd
Mar 20, 2024
241 Posts
Quote from hiro :
Last time I tried to use the Razer enclosure on my Mac(around 2020), it only worked with AMD card, and could not get it to work with my NVDIA card.
If you manage to get this to work, please also keep in mind the limitation of the hardware. Thunderbolt 3 has bandwidth of 5GB/s, while PCIE 3.0 16x has 16GB/s. Bandwidth may or may not matter to your workflow, depending on how often you transfer data between CPU & GPU.
Just set up a 3090 in a Razer core X (tb3) connected to a Framework 13 AMD Ryzen 7840 with 64gb ram.

I was using ollama in windows 11 and 7 and 13B parameter llama2 was somewhat useable with out the gpu, but after getting all of the nvidia drivers and cuda stuff installed and the 3090 recognized by ollama, it absolutely flies. Way better experience! And you can use 4-bit quantized models up to about 30B parameters on the 24GB vram.

Then I got wsl2 installed, and ollama and open-webview running in a docker container with gpu access and it is great to play with uncensored models and multi modal models and the API to learn about LLMs and LangChain etc.

I had tried to use oobabooga/textgenwebui a few months ago but had issues with virtual environment and dependencies with the install, so I love the simplicity of my docker setup with ollama…

I agree with the others that if you want to play with LLMs, get the biggest vram card you can afford, ideally 24GB.
Mar 20, 2024
86 Posts
Joined Feb 2012
Mar 20, 2024
capncrunk
Mar 20, 2024
86 Posts
Deal is dead
Mar 20, 2024
21 Posts
Joined Jan 2008
Mar 20, 2024
MrBabycake
Mar 20, 2024
21 Posts
Went to pick up card today and the register would not allow it to be sold to me. MC employees told me that the home office had restricted the sale of the item so they couldn't ring it up and there was nothing they could do about that. Manager on duty immediately made right by grabbing a refurb 3090 ti off the shelf and sold it to me with a price match of the 3090. Super happy right now.

FYI to everyone. You won't be able to walk out the door with the card you reserved. I had good luck with it getting made right, but there's no guarantee here.
Mar 21, 2024
83 Posts
Joined Jul 2017
Mar 21, 2024
elvsrbad
Mar 21, 2024
83 Posts
Quote from natecmd :
Just set up a 3090 in a Razer core X (tb3) connected to a Framework 13 AMD Ryzen 7840 with 64gb ram.

I was using ollama in windows 11 and 7 and 13B parameter llama2 was somewhat useable with out the gpu, but after getting all of the nvidia drivers and cuda stuff installed and the 3090 recognized by ollama, it absolutely flies. Way better experience! And you can use 4-bit quantized models up to about 30B parameters on the 24GB vram.

Then I got wsl2 installed, and ollama and open-webview running in a docker container with gpu access and it is great to play with uncensored models and multi modal models and the API to learn about LLMs and LangChain etc.

I had tried to use oobabooga/textgenwebui a few months ago but had issues with virtual environment and dependencies with the install, so I love the simplicity of my docker setup with ollama…

I agree with the others that if you want to play with LLMs, get the biggest vram card you can afford, ideally 24GB.
Textgenwebui has an installer now. When id originally run it I had to manually set up everything and honestly it was a pain. Once I got it working though it was great. I only have a 3060ti with 8gb of vram so I'm seriously limited on what models I can run (7b models run well, but past that not so much). Since AMD gpus have so much more vram for the price I wonder how well they'd do. I know they don't have cuda so I'm guessing it's not at all apples to apples
Mar 21, 2024
2,115 Posts
Joined Jul 2008
Mar 21, 2024
sandwich
Mar 21, 2024
2,115 Posts
Quote from slackerjj :
I run my 3090ti a bit undervolted and keeps it much cooler and stable.
3090 Ti doesn't have the same VRAM overheating issues as the regular 3090.

Sign up for a Slickdeals account to remove this ad.

Mar 23, 2024
4 Posts
Joined May 2012
Mar 23, 2024
Frugorilla
Mar 23, 2024
4 Posts
Quote from Delph :
I ordered this online yesterday and drove 1h one way today to pick it up from Tustin, CA. The manager said that they were told not to sell these as they are missing a cable. I tried to convince him to sell it to me anyway, but not luck. I recommend calling the store before you drive to pick it up.
same thing happened to me
Pro
Mar 26, 2024
1,205 Posts
Joined Oct 2005
Mar 26, 2024
Big Wang
Pro
Mar 26, 2024
1,205 Posts
Question: I got the 3090 FE. It has a 12-pin power port. However, my box didn't come with an adapter and my PSU only has 8-pins. Would I be fine buying a two 8-pin to 12-pin adapter off of Amazon? Or do I need a specific cable for my PSU?
Mar 27, 2024
988 Posts
Joined Jun 2005
Mar 27, 2024
erthian
Mar 27, 2024
988 Posts
Quote from Big Wang :
Question: I got the 3090 FE. It has a 12-pin power port. However, my box didn't come with an adapter and my PSU only has 8-pins. Would I be fine buying a two 8-pin to 12-pin adapter off of Amazon? Or do I need a specific cable for my PSU?
Where did you find one in stock still?
Pro
Mar 27, 2024
1,205 Posts
Joined Oct 2005
Mar 27, 2024
Big Wang
Pro
Mar 27, 2024
1,205 Posts
Quote from erthian :
Where did you find one in stock still?
I got it last week and I just hadn't had time to install it until now. I guess that was before they realized the cable was missing?
Apr 19, 2024
1 Posts
Joined Dec 2022
Apr 19, 2024
SplendidStar8982
Apr 19, 2024
1 Posts
They have a few more available today (4/19) at the Parkville MD location.
Apr 27, 2024
41 Posts
Joined Jul 2010
Apr 27, 2024
dahknee
Apr 27, 2024
41 Posts
Quote from SplendidStar8982 :
They have a few more available today (4/19) at the Parkville MD location.
Thanks for posting. I was able to get one at the Rockville, MD store. They have 5 left in stock.

Related Searches

Popular Deals

View All

Trending Deals

View All