Slickdeals is community-supported.  We may get paid by brands or deals, including promoted items.
Sorry, this deal has expired. Get notified of deals like this in the future. Add Deal Alert for this Item
Frontpage

NVIDIA GeForce RTX 3090 Founders Edition Dual Fan 24GB GDDR6X GPU Card (Refurb) Expired

$700
(Select Stores) + Free Store Pickup
+84 Deal Score
108,999 Views
Select Micro Center Stores have NVIDIA GeForce RTX 3090 Founders Edition Dual Fan 24GB GDDR6X PCIe 4.0 Graphics Card (Refurbished, 9001G1362510RF2) on sale for $699.99. Select free store pickup where available.
  • Note: Availability for pickup will vary by location and is very limited.
Thanks to Deal Hunter SehoneyDP for sharing this deal.

Features:
  • 24GB GDDR6X 384-bit Memory
  • 7680 x 4320 Maximum Resolution
  • PCIe 4.0
  • Full Height, Triple Slot
  • DisplayPort 1.4a, HDMI 2.1

Editor's Notes & Price Research

Written by
  • About this Store:
    • All products come with 60 days of Complimentary Tech Support
    • This product may be returned within 30 days of purchase (details).
  • Additional Information:
Good Deal?

Original Post

Written by
Edited March 16, 2024 at 05:04 AM by
Micro Center [microcenter.com] has NVIDIA GeForce RTX 3090 Founders Edition Dual Fan 24GB GDDR6X PCIe 4.0 Video Graphics Card GPU (Refurbished) on sale for $699.99. Select free store pickup where available.
If you purchase something through a post on our site, Slickdeals may get a small share of the sale.
Deal
Score
+84
108,999 Views

Your comment cannot be blank.

Featured Comments

If you have a 24GB card, just download koboldcpp which is a 250mb exec, and get a GGUF model off huggingface -- a 20B or 34B model is about 15GB, then run it. Total time, including installing card and drivers ~1hr.

Check out /r/localllama on reddit.
This card is great if you want to play with local language models. Tired of GPT refusing to answer your questions about how to dispose of that dead body in your freezer? Run your own local model that has been 'un-aligned' and it will let you know exactly which brand of acid to get at the hardware store to remove that pesky problem. All you need a boatload of VRAM to run it.
It's a great card. But for gamers, seems 4070 ti super with 3 years of warranty is a better choice.

Sign up for a Slickdeals account to remove this ad.

Joined Nov 2019
L3: Novice
> bubble2 212 Posts
141 Reputation
xlongx
03-15-2024 at 09:24 AM.

Our community has rated this post as helpful. If you agree, why not thank xlongx

03-15-2024 at 09:24 AM.
It's a great card. But for gamers, seems 4070 ti super with 3 years of warranty is a better choice.
140
>
5
1
Like
Funny
>
Helpful
Not helpful
Reply
Joined Nov 2022
Deal Hunter
> bubble2 2,450 Posts
12,279 Reputation
Original Poster
Pro
SehoneyDP
03-15-2024 at 09:34 AM.
03-15-2024 at 09:34 AM.
Quote from xlongx :
It's a great card. But for gamers, seems 4070 ti super with 3 years of warranty is a better choice.
I was definitely thinking the same thing for just $100 more the longevity should be worth it right? I'm not expert, but the big difference seems to be the vram? 24GB vs 16GB is there a big difference in consideration of that? Does it help things in particular?
4
Like
Funny
>
Helpful
Not helpful
Reply
Joined Dec 2007
L8: Grand Teacher
> bubble2 3,991 Posts
3,436 Reputation
Outrager
03-15-2024 at 09:40 AM.
03-15-2024 at 09:40 AM.
Quote from SehoneyDP :
I was definitely thinking the same thing for just $100 more the longevity should be worth it right? I'm not expert, but the big difference seems to be the vram? 24GB vs 16GB is there a big difference in consideration of that? Does it help things in particular?
Some newer games need the extra VRAM, but from what I remember it's mostly because they're so unoptimized. Like for The Last of Us Part 1 they eventually released a patch that lowered the VRAM requirement for the ultra textures.
Also, maybe for scientific stuff you'll want more VRAM, but I have no idea if you'll want more VRAM or compute power for stuff like that.
14
2
Like
Funny
>
Helpful
Not helpful
Reply
Joined Jun 2010
L9: Master
> bubble2 5,150 Posts
793 Reputation
wpc
03-15-2024 at 09:42 AM.
03-15-2024 at 09:42 AM.
the vram matters a lot when you start gaming at higher resolutions. 4k specifically
20
>
1
1
Like
Funny
>
Helpful
Not helpful
Reply
Joined Nov 2022
Deal Hunter
> bubble2 2,450 Posts
12,279 Reputation
Original Poster
Pro
SehoneyDP
03-15-2024 at 09:46 AM.
03-15-2024 at 09:46 AM.
Quote from Outrager :
Some newer games need the extra VRAM, but from what I remember it's mostly because they're so unoptimized. Like for The Last of Us Part 1 they eventually released a patch that lowered the VRAM requirement for the ultra textures.
Also, maybe for scientific stuff you'll want more VRAM, but I have no idea if you'll want more VRAM or compute power for stuff like that.
Quote from wpc :
the vram matters a lot when you start gaming at higher resolutions. 4k specifically
Ah that makes sense ty. I got a 1440p monitor for now, so probably don't really see a need. I do feel like I could use more for PCVR though. I'll have to look at which one is better for that. I can still wait until the 5000 series are out though tbh.
2
Like
Funny
>
Helpful
Not helpful
Reply
Joined Apr 2007
Trollslayer
> bubble2 800 Posts
102 Reputation
Meteo
03-15-2024 at 10:09 AM.
03-15-2024 at 10:09 AM.
temped to get a second 3090 for nvlink
8
4
10
Like
Funny
>
Helpful
Not helpful
Reply
Joined Feb 2021
L4: Apprentice
> bubble2 455 Posts
205 Reputation
HappyAccident
03-15-2024 at 10:47 AM.

Our community has rated this post as helpful. If you agree, why not thank HappyAccident

03-15-2024 at 10:47 AM.
This card is great if you want to play with local language models. Tired of GPT refusing to answer your questions about how to dispose of that dead body in your freezer? Run your own local model that has been 'un-aligned' and it will let you know exactly which brand of acid to get at the hardware store to remove that pesky problem. All you need a boatload of VRAM to run it.
76
33
>
7
2
Like
Funny
>
Helpful
Not helpful
Reply

Sign up for a Slickdeals account to remove this ad.

Joined Apr 2006
L7: Teacher
> bubble2 2,317 Posts
343 Reputation
duijver
03-15-2024 at 10:51 AM.
03-15-2024 at 10:51 AM.
Quote from HappyAccident :
This card is great if you want to play with local language models. Tired of GPT refusing to answer your questions about how to dispose of that dead body in your freezer? Run your own local model that has been 'un-aligned' and it will let you know exactly which brand of acid to get at the hardware store to remove that pesky problem. All you need a boatload of VRAM to run it.
What is the investment time + hardware to play around with your own GPT / LLM model?

I can play with GPT to see what it spits out - I am curious to hear from someone that has done it.
4
1
Like
Funny
>
Helpful
Not helpful
Reply
Joined Feb 2021
L4: Apprentice
> bubble2 455 Posts
205 Reputation
HappyAccident
03-15-2024 at 11:09 AM.

Our community has rated this post as helpful. If you agree, why not thank HappyAccident

03-15-2024 at 11:09 AM.
Quote from duijver :
What is the investment time + hardware to play around with your own GPT / LLM model?

I can play with GPT to see what it spits out - I am curious to hear from someone that has done it.
If you have a 24GB card, just download koboldcpp which is a 250mb exec, and get a GGUF model off huggingface -- a 20B or 34B model is about 15GB, then run it. Total time, including installing card and drivers ~1hr.

Check out /r/localllama on reddit.
63
>
17
Like
Funny
>
Helpful
Not helpful
Reply
Last edited by HappyAccident March 15, 2024 at 02:31 PM.
Joined Apr 2006
L7: Teacher
> bubble2 2,317 Posts
343 Reputation
duijver
03-15-2024 at 11:29 AM.
03-15-2024 at 11:29 AM.
Quote from HappyAccident :
If you have a 24GB card, just download koboldcpp which is a 250mb exec, and get a GGUF model off huggingface -- a 20B or 34B model is about 15GB, then run it. Total time, including installing card and drivers ~1hr.

Check out /r/localllama on reddit.
Thank you. I will check out reddit and see if it makes sense to pick up this card. I do have a 3060 12GB coming this weekend and perhaps that could make due?

I am not sure if my Precision 7810 system (dual xeon v3 beasts) will handle the power requirements of the 3090 and the 4070 TI S might be worth looking at for as it is more in-line with the 2080S in there now.
1
Like
Funny
>
Helpful
Not helpful
Reply
Joined Jun 2013
New User
> bubble2 9 Posts
18 Reputation
jpswaynos
03-15-2024 at 11:44 AM.
03-15-2024 at 11:44 AM.
Quote from duijver :
What is the investment time + hardware to play around with your own GPT / LLM model?

I can play with GPT to see what it spits out - I am curious to hear from someone that has done it.
If you're on Windows you can get setup in minutes: https://lmstudio.ai/

Hardware minimums are pretty low to run a 7B parameter LLM, but can ramp up substantially if you want to run a 30 or 60B parameter LLM and get more than a couple of Tokens/s.
12
>
2
Like
Funny
>
Helpful
Not helpful
Reply
Last edited by jpswaynos March 15, 2024 at 11:46 AM.
Joined Mar 2024
New User
> bubble2 11 Posts
32 Reputation
static.CH03
03-15-2024 at 11:46 AM.
03-15-2024 at 11:46 AM.
I believe you need a power adapter in order to move to the required 12 pin? Anyway its not included in the box on the refurbished model
1
1
Like
Funny
>
Helpful
Not helpful
Reply
Joined Jul 2008
L6: Expert
> bubble2 1,991 Posts
265 Reputation
sandwich
03-15-2024 at 12:00 PM.
03-15-2024 at 12:00 PM.
The base 3090 has an issue with overheating vram modules on the backside. Be sure to have a fan pointing at the backplate to keep those modules cool
9
>
2
2
Like
Funny
>
Helpful
Not helpful
Reply
Page 1 of 10
Start the Conversation
 
Link Copied

The link has been copied to the clipboard.