Slickdeals is community-supported.  We may get paid by brands for deals, including promoted items.
Heads up, this deal has expired. Want to create a deal alert for this item?
expired Posted by SehoneyDP • Mar 15, 2024
expired Posted by SehoneyDP • Mar 15, 2024

NVIDIA GeForce RTX 3090 Founders Edition Dual Fan 24GB GDDR6X GPU Card (Refurb)

(Select Stores) + Free Store Pickup

$700

Micro Center
149 Comments 117,670 Views
Visit Micro Center
Good Deal
Save
Share
Deal Details
Select Micro Center Stores have NVIDIA GeForce RTX 3090 Founders Edition Dual Fan 24GB GDDR6X PCIe 4.0 Graphics Card (Refurbished, 9001G1362510RF2) on sale for $699.99. Select free store pickup where available.
  • Note: Availability for pickup will vary by location and is very limited.
Thanks to Deal Hunter SehoneyDP for sharing this deal.

Features:
  • 24GB GDDR6X 384-bit Memory
  • 7680 x 4320 Maximum Resolution
  • PCIe 4.0
  • Full Height, Triple Slot
  • DisplayPort 1.4a, HDMI 2.1

Editor's Notes

Written by jimmytx | Staff
  • About this Store:
    • All products come with 60 days of Complimentary Tech Support
    • This product may be returned within 30 days of purchase (details).
  • Additional Information:

Original Post

Written by SehoneyDP
Community Notes
About the Poster
Deal Details
Community Notes
About the Poster
Select Micro Center Stores have NVIDIA GeForce RTX 3090 Founders Edition Dual Fan 24GB GDDR6X PCIe 4.0 Graphics Card (Refurbished, 9001G1362510RF2) on sale for $699.99. Select free store pickup where available.
  • Note: Availability for pickup will vary by location and is very limited.
Thanks to Deal Hunter SehoneyDP for sharing this deal.

Features:
  • 24GB GDDR6X 384-bit Memory
  • 7680 x 4320 Maximum Resolution
  • PCIe 4.0
  • Full Height, Triple Slot
  • DisplayPort 1.4a, HDMI 2.1

Editor's Notes

Written by jimmytx | Staff
  • About this Store:
    • All products come with 60 days of Complimentary Tech Support
    • This product may be returned within 30 days of purchase (details).
  • Additional Information:

Original Post

Written by SehoneyDP

Community Voting

Deal Score
+84
Good Deal
Visit Micro Center
Leave a Comment
To participate in the comments, please log in.

Top Comments

If you have a 24GB card, just download koboldcpp which is a 250mb exec, and get a GGUF model off huggingface -- a 20B or 34B model is about 15GB, then run it. Total time, including installing card and drivers ~1hr.

Check out /r/localllama on reddit.
This card is great if you want to play with local language models. Tired of GPT refusing to answer your questions about how to dispose of that dead body in your freezer? Run your own local model that has been 'un-aligned' and it will let you know exactly which brand of acid to get at the hardware store to remove that pesky problem. All you need a boatload of VRAM to run it.
It's a great card. But for gamers, seems 4070 ti super with 3 years of warranty is a better choice.

149 Comments

Sign up for a Slickdeals account to remove this ad.

Mar 15, 2024
266 Posts
Joined Nov 2019
Mar 15, 2024
xlongx
Mar 15, 2024
266 Posts

Our community has rated this post as helpful. If you agree, why not thank xlongx

It's a great card. But for gamers, seems 4070 ti super with 3 years of warranty is a better choice.
5
1
Original Poster
Mar 15, 2024
2,451 Posts
Joined Nov 2022
Mar 15, 2024
SehoneyDP
Original Poster
Mar 15, 2024
2,451 Posts
Quote from xlongx :
It's a great card. But for gamers, seems 4070 ti super with 3 years of warranty is a better choice.
I was definitely thinking the same thing for just $100 more the longevity should be worth it right? I'm not expert, but the big difference seems to be the vram? 24GB vs 16GB is there a big difference in consideration of that? Does it help things in particular?
Mar 15, 2024
4,034 Posts
Joined Dec 2007
Mar 15, 2024
Outrager
Mar 15, 2024
4,034 Posts
Quote from SehoneyDP :
I was definitely thinking the same thing for just $100 more the longevity should be worth it right? I'm not expert, but the big difference seems to be the vram? 24GB vs 16GB is there a big difference in consideration of that? Does it help things in particular?
Some newer games need the extra VRAM, but from what I remember it's mostly because they're so unoptimized. Like for The Last of Us Part 1 they eventually released a patch that lowered the VRAM requirement for the ultra textures.
Also, maybe for scientific stuff you'll want more VRAM, but I have no idea if you'll want more VRAM or compute power for stuff like that.
2
Pro
Mar 15, 2024
5,226 Posts
Joined Jun 2010
Mar 15, 2024
wpc
Pro
Mar 15, 2024
5,226 Posts

Our community has rated this post as helpful. If you agree, why not thank wpc

the vram matters a lot when you start gaming at higher resolutions. 4k specifically
1
1
Original Poster
Mar 15, 2024
2,451 Posts
Joined Nov 2022
Mar 15, 2024
SehoneyDP
Original Poster
Mar 15, 2024
2,451 Posts
Quote from Outrager :
Some newer games need the extra VRAM, but from what I remember it's mostly because they're so unoptimized. Like for The Last of Us Part 1 they eventually released a patch that lowered the VRAM requirement for the ultra textures.
Also, maybe for scientific stuff you'll want more VRAM, but I have no idea if you'll want more VRAM or compute power for stuff like that.
Quote from wpc :
the vram matters a lot when you start gaming at higher resolutions. 4k specifically
Ah that makes sense ty. I got a 1440p monitor for now, so probably don't really see a need. I do feel like I could use more for PCVR though. I'll have to look at which one is better for that. I can still wait until the 5000 series are out though tbh.
Mar 15, 2024
882 Posts
Joined Apr 2007
Mar 15, 2024
Meteo
Mar 15, 2024
882 Posts
temped to get a second 3090 for nvlink
4
10
Mar 15, 2024
506 Posts
Joined Feb 2021
Mar 15, 2024
HappyAccident
Mar 15, 2024
506 Posts

Our community has rated this post as helpful. If you agree, why not thank HappyAccident

This card is great if you want to play with local language models. Tired of GPT refusing to answer your questions about how to dispose of that dead body in your freezer? Run your own local model that has been 'un-aligned' and it will let you know exactly which brand of acid to get at the hardware store to remove that pesky problem. All you need a boatload of VRAM to run it.
7
33
2

Sign up for a Slickdeals account to remove this ad.

Mar 15, 2024
2,345 Posts
Joined Apr 2006
Mar 15, 2024
duijver
Mar 15, 2024
2,345 Posts
Quote from HappyAccident :
This card is great if you want to play with local language models. Tired of GPT refusing to answer your questions about how to dispose of that dead body in your freezer? Run your own local model that has been 'un-aligned' and it will let you know exactly which brand of acid to get at the hardware store to remove that pesky problem. All you need a boatload of VRAM to run it.
What is the investment time + hardware to play around with your own GPT / LLM model?

I can play with GPT to see what it spits out - I am curious to hear from someone that has done it.
1
Mar 15, 2024
506 Posts
Joined Feb 2021
Mar 15, 2024
HappyAccident
Mar 15, 2024
506 Posts

Our community has rated this post as helpful. If you agree, why not thank HappyAccident

Quote from duijver :
What is the investment time + hardware to play around with your own GPT / LLM model?

I can play with GPT to see what it spits out - I am curious to hear from someone that has done it.
If you have a 24GB card, just download koboldcpp which is a 250mb exec, and get a GGUF model off huggingface -- a 20B or 34B model is about 15GB, then run it. Total time, including installing card and drivers ~1hr.

Check out /r/localllama on reddit.
Last edited by HappyAccident March 15, 2024 at 02:31 PM.
17
Mar 15, 2024
2,345 Posts
Joined Apr 2006
Mar 15, 2024
duijver
Mar 15, 2024
2,345 Posts
Quote from HappyAccident :
If you have a 24GB card, just download koboldcpp which is a 250mb exec, and get a GGUF model off huggingface -- a 20B or 34B model is about 15GB, then run it. Total time, including installing card and drivers ~1hr.

Check out /r/localllama on reddit.
Thank you. I will check out reddit and see if it makes sense to pick up this card. I do have a 3060 12GB coming this weekend and perhaps that could make due?

I am not sure if my Precision 7810 system (dual xeon v3 beasts) will handle the power requirements of the 3090 and the 4070 TI S might be worth looking at for as it is more in-line with the 2080S in there now.
1
Mar 15, 2024
10 Posts
Joined Jun 2013
Mar 15, 2024
jpswaynos
Mar 15, 2024
10 Posts

Our community has rated this post as helpful. If you agree, why not thank jpswaynos

Quote from duijver :
What is the investment time + hardware to play around with your own GPT / LLM model?

I can play with GPT to see what it spits out - I am curious to hear from someone that has done it.
If you're on Windows you can get setup in minutes: https://lmstudio.ai/

Hardware minimums are pretty low to run a 7B parameter LLM, but can ramp up substantially if you want to run a 30 or 60B parameter LLM and get more than a couple of Tokens/s.
Last edited by jpswaynos March 15, 2024 at 11:46 AM.
2
Mar 15, 2024
11 Posts
Joined Mar 2024
Mar 15, 2024
static.CH03
Mar 15, 2024
11 Posts
I believe you need a power adapter in order to move to the required 12 pin? Anyway its not included in the box on the refurbished model
1
Mar 15, 2024
2,112 Posts
Joined Jul 2008
Mar 15, 2024
sandwich
Mar 15, 2024
2,112 Posts

Our community has rated this post as helpful. If you agree, why not thank sandwich

The base 3090 has an issue with overheating vram modules on the backside. Be sure to have a fan pointing at the backplate to keep those modules cool
2
2
Mar 15, 2024
11 Posts
Joined Mar 2024
Mar 15, 2024
static.CH03
Mar 15, 2024
11 Posts

Our community has rated this post as helpful. If you agree, why not thank static.CH03

Quote from jpswaynos :
If you're on Windows you can get setup in minutes: https://lmstudio.ai/

Hardware minimums are pretty low to run a 7B parameter LLM, but can ramp up substantially if you want to run a 30 or 60B parameter LLM and get more than a couple of Tokens/s.
seconding lmstudio (I personally use the top download for dolphin and there is a phrase you can use to bypass the ethics parameters
1

Sign up for a Slickdeals account to remove this ad.

Mar 15, 2024
5,226 Posts
Joined Jun 2010

This comment has been rated as unhelpful by Slickdeals users.

Popular Deals

View All

Trending Deals

View All