Ketchup on Halloween deals

View Deals
Slickdeals is community-supported.  We may get paid by brands for deals, including promoted items.
popularDanCar posted Yesterday 07:59 PM
popularDanCar posted Yesterday 07:59 PM

AMD Radeon AI Pro R9700 32GB Amazon and Newegg $1,300

$1,300

Newegg
19 Comments 2,579 Views
Get Deal at Newegg
Good Deal
Save
Share
Deal Details
Great for LLMs. How fast can you run qwen3 coder on this?

https://www.newegg.com/asrock-cha...6814930143
https://www.newegg.com/gigabyte-g...6814932822 OOS

Might run modesl that don't fit on 4090 24gb of VRAM faster. Compare to nVidia 5090 with 32 gb of vram that costs $3.3K
Specs:
https://www.amd.com/content/dam/a...asheet.pdf
Product Info
Community Notes
About the Poster
Deal Details
Product Info
Community Notes
About the Poster
Great for LLMs. How fast can you run qwen3 coder on this?

https://www.newegg.com/asrock-cha...6814930143
https://www.newegg.com/gigabyte-g...6814932822 OOS

Might run modesl that don't fit on 4090 24gb of VRAM faster. Compare to nVidia 5090 with 32 gb of vram that costs $3.3K
Specs:
https://www.amd.com/content/dam/a...asheet.pdf

Community Voting

Deal Score
+14
Good Deal
Get Deal at Newegg

Leave a Comment

Unregistered (You)

19 Comments

Sign up for a Slickdeals account to remove this ad.

Yesterday 09:02 PM
628 Posts
Joined Dec 2021
CyanWriter8569Yesterday 09:02 PM
628 Posts
When did this release????
Yesterday 09:51 PM
5,791 Posts
Joined Nov 2013
ROB.E.REINYesterday 09:51 PM
5,791 Posts
It shows $1,556.87 on Amazon, and without CUDA, it will not be as great as NVIDIA for AI, correct?
Yesterday 10:02 PM
221 Posts
Joined Aug 2023
PT89Yesterday 10:02 PM
221 Posts

Our community has rated this post as helpful. If you agree, why not thank PT89

Quote from ROB.E.REIN :
It shows $1,556.87 on Amazon, and without CUDA, it will not be as great as NVIDIA for AI, correct?
ROCm and HIP are on parity with CUDA for the vast majority of AI workloads. There are still a few use cases where CUDA is needed but that number has been rapidly shrinking. This means this card is a pretty big deal and is the 32GB AI value king.
2
2
Yesterday 10:30 PM
5 Posts
Joined Mar 2021
BennybobennyYesterday 10:30 PM
5 Posts
Good luck finding a model that runs on AMD.
4
Pro
Yesterday 10:48 PM
591 Posts
Joined Jun 2021
n0p
Pro
Yesterday 10:48 PM
591 Posts
Quote from Bennybobenny :
Good luck finding a model that runs on AMD.
It's actually possible to run models like Llama, Mistral, and Stable Diffusion.
1
Original Poster
Yesterday 10:50 PM
307 Posts
Joined Aug 2007
DanCar
Original Poster
Yesterday 10:50 PM
307 Posts
Quote from CyanWriter8569 :
When did this release????
Today 😀
Original Poster
Yesterday 10:53 PM
307 Posts
Joined Aug 2007
DanCar
Original Poster
Yesterday 10:53 PM
307 Posts
Wow, amazon raised their prices a few times today. Better to shop at Newegg.
1

Sign up for a Slickdeals account to remove this ad.

Yesterday 10:55 PM
2 Posts
Joined Aug 2024
CoralLanguage185Yesterday 10:55 PM
2 Posts
lots of bugs to run AMD card for LLM, dont waste your time
1
7
Yesterday 11:03 PM
221 Posts
Joined Aug 2023
PT89Yesterday 11:03 PM
221 Posts

Our community has rated this post as helpful. If you agree, why not thank PT89

Quote from CoralLanguage185 :
lots of bugs to run AMD card for LLM, dont waste your time
Not true at all, LLMs work perfect on AMD.
1
1
Original Poster
Yesterday 11:12 PM
307 Posts
Joined Aug 2007
DanCar
Original Poster
Yesterday 11:12 PM
307 Posts
Below review had issue with cuda but not amd.
https://www.youtube.com/watch?v=Mt_8iNHOHFQ
1
Yesterday 11:34 PM
5,418 Posts
Joined May 2020
Shake-N-BakeYesterday 11:34 PM
5,418 Posts
Quote from DanCar :
Wow, amazon raised their prices a few times today. Better to shop at Newegg.
It's never better to buy from Newegg. Sometimes it's cheaper, but it's never better.
1
Today 12:04 AM
436 Posts
Joined Dec 2017
veyveyToday 12:04 AM
436 Posts
Quote from PT89 :
ROCm and HIP are on parity with CUDA for the vast majority of AI workloads. There are still a few use cases where CUDA is needed but that number has been rapidly shrinking. This means this card is a pretty big deal and is the 32GB AI value king.
No not really.

The memory bandwidth and raw performance on this compared to a 5090 is miles behind. Maybe if the 5090 was selling for 4k this would make sense. Not to mention fp4.

Nearly double the perf and triple the memory bandwidth paired with cuda and fp4 for less than double the price is a no brainer imo

If you only had 1300 dollars even dual 3090s with nv link makes more sense(lacks fp8 though)

Or if you just need memory then the amd 395 is an option

if you look at flux/stable diffusion perf it is around 3060-4070 level and then for llms you just want memory and memory bandwidth and this is kind of lacking for the price point
2
2
Original Poster
Today 02:34 AM
307 Posts
Joined Aug 2007
DanCar
Original Poster
Today 02:34 AM
307 Posts
Quote from Shake-N-Bake :
It's never better to buy from Newegg. Sometimes it's cheaper, but it's never better.
Why? I've been a happy newegg customer many times.
Original Poster
Today 03:39 AM
307 Posts
Joined Aug 2007
DanCar
Original Poster
Today 03:39 AM
307 Posts
Quote from veyvey :
No not really.

The memory bandwidth and raw performance on this compared to a 5090 is miles behind. Maybe if the 5090 was selling for 4k this would make sense. Not to mention fp4.

Nearly double the perf and triple the memory bandwidth paired with cuda and fp4 for less than double the price is a no brainer imo

If you only had 1300 dollars even dual 3090s with nv link makes more sense(lacks fp8 though)

Or if you just need memory then the amd 395 is an option
If your goal is to run a 60G model at q4 why pay $3k when you can only pay $1,300? I'm o.k. with the slower speed. This will run qwen3 coder 30g q8 at > 30 tokens per second, which is plenty fast. I don't need 120 tps. claude cli is slower than that.

Sign up for a Slickdeals account to remove this ad.

Today 04:11 AM
436 Posts
Joined Dec 2017
veyveyToday 04:11 AM
436 Posts
Quote from DanCar :
If your goal is to run a 60G model at q4 why pay $3,500 when you can only pay $1,300? I'm o.k. with the slower speed. This will run qwen3 coder 30g q8 at > 30 tokens per second, which is plenty fast.
Those are terrible speeds I got over 35T/s on a 265k 96gb memory + single 3090 on 120b gpt-oss. The total system cost less than just the 9700 pro alone.

Also the 5090 is not even close to 3.5k. You can get them at 2k msrp with some patience, 1800-1900 with open box/ cc deal, or 2.1k-2.5k if you are lazy.

If you just want cheap memory then just get a dual 3090 or xtx system. Or better yet if you just don't care about speeds then the amd ai 395 system would be perfect

The 9700 pro has been shown to be slower than the xtx already which makes sense due to the poor memory bandwidth. Although fp8 is nice

Ai/llm usage is just not the strength of this card it is just something it is capable of. I think where the value of this card shines is for productivity tasks like solid works or autocad where cuda doesn't make a big difference(like in blender) but at the same time the "pro" driver support is relevant
2
2

Leave a Comment

Unregistered (You)

Popular Deals

View All

Trending Deals

View All