This collaborative space allows users to contribute additional information, tips, and insights to enhance the original deal post. Feel free to share your knowledge and help fellow shoppers make informed decisions.
Our community has rated this post as helpful. If you agree, why not thank PT89
Quote
from ROB.E.REIN
:
It shows $1,556.87 on Amazon, and without CUDA, it will not be as great as NVIDIA for AI, correct?
ROCm and HIP are on parity with CUDA for the vast majority of AI workloads. There are still a few use cases where CUDA is needed but that number has been rapidly shrinking. This means this card is a pretty big deal and is the 32GB AI value king.
ROCm and HIP are on parity with CUDA for the vast majority of AI workloads. There are still a few use cases where CUDA is needed but that number has been rapidly shrinking. This means this card is a pretty big deal and is the 32GB AI value king.
No not really.
The memory bandwidth and raw performance on this compared to a 5090 is miles behind. Maybe if the 5090 was selling for 4k this would make sense. Not to mention fp4.
Nearly double the perf and triple the memory bandwidth paired with cuda and fp4 for less than double the price is a no brainer imo
If you only had 1300 dollars even dual 3090s with nv link makes more sense(lacks fp8 though)
Or if you just need memory then the amd 395 is an option
if you look at flux/stable diffusion perf it is around 3060-4070 level and then for llms you just want memory and memory bandwidth and this is kind of lacking for the price point
The memory bandwidth and raw performance on this compared to a 5090 is miles behind. Maybe if the 5090 was selling for 4k this would make sense. Not to mention fp4.
Nearly double the perf and triple the memory bandwidth paired with cuda and fp4 for less than double the price is a no brainer imo
If you only had 1300 dollars even dual 3090s with nv link makes more sense(lacks fp8 though)
Or if you just need memory then the amd 395 is an option
If your goal is to run a 60G model at q4 why pay $3,500 when you can only pay $1,300? I'm o.k. with the slower speed. This will run qwen3 coder 30g q8 at > 30 tokens per second, which is plenty fast. I don't need 120 tps. claude cli is slower than that.
Like
Helpful
Funny
Not helpful
Sign up for a Slickdeals account to remove this ad.
If your goal is to run a 60G model at q4 why pay $3,500 when you can only pay $1,300? I'm o.k. with the slower speed. This will run qwen3 coder 30g q8 at > 30 tokens per second, which is plenty fast.
Those are terrible speeds I got over 35T/s on a 265k 96gb memory + single 3090 on 120b gpt-oss. The total system cost less than just the 9700 pro alone.
Also the 5090 is not even close to 3.5k. You can get them at 2k msrp with some patience, 1800-1900 with open box/ cc deal, or 2.1k-2.5k if you are lazy.
If you just want cheap memory then just get a dual 3090 or xtx system. Or better yet if you just don't care about speeds then the amd ai 395 system would be perfect
The 9700 pro has been shown to be slower than the xtx already which makes sense due to the poor memory bandwidth. Although fp8 is nice
Ai/llm usage is just not the strength of this card it is just something it is capable of. I think where the value of this card shines is for productivity tasks like solid works or autocad where cuda doesn't make a big difference(like in blender) but at the same time the "pro" driver support is relevant
Leave a Comment
15 Comments
Sign up for a Slickdeals account to remove this ad.
Our community has rated this post as helpful. If you agree, why not thank PT89
Sign up for a Slickdeals account to remove this ad.
https://www.youtube.com/watch?v=Mt_8iNH
The memory bandwidth and raw performance on this compared to a 5090 is miles behind. Maybe if the 5090 was selling for 4k this would make sense. Not to mention fp4.
Nearly double the perf and triple the memory bandwidth paired with cuda and fp4 for less than double the price is a no brainer imo
If you only had 1300 dollars even dual 3090s with nv link makes more sense(lacks fp8 though)
Or if you just need memory then the amd 395 is an option
if you look at flux/stable diffusion perf it is around 3060-4070 level and then for llms you just want memory and memory bandwidth and this is kind of lacking for the price point
The memory bandwidth and raw performance on this compared to a 5090 is miles behind. Maybe if the 5090 was selling for 4k this would make sense. Not to mention fp4.
Nearly double the perf and triple the memory bandwidth paired with cuda and fp4 for less than double the price is a no brainer imo
If you only had 1300 dollars even dual 3090s with nv link makes more sense(lacks fp8 though)
Or if you just need memory then the amd 395 is an option
Sign up for a Slickdeals account to remove this ad.
Also the 5090 is not even close to 3.5k. You can get them at 2k msrp with some patience, 1800-1900 with open box/ cc deal, or 2.1k-2.5k if you are lazy.
If you just want cheap memory then just get a dual 3090 or xtx system. Or better yet if you just don't care about speeds then the amd ai 395 system would be perfect
The 9700 pro has been shown to be slower than the xtx already which makes sense due to the poor memory bandwidth. Although fp8 is nice
Ai/llm usage is just not the strength of this card it is just something it is capable of. I think where the value of this card shines is for productivity tasks like solid works or autocad where cuda doesn't make a big difference(like in blender) but at the same time the "pro" driver support is relevant
Leave a Comment