Ketchup on Halloween deals

View Deals
Slickdeals is community-supported.  We may get paid by brands for deals, including promoted items.
popularDr.W posted Sep 28, 2025 03:27 PM
popularDr.W posted Sep 28, 2025 03:27 PM

GMKtec EVO-X2 AI Mini PC: Ryzen AI Max+ 395, 128GB LPDDR5X-8000, 2TB SSD, Radeon 8060S $1795

$1,795

$2,800

35% off
34 Comments 9,237 Views
Get Deal at Retailer
Good Deal
Save
Share
Deal Details
Lowest price ever!

eCoupon at checkout: GMKVX2

SPECS:
  • AMD Ryzen AI Max+ 395 3.0GHz Processor
  • 128GB LPDDR5X-8000 Onboard RAM
  • 2TB PCIe 4.0 M.2 2280 SSD; Dual M.2 2280 slots
  • AMD Radeon 8060S Graphics
  • Microsoft Windows 11 Pro
  • SD Memory Card Reader
  • 2.5GbE LAN
  • Wi-Fi 7: RZ717(MT7925) + BT 5.4
  • Ports:
  • Front Panel I/O:
    • 2× USB-A 3.2 Gen2
    • 1× Audio Combo (3.5mm jack)
    • 1× USB Type-C (USB4.0)
    • 1× SD Card Reader (SD4.0, supports SDXC)
    • Buttons: Power, System Fan Lighting Control, Performance Mode Switch
  • Rear Panel I/O:
    • 1× DisplayPort 1.4 HBR3 (8.1 Gbps)
    • 1× HDMI 2.1 (FRL at 8 Gbps)
    • 1× USB-A 3.2 Gen2
    • 2× USB-A 2.0
    • 1× USB Type-C (USB4.0)
    • 1× Audio (3.5mm CTIA)
    • 1× DC IN (5525 interface)
    • 1× RJ45 Gigabit LAN (2.5G, 8125BG)


https://www.gmktec.com/products/a...header_1.1
Community Notes
About the Poster
Deal Details
Community Notes
About the Poster
Lowest price ever!

eCoupon at checkout: GMKVX2

SPECS:
  • AMD Ryzen AI Max+ 395 3.0GHz Processor
  • 128GB LPDDR5X-8000 Onboard RAM
  • 2TB PCIe 4.0 M.2 2280 SSD; Dual M.2 2280 slots
  • AMD Radeon 8060S Graphics
  • Microsoft Windows 11 Pro
  • SD Memory Card Reader
  • 2.5GbE LAN
  • Wi-Fi 7: RZ717(MT7925) + BT 5.4
  • Ports:
  • Front Panel I/O:
    • 2× USB-A 3.2 Gen2
    • 1× Audio Combo (3.5mm jack)
    • 1× USB Type-C (USB4.0)
    • 1× SD Card Reader (SD4.0, supports SDXC)
    • Buttons: Power, System Fan Lighting Control, Performance Mode Switch
  • Rear Panel I/O:
    • 1× DisplayPort 1.4 HBR3 (8.1 Gbps)
    • 1× HDMI 2.1 (FRL at 8 Gbps)
    • 1× USB-A 3.2 Gen2
    • 2× USB-A 2.0
    • 1× USB Type-C (USB4.0)
    • 1× Audio (3.5mm CTIA)
    • 1× DC IN (5525 interface)
    • 1× RJ45 Gigabit LAN (2.5G, 8125BG)


https://www.gmktec.com/products/a...header_1.1

Community Voting

Deal Score
+9
Good Deal
Get Deal at Retailer

Leave a Comment

Unregistered (You)

34 Comments

Sign up for a Slickdeals account to remove this ad.

Sep 28, 2025 05:11 PM
1,228 Posts
Joined Aug 2019

This comment has been rated as unhelpful by Slickdeals users.

Sep 28, 2025 06:08 PM
16 Posts
Joined Sep 2015
morglorSep 28, 2025 06:08 PM
16 Posts
This or Framework Desktop? I know framework is $2k and has no storage, but I'm wondering if it has any advantages
Sep 28, 2025 07:44 PM
720 Posts
Joined Nov 2020
ElatedSpaniel543Sep 28, 2025 07:44 PM
720 Posts
Are people using these to run LLMs? Are these a cheaper/more more alternative to desktop GPUs?
Sep 29, 2025 03:32 AM
14 Posts
Joined Feb 2014
phonytoastSep 29, 2025 03:32 AM
14 Posts
Quote from ElatedSpaniel543 :
Are people using these to run LLMs? Are these a cheaper/more more alternative to desktop GPUs?
Yes, they aren't as fast as the high end GPU's but they run pretty well and can do large models due to the shared memory for the GPU.
1
Sep 29, 2025 03:33 AM
546 Posts
Joined Apr 2008
sr27Sep 29, 2025 03:33 AM
546 Posts
Someone tested and said these are about three times slower compared to a desktop with a gpu. Though they can still run the models that fit
1
Original Poster
Pro
Sep 29, 2025 04:45 AM
11,929 Posts
Joined Nov 2020
Dr.W
Original Poster
Pro
Sep 29, 2025 04:45 AM
11,929 Posts
Quote from SlickCrayon1512 :
Not lowest, $1699 pre order.
Don't spread false information.
This was never $1699 pre-order. The price was $ 1,799 for the 64GB or 96GB RAM variant. Search it up in the SD search bar and see for yourself:
Sep 29, 2025 06:07 AM
108 Posts
Joined Jul 2008
warmonkedSep 29, 2025 06:07 AM
108 Posts
Quote from SlickCrayon1512 :
Not lowest, $1699 pre order.
That's a good point. I'll just get in my time machine and go back to preorder it.
2

Sign up for a Slickdeals account to remove this ad.

Sep 29, 2025 12:08 PM
3,815 Posts
Joined Dec 2008
konoplyaSep 29, 2025 12:08 PM
3,815 Posts
Is this a desktop? Why so much RAM?
2
Sep 29, 2025 12:48 PM
318 Posts
Joined Apr 2013
WhenPigsFly09090Sep 29, 2025 12:48 PM
318 Posts
Quote from morglor :
This or Framework Desktop? I know framework is $2k and has no storage, but I'm wondering if it has any advantages
if you plan on using them as a cluster in the future, the framework desktop will be a better choice with the 5gbe option
Sep 29, 2025 01:29 PM
1,373 Posts
Joined Feb 2008
vd853Sep 29, 2025 01:29 PM
1,373 Posts
Wow 128GB of DDR5...
Sep 29, 2025 03:27 PM
349 Posts
Joined Jan 2010
wiouxevSep 29, 2025 03:27 PM
349 Posts

Our community has rated this post as helpful. If you agree, why not thank wiouxev

Quote from konoplya :
Is this a desktop? Why so much RAM?
It would probably be considered a mini desktop, yes, as it doesn't have a display. But it's size is a big advantage for mounting it out of the way, even to a back of a monitor, putting it in a rack, just being portable in general.

The high RAM is for running AI locally. You can download LLMs (Large Language Models) and have your own "ChatGPT" and run it on your computer. The way LLM's work is accessing data thousands of times per second.

Memory bandwidth speed for SSD's = 3-7 GB/s
Memory bandwidth speed for RAM or VRAM = 500-1,500 GB/s

So thats why they need to be stored in RAM (regular RAM or VRAM from a GPU which is even better)

Sorry if this is TMI or you don't care, just thought i'd answer your question in case you didn't know!
5
Sep 29, 2025 03:38 PM
3,815 Posts
Joined Dec 2008
konoplyaSep 29, 2025 03:38 PM
3,815 Posts
Quote from wiouxev :
It would probably be considered a mini desktop, yes, as it doesn't have a display. But it's size is a big advantage for mounting it out of the way, even to a back of a monitor, putting it in a rack, just being portable in general.

The high RAM is for running AI locally. You can download LLMs (Large Language Models) and have your own "ChatGPT" and run it on your computer. The way LLM's work is accessing data thousands of times per second.

Memory bandwidth speed for SSD's = 3-7 GB/s
Memory bandwidth speed for RAM or VRAM = 500-1,500 GB/s

So thats why they need to be stored in RAM (regular RAM or VRAM from a GPU which is even better)

Sorry if this is TMI or you don't care, just thought i'd answer your question in case you didn't know!

No, that's a great answer thank you. So would this be good for also running something like stable diffusion or would that be more GPU side than ram since it's images/video? As for chat gpt type, how would you train it on your desktop? Or do you not even need to?
Sep 29, 2025 03:57 PM
349 Posts
Joined Jan 2010
wiouxevSep 29, 2025 03:57 PM
349 Posts
Quote from konoplya :
Quote from wiouxev [IMG]http://i.slickdeals.net/images/midnight/misc/backlink.gif[/IMG] :
It would probably be considered a mini desktop, yes, as it doesn't have a display. But it's size is a big advantage for mounting it out of the way, even to a back of a monitor, putting it in a rack, just being portable in general.

The high RAM is for running AI locally. You can download LLMs (Large Language Models) and have your own "ChatGPT" and run it on your computer. The way LLM's work is accessing data thousands of times per second.

Memory bandwidth speed for SSD's = 3-7 GB/s
Memory bandwidth speed for RAM or VRAM = 500-1,500 GB/s

So thats why they need to be stored in RAM (regular RAM or VRAM from a GPU which is even better)

Sorry if this is TMI or you don't care, just thought i'd answer your question in case you didn't know!

No, that's a great answer thank you. So would this be good for also running something like stable diffusion or would that be more GPU side than ram since it's images/video? As for chat gpt type, how would you train it on your desktop? Or do you not even need to?
Same basic concept to run. VRAM is ideal in either case, but RAM can do in builds like this GMKTec EVO-X2.
Small caveat worth noting: while it is normal system RAM in the case of this computer, you would allocate some/most of it to the iGPU so technically it would be considered VRAM

For LLMs you have model size and context length
For SD, you have resolution

As far as your other question, I'm sure someone around SD will see this and give a better answer, as I am a bit of a noob.

As I understand it, LLM's & SD and similar models are already "trained." That's essentially what you're downloading when you go here and download a model:
https://ollama.com/library

But there's a caveat to this, LLM's can also leverage "RAG" which is kind of a 2nd layer of "training" that might be like your personal docs you feed it. Think of that as like a personalization layer, but it can be leveraged to make the model "Smarter" or more "specific"
1
Sep 29, 2025 04:02 PM
3,815 Posts
Joined Dec 2008
konoplyaSep 29, 2025 04:02 PM
3,815 Posts
Quote from wiouxev :
Same basic concept to run. VRAM is ideal in either case, but RAM can do in builds like this GMKTec EVO-X2.
Small caveat worth noting: while it is normal system RAM in the case of this computer, you would allocate some/most of it to the iGPU so technically it would be considered VRAM

For LLMs you have model size and context length
For SD, you have resolution

As far as your other question, I'm sure someone around SD will see this and give a better answer, as I am a bit of a noob.

As I understand it, LLM's & SD and similar models are already "trained." That's essentially what you're downloading when you go here and download a model:
https://ollama.com/library

But there's a caveat to this, LLM's can also leverage "RAG" which is kind of a 2nd layer of "training" that might be like your personal docs you feed it. Think of that as like a personalization layer, but it can be leveraged to make the model "Smarter" or more "specific"

Great info, thank you

Sign up for a Slickdeals account to remove this ad.

Sep 29, 2025 05:02 PM
2 Posts
Joined Sep 2018
ChangeUsernameSep 29, 2025 05:02 PM
2 Posts
$4 more at your local Microcenter. https://www.microcenter.com/produ...ai-mini-pc

Leave a Comment

Unregistered (You)

Popular Deals

View All

Trending Deals

View All