Slickdeals is community-supported.  We may get paid by brands for deals, including promoted items.
popularDr.W posted Yesterday 07:22 PM
popularDr.W posted Yesterday 07:22 PM

GMKtec EVO-X2 AI Mini PC: Ryzen Al Max+ 395, 64GB LPDDR5X 8000MHz, 1TB PCIe 4.0 SSD, Quad Screen 8K Display, WiFi 7 & USB4, SD Card 4.0 $1399.99

$1,400

$2,000

30% off
Micro Center
12 Comments 3,046 Views
Get Deal at Micro Center
Good Deal
Save
Share
Deal Details
Lowest price so far!

Available In-store Only; In stock at all stores when posting this.

SPECS:
  • AMD Ryzen AI Max+ 395 (3.0GHz) Processor
  • 64GB LPDDR5X-8000 RAM
  • AMD Radeon 8060S Integrated Graphics
  • 1TB SSD
  • 2.5GbE LAN, WiFi 7 (802.11be), Bluetooth 5.4
  • Windows 11 Pro
https://www.microcenter.com/produ...ai-mini-pc
Community Notes
About the Poster
Deal Details
Community Notes
About the Poster
Lowest price so far!

Available In-store Only; In stock at all stores when posting this.

SPECS:
  • AMD Ryzen AI Max+ 395 (3.0GHz) Processor
  • 64GB LPDDR5X-8000 RAM
  • AMD Radeon 8060S Integrated Graphics
  • 1TB SSD
  • 2.5GbE LAN, WiFi 7 (802.11be), Bluetooth 5.4
  • Windows 11 Pro
https://www.microcenter.com/produ...ai-mini-pc

Community Voting

Deal Score
+11
Good Deal
Get Deal at Micro Center

Leave a Comment

Unregistered (You)

12 Comments

Sign up for a Slickdeals account to remove this ad.

Yesterday 10:12 PM
1,947 Posts
Joined Aug 2008
cuoreesitanteYesterday 10:12 PM
1,947 Posts

Our community has rated this post as helpful. If you agree, why not thank cuoreesitante

About $100 cheaper than the launch price. I bought it at launch (this specific version) and ended up returning it. For local LLM 64GB ram is sort of a handicap that will prevent you from some of the larger and better models, and since you can't upgrade the ram yourself it's a dealbreaker. If you just want it for gaming, while the 8060S is really good (with frame gen I was playing Cyberpunk at 1440p and getting over 100fps, which is wild on a iGPU), for the price you can do much better in terms of performance on a regular tower PC.
5
1
Today 12:41 AM
1,220 Posts
Joined Aug 2019
SlickCrayon1512Today 12:41 AM
1,220 Posts
Quote from cuoreesitante :
About $100 cheaper than the launch price. I bought it at launch (this specific version) and ended up returning it. For local LLM 64GB ram is sort of a handicap that will prevent you from some of the larger and better models, and since you can't upgrade the ram yourself it's a dealbreaker. If you just want it for gaming, while the 8060S is really good (with frame gen I was playing Cyberpunk at 1440p and getting over 100fps, which is wild on a iGPU), for the price you can do much better in terms of performance on a regular tower PC.
Any upgradable ram is a joke for AI. You need at least 96gb vram so 128gb is the one to go
Today 01:38 AM
1,026 Posts
Joined Apr 2015
TPMJBToday 01:38 AM
1,026 Posts
Quote from cuoreesitante :
About $100 cheaper than the launch price. I bought it at launch (this specific version) and ended up returning it. For local LLM 64GB ram is sort of a handicap that will prevent you from some of the larger and better models, and since you can't upgrade the ram yourself it's a dealbreaker. If you just want it for gaming, while the 8060S is really good (with frame gen I was playing Cyberpunk at 1440p and getting over 100fps, which is wild on a iGPU), for the price you can do much better in terms of performance on a regular tower PC.
This COULD BE a good device, but very very poor marketing trying to get on the AI train. By attaching "AI" to the name you can demand double what the product is worth. If you're trying to get into AI, why not just build a computer for half the price?

What this would be good for is a home theater PC. The internal GPU can do almost up to PS4 emulation, and of course you could get Jellyfin and everything else running too. Nobody is going to spend $1400 on a TV box, though.

That is to say I want this as a home theater PC but absolutely cannot justify the price.
1
Today 02:01 AM
103 Posts
Joined Nov 2024
PurpleSeed694Today 02:01 AM
103 Posts
This can fit the 4bit quantized models.

It depends on what you are using the AI models for, but for my use cases the 4bit versions have been more than enough for summarizing text, rewording paragraphs, etc.
Today 02:21 AM
2,010 Posts
Joined Aug 2011
electrobentoToday 02:21 AM
2,010 Posts
Quote from TPMJB :
This COULD BE a good device, but very very poor marketing trying to get on the AI train. By attaching "AI" to the name you can demand double what the product is worth. If you're trying to get into AI, why not just build a computer for half the price?

What this would be good for is a home theater PC. The internal GPU can do almost up to PS4 emulation, and of course you could get Jellyfin and everything else running too. Nobody is going to spend $1400 on a TV box, though.

That is to say I want this as a home theater PC but absolutely cannot justify the price.
And what graphics card would you use for 64GB+ of memory reasonably functional for AI? The answer would make your custom PC wildly more expensive than this.

Unified memory is truly a game changer for AI purposes.
Today 03:39 AM
124 Posts
Joined Jul 2017
msmith777Today 03:39 AM
124 Posts
Quote from cuoreesitante :
About $100 cheaper than the launch price. I bought it at launch (this specific version) and ended up returning it. For local LLM 64GB ram is sort of a handicap that will prevent you from some of the larger and better models, and since you can't upgrade the ram yourself it's a dealbreaker. If you just want it for gaming, while the 8060S is really good (with frame gen I was playing Cyberpunk at 1440p and getting over 100fps, which is wild on a iGPU), for the price you can do much better in terms of performance on a regular tower PC.
To be clear the RAM cannot be upgraded by anyone. It is soldered on the board. The max these 395 chips can manage is 128gb. Personally I would just go maxed out so you do not get stuck later. Some of the new 395 machines only come with 128gb models. Really this is how it should be.
Today 03:46 AM
5,005 Posts
Joined Dec 2009
seanleeforeverToday 03:46 AM
5,005 Posts
I think those who use it for AI purpose are going to get 128G, and it often was paid as company expense so not like the price matters (i think i got the HP laptop with this chip for about 4k)
for home use... it is a bit too pricey for me to get one at this price. i am fine with 32G (for home gaming usage) if the price goes under 700

Sign up for a Slickdeals account to remove this ad.

Today 03:51 AM
292 Posts
Joined Aug 2007
DanCarToday 03:51 AM
292 Posts
1
Today 05:57 AM
1,026 Posts
Joined Apr 2015
TPMJBToday 05:57 AM
1,026 Posts
Quote from electrobento :
And what graphics card would you use for 64GB+ of memory reasonably functional for AI? The answer would make your custom PC wildly more expensive than this.

Unified memory is truly a game changer for AI purposes.
Alright then, pardon my ignorance, but isn't a GPU with some horsepower at all, more useful for AI workloads? Otherwise, why is everyone snatching up as many 5090s as possible?

What exactly is a consumer running AI workloads for, anyway? Is it making you money?
Today 07:18 AM
1,947 Posts
Joined Aug 2008
cuoreesitanteToday 07:18 AM
1,947 Posts
Quote from TPMJB :
<br />
Alright then, pardon my ignorance, but isn't a GPU with some horsepower at all, more useful for AI workloads? Otherwise, why is everyone snatching up as many 5090s as possible?<br />
<br />
What exactly is a consumer running AI workloads for, anyway? Is it making you money?
Yes high end GPUs are better, but usually at a higher price, larger footprint, and more power consumption. 5090s have been hard to get a hold of ever since it was launched, 4090s are still selling for more than its original MSRP on the used market, even 3090s get snatched up super quickly.

I can see plenty of people that do freelance work use local AI for coding, text analysis, creative writing, music/video work. There's value to keeping it all off line, and not be at the mercy of the whims of tech CEOs that like to change their model/algorithm every other week.
Today 07:34 AM
5,005 Posts
Joined Dec 2009
seanleeforeverToday 07:34 AM
5,005 Posts
Quote from TPMJB :
Alright then, pardon my ignorance, but isn't a GPU with some horsepower at all, more useful for AI workloads? Otherwise, why is everyone snatching up as many 5090s as possible?

What exactly is a consumer running AI workloads for, anyway? Is it making you money?
people get 5090 to do training, you don't need nearly as much HP for inferencing, which is to load someone else's trained model for your application. if you gonna do any training at all at large scale, you need both the vram and the GPU.
Today 10:54 AM
15 Posts
Joined Jul 2018
SantaBoyToday 10:54 AM
15 Posts
Quote from TPMJB :
Alright then, pardon my ignorance, but isn't a GPU with some horsepower at all, more useful for AI workloads? Otherwise, why is everyone snatching up as many 5090s as possible?

What exactly is a consumer running AI workloads for, anyway? Is it making you money?
For consumer running local LLM, what you need the most are the (V)RAM and more importantly very high data bandwidth (GB/s) to GPU. You cannot get decent RAM BW to GPU in a traditionally built consumer PC, so people need VRAM which is on GPU board. For this AMD or Apple silicon, the RAM is unified, which is high BW to both CPU and GPU, so you just need RAM. That's the difference of RAM in both cases.

Leave a Comment

Unregistered (You)

Popular Deals

View All

Trending Deals

View All