Update: This popular deal is still available.
B&H Photo Video has
Apple 14.2" MacBook Pro Laptop w/ M4 Pro Chipset (Late 2024 Model, Space Black or Silver) on sale for
$1749.
Shipping is free.
Thanks to Community Member
GlennP9031 for sharing this deal.
Available Colors:Alternatively,
Costco Wholesale offers their
Members:
Apple 14.2" MacBook Pro Laptop w/ M4 Pro Chipset (Late 2024 Model, Space Black or Silver)
w/ Free 2nd Year Warranty & Technical Support Included on sale for
$1799.99.
Shipping is free.
- Note: Costco offers up to a $1575 Costco Shop Card w/ the trade-in on an a old Mac computer (see product page for details).
Product Specs:
- Apple M4 Pro chip w/ 12 CPU cores, 16 GPU cores, and 16 Neural cores
- 14.2" 3024x1964 120 Hz 1000-nit (1600-nit peak) Liquid Retina XDR display
- 24GB Unified Memory RAM
- 512GB NVMe Solid State Drive (SSD) Storage
- Wi-Fi 6E 802.11ax 2x2 MU-MIMO | Bluetooth 5.3
- 12MP Center Stage camera w/ Desk View
- 6-speaker high-fidelity system with Force-cancelling woofers
- 3-mic studio-quality array with high SNR and directional beamforming
- Backlit magic keyboard w/ Touch ID & Force Touch trackpad
- Ports:
- 3x USB-C/Thunderbolt ports
- 1x SDXC card slot
- 1x HDMI 2.1
- 1x Headphone/mic combo
- 1x MagSafe 3
- macOS operating system
- Weight: 3.5-lbs.
Warranty: 1-Year Limited Parts & Labor (additional year included w/ Costco purchase)
Top Comments
GB 6 Metal GPU says 40% increase on Metal performance, that's a pretty nice change.
M3Pro 11c/14gpu. vs. M4 Pro 12c/16gpu
https://www.youtube.com/watch?v=KVG5R7Q
I only do LLM on the list of items you do.
My 14c/20GPU has performance similar to the M1 Max 32GPU on LLM.
That's a pretty decent speed increase up from the M3 regular air (8 GPUs) I was carrying (I skipped M3 Pro and M3 Max)
I can actually do local inference on code and now it's fast enough that I cannot read the code it's writing out line by line. M3 Pro at 40% slower may annoy me at this point now that I am used to the faster inference speed. I did try M3 Max but retuned the machine because I didn't want 16" and I didn't need inference speed as much at the time. 7B or 11B files load jus fine on my 24GB RAM machine. 32GB is too much.
you can test your M3 Pro.
I use ollama and wrote out 500 words story to test some of the models I use
12B mistral-small 18 t/s. -why is this so slow compared to 11B vision? no idea.
11B llama3.2-vision 38 t/s
7B llama3.1 43 t/s
7B qwen2.5-coder 42 t/s.
2B llama3.2 72 t/s
1B llama3.2. 124 t/s
87 Comments
Sign up for a Slickdeals account to remove this ad.
I decided to learn react native for mobile app development and found out it's best to have a mac for ios testing via xcode.
I have access to an old 2012 MBP+16gb ram which I replaced the HDD with SDD and it still works pretty well but unfortunately os x catalina is the latest os x I can upgrade it to so I can't run the latest xcode.
m4 pro + 24gb ram seems to future proof.
From work I currently have the M3 16" MBP with 36GB of ram and I sometimes get warnings for maxing out the ram. So I think 16gb for personal development use will be too low.
+ SYW credit card offer I have is 150k points and 250k points for $1000 and $1500 online spend. So should be able to get 400k points which is equivalent to $400.
Saw the B&H price and contacted BB via chat a short while ago. They refunded me a total of $55.82. Very easy process.
Sign up for a Slickdeals account to remove this ad.
Sign up for a Slickdeals account to remove this ad.