Original Post
Written by
Edited July 6, 2020
at 07:24 PM
by
First time post.
https://store.hp.com/us/en/pdp/hp...3-ay0021nr
899 after code JULY4STACK10
You might need to logout your student/etc account before adding to cart.
Previous discussion
https://slickdeals.net/f/14086562-hp-envy-x360-13-ay0021nr-13-3-fhd-ips-touch-ryzen-7-4700u-16gb-ddr4-512gb-pcie-ssd-win10p-1000-f-s
Windows 10 Pro 64 Bit
13.3" 1080P 300 Nits IPS Touchscreen Display, 72% NTSC
Ryzen 7 4700U 2 GHz (8 Core, 4.2 GHz Boost, 12MB Cache)
AMD VEGA 7 Graphics (1600 MHz)
16GB DDR4-3200 MHz Ram (Onboard)
512GB M.2 PCIe NVMe SSD
Backlit Keyboard
720P HD WebCam /w Dual Array Microphone, Camera Shutter Button
Accelerometer; Gyroscope; eCompass; IR Thermal sensor
Fingerprint Reader
Intel WiFi 6 AX 200 (2x2) + bluetooth 5.0
PORTS:
2x USB 3.0 Type-A (one /w HP Sleep & Charge)
1x USB 3.1 Gen 2 Type-C (10 Gbps Data, DisplayPort 1.4, Power Delivery 3.0)
1x Audio Combo Jack
1x Micro SD Card Reader
3 Cell 51 WHr Li-Ion Battery
2.92 lbs
No pen included from previous thread.
148 Comments
Your comment cannot be blank.
Featured Comments
Sign up for a Slickdeals account to remove this ad.
https://www.laptopmag.c
Tiger Lake should bridge the gap in performance to the Ryzen 4000, but with its higher frequency on the TL it still was below the lower clocked Ryzen, which shows the new Intel Tiger Lake still will not have the performance and efficiency compared to this gen Ryzen.
Like I said before, the 10 nanometer on the Intel is going to be its main limitation. Tiger Lake will still be built on 10nm while Ryzen has already gone to 7nm. Not trying to start a debate, but I am a fan of innovation and both Intel and AMD, but this time around, AMD really put the pressure on Intel to start innovating instead of churning out the same CPU with different names.
I hope that's the case and we would get some good years of fierce competition (which means happy end users)
When I run my current machine at home, it is connected to all my peripherals (and charges) via usb-c to a doc. When I'm traveling (or otherwise not at my desk, and not running on battery) I use my current traditional power brick...which frees up the usb-c port for other uses.
So, again, why is not getting an usb-c power brick a bad thing?
If so, wouldn't this be a no-brainer, at such a low marginal cost? Other than viewing angles, is quality/color accuracy going to be similar to the 400 nit screen?
There seems to be a lot of confusion about the screen and what is acceptable and what is not. The panel on the Envy 15 is only 250nit. It is subpar and has a very poor color gamut and color accuracy. The color hue tends to run on the orange side. Images of red appears orange. I have tried all software calibration and color tweaking, but it is just the poor panel quality that gives it that orange tinge, nothing software can do to fix a hardware issue. The Envy 15 is a great laptop with user upgradable RAM slots (upto 32gb of DDR4 3200), full size HDMI 1.4 connector, 2x USB A (one drop jaw & one standard), 1x USB C (non Thunderbolt; it is an Intel feature) but can have power delivery charging to the laptop, display out and can be used with USB multi adapters.The main issue that cripples the Envy 15 is the screen. 250 nit is plenty bright for indoor use like many said, but with the glossy screen, outdoor use is difficult.
Now to the Envy 13 that everyone is talking about. The 300 nit is a different panel compared to the 15" 250 nit. It does not suffer from the orange tinge and awful color accuracy. It isn't the best panel for a 13" but at this price point it is acceptable. If brightness and color accuracy is crucial to you, then bump up to the 1000 nit. It is a big jump in price, but with a better panel and privacy feature, it's worth the upgrade. Worst case, go with the 400 nit panel that is more than adequate for majority of users.
Between the Envy 15 and Envy 13, they are practically identical internally, except for these items:
Envy 15 has:
- full size HDMI
- 1x Type 3.2 with PD charging, display port out
- upgradable RAM (up to 32gb in two slots) future proof necessity
- upgradable NVMe slot (for larger drives in the future)
- full size SD card slot
Envy 13 has:
- 2x USB A drop jaw
- 1x Type 3.2 with PD charging, display port out
- soldered on RAM (get what you need when you order it)
- upgradable NVMe slot (for larger drives in the future)
- micro SD card slot
As far as buyers comparing the CPU and iGPU, the Ryzen is far superior to the 10th Gen Intel CPU with Iris Plus Graphics at a fraction of the cost. AMD Ryzen 7 4700U is build on a 7 nanometer lithograpy and the Intel 10th Gen i7 is built on a10 nanometer lithography (the smaller the lithography means you can shrink down the transistors on the CPU, therefore placing more transistors in a small area (smaller is better). FYI, the i7 is 4 cores vs. the Ryzen 7 is 8 cores, which basically gave the advantage to AMD in this generation of CPUs.
I have a Surface Pro 7 with 10th gen i7 and Iris Plus and the Envy 13 with Ryzen 7. The Envy is much faster in real world use, from opening apps, to playing light games, (Far Cry 5, Battlefield 4, Tomb Raider). The FPS (frame per second) in games is a big indicator of the superior architecture of the AMD. The Ryzen (Radeon Graphics) is about 10-20 FPS better than the Intel Iris Graphics. It makes a big difference when certain games are pushing the limits of the ultrabooks at an acceptable minium FPS of 30. The 10-20 FPS helps make the light gaming experience much better.
Someone mentioned using this for school work (CAD, Photoshop, and graphics-intensive schoolwork) both Intel or AMD latest gen CPU will suffice to do well in running those apps, but it's just the rate of speed that it will process the files. Laptop CPU/GPU/Dedicated GPU has come a long way in a few years, so it shouldn't have any issues with running the apps.
Sign up for a Slickdeals account to remove this ad.
IMO, the $50 jump is plenty for a slightly better panel. The 300 nit isn't bad at all. The 400 nit has some good reviews via MobileTechReview Lisa. I don't have that unit, so I can't comment on the accuracy. She mentioned that its 90 something % color accurate with SRGB. so if you plan on photo editing with it in a pinch you should totally be able to. The price jump is a lot to go with the 1000 nit and i can't say its going to be 3x better on color accuracy. The brightness is definitely really bright. My SIL has the 300nit compared side by side it is about 30-40% brighter from the naked eye. I liked the privacy option and also the fact the panel doesn't have to push to the max to get really bright... (my hope is slightly better battery life, since the panel is less taxed). Not a scientific assessment, but I fell for the marketing upcharge!
She tested it at 200 nit at 8hrs batt life. I tried dropping down to 150 nit and was able to get close to 10 hours!
IMO, the $50 jump is plenty for a slightly better panel. The 300 nit isn't bad at all. The 400 nit has some good reviews via MobileTechReview Lisa. I don't have that unit, so I can't comment on the accuracy. She mentioned that its 90 something % color accurate with SRGB. so if you plan on photo editing with it in a pinch you should totally be able to. The price jump is a lot to go with the 1000 nit and i can't say its going to be 3x better on color accuracy. The brightness is definitely really bright. My SIL has the 300nit compared side by side it is about 30-40% brighter from the naked eye. I liked the privacy option and also the fact the panel doesn't have to push to the max to get really bright... (my hope is slightly better battery life, since the panel is less taxed). Not a scientific assessment, but I fell for the marketing upcharge!
When I run my current machine at home, it is connected to all my peripherals (and charges) via usb-c to a doc. When I'm traveling (or otherwise not at my desk, and not running on battery) I use my current traditional power brick...which frees up the usb-c port for other uses.
So, again, why is not getting an usb-c power brick a bad thing?
Sign up for a Slickdeals account to remove this ad.
I hope that's the case and we would get some good years of fierce competition (which means happy end users)
So it'll be several months before they start showing up in laptops, only to be (yet again) a marginal improvement over current chips. At best, the GPU performance might be ~2x Ice Lake, which would certainly be substantial, but I suspect AMD will have an answer to it, which will probably be at least close, if not better, and significantly cheaper. And with USB4, which should be showing up in laptops by then, eGPUs will be better (more throughput) and should be available on at least some AMD systems, allowing a budget system to be used as a high-end gaming one, making the improved iGPU less important (not to say the improvements wouldn't still be important or welcome).
As always with computers, there's something better right around the corner, and if you don't need to upgrade now you may as well wait until you do to get the most you can, but if you do need to upgrade, it's probably not worth the wait. Actually, one of the reasons I'm seriously considering the Envy is because it'll hold me over and the money saved over a Spectre or something similar can be put toward an upgrade in a year or so when I can get a system with a next-gen Ryzen + USB4. I just hope by then they'll finally stop making ridiculous keyboard layouts, though I'm sure they won't.