The Intel Arc A750 makes the RTX 3050 look pathetic and even gives some tough competition to the 3060, though AMD's RX 6600 makes Intel's offering a little less enticing. Still, it's good to see competition from Intel in the GPU space.
Hoping Intel keeps making cards. Competition is good for everyone (except AMD and Nvidia).
They aren't dropped the GPU business. They already have a couple generations coming out over the next couple of years. They have reported they are selling more GPUs. And their drivers have gotten a lot better.
Nope. I work for a company that get lots of Intels test business. GPU business not going away...if anything, its a big part of their future lol. They made the right moves attacking the affordable sector first to iron their kinks out. Arc was never meant to be a big hit off the bat, nobody shouldve ever expected that. The improvements theyve made "thus far" have been more than impressive. Battlemage is where the big guns will be revealed. Theyll still be behind by that time, but it will rival current gen of high end cards from the competition sans 4090. You gotta root for them because both amd and nv really suck these days regarding pricing and need to be humbled.
Sign up for a Slickdeals account to remove this ad.
What's the big deal with AV1 encoding anyway? Hardly anything supports it yet. I would think decoding is a big deal, but what is the use case for encoding now (where software speed is not sufficient)? Wouldn't you want your media encoded in a format most devices support?
Intel is losing server share right now and that's causing them long term revenue issues because that business makes up half of their revenue..the more AMD continues to eat that share up and win 10+ year customers because once you move servers, you stay for a long time...that long term revenue will enable AMD to fund more aggressive graphics and CPU initiatives but Intel, they need to fix their consumer CPU market because server isn't something they can turn around quickly.
What's the big deal with AV1 encoding anyway? Hardly anything supports it yet. I would think decoding is a big deal, but what is the use case for encoding now (where software speed is not sufficient)? Wouldn't you want your media encoded in a format most devices support?
AV1 offers like 30% better compression than H.265-- and playback is pretty widely supported on modern stuff at this point... On mobile devices Android supports it natively, as does ChromeOS-- and in browsers Firefox, Chrome, Opera, Edge have all supported AV1 playback for years now, as does VLC. Even Safari finally added it.
I've been a PC gamer since the Commodore 64 days and GPUs on motherboards were never common from the mid 90's through the late 2000's. In fact, I used to work in television and I've tracked digital editing capabilities over the decades from the Amiga and the Video Toaster through modern CPUs and even dedicated video editing computers always had daughter boards.
Integrated graphics were always part of the motherboard until the 2010s. You can read about it here[computer.org]. Not every motherboard included integrated graphics, kind of like how not every CPU has integrated graphics.
Intel is losing server share right now and that's causing them long term revenue issues because that business makes up half of their revenue..the more AMD continues to eat that share up and win 10+ year customers because once you move servers, you stay for a long time...that long term revenue will enable AMD to fund more aggressive graphics and CPU initiatives but Intel, they need to fix their consumer CPU market because server isn't something they can turn around quickly.
why the more intel needs to keep this GPU alive. GPU is not only for gaming, it is also for servers that need lots of parallel processing. intel's GPU gaming space is a byproduct of this server needs.
why the more intel needs to keep this GPU alive. GPU is not only for gaming, it is also for servers that need lots of parallel processing. intel's GPU gaming space is a byproduct of this server needs.
Perhaps but AMD has shown the path is more cores through additional chiplets, or Intel calls it "tiles".
Cores, power consumption, delivery times, security all are important in server. Intel really screwed things up bad with the Specter and Meltdown issues too.
Integrated graphics were always part of the motherboard until the 2010s. You can read about it here[computer.org]. Not every motherboard included integrated graphics, kind of like how not every CPU has integrated graphics.
Yes, but they weren't really GPUs. They were called IGCs or IGPs and they were super weak. GPUs are typically on daughterboards.
I have a 6600XT but if I had to choose right now I'd probably go Intel. The arch is brand new and will continue to improve with drivers and optimization, while the 3060 and 6600XT are as good as they'll ever be.
It's like back in the RX 480/580 days vs the 960, at first it didn't seem like it would keep up, but it aged tremendously better.
I've been really pleased with the performance on my A750 anywhere between 1080p and 1440p, which I grabbed just after the MSRP price drop to $250. With that as the price-point and reasonable awareness that it's a first-gen product, it's a great card. The current state of the drivers (basically everything after v. 4091) is solid, with mostly just one-off issues that tend to be knocked down by the next game or driver version update.
The main thing to keep in mind is to be sure your system supports resizable bar, which the Arc cards rely on for a good chunk of their potential performance. It's the one downside to the A-series if you're hoping to upgrade from an older system.
AV1 offers like 30% better compression than H.265-- and playback is pretty widely supported on modern stuff at this point... On mobile devices Android supports it natively, as does ChromeOS-- and in browsers Firefox, Chrome, Opera, Edge have all supported AV1 playback for years now, as does VLC. Even Safari finally added it.
If the Nvidia shield would play av1, I'd be all over these.
As far as I know it's not in the cards for a 2019 shields
I think most of you guys only care about the game performance, but there are few of us who are buying this for AV1 encoding performance. Software encoding is good but even with my Ryzen 9 5900, average FPS is 58 w/ full load. Hardware AV1 apparently goes above 200FPS, that's near 4x faster than 12 core CPU.
the big draw with this card isn't av1, because cameras don't record av1.
what's needed for editing is 4:2:2 h.265 decoding support, and afaik this card has that??
amd and nvidia both refuse to include h.265 even on their most expensive cards, it's pathetic.
If the Nvidia shield would play av1, I'd be all over these.
As far as I know it's not in the cards for a 2019 shields
Nvidias really gonna need to refresh HW at some point... even Roku and Firestick have AV1 support to varying degrees on newer models (as do many smart tvs)
106 Comments
Your comment cannot be blank.
Featured Comments
Sign up for a Slickdeals account to remove this ad.
Intel is losing server share right now and that's causing them long term revenue issues because that business makes up half of their revenue..the more AMD continues to eat that share up and win 10+ year customers because once you move servers, you stay for a long time...that long term revenue will enable AMD to fund more aggressive graphics and CPU initiatives but Intel, they need to fix their consumer CPU market because server isn't something they can turn around quickly.
https://www.lightreadin
AV1 offers like 30% better compression than H.265-- and playback is pretty widely supported on modern stuff at this point... On mobile devices Android supports it natively, as does ChromeOS-- and in browsers Firefox, Chrome, Opera, Edge have all supported AV1 playback for years now, as does VLC. Even Safari finally added it.
Intel is losing server share right now and that's causing them long term revenue issues because that business makes up half of their revenue..the more AMD continues to eat that share up and win 10+ year customers because once you move servers, you stay for a long time...that long term revenue will enable AMD to fund more aggressive graphics and CPU initiatives but Intel, they need to fix their consumer CPU market because server isn't something they can turn around quickly.
https://www.lightreading.com/semi...-id/782916 [lightreading.com]
Cores, power consumption, delivery times, security all are important in server. Intel really screwed things up bad with the Specter and Meltdown issues too.
Sign up for a Slickdeals account to remove this ad.
It's like back in the RX 480/580 days vs the 960, at first it didn't seem like it would keep up, but it aged tremendously better.
The main thing to keep in mind is to be sure your system supports resizable bar, which the Arc cards rely on for a good chunk of their potential performance. It's the one downside to the A-series if you're hoping to upgrade from an older system.
If the Nvidia shield would play av1, I'd be all over these.
As far as I know it's not in the cards for a 2019 shields
what's needed for editing is 4:2:2 h.265 decoding support, and afaik this card has that??
amd and nvidia both refuse to include h.265 even on their most expensive cards, it's pathetic.
Sign up for a Slickdeals account to remove this ad.
As far as I know it's not in the cards for a 2019 shields
Nvidias really gonna need to refresh HW at some point... even Roku and Firestick have AV1 support to varying degrees on newer models (as do many smart tvs)