The Intel Arc A750 makes the RTX 3050 look pathetic and even gives some tough competition to the 3060, though AMD's RX 6600 makes Intel's offering a little less enticing. Still, it's good to see competition from Intel in the GPU space.
Hoping Intel keeps making cards. Competition is good for everyone (except AMD and Nvidia).
They aren't dropped the GPU business. They already have a couple generations coming out over the next couple of years. They have reported they are selling more GPUs. And their drivers have gotten a lot better.
Nope. I work for a company that get lots of Intels test business. GPU business not going away...if anything, its a big part of their future lol. They made the right moves attacking the affordable sector first to iron their kinks out. Arc was never meant to be a big hit off the bat, nobody shouldve ever expected that. The improvements theyve made "thus far" have been more than impressive. Battlemage is where the big guns will be revealed. Theyll still be behind by that time, but it will rival current gen of high end cards from the competition sans 4090. You gotta root for them because both amd and nv really suck these days regarding pricing and need to be humbled.
Sign up for a Slickdeals account to remove this ad.
I dont agree. While Arc isnt perfect, this is a far better buy.
Current drivers are significantly better than launch ones. Performance is between 6700XT and 3070 levels in a lot of newer games, and at worst around 6650xt levels.
If you enable ray tracing Arc blows RDNA2 out of the water, as the RT performance is RTX 3000 levels/ RDNA3 levels. For example here is Hogwarts Legacy 1440p RT, no upscaling: https://tpucdn.com/review/hogwart...0-1440.png[tpucdn.com]
XeSS is a much better upscaler than FSR, but can do both XeSS and FSR.
The XMX cores accelerate AI application use, like upscaling in Gigapixel AI and GANs. It's not as widespread adopted as CUDA, but better than what AMD offerings in AI performance.
Encoding quality and performance is the best in the industry, Quicksync is better than NVENC and both are better than AMD, even on their newest flagships. The A750 also supports AV1 encoding, which neither RDNA2 or RTX 3000 support (you have to pay $800+ for their latest GPUs to get this).
Regarding your statements on gaming performance (RX 6700 XT/RTX 3070), that's a bit off based on updates from major reviewers who retested them. Hardware Unboxed on Youtube published updates a few days ago. They published an article as well: https://www.techspot.com/review/2...0-revisit/
The more expensive A770 was tested. The A750 is about 10% slower in raster https://www.techpowerup.com/gpu-s...a750.c3929. At 1080p we have averages: A770 111 fps, RX 6650 XT 120 fps, RX 6700 XT 140 fps, RTX 3070 143 fps. At 1440p the Arc A770 did go up slightly to be 2 fps faster than the RX 6650 XT. FPS averages are a bit inflated for all because CSGO was included to test DX9 in the 12 game suite.
This price bracket/tier of GPU is fine for 1080p on current games (no RT) or 1440p (no RT) on older and less demanding games. The RX 6600 XT/RTX 3060 commonly shows up as the recommended GPU for 1080p (no RT) on recent releases so I wouldn't rely on anything in this tier for 1440p or RT moving forward.
That said, $230-250 is within reason given pricing of the competition. Hopefully Battlemage can go up a few tiers and fix the higher than expected power draw at idle and load. Going up 2 or more tiers, then more people would care about RT.
Arguably, they have been the #1 graphic company of all time seeing as their integrated graphics are in almost all their CPUs.
That may be where the consumer apprehension originiates. I've had half a dozen laptops with integrated intel graphics and without fail they have each, in the graphics department, been utter shaite.
Intel loses on perf per watt, runs about the same price and is less mature. Go used 6600 or 5700 and win.
You don't win with the two cards you mentioned. You win if you wait for next generation cards which are overdue in the midgrade level. The GPU prices are still inflated while demand is low. If the A750 was $150, it would be a no brainer fill card that will mature with drivers over the next few months.
The 6700xt would be the card @ $250 but the pricing is not their yet. Waiting 2-3 months for the 4060 and AMD midgrade 7 series cards is the best bet.
I've been a PC gamer since the Commodore 64 days and GPUs on motherboards were never common from the mid 90's through the late 2000's. In fact, I used to work in television and I've tracked digital editing capabilities over the decades from the Amiga and the Video Toaster through modern CPUs and even dedicated video editing computers always had daughter boards.
Amiga 3000T with the Video toaster was the shit. It would of been amazing to see where the Amiga would have landed had commodore not dice rolled the company for that ridiculous CD32 to go against Sega.
Amiga 3000T with the Video toaster was the shit. It would of been amazing to see where the Amiga would have landed had commodore not dice rolled the company for that ridiculous CD32 to go against Sega.
Intel loses on perf per watt, runs about the same price and is less mature. Go used 6600 or 5700 and win.
Even new rx 6600 goes on sale cheaper than this. You don't have to buy used. You can get a 6650xt in this price range new.
The question is, what are the benefits of this Intel GPU over a 6650xt? I'm open minded and I agree with the idea of supporting competition in the GPU space, but it has to make sense.
Even new rx 6600 goes on sale cheaper than this. You don't have to buy used. You can get a 6650xt in this price range new.
The question is, what are the benefits of this Intel GPU over a 6650xt? I'm open minded and I agree with the idea of supporting competition in the GPU space, but it has to make sense.
I like to make bad decisions, but to me, the novelty is worth something.
Even new rx 6600 goes on sale cheaper than this. You don't have to buy used. You can get a 6650xt in this price range new.
The question is, what are the benefits of this Intel GPU over a 6650xt? I'm open minded and I agree with the idea of supporting competition in the GPU space, but it has to make sense.
With the newer drivers most testing seems to put the Arc750 in between the 66xx XT and 67xx cards performance wise... and at 1440p the Intel seems to get nearer the 67xx than the 66xx cards.
It also beats the hell out of comparable AMD cards in ray tracing- though I dunno how much you're doing of that in this price class anyway and if RT is a huge deal to you you prob still want Nvidia?
So it's a reasonable buy in that sense (prob a touch moreso if looking at 1440p), and supporting more competition can be your cherry on top of that I suppose.
With the newer drivers most testing seems to put the Arc750 in between the 66xx XT and 67xx cards performance wise... and at 1440p the Intel seems to get nearer the 67xx than the 66xx cards.
It also beats the hell out of comparable AMD cards in ray tracing- though I dunno how much you're doing of that in this price class anyway and if RT is a huge deal to you you prob still want Nvidia?
So it's a reasonable buy in that sense (prob a touch moreso if looking at 1440p), and supporting more competition can be your cherry on top of that I suppose.
I linked a couple updated benchmarks in my earlier comment. Both Hardware Unboxed and Techtesters 1080p results would still put the A750 at roughly an RX 6600 fyi. Benchmarks were done after the major driver update and marketing push that included DX9 improvements.
I benchmarked the RX 6600 XT, RX 5700 XT, and 3060 Ti over the past year. Based on what I'm seeing on more demanding games (Forspoken, Hogwarts Legacy, Witcher 3 next gen etc), I'd say the Arc A750, RX 6600 and similar are only going to be consistent at 1080p raster high/ultra for newer releases. Forspoken in particular was only running at 50 fps at 1080p high on an ultrawide on the RX 5700 XT (about same as RTX 3060 and Arc A750, per techpowerup).
Our community has rated this post as helpful. If you agree, why not thank jchu14
03-13-2023 at 11:10 PM.
Buyers beware: If you're running an older CPU and motherboard (some Ryzen 3000 and older and Intel 9th gen and older) that doesn't have resizeable BAR (ReBAR) then an Intel ARC GPU would be significantly handicapped.
I have a Ryzen 3700x but on a motherboard that doesn't support ReBAR and the A750's performance is noticeably worse than with the Radeon 6650xt at 1080P.
I think most of you guys only care about the game performance, but there are few of us who are buying this for AV1 encoding performance. Software encoding is good but even with my Ryzen 9 5900, average FPS is 58 w/ full load. Hardware AV1 apparently goes above 200FPS, that's near 4x faster than 12 core CPU.
Even new rx 6600 goes on sale cheaper than this. You don't have to buy used. You can get a 6650xt in this price range new.
The question is, what are the benefits of this Intel GPU over a 6650xt? I'm open minded and I agree with the idea of supporting competition in the GPU space, but it has to make sense.
AV1 encoding. For Nvidia and AMD, their last generation does not support hardware av1 encoding at all.
106 Comments
Your comment cannot be blank.
Featured Comments
Sign up for a Slickdeals account to remove this ad.
Current drivers are significantly better than launch ones. Performance is between 6700XT and 3070 levels in a lot of newer games, and at worst around 6650xt levels.
If you enable ray tracing Arc blows RDNA2 out of the water, as the RT performance is RTX 3000 levels/ RDNA3 levels. For example here is Hogwarts Legacy 1440p RT, no upscaling:
https://tpucdn.com/review/hogwart...0-1440.png [tpucdn.com]
XeSS is a much better upscaler than FSR, but can do both XeSS and FSR.
The XMX cores accelerate AI application use, like upscaling in Gigapixel AI and GANs. It's not as widespread adopted as CUDA, but better than what AMD offerings in AI performance.
Encoding quality and performance is the best in the industry, Quicksync is better than NVENC and both are better than AMD, even on their newest flagships. The A750 also supports AV1 encoding, which neither RDNA2 or RTX 3000 support (you have to pay $800+ for their latest GPUs to get this).
The more expensive A770 was tested. The A750 is about 10% slower in raster https://www.techpowerup
This price bracket/tier of GPU is fine for 1080p on current games (no RT) or 1440p (no RT) on older and less demanding games. The RX 6600 XT/RTX 3060 commonly shows up as the recommended GPU for 1080p (no RT) on recent releases so I wouldn't rely on anything in this tier for 1440p or RT moving forward.
That said, $230-250 is within reason given pricing of the competition. Hopefully Battlemage can go up a few tiers and fix the higher than expected power draw at idle and load. Going up 2 or more tiers, then more people would care about RT.
Btw, another updated review from Techtesters specifically for the A750 (including power draw): https://www.youtube.com/watch?v=EYb5PhP
The 6700xt would be the card @ $250 but the pricing is not their yet. Waiting 2-3 months for the 4060 and AMD midgrade 7 series cards is the best bet.
Amiga 3000T with the Video toaster was the shit. It would of been amazing to see where the Amiga would have landed had commodore not dice rolled the company for that ridiculous CD32 to go against Sega.
Even new rx 6600 goes on sale cheaper than this. You don't have to buy used. You can get a 6650xt in this price range new.
The question is, what are the benefits of this Intel GPU over a 6650xt? I'm open minded and I agree with the idea of supporting competition in the GPU space, but it has to make sense.
Sign up for a Slickdeals account to remove this ad.
The question is, what are the benefits of this Intel GPU over a 6650xt? I'm open minded and I agree with the idea of supporting competition in the GPU space, but it has to make sense.
The question is, what are the benefits of this Intel GPU over a 6650xt? I'm open minded and I agree with the idea of supporting competition in the GPU space, but it has to make sense.
With the newer drivers most testing seems to put the Arc750 in between the 66xx XT and 67xx cards performance wise... and at 1440p the Intel seems to get nearer the 67xx than the 66xx cards.
It also beats the hell out of comparable AMD cards in ray tracing- though I dunno how much you're doing of that in this price class anyway and if RT is a huge deal to you you prob still want Nvidia?
So it's a reasonable buy in that sense (prob a touch moreso if looking at 1440p), and supporting more competition can be your cherry on top of that I suppose.
It also beats the hell out of comparable AMD cards in ray tracing- though I dunno how much you're doing of that in this price class anyway and if RT is a huge deal to you you prob still want Nvidia?
So it's a reasonable buy in that sense (prob a touch moreso if looking at 1440p), and supporting more competition can be your cherry on top of that I suppose.
I benchmarked the RX 6600 XT, RX 5700 XT, and 3060 Ti over the past year. Based on what I'm seeing on more demanding games (Forspoken, Hogwarts Legacy, Witcher 3 next gen etc), I'd say the Arc A750, RX 6600 and similar are only going to be consistent at 1080p raster high/ultra for newer releases. Forspoken in particular was only running at 50 fps at 1080p high on an ultrawide on the RX 5700 XT (about same as RTX 3060 and Arc A750, per techpowerup).
Our community has rated this post as helpful. If you agree, why not thank jchu14
I have a Ryzen 3700x but on a motherboard that doesn't support ReBAR and the A750's performance is noticeably worse than with the Radeon 6650xt at 1080P.
https://arstechnica.com/gadgets/2...recommend/
The question is, what are the benefits of this Intel GPU over a 6650xt? I'm open minded and I agree with the idea of supporting competition in the GPU space, but it has to make sense.
Sign up for a Slickdeals account to remove this ad.