The Intel Arc A750 makes the RTX 3050 look pathetic and even gives some tough competition to the 3060, though AMD's RX 6600 makes Intel's offering a little less enticing. Still, it's good to see competition from Intel in the GPU space.
Hoping Intel keeps making cards. Competition is good for everyone (except AMD and Nvidia).
They aren't dropped the GPU business. They already have a couple generations coming out over the next couple of years. They have reported they are selling more GPUs. And their drivers have gotten a lot better.
Nope. I work for a company that get lots of Intels test business. GPU business not going away...if anything, its a big part of their future lol. They made the right moves attacking the affordable sector first to iron their kinks out. Arc was never meant to be a big hit off the bat, nobody shouldve ever expected that. The improvements theyve made "thus far" have been more than impressive. Battlemage is where the big guns will be revealed. Theyll still be behind by that time, but it will rival current gen of high end cards from the competition sans 4090. You gotta root for them because both amd and nv really suck these days regarding pricing and need to be humbled.
Sign up for a Slickdeals account to remove this ad.
This comment has been rated as unhelpful by Slickdeals users
I have to walk back my claims a little. There were integrated graphics chips, but they were just for display purposes and really didn't do much in the way of gaming or video editing. They just made the computer work without having an additional graphics card before graphics integrated on the CPU was common.
Hopefully this doesn't sound worse than I intend it to, but I can hopefully help you understand why some of the other posters' comments feel personal to you.
You went out of your way to correct a dude who said something accurate. Then when I chimed in, you posted a couple Linus videos mentioning relatively modern motherboards to support your point, and also claimed OG gaming bona fides. Then when more evidence came in, you said you need to walk your claims back "a little."
You ducked up. That's ok, we all do. I do it a lot. But when you do, eat it, don't try to minimize it.
Are you familiar with the terms "northbridge" and "southbridge"?
There was a long period where CPUs didn't have memory controllers built in, so you'd have a chipset called the northbridge which the CPU would talk to in order to communicate with the RAM.
Before PCI Express, there was also a different kind of interface called AGP which was the absolute fastest way a top end graphics card could achieve a high bandwidth connection. This was also something the northbridge handled.
Eventually, they had enough room on the northbridge to also add a low performance GPU (iGPU).
These days, all of that is integrated into the processor, so no northbridge is needed.
Now, the southbridge (which still exists, we just don't call it that anymore) would handle all your interfacing with PCI cards (sound card and modem were two very popular components) as well as USB and sometimes it included the storage controller, so your hard drive and CD-ROM drive would also go through this.
I do remember the northbridge and southbridge, and I also left a comment walking back some of my claims. I jumped to some conclusions because this is a thread about a graphics card and I didn't think the person I was responding to meant the featureless integrated graphics, but a full onboard gpu.
I don't recall anyone ever calling the northbridge a GPU. At best it contained a graphics controller.
i would be interested in getting an intel graphics card but the price would have to be at least half the price of competing amd/nvidia cards..
Yes, I fully agree. Intel is going to have to buy their way into the market with cheap prices and solid reliability and performance. Then once their reputation and user base it up to a certain level they can start to increase prices. Right now, somewhere around 60% of the competition sounds about right.
Our community has rated this post as helpful. If you agree, why not thank FantasticMallard909
03-11-2023 at 09:43 AM.
I've had the A770 16GB for three weeks. I have not had any instability or crashing issues. The only games with any major issues that I have seen are some of the Microsoft exclusives (Halo series). I think those were fixed in the most recent driver. Gears of War Ultimate Edition is the only game I've come across that is unplayable (extreme visual corruption). And their screen capture software won't capture if HDR is enabled, but OBS works fine.
Lot of these comments speaking so negatively have not actually used an Arc GPU.
The Topaz software works well, worth $99 if you have a use for it. I have not installed the other software yet.
Handbrake can use the card and encodes AV1 dvd quallity at over 300fps, and Blu-ray 1080p at over 60 fps on an ancient ( 10 year old i7 ) computer.
I've had the A770 16GB for three weeks. I have not had any instability or crashing issues. The only games with any major issues that I have seen are some of the Microsoft exclusives (Halo series). I think those were fixed in the most recent driver. Gears of War Ultimate Edition is the only game I've come across that is unplayable (extreme visual corruption). And their screen capture software won't capture if HDR is enabled, but OBS works fine.
Lot of these comments speaking so negatively have not actually used an Arc GPU.
Agree, Intel drivers will get better. It's not like latest AMD and Nvidia drivers are stable either.
I bought an Arc A380 for OBS. But I also use it for games. It's pretty decent for a card I paid $130 for. I'm looking forward to more Intel cards in the future if they keep this up.
106 Comments
Your comment cannot be blank.
Featured Comments
Sign up for a Slickdeals account to remove this ad.
Hopefully this doesn't sound worse than I intend it to, but I can hopefully help you understand why some of the other posters' comments feel personal to you.
You went out of your way to correct a dude who said something accurate. Then when I chimed in, you posted a couple Linus videos mentioning relatively modern motherboards to support your point, and also claimed OG gaming bona fides. Then when more evidence came in, you said you need to walk your claims back "a little."
You ducked up. That's ok, we all do. I do it a lot. But when you do, eat it, don't try to minimize it.
Dude it's a $230 card that performs about in line with a lot of other $200-$300 cards. It's not going to ruin anyone's gaming experience.
There was a long period where CPUs didn't have memory controllers built in, so you'd have a chipset called the northbridge which the CPU would talk to in order to communicate with the RAM.
Before PCI Express, there was also a different kind of interface called AGP which was the absolute fastest way a top end graphics card could achieve a high bandwidth connection. This was also something the northbridge handled.
Eventually, they had enough room on the northbridge to also add a low performance GPU (iGPU).
These days, all of that is integrated into the processor, so no northbridge is needed.
Now, the southbridge (which still exists, we just don't call it that anymore) would handle all your interfacing with PCI cards (sound card and modem were two very popular components) as well as USB and sometimes it included the storage controller, so your hard drive and CD-ROM drive would also go through this.
I don't recall anyone ever calling the northbridge a GPU. At best it contained a graphics controller.
Sign up for a Slickdeals account to remove this ad.
Our community has rated this post as helpful. If you agree, why not thank FantasticMallard909
Lot of these comments speaking so negatively have not actually used an Arc GPU.
Handbrake can use the card and encodes AV1 dvd quallity at over 300fps, and Blu-ray 1080p at over 60 fps on an ancient ( 10 year old i7 ) computer.
Lot of these comments speaking so negatively have not actually used an Arc GPU.
https://www.pcgamer.com/both-amd-...gaming-pc/
Sign up for a Slickdeals account to remove this ad.