I can't imagine that anyone expected anything else.
It's an undertold story that Intel has been a drag on the PC industry since the early 00's by numerous decisions that have reduced the performance gap between mainstream PCs and smartphones. Intel's real goal has been to sell as much of the balance-of-system of a PC as possible. They're offended by the idea that NVIDIA or AMD gets some money from the graphics card when you buy a PC so they came out with integrated "graphics" functionality in the CPU which is the reason why people think Windows Vista sucked.
The other day my son and I found an old PC which had a "Pentium" branded chip from the early 2010's that had an artificially limited number of PCIe lanes. This chip claims to support 16 PCIe lanes which makes you think you could hook up a graphics card, except some PCIe lanes are eaten up by the SATA, ethernet, sound, Super IO and other features. This machine is now e-Waste because of this short sighted decision.
People who've never built a PC or tried putting cards in an existing and fought with the restrictions on PCIe lanes don't realize that PC's have been badly starved for I/O since 2010.
I'm sorry, but this is all a matter of perspective, even in the narrow perview of graphic drivers.
As a counterpoint, Intel's linux graphic drivers are the very best. That's not to say they don't have problems, they have plenty, more than I'd like by a long shot. But compared to nvidia's steaming pile of kernel panics that still thinks Wayland is a second class citizen (try scrolling a jupyter notebook in pycharm under Wayland and again X. Keep your system monitor open and watch a single CPU core get pegged as your screen updates at 5fps). And don't start me on AMD, sure its better than Nvidia for displaying stuff, but that is only one Linus middle finger less. If you "use" their rocm compute stack... Use... What a joke of constant breakage and missing support. I deal with it every day and cry.
Sorry to vent... But I have strong feelings about this. Intel can suck at times(plus, you know, they are the hardware anti-comp leaders), but windows is not the only game in town. Its worth mentioning that I'm complaining about the now, in the olden days even trying AMD or Nvidia on Linux was just asking for pain. If it wasn't for gpu compute, we'd still be in the dark(er) times.
PCIe 2.0 x8 can handle video cards enough for 2010's Pentium.
I'm hungry about mainstream mobo's PCIe lane, but it seems that the majority isn't. Many mobo only have one PCIe 5.0 x16 slot directory from CPU for a video card, and rest slots come from PCH. Some mobo only have one x16 from CPU and two x1 from PCH despite it's ATX. That is basically overkill and not flexible, but I see few criticism.
This seems absolutely bizarre to me. If you're a) going to enter the low-to-middle end of market and b) do so decades after the major players, you really need to differentiate yourself positively. As a customer, getting "Good enough" in that range of the GPU market has been dead simple for a long time, even including recent semiconductor shortages.
I already dealt with NVIDIA and ATI's teething pains in the late 90s/early 00's; why would I do it again? I'm nostalgic for a lot of things from my high school years, but keeping my fingers crossed while installing the ATI Radeon driver update I got off of a PC Gamer demo disk is not one of them.
They seemed to mostly be hoping to provide stock and low end GPUs in an extremely tight, starved market. They probably would have done okay with that, but now that the market is suddenly flooded with mining GPUs...
The weird thing is, even at the worst, I didn't really have trouble getting my hands on a GPU in that price range. I threw together a basic "play Red Dead Redemption 2 and racing sims" PC for my dad a year ago, and had no problems.
They won't have decided to ship poor drivers. That'll be an emergent feature of the conditions under which the drivers were written plus their commercial decisions.
Naturally! "This is bizarre" is referring to the commercial decisions leading to this point. They seem to have chosen a hard market to crack, and then cornered themselves before even reaching the starting line.
Just look at their track record on the BIOS for server boards. A company I worked with ran into the exact same PCIe probing bug in the BIOS on every new generation of Intel server motherboard + CPU. And it always took 6 months for the BIOS guys to get around to reimplementing the same fix because the next gen CPUs fork off before the production fixes are complete.
I had a business partner who was impressed with Intel in a way that I just couldn't understand. In my own area of software expertise Intel was hiring freshers to make half-baked projects that were even more half baked than my own half baked projects and I could never get him to understand how it was a software person would find it almost impossible to be taken seriously in a company that thinks its main competence is hardware.
(That said, NVIDIA is a bit of a counterexample. As much as people complain about their drivers a lot of the brilliance in their product is in the CUDA compiler. I wound up having to fight with NVIDIA's drivers to make a family of machine language learning products deployable and came to have a lot of appreciation for the release engineering work that goes into their drivers even if, in the end, the drivers seem to be just another way to build a mailing list of people to spamannoy)
Do they value any employee at all? As per the latest grapevine, they have stopped giving RSU refreshers as well; I guess that may be a smart decision if they know all the smart horses have left the stable.
I’m surprised they got this far. The graphics software world is full of little Nvidia, AMD, and Intel integrated code paths. Nvidia drivers have game-specific code in them.
I really don't understand the Intel hate in this situation. A third competitor is a good thing, and on DirectX 12 titles they are good price for performance.
Really an utter waste of R&D and resources during semi shortage. I detest Intel and the fact that the US govt is trying to give them more money to continue to fail. I lost faith in this company awhile ago.
It's an undertold story that Intel has been a drag on the PC industry since the early 00's by numerous decisions that have reduced the performance gap between mainstream PCs and smartphones. Intel's real goal has been to sell as much of the balance-of-system of a PC as possible. They're offended by the idea that NVIDIA or AMD gets some money from the graphics card when you buy a PC so they came out with integrated "graphics" functionality in the CPU which is the reason why people think Windows Vista sucked.
The other day my son and I found an old PC which had a "Pentium" branded chip from the early 2010's that had an artificially limited number of PCIe lanes. This chip claims to support 16 PCIe lanes which makes you think you could hook up a graphics card, except some PCIe lanes are eaten up by the SATA, ethernet, sound, Super IO and other features. This machine is now e-Waste because of this short sighted decision.
People who've never built a PC or tried putting cards in an existing and fought with the restrictions on PCIe lanes don't realize that PC's have been badly starved for I/O since 2010.