I know Intel is dipping its toe into the GPU market, but let’s be real, AMD and nVidia are the only options and have been for the last 20+ years. The manufacturers/assemblers of the complete graphics cards are varied and widespread, but the core tech comes from two companies only.

Why is this the case? Or am I mistaken and am just brainwashed by marketing, and there are in fact other viable options for GPUs?

Cheers!

  • FreedomAdvocate@lemmy.net.au
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 day ago

    It’s insanely costly, insanely hard, and an insanely hard market for a new competitor to enter. Even something as “simple” as dev support is a gigantic hurdle.

  • Kyrgizion@lemmy.world
    link
    fedilink
    arrow-up
    82
    ·
    4 days ago

    It’s not even really “two companies”. Nvidia has 92% of the entire market. And the reason for that is mostly CUDA and its ecosystem which has become widespread among developers.

    • Buffalox@lemmy.world
      link
      fedilink
      arrow-up
      45
      arrow-down
      1
      ·
      4 days ago

      I think 90+% marketshare is technically considered a monopoly in many places.
      But the existence of AMD still makes a huge difference IMO, you do have an alternative option, and Nvidia doesn’t control the market completely.
      Also personally I use AMD because I’m on Linux, and I don’t want the proprietary Nvidia driver to fuck up my system.
      So AFAIK on Linux, the majority actually run AMD.

      • LainTrain@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        7
        arrow-down
        11
        ·
        edit-2
        4 days ago

        I run Linux on my PC with a 3090 and my server running Debian oldstable has a 1070Ti.

        The proprietary driver works ok. Nouveau is a valiant effort but isnt very useful. I personally can’t imagine having an AMD GPU and not being able to play games or do ML or have NVENC for transcoding (e.g. in JF docker)/video editing, AMD is just not a serious company in the GPU space sadly.

        Proper competition is sorely needed. Such a shame too, because I’m quite glad to never have to buy an Intel CPU ever again and deal with their ass backwards ecosystem, it would be cool if AMD could pull off such a comeback for GPUs.

        • herrvogel@lemmy.world
          link
          fedilink
          arrow-up
          12
          arrow-down
          2
          ·
          4 days ago

          Not being able to play games? Not serious? What the hell are you talking about? Did you compare a 3090 to a bottom tier ATI Radeon card or something? My RX 6800 was a fucking champ that was able to run everything I threw at it without a single problem and with quite satisfactory performance at 1440p. It was most definitely a very competent GPU for gaming.

          ML and nvenc are extra features that not everyone needs or even wants.

          • LainTrain@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            1
            arrow-down
            3
            ·
            edit-2
            3 days ago

            Maybe I’m misinformed, but can the RX 6800 do Path Tracing in CP2077 at 1440p native/~30fps or at least on DLSS Quality or even FSR Quality (in both upscalers cases with Ray Reconstruction and no frame generator ofc).

            Can any AMD GPU do that at all, at a better price to performance point?

            Because I was under the impression that basically no AMD cards can do practically any substantial RT of any kind, and that’s why it’s either missing on consoles or borderline unnoticeable and often confined to reflections where it’s pretty useless.

            I just briefly looked it up, and all I could find was someone on steam discussions with an 79xx XTX card talking about the old timey simple Ray Tracing being “possible” at 1080p in cyberpunk on steam discussions. Granted there were some videos that looked promising but I steer clear of YT and reddit.

            And as for features you don’t need, idk, to me if I have to buy a GPU, might as well get one that should be able to do more things rather than less for the money. I didn’t even play video games much when I got my 3090 years ago during summer 2021 for like $400 used from CeX, but I’m glad I got something that can, and something that can encode video well, whether it’s recording, editing and rendering gaming footage to share with friends, or transcoding media on a server, or do ML (eg for self hosting Immich).

            If I were to buy a GPU now, i’d look at all features, even if I don’t want those things now, I’d ofc want them later, and for that - why would you choose an inferior option?

            • lordbritishbusiness@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              3 days ago

              A RX 6800? Yes, in fact I was playing CP2077 with FSR and ray tracing on at 4K on a 6800. It was the first real card AMD had that could. Though it did struggle at times.

              Sadly replaced after 4 years with a RX9060 of similar capability but better Ray tracing.

              AMD cards are only about 2-3 years behind NVIDIA in a lot of specialised tasks, but trying to pace the behemoth that is NVIDIA’s RnD with a much smaller budget. ROCm works but is held up by compatibility issues with the newer CUDA features.

                • lordbritishbusiness@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 day ago

                  Hmm. I’ve decided I don’t like you.

                  It feels like you’re setting an artificially high goal purely to make challenging your assirtion impossible. I’m not even sure a 3090 can do path tracing. 40/5090, maybe they can.

                  Could an AMD card do it? Yes. RX 7900 possibly, a theoretical RX 9090 could if they bothered to release one.

                  But none of that matters really. Never has.

        • rapchee@lemmy.world
          link
          fedilink
          arrow-up
          9
          arrow-down
          1
          ·
          3 days ago

          lol “not being able to run games” have you heard of a little thing called the steam deck?

              • Blackmist@feddit.uk
                link
                fedilink
                English
                arrow-up
                1
                ·
                3 days ago

                Runs it better than any PC costing a similar amount.

                You can barely get an entry level GPU for that.

              • survirtual@lemmy.world
                link
                fedilink
                arrow-up
                2
                arrow-down
                1
                ·
                3 days ago

                This is nonsense and, frankly, sounds like guerrilla marketing for nvidia.

                All things considered, I can play any game I want on the steam deck, which has an old SoC by today’s standards. A newer AMD gpu can run anything at max settings on a linux machine.

                So again, either you are grossly misinformed or working for nvidia to sew gentle doubt. Either way, stop it.

          • LainTrain@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            1
            arrow-down
            2
            ·
            edit-2
            3 days ago

            I have a steam deck. It’s a portable, it’s really cool for what it is but as a main gaming system it can’t exactly compete visually nor performance wise with being able to run Path Tracing in CP2077 for instance, which I’d say is borderline required to enjoy the game’s visuals.

            It’s like comparing the PS3 and the PSP or the N64 and the Gameboy Color. Both are cool, one doesn’t replace the other nor does it have to.

            • rapchee@lemmy.world
              link
              fedilink
              arrow-up
              3
              ·
              edit-2
              2 days ago

              the point was about “not being able to game on amd”
              yeah duh mobile chips are less performant, but still forza horizon 4–5 runs better on my steam deck than on my desktop pc with a 2080rtx in linux

        • thelittleblackbird@lemmy.world
          link
          fedilink
          arrow-up
          3
          arrow-down
          3
          ·
          3 days ago

          Sorry, I needed to vote negatively your comment due to the false information. Nothing personal, just keeping the house clean

            • thelittleblackbird@lemmy.world
              link
              fedilink
              arrow-up
              1
              arrow-down
              1
              ·
              3 days ago

              AMD gpus are inside in Xbox and ps5, without taking into account the handhelds like steam deck.

              For the ML is usually better to use amd cards because they use to have more vram, and many many models can be trained using amd.

              And about the transcoding comment I will bother myself ait it.

              In summary, tells me you don’t have any clue without telling me you don’t have any clue

              • FreedomAdvocate@lemmy.net.au
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 day ago

                AMD cards are not really better than Nvidia cards at anything. That’s not being a “shill”, it’s just the truth. They’re cheaper and easier to find in stock, that’s about it.

                • thelittleblackbird@lemmy.world
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  edit-2
                  1 day ago

                  AMD are not better than Nvidia -> sure

                  I prefer Nvidia than AMD for everything -> perfect, it is your opinion and a respetable one

                  With AMD you can not do AAA gaming, ML or just transcoding -> a lie, simply, nothing more to add

                  And I will ignore the sentence about AMD not being a serious company because it is too absurd to discuss

  • HailSeitan@lemmy.world
    link
    fedilink
    arrow-up
    14
    ·
    3 days ago

    Matt Stoller had a nice writeup recently in his monopoly newsletter BIG about how we got into the current mess. TL;DR: basically financialization (prioritizing stock price over innovation, like at Boeing) and a lack of antitrust enforcement as a previously competitive market got monopolized (see chart below)

    • SmoothIsFast@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      1 day ago

      Interesting to note that most of those are not chip makers but fabless semi conductor companies who outsource all of their production to global foundries and tscm.

  • nialv7@lemmy.world
    link
    fedilink
    arrow-up
    25
    ·
    3 days ago

    Duopolies are very prevalent in tech, think AMD/Intel, AMD/nvidia, Windows/MacOS, iOS/Android, etc.

    As to why? idk. Big companies buy up small ones until one left so they don’t get sued for being a monopoly? Maybe, but I don’t think that applies to all those cases.

    • scarabic@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      3 days ago

      In tech it’s often a bad thing to have 37 of something. How many phone operating systems can app developers reasonably serve? Does it benefit consumers to have 19 different graphics chip standards?

  • AliasVortex@lemmy.world
    link
    fedilink
    English
    arrow-up
    41
    arrow-down
    1
    ·
    4 days ago

    The short concise answer is mostly cost. Nvidia, AMD, and Intel are all spending multiple billions of dollars per year in R&D alone. It’s just not a space where someone can invent something in their garage and disrupt the whole industry (like, even if someone were to come out of left field with a revolutionary chip design, they’d need to convince investors that they’d be a better bet than literal trillion dollar companies).

    • XeroxCool@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      4 days ago

      The question isn’t just about upstarts, it’s asking how we got here. We can’t start Ovidia in a garage, but Nvidia did at one point. So where’d everyone else go? What partnerships and preferences put Nvidia on top?

      • HobbitFoot @thelemmy.club
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 day ago

        In general, tech is an industry with high fixed costs and low costs per unit sold. That kind of pricing structure tends to limit competition.

        Nvidia was founded at a time when outsourcing chip fabrication was common and viable, so all Nvidia had to do was focus on design. After a series of failures and near bankruptcy, Nvidia was finally able to invent the idea of a GPU and sell it to the marketplace.

        After that Nvidia bought several companies to round out its patent portfolio and capabilities, remaining a dominant company in an industry it created. The only real competition was with other companies that had previous chip design experience.

      • despoticruin@lemmy.zip
        link
        fedilink
        arrow-up
        9
        ·
        4 days ago

        Patents and the fact that these chips are massively complex designs. We are talking architecture on the complexity level of the empire state building, most of which is a blend of proprietary designs developed over decades.

        Nobody is saying you can’t do it in your garage, in fact it’s easier than ever to start. Let me know how it goes, look into some of the recent tapeout challenges to get an idea of what you are proposing people just make in a garage.

        • XeroxCool@lemmy.world
          link
          fedilink
          arrow-up
          2
          arrow-down
          2
          ·
          4 days ago

          You said exactly what the parent comment said and ignored the secondary part of OP’s intent. But thanks?

      • AliasVortex@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        edit-2
        4 days ago

        I was content to let the other comments address the history since I’m not particularly well versed there (and there’s already enough confidently incorrect bullshit in the world). I mostly just wanted to interject on why there aren’t more chip companies beyond just hand waving it away as “market consolidation”, which is true, but doesn’t take into account that barrier for entry in the space is less on the scale of opening up a sandwich restaurant or boutique clothing store and more on the order of waking up tomorrow and deciding to compete with your local power/ water utility provider.

        The answer also gets kind of fuzzy outside the conventional computer space and where single board/ System On a Chip designs are common, stuff like Raspberry Pi’s or smart phones, since they technically have graphics modules designed be companies like Snapdragon or MediaTek. It’s also worth noting that computers have gotten orders of magnitude more complicated compared to the era of starting a tech company in your garage.

        If it helps answer your question, according to Wikipedia, most of the other GPU companies have either been acquired, gone bankrupt, or aren’t competing in the Desktop PC market segment.

  • brucethemoose@lemmy.world
    link
    fedilink
    arrow-up
    8
    ·
    edit-2
    3 days ago

    It’s even better than that:

    They all come from Taiwan Semiconductor (TSMC).

    There used to be more of a split between many fabs. Then it was TSMC/Global Foundries/Samsung Foundry. Then it was TSMC/Samsung Foundry. Now AFAIK all GPUs are TSMC, with Nvidia’s RTX 3000 series (excluding the A100) being the last Samsung chip. Even Intel fabs Arc there, as far as I know.

    Hopefully Intel won’t kill Arc, as they are planning to move it back to their fabs.

  • False@lemmy.world
    link
    fedilink
    arrow-up
    23
    ·
    4 days ago

    Intel never really tried to be a real competitor until a few years ago. 3dfx had market dominance in 90s but then basically committed suicide. There were a few other smaller manufacturers in the late 90s and early 2000s but they never really had significant market share and couldn’t keep up with the investment required to be competitive.

    • Buffalox@lemmy.world
      link
      fedilink
      arrow-up
      21
      ·
      edit-2
      4 days ago

      3dfx had market dominance in 90s but then basically committed suicide.

      As I remember it, it was Nvidia that killed 3DFX, Nvidia had an absolutely cutthroat development pace, and 3DFX simply couldn’t keep up, and they ended up being bought by Nvivia.
      But oh boy Voodoo graphics were cool when they came out! An absolute revolution to PC gaming.

      • Mereo@piefed.ca
        link
        fedilink
        English
        arrow-up
        8
        ·
        4 days ago

        We cannot forget that 3dfx went under when they bought STB to manufacture their own video cards instead of letting their board partners do it.

        • Buffalox@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          3 days ago

          Maybe you are right, but I think they did that because they thought that would help them remain competitive, keeping the profit share that would normally go to board vendors, allowing them to sell cheaper while still making money, and compete better against Nvidia.

          Maybe I remember it wrong, but I think Voodoo was already dying with Voodoo 2.

        • Buffalox@lemmy.world
          link
          fedilink
          arrow-up
          4
          ·
          4 days ago

          IDK, I think it was because they couldn’t keep up with Nvidia, I bough the Voodoo 2 already at about half price.
          After that it was basically lights out for Voodoo.

          Intel has somewhat the same problem I think, because their GPU reasonably is good and for the customer it’s a competitive product.
          But for intel, the GPU chip probably cost 3 times as much to make as for a comparable Nvidia or AMD, because Intel requires a twice as big GPU to be competitive!
          That means that Intel is probably not making any profit from their GPU division.
          Same with Voodoo, they simply couldn’t keep up to make a profit, they had to compete with Nvidia that quickly surpassed 3DFX, and since Nvidia were better Voodoo had to be cheaper, but they couldn’t make them cheap enough to make a profit from them.

          It’s not that Voodoo got worse, because obviously they didn’t. But Nvidia had a development cycle that was unheard of at the time. It wasn’t just 3DFX that couldn’t keep up. It was also S3, Matrox and ATI. And ATI were by far the biggest GPU maker at the time. ATI however made a strong comeback as the only competitor to Nvidia mainstream performance desktop graphics and gaming, and then ATI was later bought by AMD.

    • Mereo@piefed.ca
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      4 days ago

      3dfx had market dominance in 90s but then basically committed suicide.

      Very true. They committed suicide when they bought STB so that they could manufacture their own video cards. They didn’t just focus on chip R&D, they needed to manufacture and market their own video cards instead of letting board partners do it.

    • SharkAttak@kbin.melroy.org
      link
      fedilink
      arrow-up
      3
      ·
      4 days ago

      Yeah, I remember Matrix, PowerVR, Number9(?)… But probably R&D became too costly, and despite DirectX leveling the field a bit some were forced to step back, or sell.

  • Part4@infosec.pub
    link
    fedilink
    English
    arrow-up
    11
    ·
    4 days ago

    Huawei have just started selling a gpu to the public that they made a few years back. It has a lot of VRAM, but it is old slow RAM, and doesn’t have the software infrastructure (drivers etc) nvidia has. So currently it isn’t a great option, but if you look at phones, or electric cars, there is every chance they become competitive in a relatively short time period. Time will tell.

  • lemming741@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    3 days ago

    You’ve watched Google,Facebook, and apple do it the last 20 years. If a good idea is spotted early enough, they buy the whole company before they can make it to market and grow to become a threat. It happens in any emerging tech and you’re watching it happen now in the LLM space. Companies burn cash, waiting for their competitors to make a mistake or run out of money. Then they buy out the struggling company, absorbing any tech they might have, maybe some branding, but more importantly- their customers. Now they can jack up prices once market forces are eliminated.
    If not for the threat of anti-trust laws, you would see single company rule in every single sector. That is the end goal of a company- a monopoly that crushes potential competition and squeezes consumers.
    Railroads, telephone, petroleum, internet, airlines, all ended up as regional monopolies.

  • zxqwas@lemmy.world
    link
    fedilink
    arrow-up
    8
    ·
    3 days ago

    I’m happy we still have two.

    Short answer: hard to start a new company that can compete. Over the decades all the other companies have done poorly and gone bankrupt or been bought out.

  • bluecat_OwO@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    3 days ago

    still very distant in future but I believe in the power of arm. Dedicated gpu’s are beasts now but I am rooting for arm to win this race

  • Buffalox@lemmy.world
    link
    fedilink
    arrow-up
    10
    ·
    edit-2
    4 days ago

    You are not mistaken.

    In the early days of the PC, there were lot’s of GPU options, as in literally dozens. So the first part of the question is why did they almost all disappear? The answer to that is that it became a much more complicated market with Windows, with way higher demands on the software side, and many hardware vendors suck at making software. So over time the best combo of hardware and driver beat out other high end manufacturers so we ended up with just 2, and the on-board / on-chip GPU made every low end 3rd party GPU next to irrelevant, with very little possibility of making a profit.

    The low end chips were no-longer needed, as they can now be had cheaper and more efficiently as part of the CPU for both AMD and Intel. And since these are the only 2 CPU options for X86, Now that VIA has discontinued their X86 line acquired from Cyrix, there is no low end entry point in the PC market for a new maker of GPU.
    The natural evolution is to start from a lower end, and if successful work up. This is not possible in the PC market, and makes entry to the market near impossible, except with enormous investments that may never pay off, especially since PC is a dwindling market.

    As you mention Intel is dipping their toes, but despite doing a pretty good and big effort, and investing a lot to develop a better GPU, and actually delivering a good product at a reasonable price, that should be absolutely competitive on paper, their marketshare is absolutely minuscule, because Nvidia and AMD together dominate and already fill the needs for the mid to higher end market, and have brand recognition for graphics.

    It’s not that there aren’t technologies that possibly could compete if scaled for PC, because those are actually pretty numerous on phones and tablets. But you can’t port these cheaply to PC, because there is no market segment for them to slide in to easily.
    It would require major investments to make them actually hardware performance competitive at higher scales, and investments in making good drivers. Intel had a big head start in these aspects, already making on-chip graphics that had drivers already. And still they are struggling, despite delivering a good product, and people have been screaming for a third option,because of high GPU prices.

    This may not be the entire explanation, but I think it’s a very significant part of it.
    The better question IMO is why Intel never became more popular, considering how much people have raved that more competition in the GPU market is required.

    And the explanation for that is:

    but let’s be real, AMD and nVidia are the only options

    Except Intel actually presented a good alternative, but was never seriously considered by most people for whatever reason.

    Personally I didn’t consider Intel, because I remember earlier attempts by Intel, where Intel quickly left the market again. And I didn’t want a GPU where I’m left without driver support a year after I purchased it.
    So in my case it was lack of trust in Intel to stay the course. But every other maker would have that exact same issue.
    There have been a few attempts in the past from other makers, but they all had performance or driver issues or both, and left the market quickly . Intel delivered a stellar product by comparison. And if Intel drops out of GPU again, I think there’s a pretty big risk it may be our last chance for a third significant mid-high end GPU maker on PC.

    TLDR:

    1. All the old competitors couldn’t cut it on either the hardware or software side, and so they died out.
    2. It’s an insanely expensive market to enter and to stay in, with high risk of never making a profit.
    • jacksilver@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      4 days ago

      The other thing you didn’t talk about was the size of the market in general.

      As onbaord CPUs were becoming popular the biggest reason for a GPU was games or video processing. Which, while significant markets, isn’t huge.

      Over the past couple decades, GPUs have made headway as the way to do Machine Learning/AI. Nvidia spent a lot of time and money making this process easier on their GPUs which lead to them not only owning the graphics market, but the much bigger ML/AI market. And I say the AI/ML market is bigger is simply that they are being installed in huge quantities in data centers.

      Edit: My point being that the market shrunk before GPUs became so critical. To counteract Nvidias stranglehold, a lot of big tech companies are creating custom TPUs (Tensor processing units) which are just ML/AI specific chips.

    • Kyrgizion@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      4 days ago
      • Region Q2 2025 Growth Rate Key Factors
      • North America -0.5% Declining consumer demand, tariff concerns
      • EMEA +5.3% Stronger-than-usual seasonal demand
      • APAC Flat Stabilization after previous declines–

      Not exactly a dwindling market except in the US

      • Buffalox@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        3 days ago

        You don’t write what market you are describing but:
        The PC market has been dwindling for decades, the PC gaming market has also been dwindling with consoles taking bigger share, and the past 5-10 years due to high GPU prices that have been wildly unstable.
        Lately prices have returned to more normal levels when accounting for inflation, which could explain a bump this year.
        USA is an outlier because of tariffs.