Well I am shocked, SHOCKED I say! Well, not that shocked.

  • RedSnt 👓♂️🖥️
    link
    fedilink
    English
    12 days ago

    It’s like how banks figured there was more money in catering to the super rich and just shit all over the rest of us peasants, GPU manufacturers that got big because of gamers have now turned their backs to us to cater to the insane “AI” agenda.
    Also, friendly advice, unless you need CUDA cores and you have to upgrade, try avoiding Nvidia.

    • @overload@sopuli.xyz
      link
      fedilink
      English
      256 days ago

      Absolutely. True creative games are made by smaller dev teams that aren’t forcing ray tracing and lifelike graphics. The new Indianna Jones game isn’t a GPU-selling card, and is the only game that I’ve personally had poor performance on with my 3070ti at 1440p.

      • Robust Mirror
        link
        fedilink
        English
        65 days ago

        Anyone that preorders a digital game is a dummy. Preorders were created to assure you got some of the limited physical stock.

  • LostXOR
    link
    fedilink
    566 days ago

    For the price of one 5090 you could build 2-3 midrange gaming PCs lol. It’s crazy that anyone would even consider buying it unless they’re rich or actually need it for something important.

    • Oniononon
      link
      fedilink
      English
      246 days ago

      And still have your house burn down due to it just being a 2080 that has 9.8 jiggawats pushed into it.

      There isn’t a single reason to get any of the 5 series imo, they don’t offer anything. And i say that as a 3d artist for games.

      Edit: nevermind i remember some idiots got roped into 4k for gaming and are now paying the price like marketing wanted them to.

        • Oniononon
          link
          fedilink
          English
          95 days ago

          You pay ton more money for a screen thats ppi is too dense to matter only to to pay ton more money for a pc to still run it at terrible framerate with lowered settings and fake frames.

          4k is a pure scam.

          • @CybranM@feddit.nu
            link
            fedilink
            English
            15 days ago

            Have you tried 4k? The difference is definitely noticeable unless you play on like a 20" screen

            • Oniononon
              link
              fedilink
              English
              1
              edit-2
              5 days ago

              Yes its pointless, most noticable is the low frame rate and lowered graphics to make the game playable. High fps is more noticable and useful. Blind tests confirmed that, even the one ltt did.

              2k could be argued is solid but even then the ppi is so dense already it does not really matter.

              Edit: then again there is some research showing people preceive fps and ppindifferently so it may be 4k makes sense for some while for others its really overpriced 2k that no pc can run.

              • @CybranM@feddit.nu
                link
                fedilink
                English
                15 days ago

                Not arguing FPS here lol. Arguing 4k, which you can run in 144hz in a lot of games even without a 5090, you failed to mention if you had tried 4k which I assume you haven’t based on the switch to FPS instead of resolution

              • Robust Mirror
                link
                fedilink
                English
                15 days ago

                I play in 1080p so can’t comment on 4k but I can confirm fps doesn’t seem to affect me after 30fps. I don’t perceive a noticeable difference between 30, 60, 120fps. Haven’t played higher than that. I suspect 4k would probably look better to me than a higher fps though. But I’m happy with 30-60fps and 1080p so…

                • Oniononon
                  link
                  fedilink
                  English
                  14 days ago

                  I went to 2k 100hz uw from 1080p 144hz. I stopped noticing the increased framerate pretty quickly as the “mouse so smooth” effects wear off fast. But the ultrawide huge fov is a massive plus. I don’t notice the resolution increase at all beyond lower frames and more text on screen in docs.

                  Laptops 4k is just 1080p with extra battery drain and worse performance.

        • @Damage@feddit.it
          link
          fedilink
          English
          36 days ago

          Somehow 4k resolution got a bad rep in the computing world, with people opposing it for both play and productivity.

          “You can’t see the difference at 50cm away!” or something like that. Must be bad eyesight I guess.

          • @GrindingGears@lemmy.ca
            link
            fedilink
            English
            116 days ago

            It’s just kind of unnecessary. Gaming in 1440p on something the size of your average computer monitor, hell even just good ol’ 1080 HD, is more than sufficient. I mean 1080 to 4k sure there’s a difference, but 1440p it’s a lot harder to tell. Nobody cares about your mud puddle reflections cranking along in a game at 120 fps. At least not the normies.

            Putting on my dinosaur hat for a second, I spent the first decade of my life gaming in 8/16 bit and 4 color CGA, and I’ve probably spent the last thirty years and god only knows how much money trying to replicate those experiences.

            • @Damage@feddit.it
              link
              fedilink
              English
              55 days ago

              I mean I play at 1440p and I think it’s fine… Well it’s 3440x1440, problem is I can still see the pixels, and my desk is quite deep. Do I NEED 4k? No. Would I prefer if I had it? Hell yes, but not enough to spend huge amount of money that are damaging to an already unrealistic market.

          • @BCsven@lemmy.ca
            link
            fedilink
            English
            15 days ago

            Does it really help gameplay on the average monitor? If it is a fast paced game Im not even paying attention to pixels

    • Lord Wiggle
      link
      fedilink
      English
      106 days ago

      unless they’re rich or actually need it for something important

      Fucking youtubers and crypto miners.

    • @Grimtuck@lemmy.world
      link
      fedilink
      English
      36 days ago

      I bought a secondhand 3090 when the 40 series came out for £750. I really don’t need to upgrade. I can even run the bigger AI models locally as I have a huge amount of VRAM.

      Games run great and look great. Why would I upgrade?

      I’m waiting to see if Intel or AMD come out with something awesome over the next few years. I’m in no rush.

    • @Murvel@lemm.ee
      link
      fedilink
      English
      36 days ago

      But then the Nvidia xx90 series have never been for the average consumer and I dont know what gave you that idea.

  • @bluesheep@lemm.ee
    link
    fedilink
    English
    516 days ago

    Paying Bills Takes Priority Over Chasing NVIDIA’s RTX 5090

    Yeah no shit, what a weird fucking take

    • SharkAttak
      link
      fedilink
      165 days ago

      But why spend to ““eat food”” when you can have RAYTRACING!!!2

  • simple
    link
    fedilink
    English
    485 days ago

    Unfortunately gamers aren’t the real target audience for new GPUs, it’s AI bros. Even if nobody buys a 4090/5090 for gaming, they’re always out of stock as LLM enthusiasts and small companies use them for AI.

    • @imetators@lemmy.dbzer0.com
      link
      fedilink
      English
      85 days ago

      Ex-fucking-actly!

      Ajajaja, gamers are skipping. Yeah, they do. And yet 5090 is still somehow out of stock. No matter the price or state of gaming. We all know major tech went AI direction disregarding average Joe about either they want or not to go AI. The prices are not for gamers. The prices are for whales, AI companies and enthusiasts.

    • @brucethemoose@lemmy.world
      link
      fedilink
      English
      3
      edit-2
      4 days ago

      5090 is kinda terrible for AI actually. Its too expensive. It only just got support in pytorch, and if you look at ‘normie’ AI bros trying to use them online, shit doesn’t work.

      4090 is… mediocre because it’s expensive for 24GB. The 3090 is basically the best AI card Nvidia ever made, and tinkerers just opt for banks of them.

      Businesses tend to buy RTX Pro cards, rent cloud A100s/H100s or just use APIs.

      The server cards DO eat up TSMC capacity, but insane 4090/5090 prices is mostly Nvidia’s (and AMD’s) fault for literally being anticompetitive.

  • @sp3ctr4l@lemmy.dbzer0.com
    link
    fedilink
    English
    44
    edit-2
    6 days ago

    In the US, a new RTX 5090 currently costs $2899 at NewEgg, and has a max power draw of 575 watts.

    (Lowest price I can find)

    … That is a GPU, with roughly the cost and power usage of an entire, quite high end, gaming PC from 5 years ago… or even just a reasonably high end PC from right now.

    The entire move to the realtime raytracing paradigm, which has enabled AAA game devs to get very sloppy with development by not really bothering to optimize any lighting, nor textures… which has necessitated the invention of intelligent temporal frame upscaling, and frame generation… the whole, originally advertised point of this all was to make hi fidelity 4k gaming an affordable reality.

    This reality is a farce.

    Meanwhile, if you jump down to 1440p, well, I’ve got a future build plan sitting in a NewEgg wishlist right now.

    RX 9070 (220 W) + Minisforum BD795i SE (mobo + non removeable, high end AMD laptop CPU with performance comparable to a 9900X, but about half the wattage draw) … so far my pretax total for the whole build is under $1500, and, while I need to double and triple check this, I think the math on the power draw works out to a 650 Watt power supply being all you’d need… potentially with enough room to also add in some extra internal HDD storage drives, ie, you’ve got leftover wattage headroom.

    If you want to go a bit over the $1500 mark, you could fit this all in a console sized ITX case.

    That is almost half the cost as the RTX 5090 alone, and will get you over 90fps in almost all modern games, with ultra settings at 1440p, though you will have to futz around with intelligent upscaling and frame gen if you want realtime raytracing as well with similar framerates, and realistically, probably wait another quarter or two for AMD driver support and FSR 4 to become a bit more mature and properly implemented in said games.

    Or you could swap out for a maybe a 5070 (non TI, the TI is $1000 more) Nvidia card, but seeing as I’m making a linux gaming pc, you know, for the performance boost from not running Windows, AMD mesa drivers are where you wanna be.

    • @CheeseNoodle@lemmy.world
      link
      fedilink
      English
      206 days ago

      Saved up for a couple of years and built the best (consumer grade) non nvidia PC I could, 9070XT, 9950X3D, 64gig of RAM. Pretty much top end everything that isn’t Nvidia or just spamming redundant RAM for no reason. The whole thing still costs less than a single RTX 5090 and on average draws less power too.

      • @sp3ctr4l@lemmy.dbzer0.com
        link
        fedilink
        English
        9
        edit-2
        5 days ago

        Yep, thats gonna be significantly more powerful than my planned build… and likely somewhere between 500 to 1000 more expensive… but yep, that is how absurd this is, that all of that is still less expensive than a 5090 RTX.

        I’m guessing you could get all of that to work with a 750 W PSU, 850 W if you also want to have a bunch of storage drives or a lot of cooling, but yeah, you’d only need that full wattage for running raytracing in 4k.

        Does that sound about right?

        Eitherway… yeah… imagine an alternate timeline where marketing and industry direction isn’t bullshit, where people actually admit things like:

        Consoles cannot really do what they claim to do at 4K… at actual 4K.

        They use checkerboard upscaling, so basically they’re actually running at 2K and scaling up, and its actually less than 2K in demanding raytraced games, because they’re actually using FSR or DLSS as well, oh and the base graphics settings are a mix of what PC gamers would call medium and high, but they don’t show console gamers real graphics settings menus, so they don’t know that.

        Maybe, maybe we could have tried to focus on just perfecting frame per watt and frame per $ efficiency at 2K instead of baffling us with marketing bs and claiming we can just leapfrog to 4K, and more recently, telling people 8K displays make any goddamned sense at all, when in 95% of home setup situations, of any kind, they have no physically possible perceptible gains.

        • @CheeseNoodle@lemmy.world
          link
          fedilink
          English
          65 days ago

          1000W PSU for theoretical maximum draw of all components at once with a good safety margin. But even when running a render I’ve never seen it break 500W.

        • @CheeseNoodle@lemmy.world
          link
          fedilink
          English
          2
          edit-2
          5 days ago

          I tried Mint and Ubuntu but Linux dies a horrific death trying to run newly released hardware so I ended up on ghost spectre.
          (I also assume your being sarcastic but I’m still salty about wasting a week trying various pieces of advice to make linux goddamn work)

          • bitwolf
            link
            fedilink
            English
            55 days ago

            Levelone techs had relevant guidance.

            Kernel 6.14 or greater Mesa 25.1 or greater

            Ubuntu and Mint idt have those yet hence your difficult time.

    • @CybranM@feddit.nu
      link
      fedilink
      English
      -15 days ago

      The entire move to the realtime raytracing paradigm, which has enabled AAA game devs to get very sloppy with development by not really bothering to optimize any lighting, nor textures

      You clearly don’t know what you’re talking about here. Ray tracing has nothing to do with textures and very few games force you to use RT. What is “allowing” devs to skimp on optimization (which is also questionable, older games weren’t perfect either) is DLSS and other dynamic resolution + upscaling tech

      • @lordbritishbusiness@lemmy.world
        link
        fedilink
        English
        55 days ago

        Doom the Dark Ages is possibly what they’re referring to. ID skipped lighting in favour of Ray tracing doing it.

        Bethesda Studios also has a tendency to use hd textures on features like grass and terrain which can safely be low res.

        There is a fair bit of inefficient code floating around because optimisation is considered more expensive than throwing more hardware at a problem, and not just in games. (Bonus points if you outsource the optimisation to some else’s hardware or the modding community)

        • @sp3ctr4l@lemmy.dbzer0.com
          link
          fedilink
          English
          35 days ago

          That is a prominent example of forced RT… basically, as I described with the TAA example in my other reply…

          idTech 8 seems to be the first engine that just literally requires RT for its entire render pipeline to work.

          They could theoretically build another version of it off of vulkan-base, to enable you to be able to turn RT off… but that would likely be a massive amount of work.

          On the bright side… at least the idTech engines are actually well coded, and they put a lot of time into making the engine actually very good.

          I didn’t follow the marketing ecosystem for Doom Dark Ages, but it would have been really shitty if they did not include ‘you need a GPU with RT cores’.

          On the other end of the engine spectrum:

          Bethesda… yeah, they have entirely lost control of their engine, it is mangled mess of nonsense, the latest Oblivion remaster just uses UE to render things slapped on top of Gamebryo, because no one at Bethesda can actually code worth a damn.

          Compare that to oh I dunno, the Source engine.

          Go play TitanFall 2. 10 year old game now, built on a modified version of the Portal 2 Source engine.

          Still looks great, runs very efficiently, can scale down to older hardware.

          Ok, now go play HL Alyx. If you don’t have VR, there are mods that do a decent job of converting it into M+K.

          Looks great, runs efficiently.

          None of them use RT.

          Because you don’t need to, if you take the time to actually optimize both your engine and game design.

      • @sp3ctr4l@lemmy.dbzer0.com
        link
        fedilink
        English
        55 days ago

        I meant they also just don’t bother to optimize texture sizes, didn’t mean to imply they are directly related to ray tracing issues.

        Also… more and more games are clearly being designed, and marketed, with ray tracing in mind.

        Sure, its not absolutely forced on in too many games… but TAA often is forced on, because no one can run raytracing without temporal intelligent upscsling and frame gen…

        …and a lot of games just feed the pixel motion vectors from their older TAA implementations into the DLSS / FSR implementations, and don’t bother to recode the TAA into just giving the motion vectors as an optional API that doesn’t actually do AA…

        … and they often don’t do that because they designed their entire render pipeline to only work with TAA on, and half the games post procrssing effects would have to be recoded to work without TAA.

        So if you summarize all that: the ‘design for raytracing support’ standard is why many games do not let you turn off TAA.

        That being said: Ray tracing absolutely does only really make a significant visual difference in many (not all, but many) situations… if you have very high res textures.

        If you don’t, older light rendering methods work almost as well, and run much, much faster.

        Ray tracing involves… you know, light rays, bouncing off of models, with textures on them.

        Like… if you have a car with a glossy finish, that is reflecting in its paint the entire scene around it… well, if that reflect map that is being added to the base car texture… if that reflect map is very low res, if it is generating it from a world of low res textures… you might as well just use the old cube map method, or other methods, and not bother turning every reflective surface into a ray traced mirror.

        Or, if you’re doing accumulated lighting in a scene with different colors of lights… that effect is going to be more dramatic, more detailed, more noticable in a scene with higher res textures on everything being lit.

        I could write a 60 page report on this topic, but no one is paying me to, so I’m not going to bother.

  • @chunes@lemmy.world
    link
    fedilink
    English
    305 days ago

    I stopped maintaining a AAA-capable rig in 2016. I’ve been playing indies since and haven’t felt left out whatsoever.

    • @MotoAsh@lemmy.world
      link
      fedilink
      English
      75 days ago

      Don’t worry, you haven’t missed anything. Sure, the games are prettier, but most of them are designed and written more poorly than 99% of indie titles…

      • @Honytawk@feddit.nl
        link
        fedilink
        English
        45 days ago

        The majority sure, but there are some gems though.

        Baldurs Gate 3, Clair Obscur: Expedition 33, Doom Eternal, Elden Ring, God Of War, … for example

        You can always wait for a couple of years before playing them, but saying they didn’t miss anything is a gross understatement.

      • JustEnoughDucks
        link
        fedilink
        English
        3
        edit-2
        5 days ago

        It’s funny, because often they aren’t prettier. Well optimized and well made games from 5 or even 10 years ago often look on par better than the majority of AAA slop pushed out now (obviously with exceptions of some really good looking games like space marine and some others) and the disk size is still 10x what it was. They are just unrefined and unoptimized and try to use computationally expensive filters, lighting, sharpening, and antialiasing to make up for the mediocre quality.

        • @MotoAsh@lemmy.world
          link
          fedilink
          English
          14 days ago

          The irony is it is optimized in several notable cases, like cyberpunk 2077 and most major ue5 engine based games. It’s just all the mipmap levels from distant to 4k up close really add up when the game actually has a decent amount of content.

          I wonder how many people really run games at settings that require the highest detail. I bet a lot of people would appreciate half the DL size or more just to leave 'em out and disable ‘ultra’ settings.

    • @tea@lemmy.today
      link
      fedilink
      English
      65 days ago

      Indies are great. I can play AAA titles but don’t really ever… It seems like that is where the folks with the most creativity are focusing their energy anyways.

      • Bakkoda
        link
        fedilink
        English
        55 days ago

        I had lost all interest in games for a while. Desktop just ended up with me tinkering in the homelab. Steam deck has been so great to fall in love with gaming again.

  • candyman337
    link
    fedilink
    English
    27
    edit-2
    5 days ago

    It’s just because I’m not impressed, like the raster performance bump for 1440p was just not worth the price jump at all. On top of that they have manufacturing issues and issues with their stupid 12 pin connector? And all the shit on the business side not providing drivers to reviewers etc. Fuuucccckk all that man. I’m waiting until AMD gets a little better with ray tracing and switching to team red.

  • Ulrich
    link
    fedilink
    English
    26
    edit-2
    6 days ago

    I think the Steam Deck can offer some perspective. If you look at the top games on SD it’s like Baldurs Gate, Elden Ring, Cyberpunk, etc., all games that run REALLY poorly. Gamers don’t care that much.

  • JackbyDev
    link
    fedilink
    English
    175 days ago

    Uhhh, I went from a Radeon 1090 (or whatever they’re called, it’s an older numbering scheme from ~2010) to a Nvidia 780 to an Nvidia 3070 TI. Skipping upgrades is normal. Console games effectively do that as well. It’s normal to not buy a GPU every year.

    • @46_and_2@lemmy.world
      link
      fedilink
      English
      1
      edit-2
      4 days ago

      As long as you make an upgrade that’s equivalent or better than the current console generation, you’re then basically good-to-go until the next generation of consoles comes.

      • JackbyDev
        link
        fedilink
        English
        14 days ago

        I don’t really care if my current graphics are better or worse than the current console generation, it was just an illustration comparing PC gaming to console gaming.

  • @gravitas_deficiency@sh.itjust.works
    cake
    link
    fedilink
    English
    144 days ago

    Nvidia doesn’t really care about the high-end gamer demographic nearly as much as they used to, because it’s no longer their bread and butter. Nvidia’s cash cow at this point is supplying hardware for ML data centers. It’s an order of magnitude more lucrative than serving consumer + enthusiast market.

    So my next card is probably gonna be an RX 9070XT.

    • @ameancow@lemmy.world
      link
      fedilink
      English
      4
      edit-2
      4 days ago

      even the RX9070 is running around $900 USD, I cannot fathom affording even state-of-the-art gaming from years ago at this point. I am still using a GTX1660 and playing games from years ago I never got around to and having a grand time. Most adults I know are in the same boat and either not even considering upgrading their PC or they’re playing their kid’s console games.

      Every year we say “Gonna look into upgrading” but every year prices go up and wages stay the same (or disappear entirely as private-equity ravages the business world, digesting every company that isn’t also a private equity predator) and the prices of just living and eating are insane, so at this rate, a lot of us might start reading again.

      • @jacksilver@lemmy.world
        link
        fedilink
        English
        24 days ago

        It makes me wonder if this will bring more people back to consoles. The library may be more limiting, but when a console costs less than just a gpu, itll be more tempting.