• EvilZ@thelemmy.club
    link
    fedilink
    English
    arrow-up
    0
    ·
    15 hours ago

    I actually have to build a new CPU… The last time I had built mine was in 2013 with an Asus i7which lasted quite a while until the whole Windows 11…non compatibility…

    Now to look at socket type and see if my. Old casing can take it and yada yada …

    I don’t want to go the laptop route since I still prefer desktop for gaming than laptops…

        • stinky@redlemmy.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          7 hours ago

          sorry I’m being a douche

          “cpu” sometimes refers to the processor chip which goes on the motherboard

          • EvilZ@thelemmy.club
            link
            fedilink
            English
            arrow-up
            0
            ·
            7 hours ago

            Oh man… I’m so tired I didn’t even see my mistake 😂 your not a douche, your just calling out a blunder on my part lol thanks! I corrected it.

            But yeah… I ma looking at the specials for motherboards and CPU and… Wow it’s going to br a pain to rebuild a computer… Since I usually like doing it myself

            • stinky@redlemmy.com
              link
              fedilink
              English
              arrow-up
              0
              ·
              6 hours ago

              I had one built by a computer repair place when I was in college - they did great and it had a fun case with lights

              These days I’m too far out of the loop to get good prices or anything so I bought a laptop from Acer, here is the link

              They offered a payment plan, 3 payments over 3 months with 0% interest

  • johannesvanderwhales@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    2 days ago

    Power consumption is part of the equation now too. You’ll often see newer generation hardware that has comparable performance to a last gen model but is a lot more power efficient.

    • eyeon@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      2 days ago

      Or you’ll see something equally efficient and equally performing at the same power levels…except you’ll see newer gens or upgraded skus allowed to pull more power

  • arc@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    2 days ago

    I occasionally “refresh” my PC with new board, CPU etc. I never buy the top of the line stuff and quite honestly there is little reason to. Games are designed to play perfectly well on mid range computers even if you have to turn off some graphics option that enables some slight improvement in the image quality.

      • kerf@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        2 days ago

        For many games you can set graphics rendering to for example 1080p but run the whole game in 4k so text, menues and so on are super crisp but the game still runs very light. But maybe it’s good advice to never even start because I can’t imagine going back to 1080p after using 2k and 4k screens

  • kamen@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    2 days ago

    Naming conventions are somewhat consistent; it’s the pricing that has gotten a bit out of hand.

    • Possibly linux@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 day ago

      If you are blindly renting things without doing numbers you have bigger issues.

      Always read and do long term calculations

    • arc@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      2 days ago

      I saw a video on Gamers Nexus about how shitty a company they are. Hopefully word spreads amongst gamers & builders that they’re no good and they should be avoided.

      • fishbone@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        0
        ·
        2 days ago

        What’s the deal with them? Only NZXT component i’ve had is my current case, which has awful airflow (old model of H710 I think, bought 5 ish years ago).

        • boonhet@lemm.ee
          link
          fedilink
          arrow-up
          0
          ·
          15 hours ago

          Apparently they very recently got acquired or invested in and are probably looking to increase profits tenfold in under a year so the company can be dumped before it all crashes.

        • ipkpjersi@lemmy.ml
          link
          fedilink
          arrow-up
          0
          ·
          2 days ago

          Apparently their PC rental program is a worse value than illegal loans that are likely mafia-backed.

  • MonkderVierte@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    2 days ago

    Meanwhile the data i care about, efficiency, is not readily availlable. I’m not gonna put a 350 watt GPU in the 10 liter case if i can have the same power for 250 watt.
    At least TomsHardware now includes efficiency in tests for newer cards.

    • addie@feddit.uk
      link
      fedilink
      arrow-up
      0
      ·
      2 days ago

      Tell me about it. The numbers that I’m interested in - “decibels under full load”, “temperature at full load” - might as well not exist. Will I be able to hear myself think when I’m using this component for work? Will this GPU cook all of my hard drives, or can it vent the heat out the back sufficiently?

      • Zanz@lemmy.ml
        link
        fedilink
        English
        arrow-up
        0
        ·
        14 hours ago

        Temperature is meaningless unless you want oc headroom. A watt into your room is the same no matter the temp the part runs at.

        • addie@feddit.uk
          link
          fedilink
          arrow-up
          0
          ·
          14 hours ago

          That’s not correct, I’m afraid.

          Thermal expansion is proportional to temperature; it’s quite significant for ye olde spinning rust hard drives but the mechanical stress affects all parts in a system. Especially for a gaming machine that’s not run 24/7 - it will experience thermal cycling. Mechanical strength also decreases with increasing temperature, making it worse.

          Second law of thermodynamics is that heat only moves spontaneously from hotter to colder. A 60° bath can melt more ice than a 90° cup of coffee - it contains more heat - but it can’t raise the temperature of anything above 60°, which the coffee could. A 350W graphics card at 20° couldn’t raise your room above that temperature, but a 350W graphics card at 90° could do so. (The “runs colder” card would presumably have big fans to move the heat away.)

      • Fizz@lemmy.nz
        link
        fedilink
        arrow-up
        0
        ·
        23 hours ago

        I wish this was data was more available. I got a GPU upgrade 6800xt and it’s so loud. I can’t enjoy sitting at my desk without hearing a loud whine and a bunch of other annoying noises. Its probably because the card is 2nd hand but still.

  • KingOfTheCouch@lemmy.ca
    link
    fedilink
    arrow-up
    0
    ·
    2 days ago

    Thousand times this. For actual builders that care about the nuance it all probably makes sense but then there is me over here looking at pre-builts wondering why the fuck are two seemingly identical machines have a $500 difference between them.

    I’m spending so much time pouring through spec sheets to find “oh the non-z version discombobulator means this cheaper one is gonna be trash in three years when I can afford to upgrade to a 6megadong tri-actor unit”.

    I’m in this weird state of to cheap to buy a Mac and can’t be arsed to build my own.

  • lorty@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    2 days ago

    Just go here and check the charts for the kind of work you want the PC to do. If one looks promising you can check specific reviews on YouTube.

    For gaming the absolute best cpu/gpu combo currently is the 9800x3d and a gtx 4090, if you don’t have a budget.

    Yes the part naming is confusing but it’s intentional.

    • Artyom@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      2 days ago

      Make sure to get your 5900x3d with your 7900XTX. Note that one is a CPU and the other is a GPU. For extra fun, their numbers should eventually overlap given their respective incrementation schemes. The 5900x3d is the successor to the 5900xd, which is a major step down in performance even though it has more cores.

      I’m gonna give this award to Intel, which has increased the numbers on their CPU line by 1000 every generation since before the 2008 housing crash.

    • VeganCheesecake@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      0
      ·
      2 days ago

      Just ordered another CPU from them. Downside is that there isn’t any modern AMD desktop platform that works with coreboot, which seems to be the only workable way to deactivate the Management Engine/Platform Security Processor after boot.

      Was really considering to swap to Intel for that, but got a good deal on a Ryzen 9 that fits in my socket, so…

      • lorty@lemmy.ml
        link
        fedilink
        arrow-up
        0
        ·
        2 days ago

        The only thing you should realistic understand from the naming conventions is relative generations and which bracket of price/performance the part targets. Assuming more than that is just a mistake.

      • Valmond@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        2 days ago

        Is it not still “higher better” at AMD? With the obvious X or “m”, but usually price reflects the specs when the numbers are the same.

    • qyron@sopuli.xyz
      link
      fedilink
      arrow-up
      0
      ·
      2 days ago

      Honestly my preferred manufacturer since I started putting together my own machines.

    • Fizz@lemmy.nz
      link
      fedilink
      arrow-up
      0
      ·
      2 days ago

      Userbenchmark is a terrible site. Its a shame it shows up first in the search results.

    • Dudewitbow@lemmy.zip
      link
      fedilink
      arrow-up
      0
      ·
      2 days ago

      userbenchmark is a biased site(anti AMD) soo much that it’s actually banned from /r/Intel. Absolutely do NOT use userbenchmark

    • lorty@lemmy.ml
      link
      fedilink
      arrow-up
      0
      ·
      2 days ago

      They fudge their criteria to make intel look good and AMD bad. Do not use this.