But is it game over for 8K on the PC, too?

    • how_we_burned@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      12 hours ago

      I bought a bunch of sub $400 (Australian dollaroos at that) 4k monitors (Samsung, viewsonic etc). Their not the greatest monitors but this was like 5 years ago. Some were on special.

      You can play a lot of older games in 4k and it makes a big difference.

      4k gaming is more then accessible and far better in my view then high refresh rate gaming at 1080p or 1440p.

      • SponTen@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 hours ago

        I think it’s just “to each their own”.

        My wife and I have a 4K TV and an Apple TV. Initially, the Apple TV was automatically tuned to 4K, but we were having glitches with the picture every now and then. After a bit of troubleshooting, I wondered if it might be either the cable or the TV struggling, given that the cable is old and the TV has frame generation but is old and thus underpowered.

        Lo and behold, after manually tuning the Apple TV down to 1080p, it solved the issue.

        I notified my wife and we tested back and forth between 4K vs 1080p, and frame gen on vs off for each. Neither of us could tell the difference between 4K and 1080p when sitting on the couch (though we could if we went up close to the TV), but both of us immediately noticed and preferred frame gen on. And yes, we’ve had our eyes tested and have at least decent vision.

        For me, if the downsides to 4K were much lower then of course I’d turn it on and never look back. But we don’t notice it on the TV, and while I’d probably notice it on my PC monitor, to upgrade it and my gaming rigs would cost many thousands (also dollarydoos for me) for a pretty mild upgrade (for me).

  • scintilla@crust.piefed.social
    link
    fedilink
    English
    arrow-up
    46
    ·
    24 hours ago

    We’ve reached the point where FPS is far more impactful to feel than pixel count imo. The difference of looks between 4k and 8k isn’t as high as the decrease in performance is.

    • UnfortunateShort@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      2
      ·
      24 hours ago

      We’re past that point as well. 4k @ 240 Hz is so good, most people won’t be able to tell the difference to an 8k, 480 Hz monitor. Even if they pay special attention to it. Probably not even in A/B testing.

      There is still room for improvement in the area of HDR, but monitors are almost as good as they will ever get.

    • Technus@lemmy.zip
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      23 hours ago

      It’s because we’re at the limits of the human visual system. The difference in pixel pitch between 4k and 8k at the distances we watch TV is literally imperceptible.

      It also doesn’t help that there’s not much content authored and distributed for higher resolutions. It’s exponentially more expensive to produce, store, and deliver.

      Home Internet connections on average aren’t any better than they were ten years ago, either, at least not in the US. I doubt a lot of them can even support 8k streaming, let alone with anyone else using it at the same time.

        • Technus@lemmy.zip
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 hours ago

          Yeah but we’re talking diminishing returns here. Doubling the resolution to 8k makes about as much sense as doubling refresh rates to 480hz. At that point it’s going to be mostly dependent on the individual, and likely heavily subject to the placebo effect.

          By my math, a 55" 8k screen has pixels that are 0.056" (56 thou) wide.

          At ten feet, that subtends an angle of 0.268 degrees or 1.6 arcminutes.

          There’s obviously a lot of variation and it depends on exactly what you’re measuring, but normal human visual acuity struggles to distinguish details less than about 5 arcminutes, maybe 1-2 arcminutes depending on the test.

    • mrnobody@reddthat.com
      link
      fedilink
      English
      arrow-up
      7
      ·
      23 hours ago

      To add… It would only matter in large format displays anyway. Pixel density is only going to matter so much.

      I remember when Sharp put out their Aquos 70" FHD TV and I thought, “eww, so grainy”! But now I’ve got a 85" UHD with the same density as a ~42" FHD which helps with clarity since my viewing distance hadn’t really changed (~10ft).

      FPS is great and all, but not when most content is 24fps-60fps. 120 is an awesome sweet spot for 24fps content since its 5hz per frame.

      IMO UHD still has room for growth and adoption before another tech hits. Not to mention the financial strains everyone’s in due to the fucking billionaire squeeze… And they wonder why people are tight on money?! Fucking idiots!

      • UnspecificGravity@piefed.social
        link
        fedilink
        English
        arrow-up
        3
        ·
        22 hours ago

        I doubt the streaming model is going to support 8k content anytime soon. Actual 4k is already more data than anyone wants to be pushing around every time they watch something, to the point that what most people actually watch as “4k” in streaming is at bitrates that make it almost indistinguishable from 1080p.

        • mrnobody@reddthat.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          20 hours ago

          Well, yes and no. h.265 (HEVC) made it far better for UHD streaming to an extent. Around half the bandwidth of h.264 but 4x more pixels, so you only go up 2x bandwidth.

          Now we have .AV1 and h.266 (VCC) formats which need adoption first before we can really push 8K/UHD content. Again, not 100% accurate, but around 3-5x bandwidth of h.264 but is ~15x pixels.

          We’ve come a long way!

        • mrnobody@reddthat.com
          link
          fedilink
          English
          arrow-up
          2
          ·
          20 hours ago

          That’s kind of what I’m getting at. Once you hit a certain size, it only makes sense to have a certain resolution. I know jumping from 65" to 85" made all my Plex content “blurry” bc it wasn’t good enough quality/bitrate. Reripping BD and 4K BD used h.265 and 12-15GB/hr per UHD file was way better!

          Idk what 8K looks like, but for those new 98"+ displays, I wouldn’t go any bigger unless 8K. 42-50" Max FHD, I would say 85" Max UHD. You can’t really sit any further in a LR, so being that close I’d want it that way. Plus, it’d require the faster refresh rate to not look so bad moving over that much surface area.

          I’m just excited for PeLED or PeNC (Perovskite LED / Nano Crystal). 😎🤯 sorry, off topic…

  • dil@piefed.zip
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    2
    ·
    23 hours ago

    In regular use I prefer 4k, but with games I legit cant tell the difference between 4k and 1440p, I can tell the different between 60 and 120, 180 not really

  • Tim_Bisley@piefed.social
    link
    fedilink
    English
    arrow-up
    7
    ·
    24 hours ago

    Still on a 1080p plasma tv. Those old charts showing if you can tell the difference in quality shows at my distance I’d be fine with 720p. Do people sit inches from their 80 inch TV?

    • roofuskit@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      21 hours ago

      I can 100% tell the difference between 1080p and 720p. I can tell with 1080p and UHD as well, but I honestly think that has more to do with the size of the compression artifacts. Compared to the image.

      • jacksilver@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        19 hours ago

        I mean, if you have compression artifacts wouldn’t that mean the codec/delivery of the content is the issue and not the resolution.

        I’m pretty sure that most 4k content isn’t actually 4k (especially when streaming). I’d link a source talking about it, but they’re all ad garbage.

  • commander@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    20 hours ago

    PC gaming should head towards 21:9 for ubiquitous support in games. 1680x720, 1920x800, 2560x1080, 3440x1440, …

    Also OLED or higher density dimming zones. Full coverage DCI-P3. Then color reproduction and brightness highlights will also be hitting a point of diminishing returns. Then it’ll be onto VR/head mounted displays where density and brightness/contrasts will better show off

    I early adopted 3840x2160 way back and recently went with a no name $200 3440x1440 monitor in 2024 and that was a way better upgrade than 1080p to 2160p. I’d take 2560x1080 over 3840x2160. 8k has no relevance until it’s the best value for up to $1000 for a 65" TV

    • mrnobody@reddthat.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      16 hours ago

      Check out the next evolution, PeLED or PeNC. Perovskite LED or Perovskite Nano Crystal.

      Should oulast LED 3x+, better brightness, better contrast, way better refresh rates with less ghosting, smaller pixel/higher density, etc.

  • James R Kirk@startrek.website
    link
    fedilink
    English
    arrow-up
    4
    ·
    23 hours ago

    I would like to see 8K for movies in cinema, especially for remasters/digitizations from existing film negatives and archival purposes.

    While I wouldn’t say no to it, 8K on a TV under 75" is not going to add much value to the average consumer.

  • nonentity@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    edit-2
    21 hours ago

    4320p (8k) video makes as much sense as 96kHz audio. Both have a legitimate role in capture and creation, but a vanishingly minuscule role at the point of playback and consumption.

  • dazaroo@lemmygrad.ml
    link
    fedilink
    English
    arrow-up
    2
    ·
    24 hours ago

    My eyesight isn’t perfect but I honestly struggle to tell the difference between a 1440p monitor and 4K television so I’d much rather we stop the resolution nonsense and get back to tech that is at least mildly more interesting, like Q-dot or 3D