• Guy Ingonito
    link
    fedilink
    171 year ago

    Televisions are one if the few things that have gotten cheaper and better these last 20 years. Treat yourself and upgrade.

    • @[email protected]
      link
      fedilink
      91 year ago

      But be careful of the “smart” ones. If you have a “dumb” one that is working fine, keep it. I changed mine last year and I don’t like the new “smart” one. IDGAF about Netflix and Amazon Prime buttons or apps. And now I’m stuck with a TV that boots. All I want is to use the HDMI input but the TV has to be “on” all the times because it runs android. So if I unplug the TV, it has to boot an entire operating system before it can show you the HDMI input.

      I don’t use any “smart” feature and I would very much have preferred to buy a “dumb” TV but “smart” ones are actually cheaper now.

      Same for my parents. They use OTA with an antenna and their new smart TV has to boot into the tuner mode instead of just… showing TV. Being boomers they are confused as to why a TV boots into a menu where they have to select TV again to use it.

      New TVs may be cheap, but it’s because of the “smart” “spying” function, and they are so annoying. I really don’t like them.

      • @[email protected]
        link
        fedilink
        51 year ago

        Yeah the bootup kills me. I got lucky that my current tv doesn’t do it. But man the last one I had took forever to turn on. It’s stupid.

      • @[email protected]
        link
        fedilink
        21 year ago

        Can’t speak for your TV, but mine takes all of 16 seconds to boot up into the HDMI input from the moment I plug it in, and there’s a setting to change the default input when it powers on. I use two HDMI ports so I have it default to the last input, but I have the option to tell it to default to the home screen, a particular HDMI port, the AV ports, or antenna

        Not a fan of the remote though. I don’t have any of these streaming services, and more importantly I’ll be dead and gone before I let this screen connect to the Internet

    • Flying Squid
      cake
      link
      fedilink
      11 year ago

      I’ve been in stores which have demonstration 8K TVs.

      Very impressive.

      I’m still fine with my 720p and 1080p TVs. I’ve never once felt like I’ve missed out on something I was watching which I wouldn’t have if the resolution was higher and that’s really all I care about.

      • Guy Ingonito
        link
        fedilink
        11 year ago

        I have a 4k tv with backlighting that matches the screen. When I take magic mushrooms and watch it I can see god

      • @[email protected]
        link
        fedilink
        21 year ago

        I think the impressive is likely more to do with other facets than the resolution. Without putting your face up to the glass, you won’t be able to discern a difference, the human visual acuity limits just don’t get that high at a normal distance of a couple of meters or more.

        I’d rather have a 1080P plasma than most cheap 4K LCDs. The demonstrators are likely OLED which mean supremely granular conrol of both color and brightness, like plasma used to be. Even nice LCDs have granular backlighting, sometimes with something like a 1920x1080 array of backlight to be close enough to OLED in terms of brightness control.

    • @[email protected]
      link
      fedilink
      261 year ago

      Except they turned into trash boxes in the last couple of years. Everything is a smart TV with ad potential and functionality that will eventually be unsupported. I’m holding onto my dumb TVs as long as I can.

        • capital
          link
          fedilink
          11 year ago

          “Monitor” is another. Not just for PC monitors.

      • @[email protected]
        link
        fedilink
        151 year ago

        Yup. Those cheap TV’s are being subsidized by advertisements that are built right in. If you don’t need the smart functionality, skip connecting it to the Internet. (If you can. Looking at you Roku TV’s!)

      • voxel
        link
        fedilink
        3
        edit-2
        1 year ago

        well you can just not connect it to the internet and still have some extra features.
        also if it’s an android tv, it’s probably fine (unless you have one with the new google tv dashboard)
        these usually don’t come with ads or anything except regular google android tracking, and you can just unpin google play movies or whatnot.

      • @[email protected]
        link
        fedilink
        41 year ago

        We’ve got a pair of LG C1 OLEDs in the house, and the best thing we did was remove any network access whatsoever. Everything is now handled through Apple TVs (for AirPlay, Handoff etc.), but literally any decent media device or console would be an upgrade on what manufacturers bundle in.

  • @[email protected]
    link
    fedilink
    101 year ago

    It’s funny that we got to retina displays, which were supposed to be the highest resolution you’d ever need for the form factor, and then manufacturers just kept making higher and higher resolutions anyway because Number Go Up. I saw my first 8K laptop around this time and the only notable difference was that the default font size was unreadable.

  • @[email protected]
    link
    fedilink
    41 year ago

    I don’t play games on my TV but I have a really old 1080p one with a native Plex and YouTube apps with no nonsense. I have seen the ads and other stupid bullshit modern tvs come with, I’m going to be fixing this TV up until my dying breath.

  • @[email protected]
    link
    fedilink
    191 year ago

    My son is on his 3rd Dualsense controller in about 18months.

    Yesterday I plugged my Xbox 360 controller into my steam deck and played Halo 3 Like an OG.

        • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 🇮 🏆
          link
          fedilink
          English
          4
          edit-2
          1 year ago

          I had it pretty consistently with every Joycon I went through; but I’ve had my PS5 for little over a year and I use it for PC gaming also without issue on the dual sense. Did they redesign them between when they first launched and more recently, maybe? I worry about it all the time because they’re $80. I can’t be replacing them all the time.

          • @[email protected]
            link
            fedilink
            21 year ago

            Yeah we had the joycon fiasco as well .

            He runs 2 controllers - one has been fine. The other 2 got the drift

            Best breakdown here https://www.youtube.com/watch?v=7qPNyio3VDk&pp=ygUPRHVhbHNlbnNlIGRyaWZ0

            It’s wild to me that flagship consoles ship with weak controllers - and I was reminded of it, when was using the x360 controller - which is battered and old probably 15 years old - and it still works perfectly.

            Even more annoying is I think they are selling a ‘pro’ controller now for Dual Sense edge or something- for 150 odd.

        • Pika
          link
          fedilink
          English
          41 year ago

          Have you looked into repairing them yourself? Had it happened with my PS4 controller and it was fairly simple to fix myself and it costed significantly less than buying a whole new controller

      • @[email protected]
        link
        fedilink
        41 year ago

        I had mine maybe 8 months before the left stick started drifting hard. Completely unusable. And sony wanted me to go through all these hoops AND spend like $20 bucks to ship it to them.

        Ended up getting an 8bit Pro Ultimate instead, and so far it’s worked great! Has Hall-Effect joysticks too, so no chance of drifting ever. The major console makers NEED to switch to HE for the next gen.

    • @[email protected]
      link
      fedilink
      2
      edit-2
      1 year ago

      My Xbox Series S controller got stick drift like 3 months after I got it. My friend’s finally succumbed last week, after about a year of owning it. What is it with stick drift on new controllers? Seems like every modern system has the exact same problem

    • @[email protected]
      link
      fedilink
      31 year ago

      I’m still with my first dualsense, my dualshocks from PS3 and PS4 still work without any issues. I don’t want to know what people do to their controllers.

    • Domi
      link
      fedilink
      191 year ago

      Yesterday I plugged my Xbox 360 controller into my steam deck and played Halo 3 Like an OG.

      If you had told someone 10 years ago that you can play Halo 3 on a handheld running Linux with a OG Xbox 360 controller on Steam they would call you crazy.

      • @[email protected]
        link
        fedilink
        31 year ago

        Halo 3 is seventeen years old. Ten years ago, a seventeen-year-old game would be something like Quake 2 or Castlevania: Symphony of the Night, both of which could easily be run on handhelds by that time.

  • arthurpizza
    link
    fedilink
    English
    51 year ago

    I can’t really imagine being close enough to any screen where I need more than 1080p. I’m sitting across the room, not pressing my face against the glass.

  • @[email protected]
    link
    fedilink
    191 year ago

    The performance difference between 1080p and 720p on my computer makes me really question if 4k is worth it. My computer isn’t very good because it has an APU and it’s actually shocking what will run on it at low res. If I had a GPU that could run 4k I’d just use 1080p and have 120fps all the time.

    • @[email protected]
      link
      fedilink
      111 year ago

      Tldr: Higher resolutions afford greater screen sizes and closer viewing distances

      There’s a treadmill effect when it comes to higher resolutions

      You don’t mind the resolution you’re used to. When you upgrade the higher resolution will be nicer but then you’ll get used to it again and it doesn’t really improve the experience

      The reason to upgrade to a higher resolution is because you want bigger screens

      If you want a TV for a monitor, for instance, you’ll want 4k because you’re close enough that you’ll be and to SEE the pixels otherwise.

      • Flying Squid
        cake
        link
        fedilink
        11 year ago

        You don’t mind the resolution you’re used to. When you upgrade the higher resolution will be nicer but then you’ll get used to it again and it doesn’t really improve the experience

        This is sort of how I feel about 3D movies and why I never go to them. After about 20 minutes, I mostly stop noticing the 3D.

      • Johanno
        link
        fedilink
        31 year ago

        As long as don’t know that there is anything better you will love 1080p. Once you have seen 2k you don’t want to switch back. Especially on bigger screens.

        On the TV I like 1080p still. I remember the old CRT TVs with just bad resolution. In comparison 1080 is a dream.

        However if the video is that high in quality you will like 4k on a big TV even more. But if the movie is only 720p (like most DVDs or streaming Services) then 4k is worse than 1080p you need some upscaling in order to have a clear image now.

    • @[email protected]
      link
      fedilink
      141 year ago

      1440p is the sweet spot. Very affordable these days to hit high FPS at 1440 including the monitors you need to drive it.

      1080@120 is definitely low budget tier at this point.

      Check out the PC Builder YouTube channel. Guy is great at talking gaming PC builds, prices, performance.

    • @[email protected]
      link
      fedilink
      41 year ago

      It’s a chicken/egg problem. We need 8k so we can use bigger TV’s, but those bigger TV’s need 8k content to be usable.

      • @[email protected]
        link
        fedilink
        2
        edit-2
        1 year ago

        What kind of TV do you need bro? A 60 inch with 4k is more than enough, especially when you think about how far you are gonna sit from a 60 inch TV. Only suckers buy into 8k. Same people who bought those rounded screen smartphones thinking it will be the new thing. Where are those phones now?

        • @[email protected]
          link
          fedilink
          1
          edit-2
          1 year ago

          What kind of TV do you need bro? A 60 inch with 4k is more than enough, especially when you think about how far you are gonna sit from a 60 inch TV.

          You misunderstand the point of higher resolutions. The point is not to make the image sharper, the point is to make the TV bigger at the same sharpness. This also means the same viewing distance.

          At the end of the CRT ear I had 28” TV, at PAL resolutions that is ~540p visible. At the end of the HD era I had a 50” TV. Note that the ratios between resolution and size are close together. Now we’re partway through the 4k era and I currently have a 77” 4k TV. By the time we move to the 8k era I expect to have something around 100”. 8k would allow me to go up to a 200” TV.

          I sit just as far from my 77” TV as I sat from my 27”, my 50” or my 65”. The point of a larger TV is to have a larger field-of-view, to fill a larger part your vision. The larger the FoV the better the immersion. That’s why movie theaters have such large screens, that’s why IMAX theaters have screens that curve around you.

          Don’t think of an 8k TV as sharper, think of 4k as a cropped version of 8k. You don’t want to see the same things sharper, you want to see more things. Just like when we went from square to widescreen TV’s. The wider aspect-ratio got us extra content on the side, the 4:3 version just cut of the sides of the picture. So when you go from a 50” 4k to a 100” 8k, you can see this as getting a huge additional border around the screen that would simply be cut off on a 4k screen.

          Of course, content makers need to adjust their content to take into account this larger field-of-view. But that’s again a chicken/egg problem.

          The endgame is to have a TV that fills your entire field-of-view, so that when you are watching a movie that is all you see. As long as you can see the walls from the corners of your eye, your TV is not big enough.

  • BmeBenji (he/him)
    link
    fedilink
    1321 year ago

    4K is overkill enough. 8K is a waste of energy. Let’s see optimization be the trend in the next generation of graphics hardware, not further waste.

    • @[email protected]
      link
      fedilink
      481 year ago

      Yeah. Once games are rendering 120fps at a native 6K downscaled to an amazing looking 4K picture, then maybe you could convince me it was time to get an 8K TV.

      Honestly most people sit far enough from the TV that 1080p is already good enough.

      • @[email protected]
        link
        fedilink
        121 year ago

        I find 4k is nice on computer monitors because you can shut off anti-aliasing entirely and still leave jagged edges behind. 1440p isn’t quite enough to get there.

        Also, there’s some interesting ideas among emulator writers about using those extra pixels to create more accurate CRT-like effects.

        • @[email protected]
          link
          fedilink
          51 year ago

          Oh yeah, I have read some very cool things about emulators and being able to simulate the individual phosphors with 4K resolution. I have always been a sucker for clean crisp pixels (that’s what I was trying to achieve on the shitty old CRT I had for my SNES) so I haven’t jumped into the latest on crt shaders myself.

        • @[email protected]
          link
          fedilink
          11 year ago

          But anti-aliasing needs far less performance. And you need to mess about with scaling on a 4k monitor which is always a pain. 1440p for life IMHO

      • @[email protected]
        link
        fedilink
        3
        edit-2
        1 year ago

        I’m to THX spec, 10 feet from an 85 inch. I’m right in the middle of 1440P and 4K being optimal, but with my eyes see little difference between the two.

        I’d settle for 4k @ 120 FPS locked.

        • @[email protected]
          link
          fedilink
          21 year ago

          I’m 6-8 feet from a 65, depending on seating position and posture. It seems to be a pretty sweet spot for 4K (I have used the viewing distance calculators in the past, but not recent enough to remember the numbers). I do wear my glasses while watching TV too, so I see things pretty clearly.

          With games that render at a native 4K at 60fps and an uncompressed signal, it is absolutely stunning. If I try to sit like 4 feet from the screen to get more immersion, then it starts to look more like a computer monitor rather than a razor sharp HDR picture just painted on the oled.

          There is a lot of quality yet to be packed into 4K. As long as “TV in the living room” is a similar format to now, I don’t think 8K will benefit people. It will be interesting to see if all nice TVs just become 8K one day like with 4K now though.

    • @[email protected]
      link
      fedilink
      61 year ago

      For TV manufacturers the 1K/4K/8K nonsense is a marketing trap of their own making - but it also serves their interests.

      TV makers DON’T WANT consumers to easily compare models or understand what makes a good TV. Manufacturers profit mightily by selling crap to misinformed consumers.

    • bruhduh
      link
      fedilink
      4
      edit-2
      1 year ago

      Divide resolution by 3 though, current gen upscale tech can give that much, 4k = upscaled 720p and 8k = upscaled 1440p

      • AngryMob
        link
        fedilink
        41 year ago

        can doesn’t mean should.

        720p to 4k using dlss is okay, but you start to see visual tradeoffs strictly for the extra performance

        to me it really shines at 1080p to 4k where it is basically indistinguishable from native for a still large performance increase.

        or even 1440p to 4k where it actually looks better than native with just a moderate performance increase.

        For 8k that same setup holds true. go for better than native or match native visuals. There is no real need to go below native just to get more performance. At that point the hardware is mismatched

        • bruhduh
          link
          fedilink
          1
          edit-2
          1 year ago

          Devs already use it instead of optimisations, what makes you think that bosses don’t try to push it further because deadlines and quarterly profits, immortals of aveum is example and it’s not even end of generation, only half (i agree with you from user standpoint though)

    • Final Remix
      link
      fedilink
      251 year ago

      *monkey’s paw curls*

      Granted! Everything’s just internal render 25% scale and massive amounts of TAA.

  • teft
    link
    fedilink
    481 year ago

    That’s how I feel when people complain about 4k only being 30fps on PS5.

    I laugh because my 1080p tv lets the PS5 output at like 800fps.

      • teft
        link
        fedilink
        351 year ago

        My 120 fps on ps5 1080 in front of me says that your comment is mistaken.

        • @[email protected]
          link
          fedilink
          211 year ago

          The fact it can output a 120Hz signal doesn’t mean the processor is making every frame. Many AAA games will be performing at well under 120fps especially in scenes with lots of action.

          It’s not limited to 30fps like the other poster suggested though, I think most devs try to maintain at least 60fps.

          • @[email protected]
            link
            fedilink
            91 year ago

            Unlike Bethesda, who locks their brand new AAA games with terrible graphics at 30 fps, and that if you don’t feel that the game is responsive and butter smooth, then you’re simply wrong.

            I’d almost bet money that Todd has never played a game at 60 fps or higher.

            • Poggervania
              link
              fedilink
              31 year ago

              iirc that more has to do with lazy coding of their physics system with the Gamebryo Creation engine. From what I understand, the “correct” way for physics to work is more or less locked at 60fps or less, which is why in Skyrim you can have stuff flip out if you run it above 60fps and can even get stuck on random ledges and edges.

              There are use cases for tying things to framerate, like every fighting game for example is basically made to be run at 60fps specifically - no more and no less.

              • @[email protected]
                link
                fedilink
                31 year ago

                This used to be the way that game engines were coded because it was the easiest way to do things like tick rates well, but like with pretty much all things Bethesda, they never bothered to try to keep up with the times.

                There’s some hilarious footage out there of this in action with the first Dark Souls, which had its frame rate locked at I believe 30fps and its tick rate tied to that. A popular PC mod unlocked the frame rate, and at higher frame rates stuff like poison can tick so fast that it can kill you before you can react.

      • Poggervania
        link
        fedilink
        81 year ago

        No, the PS5 can output higher FPS at 1080p.

        What you might be thinking of is refresh rate, which yeah, even if the PS5 was doing 1080p/60fps, if you for some reason have a 1080p/30hz TV, you won’t be able to see anything above 30fps.

    • @[email protected]
      link
      fedilink
      1
      edit-2
      1 year ago

      I have a 4k TV, it legitimately is no better than 1080 lmao

      There’s a very noticeable difference, but it’s nothing like the difference between SD and HD. It’s pretty, but not that pretty. I prefer the performance (and proper scaling for my computer) of 1080, even on a 55" screen

      • @[email protected]
        link
        fedilink
        English
        21 year ago

        Could this be a configuration issue? I can’t talk out of experience but I’d assume it would be quite a bit better.

        Thanks for the info anyway.

        P.s. I’m not the person who downvoted you. I don’t do that when disagreeing.

        • @[email protected]
          link
          fedilink
          71 year ago

          Yeah and once you’re deep into playing… you stop caring about that stuff and focus on the game.

      • @[email protected]
        link
        fedilink
        51 year ago

        Context matters a lot. On a 27" monitor, it makes a pretty decent difference. On a 50" TV at 10+ ft…meh?

      • @[email protected]
        link
        fedilink
        121 year ago

        Hell, I can’t tell ANY difference (though I do need glasses so maybe that’s got to do with it)

        • @[email protected]
          link
          fedilink
          11 year ago

          Small print text rendering is where you’ll see the difference.

          Game graphics, whatever, but if you have to do a lot of reading or coding, you can make the text smaller and still stays crystal clear.

      • qaz
        link
        fedilink
        41 year ago

        1080 vs 2k is pretty clear to me, but I have a hard time telling the difference between 2k and 4k.

  • @[email protected]
    link
    fedilink
    401 year ago

    Has anyone else here never actually bought a TV? I’ve been given 3 perfectly good TVs that relatives were gonna throw out when they upgraded to smart TVs. I love my dumb, free TVs. They do exactly what I need them to and nothing more. I’m going to be really sad when they kick the bucket.

    • @[email protected]
      link
      fedilink
      6
      edit-2
      1 year ago

      I was a given free, very decent, dumb tv and upgraded it to a smart tv with a $5 steam link and ran a cat 6 cable to it from my router. Best $5 ever. Have no intention of buying a new one. If I ever do, I will try my hardest to make sure if it’s a dumb one. I know they sell “commercial displays” that are basically a tv with no thrid party apps or a way to install them.

    • Flying Squid
      cake
      link
      fedilink
      21 year ago

      One of my TVs was given to us by my mother-in-law, but we did buy the other one. Before the ‘smart’ TV era though.

    • @[email protected]
      link
      fedilink
      21 year ago

      Yes, people like me buy TVs. I’m the guy who keeps giving away perfectly good TVs to other people because I’ve bought a new one and don’t want to store the old one. I’ve given away 2 smart TVs so far, though I’m not sure what I’ll do with my current one when I inevitably upgrade.

    • @[email protected]
      link
      fedilink
      21 year ago

      I used my family’s first HDTV from 2008 up until last year, when my family got me a 55" 4k TV for like $250. Not gonna lie, it’s pretty nice having so much screen, but I’m never getting rid of the ol’ Sanyo.

    • @[email protected]
      link
      fedilink
      3
      edit-2
      1 year ago

      I’ve bought my TVs because all my relatives are the same as us. My mom finally tossed an old CRT TV a couple of years ago because it started having issues displaying colours correctly.