Nothing is released in 8k so why would someone want something nothing is in?
I’m so content with 1080p
The consumer has spoken and they don’t care, not even for 4K. Same as happened with 3D and curved TVs, 8K is a solution looking for a problem so that more TVs get sold.
In terms of physical media - at stores in Australia the 4K section for Blurays takes up a single rack of shelves. Standard Blurays and DVDs take up about 20.
Even DVDs still sell well because many consumers don’t see a big difference in quality, and certainly not enough to justify the added cost of Bluray, let alone 4K editions. A current example, Superman is $20 on DVD, $30 on Bluray (50% cost increase) or $40 on 4K (100%) cost increase. Streaming services have similar pricing curves for increased fidelity.
It sucks for fans of high res, but it’s the reality of the market. 4K will be more popular in the future if and when it becomes cheaper, and until then nobody (figuratively) will give a hoot about 8K.
How many homes have walls big enough for a screen big enough for 8k to matter
I would love to have an 8K TV or monitor if I had an internet connection up to the task and enough content in 8K to make it worth it, or If I had a PC powerful enough to run games smoothly in that resolution.
I think it’s silly to say ‘nobody wants this’ when the infrastructure for it isn’t even close to adequate.
I will admit that there is diminishing returns now, going from 4K to 8K was less impressive than FHD to 4K and I imagine that 8K will probably be where it stops, at least for anything that can reasonably fit in a house.
I hate the wording of the headline, because it makes it sound like the consumers’ fault that the industry isn’t delivering on something they promised. It’s like marketing a fusion-powered sex robot that’s missing the power core, and turning around and saying “nobody wants fusion-powered sex robots”.
Side note, I’d like for people to stop insisting that 60fps looks “cheap”, so that we can start getting good 60fps content. Heck, at this stage I’d be willing to compromise at 48fps if it gets more directors on board. We’ve got the camera sensor technology in 2025 for this to work in the same lighting that we used to need for 24fps, so that excuse has flown.
As someone who stupidly spent the last 20 or so years chasing the bleeding edge of TVs and A/V equipment, GOOD.
High end A/V is an absolute shitshow. No matter how much you spend on a TV, receiver, or projector, it will always have some stupid gotcha, terrible software, ad-laden interface, HDMI handshaking issue, HDR color problem, HFR sync problem or CEC fight. Every new standard (HDR10 vs HDR10+, Dolby Vision vs Dolby Vision 2) inherently comes with its own set of problems and issues and its own set of “time to get a new HDMI cable that looks exactly like the old one but works differently, if it works as advertised at all”.
I miss the 90s when the answer was “buy big chonky square CRT, plug in with component cables, be happy”.
Now you can buy a $15,000 4k VRR/HFR HDR TV, an $8,000 4k VRR/HFR/HDR receiver, and still somehow have them fight with each other all the fucking time and never work.
8K was a solution in search of a problem. Even when I was 20 and still had good eyesight, sitting 6 inches from a 90 inch TV I’m certain the difference between 4k and 8k would be barely noticeable.
For what content? Video gaming (GPUs) has barely gotten to 4k. Movies? 4k streaming is a joke; better off with 1080 BD. If you care about quality go physical… UHD BD is hard to find and you have to wait and hunt to get them at reasonable prices… And these days there are only a couple UHD BD Player mfg left.
I am a filmmaker and have shot in 6k+ resolution since 2018. The extra pixels are great for the filmmaking side. Pixel binning when stepping down resolutions allows for better noise, color reproduction, sharpened details, and great for re-framing/cropping. 99% of my clients want their stuff in 1080p still! I barely even feel the urge to jump up to 4k unless the quality of the project somehow justifies it. Images have gotten to a good place. Detail won’t provide much more for human enjoyment. I hope they continue to focus on dynamic range, HDR, color accuracy, motion clarity, efficiency, etc. I won’t say no when we step up to 8k as an industry but computing as a whole is not close yet.
What’s your opinion on using 8K TV as a monitor?
The difference between 1080 and 4K is pretty visible, but the difference between 4K and 8K, especially from across a room, is so negligible that it might as well be placebo.
Also the fact that 8K content takes up a fuckload more storage space. So, there’s that, too.
Even 4K the content is not yet easily available . I mean except from AppleTV plus that all content is 4K and it’s part of basic subscription, every other streaming charges much more for 4K content, most people don’t want to pay more every month for 4K
So 8K is just a distant reality that content makers are not really wanting to happen
Pretty sure my eyes max out at 4K. I can barely tell the difference between 4K and 1080P from my couch.
I do want a dumb 8K TV. I do not want all the so called smart features of a TV. Small Linux device with kodi works way better.
deleted by creator
Some Xiaomi TVs have root exploits, so you can manually disinfect the OS, but it’s cumbersome to get done since you need to enter adb commands over the remote control to get there in the first place.
Easier to just use an external device and the TV as a screen only. Personally I’m using the Nvidia Shield for 5+ years now and regret nothing.
Not ideal, but you can air gap the TV from the network, and use some small sbc, or even a firestick or android box. That’s what I do. Stremio?
As far as my TV is concerned I don’t have an internet connection.
I do want a TV that can access Netflix etc without another box. I just don’t want the surveillance that comes with it.
I just run mine without ever connecting it to the internet.
I run an Apple TV (shock, walled garden!), as it is the only device I’ve seen that consistently matches frame rates properly on the output.I personally hate Kodi UI. But I get your point
uh…there are hundreds of Kodi UIs.
article took forever to get to the bottom line. content. 8k content essentially does not exist. TV manufacturers were putting the cart before the horse.
4k tvs existed before the content existed. I think the larger issue is that the difference between what is and what could be is not worth the additional expense, especially at a time when most people struggle to pay rent, food, and medicine. More people watch videos on their phones than watch broadcast television. 8k is a solution looking for a problem.
Hell I still don’t own a 4k tv and don’t plan to go out of my way to buy one unless the need arises. Which I don’t see why I need that when a normal flat-screen looks fine to me.
I actually have some tube tvs and be thinking of just hooking my vcr back up and watching old tapes. I don’t need fancy resolutions in my shows or movies.
Only time I even think of those things is with video games.
4K hardly even makes sense unless your tv is over 70" and your watching it from less than 4 feet away. I do think VR could benefit from ultra-high resolution, though.
https://www.rtings.com/tv/reviews/by-size/size-to-distance-relationship
Extensive write up on this whole issue, even includes a calculator tool.
But, basically:
Yeah, going by angular resolution, even leaving the 8K content drought aside…
8K might make sense for a computer monitor you sit about 2 feet / 0.6m away from, if the diagonal size is 35 inches / ~89cm, or greater.
Take your viewing distance up to 8 feet / 2.4m away?
Your screen diagonal now has to be about 125 inches / ~318cm, or larger, for you to be able to maybe notice a difference with a jump from 4K to 8K.
…
The largest 8K TV that I can see available for purchase anywhere near myself… that costs ~$5,000 USD… is 85 inches.
I see a single one of 98 inches that is listed for $35,000. That’s the largest one I can see, but its… uh, wildly more expensive.
So with a $5,000, 85 inch TV, that works out to…
You would have to be sitting closer than about 5 feet / ~1.5 meters to notice a difference.
And that’s assuming you have 20/20 vision.
…
So yeah, VR goggle displays… seem to me to be the only really possibly practical use case for 8K … other than basically being the kind of person who owns a home with a dedicated theater room.
What this chart is missing is the impact of the quality of the screen and the source material being played on it.
A shit screen is a shit screen, just like a badly filmed TV show from the 80s will look like crap on anything other than an old CRT.
People buying a 4k screen from Wallmart for $200 then wondering why they cant tell its any better than their old 1080p screen.
The problem with pushing up resolution is the cost to get a good set right now is so much its a niche within a niche of people who actually want it. Even a good 4k set with proper HDR support and big enough to make a different is expensive. Even when 8k moves away from early adopter markups its still going to be expensive, especially when compared to the tat you can by at the supermarket.
It is totally true that things are even more complex than just resolution, but that is why I linked the much more exhaustive write up.
Its even more complicated in practice than all the things they bring up, they are focusing on mainly a movie watching experience, not a video game playing experience.
They do not go into LED vs QLED vs OLED vs other actual display techs, don’t go into response latency times, refresh rates, as you say all the different kinds of HDR color gamut support… I am sure I am forgetting things…
Power consumption may be a significant thing for you, image quality at various viewing angles…
Oh right, FreeSync vs GSync, VRR… blargh there are so many fucking things that can be different about displays…
At 1.6 meter for the metric minded. If you really stretch out and can hit the tv with your toes it’s about the right distance.
you’re*
It’s not hard, get it right.
gidoombiigiz*
Nobody likes a grammar-nazi. Due better mein fuhrer.
You’re describing my bedroom tv.
I think it’s NHK, or one of the Japanese broadcasters anyways, that has actually been pressing for 8K since the 1990s. They didn’t have content back then and I doubt they have much today, but that’s what they wanted HD to be.
Not familiar with NHK specifically (or, to be clear, I think I am but not with enough certainty), but it really makes a lot of sense for news networks to push for 8k or even 16k at this point.
Because it is a chicken and egg thing. Nobody is going to buy an 8k TV if all the things they watch are 1440p. But, similarly, there aren’t going to be widespread 8k releases if everyone is watching on 1440p screens and so forth.
But what that ALSO means is that there is no reason to justify using 8k cameras if the best you can hope for is a premium 4k stream of a sporting event. And news outlets are fairly regularly the only source of video evidence of literally historic events.
From a much more banal perspective, it is why there is a gap in TV/film where you go from 1080p or even 4k re-releases to increasingly shady upscaling of 720 or even 480 content back to everything being natively 4k. Over simplifying, it is because we were using MUCH higher quality cameras than we really should have been for so long before switching to cheaper film and outright digital sensors because “there is no point”. Obviously this ALSO is dependent on saving the high resolution originals but… yeah.
it’s not exactly “there is no point”. It’s more like “the incremental benefit of filming and broadcasting in 8k does jot justify the large cost difference”.
Filming in 8k does have advantages. You can crop without losing quality.
I’m sorry, but if we are talking about 8k viability in TVs, we are not talking about shooting in 8k for 4k delivery.
You should be pointing out that shooting in higher than 8k, so you have the freedom to crop in post, is part of the reason 8k is burdensome and expensive.
So correct the person above me, they wrote about shooting in 8k.
The RED V-Raptor is expensive for consumer grade but nothing compared to some film equipment. There are lenses more expensive than an 8k camera.
Which, for all intents and purposes, means there is no point. Because no news network is going to respond to “Hey boss, I want us to buy a bunch of really expensive cameras that our audience will never notice because it will make our tape library more valuable. Oh, not to sell, but to donate to museums.” with anything other than laughter and MAYBE firing your ass.
the point is, the cost/benefit calculation will change over time as the price of everything goes down. It’s not a forever “no point”.
… Almost like it would be more viable to film in higher resolution if more consumers had higher resolution displays?
Not only the content doesn’t exist yet, it’s just not practical. Even now 4k broadcasting is rare and 4k streaming is now a premium (and not always with a good bitstream, which matters a lot more) when once was offered as a cost-free future, imagine 8k that would roughly quadruple the amount of data required to transmit it (and transmit speee is not linear, 4x the speed would probably be at least 8x the cost).
And I seriously think noone except the nerdiest of nerds would notice a difference between 4k and 8k.
That’s usually the case
Not only does it not exist, it isn’t wanted. People are content watching videos on YouTube and Netflix. They don’t care for 4k. Even if they pay extra for Netflix 4k (which I highly doubt they do) I still question if they are watching 4k with their bandwidth and other limiting factors, which means they’re not watching 4k and are fine with it.
TV manufacturers are idiots.