Jump to content

6K is the new 1440p, Convince me otherwise


Snowbro
 Share

Recommended Posts

2 hours ago, kye said:

My first hard drive was 20Mb.  I got a second one that had been physically opened and was now unreliable, so I had to run Spinrite every week (it would read the drive and re-write the same data over the top to refresh it) otherwise it would just stop working and you'd have to format it again.  That process took about an hour because it was so much data.

At full-IQ my GH5 would fill one of those hard drives every 0.4 seconds.

8K will be mainstream, just wait long enough.  and after that it will be vintage, then it will be positively pre-historic.

I’m not discounting the march of technology. But there are physical limits to storage technology. Storage of any kind takes space.

Will 8K be mainstream someday. Yep. Will it be sooner than we think? I’m throwing my hat in the “not as soon as you think” rink.

Link to comment
Share on other sites

EOSHD Pro Color 5 for Sony cameras EOSHD Z LOG for Nikon CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs
1 hour ago, Video Hummus said:

I’m not discounting the march of technology. But there are physical limits to storage technology. Storage of any kind takes space.

Will 8K be mainstream someday. Yep. Will it be sooner than we think? I’m throwing my hat in the “not as soon as you think” rink.

I agree with "not as soon as you think" but if we take my 20MB 3.5inch HDD as an example, a 2TB drive is 100,000 times the capacity and that's not even cherry- picking a better example.  That took about 30 years, which is a 47% annual increment, and I don't see any reason that wouldn't continue almost indefinitely.

Moores Law says double every 1.5 years for the same cost, which means that in 3 years 8K will be as taxing as 4K is now, and in 6 years 16K will be as taxing as 4K is now.

That's not that much time really ???

Link to comment
Share on other sites

16 minutes ago, kye said:

and I don't see any reason that wouldn't continue almost indefinitely.

Laws of physics. Bits have to be represented by some physicality, smallest you can go is an atom maybe and electron. Problem is, the smaller you go the less those things behave in predictably manners. Quantum physics and all that jazz.

Same problem exists with fuel and space travel.

Moores Law has essential hit a wall (in the CPU domain) storage isn’t far behind.

I know hard drive size has not shrunk in anyway for 10 years or more. That’s a problem.

It makes me wonder the immense amount of work YouTube has to do to process and store video. ?

Link to comment
Share on other sites

3 minutes ago, Video Hummus said:

Laws of physics. Bits have to be represented by some physicality, smallest you can go is an atom maybe and electron. Problem is, the smaller you go the less those things behave in predictably manners. Quantum physics and all that jazz.

Same problem exists with fuel and space travel.

Moores Law has essential hit a wall (in the CPU domain) storage isn’t far behind.

I know hard drive size has not shrunk in anyway for 10 years or more. That’s a problem. 

We keep reaching theoretical thresholds and keep finding ways around them, so I'm not worried, Moores Law is safe.

And in terms of hard drives not shrinking that's a form factor question. Otherwise we wouldn't have SSD drives (smaller than a 2.5 inch HDD and things like 256GB micro SD cards are ridiculously tiny, so storage is absolutely shrinking.

This has some interesting data...

https://medium.com/predict/moores-law-is-alive-and-well-adc010ea7a63

Link to comment
Share on other sites

11 hours ago, thebrothersthre3 said:

Maybe when Arri releases an 8k camera I'll believe that.

In 2040?

5 hours ago, kye said:

8K will be mainstream, just wait long enough.  and after that it will be vintage, then it will be positively pre-historic.


It will slow down eventually the resolution wars. 

Just like with audio capture. 

Link to comment
Share on other sites

4 hours ago, kye said:

I agree with "not as soon as you think" but if we take my 20MB 3.5inch HDD as an example, a 2TB drive is 100,000 times the capacity and that's not even cherry- picking a better example.  That took about 30 years, which is a 47% annual increment, and I don't see any reason that wouldn't continue almost indefinitely.

Moores Law says double every 1.5 years for the same cost, which means that in 3 years 8K will be as taxing as 4K is now, and in 6 years 16K will be as taxing as 4K is now.

That's not that much time really ???

That's what I thought when I wanted to do 4K gaming in 2012, it's now just becoming feasible (at least 60 fps) with a flagship single GPU.

Link to comment
Share on other sites

6 hours ago, kye said:

Moores Law says double every 1.5 years for the same cost, which means that in 3 years 8K will be as taxing as 4K is now, and in 6 years 16K will be as taxing as 4K is now.

Moore's Law is not a law of physics or science; it is nothing more than an observation of a trend.

That trend has already slowed significantly in the last few years, to the point where it cannot be used as an accurate predictor of future progress.

Link to comment
Share on other sites

1 minute ago, barefoot_dp said:

Moore's Law is not a law of physics or science; it is nothing more than an observation of a trend.

That trend has already slowed significantly in the last few years.

I'm not disagreeing with your assessment of moore's law, but things can change rapidly in a heartbeat. We honestly do not even know what companies are withholding, they do it all the time. Not to mention the implementation of AI systems is yielding exponential growth. I work with high level AI everyday and it is incredible what it evolves into year by year. That is just in our company alone. Quantum computing in places like Google are going to shatter what you currently perceive as a limitation. 

Link to comment
Share on other sites

2 minutes ago, Snowbro said:

 We honestly do not even know what companies are withholding, they do it all the time.

And do you know why they withhold it, in many circumstances? Marketing.

They release certain features at a time which will give them the best market advantage.

The camera megapixel wars slowed down once they realised people were more interested in other features (video being one of them), and their R & D dollars were better spent elsewhere. And similarly, I think the TV resolution wars will have less to do with technical limits and more to do with consumer disinterest - eventually everyone will just go "that's more than enough for me". That is when the companies will stop flogging higher resolution as their primary marketing tool and start on the next big thing which they might have been holding onto for some time - perhaps AI screens?

Link to comment
Share on other sites

@barefoot_dp 

Your variable costs are greatly reduced the more volume you do, and the longer you are in production of said product. If the price elasticity (what the market will bear) of your product is set within a certain threshold, say $400-500, incentives to reduce current margins is zero. If you have been producing this product for x amount of years, your variable costs will move closer and closer towards your fixed costs. Thus increasing your profit margins. Why would you introduce a new product with the same MSRP and higher initial production costs, if your old product is still selling fine? It is less marketing and more economics/managerial accounting. 

Nvidia and Intel do this all the time. They withhold their next generation chips until AMD catches up to them. 

Link to comment
Share on other sites

20 hours ago, kye said:

I agree with "not as soon as you think" but if we take my 20MB 3.5inch HDD as an example, a 2TB drive is 100,000 times the capacity and that's not even cherry- picking a better example.  That took about 30 years, which is a 47% annual increment, and I don't see any reason that wouldn't continue almost indefinitely.

Moores Law says double every 1.5 years for the same cost, which means that in 3 years 8K will be as taxing as 4K is now, and in 6 years 16K will be as taxing as 4K is now.

That's not that much time really ???

In what universe people are living. Moore law is not eternal, if you did not notice computer power increase has slowed down quite a bit, if it was not for Amd Ryzen processors we would not see any increase for the last 5-7 years. RAM and Hard disk is a bit worst, long are the days of increasing density for less and less. HD price had stayed about the same for more than 5 years.  Media cost is costing quite a bit more as moving from 1080p to 4k is quadruple the data needed. and nowhere has the price of media decrease or capacity increase by 4x.

14 hours ago, barefoot_dp said:

And do you know why they withhold it, in many circumstances? Marketing.

They release certain features at a time which will give them the best market advantage.

The camera megapixel wars slowed down once they realised people were more interested in other features (video being one of them), and their R & D dollars were better spent elsewhere. And similarly, I think the TV resolution wars will have less to do with technical limits and more to do with consumer disinterest - eventually everyone will just go "that's more than enough for me". That is when the companies will stop flogging higher resolution as their primary marketing tool and start on the next big thing which they might have been holding onto for some time - perhaps AI screens?

This treat has turn into some cospiracy theory, ask intel how marketing has mess up with their 10nm process so that AMD is beating them up hard. Intel must be so happy loosing market share because you know that's the aim of mega corporations. By the way if you want to see one example of how we are reaching some very very hard limit, the intel 10 nm debacle is the right example. Once always the N01 in process tech by one to two generation they missed the boat completely.

Link to comment
Share on other sites

5.7K/6K sensors are really needed to output true 4K after the deBayering process required off the sensor. So to get true 4K, you need a 5.7K/6K sensor, as a 4K sensor after deBayering produces roughly a 2.7K image. The cameras with 4K sensors out there add sharpening that's not in the source (some more than others - e.g. Sony more than Canon) and aliasing. So yeah, for true 4K, you need the larger sensor.

In terms of consumer, 4K is nice on my 65" OLED, but the HDR that comes with it is junk, so I don't watch 4K content for the most part. No consumer display can be calibrated for HDR REC.2020/P3, so the colors and black levels are all over the place and differ from display to display, as do mastering and display maximum nits. It's a total mess. 4K without HDR would have been perfect, but unless everyone had a 65" or greater, and the source was true 4K or scanned 4K film, nobody would be able to see a major difference worth investing in.

Link to comment
Share on other sites

It's always a little frustrating to me, when I make an awesome 4K video, only to realize that pretty much everyone is just going to view it on a 1080p phone. I mean, it looks good on my 6.4" 1440p OLED phone, but most people don't have that or will switch it to 1440p. Even fewer will watch it on a 4K tv. Heck, even these big hollywood movies with millions poured into them are usually only viewed at 2k resolution or 1080p in the home later through streaming. That 1080p is also a low bit rate via netflix etc. Blu-ray 1080p actually looks pretty decent. 

I still film in a high resolution and edit in a 4k timeline. One day my hard work with a higher resolution will be appreciated ?

Link to comment
Share on other sites

44 minutes ago, Snowbro said:

It's always a little frustrating to me, when I make an awesome 4K video, only to realize that pretty much everyone is just going to view it on a 1080p phone. I mean, it looks good on my 6.4" 1440p OLED phone, but most people don't have that or will switch it to 1440p. Even fewer will watch it on a 4K tv. Heck, even these big hollywood movies with millions poured into them are usually only viewed at 2k resolution or 1080p in the home later through streaming. That 1080p is also a low bit rate via netflix etc. Blu-ray 1080p actually looks pretty decent. 

I still film in a high resolution and edit in a 4k timeline. One day my hard work with a higher resolution will be appreciated ?

Blue ray is nice. I have decent internet but half the time Netflix goes into what looks like 280p?

Link to comment
Share on other sites

2 hours ago, Snowbro said:

It's always a little frustrating to me, when I make an awesome 4K video, only to realize that pretty much everyone is just going to view it on a 1080p phone. I mean, it looks good on my 6.4" 1440p OLED phone, but most people don't have that or will switch it to 1440p. Even fewer will watch it on a 4K tv. Heck, even these big hollywood movies with millions poured into them are usually only viewed at 2k resolution or 1080p in the home later through streaming. That 1080p is also a low bit rate via netflix etc. Blu-ray 1080p actually looks pretty decent. 

I still film in a high resolution and edit in a 4k timeline. One day my hard work with a higher resolution will be appreciated ?

If you upload to YT etc in 4K then as internet speeds get better for people then more people will view it in higher resolutions.  I'm sure that if the YT app sensed it had ridiculous speeds to play with it might stream in 1440p or 4K and downscale to whatever the display resolution is, or at least it will in a future update.  One thing I've noticed is that app developers are keen to give you a better user experience even if it absolutely smashes your internet usage.

1 hour ago, Snowbro said:

I have the dark knight on blu-ray (1080p) and it looked better then when I watched it streamed in "4K"

Yeah, I think that when they say streaming in 4K it might not actually be 4K, but just a name on a bitrate.  These things will get better with time though, and if h265 streaming (or the next compression invented after that) becomes a thing it would be pretty easy for the services to re-compress their original files into newer resolutions/bitrates to suit what internet speeds people are streaming on.

Link to comment
Share on other sites

5 minutes ago, kye said:

If you upload to YT etc in 4K then as internet speeds get better for people then more people will view it in higher resolutions.  I'm sure that if the YT app sensed it had ridiculous speeds to play with it might stream in 1440p or 4K and downscale to whatever the display resolution is, or at least it will in a future update.  One thing I've noticed is that app developers are keen to give you a better user experience even if it absolutely smashes your internet usage.

Youtube is fine for me, I can watch things in 4k usually. Netflix automatically adjusts the resolution though so I don't have control. 

I only watch 4k on youtube if I am looking at an IQ comparison or something. HD looks fine as long as its a 4k upload. HD uploads are a bit too soft, especially if its a poorly lit video. Its really about the bitrate. 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

  • EOSHD Pro Color 5 for All Sony cameras
    EOSHD C-LOG and Film Profiles for All Canon DSLRs
    EOSHD Dynamic Range Enhancer for H.264/H.265
×
×
  • Create New...