Jump to content

Help me on an eBay hunt for 4K under $200 - Is it possible?


Andrew Reid
 Share

Recommended Posts

2 hours ago, tupp said:

Nope.  Color depth is the number of different colors that can be produced in a given area.  A given area has to be considered, because imaging necessarily involves area... which area necessarily involves resolution.

 

Obviously, if a 1-bit imaging system produces more differing colors as the resolution is increased, then resolution is an important factor to color depth -- it is not just bit depth that determines color depth.

 

The above example of a common screen printing is just such an imaging system that produces a greater number of differing colors as the resolution increases, while the bit depth remains at 1-bit.

 

 

The Wikipedia definition of color depth is severely flawed in at least two ways:

  1. it doesn't account for resolution;
  2. and it doesn't account for color depth in analog imaging systems -- which possess absolutely no bit depth nor pixels.

 

Now, let us consider the wording of the Wikipedia definition of color depth that you quoted.   This definition actually gives two image areas for consideration

  1. "a single pixel" -- meaning an RGB pixel group;
  2. and "the number of bits used for each color component of a single pixel" -- meaning a single pixel site of one of the color channels.

 

For simplicity's sake, let's just work with Wikipedia's area #2 -- a single channel pixel site of a given bit depth of "N."  We will call the area of that pixel site "A."

 

If we double the resolution, the number of pixel sites in "A" increases to two.  Suddenly, we can produce more tones inside "A."  In fact, area "A" can now produce "N²" number of tones -- much more than "N" tones.

 

Likewise, if we quadruple the resolution, "A" suddenly contains four times the pixel sites that it did originally, with the number of possible tones within "A" now increasing to "N⁴."

 

Now, one might say, "that's not how it actually works in digital images -- two or four adjacent pixels are not designed to render a single tone."  Well, the fact is that there are some sensors and monitors that use more pixels within a pixel group than those found within the typical Bayer pixel group or found withing a striped RGB pixel group.  Furthermore (and probably most importantly), image detail can feather off within one or two or three pixel groups, and such tiny transitions might be where higher tone/color depth is most utilized.

 

By the way, I didn't come up with the idea that resolution is "half" of color depth.  It is a fact that I learned when I studied color depth in analog photography in school -- back when there was no such thing as bit depth in imaging.

 

In addition, experts have more recently shown that higher resolutions give more color information (color depth), allowing for conversions from 4k, 4:2:0, 8-bit to Full HD, 4:4:4, 10-bit -- using the full, true 10-bit gamut of tones.  Here is Andrew Ried's article on the conversion and here is the corresponding EOSHD thread.

You're making progress, but haven't gotten there yet.

Please explain how, in an 8K image with banding, an area with dozens/hundreds of pixels that are all the same colour, somehow in the downsampling process you will get something other than simply a lower resolution version of that flat band of colour?

Link to comment
Share on other sites

EOSHD Pro Color 5 for Sony cameras EOSHD Z LOG for Nikon CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs
6 hours ago, kye said:

You're making progress, but haven't gotten there yet.

Glad to know that I am making progress.  You have not directly addressed most of my points, which suggests that you agree with them.

 

 

6 hours ago, kye said:

Please explain how, in an 8K image with banding, an area with dozens/hundreds of pixels that are all the same colour, somehow in the downsampling process you will get something other than simply a lower resolution version of that flat band of colour?

Firstly, the banding doesn't have to be eliminated in the down conversion to retain the full color depth of the original image.  Banding/posterization is merely an artifact that does not reduce the color depth of an image.  One can shoot a film with a hair in the gate or shoot a video with a dust speck on the sensor, yet the hair or dust speck does not reduce the image's color depth.

 

Secondly, broad patches of uniformly colored pale sky tend to exhibit shallow colors that do not utilize a lot of color depth bandwidth.  So, it's not as if there is much color depth lost in the areas of banding.

 

Thirdly, having no experience with 8K cameras, I am not sure if the posterization threshold of such a high resolution behaves identically to those of lower resolutions.  Is the line in the same place?  Is it smooth or crooked or dappled?

 

In regards to eliminating banding during a down conversion, there are many ways to do so.  One common technique is selective dithering.  I have read that diffusion dithering is considered most favorable over other dithering methods.

Link to comment
Share on other sites

8 hours ago, tupp said:

Glad to know that I am making progress.  You have not directly addressed most of my points, which suggests that you agree with them.

 

 

Firstly, the banding doesn't have to be eliminated in the down conversion to retain the full color depth of the original image.  Banding/posterization is merely an artifact that does not reduce the color depth of an image.  One can shoot a film with a hair in the gate or shoot a video with a dust speck on the sensor, yet the hair or dust speck does not reduce the image's color depth.

 

Secondly, broad patches of uniformly colored pale sky tend to exhibit shallow colors that do not utilize a lot of color depth bandwidth.  So, it's not as if there is much color depth lost in the areas of banding.

 

Thirdly, having no experience with 8K cameras, I am not sure if the posterization threshold of such a high resolution behaves identically to those of lower resolutions.  Is the line in the same place?  Is it smooth or crooked or dappled?

 

In regards to eliminating banding during a down conversion, there are many ways to do so.  One common technique is selective dithering.  I have read that diffusion dithering is considered most favorable over other dithering methods.

You're really not getting this....

Let's revisit your original statement:

On 3/19/2021 at 1:39 AM, tupp said:

4K has 4 times the color depth (and 4 times the bit rate) of full HD, all other variables being equal and barring compression or any artificial effects.

So, if 4K has 4 times the colour depth, then downscaled to FHD it should be equivalent to FHD 10-bit.

When I shoot a 4K 8-bit image and get banding in it, and downscale it to FHD, why does the banding remain?  If I took the same shot in FHD 10-bit, there is no banding, so why doesn't the banding get eliminated like you've claimed in your original statement?

Link to comment
Share on other sites

5 minutes ago, kye said:

You're really not getting this....

I sense irony here.

 

 

6 minutes ago, kye said:

So, if 4K has 4 times the colour depth, then downscaled to FHD it should be equivalent to FHD 10-bit.

A conversion can be made so that the resulting Full HD 10-bit image has essentially the equivalent color depth as the original 4K 8-bit image.  Of course, there are slight conversion losses/discrepancies.

 

 

11 minutes ago, kye said:

When I shoot a 4K 8-bit image and get banding in it, and downscale it to FHD, why does the banding remain?  If I took the same shot in FHD 10-bit, there is no banding,...

The banding remains because it is an artifact that is inherent in the original image.

 

That artifact has nothing to do with the color depth of the resulting image -- the banding artifact in this case is caused merely by a lower bit depth failing to properly render a subtle transition.  However, do not forget that bit depth is not color depth -- bit depth is just one factor of color depth.

 

It's actually very simple.

 

 

14 minutes ago, kye said:

... so why doesn't the banding get eliminated like you've claimed in your original statement?

I never claimed that the banding would get eliminated in a straight down-conversion.  In fact, I made this statement regarding your scenario of a straight down-conversion from an 8K banded image to an SD image: 

22 hours ago, tupp said:

it is very possible that the 8K camera will exhibit a banding/posterization artifact just like the SD camera.

 

Where did you get the idea that I said banding would get eliminated in a straight conversion.

 

You have to grasp the difference between banding/posteriztion artifacts and the color depth of an image.

Link to comment
Share on other sites

22 minutes ago, tupp said:

I sense irony here.

I'm sensing things here too, but it's not irony.

22 minutes ago, tupp said:

A conversion can be made so that the resulting Full HD 10-bit image has essentially the equivalent color depth as the original 4K 8-bit image.  Of course, there are slight conversion losses/discrepancies.

Ah, now we've changed the game.  You're saying that the resulting downscaled image will have the same reduced colour depth as the original image.  This is not what you have been saying up until this point.

You said that "4K has 4 times the color depth (and 4 times the bit rate) of full HD" which implies that I can film a 4K 8-bit image and get greater colour depth than FHD 8-bit, but now you're saying that the resulting downscale to FHD will have the same limitations to colour depth, which completely disagrees with your original statement.

22 minutes ago, tupp said:

The banding remains because it is an artifact that is inherent in the original image.

Correct.

Which is why "4K has 4 times the color depth (and 4 times the bit rate) of full HD" is a fundamentally incorrect statement.

22 minutes ago, tupp said:

That artifact has nothing to do with the color depth of the resulting image

I shoot with 8-bit, I get colour banding.
I shoot with 10-bit, I don't get colour banding.

Seems like it has everything to do with the colour depth of the resulting image.

22 minutes ago, tupp said:

That artifact has nothing to do with the color depth of the resulting image -- the banding artifact in this case is caused merely by a lower bit depth failing to properly render a subtle transition.  However, do not forget that bit depth is not color depth -- bit depth is just one factor of color depth.

Please provide links to any articles or definitions (or anything at all) that talks about how colour depth is different to bit depth, because I have looked and I can't find a single reference where someone has made the distinction except you, who it seems suspiciously like you're changing the definition just to avoid being called out for posting BS online.

22 minutes ago, tupp said:

It's actually very simple.

Then explain it simply.

I have asked you lots of times to do so.

The really sad thing is that there is some basis to this (and thus why Andrew and others have reported on it) and there are some situations where downscaling does in fact have a similar effect to having shot in an increased bit-depth, but you are not explaining how to tell when these situations are and when they are not likely.

Making assertions that resolution can increase bit-depth but then saying that banding will still occur is simply disagreeing with yourself.

For those having to read this, firstly, I'm sorry that discussions like this happen and that it is so difficult to call someone out on them posting BS misleading generic statements.  The reason I do this is because as I've learned more about film-making and the tech behind it, the more I've realised that so many of the things people say on forums like these is just factually incorrect.  This would be fine, and I'm not someone who is fact-checking 4chan or anything, but people make decisions and spend their limited funds on the basis of BS like this, so I feel that we should do our best to call it out when we see it, so that people are better off, rather than worse off after reading these things.

Link to comment
Share on other sites

To that end, here's the explanation about resolution vs bit-depth.

You can downscale a higher-resolution image to a lower resolution and get some/all of the benefits associated with an increased bit-depth, but only in certain circumstances.  The critical factor is the noise / detail in the image.  If there is a decent amount of noise then this technique works, but if there isn't any noise or detail then it won't work.

You also need to be aware that this detail or noise has to be present at the point where the downsample happens.  For example, if you are downsampling in post, perhaps by putting 4K files on a 1080 timeline, then the detail and noise needs to be present in the files coming out of the camera.  
Therefore, any noise-reduction that happens in camera will limit or eliminate this effect.  Any flattening of areas of the image due to compression will limit or eliminate this effect.  This is why banding in skies or banding in out-of-focus areas is not fixed by this process, and needs to be captured in 10-bit or higher in the first place.

It matters for each pixel in the image and what the surrounding pixels are doing, so you might get varying levels of this effect in different parts of the same frame, depending on the values of that group of pixels.

This is one of the reasons why RAW is so good, it gives a really good bit-depth (which is colour depth!) and it also doesn't eliminate the noise in the image, which can benefit the processing in post even further.

Some examples:

  • If you're shooting something with a lot of detail or texture, and it is sharply in-focus, then the variation in colour between adjacent pixels will enable this effect.  For example, skin tones sharply in focus can get this benefit.  If there is noise in the image above a certain amount then everything that has this level of noise will benefit, such as out-of-focus areas and skies etc.
  • Skies from a camera with a poor bitrate and/or noise reduction will not be saved by this method.
  • Skin-tones from a camera with a poor bitrate and/or noise reduction will not be saved by this method either.
  • Details that are out-of-focu from a camera with a poor bitrate and/or noise reduction will not be saved by this method.

This is why I really shake my head when I see all the Sony 4K 8-bit 100Mbps S-Log cameras.  100Mbps is a very low bitrate for 4K (for comparison 4K Prores HQ is 707Mbps and even 1080p Prores HQ is 176Mbps - almost double for quarter the pixels!) and combined with the 8-bit, the very low contrast S-Log curves, and the low-noise of Sony cameras it really means they're susceptible to banding and 8-bit artefacts which will not be saved by this method.

What can you do to improve this, if for example you are buying a budget 8-bit camera with 4K so that you can get better 1080p images?  

Well, beyond making sure you're choosing the highest bit-rate and bit-depth the camera offers, then assuming the camera has manual settings, you can try and use a higher ISO.  
Seriously.  
Find something where there is some smooth colour gradients, a blue sky does great, or even inside if you point a light at a wall then the wall will have a gradual falloff away from the light, then shoot the same exposure across all the ISO settings available.  You may need to expose using SS for this test, which is fine.  If the walls are too busy, set the lens to be as out-of-focus as possible and set to the largest aperture to get the biggest blurs.  Blurs are smooth graduations.  
Then bring the files into post, put them onto the lower resolution timeline and compare the smoothness of the blurs and any colour banding.
Maybe your camera will be fine at base ISO, which is great, but maybe you have to raise it up some, but it should at some point get noisy enough to eliminate the banding.  If you've eliminated the banding then it will mean that the bit-depth increase will work in all situations as banding is the hardest artefact to eliminate with this method.

Be aware that by raising the ISO you're probably also lowering DR and lowering colour performance, so it's definitely a trade-off.

Hopefully that's useful, and hopefully it's now obvious why "4K has more colour depth" is a misleading oversimplification.

Link to comment
Share on other sites

39 minutes ago, kye said:

Ah, now we've changed the game.  You're saying that the resulting downscaled image will have the same reduced colour depth as the original image.  This is not what you have been saying up until this point.

Nope.  I said "A conversion can be made so that the resulting Full HD 10-bit image has essentially the equivalent color depth as the original 4K 8-bit image."

 

I didn't say anything about the original image having "reduced" color depth.  You came up with that.

 

However, the original image does have lower bit depth than the down-converted image -- even though both images have the same color depth.

 

 

46 minutes ago, kye said:

You said that "4K has 4 times the color depth (and 4 times the bit rate) of full HD" which implies that I can film a 4K 8-bit image and get greater colour depth than FHD 8-bit,...

Yes.  That is a fact -- all other variables being the same in both instances.

 

 

48 minutes ago, kye said:

... but now you're saying that the resulting downscale to FHD will have the same limitations to colour depth, which completely disagrees with your original statement.

No. It doesn't disagree with anything that I have said.  You are just confusing bit depth with color depth.

 

With such a down-conversion, resolution is being swapped for greater bit depth -- but the color depth remains the same in both images.

 

It really is simple, and Andrew Reid's article linked above verifies and explains the down-conversion process.

 

 

1 hour ago, tupp said:

The banding remains because it is an artifact that is inherent in the original image.

55 minutes ago, kye said:

Correct.  Which is why "4K has 4 times the color depth (and 4 times the bit rate) of full HD" is a fundamentally incorrect statement.

No. The banding artifact is caused by the bit depth -- not by the color depth.  Posterization, bit depth and color depth are all different properties.

 

 

58 minutes ago, kye said:

I shoot with 8-bit, I get colour banding.
I shoot with 10-bit, I don't get colour banding.

Seems like it has everything to do with the colour depth of the resulting image.

Well, it would be so if bit depth and color depth were the same thing, but they're not.

 

Stop confusing bit depth with color depth.

 

 

1 hour ago, kye said:

Please provide links to any articles or definitions (or anything at all) that talks about how colour depth is different to bit depth, because I have looked and I can't find a single reference where someone has made the distinction except you

I linked an article by the founder of this forum that verifies from a prominent image processing expert that one can swap resolution for higher bit depth when down-converting images -- while retaining the same amount of color information (color depth):

20 hours ago, tupp said:

In addition, experts have more recently shown that higher resolutions give more color information (color depth), allowing for conversions from 4k, 4:2:0, 8-bit to Full HD, 4:4:4, 10-bit -- using the full, true 10-bit gamut of tones.  Here is Andrew Reid's article on the conversion and here is the corresponding EOSHD thread.

 

Also, that very article gives a link to the expert's Twitter feed, should you care to verify that the down-conversion process is actually a fact.

 

Furthermore, I have given an example proving that resolution affects color depth, and you even agreed that it was correct: 

On 3/18/2021 at 5:27 PM, kye said:

Your newspaper example is technically correct...

 

Additionally, I gave lead-pipe cinch demonstration on how increasing pixel sites within a given area creates more possible shades, to which you did not directly respond.  Do you deny the facts of that demonstration?

 

 

1 hour ago, kye said:

it seems suspiciously like you're changing the definition just to avoid being called out for posting BS online.

I have repeated essentially the same thing over and over.  It is you who keeps lapsing into confusion between color depth and bit depth.

 

 

1 hour ago, kye said:

Then explain it simply.

I have asked you lots of times to do so.

I have paraphrased the concept repeatedly in different forms, and I have given examples that clearly demonstrate that resolution is integral to color/tone depth.  In addition, I have linked an article from the forum founder that further explains the concept and how it can work in a practical application.

 

If you don't understand the simple concept by now, than I am not sure there is much more that I can do.

 

 

1 hour ago, kye said:

The really sad thing is that there is some basis to this (and thus why Andrew and others have reported on it)

Wait, I thought that you said:

1 hour ago, kye said:

I have looked and I can't find a single reference where someone has made the distinction except you

 

 

1 hour ago, kye said:

Making assertions that resolution can increase bit-depth but then saying that banding will still occur is simply disagreeing with yourself.

I never asserted that "resolution can increase bit-depth."  Please quit putting words into my mouth.  Changing the resolution will have no effect on the bit depth nor vice versa -- resolution and bit depth are two independent properties.

 

On the other hand, resolution and bit depth are the two non-perceptual factors of color depth.  So, if the resolution is increased, the color depth is increased (as exemplified in halftone printing and in my pixel site example).  Likewise, if the bit depth is increased, the color depth is increased.

 

Banding/posterization is something completely different.  It is an phenomenon that can occur with lower bit depths in some situations, but it also can occur in analog imaging systems that possess neither bit depth nor pixels.  The color depth of an image is not affected by whether or not the image exhibits posterization.

 

Let's say that you shoot two 4k, 8-bit images of the sky within the same few seconds:  one which is aimed at a part of the sky that exhibits no banding and one which is aimed at a portion of the sky produces banding.  Then, you down-convert both images to Full HD, 10-bit using the method described in Andrew Reid's article in which no color information is lost from the original images.  Would you say that the Full HD, 10-bit image of the banded sky has less color depth than the Full HD 10-bit image of the smooth sky?

 

 

1 hour ago, kye said:

For those having to read this, firstly, I'm sorry that discussions like this happen and that it is so difficult to call someone out on them posting BS misleading generic statements.  The reason I do this is because as I've learned more about film-making and the tech behind it, the more I've realised that so many of the things people say on forums like these is just factually incorrect.  This would be fine, and I'm not someone who is fact-checking 4chan or anything, but people make decisions and spend their limited funds on the basis of BS like this, so I feel that we should do our best to call it out when we see it, so that people are better off, rather than worse off after reading these things.

Well, it's good that you are looking out for us.  Now, if you could only avoid confusing bit depth and color depth...

Link to comment
Share on other sites

On 3/18/2021 at 6:41 PM, Andrew Reid said:

Nope.

Well, I guess I got "busted" for having not read your article before posting! Ha! "So smartphones are out, as are GoPros or cheap rubbish."

But, to a related (side) point, is it not, in part, due to the early adoption of such consumer-friendly and inexpensive, um, "4K creation devices" that helped fuel the widespread adoption of 4K...methinks we're in the middle of seeing this happen again with both 8K and HDR?

Anyhoo, back to the topic...

Link to comment
Share on other sites

7 hours ago, fuzzynormal said:

Why the bump in used camera prices?  Is it because brand offerings of enthusiast cams in the mid/low tier range have restricted a little over the past few years?  More YT kids trying to buy in on the used market?

I suspect a few things:

  1. People in quarantine taking up new hobbies and wanting a "cheaper older camera"
  2. People in quarantine / not able to travel getting nostalgic for older equipment they sold or used to lust after
  3. People who ravenously upgraded because of specs have discovered that quantity of pixels isn't a substitute for quality of pixels and are upgrading to older cameras
Link to comment
Share on other sites

16 hours ago, fuzzynormal said:

Why the bump in used camera prices?  Is it because brand offerings of enthusiast cams in the mid/low tier range have restricted a little over the past few years?  More YT kids trying to buy in on the used market?

Inflation?
image.thumb.png.cb58b3d841535a4e8db7148eec8a3173.png

  

8 hours ago, kye said:

I suspect a few things:

  1. People in quarantine taking up new hobbies and wanting a "cheaper older camera"
  2. People in quarantine / not able to travel getting nostalgic for older equipment they sold or used to lust after
  3. People who ravenously upgraded because of specs have discovered that quantity of pixels isn't a substitute for quality of pixels and are upgrading to older cameras

Or all of this!

But I think it is also a good theory too, that as manufacturers have given up on the low end, more people who'd have been entry level camera purchases, have instead turned to the secondhand market! Which have helped keep some of those price hold up over the years. 

Link to comment
Share on other sites

20 hours ago, fuzzynormal said:

Why the bump in used camera prices?  Is it because brand offerings of enthusiast cams in the mid/low tier range have restricted a little over the past few years?  More YT kids trying to buy in on the used market?

Too many people having too much time on their hands to look for the best and cheapest camera.

Link to comment
Share on other sites

On 3/21/2021 at 11:49 AM, fuzzynormal said:

Why the bump in used camera prices?  Is it because brand offerings of enthusiast cams in the mid/low tier range have restricted a little over the past few years?  More YT kids trying to buy in on the used market?

I wonder if the retail price for cameras over the last year or so has made people reconsider buying used? As great as they were, most folks simply aren't in a position to drop $3500 or more on the A7siii or S1H, especially during a pandemic. 

The truth is you can create your vision with pretty much any camera that has been released in the last 5 years (or older if you don't care about 4K!), or at least get close enough that you're willing to forgo the latest gear in favor of value. 

It might be hard to find a $200 camera that can do 4K and has a M43 or bigger sensor, but the fact that you can find one in the $300-350 range, and it's not that far off from what you need to spend $2000 or more for, is a pretty huge deal. The GX85 I bought for a bargain still holds its weight and if it weren't for my concerns with the M43 system as a whole I'd be prefectly fine using it with my G85 and GH5 for years to come. 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

  • EOSHD Pro Color 5 for All Sony cameras
    EOSHD C-LOG and Film Profiles for All Canon DSLRs
    EOSHD Dynamic Range Enhancer for H.264/H.265
×
×
  • Create New...