Jump to content

What was the first professional camera to shoot LOG gamma?


maxotics
 Share

Recommended Posts

@Django Here’s a list of portable devices that support Mobile HDR: Samsung Galaxy Note 8, Samsung Galaxy S8, Samsung S8 Plus, LG G6, LG V30, Sony Xperia XZ Premium, Sony Xperia XZ1, Pixel, Razer Phone, Apple iPhone X, Apple iPhone 8, Apple iPhone 8 Plus, Apple iPad Pro, Samsung Galaxy Tab S3, and the Samsung Galaxy Book.

For the fiscal first quarter of 2018, Apple alone shipped nearly 80 million smartphones worldwide, representing over 1% of the world’s population. And that’s just 3 months for one manufacturer, not including tablets and Apple TV. So I believe the number is actually far greater than .2%. lol

 

17 minutes ago, maxotics said:

My latest video covered this subject.  The short answer is that the sensor is helpless.  When making video, the manufacturer will try to maximize visual information in its 8-bit space.  I don't believe 10-bit really exists in the consumer level because camera chips, AFAIK, are 8-bit.  I am NOT AN EXPERT, in that I don't work for a camera company, never have.  I don't have an engineering degree.  I'm completely self-taught.  My conclusions are based on personal experiments where I have to build a lot of the tests.  I'm happy to help anyone find an excuse to ignore me ;)

The question you want to ask yourself is how much does the manufacturer know, that it isn't telling you, and which there is no economic incentive for people to do.  This isn't conspiracy stuff, this is business as usual.  To see this on a trillion dollar scale watch the latest Netflix documentary on VW "Dirty Money".  

Anyway, Sony and Panasonic know exactly how much color information is lost for every LOG "pulling" of higher DR data and how noisey that data is.   Since they sell many cameras based on users believing they're getting "professional" tools why should they point out its weakness?  Same for the cottage industry of color profiles.  Why don't those people do tests to tell you how much color information you lose in LOG?  There is no economic incentive here.  I'm only interested because I'm just one of those hyper-curious people.  

A case in point is Jon's comment.  No offense Jon, but think about what this implies.  It implies that 8-BIT log can capture a fully saturated HDR image.  I don't believe it can compared to a RAW based source.  Again, the reason is LOG in 8-bit must sacrifice accurate color data (midtones) for inaccurate color data (extended DR).  This is gobsmack obvious when one looks at the higher ISO LOG shoots at.  

Again, everyone is constrained by current bandwidth and 8-bit processors.  All the companies Jon mentioned want to sell TVs, cameras, etc.  They need something new.  The question is, are they going to deliver real benefits or will they, like VW, just plain lie about what's really happening when you take your equipment out ;)  If you don't know, VW's Diesel TDI was supposed to be very clean.  It was, when on a dymo.  As soon as one turned the steering wheel it turned off the defeat device and spewed out 40x more pollutants than the driver thought when they bought the car.

 

So now you’re resorting to accusing camera manufacturers of deceiving the public about bit depth, without any concrete proof. I won’t go there - whether it’s 9-1/2 or 10-bit, I can’t say. All I can respond at this point is that you’re grasping at straws. Essentially, what you’re also saying is that my Atomos can’t record 10-bit log from my GH5, is that right?

Link to comment
Share on other sites

EOSHD Pro Color 5 for Sony cameras EOSHD Z LOG for Nikon CamerasEOSHD C-LOG and Film Profiles for All Canon DSLRs
17 minutes ago, jonpais said:

So now you’re resorting to accusing camera manufacturers of deceiving the public about bit depth, without any concrete proof. I won’t go there - whether it’s 9-1/2 or 10-bit, I can’t say. All I can respond at this point is that you’re grasping at straws. 

Please Jon.  I'm not accusing them of deceiving anyone.  It's NOT their job to educate people in how to use cameras.  They are completely right to let the user decide when/if to use LOG.  From my experience, users can't be educated anyway ;)  I'm only using VW as an example that many companies are desperate for market share and aren't looking at things the way others are.   And I'm only pointing out that the data is available and one should at least ask why it isn't given out?

As for the whether it's 9-1/2 or 10-bit.  You can do that experiment yourself.  (It's what I'm trying to do but is taking me time to build up the test stuff).  Go out and shoot a high DR scene, in both 8-bit and 10-bit and RAW (if possible), then show how more DR and color is in the 10-bit footage over the 8-bit.  My first experiments, last year, indicated that except for some very small improvements in banding, there is no real extra DR in 10-bit.     

Link to comment
Share on other sites

11 minutes ago, jonpais said:

And if I record from my GH5 to an Atomos Ninja Inferno with ProRes HQ, that’s 8-bit too, is that what you’re saying?

Again, I recorded 10-bit video using the Sony X70. I could find no real different in DR to 8-bit.  I have no idea if the GH5 and Atomos Ninja do fully "saturate" the 10-bit space with visual data.  I haven't seen anything online that leads me to believe there is a huge difference.  In theory, I'm with you!  10-bit video should be serious competition to RAW.  I just haven't seen it.  Jon, really, I'm not trying to claim there is any conspiracy or that 10-bit or HDR can't deliver real improvements.  I just haven't seen them in camera recording.  

So my opinion:

  • When large sensors with fat pixels were introduced what were the benefits?  C100/Sony A7S, etc.  HUGE
  • 4K... HUGE
  • LOG... Very useful, but over-used.
  • 10-bit Shooting: SKEPTICAL with existing tech
Link to comment
Share on other sites

@maxotics You’re only able to see the advantage of 10-bit when applying LUTs or color grading. If you’re seeing a difference between 8-bit and 10-bit without grading, you’re messing something up. Of course, GH5 10-bit is obviously not going to compete with a Red or Alexa: the GH5 isn’t a $75,000 camera after all! Also, logically, whatever movie you watched, whatever book you read,  or whatever Volvo did or did not do has absolutely nothing to do with HDR. You are only trying to divert attention away from your previous unverifiable assertions. I talk about log, you bring up 8-bit; then you accuse manufacturers of lying, wanting to sell more television sets and all sorts of other irrelevant nonsense - and the goal posts keep shifting. I don’t meddle in the Blackmagic forums because I have never shot with one: you’ve never watched HDR in your home, so how can you make pronouncements like that? I’d be more than happy to talk facts, but the ‘whataboutisms’ are just too much. And for what it’s worth, I did not say that log is necessary only for Panasonic when capturing for HDR - it applies to all cameras, whether it’s Canon, Arri, Red or Sony. So are you saying all these companies are lying about the bit-depth of their cameras?

Link to comment
Share on other sites

1 hour ago, jonpais said:

You’re only able to see the advantage of 10-bit when applying LUTs or color grading

98% false.  You might be enjoying a placebo effect. But I'm not going to beat a dead horse here.

2 hours ago, IronFilm said:

I don't see why you expect one *must* change just because the other changes. (yes, they might, but doesn't mean they're always connected)

In order to display a continuous color gradient, say, each bit must contain a color that will blend in with the next color.  If it doesn't we notice banding.  I'm just using banding because it is the easiest way to visualize when the bit-depth is great enough to spread a color out evenly through the data space.  So let's assume that we never see banding in 5 stops of DR with 256 shades of red (8-bit).  If we reduce it to 200 shades, we notice banding (and if we increase it to 512 we don't notice any difference). 

Okay, so let's say we shoot at 10 stops how many bits do we need to maintain smooth color assuming, again, that we accept that we needed 256 in 8-bits?  Please think this out, @IronFilm and explain what bit-rate you came up with and why.  Then explain why bit depth and DR and not married at the hip.

To put this in audio terms, which you should know more about than I,  why don't you record your audio in 8-bits which, I believe, is closer to what is distributed than the 16-bit you probably use?  Is DR in audio not connected to bit-depth?  If I told you I had a new "Audio LOG" technology that would give you the same quality you get with your mics at 8-bit what would you say to me?  

Link to comment
Share on other sites

15 minutes ago, jonpais said:

The world will go on shooting RAW or log for HDR in spite of your disapproval

Am I speaking English?  I don't disapprove of HDR or LOG or anything.  I only DISAGREE with claims made that LOG can fit more DR range into a fixed data-depth without, essentially, overwriting data.  And I'm not making a judgment on you, or anyone else, who believes they can grade 10-bit footage better than 8-bit.  This stuff is esoteric and, in the scheme of things, completely pointless.  One shoots with what they can afford.  Even if I approved, who gives a sh_t? ;)  I've gone to great pains to figure some of this arcane stuff out.  If you say, I don't care what you say Max, I love my image.  I have no answer to that!  If you say, 'this isn't working as well as I thought it would' (which is what happened to me and got me on this whole, long, thankless trip), then I have some answers for you. 

The short answer is the manufacturers have already maximized video quality in 8-bit and there is no way of recording increased DR unless you give up color fidelity.  Therefore, if color fidelity is your prime goal, don't shoot LOG. 

In 10-bit, theoretically, there should be more DR, but 10-bit isn't true 10-bit from 10-bits of RGB values, but is 10-bit in the sense that when three 8-bit values are averaged, say 128 and 129 and 131, it saves that as 129.33 instead of 129.  If you want me to explain this is greater depth let me know.  Though it doesn't seem you trust anything I say.

Link to comment
Share on other sites

But you just said that HDR was a scheme to sell more television sets. And how can you presume to know more than Arri? How can you make baseless accusations about manufacturers lying about bit depth? Which manufacturers are lying? Why couldn’t you answer my question about trading off some color saturation for significant gains in dynamic range? How can you insist that the human eye can take just so much dynamic range when Dolby and others arrive at completely different conclusions?  Why is it that almost every authority says we can’t discern a difference between a 10-bit image and an 8-bit image but you say that’s false? What do Volvo or VW have to do with HDR? And how can you claim that everyone is happy with current display technology? Why are cellphone companies continually striving to improve their displays? Hundreds of millions of people around the world already own an HDR device. Finally, this is not about bit-depth - it’s about shooting log for HDR delivery.

Link to comment
Share on other sites

1 hour ago, jonpais said:

But you just said that HDR was a scheme to sell more television sets. And how can you presume to know more than Arri? How can you make baseless accusations about manufacturers lying about bit depth? Which manufacturers are lying? Why couldn’t you answer my question about trading off some color saturation for significant gains in dynamic range? How can you insist that the human eye can take just so much dynamic range when Dolby and others arrive at completely different conclusions?  Why is it that almost every authority says we can’t discern a difference between a 10-bit image and an 8-bit image but you say that’s false? What do Volvo or VW have to do with HDR? And how can you claim that everyone is happy with current display technology? Why are cellphone companies continually striving to improve their displays? Hundreds of millions of people around the world already own an HDR device. Finally, this is not about bit-depth - it’s about shooting log for HDR delivery.

HDR is a scheme to sell more television sets.  They are not artists living off free-love.   How much the technology can/will HDR deliver is the question.  I NEVER said manufacturers were lying about depth.  Please, if you going to put words in my mouth please quote me.  I thought I answered all your questions.  I don't know exactly what the eye can take in.  I only explained my experience.  You said it looks fantastic.  I said, 'great, I look forward to it'.  Yes, you can't see the difference between TRUE 8-bit and 10-bit image data, but that was NOT what we were talking about.  We're talking about 8 and 10-bit consumer video. 

And I don't understand why you fight me so much on these technical issue when you say you're not interested in math?  I was never good at math, but have taught myself what I need to understand, I believe, these issues.  I made that effort.  You don't have to make the effort, if you don't want.  But to fight me on it when I have done the work, and you haven't, is well, disrespectful.  

You may think I have disparaged what you said about HDR.  If you read again,you will see I have not done that.  I have only pointed out technical problems that I would think they're up against.  Because, as I've said repeatedly, I have not seen good HDR I can only speculate.  And for the umpteenth time, if LOG can't really fit good color AND extended DR into 8-bit, how will HDR make an end-run around that?  Again, not saying there isn't more to the technology.  I'll just have to see, literally ;)

Link to comment
Share on other sites

50 minutes ago, maxotics said:

HDR is a scheme to sell more television sets.  They are not artists living off free-love.   How much the technology can/will HDR deliver is the question.  I NEVER said manufacturers were lying about depth.  Please, if you going to put words in my mouth please quote me.  I thought I answered all your questions.  I don't know exactly what the eye can take in.  I only explained my experience.  You said it looks fantastic.  I said, 'great, I look forward to it'.  Yes, you can't see the difference between TRUE 8-bit and 10-bit image data, but that was NOT what we were talking about.  We're talking about 8 and 10-bit consumer video. 

And I don't understand why you fight me so much on these technical issue when you say you're not interested in math?  I was never good at math, but have taught myself what I need to understand, I believe, these issues.  I made that effort.  You don't have to make the effort, if you don't want.  But to fight me on it when I have done the work, and you haven't, is well, disrespectful.  

You may think I have disparaged what you said about HDR.  If you read again,you will see I have not done that.  I have only pointed out technical problems that I would think they're up against.  Because, as I've said repeatedly, I have not seen good HDR I can only speculate.  And for the umpteenth time, if LOG can't really fit good color AND extended DR into 8-bit, how will HDR make an end-run around that?  Again, not saying there isn't more to the technology.  I'll just have to see, literally ;)

Can you send me a link to the video you mention where you discuss bit depth per stop in various formats/gammas? I want to make sure I watch the right one. It is an interesting topic and worth exploring. There are, no doubt, trade offs with log gammas screwing with tonality. But by distributing data differently (I believe most camera sensors have 14 bit ADCs in RAW, but that data is not stored efficiently) you can maintain good tonality in a smaller package. Which is the whole point of log capture. No one says it's better than RAW capture, but in the case of the Alexa, for instance, 10 bit Log C 444 is maybe 99.9% as good–and a tiny fraction of the size. 

Furthermore, dynamic range is not the question so much as tonality is. With adequate dithering (or in the case of most cameras, noisy sensors doing the job for you) you can obviate banding for any given dynamic range at an arbitrarily low bit depth. (At a certain point it'll just be dithered black and white pixels–but no banding!) The color and tonality, however, will suffer terribly. I shoot a bit with a Sigma DP2 and I do notice a lot of poor tonality on that camera relative to the gold standard of 4x5 slide film, despite both having poor dynamic range, and even in RAW. I believe that has a pretty low bit ADC.

While I admire your reasoning and rigor, I agree with @jonpais for the most part. I agree that a ten bit image, properly sourced, will hold up better in the grade than an 8 bit one, but will look the same to the eye ungraded. While I know (secondhand) of some minor "cover ups" by camera manufacturers, none are too nefarious and consistently it's stuff you can figure out for yourself by running camera tests, and things people online identified anyway, and which were eventually rectified to some extent. Most camera manufacturers are surprisingly transparent if you can talk to their engineers, and there are white papers out there:

Deep-Dive-HDR-Part2.pdf

964

However, this over my head.

Where I disagree with Jon is his statement that a given log profile from any camera is adequate for HDR content production. In theory, since HDR standards are poorly defined, this might be true. But it doesn't mean it's giving you the full experience. My only exposure to HDR (other than displays at Best Buy, and trying HDR Video on an iPhone X) has been a Dolby 10,000 nit demonstration and a few subsequent conversations with Dolby engineers. The specs I was given for HDR capture by them were 10 bit log capture or RAW capture, rec2020 or greater color space, and 15 stops of dynamic range or greater. Of course, there are many HDR standards, and Dolby was giving the specs for top of the line HDR. But still, this was the shorthand for what Dolby thought was acceptable, and it's not something any consumer camera offers. They are, however, bullish on consumer HDR in the future. Fwiw, the 10,000 nit display is mind blowingly good.

Just because Sony seems to be careless at implementing log profiles (which is weird, since the F35 is excellent and F3 is good, too) doesn't mean log profiles are universally useless. The idea is to compress the sensor data into the most efficient package while sacrificing as little tonality as possible. The problem arises when you compress things too far, either in terms of too low a bit depth or (much worse) too much compression. I do see this with A7S footage. And I think it's the reason Canon won't allow Canon Log 2 gammas in its intermediate 8 bit codec on the C200. I realize you wouldn't consider the Varicam LT and Alexa consumer-level, but the images from their 10 bit log profiles are great, with rich tonality and color that does not fall apart in the grade. Furthermore, I suspect the C200's RAW capture would actually fulfill even Dolby's requirements for high end HDR, and $6000 is not that expensive considering.

Out of curiosity, do you use Canon Log on the C100? It's quite good, not true log, but well-suited for an 8 bit wrapper. 

Link to comment
Share on other sites

1 hour ago, maxotics said:

HDR is a scheme to sell more television sets.  They are not artists living off free-love.  How much the technology can/will HDR deliver is the question.  ...  Because, as I've said repeatedly, I have not seen good HDR I can only speculate. 

I've got a high end Sony HDR and I compared Dark Knight 2K BD and 4K HDR BD and there is absolutely a difference. The gradations are so subtle, the colors unlike anything I've seen on a TV before... It's glorious, and I highly recommend you experience it! So, I don't think I'd call it a scheme because it is a truly improved viewing experience for high-end productions.

Here's why I think low compression 10bit HDR is 100% worth it:

My TV has the exact same settings for both sources, with main contrast and brightness turned to 100% but all "special" contrast enhancements (IE dynamic range and color killers) turned off:

Exhibit 1: Exposed for the highlights with my iPhone. It's very hard to see in these photos, but you've got to trust me that in person there are 1-2 extra stops of visible detail in the clouds on the HDR version. This is one of the highest contrast scenes imaginable

2K:

5a761d5b5e76c_IMG_63192.thumb.jpg.7745819e5f1ef3f4aa6c8d94f640afa1.jpg

4K:

IMG_6321.thumb.JPG.0514d68ea6176a2e51abb9e7f7cdb9c6.JPG

 

Exhibit 2: 

2K (shot with non-HDR iPhone camera setting) this is pretty much how my eye sees it:

IMG_6324.thumb.JPG.e3c3e69df77aac0de220217fa99a0135.JPG

4K HDR - without using the HDR iPhone setting, the dynamic range doesn't even fit in the image where the sky is too bright (I tried to get the rest of the image exposed the same, but I'd say there's still an extra stop visible in shadows) Please also note the subtle transition on the ground from shadow to highlight:

IMG_6325.thumb.jpg.ef27f3a4f77b4a08e77a1dcacd45cddf.jpg

And here's 4K HDR shot with HDR iPhone settings to better show how my eye sees it:

IMG_6328.thumb.JPG.a215f4d9fff87f3619247c814c52bb98.JPG

 

Now, here are some comparisons, all shot with the iPhone in HDR mode, that speak for themselves:

IMG_6331.thumb.JPG.186eba9f61b337e7853928eb7f1665c0.JPG

IMG_6333.thumb.JPG.f408410bc581063c16bfd549275383b5.JPG

IMG_6330.thumb.JPG.1e2fcd676b094d489920387c066ecba8.JPG

IMG_6329.thumb.JPG.8d32e73a3b020044323777bfdbcc6fe1.JPG

On the last one, it looks like the helicopter spotlight is blown out, but in person it's not. There's still plenty of detail there. So it seems like the dynamic range displayed might actually be higher than what my iPhone 8 can capture even in HDR... or maybe I didn't expose perfectly.

I guess you could say the difference is... Knight and day. Ok, I'm done.

Link to comment
Share on other sites

1 hour ago, HockeyFan12 said:

Alexa, for instance, 10 bit Log C 444 is maybe 99.9% as good–and a tiny fraction of the size

Oh, I could pull my hair out!  Those Alexa files (HD) are at around 42 MBS/sec, in other words, pretty close to what you need for a BMPCC or ML RAW.  4K it would be 4x that amount.  10-bit isn't the same for all cameras.  That is 10-bit on an Alex isn't the same as 10-bit on a GH5 because the former is, in the case above, doing 444 which is essentially full-color compressed RAW (no chroma-subsampling).  It's not a "tiny fraction of the size" in my book.  It's more like half the size of RAW, which, don't get me wrong, is nice!  No matter how many ways or times I try, some people don't want to read the fine-print or again, figure out if they're really getting 10-bits of DR with no color loss.  

@EthanAlexander I don't question your viewing experience.  Again, I've never said HDR can't be good.  The disconnect here is that you and Jon are looking at images where the brain doesn't care if mid-tone color is reduced.  If I'm shooting a beach, or landscape, with no one in it, I would shoot LOG.  Again, never said one shouldn't use LOG.  However, if a person was the subject on a beach, I might not should LOG to maintain the best skin tones.  This is a subjective judgment!  I might shoot LOG anyway.  However, if I did I would not expect the LOG footage to give me as nice skin tones as I would have gotten shooting a standard profile and letting the sky blow out.

Your example above demonstrates my point.  In the non-HDR version of the truck the red headlights are nice and saturated.  The image is contrasty, but not because a standard profile is naturally contrasty but because most of the image data is OUTSIDE the sensor's ability to detect it without a lot of noise.  In your second image, you have more detail in the street, and if that's what you're going for, good!  But look at the truck's red headlight and the yellow paint. It's all washed out.  And if you go back to your editor and try to saturate them, you will get close to the other image, but this time you have high contrast and crude saturated colors, a LOSS against your original image which at least had nicely saturated colors around your center exposure.

It's the same thing with your helicopter shot.  If you look for rich colors you won't find it in the HDR.  

Again, all depends on which look you're going for.  If you don't like nuanced saturated colors then, then your HDR is great.  I do favor saturated colors.  We should respect each other's values, right?  For anyone to say I can get what I want in HDR is fine, but they need to prove it to me.  The images you posted just confirm my experience.    But thanks for posting them!!!!!!!  We all learn by looking at evidence!

 

1 hour ago, HockeyFan12 said:

While I know (secondhand) of some minor "cover ups" by camera manufacturers, none are too nefarious and consistently it's stuff you can figure out for yourself by running camera tests, and things people online identified anyway, and which were eventually rectified to some extent. Most camera manufacturers are surprisingly transparent if you can talk to their engineers, and there are white papers out there

This is why I wish Jon would be more careful in accusing me of things.  I never said the manufacturers are covering anything up.  I said they have no incentive to release test/specifications of their consumer equipment.  No professional gives an arse about any of this stuff I'm talking about, because they just go out and get a camera that gives them what they want.  If Netflix gave me money for a show do you think I'd shoot it on a DSLR :)  Arri here I come ;) I'm confining myself to the subject of this blog which is understanding and getting the most out of consumer equipment.

Link to comment
Share on other sites

19 minutes ago, maxotics said:

Oh, I could pull my hair out!  Those Alexa files (HD) are at around 42 MBS/sec, in other words, pretty close to what you need for a BMPCC or ML RAW. 

2k (not HD) 444 ProRes is about 38MB/sec; ArriRAW (2.8k Bayer for 2k delivery) is 168MB/sec.

Yes, it's only about a 77% reduction in file size, which is significant on tv shows but perhaps not on the largest feature films. I suppose "tiny fraction" is an exaggeration.

But ArriRAW has its own gamma mapping to a 12 bit container from a dual 14 bit ADC that then converts to a 16 bit signal in the camera. So, if you were starting with the true RAW signal, which is either 28 bit or 16 bit depending on how you look at it, the reduction in file size would be dramatically more. In the case of ArriRAW, the RAW data itself has its gamma stretched (similar to, but different, from Log) to save space. 

So perhaps ArriRAW is not the best example because it compresses the gamma, too, and a 77% reduction in file size isn't that big for your needs (it is for mine).

I'm not sure what I "don't get." My own experience shooting 10bit SLOG 2 on the F5 indicated that the codec wasn't well-implemented for that flat a gamma, and I ended up not liking that camera when it was first released. (Overexposed by a stop, it's not so bad, and it's better now.) I think what you miss is that most serious shooters are running these tests for themselves. Problems like sensor banding in the C300 Mk II reducing the stated 15 stops of dynamic range and SLOG 2 on the A7S being too "thin" and Red's green and red chromaticities being placed too close are well-documented at the ASC and the ACES boards.

Furthermore, the Alexa is pretty darned good even at 422, which you posit is too thin for log. (And Log C is very flat as gammas go.) Many tv shows shoot 1080p 422 (not even HQ) for the savings in file size. They still shoot log, the images still have good tonality, if slightly less flexibility than 444 ProRes or ArriRAW affords. Just because a few log profiles aren't all they're cracked up to be doesn't mean log profiles are inherently bad or wasteful. 

Link to comment
Share on other sites

1 hour ago, HockeyFan12 said:

Out of curiosity, do you use Canon Log on the C100? It's quite good, not true log, but well-suited for an 8 bit wrapper. 

Yes!  Because c-log has been tuned to give the most amount in increase DR "look" without super-compromising color.  It's a beautiful look, but it's also a sensor made for video.  Anyway, I shoot LOG, never said anyone shouldn't.  All that said, c-log isn't my first choice.

Link to comment
Share on other sites

4 minutes ago, maxotics said:

Yes!  Because c-log has been tuned to give the most amount in increase DR "look" without super-compromising color.  It's a beautiful look, but it's also a sensor made for video.  Anyway, I shoot LOG, never said anyone shouldn't.  All that said, c-log isn't my first choice.

Same! Maybe we don't disagree that strongly after all. 

There are issues with log profiles, I'll admit. One of the great things about Red is it looks great rated at 200 ISO (one could argue it doesn't look so good rated much faster) and pulled down in post. Whereas ETTR with log can thin out your signal. But I still think the better-implemented log gammas (Canon Log, Log C, and SLOG 1 was quite good on the F35 and F3) are the best thing we've got going at the moment.

Link to comment
Share on other sites

8 minutes ago, HockeyFan12 said:

I'm not sure what I "don't get."

Sorry, I'm just frustrated.  I believe you get everything I'm saying.  My guess is you have a reverse blind-spot to Jon.  Sorry! :) You obviously shoot with high-end equipment, so your cameras have fat-pixel sensors and powerful electronics.  LOG IS useful to you.  But I believe sometimes when you think about LOG you forget that you're thinking about LOG in high-bit depth or cinema-sensor contexts.  Many people on this forum have never shot RAW or high bit-depth cameras.  All they know/have, is 8-bit.  That's always what I'm focused on.  Anyway, THANK YOU SO MUCH for your observations.  I haven't used the equipment you have and I certainly don't have your experience so I find your comments extremely interesting.

Now that I hope we're getting somewhere.  I didn't say that Sony and Panasonic were lying.  But I do question what they mean by 10-bit video.  @HockeyFan12 do you find Sony consumer and Panasonic cameras to have true 10-bit, like an Arri?  I don't believe they do.  I believe they are 10bit in adding a couple of decimal places to the internal 24-bit color values, but I don't believe they are 10-bit in saving 1,024 bits per color channel.  Let me know if my question doesn't make sense.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

  • EOSHD Pro Color 5 for All Sony cameras
    EOSHD C-LOG and Film Profiles for All Canon DSLRs
    EOSHD Dynamic Range Enhancer for H.264/H.265
×
×
  • Create New...