Jump to content

alexcosy

Members
  • Posts

    51
  • Joined

  • Last visited

Reputation Activity

  1. Like
    alexcosy got a reaction from gloopglop in The REAL difference between normal DSLR video and 5D Mark III raw video   
    Indeed, but as Andrew said, like with the D800, they can tell you all they want that it's clean 4:2:2 10 bits from the HDMI, but if it's just 4:2:0 8bits, "rewrapped" in a 4:2:2 10 bits signal, it doesn't matter, it's just the same "crappy" signal in a beautiful box ;) but they get to tell you that it's great, cause they're not actually lying...
  2. Like
    alexcosy got a reaction from gloopglop in The REAL difference between normal DSLR video and 5D Mark III raw video   
    Hey, many thanks for answering, my question was more something like: As anyone, the actual display i'm looking at right now (99% of us are not on10bits Eizo or FSI displays I guess)  is 8 bits, How can i possibly, on an 8bits display, perceive 14 bits color gradation. I get that the video itself contains the information, but if the display is 8 bits... i only see 8 bits color right ? Just as if it were black and white, i wouldn't see color, no matter what.
    I know it's tricky and i guess there is something i'm misunderstanding... That's why i'm asking, so i can understand fully this, and maybe at some point give the answer to someone asking the same thing.
     
     
    Or, were you saying that an 8 bits display is not the same thing actually more colors than 8 bits video ? it's alway 8 bits per channel no ?
×
×
  • Create New...