Jump to content

BenjaminJ

Members
  • Content Count

    3
  • Joined

  • Last visited

  1. ​Of course not, but the same is true for similar (fixed) Zeiss lenses on Sony cameras, Scheider lenses on Samsung, etc. It's about their involvement in the optical design (which can be significant).
  2. I think it doesn't have so much to do with the color space, but with tone mapping or camera profiles. The same RAW file will give different colors in different RAW developing applications for this reason (because they use different camera profiles). Regarding the worse sharpness of image B: it's because of the CA of the lens -- different colors get different magnification and this reduces sharpness, but this can be corrected in software quite effectively by scaling the different color channels. The NX1 does this in-camera very effectively, as you can see in image A.
  3. Respectfully, let me provide a correction: Sigma's Foveon X3 sensors do not use stacked color filters -- they measure color by penetration depth in the silicon (red penetrates deepest and blue least deep, green in between). The only advantage that this type of sensor has is that the spatial color resolution for red and blue is twice higher than with a Bayer sensor and there is no color aliasing or colored moiré (but still some liminance aliasing/moiré). The downside is that the colors need to be calculated with heavy mathematics and this is error prone, leading to more metameric failures (fa
×
×
  • Create New...