Jump to content

tupp

Members
  • Content Count

    802
  • Joined

  • Last visited

  • Days Won

    2

Everything posted by tupp

  1. No. I am referring to the actual color depth inherent in an image (or imaging system). I never mentioned dithering. Dithering is the act of adding noise to parts of an image or re-patterning areas in an image. The viewer's eye blending adjacent colors in a given image is not dithering. Again, I am not talking about dithering -- I am talking about the actual color depth of an image. The formula doesn't require viewing distance because it does not involve perception. It gives an absolute value of color depth inherent in an entire image. Furthermore, the formula and the point of my example are two different things. By the way, the formula can also be used with a smaller, local area of images to compare their relative color depth, but one must use proportionally identical sized areas for such a comparison to be valid. What I assert is perfectly valid and fundamental to imaging. The formula is also very simple, straightforward math. However, let's forget the formula for a moment. You apparently admit that resolution affects color depth in analog imaging: Not sure why the same principle fails to apply to digital imaging. Your suggestion that "non-discrete color values" of analog imaging necessitate measuring color in parts of an image to determine color depth does not negate the fact that the same process works with a digital image. The reason why I gave the example of the two RGB pixels is that I was merely trying to show in a basic way that an increase in resolution brings an increase in digital color depth (the same way it happens with an analog image). Once one grasps that rudimentary concept, it is fairly easy to see how the formula simply quantifies digital, RGB color depth. In a subsequent post, I'll give a different example that should demonstrate the strong influence of resolution on color depth.
  2. So what? It works the same with digital imaging. However, in both realms (analog and digital) the absolute color depth of an image includes the entire image. I will try to demonstrate how color depth increases in digital imaging as the resolution is increased. Consider a single RGB pixel group of size "X," positioned at a distance at which the red, green and blue pixels blend together and cannot be discerned separately by the viewer. This RGB pixel group employs a bit depth that is capable of producing "Y" number of colors. Now, keeping the same viewing distance and the same bit depth, what if we squeezed two RGB pixels into the same space as size "X"? Would you say that the viewer would still only see "Y" number of colors -- the same number as the single pixel that previously filled size "X" -- or would (slightly) differing shades/colors of the two RGB pixel groups blend to create more colors? What if we fit 4 RGB pixel groups into space "X"? ... or 8 RGB pixel groups into space "X"? As I think I have shown above, resolution plays a fundamental role in digital color depth. Resolution is in fact an equally weighted factor to bit depth, in digital color depth. I would be happy to explain this if you have accepted the above example.
  3. I don't have a reference. When I studied photography long before digital imaging existed, I learned that resolution is integral to color depth. Color depth in digital imaging is exceedingly more quantifiable, as it involves a given number of pixels with a given bit depth, rather than indeterminate dyes and grain found in film emulsion (and rather than unfixed, non-incremental values inherent in analog video). The formula for "absolute" color depth in RGB digital imaging is: Color Depth = (Bit Depth x Resolution)^3
  4. The notion that bit depth is identical to color depth is a common misconception, that has apparently made it's way into Wikipedia. The only instances in which bit depth and color depth are the same is when considering only a single photosite/pixel or a single RGB pixle-group. Once extra pixels/pixel-groups are added, color depth and bit depth become different properties. This happens due to the fact that resolution and bit depth are equally weighted factors of color depth, in digital imaging.
  5. Actually, "8bit" and "10bit" refer only to bit depth -- bit depth and color depth are two different properties. My understanding is that Vlog doesn't actually change the dynamic range -- it changes the tone mapping within the dynamic range to capture more shades in desired exposure regions.
  6. Almost every mic that I have seen on hand-held booms had an interference tube. What kind of mic is that? They might stop laughing when the director and post production sound supervisor start asking why there is so much extraneous noise in the audio. I have been in production for a little while. By the way, I started out in audio recording. I have worked on all kinds of film productions in various departments, from small corporate gigs to large features on stages and on location. I am telling you exactly what I see audio pros use on such sets. When I was involved in audio recording, I was utterly ignorant about the different types of mics and recording techniques. I also was completely unaware of certain brands/models that had more desirable response for certain applications. Please enlighten those of us with limited knowledge with your mastery of the important mic properties in recording. Specifically, please explain what is more important than the degree of directionality in regards to film audio production, given a quality, professional mic.
  7. You should tell that to the pro's here in Hollywood. Almost all the boom operators that I see here on set are using shotguns on their booms, both indoors and outdoors. These operators are nimble and precise, and they want clean, dry audio. I am not specifically and audio person, but I always use my boomed shotgun mic, and I have always gotten great results. I would never want anything with wider reception.
  8. Agreed! Otherwise, somebody needs to tell all the pro mixers here in Hollywood that they are wasting their money hiring boom operators! I suspect that this suggestion will sound alien to some here, but never mount your shotgun mic to your camera. Always boom the shotgun mic just outside of the frame, as close as possible to the subject. If your subject is static, such as someone sitting for an interview, you can boom from a light stand or C-stand. Of course, make sure that the shotgun mic is aimed directly at the subject's mouth or aimed directly at the action you want to record. One important trick that is often overlooked -- position the mic so that there are no air conditioners or other noise sources along the shotgun mic's axis (both in front and in back of the mic) where it is most sensitive. So, if there is an air conditioner on camera right, boom the mic from slightly to camera right, so that it is aiming a little leftward towards the subject, perpendicular to the noise source on the right of camera. As @maxotics suggested, it is best to use both a lav and a boomed shotgun, if possible. In such a scenario, one is always a backup for the other.
  9. I don't have a photo of the effect, but there are several video tutorials on YouTube that might show before/after images.
  10. If you want to fix dead/hot pixels in your footage, you can do so in most NLEs. First, make a duplicate track of your footage. Then, make a mask that is transparent, except for the dead/hot pixels. Attach that mask to the top video track, so that the hot/dead pixels are the only part top track that covers identical track below. Then, Gaussian blur the footage in the top track (but not the attached mask), so that the values/color from the adjacent pixels seep into the masked hot/dead pixels.
  11. One way is to make a duplicate track of your footage, and, then, make a mask that is transparent, except for the dead/hot pixels. Attach that mask to the top video track. Then, Guassian blur the footage in the top track (that is attached to mask), so that the values/color from the adjacent pixels seep into the hot/dead pixels. Here is a demo:
  12. By the way, one good site to go to for documentary tips is http://doculink.org/ The Doculink "mailing list" is their forum.
  13. Make transcription text files with time code references to all spoken lines in your footage/audio. Then, edit "on paper" before you even get close to sitting down at the edit bay. It makes the process exceedingly easier and much more thorough. You can add b-roll/cut-aways during the paper edit and/or during the NLE session. If you have a lot of footage/audio (more than ~20-30 minutes), hire a cheap transcription service to make the text files with time code reference. The least expensive transcription services are probably in India and, perhaps, in the Philippines, and they can often transcribe most languages. These services are worth every penny you spend on them.
  14. With the high-speed capability, you can shoot videos like this: Just get a few smoke bombs and you're set!
  15. You beat me to it! I was also going to suggest a cheap base rig with rails with this one that has adjustable height on the camera platform. Op, you might as well get a cheap rig such as this that additionally gives you rails to attach a matte box or lens hood, a follow focus, etc., if you ever want to use such items. On the other hand, you could also just slide the camera to its most forward position on the Manfrotto 502.
  16. There is nothing like lensrentals.com or borrowlenses.com where you are located?
  17. Consider renting. A $7000 budget for a camera package sans lenses seems rather excessive for a short. I've worked on shorts in which the budget for the entire project was a fraction of that of your camera (sans lenses) package. Plus, renting gives you more flexibility/recover-ability. If you rent a camera that has a problem (such as fpn) or if you just don't like it, you can return it and get another camera. Also, for a narrative short, you probably don't even need 4K, which could greatly reduce your budget.
  18. Consider a fixture with some throw (not an LED panel), and a tall stand with a fill-able bag. Unless you want to spend the money on a waterproof fixture, just learn how to deploy/use rain hats.
  19. Might want to check the math and add the percentages. At any rate, there were tons of C-mount lenses made for larger videcons/plumbicons, so, not the above percentages are a "rule of thumb."
  20. There is no absolute image circle size for a C-mount, nor is there an absolute image circle size for the various focal lengths with a C-mount. Furthermore, one can use an adapter to mount APS-C and full frame lenses to C-mount cameras. You just have to determine the image circle of each lens on an individual basis.
  21. I was getting corrupt file headers, and, consequently, the files would not work on all players. Changing the microSD card fixed the problem. Some have reported that removing or changing the SD card has fixed noise and crackling problems on Zoom recorders: https://www.youtube.com/watch?v=E274Ev6yvJ8
  22. Have you tried using a different microSD card? I was having problems with my H1 and a new microSD card fixed them.
  23. I think that the mechanical shutter makes a more noticeable difference with significant movement and with handheld shots. The mechanical shutter probably also reduces noise. By the way, Panavision modified a few F65s by removing the mechanical shutter. They called the modified version the "F65 Mini," and they usually live in France: http://panavision.fr/produits/sony-f65-mini/ www.vimeo.com/197192795
×
×
  • Create New...